Google+

Friday, November 24, 2017

Deep learning, where are you going?

A talk by Kyunghyun Cho, a professor from New York univ. The name is "Deep learning, where are you going?" Things to take away for me:

(1) Currently, most people using neural network to do one specific task. They grab the data and annotation, build an architecture and train the model. However,  as time goes, the trained model become isolated because new information comes around. In such way, we have to retrain the model with newly collected data. So how could we benefit with the pre-trained model? The idea would be combining different pre-trained models to do a more complex task or using another neural net to interpret the pre-trained model.

(2) The idea of multilingual translation is to train a shared continuous language space (word, character). He found char2char model is better than word2word or word2char models. Additionally, you can do mixed languages translation where the input sentence is mixed with such as English, French, etc.

No comments:

Post a Comment