This course will cover deep learning methods for neural machine translation and text generation. It will cover neural models for discrete sequences including Long-short term memory, Transformer, and encoder-decoder frameworks for generating language. It includes training objectives and optimization algorithms for learning those models. It will also include strategy such as back-translation, knowledge distillation. Finally, it will cover multilingual machine translation, low-resource translation, speech translation, and visual translation.
Prerequisites: 130A or 130B; 165A or 165B.
Text (optional): Neural Machine Translation, Philipp Koehn, ISBN-10: 1108497322, Publisher: Cambridge University Press