Вы здесь

Tutorial: The frustrating past, the exciting present and the bright future of (neural) machine translation


In this tutorial, I will present the basic concepts of neural MT as the new state-of-the-art in automatic translation. We will look at the common architecture of attention-based sequence-to-se quence models and include an overview of useful extensions and practical tricks. The tutorial will mention available tools and resources to make it easy to get started with hands-on experience on real-world data. We do not require any deep understanding of neural networks and machine learning as a background but focus on a gentle introduction of models and techniques.