Facebook NMT: Setting up a Neural Machine Translation System Training Course
7 hours (usually 1 day including breaks)
- Some programming experience is helpful
- Basic understanding of neural networks
- Experience using the command line
Fairseq is an open-source sequence-to-sequence learning toolkit created by Facebok for use in Neural Machine Translation (NMT).
In this training participants will learn how to use Fairseq to carry out translation of sample content.
By the end of this training, participants will have the knowledge and practice needed to implement a live Fairseq based machine translation solution.
- Localization specialists with a technical background
- Global content managers
- Localization engineers
- Software developers in charge of implementing global content solutions
Format of the course
- Part lecture, part discussion, heavy hands-on practice
- If you wish to use specific source and target language content, please contact us to arrange.
Why Neural Machine Translation?
Borrowing from image recognition techniques
Overview of the Torch and Caffe2 projects
Overview of a Convolutional Neural Machine Translation model
Convolutional Sequence to Sequence Learning
Convolutional Encoder Model for Neural Machine Translation
Standard LSTM-based model
Overview of training approaches
About GPUs and CPUs
Fast beam search generation
Installation and setup
Evaluating pre-trained models
Preprocessing your data
Training the model
Converting a trained model to use CPU-only operations
Joining to the community
Bookings, Prices and Enquiries
|Number of Delegates||Private Remote|
|Number of Delegates||Public Classroom|
|Location||Date||Course Price [Remote/Classroom]|