340199 VU Advanced Machine Translation (2023W)
Continuous assessment of course work
Labels
Registration/Deregistration
Note: The time of your registration within the registration period has no effect on the allocation of places (no first come, first served).
- Registration is open from Mo 11.09.2023 09:00 to Fr 22.09.2023 17:00
- Registration is open from Mo 09.10.2023 09:00 to Fr 13.10.2023 17:00
- Deregistration possible until Tu 31.10.2023 23:59
Details
max. 25 participants
Language: English
Lecturers
Classes (iCal) - next class is marked with N
- Wednesday 11.10. 16:45 - 20:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
- Wednesday 25.10. 16:45 - 20:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
- Wednesday 08.11. 16:45 - 20:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
- Wednesday 22.11. 16:45 - 20:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
- Wednesday 06.12. 16:45 - 20:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
- Wednesday 10.01. 16:45 - 20:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
- Wednesday 24.01. 16:45 - 20:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
Information
Aims, contents and method of the course
Assessment and permitted materials
Continuous evaluation:
- Weekly reflections, and paper presentations count for 40% of the mark.
- Word alignment task deliverable counts for 20% of the mark.
- Benchmarking of domain adaptation approaches for NMT task deliverable counts for 40% of the mark.
- Weekly reflections, and paper presentations count for 40% of the mark.
- Word alignment task deliverable counts for 20% of the mark.
- Benchmarking of domain adaptation approaches for NMT task deliverable counts for 40% of the mark.
Minimum requirements and assessment criteria
In order to pass this module, a student needs to reach the threshold of 4.
MT marking map
excellent - sehr gut (1)
good - gut (2)
average - befriedigend (3)
sufficient - genügend (4)
insufficient - nicht genügend (5)
MT marking map
excellent - sehr gut (1)
good - gut (2)
average - befriedigend (3)
sufficient - genügend (4)
insufficient - nicht genügend (5)
Examination topics
- MT word alignment
- Self-attention architectures
- Multilingual NMT
- Multilingual NMT domain adaptation
- NMT decoding
- Self-attention architectures
- Multilingual NMT
- Multilingual NMT domain adaptation
- NMT decoding
Reading list
Core texts:
- Kenny, Dorothy. 2022. Machine translation for everyone: Empowering users in the age of artificial intelligence. (Translation and Multilingual Natural Language Processing 18). Berlin: Language Science Press. DOI: 10.5281/zenodo.6653406 (url: https://langsci-press.org/catalog/book/342)
- Koehn, P. 2020. Neural Machine Translation. Cambridge University Press
- Peter F. Brown, Stephen A. Della Pietra, Vincent J. Della Pietra, and Robert L. Mercer. 1993. The Mathematics of Statistical Machine Translation: Parameter Estimation. Computational Linguistics, 19(2):263–311.
- Collins, Michael. “Statistical Machine Translation : IBM Models 1 and 2.” (2011).
- Yi Tay, Mostafa Dehghani, Dara Bahri, and Donald Metzler. 2022. Efficient Transformers: A Survey. ACM Comput. Surv. 55, 6, Article 109 (June 2023), 28 pages. https://doi.org/10.1145/3530811
- Danielle Saunders. 2022. Domain Adaptation for Neural Machine Translation. In Proceedings of the 23rd Annual Conference of the European Association for Machine Translation, pages 9–10, Ghent, Belgium. European Association for Machine Translation.Additional recommended resources:
- Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention). https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/
- Voita, Lena. Neural Machine Translation Inside Out. https://lena-voita.github.io/posts/nmt_inside_out.html
- Lopez, Adam. Word Alignment and the Expectation-Maximization Algorithm. https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=8338914856defebe908394c2b33dc43d350c5dd0
- Huang, A., Subramanian, S., Sum, J., Almubarak, K., Biderman, S., & Rush, S. (2022). The Annotated Transformer.(2022). URL https://nlp.seas.harvard.edu/annotated-transformer/
- Kenny, Dorothy. 2022. Machine translation for everyone: Empowering users in the age of artificial intelligence. (Translation and Multilingual Natural Language Processing 18). Berlin: Language Science Press. DOI: 10.5281/zenodo.6653406 (url: https://langsci-press.org/catalog/book/342)
- Koehn, P. 2020. Neural Machine Translation. Cambridge University Press
- Peter F. Brown, Stephen A. Della Pietra, Vincent J. Della Pietra, and Robert L. Mercer. 1993. The Mathematics of Statistical Machine Translation: Parameter Estimation. Computational Linguistics, 19(2):263–311.
- Collins, Michael. “Statistical Machine Translation : IBM Models 1 and 2.” (2011).
- Yi Tay, Mostafa Dehghani, Dara Bahri, and Donald Metzler. 2022. Efficient Transformers: A Survey. ACM Comput. Surv. 55, 6, Article 109 (June 2023), 28 pages. https://doi.org/10.1145/3530811
- Danielle Saunders. 2022. Domain Adaptation for Neural Machine Translation. In Proceedings of the 23rd Annual Conference of the European Association for Machine Translation, pages 9–10, Ghent, Belgium. European Association for Machine Translation.Additional recommended resources:
- Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention). https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/
- Voita, Lena. Neural Machine Translation Inside Out. https://lena-voita.github.io/posts/nmt_inside_out.html
- Lopez, Adam. Word Alignment and the Expectation-Maximization Algorithm. https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=8338914856defebe908394c2b33dc43d350c5dd0
- Huang, A., Subramanian, S., Sum, J., Almubarak, K., Biderman, S., & Rush, S. (2022). The Annotated Transformer.(2022). URL https://nlp.seas.harvard.edu/annotated-transformer/
Association in the course directory
Last modified: We 27.09.2023 10:48
Students will acquire specialised and practical knowledge on neural machine translation (NMT), word alignment models, self-attention architectures, multilingual NMT, domain adaptation approaches for NMT, and NMT decoding.
Using state-of-the-art technologies, students will learn to implement unsupervised word alignment methods, and apply different approaches to customise multilingual NMT models.Content:- Multilingual NMT
- Domain adaptation for multilingual NMT
- Efficient transformer architectures
- Decoding for NMT
- Unsupervised word alignment modelsDidactic approach:
Students will need to complete practical assignments involving a range of approaches for word alignment, multilingual NMT, and domain adaptation. Students will also gain experience of unsupervised, deep generative models, and decoding for NMT. The course will be taught in English, with some opportunities for using other languages to complete the coursework.