Universität Wien

340199 VU Advanced Machine Translation (2024W)

5.00 ECTS (3.00 SWS), SPL 34 - Translationswissenschaft
Continuous assessment of course work

Registration/Deregistration

Note: The time of your registration within the registration period has no effect on the allocation of places (no first come, first served).

Details

max. 25 participants
Language: English

Lecturers

Classes (iCal) - next class is marked with N

  • Wednesday 16.10. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
  • Wednesday 30.10. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
  • Wednesday 06.11. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
  • Wednesday 13.11. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
  • Wednesday 20.11. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
  • Wednesday 04.12. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
  • Wednesday 11.12. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
  • Wednesday 08.01. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
  • Wednesday 15.01. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
  • Wednesday 22.01. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG

Information

Aims, contents and method of the course

- This course will be taught in person in the ZTW computer lab, with a hybrid or online option.

Goals:
Students will acquire specialised and practical knowledge on neural machine translation (NMT), self-attention architectures, multilingual NMT, domain adaptation approaches for NMT, and NMT decoding.
Using state-of-the-art technologies, students will learn to apply different approaches to customise multilingual NMT models.

Content:

- Multilingual NMT
- Domain adaptation for multilingual NMT
- Efficient transformer architectures
- Decoding for NMT

Didactic approach:
Students will need to complete practical assignments involving a range of approaches for multilingual NMT, and domain adaptation. Students will also gain experience of deep generative models, and decoding for NMT. The course will be taught in English, with some opportunities for using other languages to complete the coursework.

Assessment and permitted materials

Continuous evaluation:
- Weekly reflections, and paper presentations count for 40% of the mark.
- Decoding for NMT deliverable counts for 20% of the mark.
- Benchmarking of domain adaptation approaches for NMT task deliverable counts for 40% of the mark.

Minimum requirements and assessment criteria

In order to pass this module, a student needs to reach the threshold of 4.
MT marking map
excellent - sehr gut (1)
good - gut (2)
average - befriedigend (3)
sufficient - genügend (4)
insufficient - nicht genügend (5)

Examination topics

- Self-attention architectures
- Multilingual NMT
- Multilingual NMT domain adaptation
- NMT decoding

Reading list

Core texts:
- Kenny, Dorothy. 2022. Machine translation for everyone: Empowering users in the age of artificial intelligence. (Translation and Multilingual Natural Language Processing 18). Berlin: Language Science Press. DOI: 10.5281/zenodo.6653406 (url: https://langsci-press.org/catalog/book/342)
- Koehn, P. 2020. Neural Machine Translation. Cambridge University Press
- Peter F. Brown, Stephen A. Della Pietra, Vincent J. Della Pietra, and Robert L. Mercer. 1993. The Mathematics of Statistical Machine Translation: Parameter Estimation. Computational Linguistics, 19(2):263–311.
- Collins, Michael. “Statistical Machine Translation : IBM Models 1 and 2.” (2011).
- Yi Tay, Mostafa Dehghani, Dara Bahri, and Donald Metzler. 2022. Efficient Transformers: A Survey. ACM Comput. Surv. 55, 6, Article 109 (June 2023), 28 pages. https://doi.org/10.1145/3530811
- Danielle Saunders. 2022. Domain Adaptation for Neural Machine Translation. In Proceedings of the 23rd Annual Conference of the European Association for Machine Translation, pages 9–10, Ghent, Belgium. European Association for Machine Translation.
- Xu, Lingling et al. Parameter-Efficient Fine-Tuning Methods for Pretrained Language Models: A Critical Review and Assessment. ArXiv abs/2312.12148 (2023): n. pag.

Additional recommended resources:
- Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention). https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/
- Voita, Lena. Neural Machine Translation Inside Out. https://lena-voita.github.io/posts/nmt_inside_out.html
- Huang, A., Subramanian, S., Sum, J., Almubarak, K., Biderman, S., & Rush, S. (2022). The Annotated Transformer.(2022). URL https://nlp.seas.harvard.edu/annotated-transformer/

Association in the course directory

Last modified: Mo 07.10.2024 10:07