340199 VU Advanced Machine Translation (2023W)
Prüfungsimmanente Lehrveranstaltung
Labels
An/Abmeldung
Hinweis: Ihr Anmeldezeitpunkt innerhalb der Frist hat keine Auswirkungen auf die Platzvergabe (kein "first come, first served").
- Anmeldung von Mo 11.09.2023 09:00 bis Fr 22.09.2023 17:00
- Anmeldung von Mo 09.10.2023 09:00 bis Fr 13.10.2023 17:00
- Abmeldung bis Di 31.10.2023 23:59
Details
max. 25 Teilnehmer*innen
Sprache: Englisch
Lehrende
Termine (iCal) - nächster Termin ist mit N markiert
Mittwoch
11.10.
16:45 - 20:00
Medienlabor II ZfT Gymnasiumstraße 50 4.OG
Mittwoch
25.10.
16:45 - 20:00
Medienlabor II ZfT Gymnasiumstraße 50 4.OG
Mittwoch
08.11.
16:45 - 20:00
Medienlabor II ZfT Gymnasiumstraße 50 4.OG
Mittwoch
22.11.
16:45 - 20:00
Medienlabor II ZfT Gymnasiumstraße 50 4.OG
Mittwoch
06.12.
16:45 - 20:00
Medienlabor II ZfT Gymnasiumstraße 50 4.OG
Mittwoch
10.01.
16:45 - 20:00
Medienlabor II ZfT Gymnasiumstraße 50 4.OG
Mittwoch
24.01.
16:45 - 20:00
Medienlabor II ZfT Gymnasiumstraße 50 4.OG
Information
Ziele, Inhalte und Methode der Lehrveranstaltung
Art der Leistungskontrolle und erlaubte Hilfsmittel
Continuous evaluation:
- Weekly reflections, and paper presentations count for 40% of the mark.
- Word alignment task deliverable counts for 20% of the mark.
- Benchmarking of domain adaptation approaches for NMT task deliverable counts for 40% of the mark.
- Weekly reflections, and paper presentations count for 40% of the mark.
- Word alignment task deliverable counts for 20% of the mark.
- Benchmarking of domain adaptation approaches for NMT task deliverable counts for 40% of the mark.
Mindestanforderungen und Beurteilungsmaßstab
In order to pass this module, a student needs to reach the threshold of 4.
MT marking map
excellent - sehr gut (1)
good - gut (2)
average - befriedigend (3)
sufficient - genügend (4)
insufficient - nicht genügend (5)
MT marking map
excellent - sehr gut (1)
good - gut (2)
average - befriedigend (3)
sufficient - genügend (4)
insufficient - nicht genügend (5)
Prüfungsstoff
- MT word alignment
- Self-attention architectures
- Multilingual NMT
- Multilingual NMT domain adaptation
- NMT decoding
- Self-attention architectures
- Multilingual NMT
- Multilingual NMT domain adaptation
- NMT decoding
Literatur
Core texts:
- Kenny, Dorothy. 2022. Machine translation for everyone: Empowering users in the age of artificial intelligence. (Translation and Multilingual Natural Language Processing 18). Berlin: Language Science Press. DOI: 10.5281/zenodo.6653406 (url: https://langsci-press.org/catalog/book/342)
- Koehn, P. 2020. Neural Machine Translation. Cambridge University Press
- Peter F. Brown, Stephen A. Della Pietra, Vincent J. Della Pietra, and Robert L. Mercer. 1993. The Mathematics of Statistical Machine Translation: Parameter Estimation. Computational Linguistics, 19(2):263–311.
- Collins, Michael. “Statistical Machine Translation : IBM Models 1 and 2.” (2011).
- Yi Tay, Mostafa Dehghani, Dara Bahri, and Donald Metzler. 2022. Efficient Transformers: A Survey. ACM Comput. Surv. 55, 6, Article 109 (June 2023), 28 pages. https://doi.org/10.1145/3530811
- Danielle Saunders. 2022. Domain Adaptation for Neural Machine Translation. In Proceedings of the 23rd Annual Conference of the European Association for Machine Translation, pages 9–10, Ghent, Belgium. European Association for Machine Translation.Additional recommended resources:
- Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention). https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/
- Voita, Lena. Neural Machine Translation Inside Out. https://lena-voita.github.io/posts/nmt_inside_out.html
- Lopez, Adam. Word Alignment and the Expectation-Maximization Algorithm. https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=8338914856defebe908394c2b33dc43d350c5dd0
- Huang, A., Subramanian, S., Sum, J., Almubarak, K., Biderman, S., & Rush, S. (2022). The Annotated Transformer.(2022). URL https://nlp.seas.harvard.edu/annotated-transformer/
- Kenny, Dorothy. 2022. Machine translation for everyone: Empowering users in the age of artificial intelligence. (Translation and Multilingual Natural Language Processing 18). Berlin: Language Science Press. DOI: 10.5281/zenodo.6653406 (url: https://langsci-press.org/catalog/book/342)
- Koehn, P. 2020. Neural Machine Translation. Cambridge University Press
- Peter F. Brown, Stephen A. Della Pietra, Vincent J. Della Pietra, and Robert L. Mercer. 1993. The Mathematics of Statistical Machine Translation: Parameter Estimation. Computational Linguistics, 19(2):263–311.
- Collins, Michael. “Statistical Machine Translation : IBM Models 1 and 2.” (2011).
- Yi Tay, Mostafa Dehghani, Dara Bahri, and Donald Metzler. 2022. Efficient Transformers: A Survey. ACM Comput. Surv. 55, 6, Article 109 (June 2023), 28 pages. https://doi.org/10.1145/3530811
- Danielle Saunders. 2022. Domain Adaptation for Neural Machine Translation. In Proceedings of the 23rd Annual Conference of the European Association for Machine Translation, pages 9–10, Ghent, Belgium. European Association for Machine Translation.Additional recommended resources:
- Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention). https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/
- Voita, Lena. Neural Machine Translation Inside Out. https://lena-voita.github.io/posts/nmt_inside_out.html
- Lopez, Adam. Word Alignment and the Expectation-Maximization Algorithm. https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=8338914856defebe908394c2b33dc43d350c5dd0
- Huang, A., Subramanian, S., Sum, J., Almubarak, K., Biderman, S., & Rush, S. (2022). The Annotated Transformer.(2022). URL https://nlp.seas.harvard.edu/annotated-transformer/
Zuordnung im Vorlesungsverzeichnis
Letzte Änderung: Mi 27.09.2023 10:48
Students will acquire specialised and practical knowledge on neural machine translation (NMT), word alignment models, self-attention architectures, multilingual NMT, domain adaptation approaches for NMT, and NMT decoding.
Using state-of-the-art technologies, students will learn to implement unsupervised word alignment methods, and apply different approaches to customise multilingual NMT models.Content:- Multilingual NMT
- Domain adaptation for multilingual NMT
- Efficient transformer architectures
- Decoding for NMT
- Unsupervised word alignment modelsDidactic approach:
Students will need to complete practical assignments involving a range of approaches for word alignment, multilingual NMT, and domain adaptation. Students will also gain experience of unsupervised, deep generative models, and decoding for NMT. The course will be taught in English, with some opportunities for using other languages to complete the coursework.