390043 UK VGSCO Course (2021S)
Optimization Methods for Data Science
Prüfungsimmanente Lehrveranstaltung
Labels
An/Abmeldung
Hinweis: Ihr Anmeldezeitpunkt innerhalb der Frist hat keine Auswirkungen auf die Platzvergabe (kein "first come, first served").
- Anmeldung von Mo 05.04.2021 00:00 bis Fr 16.04.2021 23:59
- Abmeldung bis So 18.04.2021 23:59
Details
Sprache: Englisch
Lehrende
Termine
Block, April 19-30, 2021
online
22 April 11.15 - 13.30
23 April 9.30 - 11.45
23 April 14.30 - 16.45
26 April 14.30 - 16.45
28 April 9.30 - 11.45
28 April 14.30 - 16.45
29 April 9.30 - 11.45
30 April 9.30 - 11.45
30 April 14.30 - 16.45
Information
Ziele, Inhalte und Methode der Lehrveranstaltung
Art der Leistungskontrolle und erlaubte Hilfsmittel
Homeworks and/or Seminar
Mindestanforderungen und Beurteilungsmaßstab
A basic knowledge of linear algebra, calculus and probability theory.
Prüfungsstoff
Literatur
Beck, Amir. First-order methods in optimization. Society for Industrial and Applied Mathematics, 2017.Bertsekas, Dimitri P., and Athena Scientific. Convex optimization algorithms. Belmont: Athena Scientific, 2015.Nesterov, Yurii. Introductory lectures on convex optimization: A basic course. Vol. 87. Springer Science & Business Media, 2003.
Zuordnung im Vorlesungsverzeichnis
Letzte Änderung: Do 08.04.2021 10:09
that, thanks to the advent of the "Big Data era", have re-gained popularity in the last few years.
We first review a bunch of classic methods in the context of modern real-world applications. Then, we discuss
both theoretical and computational aspects of some variants of those classic methods. Finally, we examine current challenges and future research perspectives. Our presentation, strongly influenced by Nesterov’s seminal book, includes the analysis of first-order methods, stochastic optimization methods, randomized and distributed methods, projection-free methods. The theoretical tools considered in the analysis, together with the broad applicability of those methods, makes the course quite interdisciplinary and might be useful for PhD students in different
areas (like, e.g., Analysis, Numerical Analysis, Operations Research, Probability and Mathematical Statistics).1. Methods for Unconstrained Optimization:
1.1 Gradient and accelerated gradient methods
1.3 Block-Coordinate approaches
1.4 Stochastic Gradient and its variants
1.5 Real-world Problems2. Methods for Constrained Optimization, Projection-based and Projection-free Approaches:
2.1 Projected Gradient
2.2 Frank-Wolfe Method and its Variants
2.3 Real-world Problems3. Challenges and Future Research