Universität Wien

250063 VO Mathematics of Deep Learning (2024S)

6.00 ECTS (4.00 SWS), SPL 25 - Mathematik
ON-SITE

Registration/Deregistration

Note: The time of your registration within the registration period has no effect on the allocation of places (no first come, first served).

Details

Language: English

Lecturers

Classes (iCal) - next class is marked with N

Monday 04.03. 13:15 - 14:45 Seminarraum 16 Oskar-Morgenstern-Platz 1 3.Stock
Thursday 07.03. 13:15 - 14:45 Seminarraum 8 Oskar-Morgenstern-Platz 1 2.Stock
Monday 11.03. 13:15 - 14:45 Seminarraum 16 Oskar-Morgenstern-Platz 1 3.Stock
Thursday 14.03. 13:15 - 14:45 Seminarraum 8 Oskar-Morgenstern-Platz 1 2.Stock
Monday 18.03. 13:15 - 14:45 Seminarraum 16 Oskar-Morgenstern-Platz 1 3.Stock
Thursday 21.03. 13:15 - 14:45 Seminarraum 8 Oskar-Morgenstern-Platz 1 2.Stock
Monday 08.04. 13:15 - 14:45 Seminarraum 16 Oskar-Morgenstern-Platz 1 3.Stock
Thursday 11.04. 13:15 - 14:45 Seminarraum 8 Oskar-Morgenstern-Platz 1 2.Stock
Monday 15.04. 13:15 - 14:45 Seminarraum 16 Oskar-Morgenstern-Platz 1 3.Stock
Thursday 18.04. 13:15 - 14:45 Seminarraum 8 Oskar-Morgenstern-Platz 1 2.Stock
Monday 22.04. 13:15 - 14:45 Seminarraum 16 Oskar-Morgenstern-Platz 1 3.Stock
Thursday 25.04. 13:15 - 14:45 Seminarraum 8 Oskar-Morgenstern-Platz 1 2.Stock
Monday 29.04. 13:15 - 14:45 Seminarraum 16 Oskar-Morgenstern-Platz 1 3.Stock
Thursday 02.05. 13:15 - 14:45 Seminarraum 8 Oskar-Morgenstern-Platz 1 2.Stock
Monday 13.05. 13:15 - 14:45 Seminarraum 16 Oskar-Morgenstern-Platz 1 3.Stock
Thursday 16.05. 13:15 - 14:45 Seminarraum 8 Oskar-Morgenstern-Platz 1 2.Stock
Thursday 23.05. 13:15 - 14:45 Seminarraum 8 Oskar-Morgenstern-Platz 1 2.Stock
Monday 27.05. 13:15 - 14:45 Seminarraum 16 Oskar-Morgenstern-Platz 1 3.Stock
Monday 03.06. 13:15 - 14:45 Seminarraum 16 Oskar-Morgenstern-Platz 1 3.Stock
Thursday 06.06. 13:15 - 14:45 Seminarraum 8 Oskar-Morgenstern-Platz 1 2.Stock
Monday 10.06. 13:15 - 14:45 Seminarraum 16 Oskar-Morgenstern-Platz 1 3.Stock
Thursday 13.06. 13:15 - 14:45 Seminarraum 8 Oskar-Morgenstern-Platz 1 2.Stock
Monday 17.06. 13:15 - 14:45 Seminarraum 16 Oskar-Morgenstern-Platz 1 3.Stock
Thursday 20.06. 13:15 - 14:45 Seminarraum 8 Oskar-Morgenstern-Platz 1 2.Stock
Monday 24.06. 13:15 - 14:45 Seminarraum 16 Oskar-Morgenstern-Platz 1 3.Stock
Thursday 27.06. 13:15 - 14:45 Seminarraum 8 Oskar-Morgenstern-Platz 1 2.Stock

Information

Aims, contents and method of the course

The lecture attempts to teach multiple facets of deep learning theory. Concretely, we will study the following:

1. What is Deep Learning: A gentle introduction into the language, notation, and main concepts of deep learning on a high level.
2. Feed-forward neural networks: The main building block of deep learning is that of a neural network. We will introduce this in a formal way.
3. Universal Approximation: We will study multiple results that study the (absence of) limitations of deep neural networks to represent general functions.
4. Connection to Splines: There are close relationships between neural networks and classical approximation methods. We will discuss these.
5. ReLU neural networks: There is a special subclass of neural networks that is the most frequently used. We study this special case in more detail.
6. Affine pieces of ReLU neural networks: A standard tool to understand what deep neural networks can and cannot do is to count the number of affine regions that they can generate. We will find upper and lower bounds for this.
7. Deep ReLU neural networks: We study the effect of depth, specifically for deep ReLU neural networks and find reasons for the fact that in practice deeper architectures reign supreme.
8. Curse of Dimensionality: We study the phenomenon of high dimensional approximation, which neural networks seemingly do well, despite the fact that this is typically very hard.
9. Interpolation: Deep neural networks can interpolate data under certain assumptions. We will formalize those.
10. Training of deep neural networks: We will understand how to train neural networks, and under which conditions this training can work
11. Loss landscape analysis: The optimization problem can be understood by studying the topography of the so-called loss landscape.
12. Neural network spaces: The set of all neural networks with a fixed architecture will be studied and conclusions for optimization will be drawn.
13. Generalization: We will describe general statistical learning theory in the context of deep neural networks. This will show under which conditions one can generalize from training sets to test sets.
14. Generalization in the overparameterized regime: The most common regime, i.e., overparameterized is excluded from the results before. We will describe why in practice things still work
15. Adversarial examples and Robustness: Finally, we study under which conditions deep neural networks are robust to changes in their inputs.

Assessment and permitted materials

Depending on the number of students that take this class, we will have either a written exam or an oral exam at the end of the lecture.

Minimum requirements and assessment criteria

To pass this class a student needs to demonstrate that they have understood the material of the course on a basic level. For the best grade, a thorough understanding of each result is necessary.

Examination topics

Everything that was said in the lecture.

Reading list

There will be course notes that will be published throughout the course.

Association in the course directory

MAMV; MANV; MSTV;

Last modified: We 28.02.2024 10:26