250085 VU Tensor methods for data science and scientific computing (2021W)
Prüfungsimmanente Lehrveranstaltung
Labels
GEMISCHT
An/Abmeldung
Hinweis: Ihr Anmeldezeitpunkt innerhalb der Frist hat keine Auswirkungen auf die Platzvergabe (kein "first come, first served").
- Anmeldung von Mo 13.09.2021 00:00 bis Mo 27.09.2021 23:59
- Abmeldung bis So 31.10.2021 23:59
Details
max. 25 Teilnehmer*innen
Sprache: Englisch
Lehrende
Termine (iCal) - nächster Termin ist mit N markiert
The course is organized in the form of sessions of two types.
(i) LECTURE SESSIONS (three academic hours weekly)will cover mostly theoretical material.
These sessions will be taught in class or online (via Moodle and BigBlueButton).
The lectures will consist in the comprehensive presentation of theoretical material on a chalkboard or on virtual board, at a usual chalkboard pace.
The lectures will be recorded, and the video recordings will be available to the registered students (via Moodle).(ii) EXERCISE SESSIONS (one academic hour weekly)
will revisit the methods and techniques covered in (i),
focusing on the practical aspects and implementation thereof
as well as on homework assignments.
These sessions will be taught in class or online (via Moodle and BigBlueButton).
In any case, these sessions will NOT be recorded.
Any relevant demonstration code will be available to the registered students (via Moodle).
- Montag 04.10. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Donnerstag 07.10. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Montag 11.10. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Donnerstag 14.10. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Montag 18.10. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Donnerstag 21.10. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Montag 25.10. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Donnerstag 28.10. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Donnerstag 04.11. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Montag 08.11. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Donnerstag 11.11. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Montag 15.11. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Donnerstag 18.11. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Montag 22.11. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Donnerstag 25.11. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Montag 29.11. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Donnerstag 02.12. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Montag 06.12. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Donnerstag 09.12. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Montag 13.12. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Donnerstag 16.12. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Montag 10.01. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Donnerstag 13.01. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Montag 17.01. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Donnerstag 20.01. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Montag 24.01. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Donnerstag 27.01. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Montag 31.01. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
Information
Ziele, Inhalte und Methode der Lehrveranstaltung
Art der Leistungskontrolle und erlaubte Hilfsmittel
Mindestanforderungen und Beurteilungsmaßstab
The theory and practice of the techniques covered in the course, as presented in the course.
Prüfungsstoff
Literatur
Zuordnung im Vorlesungsverzeichnis
MAMV
Letzte Änderung: Di 05.10.2021 17:29
* low-rank approximation and analysis of abstract data represented by multi-dimensional arrays
and
* adaptive numerical methods for solving PDE problems.
For the mentioned two areas, however seemingly disjoint, the idea of exactly representing or approximating «data» in a suitable low-dimensional subspace of a large (possibly infinite-dimensional) space is equally natural. The notions of matrix rank and of low-rank matrix approximation, presented in basic courses of linear algebra, are central to one of many possible expressions of this idea.In psychometrics, signal processing, image processing and (vaguely defined) data mining, low-rank tensor decompositions have been studied as a way of formally generalizing the notion of rank from matrices to higher-dimensional arrays (tensors). Several such generalizations have been proposed, including the canonical polyadic (CP) and Tucker decompositions and the tensor-SVD, with the primary motivation of analyzing, interpreting and compressing datasets. In this context, data are often thought of as parametrizations of images, video, social networks or collections of interconnected texts; on the other hand, data representing functions (which often occur in computational mathematics) are remarkable for the possibility of precise analysis.The tensor-train (TT) and the more general hierarchical Tucker decompositions were developed in the community of numerical mathematics, more recently and with particular attention to PDE problems. In fact, exactly the same and very similar representations had long been used for the numerical simulation of many-body quantum systems by computational chemists and physicists under the names of «matrix-product states» (MPS) and «multilayer multi-configuration time-dependent Hartree». These low-rank tensor decompositions are based on subspace approximation, which can be performed adaptively and iteratively, in a multilevel fashion. In a broader context of PDE problems, this leads to numerical methods that are formally based on generic discretizations but effectively operate on adaptive, data-driven discretizations constructed «online», in the course of computation. In several settings, such methods achieve the accuracy of sophisticated problem-specific methods.***The goal of the course is to introduce students to the foundations of modern low-rank tensor methods.
The course is to provide students with ample opportunity for starting own research.