Universität Wien FIND
Achtung! Das Lehrangebot ist noch nicht vollständig und wird bis Semesterbeginn laufend ergänzt.

250041 VO Low-rank tensors and the data-driven solution of PDEs (2021S)

5.00 ECTS (3.00 SWS), SPL 25 - Mathematik

An/Abmeldung

Hinweis: Ihr Anmeldezeitpunkt innerhalb der Frist hat keine Auswirkungen auf die Platzvergabe (kein "first come, first served").

Details

Sprache: Englisch

Prüfungstermine

Lehrende

Termine (iCal) - nächster Termin ist mit N markiert

The course is organized in the form of sessions of two types.

(i) LECTURES PROPER (two academic hours weekly)
will cover theoretical material.
These sessions will be taught online throughout the semester (via Moodle and BigBlueButton).
The lectures will consist in the comprehensive presentation of theoretical material on a virtual board at a usual chalkboard pace.
The virtual-board notes will be exported after each lecture and will be available to the registered students (via Moodle).
The lectures will be recorded, and the video recordings will be available to the registered students (via Moodle).

(ii) DISCUSSIONS (one academic hour weekly)
will revisit the methods and techniques covered in (i),
focusing on the practical aspects and implementation thereof
as well as on optional assignments should such be offered.
These sessions will be organized online, similarly to (i),
but may switch to in-class teaching during the semester if that becomes possible.
These sessions will NOT be recorded.
Any relevant virtual-board notes and demonstration code will be available to the registered students (via Moodle).

Montag 01.03. 15:00 - 16:30 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Donnerstag 04.03. 15:00 - 15:45 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Montag 08.03. 15:00 - 16:30 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Donnerstag 11.03. 15:00 - 15:45 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Montag 15.03. 15:00 - 16:30 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Donnerstag 18.03. 15:00 - 15:45 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Montag 22.03. 15:00 - 16:30 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Donnerstag 25.03. 15:00 - 15:45 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Montag 12.04. 15:00 - 16:30 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Donnerstag 15.04. 15:00 - 15:45 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Montag 19.04. 15:00 - 16:30 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Donnerstag 22.04. 15:00 - 15:45 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Montag 26.04. 15:00 - 16:30 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Donnerstag 29.04. 15:00 - 15:45 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Montag 03.05. 15:00 - 16:30 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Donnerstag 06.05. 15:00 - 15:45 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Montag 10.05. 15:00 - 16:30 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Montag 17.05. 15:00 - 16:30 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Donnerstag 20.05. 15:00 - 15:45 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Donnerstag 27.05. 15:00 - 15:45 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Montag 31.05. 15:00 - 16:30 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Montag 07.06. 15:00 - 16:30 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Donnerstag 10.06. 15:00 - 15:45 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Montag 14.06. 15:00 - 16:30 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Donnerstag 17.06. 15:00 - 15:45 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Montag 21.06. 15:00 - 16:30 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Donnerstag 24.06. 15:00 - 15:45 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock
Montag 28.06. 15:00 - 16:30 Digital
Seminarraum 12 Oskar-Morgenstern-Platz 1 2.Stock

Information

Ziele, Inhalte und Methode der Lehrveranstaltung

This course focuses on the MPS/TT low-rank decomposition of multidimensional arrays, which appears naturally as a representation of functions from low-rank refinement in the construction of finite-element approximations. The course opens with an introduction to low-rank tensor decompositions from a linear-algebraic perspective and then proceeds to low-rank tensor methods for second-order linear elliptic problems. The low-rank approximation of functions, depending on their regularity, is rigorously analyzed and then placed in the context of PDE analysis. The course also treats state-of-the-art methods for solving optimality equations (ALS, AME): the construction, implementation and numerical analysis of such methods is presented.

***

This course spotlights the intersection of two areas of modern applied mathematics:
* low-rank approximation and analysis of abstract data represented by multi-dimensional arrays
and
* adaptive numerical methods for solving PDE problems.
For the mentioned two areas, however seemingly disjoint, the idea of exactly representing or approximating «data» in a suitable low-dimensional subspace of a large (possibly infinite-dimensional) space is equally natural. The notions of matrix rank and of low-rank matrix approximation, presented in basic courses of linear algebra, are central to one of many possible expressions of this idea.

In psychometrics, signal processing, image processing and (vaguely defined) data mining, low-rank tensor decompositions have been studied as a way of formally generalizing the notion of rank from matrices to higher-dimensional arrays (tensors). Several such generalizations have been proposed, including the canonical polyadic (CP) and Tucker decompositions and the tensor-SVD, with the primary motivation of analyzing, interpreting and compressing datasets. In this context, data are often thought of as parametrizations of images, video, social networks or collections of texts, whereas data representing functions are mostly considered as convenient test examples.

On the other hand, the «tensor-train» (TT) and the more general «hierarchical Tucker» decompositions were developed in the community of numerical mathematics, more recently and with particular attention to PDE problems. In fact, exactly the same and very similar representations had long been used for the numerical simulation of many-body quantum systems by computational chemists and physicists under the names of «matrix-product states» (MPS) and «multilayer multi-configuration time-dependent Hartree». These low-rank tensor decompositions are based on subspace approximation, which can be performed adaptively and iteratively, in a multilevel fashion. In a broader context of PDE problems, this leads to numerical methods that are formally based on generic discretizations but effectively operate on adaptive, data-driven discretizations constructed «online», in the course of computation. In several settings of practical importance, such methods achieve the accuracy of problem-specific, sophisticated methods.

***

The goal of the course is to introduce students to the foundations of modern low-rank methods for the numerical solution of PDE and to the state-of-the-art research in this area.
In particular, the course is to provide students with ample opportunity for starting own research.
In addition, through numerous connections with basic courses (linear algebra, numerical mathematics and numerical analysis, real and complex analysis, analysis of PDE), this course serves to reinforce a broader perspective of mathematics as an integrated field with its distinctive methods of inquiry that span across its pure and applied branches.

Art der Leistungskontrolle und erlaubte Hilfsmittel

Oral examination, on a flexible schedule.

Mindestanforderungen und Beurteilungsmaßstab

Prüfungsstoff

The theory and practice of the techniques covered in the course, as presented in the course.

Literatur

Lecture notes will be provided after each lecture (this is the first offering of the course).
The following literature may give a general idea of low-rank tensor approximation.

* A detailed monograph on tensor methods, which the course will, however, NOT follow.
Wolfgang Hackbusch. Tensor spaces and numerical tensor calculus
https://link.springer.com/book/10.1007/978-3-030-35554-8

* A general introduction to low-rank tensor approximation.
Tamara Kolda and Brett Bader. Tensor Decompositions and Applications
https://epubs.siam.org/doi/abs/10.1137/07070111x

* An overview of tensor networks in the context of quantum systems.
Roman Orus. A practical introduction to tensor networks: matrix product states and projected entangled pair states
https://arxiv.org/abs/1306.2164

* An introduction to the MPS/TT decomposition
Ivan Oseledets and Eugene Tyrtyshnikov. Breaking the curse of dimensionality, or how to use SVD in many dimensions
https://epubs.siam.org/doi/abs/10.1137/090748330

* An overview of the MPS/TT decomposition as a tool for function and PDE approximation
Boris Khoromskij. O(dlog N)-quantics approximation of N-d tensors in high-dimensional numerical modeling
https://link.springer.com/article/10.1007/s00365-011-9131-1

* Approximation of algebraic singularities in a PDE setting in two dimensions
Vladimir Kazeev and Christoph Schwab. Quantized tensor-structured finite elements for second-order elliptic PDEs in two dimensions
https://link.springer.com/article/10.1007/s00211-017-0899-1

Zuordnung im Vorlesungsverzeichnis

MAMV;

Letzte Änderung: Fr 22.10.2021 16:09