250085 VU Tensor methods for data science and scientific computing (2021W)
Continuous assessment of course work
Labels
MIXED
Registration/Deregistration
Note: The time of your registration within the registration period has no effect on the allocation of places (no first come, first served).
- Registration is open from Mo 13.09.2021 00:00 to Mo 27.09.2021 23:59
- Deregistration possible until Su 31.10.2021 23:59
Details
max. 25 participants
Language: English
Lecturers
Classes (iCal) - next class is marked with N
The course is organized in the form of sessions of two types.
(i) LECTURE SESSIONS (three academic hours weekly)will cover mostly theoretical material.
These sessions will be taught in class or online (via Moodle and BigBlueButton).
The lectures will consist in the comprehensive presentation of theoretical material on a chalkboard or on virtual board, at a usual chalkboard pace.
The lectures will be recorded, and the video recordings will be available to the registered students (via Moodle).(ii) EXERCISE SESSIONS (one academic hour weekly)
will revisit the methods and techniques covered in (i),
focusing on the practical aspects and implementation thereof
as well as on homework assignments.
These sessions will be taught in class or online (via Moodle and BigBlueButton).
In any case, these sessions will NOT be recorded.
Any relevant demonstration code will be available to the registered students (via Moodle).
- Monday 04.10. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Thursday 07.10. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Monday 11.10. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Thursday 14.10. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Monday 18.10. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Thursday 21.10. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Monday 25.10. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Thursday 28.10. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Thursday 04.11. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Monday 08.11. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Thursday 11.11. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Monday 15.11. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Thursday 18.11. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Monday 22.11. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Thursday 25.11. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Monday 29.11. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Thursday 02.12. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Monday 06.12. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Thursday 09.12. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Monday 13.12. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Thursday 16.12. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Monday 10.01. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Thursday 13.01. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Monday 17.01. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Thursday 20.01. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Monday 24.01. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
- Thursday 27.01. 11:30 - 13:00 Seminarraum 10, Kolingasse 14-16, OG01
- Monday 31.01. 13:15 - 14:45 Seminarraum 10, Kolingasse 14-16, OG01
Information
Aims, contents and method of the course
Assessment and permitted materials
Minimum requirements and assessment criteria
The theory and practice of the techniques covered in the course, as presented in the course.
Examination topics
Reading list
Association in the course directory
MAMV
Last modified: Tu 05.10.2021 17:29
* low-rank approximation and analysis of abstract data represented by multi-dimensional arrays
and
* adaptive numerical methods for solving PDE problems.
For the mentioned two areas, however seemingly disjoint, the idea of exactly representing or approximating «data» in a suitable low-dimensional subspace of a large (possibly infinite-dimensional) space is equally natural. The notions of matrix rank and of low-rank matrix approximation, presented in basic courses of linear algebra, are central to one of many possible expressions of this idea.In psychometrics, signal processing, image processing and (vaguely defined) data mining, low-rank tensor decompositions have been studied as a way of formally generalizing the notion of rank from matrices to higher-dimensional arrays (tensors). Several such generalizations have been proposed, including the canonical polyadic (CP) and Tucker decompositions and the tensor-SVD, with the primary motivation of analyzing, interpreting and compressing datasets. In this context, data are often thought of as parametrizations of images, video, social networks or collections of interconnected texts; on the other hand, data representing functions (which often occur in computational mathematics) are remarkable for the possibility of precise analysis.The tensor-train (TT) and the more general hierarchical Tucker decompositions were developed in the community of numerical mathematics, more recently and with particular attention to PDE problems. In fact, exactly the same and very similar representations had long been used for the numerical simulation of many-body quantum systems by computational chemists and physicists under the names of «matrix-product states» (MPS) and «multilayer multi-configuration time-dependent Hartree». These low-rank tensor decompositions are based on subspace approximation, which can be performed adaptively and iteratively, in a multilevel fashion. In a broader context of PDE problems, this leads to numerical methods that are formally based on generic discretizations but effectively operate on adaptive, data-driven discretizations constructed «online», in the course of computation. In several settings, such methods achieve the accuracy of sophisticated problem-specific methods.***The goal of the course is to introduce students to the foundations of modern low-rank tensor methods.
The course is to provide students with ample opportunity for starting own research.