Universität Wien FIND

Bedingt durch die COVID-19-Pandemie können kurzfristige Änderungen bei Lehrveranstaltungen und Prüfungen (z.B. Absage von Vor-Ort-Lehre und Umstellung auf Online-Prüfungen) erforderlich sein. Melden Sie sich für Lehrveranstaltungen/Prüfungen über u:space an, informieren Sie sich über den aktuellen Stand auf u:find und auf der Lernplattform moodle.

Weitere Informationen zum Lehrbetrieb vor Ort finden Sie unter https://studieren.univie.ac.at/info.

390036 UK VGSCO Derivate-Free Optimization: Basic Principles and State of the Art (2017W)

Prüfungsimmanente Lehrveranstaltung



max. 50 Teilnehmer*innen
Sprache: Englisch



Block: 27.11 - 01.12.2017
Room: 3.307 (3rd Floor)

Monday 08.30-13.00
Tuesday 10.30-15.00
Wednesday 15.00 - 19.00
Thursday 08.30-11.00
Friday 08.30-13.00


Ziele, Inhalte und Methode der Lehrveranstaltung

This course deals with the numerical solution of Optimization problems involving functions for which derivatives are unavailable, available at a prohibitive cost, or subject to noise --- as may be the case when function evaluations result from computing simulation or laboratorial observation.

In the field of Derivative-Free Optimization (DFO) one studies the development, analysis, and implementation of algorithms for such type of problems. A successful application of such algorithms requires them to be efficient (in the sense of taking a moderate number of function evaluations) and rigorous (meaning not too far from an algorithmic structure for which one can prove something meaningful under reasonable assumptions).

In this course we will describe the building blocks over which DFO algorithms are based and pass the intuition that helps understanding why are they globally convergent (meaning convergent to stationarity independently of the starting point). In some cases it is possible to develop worst case complexity bounds in terms of the number of iterations or function evaluations to reach some level of stationarity. Such global rates complement existing analyses of global convergence by providing additional insight and comparisons.

We will essentially focus on two well-known classes of methods, direct-search methods and model-based (trust-region type) algorithms. While the former are typically less efficient, they are better tailored to non-smooth and constrained settings due to their directional nature.

We will however try to bridge over other methodologies (like convex and sparse optimization, surrogate modeling, evolutionary optimization, multi-objective optimization, global optimization), as time permits, but always from the point of view of DFO.

Art der Leistungskontrolle und erlaubte Hilfsmittel

A set of 400 hundred slides will be provided to the students.

Further reading will come from papers and from the book:
A. R. Conn, K. Scheinberg, and L. N. Vicente, Introduction to Derivative-Free Optimization, MPS-SIAM Book Series on Optimization, SIAM, Philadelphia, 2009

Mindestanforderungen und Beurteilungsmaßstab

Good knowledge of Vector Calculus, Linear Algebra, and Numerical Analysis.
Basic knowledge of Nonlinear Optimization and Probability Theory.


Sampling and modeling (positive spanning, polynomial interpolation and regression, surrogate models).
Direct-search methods and non-smooth calculus.
Model-based trust-region methods.
Global convergence, global rates, randomness.
Coverage of other methodologies and software.


The following survey paper provides an introduction to the topic:

A. L. Custódio, K. Scheinberg, and L. N. Vicente, Methodologies and software for derivative-free optimization, Chapter 37 of Advances and Trends in Optimization with Engineering Applications, T. Terlaky, M. F. Anjos, and S. Ahmed (editors), MOS-SIAM Book Series on Optimization, SIAM, Philadelphia, 2017

A pdf file is available from my page: http://www.mat.uc.pt/~lnv/

Zuordnung im Vorlesungsverzeichnis

Letzte Änderung: Mo 07.09.2020 15:46