Date: Wed. November 18, 2020
Organized by: Chair for Dynamics, Control and Numerics – Alexander von Humboldt Professorship at FAU Erlangen-Nürnberg
Title: Statistical inverse problems and gradient flow structures in the space of probability measures

Speaker: Prof. Dr. Wolfgang Dahmen
Affiliation: University of South Carolina (USA)

Abstract. The exploration of complex physical or technological processes requires exploiting available information from different sources. So called background information is typically based on physical laws, often represented as a family of parameter dependent Partial Differential Equations where the range of parameters is to capture the states of interest. A second (external) source are data, provided by measurement devices or sensors. The amount of sensors is typically limited and data acquisition may be expensive and in some cases even harmful. A proper fusion of classical (possibly deficient or uncalibrated) models with a limited set of data is therefore at the heart of a wide range on inversion tasks such as data assimilation, and state or parameter estimation. This talk highlights some recent developments for such “Small-Data” scenarios where inversion is strongly aggravated by the typically large parametric dimensionality. A guiding theme is to develop concepts that warrant a high degree of „predictive capabilities“. This requires a rigorous accuracy assessment in relation to the incurred computational cost. In this context a central role is played by nonlinear reduced models whose certifiability hinges on a posteriori accuracy control. This, a proper adaptation of the reduced models to the external information, as well as the identification of intrinsic recovery limitations, are shown to rely on exploiting intrinsic problem metrics already on the infinite-dimensional level. This, in turn, requires employing stable variational formulations of the PDE model. We discuss these concepts for two classes of reduced models, either based on (piecewise) Reduced Bases (RB) or on Deep Neural Networks (DNNs). This indicates, in particular, for which problem type which version is expected to be preferable. If time permits we present a problem class where a superior expressive power of DNNs can be established rigorously so as to avoid the Curse of Dimensionality.

Part of the material is based on joint work with A. Cohen, R. DeVore, J. Nichols, O. Mula, Min Wang, Zhu Wang.

Recording/Video:

_

Don’t miss out our Upcoming events!

|| Subscribe to our FAU DCN-AvH newsletter

Tags:

Don't miss out our posts on Math & Research!

Transition Layers in Elliptic Equations By Maicon Sônego   Stable transition layers in an unbalanced bistable equation Consider the following semi-linear problem where are positive functions in ; is a positive parameter and We assume that the functions satisfy ; for all ; there is a […]
Randomized time-splitting in linear-quadratic optimal control By Daniël Veldman   Introduction Solving an optimal control problem for a large-scale dynamical system can be computationally demanding. This problem appears in numerous applications. One example is Model Predictive Control (MPC), which requires the solution of several optimal control […]
Felix Klein: A Legacy of Innovation in Mathematics and Education By Roberto Rodríguez del Río, Complutense University of Madrid | IES San Mateo, Madrid   Felix Christian Klein lived in a period of history of science in which Mathematics were involved in a process of transformation, […]
Our last Publications
[cris show="publications" persID="223281397,105092142,229344528,239343629,241149469,243434665,105514816,242263337,104776092,236754096,243266999,243266999" year="2020" type="beitrag_fachzeitschrift" sortby="updated" quotation="apa" items="5"]
© 2019-2021 Chair for Dynamics, Control and Numerics - Alexander von Humboldt Professorship at FAU Erlangen-Nürnberg, Germany | Imprint | Contact