Optimal sampling in least-squares methods – applications in PDEs and inverse problems
Speaker: Prof. Dr. Albert Cohen
Affiliation: Laboratoire Jacques-Louis Lions, Sorbonne Université, France
Zoom meeting link
Meeting ID: 927 4962 1451 | PIN code: 534277
Abstract. Recovering an unknown function from point samples is an ubiquitous task in various applicative settings: non-parametric regression, machine learning, reduced modeling, response surfaces in computer or physical experiments, data assimilation and inverse problems. In the first part of this lecture, I shall present a theoretical setting that yield recovery bounds in the context where the user is allowed to select the measurement points, sometimes refered to as active learning. These results allow us to derive an optimal sampling point distribution when the approximation is searched in a linear space of finite dimension n and computed by weigted-least squares. Here optimal means both that the approximation is comparable to the best possible in this space, and that the sampling budget m barely exceeds n. The main involved tools are inverse Christoffel functions and matrix concentration inequalities.
In a second part, I shall cover some novel and ongoing developments building upon this approach. The first one addresses the setting where the approximation space is not fixed but adaptively generated with growing dimension n, which requires a particular online sampling methodology. The second discusses the setting where the measurements are not point value but more general functionals which may be
thought as point evaluation in a transformed space, the typical applicative setting being that of inverse problems, and also of collocation methods for solving PDEs.