Variational problems on L-infinity and continuum limits on graphs
Speaker: Dr. Leon Bungert
Affiliation: FAU Erlangen-Nürnberg, Germany
Organized by: FAU CAA-AvH, Chair in Applied Analysis – Alexander von Humboldt Professorship at FAU Erlangen-Nürnberg (Germany)
Zoom meeting link
Meeting ID: 623 9301 9630 | PIN code: 158499
Abstract. Modern machine learning techniques and in particular Deep Learning have surpassed classical methods both in terms of efficiency and accuracy. On the other hand, many (semi)supervised learning methods are inherently instable with respect to noise or so-called adversarial examples which hinders their usage in safety critical applications. A potential remedy for this drawback is to design Lipschitz continuous, and hence stable, inference models. In this talk I will first speak about a graph-based semi-supervised learning approach called Lipschitz learning and study its continuum limit as the number of data points tends to infinity. Using Gamma-convergence one can prove that minimizers converge to solutions of a variational problem in L-infinity. Then I will present a novel regularization algorithm for neural networks called CLIP, which penalizes large Lipschitz constants of a neural network during training by keeping track of the set of unstable points.