BEGIN:VCALENDAR
VERSION:2.0
METHOD:PUBLISH
CALSCALE:GREGORIAN
PRODID:-//WordPress - MECv7.32.0//EN
X-ORIGINAL-URL:https://dcn.nat.fau.eu/
X-WR-CALNAME:
X-WR-CALDESC:FAU DCN-AvH. Chair for Dynamics, Control, Machine Learning and Numerics -Alexander von Humboldt Professorship
X-WR-TIMEZONE:Europe/Berlin
BEGIN:VTIMEZONE
TZID:Europe/Berlin
X-LIC-LOCATION:Europe/Berlin
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20260329T030000
RRULE:FREQ=YEARLY;BYMONTH=03;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20261025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=4SU
END:STANDARD
END:VTIMEZONE
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-PUBLISHED-TTL:PT1H
X-MS-OLK-FORCEINSPECTOROPEN:TRUE
BEGIN:VEVENT
CLASS:PUBLIC
UID:MEC-88083dbfcee5080bb9f38e21107efd08@dcn.nat.fau.eu
DTSTART;TZID=Europe/Berlin:20251030T160000
DTEND;TZID=Europe/Berlin:20251030T170000
DTSTAMP:20251101T005857Z
CREATED:20251101
LAST-MODIFIED:20251101
PRIORITY:5
SEQUENCE:9
TRANSP:OPAQUE
SUMMARY:MLDS: Machine Learning, Control Theory, and PDEs: Foundations and Advances from Variational Pathologies to Diffusion Models for Generative AI
DESCRIPTION:Event: MLDS, Machine Learning and Dynamical Systems Seminar\nDate: Thu. October 30, 2025 at 17:00H (local time)\nOrganizer: The Alan Turing Institute\nInaugural Lecture: Machine Learning, Control Theory, and PDEs: Foundations and Advances from Variational Pathologies to Diffusion Models for Generative AI\nSpeaker: Prof. Enrique Zuazua, FAU – Friedrich-Alexander-Universität Erlangen-Nürnberg (Germany)\nAbstract. This lecture explores the growing interplay between machine learning, control theory, and partial differential equations (PDEs)- three areas that together shape our understanding of modern artificial intelligence.\nIn the first part, we examine how neural networks can be interpreted as controlled dynamical systems, establishing deep analogies between training processes and optimal control problems. Concepts such as controllability, observability, and stabilization provide a rigorous framework to understand the dynamics of learning, the role of architecture depth, and the expressivity of modern models. This control-theoretic viewpoint offers valuable insights into optimization, generalization, and deep learning.\nIn the second part, we turn to PDE-based perspectives arising in the numerical and generative facets of machine learning.  We explore two examples of this growing interplay. In the first example, we uncover variational pathologies that arise when neural networks replace the classical finite element approach for solving PDEs. The loss of familiar mathematical properties, such as convexity coercivity, can lead to unexpected and sometimes striking anomalies in approximation and stability.\nIn the second example, we turn to diffusion-based generative AI models, now at the core of modern image and text synthesis. We show how Li–Yau-type parabolic inequalities, originally developed in the analysis of heat flow, shed light on their behaviour and remarkable effectiveness.\nTogether, these viewpoints show how mathematical analysis and control theory can illuminate the mechanisms of modern learning systems—and how, conversely, learning frameworks inspire new questions in dynamics, control, and PDE theory.\nWHEN\nThu. October 30, 2025 at 17:00H (local time)\nWHERE\nOnline (Zoom)\n
URL:https://dcn.nat.fau.eu/events/mlds-machine-learning-control-theory-and-pdes/
CATEGORIES:EZuazua,Seminar/Talk
ATTACH;FMTTYPE=image/png:https://dcn.nat.fau.eu/wp-content/uploads/theAlanTuringInstitute_EZuazua_30oct2025.png
END:VEVENT
END:VCALENDAR
