BEGIN:VCALENDAR
VERSION:2.0
METHOD:PUBLISH
CALSCALE:GREGORIAN
PRODID:-//WordPress - MECv7.15.0//EN
X-ORIGINAL-URL:https://dcn.nat.fau.eu/
X-WR-CALNAME:
X-WR-CALDESC:FAU DCN-AvH. Chair for Dynamics, Control, Machine Learning and Numerics -Alexander von Humboldt Professorship
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-PUBLISHED-TTL:PT1H
X-MS-OLK-FORCEINSPECTOROPEN:TRUE
BEGIN:VEVENT
CLASS:PUBLIC
UID:MEC-783900b1dad49f8e7665ebea90ccdfcc@dcn.nat.fau.eu
DTSTART:20230530T120000Z
DTEND:20230530T130000Z
DTSTAMP:20230505T211500Z
CREATED:20230505
LAST-MODIFIED:20230515
PRIORITY:5
SEQUENCE:0
TRANSP:OPAQUE
SUMMARY:FAU MoD Lecture: From Physics-Informed Machine Learning to Physics-Informed Machine Intelligence: Quo Vadimus?
DESCRIPTION:Date: Tue. May 30, 2023\nEvent: FAU MoD Lecture\nOrganized by: FAU MoD, Research Center for Mathematics of Data at Friedrich-Alexander-Universität Erlangen-Nürnberg (Germany)\n FAU MoD Lecture: From Physics-Informed Machine Learning to Physics-Informed Machine Intelligence:\nQuo Vadimus?\nSpeaker: Prof. Dr. George Em Karniadakis (GS h-index 129)\nThe Charles Pitts Robinson and John Palmer Barstow Professor of Applied Mathematics and Engineering, Brown University. Also @MIT & PNNL\nAbstract. We will review physics-informed neural networks (NNs) and summarize available extensions for applications in computational science and engineering. We will also introduce new NNs that learn functionals and nonlinear operators from functions and corresponding responses for system identification. The universal approximation theorem of operators is suggestive of the potential of NNs in learning from scattered data any continuous operator or complex system. We first generalize the theorem to deep neural networks, and subsequently we apply it to design a new composite NN with small generalization error, the deep operator network (DeepONet), consisting of a NN for encoding the discrete input function space (branch net) and another NN for encoding the domain of the output functions (trunk net). We demonstrate that DeepONet can learn various explicit operators, e.g., integrals, Laplace transforms and fractional Laplacians, as well as implicit operators that represent deterministic and stochastic differential equations. More generally, DeepOnet can learn multiscale operators spanning across many scales and trained by diverse sources of data simultaneously. Finally, we will present first results on the next generation of these architectures to biologically plausible designs based on spiking neural networks and Hebbian learning that are more efficient and closer to human intelligence.\nWHERE?\nOnline:\nZoom meeting link\n(Meeting ID: 614 4658 1599 | PIN code: 914397)\nBio: George Karniadakis is from Crete. He is a member of the National Academy of Engineering and a Vannevar Bush Faculty Fellow. He received his S.M. and Ph.D. from Massachusetts Institute of Technology (1984/87). He was appointed Lecturer in the Department of Mechanical Engineering at MIT and subsequently he joined the Center for Turbulence Research at Stanford / Nasa Ames. He joined Princeton University as Assistant Professor in the Department of Mechanical and Aerospace Engineering and as Associate Faculty in the Program of Applied and Computational Mathematics. He was a Visiting Professor at Caltech in 1993 in the Aeronautics Department and joined Brown University as Associate Professor of Applied Mathematics in the Center for Fluid Mechanics in 1994. After becoming a full professor in 1996, he continued to be a Visiting Professor and Senior Lecturer of Ocean/Mechanical Engineering at MIT. He is an AAAS Fellow (2018-), Fellow of the Society for Industrial and Applied Mathematics (SIAM, 2010-), Fellow of the American Physical Society (APS, 2004-), Fellow of the American Society of Mechanical Engineers (ASME, 2003-) and Associate Fellow of the American Institute of Aeronautics and Astronautics (AIAA, 2006-). He received the SIAM/ACM Prize on Computational Science & Engineering (2021), the Alexander von Humboldt award in 2017, the SIAM Ralf E Kleinman award (2015), the J. Tinsley Oden Medal (2013), and the CFD award (2007) by the US Association in Computational Mechanics. His h-index is 129 and he has been cited over 81,000 times.\n\nView poster\n \nThis event on LinkedIn\nYou might like:\n• FAU MoD Lecture: From Alan Turing to contact geometry: Towards a “Fluid computer” by Prof. Dr. Eva Miranda\n• FAU MoD Lecture: Applications of AAA Rational Approximation by Prof. Dr. Nick Trefethen\n• FAU MoD Lecture: Learning-Based Optimization and PDE Control in User-Assignable Finite Time by Prof. Dr. Miroslav Krstic\nYou might like:\n• FAU MoD Lecture Series\n• FAU MoD Lecture: From Alan Turing to contact geometry: Towards a “Fluid computer” by Prof. Dr. Eva Miranda\n• FAU MoD Lecture: Applications of AAA Rational Approximation by Prof. Dr. Nick Trefethen\n• FAU MoD Lecture: Learning-Based Optimization and PDE Control in User-Assignable Finite Time by Prof. Dr. Miroslav Krstic\n_\nDon’t miss out our last news and connect with us!\nFAU DCN-AvH: LinkedIn | Twitter | Instagram\nFAU MoD: LinkedIn | Twitter | Instagram\n
URL:https://dcn.nat.fau.eu/events/fau-mod-lecture-from-physics-informed-machine-learning-to-physics-informed-machine-intelligence-quo-vadimus/
ORGANIZER;CN=FAU MoD:MAILTO:
CATEGORIES:FAU MoD Lecture,Seminar/Talk
LOCATION:Worldwide
ATTACH;FMTTYPE=image/png:https://dcn.nat.fau.eu/wp-content/uploads/FAUMoDLectureSeries_gKarniadakis_30may2023.png
END:VEVENT
END:VCALENDAR