Solving/learning nonlinear PDEs and completing computational graphs with GPs

Houman Owhadi (Caltech, Pasadena)

Apr 05. 2022, 15:30 — 16:20

We first present a framework for solving and learning arbitrary nonlinear PDEs with Gaussian Processes (GPs). The method comes with convergence guarantees and inherits the complexity of state-of-the-art solvers for dense kernel matrices. This first part is a joint work with Yifan Chen, Bamdad Hosseini, and Andrew Stuart. We then present an encompassing GP framework for completing arbitrary computational graphs. This general framework is motivated by the observation that most problems in Computational Sciences and Engineering (CSE) can be described as that of completing (from data) a computational graph representing dependencies between functions and variables. Functions and variables may be known, unknown, or random. Data comes in the form of observations of distinct values of a finite number of subsets of the variables of the graph. The underlying problem combines a regression problem (approximating unknown functions) with a matrix completion problem (recovering unobserved variables in the data) and can be seen as that of solving nonlinear systems of equations in reproducing kernel Hilbert spaces from limited data and incomplete information.

 

Further Information
Venue:
ESI Boltzmann Lecture Hall
Recordings:
Recording
Associated Event:
Adaptivity, High Dimensionality and Randomness (Workshop)
Organizer(s):
Carsten Carstensen (HU Berlin)
Albert Cohen (Sorbonne U, Paris)
Michael Feischl (TU Vienna)
Christoph Schwab (ETH Zurich)