Computational Uncertainty Quantification: Mathematical Foundations, Methodology & Data

This ESI TP will gather at ESI leading researchers from applied mathematics, scientific computing and high-dimensional, computational statistics around the emerging area of numerical uncertainty quantification (UQ for short) in engineering and in the sciences. The TP will concentrate on mathematical foundations and underpinnings of novel computational strategies for the efficient numerical approximation of PDEs with uncertain inputs, as well as on the analysis of statistical methodologies for high-dimensional statistical data resulting from such PDE simulations. Both forward and inverse problems will be considered.

The formal programme will consist of five embedded workshops:

WS1  "Multilevel and multifidelity sampling methods in UQ for PDEs",  schedule (pdf)
organizers: Karen Willcox, Rob Scheichl, Fabio Nobile, Kody Law
May 2 to May 6, 2022

WS2  "Approximation of high-dimensional parametric PDEs in forward UQ",  schedule (pdf)
organizers: Albert Cohen, Fabio Nobile, Catherine Powell, Christoph Schwab, Lorenzo Tamellini
May 9 to May 13, 2022

WS3  "PDE-constrained Bayesian inverse problems: Interplay of spatial statistical models with Machine Learning in PDE discretizations", schedule (pdf)
organizers: Sebastian Reich, Christoph Schwab, Andrew M. Stuart, Sara van de Geer 
May 16 to May 20, 2022

WS4  "Statistical estimation and deep learning in UQ for PDEs", schedule (pdf)
organizers: Francis Bach, Clemens Heitzinger, Johannes Schmidt-Hieber, Sara van de Geer
May 30  to June 3, 2022

WS5  "UQ in kinetic and transport equations and in high-frequency wave propagation", schedule (pdf)
organizers: Liliana Borcea, Giacomo Dimarco, Clemens Heitzinger, Shi Jin, Robert Scheichl, Euan Spence
June 13  to June 17, 2022 

The workshops will be held in hybrid mode with the platform zoom. zoom coordinates will be provided to the participants by email shortly before the beginning of the event.

For further details and updates to the talks in each of these workshops, see the online schedule.

Summary.  Upon placing (prior) probability measures on input parameter spaces, randomized (sampling) approximations can be employed to sample from the parametric solution manifolds: the proposed thematic program will, therefore, have one focus on Monte Carlo and quasi-Monte Carlo methods for high-dimensional random inputs, with particular attention to multilevel strategies. Other algorithmic techniques to be considered will include adaptive ("stochastic") collocation and Galerkin methods, in particular combined with Model Order Reduction (MOR), Reduced Basis Methods (RBM), low-rank approximations in tensor formats and compressed sensing based algorithms.

Another focus will be statistical modelling of large-scale (spatially or temporally) heterogeneous data for use as inputs of random PDEs. Regression and least squares based methodologies from high-dimensional statistics will be analyzed in the particular case of noisy responses of PDE outputs, and one workshop will be dedicated to kernel and machine learning based approximations of input-output maps for PDEs with highdimensional inputs as well as to new directions at the intersection of UQ and machine learning in general. While engineering models such as diffusion, acoustic, elastic and electromagnetic wave propagation and viscous flow will be foundational applications, extensions to kinetic and more general, integrodifferential equations with random input data will be considered.

Application areas will include computational directions in life sciences, medicine, geosciences, quantum chemistry, nanotechnology, computational mechanics and aerospace engineering.

 

Abstract WS1: Multilevel and multifidelity sampling methods in UQ for PDEs

This workshop will cover multilevel and multifidelity sampling methods in uncertainty quantification for PDEs, with a particular focus on moving beyond forward propagation of uncertainty.

A powerful and attractive way for uncertainty propagation and for Bayesian inference in random and parametric PDEs are multilevel sampling approaches, such as multilevel Monte Carlo, multilevel quasi-Monte Carlo, multilevel stochastic collocation, to name but a few. These methods exploit the natural hierarchies of numerical approximations (mesh size, polynomial degree, truncations of expansions, regularisations, model order reduction) to efficiently tackle the arising high- or infinite-dimensional quadrature problems. Their efficiency is based on variance reduction through a systematic use of control variates and importance sampling (in the stochastic setting) and of the sparse grid recombination idea (in the deterministic setting). The variance reduction is intrinsically tied to a priori and a posteriori error control in the numerical approximations, which in turn has spawned a resurgence in fundamental research in the mathematical and numerical analysis of PDEs with spatiotemporally heterogeneous data.

A related body of work has focused on combining models of varying fidelity (such as 1D and 3D models, reduced-order models, differing physical assumptions, data-fit surrogate models, and look-up tables of experimental results) in multifidelity uncertainty quantification methods. These methods similarly use control variate formulations (the multifidelity Monte Carlo method) and importance sampling. This multifidelity setting differs from the multilevel setting above because the different models do not need to form a fixed hierarchy. The relationships among the models are typically unknown a priori and are learned adaptively as the computation advances.

All these methodologies have most notably been developed in the context of forward propagation of uncertainty, or quadrature with respect to a known (prior) distribution, but they have also been extended to inverse problems and intractable (posterior) distributions that arise when incorporating data in a Bayesian inference framework, and to in optimization under uncertainty. 

Abstract WS2: Approximation of high-dimensional parametric PDEs in forward UQ
This workshop focuses on efficient numerical methods for the propagation of uncertainty in partial differential equations (PDEs). Of primary interest is the case where the uncertain inputs belong to separable Banach spaces, a situation which commonly arises when inputs to PDE models are represented as random processes or fields. The task of quantifying numerically the uncertainty in the PDE solution can be recast as an approximation problem for a parametric family of PDE solutions, seen as a map  between data and solution (Banach) spaces. One therefore deals with deterministic maps defined on potentially very high-dimensional or infinite dimensional parameter spaces.

Key mathematical questions to be addressed in this workshop concern regularity and compressibility of random input data (measured, for instance, in terms of summability of expansion coefficients over a basis or a dictionary), compressibility results for the corresponding data-to-solution map (for instance, sparsity estimates of generalized polynomial chaos expansions of the parametric family of responses), bounds on the Kolmogorov n-width of the solution manifold, as well as low rank estimates in tensor-formatted representations, etc.

Theoretical sparsity bounds on solutions to parametric PDEs provide benchmarks for high-dimensional approximation schemes, a wide variety of which have emerged in recent years in computational UQ. Such methodologies include Stochastic Galerkin and Collocation methods, Model Order Reduction (MOR) and Reduced Basis (RB) methods, tensor-formatted numerical linear algebra techniques, compressed sensing and LASSO regularisation, kernel-based approximations and Gaussian Process regression. Recent advances on convergence and complexity results for such approximation schemes in the high-dimensional setting will be presented.

Abstract WS3: PDE-constrained Bayesian inverse problems: interplay of spatial statistical models with advanced PDE discretizations
Numerical Bayesian inversion of PDE models with uncertain input data has gained substantial momentum within the general area of data-driven computational UQ during the past years. 

Bayesian analysis consists in computing expectations of so-called Quantities of Interest (QoI's for short), constrained by forward PDE models, under a prior probability on uncertain PDE inputs, and taking into account the availability of possibly massive, noisy and/or redundant data.

Prominent examples are climate and weather forecasts, subsurface flow, social media, biomedial and genomic data. Variants are UQ in classification.

Standard numerical approaches are Markov Chain Monte Carlo (MCMC) methods and their variants. Alternative approaches include deterministic high-dimensional integration methods of Quasi-Monte Carlo type or variational techniques. While fundamental advances have been made in understanding the convergence of these methods, running such algorithms on complex forward PDE models and large data, possibly streamed in real time, entail prohibitive computational work. 

This WS will therefore explore the analysis and implementation of computational acceleration strategies including novel approximation techniques driven by advances from Machine Learning. 

One acceleration of computational MCMC in UQ for Bayesian PDE inversion and data assimilation is based on running MCMC on small scale PDE surrogate models, obtained, e.g., by MOR or reduced basis methods; this approach raises the issue on how to perform PDE MOR with certification for all states within reach of the sampler. Other acceleration methodologies comprise (semi-)supervised machine learning surrogates of Bayesian posteriors, and reparametrization of the Bayesian posterior through a transport of measures. Novel approximation techniques include random feature maps, reservoir computing, deep neural networks, and tensor trains. Furthermore, highly nonlinear and complex PDE-based forward models stand to benefit from recent advances in derivative-free inference methods and affine-invariant interacting particle samplers.

In addition to these algorithm-driven themes, the WS will address the principled selection of prior distributions and their impact on the posterior consistency of the QoIs from a frequentist perspective.

Overall, this WS will explore recent foundational and application advances in PDE-constrained Bayesian inference in step with WS2 and WS4.

Abstract WS4: Statistical estimation and deep learning in UQ for PDEs
Both uncertainty quantification and machine learning extract information from noisy data. In machine learning the data are usually outcomes of some random mechanism. Random data resulting from forward UQ, on the other hand, are generated as solutions of PDEs with uncertain coefficients, and exhibit additional structure of PDE solution manifolds.

Deep neural networks (DNN) have empirically been shown to perform well in various supervised machine learning tasks. Recently, some important steps towards a theoretical understanding of deep learning algorithms have been taken. For example, insight has been gained concerning their approximation properties in terms of network depth for various function classes.

In this workshop, we survey the state of the art in theoretical approximation results for DNNs, in particular for many-parametric data-to-solution maps for PDEs with uncertain inputs from function spaces, to advance statistical methodologies specifically tailored to corresponding parametric manifolds of PDE responses.

In particular, given a description of the ``richness'' of a neural network, one may use statistical machinery to evaluate the performance on noisy data. Moreover, regularization methods allow one to deal with the many parameters in the network and to improve generalization. Closely related are Bayesian approaches where the prior serves as a regularizer. An important question is for example whether such methods allow unsupervised learning of the sparsity of a DNN on PDE solution manifolds or, if not, whether semi-supervised learning methods can be designed based on a-priori information on the sparsity structure of PDE solution manifolds.

Research in deep learning involves information and approximation theory, in conjunction with scientific computing and statistics. A key aim of this workshop is to foster interaction among experts in computational PDE UQ with leaders in high-dimensional mathematical statistics to explore theory and applications of state-of-the art methods in high-dimensional statistics, to data resulting from uncertainty propagation in PDEs.

Abstract WS5: UQ in kinetic and transport equations and in high-frequency wave propagation
This workshop will focus on uncertainty quantification (UQ) in kinetic equations and in high-frequency wave propagation. While there is a very active international research community in analysis and numerical analysis in both those areas with a strong presence also in Vienna, both are fairly new application areas for UQ. These two areas are also somewhat related via mathematical tools such as the Wigner transform which gives rise to a kinetic radiative transfer equation for the high frequency wave equation in random media. Thus bringing together researchers from those two communities will be mutually beneficial.


The goal of the workshop will be to foster interaction between the UQ experts present at the program and domain specialists in the two application areas. The key aims of the workshop will be to study (i) kinetic models such as the Boltzmann transport equation (among others) where uncertainty often arises in the characterization of particle interactions and other model parameters and (ii) wave scattering problems, where uncertainty arises, for example, in parameters describing the medium or in the geometry of the scatterer. 

The talks will present and formulate central questions of uncertainty and stochasticity in those models, as well as existing approaches to handle them analytically and numerically. This will include theoretical questions of existence, uniqueness, regularity, inversion, and hypocoercivity as well as numerical aspects such as efficient solvers, approximation, and quadrature especially in high dimensions. Applications will include all areas where kinetic equations and wave equations have proven useful such as quantum mechanics, waves in random media and imaging, and more generally engineering, biology, and economics.

May 2, 2022
08:45 — 09:15
Registration & Welcome to WS 1 on "Multilevel and multifidelity sampling methods in UQ for PDEs"
09:15 — 10:00
Abdul-Lateef Haji-Ali (Heriot-Watt U, Edinburgh)
Multilevel Path Branching for Digital Options
10:00 — 10:45
10:45 — 11:15
Coffee Break
11:15 — 12:00
12:45 — 14:15
Lunch Break
15:00 — 15:25
Coffee Break
15:25 — 15:40
Robert Scheichl (U Heidelberg)

Introduction to discussion & summary of results from 2020 ESI workshop

15:40 — 16:30
Discuss Talks & Create Ideas
16:30 — 16:55
Robert Scheichl (U Heidelberg)

Collect Ideas in Plenum

May 3, 2022
10:30 — 11:00
Coffee Break
11:00 — 11:30
11:30 — 12:20
12:20 — 14:00
Lunch Break
14:45 — 15:15
Coffee Break
15:15 — 16:15
Discuss Talks & Create Ideas
16:15 — 16:45
Fabio Nobile (EPFL Lausanne)

Collect Ideas in Plenum

May 4, 2022
09:00 — 09:30
Robert Scheichl (U Heidelberg)

Cluster Ideas & Form Groups

09:30 — 10:30
Work in Groups
10:30 — 11:00
Coffee Break
11:00 — 12:30
Work in Groups
12:30 — 14:00
Lunch Break
14:30 — 15:15
15:15 — 15:45
Coffee Break
15:45 — 16:15
Parisa Khodabakhshi (U of Texas, Austin)
Multifidelity Uncertainty Quantification for Nonlocal Models
18:30 — 22:00
Workshop Dinner at "Heuriger Weingut Wolff"
May 5, 2022
09:00 — 10:30
Work in Groups
10:30 — 11:00
Coffee Break
11:00 — 12:30
Work in Groups
12:30 — 14:00
Lunch Break
14:45 — 15:15
Coffee Break
15:45 — 16:15
16:15 — 16:45

- online

May 6, 2022
09:00 — 09:45
Work in Groups / Preparing Presentations
10:15 — 10:30
Abdul-Lateef Haji-Ali (Heriot-Watt U, Edinburgh)
Theme G "Extensions of Multilevel Delayed Acceptance"
10:30 — 11:00
Coffee Break
11:00 — 11:15
Niklas Baumgarten (KIT, Karlsruhe), Linus Seelinger (U Heidelberg)
Theme I "Software Benchmarking and Method Comparison"
11:15 — 11:45
11:45 — 12:30
Hermann Matthies (TU Braunschweig)
What is a Sample?
12:30 — 14:00
Lunch Break
14:00 — 14:45
Elisabeth Ullmann (TU Munich)
Rare event estimation with PDE-based models

Slides available upon request

14:45 — 15:00
Robert Scheichl (U Heidelberg) & Fabio Nobile (EPFL Lausanne)

Final Discussions & Concluding Remarks

May 9, 2022
09:30 — 10:00
Registration for WS2 on "Approximation of high-dimensional parametric PDEs in forward UQ"
10:00 — 10:30
Coffee & Free Time for Discussion
11:20 — 12:10
André Uschmajew (MPI MIS, Leipzig)
Dynamical low-rank approximation for parabolic problems
12:10 — 13:50
Lunch Break
13:50 — 14:40
Christian Rieger (Philipps U, Marburg)
Kernel based reconstruction for Bayesian inverse problems
14:40 — 15:10
Coffee Break
16:00 — 16:50
Clayton Webster (U of Texas, Austin)

Sparse polynomial approximation of high-dimensional functions from random samples - see recording of his talk given at the Workshop on “Adaptivity, High Dimensionality and Randomness” on April 5, 2022, Link to recording

May 10, 2022
09:30 — 10:30
Coffee and Free Time for Discussion
12:10 — 13:50
Lunch Break
13:50 — 14:40
Matthieu Dolbeault (Sorbonne U, Paris)
A sharp upper bound for sampling numbers in L_2
14:40 — 15:10
Coffee Break
15:10 — 16:00
Amir Sagiv (Columbia U, New York)
A Measure Perspective on Uncertainty Propagation

- online

May 11, 2022
10:30 — 11:00
Coffee Break
11:00 — 12:00
Group Discussion Session
12:00 — 14:00
Lunch Break
14:00 — 14:50
14:50 — 15:20
Coffee Break
19:00 — 22:00
Conference Dinner at "Glacis Beisl"
May 12, 2022
09:30 — 10:30
Coffee and Free Time for Discussion
10:30 — 11:20
12:10 — 14:10
Lunch Break
15:00 — 15:30
Coffee Break
May 13, 2022
10:20 — 10:50
Coffee Break
12:30 — 14:30
Lunch Break/Finish
May 16, 2022
08:45 — 09:15
Registration & Welcome to WS3 on "PDE-constrained Bayesian inverse problems: interplay of spatial statistical models with advanced PDE discretizations"
09:15 — 10:00
Recording

- online

10:00 — 11:00
Coffee Break
11:00 — 11:45
11:45 — 12:30
Juan Pablo Madrigal Cianci (EPFL, Lausanne)
Generalized Parallel Tempering for Bayesian Inverse Problems
Recording

- online

12:30 — 14:00
Lunch Break
14:00 — 14:45
Research Interaction at ESI
14:45 — 15:15
Coffee Break
May 17, 2022
10:00 — 11:00
Coffee Break
11:00 — 11:45
11:45 — 12:30
Nikolas Nuesken (KCL, London)
Stein optimal transport for Bayesian inference

- online

12:30 — 14:00
Lunch Break
14:00 — 14:45
Bjorn Engquist (U of Texas, Austin)
Seismic inversion in the presence of noise

- online

14:45 — 15:15
Coffee Break
15:15 — 16:00
Franziska Weber (Carnegie Mellon U, Pittsburgh)
On Bayesian data assimilation for PDEs with ill-posed forward problems

- online

16:00 — 16:45
May 18, 2022
09:15 — 10:00
Robert Scheichl (U Heidelberg)
Multilevel Delayed Acceptance MCMC
10:00 — 11:00
Coffee Break
12:30 — 14:00
Lunch Break
14:00 — 14:45
14:45 — 15:15
Coffee Break
15:15 — 16:00
16:00 — 16:45
May 19, 2022
09:15 — 10:00
Urbain Vaes (INRIA Paris)
Consensus-based sampling

- online

10:00 — 11:00
Coffee Break
12:30 — 14:00
Lunch Break
14:45 — 15:15
Coffee Break
16:00 — 16:45
Houman Owhadi (Caltech, Pasadena)
Computational Graph Completion
Recording

- online

May 30, 2022
09:30 — 10:00
Registration and Welcome to WS4 on "Statistical estimation and deep learning in UQ for PDEs"
10:45 — 11:15
Coffee Break
11:15 — 12:00
12:00 — 14:00
Lunch Break
14:00 — 14:45
14:45 — 15:15
Coffee Break
15:15 — 16:00

-online

16:30
Apero at ESI
May 31, 2022
10:45 — 11:15
Coffee Break
11:15 — 12:00
12:00 — 14:00
Lunch Break
14:45 — 15:15
Coffee Break
15:15 — 16:00
18:30 — 22:00
Workshop Dinner at "Heuriger Schübel-Auer"
June 1, 2022
10:00 — 10:45
10:45 — 11:15
Coffee Break
11:15 — 12:00
12:00
Free Afternoon, Excursion
June 2, 2022
09:30 — 10:00
Annika Lang (Chalmers U of Technology, Gothenburg)
Short-term traffic prediction using physics-aware neural networks

-online

10:00 — 10:45
10:45 — 11:15
Coffee Break
11:15 — 12:00
Carola-Bibiane Schönlieb (U of Cambridge)
Score based diffusion models for conditional generation
Recording

– online

12:00 — 14:00
Lunch Break
14:45 — 15:15
Coffee Break
15:15 — 16:00
16:00 — 16:45
Hrushikesh Mhaskar (Claremont Graduate U)
A direct method for approximation on unknown manifolds
Recording

-online

June 3, 2022
09:15 — 10:00
10:45 — 11:15
Coffee Break
11:15 — 12:00
Tiangang Cui (Monash U, Melbourne)
DIRT: a tensorised inverse Rosenblatt transport method
12:00 — 12:30
Concluding Remarks
June 13, 2022
08:45 — 09:15
Registration and Welcome to WS5 on "UQ in kinetic and transport equations and in high-frequency wave propagation"
09:15 — 10:00
Shi Jin (Shanghai Jiao Tong University)
Uncertain Quantification of ODEs/PDEs in quantum computing
Recording

- online

10:00 — 10:45
Coffee Break
10:45 — 11:30
Josselin Garnier (École Polytechnique, Palaiseau)
Radiative transfer equation for surface and body waves
12:15 — 14:00
Lunch Break
14:00 — 14:45
14:45 — 15:15
Coffee Break
16:00 — 16:45

 - online

June 14, 2022
09:00 — 09:45
Lorenzo Pareschi (U of Ferrara)
Stochastic Galerkin particle methods
Recording

- online

09:45 — 10:45
Coffee Break
11:30 — 12:15
12:15 — 14:00
Lunch Break
14:45 — 15:15
Coffee Break
15:15 — 16:00
16:00 — 16:45
Jose Morales Escalante (UT San Antonio)
Stochastic Galerkin Methods for the Boltzmann-Poisson system
Recording

 - online

June 15, 2022
09:45 — 10:45
Coffee Break
11:30 — 12:15
12:15 — 14:00
Lunch Break
14:00 — 14:45
Group Discussions
14:45 — 15:30
Feedback Session & General Discussion
15:30 — 16:00
Coffee Break
18:30
Workshop Dinner at "Heuriger Schübel-Auer"
June 16, 2022
Public Holiday

No formal programme - Informal discussions, sightseeing and exploring Vienna.

June 17, 2022
09:45 — 10:45
Coffee Break
12:15 — 14:00
Lunch Break
14:00 — 14:45
15:30 — 15:40
Concluding Remarks
This event has no subevents associated to it.

Organizers

Name Affiliation
Clemens Heitzinger Technical University of Vienna
Fabio Nobile EPFL, Lausanne
Robert Scheichl University of Heidelberg
Christoph Schwab ETH Zürich
Sara van de Geer ETH Zürich
Karen Willcox University of Texas at Austin

Attendees

Name Affiliation
Pierre Alquier RIKEN Center for Advanced Intelligence Project
Anton Arnold Technical University of Vienna
Francis Bach INRIA, Institut national de recherche en informatique et en automatique
Markus Bachmayr Johannes-Gutenberg Universität Mainz
Gichan Bae Seoul National University
Hosseini Bamdad University of Washington
Andrew Barron Yale University
Andrea Barth University of Stuttgart
Peter Bartlett University of California, Berkeley
Niklas Baumgarten Karlsruhe Institute of Technology
Giulia Bertaglia University of Ferrara
Alex Bespalov University of Birmingham
Helmut Bölcskei ETH Zürich
Francesca Bonizzoni University Augsburg
Liliana Borcea University of Michigan
Claire Boyer Sorbonne University
Giuseppe Carere University of Potsdam
Neil Chada King Abdullah University
Peng Chen University of Texas at Austin
Alexey Chernov University of Oldenburg
Alina Chertock North Carolina State University
Geoffrey Chinot ETH Zürich
Andrés Christen Centro de Investigacion en Matematicas
Albert Cohen Sorbonne University
Colin Cotter Imperial College London
Matteo Croci University of Texas at Austin
Tiangang Cui Monash University
Nada Cvetkovic Technical University Eindhoven
Masoumeh Dashti University of Sussex
Alexis Derumigny Technical University Delft
Nicholas Dexter Simon Fraser University
Josef Dick University of New South Wales
Giacomo Dimarco University of Ferrara
Tim Dodwell University of Exeter
Matthieu Dolbeault Sorbonne University
Alireza Doostan University of Colorado
Qiang Du Columbia University
Virginie Ehrlacher Ecole des Ponts Paristech
Martin Eigel Weierstrass Institut Berlin
Bjorn Engquist University of Texas at Austin
Oliver Ernst Technical University Chemnitz
Ionut-Gabriel Farcas University of Texas at Austin
Michael Feischl Technical University of Vienna
Xiaobing Feng University of Tennessee
Martin Frank Karlsruhe Institute of Technology
Sara Fraschini University of Vienna
Josselin Garnier Ecole Polytechnique, Palaiseau
Omar Ghattas University of Texas at Austin
Susana Gomes University of Warwick
Alex Gorodetsky University of Michigan
Harshith Gowrachari SISSA
Ivan Graham University of Bath
Remi Gribonval Inria Lyon
Elena Griniari Springer
Philipp Grohs University of Vienna
Diane Guignard University of Ottawa
Seung-Yeal Ha Seoul National University
Eldad Haber University of British Columbia
Abdul-Lateef Haji-Ali Heriot-Watt University
Helmut Harbrecht University of Basel
Gottfried Hastermann University of Potsdam
Yanchen He ETH Zurich
Lukas Herrmann Johann-Radon Institute
Ralf Hiptmair ETH Zürich
Viet-Ha Hoang Nanyang Technological University Singapore
Håkon Hoel University of Oslo
Thorsten Hohage University of Göttingen
Gyuyoung Hwang Seoul National University
Gianluca Iaccarino Stanford University
John Jakeman Sandia National Laboratories
Ajay Jasra King Abdullah University
Carlos Jerez-Hanckes University Adolfo Ibanez
Shi Jin Shanghai Jiao Tong University
Barbara Kaltenbacher Alpen-Adria-Universität Klagenfurt
Clemens Karner University of Vienna
Yoshihito Kazashi Heidelberg University
Vladimir Kazeev University of Vienna
Hanne Kekkonen Technical University Delft
Parisa Khodabakhshi University of Texas at Austin
Amirreza Khodadadian University of Hannover
Kristin Kirchner Technical University Delft
Michael Kohler Technical University Darmstadt
Karina Koval Heidelberg University
Peter Kritzer Johann-Radon Institute
Karl Kunisch University of Graz
Gitta Kutyniok Ludwig-Maximilians-University Munich
Annika Lang Chalmers University of Technology
Sophie Langer Technical University Darmstadt
Jonas Latz University of Cambridge
Kody Law University of Manchester
Qin Li University of Wisconsin-Madison
Han Cheng Lie University of Potsdam
Liu Liu The Chinese University of Hong Kong
Yuena Liu Shanghai Jiao Tong University
Matthias Loeffler ETH Zürich
Po-Ling Loh University of Cambridge
Marcello Longo ETH Zürich
Mikkel Lykkegaard University of Exeter
Juan Pablo Madrigal Cianci EPFL, Lausanne
Carlo Marcati Universita degli Studi (Pavia)
Youssef Marzouk Massachusetts Institute of Technology
Hermann Matthies Technical University Braunschweig
Hrushikesh Mhaskar Claremont Graduate University
Simon Michel University of Zurich
Giovanni Migliorati Sorbonne University
Jose Morales Escalante The University of Texas at San Antonio
Mohammad Motamed University of New Mexico, Albuquerque
Olga Mula Hernandez Paris Dauphine University
Nicholas Nelsen California Institute of Technology
Richard Nickl University of Cambridge
Monica Nonino University of Vienna
Anthony Nouy Centrale Nantes
Nikolas Nuesken King's College London
Dirk Nuyens KU Leuven
Joost Opschoor ETH Zürich
Houman Owhadi California Institute of Technology, Pasadena
Iason Papaioannou Technical University of Munich
Lorenzo Pareschi University of Ferrara
Benjamin Peherstorfer Courant Institute of Mathematical Sciences
Ilaria Perugia University of Vienna
Philipp Petersen University of Vienna
Andreas Postl University of Vienna
Catherine Powell University of Manchester
Davide Pradovera University of Vienna
Dirk Praetorius Technical University of Vienna
Elizabeth Qian California Institute of Technology, Pasadena
Holger Rauhut RWTH Aachen
Sebastian Reich University of Potsdam
Christian Rieger Philipps-Universität Marburg
Pieterjan Robbe Sandia National Laboratories
Paul Rohrbach University of Cambridge
Gianluigi Rozza SISSA
Michele Ruggeri University of Strathclyde
Olof Runborg KTH Stockholm
Amir Sagiv Columbia University
Andrea Scaglioni Technical University of Vienna
Laura Scarabosio Radboud University
Claudia Schillings University of Mannheim
Johannes Schmidt-Hieber University of Twente
Sebastian Schmutzhard-Hoefler University of Vienna
Carola-Bibiane Schönlieb University of Cambridge
Linus Seelinger University of Heidelberg
Elnaz Seylabi University of Reno
Aarti Singh Carnegie Mellon University
Ian Sloan University of New South Wales
Euan Spence University of Bath
Jonathan Spence Heriot-Watt University
Björn Sprungk TU Bergakademie Freiberg
Andreas Stein ETH Zurich
Hans Peter Stimming University of Vienna
Taiji Suzuki University of Tokyo
Leila Taghizadeh Technical University of Munich
Lorenzo Tamellini CNR-IMATI
Raul Tempone RWTH Aachen
Elisabeth Ullmann Technical University of Munich
André Uschmajew Max Planck Institute for Mathematics in the Sciences
Urbain Vaes INRIA Paris
Barbara Verfürth Karlsruhe Institute of Technology
Karen Veroy-Grepl Technical University Eindhoven
Eva Vidlickova EPFL, Lausanne
Umberto Villa Washington University in St. Louis
Li Wang University of Minnesota
Sven Wang Massachusetts Institute of Technology
Gregor Wautischer University of Vienna
Franziska Weber Carnegie Mellon University
Clayton Webster University of Texas at Austin
Simon Weissmann University of Heidelberg
Marie-Therese Wolfram University of Warwick
Fan Yang ETH Zürich
Shangda Yang The University of Manchester
Petr Zamolodtchikov University of Twente
Mattia Zanella University of Pavia
Jakob Zech University of Heidelberg
Daniel Zhengyu Huang California Institute of Technology, Pasadena
Yuhua Zhu Stanford University
Preview of Michael Feischl - Convergence of adaptive stochastic collocation
Michael Feischl (TU Vienna): Convergence of adaptive stochastic collocation
May 2, 2022 10:00 — 10:45
Preview of Linus Seelinger -  UM-Bridge: Bridging the Gap Between UQ and Model Software
Linus Seelinger (U Heidelberg): UM-Bridge: Bridging the Gap Between UQ and Model Software
May 3, 2022 10:00 — 10:30
Preview of Tim Dodwell - Adaptive Multilevel Delayed Acceptance
Tim Dodwell (U of Exeter): Adaptive Multilevel Delayed Acceptance
May 3, 2022 11:30 — 12:20
Preview of Juan Pablo Madrigal Cianci - Multi-level Markov Chain Monte Carlo Methods for Bayesian Inverse Problems
Juan Pablo Madrigal Cianci (EPFL, Lausanne): Multi-level Markov Chain Monte Carlo Methods for Bayesian Inverse Problems
May 4, 2022 14:00 — 14:30
Preview of Abdul-Lateef Haji-Ali - Group Presentation: Theme G "Extensions of Multilevel Delayed Acceptance"
Abdul-Lateef Haji-Ali (Heriot-Watt U, Edinburgh): Theme G "Extensions of Multilevel Delayed Acceptance"
May 6, 2022 10:15 — 10:30
Preview of Niklas Baumgarten & Linus Seelinger - Group Presentation: Theme I "Software Benchmarking and Method Comparison"
Niklas Baumgarten (KIT, Karlsruhe), Linus Seelinger (U Heidelberg): Theme I "Software Benchmarking and Method Comparison"
May 6, 2022 11:00 — 11:15
Preview of Robert Scheichl - Group Presentation: Theme K "Beyond Plain-Vanilla MLMC & MIMC"
Robert Scheichl (U Heidelberg): Theme K "Beyond Plain-Vanilla MLMC & MIMC"
May 6, 2022 11:15 — 11:45
Preview of Hermann Matthies - What is a Sample?
Hermann Matthies (TU Braunschweig): What is a Sample?
May 6, 2022 11:45 — 12:30
Preview of Matthieu Dolbeault - A sharp upper bound for sampling numbers in L_2
Matthieu Dolbeault (Sorbonne U, Paris): A sharp upper bound for sampling numbers in L_2
May 10, 2022 13:50 — 14:40
Preview of Viet-Ha Hoang - Bayesian inversion of log-normal eikonal equation
Viet-Ha Hoang (NTU Singapore): Bayesian inversion of log-normal eikonal equation
May 16, 2022 09:15 — 10:00
Preview of Juan Pablo Madrigal Cianci - Generalized Parallel Tempering for Bayesian Inverse Problems
Juan Pablo Madrigal Cianci (EPFL, Lausanne): Generalized Parallel Tempering for Bayesian Inverse Problems
May 16, 2022 11:45 — 12:30
Preview of Nicholas Nelsen - Noisy linear operator learning as an inverse problem
Nicholas Nelsen (CalTech): Noisy linear operator learning as an inverse problem
May 17, 2022 16:00 — 16:45
Preview of Houman Owhadi - Computational Graph Completion
Houman Owhadi (Caltech, Pasadena): Computational Graph Completion
May 19, 2022 16:00 — 16:45
Preview of Richard Nickl - Bayesian non-linear inverse problems: Progress and Challenges
Richard Nickl (U of Cambridge): Bayesian non-linear inverse problems: Progress and Challenges
May 30, 2022 11:15 — 12:00
Preview of Pierre Alquier - Robust estimation via minimum distance estimation
Pierre Alquier (RIKEN AIP): Robust estimation via minimum distance estimation
May 31, 2022 11:15 — 12:00
Preview of Sven Wang - Minimax density estimation via measure transport
Sven Wang (MIT, Cambridge): Minimax density estimation via measure transport
June 1, 2022 10:00 — 10:45
Preview of Geoffrey Chinot - Adaboost and robust 1-bit compressed sensing
Geoffrey Chinot (ETH Zurich): Adaboost and robust 1-bit compressed sensing
June 2, 2022 10:00 — 10:45
Preview of Carola-Bibiane Schönlieb - Score based diffusion models for conditional generation
Carola-Bibiane Schönlieb (U of Cambridge): Score based diffusion models for conditional generation
June 2, 2022 11:15 — 12:00
Preview of Sophie Langer - Image classification: A (new) statistical viewpoint
Sophie Langer (TU Darmstadt): Image classification: A (new) statistical viewpoint
June 2, 2022 15:15 — 16:00
Preview of Hrushikesh Mhaskar - A direct method for approximation on unknown manifolds
Hrushikesh Mhaskar (Claremont Graduate U): A direct method for approximation on unknown manifolds
June 2, 2022 16:00 — 16:45
Preview of Shi Jin - Uncertain Quantification of ODEs/PDEs in quantum computing
Shi Jin (Shanghai Jiao Tong University): Uncertain Quantification of ODEs/PDEs in quantum computing
June 13, 2022 09:15 — 10:00
Preview of Josselin Garnier - Radiative transfer equation for surface and body waves
Josselin Garnier (École Polytechnique, Palaiseau): Radiative transfer equation for surface and body waves
June 13, 2022 10:45 — 11:30
Preview of Lorenzo Pareschi - Stochastic Galerkin particle methods
Lorenzo Pareschi (U of Ferrara): Stochastic Galerkin particle methods
June 14, 2022 09:00 — 09:45
Preview of Carlos Jerez-Hanckes - Helmholtz Scattering By Random Domains: First-Order Sparse Boundary Element Approximation
Carlos Jerez-Hanckes (U Adolfo Ibanez, Santiago de Chile): Helmholtz Scattering By Random Domains: First-Order Sparse Boundary Element Approximation
June 14, 2022 11:30 — 12:15
Preview of Jose Morales Escalante - Stochastic Galerkin Methods for the Boltzmann-Poisson system
Jose Morales Escalante (UT San Antonio): Stochastic Galerkin Methods for the Boltzmann-Poisson system
June 14, 2022 16:00 — 16:45
Preview of Xiaobing Feng - An efficient multi-modes Monte Carlo method for wave scattering in random media
Xiaobing Feng (U of Tennessee, Knoxville): An efficient multi-modes Monte Carlo method for wave scattering in random media
June 15, 2022 09:00 — 09:45
At a glance
Type:
Thematic Programme
When:
May 2, 2022 — June 24, 2022
Where:
ESI Boltzmann Lecture Hall
Organizer(s):
Clemens Heitzinger (TU Vienna)
Fabio Nobile (EPFL, Lausanne)
Robert Scheichl (U Heidelberg)
Christoph Schwab (ETH Zurich)
Sara van de Geer (ETH Zurich)
Karen Willcox (U of Texas, Austin)