Projected Variational Methods for High-dimensional Bayesian Inference

Peng Chen (U of Texas, Austin)

May 19. 2022, 11:45 — 12:30

Bayesian inference provides an optimal framework to learn models from data with quantified uncertainty. The dimension of the model parameters is often very high or infinite in many practical applications with models represented by, e.g., differential equations or deep neural networks. It is a longstanding challenge to accurately and efficiently solve high-dimensional Bayesian inference problems due to the curse of dimensionality—the computational complexity grows rapidly (often exponentially) with respect to the parameter dimension. In this talk, I will present a class of transport-based projected variational methods to tackle the curse of dimensionality. We project the high-dimensional parameters to intrinsically low-dimensional data-informed subspaces and employ transport-based variational methods to push samples drawn from the prior to a projected posterior. I will present error bounds for the projected posterior distribution measured in Kullback–Leibler divergence. Numerical experiments will be presented to demonstrate the properties of our methods, including improved accuracy, fast convergence with complexity independent of the parameter dimension and the number of samples, strong parallel scalability in processor cores, and weak data scalability in data dimension.

Further Information
Venue:
ESI Boltzmann Lecture Hall
Recordings:
Recording
Associated Event:
Computational Uncertainty Quantification: Mathematical Foundations, Methodology & Data (Thematic Programme)
Organizer(s):
Clemens Heitzinger (TU Vienna)
Fabio Nobile (EPFL, Lausanne)
Robert Scheichl (U Heidelberg)
Christoph Schwab (ETH Zurich)
Sara van de Geer (ETH Zurich)
Karen Willcox (U of Texas, Austin)