Sparse polynomial approximation of high-dimensional functions from random samples

Clayton Webster (U of Texas, Austin)

Apr 05. 2022, 16:20 — 17:10

This talk will focus on nonlinear approaches to sparse polynomial approximation of complex functions in high dimensions. Of particular interest is the parameterized PDE setting, where the target function is smooth, characterized by a rapidly decaying orthonormal expansion, whose most important terms are captured by a lower (or downward closed) set. By exploiting this fact, we will present and analyze several procedures for exactly reconstructing a set of (jointly) sparse vectors, from incomplete measurements, in particular: weighted minimization procedure for compressed sensing, with a precise choice of weights, for overcoming the curse of dimensionality; mixed-norm-based regularization that simultaneously reconstructs parameterized PDEs solutions over both physical and parametric domains; and a class of null space conditions for sparse reconstruction by virtue of nonconvex, non-separable minimizations. Such approaches will enable the reconstruction of the entire high-dimensional solution map, with accuracy comparable to the best approximation, while utilizing a minimal number of random samples. Numerical examples are provided to support the theoretical results and demonstrate the computational efficiency of the described compression techniques. 

Further Information
Venue:
ESI Boltzmann Lecture Hall
Recordings:
Recording
Associated Event:
Adaptivity, High Dimensionality and Randomness (Workshop)
Organizer(s):
Carsten Carstensen (HU Berlin)
Albert Cohen (Sorbonne U, Paris)
Michael Feischl (TU Vienna)
Christoph Schwab (ETH Zurich)