Stable multilevel low-rank representation for elliptic PDEs: approximation and preconditioning

Vladimir Kazeev (U of Vienna)

Apr 07. 2022, 16:20 — 17:10

Multilevel low-rank approximation in the form of matrix-product states (MPS), or tensor train (TT) decomposition, has been rigorously analyzed for certain classes of functions solving elliptic second-order PDEs. Analytic functions, solutions with algebraic corner singularities and highly-oscillatory solutions to multiscale diffusion problems have been shown to admit exponentially convergent approximations of this type. The approximation power of MPS/TT tensor methods is based on the successive adaptive approximation in a sequence of suitable, possibly highly adaptive, low-dimensional subspaces. In the case of multilevel representation, that allows for using extravagantly large but generic finite-element spaces and for computing low-parametric discretizations within those spaces adaptively, in a data-driven fashion.

In this talk, we revisit recent results on multilevel approximation and preconditioning in the MPS/TT decomposition and present a novel algorithm for the preconditioning of iterative tensor-structured solvers. Exploiting an hierarchy of nested finite-element spaces, it dramatically speeds up the computation of approximate low-rank solutions. Our numerical experiments demonstrate the efficiency of the new approach in two and three dimensions, including the setting of multiscale diffusion with solutions exhibiting low regularity and high-frequency oscillations.

Further Information
Venue:
ESI Boltzmann Lecture Hall
Associated Event:
Adaptivity, High Dimensionality and Randomness (Workshop)
Organizer(s):
Carsten Carstensen (HU Berlin)
Albert Cohen (Sorbonne U, Paris)
Michael Feischl (TU Vienna)
Christoph Schwab (ETH Zurich)