Proximal Gradient Methods for Nonsmooth Nonconvex Minimax: A Unified Convergence Analysis

Marc Teboulle (Tel Aviv U)

Jun 04. 2024, 09:00 — 09:30

Nonconvex minimax problems abound in modern applications.  We focus on nonsmooth nonconvex minimax, thus departing from the more common weakly convex/concave and smooth models assumed in the recent literature. We present proximal gradient schemes (parallel and alternating) and show that both methods can  be analyzed through a single scheme within a unified framework which relies on expanding a general convergence mechanism  for nonconvex nonsmooth optimization problems. In contrast to the current literature which focuses on the complexity of obtaining  only nearly approximate stationary solutions, here we derive pointwise global convergence results, and as by-product we prove refined complexity results. Furthemore, our approach allows to expand the scope of minimax problems that can be addressed through the use of Non Euclidean proximal steps, and to extend the convergence and complexity results to this broader setting. This is joint work with Eyal Cohen.

Further Information
Venue:
ESI Boltzmann Lecture Hall
Recordings:
Recording
Associated Event:
One World Optimization Seminar in Vienna (Workshop)
Organizer(s):
Radu Ioan Bot (U of Vienna)
Yurii Malitskyi (U of Vienna)