"DEFBAL" -- a Connection between the ADMM and Forward-Backward methods

Jonathan Eckstein (Rutgers U)

Jun 07. 2024, 14:00 — 14:30

This talk will show how the first two steps of the ADMM (the two minimization steps) may be interpreted as the classical forward-backward (proximal gradient) method applied to a dual formulation of the standard augmented Lagrangian subproblem.  By substituting other variants of the forward-backward method -- for example, algorithms involving Nesterov-style acceleration -- for the classical one, this observation allows for the creation of new classes of ADMM-like methods, of which this talk will give some examples.  It is not yet clear whether they have any computational advantages, but they are still of some theoretical interest.  Generically, we call this class of algorithms "DEFBAL", for "Dual Embedded Forward-Backward Augmented Lagrangian".

Further Information
Venue:
ESI Boltzmann Lecture Hall
Associated Event:
One World Optimization Seminar in Vienna (Workshop)
Organizer(s):
Radu Ioan Bot (U of Vienna)
Yurii Malitskyi (U of Vienna)