On strong convergence of inertial algorithms via Tikhonov regularization

Szilard Csaba Laszlo (U Cluj-Napoca)

Jun 06. 2024, 14:30 — 15:00

The introduction of Tikhonov regularization terms in the algorithms associated to a minimization problem gives us the possibility to choose a-priori the equilibrium point at which the sequence of iterates generated by the algorithm converges. Usually the equilibrium point is chosen to be the minimum norm minimizer of the objective function. Even more the convergence of the generated sequences are considered in the strong topology. We will present  two inertial algorithms with Tikhonov regularization terms, covering both the case of the optimiztion problems with a non-smooth objective function (proximal algorithm) and the case when the objective function is differentiable (Nesterov-type algorithm). We show that the extrapolation coefficient and the Tikhonov regularization parameter are closely related, we present a setting of the parameters  which assure the strong convergence of the iterates to the minimum of the minimum norm of the objective function, but also fast convergence for the values of the objective function in these iterates. In the case when the objective  function is differentiable  we consider  an inerital gradient type algorithm with two Tikhonov regularization  terms  and we obtain strong convergence of the generated sequences to the minimum norm solution, but also Nesterov-type rates. Moreover, we show that we need both regularization terms, if we omit either of them the sequence generated by the algorithm do not converge  anymore to the minimal norm solution. 

Further Information
Venue:
ESI Boltzmann Lecture Hall
Recordings:
Recording
Associated Event:
One World Optimization Seminar in Vienna (Workshop)
Organizer(s):
Radu Ioan Bot (U of Vienna)
Yurii Malitskyi (U of Vienna)