Coordinate descent methods beyond separability and smoothness

Ion Necoara (U Politehnica Bucharest)

Jun 07. 2024, 11:30 — 12:00

In this talk we consider optimization problems, possibly nonseparable and nonsmooth. When the objective function is composite  we design random coordinate proximal gradient methods which take into account the nonseparable form of the objective. When the objective function is nonsmooth we introduce a general smooth approximation framework for the original function, which covers the most important classes of smoothing techniques from the literature, and then apply random (accelerated) coordinate descent methods for minimizing the corresponding smooth approximations. Our algorithms achieve  scalability by constructing at each iteration a local approximation model of the whole nonseparable objective function along  a random  subspace with user-determined dimension.  We present a complete worst-case complexity analysis for our random coordinate (proximal) gradient methods in both convex and nonconvex settings. The numerical results on applications ranging from smallest eigenvalue problem to matrix factorization and support vector machine classification also  confirm the  efficiency of our algorithms.   

Further Information
Venue:
ESI Boltzmann Lecture Hall
Recordings:
Recording
Associated Event:
One World Optimization Seminar in Vienna (Workshop)
Organizer(s):
Radu Ioan Bot (U of Vienna)
Yurii Malitskyi (U of Vienna)