1 — 16:20 — Zero duality gap for optimization problems in spaces without linear structure
We prove sufficient and necessary conditions ensuring zero duality gap for Lagrangian duality in some classes of nonconvex optimization problems. To this aim, we use the Φ-convexity theory and minimax theorems for Φ-convex functions. The obtained zero duality results apply to infinite-dimensional linear optimization problems, including Kantorovich duality which plays an important role in determining Wasserstein distance.
2 — 16:50 — Accelerating Communications in Decentralized Learning via Randomization
This talk explores DADAO and A2CID2, two recent methods designed to address communication bottlenecks in distributed decentralized learning. DADAO stands out as the first decentralized, accelerated, primal, asynchronous optimization algorithm to minimize a sum of smooth and strongly convex functions. Meanwhile, A2CID2 utilizes continuous local momentum and randomized peer-to-peer operations to accelerate communications. These methods model operations within a decentralized framework using Stochastic Differential Equations, providing a robust theoretical basis while being straightforward to implement. By decoupling computation from communication, they effectively reduce communication demands and enhance scalability, which amenable to large-scale deep learning applications. During the talk, I will examine the fundamental principles of these methods and critically assess their limitations compared to traditional stochastic synchronous settings.
3 — 17:20 — On some new NCP functions
A nonlinear complementarity problem (NCP) consists of nonlinear inequalities with nonnegativity and orthogonality conditions on multiple variables and given multivariate functions. Such problems arise in constrained optimization and equilibrium problems. Moreover, applications of NCPs in operations research, engineering, and economics motivated significant research efforts in the past decades, which resulted in various numerical techniques including the merit function approach, nonsmooth Newton method, and regularization approach.
In this talk, we present two classes of new NCP functions with their properties. In particular, we propose neural networks and inexact Levenberg- Marquardt method by using these NCP functions for solving nonlinear complementarity problems (NCPs). More specifically, for the first class of NCP functions, we build a new class of neural networks based on the smoothing method for NCP introduced by Haddou and Maheux. For the second class of NCP functions, which are smooth, coercive and strongly semismooth, we propose an inexact Levenberg-Marquardt method for solving nonlinear complementarity problems (NCPs). Different from existing exact/inexact Levenberg-Marquardt methods, the proposed method adopts a derivative-free line search to ensure its globalization.
In addition, we present numerical experiments to validate the theoretical results and to illustrate the difference in numerical performance of functions. Numerical performance and comparison are reported and demonstrated.