114:00 — Calm local optimality for nonconvex-nonconcave minimax problems

  • Xiaoxiao Ma, University of Victoria
  • Wei Yao, Southern University of Science And Technology
  • Jane Ye, University of Victoria
  • Zhang Jin, Southern University of Science And Technology

Nonconvex-nonconcave minimax problems have found numerous applications in various fields including machine learning. However, questions remain about what is a good surrogate for local minimax optimum and how to characterize the minimax optimality. Recently Jin, Netrapalli, and Jordan (ICML 2020) introduced a concept of local minimax point and derived optimality conditions for the smooth and unconstrained case. In this work, we introduce the concept of calm local minimax point, which is a local minimax point with a calm radius function. With the extra calmness property we obtain first and second order sufficient and necessary optimality conditions for a very general class of nonsmooth nonconvex-nonconcave minimax problem. Moreover we show that the calm local minimax optimality and the local minimax optimality coincide under a weak sufficient optimality condition for the maximization problem. This equivalence allows us to derive stronger optimality conditions under weaker assumptions for local minimax optimality.

214:30 — On desingularizing functions in convex programming

Over the past years, the idea of the KL property has become increasingly common in the optimisation area. It was used as tool guarantees that every sequence generated by many different descent algorithms enjoys finite length property. When verifying the KL property, one needs to find a desingularizing function, which is usually considered difficult. In light of this, one provides a desingularizing function that can be regarded as the smallest among all potential desingularizing functions. We investigate continuity, Lipschitz continuity, differentiability and subdifferentiability properties of this function as well as different characterizations of the so-called Kurdyka-Lojasiewicz-Hoffman constant by introducing a new class of functions with nonsmooth moderate behaviour.
Based on these attributes, we provide a convergence analysis and an estimation of the convergence rate of an inexact descent methods for convex differentiable functions that covers a wide class of gradient descent methods. We also delves the complexity property of this method and its relationship with $\lambda$-one-dimensional worst-case proximal sequences.