114:00 — Quasidensity and certain classes of multifunctions

Let $E$ be a nonzero real Banach space.
{\em Quasidensity} is a concept that can be applied to subsets of $E \times E^*$ (or equivalently to multifunctions from $E$ into $E^*$).
Every closed quasidense monotone set is {\em maximally monotone}, but there exist maximally monotone sets that are not quasidense.
The subdifferential of a proper, convex lower semicontinuous function on $E$ is quasidense. The subdifferentials of certain {\em nonconvex} functions are also quasidense. (This follows from joint work with Xianfu Wang.)
The closed monotone quasidense sets satisfy a {\em sum theorem} and a {\em dual sum theorem}.
A maximally monotone multifunction is quasidense if, and only if, it is of type (NI). We will explore the relationship with the Fitzpatrick extension and other subclasses of the maximally monotone multifunctions as time permits.

214:30 — Kurdyka-Lojasiewicz exponent for a class of Hadamard-difference-parameterized models

In this talk, we consider a class of L1-regularized optimization problems and the associated smooth “over-parameterized” optimization problems built upon the Hadamard difference parametrization (HDP). We show that second-order stationary points of the HDP-based model correspond to some stationary points of the corresponding L1-regularized model. More importantly, we show that the Kurdyka-Lojasiewicz (KL) exponent of the HDP-based model at a second-order stationary point can be inferred from that of the corresponding L1-regularized model under suitable assumptions. Our assumptions are general enough to cover a wide variety of loss functions commonly used in L1-regularizations, such as the least squares loss function and the logistic loss function. We also discuss how these KL exponents can help deduce the local convergence rate of a standard gradient method for minimizing the HDP-based models.

315:00 — On second-order variational analysis of prox-regular functions

In order to characterize certain properties of prox-regular functions, the usual generalized derivatives of variational analysis for the subdifferential mapping can be applied only in the presence of subdifferential continuity. In this talk we will demonstrate how the definitions of these derivatives can be modified in order to be also useful for the characterization of (strong) variational convexity and tilt stability without subdifferential continuity. Further, we present a new characterization of strictly twice epi-differentiable functions shedding more light on the local behavior of the subdifferential mapping of such functions. As a consequence of this characterization we are able to show that for strictly twice epi-differentiable functions the property of metric regularity is equivalent to the one of strong metric regularity. Our approach is mainly based on first-order theory of the subdifferential but we will also present the relation to Rockafellar's quadratic bundle consisting of the epigraphical limit of second subderivatives. Finally, for twice epi-differentiable functions, we give a relation between function values and subgradients which constitutes some special kind of second-order approximation. This result differs from other ones in the literature in so far as it gives some second-order expansion for the function values itself and not for the Moreau envelope.

415:30 — Stability of nonsmooth optimization problems

We present some applications of implicit function theorems from variational analysis to stability analysis of prominent optimization problems with a focus on regularized least-squares, including the LASSO or nuclear norm minimization. These contain results on well-posedness, Lipschitz stability and smoothness of the optimal solution function. The quantitative results are showcased in numerical computations.