### 1 — 14:00 — Commutation principles for nonsmooth variational problems on Euclidean Jordan algebras

The commutation principle proved by Ramirez, Seeger, and Sossa (SIAM J Optim 23:687–694, 2013) in the setting of Euclidean Jordan algebras says that for a Frechet differentiable function Theta and a spectral function F, any local minimizer or maximizer a of Theta+F over a spectral set E operator commutes with the gradient of Theta at a. In this paper, we improve this commutation principle by allowing Theta to be nonsmooth with mild regularity assumptions over it. For example, for the case of local minimizer, we show that a operator commutes with some element of the limiting (Mordukhovich) subdifferential of Theta at a provided that Theta is subdifferentially regular at a satisfying a qualification condition. For the case of local maximizer, we prove that a operator commutes with each element of the (Fenchel) subdifferential of Theta at a whenever this subdifferential is nonempty. As an application, we characterize the local optimizers of shifted strictly convex spectral functions and norms over automorphism invariant sets.

### 2 — 14:30 — An efficient active-set method with applications to sparse approximations and risk minimization

In this talk we present an efficient active-set method for the solution of convex quadratic programming problems with general piecewise-linear terms in the objective, with applications to sparse approximations and risk-minimization. The algorithm is derived by combining a proximal method of multipliers (PMM) with a standard semismooth Newton method (SSN), and is shown to be globally convergent under minimal assumptions. Further local linear (and potentially superlinear) convergence is shown under standard additional conditions. The major computational bottleneck of the proposed approach arises from the solution of the associated SSN linear systems. These are solved using a Krylov-subspace method, accelerated by certain novel general-purpose preconditioners which are shown to be optimal with respect to the proximal penalty parameters. The preconditioners are easy to store and invert, since they exploit the structure of the nonsmooth terms appearing in the problem's objective to significantly reduce their memory requirements. We showcase the efficiency, robustness, and scalability of the proposed solver on a variety of problems arising in risk-averse portfolio selection, $L^1$-regularized partial differential equation constrained optimization, quantile regression, and binary classification via linear support vector machines. We provide computational evidence to demonstrate, on real-world datasets, the ability of the solver to efficiently and competitively handle a diverse set of medium- and large-scale instances.

### 3 — 15:00 — Douglas-Rachford is the best projection method

We prove that the Douglas-Rachford method applied to two closed convex cones in the Euclidean plane converges in finitely many steps if and only if the set of fixed points of the Douglas-Rachford operator is nontrivial. We analyze this special case using circle dynamics.

We also construct explicit examples for a broad family of projection methods for which the set of fixed points of the relevant projection method operator is nontrivial, but the convergence is not finite. This three-parametric family is well-known in the projection method literature and includes both the Douglas--Rachford method and the classic method of alternating projections.

Even though our setting is fairly elementary, this work contributes in a new way to the body of theoretical research justifying the superior performance of the Douglas-Rachford method compared to other techniques. Moreover, our result leads to a neat sufficient condition for finite convergence of the Douglas--Rachford method in the locally polyhedral case on the plane, unifying and expanding several special cases available in the literature.

### 4 — 15:30 — A new qualification condition for Lipschitzian optimization problems

We consider an inequality constrained optimization problem defined by possibly nonsmooth but Lipschitz continuous functions. Often, standard constraint qualifications may fail at local minimizers of the problem. It is known that corresponding approximate KKT conditions still hold in such situations without requiring a constraint qualification to be fulfilled. Therefore, it is of interest to find weak conditions which guarantee that a point satisfying the approximate KKT conditions fulfills the usual KKT conditions. In this talk, we present a condition, which is not a constraint qualification, that possesses the aforementioned property and can be satisfied by a wide range of Lipschitzian optimization problems. A comparison with existing constraint qualifications reveals that the new condition is implied by quasinormality in local minimizers. Relations to further constraint qualifications will be discussed as well.