1 — 14:00 — On tight error bounds for conic optimisation
Conic-linear programming allows researchers to tackle many important problems. Popular programs such as Mosek, CVX, and Hypatia solve problems with conic-linear methods. Such software packages provide users with so-called "backward error bounds," which may not be useful accuracy guarantees. I will describe the framework that we built for obtaining useful "forward" bounds, and explain how we have used it to find such bounds classes of next generation cones in Mosek and Hypatia. I will also discuss how such bounds also admit convergence rate guarantees for projection methods.
2 — 14:30 — On particularly awesome cones
Awesome cones were first introduced by Scott Lindstrom in 2023, with closed convex cones known to be particularly awesome.
This talk is focused on different ways to describe and capture the awesomeness (or lack thereof) of closed convex cones, including such properties as facial exposure, facial dual completeness, amenability and projectional exposure. In Euclidean spaces of dimensions four, five and higher some of the common geometric insights that guide our understanding of the three-dimensional objects become increasingly more difficult to rely on as dimensions grow, while phenomena that are impossible in three-dimensions emerge in the facial structure of higher-dimensional cones.
These difficulties call for new tools that can help us understand, study and construct closed convex cones with peculiar facial structure in higher dimensional spaces. For instance, the facial structure of these cones can be modified in subtle ways to preserve important properties and at the same time simplify the analysis, the facial structure of four-dimensional cones can be studied using three-dimensional affine cuts, and the facial structure of five-dimensional cones can be handled using a combination of four-dimensional cuts and appropriately mapping the three-dimensional boundaries of these cuts to the three-dimensional Euclidean space.
3 — 15:00 — Proximal splitting methods avoid strict saddle points of weakly convex problems
For nonconvex problems, critical points may not necessarily be local minimizers. In practice, it has been observed that commonly used methods find good-quality critical points, even close to optimality. For well-structured nonconvex problems, first-order methods and the proximal point algorithm can avoid strict saddle points with high probability. In this talk, we present how proximal splitting methods, such as Douglas-Rachford and other proximal-type splitting methods, inherit this property. More specifically, we discuss how proximal splitting methods converge to local minimizers of semialgebraic weakly convex optimization problems, when randomly initialized.
4 — 15:30 — Solving large-scale basis pursuit by infeasible alternating projections
Basis pursuit s the problem of finding a vector with smallest 1-norm among the solutions of a given linear system of equations. It is a famous convex relaxation of what the literature refers to as sparse affine feasibility (SAF) problem, in which sparse solutions to undetermined systems are sought. Problem SAF is a fundamental problem in Compressed Sensing (CS), which is a technique for recovering sparse signals from incomplete measurements. While SAF is known to be NP-hard, there are some instances where the solution of (BR) and coincide. The importance of basis pursuit led to a great deal of research devoted to the development of efficient methods for solving it, mainly for large-scale problems. There are several methods for solving (BP). One can think, for instance, in recasting it as a linear program (LP) and applying to the reformulation any standard LP solver.
In turn, we tackle the basis pursuit in its very original form with a scheme that uses alternating projections in its subproblems. These subproblems are designed to be inconsistent in the sense that they relate to two non-touching sets. Quite recently, inconsistency coming from infeasibility has been seen to work in favor of alternating projections and correspondent convergence rates. In our work, this feature is now suitably enforced in a new and numerically competitive method for solving the basis pursuit.