1 — 14:00 — Exploring the Spectral Proximal Gradient Method and Applications
The spectral proximal gradient method is an accelerated variant of the proximal gradient algorithm, leveraging second-order information obtained through the Barzilai-Borwein-Raydan stepsize. This talk concisely overviews its historical evolution, theoretical underpinnings, and key convergence results. We will also exemplify its behavior in many applications spanning different domains like projection onto the intersection of convex sets, low-rank plus sparse matrix decomposition, $l_0$-regularized machine learning, and full wave inversion in Geophysics.
2 — 14:30 — Regularized methods via cubic model subspace minimization for nonconvex optimization
Adaptive cubic regularization methods for solving nonconvex problems need the efficient computation of the trial step, involving the minimization of a cubic model. We present new approach in which this model is minimized in a low dimensional subspace that, in contrast to classic approaches, is reused for a number of iterations. Whenever the trial step produced by the low-dimensional minimization process is unsatisfactory, we employ a regularized Newton step whose regularization parameter is a by-product of the model minimization over the low-dimensional subspace. We show that the worst-case complexity of classic cubic regularized methods is preserved, despite the possible regularized Newton steps. We focus on the large class of problems for which (sparse) direct linear system solvers are available and provide several experimental results showing the very large gains of our new approach when compared to standard implementations of adaptive cubic regularization methods based on direct linear solvers.
3 — 15:00 — Scalable interior point methods for large scale discrete optimal transport problems
In this talk, we address the solution of optimal transport problems using Interior Point Methods (IPM). We propose two approaches which exploit the expected sparsity of the optimal solution to enhance the efficiency of the numerical linear algebra required to solve the Newton system.
The first approach deals with classical optimal transport over dense bipartite graphs, enforcing sparsity of the intermediate approximations by means of a column-generation-like approach. The numerical experiments show that this method can achieve scalable results for very large problems.
The second approach deals with optimal transport over sparse graphs, using a proximal stabilized interior point method. The induced primal-dual regularization allows to use sparsified versions of the normal equations to inexpensively generate IPM search directions. The proposed method, despite a potential high level of inexactness in the Newton direction, retains the polynomial complexity of standard IPMs, for suitable choices of the sparsification parameters. Numerical results show that this approach is efficient and robust for large scale problems, and can outperform classical graph algorithms like network simplex and cost scaling.
4 — 15:30 — Recent advances in preconditioners for interior point methods
We address challenges related to the application of iterative
schemes when solving KKT systems arising in interior point methods.
We demonstrate that certain undesirable features of these systems
can be exploited to design preconditioners for iterative methods.
Recent advances including the new theoretical insights and the new
computational results will be presented.