1 — 14:00 — Saddle-point Frank-Wolfe methods for Linear Programming
We propose two saddle-point Frank-Wolfe methods called FWLP and FWLP-P to solve general LP instances. Convergence of FWLP-P is proved, and a convergence rate of
2 — 14:30 — The Generalized Multiplicative Gradient Method for a Class of Convex Symmetric Cone Optimization Problem
We consider a class of convex symmetric cone optimization problem, where the objective function is a convex log-homogeneous function that satisfies the "log-gradient convexity". Applications of this class of problems include inference of multivariate Hawkes processes, D-optimal design, quantum state tomography, and Nesterov's reformulated convex relaxation of Boolean quadratic problems. We develop a generalized multiplicative gradient method for this class of problems, and show that it converges with rate O(1/k). We demonstrate that for all the aforementioned applications, the computational complexity of our method is superior or comparable to other applicable first-order methods.
3 — 15:00 — Adaptive Open-Loop Step-Sizes for Accelerated Frank-Wolfe Convergence
We extend acceleration results from fixed-
4 — 15:30 — Affine-invariant convergence of Frank-Wolfe algorithms on polytopes
We provide a unified analysis of the following popular versions of the Frank-Wolfe method on polytopes: vanilla Frank-Wolfe, Frank-Wolfe with away steps, Frank-Wolfe with blended pairwise steps, and Frank-Wolfe with in-face directions. For each of these algorithms we establish convergence rates ranging from sublinear to linear. Our approach is entirely affine-invariant as it relies on some suitable error bound and curvature properties that depend solely on the objective function and the polytope but not on any affine-dependent objects such as norms. We highlight the interesting geometric interpretation of the error bound properties underlying our main developments.