1 — 14:00 — Saddle-point Frank-Wolfe methods for Linear Programming
We propose two saddle-point Frank-Wolfe methods called FWLP and FWLP-P to solve general LP instances. Convergence of FWLP-P is proved, and a convergence rate of $O(k^{-1/2})$ is established. FWLP is simpler than FWLP-P and more efficient in practice, but no convergence proof is known for FWLP. The methods have an advantage over other first-order methods for LP in that they do not need the entire constraint matrix on each iteration.
2 — 14:30 — The Generalized Multiplicative Gradient Method for a Class of Convex Symmetric Cone Optimization Problem
We consider a class of convex symmetric cone optimization problem, where the objective function is a convex log-homogeneous function that satisfies the "log-gradient convexity". Applications of this class of problems include inference of multivariate Hawkes processes, D-optimal design, quantum state tomography, and Nesterov's reformulated convex relaxation of Boolean quadratic problems. We develop a generalized multiplicative gradient method for this class of problems, and show that it converges with rate O(1/k). We demonstrate that for all the aforementioned applications, the computational complexity of our method is superior or comparable to other applicable first-order methods.
3 — 15:00 — Adaptive Open-Loop Step-Sizes for Accelerated Frank-Wolfe Convergence
We extend acceleration results from fixed-$\ell$ open-loop step-sizes, $\ell\in\N$, characterized by $\eta_t = \frac{\ell}{t+\ell}$, to adaptive open-loop step-sizes $\eta_t = \frac{g(t)}{t+g(t)}$, with $g\colon \N \to \R_{\geq 0}$ a nondecreasing function that fulfills several well-founded assumptions. This development inspires the adoption of the log-adaptive step-size, derived by setting $g(t) = 2 + \log(t+1)$, which achieves convergence rates comparable to or faster than those of any fixed-$\ell$ step-size, up to polylogarithmic factors. In the prevalent strong growth setting, the log-adaptive step-size outperforms any fixed-$\ell$ step-size and can admit arbitrarily fast sublinear convergence rates.
4 — 15:30 — Affine-invariant convergence of Frank-Wolfe algorithms on polytopes
We provide a unified analysis of the following popular versions of the Frank-Wolfe method on polytopes: vanilla Frank-Wolfe, Frank-Wolfe with away steps, Frank-Wolfe with blended pairwise steps, and Frank-Wolfe with in-face directions. For each of these algorithms we establish convergence rates ranging from sublinear to linear. Our approach is entirely affine-invariant as it relies on some suitable error bound and curvature properties that depend solely on the objective function and the polytope but not on any affine-dependent objects such as norms. We highlight the interesting geometric interpretation of the error bound properties underlying our main developments.