1 — 08:30 — On a result by Baillon, Bruck, and Reich
It is well known that the iterates of an averaged nonexpansive mapping may only converge weakly to fixed point. A celebrated result by Baillon, Bruck, and Reich from 1978 yields strong convergence in the presence of linearity. In this paper, we extend this result to allow for flexible relaxation parameters. Examples are also provided to illustrate the results.
Based on joint work with Yuan Gao (UBC Okanagan).
2 — 09:00 — On the Bredies-Chenchene-Lorenz-Naldi algorithm
Monotone inclusion problems occur in many areas of optimization and variational analysis. Splitting methods, which utilize resolvents or proximal mappings of the underlying operators, are often applied to solve these problems. In 2022, Bredies, Chenchene, Lorenz, and Naldi introduced a new elegant algorithmic framework that encompasses various well known algorithms including Douglas-Rachford and Chambolle-Pock. They obtained powerful weak and strong convergence results, where the latter type relies on additional strong monotonicity assumptions. In this paper, we complement the analysis by Bredies et al. by relating the projections of the fixed point sets of the underlying operators that generate the (reduced and original) preconditioned proximal point sequences. We also obtain strong convergence results in the case of linear relations. Various examples are provided to illustrate the applicability of our results.
3 — 09:30 — The proximal point algorithm without monotonicity
We study the proximal point algorithm in the setting in which the operator of inter-
est is metrically subregular and satisfies a submonoticity property. The latter can be viewed as
a quantified weakening of the standard definition of a monotone operator. Our main result gives
a condition under which, locally, the proximal point algorithm generates at least one sequence
which is linearly convergent to a zero of the underlying operator. General properties of our notion
of submonotonicity are also explored as well as connections to other concepts in the literature.
This is joint work with Matthew Tam.
4 — 10:00 — Fast continuous time methods for monotone equations
In this talk, we discuss continuous in time dynamics for the problem of approaching the set of zeros of a single-valued monotone and continuous operator $V$. Such problems are motivated by minimax convex-concave and, in particular, by convex optimization problems with linear constraints. We introduce a second-order dynamical system that combines a vanishing damping term with the time derivative of $V$ along the trajectory, which can be seen as an analogous of the Hessian-driven damping in case the operator is originating from a potential. We show that these methods exhibit fast convergence rates for $\|V(z(t))\|$ as $t \rightarrow +\infty$, where $z(\cdot)$ denotes the generated trajectory, and for the restricted gap function, and that $z(\cdot)$ converges to a zero of the operator $V$. For the corresponding implicit and explicit discrete time models with Nesterov's momentum, we prove that they share the asymptotic features of the continuous dynamics.
In addition, we discuss the connection between the second-order dynamical system and a Tikhonov regularized first-order dynamical system, exhibiting fast convergence rates and strong convergence of the trajectory.