108:30 — Optimal and approximate optimal parameters of the SOR-like iteration method for solving absolute value equations

The SOR-like iteration method is an efficient method for solving the NP-hard absolute value equations. However, the SOR-like iteration method is one-parameter-dependent and it is an important problem to determine the optimal iteration parameter. In this talk, we will revisit the convergence results of the SOR-like iteration method and give a new proof. The new proof exhibits some insights in determining the convergent region and the optimal iteration parameter. Along this line, the iteration-independent optimal parameter and the approximate optimal parameter are explored. Numerical results are presented to demonstrate our theoretical results.

209:00 — A regularized splitting method for sums of multi-block maximal monotone operators and its relationship with symmetric ADMM

The monotone inclusion problem has wide applications in mathematical optimization and control. In this paper, we consider designing a splitting method for finding a zero point of $\sum_{i=1}^{m} A_ix+ Bx~(m\ge2)$, where $B$ is $\beta-$ cocoercive. We introduce a new space reformulation technique to transform this problem into a three-operator problem in $\mathcal{H}^{m-1}$. Based on the proposed space reformulation technique, we present a regularized splitting method. This method involves the computation of $B$ in a forward step and the parallel computation of the $A_i$'s in a backward step. We also prove the convergence of the proposed method and the sublinear convergence rate for the fixed point residuals under mild conditions in infinite-dimension Hilbert space. Our method extends the Douglas-Rachford splitting method and the Davis-Yin three operator splitting method. It is worth noting that symmetric ADMM is a special case of our method in the context of convex optimization. Finally, we apply our method to the mean-variance optimization problem, inverse problems in imaging, and the soft-margin support vector machine problem with nonsmooth hinge functions. We then compare its performance with the existing methods in the literature.

309:30 — ** MOVED TO FRIDAY AM REMOTE SESSION ** Extended alternating structure-adapted proximal gradient algorithm for multiblock nonconvex optimization

Alternating structure-adapted proximal (ASAP) gradient algorithm has garnered significant interest on solving nonconvex nonsmooth block-regularized problems recently. However, multiblock nonseparable structure hinders the applications of ASAP on far-reaching practical problems. In this paper, following the seminal work (M. Nikolova and P. Tan, SIAM J Optim, 29:2053-2078) of ASAP, we develop an extended ASAP for solving nonconvex nonsmooth problems with multiblock nonseparable structure. By exploiting the properties of partial subdifferential for nonsmooth alternating scheme, our developed convergence analysis covers a variety of nonsmooth nonseparable coupling functions. Thus expanding the applicability scope of the model, particularly in multimodal data fusion. Under the Aubin property on partial subdifferential mapping, the global convergence of extended ASAP is established based on Kurdyka-Lojasiewicz property. Furthermore, the sublinear convergence rate of extended ASAP is built upon the proximal point mapping algorithmic framework and some mild conditions. Numerical simulations on multimodal data fusion demonstrate the compelling performance of the proposed method.