108:30 — Epi-convergent approximations of discontinuous generalized eigenvalue functions with application to topology optimization

Optimization of the maximum generalized eigenvalue of symmetric-matrix-valued functions is an important class of structural topology optimization, which is related to robustness, vibration, and buckling. The maximum generalized eigenvalue can be ill-defined (unbounded or discontinuous) where the matrices become singular (where a topological change of the structural design occurs). In our presentation, we redefine the maximum generalized eigenvalue as an extended real-valued function. Then, we propose a continuous approximation that epi-converges to the original extended real-valued function. Epi-convergence is a useful tool in variational analysis, and is used to show that solutions of approximated optimization problems converge, in some sense, to solutions of the original optimization problems. We conduct simple numerical experiments on truss topology optimization of the minimum eigenfrequency, although most of the theoretical results also hold for other types of topology optimization problems.

209:00 — A Proximal Modified Quasi-Newton Method for Nonsmooth Regularized Optimization

We develop R2N, a modified quasi-Newton method for minimizing the sum of a continuously differentiable function f and lower semi-continuous prox-bounded h subject to bound constraints.
Both f and h may be nonconvex.
At each iteration, our method computes a step by minimizing the sum of a convex quadratic model of f, a model of h, and an adaptive quadratic regularization term.
A step may be computed using the R2 method (Aravkin, Baraldi and Orban, 2020), or the variant R2DH, in which the Hessian model of f is diagonal, when h is separable.
We establish convergence of a first-order stationarity measure to zero for both bounded and unbounded Hessian approximations.
Under a Lipschitz assumption, we derive a worst-case evaluation complexity bound that matches the best-known bound for bounded Hessian approximations, and that deteriorates with the growth of the approximations when the latter are unbounded, as in (Leconte and Orban, 2023).
We describe our implementation in Julia and report numerical experiences on inverse problems.
Additionally, we discuss our findings on a minimum-rank matrix completion problem.

309:30 — Feature Selection for Linear Fixed Effects Models

Linear mixed-effects (LME) models are used to analyze nested or combined data across a range of groups or clusters. These models use covariates to separate the total population variability (the fixed effects) from the group variability (the random effects). LMEs borrow strength across groups to estimate key statistics in cases where the data within groups may be sparse or highly variable, and play a fundamental role in population health sciences, meta-analysis, life sciences, and in many others domains. In this talk we formally introduce a mathematical description of the LME model and its feature selection variant. A naive proximal gradient descent (PGD) algorithm for its solution is described and its deficiencies are explained. A novel solution strategy is proposed that is based on a relaxation strategy that decouples the smooth from the nonsmooth components of the maximum likelihood objective. An optimal value function is obtained by partially optimizing the smooth component of the decoupled problem. We show that the resulting optimal value function has a locally Lipschitz gradient and so a PGD algorithm can be applied to a feature selecting regularization of the optimal value function.

410:00 — An efficient continuation algorithm for regularized optimization through the Moreau envelope

We develop a fast algorithm for solving sequences of regularized optimization problems by applying continuation to the Moreau envelope of the nonsmooth portion. We additionally show that our objective can be derived by replacing any of three terms with their Moreau envelope, and discuss efficient first order algorithms and linear algebra for solving the associated subproblems.