1 — 08:30 — Computing the conjugate of nonconvex bivariate piecewise linear-quadratic functions.
Computing the minima of a convex function is much easier than computing the same for a nonconvex function. Since every nonconvex function shares the same minima as its convex envelope, we compute the conjugate of a bivariate piecewise linear-quadratic (PLQ) function as a first step toward the computation of the convex envelope.
Our algorithm starts with computing the convex envelope of each piece obtaining a rational function defined over a polyhedral subdivision. Then we compute the conjugate of each of those pieces and obtain a fractional form defined over a parabolic subdivision. The last step is to compute the maximum of all those functions to obtain the conjugate of the original piecewise linear-quadratic function as a piecewise function defined on a parabolic subdivision.
Our implementation in MATLAB uses symbolic computation and rational numbers to avoid any floating-point errors.
2 — 09:00 — Norm-induced Non-Convex Relaxations for Lipschitz Constraints
In this paper, we consider a finite dimensional optimization problem minimizing a continuous objective on a compact domain subject to a multi-dimensional constraint function. For the latter, we only assume the availability of a Lipschitz property. Recent literature proposes methods based on non-convex outer approximation for one dimensional equality constraints on bounded polyhedral domains, which are Lipschitz with respect to the maximum norm. To the best of our knowledge, however, there exists no formulation of such a non-convex outer approximation method for a general problem class. We introduce a meta-level solution framework to solve this class and tackle the underlying theoretical foundations. Considering the feasible domain without the constraint function as manageable, our method relaxes the multidimensional constraint and iteratively refines the feasible region by means of norm-induced cuts, relying on an oracle for the resulting sub-problems. We show the method's correctness and investigate the problem complexity. In order to account for discussions about functionality, limits, and extensions, we present computational examples including illustrations.
3 — 09:30 — On the impact of reliable boundary calculations by convex relaxation in global optimization
In order to obtain reliable deterministic global optima, all computed bounds must be certified in such a way that floating-point operations cannot cause a Brand and Bound algorithm to converge to a wrong solution. Branch-and-bound codes based on interval arithmetic possess this reliability property. However, some efficient acceleration techniques, such as convex relaxation techniques, cannot be used directly in such a global optimization code without losing the reliability property. In this work, we explain how to correct these bounds in order to make convex relaxation methods reliable for solving constrained and unconstrained global optimization problems. In addition, we explain how to exploit them within an interval branch-and-bound code and show that the effort made to provide a certified and reliable solution is not prohibitive.
4 — 10:00 — Advances in convex relaxation generation for dynamic global optimization
Deterministic methods for global optimization proceed by generating upper and lower bounds on the unknown globally optimal value, and then refining these bounds. In this setting, lower bounds are typically computed by minimizing convex relaxations of the original problem using local solvers. This is challenging for dynamic global optimization, however, where embedded ordinary differential equations (ODEs) can complicate convex relaxation generation. In this presentation, we describe recent approaches for automatically constructing effective convex relaxations for parametric ODE solutions, for use in global dynamic optimization. These approaches make use of embedded optimization problems in new ways, and have been implemented in Julia.