114:00 — Discretization Approaches for Non-Convex Mixed-Integer Nonlinear Programs

Solving MINLPs is to this day a very popular and challenging topic. Over the last two decades, mixed-integer programming relaxation techniques have increasingly established themselves as a serious alternative to classical spatial branch-and-bound.
The underlying idea is to relax the non-convex nonlinearities in a linear-discrete fashion while controlling the relaxation accuracy. By using integer variables to model these relaxations, the discrete approaches aim at exploiting the availability of mature solvers for mixed-integer programs.
Quadratically constrained quadratic programs are probably the most important subclass of mixed-integer nonlinear programs. The talk will include both a broad overview and some of the recent developments in the field, focusing largely on the handling of quadratic monomials and bivariate products. A best practice recommendation with numerical results based on the QPLIB and the MINLPLib will complete the presentation.

214:30 — Constructing Tight Quadratic Underestimators for Global Optimization

Global optimization algorithms extensively utilize polyhedral outer approximations to determine lower bounds. Noting recent improvements in algorithms that solve quadratically constrained quadratic programming problems, we investigate introducing quadratic non-linearity into outer approximation methods. We generalize a 2nd-order Taylor series approximation to create a hierarchy of convex quadratic outer approximators and an algorithm to construct them for twice differentiable convex functions and non-convex difference of convex (d.c.) functions. We demonstrate the quality of our quadratic outer approximators on several example functions extracted from global optimization benchmark libraries. Furthermore, we show that for systematically generated d.c. optimization problems defined in greater than one dimension, the convex QCQP relaxations constructed using our methodology determine tighter root node lower bounds compared to lower bounds computed by the state-of-the-art global optimization solver BARON.

315:00 — A Generalizable End-to-End Learning Approach for Global Optimization of Quadratically-Constrained Quadratic Programs

Quadratically-constrained quadratic programs (QCQPs) are a broad class of non-convex optimization problems which occur in applications such as network optimization and the pooling problem, but are often slow to solve in practice. We present a neural network approach that can learn partitioning heuristics which result in significant time saves for existing global optimization solvers. We construct a novel end-to-end loss function so that the desired attributes of an output partitioning scheme can be learned directly, without reference to a pre-selected heuristic. Additionally, we describe how to use the graph scattering transform to produce a single model which works on a variety of different problem structures and sizes. We demonstrate the neural network is able to learn heuristics that speed up QCQP optimization significantly, greatly exceeding those of previous work. Furthermore, we demonstrate the model is able to learn heuristics which generalize well even to problem sizes not seen in training.

415:30 — Piecewise Polyhedral Relaxations of Multilinear Optimization

We develop piecewise polyhedral relaxations for multilinear optimization problems over hyper-rectangular partitions. We interpret associated multipliers as multilinear terms, which allows us to derive linking constraints across relaxations of different multilinear terms. The associated cuts have been incorporated within ALPINE, an open-source software for global optimization that relies on piecewise approximations of functions. The resulting cuts speed-up the solver by an order of magnitude and enable solution of many more instances from our test-set. We also consider non-regular partitions which are particularly useful when the functions are modeled using linear regression trees. We show that resorting to regular partitioning for a problem that was natively described using non-regular partitions can lead to an exponential blowup in the number of partitions. We give the first locally ideal formulation for non-regular hyper-rectangular partitions and show that for certain tree ensemble optimization problems, the resulting formulation performs significantly better than alternative formulations.