1 — 08:30 — Parameter estimation in ODEs: assessing the potential of local and global solvers
In this work, we consider the problem of parameter estimation in models described by nonlinear ordinary differential equations (ODEs). In particular, a direct transcription approach is considered to solve the problem, in such a way that the controls and state profiles are discretized. Furthermore, the original ODE system is transformed into algebraic equations for the optimization model after applying a finite difference scheme. This leads to the formulation of a standard nonlinear programming (NLP) problem that can be solved with mathematical programming techniques. The main goal of this work is to analyze the potential of mathematical modeling and state-of-the-art solvers to handle these problems. An extensive numerical study over a set of eight challenging parameter estimation problems of nonlinear dynamic biological systems is carried out. In this study different modeling aspects such as the chosen discretization scheme or different variants of the formulations are analyzed. Besides, thanks to the use of global and local state-of-the-art solvers, our approach provides insight into the challenges posed by local optimality and identifiability. The results obtained with this approach are very promising, outperforming the capacity to solve these problems that had been acknowledged previously in the literature.
2 — 09:00 — Domain reduction techniques in polynomial optimization
The design and implementation of global optimization algorithms for general MINLP problems is a very active field of research, and the number of available solvers has been steadily increasing over the past years. Recently, the companies behind state-of-the-art MILP solvers such as Xpress and Gurobi have announced the release of new versions capable of solving general MINLP to certified global optimality. Convexifications and domain reduction techniques are probably the two most important elements behind the efficiency of the spatial branching required to handle the nonconvexities. The focus of this talk is precisely on the joint assessment of the individual impact of different aspects of domain reduction on the performance of a global optimization algorithm. In particular, we will take a look at the RLT technique (designed to solve polynomial optimization problems) and several enhancements that can be added to it, such as improvements to the selection of the branching variable and the branching point, and bound-tightening techniques such as FBBT and OBBT. The aim is to check which of these techniques can have a noticeable impact on the performance of this particular technique (and, because we are working only on improving the spatial branching performance, on the performance of general non-linear solvers) and therefore guide the developers of this solvers towards implementing the most promising enhancement.
3 — 09:30 — Learning to Accelerate the Global Optimization of Quadratically-Constrained Quadratic Programs
We use machine learning to predict optimal instance-specific heuristics and accelerate the guaranteed global minimization of families of nonconvex quadratically-constrained quadratic programs (QCQPs). Specifically, we consider partitioning-based convex mixed-integer programming (MIP) relaxations for nonconvex QCQPs and propose the novel concept of strong partitioning (SP) to optimally partition variable domains without sacrificing global optimality. We design a local optimization method for solving this challenging max-min SP problem and learn a boosted regression tree approximation of SP for homogeneous families of QCQPs. Furthermore, we propose an end-to-end neural network model trained to directly predict partitioning points for a QCQP instance, thereby eliminating the need for computationally expensive SP training data.
We present a detailed computational study on randomly generated families of QCQPs, including instances of the pooling problem, using the open-source global solver Alpine. Our numerical experiments demonstrate significant reductions in solution time achieved by strong partitioning, its boosted regression tree approximation, and the end-to-end neural network model. On average, these techniques reduce Alpine's solution time by factors ranging from 3.5 to 16.5, 2 to 4.5, and 6 to 9, respectively, across various QCQP families. Moreover, we observe maximum reduction factors of 15 to 700, 10 to 200, and 15 to 100 for the three techniques, respectively.
4 — 10:00 — Strong and fast SDP bounds for water network problems
In this work we exploit two types of sparsity in polynomial optimization for the valve setting problem in water networks. The valve setting problem consists of many small subproblems involving few variables and monomials and connected with sparse linking constraints. The problem exhibits correlative sparsity (subsets of variables appearing separately in constraints) and term sparsity (few monomials present in the problem). We analyze the performance of several existing sparse SDP relaxations and suggest a new simple SDP relaxation that uses both correlative and term sparsity to reach a trade-off between the bound quality and running times. We report numerical results on four water networks ranging in size from 4 to about 2000 nodes. The approach extends to electricity and gas network problems.