116:20 — Distributionally robust two-stage convex quadratic programming

Distributionally robust optimization provides a framework for stochastic optimization when the distribution of the random vector is uncertain. The uncertain distribution is assumed to come from an ambiguity set of plausible distributions. The goal is to find a decision that hedges against the worst distribution in the ambiguity set. We focus on a two-stage stochastic program with convex quadratic recourse under distributional ambiguity. We assume uncertainty only in the right-hand side vector. We consider data-driven distributional ambiguity sets based on the Wasserstein distance and an optimal quadratic transport distance. For the latter, we derive a decomposition algorithm and present computational results.

216:50 — Multi-scale and Multi-stage Stochastic Discrete Optimization and its Applications in Energy and Sustainability

One of the major challenges in sustainability arises from the need to coordinate processes of different scales in such a manner that decisions in each scale can be coordinated in a computationally realizable manner, while ensuring that certain metrics of the system (e.g., costs, system reliability, emissions, renewable penetration, etc.) can be maintained within acceptable bounds. The challenges here arise due to several features, such as multi-scale dynamical uncertainty, mixed-binary dynamic decisions, and the interplay among multi-scales. Such collections of intermingling mathematical programming models are central to many applications in energy and sustainability. We will present a decomposition-coordination algorithm which will coordinate multi-scale decisions and uncertainty in such a manner that the system will seek to operate within acceptable bounds as more data becomes available.

317:20 — Integer Programming for Learning Directed Acyclic Graphs from Non-identifiable Gaussian Models

We study the problem of learning directed acyclic graphs from continuous observational data, generated according to a linear Gaussian structural equation model. State-of-the-art structure learning methods for this setting have at least one of the following shortcomings: i) they cannot provide optimality guarantees and can suffer from learning sub-optimal models; ii) they rely on the stringent assumption that the noise is homoscedastic, and hence the underlying model is fully identifiable. We overcome these shortcomings and develop a computationally efficient mixed-integer programming framework for learning medium-sized problems that accounts for arbitrary heteroscedastic noise. We present an early stopping criterion under which we can terminate the branch-and-bound procedure to achieve an asymptotically optimal solution and establish the consistency of this approximate solution. In addition, we show via numerical experiments that our method outperforms three state-of-the-art algorithms and is robust to noise heteroscedasticity, whereas the performance of the competing methods deteriorates under strong violations of the identifiability assumption.