114:00 — Distributionally robust optimization through the lens of submodularity

Distributionally robust optimization is used to solve decision making problems under adversarial uncertainty where the distribution of the uncertainty is itself ambiguous. In this paper, we identify a class of these instances that is solvable in polynomial time by viewing it through the lens of submodularity. We discuss connections to the multimarginal optimal transport problem and the generalized moment problem by bridging ideas from convexity in continuous optimization to submodularity in discrete optimization.

214:30 — When submodularity meets pairwise independence

Submodularity provides a natural context to model a large class of discrete optimization problems including but not limited to influence maximization, mechanism design, resource allocation and several machine learning problems. As a set functional property, submodularity models the notion of diminishing returns in the discrete space. Theoretically, it has intrigued scientists due to strong structural similarity with both convex and concave functions in the continuous space, which has been exploited to derive approximation guarantees for deterministic submodular optimization problems, using continuous extensions. Our work, however, approaches submodular optimization from the lens of distributional robustness which seeks to evaluate or approximate the worst-case expected value of a submodular set function (subjected to random inputs) over a set of joint distributions satisfying some assumptions. Even with univariate information (marginal probabilities of the random inputs) alone, evaluating this optimal expected value is known to be an NP-complete problem. Existing approaches tackle the hardness by approximating the optimal expected value, assuming the random inputs to be independent. This notion is formalized by the concept of correlation gap which quantifies how much we “lose” in the expectation of the function by ignoring the correlation structure of the random set and assuming independence instead. For monotone submodular set functions, it was shown that the correlation gap is upper bounded by e/(e-1) in Agrawal et.al. (2012). In reality, however, more complex notions of randomness are often encountered, such as when weak correlations coexist with higher-order dependencies. Inspired by the need to incorporate more realistic notions of randomness and driven by the curiosity to understand the interplay between functional properties and randomness, we study the behaviour of monotone submodular set functions with pairwise independent random input. We show that in this setting, the e/(e-1) bound on the correlation gap can be improved to 4/3 (and that it is tight) in the following cases: (a) for small size of random inputs with general marginal probabilities (b) for general size of random inputs with small marginal probabilities and (c) for a specific submodular function (whose expectation is the probability that the chosen input set is non-empty) for general size of random inputs with general marginal probabilities. For rank functions of k-uniform matroids, we show that the ratio can be further improved to 4k/(4k − 1) for general size of random inputs with identical probabilities. Applications in distributionally robust optimization and mechanism design are demonstrated. Our results illustrate a fundamental difference in the behavior of submodular functions under weaker notions of independence with potential ramifications in improving existing algorithmic approximations.

315:00 — Convex Optimization for Bundle Size Pricing Problem

  • Hailong Sun, Antai College Of Economics And Management, Shanghai Jiao Tong University
  • Li Xiaobo, National University of Singapore
  • Chung Piaw Teo, National University of Singapore

We study the bundle size pricing (BSP) problem in which a monopolist sells bundles of products to customers and the price of each bundle depends only on the size (number of items) of the bundle. Although this pricing mechanism is attractive in practice, finding optimal bundle prices is difficult because it involves characterizing distributions of the maximum partial sums of order statistics. In this paper, we propose to solve the BSP problem under the cross moment model (CMM), which is constructed using only the first and second moments of customer valuations. Correlations between valuations of bundles are captured by the covariance matrix. We show that the BSP problem under this model is convex and can be efficiently solved using off-the-shelf solvers. Our approach is flexible in optimizing prices for any given bundle size. Numerical results show that it performs very well compared with state-of-the-art heuristics. This provides a unified and efficient approach to solve the BSP problem under various distributions and dimensions. We also provide a few insights regarding the BSP problem and CMM. First, the BSP problem can be converted into a multichoice pricing problem with correlations of valuations, and it is crucial to capture such correlations to construct good approximate solutions. Second, using only moment information in the way of CMM can be reasonable for constructing a good approximation to the BSP problem.

415:30 — Progresses in Modeling and solving distributional robust optimization problems

  • He Simai, Shanghai Jiao Tong University

Distributional robust optimization(DRO) focusing on making efficient and robust decision with limited data or information. The two key questions are, how to formulate the uncertainty set of the unknown distribution, and how to solve the corresponding model efficiently. In this talk, we present our recent progresses on how to utilize limited information efficiently to build sharper uncertainty set to achieve more robust and accurate decisions, and the corresponding solution approach.

Firstly, for single dimensional moment based DRO models, we establish the value of distribution shape information by showing its influence on model accuracy, identify the optimal solution structure by reverse convex optimization theory, and propose numerical approaches for solving such models. Secondly, for heavy tail or light tail distributions, general moments (expectation of general function of the random variable) like entropy function, moment generation function, log-function, fractional power function, etc, are widely used in other research areas like statistics. We construct a general approach for solving single dimensional DRO problems with general moments, by reducing the primal-dual complementary-slackness condition into determinant of matrices similar to Vandermonde matrices, which is much easier to analyze and solve than the original condition. We further illustrate how to utilize such an approach to gain analytical/semi analytical solutions to various problems. Thirdly, we report our recent progresses on extending our works to high dimensional scenarios. Lastly, we show correlation information can greatly enhance the model accuracy of data driven DRO models using Wasserstein distance, and ease the curse-of-dimensionality of such models.