108:30 — Column Elimination

Column elimination is an exact algorithm to solve discrete optimization problems via a 'column' formulation in which variables represent a combinatorial structure such as a machine schedule or truck route. Column elimination works with a relaxed set of columns, from which infeasible ones are iteratively eliminated. As the total number of columns can typically be exponentially large, we use relaxed decision diagrams to compactly represent and manipulate the set of columns. In this talk, we provide an introduction to the column elimination method and present recent algorithmic improvements resulting in competitive performance on large-scale vehicle routing problems. Specifically, our approach closes a large vehicle routing benchmark instance with 1,000 locations for the first time.

209:00 — Sequential Mixed-Integer Programming for Matrix Sparsification

Sparsity of matrices can be exploited in various applications, e.g., to speed up the solution of linear systems or in second-order optimization algorithms. This motivates considering the matrix sparsification problem (MS), in which we seek an equivalent representation of a given matrix with as few nonzero entries as possible. While MS and the equivalent sparse nullspace basis problem have been of interest for decades, the only known solution approaches appear to be heuristics or methods relying on restrictive assumptions that may be unlikely to hold in practice. We propose a sequential mixed-integer linear programming algorithm for matrix sparsification that leverages relations to matroid theory, discuss connections to a machine learning task known as dictionary learning, and show some numerical results.

309:30 — An improved weighting local search for large-scale binary integer programs

We have developed a 4-flip neighborhood local search (LS) algorithm for binary integer programs (BIP) that incorporates an efficient incremental evaluation of solutions and an adaptive control of penalty weights. The proposed algorithm attained good performance for large-scale set covering and partitioning problems consisting of homogenous constraints. However, we also realized two defectives of the proposed algorithm that (1) it does note well perform for some combinatorial optimization problems such as bin packing problems consisting of heterogeneous constraints, and (2) the 4-flip neighborhood search performs less effectively than expected while taking much computation time.
To overcome them, we first introduce an improved extraction method of variable associations that identifies promising pairs of flipping variables in the neighborhood search. We focus on that many of BIPs in real applications consist of typical types of constraints such as knapsack, set packing, covering and partitioning, bin packing, variable bounding and so on. We then evaluate promising pairs of flipping variables by computing inner product of partial column vectors associated to each typical constraint type. We also introduce iterated local search (ILS) algorithm instead of the 4-flip neighborhood search for diversification of the search. The ILS is a simple and effective framework that repeats LS from different initial solutions generated by perturbing (a.k.a. kick operation) the best solution obtained so far. It is known that a naive ILS often fails to escape local minimum solutions due to rewinding to the initial solutions. However, it is difficult to design an effective kick operation deriving few distinctive features of BIP. We accordingly develop a generalized diversification procedure of ILS that prevents rewinding to the kick operation by freezing the flipped variables.
Based on the proposed algorithm, we develop an open heuristic solver for BIP and report our computational results for a wide variety of test instances.

410:00 — Sensitivity analysis for linear change of the constraint matrix of an LP

We focus on the parametric linear problem f(l) = min cx such that $(A + lD)x \leq b$, where D is a matrix modifying the coefficients of the constraint matrix A with the parameter l, which can be used to analyze the impact of multiple linear changes in the constraint matrix. As a new approach to the problem, we derive five bounds which allow us to avoid recomputing the problem for numerous l and provide guarantees on the objective function’s behavior. We discuss an iterative algorithm that use these bounds to compute an approximation to the original function within a given error threshold. We analyze the performance of the bounds on toy and real-world problems and demonstrate the effectiveness of the approach.