116:20 — Valid Inequalities for Verification of Binarized Neural Networks

Binarized neural networks (BNNs) are neural networks in which the weights are binary and the activation functions are the sign function. Verification of BNNs against input perturbation is one way to measure robustness of BNNs. BNN verification can be formulated as a mixed integer linear optimization problem. The natural formulation is often difficult to solve due to large integrality gap induced by big-M constraints. We present simple but effective tehniques for generating valid inequalities for this formulation that exploit the recursive structure of the network. We find that our techniques enable verifying BNNs against a higher range of input perturbation than existing approaches.

216:50 — Slimming down and beefing up: MILP Multi-Objective Optimization and Training for Few-Bit Neural Networks

Combinatorial optimization solvers have gained attention for training neural networks, particularly in scenarios with limited data availability. Employing sophisticated solvers such as mixed integer linear programming enables precise network training without reliance on computationally intensive GPU-based methods. Our focus is on few-bit discrete-valued neural networks, encompassing architectures like Binarized Neural Networks (BNNs) and Integer Neural Networks (INNs). These lightweight models are distinguished by their capacity to function effectively on low-power devices, often implemented using boolean operations. Our proposed methodology entails training a network for each class pair within the classification problem and employing a majority voting scheme for output prediction. The optimization process adheres to principles of robustness and sparsity through a multi-objective function. We evaluate this approach, named BeMi, against existing solver-based and gradient-based techniques, particularly in the context of few-shot learning with BNNs. Additionally, we analyze trade-offs between INNs and BNNs. Empirical results on the MNIST dataset demonstrate that BeMi achieves higher accuracy (up to 81.8\%) with a notable reduction in active weights, surpassing previous methodologies.

317:20 — Differentiable Cutting-plane Layers for Mixed-integer Linear Optimization

We consider the problem of optimizing a sequence of mixed-integer linear optimization problems with varying parameters. By combining recent advances in cutting plane generation and differentiation through convex optimization problems, we construct a new differentiable architecture to predict the optimal cutting planes from the key parameters of each instance. During the offline phase, our method maximizes the cutting planes’ efficiency by evaluating the derivative of the solution of the continuous relaxations with respect to the cut-generating parameters. We show on preliminary computational results that, once trained, our architecture computes solutions with low infeasibility and suboptimality with fast and predictable execution times.