114:00 — A Novel Column Generation Framework for One-for-many Counterfactual Explanations

In this talk, we address the problem of generating a set of counterfactual explanations for a group of instances using the one-for-many allocation rule. We aim to minimize the number of explanations necessary to explain all instances while also considering sparsity by restricting the collective perturbed features within each explanation. We introduce a novel column generation methodology that is adaptable to diverse black-box classifiers, including neural networks. Through a comparison with a straightforward adaptation of a mixed-integer formulation from the literature, we demonstrate the superiority of our column generation approach in terms of scalability, computational efficiency, and solution quality.

214:30 — Solving the Contextual Multiobjective Inverse Ideal Point Problem via Mathematical Optimization

We propose a Mathematical Optimization model tailored to address the challenging Contextual Multiobjective Inverse Ideal Point Problem (CMIIPP). The CMIIPP involves yielding the underlying objective functions of a multicriteria decision-making model based solely on a set of pairs of concurrently observed context and decisions, particularly focusing on the ideal point scenario in multiobjective optimization.
By formulating the CMIIPP as an optimization problem, our approach offers a framework to uncover the implicit preferences governing the decision-making process that led to a given ideal point solution preserving its consistency. Furthermore, considerations are made for incorporating domain-specific knowledge to enhance applicability and exploring other requirements such as interpretability.

315:00 — Distance-Based Fairness in Classification and Regression

In this presentation we introduce a novel approach to integrate fairness considerations into regression and classification models using mathematical programming techniques. We consider a setting with a sensitive and non-sensitive group defined by some attribute (e.g. race or gender), and wish to make the predictions from both groups similar through a distance. Our approach is applicable to well known methodologies such as LASSO or Support Vector Machines (SVM). In LASSO this is achieved by choosing a decision threshold on the predictions and considering the distance between the two groups above the chosen threshold. For SVM it is done by incorporating a penalty based on the Wasserstein Distance to measure the disparity in score distributions between the sensitive and non-sensitive groups. We show that this can be formulated as the original SVM problem with a fairer kernel. To solve the problem we implement an alternating algorithm. Both of these models are illustrated on real-world datasets, where we are able to significantly improve fairness while maintaining a high accuracy.

415:30 — Sequential Counterfactual Decisions

Given a probabilistic classifier in a binary classification problem, a counterfactual solution to a record is a feasible solution, close to the record and having a high probability of being labeled in the positive class by the classifier. Finding a counterfactual solution for a given record amounts to finding a Pareto-optimal solution to the problem of simultaneous maximization of closeness as well as the probability of classification in the positive class.

In this talk, we address the problem where, instead of one counterfactual solution, we seek a counterfactual solutions plan, i.e., a sequence of T solutions with increasing probabilities of positive and minimizing (a function of) the overall distances. The so-obtained optimization problems will be analyzed for different choices of the classifier and metrics induced by norms, or, in their asymmetric version, gauges.