1 — 16:20 — A parallelizable ADMM approach for block-structured quadratic programs
Block-structured quadratic programs (QPs) arise frequently in the context of optimal control problems. The goal of our study is to provide a fast solution to these QPs which is crucial for the successful application of optimal control algorithms to many real-world problems. Existing ADMM-based QP solvers usually split the problem into an equality-constrained QP and a projection onto a box. In general, this is a proven and widely used method. Nevertheless, splitting the problem in this way does not allow the block structure of OPs arising from optimal control problems to be optimally utilized. With our new splitting of the problem we are able to utilize this block structure and solve parts of the problem in parallel. We will introduce the resulting method and show first numerical results.
2 — 16:50 — Fuzzy clustering with capacity constraints: theory, algorithm, convergence analysis and numerical experiments
In this talk we present a theoretical analysis on fuzzy centroid-based clustering methods. We consider constraints that may be useful in some practical applications, such as restrictions on the number of points in each group, and methods that deal with these constraints. We propose a more general formulation to the constrained clustering problem, where each point has an associated weight, and the sum of the weights of the points that compose each group is established a priori. We discuss existence and uniqueness of solutions of the involved problems, providing mathematical foundations for the established formulas. Besides, we propose a practical algorithm for solving this problem and present its convergence analysis. This algorithm follows an alternate minimization scheme, wherein a given iteration addresses first the problem in terms of the probabilities of each point belonging to each cluster and then finds the position of the centroids. The computation
of the probabilities is performed by solving a linear system derived from the Karush-Kuhn-Tucker conditions. With the aim of validating our algorithm, we present numerical tests with synthetic data to demonstrate its performance for the given problems.
3 — 17:20 — The Smoothed Duality Gap as a Stopping Criterion
We optimize the running time of the primal-dual algorithms by optimizing their stopping criteria for solving convex optimization problems under affine equality constraints, which means terminating the algorithm earlier with fewer iterations. We study the relations between four stopping criteria and show under which conditions they are accurate to detect optimal solutions. The uncomputable one: "Optimality gap and Feasibility error", and the computable ones: the "Karush-Kuhn-Tucker error", the "Projected Duality Gap", and the "Smoothed Duality Gap". Assuming metric sub-regularity or quadratic error bound, we establish that all of the computable criteria provide practical upper bounds for the optimality gap, and approximate it effectively. Furthermore, we establish comparability between some of the computable criteria under certain conditions. Numerical experiments on basis pursuit, and quadratic programs with(out) non-negative weights corroborate these findings and show the superior stability of the smoothed duality gap over the rest.