1 — 14:00 — Uniform Function Estimators in Reproducing Kernel Hilbert Spaces, with applications in stochastic optimization
We reconstruct functions, which are observed with superimposed errors at random locations. The problem is considered in reproducing kernel Hilbert spaces (RKHS). It is demonstrated that the estimator, which is often derived by employing Gaussian random fields, converges in the mean norm of the RKHS to the conditional expectation and this implies local and uniform convergence of this function estimator. By preselecting the kernel, the problem does not suffer from the curse of dimensionality.
We analyze the statistical properties of the estimator. We derive convergence properties and provide a conservative rate of convergence for increasing sample sizes.
2 — 14:30 — Data-Driven Minimax Optimization with Expectation Constraints
Attention to data-driven optimization approaches, including the well-known stochastic gradient descent method, has grown significantly over recent decades, but data-driven constraints have rarely been studied, because of the computational challenges of projections onto the feasible set defined by these hard constraints. In this talk, we focus on the non-smooth convex-concave stochastic minimax regime and formulate the data-driven constraints as expectation constraints. The minimax expectation constrained problem subsumes a broad class of real-world applications, including data-driven robust optimization, optimization with misspecification, and Area Under the ROC Curve (AUC) maximization with fairness constraints. The new model distinguishes itself from classic minimax optimization problems in at least two aspects: (1) it deals explicitly with challenging expectation constraints and is a more realistic model; (2) it is flexible and well-suited to data-driven modeling. Then, we propose a class of efficient primal-dual algorithms to tackle the minimax expectation constrained problem. Without assuming bounded second moments of the generated dual iterates, we show that our algorithms achieve the optimal $cO(1/{\sqrt{N} })$ rate of convergence for objective optimality gap, duality gap, and feasibility residuals. We also demonstrate the practical efficiency of our algorithms by conducting numerical experiments on various large-scale real-world applications.
3 — 15:00 — A Scalable Optimization Approach for the Multilinear System Arising from Scattered Data Interpolation
Scattered data interpolation aims to reconstruct a continuous (smooth) function that approximates the underlying function by fitting (meshless) data points. Here, ``scattered'' means that the data sites have no structure or order between their relative locations. There are extensive applications of scattered data interpolation in computer graphics, fluid dynamics, inverse kinematics, machine learning, etc. When data sites enjoy a well mesh geometry, plenty of methods such as wavelets, multivariant splines, and finite elements have been used for the interpolation problem. However, in the context of scattered data interpolation, meshless methods including radial basis functions and kernel-based approximations are promising. We focus on kernel-based methods. In this talk, we consider the novel generalized Mercel kernel in the reproducing kernel Banach space for scattered data interpolation. The system of interpolation equations is formulated as a multilinear system with a structural tensor, which is an absolutely and uniformly convergent infinite series of symmetric rank-one tensors. Then we design a fast numerical method for computing products between the structural tensor and any vectors in arbitrary precision. Whereafter, a scalable optimization approach equipped with limited-memory BFGS and Wolfe line-search techniques is customized for solving these multilinear systems. Using the {\L}ojasiewicz inequality, we prove that the proposed scalable optimization approach is a globally convergent algorithm and processes a linear or sublinear convergence rate. Numerical experiments illustrate that the proposed scalable optimization approach can improve the accuracy of interpolation fitting and computational efficiency.