1 — 08:30 — Projectional coderivatives with applications
In this talk we introduce a projectional coderivative of set-valued mappings and present its calculations in some special cases. We apply this coderivative to obtain a complete characterization for a set-valued mapping to have the Lipschitz-property relative to a closed and convex set. For an extended real-valued function, we apply the obtained results to investigate its Lipschitz continuity relative to a closed and convex set and the Lipschitz-like property of a level-set mapping relative to a half line. We apply our results to study the Lipschitz-like property of the solution mapping of a parametric affine variational inequality problem.
2 — 09:00 — Sorting Functions for Sparse Projection with Applications to Sparse Optimization
Motivated by the symmetric sparse projection results developed by Beck, Eldar, and Hallmark, we introduce a general notion of sorting function for sparse projection without imposing full permutation symmetry on an underlying set. If a sorting function is defined by the monotone order of a real-valued univariate function, then it is called a simple sorting function. For example, when an underlying set is nonnegative and fully permutation symmetric, it is known that its simple sorting function can be defined by $t$. Fundamental properties of simple sorting functions on a possibly non-fully permutation symmetric set are studied. Specific results include closed form expressions of a simple sorting function, and necessary/sufficient conditions for an underlying set to attain such a simple sorting function. In particular, we derive necessary and/or sufficient conditions on a set under which $t, |t|, t_+$ (respectively $t_-$) defines a simple sorting function, and show that a set satisfying certain partially permutation symmetric property may attain a simple sorting function. These results broaden sorting based sparse projection to a larger class of sets and lay a foundation for developing efficient sparse projection algorithms with various applications to sparse optimization.
3 — 09:30 — A projection algorithm for nonlocal low-rank tensor models with orthogonal constraints
Hyperspectral images (HSIs) are often contaminated by mixed noises such as Gaussian noise, dead lines, stripes and so on. In this talk, we will present an optimization model for HSI denoising using a tensor $l_{2,p}$ group sparsity measure and nonlocal low-rank regularization. In the proposed nonlocal low-rank tensor model, the low-rank constraints are imposed on the nonlocal similar tensors collected via block matching according to the spatial nonlocal self-similarity and spectral correlation of HSIs. The low-rank regularization term is formulated based on independent 3-D higher-order singular value decomposition with sparsity enhancement on its core tensor to prompt more low-rankness. The resulting denoising model is a nonconvex nonsmooth optimization problem with orthogonal constraints. We propose a proximal block coordinate descent algorithm to solve the model, in which we utilize the projection onto Stiefel manifold to solve the equivalent subproblems. We can show any accumulation point of the sequence generated by the proposed algorithm converges to a stationary point, which is defined using three equalities of substationarity, symmetry, and feasibility for orthogonal constraints. Experiments on HSI denoising and destriping demonstrate the great performance of our proposed method over state-of-the-art methods in terms of metrics such as mean peak signal-to-noise ratio as well as visual quality.