Videos

Sparsity-enforced regularizations for optimal learning of high-dimensional systems from random data

Presenter
October 17, 2018
Abstract
Clayton Webster - Oak Ridge National Laboratory This talk will focus on compressed sensing approaches to sparse polynomial approximation of complex functions in high dimensions. Of particular interest is the parameterized PDE setting, where the target function is smooth, characterized by a rapidly decaying orthonormal expansion, whose most important terms are captured by a lower (or downward closed) set. By exploiting this fact, we will present and analyze several procedures for exactly reconstructing a set of (jointly) sparse vectors, from incomplete measurements. These include novel weighted ℓ1 minimization, improved iterative hard thresholding, mixed convex relaxations, as well as nonconvex penalties. Theoretical recovery guarantees will also be presented based on improved bounds for the restricted isometry property, as well as unified null space properties that encompass all currently proposed nonconvex minimizations. Numerical examples are provided to support the theoretical results and demonstrate the computational efficiency of the described compressed sensing methods.