Videos

Sharp Thresholds for Sparsity Recovery in the High-dimensional and Noisy Setting Using l_1 Relaxations

Presenter
January 18, 2007
Keywords:
  • Graphical methods
MSC:
  • 65S05
Abstract
The problem of recovering the sparsity pattern of an unknown signal arises in various domains, including graphical model selection, signal denoising, constructive approximation, compressive sensing, and subset selection in regression. The standard optimization-theoretic formulation of sparsity recovery involves l_0-constraints, and typically leads to computationally intractable optimization problems. This difficulty motivates the development and analysis of approximate methods; in particular, a great deal of work over the past decade has focused on the use of convex l_1-relaxations for sparsity recovery. In this work, we analyze the performance of l_1-constrained quadratic programming for recovering an unknown signal in p dimensions with at most s non-zero entries based on a set of n noisy observations. Of interest is the number of observations n that are required, as a function of the model dimension p and sparsity index s, for exact sparsity recovery. We analyze this question in the high-dimensional setting, in which both the model dimension p and number of observations n tend to infinity. Our main result is to establish, for a broad class of Gaussian measurement ensembles, precise threshold results on the required growth rate for successful recovery using the computationally tractable l_1 relaxation.