The problem of recovering a sparse signal from an underdetermined set of linearequations is paramount in many applications such as compressed sensing, genomics, and machine learning. While significant advances have been made in this area, providing useful insights and intuitions, many important questions are still open including the fundamental performance limits of the recovery algorithms. In this talk, I present a novel sparse recovery algorithm, referred to as approximate message passing (AMP), that uses the “blessing" of large dimensions to solve the ?1-norm regularized least squares or the LASSO problem very efficiently. In particular, AMP exhibits fast convergence and relies on inexpensive iterations, which renders it suitable for solving high-dimensional problems. Moreover, AMP provides a novel theoretical framework for analyzing the fundamental performance limits of the LASSO, by converting it into a sequence of classical signal plus noise estimation problems. I will show that this new framework settles several fundamental and practically important questions such as the noise sensitivity of the LASSO.
This talk is based on a joint work with David Donoho, Iain Johnstone, Richard Baraniuk, Andrea Montanari, Laura Anitori, and Zai Yang.