黑森矩阵
数学
欠定系统
压缩传感
可分离空间
Lasso(编程语言)
数学优化
算法
计算机科学
迭代法
凸性
功能(生物学)
期限(时间)
应用数学
金融经济学
物理
生物
进化生物学
数学分析
万维网
量子力学
经济
作者
Stephen J. Wright,Robert D. Nowak,Mário A. T. Figueiredo
标识
DOI:10.1109/tsp.2009.2016892
摘要
Finding sparse approximate solutions to large underdetermined linear systems of equations is a common problem in signal/image processing and statistics. Basis pursuit, the least absolute shrinkage and selection operator (LASSO), wavelet-based deconvolution and reconstruction, and compressed sensing (CS) are a few well-known areas in which problems of this type appear. One standard approach is to minimize an objective function that includes a quadratic ( lscr 2 ) error term added to a sparsity-inducing (usually lscr 1 ) regularizater. We present an algorithmic framework for the more general problem of minimizing the sum of a smooth convex function and a nonsmooth, possibly nonconvex regularizer. We propose iterative methods in which each step is obtained by solving an optimization subproblem involving a quadratic term with diagonal Hessian (i.e., separable in the unknowns) plus the original sparsity-inducing regularizer; our approach is suitable for cases in which this subproblem can be solved much more rapidly than the original problem. Under mild conditions (namely convexity of the regularizer), we prove convergence of the proposed iterative algorithm to a minimum of the objective function. In addition to solving the standard lscr 2 -lscr 1 case, our framework yields efficient solution techniques for other regularizers, such as an lscr infin norm and group-separable regularizers. It also generalizes immediately to the case in which the data is complex rather than real. Experiments with CS problems show that our approach is competitive with the fastest known methods for the standard lscr 2 -lscr 1 problem, as well as being efficient on problems with other separable regularization terms.
科研通智能强力驱动
Strongly Powered by AbleSci AI