Robust PCA Using Generalized Nonconvex Regularization
正规化(语言学)
计算机科学
数学
人工智能
模式识别(心理学)
算法
应用数学
作者
Fei Wen,Rendong Ying,Peilin Liu,Robert C. Qiu
出处
期刊:IEEE Transactions on Circuits and Systems for Video Technology [Institute of Electrical and Electronics Engineers] 日期:2020-06-01卷期号:30 (6): 1497-1510被引量:17
标识
DOI:10.1109/tcsvt.2019.2908833
摘要
Recently, the robustification of principal component analysis (PCA) has attracted much research attention in numerous areas of science and engineering. The most popular and successful approach is to model the robust PCA problem as a low-rank matrix recovery problem in the presence of sparse corruption. With this model, the nuclear norm and $\ell _{1}$ -norm penalties are usually used for low-rank and sparsity promotion. Although the nuclear norm and $\ell _{1}$ -norm are favorable due to their convexity, they have a bias problem. In comparison, nonconvex penalties can be expected to yield better recovery performance. In this paper, we consider a formulation for robust PCA using generalized nonconvex penalties for low-rank and sparsity inducing. This nonconvex formulation is efficiently solved by a multi-block alternative direction method of multipliers (ADMM) algorithm. A sufficient condition for the convergence of this new ADMM algorithm has been derived. Furthermore, to address the important issue of nonconvex penalty selection, we have evaluated the new algorithm via numerical experiments in various low-rank and sparsity conditions. The results indicate that, “exact” recovery of the low-rank principle component can be achieved only by nonconvex regularization. MATLAB code is available at https://github.com/FWen/RPCA.git .