稳健主成分分析
矩阵范数
张量(固有定义)
主成分分析
数学
秩(图论)
奇异值分解
奇异值
算法
规范(哲学)
计算机科学
应用数学
模式识别(心理学)
人工智能
组合数学
纯数学
特征向量
统计
物理
量子力学
法学
政治学
作者
Yulong Wang,Kit Ian Kou,Hong Chen,Yuan Yan Tang,Luoqing Li
出处
期刊:IEEE transactions on image processing
[Institute of Electrical and Electronics Engineers]
日期:2023-01-01
卷期号:32: 5114-5125
标识
DOI:10.1109/tip.2023.3310331
摘要
Tensor Robust Principal Component Analysis (TRPCA), which aims to recover the low-rank and sparse components from their sum, has drawn intensive interest in recent years. Most existing TRPCA methods adopt the tensor nuclear norm (TNN) and the tensor ℓ1 norm as the regularization terms for the low-rank and sparse components, respectively. However, TNN treats each singular value of the low-rank tensor L equally and the tensor ℓ1 norm shrinks each entry of the sparse tensor S with the same strength. It has been shown that larger singular values generally correspond to prominent information of the data and should be less penalized. The same goes for large entries in S in terms of absolute values. In this paper, we propose a Double Auto-weighted TRPCA (DATRPCA) method. Instead of using predefined and manually set weights merely for the low-rank tensor as previous works, DATRPCA automatically and adaptively assigns smaller weights and applies lighter penalization to significant singular values of the low-rank tensor and large entries of the sparse tensor simultaneously . We have further developed an efficient algorithm to implement DATRPCA based on the Alternating Direction Method of Multipliers (ADMM) framework. In addition, we have also established the convergence analysis of the proposed algorithm. The results on both synthetic and real-world data demonstrate the effectiveness of DATRPCA for low-rank tensor recovery, color image recovery and background modelling.
科研通智能强力驱动
Strongly Powered by AbleSci AI