张量(固有定义)
子空间拓扑
平滑度
计算机科学
塔克分解
数学
秩(图论)
代表(政治)
矩阵分解
张量分解
算法
模式识别(心理学)
基质(化学分析)
人工智能
组合数学
纯数学
法学
材料科学
物理
复合材料
特征向量
数学分析
政治
量子力学
政治学
作者
Jize Xue,Yongqiang Zhao,Shaoguang Huang,Wenzhi Liao,Jonathan Cheung-Wai Chan,Seong G. Kong
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2022-11-01
卷期号:33 (11): 6916-6930
被引量:102
标识
DOI:10.1109/tnnls.2021.3083931
摘要
Existing methods for tensor completion (TC) have limited ability for characterizing low-rank (LR) structures. To depict the complex hierarchical knowledge with implicit sparsity attributes hidden in a tensor, we propose a new multilayer sparsity-based tensor decomposition (MLSTD) for the low-rank tensor completion (LRTC). The method encodes the structured sparsity of a tensor by the multiple-layer representation. Specifically, we use the CANDECOMP/PARAFAC (CP) model to decompose a tensor into an ensemble of the sum of rank-1 tensors, and the number of rank-1 components is easily interpreted as the first-layer sparsity measure. Presumably, the factor matrices are smooth since local piecewise property exists in within-mode correlation. In subspace, the local smoothness can be regarded as the second-layer sparsity. To describe the refined structures of factor/subspace sparsity, we introduce a new sparsity insight of subspace smoothness: a self-adaptive low-rank matrix factorization (LRMF) scheme, called the third-layer sparsity. By the progressive description of the sparsity structure, we formulate an MLSTD model and embed it into the LRTC problem. Then, an effective alternating direction method of multipliers (ADMM) algorithm is designed for the MLSTD minimization problem. Various experiments in RGB images, hyperspectral images (HSIs), and videos substantiate that the proposed LRTC methods are superior to state-of-the-art methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI