矩阵分解
非负矩阵分解
公制(单位)
计算机科学
基质(化学分析)
奇异值分解
一般化
秩(图论)
人工神经网络
梯度下降
算法
数学
人工智能
组合数学
运营管理
特征向量
物理
数学分析
材料科学
复合材料
量子力学
经济
作者
Shiping Wang,Yunhe Zhang,Xincan Lin,Lichao Su,Guobao Xiao,William Zhu,Yiqing Shi
标识
DOI:10.1016/j.neunet.2023.01.034
摘要
Matrix factorization has always been an encouraging field, which attempts to extract discriminative features from high-dimensional data. However, it suffers from negative generalization ability and high computational complexity when handling large-scale data. In this paper, we propose a learnable deep matrix factorization via the projected gradient descent method, which learns multi-layer low-rank factors from scalable metric distances and flexible regularizers. Accordingly, solving a constrained matrix factorization problem is equivalently transformed into training a neural network with an appropriate activation function induced from the projection onto a feasible set. Distinct from other neural networks, the proposed method activates the connected weights not just the hidden layers. As a result, it is proved that the proposed method can learn several existing well-known matrix factorizations, including singular value decomposition, convex, nonnegative and semi-nonnegative matrix factorizations. Finally, comprehensive experiments demonstrate the superiority of the proposed method against other state-of-the-arts.
科研通智能强力驱动
Strongly Powered by AbleSci AI