非负矩阵分解
熵(时间箭头)
计算机科学
矩阵分解
因式分解
模式识别(心理学)
拉格朗日乘数
算法
人工智能
数学
数学优化
量子力学
物理
特征向量
作者
Jiao Wei,Tong Cui,Bingxue Wu,Qiang He,Shouliang Qi,Yu-Dong Yao,Yueyang Teng
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2023-09-01
卷期号:34 (9): 5381-5391
被引量:2
标识
DOI:10.1109/tnnls.2022.3184286
摘要
Nonnegative matrix factorization (NMF) has been widely used to learn low-dimensional representations of data. However, NMF pays the same attention to all attributes of a data point, which inevitably leads to inaccurate representations. For example, in a human-face dataset, if an image contains a hat on a head, the hat should be removed or the importance of its corresponding attributes should be decreased during matrix factorization. This article proposes a new type of NMF called entropy weighted NMF (EWNMF), which uses an optimizable weight for each attribute of each data point to emphasize their importance. This process is achieved by adding an entropy regularizer to the cost function and then using the Lagrange multiplier method to solve the problem. Experimental results with several datasets demonstrate the feasibility and effectiveness of the proposed method. The code developed in this study is available at https://github.com/Poisson-EM/Entropy-weighted-NMF.
科研通智能强力驱动
Strongly Powered by AbleSci AI