分布(数学)
歧管对齐
利用
歧管(流体力学)
一般化
非线性降维
相关性
人工智能
模式识别(心理学)
计算机科学
数学
数学分析
降维
几何学
工程类
机械工程
计算机安全
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2021-08-16
卷期号:34 (2): 839-852
被引量:29
标识
DOI:10.1109/tnnls.2021.3103178
摘要
Label correlation is helpful to alleviate the overwhelming output space of label distribution learning (LDL). However, existing studies either only consider one of global and local label correlations or exploit label correlation by some prior knowledge (e.g., low-rank assumption, which may not hold sometimes). To efficiently exploit both global and local label correlations in a data-driven way, we propose in this article a new LDL method called label distribution learning by exploiting label distribution manifold (LDL-LDM). Our basic idea is that the underlying manifold structure of label distribution may encode the correlations among labels. LDL-LDM works as follows. First, to exploit global label correlation, we learn the label distribution manifold and encourage the outputs of our model to lie in the same manifold. Second, we learn the label distribution manifold of different clusters of samples to consider local label correlations. Third, to handle incomplete label distribution learning (incomplete LDL), we jointly learn label distribution and label distribution manifold. Theoretical analysis demonstrates the generalization of our method. Finally, experimental results validate the effectiveness of LDL-LDM in both full and incomplete LDL cases.
科研通智能强力驱动
Strongly Powered by AbleSci AI