计算机科学
人工智能
一般化
噪音(视频)
任务(项目管理)
灵活性(工程)
机器学习
水准点(测量)
模式识别(心理学)
滤波器(信号处理)
降噪
多标签分类
多任务学习
图像(数学)
计算机视觉
数学
数学分析
统计
管理
大地测量学
经济
地理
作者
Zongmin Liu,Ziyi Wang,Ting Wang,Yitian Xu
标识
DOI:10.1016/j.engappai.2023.107714
摘要
Multi-task classification improves generalization performance via exploiting the correlations between tasks. However, most multi-task learning methods fail to recognize and filter noisy labels for the classification problems with label noises. To address this issue, this paper proposes a novel multi-task label noise learning method based on loss correction, called MTLNL. MTLNL introduces the class-wise denoising (CWD) method for loss decomposition and centroid estimation of the loss function in multi-task learning, and eliminates the impact of label noise by using label flipping rate. It also extends to the multi-task positive-unlabeled (PU) learning domain, which offers better flexibility and generalization performance. Moreover, Nesterov's method is applied to accelerate the solution of the model. MTLNL is compared with other algorithms on five benchmark datasets, five image datasets, and a multi-task PU dataset to demonstrate its effectiveness.
科研通智能强力驱动
Strongly Powered by AbleSci AI