过度拟合
半监督学习
计算机科学
人工智能
机器学习
监督学习
特征学习
无监督学习
相似性(几何)
水准点(测量)
一般化
基于实例的学习
特征(语言学)
相似性学习
人工神经网络
模式识别(心理学)
数学
图像(数学)
哲学
数学分析
地理
语言学
大地测量学
作者
Cheng Tan,Jun Xia,Lirong Wu,Stan Z. Li
出处
期刊:Cornell University - arXiv
日期:2021-10-17
被引量:36
标识
DOI:10.1145/3474085.3475622
摘要
Noisy labels, resulting from mistakes in manual labeling or webly data collecting for supervised learning, can cause neural networks to overfit the misleading information and degrade the generalization performance. Self-supervised learning works in the absence of labels and thus eliminates the negative impact of noisy labels. Motivated by co-training with both supervised learning view and self-supervised learning view, we propose a simple yet effective method called Co-learning for learning with noisy labels. Co-learning performs supervised learning and self-supervised learning in a cooperative way. The constraints of intrinsic similarity with the self-supervised module and the structural similarity with the noisily-supervised module are imposed on a shared common feature encoder to regularize the network to maximize the agreement between the two constraints. Co-learning is compared with peer methods on corrupted data from benchmark datasets fairly, and extensive results are provided which demonstrate that Co-learning is superior to many state-of-the-art approaches.
科研通智能强力驱动
Strongly Powered by AbleSci AI