加权
噪音(视频)
计算机科学
人工神经网络
水准点(测量)
噪声测量
人工智能
a计权
集合(抽象数据类型)
模式识别(心理学)
机器学习
功能(生物学)
数据挖掘
降噪
声学
物理
大地测量学
进化生物学
生物
图像(数学)
程序设计语言
地理
作者
Aritra Ghosh,Andrew Lan
标识
DOI:10.1109/wacv48630.2021.00397
摘要
Learning with labels noise has gained significant traction recently due to the sensitivity of deep neural networks under label noise under common loss functions. Losses that are theoretically robust to label noise, however, often makes training difficult. Consequently, several recently proposed methods, such as Meta-Weight-Net (MW-Net), use a small number of unbiased, clean samples to learn a weighting function that downweights samples that are likely to have corrupted labels under the meta-learning framework. However, obtaining such a set of clean samples is not always feasible in practice. In this paper, we analytically show that one can easily train MW-Net without access to clean samples simply by using a loss function that is robust to label noise, such as mean absolute error, as the meta objective to train the weighting network. We experimentally show that our method beats all existing methods that do not use clean samples and performs on-par with methods that use gold samples on benchmark datasets across various noise types and noise rates.
科研通智能强力驱动
Strongly Powered by AbleSci AI