Sample Self-Selection Using Dual Teacher Networks for Pathological Image Classification with Noisy Labels

计算机科学 人工智能 机器学习 深度学习 人工神经网络 对偶(语法数字) 样品(材料) 滤波器(信号处理) 过程(计算) 模式识别(心理学) 噪音(视频) 图像(数学) 数据挖掘 计算机视觉 文学类 色谱法 操作系统 艺术 化学
作者
Gang Han,Wenping Guo,Haibo Zhang,Jie Jin,Xingli Gan,Xiaoming Zhao
出处
期刊:Computers in Biology and Medicine [Elsevier]
卷期号:174: 108489-108489 被引量:1
标识
DOI:10.1016/j.compbiomed.2024.108489
摘要

Deep neural networks (DNNs) involve advanced image processing but depend on large quantities of high-quality labeled data. The presence of noisy data significantly degrades the DNN model performance. In the medical field, where model accuracy is crucial and labels for pathological images are scarce and expensive to obtain, the need to handle noisy data is even more urgent. Deep networks exhibit a memorization effect, they tend to prioritize remembering clean labels initially. Therefore, early stopping is highly effective in managing learning with noisy labels. Previous research has often concentrated on developing robust loss functions or implementing training constraints to mitigate the impact of noisy labels; however, such approaches have frequently resulted in underfitting. We propose using knowledge distillation to slow the learning process of the target network rather than preventing late-stage training from being affected by noisy labels. In this paper, we introduce a data sample self-selection strategy based on early stopping to filter out most of the noisy data. Additionally, we employ the distillation training method with dual teacher networks to ensure the steady learning of the student network. The experimental results show that our method outperforms current state-of-the-art methods for handling noisy labels on both synthetic and real-world noisy datasets. In particular, on the real-world pathological image dataset Chaoyang, the highest classification accuracy increased by 2.39%. Our method leverages the model's predictions based on training history to select cleaner datasets and retrains them using these cleaner datasets, significantly mitigating the impact of noisy labels on model performance.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
完美世界应助依旧采纳,获得10
1秒前
Laaaaaa完成签到,获得积分10
2秒前
奶油布丁发布了新的文献求助10
3秒前
3秒前
可可发布了新的文献求助10
4秒前
4秒前
Hello应助白云四季采纳,获得10
4秒前
4秒前
yu关闭了yu文献求助
4秒前
king完成签到,获得积分10
5秒前
隐形曼青应助阿盛采纳,获得10
6秒前
jeep先生发布了新的文献求助10
6秒前
6秒前
过目不忘发布了新的文献求助10
7秒前
伊麦香城完成签到 ,获得积分10
7秒前
搜集达人应助乐白采纳,获得10
7秒前
9秒前
laola发布了新的文献求助10
9秒前
xiatian发布了新的文献求助10
9秒前
9秒前
10秒前
10秒前
滕达发布了新的文献求助10
11秒前
陌上柳飞絮完成签到,获得积分20
15秒前
黄龙完成签到,获得积分10
15秒前
15秒前
Echo发布了新的文献求助10
16秒前
16秒前
CL发布了新的文献求助10
16秒前
zT发布了新的文献求助10
17秒前
Caecae发布了新的文献求助10
17秒前
18秒前
义气珩完成签到,获得积分10
19秒前
研友_VZG7GZ应助快乐零零屋采纳,获得10
20秒前
20秒前
20秒前
21秒前
慕青应助七七八八采纳,获得10
21秒前
芸沐发布了新的文献求助10
21秒前
范范发布了新的文献求助10
22秒前
高分求助中
Impact of Mitophagy-Related Genes on the Diagnosis and Development of Esophageal Squamous Cell Carcinoma via Single-Cell RNA-seq Analysis and Machine Learning Algorithms 2000
Evolution 1500
How to Create Beauty: De Lairesse on the Theory and Practice of Making Art 1000
Gerard de Lairesse : an artist between stage and studio 670
CLSI EP47 Evaluation of Reagent Carryover Effects on Test Results, 1st Edition 550
Decision Theory 500
Multiscale Thermo-Hydro-Mechanics of Frozen Soil: Numerical Frameworks and Constitutive Models 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 2988343
求助须知:如何正确求助?哪些是违规求助? 2649526
关于积分的说明 7158953
捐赠科研通 2283573
什么是DOI,文献DOI怎么找? 1210766
版权声明 592454
科研通“疑难数据库(出版商)”最低求助积分说明 591239