Addressing the Overfitting in Partial Domain Adaptation With Self-Training and Contrastive Learning

过度拟合 计算机科学 人工智能 模式识别(心理学) 域适应 熵(时间箭头) 机器学习 领域(数学分析) 分类器(UML) 数学 人工神经网络 数学分析 物理 量子力学
作者
Chunmei He,Xiuguang Li,Xia Yue,Jing Tang,Jie Yang,Zhengchun Ye
出处
期刊:IEEE Transactions on Circuits and Systems for Video Technology [Institute of Electrical and Electronics Engineers]
卷期号:34 (3): 1532-1545 被引量:7
标识
DOI:10.1109/tcsvt.2023.3296617
摘要

Partial domain adaptation (PDA) assumes that target domain class label set is a subset of that of source domain, while this problem setting is close to the actual scenario. At present, there are mainly two methods to solve the overfitting of source domain in PDA, namely the entropy minimization and the weighted self-training. However, the entropy minimization method may make the distribution prediction sharp but inaccurate for samples with relatively average prediction distribution, and cause the model to learn more error information. While the weighted self-training method will introduce erroneous noise information in the self-training process due to the existence of noise weights. Therefore, we address these issues in our work and propose self-training contrastive partial domain adaptation method (STCPDA). We present two modules to mine domain information in STCPDA. We first design self-training module based on simple samples in target domain to address the overfitting to source domain. We divide the target domain samples into simple samples with high reliability and difficult samples with low reliability, and the pseudo-labels of simple samples are selected for self-training learning. Then we construct the contrastive learning module for source and target domains. We embed contrastive learning into feature space of the two domains. By this contrastive learning module, we can fully explore the hidden information in all domain samples and make the class boundary more salient. Many experimental results on five datasets show the effectiveness and excellent classification performance of our method.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
1秒前
kingwill应助明天更好采纳,获得20
1秒前
2秒前
乐乐应助gaos采纳,获得10
2秒前
lzy完成签到,获得积分10
2秒前
阿烨发布了新的文献求助10
2秒前
天天快乐应助科研通管家采纳,获得10
2秒前
天天快乐应助科研通管家采纳,获得10
3秒前
gcc应助科研通管家采纳,获得10
3秒前
FashionBoy应助科研通管家采纳,获得10
3秒前
彭于晏应助科研通管家采纳,获得10
3秒前
小二郎应助sure采纳,获得10
3秒前
领导范儿应助科研通管家采纳,获得10
3秒前
今后应助科研通管家采纳,获得10
3秒前
在水一方应助科研通管家采纳,获得10
3秒前
思源应助科研通管家采纳,获得10
3秒前
yin完成签到,获得积分10
3秒前
Akim应助科研通管家采纳,获得10
3秒前
天天快乐应助科研通管家采纳,获得10
3秒前
汉堡包应助科研通管家采纳,获得10
4秒前
脑洞疼应助科研通管家采纳,获得10
4秒前
Hello应助科研通管家采纳,获得10
4秒前
打打应助科研通管家采纳,获得10
4秒前
科研通AI2S应助科研通管家采纳,获得10
4秒前
天天快乐应助科研通管家采纳,获得10
4秒前
慕青应助科研通管家采纳,获得10
4秒前
4秒前
英姑应助科研通管家采纳,获得10
4秒前
充电宝应助科研通管家采纳,获得10
4秒前
科研通AI2S应助科研通管家采纳,获得10
4秒前
慕青应助科研通管家采纳,获得10
5秒前
汉堡包应助科研通管家采纳,获得10
5秒前
36456657应助CC采纳,获得10
5秒前
爆米花应助科研通管家采纳,获得10
5秒前
传奇3应助科研通管家采纳,获得10
5秒前
5秒前
5秒前
李健应助WTT采纳,获得10
6秒前
6秒前
6秒前
高分求助中
Continuum Thermodynamics and Material Modelling 3000
Production Logging: Theoretical and Interpretive Elements 2700
Social media impact on athlete mental health: #RealityCheck 1020
Ensartinib (Ensacove) for Non-Small Cell Lung Cancer 1000
Unseen Mendieta: The Unpublished Works of Ana Mendieta 1000
Bacterial collagenases and their clinical applications 800
El viaje de una vida: Memorias de María Lecea 800
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 基因 遗传学 物理化学 催化作用 量子力学 光电子学 冶金
热门帖子
关注 科研通微信公众号,转发送积分 3527469
求助须知:如何正确求助?哪些是违规求助? 3107497
关于积分的说明 9285892
捐赠科研通 2805298
什么是DOI,文献DOI怎么找? 1539865
邀请新用户注册赠送积分活动 716714
科研通“疑难数据库(出版商)”最低求助积分说明 709678