亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整的填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

Universal and Scalable Weakly-Supervised Domain Adaptation

计算机科学 分类器(UML) 域适应 可扩展性 人工智能 模式识别(心理学) 领域(数学分析) 噪音(视频) 降噪 机器学习 数据挖掘 数学 数据库 图像(数学) 数学分析
作者
Xuan Liu,Ying Huang,Hao Wang,Zheng Xiao,Shigeng Zhang
出处
期刊:IEEE transactions on image processing [Institute of Electrical and Electronics Engineers]
卷期号:33: 1313-1325 被引量:1
标识
DOI:10.1109/tip.2024.3361691
摘要

Domain adaptation leverages labeled data from a source domain to learn an accurate classifier for an unlabeled target domain. Since the data collected in practical applications usually contain noise, the weakly-supervised domain adaptation algorithm has attracted widespread attention from researchers that tolerates the source domain with label noises or/and features noises. Several weakly-supervised domain adaptation methods have been proposed to mitigate the difficulty of obtaining the high-quality source domains that are highly related to the target domain. However, these methods assume to obtain the accurate noise rate in advance to reduce the negative transfer caused by noises in source domain, which limits the application of these methods in the real world where the noise rate is unknown. Meanwhile, since source data usually comes from multiple domains, the naive application of single-source domain adaptation algorithms may lead to sub-optimal results. We hence propose a universal and scalable weakly-supervised domain adaptation method called PDCAS to ease restraints of such assumptions and make it more general. Specifically, PDCAS includes two stages: progressive distillation and domain alignment. In progressive distillation stage, we iteratively distill out potentially clean samples whose annotated labels are highly consistent with the prediction of model and correct labels for noisy source samples. This process is non-supervision by exploiting intrinsic similarity to measure and extract initial corrected samples. In domain alignment stage, we consider Class-Aligned Sampling which balances the samples for both source and target domains along with the global feature distributions to alleviate the shift of label distributions. Finally, we apply PDCAS in multi-source noisy scenario and propose a novel multi-source weakly-supervised domain adaptation method called MSPDCAS, which shows the scalability of our framework. Extensive experiments on Office-31 and Office-Home datasets demonstrate the effectiveness and robustness of our method compared to state-of-the-art methods.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
zjspidany应助Ophelia采纳,获得30
23秒前
27秒前
Ophelia完成签到,获得积分10
43秒前
52秒前
科研通AI2S应助科研通管家采纳,获得10
54秒前
jyy应助科研通管家采纳,获得10
54秒前
科研通AI2S应助科研通管家采纳,获得10
54秒前
炙热嘉懿发布了新的文献求助10
57秒前
58秒前
欣欣子完成签到 ,获得积分10
1分钟前
1分钟前
赘婿应助优秀醉易采纳,获得10
1分钟前
1分钟前
2分钟前
科研通AI2S应助半生采纳,获得10
2分钟前
shl发布了新的文献求助10
3分钟前
hdd发布了新的文献求助10
3分钟前
3分钟前
雪白的听寒完成签到,获得积分10
3分钟前
CodeCraft应助hdd采纳,获得10
3分钟前
3分钟前
半生发布了新的文献求助10
3分钟前
英俊的铭应助损伤采纳,获得10
3分钟前
4分钟前
损伤发布了新的文献求助10
4分钟前
1997SD发布了新的文献求助10
4分钟前
损伤完成签到,获得积分20
4分钟前
可爱的函函应助损伤采纳,获得10
4分钟前
1997SD完成签到,获得积分10
4分钟前
orixero应助科研通管家采纳,获得10
4分钟前
科研通AI2S应助科研通管家采纳,获得10
4分钟前
4分钟前
5分钟前
5分钟前
6分钟前
科研通AI2S应助科研通管家采纳,获得10
6分钟前
7分钟前
7分钟前
优秀醉易发布了新的文献求助10
7分钟前
8分钟前
高分求助中
Solution Manual for Strategic Compensation A Human Resource Management Approach 1200
Natural History of Mantodea 螳螂的自然史 1000
Glucuronolactone Market Outlook Report: Industry Size, Competition, Trends and Growth Opportunities by Region, YoY Forecasts from 2024 to 2031 800
A Photographic Guide to Mantis of China 常见螳螂野外识别手册 800
Zeitschrift für Orient-Archäologie 500
Smith-Purcell Radiation 500
Autoregulatory progressive resistance exercise: linear versus a velocity-based flexible model 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 物理化学 催化作用 细胞生物学 免疫学 冶金
热门帖子
关注 科研通微信公众号,转发送积分 3341836
求助须知:如何正确求助?哪些是违规求助? 2969199
关于积分的说明 8637702
捐赠科研通 2648899
什么是DOI,文献DOI怎么找? 1450412
科研通“疑难数据库(出版商)”最低求助积分说明 671913
邀请新用户注册赠送积分活动 660986