杠杆(统计)
学习迁移
计算机科学
域适应
适应(眼睛)
条件概率分布
分布(数学)
分歧(语言学)
领域(数学分析)
人工智能
机器学习
传输(计算)
边际分布
班级(哲学)
数学
随机变量
计量经济学
统计
哲学
数学分析
物理
光学
并行计算
分类器(UML)
语言学
作者
Jindong Wang,Yiqiang Chen,Shuji Hao,Wenjie Feng,Zhiqi Shen
标识
DOI:10.1109/icdm.2017.150
摘要
Transfer learning has achieved promising results by leveraging knowledge from the source domain to annotate the target domain which has few or none labels. Existing methods often seek to minimize the distribution divergence between domains, such as the marginal distribution, the conditional distribution or both. However, these two distances are often treated equally in existing algorithms, which will result in poor performance in real applications. Moreover, existing methods usually assume that the dataset is balanced, which also limits their performances on imbalanced tasks that are quite common in real problems. To tackle the distribution adaptation problem, in this paper, we propose a novel transfer learning approach, named as Balanced Distribution Adaptation (BDA), which can adaptively leverage the importance of the marginal and conditional distribution discrepancies, and several existing methods can be treated as special cases of BDA. Based on BDA, we also propose a novel Weighted Balanced Distribution Adaptation (W-BDA) algorithm to tackle the class imbalance issue in transfer learning. W-BDA not only considers the distribution adaptation between domains but also adaptively changes the weight of each class. To evaluate the proposed methods, we conduct extensive experiments on several transfer learning tasks, which demonstrate the effectiveness of our proposed algorithms over several state-of-the-art methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI