鉴别器
计算机科学
领域(数学分析)
人工智能
水准点(测量)
任务(项目管理)
蒸馏
模式识别(心理学)
特征(语言学)
机器学习
数学
探测器
数学分析
哲学
电信
经济
有机化学
化学
语言学
管理
地理
大地测量学
作者
Xiyao Liu,Zhong Ji,Yanwei Pang,Zhi Han
标识
DOI:10.1016/j.neunet.2023.06.009
摘要
Domain Adaptive Few-Shot Learning (DA-FSL) aims at accomplishing few-shot classification tasks on a novel domain with the aid of a large number of source-style samples and several target-style samples. It is essential for DA-FSL to transfer task knowledge from the source domain to the target domain and overcome the asymmetry amount of labeled data in both domains. To this end, we propose Dual Distillation Discriminator Networks (D3Net) from the perspective of the lack of labeled target domain style samples in DA-FSL. Specifically, we employ the idea of distillation discrimination to avoid the over-fitting caused by the unequal number of samples in the target and source domains, which trains the student discriminator by the soft labels from the teacher discriminator. Meanwhile, we design the task propagation stage and the mixed domain stage respectively from the level of feature space and instances to generate more target-style samples, which apply the task distributions and the sample diversity of the source domain to enhance the target domain. Our D3Net realizes the distribution alignment between the source domain and the target domain and constraints the FSL task distribution by prototype distributions on the mixed domain. Extensive experiments on three DA-FSL benchmark datasets, i.e., mini-ImageNet, tiered-ImageNet, and DomainNet, demonstrate that our D3Net achieves competitive performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI