计算机科学
一般化
蒸馏
领域(数学分析)
任务(项目管理)
一次性
关系(数据库)
领域知识
人工智能
集合(抽象数据类型)
知识转移
弹丸
机器学习
知识管理
数据挖掘
数学
化学
管理
程序设计语言
有机化学
机械工程
经济
数学分析
工程类
作者
Zhong Ji,Jingwei Ni,Xiyao Liu,Yanwei Pang
标识
DOI:10.1007/s11704-022-1250-2
摘要
Although few-shot learning (FSL) has achieved great progress, it is still an enormous challenge especially when the source and target set are from different domains, which is also known as cross-domain few-shot learning (CD-FSL). Utilizing more source domain data is an effective way to improve the performance of CD-FSL. However, knowledge from different source domains may entangle and confuse with each other, which hurts the performance on the target domain. Therefore, we propose team-knowledge distillation networks (TKD-Net) to tackle this problem, which explores a strategy to help the cooperation of multiple teachers. Specifically, we distill knowledge from the cooperation of teacher networks to a single student network in a meta-learning framework. It incorporates task-oriented knowledge distillation and multiple cooperation among teachers to train an efficient student with better generalization ability on unseen tasks. Moreover, our TKD-Net employs both response-based knowledge and relation-based knowledge to transfer more comprehensive and effective knowledge. Extensive experimental results on four fine-grained datasets have demonstrated the effectiveness and superiority of our proposed TKD-Net approach.
科研通智能强力驱动
Strongly Powered by AbleSci AI