计算机科学
学习迁移
推论
人工智能
机器学习
元学习(计算机科学)
简单(哲学)
相似性(几何)
任务(项目管理)
管理
经济
图像(数学)
哲学
认识论
作者
Yaoyue Zheng,Xuetao Zhang,Zhiqiang Tian,Wei Zeng,Shaoyi Du
标识
DOI:10.1016/j.knosys.2023.110798
摘要
Few-shot Learning (FSL) is a challenging problem that aims to learn and generalize from limited examples. Recent works have adopted a combination of meta-learning and transfer learning strategies for FSL tasks. These methods perform pre-training and transfer the learned knowledge to meta-learning. However, it remains unclear whether this transfer pattern is appropriate, and the objectives of the two learning strategies have not been explored. In addition, the inference of meta-learning in FSL relies on sample relations that require further consideration. In this paper, we uncover an overlooked discrepancy in learning objectives between pre-training and meta-learning strategies and propose a simple yet effective learning paradigm for the few-shot classification task. Specifically, the proposed method comprises two components: (i) Detach: We formulate an effective learning paradigm, Adaptive Meta-Transfer (A-MET), which adaptively eliminates undesired representations learned by pre-training to address the discrepancy. (ii) Unite: We propose a Global Similarity Compatibility Measure (GSCM) to jointly consider sample correlation at a global level for more consistent predictions. The proposed method is simple to implement without any complex components. Extensive experiments on four public benchmarks demonstrate that our method outperforms other state-of-the-art methods under more challenging scenarios with large domain differences between the base and novel classes and less support information available. Code is available at: https://github.com/yaoyz96/a-met.
科研通智能强力驱动
Strongly Powered by AbleSci AI