计算机科学
领域(数学分析)
人工智能
帕斯卡(单位)
分类器(UML)
弹丸
符号
自然语言处理
学习迁移
机器学习
数学
算术
程序设计语言
数学分析
化学
有机化学
作者
Wenjian Wang,Lijuan Duan,Yuxi Wang,Junsong Fan,Zhaoxiang Zhang
标识
DOI:10.1109/tpami.2023.3306352
摘要
Few-shot learning aims to recognize novel categories solely relying on a few labeled samples, with existing few-shot methods primarily focusing on the categories sampled from the same distribution. Nevertheless, this assumption cannot always be ensured, and the actual domain shift problem significantly reduces the performance of few-shot learning. To remedy this problem, we investigate an interesting and challenging cross-domain few-shot learning task, where the training and testing tasks employ different domains. Specifically, we propose a Meta-Memory scheme to bridge the domain gap between source and target domains, leveraging style-memory and content-memory components. The former stores intra-domain style information from source domain instances and provides a richer feature distribution. The latter stores semantic information through exploration of knowledge of different categories. Under the contrastive learning strategy, our model effectively alleviates the cross-domain problem in few-shot learning. Extensive experiments demonstrate that our proposed method achieves state-of-the-art performance on cross-domain few-shot semantic segmentation tasks on the COCO-20 i, PASCAL-5 i, FSS-1000, and SUIM datasets and positively affects few-shot classification tasks on Meta-Dataset.
科研通智能强力驱动
Strongly Powered by AbleSci AI