遗忘
关系(数据库)
计算机科学
人工智能
班级(哲学)
图形
机器学习
知识转移
知识图
理论计算机科学
数据挖掘
语言学
知识管理
哲学
作者
Songlin Dong,Xiaopeng Hong,Xiaoyu Tao,Xinyuan Chang,Xing Wei,Yihong Gong
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence
[Association for the Advancement of Artificial Intelligence (AAAI)]
日期:2021-05-18
卷期号:35 (2): 1255-1263
被引量:91
标识
DOI:10.1609/aaai.v35i2.16213
摘要
In this paper, we focus on the challenging few-shot class incremental learning (FSCIL) problem, which requires to transfer knowledge from old tasks to new ones and solves catastrophic forgetting. We propose the exemplar relation distillation incremental learning framework to balance the tasks of old-knowledge preserving and new-knowledge adaptation. First, we construct an exemplar relation graph to represent the knowledge learned by the original network and update gradually for new tasks learning. Then an exemplar relation loss function for discovering the relation knowledge between different classes is introduced to learn and transfer the structural information in relation graph. A large number of experiments demonstrate that relation knowledge does exist in the exemplars and our approach outperforms other state-of-the-art class-incremental learning methods on the CIFAR100, miniImageNet, and CUB200 datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI