遗忘
计算机科学
人工智能
一般化
学习迁移
任务(项目管理)
知识转移
班级(哲学)
任务分析
传输(计算)
嵌入
理论(学习稳定性)
集合(抽象数据类型)
机器学习
工程类
数学
数学分析
程序设计语言
系统工程
哲学
语言学
知识管理
并行计算
作者
Junxin Lu,Shiliang Sun
出处
期刊:IEEE transactions on image processing
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:33: 3353-3368
被引量:1
标识
DOI:10.1109/tip.2024.3403053
摘要
Continual zero-shot learning (CZSL) aims to develop a model that accumulates historical knowledge to recognize unseen tasks, while eliminating catastrophic forgetting for seen tasks when learning new tasks. However, existing CZSL methods, while mitigating catastrophic forgetting for old tasks, often lead to negative transfer problem for new tasks by over-focusing on accumulating old knowledge and neglecting the plasticity of the model for learning new tasks. To tackle these problems, we propose PAMK, a prototype augmented multi-teacher knowledge transfer network that strikes a trade-off between recognition stability for old tasks and generalization plasticity for new tasks. PAMK consists of a prototype augmented contrastive generation (PACG) module and a multi-teacher knowledge transfer (MKT) module. To reduce the cumulative semantic decay of the class representation embedding and mitigate catastrophic forgetting, we propose a continual prototype augmentation strategy based on relevance scores in PACG. Furthermore, by introducing the prototype augmented semantic-visual contrastive loss, PACG promotes intra-class compactness for all classes across all tasks. MKT effectively accumulates semantic knowledge learned from old tasks to recognize new tasks via the proposed multi-teacher knowledge transfer, eliminating the negative transfer problem. Extensive experiments on various CZSL settings demonstrate the superior performance of PAMK compared to state-of-the-art methods. In particular, in the practical task-free CZSL setting, PAMK achieves impressive gains of 3.28%, 3.09% and 3.71% in mean harmonic accuracy on the CUB, AWA1, and AWA2 datasets, respectively.
科研通智能强力驱动
Strongly Powered by AbleSci AI