遗忘
计算机科学
人工智能
机器学习
编码器
嵌入
班级(哲学)
数据挖掘
模式识别(心理学)
哲学
语言学
操作系统
作者
Le Sun,Mingyang Zhang,Benyou Wang,Prayag Tiwari
出处
期刊:IEEE Journal of Biomedical and Health Informatics
[Institute of Electrical and Electronics Engineers]
日期:2023-02-22
卷期号:28 (4): 1872-1882
被引量:38
标识
DOI:10.1109/jbhi.2023.3247861
摘要
Continuously analyzing medical time series as new classes emerge is meaningful for health monitoring and medical decision-making. Few-shot class-incremental learning (FSCIL) explores the classification of few-shot new classes without forgetting old classes. However, little of the existing research on FSCIL focuses on medical time series classification, which is more challenging to learn due to its large intra-class variability. In this paper, we propose a framework, the Meta self-Attention Prototype Incrementer (MAPIC) to address these problems. MAPIC contains three main modules: an embedding encoder for feature extraction, a prototype enhancement module for increasing inter-class variation, and a distance-based classifier for reducing intra-class variation. To mitigate catastrophic forgetting, MAPIC adopts a parameter protection strategy in which the parameters of the embedding encoder module are frozen at incremental stages after being trained in the base stage. The prototype enhancement module is proposed to enhance the expressiveness of prototypes by perceiving inter-class relations using a self-attention mechanism. We design a composite loss function containing the sample classification loss, the prototype non-overlapping loss, and the knowledge distillation loss, which work together to reduce intra-class variations and resist catastrophic forgetting. Experimental results on three different time series datasets show that MAPIC significantly outperforms state-of-the-art approaches by 27.99%, 18.4%, and 3.95%, respectively.
科研通智能强力驱动
Strongly Powered by AbleSci AI