计算机科学
遗忘
人工智能
概念漂移
对比度(视觉)
分割
机器学习
班级(哲学)
语义鸿沟
自然语言处理
数据流挖掘
图像(数学)
图像检索
语言学
哲学
作者
Danpei Zhao,Bo Yuan,Zhenwei Shi
标识
DOI:10.1109/tpami.2023.3273574
摘要
As a front-burner problem in incremental learning, class incremental semantic segmentation (CISS) is plagued by catastrophic forgetting and semantic drift. Although recent methods have utilized knowledge distillation to transfer knowledge from the old model, they are still unable to avoid pixel confusion, which results in severe misclassification after incremental steps due to the lack of annotations for past and future classes. Meanwhile data-replay-based approaches suffer from storage burdens and privacy concerns. In this paper, we propose to address CISS without exemplar memory and resolve catastrophic forgetting as well as semantic drift synchronously. We present Inherit with Distillation and Evolve with Contrast (IDEC), which consists of a Dense Knowledge Distillation on all Aspects (DADA) manner and an Asymmetric Region- wise Contrastive Learning (ARCL) module. Driven by the devised dynamic class-specific pseudo-labelling strategy, DADA distils intermediate-layer features and output-logits collaboratively with more emphasis on semantic-invariant knowledge inheritance. ARCL implements region- wise contrastive learning in the latent space to resolve semantic drift among known classes, current classes, and unknown classes. We demonstrate the effectiveness of our method on multiple CISS tasks by state-of-the-art performance, including Pascal VOC 2012, ADE20 K and ISPRS datasets. Our method also shows superior anti-forgetting ability, particularly in multi-step CISS tasks.
科研通智能强力驱动
Strongly Powered by AbleSci AI