计算机科学
人工智能
遗忘
分割
蒸馏
关系(数据库)
特征(语言学)
班级(哲学)
机器学习
注释
模式识别(心理学)
数据挖掘
语言学
哲学
有机化学
化学
作者
Huisi Wu,Zhaoze Wang,Zhao Zhen,Cheng Chen,Jing Qin
出处
期刊:IEEE Transactions on Medical Imaging
[Institute of Electrical and Electronics Engineers]
日期:2023-12-01
卷期号:42 (12): 3794-3804
标识
DOI:10.1109/tmi.2023.3307892
摘要
Deep learning models have achieved remarkable success in multi-type nuclei segmentation. These models are mostly trained at once with the full annotation of all types of nuclei available, while lack the ability of continually learning new classes due to the problem of catastrophic forgetting. In this paper, we study the practical and important class-incremental continual learning problem, where the model is incrementally updated to new classes without accessing to previous data. We propose a novel continual nuclei segmentation method, to avoid forgetting knowledge of old classes and facilitate the learning of new classes, by achieving feature-level knowledge distillation with prototype-wise relation distillation and contrastive learning. Concretely, prototype-wise relation distillation imposes constraints on the inter-class relation similarity, encouraging the encoder to extract similar class distribution for old classes in the feature space. Prototype-wise contrastive learning with a hard sampling strategy enhances the intra-class compactness and inter-class separability of features, improving the performance on both old and new classes. Experiments on two multi-type nuclei segmentation benchmarks, i.e., MoNuSAC and CoNSeP, demonstrate the effectiveness of our method with superior performance over many competitive methods. Codes are available at https://github.com/zzw-szu/CoNuSeg.
科研通智能强力驱动
Strongly Powered by AbleSci AI