分割
计算机科学
学习迁移
遗忘
终身学习
深度学习
人工智能
任务(项目管理)
卷积神经网络
多任务学习
膨胀(度量空间)
适配器(计算)
模式识别(心理学)
机器学习
心理学
认知心理学
计算机硬件
数学
经济
组合数学
管理
教育学
作者
Kuo Men,Xinyuan Chen,Baiyu Yang,Ji Zhu,Junlin Yi,Shulian Wang,Yexiong Li,Jianrong Dai
标识
DOI:10.1016/j.radonc.2020.12.034
摘要
Background and purpose Convolutional neural networks (CNNs) have comparable human level performance in automatic segmentation. An important challenge that CNNs face in segmentation is catastrophic forgetting. They lose performance on tasks that were previously learned when trained on task. In this study, we propose a lifelong learning method to learn multiple segmentation tasks continuously without forgetting previous tasks. Materials and methods The cohort included three tumors, 800 patients of which had nasopharyngeal cancer (NPC), 800 patients had breast cancer, and 800 patients had rectal cancer. The tasks included segmentation of the clinical target volume (CTV) of these three cancers. The proposed lifelong learning network adopted dilation adapter to learn three segmentation tasks one by one. Only the newly added dilation adapter (seven layers) was fine tuning for incoming new task, whereas all the other learned layers were frozen. Results Compared with single-task, multi-task or transfer learning, the proposed lifelong learning can achieve better or comparable segmentation accuracy with a DSC of 0.86 for NPC, 0.89 for breast cancer, and 0.87 for rectal cancer. Lifelong learning can avoid forgetting in sequential learning and yield good performance with less training data. Furthermore, it is more efficient than single-task or transfer learning, which reduced the number of parameters, size of model, and training time by ~58.8%, ~55.6%, and ~25.0%, respectively. Conclusion The proposed method preserved the knowledge of previous tasks while learning a new one using a dilation adapter. It could yield comparable performance with much less training data, model parameters, and training time.
科研通智能强力驱动
Strongly Powered by AbleSci AI