计算机科学
分级(工程)
推论
情态动词
人工智能
编码(社会科学)
蒸馏
机器学习
胶质瘤
自然语言处理
模式识别(心理学)
数据挖掘
医学
工程类
土木工程
数学
有机化学
化学
高分子化学
癌症研究
统计
作者
Xiaohan Xing,Zhe Chen,Meilu Zhu,Yuenan Hou,Zhifan Gao,Yixuan Yuan
标识
DOI:10.1007/978-3-031-16443-9_61
摘要
The fusion of multi-modal data, e.g., pathology slides and genomic profiles, can provide complementary information and benefit glioma grading. However, genomic profiles are difficult to obtain due to the high costs and technical challenges, thus limiting the clinical applications of multi-modal diagnosis. In this work, we address the clinically relevant problem where paired pathology-genomic data are available during training, while only pathology slides are accessible for inference. To improve the performance of pathological grading models, we present a discrepancy and gradient-guided distillation framework to transfer the privileged knowledge from the multi-modal teacher to the pathology student. For the teacher side, to prepare useful knowledge, we propose a Discrepancy-induced Contrastive Distillation (DC-Distill) module that explores reliable contrastive samples with teacher-student discrepancy to regulate the feature distribution of the student. For the student side, as the teacher may include incorrect information, we propose a Gradient-guided Knowledge Refinement (GK-Refine) module that builds a knowledge bank and adaptively absorbs the reliable knowledge according to their agreement in the gradient space. Experiments on the TCGA GBM-LGG dataset show that our proposed distillation framework improves the pathological glioma grading significantly and outperforms other KD methods. Notably, with the sole pathology slides, our method achieves comparable performance with existing multi-modal methods. The code is available at https://github.com/CityU-AIM-Group/MultiModal-learning .
科研通智能强力驱动
Strongly Powered by AbleSci AI