过度拟合
变压器
计算机科学
人工智能
模式识别(心理学)
分层数据库模型
机器学习
数据挖掘
人工神经网络
工程类
电压
电气工程
作者
Rui Yan,Zhilong Lv,Zhidong Yang,Senlin Lin,Chun-Hou Zheng,Fa Zhang
出处
期刊:IEEE Journal of Biomedical and Health Informatics
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:28 (1): 7-18
被引量:1
标识
DOI:10.1109/jbhi.2023.3307584
摘要
The Transformer-based methods provide a good opportunity for modeling the global context of gigapixel whole slide image (WSI), however, there are still two main problems in applying Transformer to WSI-based survival analysis task. First, the training data for survival analysis is limited, which makes the model prone to overfitting. This problem is even worse for Transformer-based models which require large-scale data to train. Second, WSI is of extremely high resolution (up to 150,000 x 150,000 pixels) and is typically organized as a multi-resolution pyramid. Vanilla Transformer cannot model the hierarchical structure of WSI (such as patch cluster-level relationships), which makes it incapable of learning hierarchical WSI representation. To address these problems, in this paper, we propose a novel Sparse and Hierarchical Transformer (SH-Transformer) for survival analysis. Specifically, we introduce sparse self-attention to alleviate the overfitting problem, and propose a hierarchical Transformer structure to learn the hierarchical WSI representation. Experimental results based on three WSI datasets show that the proposed framework outperforms the state-of-the-art methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI