BiFTransNet: A unified and simultaneous segmentation network for gastrointestinal images of CT & MRI

计算机科学 分割 人工智能 卷积神经网络 编码器 深度学习 图像分割 掷骰子 模式识别(心理学) 几何学 数学 操作系统
作者
Xin Jiang,Yizhou Ding,Mingzhe Liu,Yong Wang,Yan Li,Zongda Wu
出处
期刊:Computers in Biology and Medicine [Elsevier BV]
卷期号:165: 107326-107326 被引量:22
标识
DOI:10.1016/j.compbiomed.2023.107326
摘要

Gastrointestinal (GI) cancer is a malignancy affecting the digestive organs. During radiation therapy, the radiation oncologist must precisely aim the X-ray beam at the tumor while avoiding unaffected areas of the stomach and intestines. Consequently, accurate, automated GI image segmentation is urgently needed in clinical practice. While the fully convolutional network (FCN) and U-Net framework have shown impressive results in medical image segmentation, their ability to model long-range dependencies is constrained by the convolutional kernel's restricted receptive field. The transformer has a robust capacity for global modeling owing to its inherent global self-attention mechanism. The TransUnet model leverages the strengths of both the convolutional neural network (CNN) and transformer models through a hybrid CNN-transformer encoder. However, the concatenation of high- and low-level features in the decoder is ineffective in fusing global and local information. To overcome this limitation, we propose an innovative transformer-based medical image segmentation architecture called BiFTransNet, which introduces a BiFusion module into the decoder stage, enabling effective global and local feature fusion by enabling feature integration from various modules. Further, a multilevel loss (ML) strategy is introduced to oversee the learning process of each decoder layer and optimize the use of globally and locally fused contextual features at different scales. Our method achieved a Dice score of 89.51% and an intersection-over-union (IoU) score of 86.54% on the UW-Madison Gastrointestinal Segmentation dataset. Moreover, our method attained a Dice score of 78.77% and a Hausdorff distance (HD) of 27.94% on the Synapse Multi-organ Segmentation dataset. Compared with the state-of-the-art methods, our proposed method achieves superior segmentation performance in gastrointestinal segmentation tasks. More significantly, our method can be easily extended to medical segmentation in different modalities such as CT and MRI. Our method achieves clinical multimodal medical segmentation and provides decision supports for clinical radiotherapy plans.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
1秒前
小蘑菇应助科研通管家采纳,获得10
1秒前
桐桐应助科研通管家采纳,获得10
1秒前
田様应助科研通管家采纳,获得10
1秒前
科研通AI6应助科研通管家采纳,获得150
1秒前
SciGPT应助科研通管家采纳,获得10
1秒前
浮游应助科研通管家采纳,获得10
1秒前
英俊的铭应助科研通管家采纳,获得10
1秒前
科研通AI6应助科研通管家采纳,获得10
1秒前
英俊的铭应助科研通管家采纳,获得10
1秒前
科研通AI2S应助科研通管家采纳,获得10
2秒前
科研通AI6应助科研通管家采纳,获得150
2秒前
lalala完成签到,获得积分10
2秒前
NexusExplorer应助科研通管家采纳,获得10
2秒前
烤冷面应助科研通管家采纳,获得30
2秒前
小戴应助科研通管家采纳,获得20
2秒前
2秒前
Akim应助科研通管家采纳,获得10
2秒前
量子星尘发布了新的文献求助10
2秒前
核桃应助科研通管家采纳,获得10
2秒前
nice应助科研通管家采纳,获得10
2秒前
英姑应助科研通管家采纳,获得10
2秒前
2秒前
2秒前
3秒前
灰暗格调发布了新的文献求助10
3秒前
Gxmmmm_举报solitude求助涉嫌违规
4秒前
缓慢如南发布了新的文献求助10
5秒前
顾矜应助飞快的映菱采纳,获得10
5秒前
HRT完成签到,获得积分20
5秒前
万能图书馆应助1282941496采纳,获得10
6秒前
Stargazer完成签到,获得积分10
7秒前
Jasper应助虚心的芹采纳,获得10
8秒前
阿俊发布了新的文献求助10
8秒前
8秒前
小薛发布了新的文献求助10
8秒前
HEHNJJ完成签到,获得积分10
8秒前
12秒前
喜悦路灯完成签到,获得积分10
12秒前
深情安青应助阿俊采纳,获得10
13秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Zeolites: From Fundamentals to Emerging Applications 1500
Architectural Corrosion and Critical Infrastructure 1000
Early Devonian echinoderms from Victoria (Rhombifera, Blastoidea and Ophiocistioidea) 1000
Hidden Generalizations Phonological Opacity in Optimality Theory 500
translating meaning 500
Storie e culture della televisione 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 内科学 生物化学 物理 计算机科学 纳米技术 遗传学 基因 复合材料 化学工程 物理化学 病理 催化作用 免疫学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 4903429
求助须知:如何正确求助?哪些是违规求助? 4182000
关于积分的说明 12984141
捐赠科研通 3947419
什么是DOI,文献DOI怎么找? 2165166
邀请新用户注册赠送积分活动 1183456
关于科研通互助平台的介绍 1089838