分割
计算机科学
卷积神经网络
人工智能
变压器
模式识别(心理学)
利用
图像分割
乳腺超声检查
乳腺癌
乳腺摄影术
癌症
计算机安全
内科学
物理
医学
量子力学
电压
作者
Huaikun Zhang,Jing Lian,Zetong Yi,Ruichao Wu,Xiangyu Lǚ,Pei Ma,Yide Ma
标识
DOI:10.1016/j.bspc.2023.105427
摘要
Breast cancer is a significant health concern that remains one of the leading causes of mortality in women worldwide. Convolutional Neural Networks (CNNs) have been shown to be effective in ultrasound breast image segmentation. Yet, because of the lack of long-distance dependence, the segmentation performance of CNNs is limited in addressing challenges typical of segmentation of ultrasound breast lesions, such as similar intensity distributions, the presence of irregular objects, and blurred boundaries. In order to overcome these issues, several studies have combined transformers and CNNs, to compensate for the shortcomings of CNNs with the ability of transformers to exploit long-distance dependence. Most of these studies limited themselves to rigidly plug transformer blocks into the CNN, lacking consistency in the process of feature extraction and therefore leading to poor performances in segmenting challenging medical images. In this paper, we propose HAU-Net(hierarchical attention-guided U-Net), a hybrid CNN-transformer framework that benefits from both the long-range dependency of transformers and the local detail representation of CNNs. To incorporate global context information, we introduce a L-G transformer block nested into the skip connections of the U shape architecture network. In addition, to further improve the segmentation performance, we added a cross attention block (CAB) module on the decoder side to allow different layers to interact. Extensive experimental results on three public datasets indicate that the proposed HAU-Net can achieve better performance than other state-of-the-art methods for breast lesions segmentation, with Dice coefficient of 83.11% for BUSI, 88.73% for UDIAT, and 89.48% for BLUI respectively.
科研通智能强力驱动
Strongly Powered by AbleSci AI