分割
深度学习
人工智能
计算机科学
稳健性(进化)
跟踪误差
医学
心脏病学
生物化学
化学
控制(管理)
基因
作者
Sigurd Vangen Wifstad,Henrik Agerup Kildahl,Bjørnar Grenne,Espen Holte,Ståle Wågen Hauge,Sigbjørn Sæbø,Daniel Mekonnen,Berhanu Nega,Rune Haaverstad,Mette‐Elise Estensen,Håvard Dalen,Lasse Løvstakken
标识
DOI:10.1016/j.ultrasmedbio.2023.12.023
摘要
Objective
Valvular heart diseases (VHDs) pose a significant public health burden, and deciding the best treatment strategy necessitates accurate assessment of heart valve function. Transthoracic echocardiography (TTE) is the key modality to evaluate VHDs, but the lack of standardized quantitative measurements leads to subjective and time-consuming assessments. We aimed to use deep learning to automate the extraction of mitral valve (MV) leaflets and annular hinge points from echocardiograms of the MV, improving standardization and reducing workload in quantitative assessment of MV disease. Methods
We annotated the MV leaflets and annulus points in 2931 images from 127 patients. We propose an approach for segmenting the annotated features using Attention UNet with deep supervision and weight scheduling of the attention coefficients to enforce saliency surrounding the MV. The derived segmentation masks were used to extract quantitative biomarkers for specific MV leaflet scallops throughout the heart cycle. Results
Evaluation performance was summarized using a Dice score of 0.63 ± 0.14, annulus error of 3.64 ± 2.53 and leaflet angle error of 8.7 ± 8.3°. Leveraging Attention UNet with deep supervision robustness of clinically relevant metrics was improved compared with UNet, reducing standard deviations by 2.7° (angle error) and 0.73 mm (annulus error). We correctly identified cases of MV prolapse, cases of stenosis and healthy references from a clinical material using the derived biomarkers. Conclusion
Robust deep learning segmentation and tracking of MV morphology and motion is possible by leveraging attention gates and deep supervision, and holds promise for enhancing VHD diagnosis and treatment monitoring.
科研通智能强力驱动
Strongly Powered by AbleSci AI