医学
卷积神经网络
神经组阅片室
超声波
放射科
乳房成像
乳腺超声检查
接收机工作特性
人工智能
双雷达
乳腺摄影术
计算机科学
乳腺癌
内科学
神经学
癌症
精神科
作者
Alexander Ciritsis,Cristina Rossi,Matthias Eberhard,Magda Marcon,Anton S. Becker,Andreas Boss
标识
DOI:10.1007/s00330-019-06118-7
摘要
To evaluate a deep convolutional neural network (dCNN) for detection, highlighting, and classification of ultrasound (US) breast lesions mimicking human decision-making according to the Breast Imaging Reporting and Data System (BI-RADS). One thousand nineteen breast ultrasound images from 582 patients (age 56.3 ± 11.5 years) were linked to the corresponding radiological report. Lesions were categorized into the following classes: no tissue, normal breast tissue, BI-RADS 2 (cysts, lymph nodes), BI-RADS 3 (non-cystic mass), and BI-RADS 4–5 (suspicious). To test the accuracy of the dCNN, one internal dataset (101 images) and one external test dataset (43 images) were evaluated by the dCNN and two independent readers. Radiological reports, histopathological results, and follow-up examinations served as reference. The performances of the dCNN and the humans were quantified in terms of classification accuracies and receiver operating characteristic (ROC) curves. In the internal test dataset, the classification accuracy of the dCNN differentiating BI-RADS 2 from BI-RADS 3–5 lesions was 87.1% (external 93.0%) compared with that of human readers with 79.2 ± 1.9% (external 95.3 ± 2.3%). For the classification of BI-RADS 2–3 versus BI-RADS 4–5, the dCNN reached a classification accuracy of 93.1% (external 95.3%), whereas the classification accuracy of humans yielded 91.6 ± 5.4% (external 94.1 ± 1.2%). The AUC on the internal dataset was 83.8 (external 96.7) for the dCNN and 84.6 ± 2.3 (external 90.9 ± 2.9) for the humans. dCNNs may be used to mimic human decision-making in the evaluation of single US images of breast lesion according to the BI-RADS catalog. The technique reaches high accuracies and may serve for standardization of highly observer-dependent US assessment. • Deep convolutional neural networks could be used to classify US breast lesions.
• The implemented dCNN with its sliding window approach reaches high accuracies in the classification of US breast lesions.
• Deep convolutional neural networks may serve for standardization in US BI-RADS classification.
科研通智能强力驱动
Strongly Powered by AbleSci AI