人工智能
深度学习
破译
计算机科学
癌症
机器学习
领域(数学)
癌症检测
精确性和召回率
医学
生物信息学
数学
纯数学
内科学
生物
作者
A. Chempak Kumar,D. Muhammad Noorul Mubarak
出处
期刊:Cognitive science and technology
日期:2023-01-01
卷期号:: 87-98
标识
DOI:10.1007/978-981-99-2746-3_10
摘要
Gastrointestinal cancer is the sixth most prevalent cancer affecting millions of people across the globe. In 2020, an estimated 1089,103 (5.6%) new cases, and 768,793 (7.7%) deaths were reported; hence, the early diagnosis of this type of cancer is one of the highest priorities for healthcare workers. Advanced technologies like Deep Learning have rapidly revolutionized the medical industry and can help detect cancers in the human body. The major drawback of Deep Learning is its black-box approach, where the decision-making methodology of complex algorithms is difficult to comprehend. The enigmatic nature of AI decisions can have disastrous implications, especially in the medical field. The Explainable Artificial Intelligence (XAI) technique is proposed to overcome this drawback, which helps to decipher the reasons behind the decisions made by machine learning algorithms. The current paper has made an effort to detect gastrointestinal cancer using Deep Learning methods (ResNet50, AlexNet, GoogLeNet) and XAI methods (Saliency, Gradients, Shape features). The models created are then evaluated using parameters like accuracy, sensitivity, specificity, recall, and precision over the Kaggle dataset. To further enhance the accuracy in the early diagnosis of cancer, Local Interpretable Model-agnostic Explanation (LIME) is employed to determine the score, and the important regions of the images.
科研通智能强力驱动
Strongly Powered by AbleSci AI