经济短缺
领域(数学)
计算机科学
蒸馏
人工智能
数据科学
数学
语言学
哲学
有机化学
化学
纯数学
政府(语言学)
作者
Hefeng Meng,Zhiqiang Lin,Fan Yang,Yonghui Xu,Lizhen Cui
标识
DOI:10.1145/3503181.3503211
摘要
In recent years, there have always been many problems in the medical field, such as a shortage of professionals and a shortage of medical resources. With the application of machine learning in the medical field, these problems have been alleviated to a certain extent, but these machine learning methods also have shortcomings, such as models are often too large to be deployed on lightweight equipment, and medical data sets are difficult to share, Many researchers have put forward many methods, and knowledge distillation is one of them. As a model compression and acceleration technology, knowledge distillation has been widely used in the medical field. The research of many researchers also shows that the use of knowledge distillation can effectively compress huge and complex models and improve the performance of models. Many studies show that the use of knowledge distillation can effectively solve many problems existing in models in the medical field, Aiming at the various applications of knowledge distillation in the medical field, this paper makes a comprehensive review from the perspectives of knowledge distillation, the problems that knowledge distillation can solve in the medical field and the practical application of knowledge distillation.
科研通智能强力驱动
Strongly Powered by AbleSci AI