计算机科学
稳健性(进化)
人工神经网络
人工智能
深度学习
蒸馏
移植
机器学习
过程(计算)
知识转移
知识管理
软件
程序设计语言
化学
有机化学
操作系统
基因
生物化学
作者
Xi Chen,Zhiqiang Xing,Yuyang Cheng
标识
DOI:10.1109/icsp51882.2021.9408881
摘要
In recent years, with the rapid development of deep neural networks, deep learning has been applied to the fields of medicine, industry, education, and so on. However, due to a large number of neural network parameters and huge storage space consumption, the problem makes it is difficult to be ported to mobile devices. As a result, deep learning model compression method can effectively alleviate the problem. Among them, knowledge distillation adopts the idea of transfer learning, using the teacher network model to guide student network, the model allows the student model to learn from the teacher, thereby to improve the robustness of the model. This article presents primarily the process of development and the trend of distilling knowledge.
科研通智能强力驱动
Strongly Powered by AbleSci AI