蒸馏
前提
计算机科学
过程(计算)
任务(项目管理)
人工智能
机器学习
基线(sea)
知识工程
工程类
化学
色谱法
哲学
语言学
海洋学
系统工程
地质学
操作系统
作者
Weiwei Zhang,Yufeng Guo,Junhuang Wang,Jianqing Zhu,Huanqiang Zeng
出处
期刊:IEEE Transactions on Circuits and Systems for Video Technology
[Institute of Electrical and Electronics Engineers]
日期:2024-03-13
卷期号:34 (8): 7601-7613
标识
DOI:10.1109/tcsvt.2024.3377251
摘要
Existing research on knowledge distillation has primarily concentrated on the task of facilitating student networks in acquiring the complete knowledge imparted by teacher networks. However, recent studies have shown that good networks are not suitable for acting as teachers, and there is a positive correlation between distillation performance and teacher prediction uncertainty. To address this finding, this paper thoroughly analyzes in depth the reasons why the teacher network affects the distillation performance, gives full play to the participation of the student network in the process of knowledge distillation, and assists the teacher network in distilling the knowledge that is suitable for their learning. In light of this premise, a novel approach known as Collaborative Knowledge Distillation (CKD) is introduced, which is founded upon the concept of "Tailoring the Teaching to the Individual". Compared with Baseline, this paper's method improves students' accuracy by an average of 3.42% in CIFAR-100 experiments, and by an average of 1.71% compared with the classical Knowledge Distillation (KD) method. The ImageNet experiments conducted revealed a significant improvement of 2.04% in the Top-1 accuracy of the students.
科研通智能强力驱动
Strongly Powered by AbleSci AI