亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

Reciprocal Teacher-Student Learning via Forward and Feedback Knowledge Distillation

计算机科学 互惠的 蒸馏 人机交互 多媒体 人工智能 语言学 哲学 有机化学 化学
作者
Jianmin Gou,Yu Chen,Baosheng Yu,Jinhua Liu,Lan Du,Shaohua Wan,Yi Zhang
出处
期刊:IEEE Transactions on Multimedia [Institute of Electrical and Electronics Engineers]
卷期号:26: 7901-7916 被引量:33
标识
DOI:10.1109/tmm.2024.3372833
摘要

Knowledge distillation (KD) is a prevalent model compression technique in deep learning, aiming to leverage knowledge from a large teacher model to enhance the training of a smaller student model. It has found success in deploying compact deep models in intelligent applications like intelligent transportation, smart health, and distributed intelligence. Current knowledge distillation methods primarily fall into two categories: offline and online knowledge distillation. Offline methods involve a one-way distillation process, transferring unvaried knowledge from teacher to student, while online methods enable the simultaneous training of multiple peer students. However, existing knowledge distillation methods often face challenges where the student may not fully comprehend the teacher's knowledge due to model capacity gaps, and there might be knowledge incongruence among outputs of multiple students without teacher guidance. To address these issues, we propose a novel reciprocal teacher-student learning inspired by human teaching and examining through forward and feedback knowledge distillation (FFKD). Forward knowledge distillation operates offline, while feedback knowledge distillation follows an online scheme. The rationale is that feedback knowledge distillation enables the pre-trained teacher model to receive feedback from students, allowing the teacher to refine its teaching strategies accordingly. To achieve this, we introduce a new weighting constraint to gauge the extent of students' understanding of the teacher's knowledge, which is then utilized to enhance teaching strategies. Experimental results on five visual recognition datasets demonstrate that the proposed FFKD outperforms current state-of-the-art knowledge distillation methods.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
4秒前
30秒前
nki完成签到,获得积分20
45秒前
51秒前
54秒前
molihuakai应助科研通管家采纳,获得10
58秒前
1分钟前
1分钟前
霸气幼荷发布了新的文献求助10
1分钟前
1分钟前
2分钟前
2分钟前
Skywalk满天星完成签到,获得积分10
3分钟前
3分钟前
4分钟前
桐桐应助霸气幼荷采纳,获得10
4分钟前
obedVL完成签到,获得积分10
4分钟前
哈哈嘿完成签到,获得积分10
4分钟前
光亮的凌青完成签到,获得积分10
4分钟前
4分钟前
啊啊啊发布了新的文献求助10
4分钟前
何为完成签到 ,获得积分10
4分钟前
5分钟前
霸气幼荷发布了新的文献求助10
5分钟前
慕青应助霸气幼荷采纳,获得10
5分钟前
5分钟前
5分钟前
霸气幼荷发布了新的文献求助10
5分钟前
华仔应助Enoch采纳,获得10
5分钟前
6分钟前
6分钟前
Enoch发布了新的文献求助10
6分钟前
feiyafei完成签到 ,获得积分10
6分钟前
Owen应助啊啊啊采纳,获得10
7分钟前
合适乐巧完成签到 ,获得积分10
7分钟前
7分钟前
RWcreator完成签到 ,获得积分10
7分钟前
Xenomorph完成签到,获得积分10
7分钟前
啊啊啊发布了新的文献求助10
7分钟前
美满尔蓝完成签到,获得积分10
7分钟前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Les Mantodea de Guyane Insecta, Polyneoptera 2000
Quality by Design - An Indispensable Approach to Accelerate Biopharmaceutical Product Development 800
Pulse width control of a 3-phase inverter with non sinusoidal phase voltages 777
Signals, Systems, and Signal Processing 610
Research Methods for Applied Linguistics: A Practical Guide 600
Research Methods for Applied Linguistics 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 纳米技术 工程类 有机化学 化学工程 生物化学 计算机科学 物理 内科学 复合材料 催化作用 物理化学 光电子学 电极 细胞生物学 基因 无机化学
热门帖子
关注 科研通微信公众号,转发送积分 6404335
求助须知:如何正确求助?哪些是违规求助? 8223563
关于积分的说明 17429832
捐赠科研通 5456912
什么是DOI,文献DOI怎么找? 2883628
邀请新用户注册赠送积分活动 1859855
关于科研通互助平台的介绍 1701302