Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation

蒸馏 计算机科学 相关性 人工智能 机器学习 对偶(语法数字) 知识转移 数学 化学 色谱法 艺术 知识管理 几何学 文学类
作者
Fei Ding,Yin Yang,Hongxin Hu,Venkat Krovi,Feng Luo
出处
期刊:IEEE transactions on neural networks and learning systems [Institute of Electrical and Electronics Engineers]
卷期号:35 (2): 2425-2435 被引量:9
标识
DOI:10.1109/tnnls.2022.3190166
摘要

Knowledge distillation (KD) has become a widely used technique for model compression and knowledge transfer. We find that the standard KD method performs the knowledge alignment on an individual sample indirectly via class prototypes and neglects the structural knowledge between different samples, namely, knowledge correlation. Although recent contrastive learning-based distillation methods can be decomposed into knowledge alignment and correlation, their correlation objectives undesirably push apart representations of samples from the same class, leading to inferior distillation results. To improve the distillation performance, in this work, we propose a novel knowledge correlation objective and introduce the dual-level knowledge distillation (DLKD), which explicitly combines knowledge alignment and correlation together instead of using one single contrastive objective. We show that both knowledge alignment and correlation are necessary to improve the distillation performance. In particular, knowledge correlation can serve as an effective regularization to learn generalized representations. The proposed DLKD is task-agnostic and model-agnostic, and enables effective knowledge transfer from supervised or self-supervised pretrained teachers to students. Experiments show that DLKD outperforms other state-of-the-art methods on a large number of experimental settings including: 1) pretraining strategies; 2) network architectures; 3) datasets; and 4) tasks.

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
嘻嘻完成签到,获得积分10
刚刚
脑洞疼应助科研小趴菜采纳,获得10
1秒前
慕青应助寒冷的咖啡采纳,获得10
1秒前
ECHO完成签到,获得积分10
1秒前
ttyip发布了新的文献求助10
1秒前
1秒前
4秒前
深情安青应助blush采纳,获得10
4秒前
舒适的方盒完成签到 ,获得积分10
5秒前
小蘑菇应助懒虫儿坤采纳,获得10
6秒前
6秒前
SB发布了新的文献求助10
6秒前
敏感小夏发布了新的文献求助30
8秒前
小马甲应助danpink采纳,获得10
8秒前
8秒前
科研通AI2S应助ZZH采纳,获得10
9秒前
ttyip完成签到,获得积分10
9秒前
小狐狸完成签到,获得积分10
9秒前
10秒前
joy完成签到 ,获得积分10
11秒前
Jessica完成签到,获得积分10
11秒前
11秒前
田様应助huangJP采纳,获得10
12秒前
小糯米发布了新的文献求助10
12秒前
科研通AI2S应助工具人采纳,获得10
12秒前
哈利发布了新的文献求助10
13秒前
14秒前
杨金城发布了新的文献求助10
14秒前
15秒前
HCLonely应助缓慢枕头采纳,获得10
15秒前
余烬22应助小米采纳,获得10
15秒前
zombie完成签到,获得积分10
15秒前
畲田雨完成签到,获得积分10
15秒前
huazhangchina发布了新的文献求助10
16秒前
田様应助ttyip采纳,获得10
17秒前
Orange应助jyy采纳,获得10
18秒前
大模型应助月亮煮粥采纳,获得10
18秒前
小马甲应助杨金城采纳,获得10
18秒前
19秒前
19秒前
高分求助中
歯科矯正学 第7版(或第5版) 1004
Smart but Scattered: The Revolutionary Executive Skills Approach to Helping Kids Reach Their Potential (第二版) 1000
Semiconductor Process Reliability in Practice 720
GROUP-THEORY AND POLARIZATION ALGEBRA 500
Mesopotamian divination texts : conversing with the gods : sources from the first millennium BCE 500
Days of Transition. The Parsi Death Rituals(2011) 500
The Heath Anthology of American Literature: Early Nineteenth Century 1800 - 1865 Vol. B 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3229126
求助须知:如何正确求助?哪些是违规求助? 2876954
关于积分的说明 8196847
捐赠科研通 2544250
什么是DOI,文献DOI怎么找? 1374230
科研通“疑难数据库(出版商)”最低求助积分说明 646923
邀请新用户注册赠送积分活动 621703