Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation

蒸馏 计算机科学 相关性 人工智能 机器学习 对偶(语法数字) 知识转移 数学 化学 色谱法 艺术 知识管理 几何学 文学类
作者
Fei Ding,Yin Yang,Hongxin Hu,Venkat Krovi,Feng Luo
出处
期刊:IEEE transactions on neural networks and learning systems [Institute of Electrical and Electronics Engineers]
卷期号:35 (2): 2425-2435 被引量:10
标识
DOI:10.1109/tnnls.2022.3190166
摘要

Knowledge distillation (KD) has become a widely used technique for model compression and knowledge transfer. We find that the standard KD method performs the knowledge alignment on an individual sample indirectly via class prototypes and neglects the structural knowledge between different samples, namely, knowledge correlation. Although recent contrastive learning-based distillation methods can be decomposed into knowledge alignment and correlation, their correlation objectives undesirably push apart representations of samples from the same class, leading to inferior distillation results. To improve the distillation performance, in this work, we propose a novel knowledge correlation objective and introduce the dual-level knowledge distillation (DLKD), which explicitly combines knowledge alignment and correlation together instead of using one single contrastive objective. We show that both knowledge alignment and correlation are necessary to improve the distillation performance. In particular, knowledge correlation can serve as an effective regularization to learn generalized representations. The proposed DLKD is task-agnostic and model-agnostic, and enables effective knowledge transfer from supervised or self-supervised pretrained teachers to students. Experiments show that DLKD outperforms other state-of-the-art methods on a large number of experimental settings including: 1) pretraining strategies; 2) network architectures; 3) datasets; and 4) tasks.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
yf发布了新的文献求助10
1秒前
1223完成签到,获得积分20
1秒前
1秒前
1秒前
liberty发布了新的文献求助10
2秒前
帅气的杰瑞完成签到,获得积分10
2秒前
2秒前
Rita完成签到 ,获得积分10
3秒前
丘奇发布了新的文献求助10
3秒前
balabala完成签到,获得积分10
3秒前
3秒前
Owen应助无糖气泡水采纳,获得10
3秒前
3秒前
同尘完成签到 ,获得积分10
3秒前
吾系渣渣辉完成签到 ,获得积分10
4秒前
neverever完成签到,获得积分10
4秒前
阿尔卡利斯完成签到,获得积分10
4秒前
5秒前
儒雅完成签到,获得积分10
5秒前
竹子完成签到,获得积分10
5秒前
量子星尘发布了新的文献求助10
6秒前
6秒前
6秒前
1223发布了新的文献求助10
6秒前
MinSheng完成签到,获得积分10
7秒前
7秒前
7秒前
芝芝椰奶冻完成签到,获得积分10
7秒前
科目三应助Motanka采纳,获得10
8秒前
微客发布了新的文献求助10
9秒前
Sisyphus发布了新的文献求助10
9秒前
苹果夜梦完成签到 ,获得积分10
9秒前
9秒前
ll完成签到,获得积分20
9秒前
毅雅发布了新的文献求助10
10秒前
10秒前
深情安青应助麻吉麻鸡采纳,获得10
10秒前
ZHOUZHEN完成签到,获得积分10
11秒前
xxr发布了新的文献求助10
11秒前
影子完成签到 ,获得积分10
11秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Zeolites: From Fundamentals to Emerging Applications 1500
Architectural Corrosion and Critical Infrastructure 1000
Early Devonian echinoderms from Victoria (Rhombifera, Blastoidea and Ophiocistioidea) 1000
Hidden Generalizations Phonological Opacity in Optimality Theory 1000
By R. Scott Kretchmar - Practical Philosophy of Sport and Physical Activity - 2nd (second) Edition: 2nd (second) Edition 666
Energy-Size Reduction Relationships In Comminution 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 内科学 生物化学 物理 计算机科学 纳米技术 遗传学 基因 复合材料 化学工程 物理化学 病理 催化作用 免疫学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 4940451
求助须知:如何正确求助?哪些是违规求助? 4206580
关于积分的说明 13074753
捐赠科研通 3985154
什么是DOI,文献DOI怎么找? 2182031
邀请新用户注册赠送积分活动 1197696
关于科研通互助平台的介绍 1110012