DCCD: Reducing Neural Network Redundancy via Distillation

计算机科学 冗余(工程) 人工智能 蒸馏 过度拟合 机器学习 人工神经网络 模式识别(心理学) 操作系统 有机化学 化学
作者
Liu Yuang,Jun Chen,Yong Liu
出处
期刊:IEEE transactions on neural networks and learning systems [Institute of Electrical and Electronics Engineers]
卷期号:35 (7): 10006-10017 被引量:1
标识
DOI:10.1109/tnnls.2023.3238337
摘要

Deep neural models have achieved remarkable performance on various supervised and unsupervised learning tasks, but it is a challenge to deploy these large-size networks on resource-limited devices. As a representative type of model compression and acceleration methods, knowledge distillation (KD) solves this problem by transferring knowledge from heavy teachers to lightweight students. However, most distillation methods focus on imitating the responses of teacher networks but ignore the information redundancy of student networks. In this article, we propose a novel distillation framework difference-based channel contrastive distillation (DCCD), which introduces channel contrastive knowledge and dynamic difference knowledge into student networks for redundancy reduction. At the feature level, we construct an efficient contrastive objective that broadens student networks' feature expression space and preserves richer information in the feature extraction stage. At the final output level, more detailed knowledge is extracted from teacher networks by making a difference between multiview augmented responses of the same instance. We enhance student networks to be more sensitive to minor dynamic changes. With the improvement of two aspects of DCCD, the student network gains contrastive and difference knowledge and reduces its overfitting and redundancy. Finally, we achieve surprising results that the student approaches and even outperforms the teacher in test accuracy on CIFAR-100. We reduce the top-1 error to 28.16% on ImageNet classification and 24.15% for cross-model transfer with ResNet-18. Empirical experiments and ablation studies on popular datasets show that our proposed method can achieve state-of-the-art accuracy compared with other distillation methods.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
YWG关注了科研通微信公众号
1秒前
1秒前
NexusExplorer应助文静的翠安采纳,获得10
3秒前
llIIiiIIiill发布了新的文献求助10
3秒前
5秒前
5秒前
上官若男应助guositing采纳,获得10
5秒前
QAQ完成签到,获得积分10
5秒前
EaSy发布了新的文献求助10
6秒前
ataybabdallah发布了新的文献求助30
6秒前
kk完成签到,获得积分20
6秒前
Nyota应助肖小小采纳,获得10
6秒前
6秒前
万宝路完成签到,获得积分10
8秒前
笑点低的银耳汤给笑点低的银耳汤的求助进行了留言
8秒前
9秒前
说与山鬼听罢完成签到,获得积分10
9秒前
10秒前
冷静妙之完成签到,获得积分10
11秒前
大个应助研友_LOoomL采纳,获得10
11秒前
12秒前
07发布了新的文献求助10
12秒前
13秒前
lebron发布了新的文献求助30
13秒前
化学发布了新的文献求助20
14秒前
15秒前
江幻天发布了新的文献求助10
15秒前
16秒前
16秒前
16秒前
香蕉觅云应助科研通管家采纳,获得10
16秒前
充电宝应助科研通管家采纳,获得10
16秒前
汉堡包应助欣慰元菱采纳,获得10
16秒前
在水一方应助科研通管家采纳,获得10
16秒前
李健应助科研通管家采纳,获得10
17秒前
小二郎应助科研通管家采纳,获得10
17秒前
顾矜应助科研通管家采纳,获得10
17秒前
17秒前
哒哒哒发布了新的文献求助10
17秒前
打打应助科研通管家采纳,获得10
17秒前
高分求助中
Sustainability in Tides Chemistry 2000
Bayesian Models of Cognition:Reverse Engineering the Mind 800
Essentials of thematic analysis 700
A Dissection Guide & Atlas to the Rabbit 600
Very-high-order BVD Schemes Using β-variable THINC Method 568
Mantiden: Faszinierende Lauerjäger Faszinierende Lauerjäger 500
PraxisRatgeber: Mantiden: Faszinierende Lauerjäger 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3124949
求助须知:如何正确求助?哪些是违规求助? 2775300
关于积分的说明 7726177
捐赠科研通 2430793
什么是DOI,文献DOI怎么找? 1291479
科研通“疑难数据库(出版商)”最低求助积分说明 622162
版权声明 600328