Diversified branch fusion for self-knowledge distillation

计算机科学 蒸馏 融合 人工智能 化学 色谱法 语言学 哲学
作者
Zuxiang Long,Fuyan Ma,Bin Sun,Mingkui Tan,Shutao Li
出处
期刊:Information Fusion [Elsevier BV]
卷期号:90: 12-22 被引量:2
标识
DOI:10.1016/j.inffus.2022.09.007
摘要

Knowledge distillation improves the performance of a compact student network by adding supervision from a pre-trained cumbersome teacher network during training. To avoid the resource consumption of acquiring an extra teacher network, the self-knowledge distillation designs a multi-branch network architecture with shared layers for teacher and student models, which are trained collaboratively in a one-stage manner. However, this method ignores the knowledge of shallow branches and rarely provides diverse knowledge for effective collaboration of different branches. To solve these two shortcomings, this paper proposes a novel Diversified Branch Fusion approach for Self-Knowledge Distillation (DBFSKD). Firstly, we design lightweight networks for adding to the middle layers of the backbone. They capture discriminative information by global-local attention. Then we introduce a diversity loss between different branches to explore diverse knowledge. Moreover, the diverse knowledge is further integrated to form two knowledge sources by a Selective Feature Fusion (SFF) and a Dynamic Logits Fusion (DLF). Thus, the significant knowledge of shallow branches is efficiently utilized and all branches learn from each other through the fused knowledge sources. Extensive experiments with various backbone structures on four public datasets (CIFAR100, Tiny-ImageNet200, ImageNet, and RAF-DB) show superior performance of the proposed method over other methods. More importantly, the DBFSKD achieves even better performance with fewer resource consumption than the baseline. • Diversified branch fusion approach is proposed for self-knowledge distillation. • Shallow branches provide complementary information for the deep ones. • Feature and logits level fusion provides richer knowledge source for distillation. • Diversity loss encourages the branches to explore diverse knowledge. • DBFSKD obtains SOTA results in the facial expression recognition application.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
大方小白发布了新的文献求助10
刚刚
舒服的安想完成签到,获得积分10
刚刚
1秒前
1秒前
111完成签到,获得积分10
2秒前
2秒前
3秒前
4秒前
111发布了新的文献求助10
5秒前
5秒前
Suagy应助syt128采纳,获得10
6秒前
6秒前
仲夏发布了新的文献求助10
6秒前
lizzyming发布了新的文献求助10
7秒前
7秒前
lihuanmoon完成签到,获得积分10
7秒前
李xue发布了新的文献求助10
8秒前
风中淇完成签到,获得积分10
8秒前
joe应助大胆的锅包肉采纳,获得10
8秒前
科研通AI5应助研友_nvG5bZ采纳,获得10
8秒前
8秒前
9秒前
科研通AI5应助77采纳,获得10
9秒前
9秒前
量子星尘发布了新的文献求助20
10秒前
10秒前
六六大顺完成签到,获得积分10
11秒前
华仔应助123采纳,获得10
11秒前
12秒前
幸运海星完成签到,获得积分10
12秒前
既晓发布了新的文献求助10
12秒前
维维逗奶发布了新的文献求助10
12秒前
12秒前
13秒前
宁柠咛发布了新的文献求助10
14秒前
小梨完成签到,获得积分10
14秒前
852应助花开富贵采纳,获得10
14秒前
咖啡不加糖完成签到,获得积分10
14秒前
饼饼发布了新的文献求助10
15秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
计划经济时代的工厂管理与工人状况(1949-1966)——以郑州市国营工厂为例 500
INQUIRY-BASED PEDAGOGY TO SUPPORT STEM LEARNING AND 21ST CENTURY SKILLS: PREPARING NEW TEACHERS TO IMPLEMENT PROJECT AND PROBLEM-BASED LEARNING 500
The Pedagogical Leadership in the Early Years (PLEY) Quality Rating Scale 410
Modern Britain, 1750 to the Present (第2版) 300
Writing to the Rhythm of Labor Cultural Politics of the Chinese Revolution, 1942–1976 300
Lightning Wires: The Telegraph and China's Technological Modernization, 1860-1890 250
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 物理化学 基因 催化作用 遗传学 冶金 电极 光电子学
热门帖子
关注 科研通微信公众号,转发送积分 4602889
求助须知:如何正确求助?哪些是违规求助? 4011856
关于积分的说明 12420674
捐赠科研通 3692191
什么是DOI,文献DOI怎么找? 2035504
邀请新用户注册赠送积分活动 1068692
科研通“疑难数据库(出版商)”最低求助积分说明 953208