Research on knowledge distillation algorithm based on Yolov5 attention mechanism

计算机科学 特征(语言学) 蒸馏 人工智能 机器学习 代表(政治) 抓住 算法 模式识别(心理学) 政治 哲学 有机化学 化学 程序设计语言 法学 语言学 政治学
作者
Shengjie Cheng,Peiyong Zhou,YuLiu,HongjiMa,Alimjan Aysa,Kurban Ubul
出处
期刊:Expert Systems With Applications [Elsevier]
卷期号:240: 122553-122553 被引量:6
标识
DOI:10.1016/j.eswa.2023.122553
摘要

The current most advanced CNN-based detection models are nearly not deployable on mobile devices with limited arithmetic power due to problems such as too many redundant parameters and excessive arithmetic power required, and knowledge distillation as a potentially practical model compression approach can alleviate this limitation. In the past, feature-based knowledge distillation algorithms focused more on transferring the local features customized by people and reduced the full grasp of global information in images. To address the shortcomings of traditional feature distillation algorithms, we first improve GAMAttention to learn the global feature representation in images, and the improved attention mechanism can minimize the information loss caused by processing features. Secondly, feature shifting no longer defines manually which features should be shifted, a more interpretable approach is proposed where the student network learns to emulate the high-response feature regions predicted by the teacher network, which increases the end-to-end properties of the model, and feature shifting allows the student network to simulate the teacher network in generating semantically strong feature maps to improve the detection performance of the small model. To avoid learning too many noisy features when learning background features, these two parts of feature distillation are assigned different weights. Finally, logical distillation is performed on the prediction heads of the student and teacher networks. In this experiment, we chose Yolov5 as the base network structure for teacher-student pairs. We improved Yolov5s through attention and knowledge distillation, ultimately achieving a 1.3% performance gain on VOC and a 1.8% performance gain on KITTI.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
科研通AI2S应助LHS采纳,获得10
刚刚
多情的续完成签到,获得积分10
1秒前
高贵绿草完成签到 ,获得积分10
2秒前
枫16完成签到,获得积分10
3秒前
wz发布了新的文献求助60
3秒前
车佳莹完成签到,获得积分10
7秒前
7秒前
8秒前
科研通AI2S应助wuyongmei采纳,获得10
8秒前
科研通AI2S应助wuming采纳,获得10
8秒前
8秒前
烟花应助Capacition6采纳,获得10
8秒前
10秒前
无奈的鹤完成签到,获得积分10
10秒前
10秒前
QAQSS完成签到 ,获得积分10
11秒前
科目三应助Yong-AI-BUPT采纳,获得10
11秒前
楚阔应助年轻思山采纳,获得30
11秒前
憨憨完成签到,获得积分20
13秒前
钱小二发布了新的文献求助10
13秒前
爆米花应助wang采纳,获得10
14秒前
15秒前
15秒前
香蕉觅云应助Danish采纳,获得10
15秒前
Jack发布了新的文献求助10
16秒前
小公完成签到,获得积分10
18秒前
21秒前
21秒前
我是老大应助诗雨采纳,获得10
21秒前
wang完成签到,获得积分20
22秒前
科目三应助木子人衣言若采纳,获得10
25秒前
大力的向日葵完成签到,获得积分10
26秒前
Hello应助酱圤采纳,获得10
26秒前
27秒前
wang发布了新的文献求助10
27秒前
K先生完成签到,获得积分10
27秒前
欢喜沛珊应助XMUZH采纳,获得10
27秒前
倒背如流圆周率完成签到,获得积分10
28秒前
研友_Z11kkZ发布了新的文献求助10
28秒前
武巧运发布了新的文献求助10
28秒前
高分求助中
Licensing Deals in Pharmaceuticals 2019-2024 3000
Cognitive Paradigms in Knowledge Organisation 2000
Mantiden: Faszinierende Lauerjäger Faszinierende Lauerjäger Heßler, Claudia, Rud 1000
PraxisRatgeber: Mantiden: Faszinierende Lauerjäger 1000
Natural History of Mantodea 螳螂的自然史 1000
A Photographic Guide to Mantis of China 常见螳螂野外识别手册 800
How Maoism Was Made: Reconstructing China, 1949-1965 800
热门求助领域 (近24小时)
化学 医学 材料科学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 量子力学 冶金 电极
热门帖子
关注 科研通微信公众号,转发送积分 3318059
求助须知:如何正确求助?哪些是违规求助? 2949449
关于积分的说明 8546122
捐赠科研通 2625875
什么是DOI,文献DOI怎么找? 1436935
科研通“疑难数据库(出版商)”最低求助积分说明 666039
邀请新用户注册赠送积分活动 652042