Structured Knowledge Distillation for Accurate and Efficient Object Detection

蒸馏 像素 计算机科学 人工智能 对象(语法) 关系(数据库) 目标检测 分割 模式识别(心理学) 特征提取 机器学习 计算机视觉 数据挖掘 色谱法 化学
作者
Linfeng Zhang,Kaisheng Ma
出处
期刊:IEEE Transactions on Pattern Analysis and Machine Intelligence [Institute of Electrical and Electronics Engineers]
卷期号:45 (12): 15706-15724 被引量:14
标识
DOI:10.1109/tpami.2023.3300470
摘要

Knowledge distillation, which aims to transfer the knowledge learned by a cumbersome teacher model to a lightweight student model, has become one of the most popular and effective techniques in computer vision. However, many previous knowledge distillation methods are designed for image classification and fail in more challenging tasks such as object detection. In this paper, we first suggest that the failure of knowledge distillation on object detection is mainly caused by two reasons: (1) the imbalance between pixels of foreground and background and (2) lack of knowledge distillation on the relation among different pixels. Then, we propose a structured knowledge distillation scheme, including attention-guided distillation and non-local distillation to address the two issues, respectively. Attention-guided distillation is proposed to find the crucial pixels of foreground objects with an attention mechanism and then make the students take more effort to learn their features. Non-local distillation is proposed to enable students to learn not only the feature of an individual pixel but also the relation between different pixels captured by non-local modules. Experimental results have demonstrated the effectiveness of our method on thirteen kinds of object detection models with twelve comparison methods for both object detection and instance segmentation. For instance, Faster RCNN with our distillation achieves 43.9 mAP on MS COCO2017, which is 4.1 higher than the baseline. Additionally, we show that our method is also beneficial to the robustness and domain generalization ability of detectors. Codes and model weights have been released on GitHub

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
俏皮颤完成签到,获得积分10
1秒前
Jasper应助111采纳,获得10
1秒前
安年完成签到 ,获得积分10
2秒前
2秒前
君故发布了新的文献求助10
2秒前
熊若宇完成签到,获得积分10
3秒前
4秒前
LHS发布了新的文献求助10
5秒前
Raymond完成签到,获得积分10
5秒前
YCH完成签到,获得积分10
7秒前
8秒前
子彧发布了新的文献求助10
8秒前
Jasper应助wuxunxun2015采纳,获得10
9秒前
10秒前
鸠摩智完成签到,获得积分10
12秒前
乐乐应助cj采纳,获得10
13秒前
REX完成签到,获得积分10
14秒前
14秒前
娜娜发布了新的文献求助10
17秒前
17秒前
cyskdsn完成签到 ,获得积分10
17秒前
19秒前
19秒前
hhuajw应助撒旦asd采纳,获得10
22秒前
22秒前
bai发布了新的文献求助10
22秒前
腼腆的海豚完成签到,获得积分10
23秒前
23秒前
科研通AI6应助科研通管家采纳,获得10
24秒前
旦旦旦旦旦旦完成签到,获得积分10
24秒前
科研通AI6应助科研通管家采纳,获得10
24秒前
mengtingmei应助科研通管家采纳,获得10
24秒前
852应助LL采纳,获得10
24秒前
科研通AI6应助科研通管家采纳,获得10
24秒前
科研通AI6应助科研通管家采纳,获得10
24秒前
在水一方应助科研通管家采纳,获得10
24秒前
mengtingmei应助科研通管家采纳,获得10
24秒前
科研通AI6应助科研通管家采纳,获得10
24秒前
在水一方应助科研通管家采纳,获得10
24秒前
24秒前
高分求助中
2025-2031全球及中国金刚石触媒粉行业研究及十五五规划分析报告 40000
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Introduction to strong mixing conditions volume 1-3 5000
Ägyptische Geschichte der 21.–30. Dynastie 2500
Clinical Microbiology Procedures Handbook, Multi-Volume, 5th Edition 2000
„Semitische Wissenschaften“? 1510
从k到英国情人 1500
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5742464
求助须知:如何正确求助?哪些是违规求助? 5408439
关于积分的说明 15345013
捐赠科研通 4883738
什么是DOI,文献DOI怎么找? 2625271
邀请新用户注册赠送积分活动 1574132
关于科研通互助平台的介绍 1531071