帕斯卡(单位)
计算机科学
蒸馏
人工智能
目标检测
对象(语法)
模式识别(心理学)
特征(语言学)
关系(数据库)
分类器(UML)
机器学习
数据挖掘
哲学
有机化学
化学
程序设计语言
语言学
标识
DOI:10.1016/j.neucom.2023.127060
摘要
Knowledge distillation (KD) is a method that transfers information from a larger network (teacher) to a smaller network (student) to obtain stronger performance without extra computational load. It has made great success in image classification, but only attains trivial improvement in object detection. In this paper, we propose a knowledge distillation scheme for object detection with Inconsistency-Based Feature Imitation (IBFI) and Global Relation Imitation (GRI) schemes. IBFI calculates the difference between the outputs of classification head and regression head to balance the classification and localization abilities of the detector. GRI enables the student to mimic the teacher’s relational information. Extensive experiments have been conducted on popular datasets including MS COCO and PASCAL VOC to validate the effectiveness of our scheme. We achieve 39.9% mAP on MS COCO by using RetinaNet + ResNet-50, which surpasses the baseline by 2.5%.
科研通智能强力驱动
Strongly Powered by AbleSci AI