计算机科学
蒸馏
人工智能
目标检测
稳健性(进化)
特征(语言学)
模式识别(心理学)
加权
对象(语法)
特征提取
水准点(测量)
机器学习
计算机视觉
化学
哲学
有机化学
放射科
基因
医学
地理
生物化学
语言学
大地测量学
作者
Yaoye Song,Peng Zhang,Wei Huang,Yufei Zha,Tao You,Yanning Zhang
标识
DOI:10.1016/j.patcog.2023.110235
摘要
Most of knowledge distillation methods for object detection are feature-based and have achieved competitive results. However, only distillating in feature imitation part does not take full advantage of more sophisticated detection head design for object detection, especially dense object detection. In this paper, a triple parallel distillation (TPD) is proposed which can efficiently transfer all the output response in detection head from teacher to student. Moreover, to overcome the drawback of simply combining the feature-based with the response-based distillation with limited effect enhancement. A hierarchical re-weighting attention distillation (HRAD) is proposed to make student learn more than the teacher in feature information, as well as reciprocal feedback between the classification-IoU joint representation of detection head and the attention-based feature. By jointly interacting the benefits of TPD and HRAD, a closed-loop unified knowledge distillation for dense object detection is proposed, which makes the feature-based and response-based distillation unified and complementary. Experiments on different benchmark datasets have shown that the proposed work is able to outperform other state-of-the-art distillation methods for dense object detection on both accuracy and robustness.
科研通智能强力驱动
Strongly Powered by AbleSci AI