计算机科学
蒸馏
探测器
人工智能
目标检测
对象(语法)
机器学习
噪音(视频)
模式识别(心理学)
图像(数学)
电信
化学
有机化学
作者
Yiran Yang,Xian Sun,Wenhui Diao,Hao Li,Youming Wu,Xinming Li,Kun Fu
出处
期刊:IEEE Transactions on Geoscience and Remote Sensing
[Institute of Electrical and Electronics Engineers]
日期:2022-01-01
卷期号:60: 1-15
被引量:47
标识
DOI:10.1109/tgrs.2022.3175213
摘要
Lightweight object detector is currently gaining more and more popularity in remote sensing. In general, it’s hard for lightweight detectors to achieve competitive performance compared to traditional deep models, while knowledge distillation is a promising training method to tackle the issue. Since the background is more complicated and the object size varies extremely in remote sensing images, it will deliver lots of noise and affect the training performance when directly applying the existing knowledge distillation methods. To tackle the above problems, we propose an Adaptive Reinforcement Supervision Distillation (ARSD) framework to promote the detection capability of the lightweight model. Firstly, we put forward a multiscale core features imitation (MCFI) module for transferring the knowledge of features, which can adaptively select the multiscale core features of objects for distillation and focus more on the features of small objects by an area-weighted strategy. In addition, a strict supervision regression distillation (SSRD) module is designed to select the optimal regression results for distillation, which facilitates the student to effectively imitate the more precise regression output of the teacher network. Massive experiments on the DOTA, DIOR, and NWPU VHR-10 datasets prove that ARSD outperforms the existing distillation SOTA methods. Moreover, the performance of lightweight model trained with our method transcends other classic heavy and lightweight detectors, which beneficiates the development of lightweight models.
科研通智能强力驱动
Strongly Powered by AbleSci AI