计算机科学
多元统计
选择(遗传算法)
样品(材料)
蒸馏
数据挖掘
特征选择
人工智能
钥匙(锁)
失真(音乐)
机器学习
模式识别(心理学)
放大器
计算机网络
化学
计算机安全
有机化学
带宽(计算)
色谱法
作者
Yiran Yang,Xian Sun,Wenhui Diao,Dongshuo Yin,Zhujun Yang,Xinming Li
出处
期刊:IEEE Transactions on Geoscience and Remote Sensing
[Institute of Electrical and Electronics Engineers]
日期:2022-01-01
卷期号:60: 1-14
被引量:8
标识
DOI:10.1109/tgrs.2022.3192013
摘要
In recent years, more concerns are shed on the lightweight detection model in remote sensing (RS), but it is difficult to reach a competitive performance relative to the deep model. Knowledge distillation has been verified as a promising method, which can promote the performance of the lightweight model without extra parameters. While there are two key issues of detection distillation, one is the sample selection, the other is the knowledge selection. Since the varying object size and complex features in RS, the existing methods based on the fixed threshold are incapable of selecting the optimal distillation samples and they also ignore the potential multivariate knowledge among RS samples simultaneously. In this paper, we propose a statistical sample selection and multivariate knowledge mining framework. The statistical sample selection module formulates the task as the modeling and splitting the probability distribution of sample selection cost, which is more suitable for dynamically choosing multiscale samples in RS and eliminates the distortion of previous static distillation selection. Furthermore, to mine the complex feature knowledge of samples in RS, we design a multivariate knowledge mining module, in which knowledge includes explicit and implicit knowledge. The proposed module validly deliver the core knowledge from the teacher model to the lightweight model. Massive experiments on three challenging RS datasets (DOTA, NWPU VHR-10, DIOR) prove that our method achieves state-of-the-art performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI