遥感
红外线的
稻草
环境科学
融合
稻草
传感器融合
信息融合
计算机科学
人工智能
农学
地理
光学
物理
生物
语言学
哲学
作者
Hao Wen,Xikun Hu,Ping Zhong
标识
DOI:10.1016/j.compag.2024.109078
摘要
Burning agricultural straw after reaping is a typical farming approach for quicker crop rotation but compromises the sustainability of cultivation practices. Unmanned aerial vehicle (UAV) remote sensing techniques are regarded as a feasible coping strategy to the sustainability dilemmas confronted in the domain. In this paper, we evaluate current multisensor information fusion technologies including both traditional methods and deep learning approaches for remote monitoring in real agricultural scenes and investigate their applicability for detecting burning rice straw. To this end, we collect StrawBurning, a real-world dataset containing fully paired infrared and visible images through mapping ground scenes based on UAV remote sensing. Furthermore, we propose a novel multiscale contrast adaptation (MCA) method for efficient multisensor image fusion and accurate straw burning detection in real farmland scenarios. The MCA remarkably enhanced the detection performance of YOLOv5 on the StrawBurning data by approximately 3%, 2%, and 5%, in terms of recall, [email protected] and [email protected]:0.95, respectively. Therefore, our proposed method is demonstrated to obtain the superior performance compared to the single-modal information and other advanced multisensor information fusion methods. Experimental results indicate that neural network models utilizing the proposed multisensor fusion data showed the potential to accurately detect burning rice straw.
科研通智能强力驱动
Strongly Powered by AbleSci AI