期刊:IEEE Transactions on Industrial Electronics [Institute of Electrical and Electronics Engineers] 日期:2024-05-03卷期号:71 (12): 16695-16705被引量:6
标识
DOI:10.1109/tie.2024.3387089
摘要
This article proposes a novel deep-learning-based ORB-SLAM-feature filtering framework to monitor, detect the occurrence, and estimate the distance of early wildfire through an integrated design of image processing of aerial onboard visual-infrared sensor measurements and real-time navigation of an unmanned aerial vehicle (UAV). The proposed framework uses a DJI ZenMuse H20T onboard sensor integrating with both visual and infrared cameras mounted on a DJI M300 UAV. It consists of three main functional modules to support early wildfire fighting and management missions: 1) smoke and suspected flame segmentation based on an attention gate U-Net, which decreases false alarm and provides semantic information; 2) camera poses recovery based on a monocular SLAM algorithm and wildfire spot distance estimation based on a triangulation algorithm. With the estimated wildfire distance, camera poses, and global positioning system (GPS) information of the UAV, the suspected wildfire spot can be geo-located; 3) visual-infrared images registration based on a geometry model to forbid false detection and missing segmentation. Finally, independent indoor and outdoor experiments are conducted to verify the effectiveness of the proposed algorithms in the developed framework.