计算机科学
卷积神经网络
深度学习
人工智能
学习迁移
火灾探测
人工神经网络
目标检测
建筑
实时计算
模式识别(心理学)
热力学
物理
艺术
视觉艺术
作者
Rahmi Arda Aral,Cemil Zalluhoğlu,Ebru Akçapınar Sezer
标识
DOI:10.1080/01431161.2023.2255349
摘要
ABSTRACTUnmanned aerial vehicles (UAVs) are invaluable technologies concerning their remote control and monitoring capabilities. Convolutional neural networks (CNNs), known for their high pattern recognition capabilities, are appropriate for forest fire detection with UAVs. Deep convolutional neural networks show substantial performance on hardware with high processing capabilities. While these networks can be operated in unmanned aerial vehicles controlled from ground control stations equipped with GPU-supported hardware, the execution on a typical UAV's limited computational resources necessitates the use of lightweight, small-sized networks. To overcome these impediments, this article presents a lightweight and attention-based approach for performing forest fire detection tasks using UAV vision data (images acquired by cameras mounted on UAVs). In this paper, we also present comprehensive research for different approaches such as transfer learning, deep CNNs, and lightweight CNNs. Among the experimented models, the attention-based EfficientNetB0 backboned model emerged as the most successful architecture for forest fire detection. With the test accuracy of 92.02%, the F1-score of 92.08%, the recall of 92.02%, and the precision of 92.66% have strongly reinforced the efficiency of the EfficientNetB0-based model in wildfire recognition. Moreover, the network has a less parameter size than the experimented networks. It proves the model's suitability for wildfire detection with UAVs having limited hardware resources.KEYWORDS: UAV Imagerydeep Learningwildfire detectiontransfer learningconvolutional neural networks Disclosure statementNo potential conflict of interest was reported by the author(s).
科研通智能强力驱动
Strongly Powered by AbleSci AI