摘要
Vision is a major component in several digital technologies and tools used in agriculture. Object detection plays a pivotal role in digital farming by automating the task of detecting, identifying, and localization of various objects in large-scale agrarian landscapes. The single-stage detection algorithm, You Only Look Once (YOLO), has gained popularity in agriculture in a relatively short span due to its state-of-the-art performance in terms of accuracy, speed, and network size. YOLO offers real-time detection performance with good accuracy and is implemented in various agricultural tasks, including monitoring, surveillance, sensing, automation, and robotics operations. The research and application of YOLO in agriculture are accelerating at a tremendous speed but are fragmented and multidisciplinary in nature. Moreover, the performance characteristics (i.e., accuracy, speed, computation) of the object detector influence the rate of technology implementation and adoption in agriculture. Therefore, this study aimed to collect extensive literature to document and critically evaluate the advances and application of YOLO for agricultural object recognition tasks. First, we conducted a bibliometric review of 257 selected articles to understand the scholarly landscape (i.e., research trends, evolution, global hotspots, and gaps) of YOLO in the broad agricultural domain. Secondly, we conducted a systematic literature review on 30 selected articles to identify current knowledge, critical gaps, and modifications in YOLO for specific agricultural tasks. The study critically assessed and summarized the information on YOLO's end-to-end learning approach, including data acquisition, processing, network modification, integration, and deployment. We also discussed task-specific YOLO algorithm modification and integration to meet the agricultural object or environment-specific challenges. In general, YOLO-integrated digital tools and technologies showed the potential for real-time, automated monitoring, surveillance, and object handling to reduce labor, production cost, and environmental impact while maximizing resource efficiency. The study provides detailed documentation and significantly advances the existing knowledge on applying YOLO in agriculture, which can greatly benefit the scientific community. The results of this study open the door for implementing YOLO-based solutions in practical agricultural scenarios and add to the expanding corpus of information on computer vision applications in agriculture.