惯性测量装置
激光雷达
全球定位系统
计算机科学
全球导航卫星系统应用
精准农业
传感器融合
机器人
卡尔曼滤波器
拖拉机
遥感
人工智能
计算机视觉
地理
工程类
农业
电信
考古
汽车工程
作者
Andrés Eduardo Baquero Velasquez,Vitor Akihiro Hisano Higuti,Mateus V. Gasparino,Arun Narenthiran Sivakumar,Marcelo Becker,Girish Chowdhary
摘要
This paper presents a state-of-the-art light detection and ranging (LiDAR) based autonomous navigation system for under-canopy agricultural robots. Under-canopy agricultural navigation has been a challenging problem because global navigation satellite system (GNSS) and other positioning sensors are prone to loss of accuracy due to attenuation and multi-path errors caused by crop leaves and stems. Reactive navigation by detecting crop rows using LiDAR measurements has proved to be an efficient alternative to GNSS. Nevertheless, it presents challenges, due to occlusion from leaves under the canopy. Our system addresses these issues by fusing inertial measurement unit (IMU) and LiDAR measurements in a Bayesian framework on low-cost hardware. In addition, a local goal generator (LGG) is introduced to provide a local reference trajectory to the onboard controller. Our system is validated extensively in real-world field environments over a distance of 50.88 km, on multiple robots, in different field conditions, across different locations. We report leading distance between intervention results for LiDAR+IMU-based under-canopy navigation, showing that our system is able to safely navigate without interventions for 386.9 m on average, in fields without significant gaps in the crop rows.
科研通智能强力驱动
Strongly Powered by AbleSci AI