计算机科学
里程计
惯性测量装置
计算机视觉
人工智能
点云
激光雷达
里程表
机器人
同时定位和映射
编码器
基本事实
移动机器人
遥感
地理
操作系统
作者
Yun Su,Ting Wang,Ting Wang,Chen Yao,Zhidong Wang
标识
DOI:10.1016/j.robot.2021.103759
摘要
Simultaneous localization and mapping is a fundamental process in robot navigation. We focus on LiDAR to complete this process in ground robots traveling on complex terrain by proposing GR-LOAM, a method to estimate robot ego-motion by fusing LiDAR, inertial measurement unit (IMU), and encoder measurements in a tightly coupled scheme. First, we derive a odometer increment model that fuses the IMU and encoder measurements to estimate the robot pose variation on a manifold. Then, we apply point cloud segmentation and feature extraction to obtain distinctive edge and planar features. Moreover, we propose an evaluation algorithm for the sensor measurements to detect abnormal data and reduce their corresponding weight during optimization. By jointly optimizing the cost derived from the LiDAR, IMU, and encoder measurements in a local window, we obtain low-drift odometry even on complex terrain. We use the estimated relative pose in the local window to reevaluate the matching distance across features and remove dynamic objects and outliers, thus refining the features before being fed to a mapping thread and increasing the mapping efficiency. In the back end, GR-LOAM uses the refined point cloud and tightly couples the IMU and encoder measurements with ground constraints to further refine the estimated pose by aligning the features on a global map. Results from extensive experiments performed in indoor and outdoor environments using real ground robot demonstrate the high accuracy and robustness of the proposed GR-LOAM for state estimation of ground robots.
科研通智能强力驱动
Strongly Powered by AbleSci AI