惯性测量装置
同时定位和映射
计算机科学
人工智能
稳健性(进化)
计算机视觉
水准点(测量)
机器人
移动机器人
大地测量学
生物化学
基因
化学
地理
作者
Hesheng Yin,Shaomiao Li,Yu Tao,J. Guo,Bo Huang
出处
期刊:IEEE Transactions on Robotics
[Institute of Electrical and Electronics Engineers]
日期:2022-08-25
卷期号:39 (1): 289-308
被引量:39
标识
DOI:10.1109/tro.2022.3199087
摘要
Most existing vision-based simultaneous localization and mapping (SLAM) systems and their variants still assume that the observation is absolutely static and cannot work well in dynamic environments. Here, we present the Dynam-SLAM (Dynam), a stereo visual-inertial SLAM system capable of robust, accurate, and continuous work in high dynamic environments. Our approach is devoted to loosely coupling the stereo scene flow with an inertial measurement unit (IMU) for dynamic feature detection and tightly coupling the dynamic and static features with the IMU measurements for nonlinear optimization. First, the scene flow uncertainty caused by measurement noise is modeled to derive the accurate motion likelihood of landmarks. Meanwhile, to cope with highly dynamic environments, we additionally construct the virtual landmarks based on the detected dynamic features. Then, we build a tightly coupled, nonlinear optimization-based SLAM system to estimate the camera state by fusing IMU measurements and feature observations. Finally, we evaluate the proposed dynamic feature detection module (DFM) and the overall SLAM system in various benchmark datasets. Experimental results show that the Dynam is almost unaffected by DFM and performs well in static EuRoC datasets. Dynam outperforms the current state-of-the-art visual and visual-inertial SLAM implementations in terms of accuracy and robustness in self-collected dynamic datasets. The average absolute trajectory error of Dynam in the dynamic benchmark datasets is $\sim$ 90% lower than that of VINS-Fusion, $\sim$ 84% lower than that of ORB-SLAM3, and $\sim$ 88% lower than that of Kimera.
科研通智能强力驱动
Strongly Powered by AbleSci AI