人工智能
计算机科学
里程计
计算机视觉
惯性参考系
惯性测量装置
视觉里程计
陀螺仪
机器人学
概率逻辑
非线性规划
卡尔曼滤波器
同时定位和映射
加速度计
非线性系统
机器人
移动机器人
工程类
物理
量子力学
航空航天工程
操作系统
作者
Stefan Leutenegger,Simon Lynen,Michael Bosse,Roland Siegwart,Paul Furgale
标识
DOI:10.1177/0278364914554813
摘要
Combining visual and inertial measurements has become popular in mobile robotics, since the two sensing modalities offer complementary characteristics that make them the ideal choice for accurate visual–inertial odometry or simultaneous localization and mapping (SLAM). While historically the problem has been addressed with filtering, advancements in visual estimation suggest that nonlinear optimization offers superior accuracy, while still tractable in complexity thanks to the sparsity of the underlying problem. Taking inspiration from these findings, we formulate a rigorously probabilistic cost function that combines reprojection errors of landmarks and inertial terms. The problem is kept tractable and thus ensuring real-time operation by limiting the optimization to a bounded window of keyframes through marginalization. Keyframes may be spaced in time by arbitrary intervals, while still related by linearized inertial terms. We present evaluation results on complementary datasets recorded with our custom-built stereo visual–inertial hardware that accurately synchronizes accelerometer and gyroscope measurements with imagery. A comparison of both a stereo and monocular version of our algorithm with and without online extrinsics estimation is shown with respect to ground truth. Furthermore, we compare the performance to an implementation of a state-of-the-art stochastic cloning sliding-window filter. This competitive reference implementation performs tightly coupled filtering-based visual–inertial odometry. While our approach declaredly demands more computation, we show its superior performance in terms of accuracy.
科研通智能强力驱动
Strongly Powered by AbleSci AI