惯性测量装置
计算机视觉
人工智能
陀螺仪
计算机科学
校准
卡尔曼滤波器
传感器融合
公制(单位)
同时定位和映射
扩展卡尔曼滤波器
可观测性
加速度计
机器人
移动机器人
工程类
数学
统计
运营管理
操作系统
应用数学
航空航天工程
作者
Jonathan Kelly,Gaurav S. Sukhatme
标识
DOI:10.1177/0278364910382802
摘要
Visual and inertial sensors, in combination, are able to provide accurate motion estimates and are well suited for use in many robot navigation tasks. However, correct data fusion, and hence overall performance, depends on careful calibration of the rigid body transform between the sensors. Obtaining this calibration information is typically difficult and time-consuming, and normally requires additional equipment. In this paper we describe an algorithm, based on the unscented Kalman filter, for self-calibration of the transform between a camera and an inertial measurement unit (IMU). Our formulation rests on a differential geometric analysis of the observability of the camera—IMU system; this analysis shows that the sensor-to-sensor transform, the IMU gyroscope and accelerometer biases, the local gravity vector, and the metric scene structure can be recovered from camera and IMU measurements alone. While calibrating the transform we simultaneously localize the IMU and build a map of the surroundings, all without additional hardware or prior knowledge about the environment in which a robot is operating. We present results from simulation studies and from experiments with a monocular camera and a low-cost IMU, which demonstrate accurate estimation of both the calibration parameters and the local scene structure.
科研通智能强力驱动
Strongly Powered by AbleSci AI