传感器融合
坐标系
人工智能
校准
计算机视觉
方向(向量空间)
计算机科学
惯性参考系
算法
数学
几何学
物理
量子力学
统计
出处
期刊:IEEE Transactions on Instrumentation and Measurement
[Institute of Electrical and Electronics Engineers]
日期:2016-06-01
卷期号:65 (6): 1495-1502
被引量:21
标识
DOI:10.1109/tim.2016.2518418
摘要
Due to the external acceleration interference/magnetic disturbance, the inertial/magnetic measurements are usually fused with visual data for drift-free orientation estimation, which plays an important role in a wide variety of applications, ranging from virtual reality, robot, and computer vision to biomotion analysis and navigation. However, in order to perform data fusion, alignment calibration must be performed in advance to determine the difference between the sensor coordinate system and the camera coordinate system. Since orientation estimation performance of the inertial/magnetic sensor unit is immune to the selection of the inertial/magnetic sensor frame original point, we therefore ignore the translational difference by assuming the sensor and camera coordinate systems sharing the same original point and focus on the rotational alignment difference only in this paper. By exploiting the intrinsic restrictions among the coordinate transformations, the rotational alignment calibration problem is formulated by a simplified hand–eye equation $AX=XB$ ( $A$ , $X$ , and $B$ are all rotation matrices). A two-step iterative algorithm is then proposed to solve such simplified hand-eye calibration task. Detailed laboratory validation has been performed and the good experimental results have illustrated the effectiveness of the proposed alignment calibration method.
科研通智能强力驱动
Strongly Powered by AbleSci AI