惯性测量装置
计算机视觉
人工智能
传感器融合
校准
运动学
计算机科学
正向运动学
计量单位
运动捕捉
姿势
数学
运动(物理)
机器人
反向动力学
物理
统计
经典力学
量子力学
出处
期刊:IEEE Sensors Journal
[Institute of Electrical and Electronics Engineers]
日期:2023-02-03
卷期号:23 (6): 6292-6302
被引量:13
标识
DOI:10.1109/jsen.2023.3241084
摘要
Upper body kinematics is essential for motor function assessment and robot-assisted rehabilitation training. Wearable sensor systems, such as inertial measurement units (IMUs), provide affordable solutions to replace laboratory-based motion capture systems for use in daily life. However, the sensor-to-segment calibration often relies on predefined posture or movements, which is hard to perform accurately, particularly for patients with a limited range of motion. A visual–inertial sensor system is presented, which includes three sensor modules attached to the trunk, upper arm, and forearm. Each module has an IMU and an ArUco marker, which can be captured by a camera and the driftless orientation of the modules is computed from visual–inertial fusion. The sensor-to-segment transformations are calibrated from a period of arbitrary arm movements in either a 2-D plane or 3-D space, simulating the training process assisted by end-effector robots. Experiments were conducted to validate the feasibility and evaluate the accuracy of the proposed method. The estimated shoulder and elbow joint angles correlated well ( $>$ 0.986) with the ground truth from the optical motion capture (OMC) system. The joint angles presented low root-mean-square errors (RMSEs) ( $< 4^{\circ }$ ) except for the forearm pronation–supination angle (9.34°), which relied on manual alignment. The sensor system provides a simple and easy-to-use solution for movement assessment during robot-assisted training.
科研通智能强力驱动
Strongly Powered by AbleSci AI