计算机科学
计算机视觉
人工智能
职位(财务)
机器人
点云
桥(图论)
点(几何)
动力学(音乐)
感知
生物
经济
内科学
声学
神经科学
财务
数学
几何学
物理
医学
作者
Fei Liu,Zihan Li,Yunhai Han,Jingpei Lu,Florian Richter,Michael C. Yip
标识
DOI:10.1109/icra48506.2021.9561177
摘要
Autonomy in robotic surgery is very challenging in unstructured environments, especially when interacting with deformable soft tissues. The main difficulty is to generate model-based control methods that account for deformation dynamics during tissue manipulation. Previous works in vision-based perception can capture the geometric changes within the scene, however, model-based controllers integrated with dynamic properties, a more accurate and safe approach, has not been studied before. Considering the mechanic coupling between the robot and the environment, it is crucial to develop a registered, simulated dynamical model. In this work, we propose an online, continuous, real-to-sim registration method to bridge 3D visual perception with position-based dynamics (PBD) modeling of tissues. The PBD method is employed to simulate soft tissue dynamics as well as rigid tool interactions for model-based control. Meanwhile, a vision-based strategy is used to generate 3D reconstructed point cloud surfaces based on real-world manipulation, so as to register and update the simulation. To verify this real-to-sim approach, tissue experiments have been conducted on the da Vinci Research Kit. Our real-to-sim approach successfully reduces registration error online, which is especially important for safety during autonomous control. Moreover, it achieves higher accuracy in occluded areas than fusion-based reconstruction.
科研通智能强力驱动
Strongly Powered by AbleSci AI