Yanmin Zhou,Zhao Jiangang,Ping Lu,Zhipeng Wang,Bin He
出处
期刊:IEEE Transactions on Industrial Electronics [Institute of Electrical and Electronics Engineers] 日期:2023-03-13卷期号:71 (2): 1708-1717被引量:14
标识
DOI:10.1109/tie.2023.3253921
摘要
Robots are now working more and more closely with humans outside of traditional fences in industrial scenes. Their real-time tactile interaction perception is crucial to the safety of human–robot collaboration (HRC). In this work, we present a customized, wearable, and modular robot skin (TacSuit), which is scalable for large-area surface coverage of robot with easily accessible multimodal sensors, including pressure, proximity, acceleration, and temperature sensors. The TacSuit is co-designed for mechanical structure and data fusion algorithm, consisting of three levels of design: sensor, cell (of multimodal sensors), and block (of multiple cells). These sensors are stored with custom-designed and 3-D printed capsules to achieve the conformity, scalability, and easy installation to the arbitrary robot surface. A multilevel event-driven data fusion algorithm enables efficient information processing for large number of tactile sensors. Furthermore, a virtual interaction force fusion method takes both the proximity and force perception information into consideration in order to achieve safety of whole interaction process before and after direct physical contacts. A humanoid robotic platform successfully realizes the TacSuit wear of 159 tactile cells. Validation experiments of obstacle detection demonstrate the effective collision avoidance capability of the TacSuit for safe HRC.