人工智能
抓住
计算机视觉
计算机科学
触觉传感器
机器人
对象(语法)
分类器(UML)
触觉知觉
模式识别(心理学)
感知
程序设计语言
神经科学
生物
作者
Fuchun Sun,Chunfang Liu,Wenbing Huang,Jianwei Zhang
出处
期刊:IEEE transactions on systems, man, and cybernetics
[Institute of Electrical and Electronics Engineers]
日期:2016-04-07
卷期号:46 (7): 969-979
被引量:65
标识
DOI:10.1109/tsmc.2016.2524059
摘要
The perception of the tactile and vision modalities plays a crucial role in object classification and robot precise operations. On one hand, the visual perception helps acquire the objects apparent characteristics (e.g., shape and color) that are paramount for grasp planning. On the other hand, when grasping the object, the tactile sensors can detect the softness or stiffness and the surface texture of the object, thus can be used to classify the objects. In this paper, two different tactile models are first developed to model the tactile sequences, namely bag-of-system (BoS) and deep dynamical system (DDS). To be specific, BoS applies the extreme learning machine method as the classifier, and DDS employs deep neural networks as the feature mapping functions in order to deal with the highly nonlinear data. As for visual sensing, we formulate a signature of histograms of orientations descriptor to model the shape of objects. Moreover, by using human experience, we identify the graspable components of objects on the category level instead of the object level. Finally, a fast planning method considering both contact points exaction and hand kinematics is proposed to accomplish the grasping manipulation. Comparative experiments demonstrate the effectiveness of our proposed methods on the tasks including the robot grasping and the object classification.
科研通智能强力驱动
Strongly Powered by AbleSci AI