亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整的填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

NeuralFeels with neural fields: Visuotactile perception for in-hand manipulation

人工智能 计算机视觉 计算机科学 感知 机器人学 对象(语法) 机器人 姿势 心理学 神经科学
作者
Sudharshan Suresh,Haozhi Qi,Tingfan Wu,Taosha Fan,Luis A. Pineda,Mike Lambeta,Jitendra Malik,Mrinal Kalakrishnan,Roberto Calandra,Michael Kaess,Joseph D. Ortiz,Mustafa Mukadam
出处
期刊:Science robotics [American Association for the Advancement of Science (AAAS)]
卷期号:9 (96)
标识
DOI:10.1126/scirobotics.adl0628
摘要

To achieve human-level dexterity, robots must infer spatial awareness from multimodal sensing to reason over contact interactions. During in-hand manipulation of novel objects, such spatial awareness involves estimating the object’s pose and shape. The status quo for in-hand perception primarily uses vision and is restricted to tracking a priori known objects. Moreover, visual occlusion of objects in hand is imminent during manipulation, preventing current systems from pushing beyond tasks without occlusion. We combined vision and touch sensing on a multifingered hand to estimate an object’s pose and shape during in-hand manipulation. Our method, NeuralFeels, encodes object geometry by learning a neural field online and jointly tracks it by optimizing a pose graph problem. We studied multimodal in-hand perception in simulation and the real world, interacting with different objects via a proprioception-driven policy. Our experiments showed final reconstruction F scores of 81% and average pose drifts of 4.7 millimeters, which was further reduced to 2.3 millimeters with known object models. In addition, we observed that, under heavy visual occlusion, we could achieve improvements in tracking up to 94% compared with vision-only methods. Our results demonstrate that touch, at the very least, refines and, at the very best, disambiguates visual estimates during in-hand manipulation. We release our evaluation dataset of 70 experiments, FeelSight, as a step toward benchmarking in this domain. Our neural representation driven by multimodal sensing can serve as a perception backbone toward advancing robot dexterity.

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
成熟稳重痴情完成签到,获得积分10
刚刚
梅豪发布了新的文献求助20
1秒前
酷波er应助美味肉蟹煲采纳,获得10
3秒前
寻道图强应助科研通管家采纳,获得30
4秒前
4秒前
VDC应助科研通管家采纳,获得10
4秒前
cvvl2完成签到,获得积分10
4秒前
Darcy完成签到,获得积分10
5秒前
领导范儿应助粗犷的半邪采纳,获得10
7秒前
科研通AI2S应助闪闪善若采纳,获得10
8秒前
10秒前
称心璎发布了新的文献求助10
13秒前
16秒前
23秒前
31秒前
Helen发布了新的文献求助10
35秒前
科研通AI2S应助Helen采纳,获得50
45秒前
47秒前
纵横天下完成签到,获得积分10
50秒前
yixueshng发布了新的文献求助10
52秒前
1分钟前
CodeCraft应助梅豪采纳,获得10
1分钟前
肉胖胖肉完成签到,获得积分10
1分钟前
1分钟前
88完成签到 ,获得积分10
1分钟前
雍雍完成签到 ,获得积分10
1分钟前
不三不四发布了新的文献求助10
1分钟前
1分钟前
konosuba完成签到,获得积分10
1分钟前
Yee完成签到 ,获得积分10
1分钟前
不三不四完成签到,获得积分10
1分钟前
1分钟前
2分钟前
故意的山河完成签到 ,获得积分10
2分钟前
科研通AI2S应助科研通管家采纳,获得10
2分钟前
彭于晏应助科研通管家采纳,获得10
2分钟前
2分钟前
寻道图强应助科研通管家采纳,获得30
2分钟前
yixueshng发布了新的文献求助10
2分钟前
傲娇的咖啡豆完成签到,获得积分10
2分钟前
高分求助中
Earth System Geophysics 1000
Semiconductor Process Reliability in Practice 800
Co-opetition under Endogenous Bargaining Power 666
Studies on the inheritance of some characters in rice Oryza sativa L 600
Medicina di laboratorio. Logica e patologia clinica 600
Sarcolestes leedsi Lydekker, an ankylosaurian dinosaur from the Middle Jurassic of England 500
《关于整治突出dupin问题的实施意见》(厅字〔2019〕52号) 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3211056
求助须知:如何正确求助?哪些是违规求助? 2860096
关于积分的说明 8122656
捐赠科研通 2525770
什么是DOI,文献DOI怎么找? 1359596
科研通“疑难数据库(出版商)”最低求助积分说明 643012
邀请新用户注册赠送积分活动 614987