作者
Meel Velliste,Sagi Perel,M. Chance Spalding,Andrew Whitford,Andrew B. Schwartz
摘要
Brain-machine interfaces have mostly been used previously to move cursors on computer displays. Now experiments on macaque monkeys show that brain activity signals can control a multi-jointed prosthetic device in real-time. The monkeys used motor cortical activity to control a human-like prosthetic arm in a self-feeding task, with a greater sophistication of control than previously possible. This work could be important for the development of more practical neuro-prosthetic devices in the future. A system where monkeys use their motor cortical activity to control a robotic arm in a real-time self-feeding task, showing a significantly greater sophisitication of control than in previous studies, is demonstrated. This work could be important for the development of more practical neuro-prosthetic devices in the future. Arm movement is well represented in populations of neurons recorded from the motor cortex1,2,3,4,5,6,7. Cortical activity patterns have been used in the new field of brain–machine interfaces8,9,10,11 to show how cursors on computer displays can be moved in two- and three-dimensional space12,13,14,15,16,17,18,19,20,21,22. Although the ability to move a cursor can be useful in its own right, this technology could be applied to restore arm and hand function for amputees and paralysed persons. However, the use of cortical signals to control a multi-jointed prosthetic device for direct real-time interaction with the physical environment (‘embodiment’) has not been demonstrated. Here we describe a system that permits embodied prosthetic control; we show how monkeys (Macaca mulatta) use their motor cortical activity to control a mechanized arm replica in a self-feeding task. In addition to the three dimensions of movement, the subjects’ cortical signals also proportionally controlled a gripper on the end of the arm. Owing to the physical interaction between the monkey, the robotic arm and objects in the workspace, this new task presented a higher level of difficulty than previous virtual (cursor-control) experiments. Apart from an example of simple one-dimensional control23, previous experiments have lacked physical interaction even in cases where a robotic arm16,19,24 or hand20 was included in the control loop, because the subjects did not use it to interact with physical objects—an interaction that cannot be fully simulated. This demonstration of multi-degree-of-freedom embodied prosthetic control paves the way towards the development of dexterous prosthetic devices that could ultimately achieve arm and hand function at a near-natural level.