The first reason to watch this 4-minute video of a paralyzed woman using her brain signals to move a robot arm is that you will never take for granted again your ability to lift your morning cup of coffee to your mouth.
The second reason is that it’s the latest cool development to come out of the BrainGate project, which showed that it’s possible to “read out” brain signals in paralyzed people and convert them into actions, previously by moving a computer cursor and now a robotic arm. The journal Nature has just published the robotic-arm results, and posted the video above. Read the full press release from Brown University here. From Brown:
Researchers in the BrainGate collaboration of the Providence Veterans Affairs Medical Center, Brown University and Massachusetts General Hospital describe experiments in which two participants with tetraplegia used the investigational* BrainGate BCI [Brain-Computer Interface] to precisely control robotic arms to reach and grasp for objects in three-dimensional space. They controlled the robots by thinking about moving their own arms and hands. This is the first demonstration of 3D control of robot arms by neural activity of people.
On a particularly poignant day in the research, one of the participants used a robot arm to pick up a bottle of coffee, bring it to her lips and tip it to take a drink through a straw. It was the first time she had served herself anything to drink for nearly 15 years. Her smile after she had taken a sip was especially inspiring because her success indicated that the BrainGate team’s research has moved substantially closer toward the goal of restoring independence for people who have lost functional control of their limbs.
Suggestion: The smile comes at about 3:30 in the video. Don’t miss it.