Brain chip gives paralyzed man the ability to feel what a robotic arm touches!
Pittsburgh, Pennsylvania - A huge breakthrough in technology has allowed a partially paralyzed person actually feel what a robotic arm is touching, paving the way for a potential revolution in prosthetics!
The University of Pittsburgh collaborated with Blackrock in what will go down as a significant milestone in neurotechnology.
"It's the first time a BCI for a robotic prosthetic hand has integrated motion commands and touch in real time," according to Wired. BCI stands for brain-computer interface and is essentially the go-between program for paralyzed individuals who are controlling robotic limbs.
Nathan Copeland, the subject in the study who suffers from partial paralysis due to a car accident, was able to double his speed as he used the robotic arm once he was given the artificial tactile (touch) feedback and able to feel what the robot was feeling.
This is an incredible development, as robotic limbs have been controlled through sight up to this point.
Copeland was selected for the test because of his very specific type of injury. He still had some existing nerves, though messages from those nerves couldn't reach his brain correctly. The team mapped what parts of his brain still responded to touch, and recorded what happened in his brain when he imagined different movements.
The team then used Blackrock Neurotech's NeuroPort System and implanted four-micro electrode arrays into Copeland's brain – two to read the signals for movement, and two to send signals to his sensory system.
Once Copeland learned how control the robotic limb with just sight feedback, the team switched on the feature that made his brain feel what the robot did, and it changed everything instantly.
Though revolutionary, there are still some bugs in the system
One of the study's authors, biomedical engineer Jennifer Collinger, said, "you don't necessarily rely on vision for a lot of the things that you do. When you're interacting with objects, you rely on your sense of touch."
By placing sensors on the fingertips of the robotic arm and hand, as well as torque sensors on the fingers to relay pressure, whatever the robot feels was sent back as electrical impulses to Copeland's brain, allowing him to feel those pressures in his fingers and hands.
"The first time we did it, I was like, magically better somehow," Copeland said of his experience with artificial touch. He could also complete his tasks twice as fast because he didn't have to double-check visually what he could feel.
Having just published their study in Science, the team is still working out kinks in the system, such as making the sensations that Copeland feels a bit more natural. He might feel a poking sensation when it should be more of a rubbing one, for example.
The system is also wired, and requires being plugged into the sensors embedded in his skull. Once a way is made for him to be able to operate the arm outside the lab, then his life could be opened up to countless other tasks.
Copeland's successes are drawing so much attention that he was able to sell a picture of a cat he drew with his robotic arm recently.
Cover photo: 123RF/ Sergey Soldatov