The development of neuroprostheses to aid paralyzed people has advanced significantly with a report that a monkey has successfully fed itself in fluid, well-controlled movements with a human-like robotic arm by using only signals from its brain.

The lead researcher at University of Pittsburgh, Andrew Schwartz, PhD, professor of neurobiology, has predicted his lab will be ready to test the device and accompanying technology in humans in two years, but that commercialization is likely five to 10 years down the road.

While several researchers around the world, including Schwartz, have previously reported success in getting monkeys to use robotic arms, Schwartz points out that the latest effort, reported last week in a Nature article he authored, is different.

The use of cortical signals to control a multi-jointed prosthetic device for direct, real-time interaction with the physical environment is known as embodiment and that has not been previously achieved.

"Previous studies have not reported direct interaction ... the monkey wasn't fully controlling the task," Schwartz told Medical Device Daily. "In our Nature paper, we describe a system that permits embodied prosthetic control. Learning is the essential thing that gets these devices to work. We've enabled monkeys to change the way their neurons are firing."

Previous work focused on using brain-machine interfaces to control cursor movements displayed on a computer screen. Monkeys in Schwartz's lab have been trained to command cursor movements with the power of their thoughts with greater precision, skill and learning.

Monkeys first learn by observing the movement, which activates brain cells as if they were actually performing the action. "It's a lot like sports training, where trainers have athletes first imagine that they are performing the movements they desire," Schwartz explained.

Monkeys now are able to move a robotic arm to feed themselves marshmallows and fruit while their own arms are restrained.

Schwartz and his team implanted a grid of electrodes, the size of a thumbtack, about 1.5 mm into the monkey's brain.

Computer software interprets signals picked up by these probes, which are the width of a human hair. The probes are inserted into neuronal pathways in the monkey's motor cortex, a brain region where voluntary movement originates as electrical impulses.

The neurons' collective activity is then evaluated using software programmed with a special mathematic algorithm Schwartz and his team developed and then sent to the arm, which carries out the actions the monkey intended to perform with its own limb.

"The microelectrodes pick up action potentials from individual neurons ... pulses of electricity that neurons use to communicate with each other," he said.

A computer then examines the array of signals. "It's able to decipher or extract the instantaneous arm signal. This is the same signal we picked up when the monkey used his own arm," he said. "After the computer finds the correct signal, it sends it to the robot arm."

Every 30 milliseconds, the information is updated.

"This demonstration of multi-degree-of freedom embodied prosthetic control paves the way toward the development of dexterous prosthetic devices that could ultimately achieve arm and hand function at a near-natural level," Schwartz wrote in Nature.

The robot arm has a shoulder, an elbow and a simple gripper at the end. But it's heavy, made of aluminum. "The research arm, made by Barrett Technology [Cambridge, Massachusetts], is backdriveable, meaning you can push on it and it yields. It's springy, so that when you push on it, it feels springy. We wanted it to behave like a natural arm."

One of the biggest hurdles Schwartz said he had to overcome was related to the electrodes used over the years. "Getting good, reliable recordings is difficult," he said. "The electrodes we've been using now work well."

Those electrodes come from Cyberkinetics Neurotechnology Systems (Foxborough, Massachusetts) and apparently it's the insulation technology on the new electrodes that make them work more efficiently.

Schwartz has been working on this technology for 25 years. He said he knew 20 years ago that it would work. "It was just a matter of time."

He has no immediate plans to license the technology to a company.

"Next thing we want to do is to add a wrist and human-like hand," he said. "It's quite a jump in complexity because it would give us 21 degrees of freedom compared to 4 degrees now."