Monkey See; Robot Do

Reading Time: 3 minutes

A look into Brain-Machine Interfacing

Garret Stirland

Staff Writer

You may not think twice about your finger clicking a mouse, or the way your eyes jump from word to word as you read this sentence. For every action your body performs, networks of neurons are communicating with electrical impulses throughout your brain.

In January of 2008, an experiment was conducted with a rhesus monkey named Idoya in North Carolina and a robot in Kyoto, Japan.  Idoya, using nothing but brain signals, was able to control the 5-foot 200-pound robot.

“It was the first time that brain signals had been used to make a robot walk,” said neuroscientist Dr. Miguel A.L. Nicolelis.

Idoya was trained to walk on a treadmill while looking at a screen displaying the legs of a primate-like robot. She walked at different speeds for roughly 15 minutes each day. Electronic implants inserted in Idoya’s brain recorded her brain activity as each movement was made. The electrode implants converted Idoya’s electrical brain signals into a language that the computer could understand.

Idoya’s brain signals were then transmitted from North Carolina and connected to the robot in Japan. Idoya was shown a live video feed of the robot.  She was given a treat each time she could get the robot to walk.  After repeating that method, the researchers would switch off the treadmill, and Idoya was able to keep the robot walking long after her own legs ceased to move.

“Her eyes remained focused like crazy on [the robot’s] legs,” said Nicolelis.

Idoya’s motor cortex adopted the robotic legs as her own, allowing her to move the robot legs independently of her own limbs.  While the movements were limited, the experiment demonstrated a first look at brain-machine interfacing.

Dr. Nicolelis conducted another study in July of 2015. Nicolelis was looking for ways to improve the effectiveness of neural electrode implants. This time, a group of monkeys were used instead of a single primate. The group of monkeys was each connected neutrally to a robotic arm, charged with the task of picking up simple objects, like a red ball. The monkeys performed surprisingly well at working together to control the arm.

In another trial, the monkeys were connected to a computer displaying a three-dimensional arm in space. Each monkey was in charge of a different aspect of the arm. Dr. Nicolelis reported that a net of monkey brains was more efficient at moving and controlling the arm than an individual.

“These experiments are the first steps toward a brain machine interface that might permit paralyzed people to walk by directing devices with their thoughts,” said Nicolelis, “Electrodes in the person’s brain would send signals to a device worn on the hip, like a cell phone or pager, that would relay those signals to a pair of braces, a kind of external skeleton, worn on the legs.”

In February of 2015, researchers at Utah Valley University received 1.4 million dollars to develop a new prosthetic hand, which would be controlled by a neural implant. Not only will this limb be controlled by brain signals, but will also allow the user to experience sensations of touch and movement.

“We can pick up the nerve signals, translate them, and relay them to an artificial hand. People wouldn’t have to do anything differently from what they’d already learned how to do their whole life with their real hand. They’ll just think what they normally think, and the prosthetic hand will move,” said Gregory Clark, bioengineering professor at University of Utah.

Human subject tests have already been conducted on four subjects at the University of Utah, where participants controlled a virtual hand through 12 to 20 different movements while experiencing roughly 130 different touch and motion sensory responses.

Brain-Machine Interfacing, or BMI, is the term used to describe the process of using thinking, thoughts, and cognition to control prosthetic limbs.  BMI technology gives hope to individuals who have suffered paralysis or limb loss. With dedication, hard work and funding the stuff of science fiction will be brought to life.

“I think technology like this could help us out a lot. My uncle has to use a prosthetic leg and he hates it. Something like this could help him move around better, feel more comfortable,” said UVU student Timote Moala.

“We have the opportunity to not only significantly improve an amputee’s ability to control a prosthetic limb, but to make a profound, positive psychological impact,” said Doug Weber, Defense Advanced Research Projects Agency program manager.