A monkey's brain-controlled avatar allows it to "feel" texture of virtual objects

In a Duke University laboratory, a monkey uses a brain-computer interface to control a digital monkey arm and literally feel the texture of objects in a virtual environment. The experiment, carried out by the Nicolelis Lab at Duke University Medical Center, demonstrates a major step towards brain-machine devices that not only allow users to control a computer cursor or robotic limb but actually provide sensory feedback about what they are touching.

 

Haptic feedback was delivered to the brain using an array of microscopic electrodes implanted into the somatosensory cortex, or the area of the cortex responsible for sensing tactile information about an object's texture. The monkeys were coaxed with snacks to learn how to use a joystick to control the virtual monkey arm on a computer screen. They used this arm to 'touch" various objects presented on the screen and to receive tactile information through the brain implant. When the avatar arm came in contact with a specific object, the electrodes would fire off tiny current pulses in a particular pattern. The pattern or frequency of stimulation delivered information about the texture of an object, a quality the monkeys used to pick out specific objects in exchange for treats. Despite the inherently artificial nature of the touch signal, the monkeys seemed to have no trouble associating it with the tactile information. In fact, after a few trials, they were easily able to distinguish between visually identical virtual objects that differed only by texture. Below is an outtake from the Duke Medicine YouTube channel, demonstrating the procedure.

 

  

Having met with success in the first experiment, the group decided to push the technology further. They wanted to make the device entirely brain-controlled, factoring the joystick out of the picture and "closing the loop." To this end, they implanted a second array of electrodes, this time targeting the primary motor cortex, or the area of the brain that controls volitional movement. These electrodes delivered no stimulation; however, they recorded neural signals as the monkeys attempted to control the arm on the screen. A signal processing algorithm was able to distinguish between patterns of neuronal activity that was associated with the subject monkey's desire to move the arm, say, left rather than right. It could then use this information to move the virtual arm in the appropriate direction.  

 

While sensory-feedback technology for brain-computer interfaces remains a largely unexplored field, brain-computer interfaces that enable movement are somewhat more established. In an earlier study by Miguel Nicolelis, an electrode array implanted into a monkey's brain was used to "read" the volitional motions of the monkey and use them to control a robot. In this study, the monkey was able to see the robot on a computer screen and was rewarded with treats when it walked "in sync" with the robot. What it didn't know was that the neural information encoding its walking motions was wirelessly sent to the robot, controlling its gait on a treadmill. Amazingly, the robot was located in Kyoto, Japan, while Nicolelis and his team were at Duke [1]. 

 

Several other researchers demonstrated brain-computer interface (BCI) devices that predicted kinetic parameters of limb movements and used them to drive real or virtual devices. In a groundbreaking demonstration of a "closed-loop," BCI researchers in the Schwartz group at the University of Pittsburgh demonstrated a monkey that used a mentally controlled robotic arm to pick up and eat marshmallows (depicted below) [2]. What is more astonishing is that the decoded signals came from only 15 to 30 neurons (check out their YouTube video)!

 

Because of its rehabilitative potential for patients with missing limbs, paralysis or locked-in syndrome, human BCI research has found a large niche within the neuroprosthetics field. In 2005, a tetraplegic patient was successfully implanted with a 96-electrode BrainGate device that allowed him to become the first human in the world to move a robotic arm with his mind. Strides have also been taken to make the procedure less surgically invasive by utilizing electrodes that do not penetrate the brain, or that stay entirely atop the scalp. The latter technique, known as electroencephalography (EEG), has become the most studied BCI technique because of its potential for rapid commercialization. EEG-based devices are currently used for therapeutic purposes such as biofeedback, as well as for gadget and gaming applications. 

 

EEG-based devices are finding a niche in the gaming console and user-interface market. Several companies, such as NeuroSky and Emotiv, are offering BCI headsets aimed at app and game developers as well as gamers. Additionally, a BCI-based spelling machine has been brought to the market by intendiX.

 

Nevertheless, the BCI technology is limited by the lack of sensory feedback from the objects being controlled. According to Nicolelis, in order to make the technique clinically useful for prosthetics and paralysis, "you need to sense what you're touching," a problem he attempted to tackle in his study [3]. Indeed, the question of tactile feedback becomes similarly relevant for EEG-based devices undergoing commercialization. This is especially true in light of the increasing popularity of touch-oriented user interfaces such as the touch-screen and gaming devices, like the Microsoft Kinect and the Nintendo Wii. This trend is not limited to the start-up world, with big players such as Sony filing patents as early as 2004 for various methods of eliciting sensory feedback.

 

The technique still faces serious problems, including long training times, noise and surgical invasiveness. While we aren't exactly seeing James Cameron's "Avatar" in real life, over the past five or so years, neuroengineered BCI devices have grown up as a technology, making steps towards commercialization and addressing the problem of sensory feedback. All in all, the technique still promises to provide an avenue for the development of more life-like prosthetic limbs.