Translation and Interpreting in 200+ Languages

Translating Brain Waves into Words

October 3, 2010 -By: -In: In the News / Awards - Comments Off on Translating Brain Waves into Words

Now scientists can translate brain waves into words. Ho hum. Happens every day, right? Even as you read this, you are translating words into brain waves, right? Easy enough inside your own head, but what about someone else’s brain waves? Not so easy. Grant money is required.

In recent years, various study participants have been able to hook their mental muscles up to devices that allow them to turn lights on and off, change channels on TV, and move cursers and robot arms. But recognizing brain waves for words requires a whole different level of granularity.

Bioengineer Bradley Greger and his team have done it, he reports in the recent issue of the Journal of NeuroEngineering. They’ve shown that it is possible to translate recorded brain waves into words using a grid of electrodes placed directly on the brain.

Although they have only done it with one person, and individual words can only be identified with accuracy in tests 50% of the time, Greger thinks greater accuracy is just around the corner.

Some researchers have been attempting to “read” speech centers in the brain using electrodes placed on the scalp. But such electrodes “are so far away from the electrical activity that it gets blurred out,” Greger says.

Greger and his colleagues instead use arrays of tiny microelectrodes that are placed in contact with the brain, but not implanted. In the current study, they used two arrays, each with 16 microelectrodes. The arrays were placed directly on the brain of a volunteer patient with epilepsy whose skull had already been opened to measure aberrant electrical signals that trigger seizures.

The team tested 10 words, such as “yes,” “no,” “hungry” and “thirsty,” that a patient might need. The volunteer spoke each word 31 to 96 times while the researchers measured brain waves.

In the best case, the researchers could correctly distinguish between two words, such as “yes” and “no,” 90% of the time. But when trying to distinguish among all 10 words, their best accuracy was 48%.

Greger believes the accuracy can be improved by using more electrodes, and the team is now working with grids of 121 sensors.

“We’re pretty hopeful that, with a better design, we’ll be able to decode more words and, in two or three years get approval for a real feasibility trial in paralyzed patients,” he says. The device could benefit people who have been paralyzed by stroke, Lou Gehrig’s disease or trauma and are “locked in”―aware but unable to communicate except, perhaps, by blinking an eyelid or arduously moving a cursor to pick out letters or words from a list.

Twenty years ago, I visited my best friend in the hospital, who was lying unconscious in a hospital bed, comatose for weeks after he pancaked his hang glider. As I spoke to him and stroked his hair, he was reacting, grimacing, responding―or so I thought. Inspired by my discovery, I ran to the intensive care nurse to report my findings. A hopeless case, she reported, and her answer, gentled for a grieving survivor of course, was along the lines of “Yep, they’ll do that.” Later, it took a double dose of morphine to put my friend down after they took him off the respirator. Ever since then I’ve been tormented by the thought that he was consciously hanging on as we tried to pull the hospital curtain down on him so neatly. I’ve never had the balls to raise the question with his widow, now an oncologist, for fear my brain waves of doubt would echo in her own mind.

That we can now allow those trapped in their own minds to rejoin the human conversation without the burden of speech or gesture is amazing. Bringing those silent sufferers back into the language fold is one more wonder in this age of wonders. But I can’t help but also wonder what’s to come. These mind jacks seem just a plug-in away from monetization. Not with the comatose, who don’t really represent an especially affluent market, but for the rest of us. They (we) can already predict our behavior after we tap in a few keystrokes into our digital devices. How much more convenient to just read our thoughts? I know that must be a long way off, right? But….

Watching these technologies race on is like looking out the window of the Bullet Train as it accelerates out of the station, telephone poles flicking by, faster and faster, until they can barely be comprehended. So fast. So fast!