Ever wonder why people talk with their hands? We all do -- across cultures, throughout history. Now, a serendipitous discovery building on years of meticulous work tells us what may be the reason -- or at least a reason -- for it.
The discovery may also portend a potential breakthrough for those with aphasia, the brain-damage-induced loss of ability to speak, which affects one in 250 people.
Some years ago, a team of Stanford scientists led by neurosurgeon Jaimie Henderson, MD, and electrical engineer Krishna Shenoy, PhD, implanted baby-aspirin-sized multi-electrode arrays in the brains of study participants who suffered from severe limb weakness. These arrays, owing to work in Shenoy's lab, were capable of deciphering signals in the paralyzed participants' motor cortex: the part of the brain that controls voluntary motion.
When a participant willed a particular hand motion, the array was able to read that brain pattern and, bypassing the participant's musculature, transmit the relevant information through a cable to a converter that could boss around the cursor on a computer. The result: With some training, participants were enabled to select music, shop online and even type at speeds that are practical for composing emails.
Some of these participants had lost their ability to speak. But others hadn't. And a study just published in eLife and spearheaded by postdoc Sergey Stavisky, PhD, who is associated with the Shenoy/Henderson team, describes an amazing -- if unanticipated -- finding: The same part of the motor cortex that controls hand movement also appears to influence the muscles of the jaw, lips and tongue.
So much influence, in fact, that the scientists were able to design software that could differentiate among different syllables uttered by two of the implanted participants who retained the power of speech.
The team recorded neural activity from these participants during a speaking task where each participant heard one of 10 different syllables and then repeated that syllable. The firing patterns of neurons in the participant's array-accessible motor cortex changed in ways that corresponded to various groups of sounds, and to face and mouth movements one would observe in a person with speech. These patterns were more pronounced when the participant spoke one of the syllables than when he or she heard it.
By analyzing neural activity across nearly 200 electrodes, the scientists found they could identify which of several syllables a participant was saying -- with more than 80% accuracy in the case of one participant.
The implication here is that someday it may be possible to figure out what people who, for one or another reason, can't speak are trying to say -- and get a device to say it for them.
The conventional wisdom has been that the brain's motor control over any given part of our body traces to a particular location along the motor cortex. There was no reason to expect that the"state" on the cortical "map" corresponding to our hands also contained "cities" belonging to the "state" known as our mouth.
"There aren't a lot of opportunities to make measurements from inside someone's brain while they talk," Stavisky said. If these two people hadn't just happened to have multi-electrode arrays implanted in the part of the brain responsible for hand-movement control, that area's connection to speech might never have surfaced.
Photos by Peter Barreras/Howard Hughes Medical Institute. Top photo shows one hundred thin electrodes protruding from the electrode array, which was surgically implanted into the motor cortex of the brain.