Brain Implant Enables Bilingual Communication for First Time

In a medical achievement, a brain implant has enabled a bilingual individual who lost the ability to articulate words to communicate effectively in both of his languages. An advanced artificial intelligence (AI) system, paired with the implant, decodes the individual’s intended speech in real-time, interpreting it into either Spanish or English.

Published on May 20 in Nature Biomedical Engineering, the study offers new insights into brain language processing and holds potential for developing long-lasting devices that could restore multilingual speech capabilities to individuals unable to communicate verbally.

“This new study is an important contribution to the emerging field of speech-restoration neuroprostheses,” commented Sergey Stavisky, a neuroscientist at the University of California, Davis, who was not involved in the research. Despite involving only one participant and requiring further validation, Stavisky believes that the strategy will achieve higher accuracy in the future when integrated with other recent advancements.

Speech-Restoring Implant: A Personal Journey

The subject of the study, known as Pancho, suffered a stroke at the age of 20 that resulted in significant paralysis and left him unable to speak clearly. In his thirties, Pancho collaborated with Edward Chang, a neurosurgeon at the University of California, San Francisco, to explore the stroke’s impact on his brain. In a pioneering 2021 study, Chang’s team implanted electrodes on Pancho’s cortex to record neural activity, which was then translated into words displayed on a screen. Pancho’s initial decoded sentence was “My family is outside,” interpreted in English. However, Pancho is a native Spanish speaker who learned English post-stroke, and Spanish remains the language that resonates deeply with him.

“What languages someone speaks are actually very linked to their identity,” Chang explains. “Our long-term goal has never been just about replacing words but about restoring connection for people.”

Deciphering Bilingual Speech with AI

To further this goal, the team developed an AI system to interpret Pancho’s bilingual speech. Led by Chang’s PhD student, Alexander Silva, the system was trained as Pancho attempted to say nearly 200 words, each effort creating a distinct neural pattern recorded by the electrodes.

The AI system includes separate Spanish and English modules. When Pancho tries to speak, the system analyzes the neural patterns, with each module selecting words from its respective language based on likelihood. For example, the English module might choose "she" with a 70% probability, while the Spanish module might select "estar" (to be) with a 40% probability. The system builds phrases word by word, considering both the neural match and linguistic likelihood, and displays the sentence with the highest overall probability on Pancho’s screen.

The modules distinguished between English and Spanish with 88% accuracy and decoded sentences with 75% accuracy. This allowed Pancho to eventually have unscripted conversations with the research team. “After the first time we did one of these sentences, there were a few minutes where we were just smiling,” Silva recalls.

New Insights into Language Processing

The findings also provided unexpected insights into how the brain processes different languages. Contrary to previous studies using non-invasive tools that suggested distinct brain regions for different languages, the team found that much of the neural activity for both Spanish and English originated from the same brain area. Additionally, Pancho’s neurological responses were similar to those of individuals who grew up bilingual, despite him learning English in his thirties.

Kenji Kansaku, a neurophysiologist at Dokkyo Medical University in Japan, who was not part of the study, highlighted the importance of including more participants and studying languages with different articulatory properties, such as Mandarin or Japanese, in future research. Silva is already exploring these areas, including the phenomenon of ‘code switching’—shifting from one language to another within a single sentence. “Ideally, we’d like to give people the ability to communicate as naturally as possible,” Silva states.

doi: https://doi.org/10.1038/d41586-024-01451-4


0 Comentarios