Brain Implants that help people who are paralyzed


Brain-Computer Interfaces for Speech and Lyrae: Two Teams from Belgium and New York City Exploring Brain-Cognition Technologies to Describe a 47-year-old Woman Who Lost His Speech

In separate studies, both published on 23 August in Nature1,2, two teams of researchers describe brain–computer interfaces (BCIs) that translate neural signals into text or words spoken by a synthetic voice. The BCIs can decode speech at 62 words per minute and 78 words per minute, respectively. The new technologies are faster than any previous attempts at natural conversation, which is around 160 words per minute.

Christian Herff, a computational neuroscientist at Maastricht University, says that these devices can be products in the near future.

Bennett said in a statement to reporters that it’s a good thing for those with no speech who can stay connected to the bigger world.

In a separate study2, Edward Chang, a neurosurgeon at the University of California, San Francisco, and his colleagues worked with a 47-year-old woman named Ann, who lost her ability to speak after a brainstem stroke 18 years ago.

They used a different method than what Willett’s team does, placing a paper thin rectangular piece of paper on the surface of the brain’s cortex. The technique, called electrocorticography (ECoG), is considered less invasive and can record the combined activity of thousands of neurons at the same time. The team trained an Artificial Intelligence system to figure out the patterns in Ann’s brain activity related to her attempts to speak a 1,024 word vocabulary. 78 words per minute was produced by the device.

Chang and his team also created customized algorithms to convert Ann’s brain signals into a synthetic voice and an animated avatar that mimics facial expressions. They personalized the voice to sound like Ann’s before her injury, by training it on recordings from her wedding video.

Ann told the researchers in a feedback session that hearing a voice eerily similar to their own is an emotional experience. “When I had the ability to talk for myself was huge!”

Improving the sensitivity of brain-computer interfaces to human speech and language processing, and where to expand the possibilities for implantable BCIs

BCIs can be used for clinical use, but many improvements are needed first. “The ideal scenario is for the connection to be cordless,” Ann told researchers. Yvert adds that a BCI that could be used everyday would have to be fully implantable systems. Both teams hope to add more-robust decoding methods to their devices.

According to Herff, participants of both studies have the ability to use their facial muscles when thinking about speaking, and their speech-related brain regions are intact. “This will not be the case for every patient.”

It’s a proof of concept that can be used to help industry people translate it into a product someone can actually use.

More people need to be tested for reliability of the devices. Judy Illes, a research scientist at the University of British Columbia in Canada, says that they need to understand the data in context even if they are technically sophisticated. “We have to be careful with over promising wide generalizability to large populations,” she adds. “I’m not sure we’re there yet.”

Two studies demonstrate how brain-computer interfaces could help people to communicate, and working out how hot it can get before tropical leaves start to die.

How wind-tunnel experiments could help athletes run the fastest marathon ever, and an analysis that could help explain why birds are the colours they are.

What will the trees in temperate tropical forests remember? An experimental study of talking about a facial motor neuronal device using a brain-computer interface

As the climate warms, tropical forests around the world are facing increasing temperatures. But it’s unknown how much the trees can endure before their leaves start to die. A team has combined multiple data sources to try and answer this question, and suggest that a warming of 3.9 °C would lead to many leaves reaching a tipping point at which photosynthesis breaks down. This scenario would likely cause significant damage to these ecosystems’ role in vital carbon storage and as homes to significant biodiversity.

Paralysis had robbed the two women of their ability to speak. There was a disease that affects the motor neurons that was the cause. The other had suffered a stroke. They can’t enunciate yet, but they remember how to say something.

While slower than the roughly 160-word-per-minute rate of natural conversation among English speakers, scientists say it’s an exciting step toward restoring real-time speech using a brain-computer interface, or BCI. It is getting closer to being used in everyday life according to a neurologist who didn’t take part in the new studies.

The Utah array is a tiny sensor that resembles a hair brush and can be used in a BCI. Each is tipped with an electrode, and together they collect the activity of individual neurons. Researchers then trained an artificial neural network to decode brain activity and translate it into words displayed on a screen.