Top
Begin typing your search above and press return to search.
keyboard_arrow_down
Login
exit_to_app
Going beyond birthday celebration
access_time 18 Sep 2020 6:05 AM GMT
Periyar@142-Revolutionary and Visionary
access_time 17 Sep 2020 11:57 AM GMT
The word of caution from the highest court
access_time 17 Sep 2020 6:31 AM GMT
access_time 16 Sep 2020 5:58 AM GMT
access_time 15 Sep 2020 6:19 AM GMT
DEEP READAll arrow_drop_down
The ogres in the mind
access_time 8 Sep 2020 11:27 AM GMT
Why worry about populism?
access_time 4 Sep 2020 9:51 AM GMT
Media mind-set towards minorities
access_time 15 July 2020 4:29 PM GMT
exit_to_app
Homechevron_rightTechnologychevron_rightNew AI-enabled system...

New AI-enabled system translates brain signals into speech

text_fields
bookmark_border
New AI-enabled system translates brain signals into speech
cancel

Washington:  In a first, scientists have created an artificial intelligence (AI) based system that directly translates thoughts into intelligible, recognisable speech, an advance that may help people who cannot speak regain their ability to communicate with the outside world.

By monitoring someone's brain activity, the technology developed by researchers from Columbia University in the US can reconstruct the words a person hears with unprecedented clarity.

The breakthrough, which harnesses the power of speech synthesisers and artificial intelligence, could lead to new ways for computers to communicate directly with the brain.

It also lays the groundwork for helping people who cannot speak, such as those living with as amyotrophic lateral sclerosis (ALS) or recovering from stroke, regain their ability to communicate with the outside world, researchers said.

"Our voices help connect us to our friends, family and the world around us, which is why losing the power of one's voice due to injury or disease is so devastating," said Nima Mesgarani, of Columbia University in the US.

"With today's study, we have a potential way to restore that power. We've shown that, with the right technology, these people's thoughts could be decoded and understood by any listener," said Mesgarani, a principal investigator of the study published in the journal Scientific Reports.

Decades of research has shown that when people speak -- or even imagine speaking -- telltale patterns of activity appear in their brain.

Distinct pattern of signals also emerge when we listen to someone speak, or imagine listening.

Experts, trying to record and decode these patterns, see a future in which thoughts need not remain hidden inside the brain -- but instead could be translated into verbal speech at will.

However, accomplishing this feat has proven challenging. Early efforts to decode brain signals by researchers focused on simple computer models that analysed spectrograms, which are visual representations of sound frequencies.

However, because this approach has failed to produce anything resembling intelligible speech, the team turned instead to a vocoder, a computer algorithm that can synthesise speech after being trained on recordings of people talking.

"This is the same technology used by Amazon Echo and Apple Siri to give verbal responses to our questions," said Mesgarani.

Researchers plan to test more complicated words and sentences next, and they want to run the same tests on brain signals emitted when a person speaks or imagines speaking.

Ultimately, they hope their system could be part of an implant, similar to those worn by some epilepsy patients, that translates the wearer's thoughts directly into words.

"In this scenario, if the wearer thinks 'I need a glass of water,' our system could take the brain signals generated by that thought, and turn them into synthesised, verbal speech," said Mesgarani.

"This would be a game changer. It would give anyone who has lost their ability to speak, whether through injury or disease, the renewed chance to connect to the world around them," he said.

Show Full Article
TAGS:
Next Story