Boing Boing Staging

Brain-computer interface successfully translates thought into synthesized speech

In a pioneering study, scientists have demonstrated that an implanted brain-computer interface (above) coupled with deep-learning algorithms can translate thought into computerized speech.

From Scientific American:


(University of California, San Francisco neurosurgeon Edward Chang) emphasized that his approach cannot be used to read someone’s mind—only to translate words the person wants to say into audible sounds….


Chang and his colleagues devised a two-step method for translating thoughts into speech. First, in tests with epilepsy patients whose neural activity was being measured with electrodes on the surface of their brain, the researchers recorded signals from brain areas that control the tongue, lips and throat muscles. Later, using deep-learning computer algorithms trained on naturally spoken words, they translated those movements into audible sentences….

The researchers asked native English speakers on Amazon’s Mechanical Turk crowdsourcing marketplace to transcribe the sentences they heard. The listeners accurately heard the sentences 43 percent of the time when given a set of 25 possible words to choose from, and 21 percent of the time when given 50 words, the study found.


Although the accuracy rate remains low, it would be good enough to make a meaningful difference to a “locked-in” person, who is almost completely paralyzed and unable to speak, the researchers say.


Scientists Take a Step Toward Decoding Thoughts(SciAm)


Speech synthesis from neural decoding of spoken sentences(Nature)

Exit mobile version