These glasses can read their wearer's lips

ETX Studio
April 22, 2023 07:56 MYT
The EchoSpeech glasses can read silent speech. - ETX Studio
RESEARCHES at Cornell University in the United States have developed a pair of AI-equipped eyeglasses that can identify commands silently mouthed by a user to their smartphone, simply by analyzing the wearer's lip and mouth movements.
Developed by the SciFi (Smart Computer Interfaces for Future Interactions) laboratory at Cornell University, this device was designed to allow people to unlock and use a smartphone in all circumstances, including when there is a lot of noise (like in a stadium or a nightclub) or, on the contrary, in a space where silence is required (like in a library).
Here, there's no need to say a command out loud, just silently mouth the word and the glasses can act as a relay between the wearer and the smartphone.
Currently, this involves, for example, silently pronouncing the phone's access code or commands such as "louder," "forward" or "stop" to control playlists.
As designed, this concept is relatively compact and, above all, requires very little power.
These glasses, called EchoSpeech, work using sonar, emitting and receiving sound waves through the face, in turn detecting the slightest movement of the lips and mouth.
Depending on the type of echo profile received, the artificial intelligence can identify the request being made. It takes just a few minutes of user training for the artificial intelligence to recognize about 30 commands and numbers to execute on the smartphone.
Although still a prototype, this initiative could one day be commercialized. It could potentially be of assistance to people with speech impediments, or if one day combined with a voice synthesizer, it could give a voice to people who cannot usually express themselves.
#EchoSpeech #AI #Cornell University #SciFi laboratory #English News
;