A New App Interprets Sign Language for the Amazon Echo

iStock
iStock / iStock
facebooktwitterreddit

The convenience of the Amazon Echo smart speaker only goes so far. Without any sort of visual interface, the voice-activated home assistant isn't very useful for deaf people—Alexa only understands three languages, none of which are American Sign Language. But Fast Company reports that one programmer has invented an ingenious system that allows the Echo to communicate visually.

Abhishek Singh's new artificial intelligence app acts as an interpreter between deaf people and Alexa. For it to work, users must sign at a web cam that's connected to a computer. The app translates the ASL signs from the webcam into text and reads it aloud for Alexa to hear. When Alexa talks back, the app generates a text version of the response for the user to read.

Singh had to teach his system ASL himself by signing various words at his web cam repeatedly. Working within the machine-learning platform Tensorflow, the AI program eventually collected enough data to recognize the meaning of certain gestures automatically.

While Amazon does have two smart home devices with screens—the Echo Show and Echo Spot—for now, Singh's app is one of the best options out there for signers using voice assistants that don't have visual components. He plans to make the code open-source and share his full methodology in order to make it accessible to as many people as possible.

Watch his demo in the video below.

[h/t Fast Company]