The estimated reading time for this post is 4 minutes
Using your voice to ask Alexa questions can be very convenient, just ask her a question and get the answer. With thousands of skills Alexa is a very powerful tech tool that can be used with merely the sound of your voice.
But the power of Alexa can’t be used when communicating by speech isn’t an option. Members of the deaf community and those with speech impediments have been left behind by Alexa so far.
Two advancements are opening up the possibility that Alexa can become a useful tool to people who are hearing impaired. One advancement is from a software developer who has created a system that lets Alexa interact with people who use sign language.
The other is an update from Amazon that allows users to tap on the screen of an Echo Show to ask Alexa a question, instead of speaking.
Alexa Responds to Sign Language
Tech is at its best when it makes the world more accessible to those with challenges. Abhishek Singh is a developer who is helping bring Alexa’s skills to the deaf community.
Singh realized the need to bring the skills of Alexa to those who have impaired hearing and/or speech impediments. As he told Fast Company, “If these devices are to become a central way we interact with our homes or perform tasks, then some thought needs to be given to those who cannot hear or speak. Seamless design needs to be inclusive in nature.” See, Fast Company, This clever app lets Amazon Alexa read sign language.
Using Google’s TensorFlow software, Singh created an app that could learn sign language and translate it into written and spoken text. The app also translates spoken text into written text.
Singh’s app uses a computer’s camera to learn sign language and translate it into written and spoken text. The user stands in front of the computer’s camera and signs words and/or letters. The app then speaks the words and types them into written text on the screen. An Echo placed next to the computer lets Alexa hear the spoken translation.
Upon hearing the questions translated by the computer from sign language, Alexa then responds with the answer. The app then translates Alexa’s response into written text on the computer screen.
This YouTube video shows how Singh’s app works:
So far the app only recognizes the sign language terms demonstrated in the video, though Singh wants to make the software open source so other developers can add to its vocabulary.
Ultimately Singh hopes that Amazon creates a skill for the Echo Show that allows it to use its camera to interpret sign language directly. As he told Fast Company, “That’s where I hope this heads. And if this project leads to a push in that direction in any small way, then mission accomplished,” says Singh. “In an ideal world I would have built this on the Show directly, but the devices aren’t that hackable yet, so [I] wasn’t able to find a way to do it.”
Echo Show Update Lets Alexa Respond to Taps
Amazon is taking a step in the right direction by updating Alexa to respond to taps on the Echo Show screen. With this update you will no longer need to speak a question to get Alexa to respond.
With the Tap to Alexa update, you can tap on the Show’s screen to ask Alexa a question. Her response will be shown on the screen. This update can help those in the deaf community, those who are unable to speak to Alexa (for whatever reason), or for language learners whose accents may not be easily understood by Alexa.
To use Tap to Alexa, swipe down from the top of the Echo Show screen and tap the Settings gear icon. Then tap Accessibility and turn on Tap to Alexa. (If Tap to Alexa isn’t an option in Accessibility then your Echo Show has not received the update yet.)
With the Tap to Alexa setting turned on, an icon of a hand with a pointed finger will appear on the home screen of the Echo Show. Tap the hand to reveal the Tap to Alexa screen. There you can find the questions you can ask Alexa. You can also create your own custom Tap to Alexa questions by tapping on the keyboard icon at the top of the screen.
Amazon is rolling out this update to Echo Show owners. The company plans to expand to expand Tap to Alexa to other Echo devices with a screen (such as the Echo Spot) at a later date.
Amazon is also introducing Alexa Captioning on all Echo devices that have a screen so that users can read Alexa’s responses even when a question is spoken.
Alexa Captioning is coming to the UK, Germany, Japan, India, France, Canada, Australia, and New Zealand. The US already has Alexa Captioning.
Your Thoughts
Do you like the idea of Alexa being able to respond to sign language and taps on the screen? Do you think these developments would help the deaf community be able to use Alexa’s skills? Do you hope that Amazon adds a skill that allows Alexa to interpret sign language directly?
Share your thoughts in the Comments section below!