During Amazon’s Devices event, the company introduced a new Alexa feature that aims to support users with mobility or speech disabilities. “Eye Gaze on Alexa” will be available later this year on the new Fire Max 11 Tablet, which launched in May.
This marks Amazon’s first time using eye-driven navigation technology for its devices.
Designed for customers unable to tap or use their voice, the eye-tracking feature detects a user’s presence and follows what they’re looking at in real-time. “Eye Gaze” works with “Tap to Alexa” so users can look at their Fire Max 11 tablet and perform pre-set Alexa actions, such as playing music and other entertainment, making calls and controlling their smart home environment. Amazon says it worked with speed-language pathologists in order to build a feature that helps customers with their daily tasks, such as turning on lights or communicating with caretakers.
“Eye Gaze on Alexa” will roll out to Fire Max 11 tablets in the U.S., the U.K., Germany and Japan.
Another Alexa feature that Amazon announced is “Call Translation,” where Alexa audio and video calls are translated and captioned in real time. The new feature could help non-native speakers communicate more effectively. Also, the captions could be useful for deaf and hard-of-hearing users.
“Call Translation” will roll out to Echo Show devices and the Alexa mobile app. It will be available in the U.S., the U.K., Canada, Mexico, Germany, France, Spain and Italy in over 10 languages, including English, Spanish, French, German and Portuguese.