Amazon Launches Alexa Accessibility Hub
Amazon has brought together all of the accessibility features and initiatives for the Alexa voice assistant into a central website. The new Alexa Accessibility Hub demonstrates the ways people with different kinds of disabilities can use Alexa as a helpful tool and the accommodations in place to make using Echo smart speakers and smart displays easier for people with disabilities.
The new website attempts to catalog both Alexa’s accessibility features and how the voice assistant can augment the lives of people with disabilities. The voice assistant’s features are broken up into hearing, speech, mobility, vision. Each sub-section describes ways the voice assistant compensates for the relevant difficulties. Among other options, Alexa can offer real-time texting during phone calls on a smart display or say out loud what gestures people are using to navigates the device should they not be able to see it well. Amazon has been adding accessibility features to its voice assistant for a while. Almost a year ago, the company debuted the Show and Tell feature for Echo smart displays, able to use a combination of computer vision and machine learning to identify a can or bottle for someone who can’t see well enough to identify it. The company was proud enough of its accessibility evolution to even play an ad last summer in the UK to highlight how Alexa can aid someone with low-vision. The ad was developed in a partnership with the Royal National Institute of Blind People, which included Amazon staff training people at the RNIB on how Alexa can help people with vision impairments.
That said, it’s notable how the accessible features can blend into ways of using Alexa that the general public might like to use. The Echos Show smart displays added the ability to read barcodes earlier this year in a way reminiscent of the Show and Tell feature. And most of the features described as being for people who have limited mobility are just standard Alexa features like turning on lights, ordering groceries, or looking up information while shopping.
Amazon’s decision to highlight accessibility while it keeps rolling out new features to accommodate people is part of a larger movement by voice assistant and voice app developers. The race to draw more users starts with making it possible for everyone to use the technology. Startups like Open Style Lab and K4Connect, which recently closed a $21 million funding round, are built around augmenting voice-activated hardware and software to make them more usable by those with disabilities.
Google has been particularly active in that regard lately, most recently by adding visual context to the Voice Access feature in Android, as well as the option to make a phone vibrate when it detects the owner’s name and voice cues in Google Maps to provide directions to the visually impaired. The company also runs Project Euphonia, which works to train voice assistants to understand those with speech impairments. The most significant accessibility feature added by Google of late is Action Blocks, which let users create a combination of pre-set Google Assistant commands and shortcuts and put them into a single button or voice command. The Action Blocks are very similar to the Shortcuts used by Siri, which can be used as an accessibility feature or to trick iPhones into unlocking with a voice command.