Android 11 Beta Updates Voice Access Feature with ‘Visual Cortex’
Google’s next version of the Android operating system will integrate visual context into its Voice Access feature. The improved Voice Access is in the beta of Android 11, which Google unveiled on Wednesday.
Google originally planned to share the beta last week during a virtual event, but decided to cancel it “to allow people to focus on important discussions around racial justice in the United States.” Instead, the beta arrived quietly, and the company uploaded the pre-recorded talks to YouTube. For the Android 11 beta, Google broke down the changes into the categories of people, controls, and privacy. For controls, Google made managing multiple smart devices and shifting where audio content plays easier. Privacy is very much at the forefront of the update. The operating system has a lot more customization available when giving permission to apps to use your camera, microphone, or location. Instead of just permanently allowing access, users can restrict permission to just once, with the app asking again every time it’s opened.
On the people front, Android 11 offers a section of the screen specifically for conversation notifications, as well as bubbles that will keep them on the screen while you perform other tasks. The voice and AI aspects of Android have been improved with context-based suggestions for autocompleting sentences in conversations. For Voice Access, that context includes visual cues that can help a user find accessibility controls and respond to questions and commands based specifically on what’s on the screen.
“[W]e’re making Android more people-centric and expressive, reimagining the way we have conversations on our phones, and building an OS that can recognize and prioritize the most important people in your life,” Android director of product management Stephanie Cuthbertson explained in a blog post. “Voice Access, for people who control their phone entirely by voice, now includes an on-device visual cortex that understands screen content and context, and generates labels and access points for accessibility commands.”
Google has been keen lately to enhance accessibility for Android. Voice Access was even among the features updated recently to mark Global Accessibility Awareness Day just a few weeks ago. But, it’s far from unique. At the same time, Google widely released Action Blocks, the ability to combine pre-set Google Assistant commands and shortcuts into a single button or voice command, and added to Live Transcribe the option to make your phone vibrate when your name is is heard by the device’s microphone. Those updates in turn arrived after Google added voice cues in Google Maps directions to help the visually impaired, and started Project Euphonia, which works to train voice assistants to understand those with speech impairments. Were it not for the COVID-19 health crisis and Google had hosted a live event, as usual, the company would probably have marked the anniversary of the DIVA initiative, a program aimed at increasing accessibility to its technology. All of the elements of Android 11 revealed in the beta point to Google attempting to address product requests and gaps in their functions. Using a voice AI to help users navigate a smartphone is a good thing and no doubt ups Google’s potential for attracting new customers who benefit from more accessible technology.