Project Understood

Google Partners with Canadian Down Syndrome Society to Teach Google Assistant to Understand People with Down Syndrome

Google and the Canadian Down Syndrome Society are working together to gather voice recordings from adults with Down syndrome as part of an effort to improve how well Google Assistant understands people with the syndrome. The new endeavor is called Project Understood, part of Google’s larger Project Euphonia, which works to train voice assistants to understand those with speech impairments.

Training From Recordings

Down syndrome affects people differently, but a pilot test demonstrated enough similarity among those with the disorder that the voice assistant’s AI could be trained to recognize and understand them. For Project Understood, volunteers are recording 1,700 phrases to train the AI. The goal is to collect examples from at least 500 people, although the more people who participate, the more data the algorithm will have to work with and the better it will learn.

“With the help of the Canadian Down Syndrome Society we were able to sample a small group to test whether there were enough patterns in the speech of people with Down syndrome for our algorithm to learn and adapt,” said Julie Cattiau, product manager at Google in a statement. “It’s exciting to see the success of that test and move into the next phase of collecting voice samples that represent the vocal diversity of the community. The more people who participate, the more likely Google will be able to eventually improve speech recognition for everyone.”

Accessibility and Voice

Voice technology as a tool for accessibility is becoming increasingly popular. Google recently added voice cues to directions in Google Maps as a way to improve how it works for the visually impaired. It also launched its Action Blocks feature specifically for those with disabilities as a way to create pre-set Google Assistant commands and shortcuts. The company is also working with other healthcare organizations and helped sponsor a hackathon in Europe to come up with new ways for Google Assistant to help people with neuromuscular disorders.

Amazon has been working on accessibility as well, developing a new first-party Alexa Show skill aimed at people with visual impairment called Show and Tell. The skill can identify boxes, cans, or other packaged items held up to the device’s camera. Amazon is also trying to promote Alexa as a tool for those with disabilities, as it did in an ad for Alexa in the UK highlighting how the voice assistant can aid someone with low-vision. The new partnership could bring the power of voice and AI tech to a lot more people who could use it.

“For most people, voice technology simply makes life a little easier,” CDSS interim executive director Laura LaChance said in a statement. “For people with Down syndrome, it has the potential for creating greater independence. From daily reminders to keeping in contact with loved ones and accessing directions, voice technology can help facilitate infinite access to tools and learnings that could lead to enriched lives.”


Google’s New Action Blocks Are Siri Shortcuts for Android

Google Assistant For Good Challenge Seeks to Aid Those With Neuromuscular Disorders

New Alexa Skill Identifies Groceries for Blind People