Guideline

Google Pilots AI-Powered Audio Navigation Tool to Help Visually Impaired People Run Safely


Google is testing a way for blind and visually impaired people to go for a run safely using artificial intelligence to provide audio prompts when they start to stray from the road. Project Guideline uses a mobile app running on an Android smartphone tracks painted lines on the road from a harness worn by the runner and combines the visual information with GPS data to determine if and how the runner is drifting, then uses sounds played through the runner’s earbuds to alert them of how to adjust to get back on track as can be seen in the video above.

Guiding Lines

Project Guideline’s app uses an AI model to spot a painted line on the road and infer where the runner is in relation to the line. If the runner begins to move too far from the line, the app sends a signal to the bone-conducting headphones they wear to play unpleasant noises. Which ear the sound plays and how loud it gets lets the runner know the direction they need to move and how far away they are from the line. The system can run from the smartphone without the internet, crucial if it’s going to work on running trails that may have poor Wi-Fi. Google designed Project Guideline in collaboration with the non-profit Guiding Eyes for the Blind, whose leadership took a direct role in testing the system.

“We began by sketching out how the prototype would work, settling on a simple concept: I’d wear a phone on a waistband, and bone-conducting headphones,” Guiding Eyes for the Blind president Thomas Panek explained in a blog post. “The phone’s camera would look for a physical guideline on the ground and send audio signals depending on my position. If I drifted to the left of the line, the sound would get louder and more dissonant in my left ear. If I drifted to the right, the same thing would happen, but in my right ear.”

It’s a bit like the proximity alert used in cars, getting louder and faster the closer the car is to a possible collision, but with a directional component added. Panek successfully completed the New York Road Runners’ Virtual Run for Thanks 5K in Central Park using Project Guideline this year. He ran it last year as well with the help of a relay team of three guide dogs. The next step will be to paint the lines on more roads in different towns and cities.

Google Accessibility

Project Guideline is another facet of Google’s accessibility initiatives involving AI, such as training voice assistants in Project Euphonia to understand those with speech impairments. Most recently, Google added the Sound Notifications feature to Android to alert people who can’t hear critical noises like alarms or crying babies. Part of the company’s celebration of International Augmentative and Alternative Communication (AAC) Awareness Month. Google added eye-tracking controls to Google Assistant and improved accessibility for Action Blocks in the same month. For people with visual impairment, Google has been adding features like voice cues in Google Maps directions. Google also recently made a major upgrade to the Lookout app. First designed to provide ongoing audio descriptions of the user’s environment using the camera, it now can scan and identify packaged food and take snapshots of text to read aloud later.

  

Google Extends Lookout Accessibility App to All Android Devices, Adds Ability to Read Food Labels and Text

New Android Accessibility Feature Alerts You to the Sounds You Don’t Hear

Google Assistant Adds Eye-Tracking Control, Upgrades Action Blocks for Better Accessibility