Honda Harnessed AI to Create an App for the Blind
Honda is enhancing car rides for the blind and visually impaired with its Honda Scenic Audio project.
Using a combination of AI and various data sets, the app, developed in partnership with Perkins School for the Blind's Howe Innovation Center in suburban Boston, gives passengers real-time descriptions of the view outside their windows. For example, if it's a drizzly day, the app will describe raindrops pelting the facade of Fenway Park.
"Honda worked closely with Dr. Josh Loebner to create the Scenic Audio app," says Phil Hruska, senior manager of marketing. "Josh worked with creative technologists to develop the framework, along with a creative team to tell the story from the perspective of visually impaired passengers."
The app is currently in beta and open to input from the Alabama Institute for Deaf and Blind and the Perkins School—both Honda partners.
"Our technical challenge was to figure out a way to scale visual descriptions to the more than four million miles of roads in the U.S. at any given time," Hruska tells Muse." We use computer vision, generative text, and text-to-speech AIs, in combination with eight discrete data sets pulled in real time to affect each description."
Once revisions and updates made, Honda will decide how to release the app to a wider audience.