The brand new GPT-Four app will be “life-changing” for visually impaired folks

The first app to integrate GPT-4’s image recognition capabilities has been described as “life-changing” by visually impaired users.

Be My Eyes, a Danish startup, applied the AI ​​model to a new feature for blind or partially sighted people. The object detection tool, called “Virtual Volunteer,” can answer questions about any image sent.

For example, imagine that a user is hungry. You could simply photograph an ingredient and request corresponding recipes.

If they prefer to eat out, they can upload an image of a map and get directions to a restaurant. Upon arrival they can take a picture of the menu and listen to the options. Then, when they want to work off the extra calories at the gym, they can use their smartphone camera to find a treadmill.

“I know we’re in the middle of an AI hype cycle right now, but several of our beta testers have used the phrase ‘life-changing’ when describing the product,” By My Eyes CEO Mike Buckley told TNW.

Visit us at the TNW conference on June 15th and 16th in Amsterdam

Get a 20% discount on your ticket now! Limited time offer.

“This has an opportunity to transform the community with unprecedented resources to better navigate physical environments, meet everyday needs, and gain greater independence.”

Virtual Volunteer uses an upgrade to the software from OpenAI. Unlike previous iterations of the company’s lauded models, GPT-4 is multimodal, meaning it can parse both images and text as inputs.

Be My Eyes took the opportunity to test the new functionality. Although text-to-image systems are nothing new, the startup was never convinced of the power of the software before.

“From too many bugs to being unable to converse, the tools available on the market weren’t designed to solve many of our community’s needs,” says Buckley.

“The image recognition offered by GPT-4 is superior, and the levels of analysis and conversation powered by OpenAI increase the value and utility exponentially.”

So far, Be My Eyes has only supported users with human volunteers. According to OpenAI, The new function can generate the same context and understanding. But If the user doesn’t get a good response or simply prefers a human connection, they can still call a volunteer.

The first version of the free app was released in 2015 with the aim of empowering the 253 million people who are blind or partially sighted to live more independently. Credit: Be My Eyes

Despite the promising early results, Buckley insists the free service will be rolled out cautiously. The beta testers and more Community will play a central role in determining this process.

Ultimately, Buckley believes the platform will bring both support and opportunity to users. Be My Eyes will too soon to help companies better serve their customers by prioritizing accessibility.

“It’s safe to say that technology could not only give people who are blind or partially sighted more empowered, but also a platform for the community to share even more of their talents with the rest of the world,” says Buckley. “To me, that’s an incredibly compelling possibility.”

If you or someone you know is visually impaired and would like to try the Virtual Volunteer, you can Sign up for the waiting list here.

Comments are closed.