This robotic dog talks with ChatGPT magic and guides the visually impaired

Date:

Share:


Robot dogs aren’t some new innovation, but one that can back to your sounds like science fiction. Researchers at Binghamton University say they have already built one, and it is meant to help the blind.

The team from the university describes an AI-powered robotic guide dog system designed to aid the visually impaired users navigate indoor spaces while also communicating with them during the journey. The big twist is that it uses large language models (LLMs), specifically GPT-4, to make the robot more conversational and responsive than a traditional guide dog could be.

How does the AI guide dog work?

According to Binghamton University, the system was developed by Shiqi Zhang, an associate professor in the School of Computing, and his team. Zhang stated that the project shows how robotic guide dogs can go beyond the limits of actual guide dogs, who can only understand a small set of commands.

Using GPT-4 with voice commands, the AI-powered robot dog gains much stronger conversational capabilities. The setup isn’t about just getting the user from one point to another. Before the trip even begins, the robot can describe the possible routes and estimated travel times. During the journey, it offers what researchers call “scene verbalization,” giving real-time spoken feedback about the environment and obstacles ahead.

In one example shared by the report, the AI guide dog may say something like “this is a long corridor” while guiding the user to a conference room.

It’s already being tested with blind participants

To evaluate the system, the researchers recruited seven legally blind participants and had them navigate a large, multi-room office environment. The participants then completed a questionnaire rating the system’s helpfulness, usefulness, and ease of communication. And the results? Users preferred the combined approach of route planning explanation along with live narration during the travel.

It isn’t about going from point A to point B—it is about giving users more situational awareness and more control over how they move through a space. And just like how AI is being used to find pets, this is one of those positive headlines around AI.



Source link

━ more like this

The MacBook Neo is moonlighting as a Windows gaming machine, and it’s doing it well

Apple didn’t position its most affordable MacBook as a gaming machine. The MacBook Neo, a budget-leaning laptop that runs on Apple’s A18 Pro...

Apple glasses won’t go brand shopping like Meta did with Ray-Ban and Oakley

When it comes to smart glasses, Apple seems to be taking the road less traveled. While others have leaned on big-name eyewear brands...

I tried this Pokémon-inspired weather app, and checking the weather now feels like a Pokédex hunt

Weather apps are usually one of the most boring things on your phone. You open one, glance at the temperature, maybe check if...

Apple reportedly testing out four different styles for its smart glasses that will rival Meta Ray-Bans

Apple may be late to the smart glasses market, but it could be covering all its bases with up to four potential styles...

Months before the Fold 8’s expected launch, the Fold 7 gets a price hike in the U.S.

So far, we’ve seen companies either release new smartphones at higher prices than their predecessors or hike prices a few months after launch,...
spot_img