Meta’s Ray-Ban branded smart glasses are getting AI-powered reminders and translation features

Date:

Share:


Meta’s AI assistant has always been the most intriguing feature of its second-generation Ray-Ban smart glasses. While the generative AI assistant had fairly limited capabilities when the glasses launched last fall, the addition of real-time information and multimodal capabilities offered a range of new possibilities for the accessory.

Now, Meta is significantly upgrading the Ray-Ban Meta smart glasses’ AI powers. The company showed off a number of new abilities for the year-old frames onstage at its Connect event, including reminders and live translations.

With reminders, you’ll be able to look at items in your surroundings and ask Meta to send a reminder about it. For example, “hey Meta, remind me to buy that book next Monday.” The glasses will also be able to scan QR codes and call a phone number written in front of you.

In addition, Meta is adding video support to Meta AI so that the glasses will be better able to scan your surroundings and respond to queries about what’s around you. There are other more subtle improvements. Previously, you had to start a command with “Hey Meta, look and tell me” in order to get the glasses to respond to a command based on what you were looking at. With the update though, Meta AI will be able to respond to queries about what’s in front of you with more natural requests. In a demo with Meta, I was able to ask several questions and follow-ups with questions like “hey Meta, what am I looking at” or “hey Meta, tell me about what I’m looking at.”

When I tried out Meta AI’s multimodal capabilities on the glasses last year, I found that Meta AI was able to translate some snippets of text but struggled with anything more than a few words. Now, Meta AI should be able to translate longer chunks of text. And later this year the company is adding live translation abilities for English, French, Italian and Spanish, which could make the glasses even more useful as a travel accessory.

And while I still haven’t fully tested Meta AI’s new capabilities on its smart glasses just yet, it already seems to have a better grasp of real-time information than what I found last year. During a demo with Meta, I asked Meta AI to tell me who is the Speaker of the House of Representatives — a question it repeatedly got wrong last year — and it answered correctly the first time.

Catch up on all the news from Meta Connect 2024!



Source link

━ more like this

Surprising no one, researchers confirm that AI chatbots are incredibly sycophantic

We all have anecdotal evidence of chatbots blowing smoke up our butts, but now we have science to back it up. Researchers at...

WordPress creator files countersuit against WP Engine over trademark violations

There's been another turn in WordPress creator Automattic's ongoing legal battle with WordPress provider WP Engine. In a counterclaim Automattic filed as part...

Instagram adds a watch history for Reels

Instagram's latest feature should make it easier to resurface videos you've viewed. On Friday, Adam Mosseri revealed a new watch history for Reels....

Rivian agrees to settle shareholder lawsuit for $250 million

Rivian has agreed to settle a 2022 shareholder lawsuit. The automaker will pay out $250 million to qualifying investors if the agreement is...
spot_img