Tech News

Ray-Ban’s Meta smart glasses get reminders and translation features powered by AI

Meta’s AI assistant has been the most interesting feature of the second generation of Ray-Ban smart glasses. While the productivity AI assistant had limited capabilities when the glasses were introduced last fall, the addition of real-time information and multimodal capabilities have provided a range of new possibilities for the assistant.

Now, Meta greatly enhances the AI ​​capabilities of Ray-Ban Meta smart glasses. The company showed off several new capabilities for the year-old frame at its Connect event, including reminders and live rendering.

With Reminders, you’ll be able to view things around you and ask Meta to send you a reminder about them. For example, “hey Meta, remind me to buy that book next Monday.” The glasses will also be able to scan QR codes and call the phone number written in front of you.

In addition, Meta is adding video support to Meta AI so the glasses can better scan your surroundings and answer questions about what’s around you. There are other subtle improvements. Previously, you had to start the command “Hey Meta, look and tell me” to get the glasses to respond to a command based on what you were looking at. However, with the update, Meta AI will be able to answer questions about what’s in front of you with more natural requests. In the demo with Meta, I was able to ask a few questions and follow up with questions like “hey Meta, what am I looking at” or “hey Meta, tell me about what I’m looking at.”

When I tested Meta AI’s multimodal capabilities on glasses last year, I found that Meta AI was able to translate some snippets of text but struggled with anything more than a few words. Now, Meta AI should be able to translate long chunks of text. And later this year the company is adding live translation capabilities for English, French, Italian and Spanish, which could make the glasses even more useful as a travel accessory.

And while I haven’t fully tested the new Meta AI capabilities in its smart glasses yet, it already understands real-time information better than I did last year. During a demo with Meta, I asked Meta AI to tell me who the Speaker of the House of Representatives is — a question it got wrong repeatedly last year — and it answered correctly the first time.


Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button