The Ray-Ban Meta glasses are the first real artificial-intelligence wearable success story. In fact, they are actually quite good. They’ve got that chic Ray-Ban styling, meaning they don’t look as goofy as some of the bulkier, heavier attempts at mixed-reality face computers. The onboard AI agent can answer questions and even identify what you’re looking at using the embedded cameras. People also love using voice commands to capture photos and videos of whatever is right in front of them without whipping out their phone.
Soon, Meta’s smart glasses are getting some more of these AI-powered voice features. Meta CEO Mark Zuckerberg announced the newest updates to the smart glasses’ software at his company’s Meta Connect event today.
“The reality is that most of the time you’re not using smart functionality, so people want to have something on their face that they’re proud of and that looks good and that’s, you know, designed in a really nice way,” Zuckerberg said at Connect. “So they’re great glasses. We keep updating the software and building out the ecosystem, and they keep on getting smarter and capable of more things.”
The company also used Connect to announce its new Meta Quest 3S, a more budget-friendly version of its mixed-reality headsets. It also unveiled a host of other AI capabilities across its various platforms, with new features being added to its Meta AI and Llama large language models.
As far as the Ray-Bans go, Meta isn’t doing too much to mess with a good thing. The smart spectacles got an infusion of AI tech earlier this year, and now Meta is adding more capabilities to the pile, though the enhancements here are pretty minimal. You can already ask Meta AI a question and hear its responses directly from the speakers embedded in the frames’ temple pieces. Now there are a few new things you can ask or command it to do.
Probably the most impressive is the ability to set reminders. You can look at something while wearing the glasses and say, “Hey, remind me to buy this book next week,” and the glasses will understand what the book is, then set a reminder. In a week, Meta AI will tell you it’s time to buy that book.
Meta says live transcription services are coming to the glasses soon, meaning people speaking in different languages could see transcribed speech in the moment—or at least in a somewhat timely fashion. It’s not clear exactly how well that will work, given that the Meta glasses’ previous written translation abilities have proven to be hit-or-miss.
Zuckerberg says Meta isalso partnering with the Danish-based mobile app Be My Eyes to bring a feature to the Ray-Ban Meta glasses that that connects blind and low-vision people to volunteers who can view live video and talk the wearer through what is in front of them.
“I think that not only is this going to be a pretty awesome experience today, but it’s a glimpse of the type of thing that might be more possible with always-on AI.”
There are new frame colors and lens colors being added, and customers now have the option to add transition lenses that increase or decrease their shading depending on the current level of sunlight.
Meta hasn’t said exactly when these additional AI features will be coming to its Ray-Bans, except that they will arrive sometime this year. With only three months of 2024 left, that means very soon.