When AI-powered visual search tools first debuted on Ray-Ban’s Meta sunglasses last year, they had some amazing (and worrying) capabilities. However, a new feature in the most recent beta appears to be very helpful. Meta CTO Andrew Bosworth noted in a Threads post that it functions as a kind of tour guide for travelers, identifying landmarks in different regions and providing further information about them.
Bosworth displayed a few sample photos that illustrated the history of the “painted ladies” houses in San Francisco, the reason the Golden Gate Bridge is orange (making it easier to see in fog), and other topics. For those, the textual descriptions were displayed beneath the photographs.
On top of that, Mark Zuckerberg used Instagram to show off the new capabilities via a few videos taken in Montana. This time, the glasses use audio to provide a verbal description of Big Sky Mountain and the history of the Roosevelt Arch, while explaining (like a caveman) how snow is formed.
Meta previewed the feature at its Connect event last year, as part of new “multimodal” capabilities that allow it to answer questions based on your environment. That in turn was enabled when all of Meta’s smart glasses gained access to real-time info (rather than having a 2022 knowledge cutoff as before), powered in part by Bing Search.
The feature is part of Meta’s Google Lens-like feature that enables users to “show” things they are seeing through the glasses and ask the AI questions about it — like fruits or foreign text that needs translation. It’s available to anyone in Meta’s early access program, which is still limited in numbers. “For those who still don’t have access to the beta, you can add yourself to the waitlist while we work to make this available to more people,” Bosworth said in the post.