Scroll to read more

Imagine if you had ChatGPT with you at all times, so that you could pose questions to an AI system as you walk around in your everyday life.

That could soon be a reality, with Meta testing an integrated AI chatbot within its Ray Ban Stories sunglasses, so you can ask questions based on what the cameras can see via the device.

Note: Ostentatious cardigans are optional when wearing your Ray Ban Stories.

As demonstrated by Meta’s CTO Andrew Bosworth, with this new test, those using Ray Ban Stories (U.S. only for now) will be able to say “Hey Meta” and then ask questions that can then be answered by Meta’s AI system. The system can generate ChatGPT-like responses, supplemented with additional info from Microsoft’s Bing search engine, which could be handy for getting on-the-go insights into anything you like via the device.

So long as you’re fine with looking like a weirdo, chatting to your glasses, phone in hand, then looking at the results on your device.

I mean, you could probably just take a picture on your phone and get the same result, but that’s arguably not as cool as being able to ask a form of digital omnipresence, which can then reply, making you feel like you’re walking around some futuristic spaceship.

Ray Ban Stories

And really, this is about future potential, not necessarily immediate value. Sure, it’s interesting to be able to ask your sunglasses what plant you’re looking at, but this is more of a precursor to the next stage, where more people will be using voice commands in everyday life, with AI systems then able to provide conversational answers, and become digital companions, of sorts, in a range of ways.

But then again, you can kind of do that already in your home, if you have a home assistant device. And those haven’t proven hugely popular, though youngsters who are growing up with voice commands are likely to become increasingly reliant on conversational queries, which could make this an even more valuable innovation over time.

And maybe, if Meta releases a non-sunglass version, maybe even prescription Ray Bans (or similar), there could be additional interest and value there. Then, eventually, the device will become more integrated, and less reliant on a supplementary screen, like your phone, and over time, the value and utility of such will improve, making it more appealing and useful.

So while it’s a novelty now, it’s more interesting to consider where the technology is headed, and what this means for future interaction with the same.

Right now, it’s a bit weird, and not hugely beneficial, but it’s another step towards the next stage of AR interaction, as Meta continues to build on this front.

U.S. Ray Ban Stories users will be able to test out the new functionality shortly.