artificial intelligence wearables have had a disastrous year.
Just a few months ago, the tech world was convinced that ai hardware could be the next big thing. It was a bold vision, backed by futuristic demos and sleek hardware. At the center of attention were the Humane ai Pin and the Rabbit R1. Both promised a great future, but neither delivered on the promise.
It’s an old story in the gadget world. Smart glasses and augmented reality headsets went through a similar popularity cycle a decade ago. Google Glass promised a future where reality would be overlaid with useful information. In the years since, Magic Leap, Focals By North, Microsoft’s HoloLens, Apple’s Vision Pro and, most recently, the new Snapchat Spectacles have attempted to keep that vision alive, but without any real commercial success.
So, all things considered, it's a little ironic that the best chance at achieving a functional ai-powered wearable device is a pair of smart glasses — specifically, the Ray-Ban Meta smart glasses.
The funny thing about the Meta smart glasses is that nobody expected them to be so successful. Partly because the first version, the Ray-Ban Stories, was a complete failure. Partly because they weren't smart glasses that offered new ideas. Bose had already done elegant Sunglasses with audio and then shut down the whole operation. Snap Spectacles already tried to record short videos for social media, and that clearly wasn't enough either. On paper, there was no compelling reason for the Ray-Ban Meta smart glasses to be a hit with the people.
And yet they have It succeeded where other ai wearables and smart glasses failed, and it was worth noting that it even surpassed Meta's expectations.
Much of this comes down to Meta finally mastering the style. and Execution. Meta glasses come in a ton of different styles and colors compared to Stories. You're almost guaranteed to find something that looks stylish on you. In this regard, Meta was smart enough to understand that the average person No They want to look like they just stepped out of a sci-fi movie. They want to look cool. today's rules.
At $299, they are pricey, but affordable compared to a $3500 Vision Pro or a $699 Humane pin. Audio quality is good. Call quality is surprisingly Excellent thanks to a well-placed microphone on the nose bridge. Unlike Stories or previous Snap Spectacles, the quality of videos and photos is good enough to post on instagram without feeling embarrassed, especially in the age of content creators, where instagram Reels and POV TikToks are the order of the day.
This is a device that can be easily integrated into people's lives. nowThere is no need to wait for any future software updates. This is not a solution looking for a problem to solve. And this, more than anything else, is exactly why Ray-Bans have a chance of successfully cracking ai.
That’s because ai is already present on the device — it’s just a feature, not the whole deal. You can use it to identify objects you find or tell you more about a landmark. You can ask Meta ai to write dubious captions for your instagram post or translate a menu. You can video call a friend and they can see what you see. All of these use cases make sense for the device and how you would use it.
In practice, these functions are a bit clunky and inelegant. Meta ai has yet to write me a good instagram caption And it often doesn't hear me well in noisy environments. But unlike the Rabbit R1, it works. Unlike the Humane, it doesn't overheat, and there's no latency because it uses your phone for processing. Crucially, unlike any of these devices, if the ai shits the bed, it can still do other things just fine.
For now, that's enough. From here on out, the pressure is on. Meta's bet is that if people can accept using simpler smart glasses, they'll become more comfortable with face computers when ai (and, ultimately, AR) are ready for use.
They have proven the first part of the equation, but for the second to become a reality, ai cannot be good or functional. It has to be truly goodYou have to make the leap from “Oh, this is pretty convenient when it works” to “I wear smart glasses all day because my life is so It’s much easier with them than without them.” Right now, many of the Meta glasses’ ai features are interesting, but they’re basically party tricks.
It’s a tall order, but of all the ones out there right now, Meta seems best positioned to succeed. Style and portability aren’t an issue. It just signed a deal with EssilorLuxxotica to extend its smart glasses partnership beyond 2030. Now that it has an overall plan for the hardware, iterative improvements like better battery life and lighter fits can be achieved. All that remains to be seen is whether Meta can deliver on the rest.
He'll get a chance to prove he can do it next week at his Meta Connect event. It's prime time. Humane's daily revenues outstrip sales. Critics accuse Rabbit of being little more than a scam. Experts I'm not convinced Apple's big ai-inspired “supercycle” with the iPhone 16 is about to become a reality. A win here would not only consolidate Meta's lead, but help keep the dream of ai-powered hardware alive.