All over Meta's campus in Menlo Park, cameras were staring at me. I'm not talking about the security cameras or DSLRs of my fellow reporters. I'm not even talking about smartphones. I'm talking about Ray-Ban and Meta's smart glasses, which Meta hopes we'll all, someday, somehow, wear.
I visited Meta for this year's Connect conference, where almost every hardware product involved cameras. They're in the Ray-Ban Meta smart glasses that received a software update, the new Quest 3S virtual reality headset, and Meta's prototype Orion AR glasses. Orion is what Meta calls a “time machine”: a working example of what full-blown AR could look like, years before it's ready for the consumer.
But at least on the Meta campus, Ray-Bans were already everywhere. It was a different kind of time machine: a glimpse into CEO Mark Zuckerberg's future world, where glasses are the new phones.
I'm conflicted about it..
Meta really wants to put cameras in your face. The glasses, which follow 2021's Ray-Ban Stories, are apparently making progress on that front, as Zuckerberg said. The edge Sales are going “very well.” They are not full AR glasses as they have no screen to display information, although they are becoming more powerful with ai features. But they're perfect for what the entire Meta empire is built on: encouraging people to share their lives online.
The glasses come in a variety of classic Ray-Ban styles, but for now, it's obvious that wearers don't just wear glasses. As I wandered around campus, I saw telltale signs on person after person: two prominent circular cutouts on the edges of their glasses, one for a 12MP ultra-wide camera and the other for an indicator light.
This light flashes when a user is taking photos and videos and is usually visible even in sunlight. In theory, that should have calmed me down: If the light wasn't on, I could trust that no one would be recording footage of me eating lunch before my meetings.
But when I talked to people on campus, I was always a little nervous. I found myself very aware of those circles, checking to see if someone was filming me when I wasn't paying attention. The mere potential of a recording would distract me from conversations, inserting a faint murmur of anxiety in the background.
When I put on a pair, the situation changed.
Then, when I put on a pair, the situation suddenly changed. As a potential recording target, I had been hesitant, worried that I would be photographed or filmed as a consequence of making polite eye contact. However, with the glasses on my own face, I felt like I should be recording further. There's something really compelling about the experience of having a camera right at your eye level. With the push of a button on the glasses, you could take a photo or video of whatever you were looking at at exactly the angle you were viewing it. I didn't have to clumsily pull out my phone and hope the moment lasted. Perhaps there is no better way to share my reality with other people.
Meta smart glasses have been around for a few years and I'm not the first person, or even the first to The edge – be impressed by them. But this was the first time I saw these glasses not as an early adopter technology, but as a ubiquitous product like a phone or a smartwatch. I had an idea of how this perfect scale recording would work, and the prospect is both exciting and terrifying.
The camera phone was a revolution in itself and we are still dealing with its social effects. Almost anyone can now document police brutality or capture a fleeting, funny moment, but also take gruesome photographs and post them online or (a much lesser crime, to be clear) bother people at concerts. What will happen when even the minimal friction of pulling out a phone disappears and billions of people can immediately take a photo of anything they see?
Personally, I can see how incredibly useful this would be for capturing candid photos of my new baby, who is already starting to recognize when a phone is taking his picture. But it's not hard to imagine much more malicious uses. Sure, you'd think we're all used to everyone pointing their phone cameras at everything, but I'm not exactly sure that's a good thing; I don't like that there's a chance I'll end up on someone's TikTok just because I left the house. (The rise of tech/”>sophisticated facial recognition makes the risks even greater.) With the ubiquitous glasses-equipped cameras, I feel like there is an even greater chance that my face will appear somewhere on the Internet without my permission.
There are also clear risks in integrating cameras into what, for many people, is a non-negotiable visual aid. If you already wear glasses and switch to prescription smart glasses, you'll either have to carry a low-tech backrest or accept that they'll stay on in some potentially very uncomfortable places, like a public bathroom. The current Ray-Ban Meta glasses are largely sunglasses, so they are probably not the primary model for most people. But you can get them with clear and transition lenses, and I bet Meta would like to market them more as everyday specs.
Of course, there's no guarantee that most people will buy them. Ray-Ban Meta glasses are pretty cool devices now, but I was on the Meta campus meeting with Meta employees to preview the Meta hardware for a Meta event. It's not surprising that the latest Meta hardware was common and doesn't necessarily tell us much about what people want outside of that world.
Camera glasses have been on the horizon for years. Remember how magical I said it is to take pictures of what's in front of your eyes? My former colleague Sean O'Kane recounted almost the exact same experience with Snap Spectacles. back in 2016.
But Meta is the first company to make a credible play for widespread acceptance. They are very funny and that's what scares me a little.