In a sign that the tech industry is getting weirder, Meta soon plans to release a major update that transforms Ray-Ban Meta, its video-recording camera glasses, into a device only seen in sci-fi movies.
Next month, the glasses will be able to use new ai software to see the real world and describe what you're looking at, similar to the ai assistant in the movie “Her.”
The glasses, which come in various frames starting at $300 and lenses starting at $17, have been used primarily for taking photos and videos and listening to music. But with new artificial intelligence software, they can be used to scan famous landmarks, translate languages, and identify animal breeds and exotic fruits, among other tasks.
To use the ai software, users simply say, “Hello, Meta,” followed by a message like “Look and tell me what kind of dog this is.” The ai then responds with a computer-generated voice that plays through the glasses' small speakers.
The concept of ai software is so novel and peculiar that when we (Brian produce a cooking show — I found out, we were dying to try it. Meta gave us early access to the update and we tested the technology over the past few weeks.
We take the glasses to the zoo, grocery stores, and a museum while interrogating the ai with questions and requests.
The result: We were simultaneously entertained by the virtual assistant's mistakes (for example, mistaking a monkey for a giraffe) and impressed when it performed useful tasks like determining that a package of cookies was gluten-free.
A Meta spokesperson said that because the technology was still new, the artificial intelligence would not always get things right and that feedback would improve the glasses over time.
Meta's software also created transcripts of our questions and the ai's answers, which we captured in screenshots. Here are the highlights of our month of coexistence with the Meta assistant.
Pets
BRIAN: Naturally, the first thing I had to try out with Meta's ai was my corgi, Max. I looked at the chubby dog and asked, “Hey, Meta, what am I looking at?”
“A cute Corgi dog sitting on the floor with his tongue out,” the assistant said. Right, especially the being cute part.
MIGUEL: Meta's ai correctly recognized my dog, Bruna, as a “black and brown Bernese mountain dog.” She half-expected the ai software to think she was a bear, the animal her neighbors most constantly mistake her for.
animal zoo
BRIAN: After the ai correctly identified my dog, the next logical step was to test it on zoo animals. So I recently visited the Oakland Zoo, California, where, for two hours, I observed about a dozen animals, including parrots, turtles, monkeys, and zebras. I told him: “Hey Meta, look and tell me what kind of animal that is.”
The ai was wrong the vast majority of the time, in part because many animals were caged and further away. He confused a primate with a giraffe, a duck with a turtle and a meerkat with a giant panda, among other confusions. On the other hand, I was impressed when the ai correctly identified a species of parrot known as a blue and gold macaw, as well as zebras.
The strangest part of this experiment was talking to an ai assistant in the presence of children and their parents. They pretended not to hear the only lone adult in the park while I apparently muttered to myself.
Food
MIGUEL: I also had a peculiar time shopping. Being inside a Safeway and talking to myself was a little embarrassing, so I tried to keep my voice down. I still got a few sideways glances.
When Meta's ai worked, it was lovely. I picked up a strange looking package of Oreos and asked her to look at the package and tell me if they were gluten-free. (They weren't). She answered questions like these correctly about half the time, although I can't say that she saved any time compared to reading the label.
But the only reason I put on these glasses was to start my own Instagram Cooking Show – a flattering way of saying that I record myself preparing the week's meal while talking to myself. These glasses made doing so much easier than using a phone and a hand.
The ai assistant can also offer help in the kitchen. If I need to know how many teaspoons are in a tablespoon and my hands are covered in olive oil, for example, I can ask you to tell me. (There are three teaspoons in a tablespoon, just FYI.)
But when I asked the ai to look at a handful of ingredients I had and create a recipe, it spit out quick instructions for an egg cream, which isn't exactly helpful for following directions at my own pace.
A handful of examples to choose from might have been more useful, but that might require tweaks to the UI and maybe even a screen inside my glasses.
A Meta spokesperson said users could ask follow-up questions to get more accurate and helpful answers from their assistant.
BRIAN: I went to the supermarket and bought the most exotic fruit I could find: a cherimoya, a green, scaly fruit that looks like a dinosaur egg. When I gave Meta's ai multiple chances to identify it, it made a different guess each time: a chocolate-covered nut, a stone fruit, an apple, and finally a durian, which was close, but not a banana.
Monuments and Museums
MIGUEL: The new software's ability to recognize landmarks and monuments appeared to be working. Looking down a block in downtown San Francisco toward a towering dome, Meta's ai correctly answered, “City Hall.” It's a good trick and perhaps useful if you are a tourist.
Other times they were unpredictable. While driving from the city to my home in Oakland, I asked Meta which bridge she was on while she looked out the window in front of me (with both hands on the wheel, of course). The first answer was the Golden Gate Bridge, which was wrong. On the second try, he discovered he was on the Bay Bridge, which made me wonder if he just needed a clearer shot of the tall white suspension posts of the newer portion to be right.
BRIAN: I visited the San Francisco Museum of Modern Art to see if Meta's ai could do the job of a tour guide. After taking photographs of about two dozen paintings and asking the assistant to tell me about the artwork I was looking at, the ai was able to describe the images and what mediums were used to compose the art, which would be good for a college student. history of art. – but could not identify the artist or the title. (A Meta spokesperson said that another software update released after my visit to the museum improved this capability.)
After the update, I tried looking at images on my computer screen of more famous works of art, including the Mona Lisa, and the ai correctly identified them.
Languages
BRIAN: At a Chinese restaurant, I pointed out a menu item written in Chinese and asked Meta to translate it into English, but the ai said it currently only supported English, Spanish, Italian, French, and German. (I was surprised because Mark Zuckerberg learned Mandarin).
MIGUEL: He did a pretty good job translating a book title from English to German.
Bottom line
Meta's ai-powered glasses offer an intriguing glimpse into a future that seems distant. The defects highlight the limitations and challenges in designing this type of product. The glasses could probably work better for identifying zoo animals and fruits, for example, if the camera had a higher resolution, but a nicer lens would add bulk. And no matter where we were, it was awkward talking to a virtual assistant in public. It's unclear if that will ever feel normal.
But when it worked, it worked well and we had fun, and the fact that Meta's ai can do things like translate languages and identify landmarks through a pair of hip-facing glasses shows how far the technology has come.