Here's the scene: I'm wearing Meta's latest Ray-Ban glasses with a new feature called Live ai, which can answer questions about the world around you. I'm getting ready for a four-hour road trip to my in-laws for the Christmas holidays. I'm planning breakfast the next day because I'm 99.9 percent sure I won't have brain cells to invent an edible at 5 a.m. I don't even know if I have anything to make a meal. with. I open the refrigerator door and say, “Hey Meta, start Live ai.” Suddenly, John Cena's voice is in my ear telling me that a live ai session has started.
“What breakfast can I make with the ingredients in my refrigerator?” Asked.
The inside is a sad sight with a month's worth of Thanksgiving leftovers, a carton of eggs, soda, condiments, a tub of Greek yogurt, and a large jug of maple syrup. Metai-as-John-cena responds that I can make a “variety of breakfast dishes,” such as “scrambled eggs, omelettes, or yogurt parfaits.”
To be clear, there is no single fresh fruit to make a parfait. The egg carton has two eggs. My spouse put an empty milk carton back in the refrigerator, which means the scrambled eggs and omelettes are out, too. My stomach rumbles, reminding me that I skipped lunch. I put on the idea of breakfast and instead open the freezer door and ask what kind of dinner I can make with the ingredients inside. It's mainly a bunch of frozen pizzas, a variety of frozen vegetables and hamburger buns. I've been told, “Frozen meals, stir-fries, and stews.”
I decide to order for dinner. It will be a breakfast on the road.
This is the problem with Live ai. Most of the time, I don't know when to use it. When I do, the answers I get are too obvious to be useful.
Photo by Amelia Holowaty Krales / The Verge
Launching for Live ai lets you talk to an ai assistant like you would a friend. While similar in function to the glasses' multi-modal ai feature, you don't have to constantly request the ai. He (supposedly) knows when you're talking to him. You can also join multiple queries and follow-up questions. If you're in a cooking class and something looks a little off, you'd dial the instructor and they'd look at the mess in your pan and tell you what you did wrong and how to fix it. This is meant to be a version of that, but with a disembodied ai that lives in your glasses. It sees what you see and can help you in real time.
It's a great concept. But I was stumped when it came time to use live ai without guardrails. Every time a question pops into my head, I automatically reach for my phone. That's what I've been trained to do for over 10 years. The first and biggest hurdle to using live ai was remembering that it was an option.
The second problem was knowing when ai could be live further useful than a quick Google search. Meta suggested I try scenarios involving fashion and cooking. I already told you how my cooking consultations were going. So, I asked the ai what color combinations I should try with a set of multicolored pastel press-on nails.
The ai suggested that a “pastel color combination” would “complement pink nails well.” I asked him which of the books on my shelf I should read. The ai reminded me that it “has no personal preferences or opinions,” but that I should “read a book that (I) is interested in or one that (I) have been meaning to read for a while.” Dissatisfied, I asked which of the books was more acclaimed. He suggested I watch it online. I tried a few more scenarios and asked myself: why would I talk to ai if all it does is restate the obvious and tell me to Google things myself?
The most useful experience I had with Live ai was when I asked how to zhuzh in my home office. At first, I got another response from Milquetoast: adding artwork, plants, and rearranging furniture to create a cozier atmosphere. Annoyed, I asked him what guy of artwork would look nice. Again, he told me that “a variety of artwork” could look good “depending on (my) personal style.” Had I considered adding posters, prints or paintings that reflected my interests or hobbies? I wanted to scream, but instead, I asked what style of poster would look good based on what it was. at the moment in the room. To that, I got my first somewhat useful response: a colorful, playful poster with a fun design or a cute character that would complement the stuffed animals in the room. I asked the artists to consider. She suggested Lisa Congdon, Camille Rose Garcia and Jen Corace for their “playful, whimsical styles.”
And herein lies the biggest recurring problem I have with ai: you have to know how to ask the right questions to get the answer you want.
I could have saved myself some pain if I had said to Meta ai, “I want to hang artwork in my room. Based on what's currently here, what artists should I consider? This ability is naturally for some people. My spouse is a genius at inciting ai. But for the rest of us, it's a skill that must be learned, and few people right now are teaching us ai newbies how to rewire our brains to better use this technology.
Photo by Amelia Holowaty Krales / The Verge
After Googling the artists Meta ai suggested, I was dropped back at square one. I liked their art, but none of them felt like my style. I relayed the experience to my best friend, who rolled her eyes and quickly sent me three artists on instagram. I loved them all. Quietly, he said I should have asked him and not bothered with a bot. Because, unlike Meta ai, he said, he actually knows me.
Live ai has other problems outside of the philosophical ones. Struggle to differentiate when you're talking to him versus someone else in the room. At one point, he outright lied and said he had witnessed me feeding my cat when I hadn't. (My spouse had confused it by saying that they had fed the kittens). It also only works in 30 minute windows before the battery dies. That means you have to be intentional in how you use it, something difficult to do when there are few obvious use cases.
I'm not against Live ai. The general view is that we are all like Tony Stark, wearing cool glasses with his own jarvisos on them. When you're being handheld through controlled demos, that future feels inevitable and magical. It's just that the fantasy starts to break down when you're left to explore on your own. And once that happens, nine times out of 10, you'll reach for your phone.