youhe protagonist of the new Levi’s campaign looks like just another model. Her tousled hair hangs over her shoulders as she gazes into the camera with that faraway haute couture look. But she takes a closer look, and something starts to look a little off. The shadow between her chin and neck looks indistinct, like a poor attempt to use FaceTune’s eraser effect to hide a double chin. Her French-manicured nails appear scrubbed and even in a creepy real-doll fashion.
The model is AI-generated, a digital representation of a human being that will begin appearing on Levi’s e-commerce website later this year. The brand partnered with LaLaLand.ai, a digital studio that creates custom AI models for companies like Calvin Klein and Tommy Hilfiger, to create this avatar.
Amy Gershkoff Bolles, global director of digital strategy and emerging technologies at Levi’s, Announced the model’s debut at a Business of Fashion event in March. The AI models won’t completely replace humans, she said, but will serve as a “plug-in” meant to help in the brand’s representation of various sizes, skin tones and ages.
“When we say plug-in, we mean that AI-generated models can be used in conjunction with human models to potentially expand the number of models per product,” a Levi’s spokesperson said. “We are excited about a world where consumers can see more models on our site, which could reflect any combination of body type, age, size, race and ethnicity, allowing us to create a more personal and inclusive shopping experience. inclusive”.
Michael Musandu, the founder of LaLaLand.ai, created the software in part because he had a hard time finding models that looked like him. He was born in Zimbabwe, grew up in South Africa, and moved to the Netherlands to study computer science. “Any good technologist, instead of complaining about a problem, will build a future where he could actually have this representation,” Musandu said.
How about just hiring a diverse cast of models? Musandu said that LaLaLand.ai is not meant to “replace” models, but instead allows brands to display different clothing on as many bodies as possible.
“It’s not feasible for brands to shoot nine models for every product they sell, because they’re not just hiring models, they’re hiring photographers, stylists and makeup artists for those models.” AI-generated images don’t need glamor squads, so brands can cut costs they’d spend on set by using fake avatars.
A Levi’s spokesperson added: “The models Levi’s hires are already diverse and this will continue to be a priority for us. Over the last year, we have focused on ensuring that those working on the content, both in front of and behind the camera, reflect our broad consumer base.”
However, the diversity that AI can provide will always be virtual: a computer-generated sense of inclusion. Do brands that generate, for example, black models for pieces in which they only photographed a white human model, engage in some kind of digital blackface?
This is not a new question. There are already “digital influencers” like lil miquela and shudu, fake avatars with millions of followers on social media. They model Prada, Dior and Gucci clothes with the idea that their (human) audience will buy the pieces. Neither model is white, but both have at least one white creator (Shudu was created by British fashion photographer Cameron-James Wilson and Miquela by Trevor McFedries and Sara Decou).
Criticism of Levi’s for choosing AI models over real models echoes the wave of response Lil Miquela received when was first released in 2016, or when Shudu made his debut two years later. Lauren Michele Jackson of The New Yorker called Shudu “a white man’s digital projection of true black femininity”.
Lil Miquela’s creators also filled her fake life with “events” to try to give her personality. Calvin Klein He apologized for a Pride ad showing Lil Miquela kissing royal model Bella Hadid. A few months later, Lil Miquela posted a story about being sexually assaulted on the back of a rideshare, and her followers accused her creators of composing a traumatic event for influence.
Unlike their mortal counterparts, these models never age. A “19-year-old robot living in Los Angeles,” Miquela is 19 forever, making her a hot commodity in a youth-obsessed industry.
Deep Agency, another Netherlands-based artificial intelligence company, it was news this month after debuting her own “AI modeling agency”. The service, which costs $29 a month, bills itself as a way for creators to “say goodbye to traditional photo shoots.” Users write a description of what they want their photo to look like and receive “high quality” photos of fake models in return.
Paid subscribers to the service get access to 12 models of various races, though they all appear to have smaller bodies and are in their 20s and 30s. Users browse through the site’s catalog of existing images, which includes photos of models engaging in activities like reading books or making peace signs to the camera. Those photos serve as inspiration for the final result.
In a photo provided by The Guardian, a model named “Chai” had a disconcertingly plastic face and extra-long, skinny fingers that belonged in a horror movie. Another, “Caitlin”, had a disturbing amount of veins sticking out from under the skin of her neck. A male model, “Airik”, looked incredibly awkward and upright as he posed in front of a drab gray building.
How long will it be before these models take jobs away from real people? Sara Ziff, founder of advocacy group The Model Alliance, worries that “taking advantage of someone else’s identity to exclude people who are actually Black from hiring could be equated to Blackface,” Ziff said.
Ziff’s New York office has a support line where models call to talk about things that have made them feel uncomfortable on set. The topic of conversation lately has been AI, and specifically body scans, which brands can use to create 3D digital replicas of models’ bodies.
“We have received an increasing number of calls from models who, after receiving body scans, discovered that the rights to their body were being assigned to a company, which meant they were losing the rights to their own image,” Ziff said. “We’ve heard this particularly from fit models, who are concerned about how their personal information would be used or capitalized on without their permission.”
Fit models work in the initial process of fashion design. They are essentially human mannequins for creatives, trying on drafts of clothing to see how the garment would look on a real body.
Summer Foley, a 25-year-old model from New York, said it was not uncommon to earn around $400 an hour as a fit model.
“If someone wanted to scan my body, I’d like to charge them every time they use it!” Foley said. “That is my body, and I work hard to maintain these measurements. You can’t scan me and use my likeness in perpetuity without me making money.”
Sinead Bovell has modeled for six years and wrote on the subject of AI models for Vogue in 2020. She frequently posts on social media about the ethical dilemma that arises with companies using models’ bodies to create their images.
Last year, the portrait app Lensa went viral for generating highly stylized portraits of its users. He used Stable Diffusion, a text-to-image application that is trained to learn patterns through an online database of images. Those photos came from the internet, leading the artists to say that Lensa was stealing their work to create the images.
Similarly, brands could train their AI on real-life photos or body scans of human models. But who gets paid when the photo generated from your image appears in the next big ad campaign? “Who would own that data? Where would I live? I’m sure there are ways that you have full rights to that, but since that area of technology is being worked out, I’d rather not be the guinea pig,” Bovell said.
Musandu, the founder of LaLaLand.ai, said that his algorithm only works on data held by the company. But he agrees that companies should compensate models if they base images on their likeness. “I think if any algorithm has used you in the training set, you should have the rights to license those images,” he said.
It’s easy to remain pessimistic about the long-term effects this will have on fashion and body image. “I can see a future with AI where beauty standards become even less realistic because clothes are literally worn by people who aren’t real,” Bovell said. “If you look at the history of how technology has evolved, things like sand filters for selfies, it’s not very positive.”
Bovell, who is black, doesn’t think someone can just create a virtual identity that mirrors their own. But he worries about the ethics of who will ultimately benefit from the images of color models. “I call that robot cultural appropriation,” he said. “The central question is: who has the right to own and speak about the identities that the AI models represent?”