Have you ever seen an Asian person with a white person, whether it's a mixed-race couple or two friends of different races? It seems pretty common to me: I have a lot of white friends!
For Meta's ai-powered image generator, this is apparently impossible to imagine. I tried dozens of times to create an image using prompts like “Asian man and Caucasian friend,” “Asian man and white wife,” and “Asian woman and Caucasian husband.” Only once was Meta's image generator able to return an accurate image with the runs I specified.
Images: Mia Sato/The Verge
Modifying the text-based message didn't seem to help. When I asked about an “Asian man and a white woman smiling with a dog,” the Meta text generator on instagram gave me three consecutive photos of two Asians. When I changed “white” to “Caucasian,” it did the same thing. “Asian man and Caucasian woman on wedding day” gave me an Asian man in a suit and an Asian woman in a traditional-looking garment…except upon closer inspection, it appears to be a mix of a qipao and a kimono. Multiculturalism is amazing.
The imager also didn't like it when I asked for depictions of platonic relationships, such as “Asian man with Caucasian friend” and “Asian woman and white friend.” Each time, he threw up images of two Asians. When I asked for a photo of an “Asian woman with a black friend,” the ai-generated image showed two Asian women. Adjusting it to “Asian woman with an African American friend” yielded more accurate results.
Interestingly, the tool performed slightly better when I specified people from South Asia. He successfully created an image using the message “South Asian man with Caucasian wife”, before immediately creating an image of two South Asian people using the same message. The system also relied heavily on stereotypes, such as adding bindi and sari-like elements to South Asian women that it created without me asking.
Images: Mia Sato/The Verge
The image generator that cannot conceive of Asians alongside whites is atrocious. But there are also more subtle hints of bias in what the system automatically returns. For example, I noticed that the Meta tool consistently depicted “Asian women” as having East Asian appearance and light skin, even though India is the most populous country in the world. She added culturally specific clothing even when she wasn't asked to. She produced several older Asian men, but the Asian women were always young.
The only image he successfully created used the message “Asian woman with Caucasian husband” and showed a noticeably older man with a young, light-skinned Asian woman; The age difference discourse.. Immediately after, I generated another image using the same message and it again showed an Asian man (also older) with an Asian woman.
Meta did not immediately respond to a request for comment.
Meta introduced its ai image generation tools last year, and its sticker creation tool quickly went off the rails when people created things like nude images and Nintendo characters with weapons.
ai systems reflect the biases of their creators, trainers, and the data set they use. In American media, “Asian” is usually understood as a person from East Asia, unlike people from other parts of the continent; Perhaps not surprisingly, the Meta system assumes that all “Asian” people look the same, when, in reality, we are a diverse group of people who often have little in common other than checking the same number in the census box.
Asians who don't fit into the monolith are essentially erased from cultural consciousness, and even those who do are underrepresented in the main media. Asians are homogenized, exoticized, and relegated to “perpetual foreigners.” Breaking type is easy in real life and impossible in Meta's artificial intelligence system. Once again, generative ai, rather than allowing the imagination to run wild, imprisons it within a formalization of society's silliest impulses.