Some AI-generated results for an image search for “portrait of an Asian woman.”

Why is AI Pornifying Asian Women?

How societal biases and the Internet has led to scantily clad avatars

Some AI-generated results for an image search for “portrait of an Asian woman.”

Words by Samantha Pak

When Melissa Heikkilä tried Lensa’s Magic Avatars, she’d hoped to get similar results as her male colleagues, who saw themselves portrayed as astronauts, fierce warriors, and other flattering images. Instead, Heikkilä got nude and sexualized versions of herself.

The senior reporter at MIT Technology Review recently wrote about her results with the AI-powered feature, and how her images—as a woman with Asian heritage—compared to her colleagues’.

“Out of 100 avatars I generated, 16 were topless, and in another 14 it had put me in extremely skimpy clothes and overtly sexualized poses,” Heikkilä wrote.

And while one of Heikkilä’s female colleagues with Chinese heritage got similar results, a white female colleague received “significantly fewer sexualized images, with only a couple of nudes and hints of cleavage.”

Writer Mia Mercado also noticed this hypersexualization of her avatars.

“Although only two of my 13 submitted photos contained the slightest suggestion of cleavage, the app filled in the gaps quite literally,” Mercado, who is half Filipina and half white, wrote in The Cut. “About a quarter of the pictures featured a woman who just barely looked like me with a chest that can only be described as ample.”

Mercado also compared her results to her white husband’s, and it was a night and day difference, as his images were similar to Heikkilä’s male colleagues’ results. Both women also noted how their avatars were more images of generic Asian women—“clearly modeled on anime or video-game characters. Or most likely porn,” according to Heikkilä—than of themselves. Meanwhile, the men’s avatars were more realistic and closer likenesses.

According to Heikkilä, these results are not surprising because the avatars—which are based on selfies inputted by the user—are generated using Stable Diffusion. This open-source AI model generates images based on text prompts and is built using LAION 5B, an open-source data set that’s compiled by scraping images off the Internet.

Take this, combine it with the plentiful images of nude and scantily clad women on the Internet, and the racist stereotypes that have hypersexualized Asian women for ages, and of course you’re going to get a skewed data set. And as we’ve seen in recent years, that’s a dangerous combination for women in our community.

But we can’t just blame the data. Keep in mind that someone is developing these models—and making the choice to depict male avatars in cool and flattering images and female avatars as half (if not fully) naked.

The sexualization of women is so prevalent that Lensa even addresses the issue on its FAQ page, acknowledging these societal biases, adding that LAION 5B creators have introduced filters to reduce these biases (they didn’t think to do this initially?). Lensa also states, “intentional use of the app to create explicit content is absolutely prohibited by our Terms of Use and obliges Lensa users to comply with that requirement.” But not everyone is going to follow the rules and how much faith can we put in the company to efficiently and effectively address rulebreakers (especially when it comes to the harm of women, specifically BIPOC women)?

Now, there’s nothing wrong with exploring and embracing one’s sexuality—in fact, we encourage it (check out Anna Lee’s sex advice column here on JoySauce)! But one key factor AI models like Lensa leave out is consent. The technology, as Heikkilä’s writes, sexualizes women “regardless of whether they want to be depicted that way.” This, of course, raises issues of revenge porn and other ways women can be harmed simply by existing.

Consent is sexy. What Lensa is doing, and has the potential to do, is not.

So while it may be fun to try out new technologies and see what they can do, we should also proceed with caution.

Published on December 16, 2022

Words by Samantha Pak

Samantha Pak (she/her) is an award-winning Cambodian American journalist from the Seattle area and assistant editor for JoySauce. She spends more time than she’ll admit shopping for books than actually reading them, and has made it her mission to show others how amazing Southeast Asian people are. Follow her on Twitter at @iam_sammi and on Instagram at @sammi.pak.