Settings

ⓕ font-size

  • -2
  • -1
  • 0
  • +1
  • +2

AI art and sexualization of women

  • Facebook share button
  • Twitter share button
  • Kakao share button
  • Mail share button
  • Link share button
Chyung Eun-ju
Chyung Eun-ju
Joel Cho
Joel Cho

By Chyung Eun-ju and Joel Cho

The latest trend taking over social media has been AI-generated art. Art generated by AI has been around for a couple of years now, but lately its popularity has soared, as apps like Lensa have reached the status of the most downloaded mobile apps.

Just as apps generating AI art have quickly taken over the internet, so have people's concerns over the consequences of AI art. Not only that, but the rise of AI art is prompting socially relevant and important conversations.

Amongst the various celebrities partaking in the hottest new trend of AI art, Megan Fox has posted the results of the "magic avatars" generated by the Lensa app and questioned why the results she got were heavily sexualized.

This question got us wondering if AI art has ingrained sexist standards. In a simple search online, it seems that a lot of people have noticed that AI-generating apps have a clear tendency to generate art sexualizing women.

Digging deeper, we learned that Lensa uses an open-source AI deep learning model called Stable Diffusion, which runs its program on a database made up of various different public image sources spread around the internet called LAION-5B. It is interesting to note that LAION-5B manages a dataset comprised of images that are available from a vast number of publicly available databases like Pinterest, with the following disclaimer:

Our recommendation is therefore to use the dataset for research purposes. Be aware that this large-scale dataset is uncurated. Keep in mind that the uncurated nature of the dataset means that the collected links may lead to strongly discomforting and disturbing content for a human viewer.

So what does the fact that women are being more sexualized by AI-generating apps say about the data out in the internet?

If AI generators use public image databases, which are human-made content, the technology will have a tendency to replicate the racist, sexist or biased language it is exposed to. As generative AI is predicted to be the future of creative work, it is becoming a very lucrative business. Managing the sexualized images of women is crucial.

Several women who have never taken nudes have found their naked pictures online and deepfake porn has become a new genre, causing distress for the subjects. Generative AI is an accessible technology that does not require special training to use ― such as with the creation of deepfakes ― thus, companies must find a way to protect women from becoming sexualized victims.

A solution could perhaps begin by including more women in the AI community. The computer science field is mostly dominated by men, while women only make up a small minority in machine-learning research.

Despite the efforts to address the lack of women, the AI industry does not expect the proportion of women to grow so drastically.

Sexism seems to follow women since birth, although it may vary based on geography. Economists Kerwin Kofi Charles, Jonathan Guryan and Jessica Pan state how women internalize social norms from a young age, which could also affect women's willingness to bargain for higher wages. Such internalization could extend to hinder women's entry to the AI industry.

If we cannot fix sexism in humans, how can we change it in AI? AI developers are trying to make AI humanlike, and maybe that is the problem itself.

Beyond the offensive results, security implications of uploaded pictures are a growing concern. Phone apps are known to collect data from our phones, but most users are unaware of how much. Although Lensa states that the user's photos are deleted from servers after avatars are generated, the photos or videos can be used to train Lensa's algorithm further; so we do not know really how our data is being used.

In the age of AI, your home security video, words and personal photos are all being used without your "real" consent to coach AI. Be it good or bad, AI is built from pieces of us, or rather the information we present to the internet world. The worst-case scenario is that we lose control of how our data is being used once we upload.

If Lensa creates nudes from just children's headshots as well, the potential societal implications are that perhaps we are not ready for such technology. Do we want to upload a bunch of selfies for potential sexualization? I think not.

Technology is being developed faster than ethics and laws, and perhaps it is time to slow down the speed at which such technology is being presented to the public ― or wait at least until developers figure out a way to fix them.


Chyung Eun-ju (ejchyung@snu.ac.kr) is studying for a master's degree in marketing at Seoul National University. Joel Cho (joelywcho@gmail.com) is a practicing lawyer specializing in IP and digital law.




X
CLOSE

Top 10 Stories

go top LETTER