Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

What AI thinks a beautiful woman looks like: Mostly white and thin

Jun 03, 2024 - washingtonpost.com
The Washington Post analyzed how AI-generated images from three leading models - DALL-E, Midjourney, and Stable Diffusion - depict female beauty, finding a narrow and biased representation. The AI models overwhelmingly generated images of thin women with light skin tones when prompted to show a "beautiful woman" or "normal woman". Only 2% of the images showed signs of aging and just 9% had dark skin tones. The AI tools also struggled to accurately depict women with wide noses or single-fold eyelids, common in people of Asian descent.

The article highlights concerns about the potential for these AI tools to reinforce harmful beauty standards and body image distress. It also discusses the technical challenges and costs associated with addressing these biases. The biases in the AI models are largely a result of the data they are trained on, which is often scraped from the internet and heavily weighted towards the perspectives of people in the U.S. and Europe. The article concludes by noting the increasing use of these AI tools in industries such as advertising and the potential for them to undo progress on depicting diversity in popular culture.

Key takeaways:

  • AI-generated images are shaping cultural norms, particularly in the depiction of female beauty. The Washington Post found that leading image tools like DALL-E, Midjourney, and Stable Diffusion often generate images of thin, light-skinned women, reinforcing narrow standards of attractiveness.
  • AI artist Abran Maldonado criticized these tools for their lack of diversity, stating that derogatory terms were needed to generate images of Black women with larger bodies. He expressed concern that the increasing commercial use of these tools could reverse progress in depicting diversity in popular culture.
  • OpenAI, the maker of DALL-E, acknowledged the tool's bias towards "stereotypical and conventional ideals of beauty" and warned that it could reinforce harmful views on body image. The company is working to address these biases, but faces technical challenges in diversifying gender norms and body types.
  • The data used to train these AI models is often scraped from the internet, leading to biases in the images generated. This data does not include material from China or India, the largest demographics of internet users, making it heavily weighted towards the perspective of people in the U.S. and Europe.
View Full Article

Comments (0)

Be the first to comment!