Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

OpenAI’s Sora Is Plagued by Sexist, Racist, and Ableist Biases

Mar 23, 2025 - wired.com
A WIRED investigation has revealed that OpenAI's AI video tool, Sora, perpetuates biases such as sexism, racism, and ableism in its generated content. The study, which involved analyzing 250 AI-generated videos, found that Sora often depicts stereotypical roles and appearances, such as men in leadership positions and women in caregiving roles, while also struggling with accurate portrayals of race, disability, and body diversity. Despite OpenAI's efforts to reduce bias, the model's outputs reflect and amplify existing social biases, potentially exacerbating stereotypes and erasure of marginalized groups in commercial and security applications.

The investigation also highlighted Sora's challenges in representing diverse relationships and family dynamics, often defaulting to homogeneous portrayals. Researchers noted a "stock image" aesthetic in Sora's videos, suggesting limitations in its training data or fine-tuning processes. Addressing these biases requires more than technical solutions, with experts advocating for greater disciplinary diversity and real-world testing to understand societal risks. As OpenAI expands Sora's availability, there may be increased motivation to tackle these issues, especially given the commercial implications of biased AI outputs.

Key takeaways:

  • AI-generated videos, like those from OpenAI's Sora, exhibit biases, perpetuating sexist, racist, and ableist stereotypes.
  • Sora's model often portrays people in stereotypical roles and appearances, such as men in leadership positions and women in caregiving roles.
  • The system struggles with diversity in relationships and often defaults to portraying people as young, attractive, and able-bodied.
  • Addressing AI bias requires more than technical solutions; it needs interdisciplinary collaboration and diverse perspectives in development and testing.
View Full Article

Comments (0)

Be the first to comment!