Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

OpenAI peels back ChatGPT's safeguards around image creation | TechCrunch

Mar 28, 2025 - techcrunch.com
OpenAI has launched a new image generator in ChatGPT, enhancing its capabilities with improved picture editing, text rendering, and spatial representation. Notably, OpenAI has updated its content moderation policies, allowing ChatGPT to generate images of public figures and certain controversial symbols under specific contexts. This shift from blanket refusals to a more nuanced approach aims to prevent real-world harm while giving users more control. OpenAI's adjustments are part of a broader strategy to "uncensor" ChatGPT, enabling it to handle more diverse requests and perspectives.

The changes have sparked discussions around AI content moderation, with OpenAI emphasizing that the updates are not politically motivated but rather reflect a commitment to user control. Despite relaxing some guardrails, OpenAI maintains safeguards against misuse, particularly concerning images of children. The company's approach aligns with similar policy shifts by other tech giants, though it remains to be seen how these changes will impact broader AI content moderation debates.

Key takeaways:

  • OpenAI launched a new image generator in ChatGPT that can create Studio Ghibli-style images and has improved capabilities in picture editing, text rendering, and spatial representation.
  • OpenAI has updated its content moderation policies to allow ChatGPT to generate images of public figures, hateful symbols, and racial features upon request, with a focus on preventing real-world harm.
  • ChatGPT's new image generator can mimic the styles of creative studios like Pixar or Studio Ghibli but still restricts imitating individual living artists' styles.
  • OpenAI's content moderation changes are part of a broader trend among tech companies to relax guardrails around controversial topics, amidst potential regulatory scrutiny and political pressure.
View Full Article

Comments (0)

Be the first to comment!