Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Google still hasn't fixed Gemini's biased image generator | TechCrunch

May 15, 2024 - techcrunch.com
Google's AI-powered chatbot, Gemini, has been under scrutiny for its image generation feature, which users have criticized for historical inaccuracies and racial bias. Despite promises from Google CEO Sundar Pichai and DeepMind co-founder Demis Hassabis that a fix would be implemented quickly, the issue remains unresolved. The problem stems from the data sets used to train Gemini, which contain more images of white people and often reinforce negative stereotypes of non-white individuals. Google's attempt to correct this bias through hardcoding has been unsuccessful, and the company is now struggling to find a solution that avoids repeating these mistakes.

In other news from Google's annual I/O developer conference, the tech giant showcased a variety of Gemini's other features, including custom chatbots, a vacation itinerary planner, and integrations with Google Calendar, Keep, and YouTube Music. However, the image generation feature remains switched off in Gemini apps on both web and mobile platforms. The ongoing issue serves as a reminder of the challenges in addressing bias in AI technology.

Key takeaways:

  • Google's AI-powered chatbot Gemini's ability to generate images of people has been paused since February due to complaints of historical inaccuracies and bias in its image generation.
  • Despite promises of a fix, the issue has not been resolved as of May, with the image generation feature still switched off in Gemini apps on the web and mobile.
  • The problem stems from the data sets used to train image generators like Gemini, which contain more images of white people than people of other races and ethnicities, and often reinforce negative stereotypes.
  • Google is struggling to find a middle path that corrects these biases without resorting to clumsy hardcoding, highlighting the complexity of addressing bias in AI.
View Full Article

Comments (0)

Be the first to comment!