Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Google restricts AI chatbot Gemini from answering questions on 2024 elections

Mar 13, 2024 - theguardian.com
Google is limiting its Gemini AI chatbot from answering election-related queries in countries where voting is taking place this year. The decision is a precautionary measure to prevent the spread of AI-generated disinformation and its potential influence on global elections. The company initially announced its plans in a blog post last December and confirmed the global rollout of these changes recently. The chatbot now responds to political questions with evasive answers, referring users to Google search instead.

The decision has sparked debate about the reliability of Google's AI tools, with critics questioning their use in other contexts such as health or financial information. The company has faced backlash over Gemini's image-generation capabilities, particularly its inaccurate depiction of people of color in historical contexts. Google has suspended some of Gemini's capabilities in response to the controversy. The incident highlights the increasing scrutiny faced by major AI firms and their struggle to navigate sensitive topics without causing a public relations backlash.

Key takeaways:

  • Google is restricting its Gemini AI chatbot from answering election-related questions in countries where voting is taking place this year, to prevent the spread of misinformation.
  • The company is implementing features like digital watermarking and content labels for AI-generated content to combat the spread of false information.
  • Google's decision to restrict Gemini has raised questions about the overall accuracy of the company’s AI tools, particularly in other contexts such as health or financial information.
  • Gemini recently faced backlash over its image-generation capabilities, inaccurately generating images of people of color when given prompts for historical situations, leading to Google suspending some of Gemini’s capabilities.
View Full Article

Comments (0)

Be the first to comment!