Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Google Releases AI Safeguards Ahead US, Global Elections

Mar 13, 2024 - techtimes.com
Google has announced a series of measures to prevent AI disinformation across its products and applications, including labeling AI-generated videos and political ads. The tech giant has also banned its AI chatbot, Gemini, from answering election-related questions in an attempt to prevent the spread of election disinformation. This comes as part of Google's strategy for the upcoming US and Indian elections, which is based on extreme caution. Content produced using Dream Screen and other YouTube generative AI tools will also be labeled, and users will be alerted to material generated by AI.

In addition, Google has imposed restrictions on Gemini, directing users to use Google Search for election-related queries as the chatbot is still learning how to answer these questions. However, not all election queries are subject to this restriction. This move follows Google's decision last month to discontinue its AI picture-creation tool due to several issues. The announcement comes as digital platforms prepare for a significant year with elections in over 40 nations that will impact up to four billion people worldwide.

Key takeaways:

  • Google is implementing policies and safeguards to prevent AI disinformation across its products and applications, including labeling AI-generated videos and political ads created with AI.
  • Google has banned its AI chatbot Gemini from answering election-related questions to prevent the spread of disinformation during the upcoming US and Indian elections.
  • Google is also merging teams from Maps and Waze as part of a cost-cutting operation, and will label content produced using AI tools on YouTube.
  • A recent study found that AI chatbots, including Gemini and OpenAI's GPT-4, are providing users with inaccurate election-related information, prompting Google to restrict Gemini's responses to election queries.
View Full Article

Comments (0)

Be the first to comment!