Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

AI-Generated Deepfakes Raise Concerns Ahead of 2024 US Presidential Election

Oct 05, 2023 - techtimes.com
Democratic lawmakers Sen. Amy Klobuchar and Rep. Yvette Clarke have written to Meta CEO Mark Zuckerberg and X CEO Linda Yaccarino, expressing concerns over AI-generated political ads and urging transparency and accountability. The legislators are worried about the potential for such ads to spread election-related disinformation ahead of the 2024 US presidential elections. Clarke has also proposed the DEEPFAKES Accountability Act, aimed at protecting people from misleading digital information, including deepfakes.

Klobuchar is also working on companion legislation in the Senate, and tech companies are under increasing scrutiny to address the issue while respecting free speech. Both Democratic and Republican campaigns are already using AI-generated ads, raising concerns about their potential to mislead voters. The Federal Election Commission has begun a process to regulate AI-generated deepfakes in political ads, with a public comment period open until October 16.

Key takeaways:

  • Democratic members of Congress, Sen. Amy Klobuchar and Rep. Yvette Clarke, have expressed concerns about AI-generated political ads and have written to Meta CEO Mark Zuckerberg and X CEO Linda Yaccarino, urging transparency and accountability.
  • Rep. Yvette Clarke has filed the DEEPFAKES Accountability Act, a legislative initiative intended to protect people against misleading digital information, such as deepfakes.
  • Klobuchar is proposing companion legislation in the US Senate, and she and Republican Sen. Josh Hawley are among the co-sponsors of a bipartisan Senate measure prohibiting "materially deceptive" deepfakes about federal candidates.
  • The capacity of tech corporations to meet this emerging problem while upholding free speech principles is coming under greater scrutiny, with some experts doubting their readiness to tackle election-related deepfakes.
View Full Article

Comments (0)

Be the first to comment!