Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

FTC seeks to modify rule to combat deepfakes | TechCrunch

Feb 16, 2024 - news.bensbites.co
The Federal Trade Commission (FTC) is planning to revise an existing rule that prohibits the impersonation of businesses or government agencies to include all consumers, in response to the increasing threat of deepfakes. The updated rule may also make it illegal for a GenAI platform to offer goods or services that it knows or suspects are being used to harm consumers through impersonation. FTC chair Lina Khan highlighted the rise of AI-driven scams, such as voice cloning, and the need to protect Americans from such fraud.

The issue of deepfakes extends beyond high-profile individuals to include online romance scams and corporate fraud. Surveys indicate widespread concern about the spread of misleading deepfakes and their potential impact on the 2024 U.S. election. While there is currently no federal law specifically banning deepfakes, some states have enacted laws criminalizing them, primarily in relation to non-consensual pornography. These laws are expected to be expanded to cover a broader range of deepfakes as the technology becomes more advanced.

Key takeaways:

  • The FTC is looking to modify an existing rule to cover all consumers against the threat of deepfakes and impersonation, potentially making it unlawful for AI platforms to provide services that they know are being used to harm consumers.
  • Online scams involving deepfakes are on the rise, with fraudsters using AI tools to impersonate individuals and employees to extract money.
  • Surveys show a high level of concern among Americans about the spread of misleading deepfakes, with many believing AI tools will increase the spread of false information during the 2024 U.S. election cycle.
  • While no federal law specifically bans deepfakes, ten states have enacted statutes criminalizing them, and more state-level laws are expected as deepfake-generating tools become more sophisticated.
View Full Article

Comments (0)

Be the first to comment!