Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Uncle Sam will pay for your big ideas to end AI voice-cloning fraud

Jan 05, 2024 - theregister.com
The Federal Trade Commission (FTC) is offering a $25,000 reward for the best solution to combat the threat of AI voice cloning, also known as audio deepfakes. The FTC is concerned about the potential misuse of this technology in cyberattacks, such as impersonating CEOs to instruct finance departments to wire money to attackers, tricking friends and families into sending money, and threatening the livelihoods of performing artists. Submissions, which are due by January 12, will be evaluated based on feasibility, resilience to technological changes, and considerations of liability and responsibility.

AI voice cloning has been used in several scams, including tricking banks into moving money and carrying out "family emergency" scams. This technology is made possible by feeding an AI model enough training data to understand the nuances of an individual's voice. Celebrities and public figures are particularly at risk due to the number of their voice recordings available online. However, with the rise of social media and video content, non-celebrities and children also have enough material online to train a model effectively.

Key takeaways:

  • The Federal Trade Commission (FTC) is offering a $25,000 reward for the best solution to combat the threat of AI voice cloning, also known as audio deepfakes.
  • The FTC is particularly interested in ideas that focus on the prevention, monitoring, and evaluation of AI-based voice cloning fraud.
  • Submissions will be evaluated based on feasibility, resilience to technological changes, and consideration of liability and responsibility placement on companies.
  • AI voice cloning has been proven effective in several cases, with criminals using it to impersonate individuals and trick others into sending money or revealing sensitive information.
View Full Article

Comments (0)

Be the first to comment!