Security company Pindrop analyzed the audio from the call and claimed it was likely created using ElevenLabs’ technology. This is not the first time ElevenLabs tools have been suspected of being used for political propaganda. The incident highlights the potential for misuse of AI voice-cloning technology, especially as the 2024 election season approaches. Experts warn that authorities, the tech industry, and the public are underprepared for the potential impact of AI-generated propaganda.
Key takeaways:
- Last week, voters in New Hampshire received an AI-generated robocall impersonating President Biden, advising them not to vote in the state’s primary election. The call was likely created using technology from voice-cloning startup ElevenLabs.
- Two separate teams of audio experts, including Pindrop and a team from UC Berkeley, conducted independent analyses of the audio and concluded with high confidence that it was AI-generated and likely to be from ElevenLabs.
- This is not the first time ElevenLabs' technology has been suspected of being used for political propaganda. Last year, TikTok accounts sharing conspiracy theories using AI-generated voices were also believed to have used ElevenLabs’ technology.
- The incident highlights the potential for misuse of AI voice cloning technology, especially as the 2024 election season approaches, and the lack of preparedness among authorities, the tech industry, and the public to deal with such incidents.