The study found that only one of the six services, ElevenLabs, blocked the creation of voice clones, citing policies against replicating public figures. However, the other five services, including Invideo AI, Veed, and Descript, allowed the creation of false statements, even when they were misleading or false. The study warns that if these platforms do not enforce their policies, there could be a surge in voice cloning during the upcoming election season.
Key takeaways:
- The 2024 election could see the use of faked audio and video of candidates, with AI-powered voice cloning services generating false statements in the voices of political figures.
- A study by the Center for Countering Digital Hate found that out of 240 requests to six AI voice cloning services, 193 resulted in the generation of convincing fake audio.
- Only one service, ElevenLabs, blocked the creation of a voice clone, citing policies against replicating public figures. The other services either had no restrictions or their safety measures were easily circumvented.
- Invideo AI was highlighted as the worst offender, not only failing to block any recordings but also generating an improved script for a fake warning of bomb threats at polling stations by a President Biden voice clone.