The advocates also touched on the issue of copyright infringement, particularly in relation to AI voice cloning and the use of public posts to train AI models. Zhang, founder of artist-forward social platform Cara, argued that just because artwork is available online, it doesn't mean it's free to use. She emphasized the need for licensing agreements when using artists' work. Pedraszewska, head of safety at AI voice cloning company ElevenLabs, stressed the importance of understanding the undesirable behaviors and unintended consequences of AI technology.
Key takeaways:
- AI safety advocates are urging startup founders to move cautiously and consider the ethical implications of their products, as the rapid development and deployment of AI technologies can lead to serious issues.
- The family of a child who died by suicide has sued chatbot company Character.AI, highlighting the potential dangers of AI technologies and the need for careful content moderation.
- Artists could be negatively impacted by AI technologies, as their online work could be used to train AI models without their consent, potentially putting them out of work.
- Aleksandra Pedraszewska, head of safety at AI voice cloning company ElevenLabs, emphasizes the importance of understanding undesirable behaviors and unintended consequences of new AI products, and advocates for a balanced approach to AI regulation.