The COPIED Act would also mandate the National Institute of Standards and Technology (NIST) to create guidelines and standards for content provenance information, watermarking, and synthetic content detection. Content owners would have the right to sue platforms that use their content without permission or tamper with content provenance information. The bill is supported by several artists’ groups, including SAG-AFTRA, National Music Publishers’ Association, The Seattle Times, Songwriters Guild of America, and Artist Rights Alliance. The introduction of the COPIED Act comes amid a surge of AI-related bills as lawmakers seek to regulate the technology.
Key takeaways:
- A bipartisan group of senators introduced the Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act), a bill aimed at protecting artists, songwriters, and journalists from having their content used to train AI models or generate AI content without their consent.
- The bill would require AI tool developers to allow users to attach content provenance information to their content within two years, and it would give content owners the right to sue platforms that use their content without permission or have tampered with content provenance information.
- The National Institute of Standards and Technology (NIST) would be required to create guidelines and standards for content provenance information, watermarking, and synthetic content detection under the COPIED Act.
- The bill is supported by several artists’ groups and comes amid an influx of AI-related bills as lawmakers seek to regulate the technology, with state legislatures introducing 50 AI-related bills per week according to an Axios report.