Efforts to regulate AI-generated explicit content are also underway in other states and at the federal level. The U.S. Senate approved a bill to criminalize nonconsensual sexual imagery, and similar legislation is being considered in various states. Victims and advocacy groups emphasize the difficulty of removing such images once created. However, AI law experts caution that the Minnesota bill may be too broad and could conflict with federal laws protecting user-generated content. Despite these challenges, proponents argue that regulation is necessary to hold tech companies accountable for harmful technologies.
Key takeaways:
- Minnesota is considering legislation to prevent the creation of AI-generated explicit images, targeting companies that operate "nudification" sites and apps.
- The proposed bill would impose civil penalties on operators who fail to block access to these services in Minnesota, with fines up to $500,000 per unlawful use.
- Similar legislative efforts are underway in other states and at the federal level, with a focus on regulating AI-generated nonconsensual sexual imagery.
- AI law experts caution that the Minnesota bill may face constitutional challenges on free speech grounds, suggesting the need for more precise language.