The administration suggests that the private sector should disrupt the monetization of image-based sexual abuse, particularly by restricting payment access to sites advertising explicit images of minors. It also suggests that cloud service providers and mobile app stores could curb services and applications used for creating or altering sexual images without consent. The administration believes that survivors should have an easier time getting online platforms to remove such images, whether they are AI-generated or real.
Key takeaways:
- The Biden administration is urging tech companies and financial institutions to combat the growing issue of AI-generated sexually explicit images, which are often nonconsensual and target women and children.
- The White House is seeking voluntary cooperation from these companies to curb the creation, spread, and monetization of such images, particularly those involving minors.
- The administration is calling for action from AI developers, payment processors, financial institutions, cloud computing providers, search engines, and app store gatekeepers like Apple and Google.
- Despite existing laws criminalizing the creation and possession of sexual images of children, there is little oversight over the tech tools and services that enable the creation of these images, highlighting the need for more rigorous enforcement and legislation.