The app in the ad uses branding identical to a company called Deepnude AI, which was written about in 2019 by 404 Media's Sam Cole. After the article, the founder of the app deleted it. The author argues that Twitter's moderation of such apps is nonexistent, while other platforms like TikTok and Instagram have made it more difficult to search for terms like "undress". The article concludes by stating that the account that bought the ad is still active, highlighting Twitter's failure to moderate nonconsensual content.
Key takeaways:
- Twitter is running ads for an app that uses artificial intelligence to create nonconsensual nude images of women, highlighting the platform's ongoing issues with moderation and ethical advertising.
- The app in the ad uses identical branding to a company called Deepnude AI, which was deleted after a 2019 article by 404 Media's Sam Cole.
- Despite other platforms like TikTok and Instagram making it more difficult to search for terms like "undress", Twitter has seemingly done nothing to prevent people from sharing these nonconsensual tools.
- The existence of this ad shows that Twitter is not only failing to moderate this nonconsensual content but is actively profiting from it.