The Nudity Protection feature puts nude images behind a safety screen, allowing users to choose whether to view it. The feature is powered by on-device machine learning and will be turned on by default for users under 18. Instagram users sending or receiving nudes will be directed to safety tips, and Meta is also testing pop-up messages for people who may have interacted with an account removed for sextortion. Accounts flagged as potential sextortionists will face restrictions on messaging and interacting with other users.
Key takeaways:
- Meta is testing new features on Instagram to protect young users from unwanted nudity and sextortion scams, including a feature called Nudity Protection in DMs that blurs images detected as containing nudity.
- The company is also developing technology to identify accounts potentially involved in sextortion scams and applying limits to how these suspect accounts can interact with other users.
- Meta has increased the data it's sharing with the cross-platform online child safety program Lantern, to include more sextortion-specific signals.
- The company is also adding new child safety helplines from around the world into its in-app reporting flows, and testing hiding teens from potential sextortion accounts in people's follower, following and like lists, and making it harder for them to find teen accounts in Search results.