Future updates to the Online Safety Act will address more sensitive issues, including harmful content to children, underage access to pornography, and safeguards for women and girls. A controversial section could have implications for end-to-end encryption in messaging apps, with Ofcom empowered to mandate the use of "accredited technology" for identifying child sexual abuse material. This has raised concerns about compromising encryption systems and infringing on user privacy. The Act also covers AI-generated content, focusing on the context rather than the technology itself.
Key takeaways:
- The online safety regulator Ofcom has released initial draft codes of practice following the enactment of the Online Safety Act. The guidelines address illegal online content, including child sexual abuse material, terrorism-related content, and fraudulent activities.
- The guidelines are currently proposals, with Ofcom seeking input and feedback before they are sanctioned by the UK Parliament. Non-compliance with the rules could result in fines up to £18 million or 10 percent of a company's global turnover.
- Future updates to the Online Safety Act will address more sensitive issues, such as content harmful to children, underage access to pornography, and safeguards for women and girls. A controversial section could have implications for end-to-end encryption in messaging applications.
- The Online Safety Act aims to address online harms in a "technology-neutral" manner, meaning AI-generated content is not exempt from the regulations. The focus is on regulating the context, not the technology itself.