What you need to know about the UK's new Online Safety Act

Following the EU’s Digital Services Act in August 2023, a new Online Safety Act (OSA) has now been passed in the UK. Intended to help protect people on the Internet, the new law requires regulated service providers to “identify, mitigate and manage the risks of harm” from content that is illegal, abusive, or harmful to children.

What does the OSA cover?

The OSA will regulate services that target UK consumers even if not based in the UK. As Senior Commercial Technology Lawyer, David Sant, puts it, “the legislation will therefore have huge consequences on many businesses both in the UK and internationally.” This means that all businesses - even those outside of the UK - should carefully carry out their own risk assessments and set some action points from there if necessary.

Under the Act’s definitions, illegal content includes child sexual abuse material (CSAM), terrorism, fraud, selling illegal drugs or weapons, and content encouraging self harm or suicide. Material is considered abusive should it target race, religion, sex or sexual orientation, gender reassignment or disability. Content that’s harmful to children includes any material that depicts, promotes, or facilitates pornography, serious violence, bullying, self-harm, or eating disorders. Another section of the Act criminalises the targeted sharing of flashing images to people with epilepsy, also known as Zach’s Law.

Protective measures

The OSA provides a set of safety objectives for regulated user-to-user services, including social media sites, and proposes that services like these set up more systems and processes to help them comply. The service must also provide a higher standard of protection for children than for adults, for example by introducing more age-checking measures, and take into account the varying needs of children at different ages.

According to Ofcom, around three in five children aged between 11-18 years old have been contacted online in a way that potentially made them feel uncomfortable. There are some tools that have been specially developed to help prevent inappropriate content from circulating online. ‘Hash matching’, for example, can identify illegal images by matching them to CSAM held in a secure database. Another tool can detect URLs that have been identified as hosting CSAM.

Ofcom have published a detailed outline showing how they’ll be working to improve online safety over the next three years. Sir Peter Wanless, Chief Executive at the National Society for the Prevention of Cruelty to Children (NSPCC), has said:

We look forward to working with Ofcom to ensure these initial codes help to build bold and ambitious regulation that listens to the voices of children and responds to their experiences in order to keep them safe.

Debates over end-to-end encryption

According to Ofcom, the codes of practice aren’t prescriptive or exhaustive and different companies can choose a different approach to content moderation and management. However, there is controversy over the threat that the OSA poses to the use of end-to-end encryption, which is used by many social media providers including Meta and Signal.

On the one hand, as Michelle Donelan, UK Technology Secretary, has put it: “The Bill protects free speech, empowers adults and will ensure that platforms remove illegal content.” But one clause is particularly divisive. According to WIRED, Section 122 “has been widely interpreted as compelling companies to scan users’ messages to make sure that they aren’t transmitting illegal material” - something that’s not really possible without breaking end-to-end encryption.

Critics including Matthew Hodgson, CEO of encrypted messaging service Element, are concerned that doing so poses an additional security threat, where any mechanism put in place to scan the content could also be compromised. Rasha Abdul Rahim, Director of Amnesty Tech, believes Section 122 (widely known as the ‘Spy Clause’) would “leave everybody in the UK - including human rights organisations and activists - vulnerable to malicious hacking attacks and targeted surveillance campaigns.”

What this means for organisations

Although the OSA is largely aimed toward user-to-user services like these, it’s being recommended that all companies working with online content take the new law into account and work towards compliance, for example by considering new age verification systems where necessary and monitoring potentially harmful content.

While the new law will have different implications for different services, Sant also outlines the key action points that all businesses should be taking into consideration. For digital asset management and other systems that host user generated content, based on Sant’s advice, the steps might look like this:

  • Consider whether your organisation will need to comply with the OSA and if so, which content falls within its scope

  • Review and update existing content moderation policies: if there’s no policy in place, are there resources available internally to write one? Check your systems and processes and seek legal advice if needed

  • Check if you can monitor and remove content

  • Consider if age verification is needed: who has access to which content, and who’s responsible for moderating this? How secure is the content, and how likely is it to be accessed by children of different ages?

  • Consider if you have adequate systems to prevent illegal and harmful content

Sant recommends that companies “promote a culture around safeguarding and protection on your platforms.” The most important action point is to stay informed. Carry out due diligence and look at how similar businesses are responding to the new law in order to consider what’s appropriate for your organisation. Again, seeking legal advice and community discussion around the topic may be helpful, as is keeping up with new developments in the guidance.

Further reading:

The full legislation can be found on the gov.uk legislation site, and the full Ofcom plan and schedule can be found in the Ofcom News Centre pages.

Peter Guest, ‘The UK’s Controversial Online Safety Act Is Now Law’, Wired, 26.10.23

Computer Weekly compiled a longer list of articles on the end-to-end encryption debate: 'Tech firms cite risk to end-to-end encryption as Online Safety Bill gets royal assent', 27.10.23

David Sant, ‘What is the UK’s Online Safety Act? Your business guide’, Harper James, 17.10.23

Rasha Abdul Rahim, ‘UK: “Spy clause” in Online Safety Bill must be addressed before it becomes law’, 5.9.23

Read the Epilepsy Society’s summary of the section on flashing images: 'Zach's law now official', 19.9.23