Guidance

A guide to the Online Safety Bill

The Online Safety Bill is a new set of laws to protect children and adults online. It will make social media companies more responsible for their users’ safety on their platforms.

This guidance was withdrawn on

The Online Safety Bill received Royal Assent on 26 October 2023.

For more information read the Online Safety Act explainer or visit www.legislation.gov.uk.

A guide to the Online Safety Bill

What the new internet safety laws mean for adults and children.

The Online Safety Bill is a new set of laws to protect children and adults online. It will make social media companies more responsible for their users’ safety on their platforms.

How the online safety bill will protect children

The Bill will make social media companies legally responsible for keeping children and young people safe online.

It will protect children by making social media platforms:

  • remove illegal content quickly or prevent it from appearing in the first place. This  includes removing content promoting self harm
  • prevent children from accessing harmful and age-inappropriate content
  • enforce age limits and age-checking measures
  • ensure the risks and dangers posed to children on the largest social media platforms are more transparent, including by publishing risk assessments
  • provide parents and children with clear and accessible ways to report problems online when they do arise

How the online safety bill will protect adults

The Bill will protect adults in three ways through a ‘triple shield’.

  1. All in scope services will need to put in place measures to prevent their services being used for illegal activity and to remove illegal content when it does appear.

  2. Category 1 services (the largest and most-high risk services) must remove content that is banned by their own terms and conditions.

  3. Category 1 services must also empower their adult users with tools that give them greater control over the content that they see and who they engage with.

Types of content that will be tackled

Illegal content

Some content that children and adults encounter online is already illegal. The Bill will force social media platforms to remove illegal content, stopping children and adults from seeing it.

The Bill is also bringing in new offences, including making content that promotes self harm illegal for the first time. Platforms will need to remove this.

This is not just about removing existing illegal content, it is also about stopping it from appearing at all. Platforms will need to think about how they design their sites to reduce the likelihood of them being used for criminal activity in the first place.

Illegal content that platforms will need to remove includes:

  • child sexual abuse
  • controlling or coercive behaviour
  • extreme sexual violence
  • fraud
  • hate crime
  • inciting violence
  • illegal immigration and people smuggling
  • promoting or facilitating suicide
  • promoting self harm
  • revenge porn
  • selling illegal drugs or weapons
  • sexual exploitation
  • terrorism

Content that is harmful to children

Some content is not illegal but could be harmful or age-inappropriate for children. Platforms will need to protect children from it.

The categories of harmful content that platforms will need to protect children from encountering are set out in the Bill and include:

  • pornographic content
  • content that does not meet a criminal threshold but which promotes, encourages or provides instructions for suicide, self-harm or eating disorders
  • content that depicts or encourages serious violence
  • bullying content

Underage children will be kept off social media platforms

The online safety laws will mean social media companies will have to keep underage children off their platforms.

Social media companies set the age limits on their platforms and many of them say children under 13 years of age are not allowed, but many younger children have accounts. This will stop.

Different technologies can be used to check people’s ages online. These are called age assurance technologies.

The new laws mean social media companies will have to say what technology they are using, if any, and show they are enforcing their age limits.

Adults will have more control over the content they see

The largest platforms will be required to offer adult users tools so they can have greater control over the kinds of content they see and who they engage with online. This includes giving them the option of filtering out unverified users, which will help stop anonymous trolls from contacting them.

The largest platforms will have to provide adult users with tools to help reduce the likelihood that they will encounter certain types of content. These categories of content are set out in the Bill and include content that does not meet a criminal threshold but promotes or encourages eating disorders or self harm, or is racist, anti-semitic or misogynistic.

The Bill will already protect children from seeing this content.

These platforms must also proactively offer these tools to all their registered adult users, so it is easy for them to decide whether to use these tools or not. The tools must be effective and easy to access and could include human moderation, blocking content flagged by other internet users or sensitivity and warning screens.

The Bill will tackle repeat offenders

These laws will require all social media companies to assess how their platforms could allow abusers to create anonymous profiles, and take steps to ban repeat offenders, preventing them from creating new accounts and limiting what new or suspicious accounts can do.

How the Bill will be enforced

We are putting Ofcom in charge as a regulator to check platforms are protecting their users.

Platforms will have to show they have processes in place to meet the requirements set out by the Bill. Ofcom will check how effective those processes are at protecting internet users from harm.

Ofcom will have powers to take action against companies which do not follow their new duties. Companies will be fined up to £18 million or 10 percent of their annual global turnover, whichever is greater. Criminal action will be taken against senior managers who fail to follow information requests from Ofcom. Ofcom will also be able to hold companies and senior managers (where they are at fault) criminally liable if the provider fails to comply with Ofcom’s enforcement notices in relation to specific child safety duties or to child sexual abuse and exploitation on their service.

In the most extreme cases, with the agreement of the courts, Ofcom will be able to require payment providers, advertisers and internet service providers to stop working with a site, preventing it from generating money or being accessed from the UK.

How these UK laws impact international companies

Ofcom will have the power to take appropriate action against all social media and tech companies, no matter where they are based, if they are accessible to UK users.

The next steps for the Bill

The new laws are currently going through parliament and will be in place once they finish their passage.  The Department for Science, Innovation and Technology is working closely with Ofcom to lay the groundwork for the laws to be enforced once they are introduced.

We will take a phased approach to bringing in the Online Safety Bill’s duties as Ofcom’s powers come into force.
Ofcom will initially focus on illegal content, to address the most serious harms as soon as possible.

Find out more

See more information about online safety, including learning materials and advice for staying safe online.

Updates to this page

Published 16 December 2022
Last updated 30 August 2023 + show all updates
  1. Changes made to provide more detail on the Online Safety Bill, and to reflect amendments which have been made to the Bill during its passage through Parliament.

  2. First published.

Sign up for emails or print this page