Guidance

Principles of safer online platform design

Understand how preventative design measures can reduce the risk of harms happening on your online platform.

Overview

Online harms can happen when features and functions on an online platform create a risk to users’ safety. These harms may be illegal, or they may be legal but still harmful to a user.

New online safety legislation is coming which will aim to improve people’s safety. The laws will apply to businesses and organisations operating online platforms that:

host user generated content such as images, videos and comments allow UK users to talk with other people online through messaging, comments and forums allow UK users to search a range of websites and databases – search engines that only allow searches of a single website will not be in scope

If you own or manage an online platform which enables any of these, you will have a legal duty to protect users against illegal content. If children are likely to access your service, you will also have to put in place measures to protect them.

You can do this by understanding the principles of safety by design and applying them to your platform.

What is a safety by design approach?

Safety by design is the process of designing an online platform to reduce the risk of harm to those who use it. Safety by design is preventative. It considers user safety throughout the development of a service, rather than in response to harms that have occurred.

The government has emphasised the importance of a safety by design approach to tackle online harms. The government’s response to the Online Harms White Paper highlighted the importance of a preventative approach to tackling online safety, including through safer platform design. In response to this, the government committed to publishing guidance to help UK businesses and organisations design safer online platforms.

By considering your users’ safety throughout design and development, you will be more able to embed a culture of safety into your service.

How design can be used to improve safety

By learning how your platform’s services expose your users to risk, you can put in place safety measures to protect them from harm. Your users may be at increased risk of online harms if your platform allows them to:

  • interact with each other, such as through chat, comments, liking or tagging

  • create and share text, images, audio or video (user-generated content)

The benefits of a preventative approach

The best way to reduce online harms is to prevent them before they happen. You can do this by taking a safety by design approach, which can benefit your business or organisation by:

  • creating an environment where users feel safe and make safer choices
  • preventing problems that might be difficult or costly to solve later on
  • building confidence in your brand

Safety by design principles

By following four principles, you can protect your business and create a safer online environment for people who use your service.

1. Users are not left to manage their own safety

Anyone who owns or manages a platform should take preventative steps to make sure their service reduces a users’ exposure to harm.

Good platform design:

  • makes a user aware when they do something that might harm themselves or others

  • helps a user report content or behaviour that they think is harmful

  • makes it harder for users to upload or share content that is illegal or breaks your terms of service

Example

A social media site lets users post content which can be viewed by anyone with an account. An increasing number of users have been sharing a post that contains misleading health information with the potential to cause severe harm. As the content violates its terms of service, the platform decides to remove it and prevent the content being uploaded again.

2. Platforms should consider all types of user

Many factors can increase a user’s risk of being a victim of harm. You should aim to understand the people who use your service so you can be aware of the risks your service might present to them. This helps you design to meet their needs.

Inclusive design considers the needs of all users. You should consider users:

  • with protected characteristics which may make them victims of discrimination - for example race, disability or sexual orientation

  • with low levels of media literacy, which can be a result of age, background or level of education

  • of different levels of ability - for example children, or learning disabled people

  • who have accessibility needs - for example, visually impaired people

  • who cannot speak, or speak limited English

Example When creating an account on an online forum, users must agree to the forum’s terms of service. This includes respecting all users and not using discriminatory language. If a user tries to send a message containing discriminatory language, they get a message warning them they are in breach of the rules.

3. Users are empowered to make safer decisions

As well as creating a safer online environment, platforms should give users the tools and information they need to make safer choices online. You can do this by designing your platform to promote safer decisions. For example:

  • warning users about excessive screen time
  • prompting users to review their privacy settings
  • highlighting when content has been fact checked by an official or trustworthy source

You should be careful that platform design does not limit a user’s ability to make informed choices. For example, using algorithms to recommend content that is harmful to a user, which they have no or limited control over changing.

Good platform design helps users understand:

the reliability and accuracy of the content they are interacting with
 how their online activity is seen by others, and how to manage that - such as by changing privacy settings or blocking a user
 the potential legal impact of their actions 
 their rights and responsibilities online

Example

A social media platform lets users determine how others see their activity and profile using privacy settings. Users’ privacy is set to high by default, which hides their profile information from those outside their immediate network. If a user chooses to downgrade their privacy settings, they receive a message that asks them if they are happy with people they do not know seeing their profile information.

4. Platforms are designed to keep children safe

Children are generally less able to understand risk, which makes them particularly vulnerable to online harms.

Even if your online platform is not targeted at under 18s, you should take steps to ensure that only users who are old enough are able to access your service. You could undertake measures to identify your child users, using age assurance or verification solutions and tailor their online experience in line with their age.

Designing for children should:

  • ensure information like terms of service, or tools used to report harms, are prominent and easy to understand for children of different ages

  • encourage safe, positive interactions and ensure users can interact free from abuse, bullying and harassment

  • limit access to certain features, functions and content which pose a greater risk of harm to them

  • give responsible adults the ability to shape a child’s online experience using safety settings

  • set safety, security and privacy settings to high by default

You should establish a clear process for tackling abuse and regularly check reporting and moderation processes are being enforced effectively. There are a number of ways that you can protect children on your platform. Learn more about protecting children on your platform at A business guide for protecting children on your online platform.

If you feel that a child is facing a high or immediate risk of harm, you should dial 999 immediately.

You should also take action to ensure you are using young people’s data in an appropriate way. Learn about managing children’s personal data with the Information Commissioner’s Office’s (ICO) Age appropriate design code of practice.


Part of Online safety guidance if you own or manage an online platform

Updates to this page

Published 29 June 2021

Sign up for emails or print this page