Search functionality: improve the safety of your online platform
Practical steps to manage the risk of online harms if your online platform allows people to search user generated content.
Search functionality is any process people use to find content on a platform.
For example, users might find things by:
- typing search terms into a search box
- reading through a list of contents
- using tags or meta data to filter content
This page will help you understand how the ability to search user generated content on a platform can create a risk to users’ safety, and how to manage those risks.
New online safety legislation is coming which will aim to reduce online harms.If you own or manage an online platform in scope of the forthcoming legislation, you will have a legal duty to protect users against illegal content. You will also have to put in place measures to protect children if they are likely to use your service.
Learn what an online harm is
Learn about your responsibilities if you own or manage an online platform or service.
7 step checklist to keep your business and users safe
Taking a safety by design approach
Learn about best practice design if your platform features:
How harms can happen if you allow search functionality
Search functionality can make it easier for people to actively seek out and find harmful content.
It can also present harmful content to people who are not actively seeking it. For example, by using autofill to suggest search terms that reinforce hate crime or disinformation, or terms that encourage users to search for harmful or illegal content.
The most likely harms relating to search include:
-
terrorist content
-
self harm or suicide content
-
disinformation
-
child sexual exploitation and abuse
-
cyberbullying or hate crime
Example of harm caused by search functionality
A young person with an eating disorder uses the search bar on an image sharing platform to search for content on the site which promotes anorexia. The platform has not implemented any measures to block or remove auto-suggest for high risk search terms.
The person is able to find and browse hundreds of harmful images which exacerbate their eating disorder.
How to prevent harms if you offer search functionality
If you provide search functionality for your users to find content on your platform, you can reduce the risk of harms by:
-
blocking high risk search terms relating to illegal harms and activity
- ensuring that auto-suggestion functions and algorithmic recommendations do not lead people to harmful content
-
ensuring that harmful content does not appear at the top of search results
- providing links to resources and support to users who search for high risk search terms
Part of Online safety guidance if you own or manage an online platform