Correspondence

Appendix B: Update on online targeting

Published 5 July 2021

This was published under the 2019 to 2022 Johnson Conservative government

Introduction

The government is committed to taking a pro-innovation approach to digital technologies. We need a regulatory regime that supports competition, provides clarity for businesses to innovate, and builds trust and confidence amongst citizens.

As set out in the National Data Strategy last year, the government wants the opportunities presented by data use to be embraced. But to do this we need to make data work for everyone, which means that innovation in data driven technologies such as online targeting systems should be responsible.

The government commissioned the Centre for Data Ethics and Innovation (CDEI) to conduct a review of online targeting as part of its first work programme, and the CDEI published its final report in February 2020. During the review the CDEI engaged with policy teams across government, sharing their evidence base and analysis, which informed policy development that was taking place in parallel to the review.

The CDEI identified a number of areas where the government could take action to address the risks, and maximise the potential benefits, of online targeting. The government has consulted on or announced policies in a number of areas that relate to the CDEI’s recommendations. Relevant publications include: a technical consultation on digital imprints (August 2020) and government response (June 2021); a response to the CMA’s market study into online platforms and digital advertising (November 2020); the National Data Strategy (September 2020) and government response (May 2021); the Full Government Response to the Online Harms White Paper consultation (December 2020) and the Draft Online Safety Bill (May 2021).

The Online Safety Bill will place new duties on companies to take responsibility for the safety of their users in relation to user-generated content and search functions. Ofcom has been confirmed as the regulator for the new regime and will have powers to oversee and enforce compliance (including imposing substantial fines and applying to the court for business disruption measures). The draft Online Safety Bill and the Full Government Response to the Online Harms White Paper consultation in particular addressed a significant number of the recommendations made by the CDEI, in the contexts of services hosting user-generated content and search engines. This includes: regulatory oversight for content recommendation systems; a commitment to actively investigate the option of independent researcher access to data; greater protections for freedom of expression and privacy; a focus on systems and processes rather than categories of content; and mechanisms to enable effective regulatory coordination.

The CDEI is working closely with policy teams in DCMS to develop data governance as part of the Online Safety Data Initiative. This will include considering arrangements for data stewardship that support responsible innovation, while protecting people’s privacy. The CDEI is also providing an evidence base on user empowering design via their Active Online Choices project with the Behavioural Insights Team. The findings also provide evidence in support of safety-by-design and media literacy considerations.

The CDEI report was focused on three key areas: accountability; transparency; and user empowerment. This update follows that structure.

Accountability

The Online Safety Bill will require companies to consider the role potentially played by algorithms in relation to illegal content, content that is harmful to children (if children are likely to use their services) and, for the largest companies, legal content that may be harmful to adults. The new regulatory framework will apply to any service which hosts user-generated content, as well as search engines, with some specific targeted exemptions for low-risk services. Companies will have duties to carry out risk assessments and to take proportionate steps to manage and mitigate the risks they identify which might be related to their content recommendation processes. These duties will be overseen and enforced by an independent online safety regulator. The government has confirmed that Ofcom will be designated as regulator. It will have the power to fine companies failing in their duties up to £18 million or 10% of relevant annual global turnover, whichever is higher, and have the power to apply to the court to block access to sites.

The online safety regulatory framework seeks to tackle harm facilitated through user-generated content and search engines, including where appropriate through online targeting. The framework will not seek to tackle harm facilitated through content such as advertising, email, video-on-demand services, aggregators or news websites.

Separately, in November 2020, the government announced the introduction of a new pro-competition regime for digital markets. In April 2021, a new regulator, the Digital Markets Unit (DMU), was launched within the CMA to ensure digital firms with substantial and entrenched market power cannot exploit their positions in ways that harm consumers and businesses.

A central part of this new regime will be a mandatory code of conduct to govern the relationships between dominant firms designated with ‘strategic market status’ and the different groups of users which rely on their services. Like the code proposed by the CDEI, the pro-competition regime’s code of conduct aims to increase accountability of platforms’ practices and conduct. One of the objectives of the code would be to promote trust and transparency, aiming to promote informed and effective consumer choice. This is discussed in more detail below, in relation to user empowerment.

Ofcom will have a duty to issue codes of practice that set out the steps companies can take to fulfil their new duties under the Online Safety Bill. Those duties themselves are focused around proportionate steps and processes to mitigate and manage risks, including those linked to their service design. In line with the CDEI’s recommendation, the regulator’s codes of practice will take a systemic approach, focusing on systems, processes and governance that in-scope companies can put in place to fulfil their duties. Ofcom will have freedom to decide how the individual codes of practice, and codes of practice as a whole, are structured. Therefore, there will not necessarily be a specific code of practice on online targeting, although we expect steps related to online targeting of user-generated content may feature in codes of practice.

The UK supports freedom of expression as both a fundamental right in itself and as an essential element of a full range of human rights. One of the overarching principles of online safety regulation is to protect users’ rights online, including the rights to freedom of expression and privacy, and companies will have a duty to have regard to the importance of these rights when implementing online safety policies. Companies will be required to protect users’ rights to freedom of expression and from unwarranted infringements of privacy, both as part of their risk assessments and when they make decisions on what safety systems and processes to put in place on their services. Ofcom will also need to realise its duties in a way that protects freedom of expression and privacy.

Regulation will ensure consistent application of the largest companies’ terms and conditions relating to harmful content. Additional duties will apply to providers of the highest-reach and highest-risk services, including the largest social media services, known as ‘Category 1 services’. They will be required to set clear and accessible terms and conditions about what types of legal but harmful content are acceptable on their services, and enforce those terms and conditions consistently. Users will be able to report harmful content in breach of terms and conditions, and challenge wrongful content removal. This will protect freedom of expression by preventing companies from arbitrarily removing content.

Our approach to online safety will support more people to enjoy their right to freedom of expression, helping more people to participate in online discussions, by reducing the risk of bullying or being attacked on the basis of their identity, for example their sex, race, disability, sexuality, religion or age. Regulation will focus on the systems and processes companies have in place to deal with harmful content, and will not address individual pieces of legal content. Regulation will not prevent adults from accessing or posting legal content, nor require companies to remove specific pieces of legal content.

All companies in scope of the online safety regulatory framework will have a specific legal duty to have effective and accessible reporting and redress mechanisms. This will cover harmful content and activity, infringement of rights, and broader concerns about a company’s compliance with its regulatory duties.

Ofcom will be given broad powers to require the information that it needs to carry out its functions in this space. This will give it the flexibility to determine the specific information it requires. These powers will apply to companies in scope of the new regime and, where necessary, to other organisations or persons who may have relevant information. Ofcom will also have the power to enter and inspect the premises of regulated service providers. Where there are reasonable grounds to suspect possible non-compliance, a justice may grant Ofcom a warrant, which would allow Ofcom to search the premises, access equipment and seize documentation. As with all its powers, Ofcom will need to use this power proportionately.

Ofcom will have the power to require a report from a skilled person about a service in scope of the online safety regulatory framework. It will be able to use this power in order to assess potential non-compliance with regulatory requirements and/or to build an understanding of the risk associated with a service. As with all its powers, Ofcom will need to use this power proportionately.

The government believes that effective coordination between regulators is needed to address the challenges posed by the cross-cutting nature of digital technologies. We welcomed the creation of the Digital Regulation Cooperation Forum - which brings together Ofcom, the ICO, the CMA, and the FCA - as an important step towards closer regulatory cooperation. We are working with the Forum and other stakeholders to understand what further steps may need to be taken.

Our approach to governing digital technologies will boost competition in digital markets, help make sure digital technologies are safe and secure, and promote our democratic values, driving fast and inclusive growth across the economy. As part of this, the government is assessing the institutional landscape to ensure our regulators have the capabilities they need to govern effectively; that the right mechanisms are in place to support collaboration on cross-cutting issues; and that regulatory responsibilities and remits are clear to businesses and consumers alike.

Through the Online Advertising Programme, the government is in the process of assessing the current regulatory system for online advertising to ensure that it fosters fair, transparent and ethical online advertising that works for citizens, businesses and society as a whole. It will consider the recommendations to increase accountability of online advertising as part of this programme of work. This work will include looking at the standards to which advertisers and disseminators of advertising content are held by regulators and by statute. The government will continue to work with both regulators and relevant market participants in this area to ensure coherence of approach across the system.

The government has published the Ethics, Transparency and Accountability Framework for Automated Decision-Making to ensure that public sector practitioners have clear guidance when algorithmic approaches, including personalisation, are involved in decision making processes.

The CDEI will be working closely with public sector teams to help them responsibly innovate in the use of data-driven technologies.

Transparency

The draft Online Safety Bill imposes a requirement on a subset of companies to produce transparency reports. It includes a list of the types of information that in scope services might be required to include in transparency reports. This list includes information about the process and steps an organisation has in place to assess risk of harm at the design, development and update stage of the online service; information about the enforcement of the company’s own relevant terms and conditions; and information about the measures and safeguards in place to uphold and protect fundamental rights, ensuring decisions to remove content, block and/or delete accounts are well founded, amongst others.

To support research into online harms, and to help the regulator to prioritise its actions, Ofcom will be required to produce a report on the extent to which researchers are currently able to access data to support research into online harms, the challenges associated with accessing this data, and the extent to which greater access to information might be achieved.

As part of this Ofcom will be able to produce best practice guidance for companies and researchers on how to approach it. In preparing any guidance, Ofcom will be required to consult a broad range of stakeholders, including companies, academics, the Information Commissioner’s Office, the CDEI, and UK Research and Innovation.

Several social media companies have implemented measures to improve the transparency of political advertisements on their platforms, including introducing advertising archives. We welcome these steps and platforms should continue to make as much data as possible available to the public.

To ensure the online safety regulatory framework is well equipped to deal with the challenges of disinformation and misinformation, the regulator will be required to establish an expert working group on these issues. The working group will aim to build consensus and technical knowledge on how to tackle disinformation and misinformation. This working group will include a range of stakeholders such as rights groups, academics and companies.

There is also a need for greater transparency on the actions that companies are taking to tackle this kind of content on their platforms, including improving access for researchers to better understand the scale, scope and impact of misinformation and disinformation. The online safety regulatory framework will help build an understanding of what in-scope companies are doing in relation to disinformation and misinformation through transparency reporting requirements. As set out in the transparency section, the regulator will have the power to require certain companies to publish annual transparency reports, setting out the extent and response to this harm. As part of this, companies could be required, where relevant, to report on processes and systems in place to respond to disinformation and misinformation.

The government is convening regular Counter-Disinformation Policy Forums, which bring together key actors in industry, civil society and academia to improve responses to mis/disinformation and prepare for future threats. The Forums will help us understand which types of interventions are most successful in reducing the spread and consumption of false content, and encourage the adoption of best-practice principles throughout the online environment.

User empowerment

The government recognises that users should be able to understand how online platforms operate and have greater control over their experiences. The Online Safety Bill will encourage an approach to platform design that empowers users. The government’s voluntary safety by design framework, due to be published later this year, will provide organisations with practical guidance on how to design safer online services and products that empower users. Examples of a safety by design approach include: default safety settings, clearly presented information, positive behavioural nudges and user reporting tools that are simple to use.

The government’s approach will include building upon the interactions between safety by design and media literacy, to promote the role of design in strengthening media literacy and improving user safety.

In addition, the new pro-competition regime for digital markets will include a mandatory code of conduct for firms designated with ‘strategic market status’ (SMS) to ensure open choices, fair trading and trust and transparency. This is informed by the advice of the cross-regulator Digital Markets Taskforce, published in December. The Taskforce was established by the government and drew on relevant research and recommendations, including the evidence set out in the CDEI’s report.

User empowerment is a key aspect of the pro-competitive behaviour that the code would seek to promote. For example, under the proposed Trust and Transparency objective, the code would seek to ensure that users have clear and relevant information to understand what services SMS firms are providing, and to make informed decisions about how they interact with the firm. This includes sharing clear and accessible information with users and ensuring that choices and defaults are presented in a way that facilitates informed and effective customer choice.

The government will consult on the new pro-competition regime, including the statutory code of conduct, this year and will legislate as soon as Parliamentary time allows.

The government is committed to introducing a digital imprints regime. In August 2020, the government launched Transparency in digital campaigning: technical consultation on digital imprints, outlining how a digital imprints regime would operate in practice and gathering feedback on the details of the proposal. The consultation closed in November 2020 and thegovernment’s response to the technical consultation setting out our proposals was published on 15 June 2021.

Voters value transparency and our proposals would inform voters about the source of digital campaign material, making UK politics even more transparent.

These proposals would represent a significant step forward and the regime would be one of the most comprehensive ‘digital imprint’ regimes operating in the world today.

The government is developing an Online Media Literacy Strategy which will ensure a coordinated and strategic approach to online media literacy education and awareness for children, young people and adults. The Strategy has been developed by DCMS in close collaboration with government departments and public bodies, including the Department for Education, the Cabinet Office, and Ofcom. The strategy will be published this year.

The online safety regulatory framework will build upon and strengthen Ofcom’s existing duty to promote media literacy. Under this renewed duty Ofcom will have additional responsibilities including to have oversight of industry activity and spend, and have the power to require companies to report on their education and awareness activity.

As we set out in the National Data Strategy, new data stewardship measures need to be developed to support the market for new online safety products and services, as well as supporting other responsible uses of personal or sensitive data, for example health research. As part of the development of the Data Economy Innovation policy framework, to be published in the autumn, we are considering how data intermediaries can enable new forms of responsible data sharing, building on the Ada Lovelace Institute and AI Council joint report on Legal Mechanisms for Data Stewardship.

We are also working to support growth and innovation in the safety tech sector, delivering the world’s first safety tech industry forum, organising safety tech trade missions and delivering a pioneering research project into better use of data around online harms.