Product Safety Database Methodology and Quality report
Updated 22 August 2024
1) Background to the statistics
1.1 What is the Product Safety Database?
The General Product Safety Regulations 2005 (GPSR), Regulation on Accreditation and Market Surveillance (RAMS) in Great Britain and Market Surveillance and Compliance of Products Regulation 2019/1020 establish the notification requirements for products found to pose a risk to the health and safety of consumers and/or products that have been found to be non-compliant with the relevant legislation.
The Rapid Alert System for non-food consumer products (RAPEX) is the single rapid alert system for dangerous consumer products in the European Union (EU) since 2004. The EU publishes reports on its website covering data from RAPEX from 2004 to 2023. In addition, the Information and Communication System for Market Surveillance (ICSMS) allows the reliable exchange of information among authorities. The Product Safety Database (PSD) was introduced in 2019 to replace RAPEX and ICSMS, in readiness for the UK leaving the EU. This enabled a transition from RAPEX/ICSMS until the end of the EU Exit transition period on 31 December 2020. It also benefitted the UK as there is now one database rather than two.
Access RAPEX – Europa website
Access ICSMS – Europa website
Market surveillance authorities (MSAs) are those designated as competent to monitor the compliance of products with the general safety requirements and to take the appropriate measures. They use the PSD to notify the Secretary of State of measures taken where a product presents a risk to the health and safety of consumers or is non-compliant, and any further modification to these measures. The PSD is owned by the Office for Product Safety and Standards (OPSS), part of the Department for Business and Trade (DBT).
Notifications should be made on the PSD where:
- A notification to the Secretary of State is required in product safety legislation.
- There is an established risk across an entire product line and action is taken to mitigate against that risk.
- MSAs are planning to undertake an investigation and/or commission testing, to be able to share results and prevent duplication of effort.
Guidance on when to notifications should be made on the PSD and how to do this is on GOV.UK and the PSD.
Read guidance on the notification of unsafe and noncompliant products.
The OPSS Incident Management Team (IMT) is added to a notification if the product is a high or serious risk, or the product has been recalled.
1.2 Publications
The data is used to produce an annual report which is published on GOV.UK.
Access the PSD annual reports.
The PSD collects detailed information on notifications, products, businesses and the supply chain, test reports and evidence, risk assessments and corrective actions. Not all fields are mandatory and so the level of missing data on some fields is high. In addition, some fields are free text and much harder to analyse in a consistent way. Work is being carried out to improve the data collection.
1.3 Quality Assurance (QA) Process
Data is input onto the PSD by MSAs and OPSS teams on an ongoing basis and monitored and collated by operational delivery teams in OPSS.
The quality assurance processes in place are focused on the accurate capture of data, consistency of recording, and the accurate transfer of processed data into an annual report and accompanying tables. If the underlying data is inaccurate, then this has significant impact on all its potential uses. Data that is widely used and that inform important and high-profile decisions will receive the highest level of QA. Other data will undergo a more limited, but proportional level of QA. This ensures the data is fit-for-purpose in terms of the individual uses of the dataset.
In 2015 the Office for Statistics Regulation (OSR) published a regulatory standard for the quality assurance of administrative data.
Read the regulatory standard – OSR website
To assess the quality of the data provided for this release OPSS has followed that standard. The standard is supported with an Administrative Data Quality Assurance Toolkit which provides useful guidance on the practices that can be adopted to assure the quality of the data utilised.
Access the Toolkit – OSR website
This section draws on that guidance, recognising that quality assurance of administrative data is more than just checking that the figures add up. It is an ongoing iterative process to assess the data’s fitness for purpose. It covers the entire statistical production process and involves monitoring data quality over time and reporting on variations in that quality.
An assessment of the level of risk based on the Quality Assurance Toolkit is as follows:
- Risk/Profile Matrix Statistical Series: Product Safety Database statistics
- Administrative Source: PSD held on GOV.UK
- Data Quality Concern: Low
- Public Interest: Medium
- Matrix Classification: Medium Risk [A1/A2]
Overall, the PSD statistics have been assessed as A1: Medium Risk. This is mainly driven by the medium-profile nature of the figures as although the scope is limited to the work of MSAs, some of the product recalls and safety alerts attract publicity. The data quality concern is considered a low concern given that the published fields are checked by providers and the data is then further quality assured in detail by the statisticians responsible for the publication, who perform further detailed validation and checks, spotting and correcting any errors.
Quality assurance by MSAs
MSAs are responsible for submitting completed notifications, as well as for the accuracy of data provided. This responsibility includes checking the validity and accuracy of data and amending any errors entered into or omissions on the PSD.
Quality assurance by OPSS
Data received by OPSS undergoes a quality assurance process to ensure the data is fit-for-purpose and published to the highest possible standard. Any data quality issues are flagged and subsequently resolved with MSAs.
A data cleansing project was initiated in 2022 where the quality of OPSS’s notifications was looked at. Closed notifications with blank or NULL values for some of the fields [footnote 1] were referred back to the creator to be reviewed and information was added where possible. This led to improvements to the data for around 5000 notifications. In addition, open notifications were reviewed by LAs and OPSS teams and closed where possible. Fields that are used in the annual report were prioritised and there is an ongoing project to identify further fields for data cleansing. These checks are now included as part of the preparation of the data for the annual report.
OPSS statisticians look at data gaps and variance checks to identify figures that seem unusually large or small compared with expectations or unusual patterns in the data. A large increase or decrease is flagged to the RA and they are asked for an explanation.
OPSS statisticians extract the data from the PSD using PostgreSQL and produce the HTML charts and analysis using a Reproducible Analytical Pipeline (RAP) using the software package ‘R’. Not only does using a RAP greatly reduce the production time, but it also reduces human interaction needed and therefore minimises risk of errors. Any errors that do occur are likely to be systematic, and therefore, only need to be fixed once. The R code is often complex and so needs to be well documented, with considerable attention to detail.
Once the commentary and charts are produced for the publication, they are independently checked by a second OPSS analyst against the raw data. They also work through a clear checklist for each table, to look at, for example, totals that should be consistent within and between tables, consistency checking between tables, and checking totals, percentages etc. within tables are calculated correctly, hyperlinks work, formulae and drop-down lists update correctly. Each check is systematically signed off when it has been completed.
OPSS statisticians are responsible for checking that the commentary appropriately describes the data and is not biased and it is checked against the data for accuracy. The report and data tables are checked for accessibility using tailored guidelines and a checklist drawn up from the Web Content Accessibility Guidelines.
Access the Accessibility Guidelines.
Reports are peer reviewed and signed off at a senior level prior to publication.
The data underpinning publications is held in a form so that the content of publications can be replicated and that the option remains for additional historical scrutiny and analysis if required.
Feedback received may include queries on the data itself or for reasons behind levels or changes in the data, to understand trends or unusual figures. It is rare for errors to be made in publications. When made, these are corrected either immediately or in the next release (depending on severity and frequency) in line with the revisions policy.
1.4 Revisions policy
OPSS will notify users when they plan to publish new editions and if any revisions are required, in line with T3.9 in the Code of Practice for Statistics. A revision is defined as any change to official statistics in the public domain. These revisions can take the form of pre-announced revisions, or unannounced revisions such as when data updates are published without prior knowledge. The release follows the DBT statistics error policy.
Read the Orderly release section of the Code of Practice for Statistics – UK Statistics Authority website
Read the DBT statistics error policy.
Non-Scheduled revisions
Where a substantial error has occurred because of the compilation, imputation or dissemination process, the statistical release, tables and other accompanying documents will be updated with a correction notice as soon as is practical.
Scheduled revisions
Changes to the data sources used in the releases are incorporated in the next scheduled release of data.
1.5 Notes on the data in the Annual report
How the data can be used
The report provides the official estimate of the number of products and notifications added to the PSD in a financial year.
How the data cannot be used
Product notification numbers cannot be used to estimate the number of unsafe or non-compliant goods at a national level as there is no guarantee that all non-compliant and unsafe products are notified on the PSD.
Definition of fields included in report
Corrective actions – notifications can be updated with information on corrective measures when this becomes available and so the data table for previous years is updated alongside the next annual report. From 2023 to 2024 the corrective action tables are based on closed (ie complete) notifications and so the updated tables for 2021 to 2022 and 2022 to 2023 also usecom this revised methodology.
Answer categories are presented in a drop-down list with the following options:
- Ban on the marketing of the product and any accompanying measures
- Destruction of the product
- Import rejected at border
- Making the marketing of the product subject to prior conditions
- Marking the product with appropriate warnings on the risks
- Modification programme
- Other
- Product brought back into compliance
- Product no longer available for sale
-
Recall of the product from end users
- Removal of the listing by the online marketplace
- Referred to an overseas regulator
- Seizure of goods
- Temporary ban on the supply, offer to supply and display of the product
- Warning consumers of the risks
- Withdrawal of the product from the market
Harm – defined in the OPSS Product Safety Risk Assessment Methodology (PRISM) as an adverse impact on individuals, the environment, infrastructure, property, animals or businesses. It is mandatory for notifications where the product is unsafe or both unsafe and non-compliant. In previous publications this was referred to as ‘hazard’ but has been replaced with the descriptor ‘harm’ to ensure consistency with PRISM, where ‘hazard’ has another meaning. Answer categories are presented in a drop-down list with the following options:
- Asphyxiation
- Burns
- Chemical
- Choking
- Cuts
- Damage to hearing
- Damage to sight
- Drowning
- Electric shock
- Electromagnetic disturbance
- Entrapment
- Environment
- Fire
- Health risk
- Injuries
- Microbiological
- Security
- Strangulation
- Suffocation
Risk – defined in PRISM as a function of the level of a hazard (a potential source of harm) and the likelihood (or probability) that the hazard will cause harm. Notifications can be updated with information on risk assessments when this becomes available. Not all notifications require a risk to be set. Answer categories are presented in a drop-down list with the following options:
- High
- Low
- Medium
- Serious
Product category – This is mandatory. Answer categories are presented in a drop-down list with the following options:
- Chemical products
- Childcare articles and children’s equipment
- Clothing, textiles and fashion items
- Communication and media equipment
- Construction products
- Cosmetics
- Decorative articles
- Electrical appliances and equipment
- Explosive atmospheres equipment
- Food-imitating products
- Furniture
- Gadgets
- Gas appliances and components
- Hand tools
- Hobby / sports equipment
- Jewellery
- Kitchen / cooking accessories
- Laser pointers
- Lifts
- Lighters
- Lighting chains
- Lighting equipment
- Machinery
- Measuring instruments
- Motor vehicles (including spare parts)
- Other (this includes a few medical devices and medicines which are outside OPSS’s remit, but have been included here as it is not possible to separate them from the notification they relate to)
- Personal Protective Equipment (PPE)
- Pressure equipment / vessels
- Pyrotechnic articles
- Recreational crafts
- Stationery
- Toys
2) Quality summary
Quality is defined in terms of how well outputs meet user needs. The Code of Practice for Statistics states that quality means that statistics fit their intended uses, are based on appropriate data and methods, and are not misleading. This definition is a relative one allowing for various perspectives on what constitutes quality depending on the intended use.
Access the Code of Practice for Statistics – UK Statistics Authority website
In order to determine whether outputs meet their needs, quality is measured in terms of the quality dimensions of the European Statistical System.
Read about the European Statistical System – Europa website
2.1 Relevance
The degree to which the statistical product meets user needs in both coverage and content.
The statistics provide information on unsafe and non-compliant products added to the PSD in the UK. There is some information covering product safety alerts in the UK from 2004 to 2020 from the RAPEX system published on the EU website. The first publications from PSD were released by OPSS in 2023, covering 2021 to 2022 and 2022 to 2023 notifications.
Access the RAPEX reports – Europa website
In addition to the annual report, more detailed data tables are published in OpenDocument Spreadsheet (ODS) format on GOV.UK.
The data is used by OPSS policy colleagues to inform policy development, and by operational delivery teams to identify emerging trends and issues to inform intelligence development and proactive investigations and enforcement activity. In addition, LAs use the data to monitor and benchmark performance, to make strategic decisions, for planning and risk assessment and to identify the most important areas to target for testing activity.
Other interest and uses of this data is outlined in the section on Uses and users.
We review our data collections and outputs periodically to ensure that they are relevant, collect reliable data and meet user needs. More details are in section 2.7 on Assessment of user needs and perceptions
The content of the PSD is reviewed and refined on a regular basis and new fields or answer options added where appropriate. New fortnightly releases of the PSD include notes to explain to users what has changed.
Uses and users
We believe the uses of the PSD statistics are:
- Informing the general public – the statistics may be used by both national and local media, which in turn informs the public about product safety activity; information on the statistics can also be requested by Parliamentary Questions and Freedom of Information requests. Academics may use the statistics to enrich their research.
- Policy making and monitoring – the statistics are used by policy areas to monitor the identification of unsafe and non-compliant products and to provide context and evidence for policies; the data is also used to provide advice to Ministers and for impact assessments.
- Identifying risks.
- LAs – comparisons and benchmarking, enabling LAs to identify which areas their resources should be prioritised towards using targeted testing.
- Third parties – the statistics may be used by a range of third parties, e.g. businesses for market research.
We believe the users of PSD statistics may include:
- LAs and other MSAs
- other colleagues within OPSS
- other government departments for example DLUHC
- trade unions
- journalists
- Chartered Institute of Public Finance and Accountancy
- Local Government Association
- Ministers
- Members of Parliament
- individual citizens
- private companies and trade associations
- students, academics and universities
- charities
- consumer groups
- security services and intelligence
2.2 Accuracy and reliability
The proximity between an estimate and the unknown true value.
Guidance for completing the PSD is provided by OPSS. Product Safety regulation requires specialist knowledge and expertise and the Regulators’ Companion [footnote 2] pages provide links to resources and learning opportunities to support LA regulators with this work. The PSD has undergone some significant functional improvements since it was introduced, for example it is now possible to link a product to more than one notification to support LA identification of products outside of their area to share data.
MSAs use the PSD as a live portal to
- notify products posing a risk to the health and safety of consumers and/or products which are non-compliant by creating PSD ‘product’ and ‘notification’ records
- assign notifications to, and receive referrals from, LAs or organisations and track the notification progress
- search for products or businesses already notified in the PSD
- search for products and view all the notifications which have been created about the product
- take corrective action to restrict the availability of a product on the market
- add more evidence to notifications as it is gathered
The database has a large number of fields with some in-built validation rules to ensure that basic validation errors are avoided. However, there can still be some inaccuracies in the data due to reporting or keying errors, such as misclassification or missing notifications. No grossing, imputation or other estimation methods are currently used.
As discussed in the section on quality assurance, the Analysis team run some specific checks before publishing the data.
OPSS provides support through an enquiries email inbox if users experience any difficulties in entering data onto the PSD. OPSS also run regular online training sessions for users via the Chartered Trading Standards Institute.
Accuracy can be broken down into sampling and non-sampling error.
The notifications provided by MSAs are required under legislation, and we aim to achieve 100 per cent notification, therefore reducing sampling error to the minimum. In the period 1 April 2023 to 31 March 2024, notifications were received from 164 of the 218 LAs in the UK who undertake Trading Standards activities. It is possible that some LAs do not have any product safety issues to notify on PSD, but OPSS teams are working with LAs to ensure the optimum level and quality of data are recorded on PSD and understand and resolve barriers to effective use. It is also possible that low risk notifications are missing.
The published report is produced via a RAP as discussed in the section on quality assurance and these statistics are therefore considered to be robust and reliable.
Non-sampling error includes areas such as coverage error, non-response error, measurement error, processing error.
We aim to reduce non-sampling error through the provision of guidance about the data collections and the definitions of the data items.
2.3 Timeliness and punctuality
Timeliness refers to the time gap between publication and the reference period. Punctuality refers to the gap between planned and actual publication dates.
There is a trade-off between timeliness and the other quality dimensions, in particular accuracy, accessibility and clarity. It is important to ensure that the release has adequate processes to ensure accuracy of the dataset and produce clear publication outputs.
To provide timely data to users, we aim to publish figures annually in as soon as possible given resource constraints. The first two reports covering April 2021 to March 2023 data were published on 18 October 2023. The report covering April 2023 to March 2024 data was published on 22 August 2024.
The PSD is a live database with updates being added all the time. After a snapshot of the database covering the relevant time period is taken, the data is then quality assured by OPSS Statisticians, charts and commentary are prepared, and they are then quality assured and prepared for publication.
The data production and publication schedules are kept under review and take into account user needs when considering the timeliness of future data releases.
The publication date for the next edition will be pre-announced on GOV.UK.
View upcoming OPSS statistics.
2.4 Accessibility and clarity
Accessibility is the ease with which users are able to access the data, also reflecting the format in which the data is available and the availability of supporting information. Clarity refers to the quality and sufficiency of the metadata, illustrations and accompanying advice.
The PSD statistics webpages are accessible from the OPSS GOV.UK pages. An RSS feed will alert registered users to this publication. All releases are available to download free of charge.
Access the PSD annual reports.
The outputs aim to provide a balance of commentary and charts accompanied by ODS files of the data returns. The aim is to ‘tell the story’ in the output, without the output becoming overly long and complicated.
The publication is available in HTML format and includes email contact details for sending queries and feedback to the production team.
The format used is in line with GOV.UK guidance. It aims to make outputs clear for the audience and all outputs adhere to the accessibility policy. Key users of the publication are informed of the statistics on the day of their release. Further information regarding the statistics can be obtained by contacting opssanalysis@businessandtrade.gov.uk.
The data published on the PSD collection page of GOV.UK are subject to rights detailed in the Open Government Licence v3.0: ‘All content is available under the Open Government Licence version 3, except where otherwise stated’.
View the Open Government Licence – National Archives website
The statistics are taken directly from the source data that is collected for administrative purposes with little manipulation between source and publication.
2.5 Coherence and comparability
Coherence is the degree to which data that are derived from different sources or methods, but refer to the same topic, are similar.
The statistics covered by this report are the only source of official data on that subject. The data is collected from all providers on the same portal with accompanying guidance and definitions. This ensures consistency across the different types of data provider.
Comparability is the degree to which data can be compared over time and domain.
The statistics are taken directly from MSAs in the UK, using the same question set and guidance documents.
Comparison over time
The OPSS published reports cover the period from April 2021 onwards. Prior to this, reports were published by the EU and the data was presented in a different format.
Devolved administration data sources
The database is used by England, Wales, Scotland and Northern Ireland Authorities.
Geographies below UK level
The data is published at UK level. Although data is collected at lower geographical levels, MSAs do not currently use the system consistently and so it would not be possible to show the information in a meaningful way. OPSS are working with MSAs to improve their use of the PSD.
Harmonisation of statistics inputs and outputs
A cross-governmental programme of work is currently underway looking into standardising inputs and outputs for use in National Statistics This is known as harmonisation. The Government Statistical Service published a Harmonisation Strategy in 2019. Its aim is to make it easier for users to draw clearer and more robust comparisons between data sources. OPSS adopts harmonised questions where possible, and harmonisation will be part of any ongoing changes to the system.
Access the Harmonisation Strategy – Government Analysis Function website
2.6 Trade-off between output quality components
Trade-offs are the extent to which different aspects of quality are balanced against each other.
As discussed previously, the timetable for publication is designed to provide users with the best balance of timeliness and accuracy.
2.7 Assessment of user needs and perceptions
The processes for finding out about users and uses, and their views on the statistical products.
OPSS teams work closely with key customers and stakeholders in both OPSS and LAs to keep track of developments, and to continuously review the coverage and content of publications to ensure that they meet the needs of users.
We also consult our users on data collection issues to better understand user requirements and priorities for the future. As part of this, OPSS policy colleagues, LAs and others have provided information on how they use statistics as discussed in the earlier section on Uses and users. We encourage feedback on all our outputs and data collections. New fortnightly releases of the PSD are notified to users and include information on how to be involved in user testing and help shape the future of the PSD.
Contact details are also available at the bottom of each publication page for users to get in touch if they have a query.
Stakeholder engagement
OPSS statisticians have worked closely with stakeholders to ensure that as much data as possible is shared, upholding the principle of collect once and use many times.
User consultations
Broader consultations will be conducted in the future when appropriate, for example when significantly changing the provision or coverage of the published data, or revising the methodology used.