Fostering in England: methodology
Published 12 November 2020
Applies to England
Introduction
This document contains methodology and quality information relevant to the Office for Standards in Education, Children’s Services and Skills (Ofsted) annual release of fostering data, covering all fostering provision in England.
You should read this methodology and quality report in conjunction with the background notes contained within the statistical first release (SFR), because those notes include helpful information that is not in this report.
The SFR contains data sourced from local authorities (LAs) and independent fostering agencies (IFAs). The release includes data about fostering agencies, fostering households and foster carers.
The data includes:
- LA fostering agencies, including IFAs performing the function of LA fostering agencies
- IFAs
This release of fostering in England data covers the period between 1 April 2019 and 31 March 2020.
The statistical release is published as a full version annually and contains final data. We publish a number of official statistics covering children’s social care, including fostering, adoption and children looked after placements.
We welcome feedback about our statistical releases. If you have any comments, questions or suggestions, please contact the Social Care Data and Analysis team socialcaredata@ofsted.gov.uk.
Relevance
Ofsted regulates and inspects to achieve excellence in the care of children and young people, and in education and skills for learners of all ages. We release our official statistics to promote reform and improvement across government through increasing transparency and citizen participation.
We regulate and inspect IFAs under the Care Standards Act 2000. The Care Standards Act 2000, including regulations made under section 22, sets out the legal basis for regulating fostering agencies. More information is available on how we regulate and inspect IFAs.
We inspect LA children’s services functions, including fostering, under section 136(2) of the Education and Inspections Act 2006.
Uses for fostering data
Primarily, the data is collected from providers to support inspections of IFAs and LA fostering services. The data is analysed at an agency level and using comparator data to prompt lines of enquiry that will be followed at inspection. The data is also used to evaluate the effectiveness of fostering agencies, including ongoing monitoring of performance and improvement work.
We analyse the data to further enhance insight into the sector. This can then be used by others for planning and providing public services, for example, through being informed about the capacity of social care provisions nationally and by area. We also use the analysis of the data to inform our policy discussions and decisions, for instance by contributing to the reviews of inspection frameworks, evidence and reports. We also use the data to respond to ad-hoc requests and to give context to emerging issues or the impact of changes in the sector.
Communicating with users
The data is published annually as official statistics. The aggregation of data for official statistics allows us to communicate to users the key data and messages, for example, at different geographical levels and by provider type. The official statistics draw out the key messages and communicate these in an understandable way, appropriate for a wide range of different users. Users are able to interpret and manipulate the data published for their own purposes, because the release includes underlying data.
The data may, therefore, be used by stakeholder groups, academics and other interested parties across the sector. Fostering agencies may also use the data themselves, for local and regional use, to inform on areas of practice and to improve processes and standards.
Data sources
Some of the data about fostering in England is unique to Ofsted; alternative sources are not available. For instance, data about the capacity of fostering services and recruitment activity in England, as well as some other indicators, is only collected by Ofsted. This data is widely viewed as a valuable source for information about recruitment and capacity, for identifying vacancies and for providing an in-depth overall picture of fostering in England.
Some data is also published by other sources, such as the Department for Education (DfE). For example, ‘Children looked after in England’ includes data about fostered children, who make up the majority of children in care.
Response rates
Due to the impact of COVID-19 (coronavirus), a small number of agencies did not submit data this year. Because of this, numbers may be lower than in previous years and any yearly comparisons must be used with caution. For more information, see Ofsted guidance and information relating to COVID-19.
We collected data from 410 agencies (LA and IFA), which is 94% of all 434 eligible agencies. Less than 1% of all returns contained data inaccuracies that had not been resolved. This data has been included in the national, LA and IFA data and in this report, with an acknowledgement where necessary.
Of the 288 eligible IFAs (excluding trusts), we received data from 272 (94%) which included 2 returns with data inaccuracies.
Of the 146 eligible LAs (including trusts), we received data from 138 (95%). The Isles of Scilly and the City of London provided nil returns, because their fostering services are provided by Cornwall and the Pan-London arrangement respectively, rather than in-house. Hammersmith and Fulham, Kensington and Chelsea, and Westminster’s fostering services submitted a single combined return.
Due to the lower response rate this year because of COVID-19, we have used estimates for some main comparators. Our methodology for working out these estimates was to take a 3-year average for the 21 agencies (8 LAs and 13 IFAs) that were eligible to submit this year but did not. We looked at the data submitted for the previous 3 years for these agencies and worked out the averages for each agency. These numbers were then added onto our overall collected data this year. Where estimated figures have been used in the release, it has been clearly signposted. A further 3 eligible IFAs were new registrations this year, so we were unable to estimate figures for these agencies as they had no previous data.
Meeting users’ needs
The content of the survey and accompanying guidance are reviewed annually by the Social Care Data and Analysis and Social Care Policy teams, to develop the collection. Senior managers then sign off any changes that are required. This review process ensures that the survey will meet the data requirements for inspections and takes into account any policy changes or emerging issues.
Online data collection
The new portal system was introduced for the 2017 to 2018 collection, following on from a formal consultation in September 2017 about modernising the collection. Feedback was subsequently sought from users, and updates have been made to the system where necessary to suit user needs. Using the online system enables us to release this publication more quickly.
Helping and meeting with our users
Ofsted’s Social Care Data and Analysis team is represented at regional meetings of performance leads from LAs. These regional meetings are organised by the Association of Directors of Children’s Services (ADCS). We have presented relevant information and publications to all of the existing regional groups, and sought feedback about whether our publications meet user needs. Our participation in these groups helps inform us about users’ views on our social care official statistics. We take on board suggested improvements. We have also attended additional workshops to help data suppliers and stakeholders to understand the data that is being asked for and how it can be used. An online webinar is available to familiarise data suppliers with the online portal collection system. Ofsted inspectors attend annual conferences, which include sessions on the data that is collected and how this can inform inspection preparation.
Social care outputs are also shared with users in other organisations, such as the DfE, LAs, and representatives from the private and voluntary sectors at Ofsted-led stakeholder meetings. These organisations use the data for a range of purposes including informing their own social care outputs (DfE) and benchmarking performance (LAs).
The accompanying guidance includes a glossary of terms, to help clarify what is being requested, and descriptions of all codes used for data entry. The online portal allows agencies to quality assure (QA), validate and sign off their data to help make sure data is accurate and complete. We publish additional troubleshooting guidance to assist users with the process of submitting data though the online portal. The Social Care Data and Analysis team also offers help and guidance to agencies via email and telephone.
More information
The contact details for the Social Care Data and Analysis team are included in the releases. Users are encouraged to feedback about any unmet needs or suggested improvements and ask questions that are not covered in the glossary definitions and supporting guidance.
See more information about Ofsted’s engagement policy and confidentiality and access policy, as well as Ofsted’s statement of administrative sources. We also operate under more detailed internal engagement guidance.
Coherence and comparability
Ofsted has reported on fostering data in England since 1 April 2008. Over time, the data collection has been developed and improved. As the survey is reviewed and questions are amended or added, some areas are not comparable over time.
Reasons for these changes include:
- changes in legislation or policy; for example, a question about the staying put arrangement was added when this arrangement was introduced
- in response to data supplier feedback; for example, this year we stopped collecting child-level data because it duplicated some of the DfE SSDA903 Children Looked After data collection. Minimising the collection of duplicate data reduces unnecessary additional burden on data suppliers
- to provide more nuance in the resulting analysis; for example, placement offer data is now collected at household rather than agency level
If it is not directly possible to compare data over time, notes are given in the release to alert users.
The response rates have varied over the course of the survey. However, excluding this year, when response rates were slightly lower due to the impact of COVID-19, they had been consistently high at around 98% to 99% since 2012 to 2013.
Comparable data
Where appropriate, we reference comparable data collected by the DfE or the Office for National Statistics (ONS). For example, we reference the DfE’s annual collection on ‘Children looked after in England’ and the ONS’s publication of data from the 2011 census.
Data is presented at England-level and then sub-divided by sector: LA or IFA. Although there is a small amount of comparable data collected for other countries in the UK, this is minimal and so has not been included, for example, the number of approved LA foster carers and places in Wales and data on children looked after in Scotland in 2018 to 2019. Comparable data for other countries, including via the Eurostat database, is not available.
The underlying data that accompanies the fostering in England release has been streamlined and improved, to increase value to users. As a result of improved data quality in recent years, we have now been able to publish underlying data to include more detail of fostering types, such as:
- fostering to adopt
- emergency foster care
In previous releases, fostering households were reported against aggregated categories:
- mainstream
- family and friends
- short breaks only
There has been a slight recalculation of historical data to provide further details, where available. This is because we now include greater detail of fostering types in our data.
This year, for the first time, we have published underlying data at IFA-level. This data includes the number of fostering households and the number of foster carers registered with each IFA, as at 31 March 2020.
Comparisons may be adversely affected by different reporting practices across data suppliers. For example, one agency may only record the ethnicity of the primary carer in a household, while most record the ethnicity of both carers where applicable.
The annual collection gives data at consistent intervals. It includes snapshot data as at 31 March and periodical data for 12 months between 1 April and 31 March. The reporting period used is made clear in the release. An exception to the defined reporting periods may occur in certain circumstances, for example, if a new piece of legislation came into effect mid-way through the financial year.
Accuracy and reliability
All LAs and IFAs are asked to complete this return on a voluntary basis. Despite the lower submission rates in 2019 to 2020, there was still a 95% response rate from LAs and 94% response rate among IFAs. As a result of the number of agencies submitting data being lower than in previous years, we have estimated values for some main comparators.
We carry out this survey across all LA fostering agencies and IFAs in England, and so there is no risk of potential bias through sample selection. This data is sourced from the agencies’ administrative systems and therefore the data returned by each agency reflects fostering households as at 31 March 2020. We recognise, however, that the data was collected from 410 different agencies and that detailed information on their internal QA processes is not available. In total, 3 agencies (less than 1%) returned data with discrepancies that had not been resolved with the agency before the deadline for publication. This was in-line with last year.
Quality assurance process
The data is subject to a rigorous QA process, by both data suppliers and Ofsted. The online portal system has in-built validation functions that assist data suppliers with checking and amending the data. We also discuss with data suppliers any queries or errors in the data and resolve these to ensure the most accurate data the agency can supply. For some agencies with a larger number of issues, or that are new to completing the return, these conversations can be detailed and lengthy.
There are 2 tiers of validation carried out by the online portal system:
- at the point of upload, the portal checks that all fields contain valid data
- using the validation tool, the portal checks that data is complete and accurate
QA checks during upload
Invalid data: data that does not match the specified code set and format may be omitted from the upload or prevent the file uploading. For example, 31/02/2020 is an invalid date and WRBI is an invalid code
Invalid ID numbers: if any ID numbers have been duplicated within the data, this would prevent the file from uploading
QA carried out by the validation tool
Inconsistent data: validation rules compare data submitted across multiple fields to check that the data is consistent.
For instance, in recruitment data we would expect all ‘end dates’ to be after ‘start dates’.
Incomplete data: based on the users’ response, additional fields may be expected to be completed.
For instance, if there are 2 foster carers within a household, we expect ethnicity and training data for both carers.
After the QA process
Once the portal has carried out the QA checks, specific guidance is given to direct users to the field(s) containing errors. Guidance is available in 3 formats to allow maximum flexibility for users to check and amend their data. For instance, some users may prefer to download a validation report and then make changes to their original upload template, whereas other users may prefer to use the webform view where individual records can be edited straight into the portal.
The data is submitted online and stored securely in Amazon’s data centre that meets all Government Digital Service data requirements. Data exported from the online portal is stored in system folders that are only accessible to members of the Social Care Data and Analysis team. As data is now collected at person-level, agencies are asked to provide identifiers only and not names.
The deadline for all agencies to submit a validated return was 31 July 2020. Due to the impact of COVID-19, a very small number of individual extensions were granted.
QA checks are also carried out on the combined dataset, the analysis and the key findings, along with any supplementary statistics that are going to be published.
Where applicable, data is considered against the DfE data on Children Looked After as a ‘sense check’. However, the time periods in the 2 returns do not always match up, and so there is limited utility to this.
Strengths of the data
High response rates with all data provided: excluding this year, response rates for the previous 5 years had been 98%–99%. This year, despite the lower response rate, we still had a 94% response rate.
Thorough QA of the data: we carry out detailed QA of all returns, which means we have a high level of confidence in the data presented. The QA tools, processes and outcomes are described in the introduction to the main report and elsewhere in this quality report.
Data benchmarking
When applicable, data returned to us is benchmarked against data submitted to the DfE, and generally found to be in line with DfE data. When there are differences, this is likely due to additional QA work done with individual agencies, particularly IFAs, to ensure quality of data. This level of QA work is not always possible for IFAs as the DfE collects data from LAs only.
Comprehensive picture of fostering: due to high response rates, and the volume of data collected, as well as the mix of in-year and end-of-year figures, the data provides a comprehensive picture of fostering in England over time.
Embedding of new methodology: the number of agencies that returned forms that still had errors at the point that the collection was closed was in line with last year.
Limitations of the data
Voluntary nature of the collection: the data collection is voluntary. As a result, response rates may fluctuate, though, excluding this year, response rates had been close to 100%, with all respondents providing all requested data. As the collection is voluntary, there is no legislation compelling agencies, including LAs, to supply data, or to provide information on their own data quality. Her Majesty’s Chief Inspector has also not made use of her powers to compel reporting of this data.
Comparability to previous years: the changes made to the form each year, and particularly the introduction of a new way of collecting data from 2015 to 2016, meant that some previously comparable data items could no longer be compared to previous years’ data.
Known issues and variance with the data collected: there are some known issues and variance with the data collected. Some agencies reported that they were unable to provide some data items before the final deadline. Also, in some cases, there were errors in the returns that could not be resolved by the deadline. A small number of IFAs did not fully validate their data and submissions included small errors, such as omitted approval dates for households. As a result, we are aware that the data may not be as robust as hoped. However, using the online portal and in-built validation has streamlined the process and reduced the numbers of data sets submitted with errors, compared with the manual process used up until 2016 to 2017.
Different reporting practices: different agencies will have different reporting practices; there is no standard across all fostering agencies. This may impact the burden of completing the return on some agencies. There is also some anecdotal evidence of different recording practices in different agencies, such as around children going missing for short periods (less than 24 hours).
Minimal knowledge of data quality at provider level: although we extensively quality assure all data returns, information is not generally available about how the agency ensures accuracy of their own data. Steps are taken to mitigate the impact of this, including QA, and the provision of guidance on completion of the form. A helpline number is also operated so that data suppliers can speak to a member of the Social Care Data and Analysis team for queries.
Potential sources of error and bias
We have no direct knowledge of the agencies’ data storage systems, or the checks and QA they carry out on this. We are currently unable to explore this in any depth due to the time and resource needed. However, we perform extensive QA of data to minimise the impact of this. Also, the data collected is broadly similar year on year, so it is likely that agencies have systems in place to collect and report on the required data. There has also been evidence of improvement over the years as a result of this data collection.
There will always be situations that do not fit easily into the categories supplied. In these cases, personal interpretation may mean that different data suppliers code similar situations in different ways using a ‘best-fit’ approach. We aim to minimise this through use of guidance and support.
Agencies may perceive that the data will be used to form a judgement on their services, which could bias their return. We include guidance on the purpose of the collection in an aim to reduce this concern.
Finally, all agencies are asked to, and most do, submit data. Therefore, there is minimal risk of sample or response bias in the data.
Timeliness and punctuality
Statistics are produced and published on an annual basis.
Data is published on the date pre-announced in the publication schedule. You can also find information on any delay in publication on the publication schedule. Reasons why a delay may occur include, for example, when more time is necessary to properly QA the data to ensure its robustness. We announce publications on our Twitter account and other social media channels on the day of release.
The average timescale for producing the fostering data release is approximately 6 months. This includes approximately 3 months for collecting the data, including validation of the data, and an additional month for support and follow-ups with agencies. A further 6 weeks of the production involves:
- the analysis
- drafting the findings
- creating the statistical release
- QA of all outputs and publication on GOV.UK
Pre-release is given in accordance with the Pre-Release Access to Official Statistics Order (2008), as detailed in Ofsted’s pre-release policy.
Accessibility and clarity
Ofsted releases are published in an accessible format on GOV.UK. The information is publicly available and there are no restrictions on access to the published data.
Data covering children’s social care is held on a collections page on GOV.UK.
The primary function of the data is to meet Ofsted’s data requirements for inspections. However, the data is shared for public use with the intention of informing about the fostering sector and for re-use by analysts and researchers as may be required. The underlying data presentation was amended to better support public use and re-use from 2014 to 2015. This year, we removed child-level data from the collection and the published data. We implemented this change to reduce the duplication of data within the DfE SSDA903 Children Looked After data collection and minimise unnecessary burden on data suppliers.
Performance, cost and burden on respondents
We attempt to minimise the burden on respondents by improving the clarity of questions and definitions through direct consultation and feedback and queries. Some work was done in 2011, for the Department for Communities and Local Government (DCLG) single-data list, to establish the annual respondent burden in terms of resource hours.
In order to reduce the burden on agencies around producing this data, Ofsted and the DfE agreed (before the 2016 to 2017 collection) to introduce Ofsted unique reference numbers (URNs) into the statutory SSDA903 data collection from LAs. This data is intended to be included in a supplementary release. We will assess how effectively this change has worked for all parties, and what amendments may be needed to ensure that the process runs smoothly and continues to reduce burden for data suppliers.
As discussed above, we introduced an online portal for data validation and submission in 2017 to 2018. This substantially reduced the amount of time agencies wait for feedback on the quality of their returns. Validation is carried out immediately, at the point of submission, and the validation tool is more comprehensive compared with the previous method.
Confidentiality, transparency and security
When we hold sensitive or personal data, the disclosure control processes we have in place ensure that this data is not published. All data releases follow Ofsted’s confidentiality and revisions policies. All staff using sensitive data have been trained in confidentiality and disclosure awareness.
Methodology
For the 2019 to 2020 fostering dataset collection, we did not ask agencies for child-level data.
Data processing involves aggregating data to England- and sector-level. This processing is done using Excel, and is reviewed and quality assured before the data is used. No data has been removed.
For data protection and disclosure purposes, all figures in the key findings and the underlying data have been rounded to the nearest 5; this has also been applied to figures from previous years used in the release. The purpose of the rounding is to ensure non-disclosure of sensitive data while maintaining its usefulness. This means, however, that some total figures do not match exactly with data aggregated at provider type, England or regional levels.
This year, we will be publishing IFA agency-level data for the first time on the:
- number of fostering households as at 31 March 2020
- number of foster carers as at 31 March 2020
Definitions are provided in the SFR; for instance, the glossary includes placement types and what these mean, as well as references to relevant legislation where applicable.
The fostering data collection 2019 to 2020 was the first year that data was collected on the age of foster carers. For fostering households that have 2 carers, the age of ‘foster carer 1’ was used for household-level analysis. This method was considered suitable because, in the majority (84%) of 2 carer households, the carers were less than 10 years apart in age.
Glossary
Definitions of terms are in our statistics glossary.