Summary of findings from our education inspection framework implementation review
Published 3 February 2025
Applies to England
Introduction
In 2023, we did a large-scale review of the education inspection framework (EIF), focusing on the main features of the EIF, to understand whether we were carrying out inspections as we intended. We subsequently carried out an evaluation of the impact of the EIF on curriculum practice in schools. We committed to doing these reviews in our Ofsted strategy 2022–27.
The review was largely internal and based on inspectors’ views. We have used the findings to help improve inspection policy and practice. We have since carried out the Big Listen, where we heard more about our practice from children and learners, parents and carers, and providers.
This note summarises the review’s main findings and how we have used them in reforming our inspection practice.
Methodological note
We invited all our early years, schools, and further education (FE) and skills inspectors, and all contracted Ofsted Inspectors (who are often serving practitioners) to take part in a survey.[footnote 1] We also held focus-group discussions with select groups of inspectors. We surveyed providers that had recently been inspected to gather their views on their experiences of inspection. We also reviewed a range of internal data, including inspection evidence, published reports and the post-inspection survey.
We evaluated the main features of inspection that are specific to the EIF.[footnote 2] The review did not assess areas of inspection practice generally, such as the use of single-word judgements in schools. However, since we carried out the Big Listen we have made changes to some of these areas.
The review enabled us to hear directly from those who carry out inspections under the EIF. However, we recognise that the views of our inspectors are only one part of the picture.
Summary of findings
We found that inspectors were largely able to implement the main features of the EIF. However, they also highlighted some aspects that were more challenging to carry out.
Inspectors were able to gather evidence across all the judgement areas.[footnote 3] Most providers we surveyed said that inspectors spent the right amount of time on the key judgement areas on inspection.[footnote 4] However, inspectors told us that there were some areas that they would like more inspection time on. They found it more challenging to gather in-depth evidence on these areas within the time limitations of inspection. In schools, for example, inspectors mentioned the early years and sixth-form provision judgements and the personal development judgement. In early years, we found evidence was more complex to gather for the ‘leadership and management’ judgement.
Inspectors told us that they were applying the EIF in flexible ways, depending on the setting, as the inspection handbooks describe. Inspectors said they found the EIF the most straightforward to apply in:
- registered early years nurseries
- larger state-funded primary schools
- secondary schools
- education programmes for young people in general FE colleges and sixth-form colleges
However, we identified several contexts where there was a greater need to be more flexible. These included:
- childminders
- small primary schools
- independent learning providers, employer providers and community learning and skills providers
- special schools, provision for learners with high needs and alternative provision (AP)
Inspectors were also able to apply the deep dive and learning walk methodologies, mostly in straightforward and effective ways. However, there were some contexts where it was harder. Deep dives were more challenging in small schools, for example.[footnote 5] Inspectors were also less confident in their evidence bases for the areas of focus on ungraded inspections, compared with on graded inspections. In early years, learning walks were not always possible in small settings so they replaced them with a curriculum conversation. In FE and skills, inspectors found deep dives less straightforward when there was limited teaching or training during the week of inspection; learners would not be available to talk to inspectors because they were working. Inspectors also found judging curriculum impact more complicated in special schools and AP because of the wide range of pupils’ needs. For instance, fewer published outcome measures are available in these providers; those that are available are not necessarily representative of all the children who have passed through that provision.
Inspectors no longer look at unvalidated internal performance data. In schools and FE and skills inspections, published performance data continues to be part of judging the impact of education quality. Schools and FE and skills inspectors acknowledged the importance of data in the EIF. They also acknowledged that, by balancing outcomes alongside what pupils and learners know, data was more proportionately integrated into inspection compared with previous frameworks. In early years, inspectors gathered evidence by observing children and through discussions with providers and parents, without looking at providers’ internal assessment information, as intended.
Under the EIF, there are regular opportunities for dialogue between inspectors and a range of leaders and staff. However, in early years and schools inspections in particular, inspectors said they would like to have more time for those discussions. We know that discussions during inspection happen in a high-stakes context with limited time, especially on school inspections. This means that they are not always easy. Inspectors and providers agreed that dialogue was most effective when it was transparent, open, evidence-led and based on building a positive working relationship. Many providers thought that this happened during their inspection.
Next steps
We have considered the findings of this review alongside what we heard in the Big Listen.
We have already made changes to our ungraded inspection methodology in schools and to our work in small schools, as a result of this review.
Furthermore, we want our reforms to the EIF to:
- tailor the inspection process and criteria to the education provider phase and type, especially where the review identified a greater need to be more flexible
- consider the areas that inspectors would like more time on. For example, we want to give early years provision in a school the attention it deserves on inspection
- continue our focus on what children and learners achieve at every stage, alongside published outcomes
- make the inspection process more collaborative and transparent, including ensuring that professional dialogue is transparent and open
Our consultation on improving the way Ofsted inspects education opens on 3 February 2025. We are committed to evaluating these and will set out our plans later in 2025.
-
Response rates: early years inspectors 216 (30%), schools inspectors 379 (22%), FE and skills inspectors 248 (56%). ↩
-
In 2019, the EIF refocused inspection on a new evidence-based conception of high-quality education. We created a ‘quality of education’ judgement that has curriculum at the centre. We introduced new aspects of inspection methodology: a ‘learning walk’ in early years inspections and ‘deep dives’ that aimed to enable inspectors to build an overall evaluation of the quality of education. This included looking at what children have learned alongside how performance outcomes are being achieved. We also introduced separate judgements for ‘behaviour and attitudes’ and ‘personal development’ on graded inspections. ↩
-
In schools and FE and skills, inspectors were focusing on quality of education in order to reach secure judgements, as intended. In early years, inspectors said that the interconnected nature of early years provision means they can gather evidence in depth for each judgement across all the inspection activities. ↩
-
In early years, we surveyed 5,802 providers that had recently been inspected, using our internal database. We achieved a 13% response rate. In schools, an external agency surveyed panellists who had recently had their school inspected. We received 407 responses. In further education and skills, we surveyed 217 providers that had recently been inspected, using our internal database. We achieved a 20% response rate. In FE and skills, this was also true of provision types and themes on new provider monitoring visits. ↩
-
In schools, we have created a small-school reference group to recognise and build insights on the unique curriculum challenges in small schools. ↩