Independent report

Review into the operational effectiveness of the Care Quality Commission: interim report

Updated 15 October 2024

Applies to England

Foreword

I was asked to carry out a review of the Care Quality Commission (CQC) in May 2024. Over the last 2 months I have spoken to around 170 senior managers, caregivers and clinicians working in the health and care sector, along with over 40 senior managers and national professional advisors at CQC.

All have shared with me considerable concerns about the functioning of the organisation, with a high degree of consistency in the comments made. At the same time all recognise the need for a strong, credible and effective regulator of health and social care services.

This interim report of my work provides a high-level summary of my emerging findings in order to inform thinking around changes needed to start the process of improving CQC.

A more detailed report will be published in autumn 2024. Prior to this I will be discussing findings with user groups to ensure my final recommendations reflect their needs (which was not possible during June and the general election period).

There is an urgent need for a rapid turnaround of CQC - a process that has already started with the appointment of an interim chief executive in June 2024.

The health and care sector accounts for around 12% of the economy and 20% of public expenditure[footnote 1] and is one of the most significant drivers of health, public satisfaction and economic growth[footnote 2]. It needs - and deserves - a high-performing regulator.

Dr Penelope Dash, Independent Lead Reviewer

Background and context

CQC is the independent regulator of healthcare and adult social care in England. It was established in 2009 under the Health and Social Care Act 2008 and brought together the Commission for Social Care Inspection (CSCI), the Mental Health Act Commission and the Healthcare Commission.    

Before the early 1990s there was little objective assessment of the quality of health and care, despite the early attempts of notable figures such as Florence Nightingale and Ernest Codman (see Patient Safety Learning, Clinical audit: a manual for lay members of the clinical audit team). With increasing recognition of the high levels of variation in quality of care, combined with high-profile exposés of very poor outcomes, such as the Bristol heart scandal, the decision was taken to set up an independent reviewer of quality. This culminated in the establishment of the National Care Standards Commission ​by the Care Standards Act 2000 as a non-departmental public body to regulate independent health and social care services and improve the quality of those services in England. After several reincarnations, CQC was launched in its current form in 2009. The history of quality regulation in the NHS is shown in appendix 2.

All providers of healthcare and adult social care regulated activities in England must register with CQC. CQC monitors, inspects and regulates these services to make sure they meet fundamental standards of quality and safety and takes action to protect those who use these services (see tables 1 and 2 in appendix 1). It conducts performance assessments and rates providers of these services (with some exceptions). These assessments and ratings are publicly available.

CQC introduced a single assessment framework (SAF) in November 2023 to replace its previous system of inspections and assessments. The new framework was intended to make the assessment process simpler and more insight-driven, with the ability to inspect more frequently, and better reflect how care is delivered by different sectors, but there have been concerns since its introduction that it was not providing effective assessments.

The terms of reference for this review were to examine the suitability of the SAF methodology for inspections and ratings, including for local authorities and integrated care systems (ICSs). The full terms of reference are in appendix 3. While CQC has responsibility for assessing the provision of care in a very wide range of settings, for example in prisons and defence medical services, this review has been limited to the overwhelming majority of its work which includes social care providers (residential and care delivered in the home), NHS providers (trusts, GPs), dentists and independent sector (private and charitable) healthcare providers.

The review has been informed to date by management information provided to the Department of Health and Social Care (DHSC) by CQC along with one-to-one interviews and roundtable discussions with around 200 people. This includes executive directors of NHS England, regional directors of NHS England, NHS trust CEOs, medical directors and nurse directors, GPs, the British Dental Association, independent sector healthcare providers, representatives of local authorities and care providers (both smaller and larger providers). At the time of publication of this interim report, interviews with patients and users have been scheduled and will take place shortly. The review has spoken to the whole executive team at CQC (including the former chief executive), the chair, senior professional advisors and most of the wider leadership team. At the time of publication of this interim report, a meeting with CQC’s non-executive directors and staff forum had been scheduled but had not yet taken place. A list of all participants to date is shown in appendix 4.

In recognition of the current challenges impacting on health and social care, this review is looking at the current performance of CQC and specific questions regarding the SAF, support for innovation and for the economic and efficient delivery of health and care services. This is an interim report, with the full report being published in autumn 2024.  

Emerging findings

This review has found significant failings in the internal workings of CQC which have led to a substantial loss of credibility within the health and social care sectors, a deterioration in the ability of CQC to identify poor performance and support a drive to improved quality - and a direct impact on the capacity and capability of both the social care and the healthcare sectors to deliver much needed improvements in care. The findings are summarised around 5 topics:

  1. Poor operational performance.
  2. Significant challenges with the provider portal and regulatory platform.
  3. Considerable loss of credibility within the health and care sectors due to the loss of sector expertise and wider restructuring, resulting in lost opportunities for improvement.
  4. Concerns around the SAF.
  5. Lack of clarity regarding how ratings are calculated and concerning use of the outcome of previous inspections (often several years ago) to calculate a current rating.

Emerging finding 1: poor operational performance  

In order to ensure that health and social care services provide people with safe, effective, compassionate and high-quality care, CQC registers organisations that apply to carry out regulated activities, carries out inspections of services and publishes information on its judgements.

Operational performance was impacted by the COVID-19 pandemic. In March 2020, CQC paused routine inspections and focused activity on where there was a risk to people’s safety (see CQC Update on CQC’s regulatory approach). Despite this, and almost 2 and a half years since the steps in the previous government’s Living with COVID-19 strategy were implemented, the review has heard that CQC’s operational performance is still not where it should be. Specifically:

  • just 7,000 inspections and assessments were carried out in 2023 to 2024, partly due to the roll out of the SAF. This compares to more than 16,000 inspections conducted in 2019 to 2020. The target for 2024 to 2025 is for 16,000 assessments to be conducted. (See data from the ‘Corporate performance report (2023/24 year end) - appendix’ on the CQC Board meeting: 22 May 2024 page and the CQC Annual report 2019 to 2020 - performance summary respectively)

  • there is a backlog in registrations of new providers. At the end of 2023 to 2024, 54% of applications pending completion were more than 10 weeks old (according to the CQC ‘Corporate performance report (2023/24 year end) - appendix’). CQC has a key performance indicator (KPI) to reduce this proportion, but it increased from 22% at the end of 2022 to 2023. The review heard that the backlog in registrations was a particular problem for small providers trying to set up a new care home or a new healthcare service and could result in lost revenues and investment, which had a knock-on impact on capacity

  • even under the risk-based approach to prioritising assessments, the review heard from some stakeholders that re-inspection of social care providers after awarding a ‘requires improvement’ rating does not happen in a timely manner. Interviewees told the review that this could result in hospital discharge teams refusing to discharge people to them, or local authorities refusing to contract with providers, with a further knock-on impact on capacity

  • some organisations have not been re-inspected for several years. Data provided by CQC suggests that the oldest rating for a social care organisation is from October 2015 (nearly 9 years old) and the oldest rating for an NHS hospital (acute non-specialist) is from June 2014 (around 10 years old)

  • CQC estimates that the average age of current provider ratings ‘overall’ is 3.7 years (as of the beginning of June 2024) although this varies by provider type

  • providers told the review that they can wait for several months to receive reports and ratings following assessments. This increases the burden on, and stress of, staff and results in lost time when quality improvements could have been made

  • of the locations CQC has the power to inspect, CQC estimates that around 1 in 5 have never been rated (appendix 1, table 3). Some of these services registered with CQC over 5 years ago (appendix 1, table 4). This means that patients and users are unable to compare these services with others in order to help them choose care and providers do not have the insights from an expert inspection, resulting in a missed opportunity for improvement

  • the call centre performance is poor with interviewees telling the review that calls took a long time to be answered. Data from CQC shows the average time for calls in relation to registration (the most common reason for calling) to be answered between January and June 2024 was 19 minutes. CQC does have a KPI to achieve a 60% to 80% response rate on its national customer service centre call lines and this was achieved in 2023 to 2024 with a response rate of between 63% to 76% across the 4 lines. This means that between a quarter and a third of calls were dropped before they were answered (according to data from the ‘Corporate performance report (2023/24 year end) - appendix’ on the CQC Board meeting: 22 May 2024 page).

The review has concluded that poor operational performance is impacting CQC’s ability to ensure that health and social care services provide people with safe, effective, compassionate and high-quality care and is negatively impacting the opportunity to improve health and social care services. 

Emerging finding 2: significant challenges with the provider portal and the regulatory platform

New IT systems were introduced into CQC from 2021 onwards. The provider portal started in July 2023 but was not used in significant numbers until April 2024. The regulatory platform started in November 2023 for assessment and included registration and enforcement by April 2024. They were implemented with the intention of improving operations and communications with providers, enabling a move to a much more insight-driven approach to regulation, highlighting emerging risks and supporting more risk-informed, responsive assessments and inspections.

However, the deployment of new systems resulted in significant problems for users. The review has heard that they cannot easily upload documents, there are problems if the named user is away or off sick and it can take hours to receive a password reset. This takes staff away from delivering or supporting front-line care and causes considerable frustration.

The review has concluded that poorly performing IT systems are hampering CQC’s ability to roll out the SAF and cause considerable frustration and time loss for providers.

Emerging finding 3: considerable loss of credibility within the health and care sectors due to the loss of sector expertise and wider restructuring, resulting in lost opportunities for improvement

As part of a restructuring of CQC, the decision was taken to separate out sectoral knowledge from assessment and inspection teams and move to a far greater reliance on generalists. The review heard of inspectors visiting hospitals and saying they had never been in a hospital before, and inspectors visiting care homes and commenting they had never seen anyone with dementia before.

This has been compounded by changes to the roles of chief inspector. Where previously there were 3 roles - chief inspectors of social care, primary care and hospitals, each headed up by highly respected national figures with deep expertise in their sector, there are now 2 chief inspectors: one for adult social care and integrated care and one for healthcare.  

The current executive team is largely drawn from the social care sector with a noticeable lack of healthcare experience. Given CQC’s remit is across both the social care and healthcare sectors, the executive team would be expected to reflect that balance. The healthcare leadership team consists of a mental health nurse, a pharmacist and a NHS manager. They are supported by a medical director and national professional advisers who are drawn from a wide range of backgrounds. It has been nearly a year since the previous chief inspector of healthcare was unfortunately taken ill and has been unable to work. A request to bring in a new chief inspector for healthcare was first discussed with DHSC in September 2023, with a formal request made in February 2024.

The lack of sector expertise results in providers not trusting the outcomes of reviews and not feeling they have the opportunity to learn from others, especially from highly regarded peers.

This lack of expertise has been compounded by a reduction in ongoing relationships between CQC staff and providers. The chief executive in post up to 2018 and chief inspectors would spend considerable time with senior members of the health and care sectors building relationships, hearing their perspectives on care delivery and explaining and sharing the insights CQC was gathering. This has been described on both sides as invaluable and has been largely lost. At the local level, inspection teams would similarly build relationships with senior leaders from across the sectors to build confidence and support early awareness of emerging problems. While this clearly did not work in all cases, as witnessed by a number of high-profile failings, it did build more confidence among providers and enabled wider sharing of good practice.  

The review has found the current model of generic assessment and inspection teams lacking sector expertise and a lack of expertise at senior levels of CQC, combined with a loss of relationships across CQC and providers, is impacting the credibility of CQC, resulting in a lost opportunity to improve healthcare and social care services.

Emerging finding 4: concerns around the SAF

In 2022, CQC shifted its planned launch of the new SAF from January 2023 to later in 2023 because of internal delays and some feedback from providers which highlighted the importance of giving them time to prepare for changes.

The SAF was subsequently rolled out in November 2023 with a small number of providers across sectors as part of the early adopter programme. CQC continued to roll out the SAF region by region and began to apply the new framework in its assessments and inspections in a phased manner. The intention was to develop a streamlined approach to assessing and measuring quality of care which could be applied to any sector - from social care to acute and specialist hospitals, to mental healthcare and community care providers, to primary care. It was hoped that data and insights could be collected in advance across all areas of care in order to have an ‘always on’ data and insight system. This would allow for emerging problems to be identified at an early stage and enable a more risk-informed approach to assessments and inspections whereby organisations with data suggesting poorer quality care were assessed and inspected more frequently. The data would also provide national profiles so that a national schedule of planned assessments could be rolled out based on risk. This review considers this concept and approach to be sensible and in line with regulation in other sectors, but is clearly dependent on reviewing robust data and insights in a timely manner.

The framework has set out 34 areas of care quality (called quality statements) which could be applied to any provider of healthcare or social care. These align to the 5 key questions, or domains, of quality used for many years:

  • safe
  • effective
  • caring
  • responsive
  • well led

For each of these 34 areas, there are 6 ‘evidence categories’ where information about quality of care is collected. These are:

  • people experience
  • staff experience
  • partner experience
  • observations
  • processes
  • outcomes

However, the SAF is intended to be flexible and not all areas are considered for all 34 quality statements. Further, there are priority quality statements identified for services which are the starting point of an assessment. Additional quality statements may then be chosen based on the risk and improvement context for providers. Typically there are between 5 and 12 priority quality statements assessed depending on the sector or sub-sector being assessed. The average number of quality statements currently used on inspection is 9.4 or just under a third of the total 34 quality statements.

The review has identified 6 concerns with the SAF:

  1. There is no description of what ‘good’ or ‘outstanding’ care looks like, resulting in a lack of consistency in how care is assessed and a lost opportunity for improvement.

  2. There is a lack of focus on outcomes (including inequalities in outcomes).

  3. The way in which the SAF is described is poorly laid out on the CQC website and not well communicated internally or externally.   

  4. The data used to understand the user voice and experience, how representative that data is, and how it is analysed for the purposes of informing inspections, is not sufficiently transparent.

  5. There is no reference to use of resources or efficient delivery of care in the assessment framework which is a significant gap despite this being stated in section 3 of the Health and Social Care Act 2008. 

  6. The review has found limited reference to innovation in care models or ways of encouraging adoption of these.  

Concern 1: there is no description of what ‘good’ or ‘outstanding’ care looks like, resulting in a lack of consistency in how care is assessed and a lost opportunity for improvement

There is a lack of a description regarding what ‘good’ looks like for each quality statement and evidence category. While there used to be descriptions of each area assessed in the old model - see appendix 5 - this has not yet been developed for the new framework. 

The lack of a description of what ‘good’ or ‘outstanding’ looks like does not support organisations to improve. The review heard time and again from providers that they struggle to know what inspectors are looking for, they are not learning from them and, as a result, they don’t know what they need to do to be better. The review also heard that inspectors struggled to articulate the SAF and the definitions they should be working to.

Many providers referred to a lack of consistency in ratings awarded to providers. Those who work across multiple sites, for example, a large care home provider with multiple sites or a large group of GP practices, report ratings differing from one site to another, when they know (spending far more time with them) that the differences are not as being reported - and in all directions, for example, their poorer quality providers getting better ratings than their top providers and vice versa.

Concern 2: there is a lack of focus on outcomes (including inequalities in outcomes)

Within the framework there is an evidence category for outcomes data but surprisingly little evidence of assessments and inspections considering the outcomes of care. For example, within primary care, only 2 of the 99 quality statement and evidence categories considered refer to outcomes of care. Even when outcomes are considered, there is a very narrow set of metrics routinely reviewed. This is despite there being considerable data available about outcomes of care in primary care, for example how well controlled diabetes is in someone with diabetes, and the impact of that care - for example, rates of admission to hospital with the complications of a long-term condition (renal failure in people with diabetes, acute cardiovascular events in people with high blood pressure or atrial fibrillation).

While there is limited nationally consistent data that is available on the outcomes of social care at a care provider level, CQC does collect data from providers on adverse negative outcomes in care including serious injuries, safeguarding incidents and deaths (see the CQC notifications page). At a local authority level  slightly more outcomes data is available and assessments do take into account measures from the adult social care outcomes framework, including social care-related user quality of life. CQC could be more transparent as to which metrics are looked at against each quality statement.

Within healthcare, NHS England publishes over 30 national clinical audits - and the Getting It Right First Time (GIRFT) programme - more of which could be drawn on to provide comparisons of outcomes across providers. The GIRFT programme provides data on mortality rates, length of stay, post-surgical infection rates and hospital re-admission rates in more than 40 surgical and medical specialties, but is not used by CQC.

Across outcome data, the review has struggled to find reference to measures looking at outcomes by different population groups, in particular the NHS England Core 20PLUS5 groups, though the review understands that this is being considered in the developing approach to assessing ICSs.

The review was similarly surprised to see the lack of measures of outcomes in independent sector providers, particularly given the emphasis on this in the Paterson Inquiry report

Concern 3: the way in which the SAF is described is poorly laid out on the CQC website and not well communicated internally or externally

The summary of the SAF (as set out at the start of this section) should be easy for any health or care provider, or any member of the public, to find and access. However, the SAF is described on CQC’s website in a way that the review found confusing. Poorly laid-out papers lack structure and numbering and do not have a summary as per the section above.

Further, the review has found that across the executive team, few were able to describe the 34/6 framework (34 quality statements and 6 evidence categories), the rationale for prioritising particular quality statements in each sector, the rationale for which evidence categories are used for different quality statements, the way in which ratings are calculated and so on. This should be widely understood in CQC irrespective of role or background.

Concern 4: the data used to understand the user voice and experience, how representative that data is, and how it is analysed for the purposes of informing inspections is not sufficiently transparent

With the development of the SAF, CQC has rightly sought to have a greater emphasis on people’s experience of the care they receive. However, the review found it hard to build a clear picture of exactly what data is looked at. The review has not found any published description of the statistical analysis applied by CQC to these, what response rate is required and how representation across users, patients and staff is ensured.

It seems the majority of data considered is drawn from national surveys (for example, the NHS GP Patient Survey and the NHS England Personal Social Services Adult Social Care Survey), which may or may not be representative or statistically significant at a service level, and this is then supplemented by a number of interviews with service users. The interviews could be as few as tens of users of a service when the service is looking after thousands of people a year. For local authority assessments only 6 cases are tracked in any detail - regardless of the size of the local authority. Senior leaders with experience of Ofsted inspections of children’s social care and local authority assessments noted that Ofsted looks at significantly more cases. This may result in a lack of representativeness of responses which questions the robustness of the analysis.  

CQC does assess providers on whether they are actively seeking, listening and responding to the views of people most likely to have a poorer experience of care or those who face more barriers to accessing services.

There is similarly a need to ensure representative surveys of all staff in the main sectors reviewed and the methodology used should again be more transparent.    

Concern 5: there is no reference to use of resources or efficient delivery of care in the assessment framework despite this being stated in section 3 of the Health and Social Care Act 2008

The review was asked to consider how efficient delivery of care is assessed. This is part of the scope of CQC and was set out in the Health and Social Care Act 2008. However, within the SAF there is no quality statement that considers use of resources. The review understands that work on this from NHS England can be requested by CQC, but this is rarely done.  

The lack of an objective assessment of the efficient delivery of care is disappointing as effective use of resources is one of the most impactful ways of improving quality of care for any provider. More efficient deployment of staff and more efficient use of assets (beds, diagnostics, theatres) enables more people to be cared for and better care for individual patients. Further, a number of recognised metrics of high-quality care are also good metrics of efficient services and good use of resources, for example length of stay in an inpatient facility (See Han TS and others. Evaluation of the association of length of stay in hospital and outcomes. International Journal for Quality in Health Care 2022: volume 34, issue 2).

The quality statement on effective use of staffing assesses staffing levels based on guidance from national and specialist clinical bodies but does not independently consider whether services could be delivered in a more efficient way. An independent regulator should be able to share best practice and challenge provider capture, as can be seen in other national regulators such as OFGEM (see OFGEM’s Data Communications Company (DCC) Price Control Guidance: Processes and Procedures 2022).  

CQC now has the power to assess local authorities and ICSs. The review will include more on this in the final report, given their crucial role in commissioning high quality and efficient services.

Concern 6: the review has found limited reference to innovation in care models or ways of encouraging adoption of these

The review was asked to consider how CQC considers innovation in care delivery. While there is a quality statement looking at ‘learning, improvement, innovation’ and there is some CQC guidance on using and sharing innovation to reduce health inequalities, the review did not find evidence of a systematic consideration of how well organisations are innovating care delivery, for example adopting technology or setting up elective care centres or moving to new models of primary care. (See Skills for Care Using technology in social care, and NHE England The healthcare tech ecosystem and Next steps for integrating primary care: Fuller Stocktake report). 

This is a missed opportunity and could be a galvanising factor in driving better quality, more efficient, more responsive care across all sectors. In comparison, for example, Ofsted works closely with the Department for Education to consider what changes and improvements to schools are planned and incorporates those into its inspection framework. 

Emerging finding 5: lack of clarity regarding how ratings are calculated and concerning use of the outcome of previous inspections (often several years ago) to calculate a current rating

The review has been concerned to find that overall ratings for a provider may be calculated by aggregating the outcomes from inspections over several years. This cannot be credible or right.

The review understands that this approach is long standing and did not change as a result of the introduction of the SAF but may not have been transparent before. The SAF was intended to prevent the use of inspections (and associated ratings) from previous years as more frequent assessments would be undertaken based on emerging data and intelligence but, because the CQC is not doing the number of assessments required to update ratings, the problem continues.

CQC intends to mitigate this by using individual quality statement and quality domain scores instead of aggregated ratings, and by assessing more quality statements to improve robustness.

Providers do not understand how ratings are calculated and, as a result, believe it is a complicated algorithm, or a ‘magic box’. This results in a sense among providers that it is ‘impossible to change ratings’. CQC is seeking to bring greater clarity to how ratings are calculated and is developing materials to facilitate communication and build transparency. Ratings matter - they are used by users and their friends and family, and they are a significant driver of staff recruitment and retention. They need to be credible and transparent.

Recommendations

There are 5 recommendations in line with the findings above: 

  1. Rapidly improve operational performance.

  2. Fix the provider portal and regulatory platform.

  3. Rebuild expertise within the organisation and relationships with providers in order to resurrect credibility.

  4. Review the SAF to make it fit for purpose.

  5. Clarify how ratings are calculated and make the results more transparent particularly where multi-year inspections and ratings have been used.

Recommendation 1: rapidly improve operational performance

The interim chief executive of CQC is already making progress towards redressing poor operational performance including bringing in more staff, particularly those with prior experience of working in CQC. CQC should agree operational performance targets or KPIs in key areas, in conjunction with DHSC, to drive and track progress. In addition, consideration should be given to moving the nascent ICS assessment team to other sectors given the delays in starting ICS assessment work.

Recommendation 2: fix the provider portal and the regulatory platform

CQC will need to set out how, and by when, it will make the changes required. CQC should also ensure that there is far more consideration given to working with providers to seek feedback on progress.

Recommendation 3: rebuild expertise within the organisation and relationships with providers in order to resurrect credibility

There is an urgent need to appoint highly regarded senior clinicians as chief inspector of hospitals and chief inspector of primary and community care. Working closely with the chief inspectors and the national professional advisors, there should be rapid moves to rebuild sector expertise in all teams.

The review heard a strong message from providers across sectors about the opportunity to create a sense of pride and incentive in working as a specialist advisor with CQC. Consideration should be given to a programme whereby the top performing managers (from across health and social care) along with carers and clinicians are appointed or apply to become assessors for 1 to 2 weeks a year with high accolade being given to being accepted on the programme.

The leadership team of CQC - which should include the 3 chief inspectors - should rebuild relationships across the health and care sectors, share progress being made on improvements to CQC and continually seek input.

Recommendation 4: review the SAF to make it fit for purpose

There needs to be a wholescale review of the SAF to address the concerns raised above. Specifically, to:

  • set out clear definitions of what ‘outstanding’, ‘good’, ‘requires improvement’ and ‘inadequate’ looks like for each evidence category, for each quality statement as per the previous key lines of enquiry
  • request credible sector experts to revisit which quality statements to prioritise and how to assess and measure them
  • give greater emphasis to, and use of, outcome measures for example, GIRFT, national clinical audits, equivalents for social care
  • give greater emphasis and prominence to use of resources within the well led domain - and build skills and capabilities to assess
  • improve the quality of documentation on the CQC website
  • significantly improve transparency and robustness of data used for user and staff experience
  • build knowledge and insights into innovation in healthcare and social care - including new models of care - and weave these into the quality statements

Recommendation 5: clarify how ratings are calculated and make the results more transparent particularly where multi-year inspections and ratings have been used

The approach used to calculate ratings should be transparent and clearly explained on CQC’s website. It should be clear to all providers and all users.

The use of multi-year assessments in calculating ratings and in reports should be re-considered and greater transparency given to how these are being used in the meantime.

Next steps

The review team will publish a more substantive report in autumn 2024, bringing additional data and detail to this report with more inputs from the people spoken to (importantly including patients and users who will shortly be included now the pre-election period has completed) and more analysis. The review will also consider other areas within the terms of reference, for example, local authority and ICS assessments.

DHSC should enhance its oversight of CQC in line with the Cabinet Office’s Sponsorship Code of Good Practice. In particular, more regular performance review conversations should take place between DHSC and CQC to reinforce and check progress against the recommendations in this report. Meetings should include senior civil servants at DHSC, ideally the relevant directors general, and should take place on a monthly basis.

Appendix 1 - supplementary data tables

CQC publishes, in their care directory, a list of all regulated locations in England and their latest rating. This is a live database that is generally updated monthly. See CQC guidance on using CQC data.

CQC took a snapshot of this data at the beginning of June 2024, to provide the information contained in this appendix. The numbers may vary slightly depending on when the data is extracted.

Table 1 - number of locations registered with CQC by provider type (snapshot, June 2024)

Provider type Number Proportion
Social care organisation 29,108 54%
Primary dental care 11,346 21%
Primary medical services 6,475 12%
Independent healthcare organisation 4,821 9%
NHS healthcare organisation 1,820 3%
Independent ambulance 308 1%
Total registered locations 53,878 100%

Table 2 - number of locations CQC has the power to rate, by provider type (snapshot, June 2024)

Provider type Number of locations CQC has the power to rate % of locations CQC has the power to rate Number of locations CQC does not have the power to rate % of locations CQC does not have the power to rate
Social care organisations 29,108 100% 0 0%
Primary dental care 15 0.1% 11,331 99.9%
Primary medical services 6,471 99.9% 4 0.1%
Independent healthcare organisations 4,795 99% 26 1%
NHS healthcare organisations 1,740 96% 80 4%
Independent ambulance 308 100% 0 0%
Total registered locations 42,437 79% 11,441 21%

Note: the Care Quality Commission (Reviews and Performance Assessments) Regulations 2018 refers to 10 services that are not rated but are inspected by CQC, in some cases jointly, for example health in prisons with HM Inspectorate of Prisons. The reason for not rating these services is to avoid over-regulation, for example, another regulatory body has oversight, or the risks are considered low.

Table 3 - number of locations with a published rating, by provider type, where CQC have the power to rate (snapshot, June 2024)

Provider type Number of locations CQC has the power to rate Number of locations with a published rating % of locations with a published rating
Primary medical services 6,471 6,256 97%
Social care organisation 29,108 23,951 82%
Independent ambulance 308 193 63%
Independent healthcare organisation 4,795 2,871 60%
Primary dental care (see note 1) 15 4 27%
NHS healthcare organisation 1,740 445 26%
Total 42,437 33,720 79%

Note 1: the 4 primary dental care services rated were not rated for their dental care. There are regulated activities other than treatment of disease, disorder or injury (TDDI) being carried on at these sites for example, surgical procedure, family planning services, diagnostic and imaging.

Table 4 - number of locations registering with CQC in 2019 or earlier, which CQC have the power to inspect, but do not have a published rating (snapshot, June 2024)

Provider type Total locations without a rating where CQC has power to inspect Number registering with CQC in 2019 or earlier % registering with CQC in 2019 or earlier
Social care organisation 5,157 173 3%
Primary medical services 215 13 6%
Independent healthcare organisation 1,924 395 21%
Independent ambulance 115 24 21%
Primary dental care 11 4 36%
NHS healthcare organisation 1,295 1,081 83%
All 8,717 1,690 19%

Appendix 2 - quality and safety history and context

This list is not exhaustive.

1990s

Increasing interest in quality of care accompanied by increasing role of clinical audit.

Under the Labour administration, the Department of Health published ‘Quality in the NHS’ in 1998 which was seen as a step change in focusing on systematic improvement in the quality of care.

Investigation into the Bristol heart scandal.

1999

Establishment of the Commission for Health Improvement (CHI). The statutory functions conferred on CHI set out in Section 20 of the Health Act 1999 and the Commission for Health Improvement (Functions) Regulations 1999 to 2000.

2000 

Establishment of the National Care Standards Commission by the Care Standards Act 2000 as a non-departmental public body to regulate independent health and social care services and improve the quality of those services in England. It was set up on 9 April 2001.

2001

Creation of the National Patient Safety Agency (NPSA).

Establishment of the Shipman Inquiry.

2003

Creation of Council for Healthcare Regulatory Excellence.

Establishment of Medicines and Healthcare Products Regulatory Agency (MHRA).

2004

Establishment of the Commission for Social Care Inspection by the Health and Social Care (Community Health and Standards) Act 2003. Under the terms of the 2003 Act, the Healthcare Commission assumed responsibility for the functions previously exercised by CHI.

2006

Creation of National Patient Safety Forum (NPSA).

2009

Establishment of the National Quality Board.

Establishment of CQC, replacing the Commission for Social Care Inspection (CSCI), the Mental Health Act Commission and the Healthcare Commission. 

NPSA publishes first version of Never Events policy framework.

2010

NPSA publishes a National Framework for Reporting and Learning from Serious Incidents Requiring Investigation (the Serious Incident Framework (SIF)).

2012

Council for Healthcare Regulatory Excellence (CHRE) becomes Professional Standards Authority (PSA) for Health and Social Care.

NPSA transferred to NHS England under provisions in the Health and Social Care Act 2012.

2013

Robert Francis public inquiry report into Mid Staffordshire NHS Foundation Trust published in February.

CQC introduced its new regulatory model.

2014

Introduction of the organisational Duty of Candour for trusts through the Health and Social Care Act 2008 Health Regulations 2014 (Regulation 20).

Establishment of patient safety collaboratives.

2015

Ongoing implementation of comprehensive inspections and ratings for all NHS and care providers by the CQC and a focus by CQC on patient safety, in response to the Mid Staffordshire public inquiry.

2016

The Getting It Right First Time programme was launched.

CQC review into learning from deaths (‘Learning, candour and accountability’).

2017

DHSC established the Healthcare Safety Investigation Branch (HSIB).

2018

CQC’s ‘Opening the door to change’ published, looking at why patient safety incidents like ‘never events’ were still occurring.

2019

The first NHS Patient Safety Strategy published by NHS England.

2023

CQC introduced the single assessment framework.

New duty on CQC to assess local authorities’ delivery of their duties under Part 1 of the Care Act 2014.

Health Services Safety Investigations Body (HSSIB) established as an arm’s length body (ALB).

Appendix 3 - review terms of reference

Summary

To examine the suitability of CQC’s new SAF methodology for inspections and ratings. In particular to:

  • ensure the new approach supports the efficient, effective and economic provision of health and care services
  • contrast the previous approach, which prioritised routine and/or some reactive inspections with the new more sophisticated approach informed by emerging risks and data. The new inspection methodology for ICSs - both the NHS and social care components - will also be reviewed
  • consider what can be done to ensure appropriate alignment between NHS oversight framework (NHSOF) and CQC inspection and ratings
  • consider what can be done to ensure trusts respond effectively, efficiently and economically to CQC inspections and ratings
  • consider whether CQC is appropriately set up, in both its leadership and staffing, to ensure that its new statutory role of assuring local government social care functions is as effective as possible alongside their wider responsibilities, and how they will review and monitor this over time
  • examine how senior NHS leaders can be more involved in CQC inspections and actions they can take to positively support CQC activity to ensure CQC’s work is translated into strong outcomes for patients
  • examine how social care inspections and ratings make use of the user voice and capture patient experience

Areas of focus

The main areas of focus include:

  • staffing and service innovation - is CQC inspection an actual or perceived barrier to workforce reform and change, and service innovation? If so, how are these barriers being addressed?
  • provider responses - do CQC’s investigations and ratings drive the correct responses among providers, in respect of ensuring the delivery of safe, efficient, effective and economic services?
  • data quality and learning - what more could be done to ensure that regulated bodies understand the importance of ensuring that the data they produce on which regulation activity is based is of sufficient quality?
  • patient satisfaction and access - do inspections and ratings take account not only of the access statistics but patient experience of service access more broadly? Are patients’ voices being heard, both in health and social care?
  • is the inspection and/or assurance approach, both CQC’s leadership and staffing, appropriate for: a) ICSs (health and social care aspects), b) local government social care functions?
  • how will the new scoring system affect ratings and the 5 key areas (safe, caring, responsive, effective and well led)?
  • are CQC’s regulations and processes fit for an age of digital healthcare?

Appendix 4 - review participants

The list of people spoken to for this review so far are as follows:

  • 6 CQC executives including previous CEO
  • chair of CQC
  • 2 members of the CQC senior leadership team and board attendees
  • 14 CQC senior leadership team members
  • 11 CQC organisational leads
  • 12 CQC specialist advisors and national professional advisors
  • 7 DHSC senior civil servants
  • 1 Ministry of Housing, Communities and Local Government senior civil servant (previously DLUHC)
  • 9 NHS England national directors including NHS England chair
  • 7 NHS England regional directors
  • 49 NHS trust leaders including chairs, chief executives and medical directors of NHS trusts (spread across acute and specialist community and mental health trusts and foundation trusts)
  • 3 members, 1 director and 1 chief executive of NHS-related bodies
  • 16 general practitioners and general practitioner leaders
  • 3 senior members of the British Dental Association
  • chief executive of the Independent Healthcare Providers Network
  • 6 senior members of social care provider representative organisations
  • 8 members of adult social care individual providers
  • 22 quality leaders of social care providers
  • 8 local authority directors or chief executives of adult social care
  • 9 integrated care integrated care board chairs and chief executives
  • 5 chairs and chief executives of statutory and quality-related health bodies
  • 2 from academia and think tanks
  • 9 other individuals

Appendix 5 - previous assessment model

We have taken this from page 26 of CQC’s key lines of enquiry (KLOEs) for healthcare services, which includes ratings characteristics. We have copied, for ease of reference, the rating characteristics for ‘safe’ in healthcare services below.

Safe

By safe, we mean people are protected from abuse and avoidable harm.

Note: abuse can be physical, sexual, mental or psychological, financial, neglect, institutional or discriminatory abuse.

The ratings are:

  • outstanding: people are protected by a strong comprehensive safety system and a focus on openness, transparency and learning when things go wrong
  • good: people are protected from avoidable harm and abuse. Legal requirements are met
  • requires improvement: there is an increased risk that people are harmed or there is limited assurance about safety. Regulations may or may not be met
  • inadequate: people are not safe or at high risk of avoidable harm or abuse. Normally some regulations are not met

S1 (CQC code for safe): How do systems, processes and practices keep people safe and safeguarded from abuse?

Outstanding

Outstanding means:

  • there are comprehensive systems to keep people safe, which take account of current best practice. The whole team is engaged in reviewing and improving safety and safeguarding systems. People who use services are at the centre of safeguarding and protection from discrimination
  • innovation is encouraged to achieve sustained improvements in safety and continual reductions in harm

Good

Good means:

  • there are clearly defined and embedded systems, processes and standard operating procedures to keep people safe and safeguarded from abuse, using local safeguarding procedures whenever necessary. These:

    • are reliable and minimise the potential for error
    • reflect national, professional guidance and legislation
    • are appropriate for the care setting and address people’s diverse needs
    • are understood by all staff and are implemented consistently
    • are reviewed regularly and improved when needed
  • staff have received up-to-date training in all safety systems, processes and practices
  • safeguarding adults, children and young people at risk is given sufficient priority. Staff take a proactive approach to safeguarding and focus on early identification. They take steps to prevent abuse or discrimination that might cause avoidable harm, respond appropriately to any signs or allegations of abuse and work effectively with others, including people using the service, to agree and implement protection plans. There is active and appropriate engagement in local safeguarding procedures and effective work with other relevant organisations, including when people experience harassment or abuse in the community

Requires improvement

Requires improvement means:

  • systems, processes and standard operating procedures are not always reliable or appropriate to keep people safe
  • monitoring whether safety systems are implemented is not robust. There are some concerns about the consistency of understanding and the number of staff who are aware of them
  • safeguarding is not given sufficient priority at all times. Systems are not fully embedded, staff do not always respond quickly enough, or there are shortfalls in the system of engaging with local safeguarding processes and with people using the service
  • there is an inconsistent approach to protecting people from discrimination

Inadequate

Inadequate means:

  • safety systems, processes and standard operating procedures are not fit for purpose
  • there is wilful or routine disregard of standard operating or safety procedures
  • there is insufficient attention to safeguarding children and adults. Staff do not recognise or respond appropriately to abuse or discriminatory practice
  • care premises, equipment and facilities are unsafe