Responding to our consultation: Changes for more flexible and responsive regulation

Page last updated: 22 February 2023
Categories
Public

Following on from our main strategy consultation, we launched a second formal public consultation in January 2021. This was to hear views on our proposals for some specific changes. These build on our learning from our regulation during the COVID-19 pandemic and move us towards our ambition to be a more dynamic, proportionate and flexible regulator.

In our consultation Changes for more flexible and responsive regulation, our proposals were designed to:

  • allow us to assess and rate services more flexibly, so we can update our ratings more often in a more responsive and proportionate way
  • make ratings easier to understand for everyone.

As part of the consultation we also produced a draft Equality and Human Rights impact assessment. This reflected our early thoughts on the potential opportunities for improving and promoting equality and human rights through our proposals and ways to maximise these, and how we propose to mitigate potential risks to equality and human rights as a result of implementing proposals.

How we consulted

The consultation ran for eight weeks from 26 January to 23 March 2021. We published the content on our website both as an accessible PDF document and as web content, along with an easy to read version. People were able to respond to the questions by completing an online webform, by emailing, or by post.

To support the public to respond, we ran focus groups and held one-to-one interviews with the following groups of people whose voices are seldom-heard:

  • carers
  • people with a learning disability and/or Autism
  • people with experience of mental health services
  • people with a sensory impairment
  • older people with long-term conditions
  • people whose first language is not English.

We also held webinars and conversations with Local Healthwatch and voluntary sector organisations to encourage those representing the views of the public to respond.

We also hosted the consultation on our online participation platform, CitizenLab, which directed people to the online webform.

To encourage people to respond to the consultation we engaged with health and care providers through a number of podcasts and webinars.

Responses to the consultation

We are grateful to all those who responded to the consultation for the valuable feedback. All responses were analysed by Traverse Ltd, who provided independent analysis of the feedback.

We received a total of 566 responses. These comprised:

  • People who use services / public / voluntary and charity groups: 93 (16%)
  • Service providers / commissioning bodies: 407 (72%)
  • Teams of CQC employees: 23 (4%) ((team responses equate to over 400 individual employees)
  • Other stakeholders: 43 (8 %)

We received responses in the following ways:

  • 492 through our online webform
  • 44 sent by letter or email
  • 30 responses taken at engagement events.

Many responses represented the collective views of an organisation or a team.

What you told us

Summary of key themes from responses

We welcome the high level of response to this consultation, particularly at a time of great pressure for health and care services. Many responses included detailed views and suggestions.

Responses indicated broad overall support across all proposals.

  • Strong support for using a wider range of regulatory approaches to assess quality, and not just rely on full on-site inspections. Many respondents believed this will lead to a more efficient and proportionate regulatory approach.
  • Strong support for reviewing and updating ratings (or judgements of quality where we don’t rate) more flexibly, rather than following a fixed schedule of inspections. Respondents believed this will allow CQC to be more responsive to changes in the quality of services, resulting in more accurate and reliable ratings.
  • Support for the changes to rating GP practices and NHS trusts. Many respondents believed it will make ratings simpler and easier to understand.
  • Many respondents also thought our proposed approaches would have a positive impact on CQC’s relationship with providers. Providers told us they valued the way we engaged with them during the pandemic and they welcome opportunities for more collaborative working in the future.

Despite the high level of support, there were some areas where respondents raised some concerns or queries, including:

  • A more flexible approach leading to fewer on-site inspections, which some respondents believed could undermine the importance of seeing care being delivered and how it informs our judgements about the quality of care.
  • The reliability and availability of data and information that we will be using to change a rating and how we will collect it.
  • The potential for a simpler ratings process for GP practices and NHS trusts to result in a loss of detailed information that contributes to our reports and judgements.

We address these queries in our responses to each consultation question.

Respondents expressed an appetite for more detail on how we will implement the proposals. For each proposal, we set out the next steps for implementing some of the changes we’ll make to put them into practice. We’ll ensure that the information on our website is clear about how we are regulating and rating services as we develop our approach following this consultation.


Assessing and rating quality

1. Assessing quality

We propose to assess quality and rate services by using a wider range of regulatory approaches – not just on-site or comprehensive inspections.

Question 1a. To what extent do you support this approach?

Of the 484 respondents who answered this question, the majority (78%) either ‘fully’ or ‘mostly’ supported this proposal. Of these respondents, 80% were in the Provider/Commissioner respondent category, with the majority either ‘fully’ or ‘mostly’ supporting this proposal. Only 21 respondents (4%) did not support this proposal at all.

Question 1b. What impact do you think this proposal will have?

Many respondents supported our proposal to use a broader approach to assess quality, and not just rely on full on-site inspections. Respondents believed that an ongoing, dynamic and proactive approach to assessing quality would lead to more accurate and balanced assessments on quality and produce more robust ratings.

“If this ambition is fully realised, it should yield a more targeted and efficient approach to regulation, which would have many benefits, including relieving pressure on providers.”

(Other stakeholder)

Respondents felt the proposed approach would be fair and flexible and enhance CQC’s responsiveness. They were positive that this would encourage providers to improve.

“A broader range of approaches for assessing quality would be beneficial, as this would help identify issues and concerns as early as possible and provide the opportunity to work with practices [so they] can improve.”

(Provider membership organisation)

There was also support for the impact of the approach on CQC’s relationship with providers. Some respondents believed it would result in a more collaborative approach between providers and inspectors and help to develop relationships. Respondents also believed the proposals could be effective in reducing the stress and burden for providers that can be associated with an on-site inspection.

There was also an emphasis on the benefit for people who use services as the proposed approach would help them make informed choices about the services they need.

A number of respondents felt this proposal would support CQC to be more efficient and to concentrate activity where there was likely to be increased risk of poor care.

We clarify some specific queries from the responses, including:

  • Potential for the proposed approach to lead to fewer on-site inspections

We’ll continue to carry out site visits. We know that poor cultures can exist in all types of services, but we’ll focus particularly on the types of care setting or provider where there’s a greater risk of a poor culture going undetected. This may mean we make more frequent site visits to those settings. We’ll use our powers to visit services when we need to respond to risk, when we need specific information that can only be gathered through a site visit, when we need to observe care, and to ensure that our view of quality is reliable.

However, we can also gather some information from other sources, without needing to visit. For example, good quality information is already available for specific types of services, such as some NHS services. We also know from our pilot assessments of home care services and GP practices that we can collect evidence digitally – as we have done during the pandemic – and assess services without always needing to visit.

We’ll use a combination of all our regulatory methods, tools, and techniques to help us assess quality continuously – including site visits where appropriate. As part of this, we’ll gather the views and experiences of a wider range of people and we’ll improve how we analyse people’s experiences of care, and the views of staff, in our assessments.

Some services are using more digital technology to deliver care – we believe this will continue beyond the pandemic. This means we’re considering more ways to collect evidence, as site visits won’t always be right for all types of service.

  • Quality of the data and information we’ll be using to change a rating and how we’ll collect and analyse it

Some respondents were concerned that our proposals may affect our ability to identify risks or improvements in a service as we would collect different data from a broader range of sources, or that we will not always consider or understand the context of the information we collect. A small number of respondents were also concerned that the proposed approach would mean collecting more data from providers.

We’ll continue to work with providers, the public, our strategic partners and stakeholders to ensure we’re using the best data and information about a service to keep ratings and our information about quality up to date. We’ll only ask for the information we need and that we can’t get elsewhere, and we’ll work with providers, other regulators and partners to coordinate data collections.

We’ll build digital platforms that will better integrate the data we hold, which will enable us to interpret data in a more consistent way. Combined with the experience, knowledge and professional judgement of our inspectors, this will support improved analysis. We’ll be transparent about the information we collect and explain how we plan to use it.

  • Potential for a less transparent and consistent approach to assessing quality for different types of services and providers

As we develop our approach, we’ll make sure that the information on our website clearly explains how we are regulating and rating services.

We’ll use our regulatory powers in a smarter, more proportionate way to enable us to tailor our regulation to individual types of services and circumstances. This means using a variety of methods, tools and techniques to assess quality – not just inspection visits. The changes we’re making will allow us to assess and rate services more flexibly, so we can update our ratings more often in a more accessible, responsive and proportionate way. When we update a rating, we’ll be open and clear about the type of data and information we’ve used, and how it has contributed to our judgements and decisions about any changes to a rating.

Next steps

We are encouraged by the level of support for this proposal and will therefore take this forward. We want to build on the flexible and proportionate approach we adopted at the start of the pandemic.

  • We’ll continue to focus our on-site activity where it is needed most. Site visits will remain an important part of our regulation, but we’ll also draw on a wider range of regulatory approaches to assess quality without always needing to visit. We’ll do this when, for example:
    • we’ve gathered evidence without a site visit and used this to take enforcement action – we’ll be able to use this evidence to update a rating
    • a provider can show us full evidence that they have made improvements following an inspection – we’ll be able to update that rating without another site visit
    • inspecting homecare providers using remote technologies, building on our evaluation of the pilot phase
  • For GP services, we’ll build on our pilot work and continue to develop our methods for gathering evidence without always visiting. We will continue to carry out site visits where appropriate, for example when this is the only way to gather the evidence we need.
  • We'll introduce some further flexibility so that we can use more focused activity to update ratings in a broader range of circumstances. This includes:
    • using more focused inspections of GP services to update overall ratings (including those with an existing overall rating of requires improvement where we had previously needed to carry out a comprehensive inspection in order to update overall ratings)
    • allowing more flexibility and professional judgement in how we aggregate service level ratings for NHS trusts
  • We'll continue to develop our regulatory approach in line with the proposals in our new strategy, ensuring that our regulation is targeted and dynamic, and that data and information about quality and ratings are up to date.

We’ll start to work in this way in July 2021. We’ll ensure that the information on our website is clear about how we are regulating and rating services as we develop our approach, engaging with providers, strategic partners and people who use services in a targeted way.


2. Reviewing and updating ratings

Rather than following a fixed schedule of inspections, we propose to move to a more flexible, risk-based approach for how often we assess and rate services.

Question 2a. To what extent do you support this approach?

There were 484 responses to this question. The vast majority (379 respondents) either ‘fully’ or ‘mostly’ supported this proposal (78%). Only 16 respondents (3%) did not support this proposal at all.

Question 2b. What impact do you think this proposal will have?

Many respondents were supportive of our proposal and believed it would allow us to be more responsive to changes in the quality of services and result in more accurate and reliable ratings. Respondents welcomed the risk-based approach, which they felt was more appropriate and effective, and would enable us to identify risks and issues quickly, with a focus on providers or services that needed support.

Feedback emphasised that this approach would facilitate a fairer ratings system than our current approach, where providers would not have to wait for a long period before their ratings are updated. Respondents also thought our proposed approach to reviewing and updating ratings would encourage providers to proactively seek to improve.

“This approach should ensure that CQC is able to offer the public a more up to date and accurate reflection of the state of services.”

(Membership organisation)

“Inspections should be directed where most appropriate whilst not allowing highly rated establishments to sit back behind their latest inspection. This will enable resources to be directed where required most.”

(Person using services/public)

“We believe that the proposal has the potential to provide much more meaningful information to service users by reducing the potentially long period of time between ratings being awarded and reviewed.”

(Other stakeholder)

Respondents had similar queries as for Question 1b, regarding the quality of external information that we would use to review and update ratings, how we would collect it, and about transparency and fairness of the proposed approach.

Again, the value of site visits came through clearly and respondents had the following queries:

  • Inspections are important to people who use services as they use the information we publish about services to inform decisions about their care

We recognise the importance of the information we publish about services, and how it can support people to make decisions about their care. Under our fixed schedules of working, we always needed to carry out a site visit to assess quality and rate a service. But the changes we propose will allow us to assess and rate services more flexibly, so we can provide a more consistent, up-to-date and accurate picture of quality, and update the information we publish about a service.

  • Risk that a move away from our published inspection frequencies could make some services become complacent and feel they don’t need to continue to improve

Our ratings will be more dynamic as we’ll update them when there is evidence that shows a change in quality, rather than being static for a set period. This will give everybody an up-to-date view of quality. This means services will need to continuously demonstrate that they are delivering safe, effective and high-quality care. We will continually assess risk and quality and will use our strengthened relationship management approach to engage with services so we can identify changes in quality (including improvements) and where there are risks.

Next steps

We are encouraged by the level of support for this proposal and will therefore take this forward. We want to build on the flexible and proportionate approach we adopted at the start of the pandemic. Building more flexibility into our regulatory approach includes:

  • Moving away from our published inspection frequencies based solely on overall ratings. For 2021, we’ll continue to respond to risk and inspect and re-rate services where this is appropriate, as we explained in our update on CQC’s regulatory approach published in March 2021. We’ll then provide further information on our website about how often we’ll update ratings as we implement our approach to Smarter Regulation.
  • Developing the process to update ratings and assess quality more frequently and dynamically and update ratings when we have evidence that shows a change in quality.

We’ll start to work in this way in July 2021. Our website will explain clearly how we’re regulating and rating services as we develop our approach, and we’ll carry out targeted engagement on our detailed proposals as we develop them.


Changing how we rate GP practices and NHS trusts

3. Rating GP practices and population groups

We propose to stop providing separate and distinct ratings for the six population groups when rating GP practices.

Question 3a. To what extent do you support this approach?

Of the 414 respondents who answered this question, 254 (61%) either ‘fully’ or ‘mostly’ supported the proposed approach. Notably, 101 respondents (24%) who answered this question ticked the ‘don’t know’ option. Only 22 respondents (5%) indicated that they did not support the approach at all.

Question 3b. What impact do you think this proposal will have?

Respondents were supportive of the proposed changes to rating GP practices because it would make ratings simpler and easier to understand for people.

“The simplification will make it easier for patients / the community to understand the rating and for providers to be accountable.”

(Other stakeholder)

There was support for improving the fairness and reliability of ratings for GP practices. Respondents were also positive about the impact that the proposed approach would have on providers.

“This will make the whole process easier to manage from both sides. It will also provide a better overall picture of the level of service being offered and given.”

(GP provider)

There were some areas of concern raised in relation to the proposed approach.

  • Loss of focus on different population groups and less published information about the quality of care for patients in these groups

We will continue to focus on the quality of care provided to patients in all the population groups as part of our assessments. This will include looking at how practices are providing proactive and personalised care that considers the different needs of their patients.

There is currently little variation in ratings for different population groups as these are usually influenced by evidence and judgements about the quality of care that affect all the people who use the practice. Although we will not provide a rating for each population group for the effective and responsive key questions, we will use the information about them to inform our ratings of these key questions overall. In line with our current approach, we will still publish information about the evidence we have used to make our judgements and decisions about ratings.

As we further develop our approach and implement our strategy, we’ll continue to engage with providers and the public to consider how this change can support us to look at how care is provided across local systems, as well as individual services, and to focus on reducing inequalities.

Next steps

We are encouraged by the level of support for this proposal and will therefore take this forward.

  • For GP practices, we will stop providing separate ratings for the six population groups from October 2021. Our ratings of GP practices will be focused on whether practices are safe, effective, caring, responsive and well-led, and we will also provide an overall rating.
  • We will engage with providers and stakeholders to ensure our approach reflects how providers are delivering person-centred care and acting to address inequalities. We’ll publish clear information on our website to explain how we’re regulating and rating services as we develop our approach.

Rating NHS trusts

We propose to remove aggregation for NHS trust-level ratings and develop our current approach to assessing the well-led key question for a trust.

Question 4a. To what extent do you support this approach?

Of the 420 respondents who answered this question, 237 (56%) either ‘fully’ or ‘mostly’ agreed with the proposal. Notably, 102 respondents (24%) ticked the ‘don’t know’ option. Only 22 respondents (5%) did not support the approach at all.

Responses from those in the provider/commissioner category who were from NHS trusts (acute hospitals, mental health services and ambulance services) indicated stronger support for this proposal, with 27 of these 33 respondents ‘fully’ or ‘mostly’ supporting it.

Question 4b. What impact do you think this proposal will have?

There was support for our proposal to simplify the ratings process for NHS trusts as it would make ratings clearer, simpler and easier to understand for the public and people using services, as well as partner organisations. Respondents also emphasised that they found the current approach complex or confusing.

"The publication of a single rating at the overall trust level, rather than multiple levels of complex, aggregated ratings should make it easier to focus on where strategic improvement is needed…”

(Other stakeholder)

There were also supportive comments about the potential broader impacts of the proposed approach, including the opportunity to provide clearer information about improvements needed at trusts and strengthening public confidence.

Other positive responses from different types of services related to our proposed focus on leadership and culture. These responses noted the impact of leadership and culture on all aspects of delivering care, and that it would encourage trusts to seek feedback from staff.

“Being able to focus on the culture and leadership of a trust is key, as this filters down to all areas of the trust and affects the level of care that is provided to all users.”

(Adult social care provider)

Some respondents commented that the proposed approach is fairer, and a small number thought it would enable providers to take action to achieve improvement across services.

There were some queries and areas of concern, which we clarify as follows:

  • Whether CQC ratings or reports would give enough detail about variation in quality between the different services in a trust

We’ll still publish ratings for services and locations where people receive care. The change will be that we no longer aggregate these together to give trust-level ratings.

  • Whether the proposals will make the ratings less accurate and reliable, and less transparent

We want to remove the step in our internal process that aggregates the multiple service-level ratings into separate overall ratings for the safe, effective, caring and responsive key questions for the trust as a whole. These composite trust-level ratings can conceal variation in quality in the service-level ratings they comprise. By removing them, we will improve accuracy and reliability and ensure that ratings for the trust as a whole clearly and directly reflect our assessment activity at that whole organisational level.

  • The need for the trust-level assessment to be effective and reflect quality at service level

The trust-level assessment will be based on a development of our current approach to assessing the well-led key question for an NHS trust. We will work collaboratively with strategic partners and providers as we strengthen our well-led framework in areas such as collaboration, equality, diversity and inclusion. We will also explore how we can more broadly assess overall quality and leadership for a trust, and how findings at service level form part of the assessment.

Next steps

The level of support for this proposal is endorsement for us to take it forward. We’ll implement these changes in Spring 2022. In developing the approach, we’ll:

  • work with providers, our partners and key stakeholders to develop our assessment approach for NHS trusts
  • work with NHS England and NHS Improvement to align our approach and ensure links with the System Oversight Framework
  • review and develop our framework and approach to rating and reporting in line with wider changes to our regulatory approach.

Measuring the impact on equality

We produced a draft equality impact assessment to identify the opportunities and risks to improving equality and human rights through our proposed changes. We asked for views on:

  • Whether the proposals will have an impact on some groups of people more than others, such as people with a protected equality characteristic.
  • Whether any impact would be positive or negative.
  • How we could reduce or remove any negative impacts.

We received 343 responses through all channels. Respondents showed support for our draft equality impact assessment with some feeling that our proposed approach to measuring and mitigating the impact on equality would be positive for people who use services and staff, and improve our relationships with providers.

Some respondents were concerned about how we would collect data to inform our final equality impact assessment. For example, some felt that certain population groups may not be sufficiently heard and/or considered, such as people with disabilities.

Respondents also felt the draft equality impact assessment was lacking in detail about what practical steps we would take.

As this was a draft impact assessment, we have considered feedback and made some changes – see our updated equality impact assessment. We will have more specific actions relating to equality and human rights in our new Equality objectives, to be published in Summer 2021.


How we’ll engage with you in the future

In part 2 of the consultation we also set out our plans for engaging in the future about changes to our regulatory approach. This is about spending more time listening to people’s views earlier in the process and implementing changes quickly.

Although there was no consultation question for this section, we welcome the comments on this area.

As well as support for our plans, there were queries about the transparency of future engagement activity. We believe our future approach to engaging on changes will enable us to engage in a more meaningful and timely way.

We have committed to:

  • always meet our statutory duties to consult under the Health and Social Care Act 2008
  • carry out fewer large-scale formal consultations and more on-going opportunities for you to contribute as we engage in different ways
  • engage in a more proportionate, targeted and responsive way where we need to consult
  • publish information to update people about changes to our approach
  • publish up-to-date information on our website in a more accessible and easy to understand format

Next steps

We’ll make sure we publish information that explains clearly how we’re regulating and rating services as we develop our approach, so that everybody understands the changes we’re making following our consultation. Information will be updated regularly so that is clear to all how we’re assessing and rating services.