Speech

Catherine Large OBE at the Annual Apprenticeship Conference 2024

Ofqual's Executive Director for Vocational and Technical Qualifications discusses protecting the interests of apprentices and driving quality in assessment.

Catherine Large, Executive Director for Vocational and Technical Qualifications, Ofqual

Good morning, everyone. It’s good to be back here after speaking to you last March.  

So, Ofqual – we are the statutory regulator of qualifications and assessments, if you didn’t know, which now includes over 1,700 apprenticeship end-point assessments (EPA) offered by 153 different awarding organisations against 577 apprenticeship standards. Our job is to ensure that apprentices are assessed fairly, in a way that produces results that employers can rely on. We are here to protect the interests of apprentices and students of all ages as they take their assessments – we put their needs first in everything we do. 

The market for apprenticeship end-point assessment continues to change and develop. Last year I updated you on the EQA transition programme initiated by IfATE. Through that programme Ofqual has now officially recognised over 80 end-point assessment organisations as regulated awarding organisations. Some newly-recognised awarding organisations are entering new markets, some are changing their minds and leaving. We are seeing a lot of applications to change or expand awarding organisations’ scope of recognition at present. Each of these requires an important check that the awarding organisation has the capacity and capability to deliver what it is applying for. 

I want to talk to you today, therefore, about protecting the interests of apprentices and driving quality in assessment, in the context of a market which is still developing, maturing, and changing. 

Firstly, I will talk about the part that awarding organisations can play in tackling low apprenticeship completion rates, which I know are on all of our minds at present. 

Secondly, I’ll share a couple of practical examples of how quality of assessment could be improved in the system, building on the evaluation that Ofqual has undertaken to-date.  

And thirdly, I will talk about Ofqual’s recent report on functional skills qualifications, which was a very important topic of discussion at this conference last year, and remains a key interest, I know. So, coming firstly to low completion rates. There are various reasons, of course, why apprentices might not complete their apprenticeship, from positive reasons such as securing a new role with their employer, through to not enjoying their choice of apprenticeship, to changing their minds, or to experiencing a lack of support through their programme. Ofqual’s focus is on just one part of the apprenticeship – the assessment – ensuring that apprentices taking a regulated EPA are assessed fairly. But we are keen to understand whether the EPA itself is contributing to low completion rates. Whether, for example, gateway requirements, delays to scheduling or rescheduling assessments, other logistics, or the experience of the assessment itself are negatively impacting apprentices. We understand from providers that assessment booking processes and limited flexibility with assessment locations take their toll on scheduling EPA.  

So, towards the end of last year, we asked all awarding organisations who deliver EPA to provide information regarding their planning tools, learner registration system, gateway process, assessor availability and availability of assessment materials. The responses gave us a real insight into this area, and we’ll be sharing some of our key findings during this afternoon’s workshop with my colleagues.  

While it’s good to hear that just under a third of awarding organisations aren’t experiencing challenges in the overall planning and timely delivery of EPA, a broader range of responses indicates that active provider engagement is key for effective EPA planning and delivery. Awarding organisations are telling us that when providers record in the Individualised Learner Record system which awarding organisation they intend to use, and go on to register their apprentices with the awarding organisation early in the process, it helps to ensure that end-point assessments are delivered efficiently and in line with provider expectations. Forty-three awarding organisations told us that they rely on direct provider interaction as the most reliable source of pipeline intelligence. Scheduling an assessor is listed as the biggest challenge for awarding organisations when they receive less than 6 months’ notice to deliver EPA. 

Awarding organisations also flagged that changes to apprentice availability impact on timely EPA delivery. A third of awarding organisations suggest that the most common reason for delay or cancellation of EPA is because of this. We’d welcome your feedback at the workshop today about how apprentice availability for EPA, and all these other factors could be influenced. 

Moving on now to some practical things that could be done to drive quality in assessment based on what Ofqual has observed. I was keen to be able to share with you today some specific take-aways that, with increased awareness, might make a difference to the apprentices we all interact with.  

One of our key areas of interest at present is the important role that individual the EPA assessor plays in making accurate and consistent judgements about an apprentice’s performance in the workplace. 

Given the system’s reliance on this critical role, it is important to understand how awarding organisations induct and train their assessors to support high quality assessment decisions. Last September, we asked all awarding organisations delivering EPAs to explain their approach to assessor training. We also asked for their assessor training schedule for the coming year so that we could observe training activity. 

This work is in its early days, but we’re seeing some excellent examples of awarding organisations that support their assessors with clear guidance and training to enable them to deliver consistent assessment judgements.  

We have, however, identified areas of risk through this work. For example, in some cases annual professional registration being used as a substitute for subject specific training. Where we’ve identified risks, we’re already in dialogue with the relevant awarding organisations about making changes, and we expect them to make improvements to the way they induct and train their assessors.  

If you are an awarding organisation that hasn’t yet had this conversation with us, you might want to think about the training you offer your assessors and whether it supports consistent and valid judgements. If you are an employer or training provider, this might be an area you ask about when selecting the awarding organisation that will offer EPA to your apprentices. And if you are yourself an assessor, you might want to ensure that the organisation you work for is providing you with the guidance you feel you need to make appropriate decisions when undertaking the assessment. 

Ofqual is hosting a workshop here at the conference tomorrow on the themes arising from our work in this area. So please do go along to discuss this in more detail and ask any questions you may have. 

I also wanted to mention one other practical example which is coming up as a current theme and where improvements could be made across the system. This is multiple choice tests. These are crucial to consider because they are an assessment method used across many different apprenticeship standards. They seem straightforward to administer and mark, but there are many pitfalls in constructing valid and reliable multiple choice test papers as many of you know, I’m sure. It is not a cheap or easy option to deliver. 

Issues with these may include, for example, test items that draw the candidate’s attention to a particular answer inadvertently. The test’s validity can also be compromised if it contains unlikely wrong answer options that can be easily eliminated by the test taker with little or no subject knowledge. And if a question is too wordy or complex, it can confuse those taking the test, or waste their time. The extent of the item bank is important as well, to ensure there’s a wide range of questions each test draws from, so that apprentices are not getting the same test each time. 

Again, where we have identified specific issues through our monitoring and evaluation, we are already in discussion with the relevant awarding organisation to make improvements. But, given this is a recurrent theme, I just thought it was worth flagging today as something that all awarding organisations should consider in the design of their EPAs, to support valid assessment and fairness for apprentices across the system. 

I want to turn now to Functional Skills qualifications, as government, as you well know specifies that English and maths are a critical part of an apprenticeship. You may be aware that Ofqual has recently issued a report on these having looked in detail at how the reformed assessments are functioning. 

As part of our evaluation, we conducted extensive engagement with a range of stakeholders to gather feedback on what they feel is working well, and less well, about reformed Functional Skills. Many of you in this room fed in, for which we are very grateful. We are particularly grateful to colleagues at AELP, AoC, Holex and other representative groups for providing feedback and also helping to field views from the sector. We heard through our surveys and focus groups that stakeholders feel that the reformed assessments are harder, particularly for Functional Skills maths.   

The report we published last month explored a range of concerns around the assessment of these qualifications. We concluded that although our findings do not indicate the need for a change to the overall approach to assessment, there are some improvements that awarding organisations can make, as well as certain aspects that would benefit from further exploration. 

In response to stakeholder feedback, we looked specifically at the level of difficulty in the reformed maths Functional Skills assessments at level 1 and level 2. Research conducted at the time of reform found the level of demand of the old and reformed maths assessments to be very similar. Further to this, a review of papers conducted during the evaluation indicated that the level of demand was broadly appropriate against the subject content set by the Department for Education (DfE).  

We have, however, identified a range of factors that might lead people to believe that the new qualifications are more difficult. This includes changes made to the subject content during the reform, and the introduction of discrete non-calculator assessment and assessment of underpinning skills. We also noted changes to the cohort taking qualifications over time, and the need for providers to become familiar with the changes to the qualifications, particularly in light of the disruption caused by the pandemic. Ofqual is doing more work to explore the assessment of problem solving in level 1 and 2 maths. We believe this may be contributing to an additional reading load and more questions being set in context than necessary. 

In terms of next steps, we’re already researching how to improve the assessment of problem-solving questions in maths. We’re following up with awarding organisations on our findings. And, as part of our ongoing monitoring of the qualifications in delivery, we’re going to carry out further paper reviews.   

Improving Functional Skills qualifications is a crucial part of improving quality in apprenticeships, and we are committed to playing our part. 

To conclude my 10 minutes, I wanted to draw your attention to a new resource for apprentices that we published earlier this month, during National Apprenticeships Week. The End-point assessment guide for apprentices is a new publication that helps apprentices know what to expect from their EPA and how to prepare for it. It explains the additional support that’s available, if needed, when apprentices take their assessment and includes information on results, resits and retakes. 

We’re very grateful to the awarding organisations, training providers and apprentices who, together with DfE and IfATE, helped to shape the content of this guide. It is our firm belief that apprentices should be afforded the same protection as their peers taking qualifications alongside them, such as A levels. They are equally as entitled to be treated fairly as they take their assessments. An important first step was to ensure that their assessments are regulated, which was not universally the case until recently. Now that apprenticeship end-point assessments are under the remit of a statutory regulator – either Ofqual or the Office for Students – we can publish guides such as this for apprentices that demonstrate the protection afforded to them and where they can turn to for help, if needed.  

We are always interested in hearing about ways to improve qualifications and other aspects of the assessment system, so please do come and talk to us. As I said, my colleagues are here hosting workshops today and tomorrow. Let’s continue the conversation about how we can work together in the best interests of apprentices. 

Thank you.

Updates to this page

Published 27 February 2024