Origins and Evolution of the CASLO Approach in England - Chapter 6: Recent history
Published 18 November 2024
Applies to England
Regulatory arrangements were published in August 2008 to support implementation of the QCF (Ofqual, 2008a). This document specified that all QCF units had to be written entirely in terms of learning outcomes and assessment criteria, and indicated that passing a unit meant mastering all specified learning outcomes. The dominance of the CASLO approach became evident as the vast majority of regulated qualifications transitioned into the QCF (typically, to qualify for public funding).
Just a few years later, however, both the QCF and the CASLO approach were to be called into question following a series of official policy reviews. In 2015, the QCF was withdrawn, meaning that there was no longer a regulatory requirement for any qualification in England to adopt the approach. Indeed, an increasing number of qualifications and assessments were prohibited from doing so (a trend that started even before the QCF had been withdrawn).
Post-2010 policy reviews
After the May 2010 general election failed to return a single governing party, the Conservatives and Liberal Democrats formed a coalition. Although plans for reforming Technical and Vocational Education and Training did not loom large in their initial ‘Programme for Government’ (HM Government, 2010), a series of high profile policy reviews subsequently paved the way to substantial reforms, which continued following the next election when the Conservatives won a majority.
None of these reviews focused specifically on the CASLO approach to qualification design. However, the approach did feature in many of them, sometimes obliquely and sometimes more directly. Whereas, prior to 2010, official reviews tended – either implicitly or explicitly – to support the approach, the post-2010 reviews tended to be more critical. We will consider how the approach featured within 5 of these reviews, and how this influenced subsequent policies and practices related to the CASLO approach:
- Wolf review – 14 to 19 vocational education (Wolf, 2011)
- Richard review – apprenticeships (Richard, 2012)
- CAVTL review – adult vocational teaching and learning (CAVTL, 2013)
- Whitehead review – adult vocational qualifications (Whitehead, 2013)
- Sainsbury review – technical education (Sainsbury, 2016)
Wolf report
In his Foreword to the Wolf report, Secretary of State for Education, Michael Gove, explained that he had invited Alison Wolf to confront a longstanding “failure to provide young people with a proper technical and practical education of a kind that other nations can boast” (Wolf, 2011, page 4). Wolf approached this challenge with a wealth of experience, which – from the perspective of the present report – included having conducted substantial research and analysis into NVQs and GNVQs, and having written a landmark book on Competence-Based Assessment (Wolf, 1995).
Wolf was critical of the state of vocational education in England, concluding that many students were being let down. She was particularly critical of the plethora of low-level qualifications that students were encouraged to take, which had little or no labour market value. This meant that many 14 to 19-year-olds left education without the skills that would enable them to progress.
Recommendations from the Wolf report were extremely wide ranging, addressing a host of issues related to curriculum, teaching, qualifications, apprenticeships, funding, work experience, employer involvement, regulation, accountability measures, and others too. At the heart of these recommendations was the intention that all 14 to 19 students should be following a valuable programme of learning with clear progression opportunities, and that all supporting systems and structures – including qualifications, funding, accountability, regulation, and so on – should clearly align with this intention.
Wolf foregrounded the problem of perverse incentives related to funding and accountability, which steered learners away from coherent programmes and encouraged them to focus upon accumulating easy-option, low-value qualifications. Indeed, she noted the trend for students to be channelled away from high-value academic qualifications toward low-value vocational ones, which were easier to achieve despite attracting equivalent school performance table points. This helped schools to (appear to) perform well but ultimately let students down.
Wolf argued that 14 to 19 learners should not be studying highly-specific qualifications that were unsuitable for them, including those designed specifically for adults working towards occupational competence, and based solely on National Occupational Standards. Because young people change sectors, occupations, and jobs very frequently during their first decade of employment, they needed to study “fairly general” vocational qualifications (Wolf, 2011, page 74). Schools and colleges should therefore be incentivised to ensure that vocational qualifications studied by 14 to 19 students were suitable and valuable:
Only those qualifications – both vocational and academic – that meet stringent quality criteria should form part of the performance management regime for schools.
(Wolf, 2011, page 11)
Government was to put this principle at the heart of its response to the report.
Concern over the QCF
Although Wolf did not focus specifically on the CASLO approach, she did have plenty to say about the (newly established) Qualifications and Credit Framework, which stipulated the approach. She also had a lot to say about QCF qualifications that were offered in schools, particularly those had been allocated the same performance table points as GCSEs. Nominal equivalence was not the same as substantive equivalence, Wolf insisted, and vocational students who took ‘GCSE equivalent’ QCF qualifications had been sold short. Furthermore, she argued, QCF qualifications – designed for adults with narrowly defined occupational goals – should not be the main, let alone the only, type of vocational qualifications offered to students in schools and colleges.
The report also flagged up various design features associated with QCF qualifications related to their association with National Occupational Standards. Of particular relevance to our discussion of the CASLO approach, she proposed that the following characteristics make QCF qualifications ill-suited to delivery within education and training institutions:
- as QCF qualifications require students to achieve all specified learning outcomes, this mastery requirement means that “no single element can be difficult” (page 88) because, if a student fails one element, then they fail the qualification
- the same requirement – with its heightened risk of students failing – also places “enormous downward pressure on standards” (page 87) in the context of teachers assuming considerable responsibility for assessing QCF qualifications, given the perverse incentive not to fail students when institutions are paid by results
- the mastery requirement also incurs “large costs in time and money spent assessing, recording, re-assessing, etc.” (page 88)
Recognising problems such as these, Wolf proposed that:
If awards are to be used for national performance monitoring, it is vitally important that there be very strong safeguards against downward pressure on standards. It would be nice to think this is unnecessary, but the experience of the last few years tells us otherwise. All those which are used, vocational or academic, should make serious demands of students, develop and accredit distinctive skills and attainments, facilitate progression post-16 and incorporate clearly established, and properly monitored, national standards. They must, therefore, have a strong element of external assessment. This need not, and indeed should not, mean assessment entirely on the basis of examinations, which in the case of vocational awards will often be quite inappropriate. But we know that, without regular external referencing, assessment standards in any subject invariably diverge across institutions and assessors.
(Wolf, 2011, page 112)
Government’s Response
Government accepted recommendations from the Wolf report without reservation, and proposed to implement them in both letter and spirit. Its action plan identified 3 major themes, the second of which promised to:
Reform performance tables and funding rules to remove the perverse incentives which have served only to devalue vocational education, while pushing young people into qualification routes that do not allow them to move into work or further learning. Those vocational qualifications that attract performance points will be the very best for young people – in terms of their content, assessment and progression.
(DfE, 2011a, page 3)
This would be achieved by tightening the accountability system to ensure that only certain vocational qualifications would be counted in school and college performance tables – only ones that government deemed to be respected, and comparable to others in the tables in terms of the rigour of their content and assessment. These qualifications would need to meet stringent quality criteria, meaning that awarding organisations would need to undertake a considerable redevelopment programme.
Performance table requirements
The Department for Education (DfE) developed its response in 2 phases. It launched a consultation on qualifications and performance tables for 14 to 16-year-olds immediately (DfE, 2011b), followed 2 years later by consultations on vocational qualifications for 16 to 19-year-olds (DfE, 2013a) and performance tables for 16 to 19-year-olds (DfE, 2013b). Alongside the first consultation, the Department published technical guidance for awarding organisations (DfE, 2011c), which provided further insights into its proposals and their rationales. This began by stating that:
In line with Professor Wolf’s recommendations, in future, only qualifications which are high quality, rigorous and enable progression to a range of study and employment opportunities will be recognised in school performance tables for 14-16 year olds.
(DfE, 2011c, page 1)
It went on to insist that vocational qualifications “must be just as stretching and challenging as academic or general qualifications” (DfE, 2011c, page 1). To demonstrate this, all performance table qualifications would need to satisfy DfE criteria related to size, grading, external assessment, synoptic assessment, progression opportunities, proven track record, and appropriate content (which would be determined, up front, via a formal evaluation process). These rules included a minimum of 20% external assessment: “to ensure that vocational qualifications offer a comparable level of challenge to academic qualifications and are seen to do so” (DfE, 2011c, page 6). They also included an unspecified amount of synoptic assessment:
Synoptic assessment is vital to increase the level of challenge for students as it requires a broader comprehension. This will help ensure that vocational qualifications are as challenging as academic ones. Taken with the minimum size requirement, adding synoptic assessment will ensure cohesiveness across a qualification and prevent qualifications from being treated as a series of disconnected components.
(DfE, 2011c, page 7)
Note that the synoptic assessment requirement responded to a frequent criticism of CASLO qualifications, and unitised qualifications more generally, which is that they can lead to fragmented teaching and learning. Although Wolf would have recognised this criticism, it was not actually foregrounded in her report. External assessment was mentioned, of course, although more to secure confidence in the application of national standards than to secure parity of standards between vocational and academic qualifications.
Technical requirements for school and college performance tables were refined over time. By 2017, requirements for Technical Awards (14 to 16) and for Technical Certificates, Technical Levels, and Applied Generals (16 to 19) were as summarised in Table 6 (see DfE, 2017).[footnote 1]
14 to 16 Qualifications | 16 to 19 Qualifications | |
---|---|---|
Declared purpose | required | required |
Size | at least 120 GLH | at least 150 GLH (AG) at least 150 GLH (TC) at least 300 GLH (TL) |
Employer or HE recognition | not applicable | required |
Appropriate content | “set out in the specification or supporting documentation clear information about the content of the qualification; this should be more than just the learning outcomes and assessment criteria” | at least 60% mandatory (AG) at least 40% mandatory (TC) at least 40% mandatory (TL) |
Appropriate assessment | at least 40% external internal assessment verified or moderated suitably controlled |
at least 40% external (AG) at least 25% external (TC) at least 30% external (TL) internal assessment verified or moderated suitably controlled |
Synoptic assessment | ‘sufficient’ synopticity | ‘sufficient’ synopticity |
Grading | Pass, Merit, Distinction or more detailed | Pass, Merit, Distinction or more detailed |
Employer involvement | not applicable | for TC and TL only |
Progression opportunities | required | required |
Track record | required | required |
Table 6. Technical guidance for awarding organisations.
These DfE requirements effectively ruled out adopting the CASLO approach, at least at the qualification level, given the likelihood of externally assessed units adopting a classical approach (based upon numerical marking as opposed to direct judgement against assessment criteria). Having said that, it would still have been possible for awarding organisations to develop hybrid qualifications, with internally assessed units adopting the CASLO approach and externally assessed units adopting a classical one. Note that the ‘appropriate assessment’ criterion permitted ‘verification’ which tended to be associated with quality assurance within CASLO qualifications. Many Level 3 BTECs, for instance, were hybridised through the reform process.
Richard report
In June 2012, the entrepreneur, Doug Richard, was asked by the Secretary of State for Education (Michael Gove) and the Secretary of State for Business, Innovation and Skills (Vince Cable) to consider the future of apprenticeships in England, and to recommend how they can meet the needs of a changing economy.
Richard was radical in his response, arguing that apprenticeship should be redefined – with the relationship between apprentice and employer at its heart – as a high skill, high status pathway. Apprenticeship should no longer be seen as a government-led training scheme, dominated by training professionals. It should be understood as an employer-led educational journey for an apprentice who is new to their role and has much to learn. In exactly the same way, standard setting and assessment should also be employer-led, rather than being dominated by Sector Skills Councils and awarding organisations.
Richard raised concerns of direct relevance to the CASLO approach, albeit couched within broader concerns over apprenticeship standards and assessment.
Concern over standards
Central to the Richard report was the idea that apprenticeships no longer provided a guarantee of overall occupational competence. This was primarily due to apprenticeship frameworks in which a “welter of qualifications” acted “like stepping stones” yet often without ever declaring apprentices competent (Richard, 2012, page 4). Worse still “we have an extraordinary number of qualifications, which under the guise of flexibility can be stitched together in an infinite number of combinations leading to any possible outcome but no clear accomplishment” (Richard, 2012, page 6).
This lack of coherence across qualifications was compounded by micro-level specification of National Occupational Standards, which made it hard to see the wood (overall competence) for trees (elements of competence). Worse still, this inadvertently constrained innovation and flexibility in teaching, and meant that apprentices spent too much time being assessed and not enough time being trained:
Too much provision is however driven by the need to tick off a very long list of competencies, required to complete the requisite qualifications. This has meant that, today, too many apprenticeships involve, in part if not [in] total, a heavy focus on on-going assessment – indeed many apprenticeships are delivered on the ground almost exclusively by individuals called assessors, rather than trainers, teachers or educators. Much of the time which apprentices spend ‘training’, is in fact spent with their assessor providing evidence of their ability to meet competency requirements. I believe apprenticeships should be about new learning, and those involved in delivering apprenticeships should focus on teaching and coaching – this should be their primary task, the thing they are paid to do.
(Richard, 2012, page 87)
Streamlining would provide a solution to these problems. There should be just a single qualification for each apprenticeship, and its outcome-focused standard:
should clearly set out what apprentices should know, and be able to do, at the end of their apprenticeship, at a high level which is meaningful and relevant for employers
(Richard, 2012, page 17)
The effect would be to simplify the system, freeing up curriculum and pedagogy at the same time. Quality, in this newly simplified system, would be underpinned by strong leadership from employers.
Concern over assessment
The idea that apprenticeships failed to provide a guarantee of overall occupational competence also influenced recommendations concerning assessment, which also risked losing sight of the wood for the trees:
Finally, we know that success in an individual qualification or component of an apprenticeship does not always guarantee competence in actually doing the job. Employers tell me that individuals could tick off the many tasks involved but not, at the end, be genuinely employable and fully competent.
(Richard, 2012, page 50)
Streamlined standards would result in streamlined assessments, which would help to solve this problem too. These streamlined assessments – scheduled for the end of an apprenticeship over a period of days or weeks – would be far more integrated:
The final test and validation must be holistic, in that it seeks to test the full breadth of the relevant competencies not merely the incremental progression of the apprentice. That may take the form of a project or an assessment in front of an examiner. It should be performance and real world based, rather than just theoretical. It should be primarily at the end of an apprenticeship, not measuring progress during it.
(Richard, 2012, page 8)
Quality, in this simplified assessment system, would be underpinned by strong leadership from employers. However, Richard believed that there was also a need to externalise the system, to underpin its credibility. This was not in the sense of insisting upon external written exams. Indeed, Richard proposed that the test “will need to be primarily practical and involve directly observing whether the apprentice can do their job well, in different and novel circumstances” (Richard, 2012, page 54).[footnote 2] Instead, externality would be provided by appointing assessors who were independent of anyone with a strong interest in the apprentice passing.
Government’s response
Matthew Hancock, Minister of State for Skills, received the Richard report on behalf of the DfE and BIS, and set out the government’s next steps for consultation. He concluded that the report set out a compelling case for reform, which would ensure that apprenticeships become “rigorous and responsive” to employer needs (DfE & BIS, 2013, page 3). Post-consultation decisions were set out in ‘The Future of Apprenticeships in England: Implementation Plan’ (HM Government, 2013a).
On standards, government decided that:
In future, Apprenticeships will be based on standards designed by employers to meet their needs, the needs of their sector and the economy more widely. These standards, which will replace the current frameworks, will be short, easy to understand documents that describe the level of skill, knowledge and competency required to achieve mastery of a specific occupation and to operate confidently in the sector.
(HM Government, 2013a, page 4)
The new standards were therefore intended to embrace both practical and theoretical elements.[footnote 3]
On assessment, government decided that:
An apprentice will need to demonstrate their competence through rigorous independent assessment, focused primarily on testing their competence at the end of their Apprenticeship. The assessment will be against the relevant standard, and employers will have a key role in developing the high level assessment approach.
(HM Government, 2013a, page 4)
Grading was also to be introduced, to encourage apprentices to strive for excellence.
Of relevance to the CASLO approach, the new standards were still required to be outcome-based, and they were still required to certify mastery. However, the outcomes were to be defined very much more succinctly, and mastery was to be understood correspondingly holistically:
The new standards will be short (typically one side of A4), easy to understand documents that describe the level of skill, knowledge and competency required to undertake a specific occupation well, and to operate confidently within a sector. They will focus on how an apprentice should demonstrate mastery of an occupation, and will not list narrowly defined tasks.
(HM Government, 2013a, page 11)
Indeed, on the same page, the report appeared to redefine ‘mastery’ in terms of the need for an apprentice to be able to transfer their skills when moving from one company to the next in the same occupation. This was consistent with the principle of focusing each standard upon a broadly defined occupational role rather than a narrowly defined job (with a particular employer, which might only require a subset of the skills required for essentially the same role with another employer).
Having said that, just a few pages later, the report explained that apprentices would still have to demonstrate “their ability in all areas of the standard” (HM Government, 2013a, page 15), which suggested that mastery still meant jumping a series of hurdles. Indeed, this was clearly spelt out:
Grading will be applied to the full Apprenticeship standard and a mastery mechanism of assessment will be used. This means that apprentices will need to pass every aspect of their assessment in order to be successful, but not every aspect will need to be graded. This approach means that an apprentice will not be able to compensate for failure in any one aspect of the assessment with a strong performance in another area.
(HM Government, 2013a, page 18)
Finally, reflecting the idea of holistic competence, the document explained that assessment would include:
a synoptic element to the end-point assessment, requiring the apprentice to identify and use effectively in an integrated way an appropriate selection of skills, techniques, concepts, theories, and knowledge from across their training
(HM Government, 2013a, page 17)
Both standards and assessment plans were to be developed by ‘Trailblazer’ groups, comprising leading employers and professional bodies. An accompanying document ‘Guidance for Trailblazers’ (HM Government, 2013b) explained these decisions in slightly more detail. The desire to put employers in the driving seat meant that they should have as much freedom as possible when developing standards and assessment plans. This included the option of mandating the achievement of existing qualifications within the standard, if they wished to.[footnote 4] Where rules were specified, they tended to be fairly loose, such as the requirement for the specification of a standard to be “concise (typically around one side of A4)” and “written in clear and simple language” (HM Government, 2013b, page 14). The concept of ‘mastery’ was not elaborated in this guidance.
Subsequent guidance extended the anticipated length of each standard to “one to two sides” (HM Government, 2014, page 23).[footnote 5] It also seemed to loosen the mastery requirement somewhat, given the pragmatic need to sample that arises when assessment is no longer continuous:
The end-point assessment must assess across the whole standard but it does not have to assess every aspect. When thinking about which aspects of the standard would need to be formally assessed at the end of the programme, it may be helpful to think about how critical it is for the occupation, how frequently it is used and whether it links to professional registration.
(HM Government, 2014, page 44)
Yet, the passing grade was still intended to certify full competence, and this expectation continued across subsequent iterations of the guidance.
The next iteration provided further elaboration of what a successful apprenticeship standard might look like:
At the core of a successful apprenticeship standard are two things:
- A short and clear role description setting out the main activities that someone in this occupation would do, in language that can be easily understood by someone without technical knowledge.
- A definitive list of the skills, knowledge and behaviour that you as an employer would expect from someone who is a fully competent professional in the occupation.
(HM Government, 2015, page 17)
Trailblazers were still restricted to 2 sides of A4 (size 12 font), unless proposing a ‘core and options’ approach, which permitted slightly more space.
Decisions on the design of apprenticeship standards and assessment plans, which stemmed from recommendations in the Richard report, represented a shift away from the CASLO approach. Yet, how radical a shift this was to be remained a little unclear. On the one hand, Richard bemoaned the kind of micro-level specification that had been associated with NOS and NVQs. He argued that continuous assessment of finely specified standards meant that apprentices were committing too much time to assessment – time that would be better spent on training. The switch to external end-point assessment was intended to render it shorter, more holistic, and no longer reducible to an exercise in box ticking. On the other hand, Richard still wanted apprenticeships to be defined in terms of outcomes, and still wanted the passing grade to certify full occupational competence. This translated into guidance that sometimes sounded very reminiscent of the CASLO approach, which left open the possibility that assessment under the new approach might still be reduced to an exercise in box ticking, albeit with fewer boxes to tick (bearing in mind how streamlined the new standards had become).
The Federation for Industry Sector Skills and Standards (FISSS) managed apprenticeship certification under the framework system, and was to continue doing so under the new standards system.[footnote 6] During 2014 and 2015 it published a series of reports intended to help Trailblazer groups, and ‘enablers’ of those groups (including professional bodies and Sector Skills Councils) to respond to the 2013 ‘Implementation Plan’. Its ‘toolkit’ for enablers (FISSS, 2014) provides interesting insight into the variety of ways in which the first Trailblazer groups approached their remit. For instance, although it stated that there “may be more value in starting afresh” it recognized that many of the first Trailblazers “based their respective standards primarily on the existing framework, as it already met employer requirements” (FISSS, 2014, page 21). It also continued to endorse functional analysis, with a nod to the skills built up by its members:
-
Functional elements: How are the key functional elements of the occupational competence – professional skills, knowledge, and (optionally) behaviours – identified and agreed? Functional analysis is the recognised approach.
-
Does the group need support with the functional analysis of the occupation or job role to draw out the relevant skills, knowledge, and behaviour that demonstrate competency? Professional bodies and sector skills councils have extensive technical knowledge of standard development.
(FISSS, 2014, page 21)
CAVTL (McLoughlin) report
In December of 2011, the Department for Business, Industry and Skills (BIS) set out plans for reforming the further education and skills system, which included actions to develop and promote excellent teaching (BIS, 2012). An independent commission on adult education and vocational pedagogy would be established with a remit to develop a sector-owned strategy and delivery programme.
The Principal of City & Islington College, Sir Frank McLoughlin, chaired this Commission, which reported in March 2013. Reflecting an intention to speak on behalf of a range of stakeholders – including industry stakeholders, teaching trainers and practitioners, and professional associations – their report tends to be known as the CAVTL report (Commission on Adult Vocational Teaching and Learning). It focused on 18+ learners on vocational (but not pre-vocational) programmes.
The report was titled ‘It’s about work’ and stressed that learners require a clear line of sight to work to be able to appreciate exactly why they are learning what they are being asked to learn (CAVTL, 2013). At the heart of the report was the idea of strengthening links between teaching and learning (on the one hand) and employers and employment (on the other). This was embodied in the idea of creating a two-way street, which meant that employers should be full partners in the further education and skills system (alongside trade unions and professional bodies) and not mere customers to colleges and training providers.
Genuine collaboration was key to establishing this two-way street. Employer involvement and influence would need to be improved, including direct involvement in curriculum planning. In turn, this would require more flexible qualifications, which could be tailored to local needs. The Commission recommended a ‘core and tailored’ approach, that is, a national core plus a tailored element to meet local demands.
The vision of a two-way street would need to be supported by excellent teachers and trainers – dual professionals who combined occupational expertise with pedagogical expertise. But these professionals would need training and development, which would require substantial investment. The need to invest in professional updating was identified as a particular priority.
In addition to multiple site visits, McLoughlin commissioned a series of briefing papers. The first, provided by Geoff Stanton, former Director of the Further Education Unit, was discussed at the first meeting of the Commissioners. It is worth mentioning because of its discussion of vocational qualifications. Stanton emphasised how effective teaching and learning depends on striking the right balance between qualification design, the development of learning programmes, and pedagogical planning. He noted that, at different points in time, these different factors had been weighted differently. With the introduction of NVQs, the emphasis now lay very heavily on qualification design. A once iterative process had now become extremely linear:
- first develop occupational standards
- then develop qualification structures and processes to suit these standards
- then develop learning programmes to suit these structures and processes
Unfortunately, according to Stanton, this model had resulted in some very negative consequences for vocational pedagogy. Because teachers were not involved in the specification of standards:
some outcomes though measurable proved very difficult to teach, some important learning experiences were neglected because they could not be easily reflected in fundable outcomes, and many of the standards were expressed in terms that were incomprehensible to those hoping to achieve them
(Stanton, 2012, page 7)
Although he recognised that the advent of NOS had enabled much previously unstructured training to be systematised, which was important, he also noted the tendency for trainers to use those standards directly as the basis for qualification delivery – as though they constituted a learning programme – rather than indirectly as the foundation for course development. He implied that we should not be too surprised about this, bearing in mind that, while trainers were required to possess a qualification in NVQ assessment, they were not required to possess a qualification in NVQ teaching and learning.
The Commission echoed these concerns, particularly those related to the tendency to construct learning programmes directly upon NVQ-mediated NOS, with little if any attention paid to curriculum progression and pedagogical implications:
We need to put curriculum development and programme design back at the heart of vocational teaching and learning. Over the last 30 years, the emphasis has shifted from curriculum development to qualifications design, which has wrongly been equated with programme design. Together with a funding regime based on qualifications, this has exacerbated a focus on ‘assessment as learning’ and qualifications.
(CAVTL, 2013, page 14)
The newly formed Education and Training Foundation was charged with taking forward recommendations from the CAVTL report, although the Commission envisaged that certain of the recommendations – including its recommendation concerning ‘core and tailored’ qualifications – would be developed in the forthcoming review of adult vocational qualifications.
Whitehead report
In spring 2013, Matthew Hancock asked Nigel Whitehead, BAE Systems Group Managing Director and Commissioner for the UK Commission for Employment and Skills (UKCES), to review adult vocational qualifications in England. As discussed earlier, the UKCES inherited responsibility for managing National Occupational Standards when it was established in 2008, soon committing to a substantial reform programme, which was rolled out in 2011. Recall that the Richard review had criticised these reformed NOS, and Matthew Hancock formally responded to these concerns in spring 2013 (DfE & BIS, 2013). The new model of apprenticeship standards was confirmed a month prior to the Whitehead report being published (see HM Government, 2013a).
Whitehead interpreted his remit in terms of seeking out issues for improvement and providing recommendations for reform. He identified a number of “systemic weaknesses and unintended outcomes” (Whitehead, 2013, page 3) and presented a vision for reform designed to ensure that adult vocational qualifications would become more:
- relevant – being linked directly to occupations, either to support entry into an occupation, or to provide professional development within it
- rigorous – being more reliable, robust, and graded, derived from clear and ambitious occupational standards, and not constrained by QCF design rules
- recognised – being better understood and respected, with better data on progression and returns
Underpinning this vision was the principle of employer ownership. Adult vocational qualifications should be driven by business leadership rather than by government management. Employers should take end-to-end responsibility for workforce development, working in partnership with competitors, supply chains, unions, training providers, professional bodies and awarding organisations.
Whitehead interpreted ‘vocational’ to mean qualifications linked directly to occupations. He excluded from this category – and from his review – low-level, confidence-building qualifications designed to recognise progress towards the labour market. Indeed, he saw the lack of identity of adult vocational qualifications as a problem in its own right, recommending that Ofqual should regulate them as a qualification type with their own design principles. Making them more relevant to employment would also help to improve their identity, giving them a clear line of sight to a job or to a range of jobs, consistent with recommendations from the CAVTL report, and reflecting concerns from the Wolf report over the prevalence of low-value qualifications.
Whitehead recognised numerous observations and recommendations from Wolf, Richard, and CAVTL. The following three sections illustrate this, while discussing issues of particular relevance to the CASLO approach.
Standards
Whitehead echoed Wolf, Richard, and CAVTL in noting that over-detailed NOS risked constraining both teaching and assessment. He agreed with Richard that we should move away from NOS written with excess detail toward “clear high level outcome-based” standards (Whitehead, 2013, page 31). He applied the same reasoning to QCF qualifications, reiterating the concern expressed by CAVTL that detailed specification of assessment criteria risked creating a culture of ‘assessment as learning’. Whitehead was particularly concerned that overly prescriptive standards constrained training providers, making it difficult for them to customise the curriculum to meet local needs.
More specifically, he recommended that adult vocational qualifications should satisfy a set of design principles, one of which was that they should allow for a proportion of locally-specified standards. He described this as a ‘core and options’ model. This would enable qualifications to meet the needs of specific industries, small- and medium-sized businesses, and individuals, without requiring a proliferation of bespoke (but only marginally different) qualifications.
He also recommended that the UK Commission for Employment and Skills should work with employers to agree the future model for occupational standards, and that (In England) the same occupational standard should be used as the basis for apprenticeships, Tech Levels, and adult vocational qualifications.
The QCF
Whitehead followed Wolf in criticising QCF requirements, arguing that awarding organisations should be able to opt-out of certain of them, including requirements concerning unit format and unit sharing. He was also critical of the quality of QCF units based on existing NOS:
The use of the QCF has compounded the problem. NOS have to be rewritten into a standard QCF unit format and these units are added to the QCF unit databank. There is no quality assurance process to check these units reflect the initial NOS, are written clearly or are of an appropriate quality. The conversion of NOS into units adds a step to the development process. The approach of using a standard QCF unit format was introduced so that individuals could change vocational qualifications and could transfer between awarding organisations more easily, avoiding unnecessary repetition of training. In practice, there is little evidence that the units system has resulted in individuals transferring between awarding organisations. Instead, the unit format has resulted in a databank of units not quality assured and used as building blocks for vocational qualifications. The format has also encouraged a “tick box” approach to curriculum and discouraged assessment that confirmed the overall standard had been reached.
(Whitehead, 2013, page 18)
He proposed that the weak link between NOS and QCF qualifications had led to proliferation, providing an example of a single NOS having been converted into QCF units that generated around 140 separate qualifications. His reforms would help to bring down the number of adult vocational qualifications on offer.
Curriculum leadership
Finally, Whitehead strongly supported concerns raised by the CAVTL report over the potential for negative washback on curriculum planning associated with highly prescribed standards and criteria. This level of prescription encouraged providers to treat discrete assessment requirements – detailed performance criteria – as though they laid out a coherent curriculum. Far less prescriptive standards would, he believed, encourage providers to think more carefully about curriculum design, particularly given the freedom it would offer to customise qualifications to the needs of local employers.
Whitehead insisted that the process for developing new standards had to be led by employers, making them more ambitious, aspirational, accessible, adaptable, and innovative. This brought the idea of ‘industrial partnership’ to the fore. Not only would employers be expected to lead the development of standards, they would be expected to influence qualification development too:
Awarding organisations should include employers from relevant sectors directly in the design and development of vocational qualifications, and training providers should bring in employers to support curriculum design and delivery.
(Whitehead, 2013, page 4)
Whitehead suggested that, over time, these employer-led partnerships would come together to take end-to-end responsibility for workforce development in their sectors.
VQ Reform Plan
Insights from all 4 of these reviews were integrated in the government’s ‘Reform Plan’ for vocational qualifications (BIS, 2014). This report helped to clarify the complicated circumstances surrounding the reform of occupational standards, whereby:
- the UKCES continued to develop NOS on behalf of England and the devolved administrations
- 8 Trailblazer groups were developing new standards for apprenticeships in England
- the UKCES had begun to develop characteristics of higher-level occupational standards, along the lines set out in the Whitehead report
The Reform Plan report stated that:
In order to get maximum value from the effort that employers have put into developing new Apprenticeship standards, the Government believes that these should form the basis of any new National Occupational Standards that are developed. We are asking the UK Commission for Employment and Skills to make sure that any new NOS which are produced draw on the content of the relevant new Apprenticeship standard.
(BIS, 2014, page 12)
It explained that legislative change was required to facilitate the transition from apprenticeship frameworks to apprenticeship standards, and that frameworks would continue to be developed during the transition. The UKCES was charged with bringing the devolved administrations fully into the transition programme, to ensure that any new NOS would prove to be satisfactory across the whole of the UK.
In a subsequent statement of intent, the UKCES (2014) confirmed that it would establish greater clarity about what high-level outcome-based NOS might look like, anticipating that a varying spectrum of detail might be required with the move towards one NOS per occupation.
An article in ‘FE Week’ from December 2015 illustrated how the strained relationship between NOS and Trailblazer standards was becoming increasingly problematic (Lindford, 2015). Whitehead was quoted as warning that the Trailblazer process was “out of control” and there was a risk of NOS being bypassed entirely. He still believed that there was an important role for NOS as the (more detailed) foundation upon which the new apprenticeship standards and many vocational qualifications should be based. Conversely, a BIS spokesperson was quoted as saying that, although this was possible in theory, most Trailblazer groups had chosen a different approach.
Sainsbury report
The last report of significance to the future of the CASLO approach in England was the report of the Independent Panel on Technical Education, which was chaired by the former businessman and politician, Lord David Sainsbury (Sainsbury, 2016). The Panel had been established in November 2015 by Nick Boles, Minister of State for Skills, to advise on how to simplify and improve the quality of technical education in England.
Reflecting upon a century of failed reforms, which merely tinkered around the edges, Sainsbury explained that a central feature of an effective technical education system is “a well-understood national system of qualifications that works in the marketplace” (Sainsbury, 2016, page 6). He therefore made qualification reform the basis of his recommendations, insisting that this system should:
- be designed by government, but with “the knowledge and skills, and methods of assessment, for each qualification” (page 6) laid down by industry experts [footnote 7]
- provide clear and simple routes into employment in specific occupations (that is, far fewer routes than currently available)
- be sufficiently flexible to allow learners to change routes, and to accommodate returning adults
- provide a preparatory transition year for students who are not yet ready to embark upon a technical education route post-16
The problem with the current system, Sainsbury argued, was that it was too complex and delivered the wrong skills:
Currently over 13,000 qualifications are available for 16-18 year olds, yet these often hold little value for either individuals or employers, although that may not be obvious until too late. At higher levels, too, technical education qualifications have too often become divorced from the occupations they should be preparing individuals for because there have been no, or only weak, requirements that they meet such needs.
(Sainsbury, 2016, page 8)
Calling for a fundamental shift in technical education, the report made 34 recommendations, which began:
Recommendation 1: We recommend the Government develops a coherent technical education option which develops the technical knowledge and skills required to enter skilled employment, which leads from levels 2/3 to levels 4/5 and beyond, and which is highly valued because it works in the marketplace.
Recommendation 2: The technical education option should be recognised as having two modes of learning: employment-based (typically an apprenticeship) and college-based.
Recommendation 3: While it is necessary for government to design the overall national system of technical education, employer-designed standards must be put at its heart to ensure it works in the marketplace. A single, common framework of standards should cover both apprenticeships and college-based provision. These standards must be designed to deliver the knowledge, skills and behaviours required to perform successfully in specific occupations, not the narrower job role-focused needs of individual employers.
(Sainsbury, 2016, page 17)
Ultimately, these recommendations led to the development of college-based T Levels, which at Level 3 sat alongside academic A levels and employment-based Apprenticeships.
The Sainsbury report was clear that NOS should not be the basis for technical education qualifications in the new system, as these were: “derived through a functional analysis of job roles and this has often led to an atomistic view of education and a rather ‘tick-box’ approach to assessment” (Sainsbury, 2016, page 17). This underscored the principle that technical qualifications should be derived from the same (new) standards as apprenticeships. Learners following a college-based route would therefore develop essentially the same competencies as those working towards an employment-based apprenticeship.
Also of relevance to the CASLO approach, Sainsbury recommended that every technical education qualification should be assessed using realistic tasks and synoptic assessment – to test a student’s ability to integrate and apply their knowledge and skills – and recommended that all qualifications should include external assessment. Again, the idea of synoptic assessment was a response to concerns over the atomised approach to assessment associated with NOS-based NVQs, with its potential for negative washback impact on teaching and learning.
Recommendations from the Sainsbury report were accepted unequivocally (BIS & DfE, 2016). In the future, options for 16+ students would be either technical or academic. The future of existing qualifications that straddled both academic and technical pathways therefore hung in the balance.
Policy post-2010
Although none of these post-2010 policy reviews focused specifically on the CASLO approach, they embedded within TVET policy discussions concerns that had previously remained largely within the academic literature. Wolf focused primarily upon threats to qualification standards, which she associated with the mastery requirement, compounded by perverse incentives (linked to funding and performance tables) that risked undue lenience. Richard focused primarily upon the threat of negative backwash impacts upon teaching and learning, which included spending too much time assessing and not enough time teaching and learning, as well as the threat of not developing a sufficiently integrated, holistic competence. The CAVTL report also recognised these threats, although it did not necessarily accuse the CASLO approach, per se. It merely noted a tendency for teachers and trainers to treat CASLO qualification specifications as though they represented learning programmes, without appreciating that they were simply the foundation for planning curriculum and pedagogy. It concluded that colleges and training providers needed to reassert their ownership of curriculum and pedagogy. Whitehead echoed concerns regarding the risk of learners not developing a sufficiently integrated, holistic competence. So, too, did Sainsbury, a few years later.
For 3 of these reports – Richard, Whitehead, and Sainsbury – the risk of learners not developing a sufficiently integrated, holistic competence could be mitigated by basing apprenticeships and vocational qualifications upon (the same) new, short and easily understandable, employer-led standards. These new standards would capture the spirit of competence succinctly, rather than the letter of competence comprehensively. In 2017, responsibility for overseeing the development of these new standards fell to the Institute for Apprenticeships (IfA), which subsequently became the Institute for Apprenticeships and Technical Education (IfATE). As the IfA was launched, the UKCES was wound up. The NOS system continued to service the devolved administrations, albeit with no formal input from England.
Three of the reports – Wolf, Richard, and Sainsbury – recommended that assessment should be at least partly external. This was an indirect criticism of the CASLO approach, as it tends to be associated with continuous or phased centre-based or work-based assessment. The 3 reports argued that external assessment was necessary to ensure the consistent application of national standards, although the DfE subsequently characterised this more in terms of requiring a comparable level of challenge between vocational and academic qualifications, which was also part of the DfE justification for synoptic assessment. Whitehead also recognised the importance of externality, but argued that this could be ensured by effective external verification.
Both Wolf and Whitehead expressed dissatisfaction with the QCF on numerous fronts. Wolf argued that it was not servicing the needs of 14 to 19 vocational learners. Whitehead made the same argument in relation to adult vocational learners. The core characteristics of the CASLO approach were implicated in this. Having said that, it is worth noting that the reports were not critical of all CASLO qualifications. For instance, Wolf acknowledged repeatedly that Level 3 BTECs were valued in the labour market and by higher education, and provided very high positive returns to learners. The vast majority of BTECs were based entirely upon the CASLO approach at that point in time, as they had been for many years.
Post-2010 regulatory decisions
Ofqual began operating as a discrete entity in May 2008, albeit still technically located within the QCA. Once legislation came into force, in April 2010, Ofqual began operating as a fully independent regulator. Ofqual inherited the National Qualifications Framework and associated regulations from QCA, which governed GCSEs, A levels, NVQs, and other qualifications. It also inherited the Qualifications and Credit Framework, and published regulations for the QCF in August 2008. Ofqual began recognising awarding organisations and QCF qualifications, assuming that the vast majority of vocational qualifications that were regulated under the NQF would transfer to the QCF by the end of 2010.
The shift away from the NQF prompted a review of Ofqual’s regulatory approach. Following a series of consultations, Ofqual adopted a new regulatory approach in May 2011. Its focus was on awarding organisations, and its intention was to ensure that all recognised awarding organisations became fully responsible for the quality, standards, and value for money of their qualifications. Two regulatory documents were central to this approach:
- the Criteria for Recognition of awarding organisations (Ofqual, 2011c)
- the General Conditions of Recognition (Ofqual, 2011d)
All regulated qualifications were now regulated under these conditions, although certain qualifications were also regulated under a series of ‘Additional Regulatory Documents’ that remained in force, including:
- the GCSE, GCE, Principal Learning and Project Code of Practice
- the NVQ Code of Practice
- the Regulatory Arrangements for the Qualifications and Credit Framework
This meant that NQF Statutory Regulations were superseded by the General Conditions of Recognition (GCR), while QCF regulations remained in force alongside the GCR. The NVQ regulations remained in force as NVQs were gradually being transferred into the QCF.
Withdrawing the QCF
Although early evaluations had identified teething problems with the QCF (Ofqual, 2011b), and although the status of the QCF required careful consideration when developing Ofqual’s new regulatory approach (Ofqual, 2009a), there was no suggestion prior to publication of the Wolf report that the fate of the QCF might hang in the balance. Even in 2013, the Minister for Skills continued to support the role of the QCF in supporting adults who required tailored learning programmes, as well as adults who required small, accessible, cost-effective units of learning, such as offenders and unemployed people (DfE & BIS, 2013).
Having reflected on the reports by Wolf, Richard, and Whitehead, and other reports too, Ofqual commissioned an internal review of the QCF toward the end of 2013. In July 2014, it released a consultation on withdrawing its regulatory arrangements, which included details of the internal review as an appendix (Ofqual, 2014b).
Influenced by the post-2010 policy reports, Ofqual’s review recognised that the QCF was not achieving its intended positive outcomes and, worse still, had resulted in certain unanticipated negative consequences. This included concern that QCF regulations were incentivising the development and delivery of qualifications that were neither meeting the needs of the relevant sector nor were assessed appropriately. The review was also cognisant of the direction of travel of recent DfE policy decisions related to grading, synoptic assessment, and end-point assessment, none of which aligned to QCF design rules. The review also recognised Ofqual’s new statutory objectives, which were not in force when the 2008 QCF regulations were being drafted, and which stressed that Ofqual’s primary role was to uphold the validity of qualifications and assessments.
The most fundamental criticism of the QCF, which was highlighted by the review, concerned the relationship that it had established between units and qualifications:
For many, a qualification should add up to more than the sum of its parts in a way that a set of accumulated units does not. For a number of stakeholders, from the time when the QCF was launched, this approach was damaging and contributed to the destruction of established and well-regarded qualifications. Stakeholders who were involved in the development process talk about having to break down qualifications to try to ‘shoe-horn’ the components into the unit template in order to get the qualification onto the QCF. Many also take the view that there is something which is educationally flawed in this approach to the creation of qualifications and that in starting with the unit, what’s lost is the sense of the whole qualification being worth more than the sum of its parts
(Ofqual, 2014b, page 57)
Embodying this anomaly, some of the organisations that submitted units to the QCF unit bank were not actually awarding organisations, and were not recognised by Ofqual. Indeed, Ofqual had no role in quality assuring units. Furthermore, around 10% of submitted units had not actually been used by an awarding organisation. Finally, although QCF rules of combination had been established to ensure the internal coherence of unit-based qualifications, they appeared not be to be working, as too many qualifications appeared to be little more than a “bundle of units” with no relevance to employers and no value to learners (Ofqual, 2014b, page 68). The potential problems associated with introducing a heavily unitised system – which had been well rehearsed prior to the introduction of the QCF (see Unwin, 1999) – had now come home to roost.
In addition to other fairly fundamental concerns for the viability of the framework – including problems arising from unit sharing, limited evidence for the utilisation of credit transfer, and so on – the review identified more specific issues of direct relevance to the CASLO approach. These included:
- the risk that specifying units in terms of learning outcomes might have turned assessment into a mere box-ticking exercise
- the risk that specifying qualifications in terms of units (and, in turn, learning outcomes) might work against synoptic, end-point assessment
- the risk of over-assessment
The review was even firmer in its critique of the QCF mastery requirement:
For competence-based qualifications, and particularly those related to a licence to practise, the mastery model is not only common but many would consider essential. The often-quoted example is of the airline pilot; we all need to have confidence that she can land the plane as well as take-off and fly it. For other types of qualification, and there are many of them on the QCF, the mastery model is not appropriate and again raises issues about the rigidity and inflexibility of the QCF.
(Ofqual, 2014b, page 69)
The review also noted that the CASLO mastery requirement effectively proscribes certain assessment approaches that are normally premised upon sampling of qualification content, including multiple-choice tests.
Ultimately, the review concluded that the QCF regulatory arrangements were not fit for purpose, and should be withdrawn. Awarding organisations should no longer be constrained by highly prescriptive design rules. Instead, they should be required to develop coherent qualifications that would be judged primarily in terms of validity.
The consultation document indicated that Ofqual agreed with conclusions from the review and recommended that the QCF regulatory arrangements should be withdrawn. High quality QCF qualifications should continue to thrive – regulated solely under the General Conditions of Recognition – but invalid qualifications would need to be amended or withdrawn. Gone, too, would be the bank of units from which awarding organisations could draw. The new system would revolve around qualifications, not units, for which awarding organisations would be solely responsible:
From now on, we will be clearly placing validity at the centre of our approach to regulation: a qualification as a whole must be valid, not just the individual units within it.
(Ofqual, 2014b, page 9)
The consultation document also came down strongly against the idea of requiring all qualifications to adopt a mastery approach:
Our proposals on assessment will also make it possible to move away from the mastery approach required of all QCF-type qualifications and to provide for compensation. This will mean that for some qualifications, a student’s real strength in one area may be able to compensate for comparative weakness in another. We judge that this is likely to have a beneficial effect on all students and for many types of qualification will result in fairer outcomes.
(Ofqual, 2014b, page 27)
In December 2014, Ofqual announced its post-consultation decision to withdraw the QCF regulatory arrangements (Ofqual, 2014c). Ofqual’s proposals for implementing new arrangements included developing a new qualifications framework that would be less prescriptive and more descriptive (Ofqual, 2015a). This framework – which was to become known as the Regulated Qualifications Framework (RQF) – would encompass all regulated qualifications. Ofqual also announced its intention to withdraw all residual NVQ regulations. The RQF was introduced in September 2015 (Ofqual, 2015b). Regulated qualifications in England were no longer required to adopt any of the 3 core characteristics associated with the CASLO approach (related to outcomes, criteria, and the mastery principle).
Commitment to ‘strengthen’ VTQs
Evidence of cohort-level results improving steadily over time inevitably raises questions concerning grade inflation. Maybe the cohort is not improving after all? Maybe the qualification standard has fallen? During the noughties, steadily improving cohort-level results raised serious concerns over grade inflation at GCSE and A level. In 2012, Ofqual intervened to address these concerns (Newton, 2022). By 2015, similar concerns had begun to be expressed regarding steadily improving cohort-level results in Level 3 BTECs (HEFCE, 2015). Ofqual developed a complex statistical methodology for investigating this possibility, which triangulated data from a range of sources. The research explored outcomes for 4 cohorts of students between 2005 and 2015, which restricted the analysis to ‘old-style’ BTECs, that is, to BTECs that were based entirely on the CASLO approach.[footnote 8] The report provided strong evidence of grade inflation, questioning the extent of genuine improvement in cohort achievement over time (Cuff, Zanini, & Black, 2018).
This research report was published in December 2018, not long after Ofqual had announced its intention to regulate VTQs (particularly those that featured in school and college performance tables) with “the same seriousness and focus as we do general qualifications” (Ofqual, 2018a, page 5). It speculated that the grade inflation might have occurred as a result of BTECs adopting the CASLO approach, with 100% centre assessment, relatively weak controls over qualification standards, and perverse incentives caused by accountability mechanisms:
These findings might be explained by differences in the marking/awarding processes that exist for ‘older style’ Level 3 BTECs and A levels. For example, while a compensatory approach is taken for A levels, ‘older style’ Level 3 BTECs are graded according to firm criterion referencing (firm in the sense that candidates must be deemed to have achieved all pass criteria to achieve a pass, and all merit criteria to achieve a merit, etc.). As this approach does not allow for any adjustment of grade boundaries (there are no ‘marks’), these criteria set the standard, and so become the method for standards maintenance. Arguably, because of accountability measures, teachers involved in grading have a vested interest in increasing outcomes over time, which this method cannot control for. Ultimately, this method is vulnerable to pressures of grade inflation.
(Cuff, et al, 2018, page 14)
Indeed, the report speculated that similar issues might arise for other CASLO qualifications operating within similar contexts. Ofqual concluded that there was a general case for strengthening controls over internal assessment in VTQs, particularly under the pressure of school and college performance table accountability (Ofqual, 2018b).
This research was acknowledged by the DfE, which concluded that there was a likelihood that Level 2 qualifications were also vulnerable (DfE, 2019). The DfE welcomed the programme of work that Ofqual had set in train to strengthen controls in VTQs.
With a view to harmonising its regulatory requirements with the DfE’s pre-existing design requirements for performance table qualifications, Ofqual established a programme of work that focused specifically on Level 1/2 Technical Awards taught to 14 to 16-year olds in key stage 4. This led to a set of decisions that were to be operationalised through new Qualification Level Conditions (Ofqual, 2020). Of particular relevance to the CASLO approach, these new regulations specified that awarding organisations should use a numerical, mark-based approach to both exam and non-exam assessment components (albeit with an option to apply for exemption). Explaining its rationale for prohibiting use of the CASLO approach within centre-based assessment components, Ofqual noted:
In addition, we think that this will enable awarding organisations to have adequate control over marking judgements made in centres, not least as it will provide greater scope for any adjustments to the marking standard that an awarding organisation might seek to make through their moderation process.
(Ofqual, 2020, page 20)
This was not the first time that Ofqual had prohibited the CASLO approach for particular qualification types (by requiring numerical marking). For instance, in the T Level context, the draft ‘Technical Qualification Conditions and Requirements’ that accompanied Ofqual’s consultation on rules and guidance (Ofqual, 2018c) specified that candidates’ performances had, in each assessment, to be differentiated by the allocation of numerical marks. The same decision had been reached for Essential Digital Skills Qualifications (Ofqual, 2019). In theory, numerical marking increases the control that an awarding organisation has over qualification standards in 2 ways:
- consistently lenient or harsh centre-based marking can be tackled by moderating centre marks down or up accordingly
- where standards appear to be out-of-alignment from year to year, at the cohort level, this can be tackled by raising or lowering grade boundaries accordingly
Yet, it is also important to acknowledge that circumstances do not always permit awarding organisations to capitalise upon these controls, especially when relatively small cohort sizes compromise the use of statistical modelling.[footnote 9]
Regulation post-2010
Ofqual’s decisions to withdraw the QCF and to strengthen VTQs were made independently of government, although they clearly:
- echoed concerns expressed in the post-2010 policy reviews, and
- ensured that regulatory qualification requirements were appropriately aligned to existing departmental qualification requirements and policies
The decision to withdraw regulatory arrangements for the QCF (alongside residual NVQ regulations) meant that the regulator no longer required any regulated qualification to adopt the CASLO approach. If an awarding organisation was to adopt the CASLO approach in the future, then it would be their choice to do so (although this decision might be influenced by key a stakeholder, such as a professional body).
In a number of instances, however, Ofqual decided that awarding organisations should not have this choice, particularly where there might be a strong perverse incentive for lenience. This included Key Stage 4 Technical Awards, T Level Technical Qualifications, and Essential Digital Skills Qualifications. It is worth noting that this has not become a general policy stance following the withdrawal of the QCF. For instance, there is no requirement for numerical marking of centre-based assessments within recently published Qualification Level Conditions for Alternative Academic Qualifications (Ofqual, 2023).
One final point to note is how Ofqual followed Dearing and Wolf in not exploring the relationship between mastery certification and mastery learning, that is, in not discussing the potential value of mastery learning for students on general or vocational courses. The consultation accepted that for competence-based qualifications – particularly those leading to a licence to practice – mastery was often considered essential (Ofqual, 2014b). Similarly, where consultation responses addressed this issue, they made the same point, particularly for certain sectors like health and social care (Pye Tait Consulting, 2014). Yet, the idea of mastery as a philosophical principle in its own right was not discussed.
Down but not out
There are now fewer regulated CASLO qualifications (mid-2020s) than there were a decade ago. However, it is unclear exactly how many CASLO qualifications Ofqual currently regulates – whether full or hybrid – as this information is not collated centrally. Although the approach is prohibited for certain qualification types, it is still permitted in many instances, and the approach is still viewed positively by many awarding organisations, particularly those dealing with competence-based qualifications (akin to NVQs). Indeed, there are even isolated examples of outcome-based approaches within current General Qualifications and national curriculum assessments, including the A level science endorsement of practical skills, and teacher assessment of writing at key stage 2. It is fair to say that the CASLO approach has fallen out of favour with policy makers over the past decade or so. But, while down, it is certainly not out.
-
These categories were defined as follows (see DfE, 2017, page 5):
Technical Awards – high quality Level 1 and 2 qualifications that equip 14 to 16 year olds with applied knowledge and practical skills.
Technical Certificates and Tech Levels – Level 2 and 3 qualifications that equip post-16 students with the knowledge and skills they need for skilled employment or for further technical study.
Applied General qualifications – Level 3 qualifications for post-16 students who wish to continue their education through applied learning. ↩ -
Although, he acknowledged that the test might also assess knowledge and understanding if required by the industry in question. ↩
-
Guidance from 2014 onwards would refer to 3 core constructs – skills, knowledge, and behaviours. ↩
-
Indeed, some mandated the achievement of an NVQ. ↩
-
This stipulation related only to the standard. The assessment plans would have been longer. ↩
-
In 2008, the Sector Skills Development Agency was replaced by the UKCES and the Federation for Industry Sector Skills and Standards comprising all nineteen sector skills councils. ↩
-
Emphasising the importance of industry buy-in, he emphasised that the system would: “only work if industry takes ownership of the content and standards of technical education, and makes certain that companies adhere to them” (Sainsbury, 2016, page 7). ↩
-
Many of these were subsequently reformed to comply with DfE performance table requirements. ↩
-
Of course, prescribing numerical marking also means prescribing a compensatory aggregation principle (as opposed to a mastery one). So, what might seem like a relatively minor technical assessment requirement is actually a more fundamental one, with implications for curriculum, pedagogy, and assessment, as well as for certificate interpretation and use. ↩