Assessment strategies (Condition EDSQ3)
Rules and guidance about assessment strategies for Essential Digital Skills Qualifications
- EDSQ3.1 In respect of each EDS Qualification which it makes available, or proposes to make available, an awarding organisation must –
- (a) establish and maintain an assessment strategy for that qualification
- (b) ensure that the assessment strategy complies with any requirements which may be published by Ofqual and revised from time to time, and
- (c) have regard to any guidance in relation to assessment strategies which may be published by Ofqual and revised from time to time.
- EDSQ3.2 In particular, an awarding organisation must ensure that the assessment strategy for an EDS Qualification sets out how the awarding organisation intends to secure, on an ongoing basis, compliance with its Conditions of Recognition in respect of the assessments for that qualification.
- EDSQ3.3 An awarding organisation must ensure that all assessments for an EDS Qualification which it makes available, or proposes to make available, are designed, set, delivered and marked in compliance with its assessment strategy for that qualification.
- EDSQ3.4 An awarding organisation must –
- (a) keep under review its assessment strategy for an EDS Qualification, and revise it where necessary, so as to satisfy itself that the assessment strategy meets at all times the requirements of Conditions EDSQ3.1 and EDSQ3.2,
- (b) review that assessment strategy promptly upon receiving a request from Ofqual to do so, and subsequently ensure that its assessment strategy complies with any requirements that Ofqual has communicated to it in writing, and
- (c) promptly notify Ofqual of any revisions made by it to that assessment strategy.
- EDSQ3.5 An awarding organisation must –
- (a) upon receiving a request from Ofqual to do so, demonstrate to Ofqual’s satisfaction that it has complied with its assessment strategy for an EDS Qualification in respect of any particular assessment for that qualification, or provide an explanation to Ofqual as to why it has not so complied, and
- (b) give effect to any recommendation that Ofqual may make in respect of its compliance with its assessment strategy.
Condition EDSQ3.1(a) requires an awarding organisation to establish and maintain an assessment strategy for each EDS Qualification which it makes available or proposes to make available. Condition EDSQ3.2 requires an awarding organisation to ensure that the assessment strategy for an EDS Qualification sets out how the awarding organisation intends to secure, on an ongoing basis, compliance with its Conditions of Recognition in respect of the assessments for that qualification.
Condition EDSQ3.1(b) requires an awarding organisation to have regard to any requirements in relation to assessment strategies published by Ofqual. We set out our requirements for the purposes of Condition EDSQ3.1(b) below.
General requirements
An assessment strategy for an EDS Qualification must provide a comprehensive picture of the steps and approach an awarding organisation will take to secure compliance with its Conditions of Recognition in relation to the design, delivery and marking of assessments for, and the award of, that qualification.
An assessment strategy must present a logical and coherent narrative that includes clear and concise evidence demonstrating how an awarding organisation will seek to ensure that the qualification, and the assessments for it, are fit for purpose. In particular, it must include information and evidence to show how the awarding organisation promotes and acts on feedback between the different stages of the qualification development cycle so as to continuously improve the assessments for that qualification.
Detailed requirements
We set out below our detailed requirements on the specific information and evidence an awarding organisation must include in its assessment strategy. The amount of information and evidence that can be included may vary across the areas identified, depending on the relevant point in the qualification development cycle to which a particular item pertains and the extent to which Ofqual has determined the regulatory approach in relation to a particular issue.
These detailed requirements are intended to indicate the minimum items that an assessment strategy must include. They are not intended to provide a template specifying the form that it must take, since the optimal structure and content of an assessment strategy will depend on the approach that is being proposed by the awarding organisation.
Section 1: Design
1.1 Coverage of EDS National Standards
Approach to coverage of the EDS National Standards, including:
- how the EDS National Standards have been expanded as appropriate in the specification
- the skills statements included from the EDS National Standards and how these reflect the purpose of the qualification.
- the weightings for different skill areas of the EDS National Standards overall (and in each assessment if more than one)
- the approach to targeting different skill areas and aspects of the EDS National Standards, for example, whether integrated/synoptic or direct/focused
1.2 Qualification structure
Details of how the qualification and assessments will be structured and a rationale for the approach, for example in terms of covering the EDS National Standards effectively, and balancing reliability and manageability, including:
- number and weighting of assessments
- approach to content coverage across assessments (if more than one)
1.3 Availability of assessments
- Approach to availability of assessments, including:
- number of assessments to be available
- type of assessments (for example, onscreen, online, paper-based or a combination of these)
- nature of opportunities (for example, on-demand or sessional)
- duration for which assessments will be available
- approach to Learners taking an assessment again
- In light of the approach to assessment availability, and having regard to the guidance for on-demand assessment, any specific risks that have been identified, how these will be mitigated, and how particular challenges will be addressed, including:
- ensuring comparability of assessments
- minimising predictability of assessments
- ensuring security of assessments
Examples of relevant Conditions here include Conditions EDSQ7.1, D1, E4.2, G1, G9.1-G9.2, H2 and H3.
1.4 Task types and mark schemes
For each assessment:
- details of the range and balance of task types to be used (for example, multiple-choice, short answer, extended response) and how these will support valid assessment of the EDS National Standards
- approach to mark scheme design, including for different task types, and an explanation of how resulting mark schemes will support reliable application by assessors
- a sample of example tasks and associated mark schemes, representing the range to be used in assessments – these may or may not be taken from any sample assessment materials
- commentaries explaining the approaches taken in the sample of example tasks and mark schemes
1.5 Assessment time
Assessment time overall, and for each individual assessment if more than one, and a rationale for this, for example in terms of covering the EDS National Standards effectively, and balancing reliability and manageability.
1.6 Number of marks
Number of marks overall, and for each individual assessment if more than one, and a rationale for this, for example in terms of covering the EDS National Standards effectively, and balancing reliability and manageability.
Section 2: Delivery
2.1 Developing assessment materials
Process for developing assessment materials, including different stages and personnel involved, how evidence regarding functioning of previous assessments is used, and any differences by assessment type.
Examples of relevant Conditions here include Conditions EDSQ1.1, EDSQ7.1, D1, D3, E4.2, G1, G3 and G9.1.
2.2 Assessment setting arrangements
Approach to training individuals who will be responsible for setting assessments and/or items, including ensuring security and mitigating any conflicts of interest.
2.3 Assessor standardisation
Approach to training and standardising Assessors, including details of standardisation procedures and any wider training.
2.4 Marking process
Explanation of how marking processes will operate.
2.5 Monitoring marking
Processes in place to monitor accuracy and consistency of marking and issuing of results, and to take remedial action where necessary.
2.6 Malpractice & security arrangements
How malpractice will be addressed and security of assessments will be ensured, including any differences by assessment.
Section 3: Centres
3.1 Centre assessment
- Approach to whether Centre-adaptation and/or Centre marking will be permitted
- An explanation of the rationale for this and how any risks will be managed, for example in relation to authenticity of Learners’ work and accuracy of Centres’ marking.
Examples of relevant Conditions here include Conditions EDSQ7.1, C1, C2.1-C2.3, D1, E4.2, G1, G3 and G9.
3.2 Centre guidance and training
Approach to the provision of guidance and training to centres around Centre-adapted, and Centre-marked assessments. In particular to cover:
- guidance around adapting assessments
- approach to reviewing Centre-adapted assessments
- training in relation to application of assessment criteria
3.3 Approach to marking
- The steps taken to identify the risk of any Adverse Effect which may result from the awarding organisation’s approach to marking assessments (and to Moderation and monitoring where appropriate).
- Where such a risk has been identified, the steps taken to prevent that Adverse Effect or, where it cannot be prevented, to mitigate that Adverse Effect.
3.4 Centre monitoring arrangements
Approach to monitoring Centres in relation to assessments, including how this will ensure assessments remain fit for purpose on delivery.
3.5 Moderation of Centre-marked assessments
Approach to Moderation of centre marking, where relevant.
Section 4: Standard setting and maintenance
- Approach to ensuring decisions in relation to standard setting follow an appropriate technical methodology and have appropriate scrutiny.
- An explanation of the technical methodology employed in the process, including the personnel involved and their roles.
- An explanation of how the decisions from the process are approved within the awarding organisation and the personnel involved in this.
- Approach to ensuring decisions in relation to standard setting are based on an appropriate range of qualitative and quantitative evidence.
- Details of the range of evidence used to inform decisions and the weight given to different sources.
- A rationale for why this approach is optimal, in light of the assessment design/approach and c
- Approach to ensuring decisions in relation to standard setting promote comparability, over time and between awarding organisations, and are kept under review.
- Details of how comparability between different versions of assessments and, where relevant, different types of assessment (for example, onscreen, online, paper-based or a combination of these) is ensured, both where these are available at the same time and on an ongoing basis.
- For on-demand assessments, details of how and when remedial action is taken when emerging evidence regarding an existing assessment suggests previous decisions in relation to standard setting may need reconsidering.
- Details of how evidence generated in line with any requirements set by Ofqual under Condition EDSQ8.2(a) in relation to inter-awarding organisation comparability will be used to inform decisions on standard setting.
Examples of relevant Conditions here include Conditions EDSQ8.2-EDSQ8.5, D1 and H3 .
Condition EDSQ3.1(c) allows us to specify guidance in relation to assessment strategies for EDS Qualifications. We set out our guidance for the purposes of Condition EDSQ3.1(c) below.
Specific risks to the maintenance of standards apply in relation to on-demand assessments. If an awarding organisation chooses to offer this type of assessment, we will expect the assessment strategy to set out how the awarding organisation will identify and deal with risks to assessment design, delivery, monitoring and awarding.
In relation to assessment design, an awarding organisation should consider:
- how many test versions there are,
- how the number of test versions ensures that Learners do not sit the same assessment more than once,
- how the number of test versions mitigate risks around Centre malpractice (including explicit sharing of tests versions and inappropriate use of test versions as a basis for teaching),
- how test versions are developed to ensure comparability, and
- how the replacement of test versions interacts with the number of test versions available, and the approach to re-sitting where tests are replaced.
In relation to delivery, an awarding organisation should consider:
- how Learners are entered for assessments (i.e. individually or as a group),
- how Learner evidence is Authenticated,
- the lead-in time from an entry being made to the test being taken,
- the modes of assessment available to Learners and why those modes are available,
- the arrangements in place for invigilating test sittings including who is permitted to invigilate and/or handle the tests,
- how students access tests that are on screen,
- the security arrangements that are in place while on-screen tests are being taken (for example, if Learners can access other computer programmes and/or the internet while taking the test, how this access is monitored),
- how paper-based tests are delivered to Centres,
- how far in advance of test sittings paper-based tests are delivered to Centres,
- the controls there are around the storage and destruction of paper-based tests in Centres
- what happens to paper-based tests that are not used by Centres (for example, if a Learner is absent)
- how test versions are allocated to Learners on their first attempt and on resit attempts,
- how test versions are allocated to groups of Learners within a centre who are sitting the test on the same occasion or different occasions, and
- how long a test version is available for and how to avoid it being sat a large number of times (within individual Centres and more generally).
In relation to monitoring, an awarding organisation should consider their approach to:
- inspecting or monitoring centres (up front and on an ongoing basis),
- monitoring or detecting whether any security breaches have occurred (for example, via social media),
- preventing other Learners from sitting a test version where that test version has been breached (for example, by being shared publicly),
- refreshing or replacing test versions,
- monitoring the performance of test versions, and
- monitoring the performance of test items.
In relation to awarding, an awarding organisation should consider:
- when awarding takes place,
- what standard setting method is used,
- what evidence is considered,
- who is involved in the process,
- how comparability is ensured between different test versions, and
- how the outcomes for each test version are monitored.