Ofqual publishes reports relating to 2017 GCSEs, AS and A levels
Our summer report is being published alongside reviews of marking and moderation statistics and a range of associated analyses.
Ofqual has today (14 December 2017) published its review of the 2017 summer exam series, during which a range of new GCSEs and A levels were awarded for the first time. This report provides a broad overview of exam planning, exam administration, marking, the awarding process and what happened post-results.
As shown in our infographic, around 14.1 million scripts were generated by approximately 1.4 million candidates this summer. Around 2,200 different GCSE, AS and A level exams were taken and these were marked by approximately 63,000 examiners, leading to 6.6 million certifications.
Overall, GCSE and A level results were stable, and the degree of variation in year-on-year results for individual schools and colleges was similar to previous years. The changes made to special consideration provisions this year ensured that students affected by the tragic events of the summer were treated appropriately.
The vast majority of question papers were, as in previous years, error free. In the small number of cases where serious errors occurred, we monitored the exam boards’ handling to make sure, as far as possible, each affected student was given the fairest result. And we launched a review of teacher involvement in developing exam papers given the impact on affected students and public confidence of two well-publicised incidents. We have issued an update on this work today.
Reviews of marking and moderation
We have published official statistics on reviews of marking and moderation for GCSE, AS and A level exams today. This was the second summer when revised, fairer rules applied, such that marks should only be changed to correct a marking error and not because of legitimate differences in opinion between two markers.
Overall, 99% of all AS and A level grades, and 98.6% of all GCSE grades, were unchanged in England this year after the conclusion of any review. The number of grades challenged increased from 346,920 last year, to 369,215 this year (+6%). There were 6.6 million qualification grades issued in 2017. In total, 88,505 GCSE, AS and A level qualification grades were changed, compared to 63,345 grades in 2016. The proportion of all qualification grades that were changed by 2 or more grades in 2017 was less than 0.03%.
The data indicate that the rise in grades changed after review this year stemmed principally from an increase in the number of successful review requests in new and legacy versions of GCSE English language and English literature. This is partly explained by a significant increase in GCSE entries in these subjects this year, with fewer students taking alternative qualifications. However, the proportion of successful grade changes has also risen.
The evidence from a range of original marking and review of marking data points to variation in some exam boards’ efforts to embed the revised rules for reviews, rather than issues with original marking. The data suggests Pearson was more successful than the others at embedding the rules for reviews of marking.
Sally Collier said:
Overall, this year’s exams have been carefully planned, effectively managed and successfully delivered by the exam boards. From our initial analysis, it appears that some of the exam boards have not done enough to change old practices and meet our new rules around reviews of marking. We expect all exam boards to comply with our rules at all times. We are currently looking at where more could and should be done and will consider what form of regulatory action may be appropriate. We will not require exam boards to reconsider the outcomes of the reviews they have undertaken this year, so students’ awards following review will stand.
Comparability between exam board qualifications and the maintenance of standards over time
A further 3 reports published today cover our work to ensure standards are maintained between exam board qualifications in the same subject, and over time.
Ahead of the summer, we considered how standards should be maintained in the first awards of new 9 to 1 GCSEs, given anticipated changes in cohort entries. We decided that predictions would be based on previous GCSE outcomes only.
After exam papers have been marked, we monitor selected GCSE and A level awards to ensure that grade standards within subjects are in line across exam boards. We found this to be the case in 2017.
And this year we also looked at the difficulty of the live GCSE maths questions compared with the sample questions that were accredited. Our research indicates that the exam boards produced papers containing questions of similar difficulty to their sample assessment materials and to each other this summer.
National Reference Test
We are also publishing further details of the first national reference tests in English and maths, which were conducted earlier this year. We expect that it will be 2019 at the earliest before exam boards start to use the information from the tests when they award GCSEs. At that point, we will publish the outcomes alongside GCSE results.
National Assessments regulation: annual report 2017
Finally, we are issuing our annual report on the regulation of national assessments. It provides assurance that the Standards and Testing Agency took an appropriate approach to making sure that performance standards were effectively maintained for 2017 key stage tests. We will continue to focus on key aspects of assessment validity and to monitor STA’s response to our findings.