Speech

Tomorrow's world

A speech to the 2015 Association of Colleges Examinations Officers' conference by Glenys Stacey, Chief Regulator for Ofqual.

Exam invigilator

Good morning

Back in July 1965, the BBC broadcast the first ever episode of Tomorrow’s World. It was an exciting new show that brought cutting edge science into our homes for the first time. As a child, I remember being transfixed by the extraordinary live studio experiments that were featured. They were commissioned to both educate and entertain, and often the greatest entertainment resulted from their failure! But it was enthralling viewing. The show ran for nearly 40 years, and I was saddened when the programme disappeared from our screens in 2003.

In many ways, Ofqual shares the ethos of Tomorrow’s World. We want GCSE and A level students to be excited by science. We want them to experience the thrill of success in the classroom, and know the implications of failure. We want them to be presented with a rich and diverse menu of experimentation as part of their education. For too long, however, the assessment system has encouraged many excellent teachers to repeat and rehearse the same, narrow group of practicals in order to achieve the best possible grades. Much like we all turn-off our televisions at Christmas when presented with a schedule full of repeats, so we risk doing the same to our children. This is wrong and it needs to change.

The problems with the current system reflect the pressures of controlled assessment and performance league tables. All scientists agree that the value of practical experimentation is in the doing and the learning that comes from it - learning from mistakes, or experiencing that light bulb moment, when an experiment suggests something counterintuitive, but true. Those learning opportunities are being constrained at the moment. We have a broken system, one that nobody should wish to see continue.

The science teaching profession generally agrees that the main purpose of practical work is formative, helping students to understand science and how scientific ideas are developed. Teachers commonly articulate a great many purposes to practical work, including: motivation for students; the excitement of discovery; consolidation of theory; development of manipulative skills; knowledge of standard techniques; general understanding of data handling; development of other skills (such as analytic, evaluative, planning, applied, mathematical skills) and development of an understanding of how science works (such as concepts of scientific process, collaborative working, reproducible results, fair testing). Teachers also generally agree that isolated practical tasks, of the type carried out for assessment purposes at GCSE, do not enable students to find out how scientific enquiry really works. Rather than Tomorrow’s World, we have Fawlty Towers: controlled assessment does not support the aims of practical science. Instead, it drives and trammels what is taught and in many (not all, but many) schools and colleges it reduces learning from practical science significantly.

So where do we go? Last year, following consultation, we decided to adopt written assessment of practical science knowledge at A level. There is strong evidence to suggest that well-written questions can appropriately test candidates knowledge of scientific experimentation. Indeed, we have seen good examples of that in the sample assessment materials submitted to us by exam boards as we have accredited the new A levels. We have proposed the same for GCSE science, and our consultation closes today.

We have three aims as part of the reforms we are making to science assessments.

First, we want to increase the amount of practical work students experience. We want to allow and enable teachers to deliver the curriculum aims for the sciences. Currently, GCSE students can do as little as one or two experiments per science. At A level it can be a similarly poor experience. We know that some students going to university to further their science studies are going with plainly insufficient practical experience. Our reformed GCSEs will require a minimum - a minimum - of eight practicals to be conducted, and 16 for Combined Science. The reformed A-level will require a minimum of twelve per subject.

Second, we want to improve the quality of the practical work students are doing and to provide science teachers with the chance to truly integrate practical science into the curriculum, into teaching and learning. We know that practical science is at the centre of scientific enquiry. That practical science - whether it is transmitting a signal along optical fibres, producing and burning hydrogen, or investigating the effects of acid rain on seedlings – is exciting. It can and should inspire young people, hooking them into science in all its glory. But controlled assessment can’t do that any near well enough. Instead, it leads to over predictable assessments and consequently to time spent by students rehearsing and repeating those most likely to come up. Going through the motions is intensely dull. I have been told enough times, in classrooms across the country, that this experience is stultifying. And of course, it is not a proper experiment if there is no room for error, or for learning from mistakes.

With our planned reforms there will be less pressure to complete each practical in a set time (under exam conditions) and they will be much less contrived. Because there is less time pressure and students are not penalised for ‘getting the wrong results’ both teachers and pupils will find it an altogether more worthwhile experience. We have seen that already in the trials exam boards have run for A levels. Teachers and students participating in those trials have found them stimulating and enjoyable, giving us similar hopes and expectations for GCSEs.

THIRD, Ofqual wants assessments to be fair. Of course. It is hard at present to say that the outcomes from non-exam assessments are fair or reflect properly what they should - students’ understanding and abilities in practical science work. Instead, the results reflect the inhibiting nature of these assessments, and the habits now firmly established in a good many schools to make the most of a bad job. The vast majority of students’ marks are bunched at the very top end – so not only are these assessments constraining teaching, they are not even discriminating between students sufficiently well.

We have proposed at GCSE, and adopted at A level, the principle that the only element of practical work that has to be assessed by the teacher is the student’s ability to select the right equipment, use the equipment sensibly, and log the results intelligently – essential technical skills. The results and meaning of the experiments will not be assessed by the teacher – that skill and knowledge will be assessed in the written exam and at A level will be worth 15% of the marks.

We have received enormous support from the teaching community for our proposals. They recognise the incentives they face and are concerned by the impacts they see. Unfortunately, some commentators do not recognise those opinions, and some say we are ‘ending science practicals’ or engaging in a ‘big experiment’. I hope you will see from what we are proposing that this could not be further from the truth.

Others say that removing controlled assessment will lead teachers, under pressure of time and budgets, to simply drop practicals altogether. We have said that at A level there will be a separate practical grade to aim for, and students will be required to keep a logbook of their practical work, to be made available to the exam board on request. Teachers are delighted with this idea. We will also require schools to sign a form confirming to their exam board that each student has completed the practical activities, has used the required apparatus and developed the required techniques. We would intend to take similar precautions with GCSEs, should those in schools responding to our consultation agree.

All that said, qualifications themselves cannot control what happens in schools and colleges, or control altogether what is taught or how it is taught. Instead, we appreciate that those in schools and colleges have responsibilities, as do others in the system. Our hope and expectation is that they will rise to the challenge and take the opportunities the new arrangements provide - to put practical work back where it should be, at the heart of science teaching and learning.

We have not come to these proposals lightly. We have looked at all the available evidence, including international evidence. We have listened to higher education, and understand the skills and experience wanted as undergraduates start their studies. And we have listened above all to teachers and to students. This evidence has led us to rip-up the tired, old script, adding a huge dollop of improvisation, and look to help teachers inspire the next generation of scientists. That’s what we believe Tomorrow’s World should be like.

Another topic that a great many journalists and commentators have chewed on over the past few months has been the reform of GCSE maths. The new maths curriculum is very different from that which exists today, encompassing broader and more demanding content. At the same time, we must deliver an assessment programme that continues to cater for a wide range of student ability.

Towards the end of last year we accredited the new maths specifications of three of the four exam boards. We have since accredited the fourth. This was by no means a simple tick-box exercise. Since the new content is different from that in the current qualifications, the assessment standard - the level of demand - is very challenging to define in advance. We know we’re aiming to deliver three Michelin stars instead of two, but the dish being prepared is very different.

As a result, we undertook a degree of pre-accreditation preparation like never before. This included developing industry standards, publishing rules, and mapping content to tiers of assessment. This helped to reduce the potential variability in the assessment standards between specifications, but it was not intended to limit valid differences in how each exam board addressed our requirements. We’re asking them to bake a maths cake that is to our taste using the same bag of ingredients, but how they go about it is up to them. To borrow from the Great British Bake Off, this is their ‘technical challenge’.

Even after this initial work, none of the boards’ initial specifications and materials was considered palatable to our panel of six independent experts on first submission. Some submissions were considered too demanding, the language used in others too complicated. But after extensive, constructive dialogue we believe each board now has an appropriate assessment strategy and that those strategies can be implemented in a way that ensures they can continue to meet our regulatory requirements over time. That is what accreditation means. What we cannot say for certain, however, is that the level of demand - the quality of the bake, if you will - across the qualifications is precisely the same.

In order to reassure ourselves, the exam boards, students and other stakeholders on this matter we are undertaking a comprehensive research programme. Over the next three months we will be conducting three separate investigations. One will look at the comparability of demand of the new exam papers with specifications from both home and abroad. Another will look at the cognitive processes students have to carry out to answer the papers’ questions. The third will involve the large scale testing of the boards’ new papers by some 4,000 students across England.

We received a huge amount of interest in this latter exercise and achieved our target in a matter of days of announcing our intentions. I would like to personally thank all those teachers whose schools have been signed-up, and those who indicated their desire to contribute. We will have reviewed all the research programme findings by the end of April. That may seem like a long time, but it is imperative that we get this right. When we do report, the results may lead us to require changes to some board specifications, but the proof will be in the tasting.

I would like finally today to turn my attention to the issue of marking. We, the regulator, can do all we can to ensure students are being inspired in the classroom and exam boards are preparing specifications that meet the necessary level of demand, but it will count for nought if there is a lack of confidence in the final act of assessment.

We have received a great deal of feedback from examinations officers - quite possibly from some of you here today - that schools and colleges just do not have enough trust in marking. We have been told that re-marks are being requested ‘just in case’, rather than because of concern about any particular student’s result. We have been told that despite the significant costs involved, many schools and colleges are willing to gamble because they know the system is flawed and they’ll get their money back if the grade changes. This has the feeling of a one-way bet, where the loser is each and every student who gains a qualification with slightly less worth.

Contrary to these concerns, we know from our own research that the quality of marking is in general good. It would be nigh on impossible to eradicate every error in a system of the scale in England, where more than 22 million exam scripts and pieces of coursework were marked last year – we have to expect some mistakes. But we do not expect to see mark changes where a student jumps from a D to a B, for example. The boards have told us that in many cases big grade changes are caused by clerical errors, such as incorrect addition or transcription of marks, rather than poor quality marking. Irrespective of the cause, these sorts of changes are indefensible.

In order to get a handle on the issue, we have asked exam boards to carry out further analysis of their data. We want more detail around the reasons for grade changes, especially where the changes were by more than one grade. We also want them to consider how the checks they do during the marking process itself can be improved. Tackling problems as they arise is far better than waiting for appeals to prompt a wider search.

We already have details of some of the actions that exam boards intend to take this summer to minimise grade changes, such as reviewing mark schemes and improving examiner recruitment and training. Alongside these, we are going to create standardised quality of marking indicators for use across the boards. This will give us a more accurate picture of marking quality than we have currently.

We will also be making improvements to the system for enquiries about results and appeals. Markers currently undergo ‘standardisation’ before tackling any scripts to ensure they have a shared and common understanding of the standard expected. We think markers should be re-standardised before they deal with any re-marking to remind them of that standard. We know this happens at some exam boards in some subjects, but it is not carried out universally. From summer this year, we expect exam boards to re-standardise all markers involved in re-marking any subjects at GCSE and A level.

We have additionally told exam boards that they must review and publish their principles for extending a marking review. At the moment a school or college can challenge the marking for a sample of students. If the exam board identifies a trend of under-marking it can choose to conduct a review. It is not sufficiently clear at present the circumstances in which they will do this. This creates suspicions of inconsistency and we want exam boards to become more transparent about how this part of the system works.

Improvements in transparency are also required around appeal panel hearings. The second stage of these appeals involves a formal hearing in front of a panel that should have at least one independent member. These independent members are paid by the exam board, and may do other work for them. Teachers have told us they do not think this provides an adequate level of independence, and we agree. We think the best way to demonstrate the independence of the panel members is to have clear procedures and declare any interests at the start of proceedings.

I have nearly come to the end of my remarks this morning. I hope you can see that we are taking decisive action in a number of areas to improve the quality of education that our students receive and the assessment system by which they are judged. Truly tomorrow’s world will be better than today’s.

Thank-you.

Updates to this page

Published 4 February 2015