On 14 December, Ofqual published their annual statistical bulletin ‘Reviews of Marking and Moderation’ for the summer 2023 exams in England. Lots of numbers. And some very interesting ones too.
Like that out of 6,171,265 GCSE, AS and A level grades awarded, 303,270 grades were challenged and 66,205 were changed – of which some 62,595 were attributable to the correction of ‘marking errors’.
The total number of ‘marking errors’ must be bigger than 62,595, for not all ‘marking errors’ cause a grade to be changed. So if there are at least 62,595 ‘marking errors’ within the 303,270 grades challenged, how many ‘marking errors’ are lurking, hidden, within the 5,867,995 grades that were not challenged? And since a ‘marking error’ is a failure of the examiner to comply with the mark scheme, what does this imply about the quality control procedures of the exam boards – procedures that are supposed to catch such failures as they happen?
Some further missing numbers are measures of grade reliability, analysed by level, by subject, and by exam board.
Measures of grade reliability, however, are a (very) hot potato, as was especially evident at a hearing of the House of Lords 11-16 Year Olds Committee (whose insightful report was published on 12 December) held on 13 July, at which the principal witness was the then Schools Minister, Nick Gibb.
Towards the end of the meeting, Lord Watson of Invergowrie referred to statements made by two Ofqual Chief Regulators: Dame Glenys Stacey’s that GCSE, AS and A level grades are ‘reliable to one grade either way’ (Commons Education Committee, 2 September 2020, Q1059), and Dr Jo Saxton’s that ‘I can assure … young people who will receive their grades this summer that they can be relied on’ (Lords Committee, 29 June 2023, Q135). Observing that these two statements are apparently contradictory, Lord Watson asked Nick Gibb, in essence, ‘are grades reliable to one grade either way or not?’ (that’s a paraphrase – see about 12:23:33 here).
But before Nick Gibb could reply, the Division Bell rang, the meeting was adjourned, and he was asked to respond in writing.
Nick Gibb duly sent a letter, dated 8 August, but there is no reference to Lord Watson’s question, let alone an answer. It is most surprising that a Minister can just ignore a question posed at a Lords Committee, and now that Mr Gibb is no longer in post, it could well be that the question will remain unanswered indefinitely.
This reticence is puzzling, for the first measures of grade reliability were published on 14 November 2016, as Figure 14 in an Ofqual report, Marking Consistency Metrics.
Furthermore, in a paper presented to the Ofqual Board just a few weeks later on 25 January 2017, we read:
- 22. We are now able to routinely create marking consistency metrics for GCSEs and A levels…
- 23. These marking studies will be informative in a number of ways. For example, we will explore which boards/units are associated with higher levels of marking consistency within the same qualifications…
What are the metrics of marking consistency?
If, since 2017, Ofqual can ‘routinely create’ metrics of marking consistency, and therefore grade reliability, by level, by subject and by exam board, what are they? Why aren’t they routinely published? As paragraph 23 states, these metrics are indeed ‘informative’…
Yes, metrics for 14 subjects did appear as Figure 12 in a report published in November 2018, but these are aggregates, and are not analysed by level or by exam board. Why not? And despite the fact that, since January 2017, Ofqual have been ‘able to routinely create marking consistency metrics’, this report remains the sole publicly available source of measures of grade reliability for qualifications as actually awarded. Furthermore, the implications of this report are hotly disputed, implications that lie at the heart of Nick Gibb’s unanswered question.
Questions about grade reliability will not go away.
Are GCSE, AS and A level grades in England ‘reliable to one grade either way’ or not?
Teachers, parents and guardians, employers, admissions officers – and above all students – need to know the truth.
By Dennis Sherwood, an independent consultant, a campaigner for reliable exam grades, and author of Missing the Mark – Why so many exam grades are wrong, and how to get results we can trust.
An Ofqual spokesperson said:
“Quality of marking and validity and reliability in assessment are matters of great importance to us as regulator, and we continue to work with the awarding organisations we regulate to ensure that our highly respected, world class qualifications system retains its reputation for excellence.”
FE News on the go
Welcome to FE News on the go, the podcast that delivers exclusive articles from the world of further education straight to your ears.
We are experimenting with Artificial Intelligence to make our exclusive articles even more accessible while also automating the process for our team of project managers.
In each episode, our thought leaders and sector influencers will delve into the most pressing issues facing the FE.