Are my exams harder than yours?
By Anna Nagle
Published 7 Aug 2014
Aside from complete bewilderment at WHY our teachers would do that to us, I remember thinking that surely that couldn't be true? Would my grade B really be as good as my friend's grade A? Would I have to caveat any mention of my results with 'but it was with the hard exam board..'?
But is there any truth in the notion that some exam boards set papers that are 'harder' than others, or is it all in the minds of stressed students and harried teachers?
The answer is a bit of a mix. Of course exams in a certain subject set by different boards or in different years are never going to be identical. There will be variation in terms of the topics covered, the type of questions, and the way marks are awarded. But the system we have in this country is designed to account for these kinds of differences, so a grade B is a grade B no matter which board set the exam, and students can be confident that they haven't been hard done by purely as a result of the board their school uses.
Here are some examples of how awarding bodies deal with the inevitable variation between exam papers:
One exam board sets harder questions than the others
The grade boundaries for that exam can be lowered, to reflect the greater difficulty of the paper
One exam board sets easier questions than the others
The grade boundaries for that exam can be raised, to reflect the less demanding nature of the paper
Papers from different boards sample different subject content
If a subject is offered by more than one exam board, there are criteria for each subject area that exam boards have to stick to. The exam can't test every single thing that has been taught over the course, so questions dip into a representative sample of topics from across the curriculum. Different exam boards may sample from different areas, but a student would still have to be familiar with broadly the same content no matter which board set their exam
The language used in the questions is different
Not everyone asks questions the same way (as in life!), but the aim of exam questions is to test a student’s knowledge of the subject, not whether they can understand complex sentences. Questions are designed with this in mind, and also take account of the Disability Discrimination Act, so that students whose first language is Sign Language or who have dyslexia (for example) won’t be disadvantaged
The new GCSEs being introduced in the coming years are due to be more demanding for students than the current qualifications. This will be through a combination of increasing the breadth, depth and/or complexity of the subject matter covered. Maths content, for example, will be more extensive, and English Literature will cover a new, more defined range of texts.
My teenage self didn’t appreciate that teachers and heads of department obviously take many factors into account when deciding which exam board to use. Perhaps the set texts in one board’s English Lit specification are likely to appeal to their class more than those set by another. Or maybe local history opens up avenues to engage their pupils with a certain era, so a History specification that covers particular events may be a better fit. There are other factors, of course – support, finance, administrative considerations – but teaching and learning drive the decision making process.
Students sitting the new GCSEs will be faced with harder exams than those before them - that is an explicit objective of the reforms. The regulator and the awarding bodies will be putting a lot of effort into ensuring that not only are standards across boards comparable and equivalent, but also that the first cohorts under the new system are not disadvantaged by being in that first wave. And teachers, as always, will be doing their best to make sure that their charges are in the best possible position when they step foot in that exam hall from 2017.
You can read more about how awarding bodies maintain standards here, or a more in depth report by CERP Senior Research Associate Lesley Meyer on the principles of standard setting. Below you’ll find a selection of research reports on various aspects of standard setting and ensuring assessments are ‘valid’ (i.e. measure what they’re supposed to measure).
Principles of standard setting (2009) Assessment expertise project: validity of assessment (2009) Are examination standards all in the head? Experiments with examiners’ judgements of standards in A-level examinations (2000) Would the real gold standard please step forward? (2000) Principles of moderation of internal assessment (2009) Response to the education select committee inquiry into the administration of examinations for 15-19 year olds in England, focusing on how to ensure accuracy in marking scripts and awarding grades (2011)
About our blog
Discover more about the work our researchers are doing to help improve and develop our assessments, expertise and resources.