Quality assessment

In June and July each year more than 15 million exam scripts are marked and each August, more than 7 million results are issued to students.

We know that getting the right result is very important to students and teachers; that's why we've put it at the heart of everything we do. And by 'right result' we don't just mean getting the right number of marks or grade – it has to be an accurate reflection of a student's ability too.

Here you can find out what we are already doing to get it right, as well as how we are continuously improving what we do.

You can also watch our short film which explains why marking some subjects is more challenging than others and what research tells us about how to make marking as accurate and reliable as possible.

Assessing ability

One of the strengths of the English exam system is that it allows us to assess students' abilities by asking them to do specific tasks as part of their exam or coursework. This makes the assessment more valid than if students were only answering short factual or multiple choice questions.

Some of these tasks include:

  • writing prose
  • drawing graphs
  • constructing arguments
  • speaking a language.

Of course marking these tasks is not as straightforward as marking multiple choice questions, so we need to rely on our markers' professional judgment in deciding which mark best reflects the standard of the work.

To guide their judgements and make sure they are consistent, we set out the criteria for awarding marks in the mark scheme. We then continuously check that their marking is compliant. This all means we need to get our assessments and mark schemes right.

Getting it right from the start

Our intention is to 'bake in quality from the start' – so getting our assessments and mark schemes right when we first design them, rather than having to fix things further down the line.

As we and the other exam boards have developed new GCSEs, AS and A-levels as part of Government reforms, we have also reviewed the design of our assessments and mark schemes.

Using evidence from research by our world-leading experts at AQA's Centre for Education Research and Practice (CERP), we have been able to make changes where we think they will bring about greater reliability and consistency of marking.

Some of this includes:

  • basic things, like having enough marks on a paper and not having questions no one can answer
  • more complicated things, like understanding how the exam paper and the mark scheme work together.

Designing assessments that accurately test different ranges of ability means students are more likely to get the right grade, and markers are much clearer about how many marks reflect the standard of the work they are assessing.

Getting the right people

As we are relying on markers to use their professional judgement, we need to make sure we have the right people in place. Our examiners are qualified teachers who are teaching, or have recently taught, the subject they are marking.

Examiners and AQA colleagues who develop assessments also go through rigorous training so they are up to date with the latest research and developments in assessment design.

Read more about our examiners, their roles and how they bring their expertise to marking. You can also find out how to apply if you would like to become an examiner or moderator.

How papers are marked

While examiners are experts in their subjects, they have different levels of experience when it comes to marking. So we have strict quality controls in place and check marking along the way, whether marking is paper-based (traditional) or online.

Here is a brief overview of the process and quality controls for both.


Before marking starts, all examiners attend standardisation meetings so they fully understand the mark scheme and where to award marks. This helps ensure the marking is consistent across the subject.

Regular checking

Depending on how the exam papers (scripts) are marked, there are different ways of checking. Here is some of what we do.

Paper-based marking

  • A senior examiner checks the marking for the first 10 scripts to make sure the marker is following the mark scheme.
  • When the marker is halfway through their allocated scripts, the senior examiner checks another 15 marked scripts.
  • If the senior examiner has any concerns, they review additional papers and decide if the marker is able to continue or if the papers should be given to another marker.

Online marking

When marking starts, certain answers are 'seeded' and the marking for these answers is automatically reviewed. This is how it works:

  1. the answer is marked and the mark agreed by all the senior examiners for that subject
  2. it is then given to the team of markers to mark
  3. the system compares the mark they award to the mark previously agreed
  4. if the mark is different, the system detects this and marking is stopped
  5. a senior examiner is alerted and intervenes before marking can continue
  6. if incorrect marking recurs, all the answers are reviewed by another marker and any unmarked answers are also given to someone else to mark.

We are moving more of our marking online, as our research shows that it increases reliability. For example:

  • different questions are sent to different markers so the final mark for any exam paper is the professional judgement of a number of experts
  • we can monitor marking as it happens.

We continue checking marking up until results are published. After that date, if teachers and students are unhappy with their marks, they can apply for a review of marking or other post-results services.

Review of marking and marking quality

The exam system operates on a large scale and involves an important element of human judgement. Over the past few years, there has been an increase in the number of requests for marking reviews, especially in cases where a student has missed a grade by a mark or two.

We know that every mark and grade change is significant for students, and we’re doing everything we can to make sure our marking is as accurate and reliable as possible. But we do sometimes get it wrong, so we want to make sure that any reviews of marking provide a fair and consistent way to put things right.

Ofqual has introduced some changes to post-results services that mean marks will only be changed where there has clearly been an error in the marking or moderation. This includes administrative errors, as well as an academic judgement that isn’t justified.

Training and monitoring reviewers

All our reviewers are experienced, expert examiners and moderators, who marked or moderated exams in the summer series they are reviewing for. To ensure that they fully understand the review of marking process and what is expected of them, we provide mandatory marking review training through an e-learning module and detailed guidance. They also spend time re-familiarising themselves with the mark scheme and standardisation materials for the specific assessment they will be reviewing.

We monitor reviewers’ progress and performance throughout the marking review period to ensure the agreed timescales are met, and that marks only change where there is an actual marking or moderation error.

We monitor the performance of reviewers by analysing the frequency and size of their mark changes. If we identify reviewers whose performance isn’t in line with expectations, we give them feedback and, if necessary, retrain them or stop them from reviewing any more papers or moderated components.

If we identify any wider issues with initial marking while carrying out this internal monitoring, we take appropriate actions to correct any marking errors to ensure fairness to all students.

Improving quality of marking

Because we recognise how important getting the right result is for everyone, we are always looking to improve, using research evidence to build on what works well and learning from mistakes. This includes analysing the performance of all our papers, processes and examiners after each exam series and taking on board feedback.

Understanding how exams work

Read about how exams work and watch our animations explaining how: