3.9 Non-exam assessment

3.9.1 Overview

3.9.1.1 Purpose of non-exam assessment

Non-exam assessment (NEA) allows students to develop their practical skills in a problem solving context by coding a solution to a given problem. Non-exam assessment is as much a learning experience as it is a method of assessment: allowing students to work independently, over a period of time, extending their programming skills and increasing their understanding of practical, real world applications of computer science.

Additional information relating to NEA can be found in the teachers’ notes which accompany the NEA task.

3.9.2 The task

3.9.2.1 Setting the task

We will set the task for the non-exam assessment: this will be available to schools and colleges in September of the final academic year of the course.

The task will change for each new cohort of students.

It is the responsibility of the teacher to make sure that the correct task is used when preparing their students.

3.9.2.2 Taking the task

The task will comprise of a single project which can be undertaken in a period totalling 20 hours. When completing the task, students must work independently and produce a unique piece of work.

Students must program in one of the high-level programming languages available for use in that year's NEA task.

The completed task will generate a:

  • program designed, written, tested and refined by the student
  • written report.

Each student must produce their own report, in either hard copy or electronic format (saved to CD).

3.9.2.3 Authentication of students' work

Teachers must be confident that the evidence generated by each student is their own work, has been completed in 20 hours and as such can authenticate it (see Supervising and authenticating). It is the centre's responsibility to ensure that the work submitted for assessment is that of the student.
  • Students are not allowed to take the NEA tasks home with them.
  • Students are not allowed to take work on the NEA task home to complete. All work presented for submission must have been completed under supervised conditions.

3.9.3 Marking the task

Students are free to redraft a piece of work before submitting for final assessment. Once a student's work has been submitted for final assessment no further amendments can be made.

When marking the task teachers must use the marking criteria in this specification. Further information about the NEA task marking is available at aqa.org.uk/8520

3.9.3.1 Marking support

Teacher standardising will be available each year to give support in both the taking of the task and the application of the marking criteria. If you have any queries about the task you are encouraged to contact us at computerscience@aqa.org.uk

Your centre will be assigned an AQA appointed subject adviser who will be available to assist you in matters relating to the NEA. Contact details of the adviser appointed to you will be provided when you inform us that you are using this specification.

When marking the task a level of response mark scheme should be used. A level of response mark scheme allows you to assess the performance of your students holistically.

3.9.3.2 Using a level of response mark scheme

Level of response mark schemes are broken down into levels, each of which has a descriptor. The descriptor for the level shows the average performance for the level. There are marks in each level.

Before you apply the mark scheme to a student’s answer read through the answer and annotate it (as instructed) to show the qualities that are being looked for. You can then apply the mark scheme.

Step 1 Determine a level

Start at the lowest level of the mark scheme and use it as a ladder to see whether the answer meets the descriptor for that level. The descriptor for the level indicates the different qualities that might be seen in the student’s answer for that level. If it meets the lowest level then go to the next one and decide if it meets this level, and so on, until you have a match between the level descriptor and the answer. With practice and familiarity you will find that for better answers you will be able to quickly skip through the lower levels of the mark scheme.

When assigning a level you should look at the overall quality of the answer and not look to pick holes in small and specific parts of the answer where the student has not performed quite as well as the rest. If the answer covers different aspects of different levels of the mark scheme you should use a best fit approach for defining the level and then use the variability of the response to help decide the mark within the level, ie if the response is predominantly level 3 with a small amount of level 4 material it would be placed in level 3 but be awarded a mark near the top of the level because of the level 4 content.

Step 2 Determine a mark

Once you have assigned a level you need to decide on the mark. The descriptors on how to allocate marks can help with this. The exemplar materials used during standardisation will help. There will be an answer in the standardising materials which will correspond with each level of the mark scheme. This answer will have been awarded a mark by the lead examiner. You can compare the student’s answer with the example to determine if it is the same standard, better or worse than the example. You can then use this to allocate a mark for the answer based on the lead examiner’s mark on the example.

You may well need to read back through the answer as you apply the mark scheme to clarify points and assure yourself that the level and the mark are appropriate.

Indicative content in the mark scheme is provided as a guide for examiners. It is not intended to be exhaustive and you must credit other valid points. Students do not have to cover all of the points mentioned in the indicative content to reach the highest level of the mark scheme.

Work which contains nothing of relevance to the task must be awarded no marks.

3.9.4 Marking criteria

The task is assessed in five sections as shown below.

Section

Criteria

Maximum marks

1

Designing the solution

9

2

Creating the solution

30

3

Testing the solution

21

4

Potential enhancements and refinements

10

5

Overall quality of the report 10
Total 80

3.9.4.1 Designing the solution (max 9 marks)

Students should articulate their design in a manner appropriate to the task and with sufficient clarity for a third party to understand how the key aspects of the solution are structured. The emphasis is on communicating the design; it is acceptable to provide a prose description of the design or a combination of prose and diagrams. This could include flowcharts, as appropriate, as well as a description of algorithms, data structures, text file/database structures as appropriate, or using relevant technical description languages, such as pseudo-code. Where design of a user interface is relevant, screen shots of actual screens are acceptable.

Level

Mark range

Description

3

7–9

A comprehensive design which could be used as the basis of an effective implementation of a complete or almost complete solution.

A comprehensive design which shows good understanding of variables, data types and structures, as well as how the data will be processed.

Explanations of all or almost all of the main blocks of the proposed solution including data validation where appropriate.

Design choices are justified with reference to user requirements.

2

4–6

A detailed design that describes how most of the key aspects of the solution are to be structured/are structured.

A largely effective design for the variables, but showing limited understanding of the potential offered by data types and structures.

Explanations of most of the main blocks of the proposed solution, including the processing of calculations where appropriate.

Design choices are described.

1

1–3

A minimal design of what the problem involves.

An incomplete or partially effective design for the variables and/or data structures.

Minimal descriptions of some of the main blocks of the proposed solution are given, so that it is difficult to obtain a picture of how the solution is to be structured/is structured without resorting to looking directly at the programmed solution.

Design choices are stated.

0 0 Nothing worthy of credit.

3.9.4.2 Creating the solution (max 30 marks)

Completeness of solution (15 marks)

Students should present their complete code listing here. Students need to annotate their code listing either by using comments within the code or by annotating the listing in some other way. To gain marks in any particular level it must be clear from looking at the code listing and reading the comments that the solution demonstrates the requirements of that level.

Level

Mark range

Description

5

13–15

A solution that meets all or almost all of the requirements of the problem.

The marks at the top end of the level are for solutions that include well implemented elements of robustness and structured programming.

4

10–12

A solution that achieves most, but not all of the requirements of the problem. The solution uses structured programming elements effectively.

The marks at the top end of the level are for solutions that include some elements of robustness.

3

7–9

A solution that achieves some of the requirements of the problem.

The marks at the top end of the level are for solutions that include some elements of structured programming and some data validation.

2

4–6

A solution that achieves a few requirements of the problem.

The marks at the top end of the level are for solutions that include a minimal number of elements of structured programming.

1

1–3

A solution that tackles a few aspects of the problem. Solutions at this level may not work as intended.
0 0 Nothing worthy of credit.

Programming techniques used (15 marks)

The coding skills listed are indicative of those required by students working at the level indicated. The lists must not be used as a simple checklist, as some solutions may warrant inclusion at a particular level even if all of the skills for that level are not evident. For example if constants have not been used within the solution because their use is not required/appropriate, then the student can still potentially reach level 2 and above. As such teachers must use a best fit approach when deciding the appropriate level for a piece of work.

Students should provide program listing(s) that demonstrate their technical skill. The program listing(s) should be appropriately annotated and, for the higher marks, self-documenting (an approach that uses meaningful identifiers, with well-structured code whose purpose is apparent without reference to external documentation).

Students should present their work in a way that will enable a third party to discern the quality and purpose of the coding. This could take the form of:

  • explanations of particularly difficult-to-understand code sections
  • a careful division of the presentation of the code listing into appropriately labelled sections, to make navigation as easy as possible for a third party reading the code listing.

Achievement of the latter, to an extent, is linked to the skill in applying a structured approach during the course of developing the solution.

Level

Mark range

Description

Indicative coding skills required

5

13–15

The code demonstrates that the coding skills required for this level have been applied sufficiently to demonstrate proficiency.

Evidence in the code and/or annotation/comments shows a successful solution to the problem that utilises exception handling, data validation and subroutine interfaces as appropriate.

Meaningfully named variables (local and/or global) and any data structures are effectively used and appropriate to the solution.

Code is appropriately structured for ease of maintenance.

Outstanding:

  • subroutines used with appropriate interfaces
  • cohesive subroutines
  • good exception handling
  • self-documenting code
  • modularisation of code
  • appropriate use of local variables
  • minimal use of global variables
  • appropriate use of data validation
  • appropriate use of constants
  • consistent style throughout
  • meaningful identifier names
  • appropriate indentation
  • annotation used effectively where required.
4 10–12

The code demonstrates that the coding skills required for this level have been applied sufficiently to demonstrate proficiency.

Evidence in the code and/or annotation/comments shows a largely successful solution to the problem that utilises modularisation as well as exception handling and/or data validation as appropriate.

Meaningfully named variables (local and/or global) and any data structures are appropriate to the solution.

Excellent:

  • good exception handling
  • self-documenting code
  • modularisation of code
  • appropriate use of local variables
  • minimal use of global variables
  • appropriate use of data validation
  • appropriate use of constants
  • consistent style throughout
  • meaningful identifier names
  • appropriate indentation
  • annotation used effectively where required.

3

7–9

The code demonstrates that the coding skills required for this level have been applied sufficiently to demonstrate proficiency.

Evidence in the code and/or annotation/comments shows a solution that solves most of the problem. The solution utilises modularisation as appropriate.

Meaningfully named variables (local and/or global) and any data structures are appropriate to the solution. The use of data validation is evident and appropriate.

Good:

  • modularisation of code
  • appropriate use of local variables
  • minimal use of global variables
  • appropriate use of data validation
  • appropriate use of constants
  • consistent style throughout
  • meaningful identifier names
  • appropriate indentation
  • annotation used effectively where required.

2

4–6

The code demonstrates that the coding skills required for this level have been applied sufficiently to demonstrate proficiency.

Multiple programming techniques are used, and there is evidence through annotation/comments of some understanding of how to solve the problem.

Variables with meaningful names are used effectively throughout the solution.

Basic:

  • appropriate use of constants
  • consistent style throughout
  • meaningful identifier names
  • appropriate indentation
  • annotation used effectively where required.

1

1–3

The code demonstrates that the coding skills required for this level have been applied sufficiently to demonstrate proficiency.

Code statements address at least one of input, process and output and are relevant to user requirements with some minimal comments/annotation.

Variables with meaningful names are used effectively for parts of the solution.

Minimal:

  • meaningful identifier names
  • appropriate indentation
  • annotation used effectively where required.
0 0 Nothing worthy of credit.  

3.9.4.3 Testing the solution (max 21 marks)

Testing is taken to mean 'Does the solution work?' and as such it is important that students plan a series of tests to show that the different sections and elements within their solution work as intended.

Evidence for the testing section may be produced after the system has been fully coded or during the coding process. It is expected that tests will be planned in a test plan. Only carefully selected representative samples are required. When carrying out tests it is important that normal (typical), boundary (extreme) and erroneous data should be used as appropriate.

Students must provide and present in a structured way, for example in tabular form, clear evidence of testing. This should take the form of carefully selected and representative samples, which demonstrate the robustness of the complete, or nearly complete, solution and which demonstrate that the requirements of the solution have been achieved. The emphasis should be on producing a representative sample in a balanced way and not on recording every possible test and test outcome.

Students should explain the reasons for the tests carried out alongside the evidence for them.

This could take the form of:

  • the test performed
  • its purpose if not self-evident
  • the test data
  • the expected test outcome
  • the actual outcome with a sample of the evidence. For example, screen shots of before and after the test, etc, sampled in order to limit volume.

Where a test ‘fails’ or highlights an issue with the solution it is important that the solution is refined to eliminate the ‘failure’ and/or issue. The test should then be repeated to show that the refinement has succeeded in eliminating the ‘failure’ and/or issue.

Test planning (9 marks)

Level

Mark range

Description

3

7–9

A thorough representative range of tests have been planned that will demonstrate the robustness of the solution as well as that the requirements of the problem have been achieved.

Test data includes normal (typical), boundary (extreme) and erroneous data. Detailed expected outcomes are given.

The test plan is clear and unambiguous.

2

4–6

A representative range of tests have been planned but fall short of demonstrating that the requirements of the problem have been achieved.

Test data includes some different types from normal (typical), boundary (extreme) and erroneous data. Expected outcomes are listed.

The test plan is clear.

1

1–3

A small number of tests have been planned, some of which may be inappropriate.

Some test data and/or expected outcomes may be given.

The test plan may not be entirely clear.

0 0 Nothing worthy of credit.

Testing evidence (12 marks)

Level

Mark range

Description

4

10–12

Clear evidence is presented, in the form of carefully selected representative samples, which demonstrates thorough testing has been carried out.

There is an explanation that the evidence demonstrates the robustness of the complete or nearly complete solution and shows that the requirements of the problem have been achieved.

3

7–9

Extensive testing has been carried out, but the evidence presented in the form of representative samples, does not make clear that all of the core requirements of the problem have been achieved. This may be due to some key aspects not being tested or because the evidence is not always presented clearly.

There is an explanation that the evidence presented demonstrates partial robustness of the solution.

2

4–6

A range of tests have been carried out and the evidence is presented in the form of representative samples, but falls well short of demonstrating that the requirements of the problem have been achieved and that the solution is robust.

The evidence presented is explained.

1

1–3

A small number of tests have been carried out, which demonstrate that some parts of the solution work.

The evidence presented is not entirely clear.

0 0 Nothing worthy of credit.

3.9.4.4 Potential enhancements and refinements (max 10 marks)

Evaluation is considered to be 'How well does the solution work, and how could it be better?'

Students should consider and assess how well the solution meets the requirements of the problem and how the solution could be improved if the problem were to be revisited.

Level

Mark range

Description

5

9–10

Full consideration given to how well the solution meets all or almost all of the requirements of the problem. Efficiency of execution and robustness are discussed.

Improvements to the solution, if the problem were revisited, are discussed.

4

7–8

Some consideration has been given to how well the solution meets all or almost all of the requirements of the problem. Efficiency of execution or robustness are described.

Improvements to the solution, if the problem were revisited, are explained.

3

5–6

Consideration has been given to how well the solution meets most of the requirements of the problem. Where appropriate, some of the requirements that have not been met have been considered in the evaluation.

Improvements to the solution, if the problem were revisited, are described.

2

3–4

Consideration is given to how well the solution meets some of the requirements of the problem but not all aspects are addressed. There may be omissions, or some of the requirements may not have been met, and those requirements not met have been overlooked in the evaluation.

Some potential improvements, if the problem were revisited, have been stated.

1

1–2

Parts of the solution are evaluated but only in a superficial way.

0 0 Nothing worthy of credit.

3.9.4.5 Overall quality of the report (max 10 marks)

The marks in this section are for the quality of the completed report. The report itself should consist of the following sections:

  • designing the solution
  • creating the solution
  • testing the solution
  • potential enhancements and refinements.
Each of these sections should contain enough evidence to warrant the award of marks for that section as detailed within the marking criteria.

Level

Mark range

Description

5

9–10

The report is complete. All or almost all of the content is relevant to the solution of the task. A wide range of technical terms have been used accurately.

There is a consistent approach to the structure and layout of the report which enables easy cross-referencing between sections and between different parts of the solution.

Consistency is evident between the account of design and the coded implementation, the account of design and execution of testing, and the account of evaluation and refinement.

4

7–8

The report is complete. Most of the content is relevant to the solution of the task. Most of the technical terms used have been used accurately.

Most of the report shows a consistent approach to the structure and layout which enables easy cross-referencing between most sections and/or different parts of the solution.

Consistency is evident between the account of design, the coded implementation, the account of design and execution of testing, and the account of evaluation and refinement.

3

5–6

The report is complete in all or almost all respects. A few technical terms have been used accurately.

There is evidence of an attempt to create a report structure and layout that would enable cross-referencing between one or two sections and/or different parts of the solution.

There is some consistency evident between at least three of the following: account of design of the solution, the coded implementation, the account of design and execution of testing, and the account of evaluation and refinement.

2

3–4

At most, one section of the report is missing or incomplete.

There is very little evidence of an attempt to create a report structure and layout that would enable cross-referencing between sections and/or different parts of the solution.

The report gives some idea of how the solution has been developed and the code listing is consistent with other sections.

1

1–2

Two or more sections are missing from the report.

There is no evidence of an attempt to create a report structure and layout that would enable cross-referencing between sections and/or different parts of the solution.

The report fails to show a clear account of the development of the solution.

0 0 Nothing worthy of credit.