Most chosen general qualifications exam board in England.

  • About AQA

  • Centre Services

  • Join Us

  • Contact Us

AQA
  • Subjects
  • Qualifications
  • Professional Development
  • Exams Admin
  • Services
  • Search
  • Subjects

  • Qualifications

  • Professional Development

  • Exams Admin

  • Services

  • About AQA

  • Centre Services

  • Join Us

  • Contact Us

  • Log in

Subjects

  • Accounting

  • Art and Design

  • Biology

  • Business

  • Chemistry

  • Computer Science

  • Dance

  • Design and Technology

  • Drama

  • Economics

  • English

  • Food preparation and Nutrition

  • French

  • Geography

  • German

  • History

  • Law

  • Mathematics

  • Media Studies

  • Music

  • Physical Education

  • Physics

  • Politics

  • Psychology

  • Religious Studies

  • Science

  • Sociology

  • Spanish

  • All subjects

GCSEs

  • Biology (8461)

  • Chemistry (8462)

  • Combined Science: Trilogy (8464)

  • English Language (8700)

  • English Literature (8702)

  • Geography (8035)

  • History (8145)

  • Mathematics (8300)

  • See all GCSEs

AS and A-levels

  • Biology (7401)

  • Business (7131)

  • Chemistry (7404)

  • Geography (7037)

  • History (7041)

  • Physics (7407)

  • Psychology (7181)

  • Sociology (7191)

  • See all AS and A-Levels

Other qualifications

  • Applied Generals

  • AQA Certificate Mathematics

  • Entry Level Certificates

  • Project Qualifications

  • Unit Award Scheme

  • All qualifications

Our training

  • Course finder

  • About our training

  • Online training

  • Face-to-face training

  • In-school training

  • Inside assessment

Courses by theme

  • Effective exam prep

  • Exams officers

  • Getting started

  • Unit Award Scheme

Courses by subject

  • English

  • Mathematics

  • Science

  • Languages

  • Design and Technology

  • Physical Education

  • Geography

  • History

  • All professional development

Dates

  • Dates and timetables

  • Key dates

Non-exam assessment (NEA)

  • NEA, coursework and controlled assessment

  • Deadlines for non-exam assessment

  • Record forms

  • Submit marks

Exams

  • Entries

  • Entry fees

  • Exams guidance

  • Question papers and stationery

  • Access arrangements

  • Special consideration

Results

  • Results days

  • Results slips

  • Grade boundaries

  • Results statistics

  • Post-results services

  • Exam certificates

  • All Exams Admin

Assessment Services

  • Centre Services

  • Associate Extranet

  • Become an associate

Products

  • All About Maths

  • Alpha Plus

  • Data Insights

  • Exampro

  • Project Q

  • Stride Maths

  • Testbase

  • Unit Award Scheme

News and Insights

  • AQI research and insight

  • News

  • Inside exams podcast

AQA
  • Become an examiner
  • Switch to AQA
  • Contact Us
  • Join us
  • Terms and conditions
  • Accessibility
  • Modern slavery statement
  • Privacy notice
  • Cookie notice
  • X
  • LinkedIn
  • Youtube

©AQA 2025 | Company number: 03644723 | Registered office: Devas Street, Manchester, M15 6EX | AQA is not responsible for the content of external sites

AQA Education has obtained an injunction preventing interference with public examinations. This notice is to alert you to the injunction, so that you are aware of it and can make submissions about it if you wish to do so.

  1. Home
  2. About Us
  3. Our Research
  4. Research Library
  5. Investigating the validity and reliability of Stride’s diagnostic tests

Investigating the validity and reliability of Stride’s diagnostic tests

Share this page

  • WhatsApp
  • LinkedIn
  • X
  • Facebook
Investigating the validity and reliability of Stride’s diagnostic tests

Investigating the validity and reliability of Stride’s diagnostic tests

09 Jul 2024

PDF | 2.39 MB

Investigating the validity and reliability of Stride’s diagnostic tests

By Yaw Bimpeh

Abstract

Stride is AQA’s new adaptive assessment, providing diagnostic tests to help students prepare for GCSE Maths.

This study investigated the validity and reliability of the diagnostic tests by trialling Stride multiple times, administering user-experience surveys and interviewing students from a range of schools. The research provided valuable insights into the reliability of the information about students’ knowledge and skills, as well as the accuracy of inferences drawn from the diagnostic tests.

The findings indicate that the tests accurately gauge students’ proficiency across a wide spectrum of abilities and effectively distinguish between individuals with varying levels of skill. Additionally, the difficulty level of the tests appears to align appropriately with the abilities of the students, ensuring accessibility for all individuals within the tested group.

Moreover, there is ample evidence supporting the validity of the diagnostic tests in relation to test content, internal structure and student response processes. The reliability of the tests is rated as excellent or good, indicating the suitability of the tests for diagnostic or formative purposes.

The results also indicate that items in the tests have varying levels of power to distinguish between individuals with high and low abilities, ie certain items are more effective at distinguishing than others.

When asked about their overall test-taking experience, most respondents (78%) rated it positively. About 73% of respondents said they agreed with Stride’s identification of their strong points or competencies. According to 75% of respondents, tests like these would be beneficial for their learning.

This research underscores the significance of Stride’s ability to deliver personalised and adaptive feedback to students, including explanations, hints, and resources tailored to individual learning needs. The testing process is helpful for both students and teachers and is effective at tracking students’ progress over time.

This study accompanied qualitative research to gather student experiences of trialling the tests.

Keywords

  • e-Assessment
  • On-screen tests
  • Question difficulty
  • Reliability
  • Student voice
  • Teachers
  • Validity