Getting to grips with how science ‘works’
By Steve Wooding
Published 20 Jan 2021
Researcher Steve Wooding discusses outcomes from the Project Calibrate research partnership which proposes improvements to practical science teaching and assessment at Key Stage 4.
How do scientists know what they know? And how do we help current and future generations to think like scientists when confronted with competing, and sometimes contradictory, claims and ideas? As I write this, the UK is in the midst of a second wave of Covid-19 cases, highlighting the need for widespread understanding of how science actually works in practice.
For almost three years, researchers from AQA and the University of Oxford’s Department of Education have collaborated on Project Calibrate. Funded by the Wellcome Trust, the Gatsby Foundation and the Royal Society, this partnership project aims to propose improvements to practical science assessment, focusing first on Key Stage 4 in England.
The ultimate aim is to inform both current exam practice and the next round of reform, as well as broadening and deepening the teaching of practical science, particularly at GCSE level.
Two key concerns
The project arose from two key concerns about how science, and practical science in particular, is taught and assessed in schools, and the impact this has on how people view and discuss scientific claims.
- Research highlights how practical science in the classroom has become too narrowly focused on a ‘scientific’ hypothesis-testing approach to investigations, with the scientist designing a detailed experiment. On the basis of their results, the scientist decides if the hypothesis is supported or not. The key problem with this approach is that it over-simplifies how scientists work in practice.
- Research evidence shows that too much classroom practical teaching follows a ‘cookbook’ process. Pupils conduct step-by-step experiments without much thought to what’s being done and why. So, while they are ‘hands-on’, they aren’t necessarily ‘minds-on’ too.
A framework to guide assessment and teaching
The success of the project was dependent on including many viewpoints and experiences, not just those of researchers. To achieve this, we made the following decisions:
- Our approach would cover the whole teaching and assessment cycle, including pedagogy relating to practical science and teacher CPD, not just the exams.
- We'd incorporate the experiences of teachers and their pupils, and the expertise of examiners.
- Any assessments we produced should have a positive ‘washback’ effect on the teaching of practical science, given that the content of the exams influences teaching.
We identified a research-grounded framework – Brandon’s Matrix – to guide the assessment (and therefore the teaching) of practical science.
In essence, the matrix asks two simple questions:
- Are we actually testing a hypothesis, or making measurements and observations?
- Are we changing one or more independent variables, or simply recording what we observe?
|Manipulating variables||No manipulation|
|Testing and hypothesis|
‘Classic’ experiment with a hypothesis, then tested using independent, dependent and control variables.
|Testing a hypothesis via a situation that can’t be manipulated.|
Making measurements and observations
Investigation to measure the effects of changing conditions or variables on outcome(s).
|Exploring, observing, classifying, categorising, tracking, measuring.|
The top-left corner of the table is what we'd normally call a standard experiment – the ‘scientific method’ as we know it. However, the other three quadrants are also approaches used in practice.
Involving examiners and teachers
Given the simplicity of Brandon’s Matrix, the team used it as the basis for several workshops with AQA science examiners. The sessions focused on writing and refining pilot exam questions to cover the range of scientific methods relevant to a set of core GCSE practical activities. The questions were designed to assess a pupil’s learning about the ‘why’ and ‘how’ rather than just the theory of practical science.
The project team have also run teacher workshops, introducing the matrix and the pilot exam questions. We used these opportunities to gather vital feedback on the practicalities of teaching the full range of scientific approaches.
Sharing our findings
Together with Project Calibrate colleagues, we’ve presented our research and ideas at a variety of conferences and events – including IAEA, AEA-Europe, BERA and ASE events – and in academic journals, including the International Journal of Science Education, and Research and Practice in Chemistry Education. Sharing our work like this means we can incorporate the views of the wider assessment research community. We've also delivered an international webinar on the outcomes, hosted by OxfordAQA.
Resources and ongoing impacts
Our aim is that the framework and assessments we’ve developed will be used in several ways:
- By teachers, to broaden their approach to teaching ‘how science works’ so that pupils understand that there's no such thing as ‘the scientific method’, but that there are many scientific methods, all of which have already been used to create the scientific knowledge we teach.
- By assessment writers and organisations, to change the approach to creating written practical science exam questions so that a broader range of skills and understanding is covered.
- By those responsible for education policy and particularly exam reform, to ensure that curriculum content truly reflects the variety of approaches scientists use in practice.
We're also producing CPD materials for teacher trainers to use to help make sure point (1) above happens. Our dream is that Brandon’s Matrix becomes as widely known in teaching as Bloom’s Taxonomy, and that future generations of science teachers meet this framework as early on in their training as possible.
If you’d like to know more, you can visit the official Project Calibrate website. Some of the resources are already available, and more will be added in due course.
About our blog
Discover more about the work our researchers are doing to help improve and develop our assessments, expertise and resources.