Inside Exams

Podcast series two now available

Ways to listen
Google Podcasts
Apple Podcasts
Share the podcast


Teacher Craig Barton is back with series two of Inside Exams, the podcast that gives you an access all areas pass to snoop around behind the scenes at AQA.

He’ll be meeting the people who write and mark your students’ exams, as well as pioneering teachers, to get answers to all the questions you ponder throughout the school day.

© AQA 2019

Future of exams: all Star Trek and holodecks?

Series two
Episode eight | 17 February 2020

What is the future of assessment? Craig Barton speaks to AQA’s head of assessment design, and head of research and development about the strategies that might be considered for future exams. Two further education teachers also tell Craig how they already use innovative technologies in the classroom.

Featured in this podcast

Craig Barton – Maths teacher, podcaster and author

Ben Stafford – Head of Assessment Design at AQA

Ruth Johnson – Head of Research and Development at AQA

Dimitrios Georgalis – Maths teacher at Leeds City College

Darren Coogan – GCSE Maths curriculum lead at Peterborough Regional College

Craig Barton: Hello, and welcome to Inside Exams. I’m Craig Barton. I’m a maths teacher with 15 years of classroom experience. Over the last two series I’ve been putting your questions to the people who create our students’ exams. So, what would you like to know?

Melanie: Hi, I’m Melanie and I teach history and I’d like to know what we can expect from exams in the future, and where technology plays a part in that.

Craig Barton: Ah ha, yes, the future of assessments. In 10 years’ time will schools be filled with unrecognisable space age technology or will the exam look pretty much the same as it does today?
This makes me think of that Seven Up documentary, the one that’s been checking in with 14 people every seven years since the 1960s. As kids they talked about their starry-eyed hopes and dreams for the future. As adults they reflect on what realities forced them to change path.  I always find it particularly fascinating to see how they’ve had to adapt to environmental or socio-economic changes to keep up with the world around them. There's a constant give and take between noble ideas for the future and the reality of making them happen.
I want to understand if there’s a similar balance of reflection and forward thinking at play in exam boards, and see what could be in store for exams in the future.
Ben Stafford is AQA’s head of assessment design, and Ruth Johnson is head of research and development.

So Ben, I’m going to start with you first. Now we’re going to be going deep into all these incredible technologies and they impact that they may have on the assessment process throughout this conversation, but my first question for you is; what are some of the things that are just not going to change at all? What’s set in stone in terms of assessment?

Ben Stafford: That’s a very interesting question. I’m not sure in the long term anything is set in stone. Fundamentally I think if you extend the scope out for years and years, it’s fair to say that anything might change.  At the moment we’ve got a focus on validity in our assessments, and that comes with the trade-off against reliability. I’m sure we’ll talk about that as the conversation goes on, but potentially that could even change.  You look at other high stakes international jurisdictions like the USA, for example, their exams are much more focused on reliability. So even things like the open plan 25 mark essay questions that we know and love from GCSE English and so on, might not be there forever, regardless of what structure or format they’re delivered in. They might not be there permanently.

Craig Barton: Fantastic. So anything’s up for grabs at this stage. This is exciting stuff. I like this. Ruth, let me come to you then. What research is happening at the moment within AQA to look into some of these future ideas and changes?

Ruth Johnson: I think in the research team we have to be grounded in where we are. So, whilst we’ve got kind of, we’re sort of split I suppose; we’ve got half an eye to work that makes our current assessments more valid.  So, for instance, one recent project used eye-tracking technology to look at chemistry questions and the layout of chemistry questions to evaluate what the best layout was in terms of enabling a student to answer the question.

Craig Barton: You’re joking. So the kids are having where they’re looking at question tracked?

Ruth Johnson: Yeah, absolutely. So there are glasses which they wear which track where they’re looking, and they’re used in all kinds of areas of research.  So for instance there’s a really interesting video on YouTube of Ronaldo wearing them whilst playing football, and you can see where he looks, and so someone’s analysed what makes him such a good footballer.  But we’ve used the same technology to look at where students were looking on chemistry items, and we had two different kinds of items, questions, some that were more compact and some that were more laid out, more spaced out with more white space.
And one quite interesting finding was; on the one that had more white space, students actually looked at the white space. So that’s quite interesting in terms of what's going on there. Are they looking at the white space to enable their thinking, so looking away from the stimulus, away from the question? That's one thing that we’ve done recently where we’ve used technology to look at our current assessments and to try to improve them.

Craig Barton: Fascinating and what about for future assessments, what’s something that excites you at the moment that you’re looking into?

Ruth Johnson: So I suppose in the near future we’re looking at what we can do using AI to make marking more reliable. It is possible and in some assessments around the world, AI is used to mark exams.  So, for instance, in some language assessments where students are assessed, for instance, on non, students whose first language isn't English are assessed on their English skills in speaking, for instance, using AI.  And obviously we know that voice recognition is really good, because we have Siri and we have Alexa and they’re very good at understanding a really wide range of accents. So all that data is captured and then used on marking platforms to mark speech, to say how good it is.

Craig Barton: Wow.

Ruth Johnson: And similarly with written language there are ways of using natural language processing to train an AI system, a marking system to fairly reliably rank students.

Craig Barton: It’s incredible isn't it, because it still blows my mind whenever kids do multiple choice questions and you can essentially scan their answers and it marks them dead quick. That’s a bit mind-blowing for me, but you were literally talking about kids doing language exams, like the vocab bit of the exams, and the AI can – can it reliably do that?

Ruth Johnson: It’s only as reliable as the training that it gets. So its system’s only as good as the data it’s trained on, but one thing we are looking at is whether we could use that kind of technology to evaluate and monitor the reliability of marking. So that’s something that we’re looking into in the sort of medium term.

Craig Barton: What we’re going to do now, I’m going to just chuck a few things at you that I’ve either heard about or excite me a little bit, and if you know anything about them or AQA’s doing anything about them, feel free just to talk to me a bit about it.  So the first is comparative judgement. What's that and what's involved with it?

Ben Stafford: So at the moment the way in which GCSEs and  A-levels are marked is a form of absolute judgement. So students’ work is looked at by an examiner and it is assigned a mark out of whatever the mark tariff is.  Comparative judgement instead offers the judge or the marker, if you like, two scripts from students selected randomly, and instead of assigning a mark to both, the examiner is simply asked which is the better response to the question that’s presented.  And if you do that enough times with enough judges, then you can form a fairly solid rank order of students from the best answer all the way down to the worst answer, and then you can make some decisions about where you might want to apply things like grade boundaries to those if you wish to do so.

Craig Barton: And this immediately sounds great to me right, because I mean we’ve spoke both in season one and season two just how hard it is to mark these exams, because I mean I struggle with maths like one and two mark questions. I mean give me an essay, I don’t have a flipping clue what's going on, so, judging between two things, this sounds brilliant.

Ben Stafford: Another consideration really in the space of comparative judgement, if you’re considering something like a five mark response to an essay type question and you’re comparing two students; that can be done relatively quickly. If you’re comparing whole scripts, for example, a whole GCSE paper, then obviously how students have done on different questions might influence your judgement about which is better. So do you make that comparative judgement at paper level? Do you make it across maybe three papers, as in mathematics, or do you bring it back to a single item? And if you bring it back to a single item then how do you aggregate across all the items to create all those different rank order that you’ve generated to produce one that corresponds to the student overall?
The concept of comparative judgement’s been around for about 100 years. A guy called Thurstone kind of first thought it up, but it’s only really in its infancy at the moment. It’s only technological advances that are making it possible to start exploring it. It has other possibilities to us that are quite interesting.
So there are things in the space of maths. We make a commitment to try and have the first question on the paper be the easiest and the last question be the hardest, and at the moment we do that based on the experience of the senior examiner who’s writing the papers for us. We could put all those questions into comparative judgement software and ask the examiners to make a judgment, not on who’s written the best answer, because they wouldn't have any answers, but actually which question’s harder, and then that might give you, with enough judges, a way of creating a rank order that allows you to structure the paper in a fairer way.
So it’s interesting, it’s got lots of possibilities, but I think it’s one of those things that isn't quite ready for high stakes assessment yet.

Craig Barton: That’s interesting. I mean my limited experience of comparative judgement has been to use it with kids in lessons to say let’s compare these two responses, which do you think’s the better response, and then once we’ve got that decision made, then we dig into the finer details of why. So certainly for a teaching aid I find it particularly useful.
Let me chuck another one into the mix. Adaptive testing, what’s that?

Ruth Johnson: So, adaptive testing is a test which adapts to the person taking it, so the question that you next get depends on how you do on the question in front of you. You do it on a screen and so it’s designed to adapt to your ability.

Craig Barton: So if you get a question right, the next one’s probably going to be harder than if you got it wrong.

Ruth Johnson: Yes exactly, or more nuanced than that. It might have something built into it that you get a question right, and one particular concept, it will give you something which stretches that concept or is on a different concept, whereas if you got something wrong on a particular concept, it might stick with that concept and give you something easier and eventually go onto a different concept.

Craig Barton: This is interesting.

Ruth Johnson: So a personalised testing.

Craig Barton: Now I’ve seen similar things, adaptive learning platforms that do a similar thing, but I’ve never even considered this for high stakes testing. Has there been research done in this area, because this sounds absolutely incredible?

Ruth Johnson: I don’t know whether it’s being used in high stakes tests, but it is being used large scale in some places.  So in the country Georgia, for instance, it’s a really interesting country where they’ve put huge amounts of effort into building an effective testing system, and at primary level they do adaptive testing with every child in Georgia, to the degree that there's these tiny rural communities in the middle of nowhere, and they drive out with these vans with transmitters on that enable them to all log into this adaptive testing platform to get a national picture of maths ability. I think it’s maths.

Craig Barton: Now I want to pick up on something. Kids have been sitting exams in essentially the same way for hundreds and hundreds of years, but as you say, behind the scenes lots of innovations have gone on. So why don’t students sit exams on screens essentially? What are some of the pros and cons of that approach?

Ruth Johnson: The only real barrier to that is infrastructure. It’s having the solutions to enable them to happen at such a large scale in schools.  There’s research that looks at whether, when you change the format, you’re changing the construct, so are you assessing something different? And I think yeah, to some degree you are, because writing is different from typing, and interacting with a screen is different from interacting with a piece of paper.

Craig Barton: Can I just ask, sorry to interrupt Ruth, is that subject specific? Because I would imagine in terms of maths it wouldn't have that much of an impact, but in longer form questions I imagine it would. Do you know the specifics?

Ruth Johnson: Weirdly I think it might almost be the other way around, because with maths you get the diagrams and measurements and things like that, so how you do that kind of thing on a screen. Actually you can do things quite innovatively and in quite different ways to measure those skills, but they’re quite different from how you do it with a pencil and paper and a ruler.

Craig Barton: Of course, of course.

Ruth Johnson: What wouldn't be ideal is if we had a mixed economy where you had people taking the same test on screen and on paper, because I think there are legitimate questions about whether that’s the same exam.

Ben Stafford: I think from my perspective, just some of the practicalities around doing the test on screen are really interesting. So, would we have a situation where students came into a room with banks and banks of computers that were all set up that were school owned and, as Ruth touched on, the infrastructure, would that be there to support it? How would that technology work? Or would you say, actually that in itself disadvantages students, because you’re placing them on a device that they might not be familiar with using?

Craig Barton: Of course, of course.

Ben Stafford: So do you want to consider the kind of bring your own device approach whereby students bring their own laptops, tablets, whatever it might be, into the classroom? But then does that create a disparity, kind of socio-economic grounds around students who can afford the latest bits of tech against those who can’t?  So pen and paper has its faults and it does seem strange that we’re in a situation in the 21st century where students are asked to go and sit in an exam hall for so long, but at the same time I think high stakes assessment has to move at a pace whereby risk isn't introduced into the system unnecessarily.  And so it’s a bit chicken and egg. It’s that sense of, if all of the technology was invested, if somebody swooped in with billions of dollars or pounds to fund a programme in schools whereby everything was absolutely locked down and solid, then the exams could be translated, as Ruth said, into something that’s offered on screen.

Ruth Johnson: But we’re a long way from that I think.

Ben Stafford: We’re a long way from that.

Ruth Johnson: I sometimes wonder whether assessment will just miss out the on screen stage and go to something, this is maybe a bit too sci-fi, but to go to something that’s more continuous and holistic, and there’s all kinds of technologies that are being used in context around the world which allow for really different kinds of assessment.  The education testing service in the States has something which I’ve heard them describe as a holodeck which, if you’ve watched Star Trek ever, you’ll know what that is.  So they did an assessment where they were assessing medics, army medics on their ability to triage, and these trainee medics are kitted out with sensors all over their bodies, the same kind of things that you would wear if you were doing, you know CGI, you see the actors doing, like Gollum, Andy Serkis doing Gollum?

Craig Barton: Yes.

Ruth Johnson: So sensors on the body. So they go into this holodeck with these sensors on. There are, presumably, projected casualties of war around this, and they’re assessed on what they do in that situation, and their movements and their speech, all of that’s part of the assessment. And if you think about the really hard to assess things that, because they’re hard to assess, we don’t really do them so much, that kind of technology could really make that thing possible.  So I’m thinking about things like practical science skills. I’m thinking about collaborative problem solving, those 21st century kinds of skills where what you want to do is to see people working in a group and seeing what contributions they make, it’s difficult to assess. You want to be able to do it, because it’s an important 21st century skill, but it’s difficult to do reliably.  But with those kinds of technological solutions, you can look at how the group does as a whole and you can look at the contribution that the individual makes to a group.

Craig Barton: I’m excited about this. Flipping heck.

Ruth Johnson: I think that’s a way off, but obviously [laughter]. We haven't even got computers for exams on screen yet.

Craig Barton: At the start I asked you; what are the kinds of things that aren't going to change at all? What are the things that are set in stone? My question here is, I’m picturing, when we come back for Inside Exams, season 27 or something like that, am I going to be interviewing a robot?

Ruth Johnson: I think that human judgement has to inform what are the right things to assess? What is it that we’re trying to do? Because the future of assessment is the future of education, isn't it?  The two things are hand in hand, and I don’t foresee a world where some algorithm determines what the curriculum is. I think where the human has to stay in place is in making those kinds of judgements and making judgements about what we need to do now to reflect the reality of the world that children are in.

Craig Barton: And would you agree with that Ben? Is that the place for human, do you see us all sneaking in anywhere else in the process?

Ben Stafford: I don’t know. It’s not a question really I’ve ever considered in that level of detail.

Craig Barton: You could be out of a job here, Ben. That’s what I’m worried about here [laughter].

Ben Stafford: I mean it feels like it’s a long way off. There is certainly something about the nature of the questions that we ask of students at the moment. If they were to carry on, effectively GCSEs will look like GCSEs because we’ll test the type of things we always test.  There’s an interesting question around how you might use something like AI to clone questions whereby you take a basic question and change some of the details, you know, the numbers in a triangle question, whatever it might be, to create a different question. But can you do that with the more innovative problem solving questions and will the technology get to a place whereby it can do that?   And actually if you get to the point whereby it can do that, do you still need the human element to come in and actually validate that that is a legitimate thing to be asking of students? And at what point do you apply some kind of sense checks around things like cultural sensitivities and issues around; is it appropriate to ask students, a 16 year old student, for example, some of the questions that might be generated by an algorithm?

Craig Barton: Final question for both of you. In a completely utopian world, what tool would you like to invent or conceive of that would improve the assessment process?

Ruth Johnson: It sounds a bit Big Brother-ish, but I think for me assessment is something which has to be a true reflection of the child. So something that was able to capture what their strengths were in every aspect of life, and that what you got at the end of it wasn’t a grade but it was a description. It was; this person is really good at doing these things, and not they’re an A or they’re a B, sorry, or a seven or a six. It would be that I think.

Ben Stafford: When we set out to write our exams, we’re trying to make them as fair as possible, and fairness is about giving every student the opportunity to demonstrate what they know on the topic that they’re being assessed on.
And we’re good at that. Arguably we’re great at that. But we’re not perfect at that and I think if there was more we could have to help us understand what potential bias is we were introducing into our questions before we asked them of students, I think that would be a really interesting thing for us to understand and get better at.

Craig Barton: Amazing. Well, I for one am pleased you haven't been replaced by robots just yet, because I don’t think I would have had as interesting a conversation. That’s been absolutely fascinating. Thank you so much.  It’s interesting, isn't it? It sounds like the future might not be as tech fuelled as we might have imagined, certainly from the exam board’s point of view anyway. But what about you? Are you planning to embrace technology or are you more cautious?
Dimitrios Georgalis is a maths teacher at Leeds City College, and Darren Coogan is the curriculum lead for GCSE maths at Peterborough Regional College. I’m going to meet them to find out what they think the future has in store for us.

Now, let me come to you Dimitrios. When you first started teaching, what technologies were around?

Dimitrios G: We’re talking about the different education in entirety. So chalk again –

Craig Barton: What year is this?

Dimitrios G: First time it was 2004, but I actually walk into a classroom about 2008, but we’re talking about a different country. So I’ve started teaching back in Greece, and mainly I was A-level equivalent, so again, not chalk, white board.

Craig Barton: So your early teaching experiences were very similar to your experiences as a student yourself.

Dimitrios G: Yes, pretty much. I was taught that way, I’ve learned that way, so I was teaching that way as well.

Craig Barton: Very interesting. Then how about yourself Darren, was it the same?

Darren Coogan: Well I came over and my first teaching job was teaching functional skills maths. Technology wise, 2013 was the first time I used a smart board, and as far as knowing what to do, I kind of had to find out through peer observation, stuff like that.  Online materials, I mean such as Triptico and things were some of the first online resources we were using, but now obviously we’ve come a long way in the last six or seven years.

Craig Barton: This is fascinating this, because it’s very similar to my experiences as well. My schooling was not all that much different to my early experiences as a teacher. Same as you Darren, smart boards had started to come in, but I was just using it as a glorified white board. I wasn’t doing anything that I couldn't do with just a pen and a board, but, as you say, over the last few years it’s been incredible. So let’s talk about now then. What would be some of the technologies you may use in your lessons now? Let’s start with you Dimitrio

Dimitrios G: As a college we have adopted the Google Suite, so we do have Google Chrome. In my classroom we have a Jamboard which is linked with Google Chrome, so I kind of sweep towards that direction.  I base a lot of my teaching through Google platform as well, plus Edge, Geogebra, Desmos, so kind of visualise all the maths I can do there.

Craig Barton: And have the students, have they got their own devices here?

Dimitrios G: Yes they do have, all of them, they do have a Chromebook.

Craig Barton: Wow. How about yourself Darren? What’s it like for you now as a maths teacher?

Darren Coogan: The last 12 months have just been a massive eye-opener for me. I’m blessed to have a team who are willing to take a punt, try new things. They like to do online training and stuff like that and they’re great for sharing resources.
We’ve self-funded a 3D printer. For scale we’ve got centimetre cubes that we’ve created, stuff like that, plan views, side views, elevations, things like that.

Craig Barton: Are these all from the 3D printer?

Darren Coogan: 3D printer, yeah.

Craig Barton: And what about some of the non-3D printer things, what other tech’s going on in your classroom?

Darren Coogan: Well, the college has now moved to the Google Suite too, same as Dimitrios and –

Craig Barton: So all kids have got their own devices?

Darren Coogan: Yeah. We’ve self-funded, so thanks to the government incentive of £500 per learner who achieves a grade four or a functional skills level two, we’ve managed to generate a significant amount of money that allowed us to purchase this and to try and just reinvest in the kids, because that’s the most important thing for us.
Google Classrooms is amazing for distance learning, especially for people who are unfortunately off maybe for long term illness, or for people who are just sick, they miss a lesson, you can just send them the Google code, and we’ve got our own code for the year and it works so well.
I mean just to think about; how great is it to have everything at the touch of a button? When we were revising we were looking through books that are 500 pages. Now you just have to type in a few letters and it’s all there for them. The technology aspect, the kids are loving it, because you could be all day fighting them, put your phone away, but why do it when you can say “okay, we’re going to recap last week’s lesson, today we’re going to use Quizzes”, which is one of the platforms we use as a starter activity, and they can, you know, nice bit of competition.  On the smart board it shows the league table. As they get them right it shows greens, reds etc. It’s really good. They can use their phones for this or they can use their Chromebooks.  Transom, that’s another really good one we use for starter activities. I mean there’s just such a wealth of online, I just remembered something too. A gentleman who works in my department, he’s amazing, and he’s using VR. So he will actually hover the phone over some of these things, and you can see three dimensional stuff come up. It just freaked me out when I saw it.

Craig Barton: You’re joking.

Darren Coogan: Yeah it’s –

Craig Barton: What’s happening? The kids are hovering a phone over a, and like a shape’s appearing?

Darren Coogan: Yeah.

Craig Barton: And it seems to me it’s finding its way into every aspect of your lesson. Do you have similar positive experiences Dimitrios?

Dimitrios G: Yes. Technology, it is to connect us. So, for example, I had revisions. I’m going to give a few examples. I had revisions, so we do use Google Suite, we do have Hangouts. I let them, okay, carry on, I have to do something different. Snapchat, snapped photo sent me through Hangouts. What do I do here? Send it back straightaway feedback. Yes, Google Classroom all the time. What else I’m using is that I’m trying to implement a little bit of English throughout maths. So I’m picking interesting maths topics about history of maths, about who the [unintelligible 00: 25: 52] so where [unintelligible 00: 25: 53] was, let them search the internet or read something.
Like for Halloween, the crow from Edgar Allan Poe, Google Form, they read it, they hear that with a narrator, then they finish the Google Form based on the exam style questions that they will have for their English, update it to a Google [seat] which is linked to the English teacher who teaches them.

Craig Barton: Let me ask you this, Dimitrios. I can see two obvious problems with technology. One is the cost that Darren’s alluded to, and one is, and by that I mean cost in terms of money and expenditure, and we know budgets are tight and so on.
And the second is the cost in terms of the teachers upskilling themselves to be able to make use of this, because I remember the early days of interactive white boards, we had teachers who were flipping writing on them with the normal board pens, and we had teachers who, they would never turn them on all year until they were being observed by OFSTED, then all of a sudden it’s where’s the power switch for this? And it was an absolute disaster.
So you’ve got those two costs. But are there any other costs? Are there any downsides to this use of technology that you can see?

Dimitrios G: Don’t judge me wrong, I’m a huge advocate of technology, but I will play the devil’s advocate. I have to say that I’ve spent at least two months trying to upskill all of my students, because if I just throw them the technology, they don’t know how to use it. It takes considerable amount of time, considerable amount of energy from me to convince them that yes that can be an actually good thing. We can work with that.
What I’ve discovered sort of is that we cannot use technology. We can only solve it technology right now. We can only use aspects of technology that our students are kind of familiar with it.

Craig Barton: Let me ask you this Darren. How do you know it’s working? How do you know it’s effective?

Darren Coogan: FE is different. We’re like a different animal altogether compared to the skills. We have regular phase tests, so every five or six weeks, based on the key areas.
So for example we’d start on number and then we’d go on to algebra, geometry, data, etc. These tests, everything is logged online, results, so what we would do then is link to what we were delivering in class, and if, for example, say the algebra worked well, we’d know, okay, let’s keep that or, if it didn’t, let’s scrap it. I mean everything’s data led, data driven, you know.

Craig Barton: I’ll tell you what interests me and I’d be interested in your take on this. If we go back to our days when we were students ourselves, and I was a pretty good student but I still found ways to mess about with even a calculator, like you could spell rude words on it, turning it upside down and all this, and that was – yeah [laughs], the classics never die out. But I’m thinking now, forget the calculator, let’s take this phone, which has obviously got a calculator built into it, but we can also access Desmos, we can access [Jojoba], we can do incredible calculations. We can use some of the 3D stuff that we talked about.
So we have this incredibly powerful learning device but which also has a major role outside of the classroom in the fact that its kids’ social lives are on there. Do we have an equity issue here? Do we have the problem that some students simply don’t have access to this technology, so therefore they’re going to be at a major disadvantage?

Dimitrios G: We try not to, because as part of the college, a huge portion of our students come from the 10% poorest postcode of the UK. So that’s a lot of disadvantages there.
Using technology, it is really, really important how to narrow that gap. So that’s why we do have policies in place, so yes, you might not be able to have a brand new phone, but you can go to library and grab a Chromebook and work through that.
So we only require from them a little bit of extra time if they want to use technology and they can’t have the technology on their own.

Craig Barton: Is it a similar thing for you? Do you see equity issues?

Darren Coogan: Absolutely. I mean we have to take it class by class. For example we’ve got, we're fortunate enough to have a Chromebook trolley, so if we did know, for example, certain learners didn’t have smart phones, we might pair up and we might do group activities. We might just use Chromebooks per table. It definitely has to be done class by class. I do agree there.
We’re very similar, well, geographically where we’re located and it’s not realistic to assume that every learner will have a high tech phone that can access everything that you need.

Craig Barton: Is it an issue outside of the classroom that when students go home they’ve got different access to different types of technology?

Darren Coogan: Oh absolutely. No two learners are the same; therefore we need to be able to accommodate revision materials, physical worksheets, books etc. I mean I still, on the first day that I meet my learners, I take them for a walk round the college.
I still bring them to the most amazing section in the library, the one that’s seldom visited until April, where it’s a little bit dusty and you’ve got wonderful maths books, and I say “pick this up here”, and I make them take it and I say “at the end of the day you can hand this back”. That’s fine, but I understand what you’re saying.
You’ve got, we did mention Google Classrooms etc, where learners can access it from anywhere in the world from any device, but unfortunately some people may not have that luxury, so therefore they’re revising on their notebooks that we give them, physical copy books or via worksheets or something like that. I do understand that it’s something we just can’t fix.

Craig Barton: Let me go a bit controversial as we come to the end of this conversation. I remember a lesson where I had all my lesson planned. I had the PowerPoint ready. I had these websites I was going to access. It was all set out. It was going to be the best lesson ever.
And I came in, and of course there was an issue with the board. The board wouldn't turn on, the computer wouldn’t log in, and I was lost. I did not have a flipping clue what to do.
Now, had that happened in the first couple of years of my teaching, it wouldn't have been an issue at all, because I wasn’t relying on that technology.
So is there an argument that as teachers we’re becoming overly dependent on this technology, and that actually it’s making us a bit worse at our profession? Or am I talking nonsense there?

Dimitrios G: We do need to have a teaching element. We cannot rely on technology. Technology’s just a medium. The teaching is something different. How to use technology? Unless you can use technology appropriately, it’s nothing. It’s nothing.

Craig Barton: I see, so it definitely improves your teaching, but it’s not the key thing.

Dimitrios G: It makes it interactive. So to expand a little bit, I was working for a little bit of time in a different school, in as supply again.
So, one of the students there was newly arrived from a different country. He couldn’t speak the language at all. So they did have a laptop there, I’m not sure if it was Google Translate or something similar. So it was a member of staff writing there, translating through the laptop, the student write it back, so it was a form of communication. It was brilliant. It was an eye-opener. Oh wow, I’ve never thought about it. So yes, technology can be so important, so important, and it can be used effectively for those who are not lucky enough to have all that technology in their lives, and we can show them the ways to do that really good. But we must not forget that we are teachers. We all teach them.

Craig Barton: Well Darren and Dimitrios, this has been an absolute pleasure. Thanks so much.
Well, that’s it for series two of Inside Exams. But remember, there’s a whole back catalogue of episodes for you to return to whenever you or your colleagues have questions about the exam process. Ruth talked about the importance of valid assessments. There’s a whole episode dedicated just to that if you head back to your podcast feed.
If you want to find out even more about exams, assessment and how to take what you’ve learnt back to the staffroom or classroom, look out for AQA’s new Inside Exams live, coming in spring 2020, including more downloads, video resources and online events to share with your department.
The series might be over, but you can continue the conversation and ask your own questions on Twitter using hashtag Inside Exams. It has been a real pleasure meeting you and putting your questions to AQA.