The Place of AI in Education
Published
Tuesday 12 Aug 2025
Author
Chinwe Njoku, Education Insights Lead, AQA
AI in education presents both opportunities and challenges. While it can enhance learning and reduce teacher workload, it also risks diminishing core skills like memory and critical thinking. Teaching students to use AI effectively is essential, but must be balanced with safeguarding, ethical use, and fair assessment practices. The education system must evolve to integrate AI thoughtfully, ensuring young people are equipped for an AI-enabled future without compromising foundational knowledge or equity.

Most people have some sort of gut feeling on AI in schools. Some might be inclined to give students unfettered access to help ready them for the world of work, whereas others are much more cautious about the potential negative side effects of AI exposure. Similar debates are then played out in the sector and the wider media.
Whilst these initial debates have been helpful in the abstract, at AQA, we think more can now be done to examine the specific pros and cons. The conversation is still too vague in a world where schools and colleges are facing specific questions about whether or not to integrate AI into their pedagogy and practice.
To truly understand the future of AI in education, we need to start breaking down the overall debate into more tangible chunks. To do that, we believe there are four key areas that deserve closer scrutiny:
- The need for education on AI – what does it actually mean to educate a young person about AI? Can it be done? If it can be, is it helpful to the young person to do so?
- If it is helpful to teach young people about AI, what aspects should they be taught about?
- If children are to be taught about AI, what would it look like for them to be assessed on their use and understanding of it?
- Finally, if AI is to be taught and/or assessed, what safeguards are needed so young people can use it safely? And how can we prevent any adverse cognitive impact on them?
Our plan is to briefly introduce each of these topics in this overview, before following up with detailed articles on each area, authored by thought leaders from across the sector. If you are interested in getting involved, please email policy@aqa.org.uk.
The Need for Education
Some people might say, “We don’t need to teach kids ‘X’ anymore; they’ll have AI for that! So, just teach them how to use AI”? Yet others will insist on children having a strong base of domain knowledge and expertise to, for example, tell if AI responses are truthful or not.
Search engines and online encyclopaedia revolutionised the availability of resources, and AI is poised to do the same and more. But changing how things are done doesn’t mean everything that came before it becomes irrelevant.
When new technologies enter education, the system and its users adapt, often changing the mechanics of how teaching and learning are done. For example, Google Search and Wikipedia made library reference services and collections largely redundant. And satellite navigation tools like Google Maps and Waze have made the road atlas close to obsolete as well.
But these advancements haven’t made personal knowledge unnecessary or solved all educational challenges. And they have also brought their own problems, for example losing the art of independently navigating simple, short journeys. Or some people struggling to tell the time on analogue clocks thanks to digital display devices.
Even with the introduction of AI, it seems like people will still need to accurately synthesize and use information from a variety of sources, which is crucial in distinguishing the ungrounded from evidence-based information. And research has found that users’ memory retention, information literacy, mental arithmetic and number sense are being harmed by over reliance on these technologies.
The reality is therefore nuanced – one where the education sector needs to balance the potential that new technology brings, with the understanding that core knowledge and skills will not suddenly become obsolete overnight.
Learning to Use AI
A Public First survey showed that 67% of people believe it is more important for school children to learn to do things without the help of AI, rather than learning to use AI tools. However, 48% of them also agreed that children need to learn to use AI tools for the rest of their careers.
It is still unclear what it looks like for children to be taught to use AI well. Much of the academic research is skewed towards the benefits of AI for educators but less so for children. More research is therefore sorely needed.
The Government has noted that AI is a force for change such that all areas of society will be impacted, and the Department for Science, Innovation and Technology (DSIT), working with Skills England, is keen to embed digital and AI skills and expand education pathways into AI. One of the key next steps from the Curriculum and Assessment Review (CAR) interim report is considering “how best to equip children and young people with the essential knowledge and skills which will enable them to adapt and thrive in a rapidly changing and AI-enabled world.”
We need to be practical and exact on what teaching children to use AI would look like in our classrooms. Should we be teaching children how to write good prompts? Or how to fact check an answer? Or how to decide which AI tool to use for different purposes?
If you’re old enough, you might recall lessons for primary school children on which search engines to use. These seemed like a good idea at the time, but rapidly became pointless as everyone moved to using Google. Are we in danger of making the same mistake again?

AI in Assessments
AI has helped reduce teacher workload in formative assessments through automated marking and feedback. But it is not yet always as robust as teacher-given student-specific feedback.
The use of AI for summative assessment must be carefully managed to avoid widening the disadvantage gaps across certain pupil groups.
Given the claims of students ‘cheating’ on their essays and coursework by using AI, institutions like universities are having to adapt their assessment strategies to manage this. The CAR interim report mentions the “risks concerning AI in relation to coursework assessments, and consequential risks to standards and fairness.”
But how can we adequately prepare students for the most probable version of the world they’re going to live in without compromising the standards and fairness of assessments?
Some have suggested that schools allow students to write with AI but assess them on how well they are able to critique, analyse, debate, and defend what they’ve written without relying on AI. In a world where AI will be ubiquitous, but we want young people to develop knowledge and skills of their own, we will need to find a balance.
Safeguarding Children using AI
As more and more young people use AI, potential safeguarding issues arise. Children must be careful with their personal data but might ignore this as AI can sometimes feel like a friend. Children need to learn to use AI safely because, even if schools don’t allow AI in lessons, nothing stops them from using it outside school or on their electronic devices.
We must facilitate safe AI use through ethical and safeguarding frameworks, principles and guidelines for all our schools, without creating unnecessary workload, widening the digital divide or disadvantage gap, or increasing loneliness.
In the same vein, we must pay attention to the potential decline in learning skills from an over reliance on AI, using it as a substitute for thinking instead of as a supplementary tool to co-create. We must balance cognitive cost with the opportunities for enhanced learning, to maintain an overall net positive impact on children.
We cannot shy away from AI and how much it has impacted almost every aspect of people's lives. To make sure the UK isn’t lagging behind the rest of the world, despite being the third largest AI market, our education system plays a crucial role in retaining our leadership position as the technology advances.
We invite you to join us in moving the conversation forward as we explore these key themes in forthcoming articles.