Building mental scaffolding: Why AI makes subject knowledge more critical, not less
The debate around artificial intelligence in education is at a tipping point. As AI tools become ubiquitous in classrooms, we have to answer a crucial question: if students can access any information instantly, does deep subject knowledge still matter?
Recent research from the International Education group at Cambridge University Press & Assessment (Cambridge) captured the views of 3,021 teachers and 3,840 students across 150 countries, including New Zealand. It reveals some stark paradoxes:
-
While 25% of students identify finding and understanding information as technology's greatest benefit for their future, only 45% feel well prepared for life after education.
-
81% of teachers view subject knowledge as critical for students' next educational step, yet this drops dramatically to just 37% when considering life after formal education.
These disconnects suggest a) we are confusing information retrieval with genuine learning, and b) a fundamental misunderstanding of how knowledge works in an AI-augmented world.
The prevailing sentiment, expressed by a student from India, represents a dangerous oversimplification of how learning actually works: "As AI gets more popular and more widely available, the need to memorise subject knowledge becomes less important as we can find and implement subject knowledge easily with the help of AI."
The neuroscience is clear. Information becomes knowledge only when processed and organised mentally, enabling advanced cognitive skills such as clear communication, critical thinking, and problem-solving. Our working memory must process new information while simultaneously engaging with contextual details. Without prior subject knowledge stored in long-term memory to support this process, the demands on working memory become overwhelming, hindering effective learning.
The research reveals concerning patterns in how students approach AI. One student from the United Arab Emirates noted: "I'm starting to see that many students would rather go to ChatGPT whenever they need the slightest help, instead of using their brainpower. It's a way to get an easier answer." A school leader from Oman observed: "If students produce a PowerPoint on a topic, AI can do in 30 seconds what in the past would have taken students three hours to do. This can impact how deep and profound their knowledge is."
But the issue isn't technology itself. Teachers recognise its value, with 29% identifying skill development as technology's greatest benefit and 73% acknowledging it supports skill development overall. Still, a third of teachers (34%) select over-reliance on technology as the greatest challenge it poses in preparing students for the future.
The solution lies in reframing our relationship with both AI and subject knowledge. We must move away from viewing subject knowledge as mere information and instead position it as the foundation for developing skills. You cannot have one without the other. A surgeon cannot develop operating skills without deep knowledge of human anatomy. An AI tool can provide anatomical diagrams instantly, but it cannot replace the mental scaffolding – and the work of creating it – that transforms information into actionable expertise.
As part of that relational reframing, Cambridge's support and training for schools now includes guidance on AI in the classroom. The framework has been developed to protect teachers' rights, give students the ability to act and make choices independently, and promote trustworthy and environmentally sustainable AI for education. It outlines 15 competencies to support the use of AI in a safe, effective, and ethical way.
All of this is designed to strengthen how teachers teach and how students learn, so AI is a tool in developing capability. This becomes particularly critical when considering misinformation. New technologies provide immediate access to data and interpretation, but the challenges of misinformation and disinformation highlight the urgent need for a sound subject knowledge base to contextualise and assess information critically and effectively, so we can reject what is inaccurate.
The research identifies self-management as equally crucial. Almost a quarter of teachers (23%) identify self-management skills as the most difficult to teach, while 19% of students find it the most difficult to learn. In an era of instant answers, the ability to manage one's learning, assess sources critically, and apply information thoughtfully has never been more vital.
Looking at New Zealand specifically, are we preparing students to use AI as a tool or as a crutch? The distinction matters enormously. Tools amplify capability. Crutches replace it.
The path forward requires several shifts:
-
First, we must explicitly help students recognise the skills they are developing alongside their subject knowledge. Currently, less than half of students (48%) feel well prepared for their next educational step. This confidence gap suggests students don't fully recognise the capabilities they are building.
-
Second, we need to integrate AI literacy into curricula, not as a separate subject but as part of developing critical thinking skills. Students must learn to interrogate AI outputs, understand their limitations, and recognise when human expertise is essential.
-
Third, educators need support in balancing technology integration with deep learning. The 72% of teachers who identify engagement and activity as a benefit of technology should be empowered to use these tools without sacrificing the depth that builds genuine understanding.
The evidence is unequivocal: we are at a crossroads in education. The choice isn't between technology and traditional learning. It's between using technology to enhance deep understanding or allowing it to create the illusion of knowledge without substance.
As one student from Argentina wisely said, "Nobody can take away your knowledge. You could lose a lot of things, but if you have your knowledge and have really good values, you can go very far in life."
In New Zealand, as elsewhere, the quality of our educational response to AI will determine whether we raise a generation equipped to thrive in complexity or one dependent on tools they don't fully understand. The stakes couldn't be higher, and the time to act is now.