Over the last decade and a half, we’ve seen significant shifts in curriculum development, particularly influenced by figures such as Nick Gibb and Michael Gove. Their reforms placed considerable emphasis on prescribed curricula, which, as many have noted, has significantly diminished teacher autonomy. Now, with artificial intelligence (AI) are we seeing an even bigger threat to teacher autonomy? Are we concerned that AI will obliterate what’s left of teacher autonomy?
The Promise and Peril of AI
AI is often seen as a panacea for the myriad issues plaguing our education system—teachers are time-poor, overworked, and underpaid, and AI promises that it will help to alleviate these burdens. Yet, this perceived saviour comes with substantial risks. At its core, AI systems are inherently biased, reflecting the prejudices and imperfections of their human creators. They are also driven by algorithms that, while powerful, lack the nuanced understanding and pedagogical insight that experienced educators bring to the classroom. PCK anyone (let alone TPACK!).
The danger lies in AI’s potential to become the ultimate tool for prescribed curricula. When educational content and methods are dictated by AI, do we lose the ability to shape and adapt teaching to meet the diverse needs of our students? Could this loss of autonomy lead to a homogenised education system, where the rich, context-sensitive craft of teaching is reduced to algorithmic outputs?
In my experience, teachers play a crucial role in interpreting and applying curriculum guidelines, infusing their lessons with insights from cognitive science, pedagogical best practices, and a deep understanding of their students’ needs. This human touch ensures that education is not just about imparting knowledge but about inspiring and engaging learners. AI, no matter how advanced, cannot replicate this nuanced approach. We hear so many times, that teaching is the job that makes all other jobs; well how can we do that if AI is driving the car, so to speak?
Challenges in Implementation
As I’ve seen time and time again with my engagement with AI systems, even those I’ve worked hard to nuance (can you PLEASE write in British English??!), AI-generated content can perpetuate and even exacerbate existing biases (learning styles anyone?!). As noted in much research, technology must be integrated thoughtfully and purposefully, with a strong pedagogical foundation to truly enhance learning outcomes. The risk is that, without adequate oversight, planning, support and leadership, AI tools could entrench a ‘one-size-fits-all’ model of education, which undermines the principles of differentiated instruction and adaptive learning.
Moreover, the implementation of AI in education is fraught with challenges. Successful integration of technology requires careful planning, continuous professional development, and a robust support infrastructure. These elements are often lacking in schools already stretched thin by limited resources and high demands. Teachers must be equipped with the skills to critically assess and effectively use AI tools, ensuring that these technologies complement rather than replace their expertise.
The Education Endowment Foundation (EEF) emphasise that the benefits of technology in education are maximised when it’s used to support sound pedagogical practices. However, if AI is deployed primarily as a cost-cutting measure or to streamline administrative tasks without considering its impact on teaching and learning, it could do more harm than good. The EEF’s guidance suggests that technology should enhance the quality of explanations and feedback, and improve student engagement and practice, but it is important to note that this potential can only be realised when teachers retain control over how these tools are used.
In summary
Whilst AI holds promise for transforming education (and I’m positive that it IS part of the solution), it must be approached with caution. Influencers often share the Boston Consulting Group research that they saw a “40% increase in productivity is what people who use AI experience compared to those who don’t”. Wow! Amazing!
What those influencers rarely go on to share is that they also found that they were, “19% more likely to produce incorrect solutions in such scenarios”. Of course, that isn’t education, but it helps us reflect on how we can implement in education.
Firstly, that CPD and time to learn how to use AI effectively is important and secondly, that of course, cognisance of what works and when to use AI is also equally important. For leadership that tells us, and is reinforced by what we’ve learned from the EEF report noted above and other sources, that it is so important that we plan so that AI enhances rather than diminishes the human element. In that way, we can use AI for good, not as a panacea for the many problems in education.
The allure of AI as a quick fix for educational challenges must not overshadow the fundamental need for teacher autonomy and professional judgment. Ensuring that teachers remain at the heart of what happens in their classrooms is crucial for maintaining the integrity and quality of education in all settings. AI should serve as a tool to empower teachers, not replace them, preserving the essential human element that makes education truly effective.