We're having the wrong conversation about artificial intelligence in education. Not because the topic lacks importance, but because we've constructed an elaborate theatre of opposition against positions that barely exist. We rally against extremists who aren't there, demolish arguments no one makes and meanwhile, the actual transformation slips past unexamined.
Nobody is saying AI should take over education. Nobody credible argues that children should abandon foundational knowledge. No serious educator suggests that every higher education course must incorporate AI, nor does anyone claim these systems create genuinely novel solutions from thin air or have some sort of sentience and are about to takeover. Even the most ardent tech evangelists acknowledge the environmental costs of continued AI development. So why do we keep responding as if they do?
The answer reveals something uncomfortable about how we process technological change. We prefer fighting caricatures because they're easier to defeat than wrestling with ambiguity. When faced with a technology that's neither saviour nor destroyer, but rather a complex tool with profound implications, we retreat into comfortable binaries. The debate becomes a kind of shadow boxing, where we exhaust ourselves battling phantoms whilst the real changes happen quietly, incrementally, and largely unexamined.
Consider what's actually happening right now. Teachers aren't being replaced; they're finding new ways to ensure learning occurs. When a student memorises historical dates, they're not just storing information but building neural pathways that support deeper understanding. When they write by hand, they're not just producing text but engaging motor memory that reinforces conceptual learning. These aren't outdated practices to be discarded but foundations that support more complex thinking.
The question isn't whether to use AI but how to use it whilst preserving what makes learning transformative. Friction matters. The struggle to understand, the momentary confusion before clarity, the satisfaction of hard-won comprehension: these aren't obstacles to efficiency but the very mechanisms through which deep learning occurs. Some subjects might integrate AI into their outputs, using it as a collaborative tool for creation. Others might exclude it entirely, preserving spaces where students must rely solely on their own cognitive resources. Both choices can be valid when grounded in pedagogical reasoning rather than technological determinism.
Assessment, too, demands nuance. It can be dialogic, unfolding through conversation between teacher and student. It can be process-focused, valuing the journey of thought as much as the destination. It can happen continuously, woven into the fabric of learning rather than separated as discrete events. The goal isn't to catch students using AI but to design assessments that reveal genuine understanding regardless of the tools employed.
What matters is choice, but choice informed by awareness. Not every educator needs to incorporate AI, but all need to understand its capabilities and impacts on their approaches. A literature professor might reasonably decide that close reading requires unmediated engagement with text. A data science instructor might equally reasonably make AI central to their curriculum. Neither is wrong, provided their decision emerges from careful consideration of their students' needs rather than reflexive technophobia or technophilia.
This awareness extends to recognising what different students need at different moments. For some, AI might provide scaffolding that makes previously inaccessible concepts reachable. For others, it might short-circuit necessary cognitive development. The same student might benefit from AI assistance in one context whilst needing to work without it in another. These aren't contradictions but recognitions of learning's complexity.
The environmental concerns, too, require nuanced response. We can acknowledge the carbon cost whilst working to minimise unnecessary use, choosing human over chatbot interaction where possible, favouring local processing over cloud computation when feasible. Pushing for renewables in powering data processing. Perfect solutions don't exist, but thoughtful compromises do.
Building these approaches takes time and collective wisdom. We're not racing against some imaginary deadline where AI either saves or destroys education. We're engaged in the work of adaptation, learning from early experiments, sharing what works, acknowledging what doesn't. This work happens best where educators support each other in developing contextually appropriate responses.
The shift from phantom debates to genuine engagement requires empathy: for students navigating unprecedented technological change, for educators balancing innovation with preservation, for institutions trying to prepare students for unknowable futures. It requires logic that acknowledges both possibilities and constraints, that recognises resource limitations whilst imagining new pedagogies.
Most importantly, it requires keeping students at the centre. Not abstract future workers or theoretical digital natives, but actual humans in our classrooms with their particular needs, struggles, and potentials. Some will thrive with AI assistance; others will flourish through more traditional methods. Many will need both, deployed thoughtfully at different stages of their learning journey.
The conversation we need isn't about whether AI belongs in education but about how we preserve and enhance what makes learning meaningful in a changing world. It's about maintaining the productive struggle that builds understanding whilst removing unnecessary barriers. It's about preparing students not just to use AI but to think critically about its use, to maintain their own cognitive capabilities whilst leveraging new tools.
This isn't a battle between human and artificial intelligence but a careful negotiation of their relationship. In that negotiation, we have choices, but they're not the stark binaries of our phantom debates. They're the nuanced decisions of educators who understand that learning is too complex, too human, too important to be reduced to simple formulas. The future of education with AI will be built through thousands of these decisions, each grounded in awareness, guided by empathy and focused relentlessly on what our students need to flourish.