My 17-year-old handed in an essay last month that was, frankly, suspiciously good. Not just well-structured. It had a fluency and polish that made me raise an eyebrow. He was upfront about it: he’d used AI to help draft it, then edited it himself. Was that cheating? Was it learning? Was it just smart use of available tools? I genuinely wasn’t sure how to feel about it, and I suspect a lot of parents reading this are in exactly the same position.
Then there’s my 13-year-old, who last week asked Alexa a maths question instead of working it out on paper. Now, I’m not about to pretend I’ve never Googled something I probably should have known, but it did make me stop and think. Is there a line between using technology sensibly and using it as a substitute for actually engaging your brain? And if there is, are we helping our kids find it?
These aren’t abstract questions anymore. According to National Literacy Trust research covering over 15,000 young people in the UK, usage of generative AI among 13 to 18-year-olds shot up from 37% in 2023 to 77% in 2024, before settling back to around 67% in 2025. The tools are everywhere, the kids are using them, and most parents are still working out what that actually means. So let’s try to do that properly.
The Numbers Are Striking, and a Bit Alarming
Here is the headline stat that stopped me cold: 1 in 4 secondary-school pupils now admit to simply copying AI output when using it for homework. That’s up from roughly 1 in 5 just a year before. And nearly 1 in 5 say they use AI to do all their schoolwork. Not help with it. Do it.
To be fair, the picture is more nuanced than that. Around half of young AI users say they add their own thinking to what the tool produces, and about 40% say they check outputs for errors. That’s actually more critical thinking than I’d have expected from teenagers. But the direction of travel on the “just copy it” cohort is the wrong one, and that matters.
The concern from researchers isn’t really about cheating in the traditional sense. A Stanford University study found that the percentage of students admitting to cheating has stayed broadly flat since ChatGPT arrived, somewhere between 60% and 70%, which is its own sobering number but suggests AI hasn’t dramatically changed behaviour on that front. The deeper worry is something called cognitive offloading, which is essentially the brain’s equivalent of letting someone else carry your bags. If you never carry the bags yourself, your arms don’t get stronger. If you never do the intellectual heavy lifting yourself, the same logic applies.
A randomised study of nearly a thousand students, cited by the Social Market Foundation, found something particularly telling. Students who used ChatGPT appeared to outperform their peers during regular lessons, but when access was removed for the final exam, they scored 17% lower than those who hadn’t used it. The tool had become a crutch, and when the crutch was taken away, the leg hadn’t actually healed. Younger pupils and lower-attaining students appear most at risk of this effect, which feels important given that those are often the children who most need genuine academic development.
But There’s a Genuine Upside Too
I want to be straight with you here, because this isn’t a scare piece. If you came here expecting me to tell you AI is rotting your child’s brain, I’m going to disappoint you, because the evidence is more complicated than that.
Roughly half of young people in the same National Literacy Trust research said AI helped them with ideas, understand concepts, or learn new things. A Harvard University physics study from 2025 found students using AI tutors learned more than twice as much in less time compared to traditional active-learning classrooms. Now, that was university-level physics, not Year 8 geography, so I’d be cautious about applying it too broadly. But it’s not nothing.
What’s also interesting is that AI appears to amplify existing engagement rather than replace it. Young people who already enjoyed writing were twice as likely to use AI to help with story elements like plot and character, compared to those who weren’t already interested. The kids who love learning seem to be using AI to learn more. The kids who don’t want to engage seem to be using it to avoid engaging. Which means the technology itself might not be the variable here. It might just be a mirror.
There’s also evidence that AI is helping teachers, which matters for your child’s education even if they never touch the tools themselves. An Education Endowment Foundation trial across 68 schools found that using ChatGPT with proper guidance cut lesson-planning time by 31%, saving teachers roughly 25 minutes a week. That’s time they can put back into actually being in the room with students.
What Schools Are (and Aren’t) Doing
This is where things get a bit murky, and I’ll be honest about the gaps in what we know. As of 2025, there is no single formal Department for Education framework for AI in schools in England. The DfE has published guidance encouraging schools to approach AI cautiously and ethically, with emphasis on data protection and safeguarding, and in June 2025 released a broader package of resources for schools and colleges. But what that looks like in practice varies enormously from one school to another.
Some schools have banned AI outright for homework. Others have embraced it as a literacy tool. Most are somewhere in the middle, trying to work it out as they go. Which means, as parents, we can’t assume the school has this covered. Some do, some don’t, and a lot are doing their best with incomplete guidance.
What does seem clear from the research is that context and guardrails matter enormously. The same Wharton study that showed ChatGPT harming exam results also found that when students used a purpose-built AI tutoring tool with built-in safeguards for learning, the negative effects were largely eliminated. The difference wasn’t the technology. It was the design and how it was used.
Comparison: AI Tools and Their Suitability for Student Use
| Tool | Best Use Case | Learning Safeguards | Risk of Passive Use | Cost |
|---|---|---|---|---|
| ChatGPT (free tier) | General Q&A, drafting, explaining concepts | Low (will write essays on demand) | High | Free |
| Khan Academy Khanmigo | Maths, science tutoring | High (Socratic method, won’t give answers outright) | Low | Free (with Khan Academy) |
| Microsoft Copilot (school edition) | Research assistance, writing support | Medium (some prompting controls for education) | Medium | Via school licences |
| Google Gemini | General questions, creative work | Low (no education-specific limits) | High | Free |
| Revision by Seneca | UK curriculum revision | High (built around retrieval practice) | Low | Free/Premium |
Hype Cycle Check
LIKELY TO LAST: AI as a legitimate learning tool, when used well, is genuinely here to stay. Purpose-built tutoring tools that use Socratic questioning rather than just delivering answers have real, evidenced value. The question is whether schools and parents can steer kids towards those rather than the path of least resistance.
WATCH CLOSELY: The cognitive offloading risk is real but the research is still young. Most studies are short-term and small in scale. We don’t yet know what five or ten years of AI-assisted education does to critical thinking and independent problem-solving. This one deserves serious ongoing attention from researchers and policymakers alike.
OVERHYPED: The idea that AI will “revolutionise personalised learning” for every child within a few years is almost certainly overstated. The gap between a polished demo of an AI tutor and what actually lands in a mixed-ability classroom in Hampshire is enormous. Don’t assume the schools will figure it out quickly or consistently.
What This Means for CES 2027
CES 2026 already saw a significant wave of AI-in-education hardware, from smart revision tablets to AI-assisted reading companions for younger children. By CES 2027, I’d expect this to have consolidated into two camps: premium AI tutoring ecosystems aimed at families who can afford subscriptions, and budget devices aimed at the school market. The policy conversation around AI in UK schools will have moved significantly by then, and I’d bet we’ll see the first purpose-built parental control layers specifically for academic AI use. Watch for that space. It’s going to get interesting.
Recommended on Amazon
These are affiliate links — if you buy through them, Tech Dads Life earns a small commission at no extra cost to you.
What to Watch
UK school AI policy. The DfE’s June 2025 guidance is a start, but it’s non-statutory. Whether schools receive clearer, enforceable frameworks in the next 12 months will shape a generation’s relationship with these tools.
Long-term cognitive studies. We genuinely don’t have the data yet on what sustained AI use does to developing brains. The next round of major studies, expected over the next two to three years, will either confirm the cognitive offloading concern or reframe it significantly.
Purpose-built education AI. Tools designed with learning outcomes baked in, rather than general chatbots repurposed for homework, are likely to be the real game-changer. Watch the EdTech space for serious investment here.
Exam board responses. UK exam boards have been cautious about addressing AI directly in assessment design. How GCSEs and A-levels adapt to a world where AI can pass them easily will be one of the defining education policy debates of the next few years.
I don’t think AI is making our kids lazy. I think it’s giving lazy tendencies somewhere easy to go, and rewarding active, engaged learners in ways we haven’t seen before. The gap between those two outcomes might largely come down to us, at home, having honest conversations about what learning actually is and why it matters. Not a comfortable conclusion, but probably the right one.
If you want more honest, no-nonsense takes on tech and family life, come and join the Tech Dads Life newsletter. It’s where I share the stuff that doesn’t quite fit in the articles, the early findings, the half-formed opinions, and the occasional bit of kit that’s genuinely worth your money.

