The Draw

The Skills I'm Teaching My Kids About Technology

The Skills I'm Teaching My Kids About Technology

I’ve been thinking about this a lot lately. My 13-year-old came home from school a few weeks ago convinced that something he’d seen online was completely true. He’d read it, shared it with his mates, and was ready to argue the case. The source? Some bloke on social media with a catchy graphic and a confident caption. No name, no context, no link to anything verifiable. Just vibes and a lot of engagement. We had a good chat about it over dinner, as we do most things in this house, and it got me thinking: what are we actually teaching kids about technology? Not how to use it. They’ve got that covered. But how to think about it.

Because here’s the thing. According to Ofcom’s 2025 data, 60% of UK teenagers encountered potentially harmful content online in a single month. That includes fake news, violent material, and extremist content. And a UK Parliament report found that media and information literacy in English secondary schools is “patchy at best, and almost non-existent within primary schools.” So the schools, through no real fault of their own, are not keeping up. That means it falls to parents. Which, honestly, is fine by me. I’d rather be in that conversation than leave it to the algorithm.

This isn’t an article about coding. I’m not going to tell you that every kid needs to learn Python. That might come later, and there are decent resources out there for it. But right now, I want to talk about the foundational stuff, the skills I genuinely think every kid needs before they’re let loose on the internet unsupervised.

Teaching Scepticism Without Cynicism

The hardest skill to teach, and probably the most important, is how to question what you see online without becoming the sort of person who trusts nothing and believes in nothing. There’s a meaningful difference between healthy scepticism and paranoid cynicism, and I want my kids to land on the right side of it.

After the Southport riots in August 2024, false information about the attacker’s identity spread rapidly across social media. Within hours, it had already done its damage. Real-world riots erupted in more than two dozen UK cities, fuelled by fabricated content that many people shared without questioning. That’s not ancient history. That happened less than two years ago. When I talk to my kids about misinformation, I use real examples like that, because abstract warnings about “fake news” wash over teenagers pretty quickly.

The three questions I’ve drilled into mine are simple. Who is saying this? What do they gain from me believing it? And can I find this confirmed somewhere that isn’t connected to the original source? Not every piece of content needs a full investigation, but the habit of pausing before sharing is, I think, one of the most valuable things I can give them. It also helps that I’m pretty clear about the fact that adults are terrible at this too. Almost half of UK adults using social media say they’ve encountered misleading or untrue news stories. Scepticism isn’t a skill for kids. It’s a skill for life.

Screen Time: Rules That Actually Make Sense

The UK government published its first official screen time guidance in March 2026, and honestly, it landed the way I expected: broadly sensible, slightly cautious, and immediately controversial in certain parenting circles. The headline for younger children is no more than one hour of screens per day for two- to five-year-olds, and no screens in the hour before bed. For babies under two, the recommendation is to avoid screens except for shared activities that encourage bonding, interaction and conversation.

It’s worth noting that the guidance also makes clear these time limits shouldn’t apply in the same way to screen-based assistive technologies used to support children with special educational needs and disabilities. And the Royal College of Paediatrics and Child Health takes a slightly different approach, suggesting families use common sense and negotiate screen time limits based on the needs of each individual child rather than sticking to rigid numbers. So there’s an ongoing conversation here, and the “right” answer probably depends on your family.

My kids are well past those ages now, so the specific numbers don’t apply directly to our house anymore. But the principles underneath the guidance are worth thinking about regardless of age. The research shows that co-viewing, watching together and talking about what’s on screen, produces better outcomes than kids watching alone. We’ve always done dinner at the table, phones away, and I think that matters more than the total number of minutes logged.

With 97% of 13- to 15-year-olds owning their own mobile phone, the idea of banning screens entirely is a fantasy. Nearly 25% of children and young people use their smartphones in a way that’s consistent with a behavioural addiction, according to the research, and that’s a genuinely alarming figure. What I’ve found more useful than hard limits is helping my kids understand why certain apps are designed the way they are. Infinite scroll. No natural stopping points. Notifications timed to pull you back in. These aren’t accidents. When you understand the mechanism, you start to see through it.

AI Literacy: The New Essential

Half of children aged 8 to 17 in the UK say they’ve used AI, according to Ofcom’s 2025 figures, up from 46% the previous year. Among 13- to 15-year-olds, the increase is even more pronounced. My eldest uses it regularly, and my 13-year-old has started experimenting with it for school work. I’m not against that. I use AI tools myself, pretty much every day. I run local AI setups at home, so I’ve spent enough time with these models to know both what they’re good at and where they fall apart.

But what I do insist on is understanding what it actually is. AI doesn’t know things the way you know things. It predicts. It generates. It can be confidently, fluently wrong. If you ask a language model a question and it answers without hesitation, that confidence tells you nothing about accuracy. That’s a hard thing for adults to internalise, let alone teenagers who are used to search engines at least pointing at real sources.

What I teach: treat AI output like a first draft from a smart but unreliable assistant. Useful as a starting point. Never the final word. Always verify anything important. And be careful what you put into it. Data in, data processed. Some AI tools learn from input, and there’s no need to hand over personal details, school names, or anything sensitive for a homework summary.

Digital Footprints and the Long Game

This one gets less airtime in school, in my experience, but it’s the thing that will matter most in ten years. Everything posted online leaves a trace. And teenagers, by the nature of being teenagers, tend to operate with a fairly short time horizon. What feels hilarious at 14 can resurface at 24 in a job interview, a relationship, or just in the form of deep embarrassment.

I’m not trying to terrify my kids into never posting anything. That would be both cruel and counterproductive. But I do want them to ask: would I be fine if this stayed around forever? That’s the filter. Not “will this get me in trouble now,” but “could this matter later?”

We’ve also talked about privacy in a more practical sense. Using strong, unique passwords. Not sharing personal details in public profiles. Being careful about location data in photos. None of this is advanced stuff, but Ofcom’s research suggests that many adults can’t handle situations like scam emails confidently, which means the basics genuinely aren’t as obvious as we might assume. Starting these habits early matters.


Quick Comparison: Skills Worth Prioritising

SkillWhy It MattersWhen to Start
Source checkingMisinformation is everywhere and convincingAge 8 onwards
Screen habit awarenessPlatforms are designed to maximise engagementAge 10 onwards
AI literacyKids are already using it; they need contextAge 11 onwards
Digital footprint thinkingLong-term consequences need early groundingAge 12 onwards
Basic privacy hygienePasswords, data, location awarenessAge 10 onwards

Hype Cycle Check

LIKELY TO LAST: Critical thinking and source verification. These skills predate the internet and will outlast whatever platform comes next. Teaching kids to pause and question is never going to be a bad investment.

WATCH CLOSELY: AI literacy as a formal subject. Several schools are beginning to integrate it, and the gap between what kids are using and what they understand about it is closing slowly. Whether that becomes structured curriculum or stays as informal learning is still to be determined.

VAPOURWARE RISK: Any app or platform that promises to “solve” screen time or digital wellbeing through tech alone. The research consistently shows that parental involvement and conversation make more difference than any tool that claims to do it for you.


CES Angle: What This Means for CES 2027

Digital wellbeing is already becoming a category in its own right at major tech events, and by CES 2027 I’d expect to see a significant push around AI-powered parental tools, smarter screen management across connected home ecosystems, and products aimed squarely at families trying to navigate the gap between access and responsibility. Whether the products live up to the promise is another question. But the fact that the market is moving in this direction suggests that parental concern about digital literacy is now a commercial force, not just a policy one.


Recommended on Amazon

These are affiliate links — if you buy through them, Tech Dads Life earns a small commission at no extra cost to you.

What to Watch

  1. UK government digital literacy curriculum update. With the new screen time guidance already published in March 2026, pressure is building on the Department for Education to address the patchiness of media literacy teaching in schools. Something is likely coming.

  2. AI use in homework and assessment. As AI tools become more capable and more accessible, schools are going to have to decide what constitutes appropriate use. That conversation is happening now, and the outcomes will matter for every family with a secondary-aged child.

  3. Social media age verification enforcement. Age restrictions exist on paper. Enforcement is a different story. Ofcom’s ongoing work here will shape what the landscape looks like for parents over the next two to three years.

  4. New Ofcom Children’s Online Safety reporting. Ofcom continues to expand its monitoring of children’s online experiences. The next set of data, covering 2025 into 2026, will tell us whether the trends are improving or accelerating in the wrong direction.


None of this is easy. I don’t always get it right, and I’m sure my kids would cheerfully tell you as much. But the conversations are happening, and I think that’s the main thing. Technology isn’t going anywhere. The least I can do is make sure my kids understand it well enough to use it on their terms, not the other way around.

If you want more of this kind of thinking, delivered straight to your inbox without the algorithm deciding whether you see it, come and join the Tech Dads Life newsletter. It’s where I share the things that don’t always make the main feed.

Sign up at techdadslife.beehiiv.com

Mike Reed
Mike Reed

Dad of three, tech enthusiast, and the person who reads the spec sheet before the kids finish unwrapping. I cover the gear, gadgets, and ideas that actually matter to families, without the hype. I go to CES every year so you don't have to.