Skip to main content Sage Education

Kids and AI: What the Turing-Lego study means for your home and classroom

One bright Wednesday morning, nine-year-old Leila asked a robot to write a pirate joke and draw a rainbow parrot. “ChatGPT will probably know,” she said. Chances are, your children or pupils have already done the same. New research by the Alan Turing Institute and the LEGO Group shows that 22% of 8-12-year-olds are now regular users of generative AI, with ChatGPT topping the list. Here is what parents and teachers need to know.

Who is using it and why

The figures reveal a sharp divide. Half of children in private schools are experimenting with AI tools, compared with fewer than one in five in state schools. Children with additional learning needs are particularly enthusiastic: 78% of these pupils use ChatGPT, and they lean on it for talk, play and emotional company. Across the board, curiosity, not cheating, drives adoption. Children describe turning to AI to invent stories, learn about dinosaurs or simply keep boredom at bay.

What parents worry about

Four out of five parents fear that their children may stumble upon inappropriate or plainly false information; fewer than half are concerned about plagiarism in schoolwork. That mismatch should nudge every adult to shift the conversation from “Don’t copy” to “Let’s fact-check together.”

What teachers see

In classrooms, teachers who already use AI praise it: 85% say it saves time, and 82% report better lessons. Yet they are much less sanguine about student use. Their biggest anxiety? Dull thinking. If pupils let a chatbot feed them canned essays, their muscle memory for questioning and creating will wither. Teachers make one exception: digital scaffolds for learners with special needs show real promise.

Four moves to make now

1. Start early and small

Add a ten-minute, age-appropriate activity—ask AI for three dinosaur facts and then head to the library to prove or disprove them. Keep it playful.

2. Agree on house and classroom rules

Post a visible “Ask Before You Paste” sign. Reserve the right to read AI chats aloud together. Frame the tool as a co-pilot, never a ghostwriter.

3. Spot-check the answers

Teach children the “two-source rule”: any surprising AI statement must be matched by a second trustworthy source before it counts.

4. Ask for safe tools

Urge schools to adopt vetted, child-friendly AI platforms and demand that ministries create clear certification schemes.

Last week Leila returned to the chatbot with a new prompt: “Tell me a pirate joke I can disprove myself.” The robot obliged; Leila laughed, edited and sent a better punchline to her classmates. That is the kind of creative, critical engagement our children deserve.

Stay curious, stay skeptical, and keep talking.

Source:

Understanding the Impacts of Generative AI Use on Children

The Alan Turing Institute and Lego

https://www.turing.ac.uk/sites/default/files/2025-05/combined_briefing_-_understanding_the_impacts_of_generative_ai_use_on_children.pdf