I’ve spent the past six months talking to teachers — elementary, middle school, and high school, public and private, across several countries — about how AI is changing their classrooms. A few themes came up so consistently that they feel worth sharing with parents.

These aren’t the concerns you might expect. Teachers aren’t primarily worried about cheating, though that comes up. They’re worried about something more fundamental: being put in an impossible position by an institution that hasn’t figured out what it thinks, while parents have expectations on one side and students have access to powerful tools on the other.

Here’s what they wish more parents understood.

”We don’t have clear guidance either.”

Almost every teacher I spoke with said some version of this: “I’m figuring this out as I go. There’s no policy, or there’s a policy that doesn’t make practical sense, or there are conflicting messages from administration.”

A high school English teacher in New Zealand put it plainly: “I got an email in September saying AI tools were banned in all assessed work. Three weeks later I got another email from a different administrator encouraging us to ‘integrate AI into our teaching practice.’ Nobody seemed to notice the contradiction. I’m the one who has to deal with it in the classroom.”

What does this mean for parents? Don’t assume your child’s teacher has clarity they don’t have. If there’s a rule in place, it may be poorly thought through. If there’s no rule, the teacher is making individual calls in real time. Judgment calls made in ambiguity are more easily understood when you know the ambiguity exists.

”The ‘just ban it’ approach isn’t working.”

Detection tools don’t work reliably. Turnitin’s AI detection has a well-documented false positive problem — it disproportionately flags writing by students for whom English is a second language. In several widely reported cases, teachers who used detection scores to fail students were later shown to have flagged human-written work.

More fundamentally: a policy of “don’t use AI” in 2024 is like a policy of “don’t use calculators” in 1990. Students who honor the rule are disadvantaged relative to students who don’t, while the students who don’t are learning a skill that matters.

The teachers I spoke to who felt best about their situation were not the ones with the strictest bans. They were the ones who had redesigned their assignments to make AI-dependent submissions obviously missing what the assignment was actually assessing. “I ask students to bring a printed draft to class and then write an in-class reflection on their own choices,” said a 10th grade English teacher in the UK. “Suddenly it’s not about whether they used AI for the draft — it’s about whether they can think and explain their decisions.”

What does this mean for parents? Ask less “does your school have an AI policy?” and more “does your school have a thoughtful approach to what learning looks like with AI?"

"We’re not trying to stop them from learning about AI — we’re trying to protect learning.”

Several teachers felt frustrated by a narrative they’d seen from parents: that any restriction on AI was backward or anti-technology.

“I’m not anti-AI,” said a math teacher. “I’m anti-students outsourcing the cognitive work that builds their brain. When a kid uses Photomath to solve every problem without understanding why, they’re not learning math. When they use ChatGPT to write every essay without developing their own argument, they’re not learning to write. The concern isn’t AI. It’s whether any learning is happening.”

This framing was consistent across subjects: the concern is not the tool, it’s whether using the tool means skipping the developmental stage the tool is supposed to support.

There’s a useful analogy here. Using a GPS to navigate is completely fine for an adult who already has a spatial model of their city. Using GPS constantly as a child never develops that spatial model, which may or may not matter for the specific child but is a real developmental question. The same logic applies to cognitive tools: using AI to help with tasks you already know how to do is a productivity tool; using it to skip the development of the underlying skill is different.

What does this mean for parents? Reinforce this at home. The message isn’t “don’t use AI.” It’s “use AI in ways that make you better, not in ways that mean you don’t have to learn."

"A little support at home would help enormously.”

This one came up most often, and most emotionally.

Teachers are trying to have nuanced conversations with students about AI — about when to use it, how to disclose it, what “original work” means in a world where AI exists — but those conversations are happening in a vacuum if they’re not happening at home too.

“When a parent emails me upset that their kid got a zero for submitting AI-generated work, and I can see from the submission that the kid submitted a first draft without editing and clearly didn’t read it,” said a high school teacher, “I know the conversation about why that matters hasn’t happened at home.”

The teachers who described the best student relationships with AI described students who had clearly talked about this with their parents — who had a sense of the ethical dimensions, who understood what they were trying to learn and why, who could articulate why “I used AI to help outline my ideas and then wrote the essay myself” is different from “I asked AI to write my essay.”

What does this mean for parents? You don’t need to know AI policy at your child’s school. You need to have a conversation about what it means to actually learn something versus what it means to appear to know something. What’s the difference? Why does it matter? That’s a conversation that transfers to every classroom and every tool.

”We need to figure this out together.”

The teachers I came away most inspired by were not the ones who had solved the AI problem. They were the ones who had decided to treat it as something to explore with their students rather than a threat to manage.

“I told my class at the beginning of the year: I don’t know exactly what the right rules are yet. I know some things — submitting AI work as your own without telling me is dishonest, and that matters regardless of the tool. Beyond that, let’s figure it out together. What does it mean to do your own thinking in a world where AI can help? Let’s think about that.”

That approach requires trust, and it requires the same conversation being continued at home. If parents and teachers are aligned in wanting children to actually learn — not just appear to learn — and in being honest about the genuine uncertainty around how to do that with AI tools in the picture, the children benefit.

That alignment doesn’t happen on its own. It requires a conversation. Maybe between you and your child’s teacher. Definitely between you and your child.


Exploring AI literacy at home? Browse our activity library for hands-on ways to build AI understanding alongside your children.

← Back to Articles Explore Activities →