The "Cheating" Narrative Is Teaching Kids the Wrong Lesson About AI

Sometimes when I use AI at work, I get this guilty feeling. Like I'm cheating somehow.

I'll use it to draft an email, or help me articulate thoughts I'm trying to get down on paper, or organize messy ideas into something coherent. And rationally, I know this is exactly what the tool is for. I know the thinking is still mine. I know that AI is just helping me get it out of my head in a way that makes sense.

But I still get the feeling. And it’s not all the time. But, at times, there’s this nagging sense that I should be struggling through it "the hard way" to prove I actually did the work.

Even for this blog. I've gone back and forth with AI so many times to make sure this reflects what I actually think, to get my ideas down into words that sound like me. It probably took me the same amount of time to write this with AI as it would have alone. Maybe longer. But here's the thing: without AI, it would have felt so daunting to start. The blank page, the pressure to get it right the first time, the mental energy of translating messy thoughts into coherent paragraphs... I might never have started at all.

That's what I mean about intention. I used AI to help me think through and articulate what I wanted to say. Not to avoid thinking.

Meanwhile, kids ARE using AI to cheat. On tests. On essays. On homework they're supposed to be learning from. They're using it to skip the learning because we've only taught them two narratives: "AI is forbidden" or "AI will do it for you."

When kids use it to take a test or write an essay without engaging with the material? That IS cheating. Full stop.

If they instead use AI to help them understand a concept, or to bring an idea to life they couldn't execute on their own? That's different.

The tool is the same. The outcome might even look the same. But the intention—and what they're learning—is completely different.

Listen, kids internalize the messages we send

This is a bit of an oversimplification, but hear me out:

If we keep talking about how kids will use AI to cheat, they start to think that's its primary value. It becomes "the purpose of AI" in their minds.

But (and this is the part that matters), if we show them AI can deepen their understanding—that it's way more satisfying to use it to unlock something than to bypass it—that's how they start to shape what they think "AI is for."

This sounds naive. I promise I'm not. This does not mean kids won't use AI to cheat. They will. Especially on things they don't like, or find tedious, or think are just too hard. That's reality. That's being human. Kids have been finding shortcuts since homework was invented.

Teachers need us parents to show up here

I keep thinking about what educators are dealing with right now.

New policies being written faster than anyone can implement them. Pressure from administrators worried about test scores. Parents with wildly different views on whether AI should be banned or embraced. And they're supposed to figure out how to tell if a student genuinely used AI to learn, or just had AI do their thinking for them.

That's impossible.

We need to help set the foundation at home.

We need to show them how to use AI as a thinking partner. This will help them discover that using AI to deepen understanding actually feels better than using it to avoid work.

I'm not saying this prevents cheating. I'm saying it gives them a reference point for what AI can be beyond a shortcut. 

How do we get them to set the right intention?

My kids are younger. They are currently in elementary school. And I use AI with them—frequently. I talk to them about AI. And, while I’ll use it for a quick answer here and there, I don’t use it with them that way.

My kids are 5 and 8, so I don't let them use AI alone. I use it WITH them on activities that are specifically designed to bring their imagination to life.

Like when Jackson builds something with LEGO and we turn it into a coloring page he can actually color. Or when they do a "fossil dig" in the backyard and we use AI to help them imagine what their discoveries might have been, then they journal about it and set up their own museum exhibit.

Or when Austin has an idea for a character that doesn't exist yet, and we use AI to help visualize what that might look like—and then he digs into the costume bin to actually become that character.

That's using AI with the intention to create and explore. If I tried to use it to shortcut something they should be working through themselves? I can imagine the energy would be totally different.

The intention changes everything. And I can already feel my kids picking up on that difference, even at 5 and 8.

I call this the Idea → AI → Play framework

In every case, they start with their own idea. We use AI together to help execute on it. Then they take what AI creates and play with it, modify it, improve it, or sometimes toss it out and try again.

The basics: AI is in the middle of the process, never at the beginning or end.

Does this prevent them from ever using AI to cheat? Of course not.

But it gives them a comparison point. They know what it feels like to use AI to amplify their thinking. So when they try to use it to bypass their thinking, the contrast is there.

They might not always choose the harder path. But at least they know there's a difference.

What you can actually do (I promise this isn’t one more thing)

You don't need to become an AI expert. You don't need special tools. I know you're overwhelmed. I know this feels like one more parenting responsibility.

You just need to be intentional about conversations you're probably already having about effort and shortcuts and learning.

  • Ask why before they use it. "What are you trying to accomplish here?" That one question shifts everything from output to intention.

  • Have them notice how it feels afterward. "Did that help you understand it better, or did you just get it done?" Let them start feeling the difference themselves.

  • Talk through your own use. When you use AI in front of them, say why. "I'm using this to help me think through this problem, not to avoid thinking about it.”

  • Make the right choice something to be proud of. When they could have blasted through something with AI but chose to work through it themselves, acknowledge that. Not just "that took longer" but "you chose to really understand that."

  • Partner with their teachers. Ask what the school's policy is. Ask how you can reinforce it at home. Ask what they're seeing. This can't just be a school thing or a home thing. It has to be both.

Where to start

If you want practical ways to do this, sign up and receive my free starter guide that walks you through how to get started and teach intention from the beginning.

Or try Creation to Coloring—it's designed to show kids that AI is way more satisfying when you use it to bring ideas to life than when you use it to avoid creating.

Because the kids who'll thrive with AI aren't the ones who avoid it completely. And they're not the ones who let it do everything.

Next
Next

Same Technology, Different Entry Point: Teaching AI to Adults vs. Kids