The #1 rule for AI use in my house

When I use AI with my kids, there is one main guiding principle. They have to come with the idea. Full stop.

It can’t be vague, it has to be well thought out. The idea needs to be specific enough that he could explain it, detailed enough that he could draw it on paper if he had to.

This Spring Break, Jackson (my 8 year old) spent most of his free time building a Super Mario World in his room in honor of the new movie. The whole thing is made from cardboard, scraps, packaging boxes, toys, blankets, anything he could find. There was no picture to follow. He just kept building until it looked like what was in his head.

‍It's massive. And it's entirely his.

Now he wants to turn it into a video game. So we're going to try. But before we opened a single app, he needed to come up with all of the requirements himself: the objective of the game, how many stages there are, what the characters are trying to do. He also knows he can't actually make a Super Mario game (you know, because copyright), so he has to figure out who the characters become when they're his, not Nintendo's.

He's drawing storyboards right now. His room is an absolute disaster. That's exactly where he needs to be before we touch AI.

Skipping the ideation gives the wrong out (and the wrong takeaway)

If your kid sits down and says "I want to make a video game," and you open the laptop and start there, you are going to get a video game. It might be functional and may even look okay. But your kid will feel essentially nothing about it.

And, that’s not AI’s fault. It's just how this works. Generic in, generic out. If the idea has no specificity, no ownership, no detail that could only have come from your particular kid's brain, then what AI gives back will be equally generic. You've wasted a session and probably a lot of token. And you’ve wasted valuable time.

That output isn't his. There's no pride in it because there's no him in it. He didn't make a thing, he watched AI make a thing and put his name on it. Feels a bit like cheating. And a little pointless.

Putting in the work gives the best result

The storyboards Jackson is drawing right now are the product. The video game is just an extension of those. The thinking that has to happen before we ever open Claude is where the real value lives, and it's the part that AI shouldn’t do for him.

‍He has to decide what the objective of the game is. He has to figure out how many stages there are and what changes between them. He has to think through who his characters are — and because he can't just use Mario and Bowser, he has to figure out who those characters become when they're his. That constraint, which felt like a limitation, is actually the most valuable part of the whole exercise. It's forcing him to make creative decisions he would have otherwise outsourced to Nintendo.

‍This is the Idea → AI → Play™ framework in its most basic form. The idea has to come first as the actual substance of what we're going to build. AI goes in the middle. It helps execute, refine, and extend what he's already decided. And then whatever we make goes back into the real world: he plays it, shows it to his brother, decides what's missing, tells me what he wants to change.

‍If you want to see what this looks like for younger kids, our Creation to Coloring activity is a good place to start. Kids build something physical first, then use AI to transform it, then bring it back into the real world.

Ground rules for doing this with your kids

  • The idea comes first, always. Before you open anything, your kid needs to be able to tell you — out loud, in their own words — what they're trying to make. If they can't explain it yet, they're not ready for AI yet. That’s the process. Help them get more specific. Ask them what it looks like, what it does, who it's for. The more concrete the idea, the more useful AI becomes.

  • AI goes in the middle. Once the idea is clear, AI is genuinely useful for execution — especially when what your kid wants to create falls outside their current skill set. Jackson can imagine a video game. He cannot code one. That's exactly the gap AI helps bridge. But it bridges his vision, not a generic version of it. This distinction matters more than almost anything else.

  • Do it together, especially at first. Sit next to your kid while they use AI. Help them think out loud, question the output, push back on something that's not quite right.

  • Model it yourself. If you use AI at work — and at this point most of us do or should — let your kids see it occasionally. Not as a performance, but when it's natural. Show them that you use it to draft something and then you fix it. Show them that the first answer wasn't quite right, so you pushed back. Show them that you're the one with the idea and AI is helping you execute it. The parallel is closer than it seems, and kids are watching how you use these tools whether you point it out or not.

  • If the idea is too vague, help them sharpen it. "I want to make a game" is not an idea. "I want to make a game where you collect gems and need to find them all before the robbers get there " is an idea. There's specificity in it. If your kid comes with something generic, don't open the laptop. Sit with them and ask questions until something specific emerges. You'll know it when you hear it because they'll get more excited, not less.

You can find more activity ideas designed around this same sequence in our full activity library — each one starts with your kid's idea and ends in something real.

Their expectations might be too high

Here's what I want to be honest about: Jackson's expectations for this video game are almost certainly too high.

He has a vision in his head. The chances that what we actually build comes close to that vision are not great. There are things we won't know how to do. There will be features he wants that we can't figure out how to execute. The gap between what he imagined and what we produce might be significant.

‍And that's a great outcome and lesson.

‍Because one of the most important things a kid can learn about AI — one of the most important things any of us are still figuring out — is that it doesn't automatically close the gap between your vision and your execution. It helps. It helps a lot. And with practice and patience that gap will get smaller and smaller. And your ideas will likely get bigger.

‍The version of this that worries me isn't the one where we build something imperfect. It's the one where a kid sits down, types "make me a video game," watches something appear, and decides that's what making things feels like. Fast, painless, belonging to no one.

‍That's not what I'm building here. I'm building a kid who knows what it feels like to have actually made something — who can tell you exactly why every decision was made, because every decision was his.

The cardboard world didn't come with instructions. Neither does this.

Idea → AI → Play™

‍ ‍

Next
Next

For kids and young adults, there’s no do over with AI