ASK ME ANYTHING #27: “Is AI rotting our kids’ brains?”
What teenagers have to say about AI.
AI is everywhere, kids are using it constantly, and nobody has answers.
But instead of turning to another AI-obsessed expert or doomsday neuroscientist, what if we just…asked the kids?
How are they using AI? What is their relationship with this technology? Are they self-aware enough to know what’s happening? Are they able to set boundaries and use it correctly?
This is exactly what Dan Shipper did on the AI & I podcast recently, when he sat down with Alex Mathew — a 17-year-old high school senior, entrepreneur, and one of the most self-aware teenagers I’ve ever met. Alex goes to Alpha High School in Austin. He’s building a startup. And he has a lot to say about his generation and what technology is actually doing to them.
First, what does the data say?
Tragically, the fear around AI rotting our kids’ brains is pretty valid. We even have a new vocabulary for it.
Cognitive offloading: using a tool to do your thinking for you.
Cognitive crutch: when cognitive offloading becomes dependency.
Cognitive debt: over time, your brain becomes weaker than before.
Here’s a quick look at what some AI studies are saying.
In one study done directly on high schoolers, University of Pennsylvania researchers gave nearly 1,000 students in grades 9–11 access to ChatGPT while practicing math. The kids with AI solved 48% more practice problems correctly. Then they scored 17% worse on the actual test.
MIT Media Lab researchers went a step further. They scanned people’s brains while they used AI. Over four months, participants who wrote essays with ChatGPT showed 55% weaker brain connectivity than those who wrote alone. Even more striking: 83% of AI users couldn’t accurately quote from essays they had just written. They didn’t own their own thoughts. Every time AI does the thinking, you borrow against your own mind.
A Swiss Business School study surveyed over 600 people across three age groups and measured AI use against critical thinking scores. The pattern was consistent and unsettling: the younger the participant, the higher the AI dependence and the lower the critical thinking score. The researchers pointed to a specific reason — young people grew up with these tools, so they never learned to think without them. The children entering school today are even younger than the youngest group in this study.
A randomized controlled trial gave 120 students one simple choice: study with ChatGPT or study the old-fashioned way. Forty-five days later, a surprise test. The ChatGPT group scored 57.5%. The traditional group scored 68.5%. GPT made the studying feel easier, but the actual remembering harder.
Perhaps most telling: Anthropic — the company that built Claude — studied how students were using their own tool and didn’t like what they found. Students asked for direct answers nearly half the time, with almost no back-and-forth thinking in between. Their conclusion, in their own words: AI may be “stifling the development of foundational skills needed for higher-order thinking.” When the people who built the tool are raising the alarm, it’s worth listening.
Now, what are teenagers saying?
According to Alex, “There’s a huge loneliness crisis” due to AI.
“72% of teens have used AI for companionship at least once,” he says, “while 52% are using it every day.”
Kids are forming emotional attachments to their chatbots, all because AI is “easy and seamless and frictionless.”
Here’s what else Alex says: half of Gen Z is pessimistic about AI, and yet 70–75% use it anyway. They’re skeptical of it. They don’t fully trust it. They worry about what it’s doing to the environment, to jobs, to something human they can’t quite name. And they still can’t put it down. That cognitive dissonance — I know this might be bad for me and I’m doing it anyway — probably sounds all too familiar.
Because we know AI isn’t the first wave of technology to do this.
Social media is the original brain-rotting villain. When Dan asked Alex point blank whether social media had rotted his generation’s brain, there was no hesitation. “Yes. One hundred percent.” Fractured attention, dopamine loops, the constant overstimulation of an algorithm designed to keep you scrolling, the comparison trap — curated lives, filtered moments, nature influencers who set up a camera to make it look like they spontaneously wandered into the woods. He admitted his own attention isn’t great. He still catches himself scrolling when he doesn’t mean to. He described what’s happening to his generation’s brains as getting “more mushy.”
But there is a positive twist. (I promise!)
He doesn’t think the full story is as simple as “social media bad.” For his generation, this is how they connect. Some friendships run almost entirely on sharing Instagram reels. Healthy relationships often start on Snapchat. In his words: “We’re laughing together — it’s part of the optimism and joy we get in life.”
He’s not wrong. And as parents, I think we miss this nuance all the time.
We see the phone and we see a problem. Our kids see the phone and they see a universe of connection with their people.
The real issue, Alex said, isn’t the platform. It’s the gap between consuming and processing — between scrolling through a hundred ideas and sitting with one long enough for it to mean something.
Are there any positives to AI for kids? (100% yes.)
On one hand, we have kids falling in love with their chatbots, outsourcing their thinking, and losing the ability to sit with difficulty. On the other hand, there are kids using the exact same tools to build real things.
Alex’s friends are a good example. One has 2 million TikTok followers and is turning that audience into a business: thinking seriously about ownership, distribution, and what it means to build something that lasts. Another has 70,000 users on an AI-powered teen dating coach app she built herself and is collaborating with MrBeast. Both of them still want to go to college. Not because they have to, but because they want that quintessential experience.
And then there’s Alex himself, who is building Berry: an AI stuffed animal designed to help teenagers with their mental health. Five to ten minutes a day, talking through what’s going on, learning to recognize and cope with what they’re feeling.
What I love most about this isn’t that it’s clever, though it is. It’s why he’s building it. He looked at the loneliness crisis that AI is feeding and instead of scrolling past it, he decided to try to solve it. And this points to something I’ve learned in our schools that applies just as much at home.
The technology is almost never the deciding factor. The human around it is. (As Alex put it: “It’s 90% motivation, 10% ed tech.”)
Before you hand your kid any tool, they need to know why and how to use it in a way that makes them smarter.
What you can do at home — 4 ways to raise a builder, not a consumer.
1. Prompt the AI to prompt you.
Most kids open ChatGPT and ask it for answers. Write an essay about the scientific method. Generate an idea for my medieval history project. Summarize the story of Hamlet. That’s cognitive offloading at its finest. Teach your kid to flip it.
Have the AI ask them questions: challenge their thinking, poke holes in their ideas, play devil’s advocate. For example, don’t ask AI to “generate ideas.” Prompt AI: “I’m working on a medieval history project. Don’t give me any answers or ideas. Only ask me questions until I figure out what I want to explore.” That’s using AI as a thinking partner, and it builds the critical reasoning that AI use can otherwise erode.
2. Give them a project, not just a tool.
Alex doesn’t use AI randomly. He uses it because he’s building something and needs that sparring partner or research assistant. When kids have a real goal — a business idea, a creative project, something they genuinely care about — AI becomes fuel. Without a goal, it becomes a shortcut to nowhere. There’s no need to tap away at AI ad nauseam. Have a plan.
3. Prioritize real life connections.
If 72% of teens have used AI for companionship, we absolutely cannot overlook the loneliness crisis. It’s more important than ever to connect your kid with friends at their school, their neighborhood, their church. Do we need to teach kids how to properly use AI tools? Of course. But we also need to let kids run barefoot, play outside, bike through the neighborhood with friends, and experience a rich, meaningful childhood building friendships off screen.
4. Train the algorithm together.
Alex spent his entire winter break intentionally reshaping his social media feeds, clicking “not interested,” steering toward ideas he actually wanted to think about. He treated it like a project. That reframe (from “the algorithm happens to me” to “I shape the algorithm”) is one of the most important things a young person can learn right now. You can easily sit down with your kid and do it together.
Whether you like it or not, AI will affect your kids. But you don’t need to live in fear about it! It doesn’t have to rot your kid’s brain. You can help your kid become a creator with AI, not a consumer.
Because the difference (according to a 17-year-old who’s living it) isn’t the tool, but the builder behind it.



Thanks for the feature!