AI Is Changing More Than Learning—It’s Reshaping Learners, Leadership, and What’s Worth Teaching
via The Forest School: An Acton Academy

AI Is Changing More Than Learning—It’s Reshaping Learners, Leadership, and What’s Worth Teaching

AI in education is evolving fast—and unevenly. Some are resisting, others are rushing, some are accelerating and deepening learning, and some are overwhelmed. But no matter where a school, district, or microschool sits on the curve, one thing is clear: AI isn’t just changing how students learn. It’s changing what they learn. It also shapes who they grow into—positively or negatively.

From my vantage point—serving as Head of School at The Forest School: An Acton Academy and The Forest School Online, guiding public and private leaders through the Institute for Self-Directed Learning, and teaching a graduate course on AI and innovation at the University of Pennsylvania—I have the privilege of listening in, learning with, and walking alongside students, families, educators, researchers, and system leaders wrestling with this moment.

What follows is a synthesis of what I’m seeing—both the bright spots and the blind spots. My hope is that it helps others better understand the current landscape and shamelessly steal anything that might serve your learners and community.

What Students Are Teaching Us

Today’s learners are curious, capable, and complex in how they use AI. Many treat it as a thinking partner—using it to brainstorm, organize ideas, or debug work. As one student put it, “It doesn’t always get it right. That’s why I use it to check my thinking, not just do the thinking for me.”

Some try to use AI to shortcut the hard stuff. Still others reject it entirely. That spread matters. It shows us students don’t need permission to use AI—they need guidance on how to use it wisely.

We're seeing students gain real creative and cognitive leverage—but also moments of dependency, confusion, and ethical uncertainty. The best AI-using students are the most reflective. They’ve been taught to understand how it works, when it fails, and how to take ownership of the final product.

What Parents and Caregivers Are Asking

Parents and caregivers are asking new—and important—questions:

  • “If my child didn’t write that essay, did they still learn something?”

  • “How do I teach them to use AI without overusing it?”

They’re concerned about screen time, cheating, and over-reliance. At the same time, they’re deeply committed to their children’s growth. So, when they're invited into norm-setting conversations—through transparent communication and co-created expectations—trust grows. Tech literacy is uneven across households, but belief in kids’ potential is not.

Parents don’t need to be AI experts. But they do need to feel they’re part of the conversation. When they are, their feedback becomes a source of wisdom—not just worry.

What Educators Are Wrestling With

Most teachers we work with aren’t afraid of AI—they’re just unsure how to shift without losing their footing.

“I’m not scared of it,” one Guide said. “But I don’t want to just tack AI on. I want to rethink how I educate altogether.”

For many, the shift from delivering content to designing experiences is a leap. And most educators haven’t been trained for it. Without structured support, some become reactive or overly rigid. That’s why we focus on hands-on, values-driven professional learning that gives teachers not just tools, but frameworks and community to process the change.

The risk isn’t refusal. The risk is exhaustion, shortcutting the shift, or layering AI on top of a broken model.

What Leaders Are Navigating

For school and system leaders, the conversation has moved from if to how. One principal shared: “We’re not debating whether to use AI. We’re asking: how do we use it in a way that aligns with who we are?”

We’ve seen leaders begin to draft ethical AI policies, streamline systems, and vet edtech platforms with more care. But we’ve also seen AI be used reactively—implemented out of pressure rather than purpose. The biggest danger isn’t AI. It’s mission drift.

The best school leaders we know are pursuing not just savvy—but judgment. And they’re asking hard questions not only about what’s possible with AI, but what’s worth doing at all.

What Researchers and Policy Influencers Are Exploring

While many classrooms are improvising, researchers and policymakers are trying to catch up. The guidance gap is real—many schools still lack clear AI guardrails, leaving leaders without the support or clarity they need. In our work at Penn and our Institute, we’ve seen researchers listening closely for stories from the field to shape smarter policy and practice. They’re asking questions about bias, equity, access, and long-term impact. They don’t want media hype to distort the reality on the ground.

They know that redefining rigor is urgent—not about lowering standards, but shifting them. “How do we measure deep learning when it doesn’t fit into a multiple-choice box?” “How do we know a self-directed learner is truly prepared for life and work?”

The ninja move is collaboration: educators, researchers, and policymakers co-creating a human-centered approach to AI that balances optimism with wisdom.

Themes Across Stakeholders

Across all these groups, it seems that—for the moment—three themes are emerging:

  1. Discernment is the new literacy. Not about finding faster answers—about asking better questions and knowing when to trust them.

  2. Roles are shifting. Students are becoming self-directed learners. Educators are becoming designers and coaches. Parents are becoming co-navigators. Leaders are becoming stewards of judgment.

  3. Values must drive tech—not the other way around. When schools are rooted in purpose, equity, and relationships, AI becomes an amplifier—not a substitute—for what matters most.

How Content Is Shifting

AI isn’t just disrupting instructional methods—it’s revealing which content areas actually prepare students to think, adapt, and contribute in an AI-shaped world. What we teach is being refocused toward skills that machines can’t automate: judgment, synthesis, purpose, and connection.

The most relevant content in this era includes:

  • Inquiry over information

  • Ethics, equity, and philosophy

  • Cross-checking for bias and accuracy

  • Creative synthesis and authentic demonstration

  • Prompting, verifying, and reflecting with purpose

What’s fading (or should be):

  • Formulaic writing without real audience

  • Procedural math with no real-world link

  • Rigid, content-driven curricula that stifle agency

As one learner said: “If AI can do it instantly, I still want to learn it—but only if I understand why it matters.”

In short, the future of content is not about covering everything—it’s about uncovering what matters most.

The Shift Beneath The Shift

What if AI isn’t a threat or a tool, but a rebalancer? What if its true gift is to help us shift—from control to trust, from coverage to connection, from checking boxes to cultivating growth?

If we take that invitation seriously, we won’t just change how learning happens.

We’ll change what it means to belong, to grow, and to lead in this next era.


Join us for one of our next Accelerators—virtual or in person—Fall of 2025.

Karen J. Pittman

Partner, Knowledge to Power Catalysts | Publisher and Editor-in-Chief, Youth Today | Creator & Creative Director, Changing the Odds Remix | Co-Founder & Former CEO, the Forum for Youth Investment

1mo

Love this, Dr. Tyler S. Your arguments will speak to community educators who are already helping youth pursue their sparks in afterschool and summer programs. After spending a few days with you and other H3 enthusiasts last week, I'm morr inspired thsn ever to use AI exploration as a bridge between formal and flexible learning spaces.

Kevin Moore

Founder of Radix and Reason Math Learning, LLC & Co-Creator of The Generative Math Framework

1mo

Yes - "AI isn’t just disrupting instructional methods—it’s revealing which content areas actually prepare students to think, adapt, and contribute in an AI-shaped world. What we teach is being refocused toward skills that machines can’t automate: judgment, synthesis, purpose, and connection."

Mark Kabban, Ed.L.D.

Decolonized Leadership Development for Schools & Organizations. CEO & Founder of Examined Leadership Collective.

1mo

I appreciate your optimism at the end. The rebalancing can be something we can try and get behind.

To view or add a comment, sign in

Others also viewed

Explore topics