AI Insights #24

AI Insights #24

A few months ago, my students were using AI. The course chatbots I built for them were their go-to for revision. But now, as they head into their final internal exams, something has changed. AI usage has dropped dramatically. I asked them why, and their answers revealed a lot about trust, accuracy, and what really matters when the pressure is on.

That theme of trust came up again in a very different way when I used AI for a personal project. I took an old university rhyme and turned it into a fully illustrated children’s book for my daughter. It was a fun experiment, but AI had some serious limitations, especially when it came to keeping the illustrations consistent.

And then there is AI in schools. The potential is massive, but too often, we rush into new tech without a clear plan. AI should be practical, simple, and genuinely useful. When schools get it wrong, it just adds confusion. This week, I am looking at the biggest mistakes schools make when adopting AI and how to get it right.

Let's get to it!

Why My Students Are Using AI Less

A few months ago, my AS and A-level students were actively engaging with the course chatbots I had made for them. For their mid-term exams, usage was high. Very high.

Students were asking AI to explain tricky concepts and using it to support their learning on specific topics.

But now, as they approach their final internal exams before heading off on study leave, AI use has dropped significantly. Students are returning to tried-and-tested study methods: textbooks, past papers, and their own notes.

Why? The answer lies in trust.

Trust in Technology

A few months ago, my AS and A-level students were actively engaging with the course chatbots I had made for them. During their mid-term exams, usage was high. Very high.

Students were asking AI to explain tricky concepts and using it to support their learning on specific topics.

But now, as they prepare for their final internal exams before heading off on study leave, AI use has dropped significantly. Instead, they are returning to textbooks, past papers, and their own notes.

They aren’t rejecting AI outright. They’re just making a practical decision based on what they know works.

A System That Reinforces What Students Trust

I asked a few students, including some who had been frequent AI users, why they weren’t using it as much this time. Their responses followed familiar patterns:

  • It’s not always right. They’ve heard plenty about AI making mistakes, and they’ve seen it happen themselves. During mid-terms, small errors weren’t a big issue, but with final exams approaching, they can’t afford to take that risk.

  • Past papers show exactly what examiners expect. Past papers provide clear examples of the exact wording, structure, and level of detail examiners expect. Mark schemes and examiner reports spell out what earns marks and what doesn’t.

  • Textbooks are more reliable. When students need a trusted source, they turn to their notes and textbooks, which they know are accurate and aligned with the syllabus.

  • The stakes are higher. Mid-terms felt lower risk. Now, every mark matters.

  • It goes beyond the course. AI often provides additional, interesting information, but if it’s not relevant to the exam, it becomes a distraction. Right now, students are focused on marks, not extra details.

Research on trust in technology suggests that people rely on tech when they believe it is reliable, safe, and effective.

Trust in technology falls into three categories.

  • Technology factors. Is it accurate? Does it work reliably? AI’s tendency to “hallucinate” and fabricate information makes it unreliable for high-stakes studying.

  • User factors. Do individuals feel in control? Do they understand how it works? Many students don’t fully grasp how AI generates responses, making it harder to trust.

  • Task factors. What’s the risk of getting it wrong? The closer students get to exams, the higher the risk, and the more they fall back on proven study methods.

Textbooks vs. AI

Trust in technology isn’t just about whether it works, it’s also about familiarity. For centuries, students have relied on books and past papers because they offer consistency and verifiability. AI, by contrast, can be unpredictable, and students have all heard about "hallucinations." A textbook has been checked, verified, and often exam board endorsed. This provides a safety net.

Past exam papers allow students to test themselves in a structured way. AI-generated answers, meanwhile, can vary wildly depending on the phrasing of the question. This inconsistency further erodes trust.

One of the biggest drawbacks students mentioned was that AI doesn’t always align with the course content. While it can provide broader knowledge, it often goes beyond the syllabus. In theory, this is a good thing, encouraging deeper understanding and curiosity. But with exams approaching, students don’t want to explore tangents.

Will Younger Students Think Differently?

This raises a question: will the next generation of students approach AI differently?

Right now, younger students are encountering AI earlier than ever. They’re experimenting with it not just for schoolwork, but in their everyday digital lives. They’re using it to draft, summarise, and rephrase, often without the same hesitancy that older students show.

However, the real question isn’t just whether younger students will trust AI more—it’s whether the exam system will adapt to AI.

If assessment methods evolve, perhaps moving towards AI-integrated coursework or open-ended problem solving, then AI might become a more natural part of learning and exam preparation.

AI as a Supplement, Not a Replacement

For now, students are making a pragmatic choice. They aren’t rejecting AI, but they are sticking with what they know works.

In the long run, AI will become another tool in the academic toolkit. The key to its success will be greater transparency, improved accuracy, and better integration with verified sources.

But unless exams change, students will continue to prioritise past papers, mark schemes, and textbooks over anything that introduces uncertainty.

This is the reality of the current system, and for now, students are simply working within it.

How AI Helped Me Bring a Memory to Life

Some ideas stick with you for no reason at all.

Back in university, a bizarre little rhyme became part of daily life in my second-year student house. No one remembers where it came from, but somehow, we kept repeating it.

"Puggy Maloo, Puggy Maloo, Riding around on your little wooden bike. What are you doing? What are you doing? That is day-old food!"

Fast forward (only 19 years, but who’s counting), and now I have a daughter who loves books. One night, as I was reading to her, it hit me: why not turn that old uni rhyme into a children’s book?

But rather than spend months writing, illustrating, and formatting everything manually, I decided to see what AI could do.

Chaining AI Tools to Create

The process started with ChatGPT. I fed it the original rhyme and asked it to expand the idea into a full story. In seconds, I had a rough draft. With a little tweaking, the rhythm and flow felt just right.

Next came illustrations. I turned to MidJourney, hoping to create consistent images of Puggy Maloo across multiple scenes. This was trickier than expected. The AI struggled to maintain character continuity, and adding a second character only made things worse. I used ChatGPT again to refine my prompts, and after some trial and error, I had a set of images that worked.

To bring it all together, I pulled the illustrations into Canva. The platform’s magic expand tool was perfect for adjusting layouts and filling the pages. I added the text, lined everything up, and suddenly, this random university memory had transformed into a real book for my daughter.

To complete the experience, I used ElevenLabs to generate a video version with narration. A quick round of editing, and just like that, Puggy Maloo had come to life.

The Swan on the Water

I like an easy life. Not because I avoid hard work, but because I believe in keeping things straightforward. Schools are busy places, and the last thing we need is to make things more complicated than they need to be. AI has the potential to make education better, but only if we get it right. Too often, it ends up adding stress instead of taking it away.

For 14 years, I’ve worked on strategies for schools, always aiming to keep things practical and low-stress. AI should fit the same principle: easy to understand, easy to use, and genuinely helpful. But I’ve seen too many cases where schools have rushed into AI without thinking it through. Instead of making life easier, they end up with more problems.

AI is Complex but Simple

AI is one of the most advanced technologies we’ve ever had in education, but using it should feel simple. The real challenge isn’t just the technology itself. It’s the growing web of national and international regulations that come with it. Schools are now expected to navigate policies, ethical concerns, and data protection laws, all while figuring out how AI fits into teaching.

It all comes down to one word: clarity.

Teachers don’t need pages of legal jargon or vague policies. They need clear guidance on what AI can and can’t be used for. They need proper training so they feel confident using it, rather than seeing it as something to be feared or avoided.

When AI is brought in without this clarity, it just leads to uncertainty. That’s when things get overcomplicated, and instead of helping teachers, AI becomes another thing to worry about.

What Not to Do with AI

I often find myself telling people what not to do with AI. Why? Because there are some simple things you just can’t do.

I’ve been against AI detection from the start. Why? Because of false positives. It’s unreliable, and no teacher should have to accuse a student of cheating based on flawed technology.

I’ve also been against using ChatGPT to grade work. Why? Because it’s not what it was designed for, and it’s not reliable enough to fairly assess students.

I don’t recommend tools I don’t believe will actually help teachers. Why? Because I don’t want to mislead or abuse the trust people have placed in me. If I say something works, it’s because I genuinely believe it will make a difference in the classroom. AI is powerful, but it has to be used correctly.

Sometimes, Keeping Things Simple Means Doing the Hard Work First

I’ve always believed that to make life easier in the long run, you sometimes have to do the difficult work up front. Right now, I’m working through the EU AI Act for my school, putting policies and systems in place. It’s not simple, but it’s necessary. If we do it properly now, we avoid confusion later.

This has been my approach for years. Whether it’s setting up school-wide systems, designing assessments, or introducing new technology, I’ve found that putting in the effort early on makes things smoother down the line.

I always use the swan analogy. On the surface, a swan looks like it’s gliding effortlessly across the water. But underneath, those feet are kicking furiously to keep it moving. That’s what good AI implementation should feel like. It should look and feel simple for teachers and students, but only because the hard work has already been done behind the scenes.

That analogy means even more to me personally. My family is of Scottish descent, and our crest is a swan. The family motto is Je Pense...I think. If ever there was a message for AI in education, that’s it. AI isn’t about replacing thinking. It’s about using it wisely. Thoughtful, considered decisions will always matter more than rushing into the latest tech trend.

A Practical Approach to AI in Schools

If AI is going to work in education, we need to:

  1. Start Small – Bring in AI gradually, with tools that are easy to use and actually solve problems.

  2. Keep It Practical – AI should save time, not add more admin.

  3. Be Clear on the Rules – Schools need clear policies so staff and students know what AI can and can’t be used for.

  4. Support Teachers, Not Replace Them – AI should make teachers’ lives easier, not take away their role.

  5. Give Proper Training – If staff don’t feel confident using AI, it’s already failed before it starts.

Why This Matters

We don’t invest in AI just because it’s the latest trend. We invest in it because we care about the future of education. We care about giving teachers the right tools, not just more work. We care about making learning more engaging and accessible for students. We care about building a school culture where technology works for us, not against us.

I was raised with a deep respect for hard work. To be willing to put in the effort to get things right. That mindset has always stuck with me.

Hard work matters, but so does working smart.

Schools are already under enough pressure. AI should be a tool that helps, not another thing to worry about. The challenge isn’t whether AI belongs in education. It does.

The real question is, are we using it in a way that actually makes life easier? The answer depends on how we choose to bring it in.

Ta-ra Duck

That's me done for this week...enjoy your weekend!


If you want to get started with AI in your schools, I'm happy to help:

The AI in Education Handbooks for Educators & Students

My bestselling books are designed to make AI practical and accessible for teachers and students alike. Packed with real-world strategies, they’ve helped educators worldwide confidently bring AI into their classrooms. 👉 You can find them on Amazon here

Keynote Speaking, AI Training & Advice

I’ve had the privilege of working with schools to help them make sense of AI and use it effectively. Whether it’s speaking at events, running workshops, or offering strategic advice, I focus on real, practical ways to integrate AI.

If you're interested in exploring how AI can be a useful tool in education I'd be happy to talk:👉 Click here to connect


🏆Edufuturists A.I. Pioneer 2024

🏆Amazon Best-selling Author - AI in Education: Handbook Series

🏆ISC Research Edruptor 2024

Paul Oehme

Schulleiter, Schulerträumer

5mo

Much appreciated. It's great to learn that students a) use AI as a learning tool, not for cheating, and b) reflect deeply on how to use it best.

Like
Reply
David Curran

Assistive Technology (AT) Lead & Head of Careers at Moon Hall School, Reigate, a Specialist Dyslexic School | Assistive Technology (AT) & AI INSET/CPD deliverer and workshop facilitator | Independent Careers Advisor

5mo

Top banana again, Matthew Wemyss! I like your comments and thoughts on students moving back away from using AI. I’m finding students are finding their own balance and that has to be a good thing? I used to say it’s another tool in my toolbox, but it’s now become another in theirs. ‘Horses for courses’, as grandma would say. The Pathways/Lesson feature on the chatbot platform has become popular with my year 10 students so they can DIRT their first attempt at end of unit topic tests, and I’ve found that’s working better than me standing at the front of the classroom going over answers, or simply handing them the answers and telling them to ‘green pen’. It gives me time to get around to individuals as well. I have year 11s whose revision is focussed around AI and are actually teaching me how best to use it, and others who are dead against it and can’t get enough past papers. Most sit somewhere in the middle. Interesting and exciting times 😀👍

Aileen Wallace

Scottish secondary school teacher. From ed tech phobic to AI advocate. Hoping to encourage others to try a little Ed tech too. Co-Founder of the Eduguardians. Curipod Coach.

5mo

Much lower down the school but I have heard similar comments from my own students. If they have to double check it all by finding a reliable source that agrees with the AI, why not go to the source first of all? It is a very healthy cynicism IMHO and one that comes from them learning about how AI works. Without that....yikes!

To view or add a comment, sign in

Others also viewed

Explore topics