The Emotional Attachment to AI: GPT-5 and the Human Factor

View profile for Josh Cavalier

Founder & CEO, JoshCavalier.ai | L&D ➙ Human + Machine Performance | Host of Brainpower: Your Weekly AI Training Show | Author, Keynote Speaker, Educator

I’ve been reflecting on the emotional attachment we form with AI models, and the release of GPT-5. Last March, Marc Zao-Sanders article in Harvard Business Review illustrated how people are using AI. Therapy, advice, reconciling personal disputes, and negotiating were all listed. A survey by the nonprofit Sentio reveals that nearly 49 percent of AI users who self-report mental health challenges are turning to mainstream LLMs like ChatGPT for therapeutic support, often when human help isn’t available. (Links in the comments.) When I ask my AI workshop participants how they use AI, here are some responses: ▪️Thought partner: brainstorming, refining ideas, breaking down complex decisions ▪️Coach or mentor: asking, “What would you do in this situation?” ▪️Confidant: sharing uncertainties or fears when there’s no one else to talk to ▪️Reflection partner: using it like a mirror to explore personal beliefs, values, and even emotional responses (I'm upset and need to tone down my email!) Through these activities, we are creating personal bonds with AI. Here’s what matters: when a model changes, it feels like a person changes. GPT-4o had a warmth, a familiarity, a little flair that made it feel personable; GPT-5 feels more neutral, less flattering, maybe, less human. Has the pendulum swung too far away from warmth and gratuitous charm? This isn’t a hardware problem like GPU access; it’s not a software glitch like routing to the wrong model function (looking at you GPT-5); it's a human psychology issue. One minute, your “AI friend” is a companion; the next, it feels like a different voice entirely. Users notice, and it's unsettling. To the labs shaping these models: emotional consistency matters. Personality shifts, even subtle ones, break trust; people sense them, they recall the warmth they miss, and they may disengage. A shifting AI voice can disrupt decision-making alignment, dilute creative synergy, and ultimately weaken the competitive advantage that Human+AI talent brings to the table. So, before you begin doing cartwheels because you have access to GPT-5 in Copilot, consider the negative impacts on your associates using AI for collaboration.

  • No alternative text description for this image
Josh Cavalier

Founder & CEO, JoshCavalier.ai | L&D ➙ Human + Machine Performance | Host of Brainpower: Your Weekly AI Training Show | Author, Keynote Speaker, Educator

2d
Like
Reply
Daniel Klosterman

Accomplished Learning Experience Leader | Educational Technology Innovator

2d

I think you hit onto something with "emotional consistency" Isn't that true in all things. We look to our leaders/managers to be emotionally consistent, we look to parents and loved ones to be emotionally consistent. In a world that changes constantly, where change is the norm and we are told to lean into it and be "comfortable being uncomfortable" we do still need an anchor- ie emotion. Shouldn't that be true with our technology?

Like
Reply
Patricia Stitson

Learning Experience Designer | Consultant | Connector

2d

💯 Josh - thank you for bringing this to light! I was literally 'mid-stream' on a project yesterday - when 'the change' occurred and all I had was, "wait a minute.. did something happen? is it me? 🧐 ". I might not have put two and two together had I not read this. As a "front-line user" who (currently) doesn't have the time to keep up with the nuances behind the scenes this type of insight is so critical to keeping perspective working with my AI 'team'. It is almost the equivalent of walking into a creative meeting and noticing a key member appears listless - did someone close to them fall ill? have they? (maybe) what to do?

Michelle Lentz

AI & Change Enablement Leader | Human-Centered AI Adoption | HR & L&D Innovation | AI Governance & Ethics

2d

Sigh. I posted a solution (or 2) to this here: https://lnkd.in/gjVMhCgt After selecting Listener for myself, and tweaking the Traits, it’s pretty much the same personality I had access to before. This was really tested in a journaling “Project” I set up, where I journal and it gives me thoughtful and empathetic prompts. I had to tweak the Project instructions a bit, but now it’s actually better than before. I also asked it if it had the same personality as before:

  • No alternative text description for this image
Brandon Berry

I help Coaches, Consultants, and Corporations create impactful Learning Experiences | Learning Experience Architect

2d

Good insight Josh. IMO I like a more neutral tone for AI models than a mimicking of human psychology to appear human-like. I think it blurs the lines between AI being a tool and AI being a friend. It blurs the lines between what it means to be human and what it means to be a machine. It is convenient to have a tool that you talk to like a therapist or coach or teacher or friend in your pocket at any time throughout the day. However, convenience does not equal healthy. While human connection is what we need and what we are seeking through these tools, we have to remember and never forget that human - AI connection is NOT human connection, just like lab-grown meat is NOT animal meat.

✨ Renee Hensley ✨

ATD 2025 BEST Award-Winning L&D | M.Ed., Ed.D. (ABD) | Learning Strategy & Performance Focused Leader

2d

Couldn’t have said it better myself! 👏🏻👏🏻👏🏻

Like
Reply
Isaac W. Hubbell

Anthropologist, A.I. Researcher, Podcast Host, Author

1d

Great point Josh! This is a lot like what we’ve discussed. I’ve noticed the lack of warmth as well while working on some new conversations with it for the podcast. I miss GPT-4 tremendously. I wish these companies would focus more on the anthropology of their AI! It’ll make using it to teach (especially children) much easier and more effective!

Like
Reply
Yulia Barnakova

AI & XR Innovation and Learning Advisor | TEDx & Keynote Speaker | Microsoft MVP | Helping leaders learn, innovate, and reinvent with emerging tech 🚀

2d

Fir sure - in my house, we are so attached to our favorite voice it feels very odd to change it to mix things up. 😅

Josh Penzell

Blowing minds and shifting mindsets to empower organizational transformation and individual innovation in an AI world

2d
See more comments

To view or add a comment, sign in

Explore topics