Inclusive Design with AI: Making UX Accessible for All
by USER Experience Researchers
When many of us were young, some of us dreamt of designing cities, conceptualising superheroes, or building contraptions from toy bricks that solved "real world" problems. At the time, we didn’t have the words for it, but we were already thinking about design. Not just any design, but inclusive design. We just didn’t know that term yet.
As children grew some became designers, finally discovering the work of user experience (UX), and that same instinct remained: to solve problems for people. Not just people like us, but for everyone. And now, with AI in our toolbox, we’re facing both exciting opportunities and important questions: Who are we designing for? And more crucially, who are we leaving out?
This is the story of how inclusive design meets artificial intelligence. A story we’re still writing, and one that’s shaped by the people at the heart of our products.
Act One: Noticing Who’s Missing
Inclusive design, at its core, is about noticing.
It’s about noticing the person who relies on a screen reader. The one who prefers a “dark mode” because of migraines. It’s seeing the left-handed child using an interface designed for right-handers. The non-native speaker navigating unfamiliar forms. It’s about the people who are always there, but rarely considered.
When we work on a new project at USER, we ask ourselves: What assumptions are baked into this experience? Whose voice didn’t make it into the early sketches? That’s where we begin.
AI helps us listen more broadly. With it, we can analyse vast amounts of user data to find behavioural patterns we might not have seen. It can scan for colour contrast issues at scale, flag missing alt text, and surface moments of friction in a flow. It can even help us recruit more diverse testers by recognising gaps in demographic coverage.
But let’s be clear: AI isn’t the protagonist. People are. AI, in this act, plays the supporting role, a very powerful one.
Act Two: Conflict, Friction, and Real People
Sometimes, tensions start when a shiny new solution hits reality.
You’ve done the planning. You’ve run the sprints. Your design looks solid in Figma. But then someone tests it who uses voice navigation, and suddenly, nothing works the way you’d hoped. The labels don’t read clearly. The flow loops endlessly. You can’t skip the intro, and the buttons don’t say what they do or not do what they say.
This is the moment of conflict. But also opportunity.
We’ve seen this firsthand in usability sessions. One of our participants, a visually impaired student, tried to use a government portal powered by AI-generated responses. The screen reader described everything... except the one button she needed to move forward. A single label was missing. But that label was the gateway to the next page, the whole point of the task. And she was stuck.
The AI didn’t catch the omission. But the person did.
This is why we test. And not just with one type of user. Not just in ideal conditions. Real lives are messy, noisy, imperfect, and so our research should be, too.
Remote tools like Lookback or Teams let us meet people where they are. AI can transcribe and sort responses, find sentiment trends, and even group behavioural patterns. But when someone sighs before answering, when they pause just slightly too long, that’s a cue no algorithm can fully interpret. That’s where our empathy kicks in. That’s where design decisions are made, not just by logic, but by care.
Act Three: Building Forward, Together
The third act is where things resolve, or at least, move forward.
You’ve listened. You’ve seen what works and what doesn’t. The AI helped flag issues, sure, but the real magic was in the conversation. In the moment someone told you, “I gave up and just asked my son to do it for me,” and you realised that was your design failing them.
Now it’s time to make it right.
We bring the whole team into these final sessions. Product owners, team leads, designers, copywriters. We present the issues not just as problems, but as human stories. We show clips. Quotes. Patterns. And we suggest what could be.
This is when AI re-enters not as the fixer, but as the builder’s assistant. Helping us generate accessible variants, test multiple flows, and simulate different needs. But always grounded in what we learned from real people.
And when we implement those changes, whether it’s as small as a clearer label or as major as a re-architected journey, we see something shift. The product becomes less exclusive, less clever for the sake of cleverness, and more generous.
More useful. More human.
That’s where we win – when designs work for people.
Where Do We Go from Here?
Inclusive design isn’t a destination. It’s a direction. And AI, used well, can help accelerate the journey. But it can’t walk it for us.
At USER, we believe in designing with, not designing for. We believe in checking our biases, sharing the mic, and testing until the testing itself feels fair. We believe AI is a powerful tool, but never the sole expert in the room.
And above all, we believe accessibility isn’t an edge case. It’s the case.
So, here’s our call to action: next time you’re mapping out a feature or refining an experience, ask yourself: Who might this unintentionally exclude? Then bring them in. Let them shape it. And let AI assist, not decide.
The future of UX is inclusive. We’re building it with you.
Ready to create more inclusive, AI-assisted experiences?
Partner with USER to build solutions that include everyone, because great UX is for all.
📩 Email us at: project@user.com.sg
Love this focus on inclusion and empathy in design! We have powerful AI-driven tools that support accessible, human-centered creation—making it easier to build digital experiences that truly welcome everyone.