I've been designing interfaces for over two decades. Websites, apps, platforms, dashboards — you name it, I've probably designed some version of it. But nothing has challenged my thinking about UX more than this question: what if the interface could sense how the user feels?
Not in a creepy, surveillance way. In a genuinely helpful way. The kind of helpful that could change the life of a student who's struggling and nobody notices.
Everything EdTech Gets Wrong
Let me be blunt. Most educational technology is designed for a fiction. The "average learner." A student who progresses at a predictable pace, engages consistently, and responds to the same motivational patterns as everyone else. This student doesn't exist — and the tools built for them fail the students who do.
Three things drive me crazy about how edtech works today:
- It assumes everyone learns the same way. The same interface, the same pacing, the same interactions for every student. A one-size-fits-all approach that fits almost nobody well — and completely fails neurodivergent learners who process information differently.
- It ignores emotional state entirely. Whether a student is engaged, confused, frustrated, or checked out — the platform doesn't know. It doesn't ask. It doesn't adapt. It just keeps delivering content at the same pace, in the same way, regardless of what's actually happening on the other side of the screen.
- It substitutes gamification for real engagement. Badges, points, streaks, leaderboards. These aren't engagement — they're decoration. They create the illusion of motivation without addressing the fundamental question: is this student actually learning, or are they just clicking through to get the next badge?
I know this isn't just a professional observation. I see it at home. My daughter is on the autism spectrum, and when she uses educational apps, I can see the moment she disengages. The app can't. The app just keeps going.
What Is Affective UX?
Here's how I think about it in plain terms. Affective UX is designing an interface that can read the room.
It's emotional design — creating interfaces that respond to how you feel, not just what you click. It's human-centred AI — using technology to make the experience more empathetic and adaptive. And it's what I think of as "reading the room" — the app can sense when someone is confused, losing focus, or getting overwhelmed, and it adjusts.
Imagine a learning platform that notices a student has been staring at the same paragraph for three minutes without scrolling. Instead of doing nothing, it gently offers a different explanation. Or a simpler one. Or a short break. Not because the student clicked "I'm confused" — but because the system could tell.
That's affective UX. And for neurodivergent learners, it's not a nice-to-have. It's the difference between an app that works and one that doesn't.
What Would a Good Tool Actually Look Like?
I think about this a lot — partly as a researcher, partly as a designer, and mostly as a parent. If I could build the perfect learning tool for my daughter, what would it do?
- It would adapt to her pace. When she's struggling, it would slow down and simplify without making her feel like something is wrong. No "you got it wrong!" screens. No red marks. Just a quieter, gentler path through the same material.
- It would reduce sensory overload. Clean, calm interfaces. No flashing animations competing for attention. No loud sound effects. No visual clutter. Just the content, presented in a way that doesn't overwhelm.
- It would notice when she drifts. Not to punish her — never to punish. But to gently re-engage. Maybe a subtle visual cue. Maybe a different modality. Maybe just a pause that says "it's okay, take your time."
- It would let her be herself. This is the big one. It wouldn't try to make her learn like a neurotypical child. It would work with how her brain operates, not against it. Her differences aren't bugs to be fixed — they're features to be designed for.
None of these exist in any edtech tool I've seen. Not meaningfully. Not in a way that actually works.
The Accessibility Afterthought
Here's the uncomfortable truth about our industry: accessibility is a checkbox, not a mindset. I say this as someone who has worked in UX for over 20 years. Most teams treat accessibility as something to address at the end of a project — a WCAG audit, some ARIA labels, maybe a colour contrast check. Done.
But neurodivergent UX isn't even part of that checklist. When was the last time you saw a design spec that included considerations for autistic users? For ADHD users? For users whose attention and emotional regulation work differently from what we consider "standard"?
It's a total afterthought. And the reason isn't malice — it's ignorance. Most designers simply don't know how to design for neurodivergent users because nobody taught them. There are no widely adopted frameworks for it. No best practices. No guidelines you can point to the way you point to WCAG for visual accessibility.
That's part of what I'm trying to change with my research.
The Privacy Elephant in the Room
I'd be lying if I said the privacy question doesn't keep me up at night. Because here's the tension: to build a system that senses how a student feels, you need data. Facial expressions. Vocal cues. Behavioural patterns. And the idea of an AI watching students — especially children, especially neurodivergent children — understandably makes people uncomfortable.
It's a huge concern. And the people who raise it are right to worry.
But I believe it's solvable with design. Edge computing means the data never leaves the device. No cloud uploads, no server-side storage, no surveillance infrastructure. On-device processing, user control, data minimisation. You can build a system that reads the room without recording the room.
Is it worth the trade-off? I think so — if done carefully. The alternative is what we have now: struggling students who fall through the cracks because the technology is blind to their experience. A privacy-respecting system that helps a student stay engaged is better than a perfectly private system that ignores them completely.
Full Transparency, Always
There's one principle I won't compromise on: the student should always know.
If the system is sensing their emotional state, they should be told. Clearly. In language they understand. It should be a feature they're aware of and can control, not something happening secretly in the background. Even if it would be "easier" to make it invisible, even if it would add less cognitive load — transparency isn't optional.
Because here's the thing: if you build a system that watches people without their knowledge, you've built surveillance. No matter how good your intentions are. The ethics of affective computing depend entirely on consent and transparency. The moment you lose those, you lose everything.
The goal isn't to build technology that watches students. It's to build technology that sees them — and that they know sees them.
Where UX and AI Meet
This is where my two worlds collide. The UX designer in me thinks about interfaces, interactions, how things feel to use. The researcher in me thinks about models, data, accuracy, bias. And the parent in me thinks about whether any of it actually helps a real person sitting in front of a real screen.
My PhD research focuses specifically on 18+ university students — autistic adults in higher education who are falling through the cracks. That's where the data is, that's where the need is most measurable, and that's where I can make a rigorous contribution. But if it works — if the framework proves effective — adapting it for younger learners is a natural next step. The principles of affective UX don't have an age limit.
Affective UX in education isn't just a technical challenge. It's a design challenge. You can build the most accurate emotion detection model in the world, but if the interface response to that detection is clunky, intrusive, or patronising — it fails. The AI is only as good as the UX it powers.
That's why I think this research needs people who've spent time in both worlds. People who understand what a good user experience feels like and how to build the AI that supports it. I don't know if I'm the right person for that. But I know I'm going to try.
Affective UX is still a young field. The intersection of emotional sensing, interface design, and neurodivergent-inclusive education is barely explored. But that's exactly why it matters. The tools we build today set the foundation for what learning looks like tomorrow.
My daughter deserves better tools. So does every learner who's ever felt invisible to the technology that was supposed to help them. And as both a designer and a researcher, I think we can build them. We just need to start by actually seeing the students we're designing for.