After 20+ years building software — from Flash games to full-stack web apps — I did something that surprised a lot of people. I went back to university. Not to learn to code. To do a PhD.
I'm at the very beginning of this journey. Day one, really. Right now I'm deep in the systematic review phase — reading papers, mapping the landscape, understanding what's been done and what hasn't. But I wanted to write about why I'm doing this, because the question I'm exploring matters to me deeply.
The Question
My research at the University of West London is focused on a specific problem: how can AI detect engagement and focus shifts in autistic higher-education students?
Not general students. Not children. Specifically autistic adults in university — a group that's growing in higher education but remains significantly underserved by the tools meant to support them.
36% of autistic students who enrolled in 2019 did not complete their degree within three years, compared to 29% overall. Only about one in five autistic university students feel they receive the support they need.
These are real numbers. And behind every number is a student who struggled not because they weren't capable, but because the system wasn't designed for how they learn.
Why This Matters
Most e-learning platforms treat every user the same way. Whether a student is frustrated, confused, in flow, or completely disengaged — the system doesn't notice. It just keeps delivering content at the same pace, in the same way.
For neurotypical students, this is already a problem. For autistic students, it can be the difference between completing a degree and dropping out. Many autistic learners struggle with regulating attention and maintaining focus during online learning. And the platforms they use? Completely blind to it.
What if the technology could actually sense when a student is losing focus? When frustration is building? When they need a break, a different explanation, or just a moment to reset?
The Idea: Affective Computing Meets Education
The field I'm researching is called affective computing — using AI to interpret human emotional and behavioural signals. Think facial expressions, vocal tone, interaction patterns. The technology exists to detect these signals in real time. But there are big gaps:
- Most models are trained on neurotypical data — autistic individuals often express emotions differently, and off-the-shelf classifiers can misread them entirely
- One-size-fits-all doesn't work — what "focused" looks like varies hugely from person to person, especially in neurodivergent populations
- Privacy is critical — any system that watches students needs to be ethical by design, not as an afterthought
- Existing AI tools for autism focus mostly on teaching social skills, not on responding to the learner's own affective state in real time
The vision — and it's still just a research question at this stage — is a modular AI framework that can detect engagement and focus using multimodal signals (facial expressions and vocal cues), fine-tuned specifically for autistic learners, with a personalisation layer that adapts to each individual over time.
Where I Am Right Now
I want to be honest about this: I haven't built anything yet. I'm in Phase 1 — the systematic literature review. Reading everything I can find on affective computing in education, multimodal engagement detection, autism-specific expression patterns, and the ethical considerations of monitoring neurodivergent students.
It's a different kind of challenge from what I'm used to. In industry, you build fast and ship. In research, you read, question, and make sure you deeply understand the problem before you write a single line of code. It's humbling, and honestly, it's exciting.
Some of the questions I'm sitting with right now:
- How accurately can a multimodal AI model detect engagement shifts in autistic learners?
- Can personalisation at the application level reduce false detections without retraining the model?
- What does an ethical, comfortable experience look like for autistic students being monitored?
- How do educators perceive the usefulness of such a system?
I don't have answers yet. That's the point of a PhD.
Why Me, Why Now
People ask why I'd leave a successful career to go back to academia. The truth is, I'm not leaving anything behind — I'm combining everything. 20+ years of building software, designing interfaces, understanding users, and thinking about systems. My background in design gives me empathy for the user. My background in development gives me the ability to build what I design. And now, research gives me the rigour to make sure it actually works.
I've spent two decades building things for clients. Now I want to build something that genuinely helps people who need it.
This is the beginning. I'll be writing more about this journey as it unfolds — the systematic review findings, the decisions I make along the way, and eventually the technical work itself. If you're in the affective computing or inclusive education space, I'd love to connect.