Uncovering AI Needs When Users Don’t Know What They Need

Uncovering AI Needs When Users Don’t Know What They Need
In the early moments of any AI conversation, there’s often a pause. A hesitation. Not from disinterest, but from the sheer weight of the unknown.
As I move deeper into AI strategy work, I’m reminded of something deceptively simple: people cannot ask for what they do not yet understand. And in the context of AI, that gap between possibility and perception is vast.
This is not a failure.
It is the path.
To navigate it, we need more than models and infrastructure. We need something far harder to quantify: vision, trust, and a shared language to bridge the space between capability and comprehension.
The Lesson: Tools Don’t Land — Stories Do
Recently, we introduced a human tool designed to enhance model performance through better quality data. The value was there. The potential was real. But we led with the how instead of the why.
The result? Misalignment.
Not because the technology was flawed, but because the story didn’t resonate with the people we were trying to help. The tool addressed a problem they hadn’t yet seen, let alone felt.
This was the moment we pivoted (quickly) because people don’t adopt tools. They adopt outcomes.
And outcomes require belief.
The Quiet Power of Education
In most organisations, AI conversations begin before clarity arrives. Leaders are asked to make choices about systems they don’t wholly speak the language of. We can see this as a gap — or we can see it as an invitation.
Education isn’t a preamble to the work.
It is the work.
It creates a shared perspective. It transforms “What is this AI thing?” into “What could this unlock for us?”
It prepares the soil so the tomatoes, I mean, strategy can take root.
Listening Beneath the Ask
Once understanding takes hold, the next task isn’t to respond with a feature. It’s to listen for the unsaid.
What’s slowing you down?
What do you wish you had more of — time, clarity, foresight?
What do you reach for when nobody’s looking?
This isn’t solutioning. This is uncovering.
It’s the transition from problem-solving to problem-finding.
And it’s where human-centred design becomes not just useful, but essential.
It’s then translating an understanding of their fundamental needs into potential AI-driven solutions.
Making the Invisible Visible
Even when the right problem is found, AI can remain abstract. That’s why pilots matter, not as proof of concept but as acts of translation.
A good pilot doesn’t just test a capability — it tells a story.
It shows people what AI feels like in their world.
It builds momentum by making the future touchable.
Moving Beyond the Hype
Yes, AI is everywhere.
But impact lives not in hype nor headlines, but in the quiet clarity of alignment:
Between what technology can do,
What people need,
And what the organisation is truly here to deliver.
This is where return on investment lives.
Not just in automation or savings, but in insight. In better decisions.
In time regained and purpose restored.
The Role I Want to Play
AI is not just technical. It is deeply human.
It asks us to sit at the intersection of logic and intuition, data and emotion, vision and care.
This is the space I am drawn to.
Where strategy is not just a roadmap but a shared imagining.
Where adoption is not won by force but by trust.
And where the first step is always the same: listen deeply.
Because in the end, AI is not about the model. It’s about the moment it creates. When someone sees a little more clearly, acts a little more confidently, and feels a little more understood.
That’s where the real intelligence lives.
TAGS
Human Centred AI
AI Strategy
AI Adoption
AI Product Management
DISCOVER MORE