The Adoption Gap: Why Great Tools Often Go Unused
Every team has one: the clever analytics dashboard that barely gets opened. Despite its potential to transform decision-making, it quietly gathers digital dust while users revert to spreadsheets, or worse — ask the nearest colleague for help.
This isn’t a story of poor functionality or bad ideas. It’s a story of missed connections — between the system and the humans it was built for.
I am investigating a recurring problem in enterprise software, particularly in business intelligence tools: why aren’t users adopting them, even when they’re technically sound?
The answer, it turns out, lies in the invisible layers — in the way information is organised, content is structured, and confidence is either built or eroded through each interaction.

The Adoption Problem Is More Than Usability
The issue isn’t usually one of raw functionality. It’s that users often don’t feel confident or supported enough to use the system as intended. They’re unsure where to begin, can’t find what they need, or find the learning curve too steep to overcome.
As Jakob Nielsen noted, usability is not a single quality but a constellation:
-
Learnability – can a new user get started without friction?
-
Efficiency – can they do what they need quickly?
-
Satisfaction – does the system leave them feeling competent, not defeated?
Systems that don’t clear this bar — even if they “work” — often fail to earn consistent, long-term engagement.

From Functionality to Findability
Platforms, dashboards, and data-heavy tools are complex by design. But complexity doesn’t have to feel confusing. Many of the adoption barriers I encountered can be improved with better information architecture and clear content.
Support content, in particular, has a part to play. Rather than guiding users through contextual tasks, they often exists as a disconnected library or glossary. In many organisations, users ignore built-in help altogether and turn to colleagues, Google, or support tickets.
But if your tool is proprietary or closed to public search engines, this workaround breaks down — and so does user trust.
Learning Without Direct Access
Okay, there are usability issues but this is a preliminary thought. How do we get data to see if we’re on the right path? As access to end-users can be limited, research often requires indirect methods. I relied on what are known as proxy interviews — speaking with account managers and the data team who regularly interacted with users. I also analysed support tickets and internal documentation. But the most insightful method was observing client calls.
These methods, as usability expert Galitz (2007) points out, is not only practical — it can uncover valuable behavioural insights when direct testing isn’t possible.
One discovery was that users were not confident in navigating the tool. They didn’t know what to expect, where to find things, or whether they were using it “correctly.”
In short: they couldn’t find their way in — and so they opted out.
In conclusion, it’s not just the tool- it’s the experience around it.
Most people don’t stop using a tool because they’re stubborn or unwilling. They stop because they feel unsure — unsure where to start, unsure if they’re doing it right, unsure if it’s worth the effort.
Confidence isn’t something users bring with them. It’s something we design for. How do we make these design? I’m reading, learning and experimenting as I progress through my thesis.