8 Jan 2025
The Data Delusion: why more tools won’t save your storefront
If you’ve been around testing or eCommerce for more than 5 minutes, you’ve probably experienced some version of this:
A company spots conversion issues, so they do what everyone does first: install a bunch of tools haphazardly.
👉 Hotjar this week.
👉 Microsoft Clarity next week.
👉 Maybe Heap. Maybe PostHog.
👉 Someone mentions QuantumMetric and… etc etc
Dashboards multiply. Tags pile up. Maybe there’s even a Slack channel called #insights.
But here’s the twist:
Nobody actually uses the data.
I worked at a company like this once.
Every few weeks, a new tool would be introduced with great fanfare (“this will give us the answers!”)
Then the fanfare died down, the insights never came, and the next shiny thing was installed. And during all that time, I was the only person occasionally logging into these tools. Watching sessions. Noticing patterns.
The tools weren’t the problem. The culture of data addiction was.
More data doesn’t create better decisions. It just creates better distractions.
Here’s a concrete example:
Let’s say your PDP has a 66% drop-off rate – meaning only one-third of users who see a product actually move to cart. You already have this from GA4 or any basic funnel report.
So what’s the instinct?
Well, people say “let’s deepen the analysis.” Let’s add scroll tracking, heatmaps, session recordings, maybe even form analytics to see how users are interacting with the reviews widget.
Okay. Let’s say you do that. You find:
Scroll depth average = 48%
Only 22% reach the “Add to Bag” button
35% of users who do reach it never click
Okay, that’s all great. Now what?
Often, the correct question isn’t “how can we gather more data?” It’s:
“What does the data we already have suggest, and how do we challenge that assumption with a test?”
Maybe you hypothesise that users aren’t scrolling far enough because the product description is too long. So you collapse it into an accordion and move key CTAs above the fold.
That’s how data becomes action.
You look at the basics, form a theory, test it, and then see what happens.
If instead you just go install another tool to track hover intent or emotion scoring or whatever… you’ve done nothing to improve the conversion rate.
Data is only valuable when it leads to decisions.
Everything else is distraction dressed as diligence.
When teams chase more data, what they’re really chasing is certainty.
But certainty doesn’t exist in CRO. You’re never going to find that one finding that explains everything.
What you can do is get directionally confident and move quickly.
For example:
Let’s say your cart abandonment rate is 74%. That’s higher than usual.
Do you need a 30-day behaviour flow report to confirm this?
Or, can you:
Look at time-on-page in the cart (let’s say it’s low – under 10 seconds avg.)
Spot that there’s no trust messaging, no “secure checkout” icon, no free returns note
Run a test adding those in – and see if abandonment drops
Speed of insight > depth of insight.
High-performing teams get things wrong fast. Low-performing teams waste forever trying to get things right – and never ship anything.
Here’s a wild idea: Look at what you already have.
Instead of stacking 9 tools to find complex “insights”, open GA4 and ask:
Where are users dropping? (Pages / Funnel exploration)
Are they spending time on high-value pages?
What’s their exit page?
Where is the engagement rate suspiciously low?
From there, you can bring in basic heuristics:
Is the page communicating value within 5 seconds?
Is the layout intuitive and legible, especially on mobile?
Are CTAs obvious and persuasive?
Does anything look broken, slow, or strange?
You’d be shocked how many conversion issues are solved by fixing one of the above.
I once increased an package page’s conversion rate by 10-20% by simply removing a sticky promo bar that blocked content on mobile screens.
That came from one test, by me, completed in about 15 seconds. Not 400 screen recording sessions passed through some advanced analysis tool…
In conclusion: More data !== more clarity.
The right tools are great. But they’re not your bottleneck.
Your bottleneck is usually one of three things:
Lack of clear hypotheses
Fear of being wrong
An over-reliance on “more data” to delay making a call
Progress doesn’t come from being right. It comes from testing faster.
So stop searching for a tool that gives you certainty. Start building a culture that rewards speed, curiosity, and action.