A non-technical guide to the distinction that changes every decision downstream.
A hotel chain notices something interesting in its data. Properties where guests rate the breakfast experience highly also have significantly higher rebooking rates. The pattern is clear, consistent, and statistically robust. So, the executive team commissions a £4 million breakfast upgrade programme across its properties.
Twelve months later, the refurbished breakfast rooms are beautiful, and guest satisfaction with breakfast has gone up. Unfortunately, rebooking rates have barely moved.
What went wrong? The data was right. The pattern was real. Guests who rated breakfast highly did rebook more often. But the breakfast was not the reason they came back. The properties with the best breakfast ratings also happened to be the ones with the most experienced general managers. People who ran tighter operations across the board, from how housekeeping was scheduled to how complaints were handled to whether the night staff actually answered the phone. It was the overall quality of management, not the full English, that kept guests returning. Breakfast was a symptom of a well-run hotel. It was not the cause of loyalty.
The hotel chain fell into the most expensive trap in business analytics. It mistook correlation for causation, and made a costly decision on the strength of a pattern that told the wrong story.
A distinction worth millions
Most senior leaders have heard the phrase “correlation is not causation.” It appears in strategy decks and risk reports, and in the occasional boardroom packs and minutes. But in practice the distinction is routinely ignored. This is not out of carelessness, but because the analytical tools most organisations rely on make it almost impossible to tell the two apart.
When your analytics platform shows you that variable A moves with variable B, it cannot usually tell you whether A causes B, whether B causes A, whether both are driven by something else, or whether the relationship is real at all. It shows you the relationship. It leaves the causal story for humans to fill in. And humans, even very experienced ones, tend to fill it in with the most intuitive narrative, which is often wrong.
The hotel chain’s leadership did what most reasonable people would do. They saw a credible pattern, constructed a plausible explanation, and acted on it. The problem was not a lack of rigour. It was a lack of the right kind of analysis.
Alternative to thinking in patterns
Your analysis should not ask “what moves with what?” but instead should focus on “what actually drives what, and through what pathway?”
Applied to the hotel chain, a causal approach would work differently. Rather than simply measuring the correlation between breakfast ratings and rebooking, it would map the potential causal pathways. What happens at properties where breakfast quality improved but management did not change? What happens when a strong general manager moves to a different property? Do the rebooking rates follow the manager, or stay with the building?
By modelling these pathways, the causal analysis separates the genuine drivers from the incidental ones. The conclusion might be that management quality explains most of the variation in rebooking rates, while breakfast quality explains very little. That leads to a completely different investment decision. One focused on management development rather than kitchen refurbishment.
This is what Causal AI does. It builds models of the mechanisms that connect actions to outcomes, rather than simply measuring which outcomes tend to appear alongside which conditions. The result is not just a more accurate picture. It is an actionable one.
Questions worth asking
There is a simple test for whether your organisation’s analytics are giving you causal insight or just well-presented correlation. It comes down to the kinds of questions the analysis can actually answer.
Most analytics can tell you what happened in the past, and what is likely to happen in the future if conditions stay roughly the same. That is genuinely useful – it is how forecasting works, and how anomaly detection works, and how most of the dashboards in your organisation earn their keep. But it is not the same as being able to say what would happen if you took an action you have not taken before. If your system can only extrapolate from historical patterns rather than model the effect of a new intervention, then you are working with correlations, however sophisticated they look.
The next question is even harder. When a positive outcome follows a decision, organisations naturally credit the decision. The customers who received a loyalty bonus and stayed – were they going to stay anyway? The patients who improved after treatment – would they have improved without it? The factory that hit its targets after a new shift pattern was introduced – was the shift pattern responsible, or did the order book happen to recover at the same time? This kind of counterfactual reasoning is how organisations distinguish effective actions from expensive coincidences. It is also where most management reporting falls silent.
And then there is the question that the hotel chain failed to ask. When performance varies, the instinct is to identify the factor that differs most visibly between the high performers and the rest. But visible differences are not always causal ones. The factor that stands out may be correlated with the true driver without being the driver – exactly as breakfast quality was correlated with management quality. A causal approach traces an outcome back through the web of contributing factors and asks which ones genuinely move the needle, and which ones are just along for the ride.
If your current analytics cannot answer questions like these with any confidence, you are in the majority. But you are also making decisions with an incomplete picture, and the gap between what your data shows and what it means is where money and strategic focus quietly go to waste.
Why good leaders get this wrong
Let’s be honest about why this problem persists, because it is not about competence. Senior leaders get this wrong for the same reason everyone does. The human brain is built to construct causal narratives from observed patterns. We see two things happening together and instinctively infer that one drives the other. In daily life, this instinct serves us extraordinarily well. In complex systems with hundreds of interacting variables, it leads us astray.
The analytical tools most organisations use reinforce this tendency rather than correcting it. Dashboards present patterns with precision and confidence. They rarely flag the difference between “these things move together” and “this thing causes that thing.” The output looks authoritative, and the causal gap is invisible unless you know to look for it.
This is not about replacing experienced judgement with technology. It is about equipping experienced judgement with better evidence. A leader who knows that management quality drives guest loyalty makes very different investment decisions from one who believes it is about the breakfast buffet – even when both are looking at the same satisfaction data.
What this means in practice
The practical implication is not that every organisation needs to overhaul its analytics overnight. It is that leaders should start developing an instinct for when they are looking at a causal insight and when they are looking at a well-dressed correlation.
When someone presents a finding that implies a course of action, the most useful question is also the simplest. Have we established that this is a cause, or just a pattern? If we acted on it, are we confident about the mechanism through which it would produce the outcome we want, or are we hoping?
These are not technical questions. They do not require a data science background. They are leadership questions – and asking them consistently changes the quality of the decisions that follow. Not every answer will require a full causal model. But knowing when you need one, and when you are operating on assumption, is itself a significant upgrade.
Correlation has been immensely valuable. It has helped organisations see patterns they could never have found on their own. But seeing a pattern and understanding what drives it are different things. For the decisions that carry real weight, correlation gets you to the question. Causation gets you to the answer.
This is the second in a series of articles exploring Causal AI in the enterprise.

Leave a comment