When Stakeholders Say “We Knew That Already”: Hindsight Bias in UX Research
Why It Happens And What To Do About It
Has this ever happened to you? After weeks of careful user research – planning sessions, interviews, observations, analysis – you present your findings to the team. Instead of excitement or curiosity, you get… indifference. A stakeholder shrugs: “Yeah, we already knew this.” Another adds, “These insights are pretty obvious.”
It’s a deflating moment for any UX researcher. How could your hard-won findings be dismissed so casually? Were they truly that predictable?
In reality, this reaction often says less about the research and more about human psychology. The likely culprit is hindsight bias, a cognitive illusion that makes results feel obvious in retrospect even if they weren’t known beforehand (Fischhoff & Beyth, 1975; Kahneman, 2011).
Hindsight bias – sometimes also called the “knew-it-all-along” effect, for obvious reasons – leads people to believe, after the fact, that they expected an outcome all along. Classic psychology studies by Baruch Fischhoff first documented how people’s memories of their prior opinions shift once they learn the actual results . After the event, we rewrite our memory, convincing ourselves “I knew that would happen” even when we didn’t. In the context of UX, stakeholders aren’t biased against UX researchers but genuinely feel they anticipated the user research conclusions, even if the research surfaced new information.
Cognitive psychologist Daniel Schacter lists hindsight bias as one of the “sins” of memory that distort how we recall our past knowledge (Schacter, 2001). Once we know a finding, our mind retrospectively adjusts our earlier assumptions to align with what we know now, erasing the uncertainty or surprise we originally had.
Why Hindsight Bias Makes Findings Seem Obvious
Psychologists have been researching hindsight bias and trying to understand why it’s so pervasive. According to Kahneman (2011) after learning an outcome, our brains stitch together a story that makes it feel inevitable. We selectively recall information that supports the outcome and ignore what might have suggested a different result (Roese & Vohs, 2012). The narrative falls into place easily, of course users struggled with that feature, the signs were all there! If the story makes sense, we assume we knew it.
Humans also like to see the world as orderly and ourselves as competent predictors; believing “we already knew that” satisfies this need for closure and ego (Roese & Vohs, 2012). Daniel Kahneman calls this the illusion of understanding: we believe we fully understand past events and thus underestimate how surprising they were. In UX terms, once the research reveals a problem or insight, stakeholders might unconsciously revise their knowledge to make the result seem obvious and themselves seem prescient.
The trouble is, hindsight bias can seriously undermine design decisions, learning, and the impact of our research. When stakeholders dismiss findings as obvious, they risk becoming overconfident and not digging deeper or taking action. Neal Roese observes that if you feel you “knew it all along,” you won’t stop to examine why something really happened”. In other words, teams may gloss over the root causes uncovered by research because they assume they already understand them. This false confidence can lead to poor prioritisation with real user pain points might get ignored as “old news,” while the team chases new ideas or sticks to existing beliefs. Hindsight bias is a known driver of overconfidence in many domains, from business to medicine. In product development, a team convinced that “we know our users” may skip critical research or fail to address usability problems that, in hindsight, everyone supposedly knew about.
This bias can can erode respect for UX research: if outcomes are always seen as either unforeseeable or already known, research gets sidelined. In addition, left unchecked, hindsight bias in stakeholders breeds overconfidence, faulty priorities, and missed insights. Teams become convinced they have all the answers (after the fact), which discourages investing time in user research or acting on new findings. Over time, this bias can create a toxic cycle: product decisions are made on assumed knowledge, research is under-valued, and user experience suffers from issues that everyone “knew” but nobody fixed. Good design thrives on learning and confronting uncertainties – exactly what hindsight bias discourages.
What Can We Do About It? Strategies for UX Teams
Hindsight bias may be deeply human, but there are strategies we can employ to mitigate its impact. UX researchers and psychologists suggest several strategies to prevent the “we knew it” response and keep stakeholders open-minded. Below we discuss some suggestions:
Document assumptions and expectations early": Before research begins, actively capture what stakeholders think they know. For example, hold a kickoff session to list hypotheses: “We believe users find the onboarding confusing because of X.” This creates a record of initial assumptions. Writing down predictions before an outcome helps reduce hindsight bias; it’s harder to claim you “knew it” when a prior written note shows otherwise. We can start by reframing assumptions as hypotheses at the start of a project, explicitly acknowledging what the team doesn’t know. By externalising beliefs, you make it clear that the research is testing those beliefs. Later, when someone says “we already knew that,” you can point to the documented assumptions to discuss what was right, what was wrong, and what was incomplete. A visible list of assumptions reminds everyone that some of our “knowledge” was just guesswork. It’s a reality check that reduces hindsight bias.
Poll stakeholder predictions before sharing results: A powerful variation of documenting assumptions is to have stakeholders guess the research outcomes just before you reveal findings. For instance, before a readout or synthesis workshop, ask stakeholders to individually predict answers to key research questions (perhaps via a quick survey or sticky notes). This tactic serves two purposes: it engages stakeholders and it provides a comparison point that can reduce hindsight bias. When the actual findings are presented, stakeholders see where their predictions matched or (more often) missed. Psychologically, this confronts the tendency to misremember our foresight. If a team member predicted 5 out of 10 users would complete a task easily, but the research shows all 10 struggled, it’s harder for them to later insist they knew it was a problem all along. Research suggests that considering alternate outcomes and getting rapid feedback on our predictions helps deflate hindsight bias (Roese & Vohs, 2012). In a UX context, making prediction a collaborative game turns the reveal into a learning moment rather than a verdict on who was right.
Involve stakeholders directly in user discovery: It’s much harder to claim “we knew it already” when you’ve sat in on the user interviews and watched customers struggle first-hand. Wherever possible, invite stakeholders to join research sessions or analysis. Even observing just a few user interviews or usability tests can be eye-opening for team members. If that’s not possible try sharing findings and excerpts with them regularly to keep them in the loop! This practice can help build empathy and buy-in. When stakeholders are present during discovery, they experience the surprises alongside the researcher. That shared experience makes the findings more tangible and harder to dismiss. In fact, a stakeholder who observes users might become your ally. Involving the team also preemptively surfaces the “we knew that” sentiments during research, when you can probe them. If someone says “I expected that outcome” while debriefing a session, you can dig deeper: Why did they expect it? Did they also expect the reasons behind user behaviours? Often, superficial familiarity masks a lack of understanding of the why and how, which the research can then illuminate. By co-discovering insights, stakeholders are less likely to later position themselves as having known it all along – instead, they become co-owners of the new knowledge. I think of it as the “Ikea effect” in action.
Reframe “obvious” findings around persistent problems: A common stakeholderc complain is that user research just tells them things they already know, especially when it comes to known pain points. To turn this cynicism on its head, reframe the insight to focus on why the problem persists and how the research adds new depth. Emphasise any new evidence about why it continues to be an issue. This reframing shifts the conversation from acknowledgement to action. If a stakeholder says “we already knew that,” a productive response is: “Yes, we suspected it, and now we have concrete evidence and understand the nuances of the problem, so we can finally address it properly. Essentially, you’re validating that their intuition wasn’t wrong, but highlighting that knowing about a problem is not the same as solving it. Psychologically, this approach counters hindsight bias by focusing on the gap between knowing and doing. The bias might cause people to overestimate what was done with prior knowledge.
Review past efforts and failures openly: People often forget or gloss over the lessons of past projects. Combat this by bringing past data and outcomes into the conversation. For example, if similar research was done a year ago, revisit what it found and what happened with those findings. Or if a feature was launched to address a user issue, examine whether it succeeded. By reviewing the track record, you create context that humbles the “we knew it” stance. The psychological principle here is related to “consider the opposite”, a known debiasing technique (Roese & Vohs, 2012). You’re asking the team to consider that if they really knew all these things, would past outcomes have been different? If the answer is no (e.g., users are still unhappy, metrics haven’t improved), then clearly fresh insight was needed. Reviewing past failures also reduces overconfidence creating space of a continuous learning mindset.
Conclusion
Hindsight bias is a part of human nature – our minds love a tidy story that we were right all along. Unfortunately, in UX research, unchecked hindsight bias can turn hard-won insights into dismissive “so what?” reactions. This bias gives a false sense that user needs are already understood, resulting in overconfidence and complacency. The result is bad for users and the business: teams may ignore important findings, misprioritise what to fix, and ultimately deliver poor experiences. As we’ve seen, what feels obvious in hindsight wasn’t actually obvious before and realising that is key to good design.
A solution we discussed is an intentional, humble collaboration around research. By recording assumptions, involving stakeholders, and continuously challenging the narrative that “we knew it,” we can create an environment where findings are viewed with curiosity rather than dismissal. In the end, combating hindsight bias is not about winning an argument with stakeholders – it’s about building a shared understanding that evolves.
Started reading this from my email - first time in a while I’ve got this interested in an article - Thanks!
Great ready Maria. I particularly love the idea of surveying stakeholder prior to readouts!