Cognitive Bias - Making Decision

 Cognitive Bias - Making Decisions


Sunk Cost Fallacy - You can’t stop once you open


Imagine you bought a stock at $100. A few months later, it drops to $70. You feel uncomfortable, but instead of reassessing the business, you decide to buy more. You tell yourself you are being disciplined—“I’m doing dollar-cost averaging.” After all, if you lower your average price, it will be easier to break even when the stock recovers.


But let’s slow down and think. Rationally, each new investment decision should be treated independently. The question should be simple: “If I had no position today, would I buy this stock at $70?” In reality, that is not how most of us decide. The original purchase price becomes a psychological anchor. The money already invested feels like something that must be “rescued,” and buying more feels like action, progress, and commitment. Under the name of a rational strategy, we are often just digging deeper—letting sunk costs from the past quietly dictate decisions about the future.


What’s happening here is not just inertia. It’s the sunk cost fallacy—a cognitive trap that makes people keep investing in something because of what has already been spent, even when the data clearly suggests cutting losses and moving on. Economists define sunk costs as past expenditures that cannot be recovered, and rational decisions should ignore them when looking forward. But in practice, sunk costs exert powerful psychological influence: we hate admitting that previous investments were wasted, so we chase them farther and deeper.


A product misses adoption targets, yet the team insists it just needs one more release; an internal platform performs worse than external alternatives, but no one wants to turn it off because too much has already been invested; an overseas expansion never breaks even, but leadership keeps saying, “We’re already here—let’s give it one more year.” An acquisition fails to deliver synergies, so more money is spent to “fix” it; an R&D project has no clear customer, but years of effort make stopping feel impossible; a marketing campaign that once worked keeps getting budget even as ROI declines, and so on. In all of these cases, the pattern is the same: past investment quietly becomes the reason for future commitment. Instead of asking what creates value going forward, organizations ask how to justify what they have already spent—confusing effort with obligation and persistence with good judgment.


How to Counter the Sunk Cost Fallacy


The hardest part about sunk cost fallacy is that it feels like responsibility. Walking away feels careless. Continuing feels disciplined.


The most effective defense against sunk cost fallacy is prevention. If you start from the wrong place, every step forward makes it harder to step back. That is why investing time and discipline at the beginning of a decision is often more valuable than any correction made later. 


But, if decision is made, to prevent potential sunk cost fallacy from rational decision, we need to change the question we ask ourselves.


The first and most powerful move is this: treat every decision as a fresh one. Ignore what has already been spent and ask, “If this project, stock, or initiative were presented to me today, with everything I know now, would I start it?” If the answer is no, continuing is not persistence—it is avoidance of admitting a loss. This simple reframing separates the future from the past.


Second, deliberately shift the unit of comparison from cost to opportunity. Instead of asking, “How much have we invested?” ask, “What else could we do with the next dollar, the next headcount, or the next six months?” Sunk costs disappear when resources are framed as scarce future options rather than historical commitments.


Third, introduce pre-defined exit criteria before commitment. Decide in advance what data would tell you to stop. This removes ego and emotion from the decision later. If the criteria are met and you still continue, you are no longer making a data-driven decision—you are making an emotional one.


Fourth, separate decision ownership from past ownership. The person who approves whether to continue should not be the same person who approved the original investment. Fresh eyes are less attached to defending the past and more willing to act on current evidence.


Finally, normalize stopping as a sign of learning, not failure. Organizations that punish reversals train people to hide bad news and double down. Organizations that reward timely exits free leaders to make cleaner decisions. The real failure is not stopping—it is continuing after the data has already spoken.



Group Think and Authority Bias


Now imagine you are in a meeting room with smart, experienced people. Everyone has read the same deck. Everyone understands the numbers. The discussion is calm, polite, and surprisingly fast. Heads nod. No one strongly disagrees. The decision is made smoothly. It feels like alignment. It feels like efficiency. But sometimes, that smoothness is exactly the warning sign.


This is groupthink—when the desire for harmony and consensus overrides critical thinking. People don’t necessarily agree because the idea is strong; they agree because disagreement feels uncomfortable. Silence is interpreted as support. Speed is mistaken for clarity. The group converges not because the decision is right, but because it is socially easier to converge than to challenge.


Now add authority bias, and the effect becomes much stronger. When a senior leader speaks first, their opinion quietly becomes the reference point. Others adjust their views instinctively. Junior members hesitate. Even peers soften their objections. The room does not need explicit pressure; hierarchy does the work automatically. What started as groupthink becomes reinforced alignment, anchored around authority.


The most effective way to fight groupthink is not to ask people to “be more critical,” but to change the order in which thinking happens. Groupthink emerges when ideas are formed together. To counter it, thinking must happen individually first, collectively later.


1. Separate thinking from discussion

Before any meeting, require participants to write down their views, assumptions, and recommendations independently. These written positions should be shared before discussion begins. This preserves original thinking before social pressure, seniority, or consensus-seeking alters it.


2. Leaders speak last

Authority creates gravity. When leaders speak early, ideas orbit around them. By deliberately speaking last, leaders remove themselves as an anchor and allow real variance to surface. This single change often produces more insight than adding another analyst or dataset.


3. Legitimize disagreement structurally

Disagreement should not rely on personality or courage. Assign a rotating “challenge role” whose job is to question assumptions and surface risks. When dissent is part of the process, it stops being personal.


4. Slow down fast alignment

Speed feels efficient, but premature alignment is often a sign of suppressed thinking. When a group agrees too quickly, pause and ask: “What are we missing?” Silence is not consensus; it may be compliance.


Groupthink is rarely the result of weak teams. It is the byproduct of strong social dynamics left unmanaged.


Overconfidence Bias


Now imagine a leadership team reviewing a forecast. The model is clean. The assumptions are reasonable. The numbers look precise—perhaps too precise. Revenue is projected to grow at 18%, costs are tightly controlled, and risks are summarized in a single slide at the end. The confidence in the room is high. No one asks whether the forecast is right; the discussion quickly shifts to how fast the organization can execute.


This is overconfidence bias—the tendency to overestimate the accuracy of our judgments, predictions, and control over outcomes. It often appears not as arrogance, but as certainty. The more experience people have, the more confident they become in their intuition. Past success reinforces the belief that “we know how this works.” As a result, uncertainty is treated as noise rather than a core feature of reality.


In analytics, overconfidence shows up in subtle ways. Point estimates are favored over ranges. Scenarios are discussed briefly, then ignored. Models are treated as answers rather than tools. When data aligns with leadership intuition, confidence increases further. When it doesn’t, the data is questioned—not the intuition. Ironically, more data can sometimes increase overconfidence, because it creates an illusion of precision.


The danger is not that leaders are wrong; it is that they are insufficiently uncertain. Overconfident decisions underestimate downside risk, compress timelines, and assume smooth execution. When things go well, success is attributed to skill. When they don’t, failure is blamed on external shocks. Learning becomes difficult because confidence absorbs both outcomes.


Overconfidence is not eliminated by reminding people to “be humble.” It is reduced by forcing uncertainty into the decision itself.


1. Replace point estimates with ranges

Single-number forecasts create false precision. Require ranges, confidence intervals, and downside scenarios. Asking “What’s the 10th percentile outcome?” immediately shifts thinking from optimism to realism.


2. Separate confidence from accuracy

After decisions are made, track not only outcomes but forecast accuracy. Who was confident? Who was calibrated? Over time, this exposes the gap between certainty and correctness and retrains intuition.


3. Ask inversion questions

Borrowing from inversion thinking: “If this decision fails, what will be the most obvious reason?” This reframes risk as something concrete rather than abstract and breaks the illusion of control.


4. Delay commitment, not analysis

Overconfidence often accelerates commitment, not understanding. Build in deliberate “cooling-off” points—moments where no new data is added, but assumptions are re-examined. Many bad decisions are made not because of bad data, but because of rushed conviction.


Overconfidence does not look reckless. It looks efficient, decisive, and experienced. That is precisely why it is dangerous.

Comments

Popular posts from this blog

Cognitive Bias - Presenting Facts - You are Framed!

Cognitive Bias - Problem Definition and Hypothesis - You think as much as you know and remember