Lists of cognitive biases tend to contain around 200 entries. This is partially a naming problem — many "biases" are variants of each other or different labels for the same underlying mechanism. But it's also a prioritization problem. Of the biases that have been robustly replicated and shown to affect real-world decisions, a small cluster does the most damage, most consistently, in the kinds of decisions most people actually face.

Here are the five that actually matter for working professionals, and what effective training against them looks like.

1. Confirmation bias

The tendency to seek out, interpret, and remember information in ways that confirm your existing beliefs. The most studied and arguably most consequential bias in the literature.

What it looks like in practice: You form an opinion about a candidate, a strategy, or a product — and then unconsciously weight subsequent evidence toward that initial position. You ask questions whose answers will confirm what you already think. You find the arguments against your position weaker than they objectively are.

How to train against it: The most effective intervention is structured adversarial consideration — deliberately generating the strongest version of the opposing view before you've committed. The "pre-mortem" technique (explicitly asking "what would have to be true for this to fail?") activates the same mechanism. Over time, practicing this on low-stakes decisions builds the habit of steelmanning alternatives as a default cognitive move.

2. Anchoring bias

The tendency to be heavily influenced by the first number or piece of information encountered when making a judgment, even when that anchor is arbitrary or irrelevant.

What it looks like in practice: Salary negotiations, pricing decisions, project timelines — any situation involving numerical estimates is vulnerable. If you hear a high number first, your subsequent estimate will be higher than it would have been otherwise. This persists even when people are explicitly warned about it and told the anchor was random.

How to train against it: Developing the habit of generating your own estimate before receiving external information is the primary countermeasure. This is partially a procedural discipline (decide before you look) and partially a trained cognitive reflex. Calibration practice — making explicit estimates before checking answers — builds the independent-estimate habit directly.

3. Availability heuristic

Estimating the probability of something based on how easily examples come to mind, rather than how often it actually occurs. Vivid, recent, or emotionally salient examples are easier to recall and therefore feel more probable.

What it looks like in practice: Overestimating the probability of dramatic but rare events (plane crashes, cancer from specific causes) after reading about them. Assuming a type of problem is common because a few memorable examples are prominent. Hiring based on a few vivid success stories rather than base rates.

How to train against it: Reference class forecasting — before estimating the probability of something, explicitly asking "what is the base rate for situations like this?" — is the standard corrective. It requires discipline because it feels less informative than the vivid examples. Practicing with rapid-fire questions across domains builds the habit of asking about base rates before generating estimates.

4. Overconfidence bias

Consistently overestimating the accuracy of your own knowledge and predictions. One of the most replicated findings in the decision-making literature, with substantial evidence across domains from medicine to finance to engineering.

What it looks like in practice: Setting confidence intervals that are too narrow (your 90% confidence intervals contain the true answer about 50% of the time). Making predictions with higher certainty than your track record justifies. Underestimating project timelines (the planning fallacy is a form of overconfidence).

How to train against it: This is the one bias that most directly responds to calibration training — tracked, scored practice at matching stated confidence to actual accuracy rates. Unlike most biases, where the fix is largely procedural, overconfidence has a direct training analog: make predictions with explicit confidence levels, track them, get feedback, repeat until your confidence is anchored to your actual accuracy.

5. Sunk cost fallacy

Continuing to invest in something (a project, a relationship, a position) because of past investment, rather than on the merits of future returns. Irretrievable past costs should be irrelevant to forward-looking decisions, but psychologically they feel like they should "count."

What it looks like in practice: Holding onto a failing investment because you've already lost money on it. Staying in a job or project because of years already invested. Escalating commitment to a strategy that isn't working because admitting failure would mean acknowledging the past investment was wasted.

How to train against it: The standard reframe is to ask "if I were starting fresh today, with no prior investment, would I make this decision?" This works but requires consistent practice to become reflexive. The deeper training is developing emotional comfort with genuine loss — recognizing that past investment is genuinely gone and that protecting it by continuing is a compounding error, not a redemption.

The meta-point

What these five biases have in common is that they're all distortions in how you process evidence and form beliefs. They're not about bad values or bad character — they're about systematic mismatch between the information available and the beliefs and decisions you generate from it. That's what makes them trainable: they're patterns, not traits, and patterns can be reshaped by deliberate practice.

Training against all five doesn't require separate interventions. A calibration practice that includes rapid-fire questions across domains, adversarial consideration exercises, and tracked prediction accuracy will address all five as a natural consequence. The skills overlap: the ability to generate an independent estimate (anchoring), recognize vivid examples as potentially unrepresentative (availability), hold prior investments appropriately (sunk cost), and match confidence to accuracy (overconfidence) are all facets of the same metacognitive faculty.