Introduction to Cognitive Bias
If you haven’t yet, read what cognitive bias is.
If you have, you know that cognitive biases are systematic patterns of deviation from rationality or norm in judgment.
In other words, cognitive biases drive us to make decisions that don’t make rational sense.
This often served us well in the past, when we lived simpler lives focused on survival.
However, in modern life they can be a source of poor decision-making in our relationships, business, and careers.
What follows is a list of the most common cognitive biases. I’ll continue to update and refine this list, so you can save it as a resource to revisit.
If you’re interested in improving your thinking, you should also read my introduction to mental models.
List of Cognitive Biases
We overvalue information that confirms our existing beliefs, and discount information that conflicts.
Once we form an opinion, our brain naturally prioritizes information that agrees with that view. This happens everywhere in our lives.
To fight it, avoid forming an opinion too early, and be proactive in trying to disconfirm your views.
Common situations where this occurs are justifying purchases, politics, hiring, and first impressions.
We tend to use the first piece of information as an ‘anchor’ and adjust from there, often insufficiently.
Common situations where this can be dangerous are financial negotiations - buying new products like houses and cars, or negotiating on things like salary.
To avoid being subject to anchoring, try to come to a conclusion (on price, salary, etc.) by thinking from first principles, or finding your own comparables.
This can be used to your advantage when positioning products or prices, or in negotiations of your own.
When we have already incurred a cost - time, money, or otherwise - we tend to believe that it justifies further expenditure.
The more we invest, the larger the sunk costs, and the more likely we are to keep investing.
Beware the sunk cost fallacy whenever you or your team has invested significant time, money, energy or love in something.
Ensure that you evaluate the project without considering what has already been invested.
The more knowledgeable you are, the less confident you are in your abilities.
Rephrased, those who are less competent will be more confident in their skills.
This could be described as ‘blissful ignorance’ for many. They believe they are competent because they lack a deep enough knowledge to know what they don’t know (the ‘unknown unknowns’).
We need to be careful whenever we are feeling confident in our abilities, especially in an area of little experience.
Examples include the engineer who wants to build his own house or the mathematician who believes he can beat the stock market.
Their knowledge and experience, while related, is limited compared to specialists in that field, and so they are likely to misjudge their competence.
The Barnum Effect, also known as the Forer effect, is when we hear vague statements, and fill in details that are specific to ourselves, believing the statement to be highly accurate.
Our minds tend to make connections, especially when we hear statements that could be interpreted as complimentary, even when the statement is vague.
This is common in astrology, fortune-telling, and even personality tests.
Examples of such statements (these were used in research showing the effect) are things like:
- You have a great need for other people to like and admire you.
- You have a tendency to be critical of yourself.
Be cautious of this effect when you hear vague statements or proclamations from a source that may not be trustworthy.
We believe our actions are dependent upon the situation, while the actions of others depend upon their character.
In other words, we believe that what people do is based upon who they are, not the situation. In reality, this is not true. Extrapolating how someone acts in one situation to other situations is a probable source of error.
We must be particularly aware of this bias when assigning blame.
Assume ignorance, not malice, and be mindful of taking personal responsibility. Extreme ownership is a good strategy to counter this bias.
The placebo effect is observed when there is an improvement due to the belief in a treatment, not actually the treatment itself.
This is a well-known phenomenon in medicine, where inert pills or injections are given, and a noticeable improvement in symptoms is observed.
This pairs well with an understanding of regression to the mean, where a supposed improvement results not as an effect of some intervention, but rather a natural movement back towards the average (often after a period of below-average performance).
Business consulting is often considered a good example.
A business goes through a tough period of time, hires consultants, and then implements suggestions of their report (or doesn’t).
The business performance prior was below average, but the consultant’s work improves morale, and the business rebounds toward the average. The consultant looks good, but would the same have happened otherwise?
Beware the placebo effect whenever something has been studied without a ‘blind’ control - a group that doesn’t know whether they’ve received a real or fake treatment.
The halo effect occurs when we attribute positive or negative qualities to someone or something based on another trait that is unrelated.
An obvious example is when we believe someone to be intelligent or competent because they are attractive (or the reverse).
The halo effect combined with confirmation bias can be very powerful. We attribute a characteristic to someone, based on a different quality, and then we select evidence to support our judgment and ignore conflicting evidence.
The halo effect is prevalent throughout the business world in hiring, advertising, marketing and peer reviews.
But it can also have a positive effect on our day-to-day life, where it helps us maintain relationships, forgive friends and family, and fall and stay in love.
The bystander effect is when a greater number of people cause less action.
When there are more people around, everyone believes someone else will do something.
This has been studied in the context of emergency situations, where on a busy street someone is less likely to stop and offer help than if they are the only ones around.
In business, when there are many stakeholders, and no one individual is responsible, things are less likely to get done. Everyone assumes that someone else will do it.
If you want something to get done, and are unsure if you are the lead or not, make it happen. In an emergency, make sure you get help or confirm someone has.
Our choices are influenced by how something is presented.
In other words, we make different choices when something is presented positively or negatively, even if the odds are the same.
An example is a surgery that is presented as “80% chance of success” or “20% chance of death”. We will often make different choices depending upon that framing.
This gets particularly complicated when the odds and numbers get slightly more complex. Things like “20% chance of saving X people and an 80% chance of saving Y people”.
You’ll see this effect used widely in advertising and marketing, and it is hard to avoid in life.
When presented with decisions with large consequences, the solution is to try framing the problem in several different ways, and seeing how that changes your opinion. If in doubt, try inverting the statement or problem.
We overestimate the likelihood of a positive outcome.
Or, we underestimate the probability of experiencing a negative event.
In terms of potential negative consequences, this bias is huge.
Betting all your money on cryptocurrency? Investing in only one stock? Betting the company's future on a risky marketing strategy?
There are plenty of examples throughout our lives.
The counter to the optimism bias is to make slow, calculated decisions when the consequences are high.
Try to avoid short-cutting in the decision-making process, and consider all risks. Avoid the “I think this is a good idea” quick decision, and force yourself to justify a position. Where possible, try to find analogous situations or historical data to base your decision upon.
The desire to please the group, or to blend in (social pressure) overrides the best decision.
In a group, we want to please others and avoid conflict—part of our social nature. But this leads to poor decisions.
Try to avoid groupthink by implementing processes to encourage independent opinions. Force everyone to write down their opinion and reasoning independently before a meeting. Encourage criticism and challenging of ideas that isn’t taken personally. Ultimately, one person should be responsible for the decision.
When our beliefs are challenged, we strengthen our belief in our original position (often in the face of conflicting evidence).
The backfire effect is closely related to confirmation bias. We already filter evidence to confirm our existing beliefs. The backfire effect occurs when our beliefs are directly challenged. We get presented with evidence that says “you are wrong”, and then we dig in to our position.
It’s painful to be wrong about something, and we go to great lengths to avoid that pain. You must try and view your opinions and beliefs as separate from yourself. They should be things to be changed or modified frequently.
Whenever you feel offended or challenged about an opinion or belief, beware; take a moment and consider what is being presented, because you may be falling prey to the backfire effect.
We tend to view the past as better than it was, and the future more negatively than it will be.
Our memories are not to be trusted when it comes to evaluating the past, and we are poor predictors of the future.
Instead of relying on memory, try and look at objective facts relevant to the situation when trying to evaluate past events or behavior, and looking at what might happen in the future.
This is a good area to consider the Lindy effect.
We believe that moral or good action will bring good outcomes. We imagine the world to be ‘fair’.
There are many cliches and beliefs (“what goes around comes around”, ‘karma’, etc.) that say as much, yet it isn’t true.
The world is full of randomness, and actions do not guarantee outcomes.
“Prepare for the worst” is perhaps a better heuristic.
We favor those who are in our group.
As humans, we like to belong, and part of our tendency when we belong to a group is to defend and favor that group (and do the opposite with those outside the group).
Historically, this has held us in good stead, with large groups of people thriving due to the benefits of group cohesion.
But in decision-making we can become blinded by our bias towards our group, and discount others.
Try to imagine what it would be like to be in another group that you’re thinking about.
For a business competitor, try and put yourself in the shoes of those employees and those founders.
Empathy is the counter to this bias.
We overvalue information which is most easily available to us, often because it is recent, emotionally powerful, or unusual.
Our memories tend to notice these standouts, and as such, we apply them without consideration for the full set of information.
Shocking news is a good example. We overestimate the number of deaths from terrorism because it’s shocking, while discounting other larger causes.
Aiming to gather data in an organized format, and making slow decisions when the consequences are large, will help reduce the effect of available information.
Also note that when you have a strong emotional response or want to make a quick judgment, you should be careful.
We judge arguments based on whether the conclusion agrees with our own beliefs, not on the strength of the arguments themselves.
This is another major source of error in decision-making.
It is very difficult for us to objectively evaluate arguments before knowing whether the conclusion confirms our existing beliefs or not.
Countering this bias is difficult; often it requires getting people with different perspectives or beliefs involved in the process so that the interpretation of the data, evidence, and arguments is subject to a voice with a different opinion.
We feel the need to do the opposite when influenced by someone to do, believe, or change something.
A common example is when someone tells you to do something and your immediate emotional reaction is to push back or argue.
This is the basis for ‘reverse psychology’—where someone attempts to influence you, knowing you will do the opposite.
It’s important to be aware of this effect, both when being asked to do something, and when trying to influence others.
When we are being influenced, as with many biases, we must try to delay our emotional reaction and evaluate the arguments behind the suggestion. Is this something that has merit?
When we are trying to influence someone else, we need to be careful not to arouse such a reaction. Instead, we should seek to draw that person towards the conclusion we want. 48 Laws of Power is a good source of tactics to accomplish this.
Once we understand something, we assume it to be obvious to everyone.
It’s very difficult to remember the time when we didn’t understand something.
It’s why university professors often have a hard time being good teachers; they are experts in their field, and it becomes difficult to explain basic concepts.
Being able to explain things in simple terms is a valuable skill. Being able to do it without being patronizing is even more valuable.
Those who master this skill are much better at winning people to their side of an argument, or convincing others to do something, like purchase a product or support a cause.
We attribute our successes to our own competence, while believing our failures are the result of negative external factors.
We also do the reverse with other people; we assume negative outcomes are their fault, while positive outcomes are a result of external factors.
We underestimate the impact of luck in general, when in reality luck and randomness play large roles in our lives.
Be generous in evaluating the actions of others, and humble in evaluating your own contribution to successes.
Negative things tend to influence our thinking more than positive or neutral things.
Common examples include emotions, social interactions, and outcomes.
This is closely related to framing and loss aversion. Be careful when making decisions to focus on probabilities, and frame the problem in multiple ways.
Be wary in making decisions when you are experiencing negative events or emotions, and try to avoid relying on memory.
Decision journals, and journaling in general, can help alleviate negative emotions, and also reflect a more accurate accounting of past events.
We overestimate the likelihood that bad things will happen.
This may seem to conflict with the optimism bias, but the reality is they are often paired.
We often overestimate the probability of positive outcomes for others, while overestimating the probability of negative outcomes for ourselves.
We are also poor at estimating outcomes when they’re near extremes.
Whenever you are making a decision with large consequences, try to map out the actual numbers.
As Taleb says, “if there is a possibility of ruin, cost benefit analyses are no longer possible.”
Be aware of the risks and probabilities; we are poor at estimating at both positive and negative extremes.
We overestimate how much other people notice about how we look and act.
We are at the center of our own worlds, and as a result, we don’t have an accurate picture of how much we are noticed by others.
This tendency is the basis for being self-conscious.
In decision-making, it means we overestimate the effect we have on others, and our influence. This is also important in negotiating.
Use this knowledge to be cautious about your own knowledge, and to observe others in greater detail than others normally would. It will help you gain an edge.
We concentrate on those that “survive”—success stories—and fail to consider those who failed, often due to lack of visibility.
This is common when we seek to replicate success in a certain field, or when repeating a certain strategy or tactics.
Failures rarely get media coverage or have their stories told, yet we idolize those who reach success, and seek to understand how they did it.
When the startup founder talks about maxing out all her credit cards before making it big, we don’t see all those who declared bankruptcy.
Beware survivorship bias when basing actions upon success stories.