A Concise Introduction to Heuristics and Biases
Michael Anissimov :: June 2004

Already familiar with the basics of heuristics and biases?
If so, please proceed directly to Cognitive biases potentially affecting judgment of global risks.
Further information can also be found at Overcoming Bias.

On with the introduction...

When we humans use abstract reasoning to determine the solution to a given problem, say, opening a door that is stuck for some unknown reason, we are not reasoning using the real door. Our brains don't have enough memory to fit the precise physical description of a real door. Instead, we use representations, which are often trillions or quadrillions of times simpler than their physical counterparts. But they work, at least well enough to solve simple problems. Once you have a mental representation of a door, you can start reasoning about which classes of object or event have the potential to make one stuck or unstuck, for example.

Human brains use a strict set of compression schemes for abstracting critical features of incoming sensory data. These compression schemes are not perfect, and often make errors - as we can see in studies of optical illusions. Many of these errors are invisible to introspection, as they are swept under the rug by higher levels of cognition.

Relative to most animals, humans receive a massive amount of incoming sensory data - terabytes worth. Most is immediately discarded, ignored, or abstracted away by neurological machinery. The surviving data, an incredibly small percent, will be converted into symbolic format; connected to previous experiences and stored concepts in the complex associative network that is the human brain.

When new sensory data is abstracted, converted into symbolic format, and archived in long-term memory, it is subject to certain biasing effects. Biases also operate when the symbols are invoked and manipulated for cognitive operations.

The results are contradictory beliefs, anchoring effects, and a whole zoo of psychological "optical illusions". "Anchoring effects", for example, are a class of robust psychological phenomena showing that people adjust insufficiently for the implications of incoming information. We form beliefs around an anchor, and additional incoming data must fight against the intertia of the anchor, even when it is objectively irrelevant to the judgement at hand.

For example, consider the product of the series:

9 x 8 x7 x 6 x 5 x 4 x3 x 2 x 1 = ?

vs.

1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 = ?

Experimental studies have indeed confirmed that estimates are strongly biased towards the initial values. In a study by Robyn Dawes that required participants to give their answers within 5 seconds, the average estimate for the first series was 4,200, and for the second series, only 500. The real answer is 362,880. Everyone radically underestimated the true answer. Why?

Decision theorists chalk it up to another psychological effect that relates to anchoring - representativeness. Representativeness describes the phenomenon that people usually expect outputs to resemble their generating process. In real life, this is sometimes true. But, as in the case of the multiplication problem, sometimes it isn't, and this inevitably leads to human-universal biases that are very pervasive, robust, and disappointing. (In academic texts, decision theorists often skirt around the issue of interpreting these biases as "disappointing" - leaving the interpretation to the reader. But I suggest that pointing out the obvious is worth doing.)

Representativeness and anchoring are sometimes known as heuristics - "rules of thumb" that humans use to perform abstract reasoning in cognitively economical ways. They are innate and human-universal because they emerge from the same species-wide design that is responsible for the fact that we all have two eyes, two ears, ten toes, and so on. Run of the mill, typical Homo sapiens sapiens design template. Standard issue.

Heuristics save time and effort, but often fail utterly when presented with data outside of their "domain of expertise". These failures tend to difficult to notice, because 1) the thinking processes responsible for judging the overall quality of our thinking processes are plagued by these biases as well, and 2) they are so widespread and natural that few people notice them, and 3) decisions made based on heuristics feel good - they're intuitively satisfying regardless of their correctness.

Decision theorists have isolated so many of these robust biases that, from the 70s onward, their study has developed into an independent respected field with many devoted followers in academia. This field, founded by professors Daniel Kahneman and Amos Tversky, is known as "heuristics and biases". It has become very trendy in recent years, especially in economics, where its application translates into profits of millions or billions of dollars. Daniel Kahneman recently won the Nobel Prize for Economics. Amos Tversky died in 2000.

Dozens of heuristics and biases have been studied extensively and experimentally verified, and hundreds or thousands have been hypothesized. The implications of such studies are extreme and wide-ranging. Traditional assumptions in psychology, ethics, rationality, epistemology, and philosophy are utterly uprooted by these shocking phenomena. To give the reader some idea of what I mean, I'll list some of the more prominent and well-studied biases and provide summaries and commentary:

Availability. Availability is the ease with which a given instance or scenario comes to mind. When asked what we think of a given social group, we may give an opinion on the basis of what information is most available to us about that group - say, personal interactions with one group member. Vivid scenarios such as terrorist attacks are more available than objectively more dangerous problems such as colon cancer, and our probability estimates are skewed accordingly.

The fundamental attribution error. One of the most pervasive and powerful of all heuristics. The fundamental attribution error refers to the tendency of people to overweight the likelihood that humans are the causes of any given event, as opposed to situational or environmental factors. We are obsessively focused on each other, and often refer to environmental factors only when we are trying to avoid being blamed for something. When we are successful in some endeavor, we strongly tend to attribute that success to ourselves rather than inherent aspects of the situation.

Base rate neglect. Another glaring, pervasive human-universal pathology. Many people behave as if the likelihood they will die in a plane accident exceeds the likelihood that they will die in a car accident, or that the likelihood they will be killed in a terrorist attack exceeds the likelihood that they will die of colon cancer. The "base rate" is the statistical background information about the frequency of given events. In intuitive human reasoning, it is often underweighted or ignored entirely.

Contagion. This heuristic asserts, "once in contact, always in contact". Contagion is the reason why we won't drink beverages that have come into contact with a used flyswatter that has been sterilized. Contagion is where the idea of "cooties" comes from. As demonstrated by "cooties", contagion can correspond to social or moral qualities as well as actual contaminating substances. Contagion is responsible for our pre-microbial theories of disease transmission.

The planning fallacy. Humans consistently underestimate the amount of time and effort it will take for them to accomplish a given task. Government, commercial, and personal projects are routinely completed late and over budget. This bias may be more obvious than many others, because its effects are so visible and striking. To avoid this bias, we should simply use conservative estimates when planning projects.

Risk-aversion. One of the foundations of heuristics and biases that challenged the economic "rational actor" model of intuitive human reasoning, risk-aversiveness refers to the fact that losses loom larger than gains. A business or individual may take great pains to avoid small losses, while underfocusing on strategies to maximize long-term gains in spite of these losses.

Homogeneity bias. The "law of large numbers" in statistics states that we can make confident assumptions about the qualities of a certain population if we study a sample that includes a large portion of that population. If we do a poll that shows that 90 out of 100 people in a given study group plan to vote for Bush, then we can expect the majority of that group to indeed vote for Bush. The problem is that people often apply the "law of large numbers" to small numbers; jumping to statistical conclusions without adequate sample sizes. We assume that populations are more homogeneous than they really are, and assume that large populations faithfully reflect small samples taken from them.

Here's a list of some the heuristics or biases I've come across. I'm sure you've come across many of them in daily life. I really wish I could go over each one in more detail, but that would take a lot of writing. There are between dozens and thousands of published papers per heuristic or bias. Almost all have been experimentally confirmed beyond any shadow of a doubt, but there is more to discover about precise structure of each bias. In future theories, we may reduce currently known biases to multiple causes or sub-biases.

 

Bias Description
above-average effect the widespread tendency to categorize oneself as "above average".
accountability bias the tendency to form thoughts based on considerations of accountability to others.
affect heuristic hastily judging objects or people by an immediate feeling of "goodness" or "badness".
anchoring/adjustment failure to adjust sufficiently from initial anchors, even when the anchors are arbitary.
anthropomorphism tendency to ascribe human motives or characteristics to nonhuman objects.
availability heuristic salient memories override normative reasoning; most fundamental heuristic of all?
base rate neglect neglect of background frequencies in favor of salient anecdotal evidence.
biased evaluation double-standards in evaluation of evidence, attribution of hostile motives to critics.
Barnum effect tendency of people to accept general descriptions as uniquely relevant to them.
causal schema bias pervasive tendency to categorize salient events based on causal relations.
certainty illusion an overweighted desire for 100% confidence or certainty.
contagion/similarity "once in contact, always in contact", "stigma", "karma", other magical thinking.
confirmation bias the bias to seek out opinions and facts that support our own beliefs and hypotheses.
conjunction effect systematic overestimation of conjunctive probabilities.
durability bias durability bias in affective forecasting.
emotional amplification expect lots of emotion when an salient event's causes were abnormal or mutable.
egocentric attribution attributing successess to oneself, failures to others (consciously or subconsciously).
false consensus effect inclination to assuming that your beliefs are more widely held than they actually are.
fundamental comp. bias tendency toward automatic contextualization (personalization) of problems.
framing effects disparities in estimates when an identical problem is presented in a different way.
frequency bias weakness with percentages, strength with frequencies.
gambler's fallacy pervasive false beliefs about the nature of random sequences.
groupthink the pressure to irrationally agree with others in strong team-based cultures.
homogeneity bias exaggerated conclusions about large populations based on small samples.
honoring sunk costs "throwing good money after bad", pouring resources into failing projects.
isolation effect disregard of components that choice alternatives share, overfocus on differences.
planning fallacy consistent overoptimism regarding completion times for a given project.
reflection effect risk-aversiveness with respect to potential gains, risk-seeking with respect to losses.
representativeness "like goes with like", the tendency to blindly classify objects based on surface similarity.
selective recall the mostly accidental habit of remembering only facts that reinforce our assumptions.
susceptibility bias optimism in assessments of personal safety and the effectiveness of precautions.

Debiasing


Mental Contamination and Debiasing (from Wilson & Brekke, 1984.)

Accurate debiasing purifies reasoning by isolating likely sources of bias and counteracting them. Awareness of a given bias does not entail its elimination from your thinking; it must be studied deeply and applied deliberately. Debiasing involves passing a decision process through a series of successive filters - filters serving to remove the mental contamination associated with individual biases. Filters are mental software designed by you and applied to your own decisions or to the decisions of others. Like computer software, some filter programs work better than others. Certain filters can work to eliminate contamination from multiple biases, whereas others may fail or even exacerbate certain biases. Any strategy for designing effective filters will rest upon a firm foundation in the literature relevant to specific biases.

Further reading:

Heuristics and Biases, printable study cards

Evolutionary Foundations of Heuristics and Biases

Judgemental Heuristics and Biases

An Intuitive Explanation of Bayesian Reasoning

Heuristics and Biases in Probability Judgement (notes)

References:

Dunning, D., Meyerowitz, J., Holzberg, A. (2003.) "Ambiguity and Self-Evaluation: The Role of Idiosyncratic Trait Definitions in Self-Serving Assessments of Ability." In Heuristics and Biases.

Gilbert, D., Pinel, E., Wilson, T., Blumberg, S., Wheatley, T. (2003.) "Durability Bias in Affective Forecasting". In Heuristics and Biases.

Kahneman, D., Tversky, K. (2000.) "Prospect Theory." In Choices, Values, and Frames.

Miller, D., Taylor, B. (2003.) "Countefactual Thought, Regret, and Superstition: How to Avoid Kicking Yourself." In Heuristics and Biases.

Stanovich, K., West, R. (2003.) "Individual Differences in Reasoning: Implications for the Rationality Debate?" In Heuristics and Biases.

Weinstein, N., Klein, W. (2003.) "Resistance of Personal Risk Perceptions to Debiasing Interventions." In Heuristics and Biases.

Yates, Lee, Sieck, Choi, Price. (2003.) "Probability Judgement Across Cultures". In Heuristics and Biases.