In chapter 12, ‘Confirmations of Belief’, Shermer describes cognitive biases in depth. Below is his alphabetically ordered summary of all the kinds of psychological blind spots that human beings are subject to. It’s enough to humble even the most obstinate Ayn Rand disciple.
Anchoring bias: the tendency to rely too heavily on a past reference or on one piece of information when making decisions.
Attribution bias: the tendency to attribute different causes for our own beliefs and actions than that of others.
Authority bias: the tendency to value the opinions of an authority, especially in the evaluation of something we know little about.
Availability heuristic: the tendency to assign probabilities of potential outcomes based on examples that are immediately available to us, especially those that are vivid, unusual, or emotionally charged, which are then generalized into conclusions upon which choices are based.
Bandwagon effect: the tendency to hold beliefs that other people in your social group hold because of the social reinforcement provided.
Barnum effect: the tendency to treat vague and general descriptions of personality as highly accurate and specific.
Believability bias: the tendency to evaluate the strength of an argument based on the believability of its conclusion.
Clustering illusion: the tendency to see clusters of patterns that, in fact, can be the result of randomness.
Confabulation bias: the tendency to conflate memories with imagination and other people’s accounts as one’s own.
Confirmation bias: the tendency to seek and find confirmatory evidence in support of already existing beliefs and ignore or reinterpret discomfirming evidence.
Consistency bias: the tendency to recall one’s past beliefs, attitudes, and behaviors as resembling present beliefs, attitudes, and behaviors more than they actually do.
Endowment effect: the tendency to value what we own more than what we do not own.
Expectation bias / experimenter bias: the tendency for observers and especially for scientific experimenters to notice, select, and publish data that agree with their expectations for the outcome of an experiment, and to not notice, discard, or disbelieve data that appear to conflict with those experimental expectations.
False-consensus effect: the tendency for people to overestimate the degree to which others agree with their beliefs or that will go along with them in a behavior.
Framing effect: the tendency to draw different conclusions based on how data are presented.
Halo effect: the tendency for people to generalize one positive trait of a person to all the other traits of that person.
Herd bias: the tendency to adopt the beliefs and follow the behaviors of the majority of members in a group in order to avoid conflict.
Hindsight bias: the tendency to reconstruct the past to fit with present knowledge.
Illusion of control: the tendency for people to believe that they can control or at least influence outcomes that most people cannot control or influence.
Illusory correlation: the tendency to assume that a causal connection (correlation) exists between two variables.
Inattentional blindness bias: the tendency to miss something obvious and general while attending to something special and specific.
In-group bias: the tendency for people to value the beliefs and attitudes of those whom they perceive to be fellow members of their group, and to discount the beliefs and attitudes of those whom they perceive to be members of a different group.
Just-world bias: the tendency for people to search for things that the victim of an unfortunate event might have done to deserve it.
Negativity bias: the tendency to pay closer attention and give more weight to negative events, beliefs, and information than to positive.
Normalcy bias: the tendency to discount the possibility of a disaster that has never happened before.
Not-invented-here bias: the tendency to discount the value of a belief or source of information that does not come from within.
Primacy effect: the tendency to notice, remember, and assess as more valuable initial events more than subsequent events.
Projection bias: the tendency to assume that others share the same or similar beliefs, attitudes, and values, and to overestimate the probability of others’ behaviors based on our own behaviors.
Recency effect: the tendency to notice, remember, and assess as more valuable recent events more than earlier events.
Representative bias: the tendency to judge the probability of an event to the extent that it represents the essential features of its parent population or generating process.
Rosy retrospection bias: the tendency to remember past events as being more positive than they actually were.
Self-fulfilling prophecy: the tendency to believe in ideas and to behave in ways that conform to expectations for beliefs and actions.
Self-justification bias: the tendency to rationalize decisions after the fact to convince ourselves that what we did was the best thing we could have done.
Status quo bias: the tendency to opt for whatever it is we are used to, that is, the status quo.
Stereotyping or generalization bias: the tendency to assume that a member of a group will have certain characteristics believed to represent the group without having actual information about that particular member.
Sunk-cost bias: the tendency to believe in something because of the cost sunk into that belief.
Trait-ascription bias: the tendency for people to assess their own personality, behavior, and beliefs as more variable and less dogmatic than those of others.
And here’s a meta-bias, which is common to Objectivists:
Bias blind spot: the tendency to recognize the power of cognitive biases in other people but to be blind to their influence upon our own beliefs.
If you’re like me, you’re probably feeling a little depressed by this list of human cognitive failings. Fortunately, Shermer ends the chapter with good news – we have ways to compensate for our innate cognitive flaws, at least in the field of science.
What can we do about [cognitive biases]? In science we have built-in self-correcting machinery. In experiments, strict double-blind controls are required, in which neither the subjects nor the experimenters know the experimental conditions during the data-collection phase. Results are vetted at professional conferences and in peer-reviewed journals. Research must be replicated in other labs unaffiliated with the original researcher. Disconfirming evidence, as well as contradictory interpretations of the data, must be included in the paper. Colleagues are rewarded for being skeptical. Nevertheless, scientists are no less vulnerable to these biases, so such precautions must be vigorously enforced, especially by the scientists themselves, because if you don’t seek contradictory data against your theory or beliefs, someone else will, usually with great glee and in a public forum.
The reason why proponents of ‘complementary/alternative medicine’ (CAM), or New Age mumbo-jumbo like auras and horoscopes, or pseudoscientific ideas like parapsychology and cold fusion are not to be taken seriously is because they emphatically do not adhere to the scientific method. Their cognitive biases are not corrected, and so the veracity of their beliefs and claims is suspect.
20.2.12
No comments:
Post a Comment