How we know what isn’t so by Thomas Gilovich – Executive Summary & Key Messages

The tendency to find order to ambiguous stimuli is built into the cognitive machinery we use to understand the world.  That predisposition to impose order can be so automatic and so unchecked that we often end up believing in the existence of phenomena that just aren’t there.

People mostly believe that fluctuations in the prices of stocks on Wall Street are far more patterned and predictable than they really are!  We tend to believe something is systematic, ordered, and “real” when it is really random, chaotic, and illusory.

People implicitly confuse necessary and sufficient evidence: They seem to be reasoning that if these are a fair number of such positive cases, then the phenomenon must exist or the relationship must be valid.

It is the peculiar and perpetual error of the human understanding to be more moved and excited by affirmatives than negatives.

Unless we recognize these sources of systematic distortions and make sufficient adjustments for them, we will surely end up believing some things that just aren’t so!

Seeing What We Expect to See:

Representativeness leads to the belief that—causes—resemble their effects: Big effects should have big causes, complex effects should have complex causes, and so on.

Scientists try to protect against erroneous beliefs by a set of formal procedures to guard against the sources of bias and error.  They employ relatively simple statistical tools to guard against the misperception of random sequences.  They are willing to sacrifice some “intelligence” and flexibility for the benefit of objectivity.

It appears that events that confirm a person’s expectations are indeed better remembered, at least in comparison to those “non-events” that fail to confirm them.

Seeing What We Want to See:

The endowment effect: Ownership creates an inertia that prevents people from completing many seemingly beneficial economic transactions.

Recent public opinion polls indicate that although 25% of the population believes that the country as a whole will be better off financially in the coming years, 54%, nevertheless, think that they will do better => Wishful Thinking.

We seek opinions that are likely to support what we want to be true (confirmation bias).  People’s preferences influence not only the kind of information they consider, but also the amount they examine.

For desired conclusions, it is as if we ask ourselves, “Can I believe this?” but for unpalatable conclusions, we ask, “Must I believe this?”

When someone challenges our beliefs, it is as if someone criticized our possessions.

Believing What We are Told:

The need or desire to tell a good story can distort the accuracy of information we receive secondhand.

We should always give less weight to our own impressions and assign more weight to the “base rate” or general background statistics.

The Imagined Agreement of Others:

People do not always think that their own beliefs are shared by a majority of other people.  Rather, the false consensus effect refers to a tendency for people’s estimates of the commonness of a given belief, to be positively correlated with their own beliefs.

It is partly a motivational phenomenon that stems from our desire to maintain a positive assessment of our own judgement.

People are generally reluctant to openly question another person’s beliefs—to avoid potential conflict with others.

Heinrich Heine said, “God has given us speech in order that we may say pleasant things to our friends.”

Examples of Questionable and Erroneous Beliefs:

Alternative Health Practices: Roughly 50% of all illnesses for which people seek medical help are “self limited”, i.e., they are cured by the body’s own healing processes, without assistance from medical science.

Consider one of the holistic health movement’s most popular credos: “It is much more important to know what sort of patient has the disease than what sort of disease the patient has.”

Relatively little of the improvement in health and longevity during the last two hundred years is due to drug and surgical treatment of sick individuals.  Most of the gain is attributable to various preventive measures such as improved sewage disposal, water purification, pasteurization of milk, and improved diets.  This has helped to increase the chances to survive childhood.

The life expectancy of those who made it to adulthood has not changed much during the last hundred years.

The underlying causes of faulty reasoning and erroneous beliefs will never be eliminated.  People will always prefer black-and-white over shades of grey, so there will always be the temptation to hold overly simplified belief and to hold them with excessive confidence.  People will always be tempted by the idea that everything that happens to them is controllable.

To compensate, we need to develop the habit of employing one of several “consider the opposite” strategies.  We can learn to ask ourselves, for example, “Suppose the exact opposite had occurred, would I consider that outcome to be supportive of my belief as well?”  By asking these questions, we become aware that the link between evidence and belief is not so tight as it might appear.

An awareness of how and when to question, and a recognition of what it takes to truly know something are among the most important elements of what constitutes an educated person.

It requires that we think clearly about our experience, question our assumptions, and challenge what we think we know.

Leave a Reply

Your email address will not be published. Required fields are marked *