However, we can learn and reduce our biases.
When we make a decision based on our intuition, we are prone to be biased. Even when we are given all the warnings of a potential trap, all the principles of the “right” logic, or factual presentations, we still let biases slip through. This is true for both the general population and for “experts” in many fields. Since there is no such thing as “perfect” information with which we make decisions (besides, how does one define “perfect information?”), it is inevitable that we need to rely on our heuristics, or rules of thumb, sometimes. The challenge is: How do we assess our own biases.
To start, we can learn the typical biases that reside in all of us. Daniel Kahneman’s and Amos Tversky’s work on decision-making process, and more importantly the underpinning irrationality, was ground-breaking. Tversky passed away in 1996; Kahneman won the Nobel Prize in economics in 2002. Kahneman’s body of work has been regarded as the foundation for “behavioral economics” where psychology is incorporated into understanding our economic decisions.
In their seminal article, published in Science (1974), “Judgment Under Uncertainty: Heuristics And Biases,” the two authors listed three major categories where our biases run deep: Representativeness, Availability, and Adjustment and Anchoring.
In a nutshell, when we assign a person to represent a whole group based on some preconceived notion, i.e. a stereotype, we make the “representativeness” error. For instance, when we encounter a quiet man, wearing dark-rim glasses in unfashionable attire and speaking with little eye contact, we may think he is an engineer, an accountant, or a librarian. In one of the many experiments Kahneman & Tversky’s conducted to test this hypothesis, they used this statement for their test: “’Steve is very shy and withdrawn, invariably helpful but with little interest in people or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail.’ Is Steve more likely to be a librarian or a farmer?”
Most of us probably cannot shake off our initial reaction that Steve is likely to be a librarian even after we learn that there are “more than 20 male farmers for each male librarian.” And farmers probably spend less time interacting with people than average librarians.
Using “availability” bias, we evoke what we remember most – the images or memories that come to our mind immediately and quickly – as the yardstick by which we make a judgment. By Kahneman’s own admission, for quite some time, he believed that politicians were more likely than, say, doctors or lawyers (yet to become politicians), to commit adultery. How many of us feel the same way? The fact is that we know more about politicians’ transgressions because they get reported more often, hence more available to us to recall. Another example, we are more jarred to see images of houses being burnt down than just reading about the incident. Perhaps in this case, the visual impact may cause us to be more vigilant about fires and our own situations? So, sometimes, biases serve us well?
In “adjustment and anchoring,” we let our initial encounter sway how we see the future outcome. Or, we give a quantitative estimate before we even know how to go about assessing the event quantitatively. Here is a theoretical example:
Give an estimate, in 5 seconds or less, the product of
Close your eyes.
Now give an estimate, in 5 seconds or less, the product of
Or, better yet, try it out on two friends, or two groups, then, compare the estimates. Inevitably, people overestimate the statement that begins with “8” than the one beginning with “1.” The median for the former is 2,250 and 512 for the latter. Huge, no? The actual answer is 40,320.
How about this: Did Gandhi die at the age of 114? If not, how old was he when he died? Compare asking the same question another way: Did Gandhi die at age 35? If not, how old was he when he died?
So, you say, “That’s all right, it’s just numbers and a test. In real life, we know better.” Then, let’s look at purchasing a house. The initial asking price definitively influences how we perceive the quality of the house. We would consider the same house to be of higher quality if the asking price is high than if the asking price is low. Kahneman and Tversky had conducted countless experiments testing this “anchoring” effect, and the findings have been reliable and robust. There was an experiment done on grocery shopping. When “limit of 12 per person” is imposed, compared to “no limit per person,” people buy twice as many of the advertised product. Even judges have been shown, repeatedly, to be influenced by anchoring effect (well, some of us know that judges aren’t really truly totally honestly objective).
All this is not to belittle humanity. As I said earlier, Kahneman and Tversky demonstrated again and again, that even experts, including statisticians and including themselves, are prone to these judgment biases. I resonate deeply with what Kahneman lays out in his latest best-selling book, Thinking, Fast and Slow: 1. By pointing out our innate flaws, it doesn’t mean that we should give up. 2. However, given that it’s much easier for us to detect in others their logic flaws than catching our own (just human nature), we can gradually elevate our understanding of the sources of potential biases by observing others. 3. Hopefully, over time, we will learn from detecting the biases in others to eventually lessening the biases in our own judgment and decision-making.
This entry is wholly insufficient to describe the brilliant work of Kahneman and Tversky. For interested readers, you can check out the latest Thinking, Fast and Slow. Or, for those who’d prefer a drier and more academic set of articles, go for Judgment under Uncertainty: Heuristics and biases, edited by D. Kahneman, P. Slovic, and A. Tversky.
Till next time,
Staying Sane and Charging Ahead.
Direct Contact: firstname.lastname@example.org