Cognitive biases: challenging the way we think
"A great many people think they are thinking when they are merely rearranging their prejudices." — William James
"Common sense is nothing more than a deposit of prejudices laid down by the mind before you reach eighteen." — Albert Einstein
We may think we are rational, open-minded thinkers making ethical decisions in an objective, non-biased fashion. However, this is almost impossible. No matter how intelligent we believe we are, we are all susceptible to a swarm of cognitive biases. Our decisions are often shaped by emotional attachments, misleading memories or our personal self-interests. These patterns of behaviour fall under the umbrella term “cognitive bias”.
Cognitive biases are errors in our thinking which influence our decision-making processes. They are patterns of behaviour which draw us to particular conclusions. We would like to believe we are making rational, well-informed decisions, but in reality, our brains form conclusions based on information gathered and stored from the past. Rather than acting based on the facts at hand, we subconsciously base our decisions upon a number of factors, essentially: previous decisions with similar subject matter; information we have selected which suits our preconceived ideas; emotional attachments; and/or self-interest.
Pattern recognition and emotional tagging are two processes that contribute to cognitive bias. Both of these processes stem from the idea that our brains resort back to stored information rather than evaluating each decision as an individual and fresh task. Heuristics encompass this idea, being mental shortcuts which aim to simplify our decision-making processes. While saving us time when drawing conclusions, heuristics result in cognitive biases, which may lead to false assumptions being applied to new situations.
Psychologist Gary Klein perfectly summarises the cognitive bias process: “our brains leap to conclusions and are reluctant to consider alternatives; we are particularly bad at revisiting our initial assessment of a situation”. While cognitive biases can be good survival tools – making sure we make safe and sensible decisions quickly, they can also distract from logic and lead to bad decisions based on poorly informed judgements.
Psychologists Amos Tversky and Daniel Kahneman developed the term “cognitive bias” to illustrate a person’s flawed patterns of responses to decision-making and judgement problems. Through a series of studies that were focused on behavioural decision-making, it was determined that people make decisions based on heuristics and common sense principles. In other words, people do not make decisions based on rationality or logic.
Tversky and Kahneman’s research began with the pair evaluating their own intuitions and looking for biases in their own judgements. From this, a series of investigations were established, seeking to discover why and how people make decisions. These experiments resulted in the development of the “two-way system of thinking”. System one (thinking fast) is the intuitive, faster thought process sometimes referred to as the ‘gut reaction’ way of making decisions. In comparison, system two (thinking slow) is the more idealised way of decision making – involving critical and analytical thinking.
While most of us would say we are system two thinkers, Tversky and Kahneman’s findings demonstrate that we spend most of our lives making quick decisions – system one being automatic and system two being used only when we are faced with an unexpected choice to make.
System two requires effort and methodical thinking, so system one will often take over from system two when decisions become too difficult. System one simplifies matters by ignoring tools like statistics and facts, and “jumps to conclusions”. It is important to note that these two systems are a metaphor to describe how the mind works rather than two physically separate areas of the brain.
The impact of Tversky and Kahneman’s research has stretched far beyond the confines of psychology to have a lasting and profound influence upon areas such as economics, finance, law and negotiation. The impact of the pair’s work is solidified through their 2011 book Thinking Fast and Slow. This received an abundance of awards such as the Presidential Medal of Freedom and was recognised in the New York Times Ten Best Books of 2011.
Kahneman and Tversky’s research is important when investigating ethical decision-making as it challenges us to evaluate the steps we take when reaching conclusions. The pair’s findings are significant as they deconstruct often overlooked ideas – demonstrating that although we often think we are making careful decisions, our brains are merely post-rationalising decisions that have already been made. Identifying the flaws in our thinking enables us to improve our ethical decision-making.
It is through this recognition that we can reduce the influence of personal factors on our decision-making and include a variety of information and external contributors. We rarely question our abilities to make decisions, so it is important that we open the door of discussion in this area to develop a better understanding of subconscious processes and consequently improve our judgement abilities. This is not only important in the law and our professional lives but also our personal and everyday existence.
Many different types of cognitive biases have been established through various studies and investigations. Confirmation bias, the anchoring bias, the bandwagon effect, the overconfidence effect, and the optimism bias are five key biases which can be used to explain why good people make bad decisions. The next article in this series will discuss these specific cognitive biases in more detail.
Paul Sills email@example.com is an Auckland barrister and mediator, specialising in commercial and civil litigation. He is an AMINZ Mediation Panel member. This is the first article in a series about Cognitive Biases. Read part two, part three and part four.
Last updated on the 2nd August 2019