Author: Charlotte Gifford
30 May 2019
In 2006, the US Senate Select Committee on Intelligence published its report on the justifications for the Iraq War. What it uncovered seemed like a master class in self-deception by the US Intelligence Community. When assessing whether or not Iraq possessed weapons of mass destruction, analysts had interpreted ambiguous evidence as being supportive of their prevailing theory. Even when presented with evidence that suggested Iraq had no such weapons, they repeatedly found ways to discount this information. As the Silberman-Robb Commission said, somehow the US Intelligence Community had managed to get it “dead wrong”.
This is a typical example of confirmation bias – our innate tendency to unconsciously place greater trust in evidence that supports our pre-existing beliefs. Such was the finding of David Kahneman, Nobel Prize winner and author of bestseller Thinking, Fast and Slow, whose work has contributed hugely to our understanding of irrational thinking and how it impacts our decision-making. According to Kahneman, myriad biases and unconscious processes are constantly working to keep our personal world view cohesive and intact. This has wide-reaching repercussions for the decisions we make in business.
Trusting your gut
A key unconscious bias that can prove harmful within organisations is when an individual is trusted blindly because of certain attributes they possess. For example, someone might be treated as a key decision-maker because they have a lot of experience in their field. An unquestioning belief in someone’s expertise, however, can encourage poor decision-making over time, with those who consider themselves experts often more likely to ignore important advice or factual evidence.
Myriad biases and unconscious processes are constantly working to keep our personal world view cohesive
“When we have been successful, it’s easy to feel that we have little left to learn,” Francesca Gino, Professor of Business Administration at Harvard Business School, told European CEO. “When we feel like experts, we tune out negative information that clearly suggests we are wrong or we’ve made a poor decision in the past.”
Gino cites one of her studies from 2006 as an example of this phenomenon. The US Food and Drug Administration (FDA) issued a warning not to use a commonplace medical technology, drug-eluting stents, after finding that they could lead to serious complications among patients – even proving fatal, in some cases. However, to the surprise of Gino and her colleague, many cardiologists continued to use these stents. In fact, those who were most likely to continue using them in spite of the FDA’s warning were the more experienced cardiologists.
“The decisions of the experienced cardiologists also influenced their cardiologist co-workers,” Gino said. “The doctors in our sample, in fact, followed the lead of more experienced cardiologists, not realising that their experience was masking what was best for their patients.”
Another example of an unconscious bias that can lead us astray is our inclination towards well-communicated ideas. Lee Newman, Dean of the IE School of Human Sciences and Technology, believes placing too much value in good presentation, over the substance of what’s being said, can lead organisations to make decisions for the wrong reasons.
“Too many companies have a culture of ‘advocacy-orientated’ dialogue in which it’s all about taking positions and imposing arguments to win,” Newman said. “Poor ideas that are well-articulated often rule the day in these companies.”
The vice of optimism
Cognitive biases are particularly significant when it comes to investing, and one bias that plagues investors is unrealistic optimism. A 1980 study conducted by Neil D Weinstein found that, when asked to predict the likelihood of negative experiences, people consistently rated their likelihood of getting divorced or developing cancer to be lower than the global average. It’s not that people don’t think bad things happen, Weinstein concluded, but that they think bad things are ultimately more likely to happen to someone else.
It’s not that people don’t think bad things happen, but that they think bad things are more likely to happen to someone else
Similarly, investors and acquirers are at risk of latching onto one individual asset they’re highly optimistic about. For example, an investor might become excited at the potential of a company’s technology platform, but overlook more telling signs of the business’ performance, such as slowing growth rates.
This is also an example of a bias called anchoring. Anchoring is the tendency to cling to a piece of information we’re given at an early stage and underreact to facts that contradict said information later on. This can mean that acquirers are surprisingly reluctant to shift away from a figure that was proposed early on, and could be one of the reasons why, according to McKinsey, approximately half of acquiring companies pay more for acquisitions than they’re worth.
“The good news is that even though we get tripped up, investors can learn from their mistakes,” Lucy Ackert, Professor of Finance at Kennesaw State University, told European CEO. “I recommend keeping a journal to record motivations for investment decisions. This may sound simple but when we force ourselves to think, we will. Looking back on choices that did not turn out well will help investors learn. Investors will realise that sometimes good outcomes result from good luck and bad outcomes from active choice.”
In a job interview, recruiters have only a short window of time in which to determine whether a candidate could be the right person for the job. As such, it’s almost inevitable that this process is rife with biases. Among the most common are the affinity bias (unconsciously preferring someone who shares qualities with you) and the gender bias (believing men or women will be better at certain jobs). This can lead to a lack of diversity in the workplace.
The difficulty in combatting these biases is that we will inevitably create our own narrative to convince ourselves that the decision was unbiased. According to Lisa Bortolotti, a philosopher in cognitive science, we tend to rationalise our decisions only after we’ve made them, not before.
For anyone in a leadership position, it is their responsibility to minimise bias not just in themselves, but in their teams as well
Many businesses are increasingly aware of the need to try to curb implicit biases at the hiring level. McKinsey recently introduced a digital assessment tool that tests candidates’ problem-solving and decision-making abilities, without taking their background into account. Other potential solutions to unconscious bias when hiring include conducting a gender blind review of applications and writing down initial impressions of each candidate before evaluating potential biases.
Rewiring our thinking
According to the Economist Intelligence Unit, 32 percent of C-suite executives believe it’s possible to entirely remove bias from one’s thinking. In reality, though, bias is so ingrained within us that it’s impossible to eradicate, and combatting it requires constant attention and effort. For anyone in a leadership position, it is their responsibility to minimise bias not just in themselves, but in their teams as well.
“As a manager you determine the ‘currency’ that is given out to your team and who earns the currency,” Newman said. “Reward people when they demonstrate high-quality thinking, use fact-based arguments and show a willingness to give real consideration to positions and ideas other than their own – whether or not their position or idea ends up being the one moved forward.”
Another solution to unconscious bias is to alter the group decision-making process. For example, managers can benefit from encouraging one employee to play devil’s advocate and interrogate the decisions being made in a group discussion. Similarly, Gary Klein, a cognitive psychologist known for his research in decision-making, recommends the ‘premortem’ evaluation. Instead of debriefing after a failure, Klein suggests giving one individual the task of assessing all the things that could go wrong, and then basing key decisions on those assessments to avoid failure.
Irrationality is a part of human nature. Once we embrace this, we can learn to better evaluate our own thinking – and that of our team members – to ensure each decision is grounded in fact and not driven by emotion.