Why expertise can lead to bad decision-making

Managers at the top of their field have worked for years to earn the right to call themselves experts, but their knowledge could be holding them back

 
Feature image
Expertise is highly valued in our society, and yet experts so often seem to get it wrong. One could be forgiven for thinking that expertise might not count for much after all

In March 2007, one of the most distinguished economists in modern US history, then-Federal Reserve Chair Ben Bernanke, appeared in front of the US Joint Economic Committee to discuss the dangers posed by subprime mortgages. Although he acknowledged that turmoil in the market had created problems for families, he concluded that this would not affect the wider economy. “Mortgages to prime borrowers and fixed-rate mortgages to all classes of borrowers continue to perform well, with low rates of delinquency,” he said.

What happened next made him eat his words. Over the coming months, lenders defaulted and the stock market collapsed, sending shockwaves through the US financial system and plunging the global economy into recession. Bernanke was far from the only person to misjudge the situation: up until the eve of the financial crisis, agencies consistently rated bundles of subprime mortgages with their highest ranking of safety.

Expertise is highly valued in our society, and yet experts so often seem to get it wrong. One could be forgiven for thinking that expertise might not count for much after all. Consultant and author Sydney Finkelstein is an expert in leadership and strategy − whether this qualifies him to speak on the subject is a question he has conducted research into. By studying managers at the top of their field, Finkelstein concluded that many fall into what he calls the ‘expertise trap’. These specialists are so accomplished in their specific area, he argues, that they develop blind spots that stop them from seeing new situations or coming up with innovative solutions. Unless we steer clear of these blind spots, expertise could end up becoming a hindrance rather than an asset.

Experts are guilty of presuming that history will repeat itself and therefore applying an inappropriate solution to a new problem

Practice makes perfect
What it takes to become an expert is a matter of some debate. Malcolm Gladwell’s bestselling book Outliers popularised the ‘10,000-hour rule’, which suggests that a person has to put 10,000 hours of practice into something to become an expert in it. According to this logic, the average person would need to work in an industry for at least five years before they could be considered an expert in their field.

But, as cognitive scientist Alex Burgoyne points out, experience isn’t everything. “In one of our own studies, we showed this when we taught 161 undergraduates how to play a simple tune on the piano,” he told European CEO. “Some were quick learners, while others never really seemed to figure it out. In that study, individual differences in cognitive ability were the best predictor of piano skill acquisition.”

Simply sinking time into a skill is no guarantee of mastering it. This can be one of the first pitfalls of expertise in the working world – just because someone is highly experienced doesn’t mean they’re highly knowledgeable. However, even if someone does have the magic combination of experience and talent, they can still develop blind spots.

Stuck in the mud
The more experts, the better, one would assume. In theory, having an expert-heavy board should bode well for an organisation’s performance. But the opposite may actually be true. A 2016 study found that boards with lots of experts on them were more likely to damage a company’s chance of success.

One of the study’s authors, John Almandoz, Associate Professor of Managing People in Organisations at IESE Business School, explained that this finding could be attributed to a phenomenon called ‘cognitive entrenchment’. When a person is highly experienced, they can be slow to recognise situational changes or predict unprecedented events. “Cognitive entrenchment may be functional in stable environments, which may permit experts to make faster and more accurate decisions, but when the patterns of the past are no longer applicable, they may run into trouble,” he told European CEO. “The financial crisis of 2008 showed, for example, that consistently rising real estate prices were not a permanent feature of the US economy. Such assumptions could be entrenched in expert decision-making algorithms, leading to unexpected and tragic losses.”

Experts are guilty of presuming that history will repeat itself and therefore applying an inappropriate solution to a new problem. In high-stakes situations, the consequences of this can be severe. One 2015 study published in the journal JAMA Internal Medicine found that cardiac patients were less likely to die if they were admitted during a national cardiology conference when thousands of specialists were unavailable. This may have been because experienced cardiologists had a habit of using intensive interventions on patients.

“If I were to speculate, I would say that people develop routines to handle everyday problems and these routines allow them to operate on autopilot, without careful thought,” said Burgoyne. “While operating on autopilot might work most of the time in familiar situations, there are always exceptional circumstances where more careful, reflective thought is warranted.”

A confidence game
Experts also become overconfident in their skill set, which can create new problems. While confidence is mostly a good thing in corporate contexts, it can also lead to undue risk-taking. A study published in the Journal of Financial and Quantitative Analysis reviewed leadership at 1,500 global companies and concluded that overconfident CEOs were more likely to engage in reckless international decisions that could get them sued.

One person’s overconfidence can also override the bright ideas of those around them. Research shows that, in decision-making contexts, teams are likely to defer to the expert in the room 62 percent of the time. If that person is absent, they will then defer to the most extraverted person. So when the most influential speaker is in the wrong, it can lead the whole group astray.

When this becomes a key fixture of company culture, the company itself can stifle innovation. “The story of Edwin Land, the founder of Polaroid, is a great example of overconfidence,” said Almandoz. “He was indeed a highly successful expert in many domains of innovation but his aura of expertise prevented him and his organisation [from seeing] how the competitive landscape was moving into digital technology. We know how that story ended.”

Since the corporate world rewards experience, and since we are inclined to invest time in what we enjoy, the blind spots of expertise seem difficult to avoid. Perhaps we’re all doomed to become blindsided and overconfident in our decision-making the more experienced we become.

However, the way we acquire and use our expertise can help us mitigate this problem. The Greek poet Archilochus is credited with the saying: “The fox knows many things, but the hedgehog knows one big thing.” Philip Tetlock put this to the test in a 20-year study, where he pitted specialists (dubbed hedgehogs in the study) against bright people with a broad array of interests (foxes) to see which group was better at making long-term predictions. Ultimately, the foxes won. The key reason was that foxes were more likely to change their beliefs when presented with new evidence, while contradictory evidence made hedgehogs more set in their beliefs.

Being open to new ideas seems to be the best antidote to the adverse side effects of knowing too much. Ironically, experts might perform better if occasionally moved past their specialism and looked outside the box.