Experts are often Wrong!
The problem is not that expert leaders make mistakes – that’s a given (Kahneman, 2011). It’s that errors (especially those involving prediction) are inevitable because the world is complex and unpredictable. The problem is that many people in society expect and want leaders to provide what is often not possible – total and absolute clarity on complex issues and to be correct all the time! This expectation is unreasonable and dangerous in some cases (for example, taking medical advice as absolute or legal advice without the client doing some due diligence).
Nobel Prize winner Daniel Kahneman (Thinking Fast and Slow, Noise) describes, for example, variations and errors in judgement and decision-making in our court system. Studies show that similar cases of criminal extortion can receive massively different penalties ranging from twenty years imprisonment and $65,000 fine, to a mere three years and no fine at all. Other statistical research shows that judges are more likely to grant parole at the beginning of the day or after a food break than before such breaks. In other words, if judges are hungry or tired, they are much tougher. Judges are also more lenient on their birthdays. When the weather is hot, immigration judges are less likely to grant asylum. As Kahneman describes, these discrepancies in the judgements and decision-making of experts is not uncommon in a wide variety of specializations, including doctors, nurses, lawyers, engineers and many other professions.
Tom Nichols notes, “experts get things wrong all the time … and yet, experts regularly expect average people to trust their judgement and to have confidence not only that mistakes will be rare, but that experts will identify those mistakes and learn from them”. Nobel Prize winner Daniel Kahneman (Thinking Fast and Slow) notes numerous examples of excessive over-confidence of CEO’s, for example, making costly acquisition decisions, many of which are unsuccessful. In fact, research shows that the most confident CEO’s are more likely to make unsuccessful acquisition decisions.
What tends to exacerbate these situations is that people with the most knowledge often emerge as leaders with immense influence on the people around them. As Kahneman continues, psychologists have confirmed that most people (and especially senior leaders) genuinely believe they are superior to most others on desirable traits (including knowledge and expertise), almost developing a narcissistic perspective to their thinking, as those around them shower them with admiration and enable their leaders’ hubris. This blind confidence can be dangerous.
The more absolutely confident a leader is, the LESS we should trust them?
This overly optimistic and inflated sense of expertise is referred to, (in some contexts), as the “hubris hypothesis” (Aronson, 2008), where a person’s absolute optimism and surety is received more positively by followers than a leader whose optimism is described in a comparative manner (which can provide some balance on a complex topic). While comparisons can provide more balanced understanding, psychological research shows that audiences tend to dismiss these as being wishy-washy and tend more often to believe the absolute optimistic viewpoint – people tend to want absolute certainty. Herein lies the risk, where knowledgeable and overly confident leaders make absolute statements about the future, which are then rarely challenged by those around them. The potential is that a culture of blind followship to a leader’s dictates emerges. And followers can be following the pied piper into danger. Leaders and experts must be challenged, and issues must be debated if the best and most current information is to be surfaced and well-informed decisions made. But we need a process and guidelines for how to go about doing this.
Download Article 1K Club