Library of Professional Coaching

The Future of Coaching – helping leaders overcome ignorance, hubris, blind-spots and become more self-aware

In my previous essay on “the crisis of expertise”, I not only commented on leadership hubris and over-confidence, but also that many people (especially lay-people in the context of a specific complex topic) are often blatantly ignorant and largely unaware of their ignorance. A phenomenon in our modern digital world, is that information – and misinformation – is rapidly accessible and lay-people are especially susceptible to thinking they know a lot about a particular topic, while being ignorant or misinformed about the subject. This mis-informed sense of knowledge is sometimes accompanied by a zealous (and sometimes aggressive) defense of their mis-informed knowledge. As psychologist David Dunning (Dunning-Kruger Effect) says “we are all stupid, its just that some of us are aware of how much we don’t know, and what makes us stupid” and are therefore less likely to parade our stupidity.

In David McRaney’s book “You are not so smart”, the author rather humorously (but accurately) notes that all of us humans are to some degree unaware of why and how we think, feel and behave – our unconscious biases and behavioral drivers or triggers are largely unknown to us:

“There is a growing body of knowledge coming out of psychology and cognitive science that you have no clue why you act the way you do”.

This lack of awareness is particularly concerning, even dangerous, amongst senior leaders who make important decision that impact people, organizations and even societies.

Neuroscientist, Stuart Firestein (“Ignorance: How It Drives Science”) argues that “we should value what we don’t know just as much as what we know”. However, to value this “ignorance” (but not stupidity) requires an appreciation of the depth of knowledge and of the experts who deeply understand these fields of research and study. The problem is that most people are blatantly unaware of how much they don’t know – leaders and experts in positions of power and influence who are “ignorant of their ignorance” are especially dangerous.

These psychological drivers or triggers are not only important for leaders to be self-aware of, but they also need to understand the importance of these drivers in influencing the behaviors of people they lead. Leadership coaches are in a position to become more informed about these biases, blind spots and behavioral triggers in the leaders we coach and help them become more self-aware and more effective leaders.

Subtle factors that drive emotions, thoughts and behavior that are outside of conscious awareness

Most of us would likely vehemently argue that we would NEVER be as fickle as to respond more favorably to one person over another simply because one of the two were better looking! Few of us would admit to being “primed” by simple words we read or hear (even in the background), which then unknowingly influence how we view and interact with someone moments later. For example, research demonstrates that seeing images about retirement homes and old people actually make young students walk slower and act fatigued compared with control groups who have not viewed these images.

Most of us are likely to think we are quite aware and consistent in our behaviors, while in fact most of us are heavily influenced by individual biases and various social norms in different circumstances that we are largely unaware of. For example, some of us behave very differently at work, various social settings versus at home or at church. This essay describes how our behavior can be influenced without us being overtly aware of it. Rather surprising to many, but powerful and positive if we become more aware of these influencers and develop techniques to manage our responses to them. Leadership coaches are in a strong position to help clients become more aware of their individual drivers and provide techniques to overcome negative outcomes.

When describing these psychological drivers and techniques to a client recently, she responded by saying they sounded “manipulative”. And in a way she was right. But it is important to note that these psychological drivers operate naturally – they occur all the time in all of us, largely without our awareness (what psychologist and Nobel Prize winner Daniel Kahneman describes as “System 1” responses). What we, as coaches are able to do is utilize these techniques to shift behavior of a client in a positive manner.

What is also probably obvious, but important to note, is that the use of these techniques must be applied specifically to encourage the behaviors, actions or decisions needed in the context of when they occur. Clearly analyzing and articulating the behavior changes or actions needed from employees in specific situations is important. A simple example of how this process can progress is the following:

  1. A leader is in a particular organizational setting (say a high-level meeting) and notices a new person in the room that reminds her of someone in her past. She feels a moment of annoyance emerge which influences her reactions and focus during the meeting. Walking out of the meeting, she feels frustrated and confused about why the meeting went poorly.
  2. In a subsequent debrief, she notes to her coach that she was not on her game and distracted during the meeting and was mystified why.
  3. The coach explores her experience together with her and picks up on a comment about the stranger in the room. Further discussion identifies that the stranger reminded her of a college professor who was harsh, critical and confrontational.
  4. The coach then suggests that the leader was likely “primed” (see later) and suggests applying the technique of “thinking about thinking” and other techniques in future to manage these situations.
  5. The coach then helps the leader practice these techniques to mastery.

Some emotional and behavioral drivers and triggers

Priming
Kahneman, in his groundbreaking book entitled “Thinking Fast and Slow”, describes the “marvels” of priming: For example, if you have recently read or heard the word EAT or FOOD, you are more likely to complete the word fragment SO_P as SOUP rather than SOAP. The opposite would be the case if you had read or heard the word WASH. EAT primes SOUP and WASH primes SOAP. We do this unconsciously. Amazing but true! Kahneman notes that it is difficult for many of us to accept that many of our behaviors and emotions can be primed by events of which we are entirely unaware. Of course, savvy marketers are very aware of these factors and effectively “prime” our thinking and buying behavior. Amazingly, in other research studies, “priming” groups of students with the words “forgetful, old age, lonely” and so on made these students walk much more slowly from the interview room than students that were “primed” with more energetic words. Athletes “prime” themselves with energetic and powerful mantras and images.

Change leaders can also utilize this phenomenon by, for example, priming employees as they arrive at work, (and frequently during the day) with words or phrases that energize change-oriented behavior. Words like “Innovate, speed, agility, collaborate” and so on, could be effective “priming” effects. In one technology company I worked with, I noticed that many of the work area and hallway walls were proudly adorned with the examples of technology breakthroughs that they had historically through the decades. While these items were truly amazing examples of legacy breakthroughs, I was convinced that these “old” artifacts primed many employees to be complacent and think “old” versus to be innovative about breakthroughs of the future. This organization fundamentally missed the huge technology advance from older devices to new, smaller mobile devices and have struggled to catch up.

Framing
People react very differently to the same information presented in different ways. Thaler and Sunstein describe research that shows that when a problem or decision is presented in a positive or negative way that implies loss or gain, people will overwhelmingly respond differently, despite the fact that the basic information is the exactly same. In one study, if doctors are told that “ninety of one hundred patients survived” as a result of a certain type of surgery, they are much more likely to recommend surgery than If told that “ten of one hundred died”. Our “System 1” brain responds immediately to this kind of loss or gain information without the more logical and thoughtful consideration of our System 2. Framing occurs because, as Kahneman notes, our System 2 brain tends to be lazy, and most people tend not to think deeply about what we hear or read – we tend to react to information in the moment (especially when under pressure or stressed).

Elliot Aronson (“The Social Animal”) also describes fascinating research examples of our built-in tendency to respond very differently simply based on how a choice is presented (indeed, it can be quite scary when we begin to be more aware of how easily our behaviors and decisions can be swayed). This human tendency can be used for positive benefit in organizational change initiatives by developing communications about the change process that “frames” information in a way that will be interpreted positively by our System 1 brains. For example, in many technology change projects, there is often a lot of information communicated about what IS changing – the response to these changes can very often foster resistance and fear. Leveraging the framing effect, communications could begin by describing what will stay the same. For example, while technology systems are implemented, the business processes behind the systems often remain the same or similar and thus less intimidating to many employees who may find this kind of technology change intimidating.

Availability Heuristic
If people are asked the question, what is higher; the number of murders versus suicides in the United States, they answer unequivocally that there are more murders than suicides (unless they are experts in this field). This is because we hear or read about murders on the news frequently – our System 1 retrieves this information quickly and assumes that because we hear about homicides a lot, they must be more frequent, however, this is untrue. Leaders can effectively utilize this human tendency to create “rules of thumb” by, for example, communicating about positive change experiences frequently. Large scale projects often produce bursts of communications when specific phases are underway, rather than on a regular and frequent basis. Frequent and ongoing communications and discussion about projects creates two important heuristics – firstly that change can be positive, and secondly, change is something that is ongoing versus occasional and scary.

The Status Quo bias
I, like most people, stick with default settings when I, for example, download a new software program – I most often simply accept the recommended defaults. Software vendors who include a “Recommended” setting are leveraging the “Status Quo bias” that almost all of us have. Most of us are largely unaware of our actions when downloading new software. [[\]Leaders can also leverage this human tendency for inertia by providing recommendations when people are faced with changes, or when they need to make change related choices. For example, most change leaders know all too well that employees tend to resist change when it is forced on them. A technique to both overcome this resistance is to provide several options AND to include a recommended selection. For example, a few years ago I was working with a procurement team developing new global processes. Instead of deploying new required process, we held workshops that allowed employees to bring their own thinking and experience into the process. We provided a few examples of what other regions had successfully implemented and made a recommendation on what we thought was best. Almost universally, the recommended default was accepted without resistance.

Fear of loss versus incentive of gain
Humans tend to hate losses much more than they are excited about gaining the same thing. For example, if people are asked to play a game where a coin is flipped – if heads these people win $X dollars and if tails they lose $100. Kahneman describes research that shows that $X will generally have to be about $200 for people to be willing to play this game. In other words, the fear of losing is about twice as great as the possibility of winning. For example, I was working on a project some years ago where incentives where provided to keep consultants on the project until the end (consultants tend to begin looking for their next project many months in advance of the end of an existing project, and will tend to leave the existing project for a new one. This project offered bonuses for consultants to remain to the end. These bonuses had little effect, and many consultants left early. A more effective method would have included retaining a portion of the consultants agreed pay until the end to leverage their fear of loss.

MINDSPACE

MINDSPACE is an acronym for nine psychological (and largely unconscious) mechanisms or “nudges” that can influence our behavior. The nine are: Messenger, Incentives, Norms, Defaults, Salience, Priming, Affect, Commitment and Ego. This framework is used primarily by governments in the crafting of public policy aimed at influencing the behavior of citizens – for example smoking cessation, healthcare adoption, exercising more, making more effective retirement decisions and so on. However, I think there is big opportunity for leadership coaches to apply these techniques to help their clients become more effective.

Here is a summary of each of the nine elements:

Messenger (and the message)

When we are trying to influence employees’ behavior during change and transition, the source of information – the messenger – is both critical and complex; probably more so than most change leaders realize. For example, studies conducted and described by the social psychologist Elliot Aronson show that – largely unconsciously – we tend to believe and trust information from people we like irrespective of their level expertise. Likeability (or lack of) is a big influencing lever (as we are seeing play out in the Presidential elections right now). However, when the topic is complex, like healthcare choices, technical issues or retirement finances, people are influenced more by messages delivered by those considered experts. But, paradoxically, we are also less likely to listen to or believe an expert if we don’t like them (again, most of us are not aware of this influence and do not admit to it). And how about this fickle research finding: People are also more influenced by a message from an attractive person even when the message has nothing to do with being attractive (and this is despite people saying that they would never be influenced by something as absurd as how good-looking a person is. But they are!). To make things more complex, research suggests that when a leader uses an emotional appeal (especially if there is an element of fear in the message) versus a factual appeal, we are more likely to be influenced – despite the fact that the basic appeal is exactly the same, like “sign up for healthcare”. We are more heavily influenced by emotion versus pure fact.

If you are involved in organizational change management, you have undoubtedly been involved in the communications aspects of developing a change plan. Change leaders tend to build out communications spreadsheets with stakeholder audiences, messages, messages/media, timing and the like, but seldom include psychological considerations. Change leaders and coaches could add a great deal of leverage and influence to their communications by considering some of these psychological nudges.

Incentives

Incentives and rewards of various kinds are common in change and transition projects. However, few of the coaches and leaders I have worked with understand some of the psychological dynamics at play with the incentives we use. Research reported in the Journal of Economic Psychology (Dolan et al, 2012) describes a number of behavioral insights with relevance to influencing behavior. Here is one that has particular relevance in organizational settings:

Our brains weight losses more than gains: As I described in my previous blog, people strive to avoid losses more than they strive for gains. This is a basic human behavioral trait based on evolutionary loss-aversion. Many companies I have worked with invest large amounts in providing incentives, rewards and recognition with little or no benefit generated for their business goals. Indeed, providing material or money rewards and incentives is fraught with problems and can often be counter-productive. For example, in one organization I consulted with, I conducted analysis of all large business groups and the amount of money invested in recognition and rewards for employees. I then compared this with the surveyed levels of engagement and satisfaction of employees in these groups. This analysis showed that the business function that gave the greatest amount of incentive cash had the lowest levels of employee satisfaction. Change leaders should consider (however contradictory to our current and historical thinking it may be) to frame “incentives” as charges that will be imposed if the change is not successful. For example, employees could be paid bonuses that go into a personal account as progress is made, with an understanding that these monies will all be withdrawn if the overall goal is not achieved – people hate to lose something that they have already received (and I know how difficult this thinking is for most of us!).

Norms

Cultural norms are the behavioral expectations or rules in a society (or company). Usually these are implicit, and rarely used in explicit ways to drive change. The way that people think and act at work is (obviously) critical in terms of executing strategy – BUT, strategy is a dynamic process and must change and morph as market dynamics change (this is the reason why the development of organizational values, which tend to be static, can be problematic and become obstacles for companies needing to change). The problem is that most organizations do not explicitly think about and manage behavioral “norms” to align with their changing business strategies. The result is that it virtually guarantees that cultural norms become out of alignment with ever-changing business strategies. Organizations that perform well on this are very explicit about cultural norms of behavior – most importantly, this level of clarity must be in the form of senior leaders being highly visible in their role-modeling these behaviors. Simply talking about how people SHOULD behave is not effective. The topic of how to shape behaviors and create norms is very extensive and too much for this essay.

Here is a funny but powerful video about how social norms, the need to conform and perceived pressure can influence us in the most humorous (and scary) ways:

https://www.facebook.com/anonews.co/videos/vb.997108126967413/1313784798633076/?type=2&theater

A less humorous example of the power of social norms and influence is the recent backlash against “critical race theory” (CRT) being taught in schools. News media show parents screaming and shouting at school board meetings and even becoming physically violent because they believe that CRT is bad for kids. When being interviewed by the media and asked “what is CRT”, few parents can answer – they are aggressively angry about a topic they don’t understand and cannot articulate.

Defaults

Defaults refer to an option that is automatically applied when people are required to make a choice decision and are indecisive. The notion of defaults is increasingly common in behavior change programs attempting to, for example, get employees to select a 401k investment or a health plan during open-enrollment. Most of us will likely be aware of the more frequent use of default choice architecture over the past few years, as this has been studied and applied. Closely related to “default options” in decision-making is the concept of “inevitability” in choice and change. Elliot Aronson describes this psychological response as it relates to, for example, earthquake preparedness, how people respond to information about election outcomes as well as racial integration. In the context of organizational change, there is powerful leverage when change leaders construct communications in a way that provides the default message that change is inevitable – it is certain! How this change is managed and handled may involve employee engagement and innovation, but whether or not it will occur is not up for debate. People respond very differently simply based on how the message is presented – and most are entirely unaware of this influence. 

Salience

We are bombarded with so much information these days, it is impossible for our brains to process even a small volume of it. Our brains (largely unconsciously) filter what we pay attention to. Dolan and others describe a number of factors that influence how we pay attention to a barrage of information:

Change leaders are often under pressure to put out information with tight timelines, and we often do not give adequate thought to the issue of salience for our target audiences. Coaches and consultants can provide significant benefit to clients to educate and make aware of this process.

 

Affect

In psychology “affect intensity” is the experience of emotion. Those of us involved in organizational change tend to pay little attention to emotions, but affect is a powerful driver in decision-making for all of us, whether we are aware of it or not. Daniel Kahneman describes the brain’s System 1 (fast automatic) and System (slow cognitive) in how we react to stimuli and make decisions. Emotions versus System 2’s careful consideration) is a greater driver of our decision-making than most of us realize. Research shows that by simply placing an attractive female model in an advertisement for a financial loan increased demand for the loan as much as by reducing the rate by 25% (Dolan). Few of us I’m sure would admit to being influenced by this! And, there is little logic why a sports character on a box of Wheaties dramatically improves sales versus a nutritionist with a sound explanation of the benefits of the cereal. It’s primarily emotionally driven influence (although many of us would likely refute being influenced by “trivial” factors like these!).

As leadership coaches, we should think more deeply about the behaviors we are trying to shift, and the emotions associated with each particular behavior. Companies, particularly with an engineering focus, can tend to communicate with a bias to logic, data and detail, while missing the opportunities to harness emotional messages that motivate more effective behaviors.

Commitments

Most of us struggle at some point in our lives with sticking to goals such as exercising more, losing weight, stopping smoking, drinking less and so on. Making commitments, especially if these commitments are public, is a powerful mechanism to help us stick to their goals. This is the influence that emerges from our social or business culture and the need to be accepted by those in our work or social milieu. Chevron applied some remarkably effective commitment mechanisms when I was working with them on a workplace safety project. Company leaders where expected to post their commitments to safety on written and framed placards outside their offices. Further, Chevron’s “behavior-based safety” process required any employee who witnessed an unsafe practice to verbally and directly request that the individual conform to the safety practice, and ask for a commitment to do so. This was remarkably effective in my experience. When people are encouraged to make public commitments to take action, they are much more likely to follow through.

Ego balance

One of the primary drives we humans have is to maintain and enhance a positive self-image. We routinely (and largely unconsciously) compare ourselves to those in our work or social milieu to judge what behaviors are acceptable, what builds our self-image versus what is not acceptable and what reduces our self-image. When we act in a way that contradicts or diminishes our self-image, cognitive dissonance occurs which can cause extreme, anxiety and discomfort. We then struggle intensively to regain balance between our actions and our self-image. What is fascinating with this process is that – to regain ego balance – we are more likely to change our beliefs versus our behaviors to maintain self-image. Here’s an example: In the banking experience I shared in a previous blog: A senior executive (who did not like me!) was extremely antagonistic towards my strategy to improve customer service, but begrudgingly allowed it to proceed because my sponsor was more senior.

While this executive was on vacation, he was invited to speak at an international conference in Johannesburg on the topic of service quality. Given his absence I was asked to develop his presentation. On his return, he had little opportunity to make changes to the presentation, and he essentially presented my customer service strategy as-is. It was very well accepted by the audience and rated as one of the best presentations at the conference. After this event, he became a big advocate of my program. Why did this “flip-flop” occur? The level of dissonance that he experienced having presented a very successful presentation and thereafter continuing to be critical would have been a contradiction of his self-image – he would have been viewed as inconsistent and insincere. So, he changed his beliefs and began talking positively about the customer service program. What was abundantly evident was that this individual was unaware of this psychological change within himself.

Approaches to reduce this kind “ignorance”

Daniel Kahneman (“Noise”) describes that training people to become aware of their biases, heuristics and the potential for noise is possible but difficult. He notes – “Decades of research have shown that professionals (experts) who have learned to avoid biases in their area of expertise often struggle to apply what they have learned to different fields”. For example, weather forecasters have learned to avoid over-confidence in predicting weather patterns, but are just as overconfident as anyone else on general knowledge questions. The role of a coach is valuable in these circumstances to remind expert leaders that they are straying outside of their areas of expertise and that well understood biases can creep up unknowingly. The power of this coaching is that it is “in-the-moment” versus in a training program. As Kahneman notes, “people often recognize biases more easily in others than they do in themselves” – skilled coaches (Kahneman calls these “decision observers”) are even more effective in this role.

Mechanisms to overcome “ignorance”

 A case for standardized checklists, algorithms and simple rules to reduce complexity

Daniel Kahneman (2011), the psychologist and Nobel Prize winner in economics, quotes Paul Meehl, (who Kahneman rates as “one of the most versatile psychologists of the twentieth century”), as saying that one reason experts are almost always outperformed in predictive capabilities by simple algorithms, is that they think they are quite capable of dealing with massive amounts of data and information – and they are almost always wrong. They know that they are very smart people – but they “try to be (too) clever, think outside the box and consider complex combinations of features in making predictions – Complexity (most often) reduces validity”. Many studies have shown that human decision-makers are inferior to relatively simple formulae, statistics and checklists when assessing and making decisions about the success of complex scenarios such as mergers and acquisitions amongst others. In research studies, even when smart people are given the result provided by formulae, these same people tend to overrule it and ignore it because they feel that they have more knowledge and information than that produced by the formulae. Kahneman notes that “they are most often wrong”.

Standardized approaches, simple algorithms and checklists can be very powerful tools. Atul Gawande, (2013), a general surgeon in Boston and assistant professor at Harvard Medical School, defines the power of checklists in this way:

“We (humans) have accumulated stupendous know-how. We have put it in the hands of some of the most highly skilled and hardworking people in our society. And with it they have accomplished extraordinary things. Nonetheless, that know-how is often unmanageable. Avoidable failures are common and persistent, not to mention demoralizing and frustrating across many fields – from finance, business to government. And the reason is increasingly evident: the volume and complexity of what we know has exceeded our individual ability to deliver its benefits correctly, safely and reliably. Knowledge has both saved us and burdened us” … but there is such a strategy (to solve this problem) – though it is almost ridiculous in its simplicity, maybe even crazy to those who have spent years carefully developing ever more advanced skills and technologies (and indeed is resisted in many companies for this reason). It is a checklist!

Kahneman puts forward his own personal judgment and predictive capabilities (or lack of) as a young military psychologist charged with assessing the leadership capabilities of aspiring officers; he was initially dismal at this task. He also highlights examples of poor capabilities of highly trained counselors predicting the success levels of college freshmen based on several aptitude tests and other extensive data compared to the predictive accuracy of a simple statistical algorithm using a fraction of the information available – the algorithm was more successful than the trained counselors by far. Kahneman continues to reference cases of experienced medical doctors predicting the longevity of cancer patients, the prediction of the susceptibility of babies to sudden death syndrome, predictions of new business success and evaluations of credit risk, all the way to marital stability and the ability to predict the future value of fine Bordeaux wines. In all these cases, the accuracy of highly trained experts was most often exceeded by simple algorithms, much to the consternation, occasional anger and derision of the experts concerned.

Jonah Lehrer (2009) similarly referenced studies conducted at MIT in which students given access to large amounts of data performed poorly in predicting stock prices when compared with a control group of students with access to far less information. He notes that the prefrontal cortex of the brain has great difficulty NOT paying attention to large amounts of information which can overwhelm the ability of the brain to estimate and predict. Access to excessive quantities of information can have “diminishing returns” when conducting assessments and predicting future outcomes, he says. Lehrer comments that corporations, in particular, often fall into the “excessive information” trap and invest huge amounts of resources in collecting data that can then overwhelm and confuse the human brain, versus the intent of informing decision-making.

Lehrer describes the remarkable situation of medical doctors diagnosing back pain several decades ago. With the introduction of MRI in the 1980’s and with far greater detail available, medical practitioners hoped that increasingly better predictions of the sources of back pain would be made. The converse happened. Massive amounts of detail produced by the MRI actually worsened their assessment and predictive capabilities – poorer assessments were made. Kahneman refers to scenarios that contain a high level of complexity, uncertainty and unpredictability as “low-validity environments”. Experts can become overwhelmed by complexity in decision-making. Leadership coaches can assist greatly by developing checklists or other simple decision support tools to limit biases and confusion from data overload.

The power of simple checklists

The power of something as simple as a checklist is has been shown by Kahneman to have “saved hundreds of thousands of infants”. He gives the example of newborn infants a few decades ago, where obstetricians had always known that an infant that is not breathing normally within a few minutes of birth is a high risk of brain damage or death. Physicians and midwives through the 1950’s typically used their varying levels of medical judgment to determine whether a baby was in distress. Different practitioners used their own experience and different signs and symptoms to determine the level and extent of this distress. Looking at these different symptoms meant that danger signs were often overlooked or missed, and many newborn babies died. When Virginia Apgar, an American obstetrical anesthesiologist, was asked somewhat casually by a student how to make a systematic assessment of a newborn,

Apgar responded “that’s easy” and jotted down five variables (heart rate, respiration, reflex, muscle tone and color) and three scores (0, 1 or 2 depending on the robustness of each variable). Apgar herself began to use this rating scale in her own work. She began applying this assessment about sixty seconds after birth of all infants she handled. A baby of eight or greater was likely to be in excellent condition. A baby with a score of four or less was in trouble and needed immediate attention. What is now called the “Apgar Test” is used in every delivery room every day and is credited for saving thousands of infant lives. Indeed, a report on CNN.com as recently as March 2014 (Hudson, 2014) indicated that about one in twenty-five patients that seek treatment in US hospitals contracts an infection from the hospital, and that patients acquired some 721,800 infections in 2011. This statistic is however significantly better than previous years, about 44% from 2008 to 2012. This result came from “requiring hospitals to follow a simple checklist of best practices”. Simple checklists focused on complex situations work!

Resistance to assessment, prediction and tracking methods

Kahneman writes in detail of the level of resistance, even hostility, that he and other researchers have met with when presenting the results of his research on this topic. From medical professionals to psychologists and wine producers, these experts either rejected or ignored the results, and in some cases responded with derision. Perhaps this is predictable, because these results challenge the assessment and predictive capabilities of these same experts who have developed their skills over many years and have rightly developed high opinions of their capabilities.

Kahneman quotes Gawande who writes in his book “The Checklist Manifesto”:

“We don’t like checklists. They can be painstaking. They’re not much fun. But I don’t think the issue (people resistance) here is mere laziness. There’s something deeper, more visceral going on when people walk away, not only from saving lives, but from making money. It somehow feels beneath us to use a checklist, it’s an embarrassment. It runs counter to deeply held beliefs about how the truly great among us – those heroes we aspire to be – handle situations of high stakes and complexity. The truly great are daring. They improvise. They do not need protocols and checklists. Maybe our idea of heroism needs updating.”

I agree with this sentiment. I have experienced this kind of response, verging on disdain when developing various checklists related to change and transformation in organizations undergoing transformation and change. Somehow a checklist, algorithm or computation trivializes their personal sense of the expertise, making them feel less expert. Trusted leadership coaches can greatly overcome these kinds of fears and resistance.

However, I believe a key element of introducing assessments and checklists is missed in Kahneman’s dialogue. These tools should be developed – as best as possible – together with the experts that will ultimately use them. This is a basic “behavioral change” principle, designed to overcome the “not invented here syndrome”. This principle has helped me introduce checklists into organizational change initiatives where many executives feel they “know it all”.

Expertise and ignorance – being smart is more about understanding the body of knowledge that exists, and being aware that there is much more to know

This essay has just touched the surface of how we humans are unknowingly influenced by a myriad of factors beyond our awareness. Beginning to understand these factors makes us smarter and reduces over-confidence, ignorance and poor decision-making. Given the resistance to these techniques, leadership coaches and consultants are in a position to nudge their clients to apply these tools for better awareness understanding and decision-making. 

For more reading on MINDSPACE:

Dolan, Hallsworth, Halpern, King, Metcalf, Vlaev (2012). Influencing behavior: The Mindspace way. Journal of Economic Psychology.

Exit mobile version