Library of Professional Coaching

Expertise And Ignorance: We Are All Ignorant—Some of Us Know It and Some Of Us Don’t

Kevin Weitz, Psy.D. and William Bergquist, Ph.D.

To recognize superior expertise would require people to have already a surfeit of expertise themselves. David Dunning (2012)

There is a growing body of knowledge coming out of psychology and cognitive science that you have no clue why you act the way you do. David McRaney (2012)

In our previous essay on “the crisis of expertise”, we not only commented on leadership hubris and over-confidence, but also noted that many people (especially lay-people) are often blatantly ignorant, More importantly, they are unaware of their ignorance. In our digital world of the mid-21st Century, information – and misinformation – is readily accessible.

Lay-people are especially susceptible to believing that they know a lot about a particular topic. Unknowingly, they are woefully ignorant or misinformed about this topic. This mis-informed sense of knowledge is sometimes accompanied by a zealous (and often aggressive) defense of their incompetence. As psychologist David Dunning (2012) concludes: “we are all stupid, it’s just that some of us are aware of how much we don’t know, and what makes us stupid” and are therefore less likely to parade our stupidity.

The Challenge: Ignorant About Our Ignorance

In You Are Not So Smart, David Mc Rainey (2012) rather humorously (but accurately) notes that all of us humans are to some degree unaware of why and how we think, feel and behave. Our unconscious biases and behavioral drivers or triggers (heuristics) are largely unknown to us: “There is a growing body of knowledge coming out of psychology and cognitive science that you have no clue why you act the way you do”.

This lack of awareness is particularly concerning, even dangerous, when it is found amongst senior leaders and experts who make important decisions that impact other people, organizations and even societies. It is perhaps even more debilitating for followers who believe (or reject) the information conveyed by these leaders and experts. Indeed, as we have seen with conspiracy theories, misinformation and blatant lies disseminated by some leaders and experts, it can be life threatening.

Neuroscientist, Stuart Firestein (2012) argues that “we should value what we don’t know just as much as what we know”. However, to value this “ignorance” (but not stupidity) requires an appreciation of the depth of knowledge found among the experts who deeply understand their fields of research and study. The problem is that most people are blatantly unaware of how much they don’t know. Leaders and experts in positions of power and influence who are “ignorant of their ignorance” are especially dangerous.

Leaders must not only be self-aware of these psychological drivers or triggers; in addition, they must understand and appreciate the important role played by these drivers in influencing the behaviors of people they lead. Leadership coaches are in a position to become more informed about these biases, blind spots and behavioral triggers in the leaders they coach. The informed coach can help their client become increasingly self-aware and more effective as a leader.

We are all ignorant – that’s either a good thing or a very bad thing!

On the one hand, our awareness of the scope and context of our ignorance is to be cherished. For example, a scientist should be profoundly aware of how much knowledge there is still to learn. On the other hand, it is potentially dangerous if we are unaware of our ignorance. For example, it is dangerous for a lay person to strongly propose an untested medication for a certain disease. Social psychologist David Dunning (2012) observes that “people are often not aware of their deficits on everyday tasks … and when they display their incompetence, they remain blissfully unaware of the fact”. He further concludes that:

People performing poorly cannot be expected to recognize their ineptitude. They are simply not in a position to know that they are doing badly. The ability to recognize the depth of their inadequacies is beyond them. Much like certain types of brain damage prevent sufferers from recognizing their blindness or paralysis, incompetence places people in a state of anosognosia that makes it difficult and at times impossible, for sufferers to gain insight into how impaired their intellectual and social skills are.

While this might feel like a slap-in-the-face to many of us, it does make sense, because, as Dunning notes, high order skills are needed to make a sound decision, or to recognize that a conspiracy theory is only a theory. It is cognitively difficult.to accept or reject the statements of an expert or leader—for the skills required to evaluate of our own responses to these statements requires something called Metacognition. Is my thinking distorted or biased regarding this statement? Is my behavior in response to this statement appropriate in this situation? Metacognition requires that we can think about our thinking and reflect on our behavior within a specific context. Dunning puts it this way: “Because they lack the skill to produce correct responses, incompetent people also lack the skill to accurately judge their responses” (Dunning, 2012).

Dunning’s research demonstrates that the more a person knows about a subject, the more likely they are to understate their level of knowledge. An important implication can be drawn: an expert understands the immense scope of any subject (climate science, infectious diseases, psychology and human behavior and so on). They recognize how much is still to be known. Incompetent individuals tend to greatly overestimate their knowledge. They don’t adequately understand the limited nature of their newly acquired knowledge. Google-based knowledge that was acquired three minutes ago isn’t adequate. Indeed, as Stuart Firestein (2012) reports: “we will have to provide the Wiki-raised student with a taste of, and for, the boundaries, the edge of the widening circle of ignorance, how the data, which are not unimportant, frames the unknown. We must teach students how to think in questions, (and) how to manage ignorance”.

The lack of appreciation for expertise and for experts is a form of incompetence. It is blatantly visible in the (now commonly posted) video interviews to be found in the public media. Conspiracy theorists trade in a form of dangerous misinformation (which we discussed previously). These theorists declare with no evidence that democratic pedophiles are operating in the basement of a pizza parlor. Anti-vaccine believers state that vaccines are nothing more than “a Bill Gates plot” There is a host of media followers who explicitly believe blatant and un-verifiable lies that are spewed out by many leaders (especially politicians). The psychological science behind these conspiratorial beliefs is powerful as well as troubling. These scientific findings are troubling in the sense that so many of us can be hoodwinked so easily!

Education (and lack of it) and an appreciation for what we DON’T know

In a previous essay, we quoted Tom Nichols (2018) in his book The Death of Expertise: “Google-fueled, Wikipedia based, Blog-sodden” The point being made is that facts and information are increasingly available at the click of a button. Firestein (2012) goes even further: “facts are available with a few clicks, and probably in the not very distant future by simply asking the wall, or the television, or the cloud—wherever it is the computer is hidden.” Facts and information are ubiquitous, sometimes giving people the perception that they “know it all”.

The real experts and leaders of the future must not only know the “facts” but also deeply understand what is not yet known. Current educational systems are focused on filling our brains with facts and information. An alternative curriculum would focus on assisting the student in understanding the boundaries of knowledge and the big questions that are yet to be answered. It is this understanding that makes many experts present information in a “comparative” manner—which can convey a sense of uncertainty of equivocal to the poorly educated.

As we have described previously, people with less education (especially those with an authoritarian personality type) respond more positively to absolute statements and reject equivocal descriptions from experts. They prefer statements from leaders and experts that imply “I and only I know it all”. William Perry (1970) has identified this stance as one of dualism. There is a right answer and a wrong answer. There is truth that is conveyed by the one true source of knowledge. The world is dualistic: truth and fiction.

However, it is these equivocal and comparative statements that embody the assumption held by experts that they don’t know it all: the body of knowledge we DON”T know is much larger than that which they DO know. Perry identifies this as a move from dualism to a Multiplistic perspective. There are multiple legitimate sources of knowledge that can each lay claim to the truth. Even the criteria for determining what is and is not real will vary—making it that much more difficult to come to a conclusion regarding which “facts” to accept.

Unfortunately, our current educational system seems to be stuck in the world of Dualism. Any move toward Multiplicity is met with displeasure (and even violent objections) from parents and community leaders who insist that only the “truth” be taught. Something must change. As Firestein proposes: “The business model of our (educational systems), in place now for nearly a thousand years, will need to be revised … Instead of a system where the collection of facts is an end, where knowledge is equated with accumulation, where ignorance is rarely discussed, we will have to provide the Wiki-raised student with a taste of the boundaries, the edge of the widening circle of ignorance, how the data, … frames the unknown. We must teach students how to think in questions, (and) how to manage ignorance”.

Knowledge and expertise are specific – the importance of knowing when you are out of your depth

Just because we have expertise in one domain doesn’t mean we are knowledgeable in others. Many of us don’t understand and find it hard to accept this fact Our previous Hatfill story is a good example of where experts fall into this trap: assuming that because they are experts in one domain, they are also knowledgeable in others. Unfortunately, their supporters/followers support and enable this simplistic view: they listen to and believe anything their “expert” states. While on the one hand, this faulty assumption is relatively benign when we buy into a sports figure’s endorsement of a specific cereal, it is much more dangerous when the endorsement coming from someone who is an “expert” on a specific health issue concerns the “real” motives behind the position proposed or actions taken by an opposing social group or political party. They have shifted from health to sociology and political science. This is an unwarranted shift.

At a very simply level, this uncritical acceptance is based on something one school of social psychology calls “balance theory.” (Brown, 1986). It seems that when we like someone and accept their views on one issue, then we are likely to accept what they say regarding another issue. We are keeping our social relationship with this other person in a state of “balance.” It is very uncomfortable for us to hold a mixed view of another person—especially someone we admire.

Another school of social psychology views this as an example of the strong motive to avoid Cognitive Dissonance. (Aronson, 2018) For this school, it is not just enough to keep an interpersonal relationship balanced, it is important that we keep everything in our life consonant. We don’t want any contradictions in our belief system. If we believe that this person is an expert on medical matters, then we also have to believe that they are an expert in other areas—otherwise their medical expertise might also be questioned. “If they don’t know what they are talking about with regard to X, Y or Z or if they are lying to us, then how do we believe anything they say!”

We see in this proclivity to sustain interpersonal balance and avoid dissonance another example of a dualistic perspective: either everything this person has to say is actuate and they are to be trusted or nothing they are saying can be believed or trusted. When this search for balance and consonance is brought into the public domain and when what is being said in this domain is saturated with anxiety, then this search becomes highly motivated. It also tends to lead to mis-directed decisions and inappropriate actions. This is when the world becomes quite dangerous.

To illustrate this more dangerous level of uncritical acceptance of expertise, we return to Steven Hatfill. As a reminder, Hatfill was a 67-year-old immunology professor at George Washington University (in Washington D.C.) who came out of the blue with no credentials or credibility to guide US policy under Trump regarding delays in COVID response. He ultimately transitioned out of his area of medical knowledge to be involved in the attempts to overturn the 2020 election and he became involved in false claims about the rigging of the US Presidential election in 2020.

As David Dunning has observed, it is a very complex and difficult task for anyone to assess their own competence. It is even harder when those around us (particularly sycophants), do not provide honest and critical feedback. It is a slippery slope for experts to consider themselves expert in other domains – and sometimes with dangerous outcomes. It is an even more slippery slope for those in society who are asked to accept this shifting expertise. They desperately want balance and consonance. It is essential for experts and leaders to build a team of knowledgeable collaborators and for critics to be present who provide honest feedback. Both the slippery expert and the sliding public need this assistance.

Intelligence and Intellectual style

Although highly controversial, Robert Sapolsky (2017) offers research findings regarding the relationship between intelligence and ideological perspective. He findings that he offers suggests that lower intelligence predicts adherence to conservative ideologies. These findings might, in turn, suggest a preference among those embracing these ideologies for dualistic thinking, as well as a strong pull toward balance and consonance. At the very least these findings point to a specific constellation of personality attributes that are aligned with a conservative perspective.

It is at this point that psychologist (and Robert Sapolsky in particular) are touching on the “third rail” of social psychology. They can rightfully be accused of bringing their own political leanings into the conversation. Over the past century this bias has been found coming from both the left wing and right wing. It seems that the critique of political biases may itself come with a strong bias. Even the analysis offered in this essay might reflect the biases of the authors. None of us are immune and we must all “come clean” regarding our vantage point.

The most famous (and controversial) of the studies being done regarding the perspectives and competencies of those holding specific political views was conducted in the 1950s.by a group of young psychologists at the University of California and a noted philosopher, Theodor Adorno (Adorno, et al., 1950; Adorno, 2020). These researchers found that those people who exhibit Authoritarian Personalities tend to be uncomfortable with ambiguity. By contrast, liberal thinkers “think harder”! Now, more than seventy years later, Robert Sapolsky quotes political scientist Philip Tetlock’s research suggesting that “leftists” have a greater capacity for “integrative complexity”.

These findings (and conclusions) from both the 1950s and 2020s can be applied to the challenges we face as a society about a complex and challenging social condition such as Covid 19. Conservative leaders and experts are likely to downplay or reject scientific facts being put forward. Are they somehow less “bright” than their liberal colleagues and liberal -leading experts? Furthermore, it is possible that conservative thinkers are more likely to default to belief of authoritarian leaders who provide simple and absolute answers. By contrast, liberal thinkers are more likely to question and challenge simple answers. Conservatism is easily equated with Dualism. Liberalism is then equated with higher-order Multiplicity. Is this simple differentiation between Conservatism and Lliberalism itself a Dualism? Do we all regress to an early stage of cognitive development “when the going gets tough?”

Evangelical religious beliefs

In the early 1970’s one of us [KW] revied a sports scholarship that enabled him to come to the US from South Africa. The Vietnam war was winding down.

This was a huge opportunity and privilege for me and my goal was to be the best I could be with this opportunity. I landed up at a Nazarene University in San Diego overlooking the Pacific Ocean. Little could have been more perfect. Except for the hypocrisy I experienced at this school from the Nazarene leadership. Besides being forced out of the school for “not being religious enough”, I was horrified how they treated Vietnam vets who came to the school after the war from Nazarene families. These were individuals with deep trauma, likely PTSD, and needed all the help, support and empathy that could be mustered. They received the opposite from the school. Instead, they were spied on, disciplined for bad behavior and at least two of them that I knew personally were eventually ejected from the school. My own family was a “not very religious” Catholic family and my only early experience from Catholicism was punitive to the point of family arguments for not being Catholic enough and ultimately family fragmentation.

This personal narrative suggests that KW might be somewhat biased when reflecting on this the role of evangelical religious beliefs, The second author [WB] offers his own narrative that reveals something about his own bias:

I grew up in a family where my father was a Christian Scientist. This is a religion that was born in the United States by a woman (Mary Baker Eddy). It is based on the premise that there is no physical reality but only a spiritual realm in which we all dwell. It is a realm in which there is no disease or death, and no inhumanity—for God is all knowing, all powerful and all good. In my household there was no illness and only good thoughts about other human beings (in alignment with God’s presence). It should be noted that in some ways this religion is aligned with recent Quantum perspectives regarding the nature of reality; furthermore, many valuable initiatives are associated with this church — such as the Christian Science Monitor (a highly respected newspaper published by the church). It is a religion that is filled with Love and affirmative perspectives – much needed today. Yet, I had to confront a “reality” early in my life that Mary Baker Eddy might have been ”wrong” regarding her vision of a single spiritual reality. It was indeed painful to confront my loving father with my own disbelief in the tenants of a church in which he was a “true believer” (and often served as a leader of his local church). He spent every evening reading “lessons” that had also been prescribed by Eddy many years ago. I was not allowed to attend any classes in Biology or related fields and received vaccines only after my Mother (who was not a Christian Sciences) strongly insisted on this preventative treatment.

The perspectives offered by Kristin Kobes DuMez (2021) are consistent in some aspects with the challenges both of us experienced. In her book “Jesus and John Wayne: How White Evangelicals Corrupted a Faith and Fractured a Nation” DuMez notes that “White evangelicals are significantly more authoritarian than other religious groups”. White evangelicals embody “a nostalgic commitment to rugged, aggressive, militant white masculinity” she notes. Her writing aligns with what KW experienced. We would add to this description that evangelical Christians overlap in values and beliefs significantly with conservative thinkers.

For example, in The Gospel of Climate Skepticism: Why Evangelical Christians Oppose Action on Climate Change, Robin Veldman (2019) notes that evangelical Christians, like conservative thinkers, are more likely to align their beliefs amongst their in-group as opposed to critically analyzing scientific data. As Veldman suggest, “being a part of the evangelical community is showing that you keep good theologically conservative company (i.e. stay close to your in-group’s thinking no matter what), and environmentalism is associated with being liberal” or aligned with an outgroup, and that is unacceptable. This description contrasts with that of a community consisting of independent and critical thinkers who step back from in-group thinking and more critically considers scientific evidence being reported, whether that be climate change, Covid vaccines or QAnon theories.

The experiences that the second of us [WB} had while growing up are directly aligned with Veldman’s description of the dynamics operating in an evangelical community. There is no room in Christian Science for divergent thought. The Sunday presentations are prescribed and have never changed since they were dictated by Mary Baker Eddy more than a century ago. There is a specific interpretation of Biblical passages that never varies and Eddy’s Science and Health is ”Gospel”. It conveys the absolute truth. There is no room in this church for contemporary scientific findings. It is truly remarkable that very little has changed in the Christian Science church after more than a century of new scientific discoveries and major breakthroughs in medicine. Like many Evangelical churches, Christian Science is essentially a closed system that leaves few openings for the “reality” of 21st Century life (let along the 19th Century and 20th Century!).

Cognitive load

Conditions of greater stress, such as the need to make time-pressured decisions, produces what psychologists call Cognitive Load. Those working in the field of behavioral economics (Kahneman, 2011; Ariely, 2008) offer many examples of our reduced capacity to engage in various cognitive functions (such as memorizing, problem-solving, and thinking in a logical manner) when we are stressed. Our prefrontal cortex that is responsible for most cognitive functioning is easily overloaded when barraged with many stressful challenges. Control is returned to other regions of the brain. We become fast-thinkers and knee-jerk reactors when stressed out and tired. While there are some intuitive tasks that are best engaged by older regions of the brain (Lehrer, 2009), it is often not good for us to revert to the more primitive processes and abandon the slow thinking that occurs in the prefrontal regions of our brain.

Stressful conditions brought on by Covid, along with demanding and changing work conditions and high levels of inflation create major cognitive and emotional challenges and exhaustion for many people. Different personality types handle these conditions in a distinctive manner. Sapolsky (2017) notes that people become more conservative in their thinking and decision-making when stressed and tired. Both conservative thinkers and liberal thinkers look for simple solutions and answers when stressed and tired. Most people it would seem are more likely to accept a simple answer from an expert or leader – even if it’s wrong or a lie – when they are overloaded and exhausted. The problem with this scenario, of course, is that most of us are completely unaware of our tendency to do this.

Doing Something About Our Ignorance

Unfortunately, the reversion to more primitive regions of our brain is usually well hidden. Most of us would likely vehemently argue that we would NEVER be as fickle as to respond more favorably to one person over another simply because one of the two were better looking! Few of us would admit to being “primed” by simple words we read or hear (even in the background), which then unknowingly influence how we view and interact with someone moments later. Research studies have shown that seeing images about retirement homes and old people make young students walk slower and act fatigued compared with control groups who have not viewed these images. Do these findings apply to any of us—or are they just applicable to college-aged students taking a Sophomore psychological course?

Subtle factors (heuristics) that drive emotions, thoughts and decision-making beyond conscious awareness

Many studies would suggest that findings such as these relate to all of us. The behavioral scientists identify something that they call Heuristics. These are mental shortcuts or “rules of thumb” that we develop over our lifetimes. We all make use of Heuristics. They are essential for us to make quick decisions without the need for more complex and slow event processing and decision-making. They are not just being used by college Sophomores. Furthermore, these Heuristics serve as the foundation for what we have frequently identified as fast thinking (contrasting with slow thinking). This is an important distinction for which we continue to thank Daniel Kahneman (2011).

Most of us are likely to think we are quite aware and consistent in our behaviors. However, in fact most of us are heavily influenced by individual biases and various social norms in different circumstances of which we are largely unaware. For example, some of us behave very differently at work, then we do in various social settings, or at home or at church. In this essay we have described how our behavior can be influenced by sources of which we are unaware. It is surprising and often uncomfortable for us to acknowledge this influence. This awareness is nevertheless powerful and positive. We must become more aware of these influencers and develop techniques to manage our responses to them. Leadership coaches are in a strong position to help clients become more aware of their individual drivers and provide techniques to overcome negative outcomes.

When describing these psychological drivers and techniques to a client recently, one of us [KW] noted that his client responded by saying that these reflective processes sounded “manipulative”. In a way she was right. It does seem rather “unnatural” to spend time reflecting on our thinking and feelings. Our spontaneity is lost when we spend too much time thinking “slowly.” We become cold calculating computers rather than living, caring human beings.

It is important to acknowledge that these unconscious psychological drivers operate naturally. They can’t be avoided. They occur all the time in all of us, largely without our awareness. Daniel Kahneman adds to his analysis by describes these natural reactions as being “System 1”. We need not blame ourselves for engaging in System 1. We only need to engage some System 2—which is the thoughtful and self-critical slow thinking identified by Kahneman. We, as coaches are able to utilize these System 2 techniques in helping our clients shift their thoughts and resultant behavior in a positive manner.

It is also important (perhaps obvious) to note that system 2 techniques must be applied specifically to encourage the behaviors, actions or decisions needed within the specific context in which they occur. A simple example of how this process can progress follows:
1. A leader is in a particular organizational setting (say a high-level meeting) and notices a new person in the room that reminds her of someone in her past. She feels a moment of annoyance emerge which influences her reactions and focus during the meeting. Walking out of the meeting, she feels frustrated and confused about why the meeting went poorly.
2. In a subsequent debrief, she notes to her coach that she was not on her game and distracted during the meeting and was mystified why.
3. The coach explores her experience together with her and picks up on a comment about the stranger in the room. Further discussion identifies that the stranger reminded her of a college professor who was harsh, critical and confrontational.
4. The coach then suggests that the leader was likely “primed” (see later) and suggests applying the technique of “thinking about thinking” and other techniques in future to manage these situations.
5. The coach then helps the leader practice these techniques to mastery.

System 2 is not comprised of an isolated set of conceptual tools. It is a “just-in-time” set of tools that are engaged in real situations. Clearly analyzing and articulating the behavior changes or actions needed from employees in specific situations is important.

Some subconscious heuristics

We find that there are many misleading heuristics operating in our society. These heuristics might be of some value in our day-to-day decision making regarding trivial matters. However, they can also contribute to inaccurate judgement of a critical nature that are occurring in many domains of contemporary society. One of Daniel Kahneman’s colleagues, Lewis Goldberg (1970), stirred up many controversies when he brought this behavioral economic analysis into an ongoing study of misjudgments on the part of clinical psychologists and other mental health workers.

It is not only a matter of misjudgments being made by so-called “experts” in a specific field, but also the impact of these misjudgments on the credibility assigned by the general public to these “experts” – and ultimately to experts of all sorts. This is where the ugly side of Multiplicity shows up. If we are presented with a multitude of opinions and advice offered by experts and if we find out that none (or at least many) of these experts are making inaccurate judgements—then we are left with a destructive option: We don’t trust any of the experts and either quite listening or looking for the most expedient or most carefully crafted opinion. In a negative Multiplistic space we turn the Golden Rule on its head: “those with the Gold will (should) rule – or at least are those to be believed and followed.”

We must focus on the sources of misguiding heuristics if we are to move beyond this nihilistic perspective of disillusioned Multiplicity. We offer the following list and brief description of five heuristics that are often the culprits—contributing to the misjudgments of experts as well as those of us who seek to find expert-based guidance. These heuristics are: priming, framing, availability, status quo, and fear of loss.

Priming

Our guide, Daniel Kahneman”, describes the “marvels” of priming: For example, if you have recently read or heard the word EAT or FOOD, you are more likely to complete the word fragment SO_P as SOUP rather than SOAP. The opposite would be the case if you had read or heard the word WASH. EAT primes SOUP and WASH primes SOAP. We do this unconsciously. Kahneman notes that it is difficult for many of us to accept that our behaviors and emotions can be primed by events of which we are entirely unaware. The theme of unawareness shows up once again! Of course, savvy marketers are very aware of these factors and effectively “prime” our thinking and buying behavior.

As we noted above, “priming” groups of students with the words “forgetful, old age, lonely” and so on made these students walk much more slowly from the interview room than students that were “primed” with more energetic words. At the other end of the energy spectrum, athletes “prime” themselves with energetic and powerful mantras and images. Change leaders can also utilize this phenomenon, for example, by priming employees as they arrive at work, (and frequently during the day) with words or phrases that energize change-oriented behavior. Words like “Innovate, speed, agility, collaborate” and so on, could be effective “priming” effects.

In one technology company where one of us [KW] worked, many of the work area and hallway walls were proudly adorned with examples of technology breakthroughs that they had historically achieved throughout the decades. While these items were truly amazing examples of legacy breakthroughs, I was convinced that these “old” artifacts primed many employees to be complacent and think “old” versus to be innovative about breakthroughs of the future. This organization fundamentally missed the huge technology advance from older devices to new, smaller mobile devices and have struggled to catch up.

The second author [WB] consulted to the leaders in the regional office of a major religious organization. Each conference room in this facility came with banners on the wall declaring the primary mission of this church Pads of note paper were also placed on the conference table in front of each chair on which was printed the mission statement. The chief executive officer of this region began each meeting with not only a prayer aligned with the church’s mission but also an example of the mission being successfully initiated in recent weeks. This is real priming! Once again, was this priming always good?

Did it leave the leaders of this region living in the past with little opportunity (or incentive) to upgrade the mission—with full recognition that the religious and spiritual perspectives and practices of many people living in the 21st Century are changing. When does a mission statement provide important continuity and guidance in a very stormy world—when is it a surface anchor that helps a ship at sea remain aligned with the wind and accommodating no more than a slow drift? Conversely, when does a mission statement become a heavy ground anchor that allows no movement of the boat—and keeps the institution too firmly tethered to the past?

Framing

People react very differently to the same information presented in different ways. Two other behavioral scientists, Thaler and Sunstein (2008) offer research findings indicating people will overwhelmingly respond differently when a problem or decision is presented in a positive manner with potential gains being identified then when the problem or decision is presented in a negative way that highlights potential loss. These diverse responses occur despite the fact that the basic information is exactly the same.

In one study, if doctors are told that “ninety of one hundred patients survived” as a result of a certain type of surgery, they are much more likely to recommend surgery than If told that “ten of one hundred died”. Our “System 1” brain responds immediately to this kind of loss or gain information without the more logical and thoughtful consideration of our System 2. Framing occurs because, as Kahneman notes, our brain tends to be lazy and doesn’t want to move to System 2. Most people tend not to think deeply about what they hear or read. We react to information in the moment (especially when under pressure or stressed).

We can turn from our behavioral economists to our social psychologists and find similar observations. Elliot Aronson (2018) offers fascinating research-based examples of our built-in tendency to respond very differently simply based on how a choice is presented. Indeed, it can be quite disconcerting when we become increasingly aware of how easily our behaviors and decisions can be swayed. This human tendency can be used for positive benefit in organizational change initiatives. Information about the change process can be “framed” in a way that will be interpreted positively by our System 1 brains.

For example, in many technology change projects, there is often an abundance of information communicated about what IS changing. The response to these changes can very often foster resistance and fear. Leveraging the framing effect, communications could begin by describing what will stay the same. For example, while technology systems are implemented, the business processes behind the systems often remain the same or similar and thus less intimidating to many employees who may find this kind of technology change intimidating. As in the case of both our high-tech firm and our religious organization, a gesture toward the past and reinforcement of the mission—a framing–can provide an important sea anchor—as long as it is not embedded as a ground anchor in the seafloor.

Availability Heuristic

An interesting experiment what conducted regarding how people make estimations. Research subjects are asked the following question: which is higher; the number of murders each year in the United States or the number of suicides reported in the United States each year? Subjects answer unequivocally that there are more murders than suicides (unless they are experts in this field). They are wrong. We make the wrong judgement because we hear or read about murders on the news frequently. Our System 1 retrieves this information quickly and assumes that because we often hear about homicides, they must be more frequent—even if this is not the case.

Leaders can effectively utilize this human tendency to create “rules of thumb.” For example, they can frequently communicate about positive change experiences. Large scale projects often produce bursts of communications when specific phases are underway, rather than on a regular and frequent basis. Frequent and ongoing communications and discussion about projects creates two important availability heuristics. First, change can be positive. Second, change is something that is ongoing rather than being occasional and scary. A repeated conveying of these two perspectives can lead members of the organization quickly to access them when thinking about the changes that are occurring all around them.

The Status Quo bias

Like most people, the two of us tend to stick with default settings when, for example, downloading a new software program. We most often simply accept the recommended defaults. Software vendors who include a “Recommended” setting are leveraging the “Status Quo bias” that almost all of us hold. Most of us are largely unaware of our actions when downloading new software. Leaders can also leverage this human preference for inertia by providing recommendations when people are faced with changes, or when they need to make change-related choices. “Here is the problem (challenge). Here is the answer.”

For example, most change leaders know all too well that employees tend to resist change when it is forced on them. A technique can be engaged to overcome this resistance. The change leader provides several options AND includes a recommended selection. For example, a few years ago one of us [KW] was working with a procurement team developing new global processes. Instead of deploying new required process, we held workshops that allowed employees to bring their own thinking and experience into the process. We provided a few examples of what other regions had successfully implemented and made a recommendation on what we thought was best. Almost universally, the recommended default was accepted without resistance.

Fear of loss versus incentive of gain

The behavioral economics proposed that we as human beings tend to hate losses much more than we are excited about gaining the same thing. For example, if people are asked to play a game where a coin is flipped. If it ends up heads the players win $X dollars and if it ends up tails, then they lose $100. Kahneman describes research that shows that $X will generally have to be about $200 for people to be willing to play this game. In other words, the fear of losing is about twice as great as the possibility of winning.

One of us [KW]was working on a project some years ago where incentives were provided to keep consultants on the project until the end (consultants tend to begin looking for their next project many months in advance of the end of an existing project and will tend to leave the existing project for a new one. This project offered bonuses for consultants to remain to the end. These bonuses had little effect, and many consultants left early. A more effective method would have included retaining a portion of the consultants agreed pay until the end to leverage their fear of loss.

It is interesting to note that the fear of loss might not be our primary motivator. Kahneman and other behavioral economists have found that regret might be even stronger that loss. We regret not taking certain actions—even if these actions might have produced loss. The opportunity that is missed will “haunt” us for many years. Like our protagonist, Charlie Brown, we will try one more time to kick the football being held by Lucy. She will pull it away once again, but Charlie keeps trying—for he would regret not trying again to see if Lucy will finally not pull the ball. The consulting project not taken because of the payment plan offered could be a source of painful regret and could influence our decisions regarding future project requests. We are likely to distort reality if it will shield us from regret. While we might be suspicious of advice offered by a possibly untrustworthy expert might be countered at times by the fear that someday we will regret not taking this advice.

Mechanisms to Overcome “Ignorance”

Daniel Kahneman and his colleagues (Kahneman, Sibony and Sunstein, 2021)) propose that training people to become aware of their biases, heuristics and the potential for noise is possible but difficult. They notes: “Decades of research have shown that professionals (experts) who have learned to avoid biases in their area of expertise often struggle to apply what they have learned to different fields”. For example, weather forecasters have learned to avoid over-confidence in predicting weather patterns—but are just as overconfident as anyone else on general knowledge questions.

The role of a coach is valuable in these circumstances to remind expert leaders that they are straying outside of their areas of expertise and that well understood biases can creep up unknowingly. The power of this coaching is that it is “in-the-moment.” By contrast, a training program on critical thinking is remote and often inaccessible when a changing judgment must be made. As Kahneman notes, “people often recognize biases more easily in others than they do in themselves” Skilled coaches—which Kahneman calls these “decision observers”—can be particularly effective in this role.

Reducing Complexity

We reintroduce Lewis Goldberg—the gadfly who questioning many of the assessment practices being engaged by clinical psychologists. He was working alongside Daniel Kahneman and Amos Tversky at the Oregon Research Institute (ORI) in Eugene Oregon. Michael Lewis (2017) the chronicler of these days at ORI noted that Goldberg, along with his ORI colleagues, wanted to be able “to spot when and where human judgment is more likely to go wrong.” Studies were conducted on the way in which experts in several critical fields gathered information and made decisions. In most cases experts engaged in very complex processes; however, the success of experts in predicting specific outcomes was found to be no better than predictions made on the basis of very simply sources of data and analysis> Goldberg focused initially on clinical assessments. He found that “simple actuarial formulae typically can be constructed to perform at a level of validity no lower than that of the clinical expert.’” (Lewis, 2017, p. 171).

The work of Goldberg and his ORI colleagues led Daniel Kahneman (2011) to quote one of Goldberg’s mentors, Paul Meehl, (who Kahneman rates as “one of the most versatile psychologists of the twentieth century”). Meehl (along with Goldberg) proposed that one reason experts are almost always outperformed in predictive capabilities by simple algorithms, is that they think they are quite capable of dealing with massive amounts of data and information – and they are almost always wrong. They know that they are very smart people – but they “try to be (too) clever, think outside the box and consider complex combinations of features in making predictions – Complexity (most often) reduces validity”.

The behavioral economists are making the case for standardized checklists, algorithms and simple rules to reduce complexity. Many studies (including those offered by Goldberg) have shown that judgements made by human decision-makers are inferior when compared to judgements based on relatively simple formulae, statistics and checklists. Simplicity beats complexity when assessing and making decisions about the success of complex scenarios such as mergers and acquisitions. Even when smart people are given the result provided by formulae, they tend to overrule this analysis. These smart people (and experts) ignore it because they feel that they have more knowledge and information than that which is produced by the formulae. Kahneman notes that “they are most often wrong”.

Standardized approaches, simple algorithms and checklists can be very powerful tools. Atul Gawande, (2013), a general surgeon in Boston and assistant professor at Harvard Medical School, defines the power of checklists in this way:

We (humans) have accumulated stupendous know-how. We have put it in the hands of some of the most highly skilled and hardworking people in our society. And with it they have accomplished extraordinary things. Nonetheless, that know-how is often unmanageable. Avoidable failures are common and persistent, not to mention demoralizing and frustrating across many fields – from finance, business to government.

He goes not suggest why the checklists work:

. . . the reason is increasingly evident: the volume and complexity of what we know has exceeded our individual ability to deliver its benefits correctly, safely and reliably. Knowledge has both saved us and burdened us” … but there is such a strategy (to solve this problem) – though it is almost ridiculous in its simplicity, maybe even crazy to those who have spent years carefully developing ever more advanced skills and technologies (and indeed is resisted in many companies for this reason). It is a checklist!

Kahneman puts forward his own personal story regarding the quality of his judgment and predictive capabilities (or lack of) when serving as a young military psychologist in Israel. He was charged with assessing the leadership capabilities of aspiring officers. Kahneman admits that he was initially dismal at this task. He also highlights examples of poor capabilities of highly trained counselors predicting the success levels of college freshmen based on several aptitude tests and other extensive data. Kahneman compares these results with the predictive accuracy to be found in the use of a simple statistical algorithm that makes us of a fraction of the information available. The algorithm was a far more successful predictor than were the trained counselors.

Kahneman continues to reference cases of experienced medical doctors predicting the longevity of cancer patients, predicting the susceptibility of babies to sudden death syndrome, and predicting new business success and evaluations of credit risk. His examples range all the way to marital stability and the ability to predict the future value of fine Bordeaux wines. In all these cases, the accuracy of highly trained experts was most often exceeded by simple algorithms—much to the consternation, occasional anger and derision of the experts concerned.

Jonah Lehrer (2009) similarly referenced studies conducted at MIT in which students given access to large amounts of data performed poorly in predicting stock prices when compared with a control group of students with access to far less information. He notes that the prefrontal cortex of the brain has great difficulty NOT paying attention to large amounts of information which can overwhelm the ability of the brain to estimate and predict. Lehrer concludes that access to excessive quantities of information can have “diminishing returns” when conducting assessments and predicting future outcomes.

Lehrer observes that corporations, in particular, often fall into the “excessive information” trap. Leaders of these organizations tend to invest huge amounts of resources in collecting data. This date, in turn, can overwhelm and confuse the human brain—just the opposite from the intended outcome of informed decision-making. Lehrer describes the challenging situation faced by medical doctors who were diagnosing back pain several decades ago. With the introduction of MRI in the 1980’s and with far greater detail available, medical practitioners hoped that increasingly better predictions of the sources of back pain would be made. The converse happened.

Massive amounts of detail produced by the MRI actually worsened their assessment and predictive capabilities. Poorer assessments were made. Goldberg and his colleagues at ORI were further vindicated. Kahneman refers to scenarios that contain a high level of complexity, uncertainty and unpredictability as “low-validity environments”. Experts can become overwhelmed by complexity when engaged in decision making. Leadership coaches can assist greatly by developing checklists or other simple decision support tools that help to limit biases and confusion arising from data overload.

The power of something as simple as a checklist has been shown by Kahneman to have “saved hundreds of thousands of infants”. He offers the example of assessing the health status of newborn infants. We can go back a few decades. Obstetricians had always known that an infant who is not breathing normally within a few minutes of birth is at high risk for brain damage or death. Physicians and midwives through the 1950’s typically used their varying levels of medical judgment to determine whether a baby was in distress. Different practitioners used their own experience—focuses on specific signs and symptoms to determine the level and extent of this distress. Looking at these different symptoms meant that danger signs were often overlooked or missed. Many newborn babies died.

When Virginia Apgar, an American obstetrical anesthesiologist, was asked somewhat casually by a student how to make a systematic assessment of a newborn, she responded “that’s easy.” Apgar jotted down five variables (heart rate, respiration, reflex, muscle tone and color) and three scores (0, 1 or 2 depending on the robustness of each variable). Apgar herself began to use this rating scale in her own work. She began applying this assessment about sixty seconds after birth to all infants she handled. A baby of eight or greater was likely to be in excellent condition. A baby with a score of four or less was in trouble and needed immediate attention. What is now called the “Apgar Test” is used in all delivery rooms every day. She is credited with saving thousands of infant lives.

We offer another example from the medical field. A report on CNN.com (Hudson, 2014) indicated that about one in twenty-five patients that seek treatment in US hospitals will contract an infection while add the hospital. Patients acquired some 721,800 infections in 2011. This statistic, however, ii is significantly better than in previous years. The rate dropped about 44% from 2008 to 2012. This result came from “requiring hospitals to follow a simple checklist of best practices”. It seems that simple checklists focused on complex situations work!

Resistance to assessment, prediction and tracking methods

Given the efficacy of simple check lists, why aren’t they universally deployed? Kahneman writes in detail about the level of resistance, even hostility, that he and other researchers have met when presenting results of his research on this topic. Lewis Goldberg would concur. He was not welcomed by other psychologists—especially though engaged in highly complex (and highly paid) clinical assessments. From medical professionals to psychologists and wine producers, experts either rejected or ignored the results—and in some cases responded with derision.

Perhaps this is predictable, because these results challenge the assessment and predictive capabilities of these same experts who have developed their skills over many years and have rightly developed high opinions of their capabilities. Kahneman quotes Gawande (2010) who offers The Checklist Manifesto:

We don’t like checklists. They can be painstaking. They’re not much fun. But I don’t think the issue (people resistance) here is mere laziness. There’s something deeper, more visceral going on when people walk away, not only from saving lives, but from making money. It somehow feels beneath us to use a checklist, it’s an embarrassment. It runs counter to deeply held beliefs about how the truly great among us – those heroes we aspire to be – handle situations of high stakes and complexity. The truly great are daring. They improvise. They do not need protocols and checklists. Maybe our idea of heroism needs updating.

We agree that this negative sentiment is prevalent—even today. Both of us have experienced this kind of response, verging on disdain. We often find pushback when developing various checklists related to change and transformation in organizations that are undergoing transformation and change. Somehow a checklist, algorithm or computation trivializes the personal sense of expertise held by those offering much more complex analyses. The simple analyses make them feel less expert. Trusted leadership coaches can greatly help to overcome these kinds of fears and resistance. These coaches might even work with the “experts” themselves—helping them readjust to the new realities. The experts can become what they are advocating for their clients. They can become more “agile.”

From Simplicity to Understanding

As Lewis Goldberg and his colleagues have noted, the best tools to use in the engagement of prediction might be quite simple. A checklist can provide use with guidance in sorting through the multitude of options available to us in predicting the course of action regarding a mental health or medical issue—or the long-term trends of the stock market. These simple tools offer a partial answer to the challenge of Multipiicity. They can help us select one specific course of action.

However, at this point, we wish to move beyond this behavioral economic critique of the traditional expert—for the experts still have much to offer. We believe that a key element when introducing assessments and checklists is missed in Kahneman’s dialogue. These tools should be developed – as best as possible – together with the experts who will ultimately use them. This is a basic “behavioral change” principle, designed to overcome the “not invented here syndrome”. This principle has helped us introduce checklists into organizational change initiatives where many executives feel they “know it all.” Furthermore, William Perry, the psychologists who introduced us to Dualism and Multiplicity, suggests that the behavioral economist prescription of simplicity is not sufficient. If we move out of Multiplicity based on outcomes of a checklist, then in many ways we are moving back to Dualism. We now have the ”right” (or at least the “best”) answer.

As we have repeatedly emphasized in this essay, it is important to acknowledge our biases and the specific frame of reference we are engaging when approaching any issue and determining the best course of action to take regarding resolution of this issue. We have just identified one set of biases—the need for complex analyses when making predictions. There are many other biases of which we must become aware. Perry proposes that we should enter a third stage of cognitive development. He calls this stage, Relativism. In essence, we are invited to recognize that there are many different ways in which to view any problem (the Multiplistic challenge). However, these diverse perspectives tend to cluster into specific, unified frames of reference. Our job is to recognize and appreciation the valid and useful perspectives offered by each of these frames.

We embrace a Relativistic stance when acknowledging these valid frames. With this stance, we can move toward a fuller understanding of the issue we are addressing. It is fine to predict the outcome of a specific therapeutic treatment when working with a specific mental illness. It is even better (and ultimately of greatest value) to gain some understanding of WHY this treatment is effective with this illness. We need to not only predict what will happen with the stock market during the coming six months, but also gain greater appreciation for the factors that influence this market.

Experts might not be able to make better predictions than an algorithm, but they might be able to offer an insightful analysis regarding what is happening with regard to a dynamically operating issue. The analysis becomes particularly rich if the experts come from different “camps” and offer diverse perspectives and insights. From a Relativistic stage of development, multiple perspectives are welcomed—not feared or dismissed as is the case with a Dualistic stance nor simply taken in (and often arbitrarily accepted or dismissed) as is the case with a Multiplistic stance.

From Understanding to Action

This Relativistic stance is all well and good. We can relish the understanding we have gained from the multiple perspectives being offered. However, we have to make a decision and have to take action on this decision. What treatment do we engage with this patient.? What should our policy be about prescribing specific drugs? What investments do we make? The behavioral economists are correct in offering advice about simply decision-making tools that can lead us to action. We might not be able to identify all of our biases or fully recognize the validity and usefulness inherent in any of the perspectives being offered—however, we still need to make a decision. William Perry identifies this fourth stage of cognitive development as Commitment in the Midst of Relativism. We need to do something—otherwise we are sitting on the sidelines with our acquired understanding of what is happening out on the field of action.

It is not enough, however, to revert to the checklists and simply analyses. We must somehow retain our appreciative of alternative perspectives, while also choosing the best one to engage with regard to this specific situation as it appears at this point in time and in this setting. Our decision might change the next time we encounter this issue. We might deploy a different perspective—and even listen to a different expert. None of this is easy. Perry admits that Commitment in Relativism is quite challenging. Even more generally,

Perry notes that the movement through each of the four cognitive stages is difficult. In each case, there is a loss of innocence. We are kicked out of one “Eden of Ignorance” and face the challenge of seeing the world in an increasingly nuanced manner. Perry suggests that a grieving process attends each shift in stages and that multiple shifts occur depending on the issue being addressed. We might be a Relativist or even make a strong Commitment in the Midst of Relativism when addressing an issue regarding public policy (such as the funding of Charter Schools)—but remain an adamant Dualist when it comes to the upbringing of our own children.

The challenge of making commitments in the midst of relativism opens the door for new tools of analysis (simple or complex). It also leaves open the possibility that complexity can be modeled in new ways (such as through the use of system dynamics and agent-based modeling) that preserves the complexity while leading to clear and ultimately simple solutions. While these more sophisticated tools of analysis might not be available to all of us – or we choose not to go the way of computer-based technologies—there are tools and processes that can readily be engaged as we seek to take the major step from ignorance to action. We are about to introduce one of these processes: MINDSPACE

A Potential Antidote for Ignorance

As we have done in previous essays, we offer one specific process that might be of value in helping us address the challenge of moving from ignorance to action. We describe a process called MINDSPACE This is an acronym for nine psychological (and largely unconscious) mechanisms or “nudges” that can influence our behavior. The nine are: Messenger, Incentives, Norms, Defaults, Salience, Priming, Affect, Commitment and Ego.

This framework is used primarily by governments in the crafting of public policy aimed at influencing the behavior of citizens – for example smoking cessation, healthcare adoption, exercising more, making more effective retirement decisions and so on. However, we think there is big opportunity for leaders to engage this process when making decisions and planning for implementation. Leadership coaches can also apply these techniques to help their clients become more effective in moving from ignorance to action.

Here is a summary of each of the nine elements:

Messenger (and the message)

When we are trying to influence employees’ behavior during a period of change and transition, the source of information – the messenger – is both critical and complex. This is probably the case more often than most change leaders realize. For example, studies conducted and described by the social psychologist Elliot Aronson show that – largely unconsciously – we tend to believe and trust information from people we like irrespective of their level expertise. Likeability (or lack thereof) is a big influencing lever. However, when the topic is complex—as in the case of healthcare choices, technical issues or retirement finances—people are influenced more by messages delivered by those considered experts.

Paradoxically, we are less likely to listen to or believe an expert if we don’t like them (again, most of us are not aware of this influence and do not admit to it). Apparently, fickleness extends even further: people are also more influenced by a message from an attractive person even when the message has nothing to do with being attractive. This is despite people saying that they would never be influenced by something as absurd as the extent to which a person is good-looking. Unfortunately, this is not the case. We are influenced by good looks.

To make things even more complex, research findings suggest that we are more likely to be influenced when a leader uses an emotional appeal—especially if there is an element of fear in the message. We are more likely to turn away from a factual appeal–- despite the fact that the basic appeal is exactly the same whether emotionally-delivered or delivered in a more measured way. We “sign up for healthcare” based on an emotional appeal (“you could become ill or injured—then what do you do!”). Emotions will win the day over fact. Regions of our brain that govern emotions (such as the amygdala) take over from those regions (such as the prefrontal cortex) that govern reasoning.

If you are involved in organizational change management, you have undoubtedly been immersed in the communications aspects of developing a change plan. Change leaders tend to build out communications spreadsheets with stakeholder audiences, messages, messages/media, timing and the like. However, they seldom include psychological considerations. Change leaders and coaches could add a great deal of leverage and influence to their communications by considering some of these psychological nudges.

Incentives

Incentives and rewards of various kinds are common in change and transition projects. However, few of the coaches and leaders we have worked with understand some of the psychological dynamics at play with the incentives they use (or should use). Research reported in the Journal of Economic Psychology (Dolan et al, 2012) offers many behavioral insights with regard to influencing behavior. We have already noted one of these insights: people strive to avoid losses more than they strive for gains. This is a basic human behavioral trait based on evolutionary loss-aversion.

Many companies we have worked with invest large amounts in providing incentives, rewards and recognition—with little or no benefit generated for their business goals. Indeed, providing material or money rewards and incentives is fraught with problems and can often be counterproductive. For example, in one organization with whom one of us consulted an analysis was conducted with all large business groups regarding the amount of money invested in recognition and rewards for employees.

I compared [these results] with the surveyed levels of engagement and satisfaction among employees in these groups. This analysis showed that the business function giving the greatest amount of incentive cash had the lowest levels of employee satisfaction.

Change leaders should consider framing “incentives” as charges that will be imposed if the change is not successful—however contradictory to our current and historical thinking this may be. For example, employees could be paid bonuses that go into a personal account as progress is made, with an understanding that these monies will all be withdrawn if the overall goal is not achieved – people hate to lose something that they have already received (and I know how difficult this thinking is for most of us!).

Norms

Cultural norms are the behavioral expectations or rules to be found (and enforced) in a society (or company). Usually these are implicit, and rarely used in explicit ways to drive change. The way that people think and act at work is (obviously) critical in terms of executing strategy. However, strategy is a dynamic process and must change and morph as market dynamics change. This is the reason why the development of organizational values, which tend to be static, can be problematic and become obstacles for companies needing to change. We identified this issue earlier when considering the sea anchors and ground anchors associated with the process of priming. Continuity must be balanced off with agility and the capacity for change. Norms provide stability, but also can create roadblocks.

The problem is that most organizations do not explicitly think about and manage behavioral “norms” to align with their changing business strategies. The result is that it virtually guarantees that cultural norms become out of alignment with ever-changing business strategies. Organizations that perform well are very explicit about cultural norms of behavior. Most importantly, this level of clarity must be found among senior leaders in the organization. They must be highly visible in their role-modeling these behaviors. Simply talking about how people SHOULD behave is not sufficient.

Defaults

Defaults refer to an option that is automatically engaged when people are required to make a critical decision—but are indecisive. The notion of defaults is increasingly common in behavior change programs. For example, the leaders of an organization would like their employees to select a 401k investment or a health plan during open-enrollment. If employees take no action, then the 401k or health plan is enacted – by default. Most of us are probably aware of the frequent use of default choice architectures over the past few years. “We would be pleased to make your life a little easier for you by enrolling you in this easy-payment plan. You don’t need to do anything. We will do all the work for you. Just sit back, relax and enjoys all of the benefits associated with this plan.” “We will resubscribe you to this TV sports package unless you tell us otherwise.”.

Closely related to “default options” in decision-making is the concept of “inevitability” regarding choice and change. Elliot Aronson describes this psychological bias as it relates to earthquake preparedness. It also relates to how people respond to information about election outcomes as well as racial integration. Earthquakes, corrupt elections and racial discrimination are always with us—we must just get used to their occurrence. In many ways this assumption of inevitability aligns with the status quo heuristic we identified earlier in this essay: “this is simply the way our world works. Can’t do anything about it, so why try.”

This assumption of inevitability need not be negative. It can encourage acceptance of change rather than serve as a barrier to change. In the context of planned organizational change, there is powerful leverage when change leaders construct communications in a way that provides the default message. Change is inevitable. It is certain. How this change is managed and handled may involve employee engagement and innovation—but whether or not it will occur is not up for debate. People respond very differently simply based on how the message is presented – and most are entirely unaware of this influence.

Salience

As we have noted throughout this essay, we are bombarded in the mid-21st Century with so much information these days that it is impossible for our brains to process even a small volume of that which is impinging on us. Our brains (largely unconsciously) filter what we pay attention to. Dolan and associates (2012) describe a number of factors that influence how we pay attention to a barrage of information:
* Novelty: information is presented in a new and surprising manner.
* Accessibility: information is available at a point of purchase or when a related matter comes to our attention.
* Simplicity: information is presented in an easily understandable way. Simplicity is particularly important because our attention moves more rapidly to information that we understand. We tend to automatically screen out complexity.

Change leaders are often under pressure to put out information with tight timelines. They often do not give adequate thought to the issue of salience for their target audiences. Coaches and consultants can provide significant benefit to clients when they educate and make clients aware of salience.

Affect

Psychologist write about “affect intensity” when addressing the experience of emotion. Those of us involved in organizational change tend to pay little attention to emotions, but affect is a powerful driver in decision-making for all of us, whether we are aware of it or not. As we have noted, Daniel Kahneman identifies the brain’s System 1 (fast automatic) and System (slow cognitive). These systems determine how we react to stimuli and make decisions. Emotions operate against System 2’s careful consideration. Affect intensity is a greater driver of our decision-making than most of us realize.

Research shows that by simply placing an attractive female model in an advertisement for a financial loan increased demand for the loan as much as by reducing the rate by 25% (Dolan, et al.,2012). Few of us probably would admit to being influenced by this model. There is little logic regarding why a sports character on a box of cereal dramatically improves sales, whereas a nutritionist could appear on the box offering a sound explanation concerning benefits of the cereal. We are emotional animals and are primarily influenced by affect intensity—even though many of us would likely refute being influenced by “trivial” factors such as beauty and athletic accomplishments.

As leadership coaches and change consultants, we should think more deeply about the behaviors we are trying to shift–and the emotions associated with each particular behavior. Companies, particularly with an engineering focus, may tend to communicate with a bias towards logic, data and detail, while missing the opportunities to harness emotional messages that motivate more effective behaviors.

Commitments

Most of us struggle at some point in our lives with sticking to goals such as exercising more diligently, losing weight, ceasing to smoke, or drinking less. Making commitments, especially if these commitments are public, is a powerful mechanism to help us stick to these goals. This is the influence that emerges from our social or business culture and the need to be accepted by those in our work or social milieu. We are not just emotional animals. We are also social animals. Acceptance and respect matter a great deal.

One of us [KW] served as consultant regarding workplace safety to a major energy corporation. Leaders of this organization embraced some remarkably effective commitment mechanisms. They were expected to post their commitments to safety on written and framed placards outside their offices. Furthermore, this organization’s “behavior-based safety” process required any employee who witnessed an unsafe practice to verbally and directly request that the individual conform to the safety practice. They were asked for a commitment to do so. This was remarkably effective.

The other author [WB] worked with a university that primarily served a poor urban Black community. The culture of this community was saturated with a shared commitment of all its members to the welfare of one another. This commitment, in turn, was founded in the long-standing racial injustice imposed and frequent violent action taken by the white population in this city. Though it is located in a northern state, this city served as a regional headquarters for the Klu Klux Klan. Following is a description of the commitment strategy introduced in this university:

Building on this culture, I encouraged members of this university to create a charter that contained statements regarding the mission, vision, values and purposes of their university. Once created, this charter was signed each year by all members of the organization—beginning with the university president and members of the school’s board of trustees. At an annual Charter dinner, each member of the university (including the faculty) came up to sign the charter. Furthermore, all new members of the university participated in a 6 month long orientation program (that focused on components of the Charter). Having completed the orientation program, they signed the Charter.

The commitment being made by members of this university were real—and became that much more important with public signing each year of the Charter by all of the stakeholder….

When people are encouraged to make public commitments to take action, they are much more likely to follow through. We see this operating in substance abuse programs such as Alcoholics Anonymous, as well as in marriage ceremonies, many marriage enrichment programs, and conflict-management processes. It is one thing to mutter a commitment to ourselves—such as we do on an occasional New Years Eve or after eating a very full meal or imbibing in one too many cocktails. It is quite another matter for us to promise something when other people are there to witness what we have said and remain there to remind us what we should be doing if we stray away from our commitment.

Ego balance

One of the primary motivators that we humans have is the desire to maintain and enhance a positive self-image. As emotional and social animals, we routinely (and largely unconsciously) compare ourselves to those in our work or social milieu. We make these comparisons in order to judge which behaviors are acceptable, and what builds our self-image. We also want to discover what is not acceptable and what weakens our self-image. When we act in a way that contradicts or diminishes our self-image, cognitive dissonance occurs (Aronson, 2018).

This is the same dissonance that leads us to remain ignorant and susceptible to misinformation. A we have already noted, this dissonance can produce high levels of anxiety and discomfort. When the dissonance concerns our sense of self, then we struggle intensively to regain balance between our actions and our self-image. What is fascinating (and disturbing) about this process is that we are more likely to change our beliefs rather than our behaviors to maintain a positive self-image. We regain ego balance by changing what we think rather than how we act.

We offer an example. One of us [KW} worked with a senior executive who supported neither the person nor the customer service program he was directing. Here is what happened to change this person’s attitudes (thoughts):

The senior executive was extremely antagonistic towards my strategy to improve customer service, but begrudgingly allowed it to proceed because my sponsor was more senior. While this executive was on vacation, he was invited to speak at an international conference in Johannesburg South Africa on the topic of service quality. Given his absence I was asked to develop his presentation. On his return, he had little opportunity to make changes to the presentation. He essentially presented my customer service strategy as-is. It was very well accepted by the audience and rated as one of the best presentations at the conference. After this event, he became a big advocate of my program. Why did this “flip-flop” occur?

The level of dissonance that he experienced could have been great. He presented a very successful presentation. Therefore, continuing to be critical of the person who prepared the presentation would have been a contradiction of his self-image. He would have been viewed as inconsistent and insincere—if not by others than certainly by himself. So, he changed his beliefs and began talking positively about the customer service program. What was abundantly evident was that this individual was unaware of this psychological change within himself.

While MINDSPACE does not provide all of the answers for the challenges we face in confronting our own ignorance and selecting our own courses of action, it does offer some value ideas regarding how psychological and behavioral economic principles (in particular) can be applied in a manner that acknowledges the emotional and social nature of the human condition (System 1), while encourages and often helping to introduce and maintain the thoughtful use of valid information (System 2). As we have illustrated in offering our own stories as members of and consultants to many different kinds of organizations, the MINDSPACE strategies can make a difference.

Conclusions

We have focused in this essay on the interplay between expertise and ignorance. It appears that being smart is more about understanding the body of knowledge that exist and being aware that there is much more to know. In this essay we have just touched the surface of how we humans are influenced unknowingly by a myriad of factors beyond our awareness. Beginning to understand these factors makes us smarter and reduces over-confidence, ignorance and poor decision-making. Given the resistance to these techniques, leadership coaches and consultants are in a position to nudge their clients to apply these tools for better awareness understanding and decision-making.
_____________

References

Adorno, Theodor (2020) Aspects of the New Right-Wing Extremism. Medford, MA: Polity Books.

Adorno, Theodor; Frenkel-Brunswick, Else; Levinson, Daniel J.; Sanford, R. Nevitt. (1950) The Authoritarian Personality New York: Harper.

Ariely, Dan (2008) Predictably Irrational. New York: Harper.

Aronson, Elliot (2018). The Social Animal. New York: Worth Press.

Brown, Roger (1986) Social Psychology (2nd Ed.) New York: The Free Press.

Dolan, Hallsworth, Halpern, King, Metcalf, Vlaev (2012). Influencing behavior: The Mindspace way. Journal of Economic Psychology.

DuMez, Kristin Kobes (2021) Jesus and John Wayne: How White Evangelicals Corrupted a Faith and Fractured a Nation. New York: Liveright Publishing.

Dunning, David (2012) Self-Insight: roadblocks and detours on the path to knowing thyself. New York: Psychology Press.

Firestein, Stuart (2012) Ignorance: How It Drives Science. New York: Oxford U. Press.

Gawande, Atul (2010) The Checklist Manifesto. New York: Metropolitan Books.

Goldberger, Lewis (1970) “Man versus Model of Man: A Rationale, Plus Some Evidence for a Method of Improving on Clinical Inferences,” Psychological Bulletin, v. 73, no. 6, pp. 422-32.

Hudson (2014) CNN.com

Kahneman, Daniel (2011) Thinking Fast and Slow. New York: Farrar, Straus and Giroux.

Kahneman, Daniel, Oliver Sibony and Cass R. Sunstein (2021) Noise: A Flaw in Human Judgment. New York: Little, Brown and Company.

Lehrer, Jonah (2009) How We Decide. Boston: Houghton-Mifflin, Harcourt.

Lewis, Michael (2017) The Undoing Project. New York: Norton.

McRaney, David (2012) You Are Not So Smart. New York: Gotham.

Nichols, Tom (2018) The Death of Expertise: Google-fueled, Wikipedia based, Blog-sodden. New York: Oxford U. Press.

Perry, William (1970) Forms of Intellectual and Ethical Development in the College Years. Troy, Mo: Holt Rinehart & Winston.

Priest, Henry. Biases and Heuristics: The Complete Collection of Cognitive Biases and Heuristics That Impair Decisions in Banking, Finance and Everything Else (The Psychology of Economic Decisions Book 7). Cognitt Consulting. Kindle Edition.

Sapolsky, Robert (2017) Behavior. New York: Penguin.

Thaler, Richard and Cass Sunstein (2008) Nudge. New Haven, Cn: Yale University Press.

Veldman, Robin (2019) The Gospel of Climate Skepticism: Why Evangelical Christians Oppose Action on Climate Change. Berkeley CA: University of California Press.

 

 

Exit mobile version