Kevin Weitz, Psy.D. and William Bergquist, Ph.D.
[Note: the content of this essay has been included in a recently published book called The Crises of Expertise and Belief. This paperback book can be purchased by clicking on this link.]
Every day, you are subjected to hundreds of people trying to sell you a product, a trip, or an idea; persuade you they’re right; convince you to vote their way; make you angry about some injustice; make you afraid of some outside danger; or seduce or charm you into doing what they want. (Aronson, 2018).
Skillful liars are dangerous people. (Danesi, 2020).
As in previous essays, it is important to note that this essay is apolitical – the authors have no political agenda. Our reference to specific individuals is purely for purposes of providing practical examples to make the concepts more tangible. We understand that this is a delicate balancing act.
As we have presented in the essay on Conspiracy theories, thousands of people believe bazaar, concocted stories that profoundly influence their behaviors – to the point of justifying violence. Often, as in the conspiracy theories concerning vaccines, fluoride in water or the notion of liberal pedophiles operating a child sex ring in the basement of a pizza parlor, the belief in these fabrications can lead to placing the believers at risk, as well as those innocently involved in the fabrication or conspiracy.
In this essay we focus on the psychology of how influential people use language to propagate misinformation and lies for their own benefit and how it is possible that so many people actually believe them. One could argue that little “white” lies are mostly benign – the (sort of) half humorous example of a spouse asking “how do I look in these new jeans” – the answer is always “you look great”!
But when lying is used to manipulate and deceive to the detriment of others – even entire societies – lies and misinformation take on a Machiavellian purpose and outcome. The adjective “Machiavellian is typically used in reference to ruthless liars, deceivers, scammers, and swindlers” (Danesi. 2020). As in the fostering of conspiracy theories, intentional lying can undermine truth and destroy trust in experts and leaders attempting to provide honest information and advice in order to guide decision-making amongst society in general.
The Truth About Lying I: the Individual Perspective
Some years ago, while engaged in a merger and transformation project in Johannesburg South Africa, one of us [KW] began the project by interviewing all senior executives in the four companies being merged into a single organization. Not all these company leaders were happy about the merger. Clearly, there would only be one new CEO who would likely select his or her own senior leadership team. The competition was intense.
During this series of interviews, I was struck by one senior leader who gave vivid descriptions of attempts by other leaders to undermine and destroy his business and attack him personally. This particular interview was early in the interviewing process, and I was able to test this individual’s concerns in later interviews. It became evident that this individual was not paranoid, but rather cunning and manipulative, attempting to gain favor and influence my perspective in later interviews. The degree to which this person was initially charming and convincing and seemingly attempting to be “my new best friend” was quite scary in retrospect. Leadership coaches and consultants need to be on the lookout for (often subtle) signs of these Machiavellian personalities.
Why some people lie and why others believe
As Denisi notes, “skillful liars are dangerous people” and use lies, deception and sow subtle doubt, as manipulative tools for their own personal benefit and self-protection. The problem is when belief in these lies by “in-groups” places the in-group members themselves in danger. Two current (at the time of writing) examples include misinformation about diseases such as Covid and vaccines as well as Russia’s invasion of Ukraine.
But there are two sides to this story – some people lie, and others believe the lie. The seemingly strange contradiction is that research shows that people believe misinformation and lies not simply because they are gullible or stupid, but because it makes them feel good. Believing and corralling around certain lies, misinformation or conspiracy theories makes people feel good about themselves as part of their affiliation with an in-group. “People believe stories that reinforce or reward the way that they see the world. They share stories that boost their ego or make them feel like part of a team”. We tend to believe lies or misinformation because we want to believe them. It takes mental effort, critical thinking skills and self-awareness to briefly mentally step-back from misinformation that conforms to our belief systems and conduct a brief fact-check and question the validity of misinformation.
This is difficult when the “truth” contradicts who we are and how our in-group collectively thinks. The dissonance of holding two opposing beliefs creates psychological stress and it is mentally easier and feels better to simply believe what makes us feel good. For example, if I am part of an in-group who collectively believe that Covid-19 vaccines are a government strategy (possibly orchestrated by Bill Gates) to control and manipulate citizens, and we are later confronted by compelling evidence that unvaccinated people are dying at a much faster rate than others, cognitive dissonance creates mental anxiety of maintaining two simultaneous conflicting cognitions – I need to believe one or the other to have some peace of mind. However, if I dismiss my in-group’s views, I am likely to be sidelined by my in-group, whether those be my friends, family and other people that I feel strongly a part of. I will likely side with my in-group!
This very powerful and important imperative to resolve cognitive dissonance began in the mind and research conducted by Kurt Lewin, a noted social psychologist displaced from Nazi Germany, who received a contract from the US government to find ways that would encourage women in America during World War II to reduce the number of meals during which they served meat (this meat being served instead to those men who were fighting the war). Lewin asked women to prepare brief presentations regarding the benefits (and patriotic duties) associated with serving meatless meals. Lewin was not interested in the effect of these presentations on other women. He wanted to find out if the presenting women’s personal attitudes about meatless meals would themselves change. He found that this did indeed take place.
Dan Ariely brings in a similar example. As in the case of Lewin’s women, Ariely (p. 80) found that “when a speaker is asked to prepare a short lecture about the benefits of a certain drug, the speaker would begin to believe his own words.” Ariely (p. 80) points to the work done by Lewin and many others: “Psychological studies show that we quickly and easily start believing whatever comes out of our own mouths, even when the original reason for expressing the opinion is no longer relevant. . .. this is cognitive dissonance at play.”
In the case of both Lewin’s and Ariely’s presenters, a dissonant state is created. On the one hand, the presenters wanted to believe that they are honorable people who never (or rarely) lie. On the other hand, here they are telling other people to use a specific drug or to serve meatless meals even when they might not believe what they are espousing. Something has to change, and it is much easier to believe in what is being espoused than to change a fundamental view about themselves.
Ariely (2012, p. 27) expands on these initial examples of cognitive dissonance. He proposes that there are two fundamental motives that are in conflict:
What.is going on here?. . . In a nutshell, the central thesis is that our behavior is driven by two opposing motivations. On one hand, we want to view ourselves as honest, honorable people. We want to be able to look at ourselves in the mirror and feel good about ourselves (psychologists call this ego motivation). On the other hand, we want to benefit from cheating and. get as much money as possible (this is the standard financial motivation). Clearly these two motivations are in conflict, How can we secure the benefits of cheating and at the same time still view ourselves .as honest, wonderful people?
When referring to “cheating” Ariely is not just taking about fudging on financial reports or making the case for the choice of good being served or drug being prescribed. He is writing more generally about the ways in which we alter our reality in order to retain a positive image of ourself. This is where our focus in this essay on lying extends into the realm of politics and into a deeper understanding of the crisis of expertise.
We not only pretend to be experts ourselves but also want to believe, uncritically, in an ”expert” who is aligned with our own belief system. It is important, in other words, that we retain our personal sense of being “smart” (we can be our own experts”) and “discerning” (we chose the right people to be our “experts’). Furthermore, we are honorable and would never deceive another person (unless they are our “enemies”) or sell out our “cause” for money, fame or friendship. All of these constructs of self must be retained—yet they often are oppositional. We are faced with the profoundly important task of resolved is dissonant opposition.
In The Art of the Lie, Danesi (2020) puts it this way:
Suppose an individual believes something with his whole heart; suppose further that he has a commitment to this belief and he has taken irrevocable actions because of it; finally, suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervor for convincing and converting other people to his view.
A powerful and tragic example of this psychology is the pervasive ownership of guns and gun-violence in America. Research over many years shows convincingly that owning a gun is far more likely to place a person at risk versus providing protection. Second amendment supporters simply reject and refute research data that shows that gun ownership puts owners and others at risk. Its not the data and facts that drive behavior, it’s the belief system and in-group camaraderie that is most powerful.
What kinds of people lie?
Many years ago, while in high school (in Johannesburg South Africa), one of us [KW] became fascinated by a new school pupil who arrived late in the school year. He struggled to make friends and tried to fit in by telling amazing stories of travels throughout the world (his father was extremely wealthy and these stories initially sounded feasible). Months later, these stories collapsed as the truth of his father’s failed business dealings became known. This young boy used these fabricated stories to gain attention and feel important amongst the rather “clique-ish” groups of teenagers in this high-school. People lie for many different reasons. (Wood, 2022)
The long history of Steven Hatfill continues here. As a reminder, Steven Hatfill is an American physician, pathologist and biological weapons expert. He became the subject of extensive media coverage beginning in mid-2002, when he was a suspect in the 2001 anthrax attacks. His home was repeatedly raided by the FBI, his phone was tapped, and he was extensively surveilled for more than two years; he was also fired from his job at Science Applications International Corporation.
At a news conference in August 2002, Hatfill denied that he had anything to do with the anthrax letters and said: “irresponsible news media coverage based on government leaks” had “destroyed his reputation”. He filed a lawsuit in 2003, accusing the FBI agents and Justice Department officials who led the criminal investigation of leaking information about him to the press in violation of the Privacy Act. In 2008, the government officially exonerated Hatfill of any involvement in the anthrax attacks.
Following the November 2020 election, Hatfill became an active participant in then-president Donald Trump’s efforts to overturn the election results, flying to Arizona to help challenge its election results, writing proposals for “Trump’s Legal Fight”, and sharing anti-Biden rumors.
What should be remembered is that Hatfill originally submitted his Ph.D. thesis for examination to Rhodes University in January 1995, but it was failed in November. Hatfill later claimed to have completed a Ph.D. degree in “molecular cell biology” at Rhodes, as well as a post-doctoral fellowship (1994–95) at the University of Oxford in England and three master’s degrees (in microbial genetics, medical biochemistry, and experimental pathology), respectively. Some of these credentials have been seriously questioned or disputed.
During a later investigation, officials at Rhodes maintained that their institution had never awarded him a Ph.D. (In 2007, Hatfill’s lawyer Tom Connolly in his lawsuit against former U.S. Attorney General John Ashcroft and the FBI – admitted that his client (Hatfill) had “Puffed on his resume. Absolutely. Forged a diploma. Yes, that’s true.”). Despite these deceptions, Hatfill was convincing enough and considered legitimate enough to be hired into one of the most influential consulting positions in the US government during the Trump presidency, eventually “becoming an active participant in Trump’s efforts to overturn the election results, flying to Arizona to help challenge its election results, writing proposals for “Trump’s Legal Fight”, and sharing anti-Biden rumors.
Reading Hatfill’s history is a profile of distortion and half-truths (and even blatant lies if one includes certain of his degree certificates), but yet being able to convince large numbers of seemingly intelligent people to hire him and trust him, while he criticized and undermined “out-group” experts such as Dr Anthony Fauci and FDA commissioner Steven Hahn, at one point telling Fauci that he was “full of crap”. The damage done by people like Hatfill is hard to overstate and harder to quantify.
Why some people believe lies (indeed, NEED to believe)
Erik Hoffer (1951) coined the phrase “the True Believer.” He believed that there are true believers at both ends of the political spectrum—as does Milton Rokeach (1960) who wrote about the “open and closed mind.” For both Hoffer and Rokeach the key point was the deep entrenchment in a specific ideology and view of the world that dictated loyalty to a specific (and often quite small) group of people (might appropriately be called a “tribe”). True belief also dictated a specific set of criteria for determining what is ”real” and what is “false” as well as what is “right” and what is “wrong.” The role played by lies and misinformation is central to this closed-minded belief system. The question becomes: why do some people end up painted in an ideological corner?
One answer focuses on leadership. When people feel out of control, and their way of life is threatened, they will tend to align themselves with leaders who speak emotionally and directly to their fears and provide solutions – truthful, practical or not. As previously noted, people with less education, poor critical thinking and an “authoritarian personality type” are more susceptible to this kind of persuasion. Indeed, as Danesi observes, manipulative leaders are skilled at “weakening our ability to think clearly and to reason about things critically”.
Another answer relates to the biological state of the true believers at any one point in time. Our ability to think clearly may be weakened when we are faced with evaluating truth (and morality). We are tiered after a long day of work—or after a lifetime of fighting for a viable place in society—or even survival. When fatigue sets in, we are susceptible to what Daniel Kahneman (2013) calls “fast thinking.” Like Kahneman, his fellow behavioral economic, Dan Ariely (2012, p. 100) points to the tired brain. “[W]hen our deliberative reasoning ability is occupied, the impulsive system gains more control over our behavior.” Ariely goes on to identify this fatigue as “Ego depletion” and sets this up in opposition to self-control [or what many psychologists call “ego-strength”) Ego-depletion is based on a well-research assumption that “resisting temptation takes considerable effort and energy.” Ariely (2012, p. 101)
Janice Wood (2022) goes even further. Researchers at the University of Western Australia noted that “rejecting information requires more cognitive effort than simply accepting that the message is true. It’s easier for a person to believe a simple lie, than to have one’s mind changed by information that is new and novel”. And when people WANT and NEED to believe the lie, it is very easy to succumb to the lie.
So, it is a matter not just of the capacity to reject temptation, it is a matter of finding it hard to say “no” to anything that immediately seems to be aligned with our own version of “reality.” As Kahneman would note, fast thinking about favorable ideas and immediately satisfying acts is much more “tempting” and much easier to engage than slow thinking about alternative perspectives and practices and about the longer-term consequences of making decisions and taking actions that are damaging and even disastrous over the long term (such as voting for an unqualified candidate or shooting our opponent).
Is Personal Lying Characterological?
Up to this point, we have focused on specific processes that lead a person to lie. They find themselves in a situation that is conducive to lying. This situation might reside inside their own body (fatigue) or in the setting where they reside. Is someone who frequently lies just a victim of circumstances (suggesting a “situational” trait) or were they somehow born liars (suggesting a “characterological” trait). David Levy (2017) weighs in on this matter: “Pathological lying isn’t (generally) a clinical diagnosis” though it can sometimes be a symptom of other issues, such as a personality disorder or a manic episode. But some people get so accustomed to lying that they do so even when there is no clear purpose, and when their lies are easily disproven, leaving everyone scratching their heads over the point of their deceptions” (Levy, 2017). For Levy, it seems to be that people become accustomed to lying (situational) and it soon becomes a habit (characterological).
We might look at the opposite condition. What about people who rarely lie or are rarely vulnerable to the lying of other people. Eliot Aronson (2018) examines this counterpoint, and notes that people with higher levels of education are less susceptible to lies and misinformation. The more educated that viewers are, the more skeptical they are, and that skepticism leads them to believe that they are immune to persuasion. If only the mere fact of knowing that a communicator is biased protected us from being influenced by the message!
Aronson (2018) doesn’t stop here. Rather, he points out the ways in which this seeming “immunity” can lead to its own self-deception: “unfortunately, just because we think we are immune to persuasion does not necessarily mean we are immune. Indeed, our sense of immunity can make us more susceptible to persuasion of all kinds.”
This brings us to a critical point. Lying to other people might be situational and become a habit if repeated many times; however, the even deeper issue might concern the lying we do not to other people but to ourselves. Does a similar process operate? Do we simply become more comfortable (and skillful) in lying to ourselves? Like the arrogant highly educated people, we might become accustomed to viewing ourselves as not only supremely honest but also invulnerable to the lies being told by other people.
Ariely (2012, p. 158) puts it this way:
We persist in deceiving ourselves in part to maintain a positive self-image. We gloss over our failures, highlight our successes (even when they’re not entirely our own), and love to blame other people and outside circumstances when our failures are undeniable . . . On the negative side, to the extent that an overly optimistic view of ourselves can form the basis of our actions, we may wrongly assume that things will turn out for the gest and as a consequence not actively make the best decisions. Self-deception can also cause us to “enhance” our life stories. . . which can lead us to suffer a great deal when the truth is ultimately revealed.
While he does not refer to the concept, Ariely seems to be engaging a distinction to be made between what is called an Internal Locus of Control and an External locus of control. It is convenient when seeking to preserve our self-respect to attribute our failures to external sources, while taking credit for our successes.
Thus, if we admit at all to being a liar, then we are likely to say that the outside world “made me do it!” (situational lying). At worse this is somehow built into my bones (characterological lying). We are much less likely to access an intermediate view that lying has become habitual (characterological) having often been engaged in a world we inhabit that encourages and rewards lying. Thus, we find ourselves with a hybrid version of lying as being initially situational and later characterological (habitual).
Techniques used in lying
The term “Machiavellian” is often used to describe people – often politicians and other rulers – who use deceit and manipulative techniques to convince others and achieve their evil goals. Niccolò Machiavelli was an Italian diplomat, author, philosopher and historian who lived during the Renaissance period. He is best known for his political treatise “The Prince” published in 1532. He has often been called the father of modern political philosophy and political science. In “Prince”, Machiavelli describes a range of manipulative techniques and methods for politicians and other leaders to achieve their goals through deceit, manipulation and lies.
Dannagal Young, a professor of communication and political science at the University of Delaware notes that “Misinformation succeeds, in layman’s terms, because it makes people feel good. People believe stories that reinforce or reward the way that they see the world. They share stories that boost their ego or make them feel like part of a team.” (Young, 2022) Even when the messages are in direct conflict with fact based scientific information from experts, many in-group members will reject it. Individuals and in-groups (that match this profile) NEED to believe these messages from Machiavellian leaders in the sense that – true or not, the dialogue against “enemies” preserves their beliefs, sense-of-self and supports their thinking that their way of life is right. When counter arguments, even when fully fact-based, undermine their modus vivendi, they will tend to vociferously resist. Fact-based attempts at persuasion against this profile of individual and groups can be difficult.
Danesi concurs that, under certain circumstances, “we are all prey to the master liar because he (or she) is skilled at manipulating our minds through language that generates obfuscation, ambiguity, and doubt, as well as evoking hidden fears, hatreds, and resentments, in his (her) ingenious scheme to gain trust, support, and backing”, as well as to undermine trust in experts and leaders amongst the outgroup “enemies”.
It is more effective here to utilize a specific example of Machiavellian-like lying and language. Frequent comparisons have been made to Hitler and Stalin along with other autocratic world leaders, but a more recent example is more powerful. Former President of the US Donald Trump is a useful example; however, it should be re-emphasized that the authors have no political affiliation in these references and this example is used entirely for practical purposes. Certainly, Trump was highly effective at attacking and vilifying his opponents and enemies, as well as undermining leaders and experts that gave opposing or critical messages.
In ending our presentation of this first perspective regarding our personal production of lies and misinformation, we wish to focus again on our main reasons for examining the dynamics of lying. While we all tend to lie as a way of holding up our own sense of self-esteem, and while there are many profiles of people who lie or exaggerate information for various reasons, we are focusing on those who lie to manipulate others for their own benefit, and in particular use misinformation to undermine out-group leaders and experts to undermine their opposition.
Knowingly putting out lies and misinformation to distort and manipulate is Machiavellian and these individuals are the most dangerous in the sense that they are often charismatic and are more easily able to convince people that their lies and distortions are the truth. To fully understand this destructive use of lying, we must turn to the ways in which lying resides in and becomes even more potent in a group setting and in the culture of a specific society.
The Truth About Lying II: The Group Perspective
For an individual to leverage influence over an in-group, he or she must initially establish some level of credibility to be heard and accepted as an influencer, leader or expert. Credibility is earned through what a person says and stands for, and the actions and achievements the person earns that enhance or support the in-group’s beliefs and values.
Develop personal credibility within your in-group
With this backdrop it is surprising to many that Donald Trump was able to gain the acceptance of conservative, and particularly Evangelical, constituencies given that he had never previously been a Christian Conservative. While Trump was and is a highly visible figure (given his media visibility and especially his TV game show performances), his credibility did not initially emerge from his conservative or religious roots – indeed, evidence suggests that he seldom, if ever attended church – they emerged from the next dimension of credibility, namely adopting the values and goals of an in-group. This is an example of Machiavellian manipulation and allowed him to gain credibility and eventually undermine any and all leaders and experts who challenged him on the way to the Presidency.
Behavioral economists, such as Kahneman and Ariely, point to an important economic condition that contributes in a major way to the building of credibility and loyalty within a group or society. Specifically, a distinction is drawn between what is called Social exchange and what is called Market exchange. Social exchange occurs when we offer someone else a gift or an award to honor their place in our life. Market exchange, by contrast, requires that there be a monetary exchange or some other formal agreement about the exchange of goods or services.
Social exchange concerns the building and sustaining of a relationship. Market exchange is about “doing business.” In our work with Asian colleagues, a social exchange often takes places prior to any market exchange. An elaborate banquet is hosted an evening before negotiations take place. Gifts are given to one another when two potential business partners get together. When entering a new course, students from Taiwan offer small, beautifully packaged gifts to their instructor. Instructors who are knowledgeable about Asian cultures will accept these gifts with words of appreciation–even if their American university forbids the accepting of gifts (reflecting the dominance of market exchange in Western educational systems).
Western cultures do have their ways of introducing social exchange into the work being done. One of the specific ways in which social exchange is established is through doing another person “a favor.” Relevant to the issue of lying which we are addressing is the impact of “favor” on the acceptance of another person’s version of reality (i.e. lies). Dan Ariely offers research findings indicating that “once someone (or some organization) does us a favor, we become partial to anything related to the giving party—and . . . the magnitude of this bias increases as the magnitude of the initial favor . . . increases. “ (Ariely, p. 77)
We can push this even further. Social exchange seems to be operating not only when someone does someone else a favor. It occurs even more simply when people gather together and get to know one another. When this form of exchange is established, then the push to conform and to distort reality (or at least lie) on behalf of the group is great. Ariely identifies this as “altruistic cheating” and notes that “altruistic cheating overpowers the supervisory effect when people are put together in a setting where they have a chance to socialize and be observed.” (Ariely, p. 228) While Asians might have their more explicit and ritualized way of establishing social exchange, this form of exchange seems to take place without formal ceremony or the granting of gifts in Western societies – it is just a matter of this exchange not being explicitly acknowledged and the resulting acceptance of another person’s reality not being made either conscious or negotiable.
Adopt in-group values and goals
A number of writers have considered the contradiction of conservative Christians accepting Trump as their political leader given his self-described philandering personality. Some writers have noted conservative Christian hopes that “are directed toward a single redemptive figure who, it is believed, will lead the people of God, now suffering and oppressed, into a better historical future” (eschatology | religion | Britannica). Trump was highly successful at adopting the values and goals of conservatives and Christians to become their “sinner-savior”.
In this role, Trump could do or say anything, and his followers would believe only him. Trump’s comment “I could stand in the middle of Fifth Avenue and shoot somebody, and I wouldn’t lose any voters” underlies his reading and understanding of his audience. Once Trump established himself as the “sinner-savior” of their values and goals, so he could undermine any challenge made to him and his supporters would support him – even to their own detriment. If Trump belittled and undermined opposing views from leaders and experts, so would his followers.
To gain a better sense of what it means for a rational and thoughtful person to succumb to a bizarre and often self-destructive set of beliefs and actions, we must turn to the psychological dynamics that operate In groups. These dynamics include not only the socialization and enculturation that takes place in groups –primarily though something called social modeling, but also the subtle pressures that lead to conformity and collusion. We will first take a look at social modeling. Ariely (2012, p. 207) points to specific research findings:
. . . [T]hese results show how crucial other people are in defining acceptable boundaries for our own behavior, including cheating. As long as we see other members of our own social groups behaving in ways that are outside the acceptable range, it’s likely that we too will recalibrate our internal moral compass and adopt their behavior as a model for our own. And if the member of our in-group happens to be an authority figure [Bion’s assumption] -a parent, boss, teacher, or someone else we respect-chances are even higher that we’ll be dragged along.
Dan Ariely (2012, p. 214) offers the well-known narrative about broken windows in an urban slum to make the point that small crimes and lies can make quite a difference in modeling and setting the setting the stage for much bigger crimes and lies. A few windows that remain broken in a community can produce a culture of indifference and hopelessness. Similarly, a group that tolerates small amounts of misinformation and minor self-deception will build a “broken” culture in which the world is profoundly distorted and destructive actions are engaged on behalf of this distortion.
We might even suggest that not only is the broken truth not repaired—it is actually shattered into even smaller fragments and new truths are sought that can themselves be broken. Perhaps the most important truth to be broken is one that suggests the group is operating in a thoughtful and caring manner. The group process is labeled collaboration and cooperation when it is, in fact, collusion [Bergquist.2013]
We turn now to the second dynamic operating in groups that leads to distortion and misuse of information. This is the dynamic of collusion. This dynamic is often identified as collaboration when it should be seen for what it really is: a powerful, often unconscious, agreement between two or more parties to accept and support erroneous beliefs and actions. (Bill’s essay). (Ariely,2012 p,.221)
Given these powerful forces operating in the group we have joined and society in which we live, perhaps we should live alone in a cave somewhere—be oblivious to the influence of other people. While this solitude is ultimately not possible in our mid-21st Century society, Ariely, 2012 p.233) suggests that it is ultimately not desirable. At the same time, he issues a warning:
Of course, we cannot survive without the help of others. Working together is a crucial element of our lives. But early, collaboration is a double-edged sword. On the one hand, it increases enjoyment, loyalty, and motivation. On the other hand, it carries with it the increased potential for cheating. In the end-and very sadly-it may be that the people who care the most about their coworkers end up cheating the most. Of course, I am not advocating that we stop working in groups, stop collaborating, or stop caring about one another. But we do need to recognize the potential costs of collaboration and increased affinity.
Formulate a cause, emergency or threat
It is not enough simply to adopt in-group values and goals, it is necessary to leverage urgent threats and emergencies of immediate importance to the in-group: Trump was highly effective at pointing to immigration, attacks on gun-rights, religious freedoms and other concerns important to the conservative right. He established himself as the champion to lead conservatives against these threats. In this fight, Trump was able to position any leader or expert that opposed him as core to the threat. Psychologist and Nobel Prize winner Daniel Kahneman (Thinking, Fast and Slow) describes that threats and negative messages gain much more of the brain’s attention and much more quickly (“less than one-quarter of a second”) – “Bad emotions … have more impact than good ones”, and Trump is a master at using negative threats in his messaging.
Point to an enemy (and demean and vilify them)
Trump was (and is) the Machiavellian master at identifying enemies and rallying his followers against his opposition and enemies. Beginning with fellow Republican political challengers, calling Jeb Bush “Low Energy Jeb”, Ted Cruz “Lyin’ Ted” and Marco Rubio “Little Marco”, and ultimately to his immediate enemies such as Hilary Clinton “Crooked Hillary” or “Fat Jerry” for Jerry Nadler. During the height of Covid-19, Anthony Fauci was “Full of Crap”.
Perhaps, most effectively, his labeling of the (primarily) liberal media as “the enemy of the American people” created a situation in which his followers would not believe anything that was reported in liberal media even when this reporting was based on fact or for public safety. Trump’s call to “Drain the swamp” targeted anyone in government that may have opposed him in any way. Social psychologist Elliot Aronson (2018) notes, “Hermann Goering, one of Adolf Hitler’s top aides, said “The people can always be brought to do the bidding of the leaders … all you have to do is tell them they are being attacked and denounce the peacemakers for lack of patriotism and exposing the country to danger. It works the same in any country.”
With the demonization of an enemy comes the abandonment of any traditional codes of conduct or foundations of morality. Dishonesty and the dissemination of misinformation can flourish with a new sense of justification. Ariely (p. 178) frames it this way:
In terms of dishonesty, . . . .once something or someone irritates us, it becomes easier for us to justify our immoral behavior. Our dishonesty becomes retribution, a compensatory act against whatever got our goat in the first place. We tell ourselves that we’re not doing anything wrong, we are only getting even. We might even take this rationalization a step further and tell ourselves that we are simply restoring karma and balance to the world. Good for us, we’re crusading for justice!
Our enemy becomes the ”Other” for whom exiting codes of honesty and justice do not apply. We can not only feel “good” about our immoral actions but also justified in the resulting retribution.
Identify as the ONLY one to believe, and be the “knight in shining armor” that will “save the day”
Perhaps most powerfully, Trump was remarkably effective as establishing himself as the sole “sinner-savior” in his description of himself as the sole “I alone can fix it” champion of the conservative evangelical right. Trump’s messaging and approach was authoritarian in nature. As noted in previous essays, people with an authoritarian personality-type leanings prefer leaders who fully take charge and are not ambiguous.
For example, scientific research often produces ambiguous findings. People with poor critical thinking skills struggle with ambiguity and prefer unequivocal statements, despite these statements being questionable in fact. Trump was highly effective as being the “knight in shining armor” by describing the political system as being rigged and stating emphatically “No one knows the system better than me, which is why I alone can fix it” – this despite having no experience in politics. He also claimed “I know more about ISIS than the generals do, believe me” despite having no experience in the military or with anti-terrorism strategies. Yet, millions of Americans believed him and supported him.
Wilfred Bion (1961) would point to what he identified as basic assumptions which undergird the tendencies of groups (and societies) to regress in their viewing of leaders—such as Trump. For many people Donald Trump represented the first of Bion’s assumptions regarding wisdom. Donald Trump was wise while the rest of us (or at least most of us) are ignored—whether it is about treatment for Covid or insights regarding ISIS. Trump actually represents two of Bion’s three assumptions—which makes him particularly powerful for many people. The second assumption is that there is an enemy standing at our gate that must be defeated. And only a person of great courage and strength can fend off this powerful enemy. Donald Trump is just such a man. He was (and for many people still is) “the knight in shining armor.”
There is only one assumption that doesn’t fit very well with Trump. This third Bion assumption concerns finding a compelling vision for the future. There is to be a pairing (a merger) of two major forces for and vision of the “good.” They will someday come together with the reemergence of the shining city on the hill (the new Jerusalem). “The waters will flow once again from Jerusalem to the Dead Sea and this sea shall once again come alive.” This third assumption is much more likely to be held by those holding the perspective of left-wing politics. Like the other two assumptions, it is based not on any accurate assessment of the real world. While some people are finding the first and second assumptions being “realized” by a distorted image of Donald Trump. Many other people are finding any hope of a new Jerusalem (a society of justice and freedom) to be fading rapidly from view in mid-21st Century America.
Focus on emotion not facts
As Peter Economy observed: “if facts don’t matter, you can never be wrong” (The 6 Persuasion Secrets of Donald Trump, Inc.com). With descriptions of threats that conjured intense emotions, Trump fostered anger and fear by, for example, portraying illegal immigrants as intruders taking Americans’ jobs and menacing their personal safety. He described people from Mexico as “They’re bringing drugs. They’re bringing crime. They’re rapists”. He created fear of Muslim’s by implying they should be listed in a “registry” seemingly so they could be tracked. While other politicians speak more clinically, and many experts speak in more factual and scientific terms, Trump’s emotional – especially angry – dialogue As Aronson (2018) notes, more an emotional and scary a message is, the more likely the target audience is to take action.
Aronson (2018) describes how leaders and influencers can change people’s beliefs – some for the good and some for the bad. Machiavellian leaders use emotion! Aronson describes the difference between changing opinions and attitudes: “How easy is it to persuade a person? When “opinions” are no longer purely cognitive (by injecting an emotional component into a logical argument), it is almost certain that there will be strong feelings embedded in them, along with an evaluation as to whether the subject is good or bad.
An opinion that includes an emotional and an evaluative component is called an attitude. Compared with opinions, attitudes are extremely difficult to change” (especially once they have been formed). We would add to what Aronson has indicated, by bringing back Wilfred Bion (1961). He would suggest that attitudes are wrapped around and secured by one or more of the three basic (and unconsciously-held) assumptions. An example of this kind of attitude-opinion viewpoint that is backed up by an assumption is one of the current (at time of writing) perspectives on mass gun violence. When the logical argument of limiting availability of assault style weapons is countered with the emotional argument of “no one will take away my 2nd Amendment rights”, logical opinions become emotional attitudes which are very difficult to counter. The underlying assumption is that there is an enemy at the gate (the “socialist” government). We need the guns to fight off the enemy when they seek to become even more aggressive in their attempt to turn our beloved America into another socialist (or even communist) state.
The implication here is that Machiavellian-type leaders innately or expressly know how to use emotion and underlying assumptions to foster opinions and transform them into entrenched attitudes, which are then cemented and largely unchangeable. An example is Donald Trump’s references (with no intended political bias from the authors) about undocumented Mexicans as “rapists and criminals”, Haitians being from “shithole countries” and about some women as “dogs”. For groups of people that align with some of Trump’s views, these references add emotion to their opinions, and then have the potential to transform these opinions into entrenched attitudes which become very difficult to change. Once transformed into attitudes, no attempt by experts or opposition leaders from out-groups will counter these statements. Facts and figures will have no impact (Marcel Danesi, 2020).
Pre-empt, deny and deflect
Denisi describes this technique well, which he calls “verbal weaponry: The main weapons are deception, denial, and deflection. Their utilization can be seen in several combative gambits: blame the blamer, deny any wrongdoing, deflect attention away from oneself, call one’s attackers names that will vilify them, and deflect attention away from oneself by casting doubt on the actions of others”. Trump is a master at this type of military verbiage. He will blame anyone who attacks him as being guilty of the same crime of which he is accused or else conceal the truth by constant denial”.
Trump’s frequent use of the phrase “many people say” allows him to deflect, and later deny, challenges to his comments about something or someone else (“I did not say that. It was someone else”) while still clearly placing the message in the minds of his followers. An example is a comment he made about Hillary Clinton’s emails “Many people are saying that the Iranians killed the scientist who helped the U.S. because of Hillary Clinton’s hacked emails,”. Using this technique, it is easy for him to deny that he was the originator of misinformation.
Repeat the story over and over
As Hitler described in Mein Kampf, “The intelligence of the masses is small. Their forgetfulness is great. They must be told the same thing a thousand times”. Trump has used the power of repeatability very effectively to strengthen his messages. His months long repeating that the election was “stolen” is consistent and unwavering. As Danesi reminds us of a quote frequently attributed to either Adolf Hitler or his minister of propaganda George Goebbels, “Make the lie big, make it simple, keep saying it, and eventually they will believe it.”
While this strategy was directed by Hitler to the world of people, it is also found in the world of physical and biological objects. We find that nature is actually rather “lazy” when it comes to the design of trees, animals and even the movement of celestial beings. Mother Nature provides one structure for the needles of a pine tree and uses this same structure in the design of the pine tree branch and even the entire tree. We find the same replication in the design of bird and bee limbs. These replicating structures are called fractals.
Scientists who focus on complex systems identify an accompanying process called the “strange attractor.” A form of replication is found in the pull of some entities to the powerful replicating behavioral patterns to be found in neighboring entities. An avalanche pulls in neighboring rocks and other materials as it crashes down the mountain side. Birds tend to replicate the flight pattern of neighboring birds. That’s what we witness when watching (and marveling at) the intricate flight of the birds while they are flocking.
Might we find that lies display the same dynamics. Are specific lies repeated again and again? Like Mother nature are lies lazy? Are they applied without change to many different scenarios? A “good” lie about “corrupt”, “power hungry” government can be assigned not only to election outcomes but also environmental protection legislation and the rights of people to privacy. We might also find that lies are “strange attractors” that pull in other falsehoods. Like the avalanche, a “big” lie gathers both speed and neighboring resources.
More people are attracted to the lie; they add their own distortions and recruit other people around them to the lie. Soon we have a carefully orchestrated “flock” of true believers. The irony (and terror) of this process is that there doesn’t actually have to be a single leaders or small group of leaders who make it all happen. It seems that complex processes—such as conspiracies and mob action—can take place through what Nobel-prize winning theorist, Ilya Prigogine (1984), calls “self-organization.” It all depends on what one’s neighbor believes and does, rather than on some central agent of change.
Lying and authority
Ultimately, groups and entire societies that are saturated with lies will yield to and support a strong and coercive authoritarian structure. This might very well be the most destructive and long-lasting outcome of abundant lying. Heather Cox Richardson (2022) provides a summary statement regarding this pernicious link between lies and authoritarian rule:
The construction of a world based on lies is a key component of authoritarians’ takeover of democratic societies. George Orwell’s 1984 explored a world in which those in power use language to replace reality, shaping the past and people’s daily experiences to cement their control. They are constantly reconstructing the past to justify their actions in the present. In Orwell’s dystopian fantasy, Winston Smith’s job is to rewrite history for the Ministry of Truth to reflect the changing interests of a mysterious cult leader, Big Brother, who wants power for its own sake and enforces loyalty through The Party’s propaganda and destruction of those who do not conform.
Richardson turns from Orwell to a prominent social historian:
Political philosopher Hannah Arendt went further, saying that the lies of an authoritarian were designed not to persuade people, but to organize them into a mass movement. Followers would “believe everything and nothing,” Arendt wrote, “think that everything was possible and that nothing was true.” “The ideal subject” for such a dictator, Arendt wrote, was not those who were committed to an ideology, but rather “people for whom the distinction between fact and fiction…and the distinction between true and false…no longer exist.”
She then references a troubling observation made by the advisor to an American president:
Way back in 2004, an advisor to President George W. Bush told journalist Ron Suskind that people like Suskind were in “the reality-based community”: they believed people could find solutions based on their observations and careful study of discernible reality. But, the aide continued, such a worldview was obsolete. “That’s not the way the world really works anymore…. We are an empire now, and when we act, we create our own reality. And while you’re studying that reality—judiciously, as you will—we’ll act again, creating other new realities, which you can study too, and that’s how things will sort out. We’re history’s actors…and you, all of you, will be left to just study what we do.
We are left with the equally troubling observation that contemporary societies throughout the world are today being guided by lie-based narratives, reinforced by authoritarian regimes. Even progressive, democratic societies seem to be susceptible to the dangerous interplay between the lie and authority. Richardson closes her analysis by offering a somewhat more positive perspective. There might be some hope given the reassertion in some societies (such as the United States) of judicial review and political renunciation of the big lies. As Richardson notes: “I wonder if reality is starting to reassert itself.”
The Truth About Lying III: The Cultural Perspective
While the focus on the role played by lies in group and event societal setting yields important insights –and warnings—regarding the negative image of dishonest communication, we believe that our analysis should extend even further into an exploration of the basic cultural foundations of lie-based group and societal perspectives and actions. As we have suggested throughout this series of essays on the crisis of expertise, the mid-21st Century world is one that is saturated with volatility, uncertainty, complexity, and ambiguity (VUCA). VUCA is, in turn, swirling in a white-water world of turbulence (Vail, 2008) and strained in a world of profound contradiction (Bergquist, 2020).
Given these challenging conditions, the cost-benefit analyses that have dominated the world of expertise is no longer viable (Ariely,2012, p. 5). One might actually wonder if this mechanistic analysis was ever viable (Sun and Bergquist, 2021). This VUCA Plus saturated environment has created a culture of confusion and anxiety. It is not amenable to simple (let alone compelling) statements of truth. Misinformation can easily prevail. The Hatfills of mid-21st Century life can easily find a platform for their ignorant observations and recommendations.
While there are many reasons why the VUCAS Plus environment is now prevalent and why a culture arising from this set of environmental challenges might be based in confusion and anxiety, we want to identify several of the principal reasons. We do this by turning to a distinction that has been drawn several times by one of (Bergquist and Brock, 2008; Bergquist and Pawlak, 2008). between six cultures that operate in most contemporary organizations (and more broadly in most societies). Four of these cultures have existed for many years. They are the professional culture, the managerial culture, the advocacy culture and the alternative culture. Each of these cultures holds a preference for specific kinds and sources of information and expertise. Furthermore, each is vulnerable to certain types and sources of misinformation.
As the name implies, the professional culture is associated with the many professions that now populate our organizations. As Bledstein (1976) has noted, the professions may have replaced social class as the primary way we stratify societies. The source of information is particularly important in this culture. If a certified “professional” has offered information, then it must be accurate—even if a professional (such as Hatfill) is making pronouncements in a field in which they are not qualified. Just as upper-class people once had the privilege of speaking “truth” so professionals are privileged today as “truth tellers.”
For those aligned with the managerial culture, it is the type of information that determines if it receives attention and is assumed to be valid. It is all about numbers and about the relationship between numbers and money. If something can be measured, and numbers can be derived from this measurement, then the information derived is considered accurate and useful. The numbers are even more useful if they related to the “bottom line.” Anything qualitative is considered “speculative” and is vulnerable to distortion. The managerial culture is filled with people who believe that you can “lie” with words, but not with number.
The advocacy culture has been established over the years as a buffer against the managerial culture. There is also a reliance on numbers—however the source of these numbers and interpretation of what these numbers mean differs significantly from what those in the managerial culture have to offer and what they believe is accurate. Much of the polarization that has been established over the past century involves the rise of the managerial culture (and not just in business) and the counter-rise of the advocacy culture (and not just in labor unions). We find that these two cultures often tear apart legislative debates and produce abundant accusations of misinformation—and even lying on the part of those in the opposing culture.
We come finally to the alternative culture that evolved in many instances as a corrective on the “arrogance” of the professional culture. Those in this fourth culture believe that there are multiple legitimate truths and that forums must be established for constructive dialogue that not only surface these truths but also provide a shared appreciation for these multiple realties. As a culture that attracts idealists and those involved in the helping and mediating professions, it is the one of the original four that is most likely to be ignored or dismissed in a polarized world that is filled with misinformation and lies.
We come now to the fifth and sixth cultures. One of these cultures has been defined as the “virtual culture.” It has emerged in recent years as a result of the impact which technology has had on the way in which we view and act in our mid-21st Century world. The culprit is what has been identified as the “virtual culture”. The sixth culture has evolved (much as in the case of the advocacy and alternative cultures) with the growing influence of the virtual culture. This sixth culture is defined as the “tangible culture” and addresses the desire to “return” to a time when “real” things that can be seen, heard and even touched held sway over our lives. These two emerging cultures speak persuasively to the challenges facing us in mid-21st Century life.
The Virtual Culture
In the Virtual culture there is a much looser grasp on reality. There are not only “alternative truths” populating the Internet, but also multiple criteria for determining what is “true” and what is “false.” Which Internet site do I frequent and which version of reality is displayed on this site. The algorithms of the Internet assist us by pulling us to closely aligned sites. The strange attractors that we have already mentioned are operating no just in physical space but also in the virtual space of the internet.
Virtual technologies have also entered the world of finances and increased the frequency of lies and misinformation. As Dan Areily, p. 34, 37) has observed and demonstrated, it is easier to cheat on our financial dealings when financial reality is at a distance. For example, Ariely (p. 34) notes that our move to a “cashless” society in which credit cards have taken the place of money has led to greater overdrawing of accounts and falsifications of financial transactions. As is often stated in a humorous (but painful) manner: “As long as my credit card doesn’t wear out, I will be OK!” And we are now even talking about bitcoins. At some fundamental level, we recognize that “cashless” transactions lead to a sense that “real money” isn’t being exchanged.
Ariely observes, in turn, that we are much more likely to cheat and distort reality (in this case the reality of financial status) when finances become “virtual.” We might even recall that money itself is a replacement for tangible, in-person (bartering-based) exchange of services and products (Bergquist, 1093). It is even harder to deny or distort reality when one is swapping a bushel of wheat for a hand-crafted chair, or two nights of baby-sitting for two pans of home-made lasagna. The reality and reinforcement of the in-person swap is furthered by its foundation in social rather than market exchange.
The Tangible Culture
We stand here in the early mid-21st Century world and despair that it isn’t like the “good old days” when we got to know our local banker and when everything was paid for with cash or a check. We read the daily newspaper that purportedly presented “all sides” of an issue and listened every evening to one of three TV channels that purported to serve all members of a community—regardless of political affiliations. While these “good old days” are to be found only in our collective memory (aided by nostalgic movies and novels), they do provide incentive and guidance for the creation of a tangible culture that returns us to the things that we can see, hear and touch.
The activities to be found in the tangible culture range from our family dinners to national holidays, and from a demand for in-person education to the requirement that people must show up in person to vote. In some cases, the return to old tangible practices is to be praised. In other cases, this return is accompanied by (or acts as a shield) for the return of discriminatory practices. Perhaps of greatest importance is the frequent alliance of the tangible culture with issues of morality and religion. We find this alliance in Ariely’s suggestion that religion and moral education might be part of the antidote for pervasive (and perverse) lying. He finds, specifically, that religion and moral education tends to blunt the tendency to lie (Ariely, 2012, p. 281):
. . . [t]he general approach of religion is to deal directly with the period before we cheat and the period in which we have the opportunity to cheat. First, religion attempts to influence our mind-set before we are tempted, by creating moral education and-let’s not forget-guilt. The basic understanding is that if we want to curb dishonesty, we need to think about education and calibrating the moral compass, rather than threatening punishment after the fact (which many religions are also pretty clear about). Second, religions attempt to influence our mind-sets in the moment of temptation by incorporating different moral reminders into our environment. Here, the basic idea is that once we have a moral compass, it’s a good idea to keep it in good working order, with appropriate adjustments in real time, if we expect it to operate at full capacity.
A culture of tangibility—based on well-established traditions—also tends to create a culture of decency and honesty. Ariely (2012, p. 43) suggests that this foundation of decency and honesty is often established and reinforced by a very tangible code of ethics. This code might be the Ten Commandments of the Torah (Old Testament) or the Golden Rule (of the New Testament). Turning to his own Jewish tradition, Ariely (,2012p. 270) identifies two ways in which the weekly observance of Shabbat promotes this culture. Participants in Shabbat are reminded on a weekly basis of the extent to which their own behavior is aligned with the Jewish code, the ways in which they can live up to this code during the coming week, and (most importantly), the values inherent in (and inherited by) this traditional code. “It is clear” to Ariely (2012, p. 52), “that moral reminders make it relatively easy to get people to be more honest—at least for a short while.”
The Relativist Culture
In addressing the issue of misinformation and lying, we wish to introduce a sixth culture that is closely aligned with the virtual culture but is to be found in every nook and cranny of mid-21st Century life. We identify this as the Relativist Culture and suggest that all four of the original cultures are impacted by the push to relativism. Professionals are coming to recognize that they no longer can offer an openly accepted “reality.” Consumers of professional wisdom find that there are conflicting versions of this wisdom coming out of the mouths and words of “so-called” professionals.
Those in the managerial culture find that they are swamped with numbers that often seem to be contradictory. The advocates are not only swamped with the same outpouring of numbers but often torn between contradictory needs and demands among the constituencies with which they are aligned and must serve. Finally, as we have already noted, those in the alternative culture rarely are heard and are relativistic about relativism itself: “if everything is relative, then what am I to believe? Maybe the very notion of relativism is subject to review—is relativism just as much an unquestioned tenant as the orthodoxy of a specific religion?
In his own version of this culture, Ariely (2012, p. 264) writes about “white lies in a gray world.” He establishes an important point that “some level of dishonesty Is actually needed in society.” Life, noted Ariely, is never all Black and White. “When we’re making decision, we’re looking at a lot of pros and cons all jumbled together into a spectrum of grays. Our motivations—no matter how honorable—often counteract other motivations.” This is relativism at work! This is VUCA Plus in full display.
In the relativistic culture there is “fungible” truth”. One reality can readily be exchanged for a different reality. Relativism thrives in the virtual world. We find, for instance, that avatars (virtual representations of ourselves) are found in abundance on the Internet (especially in Japan). Young Internet-savvy people live through the image that they chose to represent themselves. No one actually sees their face or even knows who they really are. These images can readily be changed. Even our personal identity becomes relativistic. Highly sophisticated reality games become the world in which these young people live and where they fight imaginary dragons. Rather than confronting the real-world foes that would be encountered if they held down a “real” job, the Avatar-clad game-players can engage in battles where no one actually risks anything. The game can always be changed. All outcomes are relative. Nothing is ever truly gained or lost. Much as in the case of our dream.
Ariely writes about “fuzzy reality” (Ariely,2012, p. 6). We play with numbers until we are “convinced that the numbers truly represent the ideal way” in which to negotiate our financial world (Ariely, 2012, p. 83). Similarly, we engage narratives as a way to negotiate other domains in which we operate (Bergquist, 2021). Ariely (2012, p. 65) turns to the game of golf when describing this slippery slope to dishonesty:
When our actions are more distant from the execution of the dishonest act, when they are suspended, and when we can more easily rationalize them . . . every . . . .human on the planet . . .find[s] it easier to be dishonest. . . . [Everyone has] the ability to be dishonest but at the same time think of themselves as honest. And what have we learned . . . ? We}l. When the rules are somewhat open to interpretation, when there are gray areas, and when people are left to score their own performance-even honorable games such as golf can be traps for dishonesty.
Imagine what it is like when the “game” is not golf – or even the game of “business” and generating income? Imagine what happens when the “game” is existential in nature—we play the “game” in order to survive and in order for the world as we want it to be continues to operate or transforms into a world in which we want to live. The pull to dishonesty and clinging to a “false” truth is even greater. Changing a golf score is one thing. Changing our perception of the world is a quite different matter.
We are now not only tiered, but also overwhelmed. We are saturated with information (Gergen, 2000) – and information that has become VUCA Plus saturated. Ariely introduces the term Self-Signaling to describe the ways in which we are likely to react when faced with overwhelm and fatigue. According to Ariely (2012, p. 122) “we don’t have a very clear notion of who we are.” As Gergen suggested, we are filled with multiple images of self. Ariely (2012p. 122) notes that “we generally believe that we a have a privileged view of our own preferences and character, but in reality we don’t know ourselves that well (and definitely not as well as we think we do). Instead, we observe ourselves in the same way we observe and judge the actions of other people—inferring who we are and what we like from our actions.”
We become the clothes that we wear, the car that we drive or the people we hang out with. We signal who we are by abandoning an internal sense of self and rely on our own appearance and affiliations for this definition. Most importantly, our identity is absorbed by the true believing world with which we affiliate. We are willing to accept and promulgate any lie and are willing to disseminate any misinformation in exchange for this externally supported identity. It certainly is an exchange with the Devil. However, it is an exchange that we seek out and with which we live – without complaint. This is the culture of relativism. It swirls around in our head and heart – often with the aid of media that also is swirling all around us and impacting on our head and heart.
The Role of the Media in spreading lies and misinformation
This kind of verbal manipulation is not a new phenomenon. Indeed Aronson (2018) describes in detail that these techniques have been used effectively throughout history. The critical difference in the modern world is how rapidly and powerfully misinformation is spread through social media:
It is good to be informed, and the media play a crucial role in keeping us informed. However, there can be a downside to this kind of exposure, as well. Whether it is intentional or not, repeated vivid imagery of this sort shapes attitudes and opinions. The constant images of the collapsing Twin Towers, as well as the repetition of bellicose slogans on news channels (“The War on Terror,” “America Under Attack!”, “America Fights Back!”), contributed to the arousal of intense emotions in viewers and thus reduced the likelihood of any real debate about how America should respond. In a democracy, major decisions — like whether to go to war — benefit from rational public debate. Strong emotions, such as those stirred up by the news media, often get in the way of rational decision-making.
“Rational public debate” of course suggests openness to conflicting and complex ideas and facts and flexibility in understanding one’s own point of view may be wrong. It also implies, as noted in the previous essay on conspiracy theories, some degree of critical thinking skill is required. Certain groups of people, especially those with a tendency towards “authoritarian personalities,” struggle to have an open mind and to apply critical thinking to ambiguous and complex phenomenon.
Misinformation propagated by manipulative leaders is particularly dangerous within certain kinds of in-groups fueled by what Aronson calls “emotional contagion”. This is the “rapid transmission of emotions or behaviors through a crowd” or in-group. This is particularly powerful when the “crowd” is an in-group where group affinity is based on powerfully shared beliefs which bind the in-group together. This kind of rapid contagion has little to do with facts, science or logic – it has to do primarily with emotion and the perception of potential danger and gullibility of in-group members.
At the time of this writing, the increase in mass shootings in the United States has also placed greater focus on conspiracy theories claiming that both the recent Buffalo and Uvalde shootings were staged events. Through individuals such as Alex Jones (Infowars), White nationalist and Holocaust denier Nick Fuentes and DeAnna Lorraine (who suggested that the Buffalo shooting “could just be all just a false flag to target the white guy”), lies, inuendo and harmful conspiracy theories are rapidly expanding.
As a counterpoint to the role of social media being primarily responsible for propagating misinformation, Joelle Renstrom (2022) writes in his article entitled “How science helps fuel a culture of misinformation” in which he describes that “we tend to blame the glut of disinformation in science on social media and the news, but the problem often starts with the scientific enterprise itself”. Renstrom describes:
Universities want their scientists to win prestigious grants and funding, and to do that, the research has to be flashy and boundary-pushing. PR offices may add to that flash by exaggerating the certainty or implications of findings in press releases, which are routinely published almost verbatim in media outlets. The demand for headline-worthy publications has led to a surge in research studies that can’t be replicated. Results of a reproducibility project published by the Center for Open Science found that only 26% of the top cancer studies published between 2010 and 2021 could be replicated, often because the original papers lacked details about data and methodology. Compounding the problem, researchers cite these nonreplicable studies more often than those that can be replicated, perhaps because they tend to be more sensational and therefore get more clicks”.
Indeed, the authors have previously commented on the need for greater critical thinking skills to limit the belief in conspiracy theories, lies and misinformation, but as Renstrom notes “Scientists aren’t responsible for the critical thinking skills of the average reader or the revenue models of journals, but they (scientists and experts) should recognize how they contribute to the spread of misinformation”. Evidently, the question of trust and credibility of experts, leaders and expertise in general, is not a one-sided issue. Leaders and experts themselves have a long road to travel to gain and sustain the trust of the general population.
The Truth About Lies IV: A Psychodynamic Perspective
We wish to offer one other perspective on the nature and dynamics of lies and misinformation. This perspective is based on a belief among many psychoanalytically oriented psychologists and psychiatrists that each of us has our own “media center” located deep in our psyche. We don’t need outside media forces and manipulative politicians to engage in deception, denial and deflection. We can do a pretty good job of engaging these “weapons” as internal psychodynamic process.
Peremptory Ideation
Specifically, we wish to offer an intriguing model of intra-psychic processes first presented by George Klein, an eminent researcher and theorist who brought together psychoanalytic theory and cognitive psychology (producing an integrative perspective known as “ego psychology”). Many years ago, Klein (1967) described a process he called Peremptory Ideation. In essence, Klein proposed that in our internal world (psyche) we create a specific idea or image that begins to “travel” around our psyche (head and heart) picking up fragments of unconsciously held material (memories, feelings, and thoughts). This ideational train operates much like an avalanche (and other forms of what chaos theorist often label “strange attractors’). This train becomes increasingly rich and emotionally powerful.
At some point, this ideation begins to pull in material from outside the psyche. External events suddenly take on greater saliency (more emotional power and vividness)—and it is because they are now connected to the internal ideation. Klein suggested that this ideation now takes priority with regard to what is valued, attended to and remembered in the external world. It assumes a commanding (“peremptory”) presence. A positive (reinforcing) loop is created, with the external material now joining the interior material—all clustered around the original (often primitive) ideation.
Catching the Train
While Klein focused on the internal dynamics of the peremptory ideation, we propose that this internal ideation might find alignment with a similar external ideation that is coming for various media sources and political leaders. We can envision the internal ideation “hooking on” to the ideological “train” that is passing by outside ourselves. Hitching our own train of thoughts and emotions to an external train.
Particularly irrational and anxiety-saturated external ideation can be particularly attractive, given that the internal ideation is likely to be quite primitive. The internal ideation is often swirling with ghosts and goblins from our own childhood and the collective (unconscious) heritage of our ancestors and culture. With this powerful alignment of internal and external, we become victims of collective peremption. Attention is demanded by this new coalition: we are obsessed, closed-minded, passionate and driven to action.
It can be a quite dangerous condition when the train is drawing in the peremptory ideation of many people. We find that societies in which these is a history of collective trauma (such as the holocaust, slavery, war, famine, ostracism) will produce what is now called the “societal unconscious” (Weinberg, etc.) A common set of fearful images are held by members of this traumatized society. Citizens often report similar trauma-related dreams, as well as similar bouts of anxiety that are easily triggered by events that in many other societies produce only mild stress.
Anxiety can be produced by a potential loss of confidence in a chosen leader, or by mild public protests regarding some social ordinance. It might very well be that the “social unconscious” material shows up in our internal peremptory ideation. This being the case, then one can imagine that the alignment with external images being carried by an ideation train is likely to be quite common. Both the internal psyche and external ideological train will be holding the same social unconscious material. A “perfect” storm of prejudice, intolerance, fear of the “other” and (eventually) violence is created. One final point, it is more likely that this ideational train will be fully operational and pulling on external images when we are tired and overwhelmed. Such a state is not uncommon when living in the world of VUCA-Plus.
Piercing the Vail of Misinformation and Lies
How do we pierce the vail of misinformation and lies given the many avenues open for misinformation and lies to prevail in our personal lives, the groups of which we are members, the societies in which we live –and even the unconscious world that swirls within us. We suggest that some of these challenges can best be met by acknowledging that there are three domains in which lies and misinformation dwell. Each of these domains must be entered and the lies and misinformation residing in them must be addressed if we are to be successful in moving beyond the lies and misinformation. Here are the three domains.
The Three Domains
The domain of information is entered whenever an attempt is made to find out more about the current condition in which we find ourselves. We act as researchers, asking questions that can be answered by a systematic collection of information. For example, if a college wants to know which of four academic programs are potentially most attractive to a particular group of prospective students, then a sample of these students might be asked to indicate under what conditions they would be likely to enroll in each of these four programs. The information obtained is valid if the students have been honest, if the right questions were asked and if the sample used was representative of the entire pool of potential students. If the information is valid, then the college should be able to state with some confidence which of the academic programs is most attractive to this population of potential students.
In understanding the current situation, however, we must not only seek information that is valid. They must also seek information that is useful. It must relate to the target that the leader and her team wish to reach. Thus, if the target concerns increased financial viability for a college, then a market survey will be of little use, even if the information obtained were valid. It is only useful if the costs associated with each of the four programs also can be determined, along with the acceptable tuition levels for this population of students regarding each of the four programs. It is surprising to see how often information is collected that relates only marginally to the problem faced by an organization!
Many realistic plans can be set, and problems can be solved, through the systematic collection of valid and useful information. This lies at the heart of rational, linear planning and modern management processes. In other instances, unfortunately, effective leadership cannot exclusively be based on information about the current situation. Many organizational decisions, particularly those involving people rather than machines, center, at least in part, on conflicting goals, objectives or desired outcomes. Attention must shift from the domain of information to that of intentions. This domain is likely to be particularly important in today’s society, where conflict in values and purposes is so common.
The domain of intentions is entered whenever we attempt to understand and clarify our personal or collective mission, vision, values or purposes. While research prevails in the area of information, clarification prevails in the area of intentions. Unlike traditional approaches to the clarification of intentions, which tend to emphasize enforcement or modeling, intention clarification focuses on the way in which mission, vision, values and purposes come into being. As we become clearer about our intentions and the overall intentions of the system in which we are operating we begin to produce solutions that are more and more consistent with these intentions. The process of clarifying intentions becomes richer and more profound as each of us moves toward greater maturity. A mature intention is freely chosen; it is not imposed (an imposed requirement is part of the situation). A mature statement of mission, vision, value and purpose is prized and affirmed; this statement serves as a guiding charter for one’s department or organization and is repeatedly acted on in a consistent and persistent manner.
The domain of ideas is entered whenever we attempt to generate a proposal intended to move from the current to the desired state. Ideas are sometimes fragile, often misunderstood, and easily lost. While information exists everywhere, we often ignore or misinterpret it. But we can usually go back and retrieve it. Similarly, even though intentions may be ignored or distorted, they resist extinction. Their resistance to change is often a source of frustration: old values linger as do old visions and purposes. Good ideas, on the other hand, are easy to lose and hard to recover.
Settings must be created in which ideas can readily be generated and retained. Two processes are essential. Divergence produces creative ideas. Divergence requires a minimal censorship of ideas, minimal restriction on people offering their own suggestions and taking risks, and minimal adherence to prescribed rules or procedures for the generation of new ideas. The second process is convergence. People must be given the opportunity to build on each other’s ideas, to identify similarities in their ideas, and to agree upon a desired course of action. Convergence requires leaders to observe specific rules and procedures, to listen to ideas and to be constructively critical of other ideas.
Problem-Solving
At this point, we are ready to make use of the analyses already engaged regarding the domains of intentions, information an ideas. While we enter these domains frequently when we are navigating our daily life, they come to the fore in particular when we are confronting a problem that is not easily solved. It is at these challenging moments that we are most likely to be attracted to readily accessible information and intentions that have been manufactured by other people. Misinformation and lies are abundantly available to lead us in the wrong direction. Given our vulnerability to misinformation and lies at these problem-solving points in our life, we will focus on processes that can be effectively deployed when facing a problem.
Intentions [Desired State]: This is a terminating point. What are the goals, aims, ends, purposes, objectives, desired outcomes to be achieved. A description or portray is offered of how the outcome will look and work.
A critical review should be engaged that not only helps us determine what we really care about regarding outcomes, but also helps us discern what we don’t really care about and that helps us surface the motives behind the lies and misinformation that swirl around our head and heart. The main point to be made here is that this discernment is not about finding the one thing we care about. It is about finding the multiple things we care about and identifying the relationships between these various outcomes. We will be true to ourselves when we recognize that we care about more than one thing.
We can frame this important point by turning to an often-used metaphor. We can think of our outcomes as being like the target we focus on when shooting arrows or darts. The important point to make is that a target is not the bullseye. While a bullseye represents the center point of the intentional domain, the target represents the broader setting in which a number of different intentions can be identified. Some of these intentions reside very close to the bullseye – and might in fact reside inside the bullseye itself – being at the very heart of the matter.
Other intentions reside at some distance from the bullseye and are close to other intentions (complementing one another) or at opposites sides of the target (serving as opposing or even incompatible intentions). One of the falsehoods associated with many lies and sources of misinformation is that there is one intention and only one intention when exploring any problem or engaging in the formulation of any policy or plotting out any plan. It would be a strange (and quite challenging) target indeed if it was very small and consisted only of the bullseye.
As we have already noted, Daniel Kahneman and his two colleagues, Olivier Sibony and Cass Sustein (2021) write about the distinction between bias and noise. Let’s go a bit further than we did before into understanding this distinction. They begin with a story about assessing the success of someone shooting arrows into a target. One desirable outcome would be for all the arrows to hit the target in the same area. When this occurs, we can applaud the consistency of the archer. Another outcome would be for the arrows to arrive all over the target. Typically, we devalue this outcome. The archer has not been consistent in directing arrows toward the target.
Kahneman, Sibony and Sustein (2021) suggest that these assessments of success must be questioned. The first outcome indicates only that there is consistency—not that the arrows have arrived at or near the bullseye. The arrows could cluster at some point at quite a distance from the bullseye. This placement would reveal a BIAS. Conversely, arrows arriving at many places on the target reveal NOISE. Our authors suggest that these are quite different flaws in the performance of the archer—and that both Noise and Bias are to be found frequently in the judgements made by most of us. (Kahneman. Arrows clustering together or spread out.
Questions:
* How would you know if you have been successful in this endeavor?
* What would make you happy?
* Who else has an investment in this project and what do they want to happen?
* What would happen if you did not achieve this goal?
* What would happen if you did achieve this goal?
* What scares you most about not achieving this goal?
* What scares you most about achieving this goal?
Information [Current State]: This is where we are situated right now. This is the starting point that incorporates facts, opinions, and explanations about the current state. It contains predictions about change in the environment as perceived by the planners.
A critical review should be engaged that helps to surface untested and often self-fulfilling assumptions about the world. This is especially important when the world being assessed is filled with misinformation and lies. As we have noted throughout this essay, it is difficult to discern what is valid information and what is invalid. What are the sources of information that we can trust and what is suspect? Triangulation is one of the key tools to engage in this discernment process. Actually, Double Triangulation will often yield the best results. Triangulation is engaged when we look to at least three sources for the information we receive. Where does the information come from and how reliable is each of these sources? When we have only two sources, then we are caught in a dilemma if these sources disagree.
With a third source, we are likely to find some fit between two of the sources. This doesn’t mean that we discount the third, discordant source, for it might yield some important insights regarding the nature of biases that might exist in all three sources. We are best able to identify biases when viewing any phenomenon from multiple perspectives and with differing lens. It is not only that we are likely to see different things from different perspectives but are also like to punctuate what we have observed in different ways (especially if some of the observers are studying the phenomenon over a short period of time while others are observing it over a much longer duration).
We also triangulate when looking to three different methods for the production of information that we receive. Is the incoming information being produced in different ways? Is it based in quantitative research and are those producing these numbers engaging in different modes of research (types of measurements being taken, research design being engaged, breadth and duration of data gathering)? Is a qualitative method appropriate (such as interviews, document review, or direct observations)?
Once again, with information being produced via three or more methods, we will usually find alignment between two of the findings that are being reported. As in the case of multiple sources, the discordant information produced by use of one of the methods can often produce insights regarding how the method being used can influence the information obtained. As noted by Gleick (1987) in his early report on the study of chaos, the method being used and level of detail being engaged by this method will often have a greater influence on the outcome of a research project than the “reality” being studied.
Questions that might be asked to determine the type of information to be collected and the ways this information is to be used:
* What are the most salient facts with regard to the circumstance in which you now find yourself?
* What are the “facts” about which you are most uncertain at the present time? How could you check on the validity of these facts?
* What are alternative ways in which you could interpret the meaning or implications of the facts that you do believe to be valid?
Nature and Causes of The Problem: We can first note that a problem exists when there is a gap between the current state (information) and desired state (intention). Thus, we must first be sure that the information we have obtained in valid and useful, and that the intentions are clear and sufficiently broad (target not just the bullseye).
Second, we can focus on the domain of intentions to see if the problem can be best addressed by working in this domain. We might find that the problem exists because there are conflicting desired outcomes associated with this problem (outcomes located on opposite sides of the target). Typically, it is now a matter of sequencing several actions that must be taken to work toward both outcomes. Focus should be placed on action rather than on debating priorities regarding each outcome: “do them both!” should be the motto.
Third, we can focus on the domain of information. The problem might reside primarily in the contradictory or confusing information that we possess. This is where misinformation is likely to creep in. Triangulated analysis (sources and methods) will usually help to resolve this issue. If the contradictions still exist, then it is often useful to “test the market” by offering a description of potential actions to potential stakeholders or even conducting brief and limited pilot tests of these actions. The “real” world will usually “kick back” and let us know what is real and what is unreal.
Here are a set of questions that might be addressed when addressing the nature of a problem:
* How do you know that there is a problem here?
* To what extent do other people see this as a problem? If they don’t, why don’t they?
* How long has this problem existed? How big is it? Is there any pattern with regard to its increase or decrease in magnitude?
* What are the primary cause(s) of the problem? What is different when the problem does and does not exist? What remains the same whether or not the problem exists?
* Who benefits from the continuing existence of the problem? In what ways do you benefit (even indirectly) from the continuing existence of this problem?
* What will you miss if and when this problem is resolved?
When the problem is particularly elusive or important then a more formal and extensive causal/resource analysis might be engaged. Here is a brief description of this five-step process. Its distinctive feature is the comparison to be make with other comparable situations.
Causal/Resource Analysis: At the heart of causal/resource analysis lies the belief that a problem can best be understood if it is compared to another situation as much like itself as possible, but one in which the problem does not exist or is less serious. In most machine or production-line problems, this strategy is fairly obvious, although not always used. An engine that has been functioning correctly begins to misfire; a computer keyboard is no longer working; a bottling line begins to produce an unacceptable. number of rejects.
No matter how obscure the cause of any such problem may seem to be at first, a comparison of the current problem situation with the same situation at an earlier point in time when the problem did not exist or was not as serious should provide evidence that will lead to the solution of the problem. Something has to happen to change one situation into another. This stance is particularly important in a polarized situation—such as exists in the mid-21st Century. We tend to believe that “it” has always been messed up and refuse to examine a time when things were “better.” Misinformation abounds regarding the ”bad old days” that still exist in the “bad new days.” Under these circumstances, the causal/resource analysis tool we are describing in this section of the essay are of particular value.
Change over time becomes a critical point of analysis. This change becomes the clue that leads to the solution of most machine-related problems. This same strategy can be applied to some “people problems.” If there has been a deterioration in performance over time, a comparison of the current situation with an earlier situation will produce evidence of the cause of the problem in much the same way as a machine problem (although often with less certainty). Don’t give up on people or problems. They might be more elusive but are still amenable to causal/resource analysis.
In some cases, the problem has always existed. The bad old times might be a reality. Even in these cases, an immense amount of information can be acquired by comparing that situation to a different situation in which the problem either does not exist or does not exist to as serious a degree. If, for instance, the affirmative action program in your organization not only does not seem to be working but also has never worked, you can learn a great deal about the causes of that problem by comparing your program with a more successful program in a similar organization. The causes of the problem are almost certain to lie in these differences. The processes involved in the now-widely used tool called Benchmarking can be quite helpful in this regard—especially if engaged in an appreciative manner (Bergquist, 2003).
An emphasis on differences, however, will only help to isolate the cause of a problem; it will not isolate the means for solving that problem. Unfortunately, the similarities between a problem situation and a more desirable situation are often overlooked in a rush to solve the problem. Suppose, for instance, that your planning for the introduction of a new product does not seem to be going as well as usual. If you examine the similarities between the current problem situation and a more desirable one, it may become clear that the things that have not changed (your experience, for example, or your planning model) can be relied on as resources to help you work through the current difficult situation. You can turn to problem solving with a clearer understanding of your strengths.
The following sequence of steps suggests how one might conduct a causal/resource analysis.
STEP 1: IDENTIFY AND ANALYZE A COMPARATIVE SITUATION
To begin the process of identifying causes and resources, identify or create a situation with which the current situation C8.fl; be compared. Three possible types of comparative situations are possible; they are as follows, listed in order of desirability from most to least desirable:
I. Type A: the situation as it currently exists compared with the same situation at some earlier point in time when the problem did not exist or was not as serious;
2. Type B: the situation as it currently exists compared with a similar situation in which the problem does not exist. or exists but is not as serious; and
3. Type C: the situation as it currently exists compared with the target.
It should be noted that sometimes it is possible to establish a comparative situation for either a type A or type B comparison in which the problem actually is worse than at present. Our experience, however, indicates that in the vast majority of problems you will ever encounter, the comparative situation will be one in which the problem does not exist or is not as serious. The possibility of establishing a comparative situation in which the problem is worse should be kept in mind, ·though, at least as a possibility.
Next, identify and collect relevant information about the comparative situation in terms of who, what, where, when, extent, and pattern. The kinds of questions you should ask about the comparative situation are as follows:
• Who is involved?
• What exactly is happening?
• Where is the comparative situation?
• What objects or processes are involved?
• When is the comparative situation taking place or how recently was it taking place?
• What is the extent?
• What is the pattern?
When you complete this step in a problem-solving effort, record your answers to these questions.
STEP 2: COMPARE AND CONTRAST THE CURRENT AND COMPARATIVE SITUATIONS
Look for major similarities between the actual and comparative situations. What forces, motives, influences, or drives exist in both situations? Those factors that are common to both situations may be resources that will help move toward problem solution.
Then examine the actual and comparative situations for differences. Be as specific as possible in terms of who, what, where, when, extent, and pattern.
STEP 3: IDENTIFY RESOURCES AVAILABLE TO SOLVE THE PROBLEM
Examine each similarity between the current and the comparative situation, answering these questions:
• Will this similarity help me to achieve an important goal or cluster of goals?
• Am I confident that this similarity is not likely to change during the course of problem solving?
If the answer to these questions is “yes’” then that similarity will be a significant resource in the solution of the problem.
STEP 4: DETERMINE THE MOST LIKELY CAUSE OF THE PROBLEM
Examine the differences between the current and the comparative situations. The most likely cause of the problem will be that potential cause that explains all of the information collected about the problem situation. When you complete this step in a problem-solving effort, record your answers to Steps 3 and 4 on a form like the one presented in Figure 10.7
STEP 5: DETERMINE WHETHER THE PROBLEM IS UNIQUE OR GENERIC
One of the dangers of any approach to problem solving is that it can be seen as primarily reactive. People are taught to wait for a problem to happen, then to respond. At this point in integrated problem management, however, you can begin to move out of that reactive mode by pausing a moment to consider whether the problem is unique or simply a symptom of a broader or more generic problem.
If the problem is unique, you can move on with some hope that, once it has been solved, you will not see it again. If, on the other hand, the problem is generic, you need to decide whether the symptoms are significant enough to warrant continued attention. If they are, you need to continue managing the immediate problem. Once those symptoms are under control, however, you might want to address the more generic problem by returning to the beginning of the problem-solving process. If the symptoms are not significant enough to demand immediate attention, you might want to start addressing the generic problem.
Ideas [Proposal for Moving from Current to Desired State]: What is the best path from the situation to the target? This is where the means, plans, strategies, implementation procedures, and possible actions are identified.
With some clarity gained regarding the nature of a problem that is being addressed—and with misinformation and lies hopefully being avoided—it is time for movement to finding the best set of actions to be taken in addressing this problem. While some initial proposals or pilot tests might have been taken as a way to gain greater clarity regarding the current situation (domain of information), the focus is now on preparing a set of actions that are not only taking into account the valid and useful information that has been collected, but also directed toward the desired outcomes that have been identified.
Idea generation usually involves two steps. The first is based on the value of expanding the range of possible actions. This is often labeled “divergent” problem-solving. The second step is based on the value of homing in on a small number of potential actions—selecting from the broader range of options identified in the first step. This second step is often labeled “convergent” problem-solving.
Divergence: many “brain-storming” and “out of the box” planning tools are available to help open the doors for the production of diverse ideas. We are particularly fond of a tool called Morphological (Shape) Analysis. A problem-solving group engages in divergent and creative processes when they change the shape of a situation (information). Instead of designing a program for fifty people, what if you first designed it for one person or for 500 people.
The shape of a target (intentions) can also be modified. What if a program is designed to bring together urbanites from New York City with members of a primitive tribe in Papua? Instead, the program can be directed toward teaching a new set of leadership skills not to adults but instead to five-year-old children. Finally, the shape of possible solutions (ideas) can be altered. The solutions can be absolutely “silly” or absurd. They can be absolutely unattainable or require massive financial outlays (or require no money at all).
Each of these changes in shape can not only open up previously neglected ideas but also surface previously untested assumptions (“What would be the benefits of offering this program to one person or many people at the same time?” “Why not bridge the big gap across cultures. What are the fundamental truths about human beings?” “Could we make this program so accessible and user friendly that it could work with children?” The challenges faced when doing Morphological Analysis are particularly appropriate when misinformation and untested assumptions are alive and well. By pushing the boundaries, we are more likely to surface what is and is not real about our world.
One final point. It is often even more important to ensure that those engaging in these divergent processes do themselves represent a diversity of perspectives and experiences. We are reminded of the founding work done by the Synectic’s group that not only offered some very powerful divergent processes (related to something called “spectrum analysis”) but also typically invited in people from many departments in an organization to work on a specific problem.
Convergence: when we have sown many seeds (ideas) in a problem-solving venture, it is time to find out which seeds yield a healthy outgrowth. We can do this by allowing multiple projects to be engaged, and then determine which work and which don’t work. We can take a somewhat more realistic step by setting up several limited “pilot tests” that enable us to see how a particular idea plays out without devoting significant resources to these pilot efforts. Usually, we don’t have the luxury of engaging this “survival of the fittest” strategy (even if restricted to pilot tests). This is especially the case in a polarized setting where each side is waiting for the failure of the other side. Instead, we must make the difficult decision(s) to select one of the ideas or to combine several of the ideas and begin planning for their implementation.
We can evaluate an idea by returning to the domains of information and intentions. The questions to be asked are rather straightforward. With regard to the domain of information we can ask: “Does this idea fit with what we know about the real world in which this idea would be implemented?” The domain of intentions is added with an answer to the following question: “To what extent is this idea, if implemented, likely to move us toward one or more of the desired outcomes on our target?” If we have done a good job with our domains of information and intentions, the answers to these questions are likely to be forthcoming and valuable. The causal/resource analysis will also provide some of the answers when we begin to converge on a specific idea: “How does this idea relate to what we know about past attempts (successful or unsuccessful) to address similar problems in our own organization or in other comparable organizations?”
Here are questions that might be asked as a way to generate appropriate ideas:
* What have you already tried to do when seeking to solve this problem and what did you learn from these efforts?
* What actions have you taken that somehow reduced the scope or impact of the problem—even if this action was not intended to address this problem? What did you learn from this serendipitous impact?
* How might other people help you solve this problem—especially those who have not previously been involved with this problem? What other resources which have not previously been used might you direct to this problem?
* What would happen if you just ignored this problem? What would happen if you devoted all of your time and resources to solving this problem?
* What is the most unusual idea that you have about solving this problem? What solutions have you dreamed of or thought about at a moment when you were particularly tired or frustrated?
* What would you do if you had much more time to solve this problem?
* What would you do if you had very little time to solve this problem?
* If you were “king” or “queen” what solution(s) would you impose to solve this problem? If you were a “fool” or had nothing to lose in trying something out, what would you do in attempting to solve this problem?
Implementation: A critical moment often occurs when information in particular and even intentions are put to the test. Is the information on which the idea is based valid and useful? The “anvil of reality” is used to hammer on an idea to see if it holds up. The hammering can actually make the idea stronger – if the foundational information is valid. Misinformation and lies are often revealed when action is taken. Many years ago, John Dewey suggested that we needed to take action if we wish to find out what is really occurring in the world. More recently, psychologists such as Chris Argyris and Don Schon write about “action learning” and suggest that there is nothing wrong with making mistakes (for example, finding out that our information is not valid), as long as we learn from these mistakes and don’t keep repeating them.
Perhaps, misinformation and lies are not best confronted not by freezing in place, waiting somehow for the correct information to come forth or the desired outcomes to become clearer. There might not be some miraculous guide to arrive and lead us to the promised land. Rather, the lies and misinformation might best be confronted by taking some action and learning from it about what the nature of the “real” world and about outcomes that can realistically be expected to be accomplished given the knowledge and resources available to us at the present time.
Obviously, the primary purpose of implementation is not to learn about the world or gain greater clarity about our intentions. It is to solve a problem. It is about reducing the gap between a current state of affairs (domain of information) and a desired state of affairs (domain of intentions). We need to establish clear criteria for determining the level of success (building on the established target) as well as a realistic timeline for assessing the success. Given this assessment, modifications of the action that has been taken can be made—while learning also takes place.
Insights: There is one final “I” to be engaged. This is the reflection back on the action taken “after the dust has settled and the battle waged.” It is important to squeeze the last bit of learning from this problem-solving venture. The reflective process should produce insights regarding not only this particular problem (building on the action learning engaged during the implementation phase), but also the very process of problem-solving itself. This second level of insights are often referred to a “meta-learning” or “second order learning.” While this specific problem might not occur again in the future, there will inevitably be other problems of a comparable nature and scope to emerge. We can learn how to do an even better job of addressing problems if we openly and candidly address each of the problem-solving steps we have taken and consider ways to do a better job next time.
Once again, we are not going to avoid making mistakes—especially in a world filled with misinformation and lies—but we can avoid making the same mistakes (regarding the solving of problems) in the future. We just need to devote some time in reflecting on the mistakes that have been made. By the way, as we noted in describing the causal/resource analysis, we can also learn from our successes. What did we do “right” in solving this problem and how can we replicate this successful engagement of the problem-solving process the next time around.
Domain Interdependence: We have outlined a specific sequence of movements between the three domains of intentions, information, and ideas. With intentions helping to guide the gathering of information and both helping to produce appropriate ideas. This sequence however is not set in concrete. One can move from any one of these domains to either of the other domains. It is often valuable to move back and forth between the domains of information and intentions. A return to the domain of either information or intentions can be activated at the point when ideas are being considered. We offer the following list of interdependencies between the three domains.
Type of Interdependence
Information to Intentions
Dissatisfaction with the situation implies a particular
target as a standard of comparison
Any suggested target implies by comparison what
is unsatisfactory about the current situation
Intentions to Ideas
A target defines the results desired from any proposal
Any proposal embodies assumptions about the nature of
the desired target
Ideas to Information
A proposal embodies assumptions about the causes of the unsatisfactory situation and implies resources and requirements for change
The situation places limits on the effectiveness
and feasibility of acceptable proposals
Here are some questions that can be asked regarding interdependency among the three domains:
* When information is generated about the situation [domain of information], target information [domain of intentions] can be elicited by such questions as:
“If you could change the present situation, what would you want to accomplish?”
“What’s missing in the present situation that you want?”
“What would be your goal in improving the situation?”
* Proposal information [domain of ideas] can be generated from that same situational statement [domain of information] by such questions as:
“What might be done to improve that?”
“What kind of action does that seem to require?”
“What plan would use that resource?”
* When a target is identified [domain of intentions], situational information [domain of information] can be elicited by such questions as:
“In what ways does the present situation fall short of that goal?”
“Why does the present situation fall short of that goal?”
“What forces for improvement are there for reaching that goal?”
“What obstacles stand in the way of reaching that goal?”
* Proposals [domain of ideas] can be elicited from the same target statement [domain of intentions] by asking:
“What might be a possible way to accomplish that?”
“What steps might lead toward that goal?”
* In a similar manner, when a proposal [domain of ideas] presents itself, situational information [domain of information] can be elicited by asking:
“What might that improve in the present situation?”
“What part of the problem do you see that dealing with?”
“What resources are there for doing that?”
* And, finally, target information [domain of intentions] can be elicited from that proposal [domain of ideas] by asking:
“To accomplish what?”
“In order to do what?”
“What objective does that proposal aim at?”
Conclusions
Problem solving often seems to wander aimlessly from topic to topic without ever actually coming to grips with the problem at hand. During this wondering, it is easy to pick up lies and misinformation. Ideational trains can distract us and distort reality. We can turn in appropriately to “experts” who promise to steer us in the right direction. Instead, we can engage in a thoughtful process that enables us to enter into “slow thinking.”
By categorizing statements in visible columns according to information, intentions and ideas and by using statements in one domain to bring forth inquiries in other domains, we can become more effective and efficient in our problem solving efforts—even when misinformation and lies seem to be prevalent and are knocking on our door.
___________________
References
Ariely, Dan (2012) The (Honest) Truth About Dishonesty. New York: Harper.
Aronson, Elliot (2018). The Social Animal. New York: Worth Press.
Bergquist, William (2003) Creating the Appreciative Organization. Harpswell, Maine: Pacific Sounds Press.
Bergquist, William (2013) Collective Intelligence: Collaboration or Collusion? Library of Professional Coaching. https://libraryofprofessionalcoaching.com/concepts/organizational-theory/collective-intelligence-collaboration-or-collusion/
Bergquist, William (2020) “Leadership and Anxiety: Containment and Metabolism I: Anxiety in a VUCA Plus Environment. Library of Professional Psychology. https://psychology.edu/library/leadership-and-anxiety-containment-and-metabolism-i-anxiety-in-a-vuca-plus-environment/
Bergquist, William (2021) “The Cosmopolitan Expert: Dancing with Numbers and Narratives” Library of Professional Coaching. https://libraryofprofessionalcoaching.com/applicationsuses/leadership-coaching/the-cosmopolitan-expert-dancing-with-numbers-and-narratives/
Bergquist, William and Vikki Brock (2008) “Coaching Leadership in the Six Cultures of Contemporary Organizations” in D. Drake, D. Brennan and K. Gørtz (eds), The Philosophy and Practice of Coaching: Insights and Issues for a New Era. San Francisco: Jossey-Bass.
Bergquist, William and Ken Pawlak (2008) Engaging the Six Cultures of the Academy. San Francisco: Jossey-Bass.
Bion, Wilfred (1961) Experiences in Groups. New York: Basic Books.
Bledstein, Burton (1976) The Culture of Professionalism: The Middle Class and Development of Higher Education in America. New York: Norton.
Danesi. Marcel (2020) The Art of the Lie. Guilford, Connecticut: Prometheus Press.
Gergen, Kenneth (2000) The Saturated Self. (Rev.Ed.) New York: Basic Books.
Gleick, James (1987) Chaos. New York: Viking Penguin,1987
Hoffer, Eric. The True Believer. New York: Harper and Row, 1951.
Kahneman, Daniel (2013) Thinking Fast and Slow. New York: Farrar, Straus and Giroux.
Kahneman, Daniel, Oliver Sibony and Cass R. Sunstein (2021) Noise: A Flaw in Human Judgment. New York: Little, Brown and Company.
Klein, George (1967) Peremptory Ideation: Structure and Force in Motivated Ideas, Psychological Issues, vo. V, No. 2-3. New York: International Universities Press, pp. 78-128.
Levy, David 2017). 6 Reasons People Lie When They Don’t Need To | Psychology Today Australia
Pfeiffer, Dan. Battling the Big Lie. Grand Central Publishing. Kindle Edition.
Prigogine, Ilya (1984) Order Out of Chaos. New York: Bantam Books.
Renstrom, Joelle (Retrieved July 10, 2022) How science helps fuel a culture of misinformation | Nieman Journalism Lab (niemanlab.org)[KW2]
Richardson, Heather Cox (2020) Letters from American, August 3, 2022. Retrieved August 4, 2022.
Rokeach, Milton (1960) The Open and Closed Mind. New York: Basic Books.
Vaill, Peter (2008) Managing as a Performing Art. San Francisco: Jossey-Bass.
Wood, Janice (Retrieved July 7, 2022) Why Do We Believe Lies Even After They Are Proven Wrong? psychcentral.com. The report was published in Psychological Science in the Public Interest.
Young, Dannagal (Retrieved December 27, 2022) “Misinformation succeeds, in layman’s terms, because it makes people feel good. People believe stories that reinforce or reward the way that they see the world. They share stories that boost their ego or make them feel like part of a team” Why Does Misinformation Spread? Human Behaviour Plays a Big Part | Thrive50Plus Magazine