Dan Ariely brings in a similar example. As in the case of Lewin’s women, Ariely (p. 80) found that “when a speaker is asked to prepare a short lecture about the benefits of a certain drug, the speaker would begin to believe his own words.” Ariely (p. 80) points to the work done by Lewin and many others: “Psychological studies show that we quickly and easily start believing whatever comes out of our own mouths, even when the original reason for expressing the opinion is no longer relevant. . .. this is cognitive dissonance at play.”
In the case of both Lewin’s and Ariely’s presenters, a dissonant state is created. On the one hand, the presenters wanted to believe that they are honorable people who never (or rarely) lie. On the other hand, here they are telling other people to use a specific drug or to serve meatless meals even when they might not believe what they are espousing. Something has to change, and it is much easier to believe in what is being espoused than to change a fundamental view about themselves.
Ariely (2012, p. 27) expands on these initial examples of cognitive dissonance. He proposes that there are two fundamental motives that are in conflict:
What.is going on here?. . . In a nutshell, the central thesis is that our behavior is driven by two opposing motivations. On one hand, we want to view ourselves as honest, honorable people. We want to be able to look at ourselves in the mirror and feel good about ourselves (psychologists call this ego motivation). On the other hand, we want to benefit from cheating and. get as much money as possible (this is the standard financial motivation). Clearly these two motivations are in conflict, How can we secure the benefits of cheating and at the same time still view ourselves .as honest, wonderful people?
When referring to “cheating” Ariely is not just taking about fudging on financial reports or making the case for the choice of good being served or drug being prescribed. He is writing more generally about the ways in which we alter our reality in order to retain a positive image of ourself. This is where our focus in this essay on lying extends into the realm of politics and into a deeper understanding of the crisis of expertise.
We not only pretend to be experts ourselves but also want to believe, uncritically, in an ”expert” who is aligned with our own belief system. It is important, in other words, that we retain our personal sense of being “smart” (we can be our own experts”) and “discerning” (we chose the right people to be our “experts’). Furthermore, we are honorable and would never deceive another person (unless they are our “enemies”) or sell out our “cause” for money, fame or friendship. All of these constructs of self must be retained—yet they often are oppositional. We are faced with the profoundly important task of resolved is dissonant opposition.
In The Art of the Lie, Danesi (2020) puts it this way:
Suppose an individual believes something with his whole heart; suppose further that he has a commitment to this belief and he has taken irrevocable actions because of it; finally, suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervor for convincing and converting other people to his view.