Home Concepts Decison Making & Problem Solving Expertise And Ignorance: We Are All Ignorant—Some of Us Know It and Some Of Us Don’t

Expertise And Ignorance: We Are All Ignorant—Some of Us Know It and Some Of Us Don’t

71 min read
1
0
211

Mechanisms to Overcome “Ignorance”

Daniel Kahneman and his colleagues (Kahneman, Sibony and Sunstein, 2021)) propose that training people to become aware of their biases, heuristics and the potential for noise is possible but difficult. They notes: “Decades of research have shown that professionals (experts) who have learned to avoid biases in their area of expertise often struggle to apply what they have learned to different fields”. For example, weather forecasters have learned to avoid over-confidence in predicting weather patterns—but are just as overconfident as anyone else on general knowledge questions.

The role of a coach is valuable in these circumstances to remind expert leaders that they are straying outside of their areas of expertise and that well understood biases can creep up unknowingly. The power of this coaching is that it is “in-the-moment.” By contrast, a training program on critical thinking is remote and often inaccessible when a changing judgment must be made. As Kahneman notes, “people often recognize biases more easily in others than they do in themselves” Skilled coaches—which Kahneman calls these “decision observers”—can be particularly effective in this role.

Reducing Complexity

We reintroduce Lewis Goldberg—the gadfly who questioning many of the assessment practices being engaged by clinical psychologists. He was working alongside Daniel Kahneman and Amos Tversky at the Oregon Research Institute (ORI) in Eugene Oregon. Michael Lewis (2017) the chronicler of these days at ORI noted that Goldberg, along with his ORI colleagues, wanted to be able “to spot when and where human judgment is more likely to go wrong.” Studies were conducted on the way in which experts in several critical fields gathered information and made decisions. In most cases experts engaged in very complex processes; however, the success of experts in predicting specific outcomes was found to be no better than predictions made on the basis of very simply sources of data and analysis> Goldberg focused initially on clinical assessments. He found that “simple actuarial formulae typically can be constructed to perform at a level of validity no lower than that of the clinical expert.’” (Lewis, 2017, p. 171).

The work of Goldberg and his ORI colleagues led Daniel Kahneman (2011) to quote one of Goldberg’s mentors, Paul Meehl, (who Kahneman rates as “one of the most versatile psychologists of the twentieth century”). Meehl (along with Goldberg) proposed that one reason experts are almost always outperformed in predictive capabilities by simple algorithms, is that they think they are quite capable of dealing with massive amounts of data and information – and they are almost always wrong. They know that they are very smart people – but they “try to be (too) clever, think outside the box and consider complex combinations of features in making predictions – Complexity (most often) reduces validity”.

Pages 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28
Download Article 1K Club
Load More Related Articles
Load More By Kevin Weitz
Load More In Decison Making & Problem Solving

Leave a Reply

Your email address will not be published. Required fields are marked *

Check Also

Searching for Serenity in a VUCA-Plus World

I broaden consideration of each VUCA-Plus element—considering the polarities associated wi…