Jonah Lehrer (2009) similarly references studies conducted at MIT in which students given access to large amounts of data performed poorly in predicting stock prices when compared with a control group of students with access to far less information. He notes that the prefrontal cortex of the brain has great difficulty NOT paying attention to large amounts of information which can overwhelm the ability of the brain to estimate and predict. Access to excessive quantities of information can have “diminishing returns” when conducting assessments and predicting future outcomes, he says. Lehrer comments that corporations, in particular, often fall into the “excessive information” trap and invest huge amounts of resources in collecting data that can then overwhelm and confuse the human brain, versus the intent of informing decision-making. Lehrer describes the remarkable situation of medical doctors diagnosing back pain several decades ago. With the introduction of MRI in the 1980’s and with far greater detail available, medical practitioners hoped that increasingly better predictions of the sources of back pain would be made. The converse happened. Massive amounts of detail produced by the MRI actually worsened their assessment and predictive capabilities – poorer assessments were made. Kahneman refers to scenarios that contain a high level of complexity, uncertainty and unpredictability as “low-validity environments”. Without doubt, assessing and predicting the outcome of cultural change initiatives falls into this category.
Experts are “less competent than they think they are”
Kahneman explains the state of our deep knowledge and experience in so many fields in the following way: “Psychologists have confirmed that most people genuinely believe they are superior to most others on most desirable traits…. (for example) Leaders of large businesses sometimes make huge bets on expensive mergers and acquisitions acting on the mistaken belief they can manage the assets of another company better than its current owners … (in many cases) they are simply less competent than they think they are”. Kahneman rather humorously notes that humans are “incorrigibly inconsistent” in making judgments about complex situations. While his description of this is humorous, the reality can be serious. He continues to describe situations where experienced expert radiologists evaluating the same chest X-Rays as being “normal” versus “abnormal” contradict each other 20% of the time. Sometimes they even contradict their own evaluations on a second assessment. Similarly, my personal experience driving culture change in large corporations, is that many executive leaders (usually with very strong and dominant personalities) often have strong opinions about the future success of their culture change initiatives. With differing background and experience, and often relying on culture research studies that produce large quantities of research statistics (which, in my view, can distract focus), their views of what needs to be done and predicting future outcomes is often all “over the map”. This inconsistency amongst leaders is “destructive of any predictive validity”, Kahneman says. He labels this sense of over competence the “hubris hypothesis”. These executives are most often “less competent than they think they are”, he says, and I agree.Download Article 1K Club