According to investopedia, reviewed by Jim Chappelow updated June 25, 2019
“A black swan is an extremely rare event with severe consequences. It cannot be predicted beforehand, though many claim it should be predictable after the fact.
Black swan events can cause catastrophic damage to an economy, and because they cannot be predicted, can only be prepared for by building robust systems.
Reliance on standard forecasting tools can both fail to predict and potentially increase vulnerability to black swans by propagating risk and offering false security.”
“…the term was popularized by Nassim Nicholas Taleb, a finance professor, writer, and former Wall Street trader. Taleb wrote about the idea of a black swan event in a 2007 book prior to the events of the 2008 financial crisis. Taleb argued that because black swan events are impossible to predict due to their extreme rarity yet have catastrophic consequences, it is important for people to always assume a black swan event is a possibility, whatever it may be, and to plan accordingly.
He later used the 2008 financial crisis and the idea of black swan events to argue that if a broken system is allowed to fail, it actually strengthens it against the catastrophe of future black swan events. He also argued that conversely, a system that is propped up and insulated from risk ultimately becomes more vulnerable to catastrophic loss in the face of rare, unpredictable events.”
Examples of famous Black Swan phenomena include the financial crash of the U.S. housing market during the 2008 crisis, Zimbabwe’s hyperinflation in the 21st century with a peak inflation rate of more than 79.6 billion percent, the dot-com bubble of 2001 , and the event when the previously successful hedge fund, Long-Term Capital Management, was driven into the ground in 1998 as a result of the ripple effect caused by the Russian government’s debt default, something the company’s computer models could not have predicted.
When it comes to thinking about thinking, Taleb proposes that one big reason we expose ourselves to Black Swans is that crunching data leads to pragmatic simplification. As we make things simpler so that they can be calculable, we end up excluding the improbable and the unknowable. The fact that we exclude them doesn’t mean they don’t exist and doesn’t ensure they won’t present themselves; usually when we least expect them.
Essentially, the argument is that because we believe we are calculating all variables, we create the fiction that we are calculating all there is to calculate. As experience demonstrates, that can prove a dangerous assumption. His arguments support the need for a thinking model with built-in elasticity. This, by the way, is one of the reasons for the tension between Taleb and Kahneman, the ultimate empiricist.
Thinking Fast and Slow
In 2011, some nine years after receiving the Nobel Prize for what was proved to be the foundational work in Behavioral Economics, Daniel Kahneman published a best-selling popularization of his work and his thinking. The title of the book is Thinking Fast and Slow. Ironically, it went out into the world six years after Gladwell’s blink popularized his [that is, Gladwell’s} version of Thinking Fast, which he recast as “Thin Slice Thinking.” It also came six years after Michael Lewis wrote the story of Moneyball, which, he did not realize until after its publication, was actually about how the Oakland A’s used Kahneman’s earlier work with Amos Tversky to redefine how sports teams chose their players and ultimately established the field of sports analytics.
Download Article 1K Club