The Black Swan: The Impact of the Highly Improbable

The Non-fiction Feature

Also in this Monthly Bulletin:
The Children’s Spot: The Phantom Tollbooth by Norton Juster
The Product Spot: Foundation (Apple TV+)

The Pithy Take & Who Benefits

Nassim Nicholas Taleb, a former risk engineering professor and derivatives trader, offers an expansive look at the highly improbable. He examines Black Swans—extremely impactful outliers that “no one sees coming”—and discusses their effects and the incompleteness of knowledge. Why do we process the past so poorly? How do we interpret knowledge? Why are we so bad at predicting? And how should we proceed, knowing that Black Swans may be waiting for us?

Taleb scoffs at the notion that his ideas can be squeezed into a certain framework, such as, say, an outline on a non-fiction book website. It’s true that the book covers a formidable range of topics, many of which I will not attempt to summarize (the circularity of statistics, the Mandelbrot Set, etc.).

Despite this, I believe Taleb makes some incredibly salient points about knowledge—and how we mistreat knowledge—that can be summarized (I’m sure this would make him cringe). I think that this book is for people who seek to understand: (1) what a Black Swan event is and how it affects the world; (2) why our impaired ability to process knowledge prevents us from seeing Black Swans; and (3) how to improve prediction and use Black Swans to our advantage.


The Outline

The preliminaries

  • A Black Swan is an event that:
    • Is an outlier (it’s outside the realm of regular expectations; the past does not convincingly point to its possibility);
    • It’s extremely impactful;
    • People come up with explanations for its occurrence, making it seem explainable and predictable.
  • The extreme, unknown, and improbable actually dominate our world.
    • Despite this, people act as though things are predictable (e.g., producing 30-year projections of oil prices).

How people deal with knowledge

Although Black Swans are unpredictable, the way we deal with knowledge (not well) makes us especially vulnerable to the negative effects of Black Swans.

The first reason for blindness to Black Swans: confirmation error.

  • Think of a turkey that’s fed every day. Each day’s feeding reinforces the bird’s belief that the nice human takes care of it. Then, on the Wednesday before Thanksgiving, something unexpected happens that will change the turkey’s belief.
    • For the turkey, that Wednesday is a Black Swan. For the butcher, it isn’t, because it isn’t unexpected. 
  • Confirmation error is the tendency to look at what confirms our knowledge, not our ignorance.
  • Someone who observed the turkey’s first thousand days (but not the Wednesday) would say that there’s no evidence of the possibility of a large, disruptive event (a Black Swan).
  • People take past instances that corroborate their theories and treat them as evidence, even though a series of corroborative facts isn’t necessarily evidence. 
    • For instance, if you see someone kill, then you can be pretty sure that he’s a killer. If you don’t see him kill, you can’t be sure that he’s innocent.
  • Blindness also comes from logical errors.
    • In another example, many confuse the statement “almost all terrorists are Moslems” with “almost all Moslems are terrorists.” 
    • Assume that 99% of terrorists are Moslems. 
    • This means that only 0.001% of Moselms are terrorists.
      • (There are more than one billion Moslems and around 10,000 terrorists.)
    • This logical error makes a person unconsciously overestimate the odds of a random individual Moslem being a terrorist by close to 50,000 times.
  • Some can effortlessly solve certain problems in social situations, but struggle when it’s presented as an abstract, or vice versa.
  • This is because people use different mental skills in different situations: the brain lacks a central machinery that applies logic equally to all possible situations.

The second reason for blindness to Black Swans: the narrative fallacy.

  • People like stories and simplification. The narrative fallacy is our predilection for compact stories, which ultimately blinds us to raw truths. 
  • The narrative fallacy addresses our impaired ability to look at sequences of facts without forcing an explanation into them.
    • Explanations bind facts together; and help them make more sense. 
      • But, this can increase the impression of understanding, not actual understanding—we understand less than we actually do because we use narrative fallacies to transform the past.
    • Think of a bunch of random words in a 500-page book. If the words are random, you can’t summarize. 
      • But if the book is a single sentence repeated over and over, you can remember it because you found the pattern. Your brain stores the pattern, which is more compact than raw information.
    • By simplifying through the narrative fallacy, we tend to leave out Black Swans.
      • For instance, journalists try to get facts, but then they tie them together in a narrative to give the impression of causality. Some academics and scientists do the same. This makes things look far more complicated.
  • Narrativity affects our understanding of the Black Swan by messing up our projections.
    • In one experiment, forecasting professionals were asked to the odds of the following:
      • (a) A massive flood somewhere in America that causes 1,000 deaths.
      • (b) An earthquake in California that causes massive flooding and 1,000 deaths.
      • Respondents estimated (a) to be less likely than (b).
        • An earthquake is a plausible cause, which greatly increases the probability of the flood, even though the first event has fewer limitations and thus the likelihood of it happening is actually higher.
    • Abstract statistical information doesn’t sway us as much as the narrative anecdote.
      • For example, in the late 1970s, a toddler fell into a well in Italy. The rescue team couldn’t pull him out, and all of Italy was concerned.
      • Meanwhile, the Lebanese were also incredibly absorbed in the child’s fate, even though five miles away people were being bombed to death in Lebanon’s civil war.
  • Researchers have mapped our activities into a dual mode of thinking: System 1 (experiential) and System 2 (cogitative).
    • System one is intuition. It’s emotional, produces shortcuts, and functions rapidly.
    • System two is thinking. It makes fewer mistakes and is slower.
    • Most reasoning mistakes, especially when it comes to Black Swans, come from using System 1 when we think we’re using System 2.

The third reason for blindness to Black Swans: silent evidence that is either difficult to see or is not discussed.

  • Silent evidence pervades everything connected to history (which is any succession of events seen with posterity).
    • Roman thinker Marcus Cicero presented the following: Diagoras, a non-believer in the gods, is shown portraits of worshippers who prayed, then survived, a shipwreck. 
      • The implication is that praying protects one from drowning. Diagoras asked, “Where are the pictures of those who prayed, then drowned?”
      • Silence evidence can fool the casual observer into believing in miracles.
    • There is a belief among gamblers that beginners are always lucky. 
      • In reality, beginners will be either lucky or unlucky. 
      • The lucky ones continue gambling.
      • The unlucky stop and won’t show up in the sample. 
        • The dropouts will no longer be part of the gambling community–hence, the myth of beginner’s luck continues.
  • Another type of silent evidence: the illusion of stability.
    • This bias lowers our perception of prior risks, particularly if we’ve survived them—you were under serious threat but you survived, and retrospectively underestimate how risky the situation actually was.
  • Silent evidence greatly weakens the notion of “because” that is often used by historians and scientists. 
    • It’s not that causes don’t exist. Rather, just be suspicious of the “because,” particularly in situations where you suspect silent evidence.

In sum, people prefer narration and ignore what could have happened. Train your reasoning abilities to control decisions; nudge System 1 (emotion and intuition) out of the important ones.

We just can’t predict

Epistemic arrogance

  • People are arrogant about what they think they know; there’s a built-in tendency to think that we know a little bit more than we actually do, and that little bit can become trouble.
  • Epistemic arrogance is the hubris concerning the limits of our knowledge: we overestimate what we know and underestimate uncertainty.
    • The problem is that ideas are sticky: once we produce a theory, we are not likely to change our minds, so those who delay developing theories are better off. 
      • When people develop opinions on the basis of weak evidence, they’ll have difficulty interpreting subsequent information that contradicts these opinions, even if this new information is obviously more accurate.
  • How does learning, education, and experience affect epistemic arrogance? It depends on the profession. Which disciplines have experts and which have none?
    • Experts who tend to be experts: livestock judges, test pilots, physicists, mathematicians, accountants.
    • Experts who tend to be not-experts: college admissions officers, economists, finance professors, political scientists.
  • Professions that deal with the future and base their studies on the non-repeatable past have an expert problem.
    • For instance, in one study, various experts were asked to judge the likelihood of a number of political, economic, and military events within a specified timeframe. 
      • The study revealed that the experts’ error rates were many times what they had estimated. There was no difference in results whether one had a PhD or an undergraduate degree. Professors had no advantage over journalists.
  • The experts didn’t realize that they weren’t good at their own business, in part because of how they spun their failures.
    • They tell themselves that they misunderstood the rules, or something was outside the scope of their field, or they were “almost right.”
  • In short, we tend to think “narrowly” (epistemic arrogance) and overestimate our ability to predict.

Predicting is naturally difficult

  • Mathematician Henri Poincare explained nonlinearities: small effects that can lead to severe consequences.
    • Consider the three-body problem. If there are only two planets, with nothing else affecting their course, then you may be able to indefinitely predict their behavior. 
      • But add a third body, like a small comet.
      • Initially the small third body will cause nothing, but later, its effects on the two large bodies may become explosive.
  • The world is much more complicated than the three-body problem. 
    • There’s an additional stumbling block when social matters are involved, especially if you believe in free will.
    • The turkey can assume that feeding confirms either safety or danger.
      • So, the past can be misleading, and there are many degrees of freedom when interpreting past events.
  • Why do we plan?
    • Projecting allows us to cheat evolution: We can play multiple scenarios in our heads, as opposed to doing each thing that we want and suffering the consequences.
      • For instance, if you’re angry and you instinctively want to punch someone, you can first imagine the scenario and you realize that you’ll end up in jail, so you don’t punch.

Asymmetry between past and future inhibits us from predicting well

  • When thinking about the relationship between the past and the future, you usually don’t learn from the relationship between the past and the past previous to it. 
    • There’s a blind spot. For example, you’re about to buy a new car, it’s going to change your life, elevate your status, etc. 
    • But, you forget that the last time you bought a car, you felt the same.
    • You don’t predict that the new car’s effect will eventually fade and that you’ll revert to the initial position. This is an avoidable prediction error.
  • Get knowledge from history, but don’t draw any causal links or try to reverse engineer too much.
    • The more we try to turn history into anything other than a list of accounts, the more we get into trouble.

Scalable v. non-scalable: the special difficulty of predicting outcomes in environments of concentrated success

  • Scalable professions are ones in which you aren’t paid by the hour and are subject to the limitations of your labor. 
    • There is much more uncertainty, and it’s subject to Black Swans. (Think of J.K. Rowling, musicians, etc.)
    • Scalable professions are more competitive, produce monstrous inequalities, and are more random.
  • Non-scalable professions, such as dentists, consultants, or massage professionals, cannot be scaled because there’s a physical cap on the number of clients you can see.
    • This work is largely predictable: a single day’s income won’t be overly significant. It’s not driven by Black Swans.
  • Globalization has allowed the US to separate the less scalable components abroad.
  • The scalable/non-scalable distinction illuminates a clear differentiation between two varieties of randomness.
    • In the imaginary world of Mediocristan, particular events don’t contribute much individually, only collectively (like salary from a non-scalable job). 
      • When the sample is large, no single instance significantly changes the total.
      • Things like weight, height, and calorie consumption.
    • In the other imaginary world of Extremistan (like salary from a scalable job), a single event can disproportionately impact the total.
      • Things like wealth and social matters (not physical).
      • This world produces Black Swans.
  • The world today is more Extremistan than Mediocristan, yet we treat it as Mediocristan and further hamper our ability to predict Black Swans.

Extremely practical measures that we can take

  • Accept that being human involves epistemic arrogance. Don’t try to always withhold judgment, and don’t try to avoid predicting (just try to predict in the right places).
    • Avoid unnecessary dependence on large-scale harmful predictions. Don’t listen to economic forecasters or to predictors in social science.
  • Distinguish between positive contingencies and negative ones.
    • A negative Black Swan business is one where the unexpected can be catastrophic (the military, homeland security, etc.).
    • A positive Black Swan business is where the unexpected can bring great boons (publishing, scientific research, and venture capital).
  • Seize anything that looks like an opportunity, because positive Black Swans have a necessary first step: you need to be exposed to them.

And More, Including:

  • Why the bell curve is an intellectual fraud (measures of uncertainty that are based on the bell curve disregard the possibility and impact of sharp jumps)
  • The Mandelbrot set: the repetition of geometric patterns at different scales, revealing smaller and smaller versions of themselves, and how they allow us to account for a few Black Swans
  • Ten principles for a Black-Swan-robust society, especially as it relates to how we move forward from the 2008 financial crisis
  • The opacity of history (the illusion of understanding, the retrospective distortion, and the overvaluation of factual information)
  • How the Ludic Fallacy in particular blinds us to Black Swans

The Black Swan: The Impact of the Highly Improbable

Author: Nassim Nicholas Taleb
Publisher: Random House Trade
Pages: 480 | 2010
Purchase
[If you purchase anything from Bookshop via this link, I get a small percentage at no cost to you.]

theblackswan