Nassim N. Taleb’s treasure chest of a book and New York Times bestseller, “The Black Swan”, explores the nature of unpredictable events of huge consequence.
It sheds light on fundamental errors in thought. The comfortable but potentially devastating lies we are told by others, and ourselves. Taleb has some rare and contrarian views, that can be applied in many areas of life.
The Black Swan is recommended by Amazon founder and billionaire, Jeff Bezos. The Sunday Times named it one of the twelve most influential books since WWII.
What to expect from The Black Swan summary:
- Get key takeaways from Nassim N. Taleb’s fantastic book “The Black Swan”.
- Learn about the nature of Black Swan’s and the human fallacies related to them.
- What you should do instead of making predictions.
- Learn how to think in an unpredictable world.
- Know how to invest to be positively influenced by Black Swans.
Mediocristan vs. Extremistan
Black Swans have three attributes: unpredictability, consequences, and retrospective explainability.
Mediocristan and Extremistan is terms used to describe two very different kinds of randomness.
Wild and unpredictable Black Swan events take place in extremistan, while Mediocristan is uneventful and quiet.
When Taleb was a student he was advised to choose a scalable career path. If you’re a doctor, dentist, consultant or message proffessional, there is a limit to the amount of clients you can see in a day AND the amount of money you can make in one day. It isn’t scalable. The pay is more or less predictable and depends on your continuous effort. There will be differences in your income over time, but no single day of work will have a huge impact on your lifetime earnings. The differences between the successful and less successful isn’t to big, and at least somewhat predictable. These professions isn’t black swan driven and is performed in Mediocristan.
Another example of a mediocristan environment is the casino. In the casino no single bet is allowed to deviate too far from the mean. If one gambler could place a bet of lets say 1 billion, the casino owner would face an Extremistan environment. Casinos owners off course, likes a stable business model, therefore they limit the bet sizes so one single bet can’t have too big an impact, positive or negative.
Other professions allows you to add zeros to your output and your income. “Idea” professions as trading and writing, allow you to think intensely instead of working hard. You do the same amount of work whether your output is 100 or 1 million. J. K Rowling will not have to write a new book for each additional sale, but the baker will have to form a new bread for each additional costumer. These professions are performed in extremistan. One payday can have potentially huge impact on your lifetime income and the difference between the successes and failures are extreme!Other examples of Extremistan environments are deaths from war, financial markets and venture capital investing.
Taleb states that the human mind suffers from three mistakes when it tries to make sense of history:
- The first one is the illusion of understanding. Everyone thinks he knows what is going on, whereas in reality the world is to complicated and random to know what is really going on.
- The retrospective distortion or how we can access everything easily and simple in the rearview mirror. We are very good at making sense of events after they happened.
- The overvaluation of factual information and the danger of “platonifying” or putting info into categories. Putting info into categories always oversimplifies reality.
Learn From The Poor Turkey
Taleb tells the story of the turkey which is fed for 1000 days. The turkey off course, thinks things everything is alright and it will continue to be fed. I mean how can it think otherwise when it has been fed consistently for 1000 days in a row now. On day 1001, the day before thanksgiving, everything changes.
This story illustrates why it’s dangerous to use history to predict the future.
Also it shows how black swan events are relative to knowledge. The butcher knew about the Thanksgiving death of the turkey all the time. Here lies one of the keys to not being a sucker: The black swan events happens relative to YOUR expectations. Avoid a negative black swan event by keeping an open mind.
Also, consider this quote from the Captain of a very famous ship: “But in all my experience, I have never been in any accident. . . of any sort worth speaking about. I have seen but one vessel in distress in all my years at sea. I never saw a wreck and never have been wrecked nor was I ever in any predicament that threatened to end in disaster of any sort.” – We all know what later happened to Titanic.
Confirmation & The Round-trip Fallacy
How to not to be a dead Turkey! And what to avoid to get closer to the truth (and success).
The Round-trip Fallacy
People are very likely to confuse the statement “no evidence of black swan” with “evidence of no black swan”. Take the example of the turkey again. In the first 1000 days of its life, it sees “no evidence of a black swan”, but it confuses it with “evidence of no black swan”. Statements as “almost all terrorists are muslims” are also likely confused with the statement “almost all muslims are terrorists”, with is obviously a very different and untrue statement.
Naive Empiricism and Confirmation Bias
You and me have a natural tendency to search for information that confirms our current beliefs, hypotheses and view of the world. The search for corroboration. Taleb states that we will always be able to find these past instances that support our view and theories. An example of this would be showing people your accomplishments, but not your failures on your resume.
Taleb suggest that the way to work around this naive empiricism we all suffer from, is with what he calls negative empiricism. Seeing only white swans does not mean that black swans does not exist. However, seeing a black swan confirms that not all swans are white.
A real life example of this is in cancer detection. Finding cancer cells confirms the existence of cancer in the body, whereas not finding any cancer cells would not allow you to say with certainty that the body is cancer free.
Negative empiricism is a way to get closer to the truth. We shouldn’t build rules based on observed facts, we could end up like the turkey!
The Narrative Fallacy
“We like stories, we like to summarize, and we like to simplify, i.e., to reduce the dimension of matters.” This is what causes the narrative fallacy. The narrative fallacy is the failure to look at things, events etc – without attaching a cause or explanation to it.
Taleb mentions this exercise made by E. M. Forster, to explain Narratives: “The king died and the queen died.” Compare it to “The king died, and then the queen died of grief.” Information was added to the second statement but it is still easier to remember. The added information made a narrative, combining the two pieces of info in the first sentence. As the narrative is easier to remember and understand, it will be more popular and can be marketed better.
This tendency to favor and impose narratives, is a result of dealing with dimension reduction to make things easy, simple and understandable. We also impose narratives to completely random and explainable events as past Black Swans. We ignore data that does not fit with our current narrative, making us vulnerable to black swans.
Here is a good example of how the media uses narratives to explain events: When Saddam Hussein was captured in 2003, Bloomberg News broke this headline: U.S. TREASURIES RISE ; HUSSEIN CAPTURE MAY NOT CURB TERRORISM. And half an hour later: U.S. TREASURIES FALL ; HUSSEIN CAPTURE BOOSTS ALLURE OF RISKY ASSETS. The media feels obligated to provide a reason for why the markets moved, even though it is totally made up.
Too Much Information is Bad
Taleb introduces the greek business tycoon Aristotle Onassis, famous for being rich, charming as hell, married to Jacqueline Kennedy and socializing with the famous.
Onassis ran a shipping empire, but he didn’t like “work” in the conventional sense. He often woke up at noon and didn’t have an office or a desk. When in need of legal advice, he would call his lawyers to meet at a nightclub in Paris. His main tool was a notebook with all the information he needed. Very simple. Did the simpleness and minimalistic way of running his business play a role in his success?
“We treat ideas as possessions”
Taleb mentions studies that suggests: “..additional knowledge of the minutiae of daily business can be useless, even actually toxic.” The moral of the study is that the more information you have available, the more hypothesis you are likely to form and the worse off you will be. Once you form a theory you’re not likely to easily change your mind. Ideas are sticky. The confirmation bias mentioned earlier as well as the consistency bias is in play here. We should try to not get attached easily.
Expert Predictions & Epistemic Arrogance
We are 22 times too confident!
Taleb argues that we definitely get smarter and more knowledgeable, but the problem is that our overconfidence grows faster. He refers to a study in which participants is asked to make an estimation of which they believe to have a 98% chance of being right, and 2% of being wrong. For example: “I am 98% certain that the population of America is between 250m and 400m.”
The error rate of the participants estimations was skyhigh: 45%, instead of the 2% the participants was asked to aim for. These participants were Harvard Business School students. The takeaway here is that they were 22 times to confident in their own knowledge. Taleb notes that more humble groups of people, like cab drivers, are considerable better at estimating their own knowledge, though still way too confident.
A common theme through the book, is that humans in general can’t predict anything. But we think we can.
Taleb urges us to be very skeptic about the confidence of professionals, because as with other humans, they don’t know when they don’t know. They have a hard time defining the boundaries of their knowledge. You should always question the error rate of the authority or the experts procedure. Ie. question their confidence. And of course be skeptical of your own confidence in your methods as well.
A Dream World and Good Traits
Taleb’s dream world is one where people have these valuable characteristics:
“Think of someone heavily introspective, tortured by the awareness of his own ignorance. He lacks the courage of the idiot, yet has the rare guts to say “I don’t know.” He does not mind looking like a fool or, worse, an ignoramus. He hesitates, he will not commit, and he agonizes over the consequences of being wrong. He introspects, introspects, and introspects until he reaches physical and nervous exhaustion. This does not necessarily mean that he lacks confidence, only that he holds his own knowledge to be suspect. I will call such a person an epistemocrat”
This utopia, Taleb calls an epistemocracy. The basis of this fantastic place, would be awareness of ignorance instead of the ever present “awareness of knowledge”. The Black Swan asymmetry as Taleb calls it, allows you to be confident about what is wrong, not about what you believe is right.
What To Do If I Can’t Predict?
Practice trial and error. “You need to love to lose” – Mark Spitznagel, colleague of Taleb.
Taleb urges us to be fools in the right places. Being human includes making automatic judgements and mistakes. He urges us to rank our beliefs in order of how much harm they may cause in our life. Be a fool in the trivial and small events of your life. Don’t get fooled when it comes to important events, as for example with handling your retirement funds. You can benefit from the unpredictability of the world if you’re prepared.
The Hyper Aggressive + Hyper Conservative Barbell Strategy: Benefit From Black Swans.
Place around 85-90% in extremely safe assets and the remaining 10-15 % in extremely speculative ones. No need for medium risk assets as this “risk” is estimated by the so called experts. The 10-15% exposure to speculative assets will pay off in case a positive black swan event takes place, and the 85-90% in ultra safe assets is meant not to be affected by negative Black Swan events. Be very aggressive when you can get exposure to a positive black swan and very conservative when you’re under threat from a negative one. Be very aggressive when an error in a model can benefit you, and paranoid when the error can hurt you.
- Make a distinction between positive and negative contingencies. Where is lack of human predictability extremely beneficial and where it is extremely hurtful? Hurtful: homeland security, military, insurance, banking and loans. Positive: publishing, venture capital.
- Don’t look for the precise and the local. Don’t be narrow minded! Pasteur, the man behind the famous quote “Chance favors the prepared”, understood that you don’t look for anything in particular each morning, but you work hard to let contingency enter your working life. Don’t try to predict any specific Black Swan as it will blind you to others. Invest in preparedness, not in prediction.
- Cease any opportunity or anything that looks like opportunity. Opportunity are much more rare than you would think. Remember that positive black swans needs a critical first step: Exposure.
- Beware of precise plans by governments. Government’s purpose is to survive and self perpetuate, not getting to the truth.
- “There are some people who, if they don’t already know, you can’t tell them” – Yogi Berra. Don’t waste your time arguing with forecasters, economists, stock analysts and social scientists. The accuracy of a forecast with get worse over time.
All of Nassim N. Taleb’s recommendations have one thing in common: asymmetry. Put yourself in situations where the positive outcomes are much larger than the negative ones.
Let’s end with a quote. “Be aggressive. It’s difficult to be a loser in a game you set up yourself. In Black Swan terms this means that you’re exposed to the improbable only if you let it control you. You always control what you do” – Nassim N. Taleb