The Black Swan Notes
These are the notes I took while reading the amazing book The Black Swan by Nassim Nicholas Taleb. This is in no way a substitution for reading the book (which you can find here), it is more of a quick reference for me (and others) who have already read the book.
There are probably some grammatical errors in here. You would do best to ignore them.
Part One: Umberto Eco's Antilibrary, Or How We Seek Validation
Chapter 1: The Apprenticeship of an Empirical Skeptic
The author begins by giving some background where he grew up, near Mount Lebanon, and goes over the history of the region, how it came to be, and who is in control now. It used to be called Syria, and after the fall of the Ottoman Empire, became Lebanon. Nassim (the author) was a rebellious teenager, going to riots and so on. He was not the typical "conformist", wearing the hip clothes and all that, he spent his time being engaged intellectually. For 1300 years, the region lived in peace, and then a civil war broke out between the Christians and the Muslims in the region. The war lasted for 17 years.
Nassim brings up the subject of the "triplet of opacity", that is, the idea that history is not this thing which we can see being created, it is a molded and specific series of events, but not the events that lead up to it. The 3 "legs" of the triplet are as follows:
- History and the happening world is more complex then you think
- We can only see history in retrospect, and never as it happens
- Over-reliance on factual information, people of power, and convenient "categorization" or groups of people
For 1., Nassim gives the example that people said the war would stop "any day now", despite the fact that the war had continued on and on forever, and they had no real information that could help them in their assertion.
For 2., Nassim states how people where able to make coherent explanations for why the war started the way it did, but only after the war had ended. If someone had known how to prevent this situation, the civil war could've been avoided.
Nassim explains how keeping a journal is a good way of remembering how you felt at a given point in time, devoid of any retrospective adjustments. It is pure, and something that you can look back on.
For 3., Nassim gives the example that his grandfather, who was the ministry of defense in Lebanon, and he had much more knowledge about the war, but could not make accurate predictions about what would happen next. Worst of all, they would think that they would be the best to know what would happen next, since they are the ones privy to this information. A taxicab driver on the other hand would know that they do not know, and would be equally able to make an incorrect prediction about what would happen next.
Nassim makes a great point which I have always held, in that the political climate is very polarized, that there is "left" and "right", and if you have this one idea which is "left", you must also believe this other idea which is "left", and could be contradictory. This categorization causes turmoil, in that it blinds you from the subtle differences between the members of said group. The example that he uses is the US funding Islam because they where "aligned" against communism, until they weren't, and caused 9/11.
Nassim mentions the market crash of 1987, and how this felt to him like the Black Swan event that he had been theorizing all this time. It came out of nowhere, no one suspected it, yet it caused massive devastation to the people around him. He felt excited/vindicated that his idea was probable, but could not find a way to tell anyone about it.
Nassim got a job as a financial advisor, but instead of finding market trends, he found flaws in peoples models, found holes in their logic and reasoning. He calls this the "platonic fold", the area between what you know, and what you know you don't know. In that area, there is optimization to be had, and Black Swans ready to be caught.
Chapter 2: Yevgenia's Black Swan
We learn about Yevgenia, a neuroscientist and had an interest in philosophy (from having married 3 philosophers). She wanted to write this book, and have it be 100% her own, and not follow any of the contemporary "rules" about book writing. No one wanted to publish her book. The writing groups she joined said she was a lost cause. She put up her manuscript online, and it got some traction, and this small unheard of shop decides to print her book. She accepts, and the book is a hit. People come out of the wood work to say that she should've gone to them, that they would've seen her genius, that they would've picked up on her amazing talent. People begin to write articles about her, and where she got her interest writing style from, though she had never even heard of the writers that they where mentioning.
From what I can tell, Yevgenia is not a real person, and her book, "A story of recursion" is just a copy of this book, with all the pages blank. Is this some sort of gag? Is this something that is explained later on?
Chapter 3: The Speculator and the Prostitute
Turns out Yevgenia is a fictional character, go figure.
There are professions where you are capped to the amount of work you can produce, especially when you are producing a physical product, where time and energy is required to build the product. More Intellectual products, such as ideas, are much easier to make an explosive profit off of, because you don't need to retype a book to make a copy of it. Once it is written, it can be mass produced. The author was given this advice by one of his students, which he almost agrees with, but says that a scalable career is only good if you are successful. If you are just average, an average job will probably work out better, where in a more scalable career, you are much more susceptible to variations.
Nassim gives the example of Opera singers before the invention of voice recording. Before the invention, their career was stable, and their audiences where guaranteed, since there was no other way to hear the singer. Now, people can listen to long dead opera singers all day long, and these dead opera singers are able to put new, and alive opera singers out of business.
Nassim talks about how America has more or less become an idea house, where the top companies (Boeing, Apple, Microsoft, etc) all produce ideas, which are fulfilled by subcontractors in other countries.
The author brings up a thought experiment in which you put 1000 people in a stadium. By and large, the average weight, height, and so on will not be dramatically increased/decreased by one person. This is Mediocristan. On the other hand, put Bill Gates in there, and beyond a doubt he will make up 99%+ of the total income. This is Extrimistan. Mediocristan is more or less the measure of things which a single human can produce, whereas Extrimistan is a measure of what the affects of society can produce. Something like wealth is a number (Extrimistan), whereas your weight is tangible (Mediocristan).
When dealing in Mediocristan affairs, you can safely extrapolate the averages, since they are linear, don't move at all, a single data point cannot affect the total. In Extrimistan affairs though, it is dangerous to extrapolate based on any previous data, because one datapoint can affect the total projection. The author calls Mediocristan "type 1 randomness", and Extrimistan "type 2 randomness".
"Table 1" is on page 36, and offers a nice breakdown of the differences between Mediocristan and Extrimistan.
Chapter 4: One Thousand and One Days, or How Not to Be a Sucker
Nassim gives an example of a turkey that is fed every day, and every day, the turkey is more and more convinced that its future is safe. Then, on the thousand and first day, the turkey is butchered, which was at the height of its confidence! The butcher is indifferent, since it knows very well that the turkey will die, but the turkey is none the wiser. The hardest problem is trying to make concrete predictions about the infinite unknown, but base it off of the finite known. Another example is Captain Smith, the captain of the Titanic, who had a spotless and successful career as a sea captain, only to have his life and career end when the Titanic sank.
Nassim says that the Black Swan is relative to what you know. If you are scientific, and open minded, you can do a good job of fending off Black Swans, but if you completely ignore their existence, you are doomed to come across one. In addition, if you come across a "negative Black Swan", you will be cleaning up the mess for a very very long time. Take for instance 9/11, or an earthquake. The event last(ed) for a very short period of time, but the cleanup, rebuilding, etc. can take years.
We learn about Sextus Empiricus, a Greek guy whom was against dogma, and believed heavily in trial and error. He did not think that everything should be a science, that there should also be some room for art as well.
Nassim says that we like to live in Mediocristan, because it allows us to rule out the Black Swans. This is a mistake.
Chapter 5: Confirmation Shmonfirmation!
Nassim makes an argument that there is a large difference between "no evidence of x" vs "evidence of no x". He compares this to cancer, in that if you have cancer, and get it treated, they will do a sampling to see if the cancer is gone. Just because the cancer did not appear in the second screening, does not mean it is gone. It would be wrong to say "we found evidence that there is no cancer" or that "there is no cancer". The truth is that "they found no evidence of cancer", meaning that the cancer could still be in there (which is probably true, though hard to prove). They could do thousands of samples, and each time have no more confidence that all the cancer is gone. All it takes is one instance where cancer was found in the sample to know that the cancer is indeed still there.
Nassim goes into confirmation bias by explaining the "2 4 6" problem. Basically, you can give 3 numbers, and the person testing you will say yes or no if it follows their pattern. Then, it is your job to guess the rule behind the sequence. People in this study tried and tried, and only found more examples of why they were right, and did not try and find instances where they were wrong.
Nassim explains that not all people form blind spots for all the things. There are things which we do not generalize, and are perfectly capable of recognizing as not logical. It depends on context. For example, a group of children where asked to generalize what a group of people would look like based off of one person. When shown a fat person, they did not assume that would carry over to the rest of the group, but when shown a person of color, they generalized that the rest of the group would look like that.
Chapter 6: The Narrative Fallacy
Nassim talks about the "narrative fallacy", which is this phenomenon where you summarize something, and in doing so, you thread this narrative (that did not exist before) as a means to simplify and make certain events easier to remember, while in doing so, leaving out important, raw details. One of the reasons we do this is because information is costly to retain, and organizing it and recalling it for later is much easier when the important bits have been distilled, and made easily recognizable.
The act of summarizing and distilling an event will cause you to overlook the seemingly random bits of raw information, and hide the black swans hiding among that data.
Nassim gives some examples of narratives which cause our logical processing to get mixed up. For example, people said it was more likely that an "earthquake where 1000 die" is more probable then an "earthquake in California where 1000 people die". The issue is that although earthquakes are more probable in California compared to other states, California is a single state, and being more broad would include many states, increasing the likelihood of an earthquake (which also would include California).
Nassim mentions that there are 2 types of black swans: Narrated, meaning ones which have some sort of story attached to them, and ones that do not, that nobody is talking about, and that you feel ashamed to talk aloud about.
Nassim gives an example of how narratives can also affect our logical reasoning. We might be less reluctant to go to a certain place if we hear a first hand account of some gruesome murder that took place there, but if we hear statistics about how dangerous a certain place or activity might be, we still might go despite knowing the statistical dangers associated there.
Nassim brings up 2 modes of thinking: System 1, which is automatic, intuitive, and requires no additional thinking, and System 2, which is deliberate, intentional thinking.
Nassim mentions that narratives work in Mediocristan because we don't see many black swans, if any, whereas they never work in Extrimistan, because we never know when we will see another black swan.
Chapter 7: Living in the Antechamber of Hope
Nassim talks about how we are expected to live in a world full of constant success, constant validation for the work we do, but linear progress does not exist for people living in Extrimistan, where the nonlinear and the delayed gratification run amok. We do not live in a linear world, yet we expect to see more results when we put in more effort, though things are never that easy. Nassim says that we don't see heroes for the work that they didn't do, the work they failed to complete, we see them for the work that they have done. He also mentions that processes are important, but it would be a lie to say that an author doesn't just write for the reward of writing itself, they are also chasing the hope that they will get published.
Nassim talks about how we need continuous (or steady) pleasure, that getting $1 million every 10 years will feel a lot worse then $100k every year, because we will feel a bump of happiness when we get our 1 million dollars, but every year after that, we will feel like we are a fraud, that we are "losing", despite having the logically same amount of money.
Nassim brings up Yevgenia (the imaginary person again?), and how she read an Italian book called "The desert of the Tartars". Basically, this guy goes to this castle to protect his country, and is meant to only stay for 4 years, but ends up staying for 35 years. He is captivated by the hope that his country will be attacked, and that he will be able to fight in glory. He ends up dying in some hotel, right as the attack rides past the hotel. The character, Drogo, was able to keep himself in this pit of hope for so long because he was surrounded by people who shared the same idea, and who fed off each other, cut off from the outside world.
Nassim uses this (fictional?) character named Nero Tulip, who meets Yevgenia. He is very safe: he doesn't smoke, and conducts himself in a very controlled way. When they leave, Nero finally reads the copy of "Il Deserto" (The Desert), which changes his life. He gets into trading, and creates this technique called "the bleed". This technique requires you "bleed" a certain amount of money each day, every day, until you get this very large payout (short stopping, for example). When the 1982 stock market crashed, he became obscenely rich.
Chapter 8: Giacomo Casanova's Unfailing Luck: The Problem of Silent Evidence
Nassim brings up the idea of "silent evidence", which is evidence which is silenced by lack of attention, or by blind spots in our cognition/intuition. For example, people who survive death because they worshiped. There is a large amount of people who worshiped, died, and never got to tell their side of the story.
Nassim talks about the Phoenicians, how they are best known for created the alphabet, and that they did very little writing. The truth is that they wrote on papyrus, which is biodegradable, and as such, did not stand up against the test of time, hence why we do not think that they wrote a bunch.
Nassim brings up the idea of "success stories", and how they create this distorted view of success, that they show the path that one person took, which is a narrative, and hides all of the complexity and randomness that led to their success. It will disregard the luck involved, and primarily will look at the events the author believed led up to their success.
Nassim brings up this phrase "hardened by the Gulag" that he read in an article. He finds it absurd, because the Gulag does not harden you, it weakens you. By making it through the Gulag, by surviving those who died around you, you will not be "stronger", you will be weak compared to the version of you who came in.
Nassim follows up with some examples: He talks about how the number of extinct species is only based off of the species for which we have found fossils for. There could be plenty of species for which no fossils remain, and they very well could outnumber the ones which we have accounted for.
Another example Nassim gives is that of Hurricane Katrina, where politicians went on to say how they would pour out money to get the victims compensated, and so on. The public went along with this, sympathizing with them. What they did not recognize is that the money would need to be diverted from somewhere. For instance, it could've been diverted from Cancer treatment or research. The patients who would be getting stiffed would die soon enough, and not be able to express their frustration about being wronged. Another example is how after 9/11, some people where afraid to fly, and as a result, took to driving place to place instead. This caused a measurable increase in people getting into car crashes as a result.
Nassim brings up a guy called Giacomo Casanova, who has this impeccable winning streak. He cannot be defeated. Whenever a hardship comes his way, he always recovers. Nassim ascribes his continual success to the fact that he has been there to talk about it after every triumph, but when that fateful day comes where he no longer can, we will not be there to listen to his anecdotes. The same goes for New York City, which is described as having this resiliency, but if it did not come back from a setback, it would no longer have that title.
Nassim goes on a rant saying that we always must add a "because" to everything, that everything must have one reason for happening, that we never say how things "just happen", that we never account for randomness or luck. This blinds us from seeing that the world around us is predictable, casual, that things happen for a reason.
Chapter 9: The Ludic Fallacy, or the Uncertainty of the Nerd
We meet a character called "Fat Tony". Tony is a banker, and the line between his work life and his personal life are blurred. He snatches up land and properties with ease, and knows his way around people. He is a risk taker, and understands the world better then most. Then we meet Dr. John. He is a former engineer, and now works as an actuary (risk assessment person) at an insurance company. He lives his life as an average joe. In a thought experiment, Nassim gives them a question: If a fair coin is flipped 99 times, and it lands on heads every time, what is the chance that it will be tails next time? John (and me as well) say 50/50, whereas Tony says that the rules are rigged, that it is less then a 1% chance. Nassim compares this to thinking inside the box vs outside. Assuming that the coin is fair, yes, the odds are 50/50. But, given the evidence (and the unlikeliness of such an event happening), it is far more likely that the initial assumption about the coin being fair is false.
Nassim ends with a story about this event he went to in Lake Como, which was a gathering of military and statisticians and so on. When he arrived, he realized that only the military people are truly skeptics, since they are constantly subjected to the randomness of the world, and have to always be thinking outside of the box, unlike their academic counterparts.
The idea of the Ludic Fallacy is brought up. Ludic comes from the word "ludis", meaning "game" in Latin. Basically, we have this fallacy to think of things in terms of games with set rules, where if you know the rules, you can calculate the probabilities, and game the system. Life is not like that.
Nassim brings up a few examples of extreme events unrelated to traditional cheating, which caused immense financial troubles for the casino:
- Their lead performer was maimed by his tiger, causing $100 million in losses.
- An angry contractor almost blew up the casino with TNT
- Someone forgot to send the proper paperwork to the IRS, and they had to pay a hefty fine (or shut down)
- The casino bosses daughter was kidnapped, and he had to do some illegal things to get the required money
All in all, the most expensive burdens on the casino where not the cheaters, but events which were out of the normal realm of what you might expect.
All of the topics mentioned before are one in the same. We cannot be afraid of what we cannot see, or what almost happened, but did not. We confirm because we want to feel right in our incorrect assumptions of the world, and we create a narrative so it all makes sense, further cementing the falsehoods into our mind.
Part Two: We Just Can't Predict
Chapter 10: The Scandal of Prediction
Nassim talks about how the Sydney Opera house was way over budget, and took much longer then expected. In general, we are very bad at predicting. He brings up another study in which participants where asked to give a range of values in response to a random question, and they had to guess the upper and lower bounds, within a 2% degree of error. The range could be as large as they choose, so long as they feel it covers the whole range. It turns out that the error rate was closer to 45%. Not only are we bad at predicting, we are good at believing we are good at making good predictions, which are actually terrible (or dangerous).
We work better with less data to go off of, not more. People like to make narratives based on data, and the more data they have, the more hypothesis they create, the more grounded they are in their initial assumptions. When they are given less data, they are forced to seek out answers to their hypothesis, to wait until more data is present, to hold off on their immediate judgments.
Nassim brings up the "expert problem", in which an expert in a stable, non moving field (ie, plumbing) will be able to give you accurate advice, but someone who lives in a moving field, ie stocks, business, economics, and so on, will not be able to make accurate predictions about the future. In general, when experts make a correct prediction, they stake it on their experience, on their knowledge of their field, but when they make an incorrect prediction, they blame some external factor they where not aware of, something which they could not have predicted, something that wasn't their fault. They don't own up to the fact that most of their predictions rely on random events, things that they have no control over whatsoever.
Nassim brings it back around to the Sydney Opera house, talking about how unforeseen events almost always increase costs, and further push back deadlines.
One of the reasons predictions can get so outlandish is that Excel spreadsheets allow people to make faster and more elongated predictions about the future, whereas before you would have to crunch much more numbers to get to the same conclusion. This means that you could "predict" an event many years into the future, but this hardly does you any good given the volatility of the current day.
Another issue about prediction is how people interpret the results. If the weather is going to be 80 degrees, +/- 20 degrees, they will plan for 80 degrees, instead of preparing for one or the other (or both). They take the average as the "best" guess, the average, and go with it.
Chapter 11: How to Look for Bird Poop
Nassim starts off by saying that, by nature, there are things we just cannot know yet: If we had the knowledge in our midst, we would've already built it! The fact of the matter is that most advancements in technology happen by accident, some serendipitous series of events cause us to stumble across some discovery that changes our perceptions, causes us to take that extra step in our advancements. Another fact is that ground breaking discoveries are often not immediately noted as important: For example, the 2 Bell Labs researchers who discovered the cosmic background radiation, or the inventor of the Laser, or Darwin in his theory on evolution. It is very often that tools end up doing something completely different then it was intended to. Computers used to only be used for military purposes, and now everyone has a computer in their pocket.
Nassim brings up the 3 body problem, and how initial conditions can drastically change the output of a system in certain situations. This applies to physics, but even more so to the more complexity that is human nature. Trying to predict the future is somewhat impossible, at least to any meaningful duration in the future.
Nassim talks about "nerds", people who specialize in a certain field, and in doing so, blind themselves from Uncertainty, making it hard for them to learn. For example, with language learning, a "nerd" might read a grammar book, but not learn the language itself from speaking to someone. The irony being that language is organic, natural, spoken by people before it is ever codified and put into a book. That is why there are so many dead languages: Because no one thought to right it down.
Nassim goes on a tangent about how optimization (mathematics) causes the average Joe to loose out on the ability to do math in an intuitive way, causes math to become this field ruled by calculations. I don't agree with that: I feel like that is the nature of math. Perhaps at its core there is a simplistic nature which is anchored in day to day life, but for the most part, math is doing equations.
Chapter 12: Epistemacracy, a Dream
Nassim creates the term "Epistemocrat", someone who practices epsitemy. We learn about a man named Michel Eyquem de Montaigne, a Frenchman who was stoic, and dedicated himself to knowledge and self introspection. He was very skeptical, and very anti dogmatic. Nassim thinks that Utopia is a place filled with this kind of people, an epistemocracy if you will, in which only people who are aware of their ignorance are allowed to rule. People with knowledge are good, but better is someone who is able to know when they are wrong.
There is a disconnect between our expectations and our past vs. what we think the next time we are placed in a similar situation. We are ignorant of how our thoughts and actions can have an affect on someone/something, since we don't often put ourselves in the position of the victim, and often we forget how we felt/made our predictions the last time we where in that situation. For example, when buying a new car, you feel like this will be some earth shattering experience, but after a few weeks, you feel the same as you did for the last car you bought.
Nassim brings up the idea of entropy, about how back tracking a given situation to a single event is almost impossible (and frankly, not possible in practice), whereas it is much easier (and feasible) to track a series of events to its final state. In practice, "true" randomness and deterministic randomness are the same, because from your perspective, you do not have the necessary data to make any assumptions about the data.
Chapter 13: Appelles the Painter, or What Do You Do If You Cannot Predict?
Nassim begins saying that we should not let skepticism and anti-dogmatism rule our lives, that we should be ignorant in the small and inconsequential actions, but not so for the large and important events. Basically, be human, and make arrogant predictions when it is safe to do so, when the stakes are low. In general, always be prepared, make sure you are ready for an event if/when it happens. Bet on things falling into the hands of the black swan, but don't bet all your money on it: Be sure to diversify your portfolio, and make sure to give yourself a safety net.
Nassim gives some general advice for how to apply these ideas to life:
- Understand the impact of positive vs negative black swans
- Do not look for the precise and local: Let uncertainty happen, bask in it
- Seize any and all opportunities that come your way, since you don't know when another one will come
- Beware of the government, more specifically, beware of their hazy and unpredictable projections
- Don't convince someone of something different if they are dead set in their ways
Make sure to account for the good and the bad. You cannot know the timing or nature of an event, but you can predict the outcomes in the best and worst cases, and prepare yourself according to those situations.
Part 3: Those Gray Swans Of Extrimistan
Chapter 14: From Mediocristan to Extrimistan, and Back
Nassim mentions that people often end up in extrimistan situations out of pure luck, and stay their because of their past success. It is an example of "the rich get richer" scheme, where people who have been successful in the past are more likely to be successful in the future. The "Matthew Effect" is brought up, in which a select few authors are picked out of a sea of authors from an academic paper, and those select few getting more attention, despite the other authors having equal experience.
We are reminded that no one is safe in Extrimistan, that anyone can be wiped out at any time, that a lone wolf or an unforeseen company can displace you from success.
Nassim brings up the idea of "the long tail", which is an event that causes someone or some company/entity to success. There exists large amounts of these people, and at any given moment, they could ascend and become successful, wiping out their competition. What is important to note is that in certain areas, like government and economics, there are no "long tails", there is nobody to step up and dethrone the big dogs who control the game. And, if they fail, everyone suffers, because there is no one to take their place.
Chapter 15: The Bell Curve, That Great Intellectual Fraud
Bell curves, by nature, make the outliers highly improbable/insignificant. This makes it so that you treat improbable events as impossible.
Bell curves (aka Gaussian curves) drop off exponentially, where as Mandelbrotian curves scale linearly. In a Mediocristan society, 2 randomly selected people who combined make $1 million would make $500k each, but in Extrimistan (real life), they would make more like $50k/$950k. This goes to show that extremes are not to be ignored, and although rare, they crop up in normal life, and do have an affect on the total.
Gaussian curves are good in certain situations, where the prior events do not effect the successive events, and the events are calculable.
The love for bell curves comes from the longing for symmetry, for simplicity, for mathematical harmony. Bell curves originally where meant to describe error: the further you got from the mean (average), the further you are getting into error territory.
In isolated environments, the bell curve can be safely applied, but when applied to the masses, to society, you are asking for trouble.
Chapter 16: The Aesthetics of Randomness
Nassim brings up Mandelbrot, a mathematician who he respected and conversed a lot with. He says that the world is fractal, that it acts in a random and recursive way, that as you get deeper and deeper in your inspections, you will see that it is just as random. Sometimes, when you look at something from afar, you will see the simplicity of things, but if you get closer, the true randomness will reveal itself.
The rest of the book dives into math, something that I just could not wrap my head around.
Chapter 17: Lockee's Madmen, or Bell Curves in the Wrong Places
Nassim basically gives a bunch of examples of people using Gaussian curves to describe things that they shouldn't, and getting bitten for it. Most importantly, he mentions how most people say that they agree with what he is saying about black swans, but when they go back to work, they keep using their bell curves because it gives them something to base their work off of, a number to anchor themselves to.
Chapter 18: The Uncertainty of the Phony
Nassim says that the worst kinds of people (who use the Gaussian) are those who think, who have a voice, who have an opinion on the matter, who can change peoples minds. Statisticians are "just thinkers", and they will do what they are told. It is the philosophers who can cause the most damage, because they are the thinkers, the ones who think and come up with an idea, and give it out to the masses.
People often state (or, Nassim's thinks) that people are so caught up with the fact that the universe is unpredictable on a subatomic level, that trying to predict at all is infallible.
Part 4: The End
Chapter 19: Half and Half, or How to Get Even With The Black Swan
Basically, be vigilant, and try to keep your thinking cap on when dealing with negative black swans, but allow yourself to be ignorant and vulnerable when dealing with the positive black swans which can greatly benefit your life.
Postscript Essays
Essay I: Learning From Mother Nature, The Oldest and the Wise
Nassim uses Mother Nature as a comparison for redundancy. Us humans are built with 2 eyes, ears, kidneys, and so on, so that if we ever loose one, we will have another to fall back on. We should employ this thinking in our day to day life: Have a fallback plan in case things don't go to plan. When you start to get into trouble is when you specialize, when you step back from your redundancies, when you trade your redundancies for money, profits, numbers, etc. Nature also doesn't make things too big, and if it does, it is very rare. But, when you look at our society, we have these gate keepers who are "too big to fail", but if/when they do, things crumble. Another attribute of mother nature is that you might have features which are not suited for your current environment, but when a change in your environment occurs, you might be better suited to use your builtin features.
Essay II: Why I do All This Walking, Or How Systems Become Fragile
Nassim talks about how we get into our daily habits, and that we can platonify, that we can build up systems that can become fragile. He talks about how he introduced randomness into his lifestyle, how he would go for long walks, then do a sprint, then walk again, that he would work out vigorously at random intervals to make sure his body would not get used to any one particular schedule. He would do this, pushing his body from one extreme to another.
Essay III: Margaritas Ante Porcos
Nassim goes over the many examples of how people misinterpret what it is that he is saying, that they are discrediting him, that they acknowledge what he is saying, but do not apply it to their lives. People say that they could not predict that a certain thing would happen, because "nothing like this has happened before". In response, Nassim says that you will die, and that, without any prior experience with dying, you know this to be true. After the launch of his book, there have been some initial nay-sayers, but in general, people have begun to believe what he is saying, and have started to internalize it. He has yet to hear any coherent counter argument, and to him, this just goes to show how woefully unprepared we are against the unexpected.
Essay IV: Aspberger and the Ontological Black Swan
Nassim reminds us again that Black Swans are relative to the observer. 9/11 was a Black Swan to the people who fell victim to it, but not to the terrorists who plotted the attack. People do not have ways of predicting (major) future losses given previous history: The Great War could not have been "predicted" based off of purely analytic data, since no such event has ever taken place, it is irregular. Nassim reminds us that uncertainty and risk cannot be "measured" with a ruler, it is not a quantifiable thing.
Essay V: (Perhaps) The Most Useful Problem In Modern Philosophy
Nassim reminds us that the "smaller" the risk, the more weary we should be about how we got there, the more we need to ensure we are robustifying ourselves against it. Instead of worrying about the chance of an event happening (which can vary drastically), look at the effect if it was to occur, in conjunction with the projected risk. When looking at past data to create your projections, be mindful of what it is that you are including/excluding, because they can (and will) have an effect on your projections. In addition, when we predict the outcome of events, we often over/under estimate the odds, and there is little we can do about it. Our projections cannot apply to a complex and tumultuous world. Lastly, the way the risks are framed can change peoples perceptions. The phrase "a one in one thousand chance of a plane crash" sounds much different then "if you fly once a year, it will take about 1000 years for you to be in a plane crash".
Essay VI: The Forth Quadrant, The Solution To That Most Useful of Problems
Nassim says that there are 4 "quadrants" which you need to look out for:
- The true/false quadrant. Low risk, and Mediocristan. The magnitude of the impact is small, as the outcome can only be true or false.
It is safe and ok to use predictions and charts and such here, because the impact is small.
- Complex payoffs in Mediocristan. Low risk, able to use predictions, but the effects of which are not from Extrimistan. Not elaborated
here very much.
- Simple payoffs in Extrimistan. The risks are low, because the cost of being wrong is not damaging, but the impact of being right is
very lucrative.
- The forth quadrant, where the Black Swans are, where the negative (and positive) effects can cause massive damages or success.
Essay VII: What To Do With The Forth Quadrant
Nassim talks about how academics like progress, but the don't like to be proven wrong, they don't like to be told the limits of their knowledge. When Gödel said that there is undecidability in mathematics, people where livid. Nassim believes that the Black Swan is the life-equivalent of Gödel's theory on the limits of knowledge in math.
The topic if iatrogenics, the study of harm done by the healer, has hardly been studied outside of medicine. The thought process is that sometimes we do not have the answer, and the best course of action is to let mother nature take the wheel. Since the enlightenment, we humans have been searching for answers, but nobody has really asked if it has been worth the cost. The term "do no harm" has only recently been introduced into the doctors oath, something that should've been there from the beginning. Nassim mentions that in the olden days, when religion was a driving factor in modern society, people often looked to their church for healing, thus preventing them from seeing the doctor, and healing them in a natural way.
Nassim gives some tips on how to protect ourselves in the forth quadrant:
- Keep your eyes open, understand that the data you are looking for might not be of help, and cannot tell the full story. Often times
the Extrimistan effects that we see are caused by people disturbing a complex system, not the people who are trying to keep the system in check.
- Don't optimize, learn to build redundancies, backup plans, and so on.
- Try and avoid the remote, atypical, hard to predict events. Don't rely on them, especially if you are expecting some big payoff.
- Be weary of numbers which assess risk.
Essay VIII: The Ten Principles For A Black Swan Robust Society
Nassim gives 10 ideas for how we can become more robust to Black Swan events:
- What is fragile should break early.
- No socialization of losses and privatizations of gains. We live in a world where the common folk have to pay taxes which help bail
out the big corporations, yet the gains that they get only go into their pockets. There is no incentive for them to do better, because there is no recourse for loosing.
- People who make huge mistakes which they could've avoided should not be allowed to make them again. Financial investors and bankers
who caused the 2008 crash should not be allowed in a financial institution ever again.
- Don't put a profit-orientated person in charge of something important: Corner cutting in order to show profits comes at the cost of
huge fees and damages if things do not go to plan.
- Compensate complexity with simplicity. Try to make things as simple as possible, if possible.
- Don't allow dangerous systems and programs to be sold to gullible people who don't know better, or don't know how to use the systems
properly.
- Governments should not "restore confidence". They are not PR people.
- Don't give an addict drugs if they have withdrawal pains. If a system is broken and causes damages to others, it should be fixed, and
not allowed to keep going in such a vicious cycle.
- Average people should not invest or put their life savings/retirement money in to the financial system, since it is volatile and could
disappear at any second.
- Don't make an omelet with broken eggs. Don't keep using a system that you know is broken. Fix it.
IX: Amor Fati: How To Become Indestructible
Nassim mentions Stoicism, how he views Seneca as his role model, that he lived a life free of desire and was ready to leave all his worldly possessions behind, including his life. Nassim tells us to robustify ourselves against all things, and especially against our own death.