Reference:: R: The Black Swan by Nassim Taleb
(btw., maybe change the title to something more universal)
Mother nature created us for an environment largely bound by the laws of physics where inputs are closely tied to outputs; where we mostly experience linear progression; where the environment is more or less predictable; where what's relevant is sensational and emotional; where deep thinking and introspection aren't essential.
However, the environment has changed. Once we introduced intersubjective concepts and information became the currency, it became increasingly more extreme, asymmetric, and surprising—Black Swans started to rule. It transformed into the world of winner takes all and extreme events that invalidate all previous observations.
We didn't evolve to handle such an extreme world. It's too complex for us to manage. What's worse, our mind and body think we're still in our natural environment. This leads to our biases—mistaking what we see for what there is.
Underneath are four main categories of fallacies that make us blind to Black Swans
A baby's cry is more persuasive than
1 death is a tragedy. Million deaths is a statistic.
We generalize from what is seen
It operates in laws we don't know because they didn't concern us throughout our history: compound effect, network effect
We also put the tangible, sensational, and emotional—i.e., what is seen—over the abstract, random, and uncertain—i.e., what is not seen.
Many self-help books are filled with single examples of people "who made it." The question is, which of them made it simply because of luck? How can you evaluate it in order to decide if it's worth following their example?
One way is to test it through a low-cost experiment.
Another way is to look at the baseline. Of all the people who did the same as the successful won how many succeeded?
In extremistan many time it's luck—not ability, character, etc. (because those who have it lay in the cementery) This shows the importance of "catching luck" through increasing your exposure to Black Swans of the positive kind.
Survivorship bias can hide best when its impact is largest. The severely victimized—those who landed in the cemetery—are likely to be eliminated from the evidence. The weakness of very incompetent police can be hidden by the fact that so many thugs are getting away and never get reported about. We only talk about those who got caught. The same can be for extreme successes in business, finance, and other Extremistan areas. Can you create a rule of thumb that the more extreme the winner, the bigger the survivorship bias (or the bigger the unseen)?
The way to avoid the survivorship bias is to look at all who started in the cohort. Don't just look at the winners but at he loser as well. So, when you want to evaluate if a particular decision is good, look at all people who made the decision—not just at the ones who made the decision and succeeded. For example, when you wanna decide about whether to go on a keto diet, look for data where they take a large group of people as look at how it went (don't just watch this one keto youtuber). Or another instance, it's more relevant. When you see someone who is successful in an area and want to emulate them, stop. Before you decide to do what he did look at a greater sample of people who followed a similar behavior in that area and look at their outcomes. Maybe he's just one of the lucky ones who succeeded. Don't forget that those who are hidden, those who we don't see.
The bigger the sample, the more extravagant the winners will be.
This is exactly the way to notice the hidden. Look at the whole sample (the base rates). Ok, so this is another way to fight the survivorship bias (add it there earlier).
Also, to evaluate survivorship bias you have to refrain from looking for causes and effects, which is easy due to compression (narrative fallacy). Why? Because the success is simply the matter of luck, not of inputs. You are looking at the "rosy" scenario that played out. We can't resist putting a narrative on it than accepting randomness (or our inability to understand the causes because they're too complex).
Usually there's no right answer because reality is too complex and we just can't calculate all the variables. But in school, we are forced to come up with explanations and withholding judgement is being shamed. Why are we forced to judge such complex things? Better to use the WRAP method. Or just to ooch and test our answer. TK P: Complicated things are complicated
We understand the linear (because reality was like this) but don't understand the compounding, asymmetric. We just can't get it. We're bad at big numbers.
These biases come from the fact that we didn't evolve to think but to act. Thinking is but a means to acting, not the other way around. A hunter-gatherer who was pondering whether the approaching tiger was real or not was replaced by your ancestor who simply ran away. Therefore, thinking evolved only to the extent it enabled better acting—better acting in the Mediocristan... Our thinking evolved to serve the sensational, emotional, symmetric (need a synonym to that: action = reaction) environment and not the abstract, random, asymmetric (). In other words, our brains are outdated and we're not able to handle the complexity of the increasingly Extremistan world without errors.
We can't always perform experiments in life because we can't clone ourselves. But, each action has an alternative result that would come from inaction or another action. If we were to understand all alternatives we would learn very quickly. How can one achieve that without cloning? Premortems? Maybe one could do premortems for success, failure, and inaction? Also, maybe one can overcome that by making smaller-scope experiments that can actually be A/B tested. For example, instead of quitting your job to become a full-time writer start writing after ours and see where it's going to lead you. Or maybe another way would be to read abot others who did it: performed an action and succeeded, failed or didn't perform an action.
Because we judge people and governments based on what's seen and not what isn't, we can't evaluate well whether inaction wasn't a better action.
The relevant is the sensational. This is why we are so drawn by the sensational, not by the relevant, because relevant is often boring. To make the relevant more interesting you must lower your dopamine threshold. It's a solution to so many problems caused by technology. I should be the biggest ambassador of NSD.
For most of our history work and results, inputs and outputs were closely related. You were thirsty; drinking brought you adequate satisfaction. You were building a hut; more work lead to more apparent results, so your mood was propped up by visible feedback. In other words, what was relevant was the sensational. This is why we are so attracted to sensory and concrete information. Unfortunately, in our world what's most relevant becomes more and more boring and nonsensational.
These biases come from the fact that we didn't evolve to think but to act. Thinking is but a means to acting, not the other way around. A hunter-gatherer who was pondering whether the approaching tiger was real or not was replaced by your ancestor who simply ran away. Therefore, thinking evolved only to the extent it enabled better acting—better acting in the Mediocristan... Our thinking evolved to serve the sensational, emotional, symmetric (need a synonym to that: action = reaction) environment and not the abstract, random, asymmetric (). In other words, our brains are outdated and we're not able to handle the complexity of the increasingly Extremistan world without errors.
It's like coming to a gunfight with a knife times 100.
We either need an update (Neurolink?) or tools that make up for fd
Information is very costly to acquire, store, organize, manipulate, and retrieve. Therefore, we developed mechanisms to make this process more efficient through compression.
Compression happens through different operations such as:
We are imposing order on what is disorderly.
Because of this compression, information is less costly to handle (we have a key than unlocks the whole context of the information) but it's more erroneous because it removes the dimensionality and randomness which leads to biases and it's also where the black swans reside.
How does it to connect to mental representations from R: Peak by Andres Ericsson?
Knowing that, fight the urge to make a conclusion, explanation of a cause, or decision.
Maybe use WRAP to make better decisions.
Empiricism.
What are other methods to avaoid this bias?
How does writing relate to that. Does it help? Checklists and playlists would help.
Confusing plausible with probable.
Prioritizing coherence "It has to make sense"
Jumping to conclusions
He has a doctorate, he must be smart.
I like him, he must be good.
Halo effect.
Narrative Fallacy: seeing connections everywhere
We simplify to match our beliefs
Also, we continuously adjust our memories to make them better fit our desired narrative.
Interesting, the narrative fallacy has a chronological dimensions and leads to the perception of the flow of time.
We see explanations everywhere and are incapable of accepting the idea of unpredictability. ("Death can't just be the end")
Where might the narrative fallacy come from? The purpose of the narrative fallacy is to theoretize of what might happen in the future. It is our big advantage because it allows for creating many simulations of choices in our mind without risking our own lives. It is very useful in simple and tangible situations. However, when we enter Extremistan we often made wrong assumptions
The narrative fallacy sprung from our evolutionary need to theorize about what might happen in the future. This ability allowed us to avoid predators for example when we were chilling in the cave and heard a knacking sound outside, we could translate this cue into a prediction that it might have been a lion and avoid the attack.
This ability allowed us also to create multiple simulations of reality, enabling us to evaluate different scenarios without risking our lives.
It was, and still is, very useful in Mediocristan where inputs are closely tied to outputs but it starts to fool us in Extremistan.
We can't think without making connections and jumping to conclusions. It's tough for us to analyze information in separation.
We prefer what seems more plausible (like statements with "because") to single facts. He killed his wife will seem less likely than he killed his wife because she cheated on him.
Also, explanations bind facts together which make them more easily remembered. It's a compression mechanism.
Where it can go wrong is when it increases our impression of understanding. So, mental representations aren't inherently bad. They're only bad when they're falsely make us believe we know more than we actually do. When we start substituting, choosing more plausible explanation, etc. Relates to mental representations from R: Peak by Andres Ericsson and substitution from R: Thinking Fast and Slow by Daniel Kahneman
I think that we're at a higher risk of this fallacy when we're in extremistan environments. In predictable environments with lots of feedback this bias is diminished.
So, to avoid this fallacy we need to refrain from forming stories, conclusions, explanations in complicated subjects and second, use empiricism and practice to get closer to the truth.
Further, knowing about this fallacy you should avoid noisy data as much as possible. Each input is a potential trigger for the mental machinery start creating narratives. And when they're based on false data, it just multiplies the problem.
We create stories because orderly, less random, patterned, narratized series of words and symbols are easier to manage (obtain, store, retrieve). Thanks to this compression you can overcome the limits of your working memory.
On a more practical level, narratives help us learn from the past and predict the future (if this, then that, therefore). Last time when we heard a certain noise outside our cave, it was a bear who killed my cousin, therefore we need to be wary every time we hear it. This works in a predictable environment with lots of feedback (Mediocristan, intuition) but it gets us into trouble in complex, assymetric, non-linear subjects—there we can't use our intuition anymore.
Why doesn't it work in Extremistan? The more complex the information—the more random it seems—the more difficult to summarize. The more you summarize the more order you put in, the less randomness. So, the mechanism that makes us simplify pushes us to think the world is less random than it actually is. And the Black Swan is what we leave out of simplification.
So, it's a viscous cycle. The more complex information, the more we need to simplify in order to make sense of it it; the more we simplify, the more we strip it of dimensionality and nuance, which makes us think the world is less random than it actually is.
We makes mistakes when we use the intuitive System 1 in situations where System 2 should be used. These situations are usually those that involve complex subjects.
How to handle complex (seemingly random) information if we shouldn't compress it?
Splitting it up into smaller parts? You can then test them more quickly. Also, smaller parts have less Black Swans, because you're simplifying less. So you can get an understanding of a complex thing by splitting it into less complex parts which you can understand and then combine them to understand the whole. It also relates to knowing the basics. Understanding the building blocks of the subject will make you less liable to oversimplification because you'll already have true parts of it. This also speaks to first principles thinking as a tool to splitting and getting to the truth.
Experimentation?
Using computers?
Finding answers from others, preferably elders or old books, because they're the repositories of complicated inductive learning which passed the test of time.
Giving it time—"procrastinating" on reaching conclusions—saying "I don't know" until it plays itself out or until you reach greater understanding by following the previous steps or until the unconscious connects the dots (link to time of slack and better decision making)
productIdea now I realized that thinking through writing in Roam (and other similar tools) might be too linear. I'm lacking the dimensionality, the connections. I'd rather have it more as a board where I can readily jump between the linked notes to "see" the connections.
And how does it relate to my writing? Ain't I simplifying too much leaving out necessary details? Isn't it the fallacy of all generalizers?
Because of the narrative fallacy power, you can use it to increase your persuasion through using stories
We use narrative to give us an illusion of understanding (otherwise we would go mad because of our inability to understand the complexities of the world) and give ourselves cover for our past actions.
Our thoughts are mostly post-hoc rationalizations that compel us into believing we know more than we actually do and give cover to our past actions.
Post hoc fabrication P: People judge mostly emotionally
This is platonicity because it's about creating ideal stories, categories, patterns that are simplistic.
Tunnel vision is caused by our brains inherent laziness. We prefer the prominent/focal (priming), fresher (availability), closer (association), bigger (confirmation) constellation of neurons.
We are lazy thinkers.
Highlighting a particular area of the brain which biases you into related thoughts and suppresses unrelated ones because of the energy-saving imperative.
Further, we have a false belief that we understand how we and the world works. We are driven so heavily by instincts and by the subconscious.
P: You're not as rational as you think P: Maslow's Pyramid of Needs P: Is everything really revolving around procreation? All biases basically
We overestimate what we know, and underestimate uncertainty, compressing the range of possibilities.
Looool. This is one of the reasons for our tendency to not change a decision after we have reached it—be overconfident. Very nice! So, we made decisions in predictable environments (outputs tied to inputs, repeatable past, tangible, etc.) where being an expert truly yielded bigger results. So intuitions were often right and changing one's mind after the decision wasn't because it hindered the execution.
We fool ourselves into believing that we made a good decision (hindsight bias), that we are right (confirmation bias),
All of this leads to stripping reality of its dimensionality, complexity, and nuance, which is precisely where Black Swans reside.
Relevant notes (PN: )
P: Mediocristan and Extremistan: two realities we live in.
P: You're not as rational as you think: it's scary how much happens without our awareness.
P: Half of the success behind good decision-making is avoiding negative emotions: emotions suppress rational parts of the brain which makes us fall for our biases more dramatically.
Survivorship bias seems to be the combination of all of the mentioned fallacies. We focus on what's seen—the successful ones—over what's not seen—the cemetery (Sensualization). We instantly generate a coherent explanation as to why this individual succeeded ignoring the nuances, the randomness (compression). We look for data that will confirm this explanation (tunnel vision). Then we're overconfident in our theory. Are other biases a combination of these fallacies? Are these fallacies something meta?
Looking here, isn't it very similar to WRAP from Decisive? Widening your options is for sensualization Reality-testing is for tunnel vision Attaining distance is for Preparing to be wrong is for epistemic arrogance. Ah... I don't know. I feel that there's something missing in this whole bias shit. Or we have a wrong model for it as this one writer said. We look at it as deviations from the norm—rationality. Where in reality rationality is not the norm. But maybe it is the norm but not in the extremistan environment? I don't know.
Mediocristan is the reality we evolved for. Extremistan is well... too extreme for us because it doesn't address our linear and sensory needs. More: P: We evolved for Mediocristan
P: Why can't we see Black Swans?: here is more about platonicity.
Why is this happening? In short, our minds are not designed to handle the complexity of the modern world. Here are a couple of reasons for that.