Super Thinking: The Big Book of Mental Models

🎨 Impressions

This book wasn’t quite what I thought it was going to be. I was expecting something more along the lines of how to use my mind in new and different ways. But that’s probably because I didn’t know what a mental model was before reading it.

There are no methods in this book. It’s mostly supporting material.

I think the title was chosen to make it sound more interesting. It’s mostly a compilation of ideas and concepts. Things like Minimum Viable Product, Occam’s Razor, and simplicity are considered mental models. So a lot of this I was already familiar with.

Who Should Read It?

If you’re interested in learning concepts and ideas from a lot of different areas — business, philosophy, science, and spirituality are all represented — and how they might be mixed together and applied in ways you didn’t think about, then give it a shot.

đź“’ Summary + Notes

Chapter 1: Being Wrong Less

De-risking: To make good decisions, you need to be wrong less, and test your assumptions in the real world.

The most important assumptions to de-risk are the ones that are critical to your success and that you’re the most uncertain about.

In business, the idea of the minimum viable product, or MVP, is the bare minimum requirement for offering any product or service. The reason for the MVP is that your assumptions about your product or service are probably wrong, and need to be tested as soon as possible, and revised often based on the feedback you get.

Bringing something to market quickly keeps you from doing too much work before getting that valuable feedback.

Ockham’s razor says that the simplest explanation is most likely to be true.

To de-risk, it’s important to question your assumptions. Break things down into their individual assumptions and, for each one, ask yourself:

  • Does this assumption really need to be here?
  • What evidence do I have that it should remain?
  • Is it a false dependency?

The conjunction fallacy: The fact that people are hardwired to attach to unnecessary assumptions.

Overfitting: Explaining things using too many assumptions. For example thinking you have cancer when you have a cold is overfitting your symptons.

Question your assumptions. Ask yourself: how much does my evidence support my assumptions?

Availability bias: When a bias about reality becomes your view of reality based on information that was recently made available to you.

If you heard a bunch of news reports about shark attacks within a short amount of time you might believe that shark attacks are on the rise. But this is the availability bias. All you really know is that you’re seeing more reports about shark attacks.

Polls have found that people often overstate the risk of sensationally over-reported causes of death, like tornados, by fifty times and understate the risk of common causes of death, like stroke, by one hundred times.

Online the availability bias is called the filter bubble. Since you’re likely to click on things you already know and like, Google and Facebook show you more of the same thing.

It’s easy to focus on what you already know. It’s harder to find an objective viewpoint that challenges your assumptions, but that’s what you need to do if you want to be wrong less.

In any conflict between two people, there are two sides of the story. But the third story is what an impartial observer would describe. If you force yourself to think as that impartial observer it can help you in a conflict situation.

The key is learning to describe the gap—or difference—between your story and the other person’s story.

If anything it will help increase your empathy — your understanding of other peoples perspectives — whether or not you agree with the other person.

The most respectful interpretation is basically giving people the benefit of the doubt. It can also help you empathize and build trust with others. Try to remain open to other peoples interpretations and only make a judgement when necessary.

Hanlon’s razor: Never assume someone is being malicious when their actions can be described as simply careless. This is useful online where you don’t have body language and voice cues to go off of. It’s easy to misread text and assume something is negative.

Fundamental attribution error: When you attribute others’ behavior to their internal motivations rather than external reasons. Self-serving bias is the opposite. You tend to have self-serving reasons for your own behavior but blame others behavior on their intrinsic nature.

Like when someone cuts you off in traffic: you assume it’s because they’re a fundamentally rude person. But when you do it to someone else you think it’s just because you’re in a hurry to get somewhere, not because of some fundamental character deficiency of yours.

Confirmation bias is when you collect and interpret information in a way that confirms your existing beliefs. It’s the reason why so many scientific breakthroughs are discovered by outsiders. They aren’t attached to the same paradigms of the insiders.

The backfire effect is when someone tries to change your mind with facts and information and since your confirmation bias is so strong it has the opposite effect, causing you to become more attached to the original, incorrect position.

Disconfirmation bias: Where you impose a stronger burden of proof on the ideas you don’t want to believe.

Cognitive dissonance: The stress felt by holding two contradictory, dissonant, beliefs at once.

Instead of dealing with the underlying cause of this stress—the fact that we might actually be wrong—we take the easy way out and rationalize the conflicting information away.

Chapter 2: Anything That Can Go Wrong, Will

Tragedy of the commons: When a shared resource becomes damaged or depleted due to individual decisions made in each individual’s best interest. Everyone thinks they’re doing what’s best for them but the sum total of those decisions makes a worse outcome for everyone.

  • If all the farmers keep getting new cows, then the commons can be depleted.
  • Any shared resource, or commons, is vulnerable to this tragedy.
  • Each additional spam message benefits the spammer, but the mass of spam makes using email worse for everyone.
  • Collective overuse of antibiotics in medicine and agriculture is leading to dangerous antibiotic resistance.

The tragedy of the commons arises from what is called the tyranny of small decisions.

It’s death by a thousand cuts.

Goodhart’s law: When a measure becomes a target, it ceases to be a good measure. When you try to incentivize behavior with a measurable target, people often focus on achieving the target in ways you didn’t intend. And the way they achieve it might not be the behavior you wanted to promote.

For example, if you incentivized a nail factory solely on units of production, they might start making twice as many nails that are half as big as they’re supposed to be just to meet the target.

The cobra effect: When a solution actually makes the problem worse.

When the British occupied India they were concerned about Cobras so they started paying people for every cobra they brought to them. But when the population of cobras started decreasing, people started breeding them to keep collecting the rewards. When the British found out about it, they ended the policy and then the people released all the cobras they had bred, increasing the population even further.

Precautionary principle: When an action could possibly create an unknown amount of harm, you should proceed with extreme caution before doing it.

Analysis paralysis: Where your decision making suffers from paralysis because you are over-analyzing the large amount of information available.

One way to deal with analysis paralysis is to categorize decisions as either reversible or irreversible. Reversible decisions tend to be hard to undo and are really important. Reversible decisions, not so much.

Decision fatigue. The more decisions you make, the more energy it takes, and the worse your decision quality becomes. I think this is why some people wear the same things every day. So they don’t have to spend energy on trivial decisions.

Sayre’s law. Named after political scientist Wallace Sayre; says that in a dispute, the intensity of feelings is inversely proportional to the value of the issue.

Parkinson’s law of triviality. Named after naval historian Cyril Parkinson; says that organizations give disporportionate weight to trivial issues.

Chapter 3: Spend Your Time Wisely

How do you choose what to do given your limited time?

“You can do anything, but not everything.”

Opportunity cost. Every choice has a cost: it’s equal to the value of the best alternative choice you didn’t make. Generally, you want to choose the option with the lowest opportunity cost.

Leverage. In finance, it refers to borrowing money to buy assets, and allows gains — and losses — to be multiplied. Leveraging up means increasing debt. Deleveraging means decreasing debt.

Present bias. Overvaluing near-term rewards over making incremental progress on long-term goals. It’s one reason for procrastination.

Loss aversion. You are more inclined to avoid losses than to want to make similar gains. The reason is that it feels worse to lose something than how good it feels to gain something of equal value. It’s why people hold onto losing stocks for too long or stay in a house that has lost value hoping it will appreciate back above their purchase price.

Sunk-cost fallacy. The illusion that you can get the money you’ve spent on something back. The costs of the project have already been sunk. You can’t get them back. If the project isn’t going well, is it better to keep throwing money at it or to just walk away and cut your losses? You need to avoid thinking we’ve come too far to stop now.

Reframe the problem. Look at difficult problems in an entirely different way to come to a solution quicker.

It’s what Disney World did when they were faced with the problem of long lines. They reframed the problem from “How do we move people through the line faster?” to “How do we make people happier while they wait in line?”

Chapter 4: Becoming One With Nature

The scientific method can be looked at as simply maintaining an experimental mindset. The most successful people and businesses are constantly learning and changing what they work on and how they work.

The same mindset can be applied to all areas of life to continue learning new things and experimenting to get closer to optimal strategies and solutions.

Strategy tax. Locking yourself into rigid long-term strategies that limit your ability to adapt when circumstances inevitably change. What strategy taxes are you currently paying?

Shirky principle. Institutions will try to preserve the problem to which they are the solution.

Flywheel. It takes a lot of effor to get something moving, but once it is, it takes little effort to keep it going. This metaphor applies to how you can use momentum in your habits to continually improve.

Greatness doesn’t come all at once, or from any single action or decision. But rather turn by turn of the flywheel. Your sustained efforts over time will compound on each other.

Chapter 5: Lies, Damned Lies, and Statistics

Anecdotal evidence. Informal evidence gathered from personal anecdotes. You get yourself into trouble when you make generalizations based on this type of evidence or when you weigh it more heavily than scientific evidence.

Anecdotal evidence usually misrepresents a given situation. Since people are more likely to share stories that are either much better or much worse than normal, you end up with extremes.

Correlation does not imply causation. Just because two events happened in succession, or are correlated, doesn’t mean that the first actually caused the second.

You should be skeptical of first impressions and don’t assume that results based on a small data set are typical.

You should even be skeptical of published studies because they’re more likely to be published if they show statistically significant results. Something called publication bias.

The problem is that in order to get published, which is often a requirement for people to advance their careers, they’ll re-run tests specifically to look for these results. A practice called data dredging.

Because of these problems; when you’re making a decision on a claim you should always make sure it’s backed up by a body of research rather than an isolated study. And the way the studies were designed need to be taken into account. Were all biases accounted for when they analyzed the data?

Try to find if someone has already published a systematic review of your question. Systematic reviews are ways of evaluating research questions based on the whole body of research on a subject.

Chapter 6: Decisions, Decisions

Maslow’s hammer. If all you have is a hammer, everything looks like a nail.

The hammer of decision-making models is the pro-con list; useful in some instances, but not the optimal tool for every decision.

Cost-benefit analysis. An improved pro-con list since it adds scoring to the items and doesn’t treat them equally anymore. Instead of putting numbers next to the items you put dollar values next to them. Then you can just add up the amounts and get an estimate of that decisions worth to you in dollars.

This only works well if you are thorough, because you will use that final number to make decisions.

Another improvement for more complex decisions is the decision tree. It’s a diagram that looks like a tree (drawn on its side), and helps you analyze decisions with uncertain outcomes.

Black swan events are events that are extreme, and end in things like financual ruin, but are more likely than you might expect.

Counterfactual thinking. Thinking about the past in a way where the past was different, countering the facts about what actually happened.

Examples from your own life can help you improve your decision making when you think through the possible consequences of your past decisions. What if I had taken that job? What if I had gone to that other school? What if I hadn’t done that side project?

Lateral thinking. Thinking outside the box. Asking what-if questions can help with this since it helps you come up with scenarios that might not be obvious to you. It’s different than critical thinking which is more about analyzing something in front of you.

Randomness can help with lateral thinking. Like picking something from your surroundings at random or a word from the dictionary to add into your thought process.

Chapter 7: Dealing With Conflict

Social norms versus market norms. When you consider something from a market perspective (like doing a job for money), you consider it in the context of your own financial situation. But, when you consider something from the social perspective (like doing your friend a favor), you consider it in the context of whether it is the right thing to do (“My friend needs my help for four hours, so I am going to help her”).

You have to be careful not to accidentally replace social norms with market norms, because you may end up eliminating benefits that are hard to bring back

Once social norms are undermined, the damage has been done and they aren’t norms any more. So be careful if you’re thinking about offering monetary incentives in a situation where social norms are the standard.

Ultimatum game. Played by two people. The first person gets some money and has to offer to split it with the second person. The offer is the ultimatum. And the second person only has two choices: to accept or reject the offer. If they accept, they both keep their split. But if they reject it then they both get nothing.

Usually the second person will reject offers lower than 30% of the total because they think it’s unfair. They’d rather deny the first person anything even if it means they also get nothing.

It’s important to keep this strong desire for fairness in mind when making decisions that affect other people because perceived unfairness triggers strong emotional reactions.

Many arguments try to sway you from rational decision making by pulling at your emotions, including fear, hope, guilt, pride, anger, sadness, and disgust. This type of manipulation is called appeal to emotion.

FUD. Fear, uncertainty, and doubt. FUD is commonly used in marketing (“Our competitor’s product is dangerous”), political speeches (“We could suffer dire consequences if this law is passed”), religion (eternal damnation), etc.

Similar to FUD is the use of a straw man. This is where instead of addressing your argument directly, an opponent misrepresents (frames) your argument by associating it with something else (the straw man) and tries to make the argument about that instead.

Ad hominem (Latin for “to the person”). Where the person making the argument is attacked without addressing the central point they made. Basically just name-calling.

Chapter 8: Unlocking People’s Potential

Joy’s law. Named after Sun Microsystems cofounder Bill Joy, who said, No matter who you are, most of the smartest people work for someone else.

Chapter 9: Flex Your Market Power

When you take advantage of price differences for the same product in two different places, it’s called arbitrage.

The opposite of arbitrage is sustainable competitive advantage. This mental model describes a set of factors that give you an advantage over the competition that you can sustain over the long term.

To address questions of timing, ask yourself why now? Would it make a difference if you waited longer? What would you be waiting for in particular?

You can also do the inverse. Instead of asking why now?, ask now what? When you see something change in the world around you, ask yourself what new opportunities might open up as a result.

The why now model also explains why there are often concurrent academic discoveries across the world and similar startups independently emerging simultaneously. Wikipedia has a huge list of instances like these, and there is a name for the concept: simultaneous invention.

“Vision without action is a daydream. Action without vision is a nightmare.” â€” Japanese proverb

OODA loop. A decision loop of four steps — observe, orient, decide, act (OODA).

The faster you can make your OODA loop, the faster you can incorporate external information, and the faster you’ll reach your destination, be that product/market fit or something else.

What type of customer are you hunting? This model illustrates that you can build large businesses by finding different size customers, from the really small (flies) to the really big (elephants). You can build a huge business by selling to lots of small customers or a few really big ones. There are plenty of examples of both.