# Skin in the Game
**Nassim Nicholas Taleb** | [[Foundations]]

---
> "Their three flaws: they think in statics not dynamics, they think in low not high dimensions, they think in terms of actions never interactions."
Intellectuals and policy makers can't get the idea that complex systems do not have obvious one-dimensional cause-and-effect mechanisms, and under opacity, you do not mess with such a system.
> "Avoid taking advice from someone who gives advice for a living, unless there is a penalty for their advice."
You cannot separate knowledge from contact with the ground—with the real world. Knowledge gained by tinkering, via trial and error and the workings of time, is vastly superior to that obtained through reasoning.
> "Skin in the game is about honour as an existential commitment. If you do not take risks for your opinion, you are nothing."
It's a separation between man and machine, and (controversially) a ranking of humans. Courage—risk taking—is the highest virtue.
Bureaucracy separates a person from the consequences of their actions. Without skin in the game, systems accumulate imbalances and eventually blow up. The only sustainable structure is one where advisors bear downside.
---
## Core Ideas
### [[Skin in the Game]]
The book is about four topics in one: **Uncertainty and knowledge** (or bullshit detection), **Symmetry in human affairs** (fairness, justice, responsibility, reciprocity), **Information sharing** in transactions, and **Rationality** in complex systems and the real world.
The principle: those who don't take risks should never be involved in making decisions. Like the principle of healers, *primum non nocere*—first do no harm. Even more, those without exposure shouldn't be making calls.
A system with skin-in-the-game requirements holds together through sacrifice to protect the collective or entities higher in the hierarchy required to survive. "Survival talks and BS walks."
### [[Via Negativa]]
We don't learn so much from our mistakes; rather **the system learns by selecting those less prone to a certain class of mistakes and eliminating others**. Systems learn by removing parts, via negativa.
You will never fully convince someone that he is wrong; only reality can. When a system doesn't have a mechanism of skin in the game, with a buildup of imbalances, it will eventually blow up and self-repair that way. If it survives.
You never cure structural defects; the system corrects itself by collapsing. Hammurabi understood this: "If a builder builds a house and the house collapses and causes the death of the owner—the builder shall be put to death."
### [[Rationality as Survival]]
**There is no such thing as the "rationality" of a belief, there is rationality of action.** The rationality of an action can be judged only in terms of evolutionary considerations.
> "How much you truly 'believe' in something can be manifested only through what you are willing to risk for it."
The only definition of rationality that is practically, empirically, and mathematically rigorous: **what is rational is that which allows for survival**.
Much of what we call "belief" is background furniture for the human mind, more metaphorical than real. When we look at religion and ancestral superstitions, we should consider what purpose they serve, rather than focusing on "belief." In real life, belief is an instrument to do things, not the end product.
Nobody has managed to build a criterion for rationality based on actions that bear no cost. But actions that harm you are detectable.
---
## Key Insights
**Things designed by people without skin in the game tend to grow in complication (before their final collapse).** When you're rewarded for perception not results, you need to show sophistication. There is absolutely no benefit for someone in such position to propose something simple.
> "Things designed by people without skin in the game tend to grow in complication (before their final collapse). When you are rewarded for perception, not results, you need to show sophistication."
Never pay for complexity of presentation when all you need is results. People who are bred, selected, and compensated to find complicated solutions do not have an incentive to implement simplified ones.
> "Never pay for complexity of presentation when all you need is results."
Skin in the game brings simplicity—the disarming simplicity of things properly done. Decentralisation is based on the simple notion that **it is easier to macrobullshit than microbullshit**. Decentralisation reduces large structural asymmetries.
**People have two brains, one when there is skin in the game, one when there is none.** An employee is—by design—more valuable inside a firm than outside it; more valuable to the employer than the marketplace.
The employee has a very simple objective function: fulfil the tasks that his supervisor deems necessary, or satisfy some gameable metric. People whose survival depends on qualitative "job assessments" by someone of higher rank cannot be trusted for critical decisions.
**What matters isn't what a person has or doesn't have; it is what he or she is afraid of losing.** The more you have to lose, the more fragile you are. A free person does not need to win arguments—just win.
**Ruin and other changes in condition are different animals.** Every single risk you take adds up to reduce your life expectancy. **Rationality is avoidance of systemic ruin.**
> "Ruin and other changes in condition are different animals. Every single risk you take adds up to reduce your life expectancy. Rationality is avoidance of systemic ruin."
One may be risk loving yet completely averse to ruin. The central asymmetry of life: **in a strategy that entails ruin, benefits never offset risks of ruin**. Never compare a multiplicative, systemic, and fat-tailed risk to a non-multiplicative, idiosyncratic, and thin-tailed one.
Fragility is in the dosage: falling from the 20th floor is not in the same risk category as falling from your chair. Contrary to what psychologists tell you, some "overestimation" of tail risk is not irrational—it is more than required for survival. There are some risks we just cannot afford to take.
**Courage (risk taking) is the highest virtue.** We need entrepreneurs. Courage is when you sacrifice your own well-being for the sake of the survival of a layer higher than yours. Selfish courage is not courage. A foolish gambler is not committing an act of courage, especially if he's risking other people's funds.
> "Courage (risk taking) is the highest virtue. We need entrepreneurs."
Love without sacrifice is theft. This applies to any form of love. Skin in the game can make boring things less boring. The mere presence of an assistant suspends your natural filtering—its absence forces you to do only things you enjoy.
**People's "explanations" for what they do are just words, stories they tell themselves.** What they do, on the other hand, is tangible and measurable and that's what we should focus on. We should consider what purpose beliefs serve, rather than focusing on epistemic belief in its strict scientific definition.
**The knowledge we get by tinkering, via trial and error, experience, and the workings of time—contact with the earth—is vastly superior to that obtained through reasoning.** Something self-serving institutions have been very busy hiding from us.
Just like Antaeus, you cannot separate knowledge from contact with the ground. Actually, you cannot separate anything from contact with the ground. And the contact with the real world is done via skin in the game.
**The main idea behind complex systems is that the ensemble behaves in ways not predicted by its components.** The interactions matter more than the nature of the units. Studying individual ants will almost never give us a clear indication of how the ant colony operates.
Individuals don't need to know where they are going; markets do. Leave people alone under a good structure and they will take care of things. Human nature is not defined outside of transactions involving other humans—we do not live alone, but in packs.
Small is preferable owing to scale properties. Some things can be, simply, too large for your heart. You know instinctively that people get along better as neighbours than roommates.
---
## Connects To
- [[Antifragile]] - this is the direct companion; where Antifragile explains the mechanism, this explains the ethics
- [[The Most Important Thing]] - both emphasise that you judge people by what they risk, not what they say
- [[Dead Companies Walking]] - shows what happens when decision-makers are insulated from consequences
- [[Ego Is The Enemy]] - complements the insight about honour and sacrifice over ego
- [[The Fifth Discipline]] - shares the insight about Systems Thinking and interactions versus actions
- [[Playing to Win]] - strategy requires choices, which require courage to take risks
---
## Final Thought
Judge people—including yourself—not by what they say, but by what they risk. Most discourse, most advice, most strategy is words unconnected to consequences.
**Honour is proportional to downside.** If someone gives you advice but bears no cost when that advice fails, ignore them. If someone makes decisions but doesn't bear the consequences, remove them from the decision. If a system allows asymmetry—where some capture upside whilst others bear downside—that system will eventually collapse.
This connects skin in the game to everything: epistemology (knowledge requires contact with the ground), ethics (fairness is symmetry), rationality (the only definition that works is survival), and even religion (belief is what you risk for, not what you profess).
Don't trust employees for hard decisions—they're optimised for pleasing supervisors, not bearing consequences. Don't trust intellectuals who think in terms of actions not interactions—complex systems don't work that way. Don't take systemic risks where ruin is possible—benefits never offset the risk of blowing up.
You're allowed to dismiss anyone who doesn't have skin in the game. In fact, you should. Because without consequences, without downside, people don't have two things you need: real knowledge (from contact with reality) and real courage (from bearing risk). And without those, their advice is just noise.
This isn't about being nice or fair in the way HR departments define it. It's about survival. Systems that separate risk from reward don't limp along—they blow up. The question isn't whether you like Taleb's framing. The question is whether you can point to a counterexample that's lasted.