Who Bears the Risk?
I remember the meeting clearly. It was a Tuesday afternoon, one of those airless conference rooms with a projector that took five minutes to connect. A team from the trust's transformation unit was presenting a redesigned acute admissions pathway. The slides were polished. The flowchart was elegant. Estimated time savings per patient: fourteen minutes. Projected reduction in corridor waits: thirty per cent.
I looked around the room. The people presenting had titles like Head of Service Improvement and Programme Lead for Urgent Care Transformation. Not one of them had ever worked a night shift on the acute medical unit. Not one of them would be standing at the nursing station at 3am when the pathway broke down, as pathways inevitably do when beds are full and the on-call registrar is managing three sick patients simultaneously.
I asked what happens when there are no beds. A pause. "The pathway assumes flow," someone said. I asked what happens when the single overnight FY1 is already clerking two patients and a third ambulance arrives. Another pause. "That's an operational issue." The pathway, apparently, existed in a world without operational issues.
A colleague pushed further: who exactly would be explaining to a distressed family at two in the morning that the protocol says their father cannot be admitted to the ward he needs? The room shifted uncomfortably. We were told we were being "resistant to change."
But we were not resistant to change. We were resistant to bearing the consequences of someone else's decision. The people who designed the pathway bore no risk if it failed. If patients waited longer, if staff burned out, if something went wrong at night, none of it would touch the people in that conference room. The clinicians and patients would absorb the downside. The pathway designers would move on to the next transformation project, CV updated, lesson unlearned.
The symmetry principle
Nassim Nicholas Taleb's core argument in Skin in the Game is deceptively simple: risk must be symmetric.[1] Those who get the upside of a decision must also bear the downside. When this symmetry breaks, you get moral hazard, systemic fragility, and the slow decay of institutions.
This is not a modern insight. It is one of the oldest principles of civilisation. Every functioning society has found ways to ensure that the people who make decisions are exposed to the consequences of those decisions. When that link breaks, systems degrade. Not immediately, not visibly, but inevitably.
"Never trust anyone who doesn't have skin in the game. Without it, fools and crooks will benefit, and their mistakes will never come back to haunt them."
The principle works in both directions. People with skin in the game make better decisions not because they are smarter or more virtuous, but because reality punishes them for being wrong. Remove the punishment, and you remove the learning.
Hammurabi knew
The oldest known legal code contains what might be history's most effective quality assurance mechanism. Hammurabi's Code, circa 1754 BC, stipulated: if a builder builds a house and the house collapses and kills the owner, the builder is put to death.[2]
Brutal? Absolutely. Effective? Without question. The builder pays attention. Every beam, every joint, every foundation stone matters when your life depends on the structural integrity of what you create.
Now contrast that with modern liability structures. When a hospital trust implements a flawed IT system, who bears the cost? Not the vendor, who has contractual limitations on liability. Not the consultancy that recommended the system, whose engagement ended months ago. Not the board that approved the procurement, who will have moved to different roles by the time the problems surface. The cost falls on the nurses who spend extra hours on workarounds, the doctors who miss information buried in poorly designed interfaces, and the patients whose care suffers in ways too diffuse to trace back to a single decision.
The consequence has been diffused, delayed, and absorbed by people who had no say in the original decision. Hammurabi would not have approved.
| Symmetry level | Description | Example |
|---|---|---|
| Full symmetry | Decision-maker bears all consequences | Surgeon who operates on their own patients |
| Partial symmetry | Decision-maker bears some consequences | GP whose practice rating reflects referral outcomes |
| Transferred risk | Consequences fall on others, with some accountability | Hospital board setting staffing levels |
| Complete asymmetry | Decision-maker is fully insulated from consequences | Consultancy redesigning a pathway they will never walk |
The agency problem in healthcare
Healthcare is riddled with what economists call the agency problem. The people making decisions are systematically separated from the people living with the results.
Policy-makers mandate targets they will never have to meet. Consultancies redesign pathways they will never walk. Regulators set standards they will never operate under. Hospital boards make staffing decisions whose consequences fall on nurses and patients. At every level, authority and consequence have been decoupled.
The diagram tells the story. The arrow of authority flows downward. The arrow of consequence flows downward too, but it skips the top. The decision-maker is insulated. The patient absorbs the impact. And the frontline staff are caught in the middle, accountable for outcomes they did not design.
"Bureaucracy is a construction by which a person is conveniently separated from the consequences of his or her actions."
This is not a conspiracy. It is a structural feature of large institutions. As organisations grow, layers of abstraction accumulate between those who decide and those who are affected. Each layer is individually reasonable. Collectively, they create a system in which nobody is truly responsible for anything.
Why skin in the game is informational
Here is the insight that most people miss, and the one that elevates Taleb's argument beyond a simple complaint about fairness. Skin in the game is not just an ethical requirement. It is an epistemological one.
People with skin in the game acquire knowledge that people without it cannot. This knowledge is not theoretical. It is not the kind you get from reading reports or attending conferences. It is embodied, contextual, and only available through direct exposure to consequences.
A surgeon who operates makes different decisions from an administrator who approves surgeries. Not because the surgeon is smarter, but because the surgeon has been punished by reality for poor judgement in ways the administrator never will be. The surgeon knows which complications arise from which shortcuts, which anatomical variations demand caution, which patients need more time. This knowledge lives in the hands and the gut, not in the spreadsheet.
The same principle applies everywhere. A nurse who has worked nights knows things about patient flow that no capacity model captures. A GP who manages a panel of two thousand patients understands demand patterns that no utilisation dashboard reveals. A junior doctor who has been shouted at by a consultant for missing a diagnosis develops pattern recognition that no algorithm replicates.
"Those who talk should do, and only those who do should talk."
This is why Taleb insists that skin in the game is not optional. Without it, you are not just making unfair decisions. You are making uninformed ones. The feedback loop between action and consequence is the primary mechanism through which practical knowledge is generated. Cut the loop, and you cut the learning.
The IYI problem
Taleb coined a term for a particular kind of expert: the Intellectual Yet Idiot (IYI). These are people who are educated, articulate, and credentialed, but who have never borne consequences for their ideas.[3]
The term is deliberately provocative, and it deserves careful handling. Taleb is not arguing against expertise or education. He is arguing against a specific failure mode: the confusion of theoretical knowledge with practical wisdom.
The IYI designs systems for a world they have never inhabited. They produce elegant frameworks that collapse on contact with reality. They mistake fluency for competence and credentials for skin in the game. And because they never face consequences, they never update their models. They move from one failed initiative to the next, always confident, never corrected.
Healthcare has more than its share. The management consultant who has never taken a patient history but redesigns clinical workflows. The health economist who models bed occupancy but has never tried to discharge a complex patient on a Friday afternoon. The policy adviser who mandates four-hour emergency targets without understanding why they are routinely breached.
This is not an argument against management, economics, or policy. These disciplines matter. But they matter most when practised by people who have, at some point, been exposed to the consequences of getting it wrong. The best hospital managers I have worked with are those who have spent time on the ward. The best policy-makers are those who have delivered care. They bring something that no amount of analytical brilliance can substitute: the humility that comes from having been wrong in a way that mattered.
Restoring symmetry
If the problem is structural, the solution must be structural too. Appealing to goodwill or professionalism is not enough. We need mechanisms that reconnect decision-making to consequence-bearing.
Clinician-led design. If a pathway will be walked by clinicians and patients, it should be designed by clinicians and patients. Not consulted on. Not "engaged with." Designed by. The people who will live with the consequences should hold the pen.
Mandatory frontline time. Administrators and managers who set staffing levels, approve rotas, or design patient pathways should spend regular time on the frontline. Not as observers. As participants. If you set the nurse-to-patient ratio, you should occasionally work under it.
Lived experience as qualification. In hiring for roles that shape clinical policy, weight should be given to direct clinical experience. Not as the only criterion, but as a necessary one. The question "have you ever done this yourself?" should be asked more often than it is.
Consequence exposure. Where possible, tie the outcomes of decisions to the people who made them. If a consultancy redesigns a pathway and it fails, that should affect future contracts. If a board approves a staffing model and burnout rises, that should appear in their performance review. Not punitively, but structurally: close the loop between decision and outcome.
"Do not pay attention to what people say, only to what they do, and to how much of their necks they are putting on the line."
The principle is simple, if uncomfortable: do not let people make decisions whose consequences they will not bear. This is not a radical idea. It is, as Hammurabi understood nearly four thousand years ago, the foundation of any system that actually works.
The oldest rule
Skin in the game is not a new concept dressed in modern language. It is the oldest rule of functioning societies, rediscovered by every civilisation that has lasted long enough to matter. The Romans required engineers to stand under their bridges. Medieval guilds staked their reputations on their craft. Shipbuilders sailed on their own vessels.
We have spent the last century building institutions that systematically violate this principle. We have created layers of abstraction, limited liability, and professional distance that allow decision-makers to enjoy the upside while pushing the downside onto others. And then we wonder why our institutions feel brittle, why trust erodes, why the gap between policy and reality keeps widening.
The fix is not more oversight, more regulation, or more governance frameworks. The fix is older and simpler: make the people who decide also the people who bear the consequences. When you do, something remarkable happens. They start paying attention.
Related reading:
- Trade-offs Thinking - Why every decision comes with costs, usually borne by someone else
- Move Fast and Break Things? Not in Healthcare - The tension between innovation velocity and patient safety
- Incerto: Antifragile by Design - Why systems should gain from disorder, not merely survive it