The Discipline of Less
I remember a patient on a geriatric ward during my junior doctor years. She was in her eighties, admitted after a fall at home. When I picked up her drug chart, I counted fourteen different medications. Statins, two antihypertensives, a proton pump inhibitor, two inhalers, warfarin, paracetamol, a sleeping tablet, an antidepressant, and several others I had to look up. Some of them had been prescribed to counteract the side effects of other medications on the list. A classic prescribing cascade.
The patient was drowsy, off her food, confused. The admitting team had ordered bloods, a CT head, a urine dip. More tests, more data, more interventions queued up. The instinct was to add.
Then a geriatrician colleague did something I had rarely seen. She sat down with the drug chart, the patient's history, and a cup of tea, and conducted what she called a "structured medication review." Over the course of an hour, she stopped nine of the fourteen medications. Deprescribing, she called it. The sleeping tablet was causing the daytime drowsiness. The drowsiness was causing the reduced appetite. The PPI had been started for a reason no one could recall. Two of the medications were treating side effects of drugs that were themselves no longer indicated.
Within two weeks, the patient was more alert, eating better, walking further down the corridor each day, sleeping without a tablet. No new drug had been added. No procedure performed. The best prescription was deprescription. The most effective intervention was subtraction.
That moment crystallised something that has stayed with me ever since: in medicine, as in life, the most powerful act is often knowing what to remove.
The bias toward action
When faced with a problem, humans are wired to do something. Inaction feels irresponsible. It looks like indifference. In a hospital, standing still while a patient deteriorates feels morally unconscionable. In a boardroom, responding to a crisis with "let's wait and observe" sounds like weakness. In politics, "something must be done!" is the universal rallying cry.
But in complex systems, action often creates more problems than it solves. The surgeon who operates unnecessarily. The policy-maker who adds a new regulation to fix the problems created by the last regulation. The product manager who adds a feature to compensate for a confusing interface. Each addition increases complexity, and complexity is where failures hide.
Taleb calls this the "intervention bias" and considers it one of the most dangerous cognitive failures in modern life.[1] We systematically overvalue action and undervalue restraint. We reward the person who does something visible, even when doing nothing would have produced a better outcome. The doctor who prescribes is seen as caring. The doctor who withholds is seen as negligent. The incentives are stacked against subtraction.
"The first principle of iatrogenics: we do not need evidence of harm to claim that a drug or an unnatural via positiva policy is dangerous."
Iatrogenics: harm by the healer
The concept of iatrogenics is literally medical: harm caused by the physician. The word has existed for centuries, yet medicine still struggles to apply it systematically. Taleb takes the concept and extends it across every domain where experts intervene in complex systems.[2]
Consider how this plays out in practice. Polypharmacy in the elderly, where each new prescription introduces interaction risks that compound non-linearly. Overdiagnosis, where screening programmes detect "abnormalities" that would never have caused symptoms, triggering treatment cascades with real side effects for imaginary diseases. Defensive medicine, where doctors order tests not because they expect useful information but because they fear litigation if they don't.
The cascade effect is particularly insidious. One unnecessary blood test returns a mildly abnormal result. The abnormal result triggers a scan. The scan reveals an incidental finding. The finding prompts a biopsy. The biopsy causes a complication. Each step was "doing something." Each step was rational in isolation. The cascade, taken as a whole, harmed a patient who walked in feeling well.
The prescribing cascade is a reinforcing feedback loop in systems dynamics terms. Each addition feeds the next.
The reinforcing loop runs clockwise: symptom, prescription, complexity, side effect, new "symptom." The stock of medications accumulates. Each circuit adds mass. The conventional response is to adjust a dose or substitute a drug. But the structured medication review changes the question entirely: instead of "what can we add?" the question becomes "what can we remove?" And it is, in my experience, the single most powerful intervention in geriatric medicine. Not a new drug. Not a new protocol. A deletion.
Medicine already has the intellectual framework for via negativa. "First, do no harm" is attributed to Hippocrates. The concept of number needed to harm is taught in medical school. Evidence-based medicine demands proof of benefit before intervention. But the culture of medicine, the incentive structures, the medicolegal environment, and the expectations of patients all push relentlessly toward addition. The framework exists. The discipline to apply it does not.
Via negativa in design
Taleb observes that Steve Jobs's genius was largely a genius of subtraction.[3] When Jobs returned to Apple in 1997, the company had dozens of product lines. He cut them to four. The product that defined Apple's resurrection, the iMac, was notable for what it removed: the floppy drive, the beige case, the clutter.
Dieter Rams arrived at the same conclusion from a different direction. His ten principles of good design culminate in the tenth: "Good design is as little design as possible." Not as much design as possible. Not as clever or as feature-rich. As little.
This connects directly to the challenge I face in healthcare technology. The temptation is always to add features. Every user request, every stakeholder meeting, every competitor analysis generates pressure to build more. The discipline is knowing what to leave out. Every feature added is an interaction to test, a surface for bugs, a cognitive burden on the user, a maintenance cost that compounds over time. The best interfaces are the ones where features have been ruthlessly removed. As I explored in Why Less is More, great design is fundamentally an act of subtraction.
The same principle applies to systems design. Complex systems fail not because they lack components but because they have too many. Each additional component is a potential failure point, an additional interaction to manage, a new source of unpredictable behaviour. The most resilient systems are often the simplest.
Via negativa in policy
Sometimes the best policy is to remove an existing policy. This is not the same as laissez-faire ideology or a reflexive hostility to regulation. It is the recognition that regulations, like medications, can accumulate into a kind of institutional polypharmacy where each rule interacts with others in ways no one fully understands.
NICE, the National Institute for Health and Care Excellence, maintains a set of "do not do" recommendations: clinical practices that evidence has shown to be ineffective or harmful, and that should be stopped.[4] This is institutional via negativa. It is the healthcare system formally acknowledging that some of what it currently does is making things worse, and that the correct intervention is cessation.
The NHS's own Getting It Right First Time (GIRFT) programme operates on a similar logic. Rather than adding new pathways, it examines existing variation in clinical practice and asks: where are we doing things that add no value? Where are we performing procedures whose outcomes do not justify their costs and risks? The answers often point toward subtraction.
In broader policy terms, the principle is straightforward but politically difficult. Removing a regulation means admitting the regulation was wrong, or at least that circumstances have changed. Politicians prefer to add; addition looks like progress. Removal looks like retreat. Yet the accumulated weight of well-intentioned additions can become its own form of harm: compliance costs that consume resources better spent on care, contradictory requirements that paralyse decision-making, bureaucratic complexity that benefits no one except the bureaucracy itself.
The practitioner's razor
Taleb offers a heuristic that connects via negativa to the concept of time-tested survival. If something has been around for a long time, it has already survived a process of via negativa: the harmful has been removed by time, and what remains has proved its worth through endurance.[5]
This is the practitioner's razor. New additions must prove themselves. They carry the burden of proof. Old subtractions, things that were tried and discarded by previous generations, have already been validated by their absence. The remedy that has been used for centuries has survived countless implicit trials. The new drug has survived one controlled trial and a regulatory review.
"In practice it is the negative that's used by the pros, those selected by evolution: chess grandmasters usually win by not losing."
This connects directly to the Lindy Effect, which I explore in Essay 6 of this series. Time is the ultimate filter. What has survived is what works. What has been discarded was discarded for a reason, even if that reason is no longer remembered. The practitioner's razor tells us to be deeply suspicious of novelty and deeply respectful of longevity.
This does not mean rejecting all innovation. It means demanding that innovation justify itself against what already exists, rather than assuming that new is better. In skin in the game terms, the question becomes: who bears the cost if this new addition fails? If the answer is "not the person proposing it," the case for subtraction grows stronger.
| Domain | Via Positiva (adding) | Via Negativa (subtracting) |
|---|---|---|
| Medicine | Prescribing a new drug | Deprescribing an unnecessary one |
| Design | Adding a feature | Removing a confusing one |
| Policy | Creating a new regulation | Repealing an ineffective one |
| Personal life | Starting a new habit | Eliminating a harmful one |
| Diet | Taking supplements | Cutting out processed food |
| Business | Launching a new product | Killing an underperforming one |
How to practise subtraction
Via negativa is a discipline. It requires more thought, not less. Addition is easy: see a problem, propose a solution, implement it, move on. Subtraction demands that you understand the system well enough to know what is causing harm, what is merely present without purpose, and what is genuinely load-bearing. You need deeper knowledge to remove wisely than to add carelessly.
Here are the principles I try to apply.
Before adding, ask: what can I remove? Make this the default question. When a patient presents with a new symptom, before reaching for the prescription pad, review the existing medications. When a product has a usability problem, before designing a new feature, ask which existing features are causing confusion. When a process is slow, before adding a new step, ask which existing steps add no value.
Apply the reversal test. If a medication, feature, regulation, or habit were not already in place, would you add it today? If the answer is no, that is a strong signal for removal. We tolerate existing additions because of status quo bias, not because they have earned their place.
Respect the asymmetry. The downside of removing something harmful is bounded: at worst, you can add it back. The downside of adding something harmful compounds over time, as other elements of the system adapt to its presence. Subtraction is more reversible than addition. This asymmetry alone should bias us toward removal.
Beware the narrative of progress. We tell ourselves that more is better, that growth is good, that building is noble. These narratives make subtraction feel like failure. But the sculptor does not fail when she removes marble. She reveals what was always there. The editor does not diminish a manuscript by cutting. She strengthens it.
The discipline of less is not about doing less thinking. It is about doing more thinking before doing less doing.
The elderly patient on that geriatric ward taught me something that no textbook had made vivid. The most skilful intervention I witnessed that month was not a procedure, not a diagnosis, not a prescription. It was a colleague with the knowledge and the courage to stop. To subtract. To trust that less could be more.
That is the discipline of less. Not passivity. Not neglect. A higher form of action.
Related reading:
- Why Less is More - The design principle that underpins via negativa
- The Fundamentals of Systems Theory - Why complex systems punish unnecessary additions
- Incerto: Who Bears the Risk? - Skin in the game and the asymmetry of consequences