Move Fast and Break Things? Not in Healthcare
Mark Zuckerberg's famous motto became Silicon Valley gospel: move fast and break things. It captured something true about software. When your worst-case scenario is a bug that gets patched in the next release, speed is a competitive advantage. Failure becomes data.
Then tech turned its attention to healthcare.
"In healthcare, when you break things, people can die."
This isn't hyperbole. It's the reason healthcare operates by a fundamentally different set of rules. And yet, the friction between these two cultures - startup velocity versus medical caution - isn't simply a clash to be resolved. It's a productive tension worth understanding.
The two cultures
Startup Culture
Speed is survival. Ship the MVP, gather feedback, iterate. Failure is learning. Perfection is the enemy of good. First-mover advantage matters more than getting everything right.
Healthcare Culture
Caution is care. Evidence before action. Primum non nocere. Regulatory compliance isn't bureaucracy - it's protection. Getting it right matters more than getting it first.
Both cultures have internal logic. Both have produced remarkable outcomes. And both, taken to extremes, become pathological.
The startup founder who ships untested medical software is reckless. The hospital administrator who blocks a proven technology for another decade of studies is complicit in preventable suffering. The question isn't which culture is right - it's where each applies.
Why healthcare is different
Irreversibility
In software, you can roll back a deployment. You can push a hotfix. You can apologise, issue refunds, and try again tomorrow.
In healthcare, some mistakes cannot be undone. A misdiagnosis that delays cancer treatment. A drug interaction that causes organ failure. An algorithm that systematically undertreats pain in certain demographics. These aren't bugs to be patched - they're harms that persist.
"The undo button doesn't exist for the human body."
Asymmetric consequences
Startups operate in a world of asymmetric upside. Invest a little, maybe lose it, maybe build the next unicorn. The math favours bold bets.
Healthcare faces asymmetric downside. You might improve a patient's quality of life by some percentage, but a mistake might end that life entirely. The math demands caution.
This isn't risk-aversion for its own sake. It's rational response to the stakes involved.
The regulatory reality
Healthcare technology in the UK must navigate multiple overlapping systems: MHRA device regulations, NICE evidence standards, NHS Digital Technology Assessment Criteria, data protection requirements. Each layer exists because previous innovations caused harm that someone decided shouldn't happen again.
Founders often see this as bureaucratic friction. It's actually institutional memory encoded in policy.
The cost of caution
Here's where it gets complicated. Because excessive caution has its own body count.
Innovation lag
The NHS still runs systems designed in the 1990s. Clinicians waste hours on documentation that could be automated. Patients wait months for appointments that telemedicine could handle tomorrow. The gap between what technology enables and what healthcare delivers costs lives - just less visibly than a dramatic failure.
Burnout from outdated tools
When a nurse clicks through fourteen screens to order a blood test, that's not just inefficiency. It's cognitive load that depletes the attention needed for actual patient care. Poor technology contributes to burnout, and burned-out clinicians make more mistakes.
"Being too cautious can also harm patients - it just does so invisibly, through opportunities not taken."
The irony of over-regulation
Sometimes the approval process takes so long that patients die waiting for treatments that would have saved them. The very systems designed to prevent harm end up causing it through delay. This is the tragedy of false negatives - the interventions that never happened.
Finding the productive tension
The answer isn't to pick a side. It's to recognise that both impulses - innovation and caution - serve essential functions, and the art is knowing when each applies.
Where speed is appropriate
- Administrative systems: Scheduling, billing, internal communications
- Staff-facing tools: Rostering, handover notes, training platforms
- Low-risk patient interactions: Appointment reminders, general health information
- Research and development: Rapid prototyping in sandboxed environments
These areas can tolerate the startup playbook. Fail fast, iterate, improve continuously.
Where caution is non-negotiable
- Diagnostic algorithms: When software influences clinical decisions
- Treatment recommendations: Anything affecting drug dosing or care pathways
- Patient data handling: Privacy breaches cause lasting harm
- Medical devices: Physical interactions with the body
These areas require the full weight of evidence-based validation. Not because innovation is bad, but because the consequences of getting it wrong are irreversible.
The sandbox approach
Smart health tech companies create clear boundaries. They move fast in designated sandbox environments - where failures can't reach patients - then slow down methodically as innovations approach clinical deployment. The culture shifts at the boundary, not across the entire organisation.
A systems perspective
From a systems thinking lens, these two cultures represent different feedback loops optimised for different contexts.
Startup culture assumes short feedback loops: ship, measure, adjust, repeat. This works when you can observe outcomes quickly and corrections are cheap.
Healthcare culture assumes long feedback loops: some harms take years to manifest, and corrections may be impossible. This demands more upfront validation and slower iteration.
"The question isn't 'fast or slow?' It's 'what are the feedback dynamics of this particular intervention?'"
Neither culture is wrong. They're calibrated for different system dynamics. The mistake is applying startup dynamics to healthcare contexts (or vice versa) without recognising the mismatch.
Navigating the tension
If you're building health technology, here's how to hold both cultures productively:
- 1
Map the risk topology
Identify which parts of your product touch patients directly versus indirectly. Apply different standards to each zone.
- 2
Build cultural fluency
Your engineering team needs to understand why clinicians are cautious. Your clinical advisors need to understand why speed matters commercially. Neither perspective is complete alone.
- 3
Create explicit boundaries
Define where you move fast and where you slow down. Make these boundaries visible and respected across the organisation.
- 4
Embrace productive friction
When your startup instincts clash with clinical caution, don't see it as a problem. See it as a signal that you're in the transition zone where both perspectives matter.
The synthesis
The future of health technology doesn't belong to startups who ignore regulation or to institutions who resist innovation. It belongs to those who can navigate between both cultures fluently - knowing when to accelerate and when to pause, when to prototype boldly and when to validate rigorously.
The tension between "move fast" and "don't break things" isn't a bug in the system. It's a feature. The friction generates the heat that forges better solutions - innovations that are both bold enough to matter and safe enough to trust.
Healthcare doesn't need Silicon Valley to slow down. And startups don't need healthcare to speed up. What we need is the wisdom to know which culture to inhabit, moment by moment, as we build the future of care.
Related reading:
- The Systems Paradigm - Understanding the philosophical foundations of systems thinking
- The Fundamentals of Systems Theory - Core concepts: feedback, emergence, boundaries
- Why Health Tech Companies Should Set Up in the UK - How navigating regulation creates competitive advantage
- UK Health Tech Compliance: See the System - Mapping the regulatory landscape