What Aviation Teaches Us About Saving Lives in the OR, ICU, L&D, and ED
- Stephen Harden

- Mar 13
- 13 min read
Healthcare doesn't lack for protocols, checklists, or credentialing standards. What it often lacks is structured support for how clinicians actually decide, communicate, and recover from error under pressure — and that gap is costing lives.
Two high-stakes industries. One shared problem. In aviation, we spent decades learning that checklists, procedures, and technical proficiency alone could not prevent catastrophic failure. The missing ingredient was always human. Healthcare is learning the same lesson — in real time, at scale, and with patients in the balance.
I spent years in the cockpit — first as a Navy fighter pilot, then as an airline captain, and now as a private pilot flying high performance aircraft — and I've also spent years studying why humans fail in high-consequence environments.
The deeper I looked into aviation accident data, the more clearly I saw the same failure signatures appearing in hospital incident reports: miscommunication in handoffs, situational awareness that collapsed under pressure, hierarchy so rigid that junior team members stayed silent when it mattered most, and the slow, invisible drift of acceptable standards known as normalization of deviance.
Aviation addressed these patterns through a philosophy shift — from blaming individuals to examining the human system contributing to these failures.
It worked.
Commercial aviation's safety record is now extraordinary. Healthcare is not there yet, but the pathway is clear, and it runs directly through the same human factors principles that transformed flight.
This post draws on aviation's framework and translates it directly into four of healthcare's highest-acuity environments: the surgical suite, the labor and delivery unit, the emergency department, and the intensive care unit. The stakes in each are no less than in any cockpit.
The Problem Isn't Knowledge — It's the Human System
Consider this parallel: a commercial airline crash is rarely caused by a pilot who forgot how to fly. And a surgical adverse event is rarely caused by a surgeon who forgot anatomy. The breakdown happens at a different level entirely.
Research published in BJS Open found that up to 74.9% of incidents in patients admitted for surgical care occur intraoperatively — and that these critical events are strongly influenced not just by individual skill but by the complex sociotechnical environment in which clinicians operate.1
Put simply: the operating room is a system, and systems fail in systemic ways.
The aviation parallel is exact. NASA investigators analyzing aircraft accidents beginning in 1979 found that technical malfunction and lack of stick-and-rudder skill were rarely the primary cause of crashes. Human factors — communication failures, authority gradients, situational awareness breakdowns, fatigue — were the dominant drivers. Healthcare's adverse event data tells the same story.
The landmark Institute of Medicine report To Err is Human catalyzed a generation of patient safety research by establishing that up to 98,000 Americans die annually from preventable medical errors — and that the majority of those errors are rooted in system failures, not individual incompetence.2 Two decades of subsequent research has reinforced and refined that finding. The WHO's Global Patient Safety Action Plan 2021–2030 now identifies human factors capacity development as a core strategic objective for building high-reliability health systems.3
Yet the training most clinical professionals receive remains almost entirely task-oriented. Clinicians are taught teach procedures, flows, drug protocols, and evidence-based guidelines. Clinician performance is evaluated against objective standards — and rightly so. But does the educational system examine how decisions are formed, why communication breaks down in high-stakes moments, or how small deviations from standard practice accumulate into catastrophic failures over time.
"The breakdown rarely occurs at the level of technical skill alone. The contributing factors are human — and they require different tools than another policy update or an extra sim lab session."
Aviation's Corrective Framework: What CRM Actually Is
In 1979, following a series of crashes in which technically proficient crews made catastrophic errors, NASA convened a workshop on resource management in flight operations. The result was a training philosophy called Crew Resource Management — CRM — that shifted aviation's safety culture from individual blame to team-based error management.
CRM does not focus on flying skills. It focuses on the non-technical skills that determine whether a technically capable crew performs safely under pressure: situational awareness, decision-making under uncertainty, communication and intent clarity, authority gradient management, and structured debriefing that surfaces root causes rather than superficial fixes.
Healthcare began adapting CRM principles in the late 1990s and early 2000s, with mixed but increasingly promising results. A comprehensive umbrella review of 106 CRM studies in healthcare settings found consistent positive effects at the levels of participant reactions, knowledge acquisition, and behavioral change — the first three tiers of Kirkpatrick's widely used framework for evaluating training interventions.4
The same review identified that operating room teams, emergency medicine staff, ICU personnel, and anesthesiology teams were the primary recipients of CRM interventions — precisely the environments with the highest stakes and the most complex human system dynamics.5
The most dramatic clinical evidence came from a prospective 3-year cohort study at Radboud University Medical Center's 32-bed ICU. Following implementation of aviation-derived CRM training across the unit's entire multidisciplinary workforce, serious ICU complication rates dropped from 66.4 per 1,000 patients during the implementation year to 50.9 per 1,000 patients in the post-implementation year — a statistically significant reduction representing real lives spared.6
Four Environments, One Framework: Applying Human Factors Coaching in Healthcare
Aviation coaching is not remedial training. It is a structured, professional approach to improving performance by addressing how pilots and crews think, decide, communicate, and learn under pressure. The same framework translates directly into healthcare's highest-acuity settings — with adaptations for each environment's unique pressures, hierarchies, and failure modes.
Environment 01
Surgical Services
The OR combines elite technical skill with complex team dynamics, time pressure, and equipment interdependence. Human factors failures account for the majority of surgical adverse events.
Environment 02
Labor & Delivery
L&D is uniquely dual-patient, highly variable, and emotionally charged. Communication failures between obstetric and nursing teams are among the most frequent contributors to adverse maternal and neonatal outcomes.
Environment 03
Emergency Department
The ED is aviation's "pattern work" — high volume, unpredictable acuity, and constant task-switching. Situational awareness degradation and cognitive bias under load are ever-present threats.
Environment 04
Intensive Care Unit
ICU care is multidisciplinary, time-critical, and involves the most vulnerable patients. The complexity of interprofessional coordination creates unique conditions for both error and for transformational team-based learning.
Surgical Services: When Technique Isn't Enough
A 2022 systematic review of human factors interventions in the operating room, published in BJS Open, examined 28 studies representing 27 unique training programs. Across all of them, every intervention addressed behaviors and attitudes, teamwork, and communication — with situational awareness (85%) and leadership (74%) appearing in the majority.1 These are not soft skills. They are the mechanisms through which technically excellent surgeons and scrub techs either coordinate effectively or produce adverse events.
A particularly dangerous dynamic in the OR is what researchers term normalization of deviance — the gradual process by which small departures from accepted safety standards accumulate and become the new norm. A 2023 analysis in the AORN Journal describes normalization of deviance as "a phenomenon in which individuals and teams depart from an acceptable performance standard until the adopted way of practice becomes the new norm" — and argues that it is fundamentally incompatible with the principles of high-reliability organizations.7
The parallel in aviation is sobering. In the 20 years before the Columbia shuttle disaster, debris shedding had occurred on every single flight without incident. Each uneventful launch normalized the risk a little more, until the unthinkable became inevitable.8
Surgical services see this dynamic play out in slower motion: a timeout not fully completed, an instrument count performed hastily, a concerns not voiced to an attending. None triggers an adverse event today. But the margin erodes.
Human factors coaching in the surgical environment focuses on building the psychological safety and structured communication norms that interrupt this drift before it becomes catastrophic — not through more checklists, but through the kind of structured debrief culture that makes deviation visible and discussable.
Even as a private pilot, I debrief and assess every flight following a structured approach. Debriefing, even if just debriefing myself, keeps my margins of safety from eroding and helps end any drift in perfromance.
Labor & Delivery: The Dual-Patient Environment
Labor and delivery occupies a unique position in human factors research because of the profound stakes — two lives, high emotional load, rapidly changing acuity — combined with a workplace culture that can be strongly hierarchical and resistant to structured coaching interventions. The TeamSTEPPS framework, co-developed by the AHRQ and the U.S. Department of Defense specifically as an evidence-based CRM adaptation for healthcare, has been extensively studied in L&D settings.
Research design recommendations from a systematic review on CRM's effects on safety culture specifically cited labor and delivery units as requiring 11 to 13 sites per group to detect a 40% reduction in adverse outcomes — an indication of both the statistical complexity and the meaningful magnitude of improvement possible from structured team training.9
The most consequential application of CRM principles in L&D involves closing the communication gap between nursing staff and obstetric providers — particularly around the authority gradient that too often keeps early warning signals from reaching decision-makers in time. Aviation learned this lesson after analyzing crashes where first officers had critical information and chose not to escalate it to captains. Healthcare is learning it through maternal adverse event reviews. The mechanism is identical; so is the solution: coaching that builds assertive communication norms and structures for escalating concern regardless of hierarchy.
Emergency Department: Situational Awareness at Scale
If the ICU is a chess match, the emergency department is twenty simultaneous games on a board that keeps changing shape. The ED's defining human factors challenge is cognitive load management: maintaining situational awareness across multiple patients at varying acuity levels while navigating constant interruption, team turnover at shift change, and the ever-present risk of anchoring bias — the tendency to stop generating differential diagnoses once an initial working diagnosis feels plausible.
Research on cognitive biases in clinical decision-making published in BMC Medical Informatics and Decision Making documents a wide range of systematic errors in clinical judgment that are not correctable through additional knowledge alone — they require metacognitive coaching that helps clinicians recognize their own decision-making patterns in real time.10
Perhaps the most promising human factors intervention in the ED is the structured clinical debrief. A review of clinical debriefing practices in emergency settings found extensive evidence of improvement in knowledge and clinical performance, communication, team dynamics, and efficiency — all directly impacting patient outcomes.11
One ED implementing the STOP5 hot debrief framework — a five-minute structured review conducted immediately after critical cases — identified ten concrete process and equipment changes as direct results of the debriefs over twelve months, and 98% of respondents felt the practice should continue and expand.12
This is the aviation debrief model applied to medicine. Fighter pilots don't debrief because they're uncertain of their skills — they debrief because they understand that error is not a character flaw; it is a system output. The debrief is where the system learns. The ED that builds a debrief culture is building the same feedback loop that aviation used to transform its safety record.
Intensive Care Unit: Complexity, Multidisciplinary Teams, and Culture Change
The ICU represents aviation's most direct analogue in healthcare: a closed, high-stakes environment staffed by a multidisciplinary team managing critically ill patients where the margin for error is narrow and the consequences of failure are immediate. Researchers who have studied CRM/TeamSTEPPS implementation in the ICU consistently note that its parallels to aviation are stronger than in almost any other clinical setting.
A prospective cohort study at a tertiary-care ICU found that CRM implementation was associated with a statistically significant reduction in serious complications — and that the training's mechanism of action was to change not just individual behavior but the unit's safety culture, shifting how staff thought about error, risk, and interprofessional communication.6 Importantly, the study team included aerospace trainers from a military aviation background — making the translation explicit and direct.
Research on ICU human factors also highlights the workload and burnout dimensions that have no direct equivalent in aviation but significantly amplify human factors risk. A systems-level analysis using the SEIPS (Systems Engineering Initiative for Patient Safety) model found that interactions between healthcare workers and their work system elements — tasks, tools, organizational factors, and environment — often compound the stress of clinical care in ways that increase error risk far beyond what any individual training intervention can address alone.13
This is where the coaching model diverges meaningfully from one-time training. CRM delivered once and left unsupported does not transform safety culture. As ICU researchers have noted, behavioral change through culture takes time — in aviation, it took a generation.14
What accelerates that process is ongoing coaching: structured debriefs, leadership development, instructor consistency, and the kind of psychological safety that allows a nurse to tell an attending that something doesn't look right — and be heard.
The Six Failure Modes Aviation Solved — and Healthcare Hasn't
Brandon Williams' (also a former fighter pilot) original aviation framework identifies several recurring human factors failure modes. Each appears in healthcare's adverse event literature with nearly identical signatures.
Failure Mode | In Aviation | In Healthcare |
Situational Awareness Collapse | Crew loses mental model of aircraft state and environment under task saturation | ED clinician anchors on initial diagnosis; OR team misses physiologic deterioration |
Communication Breakdown | Critical information not verbalized or acknowledged; readback failures | Handoff information lost at shift change; L&D concerns not escalated to obstetric provider |
Authority Gradient | First officer doesn't challenge captain despite having critical information | Nurse, resident, or tech stays silent to avoid conflict with attending or senior surgeon |
Normalization of Deviance | Small deviations from procedure become routine; complacency grows until failure | Incomplete timeouts, bypassed safety checks, "we always do it this way" in the OR |
Weak Debrief Culture | No structured after-action review; errors not surfaced or analyzed | Post-case discussions focus on clinical outcomes, not team process or decision quality |
Fatigue & Complacency | Crew certification doesn't protect against cognitive degradation under fatigue | Clinical expertise doesn't prevent errors in post-call, high-load, or routine-task scenarios |
What Targeted Coaching Actually Looks Like in Clinical Settings
Human factors coaching in healthcare is not a lecture on cognitive biases, a workshop on SBAR communication, or a one-day CRM course. Those can be starting points, but they are not coaching. Effective coaching is structured, sustained, and focused on the specific failure modes and performance gaps of a specific team in a specific environment.
The AHRQ's TeamSTEPPS framework, one of the most rigorously evaluated team training programs in healthcare, distinguishes between training and implementation — recognizing that the latter requires ongoing coaching, measurement, and leadership reinforcement to produce durable behavioral change.15
Research using the Safety Attitudes Questionnaire (SAQ) — itself a validated derivative of the Cockpit Management Attitudes Questionnaire from aviation — has documented measurable improvements in safety climate following CRM implementation in ICU settings, with those improvements correlating to better patient outcomes.6
Effective human factors coaching in clinical settings focuses on several specific domains:
Decision-making under uncertainty. Clinical decisions — particularly in the ED and ICU — are made with incomplete information under time pressure. Coaching helps teams recognize when they are operating in high-uncertainty conditions and apply structured frameworks for managing diagnostic ambiguity rather than prematurely closing on a diagnosis.
Error management, not error avoidance. Healthcare training traditionally tries to prevent errors through more rigorous procedure adherence. Human factors coaching recognizes that in complex systems, error is inevitable — and focuses instead on earlier recognition, more effective trapping, and faster recovery. This is exactly the aviation CRM philosophy: not zero error, but error resilience.
Debrief quality and learning effectiveness. Research in both aviation and emergency medicine consistently shows that teams that debrief effectively improve faster than teams that don't.16 The content domains most commonly discussed — and most commonly identified as opportunities for improvement — in clinical debriefs are decision-making and communication, precisely the two most frequent contributors to adverse events.17
Instructor and leader consistency. In aviation, instructor consistency and standardization is a safety issue: inconsistent instruction creates gaps that don't become visible until a critical situation exposes them. In healthcare, charge nurses, attending physicians, and clinical educators who apply standards inconsistently create the same gaps — the same fertile ground for normalization of deviance to take root.
The Business Case, Briefly
Human factors coaching in healthcare is not a luxury. It is one of the highest-return investments available to a health system. A widespread initiative in Michigan ICUs that used aviation-derived checklist and team communication principles reduced ICU infection rates by 66% within the first three months — saving an estimated $175 million and more than 1,500 lives within 18 months.18 A WHO surgical safety checklist study involving eight hospitals across four continents found a 47% reduction in 30-day post-operative mortality and a drop in major complications from 11% to 7%.19
These are not marginal improvements. They are the results of investing in the human system — the same investment aviation made when it stopped blaming pilots and started fixing the conditions that made pilot error inevitable.
"Aviation doesn't just teach "checklists." It knows the hardest part of safety isn't the procedure — it's the culture that determines whether the procedure actually gets followed when the pressure is on."
Where to Start
For healthcare leaders considering a human factors coaching initiative, the entry point is diagnosis before prescription. The specific failure modes in a surgical suite are not identical to those in an ED or an L&D unit. Effective coaching begins with a structured assessment of the team's current decision-making patterns, communication norms, debrief practices, and safety culture — the same kind of threat and error analysis that aviation uses before designing CRM interventions for a specific fleet or operator.
The goal is not to import aviation culture into healthcare. Clinicians are not pilots, and patient care is not flight operations. The goal is to apply the same rigorous, evidence-based attention to how humans perform under pressure — and build the coaching systems that make high performance sustainable, not accidental.
Healthcare has the technical excellence. It has the dedication. What it needs is the same shift aviation made: from blaming the human at the sharp end to building the system that supports human performance at its best.
The lives are already in the balance. The framework exists. The question is whether clinical leaders are willing to invest in the human system the same way they invest in the technical one.
References & Selected Research
Pelletier M, et al. Exploring human factors in the operating room: scoping review of training offerings for healthcare professionals. BJS Open. 2022;6(2):zrac011. doi:10.1093/bjsopen/zrac011
Kohn LT, Corrigan JM, Donaldson MS, eds. To Err Is Human: Building a Safer Health System. Washington, DC: National Academies Press; 2000.
World Health Organization. Global Patient Safety Action Plan 2021–2030. Geneva: WHO; 2021.
Kemper PF, et al. What do we really know about crew resource management in healthcare? An umbrella review. Journal of Patient Safety. 2021;17(8):e1099–e1106. doi:10.1097/PTS.0000000000000816
Gross B, et al. Crew resource management training in healthcare: a systematic review of intervention design, training conditions and evaluation. BMJ Open. 2019;9(2):e025247. doi:10.1136/bmjopen-2018-025247
Haerkens MH, et al. Crew resource management in the intensive care unit: a prospective 3-year cohort study. Acta Anaesthesiologica Scandinavica. 2015;59(10):1319–1329. doi:10.1111/aas.12540
Wright MI. Normalization of deviance is contrary to the principles of high reliability. AORN Journal. 2023;117(4):231–238. doi:10.1002/aorn.13894
Banja J. The normalization of deviance in healthcare delivery. Business Horizons. 2010;53(2):139–148. doi:10.1016/j.bushor.2009.10.006
Rabol LI, et al. Does classroom-based crew resource management training improve patient safety culture? A systematic review. BMJ Quality & Safety. 2011;20(2):173–183.
Saposnik G, et al. Cognitive biases associated with medical decisions: a systematic review. BMC Medical Informatics and Decision Making. 2016;16:138. doi:10.1186/s12911-016-0377-1
Paquay M, et al. A success story of clinical debriefings: lessons learned to promote impact and sustainability. Frontiers in Public Health. 2023;11:1188594. doi:10.3389/fpubh.2023.1188594
Coggins A, et al. Interdisciplinary clinical debriefing in the emergency department: an observational study of learning topics and outcomes. BMC Emergency Medicine. 2020;20(1):79. doi:10.1186/s12873-020-00370-7
Carayon P, et al. Human factors systems approach to healthcare quality and patient safety. Applied Ergonomics. 2014;45(1):14–25. doi:10.1016/j.apergo.2013.04.023
Kemper PF, et al. Implementation of crew resource management: a qualitative study in 3 intensive care units. Journal of Patient Safety. 2017;13(4):223–231. doi:10.1097/PTS.0000000000000197
Agency for Healthcare Research and Quality. TeamSTEPPS 3.0. AHRQ; updated December 2023.
Agency for Healthcare Research and Quality. Reviewing the team's performance: debrief. TeamSTEPPS Curriculum. AHRQ; 2023.
Seelandt JC, et al. Decision making and communication as the most common domains for positive discussion and opportunity areas in clinical debriefs. Advances in Simulation. 2021;6:7.
Pronovost P, et al. An intervention to decrease catheter-related bloodstream infections in the ICU. New England Journal of Medicine. 2006;355(26):2725–2732.
Haynes AB, et al. A surgical safety checklist to reduce morbidity and mortality in a global population. New England Journal of Medicine. 2009;360(5):491–499.




Comments