Pre-Summit Series: “What If We Could Scale Good?” — Post 3 of 5 | Pre-Summit Day 1
There’s a phrase that keeps showing up in healthcare AI conversations that should make every nurse’s blood boil:
“We need to make sure nurses can use these tools.”
Read that again. The framing is the problem.
It positions nurses as end users — passive recipients of technology designed by someone else, for someone else’s vision of care. It reduces the most clinically present, patient-proximate profession in healthcare to a training problem. As if the issue is that nurses just need to learn how to click the right buttons.
This framing isn’t just wrong. It’s dangerous.
The End-User Fallacy
When health systems treat nurses as end users of AI, they make a series of cascading errors:
They design AI tools without nursing workflow input, creating systems that interrupt rather than integrate. They measure success by adoption rates rather than patient outcomes. They train nurses on interfaces rather than engaging them in governance. And they wonder why the tools don’t work as promised.
The end-user model treats nurses the way early computing treated secretaries — as operators of someone else’s machine, not architects of the system itself.
But here’s what 22 years of ICU nursing has taught me: nurses don’t just use clinical systems. They are the clinical system. They are the integration layer between the patient, the data, the physician orders, the family dynamics, the institutional protocols, and the moment-to-moment reality of care.
That’s not an end user. That’s an operating system.
Why Nurses Are the Natural AI Orchestrators
Consider what a nurse does in a single shift:
They synthesize data from monitors, labs, imaging, and physical assessment — simultaneously. They cross-reference that data against physician orders, patient history, and their own clinical intuition built over years of pattern recognition. They communicate across disciplines — translating between physician language, patient language, family language, and system language. They make dozens of micro-decisions per hour that never appear in any chart but keep patients alive.
Now consider what an AI orchestrator does: integrates multiple data streams, validates outputs against context, coordinates between systems, and makes judgment calls about when to act and when to escalate.
Sound familiar?
Nurses have been doing AI orchestration — manually, brilliantly, exhaustingly — for their entire careers. The question isn’t whether nurses can handle AI. The question is whether AI can handle being governed by people who actually understand clinical reality.
From Operator to Architect
The shift we’re advocating for at the Nurse Intelligence Network isn’t incremental. It’s fundamental:
Old model: Technologists build AI tools → Nurses are trained to use them → Problems emerge at the bedside → Workarounds multiply → Trust erodes.
New model: Nurses co-design AI governance → Clinical validation is built into deployment → Continuous monitoring includes nursing metrics → Tools actually serve patient care → Trust scales.
This isn’t about giving nurses a seat at the table. It’s about recognizing that nurses are the table — the surface on which every clinical AI decision ultimately rests.
The NAIO Principle
We’ve formalized this insight into what we call the NAIO framework: Navigate, Assess, Integrate, Orchestrate.
It’s not a training program. It’s a governance architecture that positions nurses as the stewards of AI in clinical environments — the professionals who navigate the landscape, assess the tools, integrate them into workflows, and orchestrate their safe deployment.
Tomorrow, we’ll break down each element of NAIO and show how it transforms the nurse’s role from passive consumer to active governor of healthcare AI.
The Provocation
So here’s the challenge we’re putting to every health system executive, every CIO, every AI vendor building tools for clinical environments:
Stop training nurses to use your AI. Start building AI that answers to nursing governance.
Stop asking how to get nurses to adopt your tools. Start asking how your tools survive nursing scrutiny.
Because if your AI can’t pass the 3 AM ICU test — if it can’t hold up under the judgment of a nurse who’s been awake for ten hours and has three critical patients and knows something is wrong before the algorithm does — then it’s not ready for patient care.
Nurses aren’t your end users. They’re your quality standard.
Robert Domondon, MD, BSN, RN, MBA, MPH
Founder, Nurse Intelligence Network
Where Nightingale Meets Neural Net
Leave a comment