Skip to main content

Coordination Debt: Why More AI Creates More Work in Healthcare

Health Systems With the Most AI Tools Often Have the Biggest Operational Gaps

May 13, 2026 | Coordination debt builds when AI generates outputs no one is designed to act on. Here’s what it costs health systems and how to design your way out of it.

Health systems have plenty of AI, but most haven’t designed the space between their tools. Coordination debt builds when outputs have no named owner, no defined trigger, and no clear definition of “done.” The result: skilled clinicians become human routers, and AI investments stall. This post builds on Angela Adams’ recent Inc. piece to explain what coordination debt looks like in healthcare, why AI makes it worse before it gets better, and the four design decisions that pay it down.

Coordination debt is the hidden cost that accumulates when health systems adopt AI tools without designing the workflows between them. More detection, more flags, and more outputs without ownership, accountability, or follow-through infrastructure shifts work onto clinicians instead of removing it. Evidence shows that healthcare organizations only realize the value of AI investments when they treat the space between their tools as infrastructure worth building.

What Is Coordination Debt in Healthcare?

Angela Adams, CEO of Inflo Health, recently introduced the concept of coordination debt in Inc., describing it as the operational cost that builds when organizations buy tools that work independently but never design how the handoffs between them are handled. The concept mirrors technical debt: it accrues not through bad decisions, but through the accumulation of reasonable ones.

In healthcare, coordination debt is especially acute. Hospitals operate dozens of specialized platforms — for imaging, scheduling, lab results, billing, clinical documentation — and each platform generates outputs that require human action to become outcomes. When no one designs that intermediate step, skilled professionals become what Adams calls “human routers”: expensive clinical staff spending significant portions of their day reconciling spreadsheets, forwarding results, and manually tracking what happens next.

Research from Asana found that knowledge workers spend roughly 60% of their time on coordination — not producing work, but organizing it. In healthcare, where the stakes involve patient safety and clinical liability, the coordination tax falls disproportionately on those least able to absorb it.

Why Does AI Make Coordination Debt Worse Before It Gets Better?

The standard case for AI in healthcare is workload reduction. But a good detection system finds more things and more findings, without a coordination layer, which means more work for the people downstream.

This is most visible in radiology. AI-powered imaging tools have become highly effective at identifying incidental findings: lung nodules, lesions, and abnormalities that might have previously been missed. This is clinically valuable. It is also generating a downstream burden that most health systems have not operationalized.

Every flagged finding initiates a workflow: a follow-up to schedule, a patient to contact, a referral to coordinate, a result to track. A 2026 scoping review published in the International Journal of Medical Informatics found that AI implementations introduced additional monitoring requirements or parallel workflows and often shifted the administrative and cognitive burden onto clinicians (Dave et al., 2026).

Adams identifies a compounding factor specific to radiology: report language written under volume pressure. Recommendations like “please correlate clinically, further imaging may be required” are hedging, not guidance. If the downstream system cannot distinguish genuine actionable findings from defensive boilerplate, it cannot route the right patients to the right care, and the coordination debt compounds.

What Does a Coordination Layer Actually Look Like?

Paying down coordination debt is an operating model decision. Adams outlines four design requirements that separate organizations compounding their AI investments from those drowning in outputs:

  1. Named ownership at every handoff. If an AI flags a high-risk finding, that flag needs a specific owner responsible for the next step.
  2. Defined triggering events. Not every output warrants action. A coordination layer filters signal from noise by cross-referencing findings against clinical guidelines (e.g., Fleischner criteria for lung nodules, ACR standards for other findings) to determine which outputs require a human in the loop.
  3. A real definition of “closed.” A follow-up is not complete when the order is placed. It is complete when the patient has been seen and the result recorded. Systems that cannot distinguish between these two states cannot identify who is falling through.
  4. Protected human moments. There are moments in high-stakes workflows where automation is the wrong tool because the patient on the other end needs another human.

Real-world implementations that pair AI with this kind of workflow ownership show measurable results. Across Inflo Health deployments, automation of follow-up identification, communication, tracking, and closure has reduced manual administrative tasks by up to 95% by designing the space between detection and action.

What Should Healthcare Leaders Do About Coordination Debt?

Adams offers a practical diagnostic: pick one workflow that touches a patient or high-value outcome, map it end-to-end, and find the handoffs. At each one, ask three questions: Who is accountable? How do they know it’s their turn? How do you know if it doesn’t happen?

In most organizations, the honest answer to that last question is: we find out when someone complains or when a patient falls through.

The health systems that will compound their AI investments are not the ones with the most tools. They are the ones who treat the coordination between tools as infrastructure worth designing, building, and measuring.

Coordination debt isn’t a technology failure. It’s a design failure. And design failures can be fixed.

References

Adams, A. (2026, May 12). Does your technology turn outputs into outcomes? Inc. https://www.inc.com/angela-adams/does-your-technology-turn-outputs-into-outcomes/91343216

Dave, B., Martin, P., Singh David, S., Kumar, S., & Chakraborty, T. (2026). Enhancing healthcare worker mental health via artificial intelligence-driven work process improvements: A scoping review. International Journal of Medical Informatics, 205, 106122. https://doi.org/10.1016/j.ijmedinf.2025.106122