From Guidelines to Outcomes: What Autonomous AI Can Deliver in Healthcare Today

Insights by Dr. Eric Stecker, Co-founder and Chief Medical Officer, Insight Health

“Healthcare doesn’t need new technology. We need to implement what we already know works.”

That’s the underlying message Dr. Eric Stecker returns to throughout this episode of The Big Unlock. And it’s a striking perspective to hear from someone who sits at the intersection of clinical practice, population health, and AI product development.

Dr. Stecker is a practicing cardiologist and professor of medicine at Oregon Health & Science University, and he co-founded Insight Health to apply AI in ways that measurably improve real-world outcomes. He’s also spent years inside the guideline and quality ecosystem of cardiology, most notably as Chair of the American College of Cardiology’s Science and Quality Committee, which shapes national cardiology practice guidelines and policy documents. In other words, he’s not arguing from theory. He’s arguing from a place of deep familiarity with what evidence already supports and frustration that the U.S. healthcare system still struggles to carry that evidence into daily practice at scale.

The episode begins with a timely setup. Coming off conferences where “AI was everywhere,” the hosts ask about wearables and continuous data streams, and whether we’re heading toward continuous cardiac care. Dr. Stecker agrees that the future is exciting. But he draws a line between what is still evolving evidence and what is already proven, and he argues that autonomous AI can deliver enormous clinical value today without waiting for fully autonomous diagnostic AI.

Listen to the full conversation

Autonomous action versus autonomous decision-making: a critical distinction

One of the most useful contributions Dr. Stecker makes is a simple conceptual distinction that clarifies much of the market noise.

He divides autonomous AI into two categories:

  • Autonomous action (AI taking action on established protocols and workflows)
  • Autonomous decision-making (AI making clinical decisions, diagnoses, or orders without a human in the loop)

These two ideas are often conflated. Dr. Stecker insists they shouldn’t be.

Autonomous decision-making is the “hard future,” and he’s clear about why; it requires significant technical maturity, safety assurance layers, deep clinical validation, regulatory readiness, and, just as important, trust from both healthcare workers and patients. It’s coming. It’s needed. But it’s difficult.

The bigger mistake, he argues, is waiting for that future while ignoring what autonomous action can do right now.

This is where he makes a point that feels both obvious and urgent. We already have decades of high-quality evidence showing how to prevent cardiovascular events, yet we still fail at the mundane steps of implementation.

He uses cholesterol therapy as a straightforward example. Statins are not new. The evidence base is vast. Yet health systems still struggle to reliably identify eligible patients, start therapy, and support adherence over time. The result is preventable harm: heart attacks, strokes, and deaths that occur not because we lack knowledge, but because implementation breaks down.

In his framing, autonomous action is the opportunity to close that implementation gap at scale today.

And he offers a memorable “why now” scenario: what if every patient starting a new medication received consistent follow-up? What if someone checked whether they filled the prescription, whether they were having side effects, and whether they had questions, then checked in again at the interval the patient chose?

That doesn’t require futuristic diagnostic autonomy. It requires operational execution at scale.

And that, he argues, is exactly what autonomous agents can provide.


Clinician involvement is the difference between signal and noise.

If autonomous action is the promise, the risk is obvious, says Dr. Stecker, “more AI can also mean more burden.”

The hosts raised the signal-to-noise problem directly, asking about alerts, risk scores, predictions, summaries, and data streams that can overwhelm clinicians and patients. Dr. Stecker doesn’t dismiss those concerns. He agrees this is real and points out that medicine has seen it before.

He recalls the early EHR era, when alerts proliferated, and clinicians learned to click through them just to get through the day. Alert fatigue is well-documented. The danger now is that healthcare repeats that lesson in the AI era: flooding workflows with AI-generated documentation or repeated check-ins that create cognitive load rather than relief.

This is where Dr. Stecker makes a strong operational point: meaningful clinician involvement is not optional.

Not clinician involvement as advisory branding. Not “a chief medical officer who joins two meetings a month.” He’s talking about integrating experienced, practicing clinicians into product development and implementation so that the system is designed from the beginning to escalate only what matters and to avoid generating unnecessary documentation.

He gives a practical example. If an oncology practice checks in weekly with patients and generates a long summary every time, pushing it into the HER, that means someone must read it, interpret it, and decide what to do. Done poorly, the AI “help” becomes a new inbox burden.

The fix, in his view, is workflow design:

  • escalate only meaningful medical flags
  • keep routine information from becoming unnecessary documentation
  • offer dashboards or condensed summaries rather than long notes
  • design protocols that define what “needs attention” vs what is “normal.”

This is the heart of his execution argument: AI must be built to reduce the burden, not shift it.

And it reinforces his broader theme: if you leave development to people who haven’t practiced medicine or to teams that don’t understand real clinical workflows, signal-to-noise failures are inevitable.


From cardiology to population health: meeting patients where they are

The episode also reveals why Dr. Stecker is unusually focused on population health outcomes.

He describes how he came to cardiology through physiology and then electrophysiology. But he also brings an engineering-oriented mindset to medicine, influenced by his father, an engineer, and by his early exposure to databases and analytics. That “systems” orientation shows up repeatedly: he’s interested not just in what’s true clinically, but in what can be operationalized across large populations.

When asked how cardiology changes in an AI world, he shares a hopeful view: “Clinicians and patients should not have to ‘evolve too much.” The technology should fit around the current experience and dramatically improve it, unlike many past tools that felt imposed on clinicians.

As he does throughout the podcast, he gives a concrete example, autonomous agents that reach out before visits to gather medical history and symptom details, even at 2:00 a.m. if a patient is a shift worker, so that patients don’t spend precious visit time answering basic intake questions. This is a key pattern he returns to: meeting the patients where they are, on their time, in their context, while improving readiness and efficiency for the clinical encounter.

He then expands into what his team calls different interaction modalities:

  • voice-only agents (phone calls)
  • text-based interactions
  • “visual voice” experiences that blend voice with on-screen guidance and embedded media

The point is accessibility and engagement. If an AI agent can show a short instructional video to help a patient place a cardiac monitor correctly without requiring them to search online or call a help line, you reduce friction and improve adherence.

This matters because the value of autonomous action isn’t only clinical. It’s behavioral. Patients are more likely to engage when the experience is simple and supportive.


The Takeaway

Dr. Eric Stecker’s message is refreshingly direct; healthcare doesn’t need to wait for fully autonomous diagnostic AI to start saving lives at scale. We already have decades of evidence for preventing cardiovascular disease and other high-burden conditions, but we repeatedly fail to implement these interventions, including identifying eligible patients, initiating proven therapies, and supporting adherence over time. His key distinction is that autonomous action is both safe and powerful today when it’s grounded in established protocols and designed with real clinician involvement to avoid alert fatigue, documentation overload, and signal-to-noise failures. In his view, the organizations that lead won’t be the ones chasing the most futuristic promises first. They’ll be the ones who use autonomous agents to turn evidence into consistent action, building trust with healthcare workers and patients now, while responsibly advancing toward autonomous decision-making tomorrow.

Sitting at the intersection of guideline-level evidence, practicing cardiology, and real-world AI execution, Dr. Stecker’s unique insights are especially valuable:

  • Autonomous action and autonomous decision-making are different, and the fastest path to impact is autonomous action on established protocols.
  • The biggest “AI opportunity” in cardiovascular care is implementation: statins, hypertension control, and adherence support can prevent massive harm today.
  • Trust is a prerequisite for autonomy. Patients and healthcare workers must see AI as reliable, safe, and helpful before decision-making autonomy can scale.
  • Clinicians must be integrated into development and implementation to prevent alert fatigue, cognitive overload, and documentation bloat.
  • AI agents can meet patients where they are, collecting pre-visit history, guiding setup with visual support, and improving engagement without adding friction.
  • Population health impact comes from operationalizing prevention: identifying care gaps, educating patients, capturing preferences, and scheduling follow-through at scale.

The Healthcare Digital Transformation Leader

Stay informed on the latest in digital health innovation and digital transformation.

The Healthcare Digital Transformation Leader

Stay informed on the latest in digital health innovation and digital transformation

The Healthcare Digital Transformation Leader

Stay informed on the latest in digital health innovation and digital transformation.