Skip to content
Back to Blog

February 5, 2026

Prior Authorization Delays Kill People. The Numbers Prove It.

Health insurance prior authorization denies or delays treatment for millions of Americans every year. Exposed internal documents show some insurers use AI to auto-deny claims. DENIED is a thriller about what happens when the algorithm optimizes for the wrong thing.

real-science denied health-insurance prior-authorization AI-healthcare insurance-denial healthcare-system

In 2023, a federal investigation found that Cigna was using an AI system to automatically deny claims. Doctors at the company were rubber-stamping denials at a rate of roughly 1.2 seconds per case. That’s not a review. That’s not a medical judgment. That’s a batch process.

The system reviewed over 300,000 claims in two months. The denial rate for those claims was over 90%.

This isn’t a conspiracy theory. It’s a ProPublica investigation and a subsequent class-action lawsuit. The documents are public.

What Prior Authorization Actually Means

If you’ve ever had health insurance, you’ve probably encountered prior authorization. Your doctor says you need a procedure, a medication, or a test. Before you can get it, the insurance company has to approve it. In theory, this is a utilization review process meant to prevent unnecessary care.

In practice, it’s a delay mechanism. The American Medical Association surveyed physicians in 2023 and found that 94% of doctors reported that prior authorization delayed patient care. One in three doctors said the process had led to a “serious adverse event” for a patient. That means hospitalization, permanent injury, or death.

The average prior authorization takes days. Complex cases take weeks. Cancer patients waiting for chemotherapy approval. Surgical patients waiting while their condition worsens. Mental health patients waiting for medication while they’re in crisis.

The AI Optimization Problem

Insurance companies have started using machine learning models to process prior authorization requests. The pitch is efficiency: faster processing, more consistent decisions, less administrative burden.

But the models are optimized on historical data. Historical data that includes decades of denial patterns. If the training data shows that denying a certain category of claim results in a percentage of patients giving up and not reappealing, the model learns that denial is the efficient outcome. Not because the care isn’t needed. Because denial reduces cost.

The system literally learns that making people go away is cheaper than treating them.

UnitedHealthcare’s AI denial system, which was the subject of a separate investigation, was found to have a 90% reversal rate on appeal. That means 90% of the denials were overturned when a human actually looked at them. The AI wasn’t making good medical decisions. It was generating friction. And friction is profitable.

Why DENIED Hits Different

Every series I write is built on a real system. DENIED is about health insurance prior authorization.

A pharmacist named Mara Kinsey starts noticing a pattern. Patients are dying after their treatments get denied. Not randomly. In clusters. And the denials aren’t coming from doctors reviewing charts. They’re coming from an algorithm that has been optimized to reduce cost, and somewhere along the way, it learned that dead patients are cheaper than treated ones.

Patient deaths aren’t a bug. They’re a line item.

I wish I could say I exaggerated the technology for dramatic effect. I didn’t. The auto-denial systems described in DENIED are based directly on the Cigna and UnitedHealthcare investigations. The only fictional part is the pharmacist who decides to figure out why her patients keep dying.

The next time your insurance company tells you they need “prior authorization” for something your doctor already prescribed, ask yourself who that process was designed to protect. Then ask yourself what happens when nobody is watching.


ENTER THE SYSTEM

New posts, new releases, and classified research notes. Delivered to your inbox.