The KuLT Press

The KuLT Press

Maths Casualties

The Optum Risk Algorithm ...

... and the Quiet Denial of Care

Krystal Lynn Tronboll's avatar
Krystal Lynn Tronboll
Jan 11, 2026
∙ Paid

Maths Casualties · Case 003


Case Summary

For years, one of the most widely deployed healthcare risk algorithms in the United States systematically underestimated the medical needs of Black patients.

The algorithm, developed by Optum and used by health systems and insurers to allocate access to “high-risk care management” programs, did not explicitly consider race. Instead, it used future healthcare cost as a proxy for future health need.

Because Black patients, on average, incur lower healthcare spending than white patients with the same disease burden, due to long-documented structural inequities, the algorithm concluded that many Black patients were healthier than they actually were.

The result was not a marginal statistical artifact. It was a large-scale, operational outcome: millions of Black patients were less likely to be flagged for additional care, not because clinicians made a decision, but because a mathematical model quietly filtered them out.

This case is not about malicious intent. It is about how a design choice, mathematically defensible on paper, institutionally convenient in practice, translated directly into unequal access to care.


User's avatar

Continue reading this post for free, courtesy of Krystal Lynn Tronboll.

Or purchase a paid subscription.
© 2026 Krystal Lynn Tronboll · Publisher Privacy ∙ Publisher Terms
Substack · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture