Maths Casualties · Case 003
Case Summary
For years, one of the most widely deployed healthcare risk algorithms in the United States systematically underestimated the medical needs of Black patients.
The algorithm, developed by Optum and used by health systems and insurers to allocate access to “high-risk care management” programs, did not explicitly consider race. Instead, it used future healthcare cost as a proxy for future health need.
Because Black patients, on average, incur lower healthcare spending than white patients with the same disease burden, due to long-documented structural inequities, the algorithm concluded that many Black patients were healthier than they actually were.
The result was not a marginal statistical artifact. It was a large-scale, operational outcome: millions of Black patients were less likely to be flagged for additional care, not because clinicians made a decision, but because a mathematical model quietly filtered them out.
This case is not about malicious intent. It is about how a design choice, mathematically defensible on paper, institutionally convenient in practice, translated directly into unequal access to care.



