Decision aids that U.S. physicians use to guide patient care on everything from who receives heart surgery to who needs kidney care and who should try to give birth vaginally are racially biased, scientists reported on Wednesday in the New England Journal of Medicine.
It is the latest evidence that algorithms used by hospitals and physicians to guide the health care given to tens of millions of Americans are shot through with implicit racism that their creators are often unaware of, but which nevertheless often result in Black people receiving inferior care.
The new findings cut across more medical specialties than any previous study of race and algorithm-driven patient care.
Malika Fair, an emergency physician and senior director for health equity partnerships and programs at the Association of American Medical Colleges, called the study “impressive.”
“I am delighted that the use of race in medical decision-making is being questioned in such a thoughtful analysis,” said Fair, who was not involved in the study. “As a medical community, we have not fully embraced the notion that race is a social construct and not based in biology.”
The findings build on earlier studies that focused more narrowly. Last year, for instance, other scientists found that a widely used algorithm that identifies which patients should get additional health care services such as home visits routinely put healthier, white patients into the programs ahead of Black patients who were sicker and needed them more. The bias resulted from the algorithm’s developers equating higher health care spending with worse health. But white Americans spend more on health care than Black Americans even when their health situations are identical, making spending a poor and racially biased proxy for health.
The new study finds that algorithms used for medical decisions from cardiology to obstetrics are similarly tainted by implicit racial bias and adversely affect the quality of care Black patients receive. Among them:
Developed by the American Heart Association to determine which hospitalized patients are at risk of dying from heart disease, the algorithm assigns three points to any “nonblack” patient; more points mean higher risk of death. Those deemed at higher risk (non-Blacks) are more likely to be referred to specialized care, said David Shumway Jones of Harvard Medical School, the study’s senior author. That possibility is not merely theoretical: At one Boston hospital, Black and Latinx patients arriving in the emergency room with cardiac symptoms were less likely than white patients with the same symptoms and medical history to be admitted to the cardiology unit, a 2019 study found.
In a risk calculator used by thoracic surgeons, being Black increases the supposed likelihood of post-operative complications such as kidney failure, stroke, and death. “That could make surgeons steer Black patients away from bypass surgery, mitral valve repair and replacement,” and other life-saving operations, Jones said. “If I have a Black patient and the risk calculator tells me he has a 20% higher risk of dying from this surgery, it might scare me off from offering that procedure.”
It’s very difficult to measure kidney function directly, so physicians use creatinine levels in the blood as a proxy: less creatinine, better kidney function. A “race adjustment” in a widely used algorithm lowers Black patients’ supposed creatinine levels below what’s actually measured. That makes their kidney function appear better, potentially keeping them from receiving appropriate specialty care. The rationale for “adjusting” creatinine levels by race is that Black people are supposedly more muscular, which can increase the release of creatinine into the blood. As it is, Black people have higher rates of end-stage renal disease than whites.
An algorithm used by transplant surgeons says that kidneys from Black donors are more likely to fail than kidneys from donors of other races. Because Black patients are more likely to receive an organ from a Black donor, the algorithm reduces the number of kidneys available to them.
Recognizing that cesarean deliveries are more dangerous for both mothers and babies, obstetricians recommend that women who had a previous surgical birth not be automatically scheduled for another cesarean, as was once common practice. But the algorithm they use to determine whether a woman faces too high a risk from vaginal birth automatically says that Black and Latinx women face a higher risk. That was based on a study that found being unmarried and not having insurance also increases that risk. Neither of those socioeconomic factors are included in the algorithm.
An online tool that estimates risk of developing the disease calculates a lower risk for a Black or Latinx woman than a white one even when every other risk factor (such as age at menarche, close relatives with breast cancer, and a history of benign biopsies) is identical. That could deter minority woman from undergoing screening.
When patients come to an emergency department with pain in the back or side, doctors use an algorithm with a 13-point scale to assess whether the cause is kidney stones. A higher score means less likelihood of that. Being Black automatically adds three points to the score. An assessment of the algorithm by independent researchers found no scientific support for the assumption that Black people’s pain is less likely to indicate kidney stones.
“Many of the algorithms are widely used and have a substantial impact on patient care,” said Brian Powers, a physician and researcher at Brigham and Women’s Hospital in Boston, who was not involved in the study. “Cataloguing these algorithms is an important first step. What’s needed now is a better understanding of whether these algorithms exhibit racial bias, or are contributing to health inequities in other ways.”
All 13 of the algorithms Jones and his colleagues examined offered rationales for including race in a way that, presumably unintentionally, made Black and, in some cases, Latinx patients less likely to receive appropriate care. But when you trace those rationales back to their origins, Jones said, “you find outdated science or biased data,” such as simplistically concluding that poor outcomes for Black patients are due to race.
Typically, developers based their algorithms on studies showing a correlation between race and some medical outcome, assuming race explained or was even the cause of, say, a poorer outcome (from a vaginal birth after a cesarean, say). They generally did not examine whether factors that typically go along with race in the U.S., such as access to primary care or socioeconomic status or discrimination, might be the true drivers of the correlation.
“Modern tools of epidemiology and statistics could sort that out,” Jones said, “and show that much of what passes for race is actually about class and poverty.”
Including race in a clinical algorithm can sometimes be appropriate, Powers cautioned: “It could lead to better patient care or even be a tool for addressing inequities.” But it might also exacerbate inequities. Figuring out the algorithms’ consequences “requires taking a close look at how the algorithm was trained, the data used to make predictions, the accuracy of those predictions, and how the algorithm is used in practice,” Powers said. “Unfortunately, we don’t have these answers for many of the algorithms.”
Even before the new analysis, some hospitals and specialty groups were questioning the algorithms. The Society of Thoracic Surgeons, which developed and whose members use the algorithm that tells them Black patients are more likely to suffer post-op complications, admitted that including race in the algorithm “might ‘adjust away’ disparities in quality of care.”
Kidney specialists at hospitals in Boston and San Francisco have stopped using race to make a Black patient’s kidney function score better. This month, the University of Washington School of Medicine announced that its physicians would also stop using the racially tinged kidney function algorithm. But at least two other Boston hospitals, whose officials were given a presentation about the creatinine algorithm, declined to stop using it several weeks ago.
Jones does not think the algorithm developers were racially motivated. “Well-meaning individuals acting without racist intent can still produce work with racist consequences,” he said, “such as redirecting medical resources from one group to another.”
He and other physicians hope that shining a light on the use of racially biased decision tools will make more in the medical community question the practice.
“By perpetuating the use of race in our decision-making, we can further exacerbate inequities, make it longer to get needed treatments or procedures, or referrals to specialists,” said AAMC’s Fair. “For the medical community, this is an example of structural racism and is one area we can address quickly and decisively.”