A health care algorithm used in hospitals across the U.S. has been discriminating against black patients, according to new research.
The study found that the algorithm consistently prioritised less-sick white patients and screened out black patients from a programme meant to help people who need more intensive care.
Predictive algorithms have found their way into many areas of society, including health care.
According to the authors behind the new paper, though, researchers have rarely had the opportunity to study up close how and why bias can creep into these algorithms.
Many algorithms are proprietary, meaning the exact details of how they were programmed – including the sources of data used to train them – are off-limits to independent scientists.
The authors looked at data from an algorithm developed by the company Optum that’s widely used in hospitals and health care centres, including the hospital where some of the authors worked.