This site uses cookies that are essential for our site to work. We would also like to use non-essential cookies to help us improve your browsing experience and help make this website better, by collecting and reporting information on how you use our site.
ResourcesPress Release

Arcadia statement on predictive analytics and bias

By Michael Gleeson, Chief Strategy and Innovation Officer at Arcadia
Posted:
Company News Healthcare Analytics Predictive Analytics

We’re sharing the following statement with our customers who may have questions about our predictive algorithms in light of recent media coverage. For general questions, please contact hello@arcadia.io.

Dear colleagues,

Recently we’ve seen substantial news coverage about how a widely-used predictive algorithm exhibits significant racial bias. Specifically, given black patients and white patients with similar health profiles, the algorithm will assign the black patients a lower risk score. Authors of the research published in the journal Science conclude that “this racial bias reduces the number of black patients identified for extra care by more than half.”

We know that many of you are actively working to address issues of health equity, and the healthcare challenges faced by underserved populations are of great concern to our team as well.

When predictive algorithms impact the actions that are taken for a patient, great care must be taken to ensure those algorithms do not exclude underserved populations. The algorithm referenced in the study takes a cost-based approach to patient stratification, which the researchers identify as suffering from racial bias both in identification of needs and access to services.

We take a different approach to developing predictive algorithms. Many of you are already using the Arcadia Impact Score to help identify patients who could benefit from care management. Unlike the algorithm described in the study, the Arcadia Impact Score does not rely solely on cost-based stratification. We feel it is important to explain why cost-based models can be problematic, and why we use a broader set of models in our algorithm.

The algorithm that was studied tries to answer, “Which patients are most likely to have high medical expenses in the future?” Unfortunately, this question is inherently biased.

The research in Science stemmed from an understanding that on average black Americans are more likely to suffer from a lack of access to healthcare, from a lack of referrals to appropriate providers and services, and from other social and cultural barriers to care. This means that they are less likely to consume healthcare services and more likely to have a lower historical cost of care – but this does not mean that they have less need for care.

An algorithm that is trained to predict future patient cost based on historical utilization will show a lower ROI for intervening with underserved patients and will be less likely to surface them as candidates for care management. As the study authors discuss, this is not a result of inadequate or poor quality data, or even a biased dataset; it is a natural consequence of training a model on cost of care alone, as the algorithm under study did.

At Arcadia, we ask a different question: “Which patients are most likely to benefit from a care management program?”

The Arcadia Impact Score takes a more holistic view of medical outcomes by combining three models. One predicts Total Medical Expense, as in the case of the algorithm in the study. However, the Arcadia Impact Score also estimates the effect of Avoidable ED Visits and Unplanned Inpatient Admissions. Neither of these two models are included in the algorithm described by the study; however, in follow-up experiments conducted by the researchers, both of these approaches result in substantial reduction of bias and are recommended as superior approaches to patient identification.

Additionally, although total cost of care is an optional input to the Arcadia Impact Score, it has limited dynamic range, reflecting a broad classification of “high cost” less likely to be biased by racial disparities. In contrast, demographic, medical, and socioeconomic factors make up most of the inputs to the models comprising the Arcadia Impact Score.

Our work has focused on identifying patients across a diverse population who could benefit from care, rather than on specifically addressing any one type of bias. By training our algorithm to predict Avoidable ED Visits and Unplanned Inpatient Admissions, we can better identify patients who may not be receiving appropriate care and could benefit from care management. Our approach surfaces a diverse mix of patients, some who meet more traditional criteria for care management, as well as “unexpected” candidates who have a history of low cost and/or low utilization but who have health and socioeconomic markers that suggest an opportunity to head off future adverse outcomes.

We hope this comparison between the studied algorithm and the Arcadia Impact Score has been helpful. No algorithm or model is without bias, and awareness of the potential for implicit bias among analysts and practitioners, alike, is critical. Upcoming enhancements to the Arcadia Impact Score, such as a Health Outcomes model focused on significant health events, will continue to try to address the issue of disparity in access and care in the health care system. And, we will continue to invest in developing predictive models that can help identify the right opportunities for care for every patient.

We encourage you to reach out to us with any questions you may have about this important issue.

Warm regards,

Michael