At Arcadia, Mary Kuchenbrod fixates on data so you don’t have to. As Senior Director of Data Operations, she’s looking at how our partners can dig meaningful insights out of vast stores of EHRs and claims information.  

But to reach that ideal point where metrics meet action, the data itself has to be healthy—clear, coherent, and organized. Think of it like tuning up a car, or going for an annual physical. By setting clear objectives when you gather and interpret the data, the pipeline from a figure to a meaningful observation stays frictionless. 

Set the right objective to ensure data quality 

Before you start sifting through data, it’s essential to set a clear goal. This ultimate aspiration, whether it’s cost savings or disease prevention, will determine which data are most relevant, narrowing or broadening the field so you can see where analysis is useful.  

If you’re looking to reduce expenses and manage hospital spending, insurance claims alone just might be enough — within these records, you could observe opportunities for reducing readmissions, learn which patients need rerouting toward a different type of care, or see the balance between in- and out-of-network expenditure.  

Alternately, risk adjustment would necessitate a different approach. Sure, claims would be helpful, especially in showing historical conditions, but pairing those with EHRs would give granular, timely diagnoses that might help you intervene before a chronic illness strikes.  

Maybe you want to improve quality measures. In that case, EHR data is an absolute necessity. Where spend and risk look at the balance between administration and outcomes, quality measures would benefit from EHRs in their most detailed form — unstructured data, like chart notes, plus the way a chronic condition might’ve progressed over the years.  

It’s all determined by your needs. The first step is identifying them.   

Understand your data sources 

Free-floating data can be gathered in “containers” like EHRs or claims, but that doesn’t mean any one source is comprehensive. The “where” of data collection is an equally pivotal consideration.  

Most networks already have a dizzying amount of figures at their fingertips, but occasionally, looking outside your proprietary data feeds can be beneficial. That could look like tapping lab results, real-time ADTs (admit, discharge, and transfer data), or other metrics.  

“Data’s valuable, but data’s only valuable insofar as we’re going to use it,” Kuchenbrod says.  

The use case should correspond with the type of data you harvest. An initiative that focuses on improving diabetics’ A1Cs might correlate with a lab feed, for example.  

The sum of mixing and matching different types of information can be more than the individual parts—one measure might provide context for another. 

Keep a pulse on the “freshness” of your data 

Then there’s the “when” of data capture, an essential but often overlooked piece of ensuring healthy numbers.  

“In healthcare especially, data is always changing,” Kuchenbrod says. “Patients are always coming in the door and having new interactions with the healthcare system.” 

The importance of fresh, regularly updated input will vary with your end goals. Claims data is burdened with a 90-day lag from the data of service, so it won’t suit objectives that require dexterity (but could be perfect for less urgent questions, like where unnecessary spend happens over a calendar year).  

EHR data could update as often as every night, which would benefit the study of a smaller population receiving intensive treatment (but could prove overkill for a look at the bigger picture of, say, administrative efficiency). It’s possible to get even more precise, like an alert when a particular patient shows up at an emergency department. 

The capture and flow of data is just as critical as the type for data quality. You might want a small, steady drip, or you might want a torrent of detail. 

Knowing which type of data suits your needs at the beginning will help you put insights into practice quicker.  

The right structure keeps data quality high 

Healthcare data, which begins as raw material — simple 0s and 1s — is transformed through context into valuable observations.  

“There are over 150 ways for blood pressure to be recorded,” Kuchenbrod explains. “Only one of those is what most people are expecting to see in an analytics application.”  

Problems can arise when someone “translates” a numeric value into a qualitative one, like classifying good and bad blood pressure readings within a group. This can result in poor data quality. 

Data have to receive the right treatment to be useful, which Kuchenbrod describes as an “EHR- and healthcare-agnostic data model.” This means figures optimized for how you’ll use them, specifically, and the same goes for how they’re structured. Can you consistently notate a particular gap in care, for example, or a patient’s overnight admission? 

Building the scaffolding for how you’ll store and share these numbers keeps them coherent despite the variation in their original sources or formats. 

Achieve healthier outcomes through quality data 

At the heart of Arcadia’s mission, from Mary’s data dashboard to our Customer Insights team, is using data in service of healthier lives. Healthy data quality leads to a shorter road from idea to fruition, goal to reality, and we’d love to help pave the way. 

May 12, 2022
,