Addressing Data Quality in the EHR
By Greg Chittim
What if you found out that you might have missed out on seven of your 22 ACO performance measures, not because of your actual clinical and financial performance, but because of the quality of data in your EHRs? It happens, but it’s not an intractable problem if you take a systematic approach to understanding and addressing data quality in all of your different ambulatory EHRs.
In HIStalk’s recent coverage of HIMSS14, an astute reader wrote:
Several vendors were showing off their “big data” but weren’t ready to address the “big questions” that come with it. Having dealt with numerous EHR conversions, I’m keenly aware of the sheer magnitude of bad data out there. Those aggregating it tend to assume that the data they’re getting is good. I really pushed one of the major national vendors on how they handle data integrity and the answers were less than satisfactory. I could tell they understood the problem because they provided the example of allergy data where one vendor has separate fields for the allergy and the reaction and another vendor combines them. The rep wasn’t able to explain how they’re handling it even though they were displaying a patient chart that showed allergy data from both sources. I asked for a follow up contact, but I’m not holding my breath.
All too often as the HIT landscape evolves, vendors and their clients are moving too quickly from EHR implementation to population health to risk-based contracts, glossing over (or skipping entirely) a focus on the quality of the data that serves as the foundation of their strategic initiatives. As more provider organizations adopt population health-based tools and methodologies, a comprehensive, integrated, and validated data asset is critical to driving effective population-based care.
Health IT maturity can be defined as four distinct steps:
- EHR implementation
- Achievement of high data quality
- Reporting on population health
- Transformation into a highly functioning PCMH or ACO.
High-quality data is a key foundational piece that is required to manage a population and drive quality. When the quality of data equals the quality of care physicians are providing, one can leverage that data as an asset across the organization. Quality data can provide detailed insight that allows pinpointing opportunities for intervention — whether it’s around provider workflow, data extraction, or patient follow-up and chart review. Understanding the origins of compromised data quality help recognize how to boost measure performance, maximize reimbursements, and lay the foundation for effective population health reporting.
It goes without saying that reporting health data across an entire organization is not an easy task. However, there are steps that organizations must take to ensure they are extracting sound data from their EHR systems.
Outlined below are the key issues that contribute to poor data quality impacting population health programs, how they are typically resolved, and more optimal ways organizations can resolve them.
Variability across disparate EHRs and other data sources
EHRs are inconsistent. Data feeds are inconsistent. Despite their intentions, standardized message types such as HL7 and CCDs still have a great deal of variability among sources. When they meet the letter of national standards, they rarely meet the true spirit of those standards when you try to use.
Take diagnoses, for example. Patient diagnoses can often be recorded in three different locations: on the problem list, as an assessment, and in medical history. Problem lists and assessments are both structured data, but generally only diagnoses recorded on the problem list are transported to the reports via the CCD. This translates to underreporting on critical measures that require records of DM, CAD, HTN, or IVD diagnoses. Accounting for this variability is critical when mapping data to a single source of truth.
Standard approach: Most organizations try to use consistent mapping and normalization logic across all data sources. Validation is conducted by doing sanity checks, comparing new reports to old.
Best practice approach: To overcome the limitations of standard EHR feeds like the CCD, reports need to pull from all structured data fields in order to achieve performance rates that reflect the care physicians are rendering– either workflow needs to be standardized across providers or reporting tools need to be comprehensive and flexible in the data fields they pull from.
The optimal way to resolve this issue is to tap into the back end of the EHR. This allows you to see what data is structured vs. unstructured. Once you have an understanding of the back-end schema, data interfaces and extraction tools can be customized to pull data where it is actually captured, as well as where it should be captured. In addition, validation of individual data elements needs to happen in collaboration with providers, to ensure completeness and accuracy of data.
Variability in provider workflows
EHRs are not perfect and providers often have their own ways of doing things. What may be optimal for the EHR may not work for the providers or vice versa. Within reason, it is critical to accommodate provider workflows rather than forcing them into more unnatural change and further sacrificing efficiency.
Standard approach: Most organizations ignore this and go to one extreme or another: (1) use consistent mapping and normalization logic across all data sources and user workflows, making the assumption that all providers use the EHR consistently, or (2) allowing workflows to dictate all and fight the losing battle to make the data integration infinitely adaptable. Again, validation is conducted using sanity checks, comparing new reports to old.
Best practice approach: Understand how each provider uses the system and identify where the provider is capturing all data elements. Building in a core set of workflows and standards dictated by an on-the-ground clinical advisory committee, with flexibility for effective variations is critical. With a standard core, data quality can be enhanced by tapping into the back end of the EHR to fully understand how data is captured as well as spending time with care teams to observe their variable workflows. To avoid disruption in provider workflows, interfaces and extraction tools can be configured to map data correctly, regardless of how and where it is captured. Robust validation of individual data elements needs to happen in collaboration with providers to ensure completeness and accuracy of data (that is, the quality of the data) matches the quality of care being delivered.
Build provider buy-in/trust in system and data through ownership
If providers do not trust the data, they will not use population health tools. Without these tools, providers will struggle to effectively drive proactive, population-based care or quality improvement initiatives. Based on challenges with EHR implementation and adoption over the last decade, providers are often already skeptical of new technology, so getting this right is critical.
Standard approach: Many organizations simply conduct data validation process by doing a sanity test comparing old reports to new. Reactive fixes are done to correct errors in data mapping, but often too late, after provider trust has been lost in the system.
Best practice approach: Yet again, it is important to build out a collaborative process to ensure every single data element is mapped correctly. First meetings to review data quality usually begin with a statement akin to “your system must be wrong — there’s no way I am missing that many patients.” This is OK. Working side by side with the providers to ensure they understand where data is coming from and how to modify both workflow and calculations ensure that they are confident that reports accurately reflect the quality of care they are rendering. This confidence is a critical success factor to the eventual adoption of these population health tools in a practice.
Missed incentive payments under value-based reimbursement models
An integrated data asset that combines data from many sources should always add value and give meaningful insight into the patient population. A poorly mapped and validated data asset can actually compromise performance, lower incentive reimbursements, and ultimately result in a negative ROI.
Standard approach: A lackluster data validation process can result in lost revenue opportunities, as data will not accurately reflect the quality of care delivered or accurately report the risk of the patient population.
Best practice approach: Using the previously described approach when extracting, mapping, and validating data is critical for organizations that want to see a positive ROI in their population health analytics investments. Ensuring data is accurate and complete will ensure tools represent the quality of care delivered and patient population risk, maximizing reimbursement under value-based payments.
We have worked with a sample ACO physician group of over 50 physicians to assess the quality of data being fed from multiple EHRs within their system into an existing analytics platform via CCDs and pre-built feeds. Based on an assessment of 15 clinically sensitive ACO measures, it was discovered that the client’s reports were under-reporting on 12 of the 15 measures, based only on data quality. Amounts were under-reported by an average of 28 percentage points, with the maximum measure being under-reported by 100 percentage points.
Reports erroneously reported that only six of the 15 measures met 2013 targets, while a manual chart audit revealed that 13 of the 15 measures met 2013 targets, indicating that data was not being captured, transported, and reported accurately. By simply addressing these data quality issues, the organization could potentially see additional financial returns through quality incentive reimbursements as well as a reduced need for labor-intensive intensive chart audits.
As the industry continues to shift toward value-based payment models, the need for an enterprise data asset that accurately reflects the health and quality of care delivered to a patient population is increasingly crucial for financial success. Providers have suffered enough with drops in efficiency since going live on EHRs. Asking them to make additional significant changes in their daily workflows to make another analytics tool work is not often realistic.
Analytics vendors need to meet the provider where they are to add real value to their organization. Working with providers and care teams not only to validate integrity of data, but to instill a level of trust and give them the confidence they need to adopt these analytics tools into their everyday workflows is extremely valuable and often overlooked. These critical steps allow providers to begin driving population-based care and quality improvement in practices, positioning them for success in the new era of healthcare.
Greg Chittim is senior director of Arcadia Healthcare Solutions of Burlington, MA.