Dr. Jayne asked important questions in her Curbside Consult about big data, EHR conversions, the “sheer magnitude of bad data out there,” and how best to insure the integrity of health data.
The best way to address the issue of bad data is to follow the old adage,“Begin with the end in mind.” Implementing an enterprise-wide EHR is a massive, complex undertaking. It involves considering the needs of many stakeholders when defining the build requirements. For example, workflow must support ease of use and not interfere in patient care delivery and related work processes. Furthermore, many implementation decisions focus on driving clinician adoption to ensure that both quality and efficiency objectives are met (not to mention regulatory requirements related to Meaningful Use.)
With all the multi-threaded work streams and decision processes involved in planning and executing an EHR implementation, the re-usability of captured data frequently falls out of scope. That leads to the bad data problem.
Re-usability means using data captured in any source system (EHR, ADT materials management, patient accounting, registration, operating room, emergency department, etc.) for reporting, measurement, and analytics. Re-using the data captured in these source systems accelerates the value realized from implementing such systems and supports a virtuous cycle of performance improvement across an enterprise.
It all relates to, “You can’t manage what you don’t measure.” That is, you can’t measure something if you don’t have the right data. This leads back to the decisions made in implementing EHRs and other systems. You need to start with what data is required to measure and analyze what’s important to the organization and ensure that data can be consistently, reliably, and accurately captured at the point of origin (e.g., at registration or in the care process).
It’s not realistic, however, to expect that every bit of data about a patient should be captured in a discrete form for re-use. What’s required is a balance between supporting ease of use in the appropriate workflows and the availability of data for reusability.
An effective way to strike this balance is to create a list of data elements the organization agrees is necessary for analytics. Some detective work is required: tracing the journey of that data back to the source system and ensuring that each data element is captured as expected in the intended workflow. This requires collaboration across a multi-disciplinary team — one involving experts in quality reporting, data analysis, and clinical (or operational) workflow.
The inventory of data elements can be used to identify where each data element can be captured in the source system (e.g., EHR, ADT, etc.). This is the “data chain of trust.” Team discussion and compromise are required to design workflows that both support ease of use and capture data reliably and consistently.
With a documented inventory of data elements married to how that data will be captured in the source systems, data can start flowing into an analytics environment. Applying sound data governance principles and implementing a data profiling discipline will ensure data consistency and reliability.
Organizations don’t have to begin with a large set of discrete data, but they must recognize that any level of measurement, reporting, and analytics requires consistent, reliable, accurate data starting at the point of capture in the source systems. They should begin with the data most important to each organization and ensure that data can flow from origin to analytics in a chain of trust that is known and transparent.
From there, health systems can incrementally increase the available data as they come to understand why it’s important to capture data discretely and accurately and as more stakeholders benefit from access to that data. With the increasing value realized comes the understanding that, “It’s all about the data.”
Randy Thomas is associate partner of health analytics with Encore Health Resources.