Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!
Note: the views and opinions expressed are those of the authors personally and are not necessarily representative of their current or former employers.
Weaknesses Revealed: Secrets Exposed by Data Integrity Summary Reports
By Beth Haenke Just
The data integrity summary report is one of the most powerful – yet underutilized – tools hospitals have at their disposal for maintaining the integrity of the data within their MPI. Digging deeper into the statistics provided in these reports reveals far more than the volume of overlaid or duplicate records within the system. It can also reveal areas of weaknesses that, left unchecked, could threaten the long-term integrity of the MPI, limit its usefulness in achieving quality and safety goals and Meaningful Use, and hamper participation in ACOs and HIEs.
In addition to pinpointing the root cause of data integrity issues, summary reports can identify specific areas upon which hospitals should focus corrective efforts. These may include improved education and training, policy clarification, enhanced communication, and other steps that result in fewer duplicates and overlays for a more accurate MPI and improved data integrity.
Regular reviews of summary reports can also reveal patterns of errors. For example, too many null or empty fields in certain records can signal problems with registration processes. Drilling down deeper, data integrity statistics can be used to track errors with greater specificity, such as identification of incorrect patients, transposed Social Security numbers, or non-compliance with naming conventions. Data integrity reports can even provide detailed insight into the specific types of errors that are happening most frequently within individual departments or facilities and even enterprise-wide.
Once patterns are identified, individual cases can be closely examined to pinpoint where additional training or policy refreshers might be required. Coupling the data integrity summary report with advanced analytics tools allows hospitals to determine precisely where errors are entering the system and the specific types of mistakes being made. This, in turn, allows education programs to be customized to strengthen specific areas of weakness.
For example, if the summary report reveals an unusually large number of registration errors being made within a short period of time, a hospital can drill down into the data to determine the department where the mistakes are originating, as well as who is making them, why, and how. Often the culprit is an individual who is unfamiliar with the registration process and who is attempting to save time by creating new records for every patient versus first searching the MPI for existing ones. Additional training and education will significantly reduce, and in some cases eliminate, these types of registration errors.
The integrity of patient identity data is critical to achieving care quality and safety goals and plays an integral role in the success of HIEs and ACOs. By taking advantage of the wealth of information found within summary reports, hospitals and health systems can ensure the long-term integrity of their data.
Beth Haenke Just, MBA, RHIA, FAHIMA is CEO and president of Just Associates of Centennial, CO.
Round Peg in a Square Hole: Behavioral Health and EMRs
By Kathy Krypel
Implementing an EMR for behavioral health is like putting a round peg in a square hole. Yes, you read that right: a round peg in a square hole (the opposite of the traditional analogy). The EMR (round peg) can fit, but unless certain steps are taken, it won’t fill the behavioral health (square hole) need entirely. Those steps that need to be taken include:collecting the appropriate data and offering the behavioral specific tools and care plans for optimal diagnosis and care delivery.
Why does it matter? Since many large hospital systems offer behavioral health services as part of their continuum of care, it is important to fill in the gaps and variances around the EMR. The following are just a few examples of why it is important to offer behavioral care services that are supported by a robust EMR:
- One in eight (or nearly 12 million) ER visits in the US are due to mental health and/or substance use problems in adults.1 This is the most costly venue for care delivery.
- Major depression is considered equivalent, in terms of its burden on society, to blindness or paraplegia. Schizophrenia is equivalent to quadriplegia.2
What are these behavioral healthcare EMR gaps and variances?
- Providers. Most behavioral health providers are not MDs. In fact, primary care physicians spend limited time with patients and are often hesitant to diagnose and treat behavioral concerns. Most behavioral health providers are clinicians with Masters or Doctorate degrees who have been licensed by their state(s) to diagnose and treat behavioral health disorders.
- The diagnostic process and tools. Behavioral health disorders are the only serious, chronic illnesses that are diagnosed based solely on self report. The tools used to assess the behavioral health patient’s mental status and substance abuse patterns are very different than the traditional medical diagnostic tools of imaging and lab work. Behavioral health diagnostic tools are most often elaborate question and answer tools that are can be both clinician-administered and self-administered. Behavioral health clinicians use tools such as the Beck Depression Inventory (BDI), Generalized Anxiety Disorder scale (GAD 7), and the Diagnostic and Statistical Manual (DSM IV). These tools need to be incorporated into the data capture and workflow built into an EMR. Additionally, behavioral health providers are required to develop elaborate treatment plans with the patient’s participation. Non-behaviorally focused EMRs typically don’t have tools built in for this and must be built (or ignored). If ignored, the EMR becomes nothing more than a word processor.
- Customization will always be required. While there are multiple behavioral health specific EMR vendors in the marketplace and enterprise-wide EMRs that can be configured to cover behavioral health, customization will be required to meet multiple state-specific mandates, practitioner specialty requirements, and federal privacy rules that apply to behavioral health.
Although there are challenges, successes are growing. The following recommendations help to ensure a positive implementation outcome:
- Create a small but specific implementation team that aligns with your behavioral health leadership during the build and test period, so all build and testing work is completed in a collaborative manner.
- Build using the most commonly used diagnoses and their DSM IV criteria into the EMR to make it easy for providers and therapists to use drop-down lists to create the diagnostic picture of the patient.
- Build using ASAM criteria, so chemical dependency staffs can more easily complete treatment planning.
- Design within the “tighter than HIPAA” federal constraints that govern confidentiality of patient information for patients receiving chemical dependency treatment (i.e., CFR 42).
- Involve trainers and testers in the workflow discussions.
In order to avoid putting a round peg in a square hole, it’s essential to understand the variances in the behavioral health setting and address them in workflow, data capture, information exchange, provider engagement, and administrative requirements and incorporate them into the EMR project plan, design, and implementation.
1. Mental Disorders and/or Substance Abuse Related to One of Every Eight Emergency Department Cases. AHRQ News and Numbers, July 8, 2010. Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/news/nn/nn070810.htm
2. Disability Adjusted Life Year, DALY, Daly 2004
Kathy Krypel is master advisor at Aspen Advisors of Pittsburgh, PA.
Data Virtualization Best Practices Accelerate Time to Value
By Richard Cramer
Data virtualization offers a value proposition that quickly excites business leaders and technologists alike. Business executives are enthusiastic because data virtualization enables IT departments to more quickly respond to new requirements – often in days or weeks rather than months or quarters. Information technologists are similarly excited about being able to get more done, more quickly, and deliver higher value to their business customers.
However, unless we’re careful, this same enthusiasm can lead to organizations trying to use data visualization where it’s not appropriate and results in a classic “square peg in a round hole” situation. It is important to keep in mind that while data virtualization is an important part of the data management tool kit, it is not the right tool for every purpose, and doesn’t eliminate the need for a traditional data warehouse.
Successful deployments of data virtualization share some common characteristics. First is that data virtualization is most successful when it complements a mature data management infrastructure, development standards, and implementation processes. Best practice in these organizations is to use data virtualization as a part of an overall data management life-cycle where data mapping logic that had been built in the virtual solution is seamlessly reused in the physicalized data integration solution.
Second, there are specific use cases where data virtualization is most appropriate. Best practice is to vet candidate uses of data virtualization against these use cases. Just because data virtualization can be used does not mean it should be used.
This is particularly true in the early stages of adopting data virtualization technology, since missteps in using data virtualization for inappropriate use cases in the first project or two can give the technology a black eye that is hard to overcome later.
Good use cases for data virtualization share the following characteristics: (a) data needs are of a short duration; (b) business requirements are unclear or evolving; and (c) situations where quickly prototyping a view of integrated data is required.
Situations where data virtualization is not a good fit include: (a) complex join logic is required; (b) high performance query response is a driving requirement; or (c) source system availability is unreliable or unpredictable.
In this context of best practices, it is exciting to see the healthcare industry providing many opportunities where data virtualization can be a key enabler of organizations looking to maximize their return on data. There are a large number of healthcare organizations with traditional enterprise data warehouse solutions in place, and that can most benefit from the addition of data virtualization to their architecture.
There are also many examples of use cases that are appropriate for data virtualization and can quickly deliver high value. For example, data virtualization can be used to accelerate drug research by providing scientists with integrated views of internal and external information to aid in the drug discovery process. The unpredictable nature of discovery can be enabled by virtualized data integration solutions—quickly combining lesser-known external data with well-known internal data speeds up the decision-making process and ultimately reduces the time to bring new drugs to market.
For healthcare providers, the ability to respond to ambiguous and frequently changing data requirements in a rapidly changing regulatory and business environment is a must. The rapid prototyping enabled by data virtualization can be invaluable in meeting fleeting reporting and data needs today that may be gone or completely different tomorrow.
Richard Cramer is chief healthcare strategist of Informatica Corporation of Redwood City, CA.
Historically, physician and nursing systems and workflow have often been parallel, but independent of each other. Physicians and nurses must be able to share information to provide coordination of care.
For example, physicians must comply with standards such as ICD-10, ICD-9, SNOMED CT, RxNorm, LOINC, DSM-IV, and CPT, while nurses employ terminology like NANDA, NIC, NOC, ICNP, PNDS, and CCC. With so many different standards in place, creating an integrated picture of patient care can be difficult at best.
Fortunately, all of these standards have already been mapped to link physician and nursing information. The capability now exists to integrate physician and nursing documentation and care capabilities as well as provide links between a patient’s clinical diagnoses and nursing care.
To create this functionality, all existing nursing standards were evaluated to identify the best candidate for use at the point of care in computerized systems. The Clinical Care Classification (CCC) system was selected and 182 CCC Nursing Diagnoses were linked to the more than 55,000 clinical diagnoses. Linking the CCC and clinical diagnoses makes it possible for all members of the care team to generate a list of nursing diagnoses based on the physician’s clinical diagnoses for that patient.
In addition, CCC Nursing Diagnoses are linked to CCC Nursing Interventions and to more than 1,760 specific nursing actions. Also, a starter set of customizable documentation protocols has been developed for each of the nursing actions.
One of the most significant aspects of this work is that the same concepts in the nursing protocols are linked to the physician content where appropriate. Coordination of care has arrived.
David Lareau is chief executive officer of Medicomp Systems of Chantilly, VA.