Home » Dr. Jayne » Currently Reading:

Curbside Consult with Dr. Jayne 6/8/20

June 8, 2020 Dr. Jayne 1 Comment

Most of the journal articles that come across my desk during the last couple of months have understandably been about the novel coronavirus or its downstream effects. Since there have been a flurry of retractions of articles recently, I was glad to see this study that took me back to my healthcare IT roots.

One of the main reasons my first practice implemented an EHR was to increase safety – reduce handwriting errors, reduce medication errors through the addition of allergy and interaction checking, reduce errors due to missing or incomplete data, and more. Although we did see some initial improvements, it quickly became apparent that EHRs could be the source of safety issues we didn’t even dream of in the paper world.

The study, published in JAMA Network Open, looks at trends in EHR safety performance in the US from 2009 to 2018. The authors drew data from a case series using over 8,600 hospital-year observations from adult hospitals that used the National Quality Forum Health IT Safety Measure, which is a computerized physician order entry (CPOE) and EHR safety test administered by the Leapfrog Group. The authors found that mean scores on the overall test increased from 53.9% in 2009 to 65.6% in 2018. However, they noted “considerable variation in test performance by hospital and by EHR vendor” going on to voice concerns that “serious safety vulnerabilities persist in these operational EHRs.”

Digging into the methodology, the Health IT Safety Measure test uses simulated medication orders that have been previously proven to either injury or kill patients. They are entered into the system under study to determine how well it can identify potentially harmful medication error events.

Looking deeper at the measures, it was interesting to see the difference between the various levels of clinical decision support: Basic Clinical Decision Support (CDS) scores increased from a mean of 69.8% to 85.6% where Advanced Clinical Decision Support scores increased from a mean of 29.6% to 46.1%. Basic CDS functions include drug-allergy, drug-route, drug-drug, drug/one-time dose, and therapeutic duplication contraindications. Advanced CDS functions include drug-laboratory, drug-daily-dose, drug-age, drug-diagnosis, and corollary orders contraindications. Researchers looked at whether the EHR’s CPOE system correctly generated an alert, warning or stop (soft or hard) after entry of an order that may cause an adverse drug event.

The Health IT Safety Measure test is included in the Leapfrog Group’s annual hospital survey and is performed by a hospital staffer. Detailed demographic data is provided for test patients, including diagnoses, laboratory results, and more. These test patients are loaded into the EHR so that they function the same as actual patients. (Hopefully this is all being done in a copy of the production environment, but the study didn’t mention the specifics.)

Once the patients are created, a clinician is supposed to enter test medication orders for those patients and record how the EHR reacts to the orders, including whether it generates alerts, and if it does, which kind. Hospital staffers are then responsible for entering this data into the tool. The tool includes protections against the hospital trying to game the system, such as control orders that aren’t expected to generate alerts. The process is also timed and must be completed in under six hours.

As I read the study, I kept waiting for the juicy part where we would learn the details about which of the “hospitals using some EHR vendors had significantly higher test scores.” The authors used self-reported data and reported each vendor with more than 100 observations as a single vendor, although it grouped all vendors with fewer than 100 observations as “other.” Unfortunately, “vendor names were anonymized per our data use agreement.” Although the vendors all had overall scores that were in the same ballpark (ranging from 53% to 67%) the minimum/maximum score data literally ranged from zero to 100%.

The closest statement I could find to anything that might indicate how real-world vendors performed was this: “In our results, the most popular vendor, vendor A, did have the highest mean safety scores, but there was variability among Vendor A’s implementations, and the second-most popular vendor had among the lowest safety scores, with many smaller EHR vendors in the top 5 in overall safety performance. Additionally, while we found significant variation in safety performance across vendors, there was also heterogeneity within vendors, suggesting that both technology and organizational safety culture and processes are important contributors to high performance.”

As someone who has spent many thousands of hours doing consulting work in the area of organizational change, that last statement hit the nail on the proverbial head. I’ve been in plenty of hospitals and offices where safety features have been disabled or modified, and for reasons including alert fatigue and the cumbersome workflows needed to override alerts, as well as organizational culture. It would be interesting to see whether the top-performing installations were using the vendor’s EHR out of the box or in a modified fashion, and what the CPOE build actually looked like.

The authors note several limitations, including the fact that the data set only includes hospitals that completed the Leapfrog survey, which may not be representative of all US hospitals. Although it was out of scope of this study, I would be interested to see how ambulatory EHRs would fare in such an analysis. In my experience some ambulatory systems can be even less uniform, as IT teams are pressed to perform whatever customizations or configurations are requested by the physicians who sign their paychecks. I’ve seen organizations that allow physicians to turn off all medication alerts, and others who require physicians to slog through a mind-numbing parade of low-quality alerts throughout the day, and everything in between.

Regardless, the study was thought-provoking, and I hope it generates thought for additional opportunities designed to assess EHR safety and measure vendor progress towards a more optimized EHRs in the future. It will be interesting to see what the data looks like in another five years or 10, and whether individual institutions improve in their performance. I would be interested to hear observations from any hospital IT staffers or clinicians who have been involved in performing this test, including whether you feel your scores are representative of the organization’s safety culture.

What do you think about EHR safety data? Leave a comment or email me.

Email Dr. Jayne.



HIStalk Featured Sponsors

     

Currently there is "1 comment" on this Article:

  1. This would make an excellent Operational, Production, Safety – Qualification (OQ/PQ/SQ) test case for every upgrade. The patient creation should be automated and the subsequent ordering should also be automated. Done for every upgrade, minor/major/patch

    Individual settings will vary but as long as you know what your expectation is (hard or soft stop) then you can adjudicate the test results. Failure goes back to the vendor as a defect.

    I developed interoperability DUR patients that were meant to be used via C-CDA and proprietary exchange methods. They covered most of the normal DUR functions, and a couple of the advanced DUR functions — but those tests could be adjusted to cover them all.

    A safety culture is difficult to establish, but essential to making systems deliver the promise they were supposed to deliver.







Text Ads


RECENT COMMENTS

  1. Minor - really minor - correction about the joint DoD-VA roll out of Oracle Health EHR technology last month at…

  2. RE: Change HC/RansomHub, now that the data is for sale, what is the federal govt. or DOD doing to protect…

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

 

 

RSS Webinars

  • An error has occurred, which probably means the feed is down. Try again later.