Home » Dr. Jayne » Currently Reading:

Curbside Consult with Dr. Jayne 12/7/15

December 7, 2015 Dr. Jayne No Comments

clip_image001 

A Tale of Two Articles

I subscribe to quite a few news digests, including some from the AMA and other professional organizations. The headlines are always attention-grabbing, so “Framework evaluates 20 top EHRs – and they don’t quite measure up” definitely caught my attention. It links to a usability analysis done by the AMA and MedStar Health’s National Center for Human Factors.

Using an EHR User-Centered Evaluation Framework, they looked at data that 20 vendors (15 ambulatory and five inpatient) provided to meet ONC certification requirements. The Framework goes “beyond the ONC’s criteria… to encourage the ONC to raise the bar for usability certification.”

Being a good clinical informaticists means being a critical reader and making sure that one understands the information being presented before coming to conclusions. That approach has served me well when providers would storm into my office with “conclusive evidence” that our EHR was bad, waving copies of articles in my face such as a reader survey that had a grand total of 13 respondents using our product from a nationwide sampling.

In reading the introductory information carefully, one can see that they’re somewhat comparing apples to oranges. They looked at what vendors submitted for certification, not the totality of what a given vendor did or did not do with regards to user-centered design.

I have several friends who work for vendors and have heard that providing more than what is required for certification is the equivalent of being on the witness stand and offering more than a single-word answer to a yes or no question. The “just the facts, ma’am” approach seems to be preferred.

You can’t blame them. Vendors don’t want to get tangled up in showing something not required that might lead to questions. The certification process is already onerous enough.

The AMA blurb goes on to conclude that, “Out of those 20 products evaluated, only three met each of the basic capabilities measured.” I’m not surprised by this since they were measuring information from a data set that was designed for a different purpose than that for which they decided to use it.

I fully understand that they’re trying to make the point that they think the ONC certification process for usability best practices isn’t robust enough. Unfortunately, it seems to tar and feather some of the vendors despite the process they’re actually doing (but didn’t include in the ONC documentation because it wasn’t required).

The AMA blurb also didn’t include clear language that was included in the actual MedStar Health documentation on the User-Centered Design Evaluation Framework. It clearly says it is not designed to look at actual usability by clinicians, but to look at vendor practices as reported to ONC on the eight required patient safety capabilities.

I’ve personally used many of their top-scoring systems and found them to have major usability issues. Casual readers aren’t going to dig into the details. This piece is likely to be misleading.

I found the whole thing even more interesting when I opened this month’s JAMIA to find an article by the same lead author, Raj Ratwani. This time the researchers actually visited 11 EHR vendors to look at their user-centered design processes. I found this data much more interesting (not to mention peer-reviewed).

Six of the vendors visited have more than $100 million in revenue, with the top three being over $1 billion, so you can guess who they are. Interestingly enough, four of the six were found to have “well-developed UCD” processes and another was found to have “misconceptions of UCD.” I actually laughed when I read this, likening it to delusions of grandeur somewhere in the back of my mind.

The specifics of what the researchers define as misconceptions include that, “vendors do not have any UCD processes in place although they believe they do.” This also includes vendors who cite being responsive to user complaints and feature requests as evidence of UCD.

The overall distribution of the vendors was four with well-developed UCD, four with basic UCD, and three with misconceptions of UCD. The authors go on to cite the fact that even the “misconception” group is certified by ONC, illustrating why certification requirements may need an adjustment. They do at least mention the challenge of creating requirements that lead to improvement for the poor performers but don’t hamper those that are already doing well.

My favorite quote of the article is one vendor who stated, “Our product is used by thousands of people every day. So if it was that bad, it would already be out of the market.”

I certainly prefer the scholarly approach of the latter article, although I’m sure it didn’t get anywhere near as much press as the first one. I was trying to figure out what category my EHR vendor fell into. It turns out they weren’t one of the participants.

How does your vendor perform on UCD and what do you think about it? Email me.

Email Dr. Jayne.



HIStalk Featured Sponsors

     







Text Ads


RECENT COMMENTS

  1. Seems the FTC ruling on non-competes hits start-ups hardest. All they've got are ideas and if staff leave with them…

  2. With McGraw’s new position at Transcarent, it seems like Glen Tullman might be getting the Allscripts band back together.

  3. 'Samantha Brown points out that, “Healthcare, like every other industry, gets caught up in the idolatry of the ‘innovators.’”' I…

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

 

 

RSS Webinars

  • An error has occurred, which probably means the feed is down. Try again later.