Last week, Inga mentioned that the results of the annual EHR User Satisfaction Survey have been published by the American Academy of Family Physicians. Unfortunately, AAFP has this content on a restricted members-only site, so I had to bribe my favorite cross-town family doc for a copy.
I don’t want the copyright police to come after my friend, so I won’t share the full article, but I’ll summarize some key thoughts here. It also gives me a chance to hone my “speech” because I’m sure I’ll have colleagues waving it in my face (just like they did the last time the survey was conducted) and wanting to talk about how “our” system did. Some key thoughts:
There were “far more” responses than previous surveys. However, I found the reasons for excluding some respondents pretty funny. They included:
- Not using an EHR
- Not naming the system they used
- Naming a practice management system rather than an HER
- Naming a “home-grown proprietary system or… something that we could not verify as an EHR”
There were 2,719 usable responses covering 205 systems. Only 30 systems had 13 or more respondents. Those that had over 100 respondents included:
- EpicCare Ambulatory – 392
- NextGen Ambulatory – 247
- eClinical Works – 244
- Centricity EMR – 209
- Allscripts Enterprise – 180
- Practice Partner – 123
- e-MDs – 120
- Allscripts Professional – 106
There was a broad distribution of practice sizes.
Detailed information on version and implemented features was not presented. Nearly half of respondents “apparently did not know their product’s version number.” My spidey senses always tingle when small practice users have issues with their EHR. I’ve worked with docs who are using versions that are up to three years outdated and are surprised at how well the “current” version works once it’s applied.
The version paradox isn’t unique to small practices, though. For example, how many different flavors of Epic are there depending on how it was implemented? One of my buddies complained that it was ridiculous that Epic doesn’t have e-prescribing. Turns out her organization hadn’t included it in the initial physician training for some unfathomable reason.
Duration of use of the system ranged from “weeks” to “20 years,” with the majority being up to three years and another chunk being in the three to 10 years category. I think time on the system might be a useful exclusion criteria for future surveys. From experience, even with the best implementation, it still takes some practices a minimum of six to eight weeks for users to settle in and for workflow to stabilize if not longer depending on the commitment of the users and the willingness (or resistance) to change.
Fourteen percent of respondents have switched systems at least once due to dissatisfaction with a previous EHR.
The authors recognize these limits, summarizing:
As we said to begin with, it’s probably best to consider the survey results as input you’d get from a large number of colleagues who volunteered informally to report on their EHR experience. That said, we believe that the results presented in this article and its online appendix can help any family medicine practice considering the purchase of an EHR system.
This is a really key point. The study was not randomized, but rather respondents self-reported. Bias could be toward either providers who have serious concerns about their system or those who are significantly satisfied. Although the numbers were much better this time around, it’s not a true cross-section of users and doesn’t account for variables that can truly make or break an end user’s experience. These include poor implementation, lack of commitment among providers and office staff, and failure to implement recommended best practices.
During the implementation of my first EHR, there was no “kickoff” to bring everyone in the practice to the same page. Nor was their a discussion of workflow changes or process redesign. The trainer showed up and started teaching the template builder without the users having any context to her lessons. Coupled with her training on a version that was different than what we had installed, it was an unqualified disaster.
On the client side, some providers feel entitled to behave badly. I’ve had providers refuse to show up for training, refuse to complete practice scenarios, and refuse to be part of the customization process, yet complain relentlessly that the EHR doesn’t meet their needs. Those of us that have been in this a while know that deploying an EHR on top of a dysfunctional practice will only make it more dysfunctional. Partners who have historically felt disadvantaged in the practice often use implementation as a time to lash out against their peers.
Users often go against what the vendor recommends. Sometimes this is justified, such as when there are defects in the software or specialty-specific or regional issues that the vendor isn’t addressing. But sometimes it’s not. I’m currently watching the equivalent of an EHR car crash as one of my closest colleagues is being forced onto a system that isn’t configured optimally. She’s part of a larger group and is a younger physician with little political power to counter the decisions being made higher up. As a user of the same system, I’m keenly aware that the choices they have made will lead to more work being placed on the physicians, less efficient charting, and potential patient safety and regulatory issues.
I’ve armed her with enough knowledge to try to steer them in the right direction, but so far she hasn’t been successful. Eventually they’ll learn, but at the price of user bitterness and potentially patient safety. I recommend that new users take advantage of all the training and information they can get their hands on, whether formal – training programs, client conferences, user symposia, webinars, and the like – or informally through Internet chat groups, informal user get-togethers, hospital colleagues, or blogs.
Many systems offer the ability to customize on a per-physician basis. Providers who are not fully educated on the risks and benefits of doing so can quickly customize themselves into a corner and out of the ability to achieve a decent workflow (not to mention loss of the ability to reach Meaningful Use). I strongly recommend users make an attempt to use the system as the vendor delivers it for at least a month before customizing (although if the system arrives with defects and bugs, often customization is needed to effectively deploy the system).
I encourage practices to consider using EHR implementation as a chance to look at all office policies and procedures, whether written or anecdotal. Automating bad workflow just allows bad workflow to happen more quickly on a greater scale. I encourage partners to think out of the box and consider whether it’s rational for each doc in the office to have his or her own process for handling phone messages and refills. Often there is one process that is more efficient that can be expanded to the entire office with a little effort, resulting ultimately in greater satisfaction for end users.
A survey such as this one can’t account for all these factors, so my advice to users (and those still shopping for an EHR or looking to replace what they have) is to take it with a grain of salt and do your research. Talk to current users and not just those references served up by the vendor sales team. Talk to your colleagues. Spend as much time hands-on with the application as you can, and carefully consider your choices during the build and implementation process.
And for those users who are dissatisfied with their systems or feel their needs aren’t being met, don’t just fillet your vendor in the next survey. Take a proactive stance. Review your contract and implementation documents and make sure you’ve taken advantage of all the training you were allowed, and if you need more, buy it. It amazes me that physicians who wouldn’t start performing a new surgical procedure if they didn’t feel fully trained are happy to jump into an EHR with only a few minutes of training.
Log defects with your vendor and keep records of any defect and enhancement submissions. Understand your support contract and how your vendor is required to respond to issues. Take advantage of any account management or client management services that your vendor offers. Even if you’ve been on a system for years, don’t be afraid to consider retraining, especially if you have to upgrade your software to qualify for Meaningful Use. It’s a great opportunity for a refresher, and CMIO types like myself can always use the Big Bad Wolf of MU to sneak in additional workflow coaching during “mandatory” training.
AAFP has conducted this survey three times before. The first had 408 responses, the next 422, and the 2009 survey had 2012 responses. It will be interesting to see what the results look like the next time it’s conducted and whether any conclusions can be drawn once Meaningful Use is in full swing.