Home » Readers Write » Currently Reading:

Readers Write 9/21/11

September 21, 2011 Readers Write 11 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

EMR Usability and the Struggle to Improve Physician Adoption
By Todd Johnson

9-21-2011 4-22-51 PM

Now that Meaningful Use money is up for grabs, almost every US hospital is somewhere down the pathway to deploying at least a Stage 1-certified EMR. Once installed, the tab on many of these systems can run as high as nine figures. For that kind of coin, every user in every department should see their daily workflow improve dramatically. Yet industry-wide physician adoption of hospital EMR systems continues to fall short of expectations.

For many users, the source of frustration is the clinical documentation system they’re asked to use. In theory, these tools are designed to make physicians’ lives easier. But too many documentation systems compromise the usability of the EMR for its most important users.

Not surprisingly, physicians resist changing the way they work to use tools that don’t solve their daily challenges. They stick with familiar workflows – even cumbersome tools such as pen and paper – to capture the details of patient encounters. And they leave it to the hospital to figure out how to extract the data they need.

The net result: physicians end up engaging with the EMR as minimally as possible. Without timely and comprehensive involvement from a significant percentage of physicians, an EMR system cannot help hospitals achieve their clinical, financial and operational improvement goals.

Determining the “usability” of an EMR is less subjective than it sounds. Here’s how usability is defined in the HIMSS Guide to EHR Usability:

  • Usability is the effectiveness, efficiency, and satisfaction with which specific users can achieve a specific set of tasks in a particular environment.
  • Efficiency is generally the speed with which users can complete their tasks. Which tasks and clinic processes must be most efficient for success? Can you establish targets for acceptable completion times of these tasks?
  • Effectiveness is the accuracy and completeness with which users can complete tasks. This includes how easy it is for the system to cause users to make errors. User errors can lead to inaccurate or incomplete patient records, can alter clinical decision-making, and can compromise patient safety.
  • User satisfaction is usually the first concept people think of in relation to “usability.” Satisfaction in the context of usability refers to the subjective satisfaction a user may have with a process or outcome.

Each of these components is measurable. Even user satisfaction, while highly subjective, can be measured through user queries. Yet even with an objective framework of EMR usability, physicians continue to suffer through documentation tools that often fail to meet any of these criteria.

Clinical documentation has become a victim of its own exploding popularity. Thanks to Meaningful Use and other technology-driven initiatives, the value of the data found in clinical notes has skyrocketed. Hospitals now have more incentive than ever to deploy systems that capture, aggregate and transfer data as efficiently as possible.

As the point of entry for a majority of patient information found in the EMR, electronic physician documentation has the added burden of converting notes into usable data. But too often, HIS solutions attempt to solve this problem by delivering electronic documentation that migrates all users to a single, inflexible workflow. Rather than accommodate multiple data entry methods and adapt to user preferences, physicians must instead learn to navigate drop-down menus, check boxes, and other pre-defined selections to complete their documentation.

A one-size-fits-all approach to documentation is shortsighted for two reasons. First, “narrative” shouldn’t be a dirty word in the electronic documentation workflow. A comprehensive patient record is much easier to achieve through a blend of structured and unstructured data input. Certain types of notes, such as H&Ps, benefit from the physician’s ability to capture all details of the patient encounter in his or her own words. Elements with repetitive values, such as lab results and vitals, benefit from structured input – even better if these values automatically carry forward daily.

Second and more important, we can’t lose sight of the fact that we’re asking physicians to alter a very important – and very personal – part of their jobs by asking them to use new clinical documentation solutions. Workflow flexibility is crucial to achieving user satisfaction. Narrative-based capture methods such as dictation remain popular because they’re easy to use. Forcing users to modify their behavior and abandon familiar workflows – to “document to the system” – is a recipe for continued lackluster physician engagement with the EMR.

Ultimately, a truly user-friendly advanced electronic clinical documentation system should empower users to document however they’re comfortable without compromising speed, accuracy, data availability, and overall productivity. The specialized technology solutions are in place to make that possible.

Modern speech recognition and transcription systems can convert dictated narrative to structured data. Universal interoperability standards such as HL7 Clinical Document Architecture (CDA) enable that data to integrate seamlessly into the EMR, regardless of which best-of-breed physician documentation solution you’re using.

The only way to know we’re achieving the right balance of structure and narrative is to let the end users guide the design of the finished product. By achieving high rates of physician adoption, hospital CIOs and other stakeholders can finally focus attention on other priorities.

Todd Johnson is president and co-founder of Salar of Baltimore, MD.

Is ONCHIT About to Chase the Clouds Away?
By Frank Poggio

9-21-2011 4-30-42 PM

My sincere apologies to Chuck Mangione. For our younger readers, Chuck is a great French horn jazz musician from the 70s. His signature song was Chase the Clouds Away. Now back to ONCHIT.

Cloud computing is the latest systems deployment panacea. In the recent past, it was referred to as SaaS (Software as a Service), and before that, remote hosting. The word ‘cloud’ clearly has a better visual impact. Cloud computing runs all your data and applications at a remote facility, giving the user many advantages such as built-in redundancy, reduced capital investment, effortless backups, better integration with many other Web services, and faster and simpler delivery of updates and fixes.

One of the core elements of the ONCHIT certification process and the Meaningful Use attestation requirements is that a provider must run certified software. The certification must tie back to a vendor’s specific version and build. Directives from two of the current ATCBs state:

CCHIT: If you modify or update your CCHIT Certified product in a manner that carries a significant risk of affecting compliance, you must follow this procedure. Before marketing the modified or updated product as CCHIT Certified, you must apply for re-testing of the product to verify continued compliance with all published criteria and Test Scripts.

Drummond: If changes are made to the Drummond Certified EHR product, you must submit to Drummond Group an attestation indicating the changes that were made, the reasons for those changes, and a statement from your development team as to whether these changes do or do not affect your previous certification and other such information and supporting documentation that would be necessary to properly assess the potential effects the new version would have on previously certified capabilities.

If you sell and install a certified full EHR or EHR module, you must at minimum notify the ATCB with each new version or build so that your previous certification gets inherited to your new update or release, preferably before you send it out to your client base.

Turnkey system vendors (do they really fly above the Cloud?) would send out two or three updates during the year, with perhaps one being a major release. If there was an emergency fix needed for a specific client, they might send that out separately. Clearly the update notice to the ATCB should happen before you would send the fix out, but in an emergency situation if the impact was to only one or a few clients, you could send it out just to them and notify and re-certify later.

The same would be true for any special enhancements. Say a new customer requires a specific enhancement as part of a new install contract. For the period your client is running the enhanced software, that version or build would not be deemed certified. This means they could not use your package to attest to MU. But it’s only one client, and if you are a best-of-breed or niche vendor, it may not matter to that client since they might be able to cover the MU criteria with other vendor-certified products. A good example is with the ONCHIT demographic criteria. This requirement could be covered by several EHR modules.

Lastly and most importantly, the assumption is that your updates or fixes do not impact any certification criteria. At this time, how ‘no significant impact’ is defined and determined is left to our imagination, but starting next year it will be a question that must be tackled by the ONCHIT AA surveillance auditors.

Meanwhile, back in the Cloud, it gets little more complicated. As noted before, one of the real advantages of the SaaS approach is that the user never has to load updates. They are handled centrally. One load and all clients are running the new code. Back to our example where a new client contracts for a special enhancement or a fix is needed — you code them, load them, and go. Everybody has access to the new enhancement and everybody is now running a non-certified system. Ouch!

The simple solution, of course, is to make your new customer wait for a full version release, or in the case of a fix, require a workaround until you get re-certified. Either way, ONCHIT has succeeded in turning the clock back to those Neanderthal days of legacy and turnkey system releases.

Cloud vendors who are ONCHIT certified will really need to rethink that load-and-go approach.

Frank L. Poggio is president of The Kelzon Group.

Interoperability? But of Course!
By Cheryl Whitaker, MD

9-21-2011 4-42-19 PM

An HIStalk reader, Rusty Weiss, recently wrote about interoperability (Is Healthcare Interoperability Possible With a Conflicted Federal Committee?, 9/14/11.)

I am not writing to comment on the appointment of Epic’s Judy Faulkner to the Health Information Technology Policy Committee. I am writing to endorse the concept of interoperability. 

In his article, Weiss states, “Democrats, Republicans, and industry experts alike recognize the importance of interoperability.”

Amen. It’s logical that we move to a model in which health information systems talk with each other. I concur that by “tapping into ‘big data,’ there will be opportunity to learn more from existing information – and to make healthcare more effective and less expensive.”

Weiss also states, “By allowing patients to carry their health information across provider lines as easily as we want them to carry their health insurance across state lines, we will empower patients. In fact, one of the stated goals written into the Recovery Act was the development of ‘software that improves interoperability and connectivity among health information systems.”

Weiss goes on to quote Otech president Herman Oosterwijk,  who says, “The entire industry is 15 years behind in interoperability compared with PACS systems.”

PACS solutions were early in the landscape of healthcare’s adoption of electronic information exchange. However, let’s be clear. Diagnostic imaging is far from superior in the context of interoperability. Visit a doctor’s office and you’re likely to see a patient carrying his or her own images burned onto a CD. Ride in a ambulance with a trauma transfer and you’re likely to see a CD strapped to the patient or the stretcher. 

When it comes to exchange of diagnostic images, the inefficiencies are horrific. The room for error is frightening.

Weiss quotes Andrew Needleman, president of Claricode Inc., who says, “Due to the amount and complexity of data being transmitted between systems, even systems that attempt to be interoperable run into issues when they send data to other systems. For healthcare data, even the demographic data to determine if you are talking about the same patient is complex.” 

Consider the realities of diagnostic imaging: 

  • Healthcare organizations generate nearly 600 million diagnostic imaging procedures annually.
  • Based on a study of data from 1995 to 2007, the number of visits in which a CT scan was performed increased six-fold, from 2.7 million to 16.2 million, representing an annual growth rate of 16%.
  • One CT scan exposes a patient to the same amount of radiation as 100 chest x-rays.
  • $100 billion of annual healthcare costs are related to diagnostic imaging tests – but an estimated 35% ($35 billion) represents unnecessary costs for US patients and insurance providers.

PACS solutions facilitate electronic image management. But these are proprietary, closed systems that do not allow providers to easily share information between departments and entities, and also across "ologies." Exchanging images outside of a "system" is difficult if the two facilities have different PACS vendors.

To solve this challenge, some entities have added solutions to morph imaging studies so they can be viewed on a receiving system. Until recently, this has required the implementation of specialized hardware and software and costs that were not sustainable.

We continue to see patients carrying their images around on CDs. Yet according to a January 2011 article in the Journal of the American College of Radiology, Johns Hopkins researchers found that approximately 60% of respondents said most images provided by patients on digital media were unreadable or not importable.

With today’s movement toward ACOs and medical homes, new approaches are needed. An enterprise imaging strategy must focus on providing access to any type of image, anywhere, any time, by anyone – provider, referring physician, radiologist, patient, etc. – across the continuum of care. This vision goes beyond PACS to make image sharing truly interoperable and accessible in real time on any device, without having to load and support additional software and without complicated and unnecessary movement of data. Image-enabling the EHR is also critical.

Three components are required for the move to a truly interoperable imaging environment: a standardized vendor-neutral archive (VNA), an intelligent digital image communication in medicine (DICOM) gateway, and a universal viewer that can be accessed via an embedded link or a standalone portal that enables viewing of images on any browser-based electronic device.

This technology exists. An organization can readily start with just one of the components, then build toward a more robust enterprise solution. There is no wrong door for entry.

Today’s most progressive organizations are embracing enterprise imaging, saving time and money, reducing unnecessary radiation exposure, and improving quality of care.

Healthcare data is voluminous and complex. Regulatory demands seem daunting.  Other industries, however, have adapted to a multitude of “data pressures.” Banking, for example, has been successful with leveraging federated data models to enable cross-organizational transactions via ATMs. 

The time is now for healthcare to create exchanges that allow EMRs, HIEs, and PHRs to access content and results from any location without moving data. We should empower patients, providers, and payers to manage the total healthcare experience from computers, mobile devices, and new types of access points, including kiosks.

Cheryl Whitaker, MD is chief medical officer of Merge Healthcare of Chicago, IL.

HIStalk Featured Sponsors


Currently there are "11 comments" on this Article:

  1. Oh, Frank! You may have hit on a completely new strain–French horn jazz. I want to meet the iron-lipped son of a gun that pulls that off. But our friend Chuck plays (played?) flugelhorn.

  2. Frank, I think you are overstating the difficulty of meeting the ONCHIT requirements with SaaS solutions. ONCHIT requires that the certified version be specified. It doesn’t require that the version number be rolled for every build. The requirement that avoid changes that “carry a significant risk of affecting compliance” is another way of saying your application needs to be upward compatible. That’s not an unreasonable requirement to meet in any case.

    An SaaS solution which is incrementally modified in an upward compatible way, and which only rolls its version number on major functionality updates, should be able to meet the ONCHIT definition.

  3. Ken,
    In several certification approvals I have been involved with the ATCB has asked for the version and build number. Also as quoted in my piece Drummond says you should file a request for certification inheritance for a ‘change’. No definition of what constitutes a change.
    After you file the request they will tell you if it is significant.

    This is one of the many, many unaddressed ‘nits’ in the ONC program that will drive vendors nuts and give CIOs heartburn.

  4. All three of these pieces are good. Re: interoperability and hearing images and reports, the need for an affordable, secure cloud system is clear. Here’s how UC San Diego is approaching it: http://ucsdnews.ucsd.edu/newsrel/health/04-12CloudComputing.asp.

    As Dr. Whitaker reports: “Johns Hopkins researchers found that approximately 60% of respondents said most images provided by patients on digital media were unreadable or not importable.”

    To better serve MDs, part of the solution also lies in cloud-based EHR, especially for those who practice at multiple sites. See radiologist Murray Reicher, MD’s article in current issue of Journal of the ACR. Link at http://bit.ly/qNS7Un.

  5. When I hear the usability argument, my question is: out of the 100’s of applications/ modules on the market, have none hit the mark? Would be good to know which ones have made it so we know what we are aiming for. If you believe none have made it, does the question become: no matter what the UI is, Physicians just don’t want to use computers? (I wouldn’t if I was a Doc). Todd and readers, can you name 1 that has made it? (doesn’t even have to be a vendor, just one function point within a vendor’s suite..) … I’m guessing the answer is yes – then we can start chasing best practices vs. some elusive “system must be fast, efficient, intuitive… that docs love” kind of language. Btw, I don’t work for a vendor… just tired of this “no system has been designed to meet the needs for a clinician” argument. “No system” involves a pretty big universe of computer code…

  6. Yes, the ONC certification has likely ended the days of showing a product demonstration which lacks functionality required by a prospect and returning three days later with that functionality built to close the deal.

    Cloud-based EHRs seem to hold the advantage in terms of releasing new functionality in a timely manner. Cloud EHRs can have a much shorter development and release cycle. Re-certifying every release is a drop in the bucket compared to supporting multiple versions in the marketplace. Not to mention the cost of upgrading all existing clients to the latest version.

    Could you imagine what would happen to the customers of all the big ‘software’ EHR vendors if they shipped new CDs to install every other month? I’d imagine support costs (and attrition) would skyrocket.

    So who has the advantage? A traditional software vendor who releases (and certifies) once every 12-18 months OR a cloud vendor who releases (and certifies) every couple months?

  7. Great articles… Let me share my point of view on certain topics:

    >>> Not surprisingly, physicians resist changing the way they work to use tools that don’t solve their daily challenges. They stick with familiar workflows – even cumbersome tools such as pen and paper – to capture the details of patient encounters. And they leave it to the hospital to figure out how to extract the data they need.

    Beauty is in the eyes of the beholder. I use both an EMR and pen & paper. Why? Because pen and paper is actually faster than anything the HIT world can offer at this time. It’s also more detailed than some boilerplate EMR trash patient visit note. When it comes to actual detail that can be used in a court of law, there is nothing like the written history and physical.

    Now this paragraph of yours brings up another point- it must be the EMR that has to change its workflow, not the doctor. Unless it’s both usable and user-friendly, then as you state: “physicians end up engaging with the EMR as minimally as possible.”

    >>> Now that Meaningful Use money is up for grabs, almost every US hospital is somewhere down the pathway to deploying at least a Stage 1-certified EMR.

    I don’t think so- last I read something on US hospitals, only about 4% were actually ready for doing meaningful use. Recently, many have been hard at work, but the most optimistic figures pegs about 30% doing CPOE, a core part of meaningful use. Since many will fail at some point in the complicated process, we can figure that half, or about 15% will probably reach stage I MU payment attestation

    URL: http://www.fiercehealthit.com/story/hospitals-slow-adoption-cpoe-impedes-meaningful-use/2011-08-14

  8. And more apologies to Mr. Mangione:

    “His signature song was Chase the Clouds Away.”

    Uh, his signature song was “Feels So Good”, a huge hit that paid for his kids college education no doubt and got him on The Tonight Show. “Chase the Clouds Away” is the title track from an earlier album that is a beautiful composition, fantastic recording from a great album and a favorite of his fans, but not the signature song if we take this term literally.

    I hate to be a music nerd, but there ya’ go.


  9. J Knight
    IMHO his signature song is CTCA. When I think of Chuck that tune is in my head, I am not a music nerd…heck I can’t tell the difference betw a fleugel or french horn, but may be the ‘signature’ is in the ears of the beholder!

Founding Sponsors


Platinum Sponsors


















































Gold Sponsors













Reader Comments

  • Sam Lawrence: Les Claypool, Primus...
  • Matthew Holt: I went back to look at when Mr H first recommended Muse. It was https://histalk2.com/2009/10/24/monday-morning-update-1...
  • Matthew Holt: Modern (ish) day bassists? think you have to include Chris Wolstenholme from Muse, who to my shame/delight I heard abo...
  • Nehal Vapi: Nice Article! The coronavirus has had unprecedented impacts on the world — and the worst is yet to come. Companies mus...
  • detroitvseverybody: The most surprising aspect of the Teladoc-Livongo deal is how investors and healthcare analysts don't seem to understand...

Sponsor Quick Links