Home » Readers Write » Recent Articles:

Readers Write: Seize the Opportunity: Making Your Meaningful Use Meaningful

September 9, 2013 Readers Write 1 Comment

Seize the Opportunity: Making Your Meaningful Use Meaningful
By Linda Lockwood, RN, MBA

9-9-2013 5-58-47 PM

In recent weeks, countless stories have appeared in healthcare-industry publications touting the complexities of Meaningful Use (MU) Stage 2 and the challenges ahead. While MU Stage 2 is no walk in the park, turning these challenges into an opportunity to establish the proper foundation at the outset goes a long way to setting up an organization for continuing success throughout the course of the EHR Incentive Program.

A strong MU program is also the basis for long-term quality and performance improvement that goes far beyond MU compliance. Viewed as strategically foundational, it can help health systems survive and thrive in today’s shift from volume- to value-based care delivery and reimbursement models.

clip_image004

Successfully meeting Meaningful Use requires more than just taking on another IT project, checking off boxes, and receiving incentive payments. Rather, a compelling case can be made for adopting a strategic and programmatic approach to enable ultimate success over progressive MU stages. It requires implementing a program with consideration of standardization, improved workflows, documentation at the point of care, interoperability, eCQMs as defined by multiple quality programs, and an auditable defense portfolio that provides evidence of the provider’s compliance and intent.

A full lifecycle looks beyond the initial incentive payments. It employs a comprehensive approach that closes the loop on every aspect of the program. It also establishes the culture and business plans that support improved patient care outcomes and efficiencies necessary to survive in the new, fee-for-value healthcare world.

Taking a programmatic approach to achieving meaningful use can provide foundational benefits in the long run. As we look back at the journey already traveled and ahead to MU Stage 2 and beyond, it is clear that the organizational approach to MU directly impacts future success. Organizations that chose the “easy way out” as a path to financial gain are now facing Stage 2 with increased thresholds, a focus on sharing data and engaging patients, increased emphasis on eCQMs, and realizing that they have significant work ahead.

Organizations that “seized the opportunity” at the outset and invested the time, money, and resources to set the proper foundation for value-based performance improvement are now in the lead with regard to successfully meeting the MU Stage 2 requirements.

If your MU approach was not robust enough, is all lost? Absolutely not. At the heart of every successful MU journey is an organization with a commitment from the top to view MU as a foundational strategy to improve quality and support the goals laid out by CMS. Much has been said about the transitions of care, patient engagement, and quality reporting issues, but what many don’t often talk about is how to position an organization for success. Some key points to consider include:

  • Identify and act upon lessons learned
  • Embrace a big vision; leverage the MU effort
  • Understand the scope and level of effort required; don’t underestimate Stage 2 challenges – thresholds, interoperability, and patient portal and engagement
  • Include all stakeholders; align with quality and performance improvement
  • Develop program management and governance
  • Focus on adoption and change management
  • Understand vendor approach; challenge and verify
  • Create an auditable defense portfolio and an audit plan
  • Budget for upgrades, software and services; understand how this will affect the timeline
  • Establish a comprehensive portal plan to include security, access, outreach, content, policies and procedures
  • Pay special attention to the Summary of Care – the complexities and the content to include physician documentation for care planning.

Meaningful Use is truly a journey that must be embraced beyond the IT department. To be successful, organizations must employ proactive executive sponsorship that supports the long-term, value-based, performance-improvement vision. Realization of the vision depends on developing and delivering a well-structured program. Organizations that adopt this approach will be aligned for success; they will be the frontrunners in this new world of value-based payment and performance improvement.


Linda Lockwood, RN, MBA is the partner of advisory services at Encore Health Resources of Houston, TX

Readers Write: Be the One

September 4, 2013 Readers Write 1 Comment

Be the One
By Daniel Coate

9-4-2013 6-01-52 PM

Amidst all the paperless aspects of our world, last year I subscribed to the New York Times Sunday edition on paper. I really enjoy the old-school nature of waking up Sunday morning, walking down my driveway to pick up the paper, and spending a couple of hours with a cup of tea or coffee reading the in-depth analysis of the week’s news.

I was taken by an article in the December 8, 2012 edition of the paper entitled, “Billion-Dollar Flop: Air Force Stumbles on Software Plan.” I’ve had it on the corner of my desk since and am just now thinking I should write about it.

The bottom line is that the Air Force is canceling a six-year modernization effort of its logistics systems and processes. On the technology front, they were attempting to convert from custom legacy logistics systems developed in the 1970s to an Oracle ERP system. The six-year track of the project cost “them” $1 billion (oh, and when I say “them”, I really mean “us”). By the time the Air Force canceled the project, it had realized it would cost an additional $1 billion just to achieve one-quarter of the capabilities originally planned. As a reminder, one billion is a big number – if you were to start counting right now at a rate of one number per second, you’d get to 1 billion in 2045 (32 years).

In analyzing the reasons for this colossal failure, many contributing factors were identified, such as starting with a big bang approach that tried to put every possible requirement into the program, making it very large and very complex.

However, the main reason identified was, “…a failure to meet a basic requirement for successful implementation: having ‘a single accountable leader’ who ‘has the authority and willingness to exercise the authority to enforce all necessary changes to the business required for successful fielding of the software.’”

As we all know, there are a number of exciting developments and converging forces changing the healthcare industry right now. With these converging forces, healthcare organizations are under tremendous pressure to address a number of priorities simultaneously:

  • Reduce operating costs while driving value
  • Implement and realize the full benefit of electronic health records
  • Transition from volume to value and plan for the accountable future
  • Harness the power of data and analytics to drive a data driven culture
  • Enable the connected community across the care continuum
  • Achieve Meaningful Use and complete ICD-10

While it seems like a tidal wave, these initiatives are aimed at paramount goals: better care, better health, and lower per capita costs. It’s essential that we as an industry heed lessons learned like this example from the Air Force to avoid similar stumbles or flops. While it’s never a comfortable position to be that single accountable leader, I think it’s important that as we all do our day-to-day work, we look for ways where we can either assume that leadership or recommend that a specific person assume that position. It is a key way to drive value from investments in information technology, operations and process improvement, and change leadership.


Daniel Coate is principal and co-founder of
Aspen Advisors of Pittsburgh, PA.

Readers Write: Paper Bills Can Be Hazardous to Your Practice’s Health

September 4, 2013 Readers Write 5 Comments

Paper Bills Can Be Hazardous to Your Practice’s Health
By Tom Furr

Every time I go through a healthcare facility I am struck by all the paradigm shifts, inflection points, and market disruptions glistening under the bright lights alit in examination rooms, labs, and other clinical areas.

It truly astounds me that there is such a yawning chasm separating the business office from the clinical side of the practice. It hits me all the more when I pause to consider most of what’s going on in medical practice management revolves around how a doctor will get paid for services provided.

This is part of the fundamental changes needed in the business office that requires a massive disruption to the way patients get billed, payments are secured, and – yes – the embrace of productivity- and profit-improving technology.

In fact, the MGMA states that today practices need to send out an average of 3.3 paper statements to secure payment. It’s not a great leap of logic to add bill issuance and bill pay to a practice’s online capabilities if it’s already “forced” to make patient clinical information available online. What’s more, the need to issue multiple paper statements that can cost around $0.70 to get paid is reduced, if not eliminated.

So be honest — what’s the hurdle that is keeping you from making a change? Are there several cases of paper invoices sitting on a shelf and you feel compelled to use them for fear someone will call you a money waster?

If you truly want to cut costs and improve profitability, throw away those paper bills and all the time consuming, error-producing manual processes associated with that antiquated and expensive process.

To be fair, the tumult of change is daunting for medical practices, but it doesn’t need to be destructive. Embrace change and employ innovative online patient billing and balance management that can be easily embedded into practice management software.

One key pressure medical practices are feeling which will make the change more palatable is the rise of patient accounts receivables; a reflection of the inexorable march from the simplicity of co-pays to high deductible health plans. One industry expert notes that, “It wasn’t that long ago that health plans covered 87 percent of medical bills. Now they cover 65 percent.” According to Aon Hewitt’s 2013 Private Exchange Survey, growth rates of high deductible health plans (HDHPs) has been averaging 10 percent per year, and as more employers promote the plans, the growth rate is accelerating.

If you still need motivation, let me share with you some research findings on consumer behavior when it comes to paying bills.

  • The people who stack up their bills once or twice a month and write checks are far and few between.
  • Folks who get bills in paper form tend to delay paying them versus those that arrive digitally.
  • Medical bills are often not paid because they are complex and confusing and the hassle to find out what the charges are for and what’s owed translates into…delayed payment.
  • Even the US Postal Service, that organization that depends on your paper bills as the bulk of what makes up first class mail today, has come to realize that 60 percent of consumers prefer to pay bills online, the result of a survey they conducted among people just like your patients.

Take a break from reading of the latest diagnostic breakthrough in a medical journal. Look at your practice’s balance sheet, particularly the A/R line. Before market forces push you to sell or close up your practice, embrace change in patient billing and balance management. Go away from paper and move toward better, more manageable profitability with online billing methods.


Tom Furr is founder and CEO of PatientPay of Durham, NC.

Readers Write: Natural Language Processing: Putting Big Data to Work to Drive Efficiencies and Improve Patient Outcomes

August 26, 2013 Readers Write 1 Comment

Natural Language Processing: Putting Big Data to Work to Drive Efficiencies and Improve Patient Outcomes
By Dan Riskin, MD

8-26-2013 6-26-06 PM

Natural language processing (NLP) is increasingly discussed in healthcare, but often in reference to different technologies such as speech recognition, computer-assisted coding (CAC), and analytics. NLP is an enabling technology that allows computers to derive meaning from human, or natural language input.

For example, a physician’s note may state that a patient “has poorly controlled diabetes complicated by peripheral neuropathy.” When notes are analyzed through an NLP system, coded features are returned that can:

  • Suggest codes such as ICD-9 or ICD-10 that may feed a CAC billing application;
  • Classify a patient according to applicable quality measures such as poorly controlled diabetes mellitus, to support a reporting tool;
  • Populate a data warehouse;
  • Feed analytics applications to support descriptive or predictive modeling, such as the likelihood of a patient being readmitted to a hospital within 30 days of discharge.

Healthcare is data intensive from both clinical and business perspectives. While the industry’s transition to electronic data collection and storage in recent years has increased significantly, this has not actually forced physicians to code the majority of meaningful content. Eighty percent of meaningful clinical data remains within the unstructured text, as it does in most industries. This means that it remains in a format that cannot be easily searched or accessed electronically.

NLP can be leveraged to drive improvements in financial, clinical, and operational aspects of healthcare workflow:

For financial processes, automating data extraction for claims, financial auditing, and revenue cycle analytics can impact the top line. NLP can automatically extract underlying data, making claims more efficient and offering the potential for revenue analytics.

For clinical processes, automatically extracting key quality measures can support downstream systems for reporting and analytics. NLP can infer whether a patient meets a quality measure rather than requiring individuals to manually document each measure for each patient.

For operational processes, descriptive and predictive modeling can support more effective and efficient operations. NLP can extract hundreds of data elements per patient rather than the 2-4 codes listed in claims, producing better models and supporting business insight and diversion of resources to high risk patients.

So, NLP is a powerful enabling technology, but it is not an end user application. It is not speech recognition or revenue cycle management or analytics. It can, however, enable all of these.

There is a battle underway that is increasingly recognized in the healthcare space. Individual hospital divisions seek turnkey solutions and frequently purchase NLP-enabled products. But at a broader level, health systems as a whole do not want to pay repeatedly for similar technology. They seek best-of-breed infrastructure, wanting a combination of electronic health records, data warehouses, NLP, and analytics.

This battle will increasingly highlight best-of-breed data warehouses, data integration vendors, and natural language processing technologies as health systems search for a scalable, affordable, and flexible healthcare infrastructure to feed a suite of clinical, operational, and financial applications.

Dan Riskin, MD is CEO of Health Fidelity of Palo Alto, CA.

Readers Write: Bridging the Divide: Can Clinicians and CFOs Speak the Same Language?

August 26, 2013 Readers Write Comments Off on Readers Write: Bridging the Divide: Can Clinicians and CFOs Speak the Same Language?

Bridging the Divide: Can Clinicians and CFOs Speak the Same Language?
By Nick van Terheyden, MBBS

Pity poor Henry the VIII. Historians still argue over his medical records. Though his was the most scrupulously documented medical history of his age, burning questions remain. Did he suffer from syphilis as believed for centuries? More likely he had familial diabetes, which better explains his symptoms – including his well-documented inability to heal from wounds.

Imagine if Henry’s physicians were also tasked with assigning codes and complying with the clinical documentation requirements of today. The Tudor dynasty might have had some reimbursement issues. Heads would have rolled.

Sure, bloodletting is no longer an accepted therapeutic modality. But have we really come that far in bridging the divide between the clinician’s responsibility for care and the CFO’s responsibility for financial performance? Or do finance and quality continue to be involved in a forced marriage of sorts?

Clinicians are focused on their patients. While they understand the importance of billing, they need to put their energies into diagnosing and treating patients to ensure positive outcomes. And they’re overwhelmed with data – patient test results, clinical studies, guidelines, protocols – much of which they have to sift through to find relevant, critical information. Add to that, they have the burden of learning the new coding requirements under ICD-10, with the deadline approaching around the corner.

CFOs, of course, are also focused on quality but, at the same time, must juggle that priority with issues related to reimbursement, their bottom-line and ever-changing and expanding compliance requirements. They’re continually seeking out and analyzing solutions that may be able to improve both patient health and revenue performance. At the same time, they also recognize that without physician buy-in, they cannot meet any of these goals; therefore, they are looking for meaningful ways to bring them along, without disrupting their workflows.

Information that’s deemed crucial for the clinician may not be deemed useful by the CFO, and vice-versa.

Yet finding ways to break through this language barrier between the clinical and financial perspectives will be a critical success factor for healthcare organizations in the years ahead. It’s more than just a communications issue. It’s a strategic imperative aimed at translating the narrative of care into an actionable piece of information that aids in care coordination, while also ensuring appropriate reimbursement and minimizing the potential revenue leakages that keep most hospital CFOs up at night.

Clinical documentation is at the heart of plugging these revenue leakages while also meeting quality standards. Instead of finding one-stop solutions to prevent leakages across the revenue cycle, it is much easier to build accuracy from the start rather than trying to fix the problem after the train has left the station and the process is in motion.

Regardless of the tools used, clinical documentation addresses the most important concern for both physicians and CFOs: ensuring that the most useful information is captured accurately and is made readily accessible to the decision makers (and systems) who need it. At the end of the day, we all know that quality leads to a win for all.

Nick van Terheyden, MBBS is CMIO of Nuance.

Readers Write: Fund Healthcare Modernization and Innovation – Retire Legacy Applications

August 21, 2013 Readers Write 7 Comments

Fund Healthcare Modernization and Innovation – Retire Legacy Applications
By Julie Lockner

8-21-2013 6-36-17 PM

The American Recovery and Reinvestment Act of 2009 (ARRA) provided the healthcare industry incentives for the adoption and modernization of point-of-care computing solutions including electronic medical and health records (EMRs/EHRs). Now that these funds have been allocated and invested in new information systems, hospital and patient care provider CFOs are checking in on the return on those investments. Many are coming up short.

clip_image002

These mega EMR/EHR applications are taking longer than planned to implement leaving a number of legacy applications running in parallel. In addition to hardware and software maintenance costs for both environments, costly resources with skills to maintain aging technology platforms drain IT budgets – funds needed to support new systems.

This challenge is not unique to the healthcare industry. A recent survey [1] of companies with over 50 IT staff shows that on average, 70 percent of the IT budget is spent on existing systems. If half of the applications are redundant, this represents a major opportunity for cost savings or reinvestment.

Why are so many legacy systems still running? An industry research report [2] indicates the #1 reason is that users still want to access data. As creatures of habit, many hospital staff continue to use familiar systems to look up patient information and records out of convenience. Unfortunately, this comes at a cost.

Another reason is because entire legacy data sets are not always migrated to new systems. Data with assigned records retention schedules require collaboration between stakeholders and compliance teams. Without a programmatic approach, application retirement projects can be significantly hampered.

Many providers have overcome these hurdles and successfully implemented an application retirement strategy while migrating to a new system saving millions.

For example, the nation’s largest children’s hospital expects to save $1.8 million annually from retiring legacy applications. Their IT modernization program replaced applications running on aging platforms such as HP Turbo Image, SQL Server, Oracle, and MUMPS with an EPIC implementation. Patient clinical data needed to be retained for compliance reasons, so they deployed an application retirement strategy that allowed them to keep that data and eliminate dependencies on legacy systems and applications. Hospital staff was given online, convenient access to data in a secure archive and compliance teams could track retention.

Key to success, they claim, includes a platform with the following capabilities:

  • Archive support for a variety of data types, systems and platforms
  • Automated validation to confirm data had been completely and correctly archived
  • Ability to assign retention policies during or after the archive process
  • Automate data purge workflow when retention periods expires with legal hold support
  • Mask sensitive data in case a clinical trial is reopened

[1] NCC survey companies with over 50 IT staff

[2] Enterprise Strategy Group Research Report, Application Retirement Trends, October, 2011


Julie Lockner is vice president of product marketing for
Informatica.

Readers Write: Good Product Design is Preventive Medicine for your Software

August 21, 2013 Readers Write Comments Off on Readers Write: Good Product Design is Preventive Medicine for your Software

Good Product Design is Preventive Medicine for your Software
By Ryan Secan, MD, MPH

As a practicing hospitalist physician, I see many patients with untreatable or difficult-to-treat disease that could have been prevented with care before their illness took root. From the lifelong smoker with emphysema who might have quit smoking to the patient with end-stage colon cancer who should have had a screening colonoscopy, dealing with the issue before it started would have potentially prevented their problem.

As a practicing informaticist, I also see parallels between the preventive situations described above and common issues that I’ve faced in healthcare IT. When it comes to healthcare IT, it seems that like patients, too many companies are ignoring preventive care for their product.

As an employed physician, I have limited to no choice regarding what software I use for clinical care. Even as an informaticist, I have inherited my share of decisions regarding software that took place before I had a chance to offer input. Often these software choices that my colleagues and I are forced to use appear to be designed without ever considering the workflows of the clinicians who would use them. It just doesn’t seem possible that any physician involved in product development would allow something this difficult for a clinician to use to be rolled out.

One simple example involves ordering medications in an unnamed product. After typing in a medication name, clicking on it to select it, clicking on a prepopulated order string, and clicking OK (already too many clicks), the pop-up window cycle starts. A click to confirm that I understand that the medication requires dosing based on kidney function, a click to confirm that I know that the kidney function is X (or unable to be calculated), and another click to confirm that the appropriate dose for this level of kidney function is Y (it remains unclear why all of these notations couldn’t be in one window).

Worst of all, if the correct dose is different from what I’ve ordered, it doesn’t offer to change the order or allow me to cancel my already entered order. I need to cancel the old order and order the medication again, remembering the correct dose, and once again going through the multiple windows telling me that the medication must be dosed for kidney function, etc. This is a completely absurd process and a missed opportunity.

Instead of having unhappy customers and trying to repair the damage after the fact, HIT companies need to invest the time and effort in product design. Seek advice from experts (you know, the people who will be using your product) and make sure your product fits into your clients’ workflow. Build customization into your product so it can be adapted to the particular workflows of your individual clients, and make the customization more than just cosmetic. When you get feedback from one client regarding problems, make changes for all of your clients so your entire user base can benefit from each other’s experience.

This preventive medicine will prevent difficult (or impossible) to fix problems down the road.

Ryan Secan, MD, MPH is chief medical officer of MedAptus.

Readers Write: Connecting the Industry: Behind the Scenes of the New Healthcare Triangle

August 14, 2013 Readers Write Comments Off on Readers Write: Connecting the Industry: Behind the Scenes of the New Healthcare Triangle

Connecting the industry: Behind the Scenes of the New Healthcare Triangle
By Gary Palgon

8-14-2013 6-14-18 PM

The days of patients accepting the decisions and information provided by their physicians – no questions asked – are in the past. Today, patients are in the driver’s seat, participating in discussions about their care and making crucial decisions based on information from all members of their care teams. The movement from a linear form of communication between a single physician and a patient has changed into a multi-dimensional conversation that includes a variety of healthcare providers and patients.

Patients have always trusted physicians to provide the best care for them as individuals. Yet few realize until they become ill that historically there has been little exchange of information between all points of care. A patient who first visits an urgent care center, for example, then sees a primary care physician who refers him or her to a surgeon expects their health information to follow from place to place.

Fortunately, technology now supports the exchange of health information among disparate sources so that a patient’s longitudinal record is accessible. This not only prevents the need for the patient to repeat information at each point of care, it also ensures the accuracy of the information on which all of the providers base their treatment recommendations.

Better exchange of health information among providers and patients has led to a more collaborative approach to developing the best possible treatment plans. While providers and patients make up two important points in the communications continuum, however, there is a third important point in the healthcare communications triangle that must not be overlooked – pharmaceutical organizations.

Access to the most current medications and treatment protocols requires an exchange of information among patients, physicians, and pharmaceutical organizations. Searches for clinical trial information can be initiated by patients or physicians, as each seeks the best care for a specific condition.

Nevertheless, patients expect physicians to have greater access to clinical trial information, as well as the ability to qualify them for participation in trials. This increased awareness of the benefits of clinical trials brings the healthcare industry face to face with a longstanding challenge: how to give pharmaceutical companies access to patient information for research while still respecting patient privacy where desired.

Concern about patients’ privacy and the security of their data has for many years limited healthcare providers’ willingness to share information, resulting in limited access to it for researchers. Yet de-identified patient information can be tremendously beneficial, helping shorten research timeframes and speeding time-to-market for new treatments. Today, many patients not only understand the importance of their medical data to research, they also want to share it with the general population as a way to help improve care for others suffering with similar conditions. It is, after all, their information to be used as they wish, which includes individuals using such data for future treatment of diseases.

The technology to connect all points of the healthcare continuum is available today. Although not perfect – industry standards are still evolving and interoperability challenges do exist – successful connections being made every day prove that the challenges can be overcome.

Take lab and diagnostic results, for instance. With lab results comprising roughly 70 percent of an average patient record, the ability to see and share lab data electronically greatly improves communication across the continuum of care. Physicians rarely have a laboratory in their own offices, so they rely on technology to share data outside their four walls. The connections among physicians and multiple laboratories or diagnostic services provide a real-life example of the benefits and possibilities of secure data exchange.

A truly connected healthcare industry can be accomplished if the strategy to achieve it is simple, cost-effective and beneficial to everyone. Technology that incorporates use of the cloud to integrate disparate systems, aggregate and harmonize data, and provide access in usable formats addresses these requirements. Not only does this strategy overcome the financial and staffing challenges of industry integration, but it provides enhanced security for health information.

Even though patients are willing to share their health information for research purposes, protocols that authenticate user identities and provide secure access for specific research needs still must be put into place to protect it. Leveraging cloud-based solutions is one way to provide those securities while enabling access to data for multiple researchers working on different studies at the same time.

Easy access to information is part of day-to-day life in the 21st century. In years past, you might have to ask the advice of the local hardware store owner or look through a book if you wanted to fix a leaky sink. Today, a simple Internet search provides hundreds of suggestions and videos showing exactly what to do.

The same concept – easy access to data in a usable format for everyone – applies to the exchange of health information among patients, providers, and pharmaceutical organizations. The benefits include improved clinical trials, appropriate research, faster time to market for beneficial new drugs, and ultimately enhanced population health management.

Just as access to information improves our daily lives, so it does too for patients, researchers, and providers. It is a win-win-win situation for those who rely on shared information to develop new treatments, assess care options, and make the best care choices possible.

Gary Palgon is vice president of healthcare solutions for Liaison Healthcare Informatics.

Readers Write: A Meaningful View of Meaningful Use

August 5, 2013 Readers Write Comments Off on Readers Write: A Meaningful View of Meaningful Use

A Meaningful View of Meaningful Use
By Helen Figge, PharmD, MBA, CPHIMS, FHIMSS

Meaningful Use has meaning to us all. While we struggle to decide timelines for milestones and determining measured success, we all experience Meaningful Use in our daily lives.

First and foremost, we are all consumers of healthcare living in a society that wants immediate gratification. As consumers, we are being granted instant healthcare gratification through the lens of Meaningful Use. We receive visit summaries, electronic copies of our medical records, and a detailed report of our current medications. Our providers have access to information such as our laboratory reports, X-ray reports, and notes from our specialists. We are encouraged to engage in our own care by having access to our data through patient portals.

We can ask our clinicians new questions based on previous test results, which is great for the patient, but perhaps less than ideal for the clinician (e.g. a TSH ranges from 0.3 to 5.0 – so what is normal for me or for you? Does a low value mean something versus a higher one?) We assume all are equally computer savvy, which in turn creates a potential digital divide. Some more tech-savvy patients “get it” with little prodding, while others finding this new Meaningful Use approach cumbersome, yielding potentially more work for the clinician. 

Maybe to counteract this one potential angst of patient computer illiteracy, should we offer patients a computer literacy course in order to take advantage of the opportunities presented to them by Meaningful Use?

There seems to be a learning curve for us as healthcare consumers. Not only learning the technologies given to us for data access, but also comprehension of the new rules of healthcare engagement. Given that we want it and want it now, Meaningful Use is the lightning bolt needed to energize the healthcare delivery system. Most noteworthy of all, Meaningful Use to a healthcare consumer is invisible, and translates to a meaningful interaction with our healthcare provider with the highest quality of care delivered to us that is coordinated, seamless, accessible, real-time, and complete.

Next comes the clinician, whose perspective is somewhat more sterile. Patient record transparency and best practices yielding to a more informed patient with data in real-time, workflow supportive and organized is the nirvana. But, in reality, the technologies do not always support clinician workflow, hence the angst felt today with the execution of Meaningful Use to some clinicians. Additionally, clinicians have an extra burden to exercise patience with their patients who might overuse or underuse these new approaches for data access.

But if patience is exercised, Meaningful Use will work to transform healthcare the way we all want it to be. It just might take a little more time for some to realize the benefits, lending fuel to the current discussions of some “catching up” with others in the various stages of Meaningful Use. And to compound our “want it and want it now” mentality, don’t forget the Direct Project that if exercised correctly could improve communication across many layers of clinician thought. The problem with that project, however, is the select few who enjoy its rewards as many haven’t caught up to the pack yet for this vehicle offered in healthcare today to work for optimal effect.

Now enter the poor vendor who finds Meaningful Use an opportunity, but also a challenge. The challenge comes not only from the institution that purchased the technology, but the various stakeholders that institution represents. Vendors who can’t keep pace with these demands will now become easily identified, and these vendors in turn will now more than ever experience negative selection because stakeholders will opt for software that supports healthcare delivery.

Vendors also need to contend with clinicians who have the extra burden of now hearing from patients that the technologies are not user-friendly, adding fuel to patient dissatisfaction. This is a double whammy of frustration. Complaints fielded by clinicians are in turn angsts for the CIOs, who then turn their aggressions on the vendor for immediate response and relief.

Rome wasn’t built in a day and neither were software platforms, yet our need for instant gratification overrides the ability to work through issues that otherwise without emotion would be handled quite effectively. Darwinian Theory of evolution plays well here: only the best adapted will survive (the vendor, I mean). The meaning of Meaningful Use to a vendor is twofold: to deliver high quality technology meeting acceptable government criteria and also technology that all stakeholders find acceptable, functional, and timely.

Finally the last group who should or could benefit from Meaningful Use if implemented, accepted, and seamlessly delivered involves insurers (third-party payers) that have been battling the cost containment of healthcare for quite some time. If insurers were really wanting to make a difference in healthcare costs, they would reward more for preventive care and support universally such processes as the Patient Centered Medical Home and also invest in the health of our bodies real time, not years later when we are ravaged by illnesses due to poor lifestyle, poor gene pool, or a combination of the two. In the end, if Meaningful Use is supported by these groups, the insurers will benefit from lower healthcare consumption, more efficiency, and better outcomes.

Meaningful Use has meaning to us all and worthy of support. It just needs to be appreciated and agile enough to survive the need for our society’s immediate need for gratification and be resilient enough to let some play catch up.

Helen Figge, PharmD, MBA, CPHIMS, FHIMSS  is advisor, clinical operations and strategies, for VRAI Transformation.

Readers Write: Building an Accountable Care Organization? Consider Starting in Your Own Back Yard

August 5, 2013 Readers Write Comments Off on Readers Write: Building an Accountable Care Organization? Consider Starting in Your Own Back Yard

Building an Accountable Care Organization? Consider Starting in Your Own Back Yard
By Claudia Blackburn

8-5-2013 12-56-10 PM

Explaining my healthcare IT profession to my parents and children has never been straightforward. Yet sometimes they are the ones who can boil it down to the essence of what we do, perhaps even better than we can.  Before I became a consultant, my mom once told a family friend that I, "paid people to be healthy so that the hospital I worked for didn’t have to pay as much for health insurance." The friend responded,"Where can I sign up?" They both clearly understood the value of population health management (PHM) programs.

With the CMS news released this month about those Pioneer Accountable Care Organizations (ACOs) that have demonstrated success and shared in the savings — and of those Pioneer ACOs that are not continuing the program — there’s healthy debate about the model and the key success factors.

For those organizations considering starting an ACO, consider test-driving the concept in your own back yard with your health plan member population.

The Opportunity: An Integrated Wellness Model

Several self-insured employers – both healthcare organizations and companies from other industries – have proven that an ROI is achievable through population health and wellness programs. A few shared their program experiences showing impressive return for their wellness dollars:

  • In 2011, Mercy Clinics, Inc. reported a four-to-one return on investment of wellness dollars spent. Mercy uses coaches within its practices to assist with coordination of care.
  • Franciscan Missionaries of Our Lady Health System decreased health plan expenses 13 percent, with a 21 percent decrease in medical claims alone in 2011. A four-to-one return over five years projected a savings of $37.3 million.
  • John Hancock’s Healthy Returns program increased savings per participant from $111 in 2009 to $261 in 2010, and preventative care increased 1 percent to 4 percent per year with an overall 2.5 to one ROI.

Just as any other employer, hospitals face increasing healthcare costs for their employee and member population. However, hospitals can use their healthcare expertise to develop practice protocols that change habits and ultimately improve the health of their self-insured member population and decrease employee benefit costs.

Strategic Elements of a Successful Population Health Management Program

Screening, prevention, and care management are all involved in population health improvement, but by far, changing the habits of individuals is the most challenging. Smart phone applications and portals, in addition to payers and providers pushing information, have not engaged members.

To engage members for best outcomes with accountability and oversight, the health management program must be a combination of people, new processes, new technology, and much better use of the collective data. There are several essential elements of an integrated PHM model:

  • Claims data. Claims data define healthcare services received across the continuum of care and risk in order to target program benefits and measure improvements in utilization and cost.
  • Health risk assessment (HRA). A HRA captures basic information to determine the consumer population health status and risk stratification, especially important for those with no claims.
  • Electronic medical record (EMR) / biometric screening. It’s important not to allow the member to self-report on weight, cholesterol, blood pressure, and glucose. Instead, a coach or nurse should measure other biometrics charted in the EMR. Patient data from a personal health record (PHR) can be useful and selectively imported into the EMR.
  • Aligned incentives. Incentives are important to move members towards participation and keep them active and accountable. Incentives such as reduced premiums, door prizes, or gift cards are helpful to encourage enrollment. Once enrolled, outcomes-based incentives can be used to keep the member working towards health goals.
  • Coaching. Successful PHM programs have coaches armed with full information from claims, HRA, and EMR to motivate members to change behaviors.
  • Consumer portal. The portal allows for better engagement between provider and consumer and monitoring of healthy habits, such as exercise.
  • Data warehouse /analytics. Armed with holistic information about the consumer, high-risk root causes can be identified, targeted with strategic program initiatives, and measured for success or rework as part of a feedback loop to assure data-driven increased quality and decreased cost.

From the above list, clearly the “glue”for connecting the PHM program elements is a solid technology foundation. It provides a concise picture of population and individual holistic health. When combined with coaching, health systems are able to not only monitor but also influence change. Additionally, the closed-loop feedback mechanism enables measurement of the success of strategies at an enterprise level and a member level to allow for continuous improvement.

Just as my mom and her friend understood, the value of population health and wellness programs can be substantial. Keeping members accountable through incentives increases healthy behaviors and reduces the self-insured health insurance cost of the employer.

Hospitals can take a leadership position in the move toward the IHI’s Triple Aim both as an employer and a healthcare provider via PHM programs for its own self-insured member population. The individual wins, the employer wins, the hospital wins, and the community wins.


Claudia Blackburn is a consultant with
Aspen Advisors of Pittsburgh, PA.

Readers Write: Think Beyond the Text: Understanding HIPAA and Its Revisions

July 31, 2013 Readers Write 1 Comment

Think Beyond the Text: Understanding HIPAA and Its Revisions
By Terry Edwards

Every day, an increasing number of physicians and other health care providers are exchanging clinical information through a wide range of modes, including smart phones, pagers, CPOE, e-mails, texts and messaging features in an EMR. It’s no surprise that hospital and health system leaders are increasingly focused on securing protected health information in electronic form (ePHI)—a trend that has certainly invoked some confusion across the industry.

As PHI data breaches increase in frequency, hospital executives must strategize ways to eliminate security threats and remain HIPAA compliant. Especially since HIPAA violations can be extremely expensive, leaving these already-strapped organizations in an even more stressful financial situation.

In order to prioritize tangibles such as patient safety, physician satisfaction and overall efficiency across processes and hospitals, health care leadership must consider ways to tackle this confusion and maximize the benefits enabled by modern technology and electronic communications.

PHI can take a variety of paths in today’s complex healthcare environment and expose a health system to risk. But time and time again I see health systems looking to implement stop-gap measures and point solutions that address part—and not all—of the problem.

While texts are commonly sent between two individuals via their mobile phones, the communication “universe” into which a text enters is actually much bigger. It also includes creating ePHI and sending messages—in text and voice modalities—from mobile carrier web sites, paging applications, call centers, answering services and hospital switchboards.

For example, a 400+ bed hospital generates more than 50,000 communication transactions to physicians each and every month. Many of these communications contain ePHI. And if they were transmitted through unsecure networks and stored in unencrypted formats, they would represent a meaningful potential security risk to both the hospital and its medical staff.

In order to identify all potential areas of vulnerability, health care leaders need to consider all mechanisms by which ePHI is transmitted and the security of those mechanisms and processes. No mode of communication can be viewed in isolation. By failing to address all transmitted ePHI, organizations become vulnerable to security breaches with adverse legal and financial consequences, as well as loss of patient trust and reputation in marketplace.

In addition, contrary to what many health leaders have been led to believe, HIPAA provisions do not call out any specific modes of communication. Text messaging is permissible under HIPAA. The law simply stipulates that a covered entity (CE) must perform a formal risk assessment; develop and implement and effective risk management strategy based upon sound policies and procedures; and monitor its risk on an ongoing basis. These regulations apply to providers communicating PHI in any electronic form.

As a result, there is no such thing as a “HIPAA-compliant app.”

HIPAA provisions emphasize the risk management process rather than the technologies used to manage risk. For hospitals and health systems, the pathway to safeguarding electronic communication of PHI lies in the creation of an overall risk management strategy.

Ideally, leaders of the CE will form an information security committee to develop and execute the strategy, which includes representatives from IT, operations, the medical staff, and nursing, as well as legal counsel. Leaders should also consider including an external security firm in the group. Once the committee is formed, the organization should take these four essential steps for protecting the security of ePHI:

  1. Organize and execute a formal risk analysis. A formal risk analysis should break down types of technology used for electronic communication as well as the transmission routes for all ePHI. To ensure HIPAA compliance, ePHI transmitted across all channels must be “minimally necessary,” which means it includes only the PHI needed for that clinical communication. This layer of complexity, which is common in clinical communication processes, underscores the need for a comprehensive security assessment and strategy appropriate for the organization, coupled with the resources necessary to implement that strategy.
  2. Establish an appropriate risk management strategy. The committee should develop a risk management strategy that’s specific to the needs and vulnerabilities of the organization and is designed to manage the risk of an information breach to a reasonable level. HIPAA does not specifically define “reasonable,” but in general, the risk management strategy should include policies and procedures that ensure the security of message data during transmission, routing, and storage. The strategy should also include specific administrative, physical, and technical safeguards for ePHI.
  3. Roll out these policies and procedures and train staff. Implementing new policies and procedures is the biggest challenge for organizational leaders, especially as a substantial proportion of reported security breaches are due in part to insufficient training of staff. As a result, appropriate individuals should be assigned specific implementation tasks for which they are held accountable, while leaders and committee members must carefully monitor the success of implementation. All staff with access to PHI must be educated about the specific policies and procedures, which will help ensure they are upheld across the organization.
  4. Monitor risk on an ongoing basis. To ensure continued compliance with security standards, organizations must conduct ongoing monitoring of their information security risk. Leaders should receive regular trend reports from the information security committee based on their ongoing assessment of ePHI security at the organization. Those reports should support the ongoing assessment of security needs as technology and health care delivery change, and act as a catalyst for changes that may need to be made to the policies and procedures over time.

In today’s increasingly complex healthcare environment, analyzing and implementing a broader policy around security across all forms of electronic communications—rather than focusing on a single mode of communication in isolation—is critical to any health system’s ability to avoid and mitigate the adverse consequences of a breach. By clarifying the confusion around electronic communications now, hospitals and health systems will be better prepared to minimize risk and maximize best-practice communication process in the future.

Terry Edwards is president and CEO of PerfectServe of Knoxville, TN.

Readers Write: Seven Strategies for Optimizing the EHR

July 31, 2013 Readers Write Comments Off on Readers Write: Seven Strategies for Optimizing the EHR

Seven Strategies for Optimizing the EHR
By Marcy Stoots MS, RN-BC

7-31-2013 4-11-56 PM

Healthcare organizations are making a mistake if they subscribe to the notion that once an EHR is successfully implemented, it no longer requires attention. Even the most carefully designed EHR will not work as intended in all situations, causing users to create workarounds that are counterproductive and inefficient. It’s important to develop and implement an ongoing strategy for fine-tuning the EHR so that users can input and access the data they need with fewer clicks and better outcomes, which will improve clinician satisfaction.

Besides moving toward usability and adoption, optimization will help with plans to achieve Meaningful Use Stage 2, which raises the bar significantly. Under the Stage 2 final rule, for example, hospitals must report on 16 of 29 clinical quality measures (CQMs) and Eligible Professionals must report on nine of 64 CQMs. Optimizing the EHR to properly capture this data and generate compliance reporting is crucial.

Finally, optimization is a key step to realizing the financial ROI of the EHR, in which a substantial investment has been made. In today’s landscape of cost containment and healthcare reform, an organization can ill afford to sacrifice financial ROI or be bogged down by inefficiencies.

Below are seven strategies for optimizing the EHR to increase efficiency, improve the ROI, drive adoption, and improve usability, with the ultimate goal of providing better outcomes.

1. Create a Governance Structure

Just as an organization needed a governance structure during planning and implementation of the EHR, it will need one for ongoing optimization. This will provide an avenue for making decisions and keeping the optimization plan moving forward. Problems will continue to arise and solid governance will ensure that they are dealt with effectively. A process should be in place to manage variances when clinicians do not want to adhere to a standardized documentation or workflows. When these crop up, the governance group will need to decide upon appropriate action.

2. Create a Solid Informatics Structure

Many healthcare organizations struggle with the size and organization of the informatics team. From an optimization standpoint, it’s important to get this right. There is no standard answer here; every organization is different. Detailed descriptions of job roles and responsibilities should be created and appropriate resources budgeted.

3. Assign Responsibility

An individual at the leadership level should be designated as the responsible party for optimization. This function should be incorporated into that person’s job description. This is typically an informatics director, but could also be a CMIO or IT director, depending on the organizational structure. Assigning this responsibility will help ensure that optimization is an ongoing process, since it requires continual evaluation and modification. Ideally, for larger health systems, there should also be an optimization team in place that could include clinical leadership, operational leadership, informatics analysts, and super users. For smaller health systems, the team would be much smaller, but informaticists should have optimization as a core job function.

4. Measure

The pain points of clinicians should be determined by interviewing stakeholders, examining service desk tickets, listening to input from IT and informatics staff, analyzing reports and metrics, and observing end-to-end workflows. The most important issues should be focused on with data collected at baseline and after 30, 60 and 90 days. Measuring is an ongoing process. It should be used to monitor progress and gauge success.

5. Create Scorecards

Scorecards are a powerful tool for demonstrating what has been achieved. They display the collected data and communicate improvements to the team and stakeholders. Managing workarounds starts with accountability; Scorecards lets users know where they stand and create a healthily competitive environment that encourages success. They can be used to compare units within a hospital or hospitals within a health system.

6. Provide a Quick Win

Clinicians can be easily frustrated by glitches in the EHR, so areas should be pinpointed that will quickly increase their satisfaction. These are issues that are important to them, yet easy to address, the low-hanging fruit that delivers the highest impact. Success breeds enthusiasm, setting the stage for better adoption.

7. Continue Refining

Optimization is never complete. It is an ongoing endeavor without an endpoint.

Workarounds are a reality. The organization should have an optimization plan to monitor and manage them, as well as establishing ownership of that plan. With proper planning and a roadmap in place, addressing problems and overcoming challenges will go smoothly. The end result will be satisfied users and healthier patients (and lower costs).

Marcy Stoots MS, RN-BC is a principal with CIC Advisory of Clearwater, FL.

Readers Write: How Many Licks to the Tootsie Pop Center Versus How Many Clicks to Relevant Clinical Data?

July 22, 2013 Readers Write 1 Comment

How Many Licks to the Tootsie Pop Center Versus How Many Clicks to Relevant Clinical Data?
By Helen Figge, PharmD, MBA, CPHIMS, FHIMSS

A group of engineering students once reported that it took an average of about 364 licks to get to the center of a Tootsie Pop. For some reason, this was a very important scientific query that needed an answer.

A current healthcare query many are pondering today is: how many clicks are needed to get to the relevant clinical data necessary to support patient care? Clinicians using the various technologies like EHRs and HIEs for data retrieval often times have the same number of steps as getting to the center of a Tootsie Pop. The more clicks it takes to get to the required clinical data, the more time spent away from the patient, and thus eventual loss of productivity, suboptimal patient care, and potentially total clinician frustration. If you speak to clinicians on the front line, many of the technologies are more of a hindrance than a help.

In reality, one wants the right data at the right time and in a comprehensible format without undue effort to retrieve it. Clinicians are yearning for “smart software” that knows what data to fetch and how to properly present it. Data needs to be automatically populated inside the clinician note. Additionally, more than ever clinicians need technology that supports workflow and provides the correct data with minimal effort on the clinician’s part. 

Bottom line, we need “smart software” that knows what data to present while simultaneously having other data immediately available with one click. As an example, if the software recognizes that the patient has diabetes, then certain labs — such as hemoglobin A1C, lipids, and renal function — should be automatically displayed in the note.

Ask clinicians and they will tell you that they are getting drowned in unnecessary data. Data needs to be presented in a way that is easily understood by clinicians. We know a lot about the technologies out there today compared to a few years ago, so clinicians more than ever should expect nothing less from their vendors today than data that is useful, timely, and in real-time.

Clinicians require from their technology enablers the ability to aggregate data from multiple sources and present it in a comprehensible format. For a diabetic patient, the software needs to aggregate kidney function from any laboratory source and plot and trend the data appropriately. Clinicians need to be more vocal in their desires for appropriate data and need to collaborate with the IT departments to get the desired outcomes from their technologies. Clinicians need to engage more than ever before to ensure the software chosen for their organizations delivers what is needed on the front lines.

Right now, clinicians need easily customizable data presentation formats, smart order templates and true data aggregators along with evidence-based algorithms from their vendors. Clinicians must have tools that actually work for them, not against them, and truly support patient care. True “smart software” should support what the clinician needs, not forcing the clinician to adapt to inept software to attempt data retrieval for patient care. IT experts need to continue engaging the clinician in collaborations because right now it’s all about the data and how it is presented.

Organized structured data is the paramount piece to the current healthcare puzzle. We have the answer to how many licks to the center of a Tootsie Pop. Now it’s time to get to the answer of how many clicks to the necessary data that truly supports patient care.

Helen Figge, PharmD, MBA, CPHIMS, FHIMSS  is advisor, clinical operations and strategies, for VRAI Transformation.

Readers Write: The Sequester’s Impact on Healthcare: Dangerous Unintended Consequences

July 22, 2013 Readers Write 1 Comment

The Sequester’s Impact on Healthcare: Dangerous Unintended Consequences
By Rich Temple

7-22-2013 8-27-38 PM

It has been three months since the sequester hit the healthcare industry, and the effects are more profound than they might seem. What’s most troubling is that the budget cuts in many cases will wind up costing the government more money and will have a particularly negative impact on cancer patients and those living in rural areas.

Cost of Caring for Unemployed

Across the healthcare spectrum, providers can anticipate about $11 billion in cuts. A joint study by the American Medical Association and the American Hospital Association estimates the loss of 330,127 healthcare jobs and 496,000 indirect job losses by 2021. Victims of job losses tend to require extra care to sustain their health and well-being while out of work, and the cost of these interventions may wipe out the perceived benefits of the sequester’s capricious cost-cutting.

Another Hit for Providers: Cuts in Medicare Reimbursement

For individual healthcare providers, the 2 percent across-the-board Medicare reimbursement cut will exacerbate challenges for providers who are already struggling to adapt to value-based purchasing and other mandated reimbursement cuts. Mercifully, Medicaid was exempted from this cut, but even Medicare Meaningful Use incentives will sustain the 2 percent reduction.

Particularly hard-hit will be rural hospitals, which according to a study by iVantage Health Analytics are twice as likely to be thrown into the red as a result of these cuts. That’s because rural hospitals treat older, poorer, and less-insured patients and are thus directly dependent on Medicare for their economic sustainability. This financial damage will ripple down to the communities they serve since these organizations tend to be among the largest employers and are likely to be a key focal point of much of the activity in their local economies.

Cuts Disproportionately Affect Community-Based Cancer Clinics

Cancer care is the area most profoundly impacted by the sequester. Reimbursement cuts are making it financially untenable for community-based cancer clinics — one of the more cost-effective treatment sites — to continue to serve many patients, thereby forcing them to either seek care in a more expensive hospital setting or not seek care at all.

Historically, Medicare reimbursement for cancer drugs has been the average price of the drugs, plus a 6 percent administrative fee to cover the cost of providing care. The sequester reduces that fee to 4.3 percent for both drugs and services, which in essence translates to a 28 percent cut in actual reimbursement.

According to a study conducted by the actuarial firm Milliman, the sequester is already resulting in layoffs, closings, cutbacks, and is driving patients into hospital settings. The study also says that the government could pay an average of $6,500 more per year for cancer patients in a hospital versus a community clinic.

Cuts to Cancer Research Means Fewer Clinical Trials

Another area where cancer patients are hard hit involves cuts to research funding. Besides the estimated loss of 20,500 research jobs, NIH research indicates that every $1 invested in cancer research yields over $2 in incremental economic activity. This translates to a $3 billion direct negative hit on overall economic activity.

Significant cuts to cancer research mean that fewer clinical trials will be available to help identify better treatments and thus, more protracted, costly, and painful care for patients will continue.

Most Vulnerable are Hardest Hit

In summary, the sequester’s effects are causing great pain on many levels to some of the most vulnerable segments of our population. And the perceived cost-reduction benefits are actually not likely to be realized since the unintended consequences of the sequester look like they will cost even more than the mandated cuts. These consequences could take the form of:

  • More expensive, less efficient care due to patients losing access to primary care physicians
  • Incremental unemployment insurance for those who have lost their jobs
  • Protracted inpatient stays due to less readily available preventative research
  • Other forms of public assistance these individuals will require

The effects of the sequester on healthcare have not been discussed extensively of late in the media. However, it should be noted that there are unintended consequences that we will most likely pay for in the coming years ahead.


Rich Temple is national practice director for IT strategy at
Beacon Partners.

Readers Write: The Enterprise Content Management Adoption Model

July 15, 2013 Readers Write 4 Comments

The Enterprise Content Management Adoption Model
By Eric Merchant

7-15-2013 6-21-31 PM

There have been numerous publications recently about the amount of unstructured content that exists (80 percent of all content) in a non-discrete format outside of the electronic medical record. This unstructured content exists as digital photos, scanned documents, clinical images, and faxes and e-mails.

The challenge of capturing this information as close to the source as possible — managing it effectively and ultimately delivering it to the necessary physician, nurse, or other provider in a timely manner at the point of need — is a continuous uphill battle. There are varying degrees of being able to manage unstructured content and make it available to decision makers in a meaningful way to improve patient care, drive operational efficiencies, and improve financial performance in the healthcare market.

In developing a content strategy, the challenge is greater than simply buying a software suite and thinking your problems are over. As content grows in volume and complexity, the strategic plan needs to be flexible to be able to grow and adapt accordingly.

To do this, a reference is needed to determine where we were, where we are now and where we want to be. I began creating an Enterprise Content Management (ECM) adoption model as an internal point of reference, but also as a strategic guide for the industry. In practice, it would function similarly to the seven stages of the EMR adoption created by HIMSS Analytics.

ECM Adoption Model

Stage 10

Vendor Neutral Archive (VNA) Integration: Ability to seamlessly integrate with VNA.

Stage 9

Federated Search: Ability to search content across the enterprise.

Stage 8

Information Exchange: Ability to share/publish content with external entities, social media, etc.

Stage 7

Analytics: Meaningful use of content.

Stage 6

Image Lifecycle Management (ILM): Ability to purge and archive.

Stage 5

Capture, Manage and Render Digital Content: Ability to capture photos, videos, audio, etc.

Stage 4

Intelligent Capture: Ability to use OCR and other techniques to extract/use data.

Stage 3

Integration: Ability to render content inside ERP, EMR, etc.

Stage 2

Workflow: Ability to use automated workflow to streamline processes.

Stage 1

Capture and Render Documents: Ability to scan/upload and retrieve documents.

Stage 0

All Paper: No document management system (DMS).

This adoption model can serve the healthcare industry well by allowing us to keep focused on the outcomes we want to achieve and the systems that would provide them. The adoption model also intertwines patient care initiatives (capture content and deliver within the EMR), operational efficiencies we need to achieve (federated search and analytics) and outcomes that will directly benefit healthcare organizations’ financial performance (intelligent capture, VNA and Image Lifecycle management).

In addition, this strategy also delivers on the commitment to support Meaningful Use and IHE data-sharing initiatives with the ability to share and publish unstructured content to information exchanges.

EMR systems have received the bulk of the attention the past few years due to the value they bring and the public policy and reimbursement implications of getting them successfully implemented. However, as the healthcare market becomes more electronically mature, we cannot lose focus on the larger picture and the bigger challenge and ultimately the patient. This picture is incomplete without bringing together both the unstructured content created outside the EMR and the discrete information within the EMR.

To do this, the ECM adoption model, in conjunction with the EMR adoption model, must both be used as a roadmap to reach that goal. ECM vendors must take the same approach that EMR vendors have taken and work hand in hand with healthcare organizations to provide the solutions to achieve Stage 10 of the ECM adoption model and ultimately move closer to a complete patient record, which subsequently creates better health outcomes delivered efficiently and in a financially solvent manner.

Eric Merchant is director of application services, health information technology, for NorthShore University HealthSystem of Skokie, IL.

Readers Write: Requirements Versus User Experience: The MU Design Impact on Today’s EHR Applications

July 15, 2013 Readers Write 3 Comments

Requirements Versus User Experience: The MU Design Impact on Today’s EHR Applications
By Tom Giannulli, MD, MS

7-15-2013 6-03-46 PM

Since the first electronic health record (EHR) applications, the federal government has been looking for ways to leverage EHR technology to improve the quality and cost of healthcare delivery. A decade ago, President George W. Bush declared that every American should have an electronic health record within 10 years. While we’ve come a long way, almost half of all medical providers are currently searching for an EHR, installing one now, or looking to switch out the one they have in place.

This is an eye-opening situation given the investment of billions of dollars in EHR technology by healthcare providers, technology suppliers, and the government via incentive programs. Why is this? One contributing factor is that the government incentive programs have excessively focused on features over user experience and outcomes.

When the current EHR incentive programs emerged in 2009, EHR suppliers with existing products were faced with the challenge of meeting Meaningful Use (MU) requirements. It’s not easy to retrofit new functional requirements into an existing product, and it’s commonly understood many suppliers had to focus on achieving functionality requirements however possible given the potential impact of government incentives. The time-bound goal was simply to get X feature programmed in Y weeks so that version update or hot fix could be applied to meet customer certification timelines.

Function ruled over form, often resulting in degraded user experience and sub-optimized workflows. In hindsight, it may have been better to have fewer incentive program requirements with broader definitions and simpler tests to validate compliance.

For example, assume a general requirement for physicians to be able to share standardized clinical documents with basic tests of compliance. With this more general goal, technology suppliers would have greater freedom around how to solve the requirement resulting in a greater range of solutions—some of which likely would have superior usability. The market would then reward the company that best met both the requirement and the associated usability and user satisfaction.

The overall goals of MU are sound; it’s simply that in practice the extent and specificity of the requirements often overemphasize feature content and prescribed usage at the expense of user experience and the innovation that comes with flexibility. A doctor on HIStalk a few weeks ago highlighted this reality:

“When you’re used to using very clean designs—a MacBook, an iPhone, Twitter, Facebook—and you sit down on an EMR (electronic medical record system), it’s like stepping back in time 15 or 20 years.”

I had the opportunity to build an EHR after MU Stage 1 had been established. This allowed us to take a more comprehensive approach in terms of meeting our overall design goals, including usability, as well as MU requirements. We wanted to make it possible for the physician to use the application to chart patient visits and the required data and reporting were generated as an by-product of normal use.

Now, we are facing changes for MU Stage 2, integrating those into an existing product, tying them to user needs in a way that makes sense. We have developed a process that uses a lot of user feedback and testing and we try to iterate quickly with releases at least monthly.

But the fact is that the specificity of MU and the rigorous testing don’t provide for the best user experience. Ironically, these really specific requirements—a number of which dictate the user experience to a large degree—are supposed to be creating improved usability when in fact they are detracting from user-friendless and improved workflow.

I believe that without MU, many EHR features would be similar, but there would be notable differences resulting from the focus on user feedback versus government direction. As a physician and an EHR designer, I would still want to track health maintenance and have tools to manage people’s care. The big change would be the ability to focus on some market-driven elements that we haven’t been able to spend as much time on because they aren’t MU requirements.

We would be spending more time looking at how we could use the practice data to highlight workflow problems or areas where the practice isn’t using best practices. By leveraging our large pool of operational and clinical data, we could generate more recommendations for practice optimization and patient care. These are very high level concepts that we are exploring, but are at a lower priority given the resources required to implement MU2 in a way that is well integrated and results in a positive user experience.

In a perfect world, current MU2 requirements would be replaced with just few high-impact goals related to interoperation and communication. Current MU2 requirements have added little new incremental value while creating a significant burden for vendors and end users. This situation is even more challenging in that the requirements are becoming more specific and dictate user interaction in some cases. The structure is in place to capture discrete data, measure quality, and communicate standardized data.

At this point, I believe the market should drive the process of advancing features and expand-on the valued features outlined by the MU requirements.

Tom Giannulli, MD, MS is chief medical information officer at Kareo.

Readers Write: All Vendors Exit Stage Left

July 10, 2013 Readers Write Comments Off on Readers Write: All Vendors Exit Stage Left

All Vendors Exit Stage Left
By Frank Poggio

Stage 1 product certifications end this year — September 30 for Inpatient products and December 31 for Ambulatory. In many of my conversations with systems suppliers who are considering the next step in ONC Certification, they refer to it as “Stage 2 Certification.” I can’t blame them. I’ve done it myself.

Remember, it all started with Stage 1 two years ago, so naturally you would expect Stage 2 to follow Stage 1. But with the feds and ONC, it could never be that simple.

When ONC issued the final Stage 2 rules last year, they made a very purposeful and distinct break between Stage 2 Meaningful Use and the vendor test criteria. Instead of referring to “Stage 2 Test Criteria,” they labeled them the 2014 Edition Test Criteria. Providers are subject to Meaningful Use Stage 2 rules, while vendors seeking certification come under the 2014 Edition of Test Criteria. There are real differences  — some pretty big ones.

What I usually see is a software firm starts by carefully reviewing the provider MU Stage 2 attestation criteria since they are all over the Web. Next, they try to translate the MU list to product test criteria. Then confusion follows.

Although the MU attestation criteria for Stage 2 resembles the Certification test criteria, there are differences. For example, one big difference is a provider needs to attest to about 25 MU criteria and some Quality Measures to get the Stage 2 money. But you as a vendor need to pass on about 40 certification test criteria and nine QMS elements to become 2014 Edition Certified.

Another example: under Stage 2, a provider would attest to completing a HIPAA compliance risk analysis. That’s just one question (the answer is ‘yes’, subject to audit, of course). But for a vendor completing a certification test under the 2014 Edition, you address eight very specific tests for privacy and security.

ONC now refers to your Stage 1 certification as the “2011 Edition Test Criteria.” No more Stage 1.

A related question ties back to what I said at the top of this piece. Your current Stage 1 certification ends this year. Actually, ONC says your 2011 Edition certification ends and you must test out on the new 2014 Edition to continue to sell certified software.

As of this week, only four vendors have been successful in achieving 2014 Edition Full EHR Inpatient Certifications. Under Stage 1, there were dozens. The 2014 testing is turning out to be a real challenge for many vendors, far more difficult than I think ONC expected.
Some think ONC will extend the Stage 1 vendor certifications if they do not get enough vendors through 2104 Tests by September. That would seem a likely solution. But given Dr. M’s pointed comments about vendors “gaming the system,” I doubt it.

The reason they made the breaks between certification test criteria and MU attestation criteria is that when they decided to extend Stage 1 of provider attestation into 2014 (originally it was to die in 2013) they did not want to extent the vendor certifications as well. Why? I guess they just wanted to keep your feet to the fire.

Which raises the next question. How can a provider attest to Stage 1 in 2014 when all the vendor certifications for Stage 1 die in three or six months? Simple. ONC now allows the provider to MU attest under Stage1 using a 2014 Certified system. If you have clients or prospects that have not attested to Stage 1 and plan to do so in 2014, they must be running your 2014 Edition certified software for at least 90 days in 2014.

It seems that ONC has taken vendors off the Stage, and reduced them to simply an old Edition.


Frank Poggio is president of
The Kelzon Group.

Text Ads


RECENT COMMENTS

  1. If only more health systems were as transparent as the VA. We get a helpful look at the VA's inner…

  2. One consideration that I see missing from the conversation around emoji usage in formal/official settings is the fact that different…

  3. 🤭. It is important to understand how the patient views their use, and whether the particular might convey an incorrect…

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

RSS Webinars

  • An error has occurred, which probably means the feed is down. Try again later.