Home » Readers Write » Recent Articles:

Readers Write: Building an Accountable Care Organization? Consider Starting in Your Own Back Yard

August 5, 2013 Readers Write No Comments

Building an Accountable Care Organization? Consider Starting in Your Own Back Yard
By Claudia Blackburn

8-5-2013 12-56-10 PM

Explaining my healthcare IT profession to my parents and children has never been straightforward. Yet sometimes they are the ones who can boil it down to the essence of what we do, perhaps even better than we can.  Before I became a consultant, my mom once told a family friend that I, "paid people to be healthy so that the hospital I worked for didn’t have to pay as much for health insurance." The friend responded,"Where can I sign up?" They both clearly understood the value of population health management (PHM) programs.

With the CMS news released this month about those Pioneer Accountable Care Organizations (ACOs) that have demonstrated success and shared in the savings — and of those Pioneer ACOs that are not continuing the program — there’s healthy debate about the model and the key success factors.

For those organizations considering starting an ACO, consider test-driving the concept in your own back yard with your health plan member population.

The Opportunity: An Integrated Wellness Model

Several self-insured employers – both healthcare organizations and companies from other industries – have proven that an ROI is achievable through population health and wellness programs. A few shared their program experiences showing impressive return for their wellness dollars:

  • In 2011, Mercy Clinics, Inc. reported a four-to-one return on investment of wellness dollars spent. Mercy uses coaches within its practices to assist with coordination of care.
  • Franciscan Missionaries of Our Lady Health System decreased health plan expenses 13 percent, with a 21 percent decrease in medical claims alone in 2011. A four-to-one return over five years projected a savings of $37.3 million.
  • John Hancock’s Healthy Returns program increased savings per participant from $111 in 2009 to $261 in 2010, and preventative care increased 1 percent to 4 percent per year with an overall 2.5 to one ROI.

Just as any other employer, hospitals face increasing healthcare costs for their employee and member population. However, hospitals can use their healthcare expertise to develop practice protocols that change habits and ultimately improve the health of their self-insured member population and decrease employee benefit costs.

Strategic Elements of a Successful Population Health Management Program

Screening, prevention, and care management are all involved in population health improvement, but by far, changing the habits of individuals is the most challenging. Smart phone applications and portals, in addition to payers and providers pushing information, have not engaged members.

To engage members for best outcomes with accountability and oversight, the health management program must be a combination of people, new processes, new technology, and much better use of the collective data. There are several essential elements of an integrated PHM model:

  • Claims data. Claims data define healthcare services received across the continuum of care and risk in order to target program benefits and measure improvements in utilization and cost.
  • Health risk assessment (HRA). A HRA captures basic information to determine the consumer population health status and risk stratification, especially important for those with no claims.
  • Electronic medical record (EMR) / biometric screening. It’s important not to allow the member to self-report on weight, cholesterol, blood pressure, and glucose. Instead, a coach or nurse should measure other biometrics charted in the EMR. Patient data from a personal health record (PHR) can be useful and selectively imported into the EMR.
  • Aligned incentives. Incentives are important to move members towards participation and keep them active and accountable. Incentives such as reduced premiums, door prizes, or gift cards are helpful to encourage enrollment. Once enrolled, outcomes-based incentives can be used to keep the member working towards health goals.
  • Coaching. Successful PHM programs have coaches armed with full information from claims, HRA, and EMR to motivate members to change behaviors.
  • Consumer portal. The portal allows for better engagement between provider and consumer and monitoring of healthy habits, such as exercise.
  • Data warehouse /analytics. Armed with holistic information about the consumer, high-risk root causes can be identified, targeted with strategic program initiatives, and measured for success or rework as part of a feedback loop to assure data-driven increased quality and decreased cost.

From the above list, clearly the “glue”for connecting the PHM program elements is a solid technology foundation. It provides a concise picture of population and individual holistic health. When combined with coaching, health systems are able to not only monitor but also influence change. Additionally, the closed-loop feedback mechanism enables measurement of the success of strategies at an enterprise level and a member level to allow for continuous improvement.

Just as my mom and her friend understood, the value of population health and wellness programs can be substantial. Keeping members accountable through incentives increases healthy behaviors and reduces the self-insured health insurance cost of the employer.

Hospitals can take a leadership position in the move toward the IHI’s Triple Aim both as an employer and a healthcare provider via PHM programs for its own self-insured member population. The individual wins, the employer wins, the hospital wins, and the community wins.


Claudia Blackburn is a consultant with
Aspen Advisors of Pittsburgh, PA.

Readers Write: Think Beyond the Text: Understanding HIPAA and Its Revisions

July 31, 2013 Readers Write 1 Comment

Think Beyond the Text: Understanding HIPAA and Its Revisions
By Terry Edwards

Every day, an increasing number of physicians and other health care providers are exchanging clinical information through a wide range of modes, including smart phones, pagers, CPOE, e-mails, texts and messaging features in an EMR. It’s no surprise that hospital and health system leaders are increasingly focused on securing protected health information in electronic form (ePHI)—a trend that has certainly invoked some confusion across the industry.

As PHI data breaches increase in frequency, hospital executives must strategize ways to eliminate security threats and remain HIPAA compliant. Especially since HIPAA violations can be extremely expensive, leaving these already-strapped organizations in an even more stressful financial situation.

In order to prioritize tangibles such as patient safety, physician satisfaction and overall efficiency across processes and hospitals, health care leadership must consider ways to tackle this confusion and maximize the benefits enabled by modern technology and electronic communications.

PHI can take a variety of paths in today’s complex healthcare environment and expose a health system to risk. But time and time again I see health systems looking to implement stop-gap measures and point solutions that address part—and not all—of the problem.

While texts are commonly sent between two individuals via their mobile phones, the communication “universe” into which a text enters is actually much bigger. It also includes creating ePHI and sending messages—in text and voice modalities—from mobile carrier web sites, paging applications, call centers, answering services and hospital switchboards.

For example, a 400+ bed hospital generates more than 50,000 communication transactions to physicians each and every month. Many of these communications contain ePHI. And if they were transmitted through unsecure networks and stored in unencrypted formats, they would represent a meaningful potential security risk to both the hospital and its medical staff.

In order to identify all potential areas of vulnerability, health care leaders need to consider all mechanisms by which ePHI is transmitted and the security of those mechanisms and processes. No mode of communication can be viewed in isolation. By failing to address all transmitted ePHI, organizations become vulnerable to security breaches with adverse legal and financial consequences, as well as loss of patient trust and reputation in marketplace.

In addition, contrary to what many health leaders have been led to believe, HIPAA provisions do not call out any specific modes of communication. Text messaging is permissible under HIPAA. The law simply stipulates that a covered entity (CE) must perform a formal risk assessment; develop and implement and effective risk management strategy based upon sound policies and procedures; and monitor its risk on an ongoing basis. These regulations apply to providers communicating PHI in any electronic form.

As a result, there is no such thing as a “HIPAA-compliant app.”

HIPAA provisions emphasize the risk management process rather than the technologies used to manage risk. For hospitals and health systems, the pathway to safeguarding electronic communication of PHI lies in the creation of an overall risk management strategy.

Ideally, leaders of the CE will form an information security committee to develop and execute the strategy, which includes representatives from IT, operations, the medical staff, and nursing, as well as legal counsel. Leaders should also consider including an external security firm in the group. Once the committee is formed, the organization should take these four essential steps for protecting the security of ePHI:

  1. Organize and execute a formal risk analysis. A formal risk analysis should break down types of technology used for electronic communication as well as the transmission routes for all ePHI. To ensure HIPAA compliance, ePHI transmitted across all channels must be “minimally necessary,” which means it includes only the PHI needed for that clinical communication. This layer of complexity, which is common in clinical communication processes, underscores the need for a comprehensive security assessment and strategy appropriate for the organization, coupled with the resources necessary to implement that strategy.
  2. Establish an appropriate risk management strategy. The committee should develop a risk management strategy that’s specific to the needs and vulnerabilities of the organization and is designed to manage the risk of an information breach to a reasonable level. HIPAA does not specifically define “reasonable,” but in general, the risk management strategy should include policies and procedures that ensure the security of message data during transmission, routing, and storage. The strategy should also include specific administrative, physical, and technical safeguards for ePHI.
  3. Roll out these policies and procedures and train staff. Implementing new policies and procedures is the biggest challenge for organizational leaders, especially as a substantial proportion of reported security breaches are due in part to insufficient training of staff. As a result, appropriate individuals should be assigned specific implementation tasks for which they are held accountable, while leaders and committee members must carefully monitor the success of implementation. All staff with access to PHI must be educated about the specific policies and procedures, which will help ensure they are upheld across the organization.
  4. Monitor risk on an ongoing basis. To ensure continued compliance with security standards, organizations must conduct ongoing monitoring of their information security risk. Leaders should receive regular trend reports from the information security committee based on their ongoing assessment of ePHI security at the organization. Those reports should support the ongoing assessment of security needs as technology and health care delivery change, and act as a catalyst for changes that may need to be made to the policies and procedures over time.

In today’s increasingly complex healthcare environment, analyzing and implementing a broader policy around security across all forms of electronic communications—rather than focusing on a single mode of communication in isolation—is critical to any health system’s ability to avoid and mitigate the adverse consequences of a breach. By clarifying the confusion around electronic communications now, hospitals and health systems will be better prepared to minimize risk and maximize best-practice communication process in the future.

Terry Edwards is president and CEO of PerfectServe of Knoxville, TN.

Readers Write: Seven Strategies for Optimizing the EHR

July 31, 2013 Readers Write No Comments

Seven Strategies for Optimizing the EHR
By Marcy Stoots MS, RN-BC

7-31-2013 4-11-56 PM

Healthcare organizations are making a mistake if they subscribe to the notion that once an EHR is successfully implemented, it no longer requires attention. Even the most carefully designed EHR will not work as intended in all situations, causing users to create workarounds that are counterproductive and inefficient. It’s important to develop and implement an ongoing strategy for fine-tuning the EHR so that users can input and access the data they need with fewer clicks and better outcomes, which will improve clinician satisfaction.

Besides moving toward usability and adoption, optimization will help with plans to achieve Meaningful Use Stage 2, which raises the bar significantly. Under the Stage 2 final rule, for example, hospitals must report on 16 of 29 clinical quality measures (CQMs) and Eligible Professionals must report on nine of 64 CQMs. Optimizing the EHR to properly capture this data and generate compliance reporting is crucial.

Finally, optimization is a key step to realizing the financial ROI of the EHR, in which a substantial investment has been made. In today’s landscape of cost containment and healthcare reform, an organization can ill afford to sacrifice financial ROI or be bogged down by inefficiencies.

Below are seven strategies for optimizing the EHR to increase efficiency, improve the ROI, drive adoption, and improve usability, with the ultimate goal of providing better outcomes.

1. Create a Governance Structure

Just as an organization needed a governance structure during planning and implementation of the EHR, it will need one for ongoing optimization. This will provide an avenue for making decisions and keeping the optimization plan moving forward. Problems will continue to arise and solid governance will ensure that they are dealt with effectively. A process should be in place to manage variances when clinicians do not want to adhere to a standardized documentation or workflows. When these crop up, the governance group will need to decide upon appropriate action.

2. Create a Solid Informatics Structure

Many healthcare organizations struggle with the size and organization of the informatics team. From an optimization standpoint, it’s important to get this right. There is no standard answer here; every organization is different. Detailed descriptions of job roles and responsibilities should be created and appropriate resources budgeted.

3. Assign Responsibility

An individual at the leadership level should be designated as the responsible party for optimization. This function should be incorporated into that person’s job description. This is typically an informatics director, but could also be a CMIO or IT director, depending on the organizational structure. Assigning this responsibility will help ensure that optimization is an ongoing process, since it requires continual evaluation and modification. Ideally, for larger health systems, there should also be an optimization team in place that could include clinical leadership, operational leadership, informatics analysts, and super users. For smaller health systems, the team would be much smaller, but informaticists should have optimization as a core job function.

4. Measure

The pain points of clinicians should be determined by interviewing stakeholders, examining service desk tickets, listening to input from IT and informatics staff, analyzing reports and metrics, and observing end-to-end workflows. The most important issues should be focused on with data collected at baseline and after 30, 60 and 90 days. Measuring is an ongoing process. It should be used to monitor progress and gauge success.

5. Create Scorecards

Scorecards are a powerful tool for demonstrating what has been achieved. They display the collected data and communicate improvements to the team and stakeholders. Managing workarounds starts with accountability; Scorecards lets users know where they stand and create a healthily competitive environment that encourages success. They can be used to compare units within a hospital or hospitals within a health system.

6. Provide a Quick Win

Clinicians can be easily frustrated by glitches in the EHR, so areas should be pinpointed that will quickly increase their satisfaction. These are issues that are important to them, yet easy to address, the low-hanging fruit that delivers the highest impact. Success breeds enthusiasm, setting the stage for better adoption.

7. Continue Refining

Optimization is never complete. It is an ongoing endeavor without an endpoint.

Workarounds are a reality. The organization should have an optimization plan to monitor and manage them, as well as establishing ownership of that plan. With proper planning and a roadmap in place, addressing problems and overcoming challenges will go smoothly. The end result will be satisfied users and healthier patients (and lower costs).

Marcy Stoots MS, RN-BC is a principal with CIC Advisory of Clearwater, FL.

Readers Write: How Many Licks to the Tootsie Pop Center Versus How Many Clicks to Relevant Clinical Data?

July 22, 2013 Readers Write 1 Comment

How Many Licks to the Tootsie Pop Center Versus How Many Clicks to Relevant Clinical Data?
By Helen Figge, PharmD, MBA, CPHIMS, FHIMSS

A group of engineering students once reported that it took an average of about 364 licks to get to the center of a Tootsie Pop. For some reason, this was a very important scientific query that needed an answer.

A current healthcare query many are pondering today is: how many clicks are needed to get to the relevant clinical data necessary to support patient care? Clinicians using the various technologies like EHRs and HIEs for data retrieval often times have the same number of steps as getting to the center of a Tootsie Pop. The more clicks it takes to get to the required clinical data, the more time spent away from the patient, and thus eventual loss of productivity, suboptimal patient care, and potentially total clinician frustration. If you speak to clinicians on the front line, many of the technologies are more of a hindrance than a help.

In reality, one wants the right data at the right time and in a comprehensible format without undue effort to retrieve it. Clinicians are yearning for “smart software” that knows what data to fetch and how to properly present it. Data needs to be automatically populated inside the clinician note. Additionally, more than ever clinicians need technology that supports workflow and provides the correct data with minimal effort on the clinician’s part. 

Bottom line, we need “smart software” that knows what data to present while simultaneously having other data immediately available with one click. As an example, if the software recognizes that the patient has diabetes, then certain labs — such as hemoglobin A1C, lipids, and renal function — should be automatically displayed in the note.

Ask clinicians and they will tell you that they are getting drowned in unnecessary data. Data needs to be presented in a way that is easily understood by clinicians. We know a lot about the technologies out there today compared to a few years ago, so clinicians more than ever should expect nothing less from their vendors today than data that is useful, timely, and in real-time.

Clinicians require from their technology enablers the ability to aggregate data from multiple sources and present it in a comprehensible format. For a diabetic patient, the software needs to aggregate kidney function from any laboratory source and plot and trend the data appropriately. Clinicians need to be more vocal in their desires for appropriate data and need to collaborate with the IT departments to get the desired outcomes from their technologies. Clinicians need to engage more than ever before to ensure the software chosen for their organizations delivers what is needed on the front lines.

Right now, clinicians need easily customizable data presentation formats, smart order templates and true data aggregators along with evidence-based algorithms from their vendors. Clinicians must have tools that actually work for them, not against them, and truly support patient care. True “smart software” should support what the clinician needs, not forcing the clinician to adapt to inept software to attempt data retrieval for patient care. IT experts need to continue engaging the clinician in collaborations because right now it’s all about the data and how it is presented.

Organized structured data is the paramount piece to the current healthcare puzzle. We have the answer to how many licks to the center of a Tootsie Pop. Now it’s time to get to the answer of how many clicks to the necessary data that truly supports patient care.

Helen Figge, PharmD, MBA, CPHIMS, FHIMSS  is advisor, clinical operations and strategies, for VRAI Transformation.

Readers Write: The Sequester’s Impact on Healthcare: Dangerous Unintended Consequences

July 22, 2013 Readers Write 1 Comment

The Sequester’s Impact on Healthcare: Dangerous Unintended Consequences
By Rich Temple

7-22-2013 8-27-38 PM

It has been three months since the sequester hit the healthcare industry, and the effects are more profound than they might seem. What’s most troubling is that the budget cuts in many cases will wind up costing the government more money and will have a particularly negative impact on cancer patients and those living in rural areas.

Cost of Caring for Unemployed

Across the healthcare spectrum, providers can anticipate about $11 billion in cuts. A joint study by the American Medical Association and the American Hospital Association estimates the loss of 330,127 healthcare jobs and 496,000 indirect job losses by 2021. Victims of job losses tend to require extra care to sustain their health and well-being while out of work, and the cost of these interventions may wipe out the perceived benefits of the sequester’s capricious cost-cutting.

Another Hit for Providers: Cuts in Medicare Reimbursement

For individual healthcare providers, the 2 percent across-the-board Medicare reimbursement cut will exacerbate challenges for providers who are already struggling to adapt to value-based purchasing and other mandated reimbursement cuts. Mercifully, Medicaid was exempted from this cut, but even Medicare Meaningful Use incentives will sustain the 2 percent reduction.

Particularly hard-hit will be rural hospitals, which according to a study by iVantage Health Analytics are twice as likely to be thrown into the red as a result of these cuts. That’s because rural hospitals treat older, poorer, and less-insured patients and are thus directly dependent on Medicare for their economic sustainability. This financial damage will ripple down to the communities they serve since these organizations tend to be among the largest employers and are likely to be a key focal point of much of the activity in their local economies.

Cuts Disproportionately Affect Community-Based Cancer Clinics

Cancer care is the area most profoundly impacted by the sequester. Reimbursement cuts are making it financially untenable for community-based cancer clinics — one of the more cost-effective treatment sites — to continue to serve many patients, thereby forcing them to either seek care in a more expensive hospital setting or not seek care at all.

Historically, Medicare reimbursement for cancer drugs has been the average price of the drugs, plus a 6 percent administrative fee to cover the cost of providing care. The sequester reduces that fee to 4.3 percent for both drugs and services, which in essence translates to a 28 percent cut in actual reimbursement.

According to a study conducted by the actuarial firm Milliman, the sequester is already resulting in layoffs, closings, cutbacks, and is driving patients into hospital settings. The study also says that the government could pay an average of $6,500 more per year for cancer patients in a hospital versus a community clinic.

Cuts to Cancer Research Means Fewer Clinical Trials

Another area where cancer patients are hard hit involves cuts to research funding. Besides the estimated loss of 20,500 research jobs, NIH research indicates that every $1 invested in cancer research yields over $2 in incremental economic activity. This translates to a $3 billion direct negative hit on overall economic activity.

Significant cuts to cancer research mean that fewer clinical trials will be available to help identify better treatments and thus, more protracted, costly, and painful care for patients will continue.

Most Vulnerable are Hardest Hit

In summary, the sequester’s effects are causing great pain on many levels to some of the most vulnerable segments of our population. And the perceived cost-reduction benefits are actually not likely to be realized since the unintended consequences of the sequester look like they will cost even more than the mandated cuts. These consequences could take the form of:

  • More expensive, less efficient care due to patients losing access to primary care physicians
  • Incremental unemployment insurance for those who have lost their jobs
  • Protracted inpatient stays due to less readily available preventative research
  • Other forms of public assistance these individuals will require

The effects of the sequester on healthcare have not been discussed extensively of late in the media. However, it should be noted that there are unintended consequences that we will most likely pay for in the coming years ahead.


Rich Temple is national practice director for IT strategy at
Beacon Partners.

Readers Write: The Enterprise Content Management Adoption Model

July 15, 2013 Readers Write 4 Comments

The Enterprise Content Management Adoption Model
By Eric Merchant

7-15-2013 6-21-31 PM

There have been numerous publications recently about the amount of unstructured content that exists (80 percent of all content) in a non-discrete format outside of the electronic medical record. This unstructured content exists as digital photos, scanned documents, clinical images, and faxes and e-mails.

The challenge of capturing this information as close to the source as possible — managing it effectively and ultimately delivering it to the necessary physician, nurse, or other provider in a timely manner at the point of need — is a continuous uphill battle. There are varying degrees of being able to manage unstructured content and make it available to decision makers in a meaningful way to improve patient care, drive operational efficiencies, and improve financial performance in the healthcare market.

In developing a content strategy, the challenge is greater than simply buying a software suite and thinking your problems are over. As content grows in volume and complexity, the strategic plan needs to be flexible to be able to grow and adapt accordingly.

To do this, a reference is needed to determine where we were, where we are now and where we want to be. I began creating an Enterprise Content Management (ECM) adoption model as an internal point of reference, but also as a strategic guide for the industry. In practice, it would function similarly to the seven stages of the EMR adoption created by HIMSS Analytics.

ECM Adoption Model

Stage 10

Vendor Neutral Archive (VNA) Integration: Ability to seamlessly integrate with VNA.

Stage 9

Federated Search: Ability to search content across the enterprise.

Stage 8

Information Exchange: Ability to share/publish content with external entities, social media, etc.

Stage 7

Analytics: Meaningful use of content.

Stage 6

Image Lifecycle Management (ILM): Ability to purge and archive.

Stage 5

Capture, Manage and Render Digital Content: Ability to capture photos, videos, audio, etc.

Stage 4

Intelligent Capture: Ability to use OCR and other techniques to extract/use data.

Stage 3

Integration: Ability to render content inside ERP, EMR, etc.

Stage 2

Workflow: Ability to use automated workflow to streamline processes.

Stage 1

Capture and Render Documents: Ability to scan/upload and retrieve documents.

Stage 0

All Paper: No document management system (DMS).

This adoption model can serve the healthcare industry well by allowing us to keep focused on the outcomes we want to achieve and the systems that would provide them. The adoption model also intertwines patient care initiatives (capture content and deliver within the EMR), operational efficiencies we need to achieve (federated search and analytics) and outcomes that will directly benefit healthcare organizations’ financial performance (intelligent capture, VNA and Image Lifecycle management).

In addition, this strategy also delivers on the commitment to support Meaningful Use and IHE data-sharing initiatives with the ability to share and publish unstructured content to information exchanges.

EMR systems have received the bulk of the attention the past few years due to the value they bring and the public policy and reimbursement implications of getting them successfully implemented. However, as the healthcare market becomes more electronically mature, we cannot lose focus on the larger picture and the bigger challenge and ultimately the patient. This picture is incomplete without bringing together both the unstructured content created outside the EMR and the discrete information within the EMR.

To do this, the ECM adoption model, in conjunction with the EMR adoption model, must both be used as a roadmap to reach that goal. ECM vendors must take the same approach that EMR vendors have taken and work hand in hand with healthcare organizations to provide the solutions to achieve Stage 10 of the ECM adoption model and ultimately move closer to a complete patient record, which subsequently creates better health outcomes delivered efficiently and in a financially solvent manner.

Eric Merchant is director of application services, health information technology, for NorthShore University HealthSystem of Skokie, IL.

Readers Write: Requirements Versus User Experience: The MU Design Impact on Today’s EHR Applications

July 15, 2013 Readers Write 3 Comments

Requirements Versus User Experience: The MU Design Impact on Today’s EHR Applications
By Tom Giannulli, MD, MS

7-15-2013 6-03-46 PM

Since the first electronic health record (EHR) applications, the federal government has been looking for ways to leverage EHR technology to improve the quality and cost of healthcare delivery. A decade ago, President George W. Bush declared that every American should have an electronic health record within 10 years. While we’ve come a long way, almost half of all medical providers are currently searching for an EHR, installing one now, or looking to switch out the one they have in place.

This is an eye-opening situation given the investment of billions of dollars in EHR technology by healthcare providers, technology suppliers, and the government via incentive programs. Why is this? One contributing factor is that the government incentive programs have excessively focused on features over user experience and outcomes.

When the current EHR incentive programs emerged in 2009, EHR suppliers with existing products were faced with the challenge of meeting Meaningful Use (MU) requirements. It’s not easy to retrofit new functional requirements into an existing product, and it’s commonly understood many suppliers had to focus on achieving functionality requirements however possible given the potential impact of government incentives. The time-bound goal was simply to get X feature programmed in Y weeks so that version update or hot fix could be applied to meet customer certification timelines.

Function ruled over form, often resulting in degraded user experience and sub-optimized workflows. In hindsight, it may have been better to have fewer incentive program requirements with broader definitions and simpler tests to validate compliance.

For example, assume a general requirement for physicians to be able to share standardized clinical documents with basic tests of compliance. With this more general goal, technology suppliers would have greater freedom around how to solve the requirement resulting in a greater range of solutions—some of which likely would have superior usability. The market would then reward the company that best met both the requirement and the associated usability and user satisfaction.

The overall goals of MU are sound; it’s simply that in practice the extent and specificity of the requirements often overemphasize feature content and prescribed usage at the expense of user experience and the innovation that comes with flexibility. A doctor on HIStalk a few weeks ago highlighted this reality:

“When you’re used to using very clean designs—a MacBook, an iPhone, Twitter, Facebook—and you sit down on an EMR (electronic medical record system), it’s like stepping back in time 15 or 20 years.”

I had the opportunity to build an EHR after MU Stage 1 had been established. This allowed us to take a more comprehensive approach in terms of meeting our overall design goals, including usability, as well as MU requirements. We wanted to make it possible for the physician to use the application to chart patient visits and the required data and reporting were generated as an by-product of normal use.

Now, we are facing changes for MU Stage 2, integrating those into an existing product, tying them to user needs in a way that makes sense. We have developed a process that uses a lot of user feedback and testing and we try to iterate quickly with releases at least monthly.

But the fact is that the specificity of MU and the rigorous testing don’t provide for the best user experience. Ironically, these really specific requirements—a number of which dictate the user experience to a large degree—are supposed to be creating improved usability when in fact they are detracting from user-friendless and improved workflow.

I believe that without MU, many EHR features would be similar, but there would be notable differences resulting from the focus on user feedback versus government direction. As a physician and an EHR designer, I would still want to track health maintenance and have tools to manage people’s care. The big change would be the ability to focus on some market-driven elements that we haven’t been able to spend as much time on because they aren’t MU requirements.

We would be spending more time looking at how we could use the practice data to highlight workflow problems or areas where the practice isn’t using best practices. By leveraging our large pool of operational and clinical data, we could generate more recommendations for practice optimization and patient care. These are very high level concepts that we are exploring, but are at a lower priority given the resources required to implement MU2 in a way that is well integrated and results in a positive user experience.

In a perfect world, current MU2 requirements would be replaced with just few high-impact goals related to interoperation and communication. Current MU2 requirements have added little new incremental value while creating a significant burden for vendors and end users. This situation is even more challenging in that the requirements are becoming more specific and dictate user interaction in some cases. The structure is in place to capture discrete data, measure quality, and communicate standardized data.

At this point, I believe the market should drive the process of advancing features and expand-on the valued features outlined by the MU requirements.

Tom Giannulli, MD, MS is chief medical information officer at Kareo.

Readers Write: All Vendors Exit Stage Left

July 10, 2013 Readers Write No Comments

All Vendors Exit Stage Left
By Frank Poggio

Stage 1 product certifications end this year — September 30 for Inpatient products and December 31 for Ambulatory. In many of my conversations with systems suppliers who are considering the next step in ONC Certification, they refer to it as “Stage 2 Certification.” I can’t blame them. I’ve done it myself.

Remember, it all started with Stage 1 two years ago, so naturally you would expect Stage 2 to follow Stage 1. But with the feds and ONC, it could never be that simple.

When ONC issued the final Stage 2 rules last year, they made a very purposeful and distinct break between Stage 2 Meaningful Use and the vendor test criteria. Instead of referring to “Stage 2 Test Criteria,” they labeled them the 2014 Edition Test Criteria. Providers are subject to Meaningful Use Stage 2 rules, while vendors seeking certification come under the 2014 Edition of Test Criteria. There are real differences  — some pretty big ones.

What I usually see is a software firm starts by carefully reviewing the provider MU Stage 2 attestation criteria since they are all over the Web. Next, they try to translate the MU list to product test criteria. Then confusion follows.

Although the MU attestation criteria for Stage 2 resembles the Certification test criteria, there are differences. For example, one big difference is a provider needs to attest to about 25 MU criteria and some Quality Measures to get the Stage 2 money. But you as a vendor need to pass on about 40 certification test criteria and nine QMS elements to become 2014 Edition Certified.

Another example: under Stage 2, a provider would attest to completing a HIPAA compliance risk analysis. That’s just one question (the answer is ‘yes’, subject to audit, of course). But for a vendor completing a certification test under the 2014 Edition, you address eight very specific tests for privacy and security.

ONC now refers to your Stage 1 certification as the “2011 Edition Test Criteria.” No more Stage 1.

A related question ties back to what I said at the top of this piece. Your current Stage 1 certification ends this year. Actually, ONC says your 2011 Edition certification ends and you must test out on the new 2014 Edition to continue to sell certified software.

As of this week, only four vendors have been successful in achieving 2014 Edition Full EHR Inpatient Certifications. Under Stage 1, there were dozens. The 2014 testing is turning out to be a real challenge for many vendors, far more difficult than I think ONC expected.
Some think ONC will extend the Stage 1 vendor certifications if they do not get enough vendors through 2104 Tests by September. That would seem a likely solution. But given Dr. M’s pointed comments about vendors “gaming the system,” I doubt it.

The reason they made the breaks between certification test criteria and MU attestation criteria is that when they decided to extend Stage 1 of provider attestation into 2014 (originally it was to die in 2013) they did not want to extent the vendor certifications as well. Why? I guess they just wanted to keep your feet to the fire.

Which raises the next question. How can a provider attest to Stage 1 in 2014 when all the vendor certifications for Stage 1 die in three or six months? Simple. ONC now allows the provider to MU attest under Stage1 using a 2014 Certified system. If you have clients or prospects that have not attested to Stage 1 and plan to do so in 2014, they must be running your 2014 Edition certified software for at least 90 days in 2014.

It seems that ONC has taken vendors off the Stage, and reduced them to simply an old Edition.


Frank Poggio is president of
The Kelzon Group.

Readers Write: Asking the Right Questions: How to Find the Right Technology Development Partner

July 10, 2013 Readers Write No Comments

Asking the Right Questions: How to Find the Right Technology Development Partner
By Lee Farabaugh

7-10-2013 5-54-37 PM

We’ve all heard the stories. A hospital implements technology only to discover that it is so complex and confusing that it takes clinicians twice as long to get their work done as it did before, frustrating providers and patients. The hospital tries to work through the issues to no avail, and the organization ultimately abandons the software in pursuit of something else.

Money, time, and resources are wasted, and the organization is still no closer to effectively leveraging technology to improve patient care or streamline efficiencies. On the other hand, there are technology implementations that go smoothly, with providers fully embracing an application and using it appropriately.

What differentiates the good from the bad? User experience design, centered on end user input. Positive outcomes (increased user adoption, for example) occur when end users are actively involved in technology design, development and implementation.

To determine whether your technology partner incorporates user experience into its approach, there are some key questions you should ask. Getting answers to these questions can help you avoid disastrous technology roll-outs and ensure potential applications are a good fit for your organization.

Does your technology partner take a provider-centric approach by involving clinicians as key members of the development team?

These clinicians should be providers who have been actively involved in practicing medicine, so they are aware of the issues clinicians face in their day-to-day work. Getting direct input from providers who will use the system ensures that any potential roadblocks are addressed and resolved. Even if the technology you are considering is more patient-focused, clinicians should still be part of the development team. When people with medical expertise are involved in designing a patient-focused product, they can share the clinical perspective on what is possible and preferable for the technology.

How much of your technology partner’s research and development budget is devoted to garnering information about user experience?

This question can reveal the value your technology partner places on end user input. In other words, are they putting their money where their mouth is and dedicating resources to obtaining and leveraging user feedback?

Have you ever had a usability assessment on your application portfolio?

This puts hard data around your technology partner’s usability claims. By reviewing a usability assessment, you can clearly see whether providers or patients are actually using the software your partner developed on a long-term basis.

Does your technology partner have an end user group to provide ongoing feedback?

This type of forum can be a valuable source for transparent feedback about a solution. Not every software developer has the resources to sustain a user group for each of its clients, but those companies that do communicate their commitment to their customers and end user satisfaction. If your technology partner does have a user group, you may want to ask if you can attend a meeting. Although this may not be possible—some companies prefer to limit the number of attendees at a meeting—it would allow you to gain helpful information directly from other users.

Does your technology partner provide you with easy and intuitive training and support?

While some applications may be “plug and play,” most will require a certain level of training. Getting a sense of how user-friendly the training is can help provide insight around your technology partner’s commitment to user experience design across all of its materials. User-centered training may involve short videos, web-based modules or super-user mentoring. Ideally, you want to avoid day-long didactic training sessions that provide limited value and take providers away from patient care.

User experience design is the linchpin for technology adoption. Technology companies that don’t place value on user experience in the design and development process could offer products that aren’t fully usable and don’t meet the needs of your organization. As such, asking deliberate questions about your partner’s view on the value of user experience is time well spent.

Lee Farabaugh is the chief experience officer at PointClear Solutions.

Readers Write: What to Consider Before Accepting Your Next Healthcare IT Position

July 10, 2013 Readers Write No Comments

What to Consider Before Accepting Your Next Healthcare IT Position
By Frank Myeroff

7-10-2013 5-46-19 PM

In a competitive and growing job market like healthcare IT, it might be tempting to accept the next attractive job offer you receive. But before you do, take time to consider certain predictors that could determine whether you will be successful on the job or regretful that you took the job.

Is the organization in line with my values, attitudes, and goals?

You may have heard that people are hired for skill but fired for fit. It’s true. That’s why it’s so important to make sure you mesh with the culture of a healthcare organization. Their culture includes a combination of values, visions, attitudes, beliefs, and habits. These collective behaviors are taught to new organization members and affect the way people interact with each other and the way business is done.

What are the workload expectations?

Ask the hiring manager to address the workload expectations. There’s no doubt if you take the job that your boss will expect you to complete all your tasks on time and accurately. New hires usually want to meet and exceed organizational expectations by going over and above the job. But consider and evaluate if you have the staff support and resources you need to be successful.

Can I handle the commute to this job?

Always consider the commute to your job. Is it too far? How much will it cost? Gas? Parking? Will you need to be a “super commuter,” in other words, fly back and forth? The number of super commuters has increased sharply over the past few years. Be sure to determine your tolerance level and that of your family regarding the job commute.

What is the boss like?

Your career depends on understanding what makes the boss tick. Having a positive relationship with the boss is key to your success, but having a bad boss is the ultimate morale buster. Find out if the boss is a micro-manager or hands-off boss. Know if he or she has realistic or unrealistic expectations for employees. Find out if they foster innovation or discourage it. It’s important to work for a boss who values your efforts and makes it worthwhile to come to work every day.

What are the people like?

There may be a good reason why the job is open. Are the people the kind you want to work with, or are they the type to push buttons? For a workplace to be really great, it’s essential that you have a good relationship with your co-workers since you will see them so often, work with them on projects, or interact with them on a daily basis. For an office to be truly productive, there has to be some sort of harmony and cohesiveness.

What is the career progression?

This is an exciting time in healthcare IT. The demand for talented IT professionals continues to grow and the opportunities for advancement have never been better. The healthcare organization you join should be committed to meeting your current career aspirations as well as foster your future career path.

What is the training offered?

There is a clear, strategic value in continuously training and developing staff. Not only does it enable the healthcare organization to meet its mission, but allows their professional IT staff to stay current and ready for upcoming changes and trends. When considering your next IT position, make sure the organization places a strong emphasis on training and development for all IT levels. Training should focuses on individual needs such as job-related and specialized training and collective needs such as leadership and time management.

Before you jump to accept that job offer, remember that an offer is only half of the equation. The other part is performing your due diligence. Make sure the healthcare IT position and organization match your “must haves” both professionally and personally.

Frank Myeroff is managing partner and VP of business development and operations of Direct Consulting Associates of Solon, OH.

Readers Write: Uncovering the Unexplored Role and Benefits of Clinical Data Abstraction

July 1, 2013 Readers Write 1 Comment

Uncovering the Unexplored Role and Benefits of Clinical Data Abstraction
By George Abatjoglou

7-1-2013 8-35-33 PM

With the industry’s move to electronic health records (EHRs), healthcare information management (HIM) professionals as well as RNs must play a guiding role through implementation and beyond, seeking processes and solutions to manage the conversion of enormous amounts of historical health information into meaningful, structured data. One helpful and often underexplored strategy for getting organizations up and running smoothly on an EHR involves pre-go live clinical data abstraction.

Historically, clinical data abstraction has been used to retrieve meaningful information that exists in unstructured formats – paper or otherwise – to fill in the “holes” in the electronic chart. Typically, this occurs so an organization can perform better financial or quality reporting. However, with the current push to implement EHRs within the ambulatory setting under Meaningful Use, healthcare organizations often underutilize or completely overlook using clinical data abstraction as a strategy for jump starting the EHR rollout process. In short, by populating EHRs with these details from paper charts and unstructured legacy EHRs before organization-wide rollout, physician practices to health systems and ACOs can reap EHR benefits more quickly, while ensuring more optimal data quality and integrity.

Whether patient records are electronic or paper-based, most contain “legacy” health information that requires someone to pluck relevant data from unstructured content and incorporate it in a structured, representative history in the EHR. While you may think it sounds like copying and pasting, the medical and scientific nature of the information makes this more complicated than it seems. In other words, clinical abstraction, when done well, is best left to the experts, including trained and credentialed HIM professionals and RNs who are consistently focused on clinical data integrity in their day-to-day roles.

With practice makes perfect, and experts in this arena are skilled magicians at identifying and pulling nuggets of information that will provide practitioners with the most valuable details moving forward, especially from a continuity of care perspective. Moreover, these individuals understand data in a broader way than a coder might, and as a result, take into consideration different clinical components that shape the picture of a person’s whole health as they mine critical details for the new EHR.

Leveraging clinical data abstraction as a strategic step in EHR population can take two forms.

  • Existing resources. Healthcare organizations can facilitate the effort themselves by using staff clinicians and/or hiring additional nurses, medical assistants or students to abstract clinical data.
  • Outsourcing. Other healthcare organizations work with partners who embed clinical experts within the organization to facilitate the process.

Although it’s feasible for smaller healthcare organizations—community hospitals, critical access hospitals, and small practices, for example—to abstract and manage data internally, clinical data abstraction becomes increasingly complicated for larger physician groups and health systems that provide care to hundreds if not thousands of patients in a given day. When these large systems try to enlist internal staff to conduct data abstraction and enter historical data into EHRs, they are likely to run into roadblocks.

For example, relying on internal resources for data abstraction will further decrease the productivity of clinicians and HIM professionals already diminished by an EHR implementation and preparing for the ICD-10 deadline. Clinicians typically decrease the number of patients seen during the EHR implementation period in order to adjust to the new workflows demanded by the technology. A physician who normally sees 20 patients per day may need to decrease patient appointments to 12 per day and gradually work back to a normal activity level after several months. If paired with abstraction responsibilities as well, the productivity decline is often viewed as too steep.

Although it may seem counterintuitive, using internal resources also can lead to the generation of even more non-standard, unstructured data. With patient care being the top priority for clinicians, abstracting clinical data and entering historical information may not always be executed in the same way as a full-time abstractor whose sole focus is on that one task, guided by standardization across every record. While some data is better than none, the benefits of unstructured data within an EHR are not much different than working with paper-based records, which defeats the value of EHR implementation.

An EHR implementation can only be as successful as the quality of its data. As the saying goes, “garbage in, garbage out.” Regardless of an organization’s decision to use internal or external resources, clinical data abstraction overseen by seasoned HIM professionals and supplemented by knowledgeable RNs offers several benefits—some of which are more heavily weighted in the interest of utilizing outside consultants:

  • Improved data integrity. As healthcare organizations go live with EHRs, data need to be organized in a structured and sustainable format to provide consistent core medical content for clinicians across all patient records.
  • Increased patient safety. When data consistently and accurately reflect patient conditions in a streamlined, structured format, EHRs become easier to navigate from a decision-making and care management perspective, contributing to increased patient safety and care quality.
  • Enhanced productivity and satisfaction. By relying on outside experts rather than tapping internal resources to abstract and enter historical data, clinicians’ time is maximized and remains focused on providing patient care, while internal HIM professionals are able to focus on other mission-critical tasks like ICD-10 training.
  • Better patient experience. Tasking clinicians to enter data does not add value to the delivery of care, nor does it contribute to the clinician–patient interaction. Unfortunately, with the learning-curve that often accompanies EHR implementation, a patient appointment can become rather data-driven and impersonal if clinicians spend more time looking at the computer screen than their patients. Using data abstraction experts allows physicians to maintain a positive “human” interaction with the patient, a critical component to meeting patient expectations.
  • Higher return on investment. No matter who facilitates it, there is an absolute cost associated with abstracting clinical data. Outsourcing the process does carry an initial expense, which may then be recouped by physicians’ sustaining their activity loads. On the other hand, revenue lost through decreased provider productivity when clinicians are tasked with performing data abstraction may not be regained. Cultivated by the improved patient experience, outsourcing the clinical data abstraction effort may also lead to additional gains such as practice expansion and patient retention.

The number of provider choices for patients is multiplying and steering healthcare into a more consumer-driven model. The healthcare organizations that thrive into the future will be the ones that safeguard data integrity and use it to streamline the physician/patient interaction. Tapping into the data management expertise of HIM professionals in particular and using clinical data abstraction to improve data quality, patient safety, and clinician productivity is one key to providing a positive experience for patients and clinicians alike – both throughout and beyond an EHR implementation.


George Abatjoglou is CEO of IOD.

Readers Write: My Tradeshow High Horse

July 1, 2013 Readers Write 2 Comments

My Tradeshow High Horse
By Annie Oakley

Perhaps you’ve read HIStalk posts in the past – particularly after HIMSS – lambasting the poor showmanship of exhibitors at tradeshows. Eyes down, phones on, beckoning smiles nonexistent.

You may also have read subsequent reader comments from said exhibitors attempting to explain away their need to ignore attendees for the sake of an incoming service call. I get it. Everyone has multiple jobs to do while at a conference. I’d be surprised to find a healthcare professional – provider or vendor – who doesn’t wear multiple hats these days and gets taken advantage of by 24/7 connectivity.

But, like many others out there in HIStalk land, I say turn your phones off when you’re in the booth. If you need to take or make a call, exit the exhibit hall.

I’ve been on both sides of the booth at tradeshows over the years, and so I feel qualified to get up on my high horse for just a few more paragraphs about my recent trip to the HFMA ANI show. It was an experience that left me optimistic about the tradeshow experience overall, but left me with a bitter taste in my mouth on more than one occasion.

The HFMA staff and volunteers were incredibly helpful, always had smiles on their faces and good attitudes to back them up. The majority of exhibitors that I had a chance to approach were pleasant to speak with. Some were downright engaging, leaving me with lasting positive impressions of their employees and brands. Most were extremely patient in explaining revenue cycle concepts and challenges – not easy completely absorb on first go round.

The “booth babe” phenomenon continues to die a slow death, unfortunately. I found out during an educational session that HFMA membership is 60 percent women. Do exhibitors really think they’ll attract female attendees with models dressed up in racing gear? I saw one male attendee look happy enough as he posed for pictures with them, and I shook my head in shame. Is that really how you want to get your leads? Is that really the impression you want to leave people – mostly female people – with?

Drew Brees was on the show floor for a time signing autographs, an attraction which drew a few dozen folks into a line that crisscrossed the exhibit floor. Now that’s a way to create buzz without alienating anyone.

One more comment, then I’ll get off my high horse. Exhibitors, please don’t be stingy with your giveaways. You and I both know that come the last day of the show, you’ll be moaning and groaning about having to ship them back. I approached one booth BECAUSE of their unique giveaway, but was immediately turned off when the rep, thinking I’d already been by, gave me the cold shoulder. I pleasantly explained to him that we had indeed conversed the day before, but I had not acquired any of his trinkets. He apologized – sort of – and actually said he hates having to talk to people twice! Buddy, if you don’t like talking to people twice, maybe you shouldn’t be in sales.

This particular conference was a great experience for me overall. The positives far outweighed the negatives. But, it’s true what they say: one bad apple can spoil the tradeshow bunch.

Readers Write: Health Data Analytics Provides Greater Value Over Big Data

June 26, 2013 Readers Write 9 Comments

Health Data Analytics Provides Greater Value Over Big Data
By Joe Crandall

6-26-2013 6-39-39 PM

Like you, I’m tired. I am tired of the latest buzzword in healthcare circles: “Big Data.”

The problem I see as a healthcare professional is that most experts are not offering solid, realistic ideas about how to leverage data at the decision-maker level. Most articles and experts are talking about using data to fundamentally change healthcare (genomics, population health, etc.) How many times have you heard that a new something was going to change healthcare forever? These experts are doing a disservice to the large majority of hospitals and health systems out there. I suggest you forget the term “Big Data” and begin to think about Health Data Analytics (HDA).

The truth is that most hospitals have been using health data analytics to some degree for a long time. Because of external and internal drivers, healthcare organizations are now being pushed to do more with less. That means leveraging their data and tools more efficiently. This isn’t about predictive analytics . It is about giving the clinical decision maker the information they need when they need it so they can make better decisions to drive better outcomes.

Six things to think about in regards to HDA:

  1. Ignore the hype. Don’t fall for the sales pitches and doom and gloom if you haven’t bought a business intelligence (BI) tool yet. About 90 percent of the hospitals out there are in the same boat as you. The hospitals giving the “Big Data” talks have been on that path for decades and have spent millions of dollars. Not surprisingly, they are only starting to leverage the data for research. You don’t need “Big Data” — you need analytics.
  2. Be realistic. Let me say that again: be realistic. You are not going to go from a data-averse culture to a data-driven culture overnight. You aren’t going to be able to convince everyone this is the right project to invest in. Buying the best in KLAS BI vendor is not going to magically transform your organization. If you do decide to buy a BI tool, be realistic when setting expectations with a BI vendor. The implementation won’t be as easy as they say and the people won’t flock to the platform as quickly as they say. In fact, it is like every other platform IT has installed. Focus on the people rather than the technology for lasting success.
  3. Conduct an in-depth assessment. Before you start a HDA program, take an honest assessment of your current state of health data readiness. A readiness assessment saves money in the long run by clearly identifying any gaps in skills, tools, or process. Answer some basic questions first. Does our organization have a culture of sharing data? Do we have a good data governance program in place? Do we have data integrity issues? Do our people know how to use the information we can provide? Knowing where you are starting and your end goal is an important part of any project. A great assessment will help you plan to reach your goals with clearly laid out courses of action.
  4. Start small. HDA projects need to start small with scalable and sustainable processes that will allow the program to expand intelligently. While in the military, we used the “crawl, walk, run” methodology and it applies to implementing a HDA program at your facility. Do not start running with “Let’s change the discharge process” as your first HDA project. A better and more focused choice could be to crawl with “On the labor and delivery floor, how do we discharge patients before 11 am?” Start small with big results. Then grow.
  5. Grow intelligently. Once that first project is a success, look into expanding under the guidance of a strong executive sponsor and a competent governance structure. Keep in mind that you don’t need to duplicate the first project throughout your facility – you need the ability to replicate it. Duplication implies a direct copy, while replication allows variances for each situation that might be encountered while implementing the new way of doing business. Once people start to see the benefit of a data-driven culture, requests for projects will pour in and the organization will need a plan to intelligently address all requests and aggressively pursue the best ones.
  6. Focus on your people. Most importantly is the focus on the people. Each person within your organization has a decision-making maturity that may or may not be able to leverage the HDA program effectively. This is why certain programs are successful under the leadership of one person but flounder once that leader moves on. It is why someone can look at raw data and see patterns in the business and make decisions that drive action. It is why a project can be successfully run by staff while being led by an inept leader. It is the maturity of each individual that will determine the success of the HDA program, not the tools or platform.

The requirement of leveraging data to gain a competitive edge is upon us. Healthcare organizations are being asked to improve outcomes as the main driver for improving the bottom line. A data-driven culture will transform an organization from volume based to value based, but it will take time and the right people. Focus on one project initially, guided by a strong executive sponsor utilizing a process that is scalable and sustainable.

If you do this, before you know it, your organization will be utilizing health data analytics to make more intelligent decisions that will ultimately improve outcomes. You will have created a data-driven culture.


Joe Crandall is director of client engagement solutions for
Greencastle Associates Consulting.

Readers Write: The Case for One Source of Truth

June 26, 2013 Readers Write 4 Comments

The Case for One Source of Truth
By Deborah Kohn

The notion of managing and being accountable for the health status of defined populations requires much more sophisticated clinical data collection methods and skills than most healthcare organizations have today. However, for decades, numerous coded systems have been used to successfully capture clinical data for reporting purposes, such as quality initiatives and outcome measurements, as well as for reimbursement and other myriad purposes.

Such coded systems, which health information professionals categorize as either clinical classification systems[1] or clinical terminology systems[2], can continue to be used to assist in determining prospective, pre-emptive care management on covered populations. However, no single classification system meets all use cases. ICD-9 CM does not contain medications. ICD-10 CM does not address functional status. In addition, no single terminology system meets all use cases. LOINC is used to encode laboratory data. SNOMED CT is used to encode clinical care data. RxNorm is used to encode medications.

Consequently, using the existing or newer coded systems to meet any of the fast-growing clinical data collection and analysis initiatives presents a significant challenge: too many systems from which to choose, hindering any efforts to change the collection of the data into actionable information for interoperability and health information exchange. To resolve this challenge, one “one source of truth" or one central authority platform (CAP) for all clinical data capture systems, existing and new, allows all coded systems to be used to capture and exchange information.

clip_image002

© Deborah Kohn 2013

With one CAP, healthcare organizations need not be concerned about when to use which data collection system for which purpose. Organizations are able to capture required clinical, financial, and administrative data once and use it many times, such as for adjudication and information governance purposes. In addition, organizations are able to compare the data for data integrity purposes. More importantly, organizations are assured that electronic healthcare data input by different users is semantically interoperable, i.e. the data are understood and used while the original meaning of the data is maintained.

For example, for typical diabetic patients, Reference Lab #1 might denote glycohemoglobin within the chemistry panel, Physician Office Lab #2 might denote glycohemoglobin as an independent test: HgbA1c, and Hospital Lab #3 might use the embedded LOINC code: 4548-4. The central authority platform recognizes each of the three laboratory information system inputs representing the same value — glucose level. Subsequently, the healthcare organization’s electronic health record (EHR) or business intelligence system makes use of the common meaning, and for example, generates a trend analysis of the patient’s glucose readings over time.

Developing a CAP requires considerable effort. The platform must be able to store all coded values, metadata, and all the content / terms. It must be able to normalize and catalog all the content / terms. It must be able track all changes in content identifiers, watches for differences in terms, cross-maps the content, route the content while preserving the data and context, and regenerate the data and content as it was stored. Finally, it must be able to manage all the content updates / releases. Today both the public and private domains have been moderately successful in developing the platform.

The Office of the National Coordinator for Health Information Technology (ONC) and the Centers for Medicare & Medicaid Services (CMS) collaborated with the National Library of Medicine (NLM) to provide the Value Set Authority Center (VSAC). VSAC is to become the public domain, central authority platform for the official versions of the value sets that support Meaningful Use’s 2014 Clinical Quality Measures (CQMs). However, currently VSAC does not go far enough to cover all use cases.

In the private domain, several health information technology vendors provide most of the required capabilities of the CAP. Interestingly, these vendors collaborated with clinical professionals to create different categories of coded systems to describe their products than those categories created decades ago by health information professionals. For example, the vendors refer to any coded system used for capturing and exchanging data as a “terminology” system, even though some of these systems are categorized by health information professionals as classification systems. In addition, the vendors categorize all “terminologies” as either standard[3] or local terminologies[4]. Some of these vendors go even farther in categorizing all “terminologies” as either retrospective or point-of-care terminologies[5]. Consequently, today not only are there too many coded systems for data capture and exchange from which to choose, but too many categories of coded systems to make sense of it all.

Assuming that both public and private domain CAP options will prevail, healthcare organizations can expect widespread use of the platforms, allowing EHRs and other electronic records, such as financial records, to incorporate multiple coded systems for specified needs. In addition, workforce demands for the clinical informatics skills needed to manage all the coded data will continue to remain strong.

[1] Clinical classification systems, such as ICD-9-CM, ICD-10-CM, and ICD-10-PCS derive from epidemiology and health information management. These systems group similar diseases and procedures based on predetermined categories for body systems, etiology or life phases. As such, they organize related entities for easy retrieval. They are considered “output” rather than “input” systems and were never intended or designed for the primary documentation (or input) of clinical care.

[2] Clinical terminology systems (a.k.a., nomenclature or vocabulary systems), such as SNOMED CT and RxNorm derive from health informatics. These systems are expressed in “natural” language, and, typically, codify the clinical information captured in an electronic health record (EHR) during the course of patient care (because the number of items and level of detail cannot be effectively managed without automation). As such, they are considered “input” systems.

[3] Standard terminologies consist of “administrative” terminologies, such as ICD and CPT, and “reference” terminologies, such as SNOMED, LOINC, RxNorm, and UMLS.

[4] Local terminologies are those that healthcare providers, such as laboratories or physicians, use on a daily basis in their records, on the telephone, etc., to describe specific diagnoses and procedures.

[5] Retrospective terminologies consist of all standard terminologies (administrative and reference) and local terminologies, while point-of-care terminologies are those that are healthcare provider-friendly and used for specific documents.


Deborah Kohn, MPH, RHIA, FACHE, CPHIMS, CIP is a principal with
Dak Systems Consulting.

Readers Write: My Notes On Last Week’s Senate Finance Committee Hearing

June 24, 2013 Readers Write 2 Comments

My Notes On Last Week’s Senate Finance Committee Hearing
By Data Nerd

In a rare twist of fate, I had some down time last week in between deadlines and got to choose between a variety of Congressional hearings to ridicule observe. While I’d really have loved to see Gen. Alexander prove that the NSA has foiled a legitimate terrorist threat, I decided to go with the Senate Finance Committee’s hearing on the dually-pressing grievances of high prices and low transparency in the health care industry as enumerated in Steve Brill’s Time piece, “Bitter Pill: Why Medical Bills are Killing Us.” The hearing lasted about as long as it took me to read the original article and unfortunately I couldn’t “observe” all of it, but here are the questions and responses I found most relevant on the topic.

Sen. Baucus kicked off the questioning by stating that disclosure alone may not be sufficient to bring down healthcare prices and asked each of the panelists to supply a solution to the problem. Mr. Brill pointed out that injecting competition into the insurance market alone doesn’t guarantee price reduction. He brought up the large amounts of campaign contributions made by the healthcare industry to each of the members on the committee, the least of whom accepted half a million dollars in the past five years. Suzanne Delbanco, executive director of Catalyst for Payment Reform, states that consumers tend to assume that higher price means higher quality, while Paul Ginsburg, president of Center for Studying Health System Change, suggests changing benefit design so that consumers care which provider they see.

Sen. Hatch questions what type of data is being released and how reliable and useful it is to consumers. Dr. Ginsburg hones in on insurers and employers as the best source for consumer health care pricing data, stating that data has to be customized and reflect details of particular health plan, and these organizations are in best position to provide that.

Sen. Hatch shifts focus to hospital chargemasters: “If they are only marginally relevant, what steps should we take to move away from these systems and replace them?” Dr. Delbanco responds by agreeing that CMS pricing data released was great education for all concerning price disparities, but that providers and consumers need to understand costs of delivering care and the costs of delivering high-quality care.

Sen. Thune next takes the floor and cites some state measures to publish price lists. He asks Dr. Delbanco if published price lists for elective procedures are effective in putting market pressure on hospitals. Dr. Delbanco states that very little research has been done on whether consumers use this data, but is a beginning. She stresses the need of customization to make usable, vis a vis connecting price data to health care plan specifications.

Sen. Thune astutely acknowledges the role of recent regulations in pushing the industry towards more consolidation and asks what role this plays in pricing and whether antitrust laws need to be reevaluated in light of this shift. Dr. Ginsburg says that the best approach is to take steps to make the market more competitive despite its consolidated state. He mentions a need to revisit FTC Safe Harbor policy to require demonstrations of benefits for patients, and asserts that government can take a legislative approach to outlaw non-competitive contracting practices between health plans and providers.

Sen. Burr asserts that “seniors don’t like choice” and that “faced with healthcare decisions, their [adult] children are increasingly being turned to rather than healthcare providers”. He also offered that it “would be a cheap shot” to say that donations that health care organizations have made to him as informing the healthcare legislation he has written. Mr. Brill pointed out that he didn’t accuse him of such.

Sen. Rockefeller brought up the “public option” and the fact that everyone loved it but no one voted for it, so it was replaced with a “medical loss ratio” that resulted in private insurers being forced to issue rebates to consumers. He brings it all home by praising Congress on the establishment of IPAB to take the power of the purse away from lobbyists and Congress and give it to physicians that can make “wiser” decisions to save Medicare dollars. To this, Dr. Ginsburg responds that IPAB is “constrained,” with only the authority to squeeze money from reimbursement. Reimbursement, he says, is on autopilot and Congress can still lower reimbursement amounts at will. Instead, he expects more savings to come from Innovate Reimbursement models.

Sen. Baucus highlights the price variations and states that “he saw a chart somewhere” that showed that Medicare reimbursement amounts do not vary as much as private insurance reimbursement. He asks why this is so and if CMS has access to private insurance reimbursement data. Dr. Ginsburg agrees with Sen. Baucus’s assessment and asserts that new reimbursement models should address price variances. He mentions regulating private prices like Maryland has done since the late 70s. Brill asserts that a five-column list should be made public: what Medicare pays, what the Chargemaster charges, and what the three largest insurers pay for the same service. Dr. Delbanco asserts the need for quality input. She states that it matters little what you pay for a service unless the quality is satisfactory.

Sen. Menendez quickly launched into an attack, stating that Mr. Brill’s article did little to acknowledge how healthcare reform is addressing price disparities. Brill interjects and refers the senator to a specific paragraph of the article, to which the senator tells him to wait until he is done stating his question. He then attempts to corner Mr. Brill into agreeing that Obamacare addresses price volatility by eradicating low-quality health insurance plans and expanding coverage for citizens. Mr. Brill maintains that, while beneficial in other areas, the ACA does not directly address price variation in the market. Menendez asks him if he believes prices should be controlled by the government. Mr. Brill states that he believes “patented, life-saving drugs” should be controlled, but not procedures, and that “some interference is needed to preserve a free market.”

Sen. Baucus asks why hospitals are so fancy and compares healthcare to education and insurance to student loans. Dr. Delbanco points out that patients do not have data on which to base their provider choice, so they generally go on perception of facilities. Dr. Ginsburg states that consumers are removed from cost.

Sen. Schumer points out that higher costs at teaching hospitals are justified because they typically treat more rare, last-resort patients.

Sen. Baucus proposes an entrepreneurial approach to itemizing costs at a hospital on any given procedure and making that data available to consumers. Ultimately, he asked “What data, if any, should be proprietary?”

Overall, the Senators prepared meaningful questions to ask the panelists and were provided well-thought-out responses that intimate the complex nature of this issue. Consumers do not want raw massive files of data to pore over – they want someone to provide it in a way that is personalized, comprehendible, trustworthy, and ultimately actionable. Doing this will require a complex system of cost to quality analysis coupled with personal health and insurance policy parameters.

In my opinion, any true consumer solution will offer an element of predictive capability on which to base insurance and provider choices. To the entity (or entities) that can provide this in the least-intimidating way go the spoils. Who knows whether it will be insurance companies themselves, a joint venture between them and employers, or an entrepreneurial one-size-fits-all solution? 

I’m giddy to see the day when I can not only predict my tax burden six months in advance and strategize how to minimize it for free online, but also chart out a course for my family’s healthcare and make informed decisions about how much coverage we need and where we should go to get care.

Readers Write: Through a Different Lens

June 21, 2013 Readers Write 3 Comments

Through a Different Lens
By Kathy Krypel

6-21-2013 8-12-47 PM

In the end, it was hepatitis. Not some organized alphabetized version, but a quick, no-holds-barred attack from inside that would give me 10 days in the hospital and a look at healthcare from a very different perspective.

I am a clinician. I am also a healthcare IT expert. And now, I am a patient.

My induction into patient life was abrupt and unexpected. I, who had not been hospitalized in 30 years, was afflicted with sepsis in very short order. The trip to the emergency department, the 103 degree fever, and the 10 days spent in the hospital are all a bit of a blur.

Looking at it weeks later, from the slow recovery side of things, I offer these observations.

The Clinicians

I don’t know if they still teach something called ‘bedside manner’, but my experience with clinicians varied significantly. On the high end of the scale were the infectious disease doctor and hospitalist who coordinated care, modeled teamwork, and went out of their way to explain tests and procedures to me and my family. On the low end was the consulting physician, who referred to me as the ‘bile duct in 52’ in a hallway conversation that I happened to overhear.

The nursing, lab, radiology, and transport staff will forever have my gratitude for the way they fiercely protected my modesty (even when I was too sick to care), kept me informed about test results, and treated me and my family with utmost kindness.

The Electronic Medical Record

Ironically, I actually helped build the EMR and train users at the hospital where I was admitted. It was astonishing and very impressive to see it in action. I was able to see how quickly blood test results came back, watch the multiple ultrasounds and CT scans, and even observe my own liver biopsy.

It was fascinating, but reminded me that the EMR is only a tool that offers safeguards and suggestions. The physicians on my case were dogged in their pursuit of this infection, but even with the best of electronic records, they could not grow a blood culture faster or obtain instantaneous results on lab draws. These just take time. As good as an EMR is, it can help with the diagnostic process, but cannot magically make it faster.

The Patient

At the end of the day, it’s the human things that I will remember most – the infectious disease doctor who held my hand in the ED, the hospitalist who sat on the end of my bed for 30 minutes and explained what was happening and said that she would “tell us when to worry,” and the number of nurses who looked me in the eye and said, ‘I am so sorry this is happening.”

Despite advances in healthcare information technology, there’s still an inherent need for the personal connection – the relationship. That is the vehicle for healing. As the industry tackles the patient engagement challenge, the relationship – the patient experience – truly is at the center.

Kathy Krypel, LICSW, PMP is a master advisor for Aspen Advisors.

Readers Write: What’s in YOUR Medical Record?

June 21, 2013 Readers Write 4 Comments

What’s in YOUR Medical Record?
By Ken Schafer

6-21-2013 8-07-36 PM

If my wife were admitted to the hospital with diabetic ketoacidosis (DKA), I’m pretty sure I wouldn’t want her electronic record to erroneously record a leg amputation (BKA). I’m equally confident that if this documentation mistake were made, I wouldn’t care too much how it happened. I would just want it fixed.

And if incorrect documentation on my diabetic wife resulted in an incorrect treatment course, which resulted in her death? You might end up with a $140 million verdict like this one.

Inga’s post on The Atlantic’s “The Drawbacks of Data-Driven Medicine” (from Big Datty,on 6/12/13) illustrates something that we all know to be true. Our medical records often contain mistakes, and electronic errors perpetuate themselves embarrassingly quickly. But her comments – and the source article – miss two very important points.

Doctors are responsible for the content of the records they create. This is true regardless of the method used to document patient encounters. Blaming the speech recognition system for hearing “DKA” instead of “BKA” makes no more sense than blaming a keyboard for a typographical error. If the physician picked the wrong checkbox on an EHR interface, would that be the fault of the EHR? Of course not.

Speech recognition, keyboarding, and dropdown menus are all methods for data capture. For that matter, so is a more traditional transcription process. But all of these methods have one element in common: the final content should be reviewed and validated by the documenting clinician. Physicians who fail to do this put their patients at risk.

Doctors make mistakes. I know a radiologist who dictated “liver” when he meant “heart.” The transcriptionist dutifully returned the report with the word “liver,” and it was signed by the physician. When the mistake was discovered, the audio was retrieved. The doctor listened to himself dictate the wrong organ, and blamed the transcriptionist. The point? Doctors are people, and people make mistakes, whether they own up to them or not.

That same physician was convinced speech recognition would eliminate transcription errors, and he was right – sort of. What speech recognition systems really do is eliminate transcriptionists, not errors. If radiologists are involved, there will still be errors. There’s no speech recognition system that will hear the word “liver” and change it to “heart.”

In fact, in our DKA:BKA example, the doctor may have had a bad day and actually said BKA to the speech recognition system. No matter what, though, the doctor made a mistake – either in what he said, or in what he saw on the screen and failed to correct.

Those with experience greater than mine often post to HIStalk about the shortcomings of EHRs in terms of the data they contain, with usability and completeness being favorite topics. My concern for our records is more specific. Especially when speech recognition is involved, what metrics do we have in place to make sure that narrative data is recorded accurately? If doctors are responsible for the content of their documents, and we know they make mistakes, how do we monitor and improve the quality of the narrative components of our EHRs?

As the government, physicians, patients, and the free market determine what systems we are to use and how they should work, we should never lose sight of this one truth: no matter what’s in the record, it should be right.


Ken Schafer is executive vice president, industry relations for
SpeechCheck.

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

 

 

 

Reader Comments

  • Descartes: I know they stopped writing new stuff in it some time ago but it is surprising that they would make the investment to re...
  • bob: a little birdy told me that Epic is finally dropping BOE/SAP from its Cogito line-up....
  • Brody Brodock: Unfortunately there are too many maskholes in congress feeding disinformation about the virus. Claiming somehow that th...
  • Unofficially Vaccinated: It will be interesting to see how those (such as myself) who participated in a vaccine clinical trial will be treated on...
  • Kathy: With everyone working remotely these days, I'm in search of the perfect office chair which is not uber-pricey. Which IK...

Sponsor Quick Links