Home » Readers Write » Recent Articles:

Readers Write: Don’t Get Stuck in the Readmissions Penalty Box

November 9, 2016 Readers Write No Comments

Don’t Get Stuck in the Readmissions Penalty Box
By Lisa Lyons

The Hospital Readmissions Reduction Program (HRRP) requires the Centers for Medicare and Medicaid Services (CMS) to reduce payments to inpatient hospitals with relatively high 30-day readmission rates. CMS applies up to a three percent reduction for “excess” readmissions using a risk-adjusted ratio that compares a hospital’s performance to the national average for sets of patients with specified conditions.

Payment adjustments for FY 2017 (based on performance from July 2012 through June 2015) will be applied to all Medicare discharges starting October 1 of this year and running through September 30, 2017. Payment reductions for FY 2017 will be posted on the Hospital Compare website this October.

Total HRRP penalties are expected to reach $528 million for FY 2017, up sharply from about $420 million in FY 2016, with more than half of the nation’s hospitals affected, according to a Kaiser Health News analysis. The average penalty will spike in similar fashion, from 0.61 percent in FY 2016 to 0.73 in FY 2017.

The situation calls for a thorough understanding of the readmissions penalty environment and a strategic mindset for taking action.

Prior to FY 2017, CMS measured excess readmissions by dividing a hospital’s number of “expected” 30-day readmissions for heart attack, heart failure, pneumonia, hip/knee replacement, and COPD by the number that would be expected, based on an average hospital with similar patients.

For FY 2017, CMS expanded the list of cohorts to include coronary artery bypass graft (CABG) procedures. The agency also added to the existing pneumonia cohort: the assignment criterion now includes cases where the principal diagnosis of non-severe sepsis includes secondary diagnosis of pneumonia and aspiration pneumonia. This creates a bigger set of patients from which a hospital could have readmissions — in fact, it may expand the pneumonia cohort by 50 percent in many hospitals.

Complicating matters, excess readmissions found in any of the six cohorts will result in an overall penalty. A hospital gets no credit for making readmissions improvements along the way.

At the same time, all hospitals are working on readmissions, so the average of excess readmissions is decreasing. That means it’s harder than ever for hospitals to stay under the penalty bar.

Also, due to HRRP’s reporting cycle, an excess readmission stays in CMS’s data for three years.

These factors make it hard for hospitals to know if they have passed the tipping point for readmissions penalties before notification from CMS — which typically happens just four months prior to penalties being imposed. In practical terms, there’s not enough time to impact results.

Further, analyzing CMS data is challenging for most hospitals because:

  • CMS data is retrospective. CMS calculates fiscal year penalties by looking back at data over a range of two to five years. As such, current improvements to readmission reduction programs will not be seen right away.
  • CMS data includes readmissions from “non-same” hospitals. Most hospitals can’t view cases where a patient initially admitted to their facility ended up being readmitted in another facility.
  • CMS data only includes readmissions among the Medicare patient population. Many commercial payers have instituted pay-for-performance programs, which should also be analyzed. Limiting your view to the Medicare HRRP program will only reveal part of your overall readmissions.
  • CMS’s Measure Methodology for Readmissions can’t be easily replicated. CMS risk-adjusts each qualifying patient using Medicare Part A and Part B data for a full year prior to admission, and 30 days post-discharge. Since hospitals don’t have access to this information, they can’t replicate the methodology to calculate their excess readmissions.

Fortunately, with the right data, there’s a way to emulate the CMS methodology to help estimate the volume of excess readmissions that will be attributed to your hospital. You can do so well before receiving your hospital-specific reports from CMS.

Here are four ways advanced analytics can help position hospitals to be more proactive in managing their readmissions:

  1. Purchase de-identified Medicare Part A and B claims data from CMS. Advanced analytics makes it possible to match historic claims data with known patients in your hospital information systems. In this way you can see longitudinal care histories for the patients you are discharging today. Algorithms can also predict the rate of non-same hospitalization from current readmission data, effectively filling in the blanks on readmissions that occur outside your hospital. That may give you up to two years advance notice regarding which readmissions will be counted as excessive. With that knowledge, you can do something about readmissions before the end of the evaluation period.
  2. Know how many readmissions will put you in jeopardy of incurring penalties. This is the previously mentioned tipping point. Surprisingly, for many hospitals, only a few excess readmissions per month can send them to the penalty box. Predictive analytics identify patients at greatest risk for unplanned readmissions. Look for algorithms with a high degree of accuracy in matching the CMS dataset to your own database to single out cases that were identified in the assignment criteria. Once you’re able to identify trends, you can fix the issues.
  3. Since CMS measures readmission back to any hospital, partner with other hospitals in your region to which you commonly refer patients back and forth. Concentrate on areas of improvement in either coordination or quality of care.
  4. Analyze clinical conditions across the board among your hospital’s patient population, not just within the six CMS-defined cohorts. Taking a broader view establishes more effective data patterning to help determine if a systemic problem exists. Dashboards and pre-formatted reports signal where to drill down for more detail (for example, whether you discharged the patient to home or a different care setting).

Government policy statements clearly indicate Medicare payments becoming more heavily weighted on quality or value measures, and HRRP will be part of that determination.

What’s more, CMS has proposed that the readmission measure itself be expanded to count excess days associated with readmissions — taking into account ED patients and those assigned to observation status — rather than singular readmission events for inpatients. Expect increased involvement of care management and quality teams in this area, and another layer of potential penalties.

Don’t wait to react to how these measures will impact your hospital’s operations and finances. Now’s the time to implement data analytics tools to intelligently manage your hospital’s readmission risk with a high degree of accuracy.

Lisa Lyons is director of advanced analytics and population health and interim VP of consulting at Xerox.

Readers Write: Address the Disruption in Provider Data Caused by Clinically Integrated Networks and Value-Based Care

October 31, 2016 Readers Write No Comments

Address the Disruption in Provider Data Caused by Clinically Integrated Networks and Value-Based Care
By Tom White


Hospitals that became health systems and are now morphing into clinically integrated networks (CINs) are facing increasing struggles managing their expanding patchwork of providers. These include credentialed and referring physicians, APRNs, nurses, other licensed professionals. Their provider count has often grown by five to 10 times.

Not only are there more providers, but also they are working in a wider variety of outpatient care settings. This has been a boon for consumers, as there are now many new retail healthcare locations on neighborhood street corners, but this poses an increasing challenge from a provider data perspective. Who is providing the service? What is their affiliation in the ACOs, next gen ACOs, CINs, or narrow networks? Are they sanctioned?

These problems rise from the emergence of the retail healthcare economy. The resultant growth in provider data is creating obvious and not-so-obvious consequences caused by disruptions in the provider data management process, affecting the accuracy of the provider data.

Poor provider data management tends to hurt healthcare organizations much more than they realize, especially in the context of today’s emerging retail healthcare economy and value-based reimbursement market. For hospitals and providers to succeed in these circumstances it is imperative to drive out unnecessary costs, and outdated or inaccurate provider data is a hidden source of significant costs.

As hospitals and health systems develop new alliances, it is critical to know what providers are included in a CIN, including their roles and affiliations. Efforts to collaborate over large patient populations and control value-based payments require in-depth and proprietary knowledge of provider affiliations, practice scope, and their economic models. This information is mission critical for success. Using a system that manages provider data in these areas should be a business imperative for every health system executive.

Licensed healthcare provider data management programs have historically been managed by numerous, fragmented systems across the healthcare ecosystem. Many healthcare leaders believe that electronic medical records (EMR) systems and their health information exchange (HIE) modules, credentialing, and other modern back-office IT systems have made provider data more accurate, secure, and accessible. Perhaps this is so with patient data, but this is not the case with provider data. These enterprise IT systems provide numerous benefits and may even provide a repository for some provider data, but they are not inherently designed for ongoing management of this business-critical data.

Let’s think for a minute about some specific areas in which provider data plays a vital role. Do CINs know who their providers are? How do they take these new provider networks and build the tools for consumers and providers to search and find them? Simple natural language searching (think Google searches) is how the entire world except for healthcare works. Having accurate provider data who are in-network with modern search tools should be a goal for all health systems and CINs.

Accurate provider data is critical to ensure that provider search tools can be the foundation of a successful referral management program. Potential patients that visit the hospital website and search for a local, in-network doctor or a specialist expect that the information they are presented with is accurate and current. If not, a bad customer experience could mean the loss of a patient, a loss of trust, and perhaps worst of all, a bad online review by the patient.

Physicians who use these search tools to identify specialists they can refer their patients to is a critical aspect of referral management. The range of critical data that is relied upon now goes beyond simple contact information and insurance plan participation. It might include physician communication preferences, licensing data, internal system IDs, exclusionary lists, and other sensitive internal information. This information changes frequently, but users don’t have time to ponder these facts. Inaccurate information wastes time and hurts patient satisfaction.

Inaccurate provider data causes billing delays that hurt cash flow and increases days A/R. Invoices sent to the wrong location or faxed to the wrong office are common in healthcare. Never mind issues stemming from inaccurate or incomplete address information.

Beyond clinical and financial performance gains from having more accurate information on providers is that this data can then be used in consumer and physician outreach programs across the health systems, whether part of a CIN or ACO. Hospitals are businesses, too. Historically many of their patients may be admitted through the ED, but increasingly are referred by in-network physicians or come through another outpatient service. The hospital’s marketing department may want to reach out to a network of physicians within a 200-mile radius to encourage referring patients to their facilities or simply promote a new piece of equipment or innovative procedure that’s now available at their facility. The marketing department might do searches to find these physicians and contact them. Having accurate provider ensures that these efforts are productive and efficient.

A tool is required that makes it easy for the appropriate teams in the health system to curate and update their health system provider data to create a single source of truth. This should include all credentialed and referring providers from across the entire healthcare organization, including acute, post-acute, outpatient, and long-term care environments.

While health systems can develop data governance models that require all departments to verify the accuracy of their provider data and to specify how it should be shared, this is seldom a success. Most organizations don’t know exactly who is in their pool of licensed providers and historically there has not been an IT system that can provide this comprehensive capability.

Healthcare leaders have to take a proactive approach to provider data management and can no longer afford to deny the critical role this information plays in today’s increasingly complex and challenging healthcare system. In a fee-for-service world where practitioners are paid for whatever work they perform, it may not be as critical to have accurate provider data. But in today’s value-based care market, accurate provider data is critical for running an efficient, competitive, and profitable healthcare system.

Thomas White is CEO of Phynd Technologies of Dallas, TX.

Readers Write: Ready or Not, ASC X12 275 Attachment EDI Transaction Is Coming

October 17, 2016 Readers Write No Comments

Ready or Not, ASC X12 275 Attachment EDI Transaction Is Coming
By Lindy Benton


As electronic as we are in many aspects of business – and life in general – oftentimes healthcare providers and payers are still using paper for claim attachment requests and responses. With the ASC X12 275 attachment electronic data interchange on the horizon, the need for utilizing secure, electronic transactions will soon be here.

Let’s look at the claim attachment process.

  1. A claim attachment arises when a payer requests additional information from a provider to adjudicate a claim. This attachment is intended to provide additional information or answer additional questions or information not included in the original claim.
  2. In many instances, the process for sending and receiving attachments is still largely done via a manual, paper-based format.
  3. Paper-based transactions are slow, inefficient, and can bog down the revenue cycle. Additionally, paper transactions are prone to getting lost in transit and are difficult if not impossible to track.
  4. The ASC X12 275 transaction has been proposed as a secure, electronic (EDI) method of managing the attachment request while making it uniform across all providers and payers.

The ASC X12 275 can be sent either solicited or unsolicited. When solicited, it will be when the claim is subjected to medical or utilization review during the adjudication process. The payer then requests specific information to supplement or support the providers request for payment of the services. The payer’s request for additional information may be service specific or apply to the entire claim, the 275 is used to transmit the request. The provider uses the 275 to respond to the previously mentioned request in the specified time from the payer.

Both HIPAA and the Affordable Care Act are driving the adoption of these secure, electronic transaction standards. HIPAA requires the establishment of national standards for electronic healthcare transactions and national identifiers for providers, health insurance plans, and employers. In Section 1104(b)(2) of the ACA, Congress required the adoption of operating rules for the healthcare industry and directed the secretary of Health and Human Services to “adopt a single set of operating rules for each transaction” with the goal of creating as much uniformity in the implementation of the electronic standards as possible.

Providers and payers will be required to adopt these standards at some point and it will happen sooner rather than later, so it’s time to be prepared.

The final specifications and detail for the EDI 275 transaction were supposed to be finalized in January 2016, but that has yet to happen. Both the American Health Association and American Medical Association have urged the Department of Health and Human Services to finalize and adopt the latest 275 standard, so with that kind of backing, it’s only a matter of time until the 275 transaction standard gains momentum and comes to fruition.

EDI 275 is coming. The question is, will you be ready?

Lindy Benton is president and CEO of Vyne of Dunwoody, GA.

Readers Write: Exploring the EMR Debate: Onus On Analytics Companies to Deliver Insights

October 17, 2016 Readers Write 1 Comment

Exploring the EMR Debate: Onus On Analytics Companies to Deliver Insights
By Leonard D’Avolio, PhD


Late last month, a great op-ed published in The Wall Street Journal called “Turn Off the Computer and Listen to the Patient” brought a critical healthcare issue to the forefront of the national discussion. The physician authors, Caleb Gardner, MD and John Levinson, MD, describe the frustrations physicians experience with poor design, federal incentives, and the “one-size-fits-all rules for medical practice” implemented in today’s electronic medical records (EMRs).

From the start, the counter to any criticism of the EMR was that the collection of digital health data will finally make it possible to discover opportunities to improve the quality of care, prevent error, and steer resources to where they are needed most. This is, after all, the story of nearly every other industry post-digitization.

However, many organizations are learning the hard way that the business intelligence tools that were so successful in helping other industries learn from their quantified and reliable sales, inventory, and finance data can be limited in trying to make sense of healthcare’s unstructured, sparse, and often inaccurate clinical data.

Data warehouses and reporting tools — the foundation for understanding quantified and reliable sales, inventory, and finance data of other industries – are useful for required reporting of process measures for CMS, ACO, AQC, and who knows what mandates are next. However, it should be made clear that these multi-year, multi-million dollar investments are designed to address the concerns of fee-for-service care: what happened, to whom, and when. They will not begin to answer the questions most critical to value-based care: what is likely to happen, to whom, and what should be done about it.

Rapidly advancing analytic approaches are well suited for healthcare data and designed to answer the questions of value-based care. Unfortunately, journalists and vendors alike have done a terrible job in communicating the value, potential, and nature of these approaches.

Hidden beneath a veneer of buzzwords including artificial intelligence, big data, cognitive computing, data science, data mining, and machine learning is a set of methods that have proven capable of answering the “what’s next” questions of value-based care across clinical domains including cardiothoracic surgery, urology, orthopedic surgery, plastic surgery, otolaryngology, general surgery, transplant, trauma, and neurosurgery, cancer prediction and prognosis, and intensive care unit morbidity. Despite 20+ years of empirical evidence demonstrating superior predictive performance, these approaches have remained the nearly exclusive property of academics.

The rhetoric surrounding these methods is bimodal and not particularly helpful. Either big data will cure cancer in just a few years or clinicians proudly list the reasons they will not be replaced by virtual AI versions of themselves. Both are fun reads, but neither address the immediate opportunity to capitalize on the painstakingly entered data to deliver care more efficiently today.

More productive is a framing of machine learning as what it actually is — an emerging tool. Like all tools, machine learning has inherent pros and cons that should be considered.

In the pro column is the ability of these methods to consider many more data points than traditional risk score or rules-based approaches. Also important for medicine is the fact that machine learning-based approaches don’t require that data be well formatted or standardized in order to learn from it. Combined with natural language processing, machine learning can consider the free text impressions of clinicians or case managers in predicting which patient is most likely to benefit from attention sooner. Like clinical care, these approaches learn with new experience, allowing insights to evolve based on the ever-changing dynamics of care delivery.

To illustrate, the organization I work with was recently enlisted to identify members of a health plan most likely to dis-enroll after one year of membership. This is a particularly sensitive loss for organizations that take on the financial responsibility of delivering care, as considerable investments are made in Year 1 stabilizing and maintaining the health of the member.

Using software designed to employ these methods, we consumed 30 file types, from case management notes, to claims, to call center transcripts. Comparing all of the data of members that dis-enrolled after one year versus those that stayed in the plan, we learned the patterns that most highly correlate with disenrollment. Our partner uses these insights to proactively call members before they dis-enroll. As their call center employs strategies to reduce specific causes of dissatisfaction, members’ reasons for wanting to leave change. So, too do the patterns emerging from the software.

The result is greater member satisfaction, record low dis-enrollment rates, and a more proactive approach to addressing member concerns. It’s not the cure for cancer, but it is one of a growing number of questions that require addressing when the success of an organization is dependent on using resources efficiently.

The greatest limitation of machine learning to date has been inaccessibility. Like the mainframe before it, this new technology has remained the exclusive domain of experts. In most applications, each model is developed over the course of months using tools designed for data scientists. The results are delivered as recommendations, not HIPAA-compliant software ready to be plugged in when and where needed. Like the evolution of computing, all of that’s about to change.

Just hours after reading the Gardner and Levinson op-ed, I sat across from a primary care doc friend as she ended a long day of practice by charting out the last few patients. Her frustration was palpable as she fought her way through screen after screen of diabetes-related reporting requirements having “nothing to do with keeping [her] patients healthy.” Her thoughts on the benefits of using her organization’s industry-leading EMR were less measured than Drs. Gardner and Levinson: “I’d rather poke my eyes out.”

I agree fully with Drs. Gardner and Levinson. The answer isn’t abandoning electronic systems, but rather striking a balance between EMR usability and the valuable information that they provide. But I’ve been in healthcare long enough to know clinicians won’t be enjoying well-designed EMRs any time soon. In the meantime, it’s nice to know we don’t need to wait to begin generating returns from all their hard work.

Leonard D’Avolio, PhD is assistant professor at Harvard Medical School CEO and co-founder of Cyft of Cambridge, MA.

Readers Write: ECM for Healthcare Advances to HCM (Healthcare Content Management)

October 17, 2016 Readers Write 1 Comment

ECM for Healthcare Advances to HCM (Healthcare Content Management)
by Amie Teske


Industry analysts project healthy market growth for enterprise content management (ECM) solutions across all industry sectors. Gartner’s 2016 Hype Cycle for Real-Time Health System Technologies places ECM squarely along the “plateau of productivity” at the far, right-hand side of the hype cycle curve. This essentially means that ECM software has succeeded the breakthrough in the market and is being actively adopted by healthcare providers.

This is good news for ECM users and technology suppliers, but what’s next for ECM in healthcare? To remain competitive and leading edge, ECM solutions at the plateau must evolve for the sake of customers and the marketplace in order to maintain business success. There is more good news here in that ECM solutions are evolving to keep pace with healthcare changes and demands.

Up to 70 percent of the data needed for effective and comprehensive patient care management and decision-making exists in an unstructured format. This implies the existence of a large chasm between resources and effort expended by healthcare delivery organizations (HDOs) on EHR technology to manage discrete data and the work yet to be done to effectively automate and provide access to the remaining content. ECM solutions are evolving in a new direction that offers HDOs an opportunity to strategically build a bridge to this outstanding content.

Healthcare content management (HCM) is a new term that represents the evolution of ECM for healthcare providers. It is the modern, intelligent approach to managing all unstructured document and image content. The biggest obstacle we must overcome in this journey is the tendency to fall back on traditional thinking, which drives health IT purchases toward siloed, non-integrated systems. Traditional methods for managing patient content have a diminishing role in the future of healthcare. It’s time to set a new course.

An HCM Primer

  • HCM = documents + medical images (photos and video. too).
  • The 70 percent of patient content outside the EHR is primarily unstructured in nature, existing as objects that include not only DICOM (CT, MRI) but also tiff, pdf, mpg, etc.
  • ECM has proven effective for managing tiff, pdf and a variety of other file formats. It is not, however, a technology built to handle DICOM images, which represent the largest and most numerous of the disconnected patient objects in question.
  • Enterprise imaging (EI) technologies have traditionally been responsible for DICOM-based content. These include vendor neutral archives (VNA), enterprise/universal viewers, and worklist and connectivity solutions that are unique to medical image and video capture.
  • Leveraging a single architecture to intentionally integrate ECM and EI technologies — enabling HDOs to effectively capture, manage, access and share all of this content within a common ecosystem — is referred to as healthcare content management or HCM.

Although the market is ready for HCM and many HDOs are already moving in this direction, it is important to know what to look for.

Critical Elements of HCM

Although it is the logical first step, HCM encompasses much more than simply unifying ECM and EI technologies together into a single architecture to enable shared storage and a single viewing experience for all unstructured content, DICOM and non-DICOM. Just as important is workflow and how all document and image content is orchestrated and handled prior to storage and access. This is essentially the secret sauce and the most difficult aspect of an HCM initiative.

ECM for healthcare workflow is geared to handle back office and clinical workflows associated with health information management, patient finance, accounts payable, and human resources, for example. The intricacies of these workflows must continue to cater to specific regulations around PHI, release of information, etc. All this to say that the workflow component of ECM is critical and must remain intact when converging ECM with EI technologies.

The same goes for workflows for enterprise imaging. EI workflow is optimized to handle image orchestration from many modalities to the core VNA or various PACS systems, medical image tag mapping/morphing to ensure image neutrality and downtime situations, for example.

These workflow features should not be taken lightly as health systems endeavor to establish a true HCM strategy. Do not overlook the need for these capabilities to ease the complexities inherently involved and to fully capitalize on any investment made.

Guidance for HCM Planning

Consider the following recommendations as you plan an HCM approach and evaluate prospective vendors:

  • Be wary of an archive-only strategy. A clinical content management (CCM) approach is primarily an archive and access strategy. The critical element of workflow is fully or partly missing. A word of caution to diligent buyers to ask the right questions about workflow and governance of unstructured document and image content before, during, and after storage and access.
  • Always require neutrality. Changing standards is a given in the healthcare industry. HCM should be in alignment with the new standards to ensure all document and image content can be captured, managed, accessed, shared, and migrated without additional cost due to proprietary antics by your vendor. An HCM framework must have a commitment to true neutrality and interoperability.
  • Think strategically. A deliberate HCM framework offered by any healthcare IT vendor should be modular in nature but also able to be executed incrementally and with the end in mind. Beginning with the end in mind is slightly more difficult. The modularity of your HCM approach should allow you to attack your biggest pain points first, solving niche challenges while preserving your budget and showing incremental success in your journey toward the end state.
  • Consider total cost of ownership (TCO). If a common architecture and its associated cost efficiencies are important in wrangling your outstanding 70 percent of disconnected patient content, you cannot afford to take a niche approach. It may seem easier and cheaper to select a group of products from multiple niche vendors to try and solve your most pervasive siloed document and image management problems. Take a careful look at the TCO over the life of these solutions. It is likely the TCO will be higher due to factors which include the number of unique skillsets and FTEs required for a niche strategy.
  • Demand solution flexibility and options. Your HCM approach should provide extensive flexibility and a range of options and alternatives that are adaptable to your unique needs. Software functionality is important, but not the only criterion.

Your HCM approach for strategically managing all unstructured patient content should allow you to:

  • Start small or go big, solving one challenge or many.
  • Establish a common architecture with a unified content platform and viewing strategy for all document and imaging content.
  • Enable unique ECM and EI workflows, not simply storage and access.
  • Hold one technology partner responsible – “one throat to choke” – for easier overall performance management and administration.

Providers of all shapes and sizes must take a thoughtful and deliberate approach when evaluating document and image management solutions. There is much more involved than simply capture and access. Because this category of technology can enable up to 70 percent of your disconnected patient and business information, you cannot afford to make a decision without carefully considering the impact of HCM on your healthcare enterprise, immediately and over time.

Amie Teske is director of global healthcare industry and product marketing for Lexmark Healthcare.

Readers Write: Guaranteeing MACRA Compliance at the Point of Care

October 5, 2016 Readers Write No Comments

Guaranteeing MACRA Compliance at the Point of Care
By David Lareau


MACRA will affect every physician and every clinical encounter. Current systems have been designed to produce transactions to be billed. MACRA will require that clinical conditions have been addressed and documented in accordance with quality care guidelines. The only way to ensure that happens is to do it at the point of care.

The challenge is that physicians need to address all conditions, not just those covered by a MACRA requirement. One approach is to just add another set of things to do, slowing doctors down and getting in their way. This is the transactional approach — just another task.

Most current systems have different tabs that list problems, medications, labs, etc. Users must switch back and forth looking for data. The data cannot be organized by problem since the systems lack any method for correlating information based on clinical condition. Adding another set of disconnected information to satisfy quality measures will only make it worse for users.

A better approach is to integrate quality care requirements for any condition with all the other issues the physician needs to address for a specific patient and to work it into a physician’s typical workflow. A well-designed EHR should have a process running in the background that keeps track of all applicable quality measures and guidelines for the patient being seen. The status of all quality measures must be available at any point in the encounter in a format that ties all information together for any clinical issue.

This requires actionable, problem-oriented views of clinical data, where all information for any clinical issue is available instantly. Physicians need to be able to view, react to, and document clinical information for every problem or issue addressed with the patient. This includes history and physical documentation, review of results, clinical assessments, and treatment plans as well as compliance with quality measures.

Guaranteeing MACRA compliance at the point of care can be accomplished by using a clinical knowledge engine that presents all relevant information for any clinical issue so that MACRA quality measures are seamlessly included as part of the patient’s overall clinical picture, not as just another task to be added on to the already burdensome workflows of current systems.

David Lareau is CEO of Medicomp Systems of Chantilly, VA.

Readers Write: Telemedicine Is Just Medicine

October 5, 2016 Readers Write 6 Comments

Readers Write: Telemedicine Is Just Medicine
By Teri Thomas


Telemedicine. MHealth. Remote healthcare. What’s the best term for a given use case? A large portion of my job is focused on it, yet my answer is, “I don’t much care what term you use.” 

Well, I guess I care a little if I see confusion getting in the way of progress. Don’t get me wrong — I’m glad that nobody has been saying “mMedicine” yet (would that be like, “mmm…medicine” or “em-medicine?”) I don’t love “virtual health” as it makes me wonder if I watch lots of exercise shows and raw food infomercials, could I get virtually healthy? 

Defining telemedicine as a subset of telehealth related to direct care at a distance vs. provision of healthcare-related services at a distance, while correct—who cares? Consider if when indoor plumbing was new, people discussed “s-water” (out of a stream), vs. “i-water” (from in the home). I guess i-water would be better than p-water from pipes (it’s OK to giggle a little — be a middle-schooler for a minute). We care about perhaps three factors:

  • Is it modified/sparkling/flavored?
  • Do we have to pay for it (bottled water vs. tap water)?
  • Is it clean enough to drink?

Medicine is medicine. Healthcare is healthcare. It’s care: good, bad, and a ton in the middle. Yet I hear murmurs like, “Telemedicine isn’t good quality healthcare.” That’s like saying tap water isn’t good enough to drink because you’ve spent time in Flint.

Good quality care isn’t determined by the location of the provider or patient. Care can be done very well without requiring the patient and the clinician to be in the same room. It can also be done very poorly. Probably the majority of it — just like when the doctor and patient are together in a room — is not perfect, not bad, and mostly OK. 

Not every type of visit is appropriate over video, but many types are. In dermatology, providers have been using photos for decades. Camera cost and image resolution have dramatically improved so that even inexpensive systems can provide more image detail than a physician with the sharpest of vision. Stethoscopes, lights, cameras, video connections, telephones—all are tools to help us practice medicine better.  Sometimes the tools work great and are helpful and sometimes not.

If the Internet connection is slow or the battery dies, quality is impacted. But think for a minute about the impact on quality of care for the physician who had an extra-complex first appointment and is running an hour or more behind. The patients are stacking up and getting upset about their wait times. The clinic day is lengthening. The pressure to catch up mounts. Finally, consider the patient taking off work, driving to a clinic, parking, sitting in a waiting room with Sally Pink Eye, feeling at bored at best and anxious and angry at worst about their wait times.

How high of quality will that encounter be compared to the patient connecting with the provider from home or work? The patient didn’t have to drive, and even if waiting, likely they were in a more comfortable environment with other things to do.

Keep in mind that if the patient were physically there in the dermatology office and the lights went out or the dermatologist’s glasses were suddenly broken, it would be very hard to provide a quality exam. For a remote derm visit, if you can ensure reliable “tool” quality (history from the patient and/or GP, high enough resolution video/images, clear audio), why should there be a care quality concern? Yet these kinds of “visits” — heavily image-focused encounters — are still traditionally accomplished by asking a patient come to the provider. 

Thank you to Kaiser and other telemedicine leaders for providing us with the validating data: remote visits can be done with high quality, lower costs, and positive quality care and patient satisfaction outcomes. On behalf of patients who are increasing expecting more convenient care, healthcare providers who are hesitant — please invest in video visit technology and seek opportunities to provide more convenient care for your patients. Payers, please recognize that this is in everyone’s best interest and start financially rewarding those providers.

Teri Thomas is director of innovation for an academic medical center.

Readers Write: What Hospitals Can Learn from the Insurance Industry About Privacy/Insider Threat Risk Mitigation

October 5, 2016 Readers Write No Comments

What Hospitals Can Learn from the Insurance Industry About Privacy/Insider Threat Risk Mitigation
By Robert B. Kuller


The drumbeat of hospital PHI breaches marches on. Every day there seems to be another news article on a hospital being hit with a ransomware attack. Hospital CEOs and bards are placing ever-increasing demands on their CIOs to pour technology and resources into preventing these perimeter attacks.

Who can blame them? They don’t want to have to appear before the media and explain why the attack wasn’t prevented given the current high threat environment, how many patients records were affected, and how they will deal with the aftermath of the breach.

Even though these perimeter attacks are no doubt high profile, there is a larger threat that is not being given high enough attention by CEOs or their boards and certainly not the same level of technology and resources to deal with it — privacy and insider-borne threats. According to a recent study by Clearswift, 58 percent of all security incidents can be attributed to insider threats (employees, ex-employees, and trusted partners).

The primary causative factors were identified as inadvertent human error and lack of awareness or understanding. Only 26 percent of organizations are confident they can accurately determine the source of the incident. There are plenty more statistics to throw around, but suffice to say, insider threat is a major problem and represents a large part of hospital breaches even though they do not routinely get the same level of media coverage.

Let’s take a quick review of what the hospital landscape looks like in terms of dealing with insider threat today. Most privacy staff are very small, usually about two people. They are charged with identifying potential breaches; investigating those identified potential breaches to determine actual breaches; interfacing with department heads; internal, and regulatory reporting on actual breaches; putting together a breach reaction plan; assisting with staff education; and preventing future breaches. With a typical 400-bed hospital exceeding five million EHR transactions per day — all of which need to be reviewed — any reasonable person would conclude that is a very high set of expectations for such a small staff.

The vast majority of hospitals continue to use inferior, outdated technology because of severe budget limitations that are applied to the privacy function, while tens of millions of dollars are spent on perimeter defenses. The capabilities of these systems are very limited and basically dump tens of thousands of audit logs entries into Excel spreadsheets that need to be reviewed by the privacy staff. Cutting edge, behaviorally-based systems with advanced search engines, deep insight visualization, and proactive monitoring capabilities are available, but not regularly adopted.

Privacy/insider threat is primarily viewed as a compliance issue. Many hospital CEOs and boards justify giving low priority and resources to this area by looking at the potential fines that OCR will levy if their hospital’s PHI is breached. In fact, the fines are relatively low; breaches have to break the 500-record threshold (although OCR recently announced an effort to delve into breaches below this threshold); you have to be found guilty of not doing reasonable due diligence; and you are given multiple chances at correcting bad practices prior to fines being assessed. Combine this with an overreliance on cyber risk insurance and you have a potential for disaster.

The actual risk profile should start first and foremost with loss of hospital reputation. A hospital brand takes years and millions of dollars to build. One privacy breach can leave it in ruins. The second risk is patient loss and the associated costs of replacing those patients. A recent poll by Transunion showed that nearly seven in 10 respondents would avoid healthcare providers that had a privacy breach. The third major risk is lawsuits, legal costs, and settlements. Settlement costs are large and juries generally rule against institutions and for the damaged plaintiff. Fourth would be compliance.

There also seems to be a misunderstanding of cyber risk insurance. Like other insurance, it will not reward bad practices or flawed due diligence on behalf of the policyholder. Insurers will do a pre-audit to make sure that the risk they are undertaking is understood, that proper prevention technologies are in place, and that best practices are being documented and followed. Once a breach has been claimed, they will generally send out another team of investigators to determine if the items mentioned above were in place and best efforts were maintained during the breach. If they weren’t, this could lead to a denial or at least a prolonged negotiating process. Premium costs will also be reflective of level of preparedness and payouts generally do not cover anywhere near the full costs of the breach.

Prior to coming back to the hospital industry, I spent six years in the disability insurance industry, where top management and Boards take both insider threat and the actual risk matrix of PHI breach very seriously. I believe the hospital industry can learn a valuable lesson from the disability industry. This lesson can be summarized as

  1. Take the real risk matrix seriously.
  2. Put the proper amount of technological and human resources in place in alignment with the actual risk profile.
  3. Buy the best technology available, update it as frequently as possible, and get proactive rather than reactive.
  4. Educate and remind your staff constantly of proper behavior and the consequences of improper behavior (up to and including being terminated).
  5. Don’t overly rely on cyber risk insurance.
  6. Review the CISO’s reporting structure (avoid natural conflicts of interest with the CIO) and have them report to the board for an independent assessment of privacy/insider threat status on a regular basis.

As difficult and expensive as hospital data security is, it is both mandatory to protect patients and part of the price of admission to the market. Although we are in a constant battle to stay one step ahead of the bad guy, we often find ourselves one step behind. That, I’m afraid, is the nature of the beast.

Let’s place privacy/insider threat on an equal footing with the real risks associated with it. It simply makes sense to do so, from the patient, risk, financial, and fiduciary perspectives.

Robert B. Kuller is chief commercial officer for Haystack Informatics of Philadelphia, PA.

Readers Write: The Surgeon General’s Rallying Cry Against the Opioid Epidemic Must Also Be a Call to Arms for Healthcare IT

September 14, 2016 Readers Write 3 Comments

The Surgeon General’s Rallying Cry Against the Opioid Epidemic Must Also Be a Call to Arms for Healthcare IT
By Thomas Sullivan, MD


In a rare open letter to the nation’s doctors, US Surgeon General Vivek Murthy, MD, MBA sounded a rallying cry to engage their greater participation in the opioid-abuse crisis afflicting our country. Missing from the USSG’s commendable call to arms, though, was mention of the role technology plays in reducing drug diversion and doctor shopping and providing ready access to services to support patients.

Those of us in healthcare IT know that we are critical to this cause. The USSG is talking to our customers, and we know our customers aren’t adopting as quickly as they could the substance abuse-fighting technologies that are widely available to them. This includes a variety of technology solutions such as:

  • E-prescribing technology, particularly EPCS to support the electronic prescribing of controlled substances, which is key to helping providers more efficiently deploy and monitor prescription medicines being prescribed or over-prescribed across a practice.
  • Medication adherence monitoring technology that lets providers gauge in real time, at the point of encounter, a patient’s level of compliance with drug therapy and provide patients with evidence-based support and services for self-management.
  • Clinical decision support that helps doctors avoid adverse drug events and medication errors.
  • State-run prescription drug monitoring programs (PDMPs) designed to help law enforcement track the use of controlled substances and help prescribers identify doctor shoppers and others seeking illicit access to controlled substances.

Specific to the opioid abuse epidemic, the most important next step is for physicians to be able to check PDMPs within their normal workflow. Simply said, the integration and availability of PDMP data within e-medication management solutions — e-prescribing, medication history services, medication adherence tools and the like — will result in the greatest use of PDMP data and the best one-two tech-assisted punch we have in the opioid battle.

Over the past two years, policymakers have begun to take action in using EPCS to address this crisis. This past March, New York State took a major step toward this goal when it began requiring e-prescriptions for all controlled substances as well as all non-controlled substances, frequently referred to as “legend drugs.” Known as I-STOP, the Internet System for Tracking Over-Prescribing act, originally passed in 2012, New York’s experience now serves as a case study for other states that wish to modernize their prescribing infrastructure and address opioid abuse.

Maine as well now will require opioid medications to be prescribed electronically via Drug Enforcement Agency-certified EPCS solutions beginning in July 2017. Several other states including Massachusetts, Missouri and Maryland are also considering or working to pass mandatory EPCS requirements for prescribers.

Unfortunately, neither New York nor Maine PDMP data is currently accessible to health IT vendors for integration into the prescribing workflow of providers.

E-prescribing – the direct digital transfer of patient prescriptions from provider to pharmacy – is broadly recognized as an important tool in helping promote patient safety, convenience, and overall efficiency for all stakeholders in the prescription process. E-prescribing is well understood to assist prescribers by allowing patients and doctors to better guard against medication errors, such as drug-to-drug interactions, reduce common errors inherent in paper-based prescribing — including illegible handwriting, misinterpreted abbreviations, and unclear dosages, — and provide critical decision support tools.

Despite the fact that, nationwide, more than 70 percent of doctors transmit most prescriptions electronically, the vast majority of these prescriptions are only for legend drugs. In comparison, less than 10 percent are using EPCS solutions to e-prescribe controlled substances. However, in New York, the I-STOP legislation has driven adoption of EPCS to over an estimated 70 percent. As such, all indications are that the laws passed in New York and Maine mandating use of EPCS and PDMPs will almost certainly prove helpful in curbing opioid abuse, fraud, and diversion and help prevent possible addiction down the line.

However, full adoption of PDMPs will likely never be achieved until the PDMP information is accessible in the doctor’s technology workflow. Ultimately, the opioid-abuse battle needs to be fought through states enabling their respective PDMP data to flow through doctors’ own workflows, as opposed to requiring that physicians and clinicians go outside their familiar software tool and interact with a separate portal in order to access their respective state PDMP databases.

In the case of New York State, the Medical Society of the State of New York conducted a survey that found a large percentage of prescribers believed that forcing mandatory compliance was placing an undue burden on their practices. No doubt, physicians feel overburdened with IT mandates. Improving integration between PDMPs and electronic health records will alleviate some of these burdens and allow for better compliance.

States must work more closely with the healthcare community to remove obstacles that will allow as close to 100 percent compliance as possible. Every state has the opportunity to learn from New York to smooth implementation and drive adoption to make a meaningful impact on the growing opioid abuse epidemic. Leadership in healthcare IT companies must be more vocal about our role and responsibilities in enabling doctors on the ground.

With the US Surgeon General weighing in, those of us in the healthcare IT community must rise up to make our voices heard. The importance of integrating e-medication management tools and EPCS solutions with PDMP data cannot be overestimated. It is the best path toward helping our customers — the doctors — make the right decision, at the right time, with the right data, on the right platforms.

Thomas Sullivan, MD is chief strategy and privacy officer of DrFirst of Rockville, MD.

Readers Write: The Electronic Health Record and The Golden Spike

September 14, 2016 Readers Write 1 Comment

The Electronic Health Record and The Golden Spike
By Frank D. Byrne, MD


On May 10, 1869, at a ceremony in Utah, Leland Stanford drove the final spike to join the first transcontinental railroad across the US. Considered one of the great technological feats of the 19th century, the railroad would become a revolutionary transportation network that changed the young country.


For the past few years, the healthcare industry and the patients in its care have experienced a similar “Golden Spike Era” through the deployment of the electronic health record (EHR). Others have used this analogy, including author Robert Wachter, MD at a recent excellent presentation at the American College of Healthcare Executives 2016 Congress on Healthcare Leadership.

Why is this comparison relevant? While the Utah ceremony marked the completion of a transcontinental railroad, it did not actually mark the completion of a seamless coast-to-coast rail network. Key gaps remained and a true coast-to-coast rail link was not achieved until more than a year later and required ongoing further improvements.

Similarly, while a recent study indicated that 96 percent of hospitals possessed a certified EHR technology and 84 percent had adopted at least a basic EHR system in 2015, there is still much more needed to achieve optimized deployment of the EHR to make healthcare better, safer, more efficient, and to improve the health of our communities.

Nonetheless, the EHR is one of the major advances in healthcare in my professional lifetime. It is an essential tool in progress toward the Institute for Healthcare Improvement’s “Triple Aim for Healthcare”– better patient experience, lower per-capita cost, and improved population health. We cannot achieve those laudable goals without mining and analyzing the data imbedded in the EHR to generate useful information to guide our actions. Advances in data science are enabling the development of meaningful predictive analytics, clinical decision support, and other tools that will advance quality, safety, and efficiency.

But there is much work to do. Christine Sinsky, MD, vice president of professional satisfaction for the American Medical Association, and others have written with concern about dissatisfied physicians, nurses, and other clinicians who feel the EHR is distracting them from patients care and meaningful interactions with their patients.

“Contemporary medical records are used for purposes that extend beyond supporting patient and caregiver … the primary purpose, i.e. the support of cognition and thoughtful, concise communication, has been crowded out,” Sinsky and co-author Stephen Martin, MD note in a recent article.

Perhaps you’ve also seen the sobering drawing by a seven-year-old girl depicting a doctor focused on the computer screen with his back to her, his patient.


Some of the EHR’s shortcomings may be the result of lack of end user input prior to implementation, possibly due to the implementing organization not incorporating the extensive research gathered by the EHR providers. Further, even if one gets end-user input prior to implementation, there’s always challenges prior to go-live, and it seems to me that optimization after implementation has been under-resourced. And let’s not look at temporary ”fixes” as the best and final answer. I was dismayed recently to see “hiring medical scribes” listed as one of the top 10 best practices in a recent Modern Healthcare poll.

Don’t get me wrong, to have a long game, you must have a successful plan to get through today, and if hiring scribes can mitigate physician dissatisfaction until the systems are improved, so be it. But scribes are a temporary work-around, not a system solution.

As an advisor to an early-stage venture capital fund, I’ve enjoyed listening to many interesting and inspiring pitches for new technology solutions. Initially, my algorithm used to rate these ideas was:

  • Is it a novel idea?
  • Will enough people or organizations pay for it?
  • Do they have the right customer?
  • Do they have the right revenue model?

Thanks to the input of physicians, nurses, therapists, and other clinicians, and the work of Dr. Sinsky and others, I quickly added a fifth, very important vital sign: Will it make the lives of those providing care better? Similarly, author, speaker and investor Dave Chase added a fourth element to the Triple Aim, caregiver experience, making it the Quadruple Aim.

When I was in training, we carried the “Washington Manual” and “Sanford’s Antimicrobial Guide” in the pockets of our white coats as references and thought we had most of the resources we needed to provide exceptional care. Now, caregivers suffer from information overload of both clinical data and academic knowledge. Some query Google right in front of their patients to find answers.

In healthcare today, we work within a community of diverse skills and backgrounds, including clinicians, non-clinicians, computer scientists, EHR providers, administrators, and others. To achieve our goal of improving health and healthcare for individuals and communities, we must work together to organize, structure, mine, and present the massive amounts of data accumulated in the EHR. To me, the concept of population health is meaningless unless you are improving health and outcomes for my family, my friends and me. Just as the placement of “The Golden Spike” was only the beginning of railroad transportation becoming a transformational force in American life, the fact that 96 percent of U.S. hospitals possess a certified EHR is just the beginning.

I have been accused of being a relentless optimist, but I firmly believe we can use the EHR to improve the caregiver and patient experience (I believe patients will and should have access to their entire medical record, for example), and fulfill the other necessary functions that Sinsky and Martin describe as distractions from the medical records’ primary purpose: “quality evaluations, practitioner monitoring, billing justification, audit defense, disability determinations, health insurance risk assessments, legal actions, and research.”

Lastly, there is one more similarity to “The Golden Spike.” In 1904 a new railroad route was built bypassing the Utah track segment that included that historic spot. It shortened the distance traveled by 43 miles and avoided curves and grades, rendering the segment obsolete. Already, many EHR tools, applications and companies have come and gone. Many of the tools we use now remain rudimentary compared with what we really need. We must use what we have to learn and continuously improve, and frankly, we need to pick up the pace. The patients, families and communities depending on us deserve no less.

Frank D. Byrne, MD is the former president of St. Mary’s Hospital and Parkview Hospital and a senior executive advisor to HealthX Ventures.

Readers Write: Moving Beyond the App: How to Improve Healthcare Through Technology Partnerships

August 24, 2016 Readers Write 1 Comment

Moving Beyond the App: How to Improve Healthcare Through Technology Partnerships
By Ralph C. Derrickson


As the pace of change in the US healthcare system increases, we are seeing inspiring progress in access and care delivery driven in part by the adoption of telemedicine and other technology-enabled care models. Health systems are embracing virtual medicine as a way to serve their patients and communities by meeting their budget and lifestyle needs. Health systems are trying to match the consumer experience of other Internet services by delivering new care models that give patients better care, save them time, are easier on their wallets, and keep them within the health systems they already know and trust.

While the prospects for technology are enormous, there are downsides that have to be avoided.

Healthcare isn’t an app. We all use apps to conduct business, purchase products, and get our entertainment fix from our favorite mobile games and streaming media services. The idea that we could put an app in a patient’s hand to diagnose or treat them is very appealing. When that app is offered as part of a comprehensive set of integrated treatment options, there are reasons to be very hopeful. But when it’s offered outside a local health system, it leads to fragmentation, excessive prescribing, and even worse, inappropriate treatment.

Simply aggregating providers using the Internet is bad medicine. App developers and their networks of doctors – who are paid on a per-visit basis – have used technology to bring out the worst of fee-for-service care. The data on telemedicine prescribing rates, visit durations, and management rates is in and it isn’t pretty. If the expectation is that the patient’s needs will be met with a telemedicine visit, it becomes a failure when the patient doesn’t get treatment or a prescription.

There’s no doubt the provider is doing their best to serve the patient, but without a place to send the patient for in-person care, they’re stuck trying their best to meet the patient’s needs. In fact, a study in JAMA Internal Medicine shows that the quality of urgent care treatment varies widely among commercial, direct-to-consumer virtual care companies. Their transactional models for medicine also offer no integrated next step for the patient and no connection to a broader spectrum of care.

Health systems need an approach that runs counter to telemedicine/app developer trends. An integrated virtual clinic enables health systems to extend the service offering in clinically appropriate situations and build on the trust they have earned from patients in years of service to their community. Payment models can come and go, but the patient’s reliance on a doctor in a time of need should never be compromised by the method of access or their payment system.

Health care is challenging. I’ve referred to it as the Three Hopes: I hope it’s not serious, I hope I can see my doctor, and I hope it’s paid for. Countless studies have shown that the proven, most cost-effective health care model is to have access to primary care doctors and great doctor-patient relationships, two qualities that are part and parcel of a strong health system. However, most app-centered telemedicine companies have no connection to a patient’s primary care provider, leading to care fragmentation instead of care continuity.

Through all of this, the greatest institutions of clinical excellence – our health systems – are losing the arms race for patients, especially as the healthcare market continues to consolidate and health systems face fierce competition from their peers to attract and retain patients. Health systems simply don’t have the marketing engines of app-centered telemedicine providers and pharmacies who are fighting tooth and nail for patient acquisition.

Many health systems have yet to figure out how to adapt to a consumer-directed model while continuing to provide quality care. The same patients who want convenience first and foremost are often unable to accurately judge the quality of care received through most telemedicine methods. For health systems and patients to succeed, virtual care must be part of a broader care continuum and tightly integrated within health systems.

Keeping patients within the systems they already know and trust provides an invaluable convenience and allows opportunities to refer patients to appropriate care when an ailment cannot be treated virtually. Those referrals offer a chance to reconnect patients to health systems rather than their using a high-cost option like an emergency department or a quick fix like an app-centered retail clinic.

This is especially important as the industry shifts to fee-for-value reimbursement.

An approach that integrates virtual care within health systems ensures patients get the same quality of care that they would receive from an in-person visit. Patients have a better chance of understanding their own health, as trusted physicians give patients the information they need to become educated healthcare consumers. For health systems, integrated virtual care puts them in the driver’s seat on how care is delivered and managed, whereas an app-centered approach might not meet metrics of quality nor the needs of the patients they already serve.

App-centered telemedicine has no place in our health care system. This approach to addressing the changes in healthcare is robbing patients of the type of care they deserve.

There is no reason for the Three Hopes of healthcare to be points of uncertainty or stress for patients. I see great promise among leading hospitals and health systems who are alleviating this uncertainty with integrated virtual care. They realize they know how best to treat a patient – apps do not. Virtual care that’s integrated into a provider network is best equipped to put quality at the center of care now and in the future.

Ralph C. Derrickson is president and CEO of Carena, Inc.

Readers Write: Moving and Sharing Clinical Information Across Boundaries

August 24, 2016 Readers Write 3 Comments

Moving and Sharing Clinical Information Across Boundaries
By Sandra Lillie


In Gartner’s recent depiction of the Hype Cycle for Healthcare Technology, Integrating the Healthcare Enterprise (IHE) XDS has now progressed well past early adopters and rapidly toward productivity and optimization. In many regions outside the United States, it is the de facto standard for content management, and within the US, it is receiving increasing consideration for adoption in use cases supporting specialty images, standards-based image sharing and the like.

XDS is a suitable foundation for integration of clinical systems, and as noted earlier, is more widely adopted in EMEA for this purpose. It is capable of moving and sharing clinical information within and between organizations and capable of creating a patient-centric record based on multiple document (types).

XDS centralizes registration of documents, reducing the problem of deciding which system holds “the truth.” Focusing on “standardizing the standards,” XDS supports the moving and sharing of clinical information across boundaries, both within and between enterprises. This is increasingly vitally important in delivering patient-centered care across the care continuum.

Today we also have XDS-I, also referred to as XDS.b for Imaging. It is built upon the XDS.b profile with one key difference – the actual DICOM imaging study stays put in its original location until requested for presentation. This is accomplished by registering the location of the imaging study in the XDS registry while using a vendor-neutral archive that is smart enough to serve as its own XDS-I repository.

DICOM is a standard format for the storage and communication of medical images, such as x-rays. Instead of publishing the document (which would be large in imaging) to the repository, however, the imaging document source (the VNA in this case) publishes a “manifest.” This manifest contains an index of all the images within a study, coupled with a path to the VNA where they can be retrieved. This reduces the amount of data that has to move around, allowing for more efficient image sharing while minimizing the complexity and costs of image storage.

What are the implications to healthcare organizations of using XDS?

  • Documents retain their native format, allowing ready viewing by applications.
  • Standards support interoperability and sharing of both documents and enterprise image studies.
  • IHE conducts annual Connectathons in the United States and Europe to validate interoperability and enable widespread ability for vendors to act as sources and suppliers of content.

Major benefits include:

  • XDS enables movement and sharing of clinical information across boundaries, both within and between enterprises. This capability is increasingly important in delivering patient-centered care across the continuum, supporting the organization of documents across time in a patient context, allowing clinicians to realize a more complete picture of the patient.
  • XDS offers a lower-cost method for implementing care coordination through a solution that can easily respond to queries for patient-centered documents and enterprise images.
  • Use of standards simplifies healthcare IT integrations, requiring less administrative overhead.

Now is the time for US healthcare providers to seriously consider the advantages of XDS. XDS profiles provide an effective alternative for managing clinical content exported from legacy (sunsetted) systems and for supporting healthcare information sharing.

Sandra Lillie is industry manager for enterprise imaging for Lexmark Healthcare.

Readers Write: Why Reverse Mentoring is Beneficial for HIT Employees

August 15, 2016 Readers Write 2 Comments

Why Reverse Mentoring is Beneficial for HIT Employees
By Frank Myeroff


Reverse mentoring is when seasoned HIT professionals are paired with and mentored by the younger Millennial generation for the reasons of being extremely tech savvy, fast to adopt new technology, and not afraid of trying new things. In addition, it helps to bridge the gap between generations.

Reverse mentoring was introduced in the 1990s by Jack Welch, chairman and CEO of General Electric at that time. While it’s not exactly new, it’s gaining popularity fast. More and more organizations are recognizing the value of reverse mentoring and are developing formalized programs to ensure best practices in order to yield success. They believe that Millennials are well suited as mentors to help maximize HIT use and adoption in order to move organizations forward in this digital age.

Additionally, with the ever-changing landscape of technology and tools used in the HIT field, reverse mentorship can be extremely beneficial:

  • Young, fresh talent has a chance to share their skills, knowledge, and fresh perspectives with more senior employees. Hospitals and health systems often look for their HIT professionals to use technology to improve patient care, lower costs, and increase efficiency. This means that the latest technology is routinely sought. Organizations know that tech savvy younger generations will catch on to this quickly, presenting an opportunity for them to share their knowledge with a different generation. Not only HIT systems, but also technology and platforms such as social media could be unique topics for Millennials to share information and ideas on.
  • Creates a way for separate generations to build working relationships with one another. Reverse mentorship can help junior HIT employees feel more needed, confident and comfortable communicating with higher-up employees working together on projects or even in meetings. Additionally, this could create more cohesion in the workplace and begin to break down perceived barriers and stereotypes of each generation.
  • Gives junior employees a higher sense of purpose in the organization. Implementing a reverse mentorship program gives young HIT professionals a sense of empowerment and the idea that they are making an impactful contribution to the company. This in turn, could help increase retention and help to shape future leaders in the organization.
  • Continues to provide ways for senior employees to share their knowledge as well. Although called reverse mentorship, this type of program offers a two-way street for employees of all ages to learn from one another. Experienced professionals in the HIT field are able to share their insights and knowledge, in addition to learning new things.

While reverse mentorship can be extremely beneficial in the HIT industry and especially any industry with a tech focus, there are several conditions this type of relationship depends upon:

  • Trust. Each person needs to trust the other and put effort into bettering both careers.
  • Open mindedness. In a reverse mentorship, both employees will act as a mentor and a mentee and need to show a willingness to teach, but also a willingness to learn.
  • Expectations and rules. It will be important for both parties in the mentorship to communicate what they are looking to get from the relationship as well as staying committed to the process.

Reverse mentorship is an innovative way to bring together generations of employees to share knowledge. In addition, today’s Millennial mentors will be tomorrow’s chief healthcare officers. We will depend on them to lead the IT department and create strategies on how to handle the growing amount of digital data for healthcare workers and new ways to support technologically advanced patient care modalities.

Frank Myeroff is president of Direct Consulting Associates of Cleveland, OH.

Readers Write: ACO, Heal Thyself

July 18, 2016 Readers Write 3 Comments

ACO, Heal Thyself
By Stuart Hochron, MD, JD


I was recently asked to comment on the success (or lack thereof) of Accountable Care Organizations (ACO) and why I thought ACOs haven’t lived up to expectations and what additional incentives will be required for them to be successful – if, indeed, they ever will be.

The questions gave me pause. Certainly ACO performance to date has left much room for improvement. According to an analysis published by the Healthcare Financial Management Association, just over a quarter of ACOs were able to generate savings in an amount sufficient to make them eligible to receive a share of those savings.

But the implication that ACOs are biding their time until new incentives or perhaps a new business model emerges is alarming. This is not a situation where good things will necessarily come to those who wait.

I work with a number of ACOs, hospitals, and physician organizations. While I am not at liberty to share their financial performance data, I’ve distilled what I believe to be the best practices employed by those that will be successful.

It takes a platform

Fundamentally, ACOs require wide-scale patient-centric collaboration – that’s what underpins the hopes of achieving more-efficient, more-effective, less-wasteful, non-redundant care. But collaboration doesn’t just happen automatically, even when everyone on the team works in the same building. And for ACOs, comprised of multiple entities that don’t necessarily have any prior joint operating experience or relationship of any kind, the challenge is greater still.

Based on extensive discussions with healthcare executives and real-world performance analysis, it is clear that successful ACOs must make an investment in robust groupware tools, the kind that professional services organizations have had in place for decades to ensure that members of a distributed workforce can collaborate and coordinate as easily as if they were in next-door offices.

In the healthcare context, these tools will facilitate everything from patient scheduling to real-time sharing of PHI to charge capture and invoicing. Far beyond secure messaging, such platforms underpin the ACO’s activities, giving providers a common workspace for all manner of collaboration and ensuring that all providers across the care continuum are aware of and working towards a single set of organizational imperatives. The ACOs that don’t invest in the transformation – that try to piggyback on existing infrastructure – will ultimately find that their people don’t make the transformation either.

Patients at the center

All healthcare systems need to become more patient-centric and this is particularly true of ACOs, whose compensation, of course, is based on how successfully they treat (and, ideally, reduce the need to treat) patients. Thus, successful ACOs will make patient-centric collaboration and communication the centerpiece of an organization-wide operating system. 

Ideally, collaboration and communication won’t stop there. ACOs will implement population health initiatives by empowering patients, giving them the ability to take a more active role in keeping themselves healthy. This will be accomplished via tools such as mobile apps that enable people to access care services before they get sick and enable ACOs to reach out to the community, helping guide patients towards good lifestyle choices and, if they have received acute treatment, helping patients follow post-discharge instructions. So that same collaboration platform that will help care professionals work together better – it will need to extend seamlessly into the community as well.

Without aligned physicians, there’s no accountability

Technically, any organization that agrees to be “accountable for the quality, cost, and overall care of Medicare beneficiaries” can qualify under the definition of an ACO. But what all successful ACOs will have in common is tight alignment of physicians and care teams. I don’t simply mean financial alignment. Theoretically, all the physicians in an ACO are financially aligned. Nor do I just mean alignment around a patient.

True alignment means the physicians who form the core of the ACO understand the goals and priorities of the organization and feel invested in its success. Physicians make dozens of care decisions every day. They need to be making those decisions against the backdrop of the stated policies of the ACO. That requires being literally as well as figuratively connected to the organization, receiving regular communications such as educational materials, opinion, and thought leadership, being part of the daily give and take.

The financial incentives and disincentives under which ACOs operate change regularly, meaning the ACO’s organizational goals are updated all the time. The challenge is for providers to understand those incentives fully and to be able to adjust their practice methodologies and for that to happen on an organization-wide basis. Achieving and maintaining alignment requires an institution-wide collaboration platform. In a distributed entity such as an ACO, there’s no physician’s lounge. But with modern groupware, we can simulate one in a virtual environment and realize the same benefits.

Networks don’t build themselves

In my work with ACOs, one hurdle encountered by all is introducing and socializing the concept that the ACO establishes a new network of providers to which to refer cases. Intellectually it isn’t that hard to grasp. But as far as changing ingrained habits, that is much more of a challenge – not least because providers have no way of knowing which other providers are also members of the ACO, nor how effective any of those providers might be as physicians contributing to the stated financial goals (savings as well as revenues) of the ACO.

The only way to keep referrals within the organization – to combat the challenge of referral leakage, which will sink an otherwise effective ACO – is the ensure that every physician in the ACO is connected to a physician referral directory that lists all providers by specialty.For good measure, it will include a rating quantifying each provider’s service.

Improving clinical documentation

In the minutely quantified world of ACO financial performance, every dollar counts. The ACO’s income is based, in part, on costs saved, along with other metrics. As is well known, incomplete clinical documentation leads to tens of billions of dollars in disallowed reimbursements every year, a situation that only grows worse in a distributed organization such as an ACO. 

While we are imagining the infrastructure of the successful ACO of the future, let’s not neglect to include capabilities for crisply identifying and documenting treatments and procedures and thus enabling the medical billing professionals – who may have no physical or organizational connection to the care delivery professionals – to complete the paperwork correctly and maximize reimbursement revenue.

Conceptually, ACOs are the heart of the Affordable Care Act. Accountability – enforced by incentives and penalties – is central to our concept of how healthcare ought to work. If ACOs aren’t delivering on their promise, then that has ominous implications for the healthcare system overall. With the right communications infrastructure used as directed, ACOs can lead the way to the bright healthcare future we all want. Rather than stand on the sidelines as spectators, waiting for new incentives to come down from on high, ACOs can and must take action now.

Stuart Hochron, MD, JD is the chief medical officer of Uniphy Health of Minneapolis, MN.

Readers Write: Why EHRs Will Have Different Documentation Requirements for Biosimilar Dispensing, Administration, and Outcomes

July 11, 2016 Readers Write No Comments

Why EHRs Will Have Different Documentation Requirements for Biosimilar Dispensing, Administration, and Outcomes
By Tony Schueth


While a second biosimilar recently being approved in the United States does not a tsunami make, biosimilars are nonetheless expected to quickly become mainstream. In response, stakeholders are beginning to work on how to make them safe and useful within the parameters of today’s healthcare system because, biosimilars – like biologics – are made from living organisms, which makes them very different from today’s conventional drugs.

In fact, biosimilars are separated into two categories: biosimilars and interchangeables, both of which are treated differently from a regulatory standpoint. These differences will create challenges and opportunities in how they are integrated in electronic health records (EHRs) and user workflows as well as how patient safety may be improved.

EHRs must treat biosimilars differently than generics. As a result, EHR system vendors will need to make significant changes to accommodate the unique aspects of biosimilar dispensing, administration and outcomes.

Patient safety is a priority for development and use of all medicines. Manufacturers must provide safety assessments and risk management plans as part of the drug approval process by the Food and Drug Administration (FDA). Even so, biologics and biosimilars are associated with additional safety considerations because they are complex medicines made from living organisms. Even small changes during manufacturing can create unforeseen changes in biological composition of the resulting drug. These, in turn, have implications for treatment, safety, and outcomes. In order to address these issues, information about what was prescribed, administered, and outcomes must be documented in the patient’s medical record.

Substitution also is an issue because dispensed drugs may be very different than what was prescribed. As a result, it is important for physicians to know whether a substitution has been made and capture information about the drug that was administered in the patient’s medical record, especially when it comes to biologics and biosimilars. This is important for treatment and follow-up care, as well as in cases where an adverse event (AE) or patient outcome occurs later on.

Four drivers make the unique documentation requirements of biosimilars in EHR a priority.

  1. Utilization is expected to grow rapidly because of biosimilars’ lower-cost treatment for such chronic diseases as cancer and rheumatoid arthritis. It is easy to envision the availability of four biosimilars each for 20 reference products that could be available in 2020, given projected market expansions. That amounts to 100 biologics that will need to be addressed separately. As more biosimilars are approved and enter the market, it will become increasingly challenging and important to accurately identify and distinguish the source of the adverse events (AEs) from a biosimilar, its reference biologic, and other biosimilars.
  2. Physicians will need this information once biosimilars come on line and their use becomes widespread. Adverse complications — particularly immunologic reactions caused by formation of anti-drug antibodies – may occur at much later after the drug was administered. Physicians report more than a third of adverse events to the FDA, but need to know what was administered to the patient when the pharmacist performs a biosimilar substitution.
  3. Outcomes tracking and patient safety are growing priorities in healthcare. They are key pieces of the move toward value-based reimbursement and are a focus of public and private payers. Identifying, tracking, and reporting adverse events are expected to become key metrics for assessing care quality and pay-for-performance incentives.
  4. States are ahead of the curve when it comes to substitution. More than 30 are considering or have enacted substitution legislation for biosimilars, which creates urgency in how such information is captured and documented in EHRs. Some states require the pharmacy to communicate dispensing data to the prescriber’s EHR.

Because of the unique properties of biosimilar dispensing, administration and outcomes, many adjustments will be needed for documentation into EHRs used by physician offices in independent practices and integrated delivery systems (IDS). For example:

  • EHRs must be able to comprehensively record data on what was administered or dispensed for an individual patient, as well as what was prescribed. Modifications will be needed for tracking adverse event reports in various administration locations, including the physician’s office; an affiliated entity (e.g., practice infusion center); the patient’s home; or non-network providers.
  • Changes in drug data compendia will be needed to account for new naming conventions that soon will be put in place by the FDA and substitution equivalency.
  • Tracking the manufacturer and lot or batch numbers (similar to vaccine administration) can facilitate more accurate tracing of an AE back to the biologic. Fields will need be added to record the NDC code, manufacturer, and lot number of biosimilars that have been dispensed. 
  • NCPDP SCRIPT’s Medication History and RxFill transactions — already available for electronic prescribing in EHRs— can include the NDC and the recently added manufacturer and lot number as part of the notification to the prescriber. Although not widely used today, RxFill provides a compelling method to notify providers that a substitution occurred in the pharmacy.
  • EHRs will need to address barriers related to the use of biosimilars, such as creation of too many alerts; the usability of how the information is presented to the clinician; lack of consistency in the display of drugs and drug names; and conformance of screen features and workflow within and between systems.
  • IDS systems need to be interoperable and have a seamless transfer of information. This can be a challenge in trying to meld together multiple disparate health information technology systems and EHRs from different vendors.

The time is right for industry, hardware and software developers, and other stakeholders to address the opportunities and challenges posed by entrance of biologics and biosimilars into the US market. As patient safety issues arise, the EHR community must be in a position to capture and exchange needed information. Otherwise, states and other regulators could develop alternative tracking methods. Examples include state vaccine registries or prescription drug monitoring programs, which track controlled substances dispensing and vary from state to state. These programs have become complicated mechanisms for healthcare providers to address.

Tony Schueth is CEO and managing partner of Point-of-Care Partners of Coral Springs, FL.

Readers Write: Election 2016

July 6, 2016 Readers Write 5 Comments

Election 2016
By Donald Trigg


A provocative Atlantic magazine cover this month headlines, “How American Politics Went Insane.” Jonathan Rauch explores our current reality where “chaos has become the new normal — both in campaigns and government itself.”

As we struggle to draw rational signal from the noise, one can’t help but wonder if Trumpian chaos is resident in our favorite podcasts, journals, and websites. Are byzantine rule-makings not regularly bemoaned on HIStalk?  Do we not hear classes of readers singled out (particularly for using HIPPA and HIMMS)? Are we not struck by the rather small hands on the original HIStalk graphic?  


HIStalk has been, all kidding aside, a thankful escape for many of us from a campaign that has been abysmal even by our diminished US standards. Fortunately, there are just 125 or so days left. And with few exceptions, these conversion dates hold.  

Here is the quadrennial cheat sheet.  

A proper understanding of the 2016 election starts with the massive advantage Democrats have in the Electoral College. The Democrats have a safe hold on 19 states (plus DC) representing 242 Electoral College votes. (Note: If you still are suffering under the delusion that the popular vote selects the president, let’s email about a couple of ideas for your trip to the Albert Gore Presidential Library). As a quick civics reminder, you only need 270 Electoral College votes to become president.  

So, with a probable shortfall of just 28 Electoral College votes to get to 270, the Democratic path is far easier. As an indicative example, a Republican could win every “swing” state from Ohio to Virginia, but lose Florida (29 EC) and thereby lose the presidency. It is not quite as challenging as running a health system with an antiquated MUMPS technical architecture, but it is still a daunting task for the GOP.         

The statistician-turned-blogger Nate Silver places the odds of a Hillary victory at 80 percent with one of his two models factoring in GDP (Q1 GDP was 1.1 percent) for a lower 75 percent chance. He probably has that about right and (spoiler alert) decisions like the Trump VP pick aren’t going to radically change that.

No matter the outcome at the top of the ticket, neither Democrats nor Republicans are likely to dominate the breadth of the electoral landscape. Republicans have a fairly solid grasp on the US House (247-188) and they also control 31 governorships. As Barron’s wrote over the long weekend, ongoing divided government will offer a muted welcome to any agenda this January.  

As for healthcare, the issue significantly trails the economy/jobs and terrorism when it comes to top voter concerns. Moreover, opinions are very settled and polarized. Forty-two percent favor the ACA, while 44 percent oppose it.  

Consequently, Clinton and Trump will use talking point level rhetoric, predominately to drive turnout. Hillary will take on big pharma, calling for caps on prescription drug costs. Trump will bemoan premium increases, call for ACA repeal, and assure us he is going to do something “fantastic.” You will feel like you are watching “Saturday Night Live.”

Notably, there is an important piece of emerging voter sentiment that we shouldn’t miss amid the posturing and platitudes. According to the June KFF poll, 90 percent are worried about the amount people pay for their healthcare premiums, while 85 percent are worried about increased cost of deductibles. Consternation over cost is growing and will be reinforced during open enrollment this fall. 

As we look out to first 100 days of the new administration, we will see a level of change on health policy that is more incremental than historic. Importantly, MACRA will continue to advance at the agency level, buttressed by solid bipartisan opposition to fee-for-service. At the state level, ongoing programmatic Medicaid changes move forward. Finally, even with the the Cadillac tax delay, employers experiment further with wellness incentives and alternative (and narrower) network design.  

In the Atlantic, Jonathan Rauch makes a lonely case for a renewed establishment that can impose some modicum of order. Few will like that treatment plan. His Chaos Syndrome diagnosis, however, is inarguable, as is his view that in the near term, “it will only get worse.”  

Donald Trigg is president of Cerner Health Ventures. In a previous life, he worked for President George W. Bush starting on the 2000 presidential campaign in Austin, Texas, and then after a brief Florida detour, in Washington, DC for the first half of Bush’s first term. 

Readers Write: Who’s On First? Baseball’s Lessons for Hospital Shift Scheduling

June 29, 2016 Readers Write 1 Comment

Who’s On First? Baseball’s Lessons for Hospital Shift Scheduling
By Suvas Vajracharya, PhD


A single MLB season includes over 1,200 players, 2,340 games, and 75 million fans in stadiums. In just 10 seasons, it’s possible to generate more baseball schedule options than there are atoms in the universe. Yet a full season of baseball scheduling is still far less complicated than just a single month of scheduling for 24/7 coverage shifts in a hospital emergency department. There’s good reason hospital operations teams are stressed about scheduling. Trying to do this manually with paper or a spreadsheet is an exercise in pure masochism.

First, a bit of history. Major League Baseball started out using a guy in the commissioner’s office to set up season schedules. Harry Simmons quickly found the task so overwhelming that he left the office and worked on the schedule as his full-time job (sound familiar?) In 1981, the league assigned the job to a husband-wife team named Henry and Holly Stephenson, who set the schedules for two decades using a mix of computers and manual scheduling.

Tech leaders at IBM, MIT, Stanford, and Carnegie Mellon all tried to unseat these scheduling gurus and failed until 2005, when the league switched to what is called “combinatorial optimization” technology to generate their schedules entirely by computer.

Today, the same applied mathematics technology is used in not just Major League Baseball, but in all sports leagues, airline schedules, and retail stores, too. Any time you’ve got a mix of teams, individuals, holidays, facilities, unpredictable weather patterns, changing demand, and lots and lots of rules that sound straight out of high school word problems … that’s a scheduling job for advanced computing.

Healthcare, as anyone with experience in the sector might guess, is behind the times when it comes to scheduling technology. The vast majority of hospital departments (an estimated 80 percentage) are still setting schedules manually, like our poor old friend Harry Simmons. It’s a problem that can’t be ignored any longer. Not only is manual scheduling a major time sink for hospital operations staff, it also contributes to the already significant issues of professional burnout and physician shortages.

The MLB uses scheduling software in two distinct ways. First, they generate an established schedule for the season using set rules. These include rules designed to prevent player burnout, such as requiring a day off after a team flies west to east across the country or not playing on certain holidays. There are also operational rules, such as not having two home games in the same city the same night or making sure the weekend and weekday games are equally divided among teams.

In healthcare, these established schedule rules include things like not scheduling back-to-back night shifts for a physician, making sure weekend on-call time is fairly distributed, and ensuring key sub-specialists are available 24/7 for procedures. This rules-based schedule serves as the baseline.

After this, a second type of scheduling tool comes into play. These are requests that let the schedule flexibly adapt to changes. When a blizzard knocks out a week of MLB games or they need to cancel a series in Puerto Rico due to Zika concerns, it’s this second set of optimization technologies that reconfigures the schedule to get things back on track for the season.

In healthcare, schedule requests happen any time and all the time. Vacation, maternity, schedule swaps, requests for overtime, adding locum tenens, adding mandatory training sessions — hospital schedules change far more frequently than MLB schedules, adding to the complexity.

A recent study of over 5,500 real medical department physician shift schedules showed that medical department scheduling varies by specialty. Emergency medicine has by far the most complex process with an average of 62 repeating scheduling rules and 276 monthly schedule change requests. Hospital medicine and OB-GYN follow behind and office-based schedules such as nephrology are much simpler but still beyond anything in the MLB. The math on the number of schedules you could generate with that complexity and variability in emergency medicine is mind-boggling. That specialty also just happens to have the highest rate of physician burnout.

It is time for hospital operations leaders to figure out what the MLB discovered way back in 1981: setting complex schedules is a job for computers. Using sophisticated machine learning to balance dozens of rules and to support flexibility for ongoing changes is good practice for baseball players, pilots, and physicians. With the help of technology, hospitals might already have the solutions they’re looking for when it comes to care coordination, physician retention, increasing patient volume, and preventing staff burnout. It’s time for hospital operations to play ball.

Suvas Vajracharya, PhD is founder and CEO of Lightning Bolt Solutions of South San Francisco, CA.

Founding Sponsors


Platinum Sponsors



















































Gold Sponsors













Reader Comments

  • Valerie: OMG - the google meet thing! I just had that happen to me and thought it was me :) 2 of 6 participants logged in to goog...
  • Rumor mill: Anyone else hearing that Emory is leaving Cerner? There's some chatter on the Cerner Reddit. That seems like a pretty b...
  • Justa CIO: HIMSS relevant to CIO's? Not for years. CHIME is on the same path....
  • John Lynn: I know your Google Meet problem well. Although, it turns out it's easy to fix. If you go in your Google Admin, there's...
  • Brian Too: I was struck that Zombie doesn't have much dynamic volume range. It's a bit of a test of vocal power. The singer is re...

Sponsor Quick Links