Home » Readers Write » Recent Articles:

Readers Write: Ready or Not, ASC X12 275 Attachment EDI Transaction Is Coming

October 17, 2016 Readers Write No Comments

Ready or Not, ASC X12 275 Attachment EDI Transaction Is Coming
By Lindy Benton


As electronic as we are in many aspects of business – and life in general – oftentimes healthcare providers and payers are still using paper for claim attachment requests and responses. With the ASC X12 275 attachment electronic data interchange on the horizon, the need for utilizing secure, electronic transactions will soon be here.

Let’s look at the claim attachment process.

  1. A claim attachment arises when a payer requests additional information from a provider to adjudicate a claim. This attachment is intended to provide additional information or answer additional questions or information not included in the original claim.
  2. In many instances, the process for sending and receiving attachments is still largely done via a manual, paper-based format.
  3. Paper-based transactions are slow, inefficient, and can bog down the revenue cycle. Additionally, paper transactions are prone to getting lost in transit and are difficult if not impossible to track.
  4. The ASC X12 275 transaction has been proposed as a secure, electronic (EDI) method of managing the attachment request while making it uniform across all providers and payers.

The ASC X12 275 can be sent either solicited or unsolicited. When solicited, it will be when the claim is subjected to medical or utilization review during the adjudication process. The payer then requests specific information to supplement or support the providers request for payment of the services. The payer’s request for additional information may be service specific or apply to the entire claim, the 275 is used to transmit the request. The provider uses the 275 to respond to the previously mentioned request in the specified time from the payer.

Both HIPAA and the Affordable Care Act are driving the adoption of these secure, electronic transaction standards. HIPAA requires the establishment of national standards for electronic healthcare transactions and national identifiers for providers, health insurance plans, and employers. In Section 1104(b)(2) of the ACA, Congress required the adoption of operating rules for the healthcare industry and directed the secretary of Health and Human Services to “adopt a single set of operating rules for each transaction” with the goal of creating as much uniformity in the implementation of the electronic standards as possible.

Providers and payers will be required to adopt these standards at some point and it will happen sooner rather than later, so it’s time to be prepared.

The final specifications and detail for the EDI 275 transaction were supposed to be finalized in January 2016, but that has yet to happen. Both the American Health Association and American Medical Association have urged the Department of Health and Human Services to finalize and adopt the latest 275 standard, so with that kind of backing, it’s only a matter of time until the 275 transaction standard gains momentum and comes to fruition.

EDI 275 is coming. The question is, will you be ready?

Lindy Benton is president and CEO of Vyne of Dunwoody, GA.

View/Print Text Only View/Print Text Only
October 17, 2016 Readers Write No Comments

Readers Write: Exploring the EMR Debate: Onus On Analytics Companies to Deliver Insights

October 17, 2016 Readers Write 1 Comment

Exploring the EMR Debate: Onus On Analytics Companies to Deliver Insights
By Leonard D’Avolio, PhD


Late last month, a great op-ed published in The Wall Street Journal called “Turn Off the Computer and Listen to the Patient” brought a critical healthcare issue to the forefront of the national discussion. The physician authors, Caleb Gardner, MD and John Levinson, MD, describe the frustrations physicians experience with poor design, federal incentives, and the “one-size-fits-all rules for medical practice” implemented in today’s electronic medical records (EMRs).

From the start, the counter to any criticism of the EMR was that the collection of digital health data will finally make it possible to discover opportunities to improve the quality of care, prevent error, and steer resources to where they are needed most. This is, after all, the story of nearly every other industry post-digitization.

However, many organizations are learning the hard way that the business intelligence tools that were so successful in helping other industries learn from their quantified and reliable sales, inventory, and finance data can be limited in trying to make sense of healthcare’s unstructured, sparse, and often inaccurate clinical data.

Data warehouses and reporting tools — the foundation for understanding quantified and reliable sales, inventory, and finance data of other industries – are useful for required reporting of process measures for CMS, ACO, AQC, and who knows what mandates are next. However, it should be made clear that these multi-year, multi-million dollar investments are designed to address the concerns of fee-for-service care: what happened, to whom, and when. They will not begin to answer the questions most critical to value-based care: what is likely to happen, to whom, and what should be done about it.

Rapidly advancing analytic approaches are well suited for healthcare data and designed to answer the questions of value-based care. Unfortunately, journalists and vendors alike have done a terrible job in communicating the value, potential, and nature of these approaches.

Hidden beneath a veneer of buzzwords including artificial intelligence, big data, cognitive computing, data science, data mining, and machine learning is a set of methods that have proven capable of answering the “what’s next” questions of value-based care across clinical domains including cardiothoracic surgery, urology, orthopedic surgery, plastic surgery, otolaryngology, general surgery, transplant, trauma, and neurosurgery, cancer prediction and prognosis, and intensive care unit morbidity. Despite 20+ years of empirical evidence demonstrating superior predictive performance, these approaches have remained the nearly exclusive property of academics.

The rhetoric surrounding these methods is bimodal and not particularly helpful. Either big data will cure cancer in just a few years or clinicians proudly list the reasons they will not be replaced by virtual AI versions of themselves. Both are fun reads, but neither address the immediate opportunity to capitalize on the painstakingly entered data to deliver care more efficiently today.

More productive is a framing of machine learning as what it actually is — an emerging tool. Like all tools, machine learning has inherent pros and cons that should be considered.

In the pro column is the ability of these methods to consider many more data points than traditional risk score or rules-based approaches. Also important for medicine is the fact that machine learning-based approaches don’t require that data be well formatted or standardized in order to learn from it. Combined with natural language processing, machine learning can consider the free text impressions of clinicians or case managers in predicting which patient is most likely to benefit from attention sooner. Like clinical care, these approaches learn with new experience, allowing insights to evolve based on the ever-changing dynamics of care delivery.

To illustrate, the organization I work with was recently enlisted to identify members of a health plan most likely to dis-enroll after one year of membership. This is a particularly sensitive loss for organizations that take on the financial responsibility of delivering care, as considerable investments are made in Year 1 stabilizing and maintaining the health of the member.

Using software designed to employ these methods, we consumed 30 file types, from case management notes, to claims, to call center transcripts. Comparing all of the data of members that dis-enrolled after one year versus those that stayed in the plan, we learned the patterns that most highly correlate with disenrollment. Our partner uses these insights to proactively call members before they dis-enroll. As their call center employs strategies to reduce specific causes of dissatisfaction, members’ reasons for wanting to leave change. So, too do the patterns emerging from the software.

The result is greater member satisfaction, record low dis-enrollment rates, and a more proactive approach to addressing member concerns. It’s not the cure for cancer, but it is one of a growing number of questions that require addressing when the success of an organization is dependent on using resources efficiently.

The greatest limitation of machine learning to date has been inaccessibility. Like the mainframe before it, this new technology has remained the exclusive domain of experts. In most applications, each model is developed over the course of months using tools designed for data scientists. The results are delivered as recommendations, not HIPAA-compliant software ready to be plugged in when and where needed. Like the evolution of computing, all of that’s about to change.

Just hours after reading the Gardner and Levinson op-ed, I sat across from a primary care doc friend as she ended a long day of practice by charting out the last few patients. Her frustration was palpable as she fought her way through screen after screen of diabetes-related reporting requirements having “nothing to do with keeping [her] patients healthy.” Her thoughts on the benefits of using her organization’s industry-leading EMR were less measured than Drs. Gardner and Levinson: “I’d rather poke my eyes out.”

I agree fully with Drs. Gardner and Levinson. The answer isn’t abandoning electronic systems, but rather striking a balance between EMR usability and the valuable information that they provide. But I’ve been in healthcare long enough to know clinicians won’t be enjoying well-designed EMRs any time soon. In the meantime, it’s nice to know we don’t need to wait to begin generating returns from all their hard work.

Leonard D’Avolio, PhD is assistant professor at Harvard Medical School CEO and co-founder of Cyft of Cambridge, MA.

View/Print Text Only View/Print Text Only
October 17, 2016 Readers Write 1 Comment

Readers Write: ECM for Healthcare Advances to HCM (Healthcare Content Management)

October 17, 2016 Readers Write No Comments

ECM for Healthcare Advances to HCM (Healthcare Content Management)
by Amie Teske


Industry analysts project healthy market growth for enterprise content management (ECM) solutions across all industry sectors. Gartner’s 2016 Hype Cycle for Real-Time Health System Technologies places ECM squarely along the “plateau of productivity” at the far, right-hand side of the hype cycle curve. This essentially means that ECM software has succeeded the breakthrough in the market and is being actively adopted by healthcare providers.

This is good news for ECM users and technology suppliers, but what’s next for ECM in healthcare? To remain competitive and leading edge, ECM solutions at the plateau must evolve for the sake of customers and the marketplace in order to maintain business success. There is more good news here in that ECM solutions are evolving to keep pace with healthcare changes and demands.

Up to 70 percent of the data needed for effective and comprehensive patient care management and decision-making exists in an unstructured format. This implies the existence of a large chasm between resources and effort expended by healthcare delivery organizations (HDOs) on EHR technology to manage discrete data and the work yet to be done to effectively automate and provide access to the remaining content. ECM solutions are evolving in a new direction that offers HDOs an opportunity to strategically build a bridge to this outstanding content.

Healthcare content management (HCM) is a new term that represents the evolution of ECM for healthcare providers. It is the modern, intelligent approach to managing all unstructured document and image content. The biggest obstacle we must overcome in this journey is the tendency to fall back on traditional thinking, which drives health IT purchases toward siloed, non-integrated systems. Traditional methods for managing patient content have a diminishing role in the future of healthcare. It’s time to set a new course.

An HCM Primer

  • HCM = documents + medical images (photos and video. too).
  • The 70 percent of patient content outside the EHR is primarily unstructured in nature, existing as objects that include not only DICOM (CT, MRI) but also tiff, pdf, mpg, etc.
  • ECM has proven effective for managing tiff, pdf and a variety of other file formats. It is not, however, a technology built to handle DICOM images, which represent the largest and most numerous of the disconnected patient objects in question.
  • Enterprise imaging (EI) technologies have traditionally been responsible for DICOM-based content. These include vendor neutral archives (VNA), enterprise/universal viewers, and worklist and connectivity solutions that are unique to medical image and video capture.
  • Leveraging a single architecture to intentionally integrate ECM and EI technologies — enabling HDOs to effectively capture, manage, access and share all of this content within a common ecosystem — is referred to as healthcare content management or HCM.

Although the market is ready for HCM and many HDOs are already moving in this direction, it is important to know what to look for.

Critical Elements of HCM

Although it is the logical first step, HCM encompasses much more than simply unifying ECM and EI technologies together into a single architecture to enable shared storage and a single viewing experience for all unstructured content, DICOM and non-DICOM. Just as important is workflow and how all document and image content is orchestrated and handled prior to storage and access. This is essentially the secret sauce and the most difficult aspect of an HCM initiative.

ECM for healthcare workflow is geared to handle back office and clinical workflows associated with health information management, patient finance, accounts payable, and human resources, for example. The intricacies of these workflows must continue to cater to specific regulations around PHI, release of information, etc. All this to say that the workflow component of ECM is critical and must remain intact when converging ECM with EI technologies.

The same goes for workflows for enterprise imaging. EI workflow is optimized to handle image orchestration from many modalities to the core VNA or various PACS systems, medical image tag mapping/morphing to ensure image neutrality and downtime situations, for example.

These workflow features should not be taken lightly as health systems endeavor to establish a true HCM strategy. Do not overlook the need for these capabilities to ease the complexities inherently involved and to fully capitalize on any investment made.

Guidance for HCM Planning

Consider the following recommendations as you plan an HCM approach and evaluate prospective vendors:

  • Be wary of an archive-only strategy. A clinical content management (CCM) approach is primarily an archive and access strategy. The critical element of workflow is fully or partly missing. A word of caution to diligent buyers to ask the right questions about workflow and governance of unstructured document and image content before, during, and after storage and access.
  • Always require neutrality. Changing standards is a given in the healthcare industry. HCM should be in alignment with the new standards to ensure all document and image content can be captured, managed, accessed, shared, and migrated without additional cost due to proprietary antics by your vendor. An HCM framework must have a commitment to true neutrality and interoperability.
  • Think strategically. A deliberate HCM framework offered by any healthcare IT vendor should be modular in nature but also able to be executed incrementally and with the end in mind. Beginning with the end in mind is slightly more difficult. The modularity of your HCM approach should allow you to attack your biggest pain points first, solving niche challenges while preserving your budget and showing incremental success in your journey toward the end state.
  • Consider total cost of ownership (TCO). If a common architecture and its associated cost efficiencies are important in wrangling your outstanding 70 percent of disconnected patient content, you cannot afford to take a niche approach. It may seem easier and cheaper to select a group of products from multiple niche vendors to try and solve your most pervasive siloed document and image management problems. Take a careful look at the TCO over the life of these solutions. It is likely the TCO will be higher due to factors which include the number of unique skillsets and FTEs required for a niche strategy.
  • Demand solution flexibility and options. Your HCM approach should provide extensive flexibility and a range of options and alternatives that are adaptable to your unique needs. Software functionality is important, but not the only criterion.

Your HCM approach for strategically managing all unstructured patient content should allow you to:

  • Start small or go big, solving one challenge or many.
  • Establish a common architecture with a unified content platform and viewing strategy for all document and imaging content.
  • Enable unique ECM and EI workflows, not simply storage and access.
  • Hold one technology partner responsible – “one throat to choke” – for easier overall performance management and administration.

Providers of all shapes and sizes must take a thoughtful and deliberate approach when evaluating document and image management solutions. There is much more involved than simply capture and access. Because this category of technology can enable up to 70 percent of your disconnected patient and business information, you cannot afford to make a decision without carefully considering the impact of HCM on your healthcare enterprise, immediately and over time.

Amie Teske is director of global healthcare industry and product marketing for Lexmark Healthcare.

View/Print Text Only View/Print Text Only
October 17, 2016 Readers Write No Comments

Readers Write: Guaranteeing MACRA Compliance at the Point of Care

October 5, 2016 Readers Write No Comments

Guaranteeing MACRA Compliance at the Point of Care
By David Lareau


MACRA will affect every physician and every clinical encounter. Current systems have been designed to produce transactions to be billed. MACRA will require that clinical conditions have been addressed and documented in accordance with quality care guidelines. The only way to ensure that happens is to do it at the point of care.

The challenge is that physicians need to address all conditions, not just those covered by a MACRA requirement. One approach is to just add another set of things to do, slowing doctors down and getting in their way. This is the transactional approach — just another task.

Most current systems have different tabs that list problems, medications, labs, etc. Users must switch back and forth looking for data. The data cannot be organized by problem since the systems lack any method for correlating information based on clinical condition. Adding another set of disconnected information to satisfy quality measures will only make it worse for users.

A better approach is to integrate quality care requirements for any condition with all the other issues the physician needs to address for a specific patient and to work it into a physician’s typical workflow. A well-designed EHR should have a process running in the background that keeps track of all applicable quality measures and guidelines for the patient being seen. The status of all quality measures must be available at any point in the encounter in a format that ties all information together for any clinical issue.

This requires actionable, problem-oriented views of clinical data, where all information for any clinical issue is available instantly. Physicians need to be able to view, react to, and document clinical information for every problem or issue addressed with the patient. This includes history and physical documentation, review of results, clinical assessments, and treatment plans as well as compliance with quality measures.

Guaranteeing MACRA compliance at the point of care can be accomplished by using a clinical knowledge engine that presents all relevant information for any clinical issue so that MACRA quality measures are seamlessly included as part of the patient’s overall clinical picture, not as just another task to be added on to the already burdensome workflows of current systems.

David Lareau is CEO of Medicomp Systems of Chantilly, VA.

View/Print Text Only View/Print Text Only
October 5, 2016 Readers Write No Comments

Readers Write: Telemedicine Is Just Medicine

October 5, 2016 Readers Write 6 Comments

Readers Write: Telemedicine Is Just Medicine
By Teri Thomas


Telemedicine. MHealth. Remote healthcare. What’s the best term for a given use case? A large portion of my job is focused on it, yet my answer is, “I don’t much care what term you use.” 

Well, I guess I care a little if I see confusion getting in the way of progress. Don’t get me wrong — I’m glad that nobody has been saying “mMedicine” yet (would that be like, “mmm…medicine” or “em-medicine?”) I don’t love “virtual health” as it makes me wonder if I watch lots of exercise shows and raw food infomercials, could I get virtually healthy? 

Defining telemedicine as a subset of telehealth related to direct care at a distance vs. provision of healthcare-related services at a distance, while correct—who cares? Consider if when indoor plumbing was new, people discussed “s-water” (out of a stream), vs. “i-water” (from in the home). I guess i-water would be better than p-water from pipes (it’s OK to giggle a little — be a middle-schooler for a minute). We care about perhaps three factors:

  • Is it modified/sparkling/flavored?
  • Do we have to pay for it (bottled water vs. tap water)?
  • Is it clean enough to drink?

Medicine is medicine. Healthcare is healthcare. It’s care: good, bad, and a ton in the middle. Yet I hear murmurs like, “Telemedicine isn’t good quality healthcare.” That’s like saying tap water isn’t good enough to drink because you’ve spent time in Flint.

Good quality care isn’t determined by the location of the provider or patient. Care can be done very well without requiring the patient and the clinician to be in the same room. It can also be done very poorly. Probably the majority of it — just like when the doctor and patient are together in a room — is not perfect, not bad, and mostly OK. 

Not every type of visit is appropriate over video, but many types are. In dermatology, providers have been using photos for decades. Camera cost and image resolution have dramatically improved so that even inexpensive systems can provide more image detail than a physician with the sharpest of vision. Stethoscopes, lights, cameras, video connections, telephones—all are tools to help us practice medicine better.  Sometimes the tools work great and are helpful and sometimes not.

If the Internet connection is slow or the battery dies, quality is impacted. But think for a minute about the impact on quality of care for the physician who had an extra-complex first appointment and is running an hour or more behind. The patients are stacking up and getting upset about their wait times. The clinic day is lengthening. The pressure to catch up mounts. Finally, consider the patient taking off work, driving to a clinic, parking, sitting in a waiting room with Sally Pink Eye, feeling at bored at best and anxious and angry at worst about their wait times.

How high of quality will that encounter be compared to the patient connecting with the provider from home or work? The patient didn’t have to drive, and even if waiting, likely they were in a more comfortable environment with other things to do.

Keep in mind that if the patient were physically there in the dermatology office and the lights went out or the dermatologist’s glasses were suddenly broken, it would be very hard to provide a quality exam. For a remote derm visit, if you can ensure reliable “tool” quality (history from the patient and/or GP, high enough resolution video/images, clear audio), why should there be a care quality concern? Yet these kinds of “visits” — heavily image-focused encounters — are still traditionally accomplished by asking a patient come to the provider. 

Thank you to Kaiser and other telemedicine leaders for providing us with the validating data: remote visits can be done with high quality, lower costs, and positive quality care and patient satisfaction outcomes. On behalf of patients who are increasing expecting more convenient care, healthcare providers who are hesitant — please invest in video visit technology and seek opportunities to provide more convenient care for your patients. Payers, please recognize that this is in everyone’s best interest and start financially rewarding those providers.

Teri Thomas is director of innovation for an academic medical center.

View/Print Text Only View/Print Text Only
October 5, 2016 Readers Write 6 Comments

Readers Write: What Hospitals Can Learn from the Insurance Industry About Privacy/Insider Threat Risk Mitigation

October 5, 2016 Readers Write No Comments

What Hospitals Can Learn from the Insurance Industry About Privacy/Insider Threat Risk Mitigation
By Robert B. Kuller


The drumbeat of hospital PHI breaches marches on. Every day there seems to be another news article on a hospital being hit with a ransomware attack. Hospital CEOs and bards are placing ever-increasing demands on their CIOs to pour technology and resources into preventing these perimeter attacks.

Who can blame them? They don’t want to have to appear before the media and explain why the attack wasn’t prevented given the current high threat environment, how many patients records were affected, and how they will deal with the aftermath of the breach.

Even though these perimeter attacks are no doubt high profile, there is a larger threat that is not being given high enough attention by CEOs or their boards and certainly not the same level of technology and resources to deal with it — privacy and insider-borne threats. According to a recent study by Clearswift, 58 percent of all security incidents can be attributed to insider threats (employees, ex-employees, and trusted partners).

The primary causative factors were identified as inadvertent human error and lack of awareness or understanding. Only 26 percent of organizations are confident they can accurately determine the source of the incident. There are plenty more statistics to throw around, but suffice to say, insider threat is a major problem and represents a large part of hospital breaches even though they do not routinely get the same level of media coverage.

Let’s take a quick review of what the hospital landscape looks like in terms of dealing with insider threat today. Most privacy staff are very small, usually about two people. They are charged with identifying potential breaches; investigating those identified potential breaches to determine actual breaches; interfacing with department heads; internal, and regulatory reporting on actual breaches; putting together a breach reaction plan; assisting with staff education; and preventing future breaches. With a typical 400-bed hospital exceeding five million EHR transactions per day — all of which need to be reviewed — any reasonable person would conclude that is a very high set of expectations for such a small staff.

The vast majority of hospitals continue to use inferior, outdated technology because of severe budget limitations that are applied to the privacy function, while tens of millions of dollars are spent on perimeter defenses. The capabilities of these systems are very limited and basically dump tens of thousands of audit logs entries into Excel spreadsheets that need to be reviewed by the privacy staff. Cutting edge, behaviorally-based systems with advanced search engines, deep insight visualization, and proactive monitoring capabilities are available, but not regularly adopted.

Privacy/insider threat is primarily viewed as a compliance issue. Many hospital CEOs and boards justify giving low priority and resources to this area by looking at the potential fines that OCR will levy if their hospital’s PHI is breached. In fact, the fines are relatively low; breaches have to break the 500-record threshold (although OCR recently announced an effort to delve into breaches below this threshold); you have to be found guilty of not doing reasonable due diligence; and you are given multiple chances at correcting bad practices prior to fines being assessed. Combine this with an overreliance on cyber risk insurance and you have a potential for disaster.

The actual risk profile should start first and foremost with loss of hospital reputation. A hospital brand takes years and millions of dollars to build. One privacy breach can leave it in ruins. The second risk is patient loss and the associated costs of replacing those patients. A recent poll by Transunion showed that nearly seven in 10 respondents would avoid healthcare providers that had a privacy breach. The third major risk is lawsuits, legal costs, and settlements. Settlement costs are large and juries generally rule against institutions and for the damaged plaintiff. Fourth would be compliance.

There also seems to be a misunderstanding of cyber risk insurance. Like other insurance, it will not reward bad practices or flawed due diligence on behalf of the policyholder. Insurers will do a pre-audit to make sure that the risk they are undertaking is understood, that proper prevention technologies are in place, and that best practices are being documented and followed. Once a breach has been claimed, they will generally send out another team of investigators to determine if the items mentioned above were in place and best efforts were maintained during the breach. If they weren’t, this could lead to a denial or at least a prolonged negotiating process. Premium costs will also be reflective of level of preparedness and payouts generally do not cover anywhere near the full costs of the breach.

Prior to coming back to the hospital industry, I spent six years in the disability insurance industry, where top management and Boards take both insider threat and the actual risk matrix of PHI breach very seriously. I believe the hospital industry can learn a valuable lesson from the disability industry. This lesson can be summarized as

  1. Take the real risk matrix seriously.
  2. Put the proper amount of technological and human resources in place in alignment with the actual risk profile.
  3. Buy the best technology available, update it as frequently as possible, and get proactive rather than reactive.
  4. Educate and remind your staff constantly of proper behavior and the consequences of improper behavior (up to and including being terminated).
  5. Don’t overly rely on cyber risk insurance.
  6. Review the CISO’s reporting structure (avoid natural conflicts of interest with the CIO) and have them report to the board for an independent assessment of privacy/insider threat status on a regular basis.

As difficult and expensive as hospital data security is, it is both mandatory to protect patients and part of the price of admission to the market. Although we are in a constant battle to stay one step ahead of the bad guy, we often find ourselves one step behind. That, I’m afraid, is the nature of the beast.

Let’s place privacy/insider threat on an equal footing with the real risks associated with it. It simply makes sense to do so, from the patient, risk, financial, and fiduciary perspectives.

Robert B. Kuller is chief commercial officer for Haystack Informatics of Philadelphia, PA.

View/Print Text Only View/Print Text Only
October 5, 2016 Readers Write No Comments

Readers Write: The Surgeon General’s Rallying Cry Against the Opioid Epidemic Must Also Be a Call to Arms for Healthcare IT

September 14, 2016 Readers Write 2 Comments

The Surgeon General’s Rallying Cry Against the Opioid Epidemic Must Also Be a Call to Arms for Healthcare IT
By Thomas Sullivan, MD


In a rare open letter to the nation’s doctors, US Surgeon General Vivek Murthy, MD, MBA sounded a rallying cry to engage their greater participation in the opioid-abuse crisis afflicting our country. Missing from the USSG’s commendable call to arms, though, was mention of the role technology plays in reducing drug diversion and doctor shopping and providing ready access to services to support patients.

Those of us in healthcare IT know that we are critical to this cause. The USSG is talking to our customers, and we know our customers aren’t adopting as quickly as they could the substance abuse-fighting technologies that are widely available to them. This includes a variety of technology solutions such as:

  • E-prescribing technology, particularly EPCS to support the electronic prescribing of controlled substances, which is key to helping providers more efficiently deploy and monitor prescription medicines being prescribed or over-prescribed across a practice.
  • Medication adherence monitoring technology that lets providers gauge in real time, at the point of encounter, a patient’s level of compliance with drug therapy and provide patients with evidence-based support and services for self-management.
  • Clinical decision support that helps doctors avoid adverse drug events and medication errors.
  • State-run prescription drug monitoring programs (PDMPs) designed to help law enforcement track the use of controlled substances and help prescribers identify doctor shoppers and others seeking illicit access to controlled substances.

Specific to the opioid abuse epidemic, the most important next step is for physicians to be able to check PDMPs within their normal workflow. Simply said, the integration and availability of PDMP data within e-medication management solutions — e-prescribing, medication history services, medication adherence tools and the like — will result in the greatest use of PDMP data and the best one-two tech-assisted punch we have in the opioid battle.

Over the past two years, policymakers have begun to take action in using EPCS to address this crisis. This past March, New York State took a major step toward this goal when it began requiring e-prescriptions for all controlled substances as well as all non-controlled substances, frequently referred to as “legend drugs.” Known as I-STOP, the Internet System for Tracking Over-Prescribing act, originally passed in 2012, New York’s experience now serves as a case study for other states that wish to modernize their prescribing infrastructure and address opioid abuse.

Maine as well now will require opioid medications to be prescribed electronically via Drug Enforcement Agency-certified EPCS solutions beginning in July 2017. Several other states including Massachusetts, Missouri and Maryland are also considering or working to pass mandatory EPCS requirements for prescribers.

Unfortunately, neither New York nor Maine PDMP data is currently accessible to health IT vendors for integration into the prescribing workflow of providers.

E-prescribing – the direct digital transfer of patient prescriptions from provider to pharmacy – is broadly recognized as an important tool in helping promote patient safety, convenience, and overall efficiency for all stakeholders in the prescription process. E-prescribing is well understood to assist prescribers by allowing patients and doctors to better guard against medication errors, such as drug-to-drug interactions, reduce common errors inherent in paper-based prescribing — including illegible handwriting, misinterpreted abbreviations, and unclear dosages, — and provide critical decision support tools.

Despite the fact that, nationwide, more than 70 percent of doctors transmit most prescriptions electronically, the vast majority of these prescriptions are only for legend drugs. In comparison, less than 10 percent are using EPCS solutions to e-prescribe controlled substances. However, in New York, the I-STOP legislation has driven adoption of EPCS to over an estimated 70 percent. As such, all indications are that the laws passed in New York and Maine mandating use of EPCS and PDMPs will almost certainly prove helpful in curbing opioid abuse, fraud, and diversion and help prevent possible addiction down the line.

However, full adoption of PDMPs will likely never be achieved until the PDMP information is accessible in the doctor’s technology workflow. Ultimately, the opioid-abuse battle needs to be fought through states enabling their respective PDMP data to flow through doctors’ own workflows, as opposed to requiring that physicians and clinicians go outside their familiar software tool and interact with a separate portal in order to access their respective state PDMP databases.

In the case of New York State, the Medical Society of the State of New York conducted a survey that found a large percentage of prescribers believed that forcing mandatory compliance was placing an undue burden on their practices. No doubt, physicians feel overburdened with IT mandates. Improving integration between PDMPs and electronic health records will alleviate some of these burdens and allow for better compliance.

States must work more closely with the healthcare community to remove obstacles that will allow as close to 100 percent compliance as possible. Every state has the opportunity to learn from New York to smooth implementation and drive adoption to make a meaningful impact on the growing opioid abuse epidemic. Leadership in healthcare IT companies must be more vocal about our role and responsibilities in enabling doctors on the ground.

With the US Surgeon General weighing in, those of us in the healthcare IT community must rise up to make our voices heard. The importance of integrating e-medication management tools and EPCS solutions with PDMP data cannot be overestimated. It is the best path toward helping our customers — the doctors — make the right decision, at the right time, with the right data, on the right platforms.

Thomas Sullivan, MD is chief strategy and privacy officer of DrFirst of Rockville, MD.

View/Print Text Only View/Print Text Only
September 14, 2016 Readers Write 2 Comments

Readers Write: The Electronic Health Record and The Golden Spike

September 14, 2016 Readers Write 1 Comment

The Electronic Health Record and The Golden Spike
By Frank D. Byrne, MD


On May 10, 1869, at a ceremony in Utah, Leland Stanford drove the final spike to join the first transcontinental railroad across the US. Considered one of the great technological feats of the 19th century, the railroad would become a revolutionary transportation network that changed the young country.


For the past few years, the healthcare industry and the patients in its care have experienced a similar “Golden Spike Era” through the deployment of the electronic health record (EHR). Others have used this analogy, including author Robert Wachter, MD at a recent excellent presentation at the American College of Healthcare Executives 2016 Congress on Healthcare Leadership.

Why is this comparison relevant? While the Utah ceremony marked the completion of a transcontinental railroad, it did not actually mark the completion of a seamless coast-to-coast rail network. Key gaps remained and a true coast-to-coast rail link was not achieved until more than a year later and required ongoing further improvements.

Similarly, while a recent study indicated that 96 percent of hospitals possessed a certified EHR technology and 84 percent had adopted at least a basic EHR system in 2015, there is still much more needed to achieve optimized deployment of the EHR to make healthcare better, safer, more efficient, and to improve the health of our communities.

Nonetheless, the EHR is one of the major advances in healthcare in my professional lifetime. It is an essential tool in progress toward the Institute for Healthcare Improvement’s “Triple Aim for Healthcare”– better patient experience, lower per-capita cost, and improved population health. We cannot achieve those laudable goals without mining and analyzing the data imbedded in the EHR to generate useful information to guide our actions. Advances in data science are enabling the development of meaningful predictive analytics, clinical decision support, and other tools that will advance quality, safety, and efficiency.

But there is much work to do. Christine Sinsky, MD, vice president of professional satisfaction for the American Medical Association, and others have written with concern about dissatisfied physicians, nurses, and other clinicians who feel the EHR is distracting them from patients care and meaningful interactions with their patients.

“Contemporary medical records are used for purposes that extend beyond supporting patient and caregiver … the primary purpose, i.e. the support of cognition and thoughtful, concise communication, has been crowded out,” Sinsky and co-author Stephen Martin, MD note in a recent article.

Perhaps you’ve also seen the sobering drawing by a seven-year-old girl depicting a doctor focused on the computer screen with his back to her, his patient.


Some of the EHR’s shortcomings may be the result of lack of end user input prior to implementation, possibly due to the implementing organization not incorporating the extensive research gathered by the EHR providers. Further, even if one gets end-user input prior to implementation, there’s always challenges prior to go-live, and it seems to me that optimization after implementation has been under-resourced. And let’s not look at temporary ”fixes” as the best and final answer. I was dismayed recently to see “hiring medical scribes” listed as one of the top 10 best practices in a recent Modern Healthcare poll.

Don’t get me wrong, to have a long game, you must have a successful plan to get through today, and if hiring scribes can mitigate physician dissatisfaction until the systems are improved, so be it. But scribes are a temporary work-around, not a system solution.

As an advisor to an early-stage venture capital fund, I’ve enjoyed listening to many interesting and inspiring pitches for new technology solutions. Initially, my algorithm used to rate these ideas was:

  • Is it a novel idea?
  • Will enough people or organizations pay for it?
  • Do they have the right customer?
  • Do they have the right revenue model?

Thanks to the input of physicians, nurses, therapists, and other clinicians, and the work of Dr. Sinsky and others, I quickly added a fifth, very important vital sign: Will it make the lives of those providing care better? Similarly, author, speaker and investor Dave Chase added a fourth element to the Triple Aim, caregiver experience, making it the Quadruple Aim.

When I was in training, we carried the “Washington Manual” and “Sanford’s Antimicrobial Guide” in the pockets of our white coats as references and thought we had most of the resources we needed to provide exceptional care. Now, caregivers suffer from information overload of both clinical data and academic knowledge. Some query Google right in front of their patients to find answers.

In healthcare today, we work within a community of diverse skills and backgrounds, including clinicians, non-clinicians, computer scientists, EHR providers, administrators, and others. To achieve our goal of improving health and healthcare for individuals and communities, we must work together to organize, structure, mine, and present the massive amounts of data accumulated in the EHR. To me, the concept of population health is meaningless unless you are improving health and outcomes for my family, my friends and me. Just as the placement of “The Golden Spike” was only the beginning of railroad transportation becoming a transformational force in American life, the fact that 96 percent of U.S. hospitals possess a certified EHR is just the beginning.

I have been accused of being a relentless optimist, but I firmly believe we can use the EHR to improve the caregiver and patient experience (I believe patients will and should have access to their entire medical record, for example), and fulfill the other necessary functions that Sinsky and Martin describe as distractions from the medical records’ primary purpose: “quality evaluations, practitioner monitoring, billing justification, audit defense, disability determinations, health insurance risk assessments, legal actions, and research.”

Lastly, there is one more similarity to “The Golden Spike.” In 1904 a new railroad route was built bypassing the Utah track segment that included that historic spot. It shortened the distance traveled by 43 miles and avoided curves and grades, rendering the segment obsolete. Already, many EHR tools, applications and companies have come and gone. Many of the tools we use now remain rudimentary compared with what we really need. We must use what we have to learn and continuously improve, and frankly, we need to pick up the pace. The patients, families and communities depending on us deserve no less.

Frank D. Byrne, MD is the former president of St. Mary’s Hospital and Parkview Hospital and a senior executive advisor to HealthX Ventures.

View/Print Text Only View/Print Text Only
September 14, 2016 Readers Write 1 Comment

Readers Write: Moving Beyond the App: How to Improve Healthcare Through Technology Partnerships

August 24, 2016 Readers Write 1 Comment

Moving Beyond the App: How to Improve Healthcare Through Technology Partnerships
By Ralph C. Derrickson


As the pace of change in the US healthcare system increases, we are seeing inspiring progress in access and care delivery driven in part by the adoption of telemedicine and other technology-enabled care models. Health systems are embracing virtual medicine as a way to serve their patients and communities by meeting their budget and lifestyle needs. Health systems are trying to match the consumer experience of other Internet services by delivering new care models that give patients better care, save them time, are easier on their wallets, and keep them within the health systems they already know and trust.

While the prospects for technology are enormous, there are downsides that have to be avoided.

Healthcare isn’t an app. We all use apps to conduct business, purchase products, and get our entertainment fix from our favorite mobile games and streaming media services. The idea that we could put an app in a patient’s hand to diagnose or treat them is very appealing. When that app is offered as part of a comprehensive set of integrated treatment options, there are reasons to be very hopeful. But when it’s offered outside a local health system, it leads to fragmentation, excessive prescribing, and even worse, inappropriate treatment.

Simply aggregating providers using the Internet is bad medicine. App developers and their networks of doctors – who are paid on a per-visit basis – have used technology to bring out the worst of fee-for-service care. The data on telemedicine prescribing rates, visit durations, and management rates is in and it isn’t pretty. If the expectation is that the patient’s needs will be met with a telemedicine visit, it becomes a failure when the patient doesn’t get treatment or a prescription.

There’s no doubt the provider is doing their best to serve the patient, but without a place to send the patient for in-person care, they’re stuck trying their best to meet the patient’s needs. In fact, a study in JAMA Internal Medicine shows that the quality of urgent care treatment varies widely among commercial, direct-to-consumer virtual care companies. Their transactional models for medicine also offer no integrated next step for the patient and no connection to a broader spectrum of care.

Health systems need an approach that runs counter to telemedicine/app developer trends. An integrated virtual clinic enables health systems to extend the service offering in clinically appropriate situations and build on the trust they have earned from patients in years of service to their community. Payment models can come and go, but the patient’s reliance on a doctor in a time of need should never be compromised by the method of access or their payment system.

Health care is challenging. I’ve referred to it as the Three Hopes: I hope it’s not serious, I hope I can see my doctor, and I hope it’s paid for. Countless studies have shown that the proven, most cost-effective health care model is to have access to primary care doctors and great doctor-patient relationships, two qualities that are part and parcel of a strong health system. However, most app-centered telemedicine companies have no connection to a patient’s primary care provider, leading to care fragmentation instead of care continuity.

Through all of this, the greatest institutions of clinical excellence – our health systems – are losing the arms race for patients, especially as the healthcare market continues to consolidate and health systems face fierce competition from their peers to attract and retain patients. Health systems simply don’t have the marketing engines of app-centered telemedicine providers and pharmacies who are fighting tooth and nail for patient acquisition.

Many health systems have yet to figure out how to adapt to a consumer-directed model while continuing to provide quality care. The same patients who want convenience first and foremost are often unable to accurately judge the quality of care received through most telemedicine methods. For health systems and patients to succeed, virtual care must be part of a broader care continuum and tightly integrated within health systems.

Keeping patients within the systems they already know and trust provides an invaluable convenience and allows opportunities to refer patients to appropriate care when an ailment cannot be treated virtually. Those referrals offer a chance to reconnect patients to health systems rather than their using a high-cost option like an emergency department or a quick fix like an app-centered retail clinic.

This is especially important as the industry shifts to fee-for-value reimbursement.

An approach that integrates virtual care within health systems ensures patients get the same quality of care that they would receive from an in-person visit. Patients have a better chance of understanding their own health, as trusted physicians give patients the information they need to become educated healthcare consumers. For health systems, integrated virtual care puts them in the driver’s seat on how care is delivered and managed, whereas an app-centered approach might not meet metrics of quality nor the needs of the patients they already serve.

App-centered telemedicine has no place in our health care system. This approach to addressing the changes in healthcare is robbing patients of the type of care they deserve.

There is no reason for the Three Hopes of healthcare to be points of uncertainty or stress for patients. I see great promise among leading hospitals and health systems who are alleviating this uncertainty with integrated virtual care. They realize they know how best to treat a patient – apps do not. Virtual care that’s integrated into a provider network is best equipped to put quality at the center of care now and in the future.

Ralph C. Derrickson is president and CEO of Carena, Inc.

View/Print Text Only View/Print Text Only
August 24, 2016 Readers Write 1 Comment

Readers Write: Moving and Sharing Clinical Information Across Boundaries

August 24, 2016 Readers Write 3 Comments

Moving and Sharing Clinical Information Across Boundaries
By Sandra Lillie


In Gartner’s recent depiction of the Hype Cycle for Healthcare Technology, Integrating the Healthcare Enterprise (IHE) XDS has now progressed well past early adopters and rapidly toward productivity and optimization. In many regions outside the United States, it is the de facto standard for content management, and within the US, it is receiving increasing consideration for adoption in use cases supporting specialty images, standards-based image sharing and the like.

XDS is a suitable foundation for integration of clinical systems, and as noted earlier, is more widely adopted in EMEA for this purpose. It is capable of moving and sharing clinical information within and between organizations and capable of creating a patient-centric record based on multiple document (types).

XDS centralizes registration of documents, reducing the problem of deciding which system holds “the truth.” Focusing on “standardizing the standards,” XDS supports the moving and sharing of clinical information across boundaries, both within and between enterprises. This is increasingly vitally important in delivering patient-centered care across the care continuum.

Today we also have XDS-I, also referred to as XDS.b for Imaging. It is built upon the XDS.b profile with one key difference – the actual DICOM imaging study stays put in its original location until requested for presentation. This is accomplished by registering the location of the imaging study in the XDS registry while using a vendor-neutral archive that is smart enough to serve as its own XDS-I repository.

DICOM is a standard format for the storage and communication of medical images, such as x-rays. Instead of publishing the document (which would be large in imaging) to the repository, however, the imaging document source (the VNA in this case) publishes a “manifest.” This manifest contains an index of all the images within a study, coupled with a path to the VNA where they can be retrieved. This reduces the amount of data that has to move around, allowing for more efficient image sharing while minimizing the complexity and costs of image storage.

What are the implications to healthcare organizations of using XDS?

  • Documents retain their native format, allowing ready viewing by applications.
  • Standards support interoperability and sharing of both documents and enterprise image studies.
  • IHE conducts annual Connectathons in the United States and Europe to validate interoperability and enable widespread ability for vendors to act as sources and suppliers of content.

Major benefits include:

  • XDS enables movement and sharing of clinical information across boundaries, both within and between enterprises. This capability is increasingly important in delivering patient-centered care across the continuum, supporting the organization of documents across time in a patient context, allowing clinicians to realize a more complete picture of the patient.
  • XDS offers a lower-cost method for implementing care coordination through a solution that can easily respond to queries for patient-centered documents and enterprise images.
  • Use of standards simplifies healthcare IT integrations, requiring less administrative overhead.

Now is the time for US healthcare providers to seriously consider the advantages of XDS. XDS profiles provide an effective alternative for managing clinical content exported from legacy (sunsetted) systems and for supporting healthcare information sharing.

Sandra Lillie is industry manager for enterprise imaging for Lexmark Healthcare.

View/Print Text Only View/Print Text Only
August 24, 2016 Readers Write 3 Comments

Readers Write: Why Reverse Mentoring is Beneficial for HIT Employees

August 15, 2016 Readers Write 2 Comments

Why Reverse Mentoring is Beneficial for HIT Employees
By Frank Myeroff


Reverse mentoring is when seasoned HIT professionals are paired with and mentored by the younger Millennial generation for the reasons of being extremely tech savvy, fast to adopt new technology, and not afraid of trying new things. In addition, it helps to bridge the gap between generations.

Reverse mentoring was introduced in the 1990s by Jack Welch, chairman and CEO of General Electric at that time. While it’s not exactly new, it’s gaining popularity fast. More and more organizations are recognizing the value of reverse mentoring and are developing formalized programs to ensure best practices in order to yield success. They believe that Millennials are well suited as mentors to help maximize HIT use and adoption in order to move organizations forward in this digital age.

Additionally, with the ever-changing landscape of technology and tools used in the HIT field, reverse mentorship can be extremely beneficial:

  • Young, fresh talent has a chance to share their skills, knowledge, and fresh perspectives with more senior employees. Hospitals and health systems often look for their HIT professionals to use technology to improve patient care, lower costs, and increase efficiency. This means that the latest technology is routinely sought. Organizations know that tech savvy younger generations will catch on to this quickly, presenting an opportunity for them to share their knowledge with a different generation. Not only HIT systems, but also technology and platforms such as social media could be unique topics for Millennials to share information and ideas on.
  • Creates a way for separate generations to build working relationships with one another. Reverse mentorship can help junior HIT employees feel more needed, confident and comfortable communicating with higher-up employees working together on projects or even in meetings. Additionally, this could create more cohesion in the workplace and begin to break down perceived barriers and stereotypes of each generation.
  • Gives junior employees a higher sense of purpose in the organization. Implementing a reverse mentorship program gives young HIT professionals a sense of empowerment and the idea that they are making an impactful contribution to the company. This in turn, could help increase retention and help to shape future leaders in the organization.
  • Continues to provide ways for senior employees to share their knowledge as well. Although called reverse mentorship, this type of program offers a two-way street for employees of all ages to learn from one another. Experienced professionals in the HIT field are able to share their insights and knowledge, in addition to learning new things.

While reverse mentorship can be extremely beneficial in the HIT industry and especially any industry with a tech focus, there are several conditions this type of relationship depends upon:

  • Trust. Each person needs to trust the other and put effort into bettering both careers.
  • Open mindedness. In a reverse mentorship, both employees will act as a mentor and a mentee and need to show a willingness to teach, but also a willingness to learn.
  • Expectations and rules. It will be important for both parties in the mentorship to communicate what they are looking to get from the relationship as well as staying committed to the process.

Reverse mentorship is an innovative way to bring together generations of employees to share knowledge. In addition, today’s Millennial mentors will be tomorrow’s chief healthcare officers. We will depend on them to lead the IT department and create strategies on how to handle the growing amount of digital data for healthcare workers and new ways to support technologically advanced patient care modalities.

Frank Myeroff is president of Direct Consulting Associates of Cleveland, OH.

View/Print Text Only View/Print Text Only
August 15, 2016 Readers Write 2 Comments

Readers Write: ACO, Heal Thyself

July 18, 2016 Readers Write 3 Comments

ACO, Heal Thyself
By Stuart Hochron, MD, JD


I was recently asked to comment on the success (or lack thereof) of Accountable Care Organizations (ACO) and why I thought ACOs haven’t lived up to expectations and what additional incentives will be required for them to be successful – if, indeed, they ever will be.

The questions gave me pause. Certainly ACO performance to date has left much room for improvement. According to an analysis published by the Healthcare Financial Management Association, just over a quarter of ACOs were able to generate savings in an amount sufficient to make them eligible to receive a share of those savings.

But the implication that ACOs are biding their time until new incentives or perhaps a new business model emerges is alarming. This is not a situation where good things will necessarily come to those who wait.

I work with a number of ACOs, hospitals, and physician organizations. While I am not at liberty to share their financial performance data, I’ve distilled what I believe to be the best practices employed by those that will be successful.

It takes a platform

Fundamentally, ACOs require wide-scale patient-centric collaboration – that’s what underpins the hopes of achieving more-efficient, more-effective, less-wasteful, non-redundant care. But collaboration doesn’t just happen automatically, even when everyone on the team works in the same building. And for ACOs, comprised of multiple entities that don’t necessarily have any prior joint operating experience or relationship of any kind, the challenge is greater still.

Based on extensive discussions with healthcare executives and real-world performance analysis, it is clear that successful ACOs must make an investment in robust groupware tools, the kind that professional services organizations have had in place for decades to ensure that members of a distributed workforce can collaborate and coordinate as easily as if they were in next-door offices.

In the healthcare context, these tools will facilitate everything from patient scheduling to real-time sharing of PHI to charge capture and invoicing. Far beyond secure messaging, such platforms underpin the ACO’s activities, giving providers a common workspace for all manner of collaboration and ensuring that all providers across the care continuum are aware of and working towards a single set of organizational imperatives. The ACOs that don’t invest in the transformation – that try to piggyback on existing infrastructure – will ultimately find that their people don’t make the transformation either.

Patients at the center

All healthcare systems need to become more patient-centric and this is particularly true of ACOs, whose compensation, of course, is based on how successfully they treat (and, ideally, reduce the need to treat) patients. Thus, successful ACOs will make patient-centric collaboration and communication the centerpiece of an organization-wide operating system. 

Ideally, collaboration and communication won’t stop there. ACOs will implement population health initiatives by empowering patients, giving them the ability to take a more active role in keeping themselves healthy. This will be accomplished via tools such as mobile apps that enable people to access care services before they get sick and enable ACOs to reach out to the community, helping guide patients towards good lifestyle choices and, if they have received acute treatment, helping patients follow post-discharge instructions. So that same collaboration platform that will help care professionals work together better – it will need to extend seamlessly into the community as well.

Without aligned physicians, there’s no accountability

Technically, any organization that agrees to be “accountable for the quality, cost, and overall care of Medicare beneficiaries” can qualify under the definition of an ACO. But what all successful ACOs will have in common is tight alignment of physicians and care teams. I don’t simply mean financial alignment. Theoretically, all the physicians in an ACO are financially aligned. Nor do I just mean alignment around a patient.

True alignment means the physicians who form the core of the ACO understand the goals and priorities of the organization and feel invested in its success. Physicians make dozens of care decisions every day. They need to be making those decisions against the backdrop of the stated policies of the ACO. That requires being literally as well as figuratively connected to the organization, receiving regular communications such as educational materials, opinion, and thought leadership, being part of the daily give and take.

The financial incentives and disincentives under which ACOs operate change regularly, meaning the ACO’s organizational goals are updated all the time. The challenge is for providers to understand those incentives fully and to be able to adjust their practice methodologies and for that to happen on an organization-wide basis. Achieving and maintaining alignment requires an institution-wide collaboration platform. In a distributed entity such as an ACO, there’s no physician’s lounge. But with modern groupware, we can simulate one in a virtual environment and realize the same benefits.

Networks don’t build themselves

In my work with ACOs, one hurdle encountered by all is introducing and socializing the concept that the ACO establishes a new network of providers to which to refer cases. Intellectually it isn’t that hard to grasp. But as far as changing ingrained habits, that is much more of a challenge – not least because providers have no way of knowing which other providers are also members of the ACO, nor how effective any of those providers might be as physicians contributing to the stated financial goals (savings as well as revenues) of the ACO.

The only way to keep referrals within the organization – to combat the challenge of referral leakage, which will sink an otherwise effective ACO – is the ensure that every physician in the ACO is connected to a physician referral directory that lists all providers by specialty.For good measure, it will include a rating quantifying each provider’s service.

Improving clinical documentation

In the minutely quantified world of ACO financial performance, every dollar counts. The ACO’s income is based, in part, on costs saved, along with other metrics. As is well known, incomplete clinical documentation leads to tens of billions of dollars in disallowed reimbursements every year, a situation that only grows worse in a distributed organization such as an ACO. 

While we are imagining the infrastructure of the successful ACO of the future, let’s not neglect to include capabilities for crisply identifying and documenting treatments and procedures and thus enabling the medical billing professionals – who may have no physical or organizational connection to the care delivery professionals – to complete the paperwork correctly and maximize reimbursement revenue.

Conceptually, ACOs are the heart of the Affordable Care Act. Accountability – enforced by incentives and penalties – is central to our concept of how healthcare ought to work. If ACOs aren’t delivering on their promise, then that has ominous implications for the healthcare system overall. With the right communications infrastructure used as directed, ACOs can lead the way to the bright healthcare future we all want. Rather than stand on the sidelines as spectators, waiting for new incentives to come down from on high, ACOs can and must take action now.

Stuart Hochron, MD, JD is the chief medical officer of Uniphy Health of Minneapolis, MN.

View/Print Text Only View/Print Text Only
July 18, 2016 Readers Write 3 Comments

Readers Write: Why EHRs Will Have Different Documentation Requirements for Biosimilar Dispensing, Administration, and Outcomes

July 11, 2016 Readers Write No Comments

Why EHRs Will Have Different Documentation Requirements for Biosimilar Dispensing, Administration, and Outcomes
By Tony Schueth


While a second biosimilar recently being approved in the United States does not a tsunami make, biosimilars are nonetheless expected to quickly become mainstream. In response, stakeholders are beginning to work on how to make them safe and useful within the parameters of today’s healthcare system because, biosimilars – like biologics – are made from living organisms, which makes them very different from today’s conventional drugs.

In fact, biosimilars are separated into two categories: biosimilars and interchangeables, both of which are treated differently from a regulatory standpoint. These differences will create challenges and opportunities in how they are integrated in electronic health records (EHRs) and user workflows as well as how patient safety may be improved.

EHRs must treat biosimilars differently than generics. As a result, EHR system vendors will need to make significant changes to accommodate the unique aspects of biosimilar dispensing, administration and outcomes.

Patient safety is a priority for development and use of all medicines. Manufacturers must provide safety assessments and risk management plans as part of the drug approval process by the Food and Drug Administration (FDA). Even so, biologics and biosimilars are associated with additional safety considerations because they are complex medicines made from living organisms. Even small changes during manufacturing can create unforeseen changes in biological composition of the resulting drug. These, in turn, have implications for treatment, safety, and outcomes. In order to address these issues, information about what was prescribed, administered, and outcomes must be documented in the patient’s medical record.

Substitution also is an issue because dispensed drugs may be very different than what was prescribed. As a result, it is important for physicians to know whether a substitution has been made and capture information about the drug that was administered in the patient’s medical record, especially when it comes to biologics and biosimilars. This is important for treatment and follow-up care, as well as in cases where an adverse event (AE) or patient outcome occurs later on.

Four drivers make the unique documentation requirements of biosimilars in EHR a priority.

  1. Utilization is expected to grow rapidly because of biosimilars’ lower-cost treatment for such chronic diseases as cancer and rheumatoid arthritis. It is easy to envision the availability of four biosimilars each for 20 reference products that could be available in 2020, given projected market expansions. That amounts to 100 biologics that will need to be addressed separately. As more biosimilars are approved and enter the market, it will become increasingly challenging and important to accurately identify and distinguish the source of the adverse events (AEs) from a biosimilar, its reference biologic, and other biosimilars.
  2. Physicians will need this information once biosimilars come on line and their use becomes widespread. Adverse complications — particularly immunologic reactions caused by formation of anti-drug antibodies – may occur at much later after the drug was administered. Physicians report more than a third of adverse events to the FDA, but need to know what was administered to the patient when the pharmacist performs a biosimilar substitution.
  3. Outcomes tracking and patient safety are growing priorities in healthcare. They are key pieces of the move toward value-based reimbursement and are a focus of public and private payers. Identifying, tracking, and reporting adverse events are expected to become key metrics for assessing care quality and pay-for-performance incentives.
  4. States are ahead of the curve when it comes to substitution. More than 30 are considering or have enacted substitution legislation for biosimilars, which creates urgency in how such information is captured and documented in EHRs. Some states require the pharmacy to communicate dispensing data to the prescriber’s EHR.

Because of the unique properties of biosimilar dispensing, administration and outcomes, many adjustments will be needed for documentation into EHRs used by physician offices in independent practices and integrated delivery systems (IDS). For example:

  • EHRs must be able to comprehensively record data on what was administered or dispensed for an individual patient, as well as what was prescribed. Modifications will be needed for tracking adverse event reports in various administration locations, including the physician’s office; an affiliated entity (e.g., practice infusion center); the patient’s home; or non-network providers.
  • Changes in drug data compendia will be needed to account for new naming conventions that soon will be put in place by the FDA and substitution equivalency.
  • Tracking the manufacturer and lot or batch numbers (similar to vaccine administration) can facilitate more accurate tracing of an AE back to the biologic. Fields will need be added to record the NDC code, manufacturer, and lot number of biosimilars that have been dispensed. 
  • NCPDP SCRIPT’s Medication History and RxFill transactions — already available for electronic prescribing in EHRs— can include the NDC and the recently added manufacturer and lot number as part of the notification to the prescriber. Although not widely used today, RxFill provides a compelling method to notify providers that a substitution occurred in the pharmacy.
  • EHRs will need to address barriers related to the use of biosimilars, such as creation of too many alerts; the usability of how the information is presented to the clinician; lack of consistency in the display of drugs and drug names; and conformance of screen features and workflow within and between systems.
  • IDS systems need to be interoperable and have a seamless transfer of information. This can be a challenge in trying to meld together multiple disparate health information technology systems and EHRs from different vendors.

The time is right for industry, hardware and software developers, and other stakeholders to address the opportunities and challenges posed by entrance of biologics and biosimilars into the US market. As patient safety issues arise, the EHR community must be in a position to capture and exchange needed information. Otherwise, states and other regulators could develop alternative tracking methods. Examples include state vaccine registries or prescription drug monitoring programs, which track controlled substances dispensing and vary from state to state. These programs have become complicated mechanisms for healthcare providers to address.

Tony Schueth is CEO and managing partner of Point-of-Care Partners of Coral Springs, FL.

View/Print Text Only View/Print Text Only
July 11, 2016 Readers Write No Comments

Readers Write: Election 2016

July 6, 2016 Readers Write 5 Comments

Election 2016
By Donald Trigg


A provocative Atlantic magazine cover this month headlines, “How American Politics Went Insane.” Jonathan Rauch explores our current reality where “chaos has become the new normal — both in campaigns and government itself.”

As we struggle to draw rational signal from the noise, one can’t help but wonder if Trumpian chaos is resident in our favorite podcasts, journals, and websites. Are byzantine rule-makings not regularly bemoaned on HIStalk?  Do we not hear classes of readers singled out (particularly for using HIPPA and HIMMS)? Are we not struck by the rather small hands on the original HIStalk graphic?  


HIStalk has been, all kidding aside, a thankful escape for many of us from a campaign that has been abysmal even by our diminished US standards. Fortunately, there are just 125 or so days left. And with few exceptions, these conversion dates hold.  

Here is the quadrennial cheat sheet.  

A proper understanding of the 2016 election starts with the massive advantage Democrats have in the Electoral College. The Democrats have a safe hold on 19 states (plus DC) representing 242 Electoral College votes. (Note: If you still are suffering under the delusion that the popular vote selects the president, let’s email about a couple of ideas for your trip to the Albert Gore Presidential Library). As a quick civics reminder, you only need 270 Electoral College votes to become president.  

So, with a probable shortfall of just 28 Electoral College votes to get to 270, the Democratic path is far easier. As an indicative example, a Republican could win every “swing” state from Ohio to Virginia, but lose Florida (29 EC) and thereby lose the presidency. It is not quite as challenging as running a health system with an antiquated MUMPS technical architecture, but it is still a daunting task for the GOP.         

The statistician-turned-blogger Nate Silver places the odds of a Hillary victory at 80 percent with one of his two models factoring in GDP (Q1 GDP was 1.1 percent) for a lower 75 percent chance. He probably has that about right and (spoiler alert) decisions like the Trump VP pick aren’t going to radically change that.

No matter the outcome at the top of the ticket, neither Democrats nor Republicans are likely to dominate the breadth of the electoral landscape. Republicans have a fairly solid grasp on the US House (247-188) and they also control 31 governorships. As Barron’s wrote over the long weekend, ongoing divided government will offer a muted welcome to any agenda this January.  

As for healthcare, the issue significantly trails the economy/jobs and terrorism when it comes to top voter concerns. Moreover, opinions are very settled and polarized. Forty-two percent favor the ACA, while 44 percent oppose it.  

Consequently, Clinton and Trump will use talking point level rhetoric, predominately to drive turnout. Hillary will take on big pharma, calling for caps on prescription drug costs. Trump will bemoan premium increases, call for ACA repeal, and assure us he is going to do something “fantastic.” You will feel like you are watching “Saturday Night Live.”

Notably, there is an important piece of emerging voter sentiment that we shouldn’t miss amid the posturing and platitudes. According to the June KFF poll, 90 percent are worried about the amount people pay for their healthcare premiums, while 85 percent are worried about increased cost of deductibles. Consternation over cost is growing and will be reinforced during open enrollment this fall. 

As we look out to first 100 days of the new administration, we will see a level of change on health policy that is more incremental than historic. Importantly, MACRA will continue to advance at the agency level, buttressed by solid bipartisan opposition to fee-for-service. At the state level, ongoing programmatic Medicaid changes move forward. Finally, even with the the Cadillac tax delay, employers experiment further with wellness incentives and alternative (and narrower) network design.  

In the Atlantic, Jonathan Rauch makes a lonely case for a renewed establishment that can impose some modicum of order. Few will like that treatment plan. His Chaos Syndrome diagnosis, however, is inarguable, as is his view that in the near term, “it will only get worse.”  

Donald Trigg is president of Cerner Health Ventures. In a previous life, he worked for President George W. Bush starting on the 2000 presidential campaign in Austin, Texas, and then after a brief Florida detour, in Washington, DC for the first half of Bush’s first term. 

View/Print Text Only View/Print Text Only
July 6, 2016 Readers Write 5 Comments

Readers Write: Who’s On First? Baseball’s Lessons for Hospital Shift Scheduling

June 29, 2016 Readers Write 1 Comment

Who’s On First? Baseball’s Lessons for Hospital Shift Scheduling
By Suvas Vajracharya, PhD


A single MLB season includes over 1,200 players, 2,340 games, and 75 million fans in stadiums. In just 10 seasons, it’s possible to generate more baseball schedule options than there are atoms in the universe. Yet a full season of baseball scheduling is still far less complicated than just a single month of scheduling for 24/7 coverage shifts in a hospital emergency department. There’s good reason hospital operations teams are stressed about scheduling. Trying to do this manually with paper or a spreadsheet is an exercise in pure masochism.

First, a bit of history. Major League Baseball started out using a guy in the commissioner’s office to set up season schedules. Harry Simmons quickly found the task so overwhelming that he left the office and worked on the schedule as his full-time job (sound familiar?) In 1981, the league assigned the job to a husband-wife team named Henry and Holly Stephenson, who set the schedules for two decades using a mix of computers and manual scheduling.

Tech leaders at IBM, MIT, Stanford, and Carnegie Mellon all tried to unseat these scheduling gurus and failed until 2005, when the league switched to what is called “combinatorial optimization” technology to generate their schedules entirely by computer.

Today, the same applied mathematics technology is used in not just Major League Baseball, but in all sports leagues, airline schedules, and retail stores, too. Any time you’ve got a mix of teams, individuals, holidays, facilities, unpredictable weather patterns, changing demand, and lots and lots of rules that sound straight out of high school word problems … that’s a scheduling job for advanced computing.

Healthcare, as anyone with experience in the sector might guess, is behind the times when it comes to scheduling technology. The vast majority of hospital departments (an estimated 80 percentage) are still setting schedules manually, like our poor old friend Harry Simmons. It’s a problem that can’t be ignored any longer. Not only is manual scheduling a major time sink for hospital operations staff, it also contributes to the already significant issues of professional burnout and physician shortages.

The MLB uses scheduling software in two distinct ways. First, they generate an established schedule for the season using set rules. These include rules designed to prevent player burnout, such as requiring a day off after a team flies west to east across the country or not playing on certain holidays. There are also operational rules, such as not having two home games in the same city the same night or making sure the weekend and weekday games are equally divided among teams.

In healthcare, these established schedule rules include things like not scheduling back-to-back night shifts for a physician, making sure weekend on-call time is fairly distributed, and ensuring key sub-specialists are available 24/7 for procedures. This rules-based schedule serves as the baseline.

After this, a second type of scheduling tool comes into play. These are requests that let the schedule flexibly adapt to changes. When a blizzard knocks out a week of MLB games or they need to cancel a series in Puerto Rico due to Zika concerns, it’s this second set of optimization technologies that reconfigures the schedule to get things back on track for the season.

In healthcare, schedule requests happen any time and all the time. Vacation, maternity, schedule swaps, requests for overtime, adding locum tenens, adding mandatory training sessions — hospital schedules change far more frequently than MLB schedules, adding to the complexity.

A recent study of over 5,500 real medical department physician shift schedules showed that medical department scheduling varies by specialty. Emergency medicine has by far the most complex process with an average of 62 repeating scheduling rules and 276 monthly schedule change requests. Hospital medicine and OB-GYN follow behind and office-based schedules such as nephrology are much simpler but still beyond anything in the MLB. The math on the number of schedules you could generate with that complexity and variability in emergency medicine is mind-boggling. That specialty also just happens to have the highest rate of physician burnout.

It is time for hospital operations leaders to figure out what the MLB discovered way back in 1981: setting complex schedules is a job for computers. Using sophisticated machine learning to balance dozens of rules and to support flexibility for ongoing changes is good practice for baseball players, pilots, and physicians. With the help of technology, hospitals might already have the solutions they’re looking for when it comes to care coordination, physician retention, increasing patient volume, and preventing staff burnout. It’s time for hospital operations to play ball.

Suvas Vajracharya, PhD is founder and CEO of Lightning Bolt Solutions of South San Francisco, CA.

View/Print Text Only View/Print Text Only
June 29, 2016 Readers Write 1 Comment

Readers Write: Change Your Change Management

June 29, 2016 Readers Write 8 Comments

Change Your Change Management
By Tyler Smith


Does your organization have a solid change management system in place? Hopefully for most, the answer is yes. In large-scale IT projects, it is essential that a well-constructed system of checks and balances for each system-affecting change be in place, as well as a forum for the discussion of each change that has a material effect on other pieces of the project.

However, due to overhyped fears of errant build moves, change management often becomes an organizational behemoth, larger and more threatening than the worst government bureaucracy and capable of effectively killing the desire of any analyst to make any change.

When the change control warps to such a state that analysts dread getting up and going to work because they know that every software improvement they make will cost them an exorbitant amount of time in the approval process, the project runs the risk of losing talented staff (if not in body, then in mind).

Having worked on a variety of EMR projects over the years, I have seen everything from no change control to a change ticket process that required a PhD to navigate the nuances and still left no one feeling fulfilled when the update in question eventually reached the live environment. Many times it isn’t just the process — it’s the outdated change management software that is used at these organizations, which causes the confusion and lengthy timelines. I’m not going to name names but anyone who has worked in these projects knows what I am talking about. These ancient enterprise change management software suites make the worst-performing EMR seem user-friendly.

The real loser in this dreaded combination of micromanagement and crappy software is the loss of productivity and creativity. If an analyst spends more time getting a change through than building it, that is not necessarily bad. Some simple changes require lots of analysis to see the broader system impact.

However, if every change requires a time effort 1.5 times or greater than the time spent to perform the actual configuration, that is a serious issue. You are effectively sacrificing productivity out of a fear of your analyst being incompetent or too short-sighted to see/think through the effect of their change. In effect, your organizational policy is stating, “We trust you to make changes in the system, but no we do not think you have any degree of comprehension of what these changes mean.”

Therefore, as organizations stabilize and try to determine how to get the best work from their full-time teams, I would highly suggest taking a look at your change management process and change management software vendor and see if the process and software really align with the other organizational initiatives you promote within your IT team.

Here are a few suggestions for moving forward:

  • Simplify. Cut down the change management process and software to the most necessary components. For example, do you really need to have seven different fields where a description is entered? Do the technical specifications ever need to be entered more than once? How long do these meetings need to be and do all changes need to be presented in such a forum? How many people need to attend? Trim the digital and process components. Every step whether in the software or in the change meeting/presentation process is like the dreaded extra click for the provider. Eliminate documentation processes that are redundant, in addition to required fields that do not serve a purpose.
  • As you simplify the governing structures, try giving analysts more control and in doing so see how little processes you actually need in place to maintain order. If analysts have the mental capacity to perform build tasks, they can probably handle taking on a degree of higher level organizational thinking regarding the impact of their change.
  • Do not allow the change control process to be constantly updated unless those changes are removing redundancies or irrelevant steps. Adding additional rules and processes often confuses analysts and these updates rarely serve their intended purpose.
  • Eliminate the standalone change control team altogether and make a committee formed from actual team members. It is OK to have a PM if the organization mandates such a structure. However, analysts who solely sit in a change control cube and who are not building in the system can never have a real world view of the software. These team members are essentially reactive (which means that in order to feel they have a purpose, they need to make the jobs of others more difficult, for better or worse). It may be a stretch to say that a change control team is a form of featherbedding, but the roles within it should be looked at with care as to the greater purpose they serve and their need to be full time.
  • Finally, if you can, scrap the medieval change control software and use the most minimally time invasive platform to document and present change and keep a record for the future. An Excel document may be enough. If the change control is linked to the help desk ticketing software this may not be possible without getting a new help desk software, but add this to the analysis.

Reducing change control staff and processes may not be pleasant. However, the long-term gains in efficiency and creativity that you will see in return from your analysts will benefit the end users of the software far more than the negatives of a temporary overhaul.

Tyler Smith is a consultant with TJPS Consulting of Atlanta, GA.

View/Print Text Only View/Print Text Only
June 29, 2016 Readers Write 8 Comments

Readers Write: Patient Privacy — A New Way Forward

June 20, 2016 Readers Write No Comments

Patient Privacy — A New Way Forward
By Robert Lord


Health data security and patient privacy are in a state of crisis. Electronic health records (EHRs) are in the process of being ubiquitously rolled out, providing access to as much patient data as possible, to as many users as possible, in as little time as possible. As a consequence, hundreds of millions of patient records have been made easily accessible to millions of health system employees and affiliates, with essentially no oversight of who is viewing what patient data in the EHR and if that access is appropriate.

However, this isn’t because of health system negligence – it’s about a collective lack of accountability among several key stakeholders. Due to the sheer volume and complexity of patient records accessed each day, it is impossible for privacy and security officers to efficiently detect breaches without new and practical solutions and standards.

Something needs to change. Despite promises of role-based access controls, training programs, and security templates, the problem just isn’t being solved, and HIPAA violations continue to affect hospitals on a daily basis. That critical human layer of access is the root of these problems, and that doesn’t have an easy solution.

A new report from the Brookings Institution details that the majority of recent healthcare data breaches are caused by theft or unauthorized access. Research also shows it takes more than 200 days to detect an insider threat, if it is detected at all. And the in-depth report from ProPublica last December helped bring into focus that small-scale violations of medical privacy — like the Walgreens pharmacist who snooped in the prescription records of her boyfriend’s ex — often cause the most harm.

We are now at an inflection point that will decide the future of patient privacy. The actions and decisions of four key stakeholders and their collective will to collaborate through an independent fifth apparatus will significantly advance or stall patient privacy protection and next-generation health data security.

Patient privacy technology vendors need to invest in their teams and products to take advantage of the significant advances made in big data analytics, clinical informatics, and cybersecurity. These advances have changed many other fields, but cybersecurity and compliance solutions built for non-healthcare industries are rarely effective in the complex and idiosyncratic healthcare environment.

Furthermore, the big data environments that define many modern hospitals also require big data solutions that are at the cutting-edge of technological possibility. Critically, vendors need to better listen to their customers to create clinically-aware, healthcare-first solutions that address patient privacy. Health systems cannot purchase what does not exist and rarely have the in-house bandwidth to create production-ready systems.

Hospitals and health systems are working hard to protect patient privacy, but their security and privacy teams are stuck in a reactive mode, having to put out fires with limited resources. It’s clear that CISOs and chief privacy officers need a seat at the boardroom table and their roles need to give them the breathing room to see into the future rather than just to react to challenges as they occur.

Furthermore, compliance and bare-minimum standards are no longer enough. To truly protect patient data, a close relationship between hospital security and privacy groups must be formed. This partnership must be augmented by the technology necessary to detect and remediate threats and their collective mission must be aligned with the board. Fundamentally, resources and C-suite support must be allocated to tackle the next generation of privacy and security challenges, as current efforts aren’t on the right trajectory.

The federal government, with privacy protection authorities like the Office of Civil Rights and standard-setting bodies like ONC, want very earnestly to protect vulnerable populations and help hospitals protect patient data, and I have always been impressed by my interactions with them. However, there is no denying that they are under-resourced and limited in the amount of time they can spend looking into better solutions that could serve as next-generation patient privacy platforms. As a result, they are not able to offer much substantive guidance on what hospitals should and shouldn’t do to keep patient data secure. While distance must be maintained between vendors and regulators, greater public-private partnerships, like those in national security, are critical.

All of us as patients are an important but (amazingly) often overlooked constituency when it comes to advancing the protection of health data. Just as we wouldn’t keep our money in a bank that didn’t use passwords for online accounts or locks on their vaults, patients should expect and ask for more details about a hospital’s security posture. When hospitals ask you to sign forms that let them use your data, we should request that our providers detail how they’re protecting our information. A basic set of criteria about data encryption, proactive patient privacy monitoring, dual-factor authentication, network security, and whether or not a CISO/CPO are part of the team can tell you a huge amount about a hospital’s stewardship of patient data. We are all patients and I’m just as guilty of signing a HIPAA release form without thinking as anyone else. But if we’re to drive change, we have to think hard about what’s truly important to us and take a stand.

Ultimately, each of the above stakeholders has its own incentives, and I would contend, its own set of responsibilities and roles with respect to bringing about a new standard of patient privacy. In addition, while industry partnerships and bodies like the NH-ISAC are steps in the right direction in unifying these stakeholders, we need collective accountability and transparency regarding insider threats and HIPAA breaches beyond HHS’s “wall of shame.” Only through creating central, practical, collaborative bodies that bring all of these stakeholders to the table will we be able to move patient privacy forward and set a new standard for protecting our patients’ data.

Robert Lord is co-founder and CEO of Protenus in Baltimore, MD.

View/Print Text Only View/Print Text Only
June 20, 2016 Readers Write No Comments

Subscribe to Updates



Text Ads

Report News and Rumors

No title

Anonymous online form
Rumor line: 801.HIT.NEWS



Founding Sponsors


Platinum Sponsors




































































Gold Sponsors





















Reader Comments

  • Pharmacy Joe: Thanks for the tip about text2codes! I overhear docs talking about the hassle of coding all the time, and this looks ve...
  • Peter: RE: Judy/Epic – You’d think at some point her clients or prospects would care about a perpetual stream intentional m...
  • Former Epic: Epic definitely does marketing, but the local media (around Madison) has drank the Epic koolaid for years now. Epic only...
  • Epic Consultant: You just know that Epic is going to make the new booster NVTs really difficult for consultants to do....
  • Ginni's Jeopardy: IBM Watson will not be successful in healthcare IT until it pivots and aligns its product to help healthcare providers a...

Sponsor Quick Links