Home » Readers Write » Recent Articles:

Readers Write: Can Appropriate Prescribing Practices Curb the Opioid Crisis?

May 16, 2018 Readers Write 4 Comments

Can Appropriate Prescribing Practices Curb the Opioid Crisis?
By Victor Lee, MD

Victor Lee, MD is VP of clinical informatics at Clinical Architecture of Carmel, IN.

image

According to a 2014 report from the National Institute on Drug Abuse, the misuse and abuse of opioids is associated with a staggering number of emergency department visits, hospitalizations, overdose deaths, and many other adverse outcomes. Altarum estimates the economic impact from 2001 to 2017 to be more than $1 trillion, with a projected $500 billion of additional cost through 2020 at current rates. The White House Council of Economic Advisers estimates a burden of $504 billion in 2015, stating that prior estimates of the economic costs of the opioid crisis undervalue overdose fatalities. On October 26, 2017, The United States Department of Health and Human Services declared the opioid crisis to be a nationwide public health emergency.

There are efforts to combat the opioid crisis at many levels, including government (federal, state, and local), professional societies, health systems, health plans, academic institutions, and health IT vendors. Let’s look at a few selected recent events. The President’s Commission on Combating Drug Addiction and the Opioid Crisis provides a multifaceted set of 56 recommendations across categories that address federal funding, prevention, and treatment of opioid addiction. The Centers for Medicare & Medicaid Services issued a final rule which implements the Comprehensive Addiction and Recovery Act of 2016 and states, “a sponsor can limit at-risk beneficiaries’ access to coverage for frequently abused drugs beginning with the 2019 plan year. CMS will designate opioids and benzodiazepines as frequently abused drugs.” The Institute for Healthcare Improvement summarizes four main drivers to reduce opioid use, one of which is to limit the supply of opioids.

The Role of Opioid Prescribing as a Contributor

Why is it necessary to limit the supply of opioids? There is clear evidence that the prescription of opioids for pain management is a major driving force of the opioid crisis in the United States. A case-cohort study by Bohnert et al (2011) links higher opioid doses with opioid overdose death among US veterans. A retrospective cohort study by Brat et al (2018) shows that compared with opioid dosage, opioid prescription duration is even more strongly associated with misuse and overdose in a general surgery population. Findings from a series of structured interviews by Cicero et al (2017) reveal no qualitative differences in the onset and progression of opioid substance use disorder between medically treated patients and recreational opioid users. A review article by Compton et al (2016) provides further discussion of opioid prescriptions resulting in non-medical opioid and heroin use and cites numerous references.

Perhaps the most comprehensive review of risk factors for prescription drug misuse is provided in a 2017 publication by the Substance Abuse and Mental Health Services Administration. In summary, the body of research on prescription opioids shows a consistent link with resultant substance use disorder. This suggests that the demand side of the opioid crisis is critically important to address.

A Potential Solution

Prescribers of opioid medications are in an excellent position to fight the opioid crisis. While there are numerous evidence-based guidelines, a reasonable starting point would be to follow the “CDC Guideline for Prescribing Opioids for Chronic Pain” for appropriately selected patients. Recognizing that other opioid prescribing guidelines exist, the CDC guidelines are most commonly referred to by numerous organizations as part of a multifaceted approach to mitigating the opioid crisis.

While guidelines, clinical trials, reviews, and other literature may be widely available, they are not always translated into practice when applicable. This is where clinical decision support (CDS) may help. Kawamoto et al (2005) systematically reviewed the literature and found that the automatic provision of CDS as part of clinician workflow is 112.1 times more likely to improve clinical practice as compared with control groups (P< 0.00001).

CDS can lower the barrier to adhering to certain CDC recommendations such as:

  • Calculating morphine milligram equivalents (MME) dosages and justifying decisions to use ≥ 50 MME/day or ≥ 90 MME/day
  • Identifying risk factors for opioid overdose and considering of naloxone as part of an opioid management plan
  • Applying other prescribing best practices from the CDC’s 12 recommendations

We’re In This Together

While there are other ways to address the opioid crisis — such as national legislative / regulatory action, statewide technology implementation of prescription drug monitoring programs, and treatment of substance use disorder — there is also an opportunity to prevent opioid overutilization in the first place. If a bathtub is overflowing, the question is not whether to turn off the water, unplug the drain, or to mop up the water—the question is how to do all of these things in the most expedient way to address the problem.

Similarly, lawmakers, administrators, technologists, clinicians, and patients can work together to contribute their efforts in concert with one another to optimize pain management, minimize opioid overutilization, and to effectively treat substance use disorders.

Readers Write: Five Best Practices for Care Programs for Members

May 16, 2018 Readers Write No Comments

Five Best Practices for Care Programs for Members
By Jessica Schiller, RN, BSN

Jessica Schiller, RN, BSN is director of clinical programs at Wellframe of Boston, MA.

image

What if your members had all of the information they need and wanted? Medication regimen, social / lifestyle support, education for their conditions, access to a care manager — all the critical pieces related to their health and care in one place, right at their fingertips?

In many ways, this vision is becoming a reality as digital member engagement has become a high priority focus for care management. A crucial part of sustained engagement is the information that members receive about their health and that care managers utilize to structure interventions. Embracing a modern approach to engagement demands a new paradigm for care programs altogether — designed for members, delivered digitally, and personalized to meet each individual’s needs through digital and human support.

The application of personalized, interactive, member-facing care programs can amplify the medical risk reduction of care management by putting the right information in members’ hands at the right time, in the right format. With this in mind, let’s examine five best practices for care programs for health plan members.

One of the primary frameworks of care management is the care plan. In parallel to the medical record, the care plan is a collection of each member’s health history, diagnoses, problems, goals, and interventions, which evolves over time. Care plans function as decision support tools designed to help care managers structure interventions and methods for member support, typically delivered over the phone.

While they have been effective to date, the transition to member engagement through mobile and digital channels highlights where care plans are deficient: they are only available to the care team. In the booming digital age, members should be allowed to engage with this information directly.

Multi-channel engagement methods present an opportunity to extend part of the care plan directly and digitally to members in a new format adapted for the audience and the channel. We call this new concept a “care program.”

There are five best practices for effective member-facing care programs. These strategies ensure members receive the information they need to stay on track with their health in a way that aligns with their needs. In addition, well-designed member-facing care programs have proven to dramatically increase care team efficiency by saving clinicians valuable time in relaying information to members.

1. Optimize for mobile

  • Create short, interactive content
  • Stick to 400 words or less for engaging clinical articles
  • Hold attention with under-two-minute video stories from peers or tips from their doctor

2. Meet health literacy standards

  • Deliver content at the lowest reading level possible for broad accessibility
  • Write in short sentences with basic structure and simple words
  • Provide definitions for medical terminology
  • Break complex concepts into digestible pieces

3. Be holistic

  • Support the whole person, not just the chronic condition
  • Give members the support they want for lifestyle factors like weight loss, nutrition and Exercise
  • Provide information on key areas of health maintenance like emotional health, safe alcoholuse, and pneumococcal vaccinations, which also relate to HEDIS metrics

4. Deliver content over time

  • Start with foundational topics and build on them over time
  • Begin with must-know information, like what to do in an emergency, the importance of routine follow-ups, and red flags for the member’s condition
  • Progress to education on complications associated with their condition, what their medications do, and psychosocial / lifestyle factors that can impact their day-to-day

5. Enable personalization

  • Adjust care programs to meet the unique needs of each member
  • Ensure educational components are modular and easy to customize
  • Empower care teams to determine what information to send to members

The Outcome of Application

Adhering to these principles for member-facing care programs will generate a positive feedback loop for member engagement that is particularly feasible, cost-effective, and scalable via mobile, particularly when compared to care managers repeating information many times on the phone.

With health education that is personalized, relevant, and accessible, members will engage more often, feel better supported (satisfied), and learn how to self-manage chronic conditions more effectively.

Further, in the context of a therapeutic relationship with their care team, members’ interaction with the care program provides the kernel of insight around which the relationship is able to thrive: everything the member does with the care plan matters and informs better care. In turn, member-facing care programs advance the goals of care management and quality improvement overall, through effective health education to reduce complications, avoid readmissions, and improve outcomes.

Readers Write: HLTH 2018 Recap: A Transformation in Talking about Healthcare Transportation

May 11, 2018 Readers Write 1 Comment

HLTH 2018 Recap: A Transformation in Talking about Healthcare Transportation
By Travis Good, MD

Travis Good, MD is co-founder, CEO, and chief privacy officer of Datica of Madison, WI.

image

The premiere, sold-out HLTH conference ended last week in Las Vegas with a generally positive impression on its new style of healthcare conference. I, along with 3,500 attendees, laughed with Jonathan Bush, CEO of Athenahealth, as he entertained us with statements like, “All we do, all of us, is fail… And then we die!” We sat in stunned silence as Harold Paz, MD, executive vice-president and chief medical officer at Aetna shared the disturbing facts of the opioid crisis — facts like 116 people die every day in America, where we consume more opioids than any other country on Earth, and that more Americans will die this year than died through the entire AIDS epidemic or the Vietnam War.

HLTH was different than many healthcare conferences I’ve attended with its rapid-fire panel discussions, where the panelists didn’t waste time explaining high-level concepts like Blockchain, but instead jumped right in to describing the details of the emerging technology details. Numerous announcements and visionary ideas were also presented. The slick nature of the well-orchestrated HLTH event, likely made possible by the $5 million garnered in venture money, left an overwhelming impression for a first-time event.

The HLTH organizers did have one major miss: lack of strong representation of female healthcare leaders. Evidence of that agenda oversight gained audience criticism in social media and questions to panelists (including me) on why they thought few women graced the stage.

Two general themes prevailed throughout the conference. One centered on transforming the current healthcare business model to improve everything from interoperability, costs, and patient outcomes to physician burnout. The second theme that emerged throughout the conference focused on the exploration of entirely new business models that could transform the healthcare industry.

Announcements ranged from the splashy — like former CMS Acting Administrator Andy Slavitt’s launch of Town Hall Ventures, his shift from the government to investing in technologies that facilitate real change in our communities, and Change Healthcare teaming up with Adobe and Microsoft to orchestrate better patient engagement — to the mundane, like Marcus Osborne, VP of healthcare transformation at Walmart announcing, “Walmart isn’t going to stand for this” in describing the poor quality of care their associates have had to endure and Walmart’s push toward an evidence-based approach that ends physician’s entitlements.

Topics around blockchain, genomics, artificial intelligence (AI) and machine learning, cloud, augmented reality, and interoperability prevailed. During a lively panel, the so-called “unicorns of healthcare” shared their predictions of the next generation of unicorns. Anne Wojcicki, CEO and co-founder at 23andMe, predicted that the next unicorn will be in AI or chatbots. Frank Williams, CEO at Evolent Health, says precision medicine. Jonathan Bush thinks they’ll be new reimbursement models or therapeutics.

One theme woven throughout conference presentations is the idea that caring for health needs should extend beyond the walls of a treatment room and out into the community. On the first evening of the conference, David Feinberg, Geisinger president and CEO, described his vision of a new direction for healthcare for the communities Geisinger serves. The vision included not only traditional healthcare, but also feeding and housing people who need it.

Later in the conference, Lauren Steingold, head of strategy at Uber Health, described the company’s innovative new patient transportation offering that could help eliminate the $150B yearly cost to the healthcare industry resulting from 3.6 billion Americans who miss appointments due to transportation issues. Steingold described her vision of expanding that model to encompass telemedicine patients who need a ride to the pharmacy or even surgery patients who need a ride home.

My favorite quote from the conference, which pretty much sums up the current state of healthcare transformation, came from Anne Wojcicki. “What happens in healthcare is you have people who really want to do the right thing, but the ships are pointed in the wrong direction.”

All in all, the conference left attendees more informed and energized. Now HLTH organizers are taking what they learned from the first conference and planning for expansion next year.

Readers Write: Will PDMPs Remain a Vital Tool in the Opioid Response, or a Costly Burden?

April 4, 2018 Readers Write 2 Comments

Will PDMPs Remain a Vital Tool in the Opioid Response, or a Costly Burden?
By David Finney

David Finney is a partner with Leap Orbit of Columbia, MD.

image

New battle lines are being drawn in an important corner of the nation’s broad fight to control the opioid epidemic. Health IT professionals should sit up and take notice.

Much quiet maneuvering has been taking place for months, particularly among a number of large and well-connected technology vendors sensing a windfall. But with the recent signing into law of the $1.3 trillion federal omnibus spending package, the debate about what the future should look like for prescription drug monitoring programs (PDMPs) has burst into the open.

PDMPs — which are state-based systems for tracking and analyzing the prescribing and dispensing of controlled substances — have existed in some form for a century. Over the last 10 years, they have become more technologically sophisticated and are frequently pointed to as a critical (and mostly non-controversial) tool in the opioid response. Today, 49 states, the District of Columbia, Puerto Rico, and Guam have established PDMPs, while in Missouri, a PDMP instituted by St. Louis County serves most of the state’s population.

In an increasing number of states—over 30—clinicians and pharmacists are required by law to check their PDMP prior to prescribing or dispensing any controlled substance. Though enforcement is so far minimal, failure to do so could result in suspension or loss of license. Among other emerging techniques, many states now also send unsolicited reports to prescribers, using PDMP data, demonstrating that their prescribing habits are outside the norms for their specialty.

The federal government has encouraged these policies with a steady and increasing stream of grant funding to states to cover software development, licenses, and IT staffing. Not surprisingly, the private sector recognized the opportunity. Appriss, a private equity-owned firm that got its start helping states monitor sex offenders, has been the chief beneficiary of this flow of government dollars achieving a near monopoly in the state PDMP market by, among other things, acquiring its two largest competitors.

With 42 state contracts, Appriss has done what monopolists do, bidding up contract prices and seeking to monetize every aspect of the data it controls. Given the commitment by states and the federal government to “do whatever it takes” to address the opioid epidemic—including supporting PDMPs with ever-increasing grant funds—PDMP administrators may grumble, but otherwise few people have stopped and taken much notice.

Few, that is, except for several large healthcare and technology interests (increasingly those are one and the same) and the Washington lobbyists who work for them. Acting no doubt out of a genuine desire to positively impact the opioid epidemic, and also sensing a business opportunity, these interests have quietly been pushing Congress and the Trump administration to rethink the federal government’s traditional support of PDMPs and “modernize” them.

How to do this? By awarding tens, if not hundreds, of millions of dollars in new federal contracts to one or a small number of firms to facilitate the flow of PDMP data at a national level. This new network would leverage existing prescription data feeds that support e-prescribing and third-party payment. Initially, this network might complement and enhance state PDMPs, but in the longer term, it seems likely to make them redundant.

By all indications, the federal omnibus spending bill and subsequent signals from federal officials and lobbyists seem poised to deliver on this new model. Not surprisingly, Appriss is worried. In recent weeks, it has launched a marketing campaign of its own to highlight the benefits of the current state-based approach to PDMPs and the interstate gateway it developed in collaboration with the National Association of Boards of Pharmacy.

Why should health IT professionals care? Frankly (and functionally), whether the nation continues with a states-based model for PDMPs or a federal one probably won’t make a big difference to end users at hospitals, ambulatory practices, retail pharmacies, or other healthcare facilities. The more timely data offered by the federal model may offer some marginal benefit, but states have already been moving in that direction. In either case, though, the outcome is likely to hit the bottom lines of these organizations in a big way.

Already, as prescribers and dispensers are required by law to consult PDMP data, their IT departments face pressure to deliver the data to them in more workflow-friendly ways. Appriss has gladly obliged by presenting hospitals and health systems across the country with steep per-user, per-month fees to access the data it controls via its state contracts via APIs or single sign-on. These fees can reach seven figures per year for some health systems. A federally facilitated approach is likely to look no different—it would use established e-prescribing networks, whose business models are well known, to deliver PDMP data into the workflow. What all of these businesses likely understand is that the last mile into the prescriber and dispensers’ workflow could be the most lucrative aspect of PDMPs.

A few states are attempting to buck these powerful forces. They take the view that PDMPs are a public utility, and as such, PDMP data should be widely and democratically made available to anyone who has an appropriate use for it. In Maryland, Nebraska, and Washington, this has meant collaborating with a statewide health information exchange to publish open APIs and support a range of standards-based integration techniques for bringing PDMP data into the workflow. California’s PDMP, with support from the legislature, is also in the midst of an ambitious initiative to make open APIs available to all of the state’s healthcare institutions.

These states support a nascent ecosystem of third-party technology providers and system integrators that are inventing new ways to present PDMP data to those who need it, when they need it. Companies—and I count my own among them—are demonstrating real innovation that can make a difference in fighting the opioid epidemic. The earnest competition also keeps us honest and hungry and should ultimately drive down cost. If more take notice, these states may present an alternative to the models being pitched by more powerful interests.

Readers Write: I Am More Than My Specialty: Physician Burnout and Individualism

March 21, 2018 Readers Write No Comments

I Am More Than My Specialty: Physician Burnout and Individualism
By Erin Jospe, MD

image

Erin Jospe, MD is chief medical officer of Kyruus of Boston, MA.

While physician burnout is garnering more attention with a steady generation of articles and books both academic and lay, we have yet to see improvements despite our awareness of the problem. We have become facile at recognizing the symptoms of exhaustion, detachment, cynicism, and inefficiency as the hallmarks of burnout, but no better at treating the underlying causes.

Per Medscape, no specialty was spared an increase in self-reported burnout symptoms between 2013 and 2017, [1] and the prevalence is unsettling at almost 60 percent in some fields. [2] While there is no silver bullet for burnout, within their professional work environments, recognizing physicians as individuals and giving them the means to convey their unique areas of expertise to patients, fellow providers, and others within the health system can go a long way in paving a path to higher satisfaction and engagement.

We are equally aware of the downstream ramifications of physician burnout as we are of the symptoms, with repeated studies demonstrating the negative impact on patient safety, quality of care, and the patient experience. With the refocusing of the context of care upon the mission to improve patient lives, in 2007 the “Triple Aim” reminded us of the importance of how individual patients experience care. In the 10 years since, there has been a paradigm shift in respecting the individual patient as having unique needs and values that must be addressed to achieve better health.

Physician burnout directly undermines our ability to deliver on this promise and has worsened in the same 10 years. It was innovative to say we needed to acknowledge the humanity of our patients to deliver better care, to recognize the individual and not view them as interchangeable with every other patient. And yet by creating a delivery system that only recognizes the humanity of those needing care and not of the care providers, we sully the sacredness of that patient-provider relationship and create the same negative environment of disrespect that results in so much dissatisfaction among both providers and their patients.

Though we rightly strive to see and address the individual needs of the patient, there is a widespread sense that physicians themselves are interchangeable. This is no less disrespectful than perceiving patients as such. As a physician, I am far more than my specialty,  as are my colleagues. Yes, I have an expertise, and with it comes an expectation of an established skill set and standards of care. But I have a style, manner, and experience that is my own. I have defined niches of interest and excellence that make me better suited to the needs of some patients.

When given no means, no vocabulary, no voice with which to articulate that which is unique to a physician, we do a disservice to the individual physician and to the community of patients and other providers who would seek them out. Our health systems and networks of physicians are growing exponentially larger, but with it, our awareness of individual contributors diminishes. We no longer have connections with one another as physicians and no insight as to where unique strengths and gifts might exist among us.

In the face of an exploding fund of medical knowledge, we cannot deny the necessity of understanding where unique expertise — and not just specialty — lives. It is hard to enough for physicians to acknowledge the deficiencies in our knowledge base. Providing no means by which to uncover who within our community might help only furthers a tendency toward emotional and mental exhaustion.

Addressing burnout at an individual physician level is often too little, too late. Resiliency is important, but in and of itself, resiliency does not change the environment for which it is necessary, and too often will be insufficient to treat or prevent burnout.

Instead, consider the systemic and holistic organizational contributions to the environment which are causal. Rather than address the individual’s propensity to burnout, address the individual. Allow them to be acknowledged and appreciated as uniquely individual contributors. Give them the means to indicate to their networks what their clinical areas of focus are beyond merely specialty / subspecialty. Provide them with teams aligned in their mission to act in concert as exceptional people in the care of exceptional people. Facilitate their understanding of the excellence that exists within the community of providers.

Failure to do so diminishes the joy and satisfaction of relational patient care by converting those interactions into the merely transactional. Though not a panacea for physician burnout, we need to address the anonymity of our providers if we are to do justice to the promise of prioritizing the patient experience.

[1] Medscape Lifestyle Report 2017

[2] AMA, “Report reveals severity of burnout by specialty,” Jan. 31, 2017.

Readers Write: Continuous Clinical Surveillance: An Idea Whose Time Has Come

March 21, 2018 Readers Write 3 Comments

Continuous Clinical Surveillance: An Idea Whose Time Has Come
By Janet Dillione

image

Janet Dillione is CEO of Bernoulli Health of Milford, CT.

It’s no secret that the general acuity of hospitalized patients is increasing as the overall US population continues to age (hello, Baby Boomers). Many patients who would have been in an ICU in the past are now found in lower-acuity areas of the hospital. We foresee that the hospital of tomorrow, in terms of monitoring and surveillance capabilities, will need to be more like an enterprise-wide ICU.

A significant problem with such a transformation is that hospitals will not be able to staff their entire facility like an ICU. In most hospitals, there is simply no money to add more staff. Even if there were sufficient funds, doctors and nurses are in short supply. Hospitals will have no choice — they will need new technological tools to help clinicians manage these rising levels of acuity.

One type of technology that holds promise in this regard is continuous clinical surveillance. In contrast to electronic monitoring — which includes observation, measurement, and recording of physiological parameters — continuous clinical surveillance is a systematic, goal-directed process that detects physiological changes in patients early, interprets the clinical implications of those changes, and alerts clinicians so they can intervene rapidly. (1)

Just a few years ago, continuous clinical surveillance would have been impossible because there was no way to integrate data from different monitoring devices, apply analytics to that information in real time, and communicate alerts to physicians and nurses beyond the nearest nurse’s station. But today, medical device data can be aggregated and analyzed in a continuous stream, along with other relevant data such as patient data from the EHR. In addition, many clinicians now carry mobile devices that allow them to be alerted wherever they are.

Early Warning System

A continuous clinical surveillance system uses multivariate rules to analyze a variety of data, including real-time physiological data from monitoring devices, ADT data, and retrospective EHR data. When its surveillance analytics identify trends in a patient’s condition that indicate deterioration, the system sends a “tap on the shoulder” to the clinicians caring for the patient.

For example, opioid-induced respiratory depression accounts for more than half of medication-related deaths in care settings. (2) Periodic physical spot checks by clinical staff can leave patients unmonitored up to 96 percent of the time. (3) By connecting bedside capnographs and pulse oximeters to an analytic platform to detect respiratory depression and instantly alert the right clinicians, continuous surveillance can shorten the interval between a clinically significant change and treatment of the patient’s condition.

A recent study found that compared to traditional patient monitoring and spot checks, continuous clinical surveillance reduced the average amount of time it took for a rapid-response team to be deployed by 291 minutes in one clinical example. In addition, the median length of stay for patients who received continuous surveillance was four days less than that of similar patients who were not surveilled. (4)

Another condition that requires early intervention is severe sepsis, which accounts for more than 250,000 deaths a year in the US. (5) The use of continuous clinical surveillance can help predict whether a patient’s condition is going to get worse over time. By aggregating data from monitoring devices and other sources and applying protocol-driven measures for septicemia detection, a multivariate rules-based analytics engine can identify a potentially deteriorating condition and notify the clinical team.

Reduction in Alarm Fatigue

Repeated false alarms from multiple monitoring devices often cause clinicians to disregard these alerts or arbitrarily widen the alarm parameters. Continuous surveillance can significantly reduce the number of alarms that clinicians receive.

An underlying factor that produces alarm fatigue is that the simplistic threshold limits of physiologic devices — like patient monitors, pulse oximeters, and capnographs — are highly susceptible to false alarms. Optimization of the alarm limits on these devices and silencing of non-actionable alarms is not enough to eliminate this risk. The challenge is achieving a balance between communicating essential patient information while minimizing non-actionable events.

Continuous clinical surveillance solutions that analyze real-time patient data can generate smart alarms. Identifying clinically relevant trends, sustained conditions, reoccurrences, and combinatorial indications may indicate a degraded patient condition prior to the violation of any individual parameter. In addition, clinicians can leverage settings and adjustments data from bedside devices to evaluate adherence to or deviation from evidence-based care plans and best-practice protocols.

In a study done in a large eastern US health system, researchers sought to establish that continuous surveillance could alert clinicians about signs of OIRD more effectively than traditional monitoring devices connected to a nurse’s station without compromising patient safety. The results showed that a continuous surveillance analytic reduced the number of alerts sent to the clinical staff by 99 percent compared to traditional monitoring. No adverse clinical events were missed, and while several patents did receive naloxone to counter OIRD, all patients at risk were identified early enough by the analytic to be aroused and avoid the need for any rapid response team deployment. (6)

Clinical Workflow

When CIOs are considering a continuous clinical surveillance solution, they should look for a platform that fits seamlessly with their institution’s clinical workflow. This is especially important outside the ICU, where the staff-to-patient ratio is lower than in critical care. In these care settings, a solution that can be integrated with their mobile communication platform ensures that alerts will be received on a timely basis.

In addition, the continuous surveillance solution should have an open interoperability standards based architecture that integrates with the hospital’s EHR, clinical data repository, and other applications. Especially in these times, it must support strict security protocols as part of an overall cybersecurity strategy.

Clinicians are beginning to recognize that continuous clinical surveillance can help them deliver better, more consistent, more efficient, and safer patient care. In this respect, it reminds me of the timeframe after publication of the famous IOM report that highlighted the dangers of medication errors in the US healthcare system. Companies scrambled to provide a solution, and when automated medication administration was first introduced, the technology was unimaginably clunky. As many of us remember, COWs left the pastures and moved onto hospital floors.

I had the opportunity to watch clinicians who had significant doubts about bar coding and scanning try these new tools. It only took that first patient where the technology helped them avoid dispensing an incorrect medication to turn a skeptic into an evangelist. They quickly realized their patients were safer because of the new technology. Clinicians will discover that continuous clinical surveillance offers the same type of patient safety benefits.

Eventually, hospitals will use continuous surveillance with acutely ill patients in all care settings. The ability of analytics to interpret objective physiological data in real time and enable clinical intervention for deteriorating patient conditions that could otherwise be missed is just too powerful to ignore.

REFERENCES

1. Giuliano, Karen K. “Improving Patient Safety through the Use of Nursing Surveillance.” AAMI Horizons. Spring 2017, pp 34-43.

2. Overdyk FJ, Carter R, Maddox RR, Callura J, Herrin AE, Henriquez C. Continuous Oximetry / Capnometry Monitoring Reveals Frequent Desaturation and Bradypnea During Patient-Controlled Analgesia. Anesth Analg. 2007;105:412-8.

3. Weinger MB and Lee La. No patient shall be harmed by opioid-induced respiratory depression. APSF Newsletter. Fall. 2011. Available at: www.apsf.org/newsletters/html/2011/fall/01_opioid.htm.

4. “Improving Patient Safety through the Use of Nursing Surveillance.”

5. Centers for Disease Control and Prevention, “Data & Reports: Sepsis.” https://www.cdc.gov/sepsis/datareports/index.html

6. Supe D, Baron L, Decker T, Parker K, Venella J, Williams S, Beaton L, Zaleski J. Research: Continuous surveillance of sleep apnea patients in a medical-surgical unit. Biomedical Instrumentation & Technology. May/June 2017; 51(3): 236-251. Available at: http://aami-bit.org/doi/full/10.2345/0899-8205-51.3.236?code=aami-site.

Readers Write: Analytics Optimization: Doing What It Takes

March 21, 2018 Readers Write 2 Comments

Analytics Optimization: Doing What It Takes
By Lee Milligan, MD

image

Lee MIlligan, MD is VP/CIO of Asante of Medford, OR and a director of the governing boards of Asante, Oregon ACO, and Propel Health.

I recently surveyed a number of large and medium-sized integrated healthcare institutions in the Pacific Northwest with a focus on the analytics experience. I sought to answer one question: how do the operational and clinical end users perceive their experience of requesting and receiving information? I talked to CIOs, CMIOs, and directors of analytics.

Although the conversations touched on many concerns, three themes emerged that I now call the “Three Reporting D’s” – delay, distrust, and dissatisfaction. End users are just not getting what they need to do their jobs on time. Despite the adoption of sparkly analytics software products, the problems continue to fester.

We experienced a similar disconnect a few years back, and have, over the course of three years, re-architected our approach. Although we still have room for improvement, I’d like to share a bit about what we learned and how this reboot has led to a more satisfying end user experience.

We started the internal investigation by looking at the entire end-to-end experience from the customer’s perspective. Using a lean management technique known as value stream mapping, we drew out on a white board all of the steps that a typical end user would experience as they requested information from our analytics team. Surprisingly, this took quite a while and we ran out of white board space.

This was telling. Why does this process include so many steps? It reminded me of the 1990s Windows installations where the customer would continuously have to click “next” to move the process forward.

One of the keys of this lean technique is to identify the steps in the process that add value and eliminate the rest. We got stuck on the definition of value. What is valuable to the end user? When we honestly answered that question, a surprising number of steps were removed.

Next we asked, what’s missing? That question required us to walk in the shoes of our customer, like a doctor’s seeing the world through the patient’s lens. I also had the advantage of two additional frames of reference:

  • I personally requested that a report be built for me from scratch using the prior method, and
  • I asked the BI developers to CC me on all email communications between them and the customer.

Both experiences unearthed missing fragments, which ultimately informed our strategic BI architecture. Most of the changes we instituted were budget-neutral, process-related improvements. However, I would like to highlight two changes which cost a few bucks that delivered tremendous ROI.

Customer/BI Developer Partnership and Communication

We recognized fairly quickly that these relationships were in need of optimization. First, the customer rarely knows what they want. That’s not to say they can’t make a request. However, they frequently request what they don’t ultimately need or want.

Second, I discovered through those CC’d emails that they are requesting many additional discrete elements, far beyond the initial scope, usually as they learned more about what the information looks like. In other words, they didn’t fully understand what they were looking for and we were unprepared to fully discover with them what they ultimately need.

To plug that hole, we instituted a new position within our team, the clinical data analyst. Something akin to the business analyst in the corporate world, this role is responsible for working directly with the end user to accomplish two goals: (a) to fully understand the ask to detail this in the agreed-upon scope, and (b) to commit the requestor to actively participate in the data development process.

Also, our team of BI developers desired guidance on how they should communicate with our end users. We had naively taken that piece for granted. They requested clear direction on how to frame conversations, how to respond to specific requests that are outside of agreed-upon scope, and how to ask better questions of the initial ask.

Teaching/Training

We surveyed our customers and discovered something astonishing. Many are not using the reports and data that we have delivered. When pressed, it became clear that many did not fully understand the information produced and even fewer understood how to incorporate this data into their workflow to better inform operational decision-making.

We developed a new role as a principal trainer within ITS-Analytics. The goals of this role are twofold: (a) to work directly with end users to assure a full and practical understanding of the delivered information (i.e., how to read the report, what the symbols mean, how to navigate an analytics dashboard, etc.), and (b) to lead our self-service domain. The self-service aspect has significant potential to meet customer’s needs in a rapid, nimble fashion.

Putting it all together, our take-home lesson has been the criticality of performing regular internal assessments in order to verify that we are meeting our customer’s needs—from their point of view—objectively and subjectively.

Readers Write: It’s Time for Drug Price Transparency

February 14, 2018 Readers Write 9 Comments

It’s Time for Drug Price Transparency
By Stanley Crane

image

Stanley Crane is the chief technology officer of InteliSys Health of San Diego, CA.

EHR vendors face a tough challenge in deciding which new features to develop and integrate for their next release and which ones to leave on the cutting room floor. The benefits of each potential enhancement must be weighed against the costs, usually measured in programming time. Moreover, features required for Meaningful Use and MIPS must be included, making the triage even more difficult.

However, EHR companies are missing the boat if they neglect to add a feature that could have a massive impact on their clients’ patients. I am speaking here of prescription drug pricing comparisons, built directly into the EHR workflow of prescribers

We’ve heard a lot about drug price transparency lately. But the public discussion hasn’t come close to the truth.

There are vast differences in the prices pharmacies charge for the same drug from the same manufacturer within the same geographical area. For example, the price of generic Plavix (clopidogrel) ranges from $6.16 at one pharmacy in Aurora, CO to an amazing high of $150.33 at another pharmacy just a few steps away. That’s the equivalent of a gas station charging $72 per gallon for unleaded regular when a station across the street is asking $2.95. This is merely one of literally millions of examples of the absurd variation in retail drug prices.

Most doctors and patients are unaware that retail drug prices vary by so much. As a result, many patients go to the pharmacy, get hit with sticker shock, and walk out without picking up their medication. Others pay far more than they should for the drug because they’re unaware of widespread price variance.

A handful of companies now sell prescription drug price comparison tools directly to consumers. These haven’t had much impact, however. First, because not many people know about them. But also because it’s too complicated for the patient to move their prescriptions to another pharmacy.

Imagine how the situation would be different if a patient’s own doctor could tell him or her what their medications would cost at different pharmacies, regardless of whether the patient has insurance.

What our healthcare system needs today is a modern price comparison tool that is integrated with an e-prescribing tool, ideally within an EHR. The range of prices for a particular drug would appear on the prescribing screen within milliseconds of a physician selecting that medication. Using real-time pricing data from pharmacies, the software could show the cost of that drug at the closest pharmacies to the doctor’s office or the patient’s home or workplace. None of this information is available via EHRs on the market today.

Such a solution could use the patient’s insurance information in their doctor’s EHR, as well as search health plan databases to determine a patient’s out-of-pocket cost (after factoring in deductibles, co-payments, and out-of-pocket minimums). If the patient is on the hook for the cost — either because of a high deductible, high co-pay, or because he or she is uninsured –the software could show the cash price of the medication. It could also indicate whether the cash price is lower than the co-payment under the patient’s plan, ensuring that the patient pays the lowest price each time.

At the patient’s choice, the doctor could then send the e-prescription to the most convenient pharmacy that charges the lowest price for that drug. If the price is still too high for the patient, the software could automatically analyze the selected drug against therapeutically equivalent alternatives, enabling the doctor to prescribe a lower-cost alternative, again comparing the prices at local drugstores.

Transparency in prescription drug pricing offers several benefits. Patients are likely to have better outcomes if they fill their prescriptions and adhere to their prescribed therapy. Physicians can garner higher quality scores if their patients take their meds and control their chronic conditions. Lastly, if price transparency becomes widespread, some pharmacy chains will be forced to lower their prices to avoid losing customers to lower-priced stores or chains. If that happens, the whole system benefits, including patients, plans, employers, and taxpayers.

Readers Write: In Defense of Bob Dolin

February 10, 2018 Readers Write 23 Comments

This comment was provided as a response to discussion about whether former Kaiser physician and HL7 chair Bob Dolin, MD should be allowed to return to industry work after serving a prison sentence for possession of child pornography.

I appreciate Mr. HIStalk’s comment that Bob should be able to work. Not only is should he be allowed to work, he is obligated to return to being a useful, productive member of society. Not just from my perspective, but from the government perspective.

I know more intimately than any of you what the real situation is and was. I am his wife. So much for anonymity.

Many of you know me. I am a strong, independent woman, dependent on no one. Someone who not only hates child pornography and the implications of what that means for these children, but one who also despises “regular” pornography and the industry’s encouragement for participants to descend into child pornography (think “barely legal”). I also recognize that most men have participated in viewing pornography, especially men in unhappy marriages. But I don’t hate them or think they are sick — they are just unhappy.

Why have I stayed with Bob? Why do I encourage him to do the work he loves and to which he has made such great contributions? Let me tell you the reasons.

Bob is not a depraved, sick person. He never inappropriately touched any child. He is as far away from being a misogynist as any man I know. Likely, he has been far more monogamous and faithful than most of you.

While you might surmise that children were harmed because he downloaded a few zip files in one period in his life over 10 years ago, it is highly unlikely. There is no empirical evidence for that. Again, I am not asserting in any way that this was OK.

I have been with Bob nearly 24/7 since shortly after this was discovered. In fact, I believe, our relationship and strong marriage has been a primary healer. Bob simply only has the desire for the intimacy that only a special love such as ours provides.

I am firmly asserting is that he is not sick or depraved. I am stating that back 10 years ago, as his previous marriage was ending, he was in a bad spot. Did he go out and rape anyone or touch any child? Did he even have affairs? No. He withdrew into himself and escaped by viewing “regular” pornography, and unfortunately purposefully downloaded some child porn. The was no money exchanged.

In regards to “infants and toddlers being sadistically abused,” I challenge you to find an ICE arrest announcement (that’s the branch of government that deals with child pornography cases) that does not say, “XXX number of child pornography pictures were found, including infants and toddlers being sadistically abused.” Simply, that is what is in those zip files. How do I know? I was told that by lawyers who specialize in this area. The media (and ICE) love to emphasize this aspect. Whether or not the offender actually spent any time looking at these images is unknown in most cases.

I wish ICE would spend more time on finding and prosecuting the abusers and creators of child porn (usually family members) than on the easy targets of introverted adult males. For that matter, how is it possible that such pictures can even be uploaded? Surely we have the technology to recognize them and prevent it.

After extensive testing, examination, and interviews, Bob was not deemed a danger to society. Exam after exam has revealed him to have made a single mistake in an otherwise exemplary life. Not only that, it was about five years from when the forensics were done on his laptop to when the feds decided to prosecute. We assumed during this time there was nothing worth prosecuting for – they must have had far more pressing cases to deal with.

Bob’s friends, family, and many colleagues are happy to see Bob back contributing his brilliant mind to the industry. They recognize the price he has paid. For those of you who are appalled that he dare be a contributing member of society and HL7, and will quit if Bob continues to go to HL7 meetings after downloading child pornography 10 years ago, spending 2.5 years in federal prison, and losing his career, I encourage you to grow up, act like a mature adult, and think about the logic of that.

To quote a famous Rabbi, “And why beholdest thou the mote that is in thy brother’s eye, but considerest not the beam that is in thine own eye?” Are you perfect? Even if your sins are not as severe as you have judged Bob’s to be, as they may not be, and you have taken it upon yourselves to so severely damn him, I ask, you to examine yourselves, your motives, and your personal issues.

Lastly, think of who you are hurting besides Bob. You are hurting me to the core. You are damaging my ability and desire to participate as a useful member of society. You are making me question nearly everyone at HL7 as to whether they have been two-faced to me these few past years, where I have remained a successfully contributing HL7 member by myself.

I won’t abandon Bob because of this. He is a good man, the best man I know, who made a bad mistake over 10 years ago.

Readers Write: Healthcare CIO Tenure Trends

February 5, 2018 Readers Write No Comments

Healthcare CIO Tenure Trends
By Ranae Rousse

image

Ranae Rousse is VP of sales for Direct Consulting Associates of Solon, OH.

Last year while supporting one of the many local HIMSS chapter events, a keynote speaker presented a statistic that caught my attention. The speaker was presenting on the rise of cybersecurity threats to healthcare. The first slide in his well-constructed PowerPoint presentation had a bolded “17 months” with a font size of about 200. The gentleman then shared with the attendees, most of whom were CIOs, that 17 months is now the average tenure for a chief information officer.

I asked for the source of the 17-month statistic and found that it was for CISOs rather than CIOs and it was also not specific to healthcare. I decided to do my own research with an independent survey of 1,500 healthcare CIOs. The results:

  • The average tenure for a healthcare CIO is 5.5 years, with the range from five months to 23 years.
  • 37 percent of respondents were not healthcare CIOs in their previous jobs. Those who were tended to have longer tenure in their previous CIO positions.
  • 44 percent of the respondents said they don’t have a succession plan. Those respondents also did not have a requirement to appoint a successor.
  • 69 percent intend to retire as a healthcare CIO, although 11 percent say they would purse a COO/CEO role and the remaining 20 percent were split equally between moving to a consulting job or leaving healthcare.

Increases in mergers, acquisitions, and hospital closures between 2008 and 2017 reflect a loss of roughly 280 hospitals, so the number of CIO positions is decreasing. The perception of the CIO role itself has changed from being a senior IT leader to becoming a higher-level healthcare executive, opening the door for the role of the associate CIO in many large health systems.

Considering this ever-changing landscape; what trends can we anticipate for the future?

Readers Write: The Secret to Engaged Physicians at Go-Live: Personalize the EHR

February 5, 2018 Readers Write 1 Comment

The Secret to Engaged Physicians at Go-Live: Personalize the EHR
By Dan Clark, RN

image

Dan Clark RN, MBA is senior vice-president of consulting at Advisory Board.

I often compare an EHR implementation and go-live to getting a new smart phone. Out of the box, it’s a powerful tool, but it doesn’t truly become effective until you start to download applications, add your email and contacts, and pick a personal picture as your background.

Just like your new smart phone, EHRs aren’t ready to perform at their best out of the box and always require some degree of personalization. EHR personalization may sound like one more step in a long, multi-staged implementation and go-live, but it can often be the difference between adoption and rejection.

New technology will always be a disruption, but personalization can minimize a new EHR’s negative impact on patient care by matching new tech to existing clinical workflow, not vice versa. While it’s important to focus on “speed to value” with a new EHR, health systems that take the time to personalize workflows for specialties and individual providers typically see a much higher rate of adoption and a quicker return to pre-go-live productivity.

Health systems should consider a multi-layered approach to personalization. At the very least, health systems should design technology that aligns the EHR to serve high-level strategic goals, such as quality reporting and provider productivity expectations.

When it comes to the individual user level, almost every health system starts with didactic classroom trainings that may combine users from a variety of different clinical and administrative areas. While this is a good baseline, it’s challenging to teach a course that applies to doctors and nurses, front office staff, and revenue cycle alike. Physicians, specifically, report that these sessions take time away from their patients and don’t always provide the value they are hoping they will.

Because of this, one-on-one opportunities for personalization are most efficient and have the biggest impact. I typically see health systems tackle one-on-one personalization support in a couple of ways. The first is setting up a personalization lab. Prior to go-live, we set up a 24/7 personalization lab right in the physician’s office or hospital. This gives clinicians the opportunity to stop in with ad hoc questions, or better yet, make a formal appointment with a clinical EHR expert. These sessions are guided by an extensive checklist of EHR personalization options, fine-tuning everything to the clinician’s preference and specialty.

One orthopedic surgeon came back to the personalization lab four times, and that was after she had already completed the classroom training. We worked with her to personalize specific workflows, order sets, and even simple things like page setup in the EHR.

Personalization serves as just-in-time training and is usually well received by the clinicians. Sometimes this training takes the form of a mobile workstation in the hallways that caters to clinicians’ in-the-moment questions during their breaks and doesn’t pull them away from patients. This kind of assistance is also usually well received by clinicians since it gives them a chance to ask a question about a real patient scenario.

The trick to getting EHR and go-live training right, in any scenario, is to provide the right support—other clinicians who will stand at the elbow with the providers as they navigate real scenarios and issues. And staffing your personalization lab with clinicians will give you the best bang for your buck, providing your staff with clinical and technical expertise. Trainers that combine EHR acumen with clinical expertise and knowledge of appropriate workflows can help clinicians hard code best practices into the technology in a way a technical expert may not.

EHR go-live is an anxiety-ridden time for all health system staff, clinicians and non-clinicians alike. It’s important that all staff feel they have the support, training, and preparation to use the EHR to its fullest potential to impact patient care.

Readers Write: How IT Professionals Can Work More Effectively with Physicians

January 31, 2018 Readers Write 6 Comments

How IT Professionals Can Work More Effectively with Physicians
By Stephen Fiehler

image

Stephen Fiehler is IS service leader for imaging and interventional services at Stanford Children’s Health in Palo Alto, CA.

Be Agile – Work Around Their Schedule

Stop inviting orthopedic surgeons to your order set review meeting from 2:00 to 3:00 p.m. on Wednesday at your offsite IT department building. That is not a good use of their time. And good luck getting them to log in and pay attention to your GoToMeeting from 10:00 to 11:00 a.m. on Thursday.

Some electrophysiologists I work with are only available at the hospital at 7:00 a.m. on Tuesdays or Thursdays. I get there at 6:45 a.m. and have everything ready to go when they walk in the room so we can get through as much content as possible. The best time to meet with an invasive cardiologist is in the control room between cases. When I need to validate new content with them, I wear scrubs and work from a desk in the control room for half a day to get a cumulative 30 minutes of their time. This way, if cases run late, they can get home to their family at 8:00 p.m. instead of 9:00.

As long as I have my laptop, my charger, and an Internet connection, I can be productive from any location that works best for the physicians. Their time is more valuable than mine. The more time I take them away from patient care is less revenue for the hospital and fewer kids getting the medical treatment they need.

There are physicians that have the bandwidth to spend more time with us on our projects, but it is imperative that we not expect it from them.

Be Brief – Keep Your Emails Short and Concise

Review your emails to physicians before sending them. You could probably communicate as much, if not more, with half the words.

When I was at Epic, one of the veteran members on the Radiant team had a message on his Intranet profile instructing co-workers to make emails short enough that they could be completely read from the Inbox screen of the iOS Mail app. Any longer, and you could assume he would not read or reply.

If an email has to be long, bold or highlight your main points or questions. Most physicians have little time to read their email. Show them you value their time and increase the likelihood that they will read or reply to your message by keeping it concise. Writing shorter emails helps you waste less of your own time as well.

Also, use screenshots with pointers or highlighted icons when appropriate. They might not know what a “toolbar menu item” or a “print group” is.

Be Service-Minded – Do Not Forget IT is a Service Department

The biggest mistake a healthcare IT professional can make is forgetting that we are a service department. The providers, staff, and operations are our customers. It is our job to provide them with the tools they need to deliver the best patient care possible. That is why the IT department exists.

Given the complexity of our applications, integration, and infrastructure, it is tempting to forget that we are not the main show. Whether we like it or not, we are the trainers, equipment managers, and first-down marker holders, whereas the providers are the quarterbacks, wide receivers, and running backs.

By focusing on providing the best service possible, you will implement better products and produce happier customers. At the end of the day, we want to be effective and to have a positive impact on the organization. The best way to do that is through being service-minded.

Readers Write: If I Were the Health IT King: A Royal Perspective on 2018 Trends

January 10, 2018 Readers Write 2 Comments

If I Were the Health IT King: A Royal Perspective on 2018 Trends
By Jay Anders, MS, MD

image

Jay Anders, MS, MD is chief medical officer of Medicomp Systems of Chantilly, VA.

If I were king of health IT, I would find great joy in sitting at the head of a banquet table before all my subjects, casting judgment on the most current health IT trends. Like the king in Bud Light’s recent commercial series, I’d love to lead a hearty “dilly dilly cheer for innovations that make it easier for physicians to practice medicine, while banishing the less worthy trends to the “pit of misery.”

Health IT king or not, I see the following 2018 health IT-related trends falling into two distinct buckets.


Deserving Dilly Dilly Cheers

Interoperability

At long last, health systems seem to be accepting the inevitability of interoperability. Organizations are resigned to the fact that it’s no longer reasonable to refuse to share patients’ clinical records with cross-town competitors. Instead, everyone needs to work together to make systems talk. The growing acceptance of standards such as FHIR are also helping to advance interoperability efforts. I predict significantly more progress in this area over the next three to five years.

Collaboration with Physicians

More health IT companies are seeking input from physician users as they design, build, and test their solutions. Vendors are realizing that the creation of user-friendly clinical interfaces can no longer be an afterthought and that the delivery of physician-friendly solutions must be a priority. By collaborating with physicians, vendors better understand required clinician workflows, existing bottlenecks, and the processes that are critical to patient safety.

For example, physicians can provide insights into common clinician thought processes and clarify why one workflow may be preferred over another. Physicians understand what tasks are traditionally performed by a medical assistant, how long a particular procedure might take, and when and why a clinician cannot be looking at a computer screen. By embracing physician collaboration, health IT companies are better-equipped to create innovative solutions that work and think like physicians and enhance provider satisfaction.

Shared Chart Ownership

Not too many years ago, most people — including patients — believed that each physician owned his or her own patient charts. That mindset is changing, and today, most providers and patients realize that everyone involved in a patient’s care — including the patient’s family — needs to share clinical data. The growing recognition that information must flow seamlessly between caregivers is a huge step in the right direction and advances industry efforts to get the right information to the right person at the right time.


Banished to the Pit of Misery

Data Dumping

More data is being exchanged between providers thanks to better interoperability tools and growing enterprise acceptance. Unfortunately, many organizations continue to struggle to figure out what to do with all the data. More health systems have the ability to dump buckets of data on providers, yet few physicians have the tools to efficiently organize the data into actionable information that enhances patient care. Don’t look for any widespread fixes in the short term.

Administrative Burdens

Healthcare still has not figured out how to reduce the administrative burdens of practicing medicine. Physicians continue to be frustrated and disillusioned with their careers, thanks to ever-changing regulatory and reimbursement requirements that require adjustments to clinical workflows. Don’t expect big improvements any time soon, nor major legislation that streamlines existing healthcare policies and regulations. Instead, physicians will be forced to continue addressing numerous tasks that distract from the delivery of patient care.

AI Hype

Despite all the hype, don’t look to artificial intelligence and machine learning technologies to solve all the industry’s data and reporting problems. The bottom line is that these technologies are still insufficiently mature for healthcare applications. Providers would of course love solutions that leverage natural language processing (NLP). AI will have the ability to convert dictated chart notes to free text and free text to data that is actionable for clinicians. Unfortunately, the error rates for converting speech to text to data are, at best, between eight and 10 percent. Give these technologies at least two to three more years before they’re ready able to truly enhance clinical decision-making at the point of care and move out the pit of misery and earn dilly dilly cheers.


Ah, if only I were the Health IT King and had the power to fix inefficient systems that impair clinician productivity. I cheer dilly dilly to all who seek to embrace the knowledge and expertise of physicians to deliver highly-usable solutions. I am confident that their efforts will make physicians happier and more productive and enhance the delivery of quality patient care.

Readers Write: Finding the Elusive Insights to Improve Surgical Outcomes

December 20, 2017 Readers Write 1 Comment

Finding the Elusive Insights to Improve Surgical Outcomes
By Dennis Kogan

image

Dennis Kogan, MBA is co-founder and CEO of Caresyntax of Boston, MA.

America’s operating rooms have an international reputation for driving surgical innovation. But they are also the setting for high variation in performance, as evidenced by the fact that 10 percent to 15 percent of patients experience serious post-surgery complications. This means millions of patients are at risk, yet insight into the root causes of performance variation remain an elusive “black box.” In the absence of this understanding, some hospitals cite the uniqueness of its patient cohorts as the primary driver of variation.

That has the unsettling ring of blaming the patient for his or her subsequent complications. Further, it raises the question of whether or not the hospital has a reliable risk stratification methodology for its patient cohorts, and if not, why not? We can predict the reason and it’s a valid one. Risk stratification at scale depends on data insights, and most perioperative data—a full 80 percent of it—is either uncaptured or unstructured.

To establish perioperative best practices, hospitals first need to harness the massive volume of data where actionable insights currently hide. With the convergence of IoT medical technology and healthcare analytics, they finally can.

Significant workflow enhancements can be made, for example, via performance analytics that consume structured preoperative and postoperative data from the EMR, surveys and patient outcome assessments. But real actionability is made possible with the addition of point-of-care data acquired within the operating room itself, largely from various connected medical devices. Combined with structured preoperative and postoperative data, this provides clinicians with both aggregated and granular data visibility. Now enabled with the clinical full picture, clinicians can focus on putting the data into action.

Circling back to risk stratification, let’s take a closer look at how this works. First, providers must document an individual patient’s risk factors. Then, using a validated risk calculator, a personalized risk assessment can be created (and communicated to the patient). Then, it should be included in an aggregation of patient risk assessments. From this collection of data, along with other data sources that include data pulled during the patient’s surgery, automated risk stratification reports can be immediately available for ICU managers to help prioritize and tailor recovery pathways. These reports could also indicate complication risk and compliance percentages versus targeted benchmarks.

All patients are inherently unique, but that doesn’t mean most of the variation in surgical outcomes or costs is unavoidable. In fact, a significant amount of variation can be reduced by meeting targeted benchmarks—say, for reducing infection, readmissions, length of stay, or even amount of pain experienced post-surgery. These benchmarks and best practices can be crystalized after aggregating and analyzing procedure and surgical documentation, such as reports, vital charts, videos, images, and checklists.

One strategy used in operating rooms around the world is to automate the collection and aggregation of operating room video recordings with key procedure data, including some of the above mentioned checklists and vitals data. Advanced technology can also retrieve surgical videos and images from any operating room integration system. Once surgery and vitals are recorded in a synchronized way, the ability now exists to identify and create a standard protocol that can go into a pre- or post-operative brief.

An additional use for this data includes streamlining post-operative report building, especially for payer reporting and internal quality initiatives. While there is a little time left to report 2017 data for the first official year of MACRA MIPS, this will be a continuing need.

Pre-operative risk scoring is sporadic at best, again, due to the lack of an ability to harness the necessary data. But the same data aggregated to create benchmarks and best practices can be used to create robust and highly accurate risk scoring to see what the possible harm could be to a surgical patient. In parallel, protocols also identified from the data can help to mitigate this risk.

In a hypothetical example, perhaps in one hospital more than 11 percent of patients undergoing non-cardiac surgery experience post-op infection. Predictive analytics reveal that the number of times certain thresholds were reached during surgery correlated with outcome measures. Evidence from this research can be incorporated into a decision support system that monitors the patient’s score and sends alerts when care plans are veering off course. Reductions in infections—and corresponding length of stay and readmission—soon follow.

Persistent opacity into root causes of variation is untenable. Quality-based reimbursement programs such as MACRA MIPS rely heavily on analytics of surgical performance, with a full 60 percent weight given to quality. Meanwhile, patients are aging and becoming frailer. This could increase post-surgery complications to an even higher rate than it is now.

Clearly it is time to innovate not just how we perform surgery, but also how we improve performance.

Readers Write: Almost Real, But Not Quite: Synthetic Data and Healthcare

December 20, 2017 Readers Write No Comments

Almost Real, But Not Quite: Synthetic Data and Healthcare
By David Watkins

image

David Watkins, MS is a data scientist at
PCCI in Dallas, TX.

We all want to make clinical prediction faster and better so we can rapidly translate the best models into the best outcomes for patients. At the same time, we know from experience that no organization can single-handedly transform healthcare. Momentous information hidden in data silos across sectors of the healthcare landscape can help demystify the complexities around cost and outcomes in the United States, but lack of transparency and collaboration due to privacy and compliance concerns along data silos have made data access difficult, expensive, and resource-intensive to many innovation designers.

Until recently, the only way to share clinical research data has been de-identification, selectively removing the most sensitive elements so that records can never be traced back to the actual patient. This is a fair compromise, with some important caveats.

With any de-identified data, we are making a tradeoff between confidentiality and richness, and there are several practical approaches spanning that spectrum. The most automated and private method, so-called “Safe Harbor” de-identification, is also the strictest about what elements to remove. Records de-identified in this way can be useful for many research cases, but not time-sensitive predictions, since all date/time fields are reduced to the year only.

At the other extreme, it is possible to share more sensitive and rich data as a “Limited Data Set” to be used for research. Data generated under this standard still contains protected health information and can only be shared between institutions that have signed an agreement governing its use. This model works for long-term research projects, but can require lengthy contracting up front and the data is still locked within partner institutions, too sensitive to share widely.

What’s a novel yet pragmatic solution to ensure that analytics advancement is catalyzed in healthcare industry? We are exploring “synthetic data,” data created from a real data set to reflect its clinical and statistical properties without showing any of the identifying information.

Pioneering work is being done to create synthetic data that is clinically and statistically equivalent to a real data source without recreating any of the original observations. This notion has been around for a while, but its popularity has grown as we’ve seen impressive demonstrations that implement deep learning techniques to generate images and more. If it’s possible to generate endless realistic cat faces, could we also generate patient records to enable transparent, reproducible data science?

The deep learning approach works by setting up two competing networks: a generator that learns to create realistic records and a discriminator that learns to distinguish between real and fake records. As these two networks are trained together, they learn from their mistakes and the quality of the synthesized data improves. Newer approaches even allow us to further constrain the training of these networks to match specific properties of the input data, and to guarantee a designated level of privacy for patients in the training data.

We are investigating state-of-the-art methodologies to evaluate how effective the available techniques are at creating data sets. We are devising strategies for overcoming technology and scientific barriers to open up an easy access realistic data platform to enable an exponential expansion of data-driven solutions in healthcare.

SNAGHTML83ceea8 image

Can synthetic data be used to accelerate clinical research and innovation under strong privacy constraints?

image

In other data-intensive areas of research, new technologies and practices have enabled a culture of transparency and collaboration that is lacking in clinical prediction. The most impactful models are built on confidential patient records, so sharing data is vanishingly rare. Protecting patient privacy is an essential obligation for researchers, but privacy also creates a bottleneck for fast, open, and broad-based clinical data science. Synthetic data may be a potential solution healthcare has been waiting for.

Readers Write: Report from AWS Re:Invent

December 4, 2017 Readers Write No Comments

Report from AWS Re:Invent
By Travis Good, MD

image

Travis Good, MD, MS, MBA is co-founder and CEO of Datica of Madison, WI.

AWS Re:Invent has become one of the most important technology conferences of the year due to the sheer size of the Amazon Web Services cloud and the rate of technology innovation announced. The influence of the conference on health IT has grown over the years as well.

Cerner

There was not an industry-shattering Cerner announcement as was rumored in the CNBC article the week prior. Cerner held a session focused on a few interoperability topics that was well received by the deeply technical audience. But nowhere during its session, nor the daily keynotes, was the announcement made. We bumped into a few Cerner individuals at the event who all commented that they are excited about the future capabilities of AWS’s international regions. International expansion is a priority lately across many health IT vendors and it appeared both Cerner and AWS have similar ambitions based on the Cerner conversations we had at the event.

image

The amount of money Cerner makes on managed services (which can be largely interpreted as hosting) and support and maintenance (which one can presume has a large amount of hosting-related support) dwarfs the company’s revenue from licenses and subscriptions. The international market is the greatest area of growth for its core revenue model, but international data centers are exponentially harder to build and maintain yourself vs. co-location in the US. Cerner has built and maintained its  own data centers nationally.

There are legs to the rumored CNBC story as well as credibility to the other rumors around population health-related partnerships, but the best insight from Re:Invent we can lend is that any rumored partnership is much more about hosting management than it is about APIs or cloud-based data interoperability.

Compliance

Without question, compliance and security were the two most important topics at the conference. Simply charting the messaging from vendors demonstrates the point: at least twice as many vendors were touting compliance and security management tools, while at least half as many vendors were there to market developer empowerment tools. It’s like the cloud grew up to an enterprise option in the last 12 months.

This is also backed by our observation of the number of C-suite attendees at the event. Supposedly the attendee count jumped from 30,000 last year to 45,000 this year—a number and rumor that was floated often throughout the conference. If true, from our vantage point, the 15,000-person increase was a major jump in “suits” who were there to evaluate how to make this cloud thing work rather than developers who are already leveraging the cloud for projects.

As such, compliance and security was the buzz amongst serious enterprise and healthcare buyers, while the general zeitgeist amongst developers was machine learning and artificial intelligence. But, as we all know, health IT is always woefully behind!

HIPAA, HITRUST, GDPR, GxP, FedRamp, and others were the topics we continuously heard discussed. Interestingly, there are so few options to help truly manage these complex compliance frameworks on AWS. Ultimately, the sentiment we gathered across the healthcare landscape is no one is really helping, especially with HITRUST, GxP, or GDPR. No one had a true GDPR message or product. (Datica will be GDPR ready in Q1 2018.)

AI, ML, and AWS Services

John Moore from Chilmark Research once told us that he goes to Health 2.0 to see what’s going to happen and HIMSS to see what’s already happened. Re:Invent has similar characteristics as Health 2.0.

The pace of innovation and accessibility to digital health developers is so fast that the products and changes to health IT are going to become ever more rapid despite the industry’s best efforts to slow it down. The sense that the AI revolution is just around the corner was one of the strongest observations from Re:Invent. That more AI tooling is being made available to health IT developers on AWS’s cloud means that better products more adeptly addressing patient care and reducing costs are going to come at an ever faster pace. It’s going to be an interesting next few years.

Readers Write: The Challenges (and Benefits) of Anesthesia Data Capture

November 29, 2017 Readers Write No Comments

The Challenges (and Benefits) of Anesthesia Data Capture
By Douglas Keene, MD

image

Douglas Keene, MD is chairman and founder of Recordation of Wayland, MA and an anesthesiologist and co-founder with Boston Pain Care Center.

As part of the American Recovery and Reinvestment Act of 2009, hospitals and clinics were required to demonstrate conversion to electronic medical records (EMRs) by the end of 2014. However, despite government incentive programs totaling in the billions, the program initially faced a myriad of hurdles and proved harder to implement than initially anticipated. Fast-forward to nearly a decade later and the initiative is back on track, with over 90 percent of healthcare facilities using EMRs as their universal standard.

With that said, one segment of the healthcare market has lagged in EMR adoption: anesthesia care providers and the adoption of anesthesia information management systems (AIMS). Despite the critically important role the operating room plays in a hospital’s ecosystem –typically the source of about 60 to 70 percent of a hospital’s revenue – the majority of healthcare facilities have been hesitant to make substantial monetary investments in AIMS.

To bring the EMR revolution out of the doctor’s office and into the OR setting, physicians must reflect on the factors that have led to slow AIMS adoption,and consider the key features and components needed in order for physicians and administrators to overcome these implementation hurdles.

Anesthesiology departments have grappled with many of the same challenges initially faced by healthcare facilities looking to adopt EMRs. These include reluctance to share information with competitors, software from different vendors that can’t interoperate or communicate, lengthy and complex implementation phases, and the overall high price tag of such systems.

In addition to these obstacles, AIMS adoption faces an even more challenging hurdle: adoption inertia by anesthesia providers. While all EMR software faced some initial skepticism by healthcare providers in general, this aversion has been far more vehement among anesthesia care teams for several important reasons, and stemming from the complexity of real-time anesthesia-related documentation.

Early AIMS were difficult to learn to use and implement. They relied upon larger, expensive computers with relatively lower processing power and faced challenges with interfacing reliably with anesthesia equipment and hospital information systems. Anesthesia workflow and efficiency often worsened with the introduction of early AIMS technology.

Advances in computer technology and interface design have improved some aspects of the overall user experience. However, the drawbacks from early AIMS still linger in the minds of many anesthesia providers.

While many academic and larger surgical facilities have adopted AIMS made by the vendors of the existing hospital information systems, there are numerous community hospitals and ambulatory surgical centers that have not yet transitioned to electronic anesthesia records, based upon their smaller sizes and budgetary constraints.

As a result, many of today’s anesthesiologists and CRNAs who underwent their initial training using AIMS in academic facilities ultimately enter practices that still rely on handwritten documentation.

As economic and regulatory forces increase pressure to consider the adoption of electronic anesthesia records, teams that include administrators, information management specialists, clinical managers, and anesthesia providers are sharing the decision-making process.

As a board-certified anesthesiologist, pain management, and clinical informatics specialist, I am certainly familiar with the complaints physicians have had with AIMS. In my opinion, with the modern technologies now available on the market – and many now available at more reasonable price-points – there is no good reason that surgical facilities and anesthesia departments should hesitate to consider the adoption of anesthesia information technology. The benefits of AIMS and the potential perils of not adopting such a system are far too great to ignore.

In choosing an AIMS, the type of facility in which it will be implemented should be considered and the characteristics of the facility should be embodied in the AIMS. As an example, ambulatory surgery centers (ASCs), while among the slowest to adopt AIMS, are beginning to realize that their survival will depend upon information management.

ASCs must provide patient care with a focus on safety, quality, and operational efficiency, but often have smaller budgets to implement information technology. Therefore, a sensible approach would be choosing a cost-effective AIMS solution designed to facilitate perioperative documentation in a fast-paced anesthesia workflow environment that is focused on providing easily available data for process analysis and improvement.

ASCs also need to streamline the sharing of information from and with numerous sources, including primary care providers, surgeons, patients, and hospitals, and therefore should choose an AIMS solution that focuses on interoperability and that is easy to implement. These factors will benefit all of the ASC’s stakeholders and will lead to better patient care and assure the long-term financial viability of the facility.

From the point of view of the AIMS end users, the anesthesia care team must view the AIMS solution as benefit rather than an obstacle. Instead of placing a barrier between physician and patient as some feared AIMS would do, early adopters have found that well-designed AIMS empower physicians and CRNAs to be more vigilant with respect to direct patient care during surgery.

Instead of using handwriting to create what is sometimes partially illegible documentation during a surgical procedure, many AIMS are able to capture vital signs such as pulse oximetry, end-tidal CO2, volatile agent concentrations, and other numerics automatically, enabling providers to spend more time monitoring the patient and focusing on quality of care. The result: better data, accurate documentation of measurements, and improved patient outcomes.

Other improvements to modern day AIMS includes intuitive user experiences and interfaces, the ability to easily customize workflows, as well as increased interoperability with existing EMR systems. For AIMS users, and especially for ASCs, ease of use and system integration is of utmost importance as the success of an ASC depends on the ability to seamlessly share information back to the host system of a hospital or provider during transfer of care.

In addition to interoperability, today’s AIMS solutions are designed to mimic traditional interfaces and workflows with which anesthesia providers are already familiar. In fact, adopters of well-designed AIMS can become comfortable with their use after just a few surgical procedures.

There will always be new documentation requirements, new monitoring data that must be recorded, and new information that will need to be shared with providers. Practices that adopt modern AIMS solutions will be able to weather these changes far more easily than those who continue to create handwritten anesthesia documentation, as well-designed clinical solutions respond to these changes and guidelines in anesthesia technology, monitoring, and standards of care.

In summary, a well-designed AIMS provides a cost-effective alternative to handwritten documentation in that anesthetic records can now be based upon high resolution electronic data capture, with computer-validated information that can be aggregated into databases that form the basis for continuing quality analysis and improvement studies.

In the end, with a relatively small investment in anesthesia information technology, even the smallest community hospitals and ambulatory surgical centers can implement technology that will empower the facilities to say with confidence, “We’re doing a great job and here’s the proof.”

Subscribe to Updates

Search


Loading

Text Ads


Report News and Rumors

No title

Anonymous online form
E-mail
Rumor line: 801.HIT.NEWS

Tweets

Archives

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Reader Comments

  • Julie McGovern: Re: Atul Gawande piece, heard an interesting NPR interview with author A.J. Jacobs who decided to thank every person inv...
  • RobLS: RE: Watson You should really let readers know about spoilers....
  • RecentMedicareRecipient: As a young and still motivated healthcare IT professional, I unfortunately find myself with a debilitating disease which...
  • VSP: For a tight-knit company that still has traumatic flashbacks to its last *involuntary* layoff of any size, a program lik...
  • Random Guy: Excel has a RAND and RANDBETWEEN formula... No need to reinvent the wheel ......
  • James Aita: Re: "Influencers" This is part of the reason that the "real" influencers HATE the term "influencer", because the ones w...
  • Vaporware?: Is seriously NOTHING included in the $20 billion we're shoveling to Cerner for MHS and VA? Or we pay that just for the p...
  • Cosmos: Interesting and insightful piece as always, thank you!...
  • Rachel: Hi, how are you positive that their client base is dwindling? I'm curious where you're getting this information from. It...
  • Annoyed: Seriously? Have you not read the post, and been sleeping under a rock in Healthcare Technology land? WRONG QUESTION. App...

Sponsor Quick Links