Home » Readers Write » Recent Articles:

Readers Write: Continuous Clinical Surveillance: An Idea Whose Time Has Come

March 21, 2018 Readers Write 3 Comments

Continuous Clinical Surveillance: An Idea Whose Time Has Come
By Janet Dillione

image

Janet Dillione is CEO of Bernoulli Health of Milford, CT.

It’s no secret that the general acuity of hospitalized patients is increasing as the overall US population continues to age (hello, Baby Boomers). Many patients who would have been in an ICU in the past are now found in lower-acuity areas of the hospital. We foresee that the hospital of tomorrow, in terms of monitoring and surveillance capabilities, will need to be more like an enterprise-wide ICU.

A significant problem with such a transformation is that hospitals will not be able to staff their entire facility like an ICU. In most hospitals, there is simply no money to add more staff. Even if there were sufficient funds, doctors and nurses are in short supply. Hospitals will have no choice — they will need new technological tools to help clinicians manage these rising levels of acuity.

One type of technology that holds promise in this regard is continuous clinical surveillance. In contrast to electronic monitoring — which includes observation, measurement, and recording of physiological parameters — continuous clinical surveillance is a systematic, goal-directed process that detects physiological changes in patients early, interprets the clinical implications of those changes, and alerts clinicians so they can intervene rapidly. (1)

Just a few years ago, continuous clinical surveillance would have been impossible because there was no way to integrate data from different monitoring devices, apply analytics to that information in real time, and communicate alerts to physicians and nurses beyond the nearest nurse’s station. But today, medical device data can be aggregated and analyzed in a continuous stream, along with other relevant data such as patient data from the EHR. In addition, many clinicians now carry mobile devices that allow them to be alerted wherever they are.

Early Warning System

A continuous clinical surveillance system uses multivariate rules to analyze a variety of data, including real-time physiological data from monitoring devices, ADT data, and retrospective EHR data. When its surveillance analytics identify trends in a patient’s condition that indicate deterioration, the system sends a “tap on the shoulder” to the clinicians caring for the patient.

For example, opioid-induced respiratory depression accounts for more than half of medication-related deaths in care settings. (2) Periodic physical spot checks by clinical staff can leave patients unmonitored up to 96 percent of the time. (3) By connecting bedside capnographs and pulse oximeters to an analytic platform to detect respiratory depression and instantly alert the right clinicians, continuous surveillance can shorten the interval between a clinically significant change and treatment of the patient’s condition.

A recent study found that compared to traditional patient monitoring and spot checks, continuous clinical surveillance reduced the average amount of time it took for a rapid-response team to be deployed by 291 minutes in one clinical example. In addition, the median length of stay for patients who received continuous surveillance was four days less than that of similar patients who were not surveilled. (4)

Another condition that requires early intervention is severe sepsis, which accounts for more than 250,000 deaths a year in the US. (5) The use of continuous clinical surveillance can help predict whether a patient’s condition is going to get worse over time. By aggregating data from monitoring devices and other sources and applying protocol-driven measures for septicemia detection, a multivariate rules-based analytics engine can identify a potentially deteriorating condition and notify the clinical team.

Reduction in Alarm Fatigue

Repeated false alarms from multiple monitoring devices often cause clinicians to disregard these alerts or arbitrarily widen the alarm parameters. Continuous surveillance can significantly reduce the number of alarms that clinicians receive.

An underlying factor that produces alarm fatigue is that the simplistic threshold limits of physiologic devices — like patient monitors, pulse oximeters, and capnographs — are highly susceptible to false alarms. Optimization of the alarm limits on these devices and silencing of non-actionable alarms is not enough to eliminate this risk. The challenge is achieving a balance between communicating essential patient information while minimizing non-actionable events.

Continuous clinical surveillance solutions that analyze real-time patient data can generate smart alarms. Identifying clinically relevant trends, sustained conditions, reoccurrences, and combinatorial indications may indicate a degraded patient condition prior to the violation of any individual parameter. In addition, clinicians can leverage settings and adjustments data from bedside devices to evaluate adherence to or deviation from evidence-based care plans and best-practice protocols.

In a study done in a large eastern US health system, researchers sought to establish that continuous surveillance could alert clinicians about signs of OIRD more effectively than traditional monitoring devices connected to a nurse’s station without compromising patient safety. The results showed that a continuous surveillance analytic reduced the number of alerts sent to the clinical staff by 99 percent compared to traditional monitoring. No adverse clinical events were missed, and while several patents did receive naloxone to counter OIRD, all patients at risk were identified early enough by the analytic to be aroused and avoid the need for any rapid response team deployment. (6)

Clinical Workflow

When CIOs are considering a continuous clinical surveillance solution, they should look for a platform that fits seamlessly with their institution’s clinical workflow. This is especially important outside the ICU, where the staff-to-patient ratio is lower than in critical care. In these care settings, a solution that can be integrated with their mobile communication platform ensures that alerts will be received on a timely basis.

In addition, the continuous surveillance solution should have an open interoperability standards based architecture that integrates with the hospital’s EHR, clinical data repository, and other applications. Especially in these times, it must support strict security protocols as part of an overall cybersecurity strategy.

Clinicians are beginning to recognize that continuous clinical surveillance can help them deliver better, more consistent, more efficient, and safer patient care. In this respect, it reminds me of the timeframe after publication of the famous IOM report that highlighted the dangers of medication errors in the US healthcare system. Companies scrambled to provide a solution, and when automated medication administration was first introduced, the technology was unimaginably clunky. As many of us remember, COWs left the pastures and moved onto hospital floors.

I had the opportunity to watch clinicians who had significant doubts about bar coding and scanning try these new tools. It only took that first patient where the technology helped them avoid dispensing an incorrect medication to turn a skeptic into an evangelist. They quickly realized their patients were safer because of the new technology. Clinicians will discover that continuous clinical surveillance offers the same type of patient safety benefits.

Eventually, hospitals will use continuous surveillance with acutely ill patients in all care settings. The ability of analytics to interpret objective physiological data in real time and enable clinical intervention for deteriorating patient conditions that could otherwise be missed is just too powerful to ignore.

REFERENCES

1. Giuliano, Karen K. “Improving Patient Safety through the Use of Nursing Surveillance.” AAMI Horizons. Spring 2017, pp 34-43.

2. Overdyk FJ, Carter R, Maddox RR, Callura J, Herrin AE, Henriquez C. Continuous Oximetry / Capnometry Monitoring Reveals Frequent Desaturation and Bradypnea During Patient-Controlled Analgesia. Anesth Analg. 2007;105:412-8.

3. Weinger MB and Lee La. No patient shall be harmed by opioid-induced respiratory depression. APSF Newsletter. Fall. 2011. Available at: www.apsf.org/newsletters/html/2011/fall/01_opioid.htm.

4. “Improving Patient Safety through the Use of Nursing Surveillance.”

5. Centers for Disease Control and Prevention, “Data & Reports: Sepsis.” https://www.cdc.gov/sepsis/datareports/index.html

6. Supe D, Baron L, Decker T, Parker K, Venella J, Williams S, Beaton L, Zaleski J. Research: Continuous surveillance of sleep apnea patients in a medical-surgical unit. Biomedical Instrumentation & Technology. May/June 2017; 51(3): 236-251. Available at: http://aami-bit.org/doi/full/10.2345/0899-8205-51.3.236?code=aami-site.

Readers Write: Analytics Optimization: Doing What It Takes

March 21, 2018 Readers Write 2 Comments

Analytics Optimization: Doing What It Takes
By Lee Milligan, MD

image

Lee MIlligan, MD is VP/CIO of Asante of Medford, OR and a director of the governing boards of Asante, Oregon ACO, and Propel Health.

I recently surveyed a number of large and medium-sized integrated healthcare institutions in the Pacific Northwest with a focus on the analytics experience. I sought to answer one question: how do the operational and clinical end users perceive their experience of requesting and receiving information? I talked to CIOs, CMIOs, and directors of analytics.

Although the conversations touched on many concerns, three themes emerged that I now call the “Three Reporting D’s” – delay, distrust, and dissatisfaction. End users are just not getting what they need to do their jobs on time. Despite the adoption of sparkly analytics software products, the problems continue to fester.

We experienced a similar disconnect a few years back, and have, over the course of three years, re-architected our approach. Although we still have room for improvement, I’d like to share a bit about what we learned and how this reboot has led to a more satisfying end user experience.

We started the internal investigation by looking at the entire end-to-end experience from the customer’s perspective. Using a lean management technique known as value stream mapping, we drew out on a white board all of the steps that a typical end user would experience as they requested information from our analytics team. Surprisingly, this took quite a while and we ran out of white board space.

This was telling. Why does this process include so many steps? It reminded me of the 1990s Windows installations where the customer would continuously have to click “next” to move the process forward.

One of the keys of this lean technique is to identify the steps in the process that add value and eliminate the rest. We got stuck on the definition of value. What is valuable to the end user? When we honestly answered that question, a surprising number of steps were removed.

Next we asked, what’s missing? That question required us to walk in the shoes of our customer, like a doctor’s seeing the world through the patient’s lens. I also had the advantage of two additional frames of reference:

  • I personally requested that a report be built for me from scratch using the prior method, and
  • I asked the BI developers to CC me on all email communications between them and the customer.

Both experiences unearthed missing fragments, which ultimately informed our strategic BI architecture. Most of the changes we instituted were budget-neutral, process-related improvements. However, I would like to highlight two changes which cost a few bucks that delivered tremendous ROI.

Customer/BI Developer Partnership and Communication

We recognized fairly quickly that these relationships were in need of optimization. First, the customer rarely knows what they want. That’s not to say they can’t make a request. However, they frequently request what they don’t ultimately need or want.

Second, I discovered through those CC’d emails that they are requesting many additional discrete elements, far beyond the initial scope, usually as they learned more about what the information looks like. In other words, they didn’t fully understand what they were looking for and we were unprepared to fully discover with them what they ultimately need.

To plug that hole, we instituted a new position within our team, the clinical data analyst. Something akin to the business analyst in the corporate world, this role is responsible for working directly with the end user to accomplish two goals: (a) to fully understand the ask to detail this in the agreed-upon scope, and (b) to commit the requestor to actively participate in the data development process.

Also, our team of BI developers desired guidance on how they should communicate with our end users. We had naively taken that piece for granted. They requested clear direction on how to frame conversations, how to respond to specific requests that are outside of agreed-upon scope, and how to ask better questions of the initial ask.

Teaching/Training

We surveyed our customers and discovered something astonishing. Many are not using the reports and data that we have delivered. When pressed, it became clear that many did not fully understand the information produced and even fewer understood how to incorporate this data into their workflow to better inform operational decision-making.

We developed a new role as a principal trainer within ITS-Analytics. The goals of this role are twofold: (a) to work directly with end users to assure a full and practical understanding of the delivered information (i.e., how to read the report, what the symbols mean, how to navigate an analytics dashboard, etc.), and (b) to lead our self-service domain. The self-service aspect has significant potential to meet customer’s needs in a rapid, nimble fashion.

Putting it all together, our take-home lesson has been the criticality of performing regular internal assessments in order to verify that we are meeting our customer’s needs—from their point of view—objectively and subjectively.

Readers Write: It’s Time for Drug Price Transparency

February 14, 2018 Readers Write 9 Comments

It’s Time for Drug Price Transparency
By Stanley Crane

image

Stanley Crane is the chief technology officer of InteliSys Health of San Diego, CA.

EHR vendors face a tough challenge in deciding which new features to develop and integrate for their next release and which ones to leave on the cutting room floor. The benefits of each potential enhancement must be weighed against the costs, usually measured in programming time. Moreover, features required for Meaningful Use and MIPS must be included, making the triage even more difficult.

However, EHR companies are missing the boat if they neglect to add a feature that could have a massive impact on their clients’ patients. I am speaking here of prescription drug pricing comparisons, built directly into the EHR workflow of prescribers

We’ve heard a lot about drug price transparency lately. But the public discussion hasn’t come close to the truth.

There are vast differences in the prices pharmacies charge for the same drug from the same manufacturer within the same geographical area. For example, the price of generic Plavix (clopidogrel) ranges from $6.16 at one pharmacy in Aurora, CO to an amazing high of $150.33 at another pharmacy just a few steps away. That’s the equivalent of a gas station charging $72 per gallon for unleaded regular when a station across the street is asking $2.95. This is merely one of literally millions of examples of the absurd variation in retail drug prices.

Most doctors and patients are unaware that retail drug prices vary by so much. As a result, many patients go to the pharmacy, get hit with sticker shock, and walk out without picking up their medication. Others pay far more than they should for the drug because they’re unaware of widespread price variance.

A handful of companies now sell prescription drug price comparison tools directly to consumers. These haven’t had much impact, however. First, because not many people know about them. But also because it’s too complicated for the patient to move their prescriptions to another pharmacy.

Imagine how the situation would be different if a patient’s own doctor could tell him or her what their medications would cost at different pharmacies, regardless of whether the patient has insurance.

What our healthcare system needs today is a modern price comparison tool that is integrated with an e-prescribing tool, ideally within an EHR. The range of prices for a particular drug would appear on the prescribing screen within milliseconds of a physician selecting that medication. Using real-time pricing data from pharmacies, the software could show the cost of that drug at the closest pharmacies to the doctor’s office or the patient’s home or workplace. None of this information is available via EHRs on the market today.

Such a solution could use the patient’s insurance information in their doctor’s EHR, as well as search health plan databases to determine a patient’s out-of-pocket cost (after factoring in deductibles, co-payments, and out-of-pocket minimums). If the patient is on the hook for the cost — either because of a high deductible, high co-pay, or because he or she is uninsured –the software could show the cash price of the medication. It could also indicate whether the cash price is lower than the co-payment under the patient’s plan, ensuring that the patient pays the lowest price each time.

At the patient’s choice, the doctor could then send the e-prescription to the most convenient pharmacy that charges the lowest price for that drug. If the price is still too high for the patient, the software could automatically analyze the selected drug against therapeutically equivalent alternatives, enabling the doctor to prescribe a lower-cost alternative, again comparing the prices at local drugstores.

Transparency in prescription drug pricing offers several benefits. Patients are likely to have better outcomes if they fill their prescriptions and adhere to their prescribed therapy. Physicians can garner higher quality scores if their patients take their meds and control their chronic conditions. Lastly, if price transparency becomes widespread, some pharmacy chains will be forced to lower their prices to avoid losing customers to lower-priced stores or chains. If that happens, the whole system benefits, including patients, plans, employers, and taxpayers.

Readers Write: In Defense of Bob Dolin

February 10, 2018 Readers Write 23 Comments

This comment was provided as a response to discussion about whether former Kaiser physician and HL7 chair Bob Dolin, MD should be allowed to return to industry work after serving a prison sentence for possession of child pornography.

I appreciate Mr. HIStalk’s comment that Bob should be able to work. Not only is should he be allowed to work, he is obligated to return to being a useful, productive member of society. Not just from my perspective, but from the government perspective.

I know more intimately than any of you what the real situation is and was. I am his wife. So much for anonymity.

Many of you know me. I am a strong, independent woman, dependent on no one. Someone who not only hates child pornography and the implications of what that means for these children, but one who also despises “regular” pornography and the industry’s encouragement for participants to descend into child pornography (think “barely legal”). I also recognize that most men have participated in viewing pornography, especially men in unhappy marriages. But I don’t hate them or think they are sick — they are just unhappy.

Why have I stayed with Bob? Why do I encourage him to do the work he loves and to which he has made such great contributions? Let me tell you the reasons.

Bob is not a depraved, sick person. He never inappropriately touched any child. He is as far away from being a misogynist as any man I know. Likely, he has been far more monogamous and faithful than most of you.

While you might surmise that children were harmed because he downloaded a few zip files in one period in his life over 10 years ago, it is highly unlikely. There is no empirical evidence for that. Again, I am not asserting in any way that this was OK.

I have been with Bob nearly 24/7 since shortly after this was discovered. In fact, I believe, our relationship and strong marriage has been a primary healer. Bob simply only has the desire for the intimacy that only a special love such as ours provides.

I am firmly asserting is that he is not sick or depraved. I am stating that back 10 years ago, as his previous marriage was ending, he was in a bad spot. Did he go out and rape anyone or touch any child? Did he even have affairs? No. He withdrew into himself and escaped by viewing “regular” pornography, and unfortunately purposefully downloaded some child porn. The was no money exchanged.

In regards to “infants and toddlers being sadistically abused,” I challenge you to find an ICE arrest announcement (that’s the branch of government that deals with child pornography cases) that does not say, “XXX number of child pornography pictures were found, including infants and toddlers being sadistically abused.” Simply, that is what is in those zip files. How do I know? I was told that by lawyers who specialize in this area. The media (and ICE) love to emphasize this aspect. Whether or not the offender actually spent any time looking at these images is unknown in most cases.

I wish ICE would spend more time on finding and prosecuting the abusers and creators of child porn (usually family members) than on the easy targets of introverted adult males. For that matter, how is it possible that such pictures can even be uploaded? Surely we have the technology to recognize them and prevent it.

After extensive testing, examination, and interviews, Bob was not deemed a danger to society. Exam after exam has revealed him to have made a single mistake in an otherwise exemplary life. Not only that, it was about five years from when the forensics were done on his laptop to when the feds decided to prosecute. We assumed during this time there was nothing worth prosecuting for – they must have had far more pressing cases to deal with.

Bob’s friends, family, and many colleagues are happy to see Bob back contributing his brilliant mind to the industry. They recognize the price he has paid. For those of you who are appalled that he dare be a contributing member of society and HL7, and will quit if Bob continues to go to HL7 meetings after downloading child pornography 10 years ago, spending 2.5 years in federal prison, and losing his career, I encourage you to grow up, act like a mature adult, and think about the logic of that.

To quote a famous Rabbi, “And why beholdest thou the mote that is in thy brother’s eye, but considerest not the beam that is in thine own eye?” Are you perfect? Even if your sins are not as severe as you have judged Bob’s to be, as they may not be, and you have taken it upon yourselves to so severely damn him, I ask, you to examine yourselves, your motives, and your personal issues.

Lastly, think of who you are hurting besides Bob. You are hurting me to the core. You are damaging my ability and desire to participate as a useful member of society. You are making me question nearly everyone at HL7 as to whether they have been two-faced to me these few past years, where I have remained a successfully contributing HL7 member by myself.

I won’t abandon Bob because of this. He is a good man, the best man I know, who made a bad mistake over 10 years ago.

Readers Write: Healthcare CIO Tenure Trends

February 5, 2018 Readers Write No Comments

Healthcare CIO Tenure Trends
By Ranae Rousse

image

Ranae Rousse is VP of sales for Direct Consulting Associates of Solon, OH.

Last year while supporting one of the many local HIMSS chapter events, a keynote speaker presented a statistic that caught my attention. The speaker was presenting on the rise of cybersecurity threats to healthcare. The first slide in his well-constructed PowerPoint presentation had a bolded “17 months” with a font size of about 200. The gentleman then shared with the attendees, most of whom were CIOs, that 17 months is now the average tenure for a chief information officer.

I asked for the source of the 17-month statistic and found that it was for CISOs rather than CIOs and it was also not specific to healthcare. I decided to do my own research with an independent survey of 1,500 healthcare CIOs. The results:

  • The average tenure for a healthcare CIO is 5.5 years, with the range from five months to 23 years.
  • 37 percent of respondents were not healthcare CIOs in their previous jobs. Those who were tended to have longer tenure in their previous CIO positions.
  • 44 percent of the respondents said they don’t have a succession plan. Those respondents also did not have a requirement to appoint a successor.
  • 69 percent intend to retire as a healthcare CIO, although 11 percent say they would purse a COO/CEO role and the remaining 20 percent were split equally between moving to a consulting job or leaving healthcare.

Increases in mergers, acquisitions, and hospital closures between 2008 and 2017 reflect a loss of roughly 280 hospitals, so the number of CIO positions is decreasing. The perception of the CIO role itself has changed from being a senior IT leader to becoming a higher-level healthcare executive, opening the door for the role of the associate CIO in many large health systems.

Considering this ever-changing landscape; what trends can we anticipate for the future?

Readers Write: The Secret to Engaged Physicians at Go-Live: Personalize the EHR

February 5, 2018 Readers Write 1 Comment

The Secret to Engaged Physicians at Go-Live: Personalize the EHR
By Dan Clark, RN

image

Dan Clark RN, MBA is senior vice-president of consulting at Advisory Board.

I often compare an EHR implementation and go-live to getting a new smart phone. Out of the box, it’s a powerful tool, but it doesn’t truly become effective until you start to download applications, add your email and contacts, and pick a personal picture as your background.

Just like your new smart phone, EHRs aren’t ready to perform at their best out of the box and always require some degree of personalization. EHR personalization may sound like one more step in a long, multi-staged implementation and go-live, but it can often be the difference between adoption and rejection.

New technology will always be a disruption, but personalization can minimize a new EHR’s negative impact on patient care by matching new tech to existing clinical workflow, not vice versa. While it’s important to focus on “speed to value” with a new EHR, health systems that take the time to personalize workflows for specialties and individual providers typically see a much higher rate of adoption and a quicker return to pre-go-live productivity.

Health systems should consider a multi-layered approach to personalization. At the very least, health systems should design technology that aligns the EHR to serve high-level strategic goals, such as quality reporting and provider productivity expectations.

When it comes to the individual user level, almost every health system starts with didactic classroom trainings that may combine users from a variety of different clinical and administrative areas. While this is a good baseline, it’s challenging to teach a course that applies to doctors and nurses, front office staff, and revenue cycle alike. Physicians, specifically, report that these sessions take time away from their patients and don’t always provide the value they are hoping they will.

Because of this, one-on-one opportunities for personalization are most efficient and have the biggest impact. I typically see health systems tackle one-on-one personalization support in a couple of ways. The first is setting up a personalization lab. Prior to go-live, we set up a 24/7 personalization lab right in the physician’s office or hospital. This gives clinicians the opportunity to stop in with ad hoc questions, or better yet, make a formal appointment with a clinical EHR expert. These sessions are guided by an extensive checklist of EHR personalization options, fine-tuning everything to the clinician’s preference and specialty.

One orthopedic surgeon came back to the personalization lab four times, and that was after she had already completed the classroom training. We worked with her to personalize specific workflows, order sets, and even simple things like page setup in the EHR.

Personalization serves as just-in-time training and is usually well received by the clinicians. Sometimes this training takes the form of a mobile workstation in the hallways that caters to clinicians’ in-the-moment questions during their breaks and doesn’t pull them away from patients. This kind of assistance is also usually well received by clinicians since it gives them a chance to ask a question about a real patient scenario.

The trick to getting EHR and go-live training right, in any scenario, is to provide the right support—other clinicians who will stand at the elbow with the providers as they navigate real scenarios and issues. And staffing your personalization lab with clinicians will give you the best bang for your buck, providing your staff with clinical and technical expertise. Trainers that combine EHR acumen with clinical expertise and knowledge of appropriate workflows can help clinicians hard code best practices into the technology in a way a technical expert may not.

EHR go-live is an anxiety-ridden time for all health system staff, clinicians and non-clinicians alike. It’s important that all staff feel they have the support, training, and preparation to use the EHR to its fullest potential to impact patient care.

Readers Write: How IT Professionals Can Work More Effectively with Physicians

January 31, 2018 Readers Write 6 Comments

How IT Professionals Can Work More Effectively with Physicians
By Stephen Fiehler

image

Stephen Fiehler is IS service leader for imaging and interventional services at Stanford Children’s Health in Palo Alto, CA.

Be Agile – Work Around Their Schedule

Stop inviting orthopedic surgeons to your order set review meeting from 2:00 to 3:00 p.m. on Wednesday at your offsite IT department building. That is not a good use of their time. And good luck getting them to log in and pay attention to your GoToMeeting from 10:00 to 11:00 a.m. on Thursday.

Some electrophysiologists I work with are only available at the hospital at 7:00 a.m. on Tuesdays or Thursdays. I get there at 6:45 a.m. and have everything ready to go when they walk in the room so we can get through as much content as possible. The best time to meet with an invasive cardiologist is in the control room between cases. When I need to validate new content with them, I wear scrubs and work from a desk in the control room for half a day to get a cumulative 30 minutes of their time. This way, if cases run late, they can get home to their family at 8:00 p.m. instead of 9:00.

As long as I have my laptop, my charger, and an Internet connection, I can be productive from any location that works best for the physicians. Their time is more valuable than mine. The more time I take them away from patient care is less revenue for the hospital and fewer kids getting the medical treatment they need.

There are physicians that have the bandwidth to spend more time with us on our projects, but it is imperative that we not expect it from them.

Be Brief – Keep Your Emails Short and Concise

Review your emails to physicians before sending them. You could probably communicate as much, if not more, with half the words.

When I was at Epic, one of the veteran members on the Radiant team had a message on his Intranet profile instructing co-workers to make emails short enough that they could be completely read from the Inbox screen of the iOS Mail app. Any longer, and you could assume he would not read or reply.

If an email has to be long, bold or highlight your main points or questions. Most physicians have little time to read their email. Show them you value their time and increase the likelihood that they will read or reply to your message by keeping it concise. Writing shorter emails helps you waste less of your own time as well.

Also, use screenshots with pointers or highlighted icons when appropriate. They might not know what a “toolbar menu item” or a “print group” is.

Be Service-Minded – Do Not Forget IT is a Service Department

The biggest mistake a healthcare IT professional can make is forgetting that we are a service department. The providers, staff, and operations are our customers. It is our job to provide them with the tools they need to deliver the best patient care possible. That is why the IT department exists.

Given the complexity of our applications, integration, and infrastructure, it is tempting to forget that we are not the main show. Whether we like it or not, we are the trainers, equipment managers, and first-down marker holders, whereas the providers are the quarterbacks, wide receivers, and running backs.

By focusing on providing the best service possible, you will implement better products and produce happier customers. At the end of the day, we want to be effective and to have a positive impact on the organization. The best way to do that is through being service-minded.

Readers Write: If I Were the Health IT King: A Royal Perspective on 2018 Trends

January 10, 2018 Readers Write 2 Comments

If I Were the Health IT King: A Royal Perspective on 2018 Trends
By Jay Anders, MS, MD

image

Jay Anders, MS, MD is chief medical officer of Medicomp Systems of Chantilly, VA.

If I were king of health IT, I would find great joy in sitting at the head of a banquet table before all my subjects, casting judgment on the most current health IT trends. Like the king in Bud Light’s recent commercial series, I’d love to lead a hearty “dilly dilly cheer for innovations that make it easier for physicians to practice medicine, while banishing the less worthy trends to the “pit of misery.”

Health IT king or not, I see the following 2018 health IT-related trends falling into two distinct buckets.


Deserving Dilly Dilly Cheers

Interoperability

At long last, health systems seem to be accepting the inevitability of interoperability. Organizations are resigned to the fact that it’s no longer reasonable to refuse to share patients’ clinical records with cross-town competitors. Instead, everyone needs to work together to make systems talk. The growing acceptance of standards such as FHIR are also helping to advance interoperability efforts. I predict significantly more progress in this area over the next three to five years.

Collaboration with Physicians

More health IT companies are seeking input from physician users as they design, build, and test their solutions. Vendors are realizing that the creation of user-friendly clinical interfaces can no longer be an afterthought and that the delivery of physician-friendly solutions must be a priority. By collaborating with physicians, vendors better understand required clinician workflows, existing bottlenecks, and the processes that are critical to patient safety.

For example, physicians can provide insights into common clinician thought processes and clarify why one workflow may be preferred over another. Physicians understand what tasks are traditionally performed by a medical assistant, how long a particular procedure might take, and when and why a clinician cannot be looking at a computer screen. By embracing physician collaboration, health IT companies are better-equipped to create innovative solutions that work and think like physicians and enhance provider satisfaction.

Shared Chart Ownership

Not too many years ago, most people — including patients — believed that each physician owned his or her own patient charts. That mindset is changing, and today, most providers and patients realize that everyone involved in a patient’s care — including the patient’s family — needs to share clinical data. The growing recognition that information must flow seamlessly between caregivers is a huge step in the right direction and advances industry efforts to get the right information to the right person at the right time.


Banished to the Pit of Misery

Data Dumping

More data is being exchanged between providers thanks to better interoperability tools and growing enterprise acceptance. Unfortunately, many organizations continue to struggle to figure out what to do with all the data. More health systems have the ability to dump buckets of data on providers, yet few physicians have the tools to efficiently organize the data into actionable information that enhances patient care. Don’t look for any widespread fixes in the short term.

Administrative Burdens

Healthcare still has not figured out how to reduce the administrative burdens of practicing medicine. Physicians continue to be frustrated and disillusioned with their careers, thanks to ever-changing regulatory and reimbursement requirements that require adjustments to clinical workflows. Don’t expect big improvements any time soon, nor major legislation that streamlines existing healthcare policies and regulations. Instead, physicians will be forced to continue addressing numerous tasks that distract from the delivery of patient care.

AI Hype

Despite all the hype, don’t look to artificial intelligence and machine learning technologies to solve all the industry’s data and reporting problems. The bottom line is that these technologies are still insufficiently mature for healthcare applications. Providers would of course love solutions that leverage natural language processing (NLP). AI will have the ability to convert dictated chart notes to free text and free text to data that is actionable for clinicians. Unfortunately, the error rates for converting speech to text to data are, at best, between eight and 10 percent. Give these technologies at least two to three more years before they’re ready able to truly enhance clinical decision-making at the point of care and move out the pit of misery and earn dilly dilly cheers.


Ah, if only I were the Health IT King and had the power to fix inefficient systems that impair clinician productivity. I cheer dilly dilly to all who seek to embrace the knowledge and expertise of physicians to deliver highly-usable solutions. I am confident that their efforts will make physicians happier and more productive and enhance the delivery of quality patient care.

Readers Write: Finding the Elusive Insights to Improve Surgical Outcomes

December 20, 2017 Readers Write 1 Comment

Finding the Elusive Insights to Improve Surgical Outcomes
By Dennis Kogan

image

Dennis Kogan, MBA is co-founder and CEO of Caresyntax of Boston, MA.

America’s operating rooms have an international reputation for driving surgical innovation. But they are also the setting for high variation in performance, as evidenced by the fact that 10 percent to 15 percent of patients experience serious post-surgery complications. This means millions of patients are at risk, yet insight into the root causes of performance variation remain an elusive “black box.” In the absence of this understanding, some hospitals cite the uniqueness of its patient cohorts as the primary driver of variation.

That has the unsettling ring of blaming the patient for his or her subsequent complications. Further, it raises the question of whether or not the hospital has a reliable risk stratification methodology for its patient cohorts, and if not, why not? We can predict the reason and it’s a valid one. Risk stratification at scale depends on data insights, and most perioperative data—a full 80 percent of it—is either uncaptured or unstructured.

To establish perioperative best practices, hospitals first need to harness the massive volume of data where actionable insights currently hide. With the convergence of IoT medical technology and healthcare analytics, they finally can.

Significant workflow enhancements can be made, for example, via performance analytics that consume structured preoperative and postoperative data from the EMR, surveys and patient outcome assessments. But real actionability is made possible with the addition of point-of-care data acquired within the operating room itself, largely from various connected medical devices. Combined with structured preoperative and postoperative data, this provides clinicians with both aggregated and granular data visibility. Now enabled with the clinical full picture, clinicians can focus on putting the data into action.

Circling back to risk stratification, let’s take a closer look at how this works. First, providers must document an individual patient’s risk factors. Then, using a validated risk calculator, a personalized risk assessment can be created (and communicated to the patient). Then, it should be included in an aggregation of patient risk assessments. From this collection of data, along with other data sources that include data pulled during the patient’s surgery, automated risk stratification reports can be immediately available for ICU managers to help prioritize and tailor recovery pathways. These reports could also indicate complication risk and compliance percentages versus targeted benchmarks.

All patients are inherently unique, but that doesn’t mean most of the variation in surgical outcomes or costs is unavoidable. In fact, a significant amount of variation can be reduced by meeting targeted benchmarks—say, for reducing infection, readmissions, length of stay, or even amount of pain experienced post-surgery. These benchmarks and best practices can be crystalized after aggregating and analyzing procedure and surgical documentation, such as reports, vital charts, videos, images, and checklists.

One strategy used in operating rooms around the world is to automate the collection and aggregation of operating room video recordings with key procedure data, including some of the above mentioned checklists and vitals data. Advanced technology can also retrieve surgical videos and images from any operating room integration system. Once surgery and vitals are recorded in a synchronized way, the ability now exists to identify and create a standard protocol that can go into a pre- or post-operative brief.

An additional use for this data includes streamlining post-operative report building, especially for payer reporting and internal quality initiatives. While there is a little time left to report 2017 data for the first official year of MACRA MIPS, this will be a continuing need.

Pre-operative risk scoring is sporadic at best, again, due to the lack of an ability to harness the necessary data. But the same data aggregated to create benchmarks and best practices can be used to create robust and highly accurate risk scoring to see what the possible harm could be to a surgical patient. In parallel, protocols also identified from the data can help to mitigate this risk.

In a hypothetical example, perhaps in one hospital more than 11 percent of patients undergoing non-cardiac surgery experience post-op infection. Predictive analytics reveal that the number of times certain thresholds were reached during surgery correlated with outcome measures. Evidence from this research can be incorporated into a decision support system that monitors the patient’s score and sends alerts when care plans are veering off course. Reductions in infections—and corresponding length of stay and readmission—soon follow.

Persistent opacity into root causes of variation is untenable. Quality-based reimbursement programs such as MACRA MIPS rely heavily on analytics of surgical performance, with a full 60 percent weight given to quality. Meanwhile, patients are aging and becoming frailer. This could increase post-surgery complications to an even higher rate than it is now.

Clearly it is time to innovate not just how we perform surgery, but also how we improve performance.

Readers Write: Almost Real, But Not Quite: Synthetic Data and Healthcare

December 20, 2017 Readers Write No Comments

Almost Real, But Not Quite: Synthetic Data and Healthcare
By David Watkins

image

David Watkins, MS is a data scientist at
PCCI in Dallas, TX.

We all want to make clinical prediction faster and better so we can rapidly translate the best models into the best outcomes for patients. At the same time, we know from experience that no organization can single-handedly transform healthcare. Momentous information hidden in data silos across sectors of the healthcare landscape can help demystify the complexities around cost and outcomes in the United States, but lack of transparency and collaboration due to privacy and compliance concerns along data silos have made data access difficult, expensive, and resource-intensive to many innovation designers.

Until recently, the only way to share clinical research data has been de-identification, selectively removing the most sensitive elements so that records can never be traced back to the actual patient. This is a fair compromise, with some important caveats.

With any de-identified data, we are making a tradeoff between confidentiality and richness, and there are several practical approaches spanning that spectrum. The most automated and private method, so-called “Safe Harbor” de-identification, is also the strictest about what elements to remove. Records de-identified in this way can be useful for many research cases, but not time-sensitive predictions, since all date/time fields are reduced to the year only.

At the other extreme, it is possible to share more sensitive and rich data as a “Limited Data Set” to be used for research. Data generated under this standard still contains protected health information and can only be shared between institutions that have signed an agreement governing its use. This model works for long-term research projects, but can require lengthy contracting up front and the data is still locked within partner institutions, too sensitive to share widely.

What’s a novel yet pragmatic solution to ensure that analytics advancement is catalyzed in healthcare industry? We are exploring “synthetic data,” data created from a real data set to reflect its clinical and statistical properties without showing any of the identifying information.

Pioneering work is being done to create synthetic data that is clinically and statistically equivalent to a real data source without recreating any of the original observations. This notion has been around for a while, but its popularity has grown as we’ve seen impressive demonstrations that implement deep learning techniques to generate images and more. If it’s possible to generate endless realistic cat faces, could we also generate patient records to enable transparent, reproducible data science?

The deep learning approach works by setting up two competing networks: a generator that learns to create realistic records and a discriminator that learns to distinguish between real and fake records. As these two networks are trained together, they learn from their mistakes and the quality of the synthesized data improves. Newer approaches even allow us to further constrain the training of these networks to match specific properties of the input data, and to guarantee a designated level of privacy for patients in the training data.

We are investigating state-of-the-art methodologies to evaluate how effective the available techniques are at creating data sets. We are devising strategies for overcoming technology and scientific barriers to open up an easy access realistic data platform to enable an exponential expansion of data-driven solutions in healthcare.

SNAGHTML83ceea8 image

Can synthetic data be used to accelerate clinical research and innovation under strong privacy constraints?

image

In other data-intensive areas of research, new technologies and practices have enabled a culture of transparency and collaboration that is lacking in clinical prediction. The most impactful models are built on confidential patient records, so sharing data is vanishingly rare. Protecting patient privacy is an essential obligation for researchers, but privacy also creates a bottleneck for fast, open, and broad-based clinical data science. Synthetic data may be a potential solution healthcare has been waiting for.

Readers Write: Report from AWS Re:Invent

December 4, 2017 Readers Write No Comments

Report from AWS Re:Invent
By Travis Good, MD

image

Travis Good, MD, MS, MBA is co-founder and CEO of Datica of Madison, WI.

AWS Re:Invent has become one of the most important technology conferences of the year due to the sheer size of the Amazon Web Services cloud and the rate of technology innovation announced. The influence of the conference on health IT has grown over the years as well.

Cerner

There was not an industry-shattering Cerner announcement as was rumored in the CNBC article the week prior. Cerner held a session focused on a few interoperability topics that was well received by the deeply technical audience. But nowhere during its session, nor the daily keynotes, was the announcement made. We bumped into a few Cerner individuals at the event who all commented that they are excited about the future capabilities of AWS’s international regions. International expansion is a priority lately across many health IT vendors and it appeared both Cerner and AWS have similar ambitions based on the Cerner conversations we had at the event.

image

The amount of money Cerner makes on managed services (which can be largely interpreted as hosting) and support and maintenance (which one can presume has a large amount of hosting-related support) dwarfs the company’s revenue from licenses and subscriptions. The international market is the greatest area of growth for its core revenue model, but international data centers are exponentially harder to build and maintain yourself vs. co-location in the US. Cerner has built and maintained its  own data centers nationally.

There are legs to the rumored CNBC story as well as credibility to the other rumors around population health-related partnerships, but the best insight from Re:Invent we can lend is that any rumored partnership is much more about hosting management than it is about APIs or cloud-based data interoperability.

Compliance

Without question, compliance and security were the two most important topics at the conference. Simply charting the messaging from vendors demonstrates the point: at least twice as many vendors were touting compliance and security management tools, while at least half as many vendors were there to market developer empowerment tools. It’s like the cloud grew up to an enterprise option in the last 12 months.

This is also backed by our observation of the number of C-suite attendees at the event. Supposedly the attendee count jumped from 30,000 last year to 45,000 this year—a number and rumor that was floated often throughout the conference. If true, from our vantage point, the 15,000-person increase was a major jump in “suits” who were there to evaluate how to make this cloud thing work rather than developers who are already leveraging the cloud for projects.

As such, compliance and security was the buzz amongst serious enterprise and healthcare buyers, while the general zeitgeist amongst developers was machine learning and artificial intelligence. But, as we all know, health IT is always woefully behind!

HIPAA, HITRUST, GDPR, GxP, FedRamp, and others were the topics we continuously heard discussed. Interestingly, there are so few options to help truly manage these complex compliance frameworks on AWS. Ultimately, the sentiment we gathered across the healthcare landscape is no one is really helping, especially with HITRUST, GxP, or GDPR. No one had a true GDPR message or product. (Datica will be GDPR ready in Q1 2018.)

AI, ML, and AWS Services

John Moore from Chilmark Research once told us that he goes to Health 2.0 to see what’s going to happen and HIMSS to see what’s already happened. Re:Invent has similar characteristics as Health 2.0.

The pace of innovation and accessibility to digital health developers is so fast that the products and changes to health IT are going to become ever more rapid despite the industry’s best efforts to slow it down. The sense that the AI revolution is just around the corner was one of the strongest observations from Re:Invent. That more AI tooling is being made available to health IT developers on AWS’s cloud means that better products more adeptly addressing patient care and reducing costs are going to come at an ever faster pace. It’s going to be an interesting next few years.

Readers Write: The Challenges (and Benefits) of Anesthesia Data Capture

November 29, 2017 Readers Write No Comments

The Challenges (and Benefits) of Anesthesia Data Capture
By Douglas Keene, MD

image

Douglas Keene, MD is chairman and founder of Recordation of Wayland, MA and an anesthesiologist and co-founder with Boston Pain Care Center.

As part of the American Recovery and Reinvestment Act of 2009, hospitals and clinics were required to demonstrate conversion to electronic medical records (EMRs) by the end of 2014. However, despite government incentive programs totaling in the billions, the program initially faced a myriad of hurdles and proved harder to implement than initially anticipated. Fast-forward to nearly a decade later and the initiative is back on track, with over 90 percent of healthcare facilities using EMRs as their universal standard.

With that said, one segment of the healthcare market has lagged in EMR adoption: anesthesia care providers and the adoption of anesthesia information management systems (AIMS). Despite the critically important role the operating room plays in a hospital’s ecosystem –typically the source of about 60 to 70 percent of a hospital’s revenue – the majority of healthcare facilities have been hesitant to make substantial monetary investments in AIMS.

To bring the EMR revolution out of the doctor’s office and into the OR setting, physicians must reflect on the factors that have led to slow AIMS adoption,and consider the key features and components needed in order for physicians and administrators to overcome these implementation hurdles.

Anesthesiology departments have grappled with many of the same challenges initially faced by healthcare facilities looking to adopt EMRs. These include reluctance to share information with competitors, software from different vendors that can’t interoperate or communicate, lengthy and complex implementation phases, and the overall high price tag of such systems.

In addition to these obstacles, AIMS adoption faces an even more challenging hurdle: adoption inertia by anesthesia providers. While all EMR software faced some initial skepticism by healthcare providers in general, this aversion has been far more vehement among anesthesia care teams for several important reasons, and stemming from the complexity of real-time anesthesia-related documentation.

Early AIMS were difficult to learn to use and implement. They relied upon larger, expensive computers with relatively lower processing power and faced challenges with interfacing reliably with anesthesia equipment and hospital information systems. Anesthesia workflow and efficiency often worsened with the introduction of early AIMS technology.

Advances in computer technology and interface design have improved some aspects of the overall user experience. However, the drawbacks from early AIMS still linger in the minds of many anesthesia providers.

While many academic and larger surgical facilities have adopted AIMS made by the vendors of the existing hospital information systems, there are numerous community hospitals and ambulatory surgical centers that have not yet transitioned to electronic anesthesia records, based upon their smaller sizes and budgetary constraints.

As a result, many of today’s anesthesiologists and CRNAs who underwent their initial training using AIMS in academic facilities ultimately enter practices that still rely on handwritten documentation.

As economic and regulatory forces increase pressure to consider the adoption of electronic anesthesia records, teams that include administrators, information management specialists, clinical managers, and anesthesia providers are sharing the decision-making process.

As a board-certified anesthesiologist, pain management, and clinical informatics specialist, I am certainly familiar with the complaints physicians have had with AIMS. In my opinion, with the modern technologies now available on the market – and many now available at more reasonable price-points – there is no good reason that surgical facilities and anesthesia departments should hesitate to consider the adoption of anesthesia information technology. The benefits of AIMS and the potential perils of not adopting such a system are far too great to ignore.

In choosing an AIMS, the type of facility in which it will be implemented should be considered and the characteristics of the facility should be embodied in the AIMS. As an example, ambulatory surgery centers (ASCs), while among the slowest to adopt AIMS, are beginning to realize that their survival will depend upon information management.

ASCs must provide patient care with a focus on safety, quality, and operational efficiency, but often have smaller budgets to implement information technology. Therefore, a sensible approach would be choosing a cost-effective AIMS solution designed to facilitate perioperative documentation in a fast-paced anesthesia workflow environment that is focused on providing easily available data for process analysis and improvement.

ASCs also need to streamline the sharing of information from and with numerous sources, including primary care providers, surgeons, patients, and hospitals, and therefore should choose an AIMS solution that focuses on interoperability and that is easy to implement. These factors will benefit all of the ASC’s stakeholders and will lead to better patient care and assure the long-term financial viability of the facility.

From the point of view of the AIMS end users, the anesthesia care team must view the AIMS solution as benefit rather than an obstacle. Instead of placing a barrier between physician and patient as some feared AIMS would do, early adopters have found that well-designed AIMS empower physicians and CRNAs to be more vigilant with respect to direct patient care during surgery.

Instead of using handwriting to create what is sometimes partially illegible documentation during a surgical procedure, many AIMS are able to capture vital signs such as pulse oximetry, end-tidal CO2, volatile agent concentrations, and other numerics automatically, enabling providers to spend more time monitoring the patient and focusing on quality of care. The result: better data, accurate documentation of measurements, and improved patient outcomes.

Other improvements to modern day AIMS includes intuitive user experiences and interfaces, the ability to easily customize workflows, as well as increased interoperability with existing EMR systems. For AIMS users, and especially for ASCs, ease of use and system integration is of utmost importance as the success of an ASC depends on the ability to seamlessly share information back to the host system of a hospital or provider during transfer of care.

In addition to interoperability, today’s AIMS solutions are designed to mimic traditional interfaces and workflows with which anesthesia providers are already familiar. In fact, adopters of well-designed AIMS can become comfortable with their use after just a few surgical procedures.

There will always be new documentation requirements, new monitoring data that must be recorded, and new information that will need to be shared with providers. Practices that adopt modern AIMS solutions will be able to weather these changes far more easily than those who continue to create handwritten anesthesia documentation, as well-designed clinical solutions respond to these changes and guidelines in anesthesia technology, monitoring, and standards of care.

In summary, a well-designed AIMS provides a cost-effective alternative to handwritten documentation in that anesthetic records can now be based upon high resolution electronic data capture, with computer-validated information that can be aggregated into databases that form the basis for continuing quality analysis and improvement studies.

In the end, with a relatively small investment in anesthesia information technology, even the smallest community hospitals and ambulatory surgical centers can implement technology that will empower the facilities to say with confidence, “We’re doing a great job and here’s the proof.”

Readers Write: Tell Me More: Documentation Support in Telemedicine

November 29, 2017 Readers Write No Comments

Tell Me More: Documentation Support in Telemedicine
By Patty Maynard

image

Patty Maynard is senior vice president of business development with Health Navigator of La Grange, IL.

A successful telemedicine platform provides value beyond the latest technology or reduced healthcare costs. The most effective platforms focus on workflow, from resource allocation to staff education. In fact, a recent REACH Health survey showed telemedicine can improve outcomes, access to care, and efficiency.

Clinical documentation support (CDS) facilitates reaching these goals. From the chief complaint to the pre-visit, “tell us more” step, CDS can improve workflow. It captures shareable data for medical call centers, telemedicine providers, hospitals, and primary care providers. This data can simplify the pre-visit process, saving time and money. In addition, it provides patients with a familiar and comforting medical interaction, but in a digital format. CDS is part of the back-end content and workflow that make the digital health experience run smoothly.

The more information a healthcare professional has, the easier it is to make decisions. In telemedicine encounters, an easy-to-navigate questionnaire about the chief complaint or symptom can help move the process along.

Imagine knowing a patient’s chief complaint, symptoms, and demographic information before they reach the clinic. This may sound too good to be true, but modern platforms can provide a patient-facing checklist or Rapid Medical History that prompts patients to provide information. Clinicians can review a patient’s Rapid Medical History or use the CDS tool to record patient responses.

For example, a patient using a telehealth application may respond to two of five questions in a pre-visit checklist or Rapid Medical History. In a follow-up call, the clinician reviews the responses and asks any unanswered questions. The clinician then collects relevant information from a standardized CDS checklist and gives care advice.

CDS checklists also help providers ensure staff follow safe, consistent processes with patients. Checklists are especially important in crisis or high-stress situations when staff may forget details. In the long run, checklists help:

  • Ensure consistent workflows
  • Improve communication
  • Reduce provider risk, and
  • Save time.

For every chief complaint, there is related information telemedicine providers need to know. The ideal telemedicine platform should have access to content that automatically links a chief complaint to a Rapid Medical History template. A platform that connects chief complaints to a standardized list of questions can save time and improve efficiency. These custom templates can also improve accuracy of care advice.

The traditional, pre-visit process can take a significant amount of time, time that could be spent elsewhere. Incorporating CDS reduces time spent gathering patient background information and allows staff to get to the root of the problem quickly. This leads to faster, more accurate diagnoses and care recommendations. It also creates an alternative to ER or urgent care visits for low-urgency conditions, which make up a large part of telemedicine encounters. CDS can also be used to augment EHRs with data that improve patient tracking.

A standardized clinical documentation support process can transform the telemedicine experience, creating a faster diagnostic process and reducing unnecessary visits. CDS can improve patient outcomes, safety, and satisfaction by delivering a consistent experience for patients and staff. This can help patients feel empowered and gives them tools to make appropriate healthcare decisions. In short, CDS is a building block of a better telemedicine experience with more valuable data.

Moving forward, the healthcare industry will see more of this data processed through artificial intelligence (AI) like natural language processing (NLP). NLP directly relates to CDS because this “narrow AI” produces the standardized, follow-up templates for each chief complaint. These two technologies can improve all areas of telemedicine.

Some of the major areas of opportunity for telemedicine lie in services like tele-ICU, tele-psychology, and triage. CDS allows these services to deliver a richer, data-driven experience. These areas are only expected to grow, and CDS helps telemedicine providers meet patient and provider needs.

As telemedicine falls under new legislation and continues to evolve as a covered benefit, expect to see new guidance on standardization and use. CDS provides data that makes telemedicine visits valuable, fitting into value-based payment models. Telemedicine providers can expect to see increasing demand for these convenient services as employers and health systems work to provide cost-effective, accessible care.

Readers Write: HIT Talent Trends to Watch in 2018

November 29, 2017 Readers Write No Comments

HIT Talent Trends to Watch in 2018
By Frank Myeroff

image

Frank Myeroff is president of Direct Consulting Associates of Cleveland, OH.

What’s in store for 2018 when it comes to HIT talent? Here are eight talent trends that will help to shape the HIT workforce in the New Year.

Widespread Adoption of People Analytics

As Millennials move into HIT management roles, they’re turning to analytics much more than their predecessors as a way to better understand the effectiveness of people practices, programs, and processes. Millennial HIT managers are creating employee dashboards like Microsoft’s MyAnalytics to help people better understand how their time is spent and as a means to measure progress of organizational HIT goals and initiatives.

Cybersecurity Needs to Improve

Cybersecurity in 2018 needs to become a top priority. In 2017, the WannaCry outbreak brought serious attention to security in the healthcare industries. The security of digital health data has not kept up with its growth due to a lack of investment in people and technology, but that is starting to change. Healthcare IT hiring managers and HR executives could be in a good position to lure cybersecurity talent in 2018 because healthcare is the hottest hiring hotspot when it comes to cybersecurity.

Explosive Growth in Telemedicine Services

According to an IHS Technology Report, the telemedicine services field is expected to grow to include 7 million patient users, nearly twice what is was in 2016. Telemedicine is a huge change to healthcare because it can help extend care and reach of patient monitoring, consultation, and counseling to those individuals who cannot make it to a doctor’s office. Plus, once it reaches its potential, telemedicine will allow doctors to help more patients in less time. According to a survey done by Becker’s Healthcare, only one percent of respondents had no plans to implement telemedicine in the future. This fast growth means that HIT professionals will play an event bigger role when it comes to developing telemedicine services. By helping to create the telehealth infrastructure, HIT professionals can help make telemedicine a fixture in healthcare delivery.

Robotics and AI Represent Greatest Transformation in Healthcare Services

While this has been a high-growth area in recent years, we see it skyrocketing in 2018 and beyond. The main areas of healthcare that will benefit the most from robotics and AI are direct patient care such as surgery and prosthetics, indirect patient care in the areas of pharmacy, medical goods delivery, home health, and disinfection that will interact with people having known infectious diseases. This high demand in robotics and AI will add a plethora of new jobs in the areas of highly skilled data specialists, algorithm specialists, robotics engineers, software developers, and technicians.

Expected IoT Job Boom On Hold

The healthcare industry only saw an 11 percent boost in Internet of Things (IoT) network connections between 2016 and 2017. That ranks the healthcare industry behind four other key industries: manufacturing, energy / utilities, transportation / distribution, and smart cities / communities according to “The Verizon State of the Market” report. While IoT devices clearly offer new benefits for healthcare provider organizations, adoption remains limited due to the IoT standards, security, interoperability, and cost. Therefore, the hiring of developers, coders, and hardware professionals will not be needed to the extent previously thought.

Continued Rise of Freelance Economy

There’s high growth when it comes to freelancers, temporary workers, contractors, and independent consultants within the HIT space. New technologies, cost factors, and a whole new generation of HIT professionals wanting to work in a gig economy are fueling the growth. Organizations should, now more than ever, look at building new strategies or evaluating what is already in place to keep these workers motivated and engaged. If they don’t, they risk losing this highly skilled talent to their competition. By 2020, it is anticipated that 50 percent of all US workers within various industries will be contingent workers.

Candidate-Driven Job Market Continues

In most industries across the US, we’re experiencing a candidate-driven job market and the HIT industry is no exception. Those who do have the right skills are in a good position to find the best job offer. They have far more power and latitude to be very selective regarding opportunities and employers. In fact, HIT professionals tell us that they have a pipeline of opportunities to choose from and are getting up to 20 recruiting calls per day. There’s no doubt that healthcare organizations are feeling the impact of the heightened competition for their attention.

Diversity in Technology Still Needed

With the retirement of the baby boomer generation in full swing, worker shortages are of great concern. The fact that the information technology field can’t seem to attract a more diverse population doesn’t help the situation. The IT workforce is predominantly white males. Even though many organizations announce diversity initiatives on a regular basis, hiring managers complain that they can only hire from the worker pool that is available. By introducing science, math, engineering, and technology (STEM) to minority students (including females) at an early age plus having a diverse group of educators throughout their schooling, the amount of diversity in the field as a whole can increase.

Readers Write: Preparing Nurses for Opioid-Addicted Patients

November 20, 2017 Readers Write No Comments

Preparing Nurses for Opioid-Addicted Patients
By Jennifer David, RN, BSN, MHA

image

Jennifer David, RN, BSN, MHA is vice-president of clinical operations for Avant Healthcare Professionals of Casselberry, FL.

President Donald Trump declared the opioid crisis a national emergency. As a reaction to this announcement, Intermountain Healthcare, a Utah-based hospital chain, pledged to decrease opioid prescriptions by 40 percent in 2018.

Making a commitment to addressing the crisis is a crucial first step. However, only addressing the patient-use side is not a holistic approach to the problem.

The mental health of the nurses and doctors who care for overdosed patients are not considered in the opioid equation, yet every day they feel the magnitude of the epidemic and they are left alone to manage their pain. Ultimately, they may leave their job or the profession altogether without support in facing this problem.

Nurses are the frontline warriors in this epidemic. Several times each day, they’re responding to the screams of withdrawal, managing the inherent chaos of addiction, and dealing with family members who demand an immediate solution. For many patients, it’s the second, third or tenth time to the emergency department for the same problem. Families are desperate, angry, and looking for someone to blame, often defaulting to the nurse.

I’ve had nurses from hospitals around the country explain that they feel that they’re enabling drug-addicted patients by administering pain medications. However, “managing pain” is an important aspect of HCAHPS. Nurses are conflicted between caring for a patient and adding to the problem. This conflict can lead to anger, stress, and frustration among nurse staff, and in some cases, could drive nurses to quit.

Some hospitals have made steps in protecting nurses against these patients with a patient code of conduct, which states that violence and verbal abuse against staff will not be tolerated. A large hospital in New York has their patient code of conduct displayed throughout their hallways and another facility in Missouri strictly enforces that their staff will not be disrespected by patients. When these rules are up against patient discretion on HCAHPS scores, they become harder to enforce.

The best thing hospital leadership can do is to mentally prepare nurses to care for these difficult patients. This will also reduce staff turnover and improve employee communication.

The first place to start is to recognize the potential of a problem. I make personal visits to our nurses on assignment and always ask them how they are dealing with opioid-addicted patients. It is not always easy or possible to give individual attention to every nurse on staff. However, it is important to identify who is having issues. Surveying the nurse team to ask if they feel respected at all levels and supported in their job challenges is a great strategy to begin with.

Once honest communication begins, explore what support nurses want and need, then put a plan together. It should include a healthy dose of continuous learning intended to help build understanding and empathy for patients’ needs. Seeing how our nurses were affected, we now incorporate training on how to care for drug-addicted patients in our curriculum as well as provide consistent follow up while nurses are on assignment. We want to pre-expose them for what they might face and be there for them when they face it.

There will likely be multiple tiers of support needed,  varying from the occasional discussions about a particularly challenging patient to more intense, personalized support from the human resources department. Everyone has different experiences and belief systems about addiction, so allow for that. One of the hardest things to address is that opioid-addicted patients should not be discriminated against.

Not all days are the same when dealing with these patients, and some days might be especially challenging. Consistent follow up is necessary to maintain a healthy staff and also allows for positive patient experiences. If nurses feel that their employer constantly empathizes with them, they will feel the support they need when caring for such patients.

Readers Write: Tips for Selecting EMR Training and Activation Support Vendors

November 20, 2017 Readers Write No Comments

Tips for Selecting EMR Training and Activation Support Vendors
By Kevin Smith

image

Kevin Smith is CEO of TrainingWheel of Fort Myers, FL.

The contracted EMR vendor often does not deliver experienced staff for the activation. The go-live is the first time some of the vendor’s elbow support resources enter a hospital without being a patient or family member.

Here are a few helpful tips based on lessons learned to save organizations time and money:

  • Know what the organization wants and what it is paying for. Consider more than the proposed training and support cost. For example, the cost may be different because the vendor provides licensed clinicians, while others may provide non-clinical rounders with no hospital for a lower cost.
  • Insufficient planning can lead to less-experienced resources. What recourse does the contract include? If a bank teller or oil rigger joined the firm last week, are they prepared to help the clinicians? A body is not what matters to the clinicians. They want someone helpful to them as they learn how to do their work using new tools.
  • Ask the vendors to provide resumes, CVs, immunization records, background checks, and proof of experience in advance. Vendors often slide inexperienced people into a project and shuffle them around. They want to maintain high resource numbers, but the clinicians are not getting the support.
  • Does the vendor rely on one or more third-party companies to provide trainers and support staff? If so, it will be hard to know what type of resources are being provided. This is important because many vendors subcontract to the same companies. There may be two different bids, but the organization ends up with the same subcontracted company. If the primary vendor can’t answer basic support questions, the organization may already be in trouble. An experienced vendor will match clinical support personnel to support areas based on their clinical role and/or experience.
  • Can the vendor present a full project cost proposal with a support schedule, detailed expense projection, and a list of their proposed resources after one walk-through of the facility? Staffing ratios help, but are not always accurate. If a vendor doesn’t understand the makeup of the organization’s staff and the layout of the facility, how can they give an accurate estimate of clinical support resources?
  • Does the vendor develop curriculum and clinical scenario-based training or does training simply cover system functionality? If training only covers functionality, then users will require more elbow support because they won’t be prepared to use the system for their real-time clinical workflow. The #1 complaint from clinicians in EMR training is that it only teaches them navigation and what each click does. This leaves clinicians anxious and also forces every clinician to come up with their own approaches and workflows.
  • Can the vendor recognize issues in the build and offer recommendations based on past client experiences? The training partner should be an asset to the team, identifying issues in the build that may come up during the training. Better to know this ahead of time and make corrections than during or after the go-live. Make sure the organization and the vendor have a joint commitment to be open and sharing in this regard.
  • Does the vendor pursue continued improvement and feedback? Are they as committed to quality as the organization?

Vendor involvement is an integral part of implementation success. As an organization, ask the necessary questions to guarantee the right vendors are selected.

Readers Write: How Hard Is It?

November 15, 2017 Readers Write No Comments

How Hard Is It?
By Frank Poggio

image

Frank Poggio is president and CEO of The Kelzon Group.

In the October 28, 2017 issue of HIStalk, Mr. H made this critical observation and raised an important question. He wrote (finishing with tongue in cheek):

For those with short memories or short healthcare IT careers, it’s time to relearn the oft-repeated lesson that big companies dip their toes into and out of the healthcare IT waters all the time with little loyalty to anyone except shareholders. McKesson bailed out this year and now GE is apparently mulling its exit after wrecking a slew of acquisitions over many years. Siemens is long gone. Nothing good ever comes from conglomerates licking their chops at what they naively think is easy money and higher growth than their other verticals (see also: Misys and Sage). How hard could this healthcare thing be?

GE of course isn’t alone, but they may hold the prize for most kicks at the can. This will be their third time since 1970 — three tries and billons later and nothing to show for it. Ironically, GE has had great success in medical devices, so one could assume they know more about the healthcare business than a Revlon, Apple, IBM, NCR, Martin Marietta, Lockheed, Oracle, SAP, Microsoft, et al.

After some 45 years working in the healthcare IT arena, I believe I have the answer to Mr. H’s query. My qualifications in support of my response are:

  • Over four decades, I was a hospital CFO and CIO at a major teaching hospital.
  • I spent two intermittent decades as an industry consultant working with healthcare providers and system vendors of all sizes.
  • In the middle of my career, after my CIO stint, I founded a HIT startup that built both clinical and administrative systems, went public, and was later acquired by one of today’s major vendors.
  • Most importantly, I have designed clinical and administrative software systems, led installations, and written more than my share of program code.

To summarize, I have seen it from all four sides; buyer, builder, advisor, and patient.

There are four reasons that make healthcare IT hard, really hard.


Organizational Structure

Many people new to the healthcare readily compare it to commercial industry. Why can’t hospitals do as banks, or airlines, or Google, or…?

One reason is they are not organized like these entities. What other industry has as its primary customer the same person that sells and then performs the core services? That same person also defines the product and further determines how it is delivered and implemented. That person is the doctor. The PhDs at GE do not make the final decision on how to make a jet engine or how to deliver it. GE is run by a CEO and the buck stops there. Hospitals are run by a troika (or committee) of the board, the administrative CEO, and the chief medical officer.

In 1974, Professor William Dowling, University of Washington, published the book “Prospective Reimbursement for Hospitals,” which did research on hospital operations. His studies showed that the CEO of a typical community hospital directly controls only 25 percent of the resources and operations. The other 75 percent is controlled by the doctors. They decide what tests to run, when to run them, and what happens next. The fastest way for a CEO to lose his job is to directly challenge the medical staff.

What other industry is organized like this? If you are in the business trying to build and sell million-dollar systems, you had better understand this organizational dynamic and accept the fact it will take years to generate an acceptable return on investment.

Regulatory Quagmire

All businesses are struggling with regulation. I submit that healthcare far exceeds all others.

Case in point: in what other industry does the payer define the structure and content of the bill down to the very last data element? One that comes closest is the defense industry, and many of its idiosyncrasies are incorporated in healthcare regulations. In 1999, Price Waterhouse CPAs completed an analysis of how many pages in the federal register addressed income tax laws. They compared income tax against the number of regulatory pages need to create a payable UB bill for all payers in a given state. The results were 11,000 pages of regulations for taxes and over 50,000 for a hospital bill.

A further complication is the person receiving the care is not the one paying the bill. Sometimes the patient never sees the full bill, and when they do, they are inevitably confused.

Training, Structure, and Definition

Computer systems thrive on definition and structure. The easiest applications to develop are those where the target domain has a history and library of definition and structure. Lack of definition and structure are a programmer’s nightmare. Today there are many tools to help address gray areas, such a fuzzy logic and neural networks, yet learning and applying these tools significantly raises the complexity of the system, thereby increasing development time and costs.

A doctor’s adherence to medical terminology and structure is highly dependent on which medical school they attended. As an example, a study at the Milken Institute SPH at George Washington University found that physicians whose residencies were in higher-spending regions spent 29 percent more on average than their peers who had trained in lower-spending areas of the country. Different protocols for different regions based on training. The federal government spent $30 billion on EMRs and yet we still have wide gaps in medical lexicons, protocols, and the structure and content of EMRs.

Moving Targets

In IT, this is classically called a rolling design, again a developer’s nightmare. But the delivery of healthcare and the practice of medicine are rife with this burden. Medicine is in constant change, with new protocols, test procedures, quality measures, etc. presented every week. Old protocols are challenged on a routine basis, e.g., mammography screening, PSA testing, knee replacements, tonsillectomies, and more.

What if you were assigned to develop a production management system for an auto manufacturer and every month the manufacturing engineers told you that process A — which we coded last month — has now changed to process B? The solution in commercial industry is to freeze the design by freezing the process. Can’t do that in medicine — freeze your protocol and tomorrow it could be the basis of a malpractice suit.

Medicine has always been in constant change, and with personalized medicine around the corner, variation and complexity will grow by leaps and bounds. Scientists have been trying to reverse engineer the human body since the first autopsy a thousand years ago. If only when you were born your mother gave you a 5,000-page human spec sheet with schematics and diagrams, a user’s manual, a troubleshooting guide, and a 1-800 number to call when all else fails. They exist for every car, dishwasher, plane, and other device and sure make software development a lot easier.

When I was a CIO at the end of a difficult IT implementation, the dean of our medical school said to me, “There is a reason we called it the practice of medicine. If we practice long and hard enough, someday we’ll get it right.”


Many of these issues exist in other industries and disciplines. I submit that the depth and interaction to which they exist in medicine and healthcare is what makes IT development hard, very, very hard. All those big companies (and many small) that came into the healthcare industry failed because they did not allow for the depth and interaction of these challenges, and hence they did not prepare for them, lost patience and millions, then chose to cut their losses and run.

From the outside looking in, healthcare is twenty percent of the gross national product, which could support a very attractive business opportunity. It’s a beguiling number which has proved to be siren song for many a big and small firm.

Text Ads


RECENT COMMENTS

  1. "Upon learning what I do, several attendees went into some pretty serious rants about how electronic health records have destroyed…

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

 

 

RSS Webinars

  • An error has occurred, which probably means the feed is down. Try again later.