Home » Readers Write » Recent Articles:

Readers Write: The Journey from Population Health Management to Precision Medicine

April 20, 2016 Readers Write 1 Comment

The Journey from Population Health Management to Precision Medicine
By David Bennett

image

Imagine a world where individuals receive custom-tailored healthcare. Patients are at the center of their own care, making key decisions themselves. They are supported by research and education, and their information is shared easily between caregivers and clinicians. Preventive care is more effective than ever, and medical interventions occur in record time.

With precision medicine, this world is not just within reach — it’s already happening.

Precision medicine (also known as personalized medicine) is the next step in population health management, transforming healthcare from being about many, to focusing on one.

Population health serves as the “who” to identify cohorts of patients that are at risk and require attention. Precision medicine is the “what,” providing caregivers with the specific information they need to create effective prevention and treatment plans that are customized for each individual.

Having the largest variety of data sets possible optimizes therapeutic tracking of each patient’s care plan to make and refine diagnoses. This sets the stage to pursue the most personalized therapy possible by detecting patterns in clinical assessments, behavior, and outcomes.

Data is essential, but it’s only useful if you have the ability to make big data small in order to personalize care. Today’s technology platforms can do just that, by capturing vast amounts of health data and applying real-time analytics that provide information and tools that help healthcare professionals and health insurers make more effective, individualized treatment decisions.

Using this information to engage patients and guide care management makes the journey from population health management to precision medicine that much easier, paving the way for an era of truly personalized medicine that prevents the deterioration of health.

The timing couldn’t be better for precision medicine’s heyday, and here’s why: one-size care does not fit all.

Many factors are converging to make the adoption of precision medicine a reality:

  • A growing number of EMRs, EHRs, and HIEs are being connected and cover a significant number of individuals.
  • Patients are more interested in participating in their care, especially when they get access to their own data. There are myriad devices on the market today that are relevant — from wearable devices that measure activity and sleep quality, to wireless scales that integrate with smartphone apps, to medical devices that send alerts (such as pacemakers and insulin level trackers). The data from these devices contribute to a robust longitudinal patient record. The interactive nature of the technology is also an excellent way to engage patients.
  • MHealth advances allow us to easily capture consumer data using cellphone technology and monitoring patients remotely with telehealth and virtual consultations.
  • Ability to see which inherited genetic variation within families contributes both directly and indirectly to disease development. We can now adjust care plans when genetic mutations occur as a reaction to the treatment in place.

If we look at healthcare outcomes in the United States, it’s clear that we need to anticipate patients’ needs with evidence and knowledge-based solutions. Only then will we will be able to identify a patient’s susceptibility to disease, predict how the patient will respond to a particular therapy, and identify the best treatment options for optimal outcomes. Precision medicine will get us there.

Precision medicine is about aggregating all forms of relevant data to enable different types of real-time data explorations. More concretely, specific areas of medicine are expected to make use of new sources of evidence, and the data types they leverage vary based on medical specialty. A good example would be the difference between the data sets used by oncologists versus immunologists.

There are two critical types of data explorations that both need a very large number of data sets to bring results:

  • Medical research with scientific modeling. Precision medicine can be leveraged to advance the ways in which large data sets are collected and analyzed, which will lead to better ways and new approaches to managing disease.
  • Clinical applications. Treatment plans and decisions can be greatly improved by identifying individuals at higher risk of disease, dependent on the prevalence and heritability of the disease. We call this cognitive support at the point of impact. To support this, more control is needed in real time over macro variables: genomics, proteomics, metabolism, medication, exercise, diet, stress, environmental exposure, social, etc. Precision medicine provides a platform that has an extensive number of data sets with the ability to easily create custom data sets to capture these types of variables.

Precision medicine not only means care tailored to the individual, it also brings to the healthcare industry the visibility on variability and the speed necessary to act expediently on findings to prevent the deterioration of health. Not only does this enhance patients’ lives, it saves healthcare dollars and prevents waste.

Tailoring deliverables to the needs of individuals is nothing new, at least in other fields such as banking and retail. Pioneers in these industries have leveraged open-source technology on a solid data foundation to meet their markets’ challenges.

Surely we can do the same in healthcare, where it’s literally a matter of life and death. That’s why so many of us are working on a daily basis to accelerate the science behind precision medicine and to encourage its adoption. Precision medicine is nothing short of revolutionary, and together, we can all make it a reality.

David Bennett is executive vice president of product and strategy at Orion Health of Auckland, New Zealand.

Readers Write: Three Tips for Supporting a Population Health Management Program

April 20, 2016 Readers Write 1 Comment

Three Tips for Supporting a Population Health Management Program
By Brian Drozdowicz

image

Provider organizations have a lot of options when selecting population health management expertise and system support, including analytics, data aggregation, clinical workflow / care management, and patient engagement solutions. With the market for these solutions expected to reach $4.2 billion by 2018, it is not surprising that new vendors pop up practically daily, or that existing vendors are beefing up their solution portfolios to capitalize on the opportunity.

As providers’ wish lists continue to grow, driven in part by government initiatives and commercial payer programs, system selection starts to take on the overwhelming feel of a second EMR implementation. This is causing providers to hesitate just when they need to act. How can providers find the right path to effective population health management?

No matter what shape a program might take, the right team is a foundational imperative. Assuming risk for populations often means that provider organizations are learning and mastering a new set of skills while simultaneously balancing the demands of “business as usual.”

One frequently deployed tactic is to hire staff from payer environments. They bring the requisite knowledge to the table and can help incorporate proven payer techniques and processes that both build on and complement a provider’s current infrastructure. Team members are needed who “speak data” and are also representative of groups across an organization (e.g., clinicians, program managers, business leads, finance team members, IT staff) to best determine what program goals are, what is possible for the specific organization, and what actions should be taken along what timeframe.

Once  the right team is in place, here are three tips to support the implementation of a population health management program:

  1. Recognize that data quality is more important than data quantity. The foundation of any population health management program is data. However, providers don’t need or want it all because each type of data has to be managed and maintained, often by separate people and according to different rules (e.g., privacy constraints). Focus on obtaining and properly maintaining the right data to drive population analysis, program structure, program management, and ongoing assessment.
  2. Learn to embrace claims data. Provider organizations need the longitudinal view that claims data provides to adequately assess utilization, total cost of care, and provider performance, and in turn to answer complex, multi-faceted questions about risk. Other benefits of claims data include that it is: (a) easier to manage and maintain; (b) more readily available and accepted than ever before; (c) controllable from a systems perspective; and (d) proven to yield accurate insights.
  3. Show physicians the numbers and what drives those numbers. Physician change is required to embrace the concept of value-based care. Comparative performance data can be a huge eye-opener. Physician leadership can help physicians be the champions of program performance assessment by making sure they can dig deep into the data, develop confidence in its findings, and understand what precisely needs to change. Complement performance data with compensation plans that reward participation, improvement, and outcomes. Start by placing the emphasis on participation, and then weight improvement and outcomes more heavily over time.

Provider organizations must know what is essential versus nice to have before they go into the vendor evaluation process. In a new and volatile market, the number of vendors offering potential solutions is huge, and the allure of slick user interfaces that can perform every population health management function, while integrating all types of data, is understandable.

However, little is proven, and most organizations do not have the time to wait until it is. Solutions have a gestation period to build, test, and revise before they become accurate, produce valid results, and deliver actionable business value. Answers are needed now, so organizations should look for a track record of results in a similar setting.

What does an organization need to effectively manage risk and care for populations? Of course, the answer is, “it depends,” but if you build the right team and thoroughly research your options, these tips can help bring order to the chaos.

Brian Drozdowicz is executive vice president of product management at Verisk Health of Waltham, MA.

Readers Write: It’s Time to Get Doctors Out of EHR Data Entry

April 20, 2016 Readers Write 9 Comments

It’s Time to Get Doctors Out of EHR Data Entry 
By Marilyn Trapani

image

There was a day when medical transcription was neat and clean. A doctor dictated what happened during an exam and a transcriptionist accurately typed each detail into the patient’s record. Each future encounter built on that record, a detailed history meant to ensure quality care. It wasn’t a perfect system, but it worked.

Now doctors sit for hours each week in front of a computer screen entering patient encounter data into electronic health records (EHRs). These complex systems were meant to more efficiently and effectively track health data for hospitals, payers, and physicians alike. EHRs were promised to save physician practices, hospital systems, and other provider organizations millions of dollars in the long run. 

Reality shows something quite different. Placing documentation responsibilities on physicians is resulting in severe problems not only for doctors, but for patients and the hospitals and practices who serve them. Doctors are spending more time – in some cases, 43 percent of their day – entering data into EHRs, which means less time available for patients. This continual influx of data is bloating EHRs with unnecessary, repetitive, unintelligible information. 

Doctors play an integral part in developing and maintaining medical records. But we are asking them to do too much and the entire healthcare system is suffering because of it. Instead of dictating information into the medical record, many physicians are required to type notes into their EHR, which is time-consuming and distracting.

That’s just one challenge they face when required to directly document into an EHR. Upon accessing the system, the doctor enters a patient’s medical number and their record pops up. There are boxes for history, medications, procedures, etc. This “structured data” methodology allows physicians to click radio buttons or check boxes to denote what was done, but too often allows for little or no free text. Physicians are presented options from which to choose, even if those options aren’t applicable. The structured data choices can’t be changed, and the patient’s record is built off what the doctor ultimately chooses as the lesser of evils.  

Most EHRs allow doctors to copy and paste information from one area of the record to another. This creates “note bloat,” a serious issue that’s resulting in junk data and unwieldy, unmanageable records. It’s not uncommon for information copied from one patient’s record to end up in a different person’s file.

Not only does that create note bloat, it also causes mistakes. One hospital was recently sued by a patient who suffered permanent kidney damage from an antibiotic given for an infection. The patient also had a uric kidney stone, which precludes antibiotic use. The EHR file was so convoluted, none of the attending physicians noticed the kidney stone. Printed out, the patient’s record was 3,000 pages. The presiding judge ruled the record inadmissible, in part because a single intravenous drip was repeated on almost every page.

In late January, Jay Vance, president of the Association for Healthcare Documentation Integrity (AHDI), testified to the US Senate Health, Education, Labor and Pensions Committee that EHR documentation burdens on physicians can be reduced by expanding language to a draft bill aimed at improving the functionality and interoperability of EHR systems.

The move to pay providers based on the quality of the care they deliver instead of the volume of cases seen by physicians and specialists is driving much of the federal healthcare discussion. There’s a chance that work can help restore sanity to the interaction between doctor and document. The Medicare Access and CHIP Reauthorization Act of 2015 (MACRA), the bill that ended the onerous Sustainable Growth Rate, authorized the Centers for Medicare and Medicaid to pay physicians via value-based reimbursement. The law also called for a replacement for Meaningful Use.

One component of MACRA is the Merit-Based Incentive Payment System (MIPS) that, among other things, incentivizes providers for using EHR technology. The goal is to achieve better clinical outcomes, increase transparency and efficiency, empower consumers to engage in their care, and provide broader data on health systems. But there is more that can be done. 

This is progress, because at the end of the day, patient focus should always trump data entry by physicians. That’s not to say that physicians shouldn’t have a hand in documentation. According to AHDI, accurate, high-integrity documentation requires collaboration between physicians and the organization’s documentation team – highly skilled, analytical specialists who understand the importance of clinical clarity and care coordination. Certified documentation and transcription specialists can ensure accuracy, identify gaps, errors, and inconsistencies that may compromise patient health and compliance goals.

AHDI’s recommendation: include wording that expands the definition of “non-physician members of the care team” to include certified healthcare documentation specialists and certified medical transcriptionists.”

There’s not a single documentation and transcription scenario to meet every organization’s needs. But there is common ground to be found where all functions – EHR vendors, documentation specialists, transcription experts, physicians, hospital administrators – can create a structure that results in clean, effective, understandable patient medical records. 

Step 1 – reduce doctors’ administrative burdens. A physician’s role in documentation should be focused on dictation, not data entry. EHR voice recognition software allows doctors to directly narrate into the system. Like any other text, narrated notes need to be reviewed for accuracy and then approved. In some cases, doctors are approving their entries without reviewing them. This increases the risk of inaccurate data and mistakes. 

Step 2 – find the balance of structured and unstructured EHR data. There is a place for both structured and unstructured data in the EHR. Structured data can be queried and reported on with much greater ease than free flow text. However, doctors complain there aren’t enough options to share narratives about encounters and what patients had to say about their visit. The goal of an EHR is to provide a complete and accurate view of patients’ conditions, treatments, and outcomes. It makes sense to use structured data for entries such as those required by CMS. Using dictation and expert transcription assistance, unstructured free-text narratives and information also can be a part of the EHR while maintaining accuracy and completeness. 

Step 3 — eliminate interface barriers. EHRs require interfaces to “talk” with other systems. Fees charged for said interfaces prevent providers from using outside documentation and transcription services. Interfaces are necessary, but should be part of the standard development of EHR structured data forms and information collection.

Step 4 – put the responsibility of document editing and transcription in expert hands. I believe there will be resurgence of transcription services in 2016. Streamlining data entry into an EHR will never replace the need for documentation and transcription experts. Providers will continue to need outside assistance in ensuring patient data is accurately and cleanly logged in the EHR. 

EHRs are here to stay. So are documentation and transcription experts. Provider organizations need both of us. When experts on both sides to combine their strengths and expertise, we can put doctors, physicians, and other health care professionals back where they belong: taking care of patients.

Marilyn Trapani is president and CEO of Silent Type of Englewood, NJ. 

Readers Write: Radiology Benefits Managers: An Inelegant Method for Managing the Use of Medical Imaging

April 13, 2016 Readers Write No Comments

Radiology Benefits Managers: An Inelegant Method for Managing the Use of Medical Imaging
By James A. Brink, MD, FACR

image

Doctors, lawmakers, and regulators are supposed to work together to make healthcare better. So why put a process in place that takes medical decisions out of the hands of doctors and patients, may delay or deny care, and often results in longer wait times to get care?

That is what insurance companies do by requiring preauthorization of advanced medical imaging (such as MRIs or CT scans) ordered for beneficiaries. A better way to ensure appropriate imaging is widely available and already in use.

In most cases, if your doctor thinks an imaging scan can improve your health, he or she has to ask a radiology benefits management company (RBM) whether the scan will be covered or not. This process can take days or even weeks. You may not be able to get the scan at all if the RBM says no, which happens a lot.      

In fact, a Patient Advocate Foundation (PAF) study found that in people who challenged coverage denial for scans, 81 percent were denied by RBMs and 90 percent of reversed denials were in fact covered by the patient’s health plan. The U.S. Department of Health and Human Services (HHS) says there are no independent or peer-reviewed data that prove radiology benefit managers’ effectiveness. HHS also warned against the non-transparent coverage protocols that RBMs use. 

What’s more, ensuring appropriate imaging is already being done in a more modern and efficient way. Clinical decision support (CDS) systems, embedded in electronic health records systems, allow providers to consult appropriate use criteria prior to ordering scans. American College of Radiology (ACR) Appropriateness Criteria, for instance, are transparent, evidence-based guidelines continuously updated by more than 300 doctors from more than 20 radiology and non-radiology specialty societies.

CDS systems — easily incorporated into a doctor’s normal workflow — reduce use of low-value scans, unnecessary radiation exposure, and associated costs. The systems educate ordering healthcare providers in choosing the most appropriate exam and suggesting when no scan is needed at all.

An Institute for Clinical Systems Improvement study across Minnesota found that such ordering systems saved more than $160 million in advanced imaging costs vs. RBMs and other management methods over the course of the study. A major study by Massachusetts General Hospital and the University of Florida showed that these systems significantly reduced advanced imaging use and associated costs. This was done without delaying care or taking decisions out of the hands of patients and doctors.

In fact, the Protecting Access to Medicare Act — passed by Congress with the backing of the ACR and multiple medical specialty societies — will require providers to consult CDS systems prior to ordering advanced imaging scans for Medicare patients starting as soon as next year. This makes image ordering more transparent and evidence-based than any other medical service. The law would require preauthorization only if a provider’s ordering pattern consistently fails to meet appropriate use criteria.

In short, preauthorization is an antiquated approach to utilization management that disconnects doctors and patients from learning systems designed to improve patient care. Patients. together with the providers and legislators who serve them, should be demanding a more modern approach to prior authorization through the delivery of EMR-integrated imaging CDS.

James A. Brink, MD, FACR is vice chair of the American College of Radiology, radiologist-in-chief of Massachusetts General Hospital, and Juan M. Taveras Professor of Radiology at Harvard Medical School.

Readers Write: Why Can’t I Be Both Patient and Customer?

April 13, 2016 Readers Write 7 Comments

Why Can’t I Be Both Patient and Customer?
By Peter Longo

image

I love the clinicians at my local health system. However, I hate the bills from my local health system.

When the clinic staff helped last month with my knee, they were the best — rock stars. When I got their confusing bill, they were the worst. Is there any other industry where you love the service, but 30 days later, they go out of their way to take away all of your happy thoughts?

Yes, I did something stupid again. Over the holidays, I took some time off to go skiing with the family. Time with the family was not stupid; skiing in the trees was stupid. (note to self; you are not in your 20s any more and need to take it easy). The ensuing tumble, spin, twist, and crash resulted in an injured knee.

I entered the local university health system in search of a cure. In total amazement, I walked into the office and the entire staff greeted me. Just like in the Gap, the entire front staff looked up and said “hello” loudly.

Over the next month, the medical group and hospital went out of their way to make me feel at home … until the bill came. Or should I say “bills” (plural). They should have stamped on the envelopes, “Screw you” in an effort to be more honest.

Most of the bills appeared to be for my knee, based on the dates of service. But for the record, they decided to add some of my wife’s medical charges into the mix on one statement.

Having spent 25 years working in the healthcare tech world plus having two graduate degrees, it still did not give me the skills to make any sense of the bills. I decided to call them at 4:50 one afternoon. The very nice recording said, “The billing office closes at 4 p.m. Monday through Friday.” Seriously? What about people who work and don’t have time to call until after work, or on the weekend? The Gap has greeters, but they are open nights and weekends. Seems my health system copied the Gap only on the greeters.

A few days later, I was able to talk to someone. I started the call by saying, “I want to pay all that I owe, so please provide a summary and explain the charges so I can pay you.” Surprisingly, they did not understand half the statements. They indicated they could not access the “other system that has more information,” so they would need to call me back.

A few days later, someone from the billing office called. Together we figured out where there were some discrepancies and determined the correct amount owed. She indicated she would clean everything up and send me a new statement. Thirty days later, I got the statement and paid right away. As I was writing that check, I had already forgotten about how they “cured” me, as it seemed so long ago.

The cost for the billing staff involved in my bill was probably more that what I owed, so I did feel bad for them. That sympathetic feeling only lasted a short time. Last night I got a call at the house. My 15-year-old handed the phone to me. I owe $25 and they sent it off to their collection agency.

Is it too much to ask that my health system treat me both as a patient and as a customer?

Peter Longo is SVP/chief revenue officer of Sirono of Berkeley, CA.

Readers Write: Three Reasons EHRs Need to Treat Biosimilars Differently from Generics

April 13, 2016 Readers Write No Comments

Three Reasons EHRs Need to Treat Biosimilars Differently from Generics
By Tony Schueth

image

Biosimilars are being introduced in the United States and are expected to quickly become more mainstream in the near future. In response, stakeholders are beginning to work on how to make them safe and useful within the parameters of today’s healthcare system.

The reason is that biosimilars, like biologics, are made from living organisms, which makes them very different from today’s conventional drugs. These differences will create challenges and opportunities in how they are integrated in electronic health records (EHRs) and user workflows as well as how patient safety may be improved.

Normally, there is a lot of lead time before EHR vendors must address such issues. Things are different with biosimilars. Here are some reasons.

There are powerful drivers

Several drivers will stimulate demand for EHRs to address biosimilars sooner rather than later. This is because of central role EHRs play in value-based care coordination and patient safety.

New biologics will be bursting on the healthcare scene. Although biosimilars have recently been approved for use in the US, they have been in use extensively in Europe and Asia for many years. More than 80 biosimilars are in development worldwide, and the global biosimilars market is expected to reach $3.7 billion. This will stimulate rapid adoption by payers and physicians in the US, which, in turn, will create the need for EHRs to capture and share a variety of information about biologics and biosimilars. It is easy to envision the availability of four biosimilars for 10 reference products in 2020, given projected market expansions.

Next, uptake in the US is expected to take off because biosimilars are lower-cost alternatives that will be used to treat the growing number of patients with such chronic diseases as arthritis, diabetes, and cancer. Rand has estimated savings from using biosimilars at $44.2 billion over 10 years. Money talks and payers will create demand for EHRs to fold biosimilars and biologics into EHR functionalities and workflows.

Payers and regulators also will demand enhanced tracking of biologics and biosimilars because they are key pieces of the move toward value-based reimbursement and are a focus of public and private payers. Identifying, tracking, and reporting adverse events that might be associated with biologics and biosimilars are expected to become key metrics for assessing care quality and pay-for-performance incentives.

Biosimilars are not generics

It would be a mistake to think of biosimilars as being synonymous with generics, which have been around for years and use mature substitution methodology. The reason begins with the fact that biologics and biosimilars are medications that are made from living organisms. Unlike generics, which have simple chemical structures, biosimilars are complex, “large molecule” drugs that are not necessarily identical to their reference products, thus the term “biosimilar,” not “bioequivalent.” In addition, biosimilars made by different manufacturers will differ from the reference product and from each other, making each biosimilar a unique therapeutic option for patients.

Furthermore, biologics and biosimilars have varying locations where they are administered, most commonly infused in physician offices, hospitals, or special ambulatory centers, or by patients at home. Given that administration location and type can vary, such information — along with the particulars of the drug that was administered — must get back to the physician and incorporated into the patient’s EHR record.

Getting this information into the patient’s record in the EHR also is important for improving patient safety. That is because it will help in identifying and distinguishing the source of the adverse drug events and patient outcomes from a biosimilar, its reference biologic, and other biosimilars.

Substitution laws are expanding and evolving

Developers of EHR systems will need to keep abreast of evolving state laws concerning substitution. In fact, many states already are considering substitution legislation or have enacted it. According to the National Conference of State Legislatures, as of early January 2016, bills or resolutions related to biologics and/or biosimilars were filed in 31 states. Keeping pace with these new laws is likely to be a challenge to ensure that EHRs are compliant, especially since requirements are apt to vary considerably from state to state. Given the rapid changes in the regulatory landscape, latency of updates to EHR systems is a problem that needs to be addressed.

Not only that, the drug that is dispensed may be very different than what was prescribed. As a result, it is important for physicians to know whether a substitution has been made and capture information about the drug that was administered in the patient’s EHR record. Because of the differences from conventional medications, different, more granular information such as lot number, will also be required. This is important for treatment and follow-up care as well as in cases where an adverse drug event or patient outcome occurs later on.

All in all, EHRs will face a brave new world when it comes to adapting to biologics and biosimilars.

Tony Schueth is CEO and managing partner of Point-of-Care Partners of Coral Springs, FL.

Readers Write: The Future of Mobility and Cloud in Healthcare

April 6, 2016 Readers Write No Comments

The Future of Mobility and Cloud in Healthcare
By Joe Petro

image

For some time now, we’ve been hearing concerns voiced by physicians about how complicated their lives have become due to the mountainous documentation requirements. Among the most difficult is capturing the details a patient shares during a consultation and trying to fit that information into the structured template found in today’s EHRs.

How can we expect a patient’s story to be impactful when all its context and richness is lost to making sure we click and check the right boxes? This is a byproduct of all the initiatives coming out of the federal government. The EHRs are left with no choice but to force the structured capture of clinical documentation.

At the same time that we see these changing requirements, we’re also seeing a change in the technology used by physicians. Physicians are becoming increasingly more mobile and technologies can improve the physician experience and allow them to capture the patient story across the multitude of devices they currently use throughout the day. Executed properly, this ultimately offers physicians a way to streamline this documentation burden as certain technologies, such as speech recognition and language understanding, let them capture the required documentation in a more natural way.

In parallel, we are seeing an emergence of a cottage industry of mHealth app vendors looking to bring innovative technologies to the healthcare workflow. We have reached a tipping point where technical tools make it easier to leverage a large number of advanced capabilities. This makes it easier for the entire industry to create solutions and applications that are immediately impactful. This is a unique time and place in our technological evolution in the healthcare space.

Cloud is an example of a set of technologies that makes things easier and has the potential to deliver high impact. The cloud makes it possible for technologies to meet physicians wherever they are, on any device, at any time. For example, physicians can enter data into their mobile devices/apps any time, anywhere, and on the go. The cloud will be there to broadcast this information far and wide to EHRs or other apps and tools in a more meaningful way no matter where it originated. Thanks to cloud enablement, mHealth apps and other innovations become more useful to the physicians who want to be mobile.

Mobile and cloud innovations are impacting patients as well. Mobile applications and wearable devices are allowing patients to manage their own health, capture their own health data, and turn this data into actionable insights. Our lives and our health are on the brink of being substantially instrumented. We are now tracking sleep and eating patterns and mobile devices are starting to capture valuable information from blood pressure to heart rate to weight and more.

This technology can help patients comply with the treatment plans that physicians prescribe by allowing them to report progress or other important details in real time. The cloud is connecting patients to their own personal health experience, enabling them with the tools they need to better look after and manage their own health. It also connects patients to their healthcare providers and institutions before they actually need to receive care, potentially keeping them out the hospital in the first place. This evolution is taking place today.

We’re transitioning to a phase where we can truly call this “healthcare” instead of “sick care,” a phase where we are shifting to managing our health proactively instead of just managing a sickness after it has already happened. With all this data available via the cloud, EHRs and all health-oriented applications will evolve, making it easier for physicians to leverage the technology to increase productivity and improve quality of care. The value that the EHRs are promising to deliver will be delivered partly through this mechanism.

As we continue down this path, we move towards a setting that seems as if it’s almost from a futuristic movie where everything in healthcare is mobile, connected, and intelligent. We’re going to see patients surrounded by enabling technology in such a way that intelligent services in the cloud will help their mobile devices keep track of important information that can then be used during visits with their physicians or, more importantly, prior to visits.

Physicians will be primed for the visit with everything they need on a device, reducing the time patients spend having to tell the same thing to three different people upon entering a health system. Present-day documentation requirement problems will eventually fade into the background as technology advances and interacting with these systems become more human-like and natural. Physicians will be able to focus fully on what got them into medicine in the first place: caring for their patients.

Joe Petro is senior vice president of healthcare research and development of Nuance of Burlington, MA.

Readers Write: Tax Rebate? Insurance rebate!

April 6, 2016 Readers Write No Comments

Tax Rebate? Insurance rebate!
By Richard Gengler

image

Now that tax season is in full swing and the eventual rebate is around the corner, it is an ideal time to think about another kind of rebate. This one stems from the changes in healthcare policy with the Affordable Care Act (ACA) with the increasing push of the triple aim of improved patient experience, improving the health of populations, and reducing the per capita cost of healthcare.

With the individual markets becoming the fastest-growing part of the payer sector and increasingly competitive, payers are searching for any potential leverage to obtain, retain, and grow their membership base. There is more discussion on the importance of net promoter score (NPS), whereby payers can utilize their existing members to act as promoters.

By utilizing new innovations and alternative service modalities, insurance companies are able to hit all three parts of the triple aim. Almost on a daily basis we are hearing about innovations that have greater than 90 percent user satisfaction rates and significantly having positive impact on population health at potentially a fraction of the cost.

Health plans are required to have an 80 percent or 85 percent medical loss ratio (MLR), meaning that they spend this amount of the premiums they collect on medical expenses. The rest can be used for administrative, profit, and marketing. Any difference in this percentage must be refunded to the members, according to law. Great idea, but does this actually work?

Looking back to 2014, there are plentiful insurers offering rebates to their members in a wide variety of markets from individual, small group, and large group. Take, for instance, Celtic Insurance Company in Arkansas, which had $6,774,488 in rebates to its individual market. Or how about California Physicians Service ,with an astounding $21,819,095 for its small group market. In the large group market, Cigna Health and Life Insurance Company of DC sent back $5,608,359.

clip_image004

One would think this is an opportunity to fully engage and grow membership. Data from the Kaiser Family Foundation shows that many insurance companies are not meeting the medical loss ratio standards. This signals a missed opportunity.

clip_image006

To calculate the MLR is quite simple.

Let’s take, for instance, a population of 3 million Americans using a service that traditionally costs $1,751 per person per year. If there was an alternative service modality that is clinically equivalent for $30, this would create a savings of $1,721 and a percentage difference of 98 percent. If the premiums and other elements remain the same, this could be extrapolated out to provide bountiful rebates to the members.

Next time you are thinking about innovative strategies to increase the NPS of your members while increasing membership, think about your taxes. Your members will thank you, tell their friends, and increase your membership.

Richard Gengler is founder and CEO of Prevail Health of Chicago, IL.

Readers Write: All Claim Attachments are Not Created Equal

April 6, 2016 Readers Write No Comments

All Claim Attachments are Not Created Equal
By Kent McAllister

image

According to the 2014 CAQH Index, responding health plans representing 103 million enrollees returned data on claim attachments. There was approximately one claim attachment for every 24 claims during 2013 from those same responses.

Interestingly, the vast majority of claim attachments were submitted manually via paper delivery or fax. CAQH counted approximately 46 million claim attachments processed among the plans reporting, which can be extrapolated to roughly 110 million claim attachments industry-wide.

CAQH also estimates another 10 million prior authorization attachments. This statistic suggests a total of 120 million attachments annually across healthcare.

There’s a clarification, however, that must be made when dealing with attachments. Electronic attachments, in and of themselves, are not always the same despite industry rhetoric claiming that there is little difference between the healthcare sectors.

When dealing with the substance of attachments, there are two major distinct segments that providers must accommodate. These two segments are vaguely similar at the highest level, but distinctly different at the business process level for a few reasons. These two segments align with respective accountable payer organizations:

  1. Health and dental plans: commercial health plans and federal and state fiscal agents and administrators,
  2. Workers compensation (WorkComp): property and casualty insurance carriers and third-party-administrators.

The majority of the 120 million attachments are processed by health plans. Dental plans also manage an essentially equivalent business process for handling attachments, often through the same technical channels and human resources with similar skills.

Workers compensation claims, on the other hand, while voluminous, have a notably different set of business processes because of a number of distinctions in both the property and casualty insurance business and in the nature of “claims” in WorkComp parlance.

A WorkComp claim is generally related to an individual injured on the job. That claim may have a life of many months, or, in some cases, years. Resulting from that claim are typically many bills (or e-bills) that usually have an attachment. The e-bill submission process is more similar to property and casualty processes — such as auto physical damage — than to traditional health and dental plan processes.

An interesting contributor to this distinction is that property and casualty insurers are not considered “covered entities” under the 1996 HIPAA legislation. This is important, and any industry observers not recognizing this are failing to accommodate a major consideration.

Just as not all claim attachments are equal, neither are all vendors. For example, some companies that are heavily involved in the P&C space don’t work with the medical side, while others focus almost exclusively on medical. Vendors usually serve one of the two often-unrelated markets.

Providers must be aware of the differences. P&C electronic attachments, even though they may sound as if they’re in the healthcare setting, just don’t carry the same weight as electronic claims actually exchanged to support patient claims generated within a health system. Likewise, those vendors that work almost entirely in healthcare have little claim, if any, to the P&C market.

In a market filled with healthcare claims-related vendors, healthcare organizations must be able to place their trust in partners that understand the complete landscape of the healthcare space. They should also know that even though WorkComp may appear on the surface to be medical, it requires an entirely different scope of work than their counterparts working in the space. In this burgeoning sector of healthcare administration, messages are often painted too broadly with too wide a brush and healthcare leaders should be wary when entering into conversations that broach the subject of electronic attachments.

For the improvement of all parties involved, vendors should recognize and articulate the differences between health and dental attachment processes and WorkComp attachment processes in their public messages. The industry will be better served if vendors accept a mandate to clarify market confusion and to paint clearer lines as to their roles in electronic attachments.

Kent McAllister is chief development officer of MEA|NEA|TWSG of Dunwoody, GA.

Readers Write: Time for Providers to Lead the Price Transparency Revolution

March 23, 2016 Readers Write 5 Comments

Time for Providers to Lead the Price Transparency Revolution
By Jay Deady

image

With ICD-10 in the rear-view mirror, providers now face a new challenge – answering the public and media call for consumer price transparency. High-deductible plans now cover nearly a quarter of those Americans with commercial insurance, raising the ante on patient financial responsibility. Yet large numbers of patients remain confused about how much they will owe for hospital services—a full 36 percent, according to one survey.

This problem, unheard of in other consumer industries, not only endangers patient satisfaction scores, but threatens to increase the bad debt load of organizations already struggling with severely low margins.

While insurance companies and employers have deployed some pricing tools, they have done a poor job of accurately representing multiple providers’ fees within a geographic area. New technologies are available from a handful of companies that let providers take the price transparency bull by the horns and lead themselves.

These technologies transcend the usual approach of mere compliance with a state’s price transparency laws. Posting a list of charges on a provider’s website may satisfy the letter of the law, but it fails to give consumers an accurate picture of what they will owe for services. Knowing this, providers have struggled to come up with an alternative that does not reveal proprietary information to their competitors. Most have concluded there is no way for them to easily accomplish this and they refer questions to patients’ insurance companies.

But it turns out the path to truly efficient, accurate, and accessible price transparency is one that healthcare consumers can take themselves—directly from the provider’s website.

Healthcare consumers want – and deserve – an accurate understanding of what they will owe for services before they are rendered. The operative word here is “accurate”—as in an estimate based on the consumer’s current levels of insurance coverage. Or, in the case of a self-pay patient, an estimate based on the provider’s discounted fees for consumers that pay fully out of pocket.

Either way, with self-service pricing, healthcare consumers generate the estimates themselves, typically from an online calculator on the provider’s website. The process is quick and hassle-free. A consumer simply inputs their name, insurance plan number, and perhaps two or three more data elements. Within 10 to 45 seconds, a complete and accurate estimate appears, giving consumers immediate, line-item insight into what they will owe.

The process is powered by rules-based engines that automatically query, retrieve, and combine data from payer portals with the hospital’s charge master data and payer contracts. Analytics plays a critical role in assuring the estimate is accurate, including analysis of previously adjudicated claims to identify variances.

Such a tool neatly solves one of the most persistent challenges with implementing price transparency: the pitfalls of making proprietary financial information public. As a provider-facing solution, and because patient-unique information needs to be entered to generate an estimate, not just anyone can use the calculators. This is vastly preferable to putting a list of total charges or paid amounts out there for all competitors to see, which neither reflects negotiated rates with payers or the patient’s accurate out-of-pocket costs.

At the same time, self-service price calculators appeal to today’s information-driven patients and nicely align with how they already seek pricing on other purchases, from airfare to mortgages.

One of the most promising advantages of a self-service price calculator is its potential to engage consumers in multiple ways. After generating a price estimate, for example, the calculator could prompt high-deductible and self-pay consumers to view payment plan options. It could even engage those patients with concerns about their ability to pay and schedule time with a financial counselor. Realistically, we can only expect such concerns to grow along with the increasing number of high-deductible health plans. Since these plans were introduced in 2006, they have increased from 4 percent to a whopping 24 percent.

A deductible payment and co-insurance spread out over a year, or whatever the time span the provider and patient agree on, is clearly more manageable than a lump sum payment. Armed with clear, accurate information about how much they will pay—and how—healthcare consumers can better plan for paying their medical bills. This in turn will help reduce a hospital’s bad debt or charity write-offs.

Most important, patients who clearly understand their financial responsibility are more likely to schedule rather than delay urgently needed care. This reason, above all others, is why providers would be wise to take control of the price transparency issue now.

Jay Deady is CEO of Recondo of Greenwood Village, CO.

Readers Write: Trend Watch: Innovation Forges On in the Provider Sector

March 7, 2016 Readers Write No Comments

Trend Watch: Innovation Forges On in the Provider Sector
By John Kelly

image

Provider organizations face tremendous innovation challenges. The success or failure of new systems and technology will depend on their ability to adapt and anticipate the impact of major industry changes. Looking ahead to a successful 2016, hospitals and provider organizations should still expect barriers to using EMR data, should be wary of the hype surrounding cognitive systems, and should prepare for a value-based care partnerships world where providers and payers share information in ways not imaginable until recently.

EMR data will not be fully liberated in 2016

Barriers that exist to move data in and out of EMRs will not abate in 2016, despite pressure. The business model of EMR vendors and real technological barriers will continue to thwart the goals of interoperability sought under the concept of Meaningful Use.

The good news is that providers and payers are establishing pockets of innovation using edge technologies to support better care and risk sharing based upon shared data, and the public outcry over data blocking from EHRs will eventually force vendors to adopt standard APIs. We can expect the personal health data train to gain momentum with hundreds of new market entrants, but not in 2016.

Don’t trust the hype around cognitive systems

Technology-based cognitive systems in healthcare are not in our immediate future. There is lack of clarity around the FTC’s rules regarding software that makes a medical decision — when do they have to be certified as a medical device? Without medical device certification, can the output of cognitive systems be loaded into an EMR? What about malpractice liability?

Analytics vendors and their customers have been tentative in applying the technology to direct patient care, and counter to what other prognosticators believe, this liability and the fear of the unknown will slow down the cognitive market in the US.

ACOs will invest in payer technology

Successful ACOs will require the technology to support all-payer data ingestion. They will need to see the patients as a single population, but within the context of separate payer contracts. These organizations are beginning to invest in the technology that payers have used for years to successfully acquire and integrate claims data with their population health registries.

If providers are to succeed assuming risk, it will be by employing a highly-focused health management approach that addresses the specific risks associated with specific populations of patients. Population and risk analytics infrastructure requires capital investment beyond the reach of many small and mid-size provider organizations. To encourage providers to assume greater risk for performance, payers will offer shared information exchange platforms that augment provider capabilities with analytic services.

Accountable care continues to evolve

Healthcare market transformation will gain momentum in 2016 and provider organizations should also consider the following:

  • Most first-generation ACOs will fail because they don’t know what it means to truly manage risk. They do not have the ability or will to modify how they treat patients. CMS, commercial payers, and the provider community have to figure out how to hold providers harmless on what they can’t control while also rewarding them for doing the things they can do well, then help them bet on their ability to delivery consistently on their promises.
  • 2016 will see an assault on post-acute care providers, who until this point have long been profitable even as many provide little relative value. This will affect nursing homes, outpatient rehabs, and even vendors who sell to post-acute care providers. The release of Medicare data for public research, particularly in the area of Medicare fraud, combined with the high-profile budget line for post-acute care will accelerate the move to overhaul the post-acute care industry.
  • Finally, don’t expect a change in administration to affect CMS innovation. Regardless of the 2016 Presidential election outcome, payment reform will continue, primarily both macro-economic reasons, but importantly as well, the political reality that both parties favor fundamental reform.

John Kelly is principal business advisor at Edifecs of Bellevue, WA.

Readers Write: The Many Flavors of Interoperability

March 7, 2016 Readers Write 9 Comments

The Many Flavors of Interoperability
By Niko Skievaski

image

As the shift towards value-based care persists, the demand for data is as hot as ever. That means the term “interoperability” will be thrown around a lot this year. Let’s describe the various flavors in which it will inevitably be discussed. I’ve seen many conversations become confused as the context for the buzzword is mixed. Here’s an attempt at outlining the various i14y use cases. (Can we start abbreviating it like we do i18n?)

Interoperability for Care Continuity

This is the iconic use case that first comes to mind. Chronically ill patients with binders full of paper records and Ziplocs bulging with pill bottles. As patients bounce around town seeing specialists, they often need to repeat demographic data, med lists, allergies, problems, diagnoses, prior treatment, etc. The solution to this use case calls for ad hoc access to a patient’s data at the point of care. A provider’s chart doesn’t necessarily need to be synced to all other providers in the disjointed care team. Rather, the data needs to be available upon request from the relevant provider.

New payment models have fueled demand for this solution. In a fee-for-service world, redundant tests actually brought more income to the health system,  whereas in value-based models, excessive costs are eaten by the organization. This aligns the provider and patient by incentivizing only the tests and treatments that have the highest likelihood of impacting the patient’s health. Understanding the value of any given treatment also requires looking across a wide set of patients. This brings us to the second use case.

Interoperability to Measure Value

In order to understand how to pay for healthcare based on value, we must make an attempt to measure the impacts to health: a patient’s health is a function of the healthcare they receive as well as a slew of other variables. Estimating this relationship requires a magnitude more data than we’ve traditionally measured. Beyond knowing the diagnosis and treatment, we’d need to control for behavior, family history, comorbidities, prior treatments, etc. Basically everything we can know about a patient’s health. And that’s for a single patient. To build a model, we’d need this information from a large sample of patients to determine the impact of each of these variables. But as treatments are provided to patients and we receive more results, we’ll need to be updating our models to refine their accuracy over time.

Much of this data is stored in an electronic health record over the time period a patient was cared for by that health system. But it’s likely missing data from care outside of that health system. And beyond that patient, how could we combine this record with a sizable population to make a predictive (or even representative) model? Even at very large health systems, limiting their records down to the few who have a rare diagnosis for a given sex and age, the sample set can become insignificantly small.

This i14y use case requires large sets of longitudinal data, rather than single patient records in an ad hoc query. Current attempts at producing such data sets have been extremely resource intensive and normally centered around research efforts focused on a single diagnosis in a de-identified manner. We’ve also seen rampant consolidation in the industry, partially driven by the notion that taking care of larger and larger populations of patients will enable more accurate estimations of value.

Interoperability to Streamline Workflows

This i14y use case has been around since before the term garnered widespread adoption in healthcare. HL7 was created back in 1987 to develop a standard by which health data could be exchanged between the various systems deployed at a health system: electronic health records, lab information systems, radiology information systems, various devices, and pretty much everything else deployed in data center. These systems are most often tied to a centralized interface engine that acts as a translation and filtering tool bouncing transactional messages between each.

So problem solved, right? Not quite. Over the past few decades, health systems have customized their HL7 deployments just as isolated communities evolve a language into a dialect. This proves problematic as each new software application adopted by the health system requires extensive interface configuration and the precious FTE that entails. Interface teams are increasingly the most backlogged tranche of the IT department. As health systems search for more efficient ways to deliver care, they’re more often turning to cloud-based software applications because of the dramatically reduced infrastructure costs and mobility.

This use case likely requires upgraded infrastructure that allows a health system to efficiently connect with and communicate with cloud applications. The customized HL7 dialects will need to be replaced or translated into something consistent and usable for cloud applications. HL7, the organization, is currently developing FHIR as a much needed facelift to a graying standard. In the coming years we look forward to seeing more FHIR adoption in the industry, and hope to avoid the level of customization we have seen with HL7v2 — although initial feedback and documentation from EHR vendors is not promising.

Interoperability to Engage Patients

This is likely the most interesting need for i14y because of its potential. Patients don’t currently walk into doctor’s office and demand that their health data be electronically sent to applications of their choosing. But then again, where are these applications? The inability for patients to authorize API access to their health data has undoubtedly stifled the development of innovative applications. Instead, new application creation has focused on the B2B space in search of enterprise revenue.

If a patient could download an app on their phone and authorize it to pull their medical history, an army of coders would mobilize in creating apps to engage patients as consumers. Application adoption would be holistically democratized and new apps would get to market instantaneously, as opposed to the usual 18-month B2B sales cycles. Applications would be developed to help patients decipher the complexities of care, track care plans and medication adherence, and benchmark against others with similar comorbidities. They could effortlessly download and store their records and be the source of truth. They could contribute their records to research banks that would be willing to pay for their use. Widespread adoption of patient authorized access to health data would almost make the other i14y use cases moot.

Luckily, we’re getting closer. There’s mention of its mandate in MU3. One of the challenges is solving for the chicken-or-egg problem. We need enough widespread adoption of a single authentication framework and data standard to simultaneously sway the development community and health systems to adopt. MU3 seeks to force the right hand side of that equation, however failing to mandate a prescriptive framework or standard in its current draft while wavering in its timeline. As written, it’s possible that health systems can comply with differing technology making the problem only slightly better.

I’m optimistic as accelerating demand has spurred i14y innovation across the sector. HL7 is rapidly organizing support around FHIR and SMART. Incumbent integration engines are stepping up their game and outside integrators are rapidly moving into healthcare. Startups are sprouting to tackle pieces. Some health systems are proactively standing up their own i14y strategies. EHR vendors are vowing to adopt standards and roll out tools to encourage application development. I don’t doubt that we’re beginning to see the fruits of the solutions that will be adopted in the years to come. But it’s on us — as providers, technologists, developers, and patients — to continue the rally cry by demanding i14y now.

Niko Skievaski is  co-founder of Redox.

Readers Write: HIMSS, Ice Cream, and the Law of Diminishing Returns (LoDR)

February 24, 2016 Readers Write 10 Comments

HIMSS, Ice Cream, and the Law of Diminishing Returns (LoDR)
By Mike Lucey

image

“Clearly the third scoop has fewer calories than the first and second. It is simply the law of diminishing returns.” This perverse application of the LoDR only returns a derisive, “You are pathetic” from my wife when used to justify the purchase of a large ice cream sundae. But I carry on and get the nuts on top — they are healthy.

It’s not that the LoDR doesn’t apply, just that I apply it to the wrong side of the counter. The medium at $2.75 (two scoops) and the large at $3.25 (three scoops) delivers less value to the ice cream lady. Extended (five or six scoops?), it would reach the breaking point where the ice cream would cost more to scoop then it would return in cash.

I wonder if some in our industry are confused as to which side of the counter they are on? More importantly, that the LoDR will flip the counter when we are not looking. Are we effectively and consistently asking the question, “Am I getting more than giving, or giving more than getting as I continue down this project path?”

Back in my days in financial services (maybe because our product was money), every project was systemically graded for current value. “Current” being the critical word. Not graded against the expected value we assigned at the start, but against the current costs, current value, and (here’s the kicker) current alternatives.

image

The Boston Globe recently published an article citing a Health Policy Commission study of the disparate cost of care in Boston-area hospitals. Using maternity services as an example, the study found large differences in what hospitals charge.

For us in the healthcare IT industry, it is notable that four of the five top hospitals are actively using or have recently installed Epic  with a big price tag (three Partners hospitals, one UMass). This correlation raises the question: how much IT cost flows through the system, and are there effective checks against these rising costs? Did LoDR flip the counter in these cases?

To Epic’s credit, there is a concerted effort on their part to control costs that are often embedded in questionable customization. In other words, the folks at Epic are applying the concern of LoDR against the impulse of the client to work toward the elusive “best” at an ever-growing expense.

As we head toward HIMSS, our annual festival of IT goodies, we get to see a whole new set of “current” alternatives. Can we review the new stuff through the filter of LoDR? Stuff that is truly new for me, does it get me more then I need to give? And the stuff that is newer than what I have, does it keep me on (or get me back on) the right side of the counter?

And for me the ultimate question: who’s giving away free ice cream? Because free ice cream has no calories. Everyone knows that.

Mike Lucey is president of Community Hospital Advisors of Reading, MA.

Readers Write: Removing Tunnel Vision from Enterprise Imaging

February 24, 2016 Readers Write 2 Comments

Removing Tunnel Vision from Enterprise Imaging
By Karen Holzberger

image

I find the evolution of technology to be fascinating. Just think about music. Fifteen years ago, CDs were the most popular way to access music. Now you can listen to music anywhere, instantaneously, from tiny devices. The population has universally embraced the change. Why has accepting change in healthcare been so slow and difficult?

I’m not saying we all need to be on the bleeding edge of innovation, but it’s important to remove the tunnel vision and recognize advances not just in diagnostic medicine or medical research, but also in health IT innovations that make things faster, easier, and less costly.

I was surprised when I read a recent report on enterprise imaging that their research and results was limited only to organizations with a vendor-neutral archive (VNA) or universal viewer (UV) technologies.

The need to access and store medical images has been the most common demand of radiology departments for decades, but to think that in 2016 enterprise imaging is only done with these two approaches – it’s like taking a Polaroid camera to the beach and waiting a week for the film to be developed.

Don’t get me wrong. This report got it half right, but VNA and UV solutions don’t fit the needs of every organization, and that can lead people down the wrong path. If healthcare facilities are going to succeed in advancing the quality of patient care, then it is time to accept new and nimble health IT solutions for enterprise imaging today that bring patient images to people’s fingertips as swiftly and securely as the cloud delivers your favorite song.

Over the last few years, cloud-based image exchanges have gained popularity as an option for enterprise imaging. A HIMSS Analytics Cloud Survey showed that 83 percent of healthcare organizations used cloud-based apps in 2014. While this simpler approach is not the same as a VNA, it allows facilities to achieve the same overall goals, often more efficiently. Facilities can be up and running on an image exchange in as little as two weeks and have central access to all necessary images via the cloud – anywhere, anytime.

VNAs are one of the oldest imaging technologies. When introduced, they finally allowed healthcare sites to collect data from all departments in one location and exchange that information with a broader audience. But what about patient care happening elsewhere and other types of patient data?

Today, it’s critical that facilities share information with other facilities, not just other departments within the same building. In addition, the shift to value-based care means facilities require quick, efficient technology that follows patients across a continuum, which takes more than just sending an image from point A to point B. Imagine only being able to listen to your favorite song on your iPod and not on any of your other connected devices.

VNAs can take up to two years to implement and can be horribly expensive. Further, since they don’t encapsulate all of a patient’s data, sites need to use them in connection with other solutions, like a picture archiving and communication system (PACS), to have a complete enterprise imaging strategy.

Cloud-based imaging, on the other hand, provides more than the seamless sharing of images. It delivers real value and efficiencies like capturing and sharing all relevant patient data, just like how the cloud allows you to access your music, videos, and playlists effortlessly between your phone, tablet and laptop. Which is why I’m perplexed that society openly welcomes this technology in our lives, but accepting technology that can make life-saving differences has proved to be so challenging.

The time to embrace is now. If not, I fear that we will only continue set back an industry that so desperately needs to move forward.

Karen Holzberger is VP/GM for diagnostics at Nuance of Burlington, MA.

Readers Write: Read This Before You Sponsor Another Hackathon

February 3, 2016 Readers Write 6 Comments

Read This Before You Sponsor Another Hackathon
By Niko Skievaski

image

Innovation is undoubtedly a hot topic right now in healthcare. For good reason: it’s said that one-third of spend is waste and payment models are shifting in an attempt to drive efficiency.

Technology is the obvious place to look for efficiency gains and health systems around the country are getting creative with ways to better utilize it. We see rampant partnerships with startup accelerator programs, direct early stage investments, innovation teams, and the advent of the “chief innovation officer” whose primary goals seems to be a gating mechanism for the army of entrepreneurs trying to make an impact.

We’ve directly participated in a dozen flavors of enterprise innovation programs over the past two years. With this experience, I’d like to ask health systems to try a different sort of program: just try our products.

That’s a lot easier said than done. Your organizations weren’t designed to adopt new technology. Over the past decades, data centers were constructed to house your intranet, EHR, ERP, LIS, MIS, and a slew of other acronyms. IT departments were invented to manage the onslaught of hardware and software subsequently installed on their machines. The systems are weaved together in a web of interfaces managed by graying whizzes from their cubicles.

Each new piece of technology requires budget, a new install project to be prioritized, FTE to be allocated, and expertise to be acquired. Why would any IT head want to shake up their delicate game of Jenga with new software? Especially software from an unproven startup. Especially software in the cloud.

This is poles apart from the modern, tech-savvy organization. Other industries felt market pressures and profit motives to become agile and modernize incrementally. Meanwhile, health systems felt little market pressure as costs inched up year over year.

Pressure later came from well-meaning government subsidies to adopt adequate electronic health record software, however exacerbating rather than toppling the Jenga tower. While health systems upgraded their hardware, the rest of the world moved to SaaS-based tools that eliminated the need for designated IT departments to show you where to click.

The mounting inefficiencies observed in everyday healthcare interactions could cause any millennial to quit her job and start a digital health startup attempting to bring a modern Web experience and level of service to an industry worth saving. This is the core of my request. We don’t need help starting more startups. We don’t need accelerators. We don’t need strategic investments. We need feedback.

I’m not referring to conference panels of CIOs or experienced entrepreneurs tearing startups apart. The feedback required to build an effective product comes at the front lines in the real world. It needs to get all the way into the hands of the doctors, nurses, support staff, and patients.

The technology crisis in healthcare is rooted in the lack of adoption of technology, not in the lack of technology. Similarly, your innovation won’t be in the tech you help to create — it will be in your ability to more rapidly adopt the tech that already exists.

Focus enterprise innovation efforts on decentralizing technology adoption. Figure out ways to let departments choose how to manage their work. Decentralize new technology budgets to get that decision-making process as close to the front line as possible.

The vendors will figure out ways to make it cheap enough by eliminating upfront capital and installation projects. IT should invest in infrastructure technology that allows modern technology to work within your facilities: fast Wi-Fi, modern browsers and devices, API layers, make SSO easy, etc.

Don’t partner with accelerators unless you plan allowing them to outsource your technology selection process. The primary reason those companies participate is to sell to you. And don’t invest in digital health companies unless you’ve used the product. Put your money where your mouth is. Otherwise, your investment is not strategic, it’s just money.

This will also force the business development teams to work closely with clinical teams for product validation. You’re all on the same team — align incentives. You don’t need to depend on accelerators and suits with MBAs to help you figure out if a startup’s product will improve care or increase efficiency at your hospital. The front line will tell you in 10 minutes if you let them use the product.

Niko Skievaski is  co-founder of Redox.

Readers Write: Dealing with the Aftermath of Hurricane ICD-10

February 3, 2016 Readers Write No Comments

Dealing with the Aftermath of Hurricane ICD-10
By Michael Nissenbaum

image

It seems only fitting to compare the October 1, 2015 transition from ICD-9 to ICD-10 to a hurricane. Like a hurricane, we tracked the pending event well in advance. The news media were filled with stories speculating whether ICD-10 would hit as expected and what the potential impact might be.

Even as we braced for the worst, ICD-10 made landfall with a great deal of noise and fury. But according to a Porter Research and Navicure survey, 99 percent of the 360 organizations who responded said they were ready for it and survived the event itself, meaning they were able to begin using ICD-10 when the deadline hit.

Yet Hurricane ICD-10 also shares another characteristic with its physical world counterparts: the aftermath may have a longer-term effect than the event itself. So while all the cork-popping and victory laps back in October may have been well deserved, providers are realizing the forecast is not all sunshine and light tropical breezes just yet.

As you address the ICD-10 aftermath, be wary of some of the issues that may still crop up  and prepare in advance to deal with them.

  1. Be prepared to set up specialized “per payer” rules. While it would be great if every payer organization was now fully converted to ICD-10, it’s still not the case. For example, some smaller workers compensation carriers still aren’t accepting ICD-10 codes, so providers must process their claims differently. Ideally providers can set up special rules in their electronic health records (EHR) and/or practice management (PM) systems to automate this conversion and avoid the need to make manual changes, or even worse, submit paper claims.
  2. Make sure your team is fully trained on the changes. While there is currently a grace period for unspecified ICD-10 codes, that leeway is scheduled to come to an end within the next year. Denials will then increase if you’re not prepared. The best approach is to act as if the grace period doesn’t exist. Ensure your team is trained to submit documentation that is specific enough to support the selected ICD-10 code in the event you’re ever audited. If your users are still struggling in this area, partnering with an expert third party for training may be a worthwhile investment.
  3. Become experts on your most common codes first. We have gone from 13,000 codes in ICD-9 to 69,000 in ICD-10. That’s a lot to learn, but your team doesn’t have to master all 69,000 at once. Identify your practice’s most commonly used codes and make sure you can get them right every time. Once those are in good order you can expand the training on a prioritized basis.
  4. Make the most commonly used codes available quickly through adaptive learning. Take advantage of technologies that “remember” the codes that are used most frequently and make them readily available without a lengthy search process. This will enhance user productivity and minimize user frustration.
  5. Consider technologies that take advantage of natural language search. Another way to improve productivity under ICD-10 is help providers find specific codes faster. Natural language search allows a user to type in “chest pain,” for example, and be presented with answers that match chest pain specifically, as well as related terms such as angina and other heart-related diagnoses. This significantly reduces the time it takes for providers to search for the right level of specificity, especially when first learning new codes.
  6. Take advantage of automated correlation between ICD-9 and ICD-10. Providers that are still learning ICD-10 may benefit from technologies that allow them to type in a familiar ICD-9 code and have the system narrow the choices to a closely related ICD-10 subset. While there will not be many one-to-one relationship between codes, trimming the options can be a huge time-saver.
  7. Speed the selection process with filters. Technologies that use filters to navigate the ICD-10 coding process can also enhance productivity. These solutions deliver a step-by-step approach to drill down to the correct category (e.g., diabetes or chest pain,) followed by more precise options (e.g., left or right.)
  8. Make sure your team understands the importance of these changes. It’s human nature to resist change and providers have had more than their share of changes thrust upon them in the last few years. But failure to comply with ICD-10 affects reimbursements for both the practice and the individual providers. Be as encouraging as possible and keep working to ease the transition.

While Hurricane ICD-10 may have passed through in October, there’s still work to be done. Many organizations are still suffering from productivity losses that could impact their financial success for a long time to come. If your organization is still not recovered from the ICD-10 aftermath, consider the implementation of time-saving technologies and partnerships with knowledgeable experts that can deliver the training and support you need.

Finally, it’s worthwhile to remember that the ICD-10 implementation date was pushed back twice, which is akin to giving providers 15 days warning for an impending storm versus a mere five days. Take note, all you rule-making bodies, and consider how a more sensible implementation pace contributed to the relative success of the ICD-10 transition. Something to keep in mind next time anyone considers cramming providers with a new round of arduous regulations in unreasonable timeframes.

Michael Nissenbaum is president and CEO of Aprima Medical Software of Carrollton, TX.

Readers Write: The Importance of HIT Succession Planning

February 3, 2016 Readers Write No Comments

The Importance of HIT Succession Planning
By Frank Myeroff

image

While getting ready for HIMSS 2016 Conference & Exhibition, I’ve had the opportunity to speak with many healthcare IT leaders about what’s on their priority list this year when it comes to acquiring, promoting, and retaining key HIT talent. One response that I heard over and over again was “Succession Planning.”

The HIT profession is seeing shortages of talent, making succession planning more important than ever. Having a well-developed and current strategic plan in place will help your organization prepare for the future in these vital areas:

  • Prevent vacancies when baby boomers retire. As senior HIT personnel begin to retire, including many CIOs, the industry will lose leadership, knowledge, and skills and that won’t be easy to replace. However, even after having this advanced notice, many organizations are still unprepared for their absence. Therefore, they will find themselves with many vacancies, and consequently may cause them to make quick and rash hiring decisions.
  • Recognize and develop future leaders. As we face a leadership shortage in HIT and in just about every industry across the board, companies must identify and foster those individuals demonstrating leadership skills and abilities through mentoring, training, and stretch assignments so they are ready to take the helm when the time comes.
  • Prevent turnover and costs associated. Employees at all levels are less likely to leave a company that is committed to providing meaningful work and opportunities to grow. A continuous flow of engaged people with defined career paths will stop the revolving door which can be detrimental to any firm. A high turnover is quite costly. According to the Wall Street Journal, experts estimate that it costs upwards of twice an employee’s salary to find and train a replacement.
  • Maximize organizational value. Healthcare organizations with an HIT succession plan in place are more attractive. Management teams having a strategy for when a key player exits protects the value, integrity, and longevity of the company. As a result, your company’s reputation stays positive and in turn, attracts top performers to your company.
  • Meet growing demands for high quality, cost-effective care patient care. Healthcare leaders face unprecedented pressure to meet the ambitious expectations of health reform, i.e. to reduce costs and simultaneously assure high quality patient care. Therefore, the industry needs to better prepare their HIT professionals to manage the complex organizations that provide and finance care.
  • Guarantee the stability of business operations. HIT succession planning helps to mitigate risks and ensures business continuity. People are your greatest asset. They can also be its greatest downfall. If your company becomes overly dependent on the services of a few key individuals, it can lead to operational risks that can cause damage when one or more of those key people are no longer there.

With HIT succession planning on the minds of so many organizations right now, there are a number of ways to find the HIT talent who can ultimately step-in to fill those current and future roles:

  • Hire more military technology veterans. Organizations are on a mission to find, hire, train, and accommodate US military veterans who possess the IT skills in high demand, such as cybersecurity. The military represents a large IT talent pool even though military technology experts may not have civilian HIT certifications or experience. Savvy organizations are able to look past that when onboarding and then later assist returning vets in obtaining those civilian credentials, including IT certifications. In addition, when hiring military veterans, they bring so much more to the job such as leadership skills, ability to perform under pressure, teamwork, respect for procedures, and integrity.
  • Implement college-level internships. More and more organizations are moving towards creating HIT internship programs at the college level. They consider it a year-round recruiting tool which means having an ongoing pipeline of future HIT talent. In addition, interns are an inexpensive resource while at the same time are some of the most highly motivated members of the workforce. Internships.com allows a company to post a profile free of charge. This way, a company gets exposure to top colleges and candidates without breaking their budget.
  • Re-hire retirees for expertise and training. One way organizations will succession plan is to pay retirees to come back. Many IT professionals are now returning as consultants operating under one- to two-year contracts for their help and expertise. In addition, they are being asked to train and mentor promising IT professionals. These seasoned workers have experience and a tremendous amount of knowledge to share.
  • Hire and promote from within. In many cases, organizations lay out an HIT career path that they use to retain quality people. This approach fosters loyalty and also positions your company as a place that career-minded individuals want to work. If you hire or promote from within, it also helps you to retain other key people.
  • Acquire talent from outside or competitors. If an organization does not have confidence that an internal candidate is ready for the position, they may have to recruit from outside and from the competition. Hiring from a rival firm can mean bringing aboard someone who already knows your industry, your HIT initiatives, and/or can bring valuable new project knowledge.

Healthcare IT succession planning should be a part of every company’s strategic plan. It’s vital for the vision of where your company will be going in the future and how it will get there.

Most importantly, succession planning in general will shape how your organization develops and nurtures its people, assures a continuing sequence and pipeline of qualified people to move up and take over when needed, and assures that key positions will be filled with the right people able to carry your company into the future.

Frank Myeroff is president of Direct Consulting Associates of Cleveland, OH.

Text Ads


RECENT COMMENTS

  1. Personally, I think that any healthcare organization that has experienced a cyberattack should have the causalities publicized after a thorough…

  2. We should be concerned that the forthcoming MINIMUM cybersecurity standards will lead to many organizations being even less prepared than…

  3. Kind of obvious to say, but these ransomware attacks are getting ridiculous, and so is our national lack of response.…

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

 

 

RSS Webinars

  • An error has occurred, which probably means the feed is down. Try again later.