Home » Readers Write » Recent Articles:

Readers Write: Giving Patients Access to Prior Mammograms: For Me, It’s Personal

May 11, 2016 Readers Write 2 Comments

Giving Patients Access to Prior Mammograms: For Me, It’s Personal
By Kathryn Pearson Peyton, MD, Chair of the Women’s Health Advisory Board, LifeImage

image

I never imagined that I would be a radiologist advocating for patients in the healthcare tech world. The life pursuit of throwing open access to prior mammograms for women wasn’t on my career to-do list when I consulted my high school guidance counselor to narrow my college choices.

In due time, however, the career found me. Here’s my story.

I grew up in Northern California, in an area where breast cancer risk is doubled simply by virtue of being born there. Breast cancer had a strong history in my family. My great-grandmother died of it. In those days they didn’t screen. By the time they found her breast cancer, it was metastatic to the brain.

My grandmother had a mastectomy in her 40s. Her twin daughters had breast cancer, one in her 40s and the other developing three pathologically distinct breast cancers. Another aunt was diagnosed when she was 38 and passed away leaving two-year-old twins. My mom had breast cancer.

Breast cancer ravaged my family emotionally, starting with my grandmother, who was psychologically crippled from her surgery, which in those days was deforming. My aunts were terrified and anxious. By the time I came along, it was painfully obvious there was a genetic predisposition toward breast cancer in my family, and I wouldn’t be far behind.

Breast cancer found me, too

While I was in early medical training at the University of California, San Francisco in my mid-20s, I went through genetic counseling for breast cancer. A counselor looked at my family history and determined I had an 85 percent lifetime risk of developing breast cancer. They advised me not to get tested for the gene since, by law in California, that would assign me a pre-existing condition that would preclude me from qualifying for health insurance.

I followed their advice and did not get tested. What I did, however, was learn everything I possibly could about breast cancer. I became a radiologist, followed by a fellowship in breast imaging with Ed Sickles, MD, one of the fathers of mammography. I monitored myself, starting screening mammography at age 30.

During those years, I practiced high-volume breast imaging in San Francisco and Jacksonville, Florida, for 15 years. Every time I diagnosed a patient’s breast cancer, I thought, “This could be me … this will be me.”

Finally in my mid-40s, it was me. The signs of early bilateral breast cancer appeared on my own MRI screening: 6 cm of abnormal ductal enhancement in one breast and an entire lower inner quadrant in the other. A negative biopsy would not have reassured me, and the uncertain future of my extremely dense breast tissue was a ticking time bomb. The decision was easy. I don’t mind surgery. I do mind chemotherapy.

Without hesitation, I underwent a nipple-sparing bilateral mastectomy, which was unusual at the time – before Angelina Jolie’s raising awareness of the decision process that some women choose for preventive medicine.

That whole experience gave me a wake-up call. I was burning myself out practicing radiology 10 hours a day during the week and three to four weekend days a month. I stopped practicing.

Fixing mammography, one scan at a time

While I had stopped seeing patients, I still had a strong interest in helping women and I certainly knew a lot about medicine and breast cancer in general. It was clear to me this was an area in which we could improve medicine. Research shows that, with increased availability of prior exams, the quality of patient care and outcomes are improved. Breast cancer can be detected earlier, therefore resulting in less-traumatic and less-costly treatments.

In a study at UCSF, the risk of unnecessary additional examinations is increased 260 percent when prior mammograms are not available for comparison. These high recall rates account for the majority of imaging costs related to breast cancer screening.

Because breast tissue is unique to each individual, archived images provide a benchmark for evaluating changes in tissue composition and assist in the early detection of cancer. When there is a perceived abnormality, the patient is called back for additional imaging of a screening finding. In a grand majority of the time, it is not cancer, and therefore a false-positive result is discovered. This average callback rate for mammography screening in the United States is approximately 10 percent, according to peer-reviewed studies that have examined the data.

Yet it is technically difficult to keep patients connected to their prior mammograms. Patients move between locales, health systems, or both. Some hospitals willingly share mammograms with patients. Others are hesitant, for fear of losing them.

I found the lack of accessibility to priors a barrier for patients and launched Mammosphere to help solve this problem. The concept is a mammogram-sharing cloud that provides hospitals, imaging centers, and patients with electronic access to prior mammograms. It is most active in the Jacksonville, Florida where Mammosphere was formed. Now we’ve joined forces with LifeImage, and in the coming months, the reach of the network will open mammogram access to millions more women.

For patients, the health IT interoperability argument is real

Among the bits, bytes, and bottom lines of technological and financial considerations involved with health IT initiatives, we must never lose sight of the patients and their stories. They need to be at the center of all technology initiatives to improve care.

Physicians who are informaticists can lead the way in accomplishing care improvements. They comprehend not only the technology, but its usefulness in care paths, as well as the specific clinical justifications for using technology to overcome challenges that today create financial waste as well as angst, inconvenience, and sometimes pain for patients.

While it would have been impossible for me to foresee this career path, I now find myself in the health IT realm as a patient advocate. Like many others, I’m hoping to positively influence care quality while helping reduce costs for patients, providers, and payers. By using technology as the tool to achieve it, I believe it’s possible, and that breakthroughs on a national scale are right around the corner.

The top federal health IT leaders came to HIMSS16 pushing health data interoperability. It might sound geeky, but it’s not. It is foundational to helping 60 million women who undergo regular mammograms in the United States, 39 million of whom screen annually. They need access to prior mammograms in a central cloud repository, and they need to maintain freedom of choice to see healthcare practitioners best suited to their needs and personal circumstances.

How do I know all of this is true? Because I am that person. A radiologist who sees the potential power of health IT to fix broken care paths and take on breast cancer – which found me through my family tree. I will not rest until we stop this disease.

Kathryn Pearson Peyton, MD is chair of the Women’s Health Advisory Board of LifeImage.

View/Print Text Only View/Print Text Only
May 11, 2016 Readers Write 2 Comments

Readers Write: Healthcare Consumerism

April 27, 2016 Readers Write 1 Comment

Healthcare Consumerism
By Helen Figge

image

Everyone has at least one healthcare catastrophe to share. Mine is simple. My mother died of a mischievous breast cancer that disintegrated her bones, but only after it was missed “buried” in a pile of papers several years before.

One sentence tells all in a scribbled office note: “current testing could not rule out malignancy — suggest follow up.” The problem was that no one ever informed my mother. We only found incidentally upon her demise. The electronic health record with data exchange capabilities could have given a temporary reprieve.

Technology, however, did enter her life before her untimely death. Mobile technology in her final days delivered every hospital amenity into her home, supporting her last wish “to die in the same room I was born in,”which was 64 years earlier. Innovative healthcare technologies do indeed play a role and can satisfy the healthcare consumer, but certainly in this instance, arrived too late to be her savior.

Technologies are gearing more towards self-monitoring, self-direction and consumer empowerment. At least 52 percent of smartphone users directly gather their health-related information along with indications of how poorly or well one is living life. Healthcare technologies are creating an opportunity for the consumer’s total control of his/her own health destiny. But is this proactive or counter-productive? Is it a sustainable model for healthcare awareness?

Companies are offering technologies that provide the consumer access to laboratory results via apps that are private, secure, and fast, able to be viewed 24/7. However, in some instances, inaccurate results create self-doubt to the end user and clinicians. As the next chapters of technology dissemination evolve, vendors need to better understand what the end user is really looking for in order to support and sustain this new wave of healthcare consumerism.

Chronic diseases are often manageable and sometimes even preventable, yet the healthcare delivery system seems to do better at optimizing managing rather than preventing diseases. In order to turn the pendulum around in healthcare delivery and disease prevention and finally make us all healthy, a technology solution set is needed that is all-encompassing and that comes second nature to the end user. The true challenge in healthcare is to implement a practical solution that comes second nature to us in life’s daily workflow.

Several studies in healthcare show that most consumers want to use digital services for healthcare regardless of age, thanks to the success of Facebook and other social media platforms. The demand for mobile healthcare is definitely there and is resonating throughout all age groups. Consumers also state that they do not want bells and whistles, but the simple brick and mortar in the healthcare technologies to service their basic needs (supporting efficiency and accuracy). Reinforcing the phrase, “Going big is not always better.”

Given the leveling off of healthcare technology spending, the industry needs to better listen to the healthcare consumer’s wish and bring us back to the basics. Our society is not short of technology solutions, but the healthcare consumer is realizing that for health sustainability, sometimes the reliability and usability of a product might now be worth the effort to keep it.

Providing solutions that will allow self-diagnosis and self reflection are the first steps in acknowledging illness, thereafter empowering steps of going to a clinician for an unbiased assessment.

Helen Figge, PharmD, MBA is senior vice president of LumiraDx of Waltham, MA.

View/Print Text Only View/Print Text Only
April 27, 2016 Readers Write 1 Comment

Readers Write: Why Secure Messaging is Failing Hospitals

April 27, 2016 Readers Write 2 Comments

Why Secure Messaging is Failing Hospitals
By Ben Moore

image

Healthcare communications are growing up. Where we were once reliant on interruptive, one-way message pushes; device juggling; and kludgy workflows driven by pager use, modern clinicians have a wealth of tools at their disposal to facilitate effective care coordination.

Yet despite a relatively crowded marketplace (some estimates put the number of secure healthcare messaging providers at over 70) and a market that is ripe for disruption (just ask anyone who still uses a pager if they enjoy it), healthcare messaging solutions still face relatively low adoption, with an estimated 85 percent of hospitals still eschewing smartphones in favor of pagers.

Secure messaging and pagers share a common thread. Neither was specifically designed to address the nuances of healthcare communications. They were mass-market solutions that were adopted by healthcare owing to being in the right place place at the right time.

For pagers, adoption was spurred by the need to deliver around-the-clock care while also allowing providers to (occasionally) leave the hospital. For secure messaging solutions, it was a matter of encrypting PHI that clinicians were transmitting from unsecured personal smartphones, mitigating the risk that came with smartphone use in a clinical setting.

As smartphone use grew organically in healthcare workplaces, HIPAA pitfalls abounded:

  • Data remained resident on personal (and often unprotected) devices.
  • There was little control or policy enforcement.
  • There was no guarantee of SMS message receipt.
  • There was no visibility at an organizational level that any communication had occurred at all.
  • Clinicians became accustomed to utilizing shorthand codes or acronyms to communicate, increasing the propensity for error.

The end result of this was an enormous financial risk of HIPAA violation and compromised care delivery and confusion in the healthcare setting. Secure messaging vendors sought to correct these problems by handling data through a single vendor, implementing message self-destruction from personal devices, guaranteeing message delivery, supporting rich media such as images and video, and performing integrated directory lookup.

If security is the only concern (and don’t get me wrong—it should be a very big concern), these solutions fit the bill. But if the 85 percent of hospitals still utilizing pagers are any indication, healthcare providers are looking for much more when it comes to enabling mobile communications.

In application beyond HIPAA compliance, secure messaging is falling short in a big way. According to a survey conducted this year, 56 percent of providers felt a lack of useful integrations with other software was the leading reason current providers fell short; 44 percent felt they lacked structure and policy; and 33 percent felt that low user adoption was the biggest hindrance.

Inclusion and integrations must be addressed by secure texting apps. Messages are data in its rawest form. If this information is siloed from other departments (for example, if nurses and physicians use different mediums) or different systems (such as scheduling, EMR, nurse call, and paging systems), it’s useless.

The Joint Commission ruling on secure texting states that mobile order entry is not permitted because basic secure messaging lacks the ability to verify the identity of the sender and record a copy of the original message against the EMR. Integrations with Active Directory and EMR software (in that order) ensure that mobile orders remain compliant. Ask any physician if they’re looking for another way be awakened at 4 a.m. when they’re not on call and you may begin to understand why they’re not falling over themselves to try something new (see “adoption issues.”) This can be easily mitigated by integrating with the on-call schedule to ensure that messages and notifications are automatically routed to the correct on-call party.

In the age of big data and informed decisions – and, we’re told, interoperability — there is no excuse for messaging applications to not pull and push relevant or necessary information from other systems to provide additional context, value, and insight.

Healthcare communications are, by and large, structure- and policy-based. Providers in a clinical setting are familiar with not only which information needs to be captured, but who that information needs to be relayed to and when. Basic messaging such as SMS or chat does absolutely nothing to address this (just look at a millennial’s messaging history to confirm.)

For a healthcare communications application to succeed, it must be able to ensure that the relevant information is being captured, and then navigate a complex web of individual providers, care teams, departments, and schedules to deliver that information to the appropriate individuals. Further, secure communication solutions must provide an automated escalation policy and user confirmation of receipt of critical labs to ensure those results are delivered in a timely manner, according to JCAHO’s National Patient Safety Goals.

To address this, next-generation healthcare messaging solutions are building fail-safes into the software itself, including continuous multi-channel delivery attempts (by text and phone), automated escalation rules and message routing in the event that a recipient is unavailable, and delivery visibility so that senders can conclusively confirm a message has been received.

Lastly, in the world of healthcare technology, particularly communication applications, a product is only as good as the number of people who use it. It’s no surprise that a number of secure messaging implementations have been scrapped or cancelled in the face of low adoption. Concerns about device number privacy, a lack of time to learn a new product, or even, yes, pager attachment (a digital version of Stockholm Syndrome) can prevent secure messaging solutions from being successfully rolled out enterprise-wide.

To overcome these obstacles, solution providers must support dedicated number provisioning (providing a unique phone number that exclusively works for communications within the app), pager network integration and pager functionality via a smartphone app (for the pager holdouts), and driving messaging through integration points (some hospitals use as many as 10 disparate systems, including call centers, scheduling solutions, and so on) and providing a user experience that is, at minimum, better than native SMS functionality on smartphones. Really, it’s not that difficult to do.

As a whole, secure healthcare messaging has a lot of room for improvement. However, with the willingness to listen to customers and the ambition to look beyond simply providing security as a service, the opportunity to transform how healthcare workers communicate, collaborate, and deliver care is there.

Ben Moore is founder and CEO of TelmedIQ of Seattle, WA.

View/Print Text Only View/Print Text Only
April 27, 2016 Readers Write 2 Comments

Readers Write: The Journey from Population Health Management to Precision Medicine

April 20, 2016 Readers Write 1 Comment

The Journey from Population Health Management to Precision Medicine
By David Bennett

image

Imagine a world where individuals receive custom-tailored healthcare. Patients are at the center of their own care, making key decisions themselves. They are supported by research and education, and their information is shared easily between caregivers and clinicians. Preventive care is more effective than ever, and medical interventions occur in record time.

With precision medicine, this world is not just within reach — it’s already happening.

Precision medicine (also known as personalized medicine) is the next step in population health management, transforming healthcare from being about many, to focusing on one.

Population health serves as the “who” to identify cohorts of patients that are at risk and require attention. Precision medicine is the “what,” providing caregivers with the specific information they need to create effective prevention and treatment plans that are customized for each individual.

Having the largest variety of data sets possible optimizes therapeutic tracking of each patient’s care plan to make and refine diagnoses. This sets the stage to pursue the most personalized therapy possible by detecting patterns in clinical assessments, behavior, and outcomes.

Data is essential, but it’s only useful if you have the ability to make big data small in order to personalize care. Today’s technology platforms can do just that, by capturing vast amounts of health data and applying real-time analytics that provide information and tools that help healthcare professionals and health insurers make more effective, individualized treatment decisions.

Using this information to engage patients and guide care management makes the journey from population health management to precision medicine that much easier, paving the way for an era of truly personalized medicine that prevents the deterioration of health.

The timing couldn’t be better for precision medicine’s heyday, and here’s why: one-size care does not fit all.

Many factors are converging to make the adoption of precision medicine a reality:

  • A growing number of EMRs, EHRs, and HIEs are being connected and cover a significant number of individuals.
  • Patients are more interested in participating in their care, especially when they get access to their own data. There are myriad devices on the market today that are relevant — from wearable devices that measure activity and sleep quality, to wireless scales that integrate with smartphone apps, to medical devices that send alerts (such as pacemakers and insulin level trackers). The data from these devices contribute to a robust longitudinal patient record. The interactive nature of the technology is also an excellent way to engage patients.
  • MHealth advances allow us to easily capture consumer data using cellphone technology and monitoring patients remotely with telehealth and virtual consultations.
  • Ability to see which inherited genetic variation within families contributes both directly and indirectly to disease development. We can now adjust care plans when genetic mutations occur as a reaction to the treatment in place.

If we look at healthcare outcomes in the United States, it’s clear that we need to anticipate patients’ needs with evidence and knowledge-based solutions. Only then will we will be able to identify a patient’s susceptibility to disease, predict how the patient will respond to a particular therapy, and identify the best treatment options for optimal outcomes. Precision medicine will get us there.

Precision medicine is about aggregating all forms of relevant data to enable different types of real-time data explorations. More concretely, specific areas of medicine are expected to make use of new sources of evidence, and the data types they leverage vary based on medical specialty. A good example would be the difference between the data sets used by oncologists versus immunologists.

There are two critical types of data explorations that both need a very large number of data sets to bring results:

  • Medical research with scientific modeling. Precision medicine can be leveraged to advance the ways in which large data sets are collected and analyzed, which will lead to better ways and new approaches to managing disease.
  • Clinical applications. Treatment plans and decisions can be greatly improved by identifying individuals at higher risk of disease, dependent on the prevalence and heritability of the disease. We call this cognitive support at the point of impact. To support this, more control is needed in real time over macro variables: genomics, proteomics, metabolism, medication, exercise, diet, stress, environmental exposure, social, etc. Precision medicine provides a platform that has an extensive number of data sets with the ability to easily create custom data sets to capture these types of variables.

Precision medicine not only means care tailored to the individual, it also brings to the healthcare industry the visibility on variability and the speed necessary to act expediently on findings to prevent the deterioration of health. Not only does this enhance patients’ lives, it saves healthcare dollars and prevents waste.

Tailoring deliverables to the needs of individuals is nothing new, at least in other fields such as banking and retail. Pioneers in these industries have leveraged open-source technology on a solid data foundation to meet their markets’ challenges.

Surely we can do the same in healthcare, where it’s literally a matter of life and death. That’s why so many of us are working on a daily basis to accelerate the science behind precision medicine and to encourage its adoption. Precision medicine is nothing short of revolutionary, and together, we can all make it a reality.

David Bennett is executive vice president of product and strategy at Orion Health of Auckland, New Zealand.

View/Print Text Only View/Print Text Only
April 20, 2016 Readers Write 1 Comment

Readers Write: Three Tips for Supporting a Population Health Management Program

April 20, 2016 Readers Write 1 Comment

Three Tips for Supporting a Population Health Management Program
By Brian Drozdowicz

image

Provider organizations have a lot of options when selecting population health management expertise and system support, including analytics, data aggregation, clinical workflow / care management, and patient engagement solutions. With the market for these solutions expected to reach $4.2 billion by 2018, it is not surprising that new vendors pop up practically daily, or that existing vendors are beefing up their solution portfolios to capitalize on the opportunity.

As providers’ wish lists continue to grow, driven in part by government initiatives and commercial payer programs, system selection starts to take on the overwhelming feel of a second EMR implementation. This is causing providers to hesitate just when they need to act. How can providers find the right path to effective population health management?

No matter what shape a program might take, the right team is a foundational imperative. Assuming risk for populations often means that provider organizations are learning and mastering a new set of skills while simultaneously balancing the demands of “business as usual.”

One frequently deployed tactic is to hire staff from payer environments. They bring the requisite knowledge to the table and can help incorporate proven payer techniques and processes that both build on and complement a provider’s current infrastructure. Team members are needed who “speak data” and are also representative of groups across an organization (e.g., clinicians, program managers, business leads, finance team members, IT staff) to best determine what program goals are, what is possible for the specific organization, and what actions should be taken along what timeframe.

Once  the right team is in place, here are three tips to support the implementation of a population health management program:

  1. Recognize that data quality is more important than data quantity. The foundation of any population health management program is data. However, providers don’t need or want it all because each type of data has to be managed and maintained, often by separate people and according to different rules (e.g., privacy constraints). Focus on obtaining and properly maintaining the right data to drive population analysis, program structure, program management, and ongoing assessment.
  2. Learn to embrace claims data. Provider organizations need the longitudinal view that claims data provides to adequately assess utilization, total cost of care, and provider performance, and in turn to answer complex, multi-faceted questions about risk. Other benefits of claims data include that it is: (a) easier to manage and maintain; (b) more readily available and accepted than ever before; (c) controllable from a systems perspective; and (d) proven to yield accurate insights.
  3. Show physicians the numbers and what drives those numbers. Physician change is required to embrace the concept of value-based care. Comparative performance data can be a huge eye-opener. Physician leadership can help physicians be the champions of program performance assessment by making sure they can dig deep into the data, develop confidence in its findings, and understand what precisely needs to change. Complement performance data with compensation plans that reward participation, improvement, and outcomes. Start by placing the emphasis on participation, and then weight improvement and outcomes more heavily over time.

Provider organizations must know what is essential versus nice to have before they go into the vendor evaluation process. In a new and volatile market, the number of vendors offering potential solutions is huge, and the allure of slick user interfaces that can perform every population health management function, while integrating all types of data, is understandable.

However, little is proven, and most organizations do not have the time to wait until it is. Solutions have a gestation period to build, test, and revise before they become accurate, produce valid results, and deliver actionable business value. Answers are needed now, so organizations should look for a track record of results in a similar setting.

What does an organization need to effectively manage risk and care for populations? Of course, the answer is, “it depends,” but if you build the right team and thoroughly research your options, these tips can help bring order to the chaos.

Brian Drozdowicz is executive vice president of product management at Verisk Health of Waltham, MA.

View/Print Text Only View/Print Text Only
April 20, 2016 Readers Write 1 Comment

Readers Write: It’s Time to Get Doctors Out of EHR Data Entry

April 20, 2016 Readers Write 5 Comments

It’s Time to Get Doctors Out of EHR Data Entry 
By Marilyn Trapani

image

There was a day when medical transcription was neat and clean. A doctor dictated what happened during an exam and a transcriptionist accurately typed each detail into the patient’s record. Each future encounter built on that record, a detailed history meant to ensure quality care. It wasn’t a perfect system, but it worked.

Now doctors sit for hours each week in front of a computer screen entering patient encounter data into electronic health records (EHRs). These complex systems were meant to more efficiently and effectively track health data for hospitals, payers, and physicians alike. EHRs were promised to save physician practices, hospital systems, and other provider organizations millions of dollars in the long run. 

Reality shows something quite different. Placing documentation responsibilities on physicians is resulting in severe problems not only for doctors, but for patients and the hospitals and practices who serve them. Doctors are spending more time – in some cases, 43 percent of their day – entering data into EHRs, which means less time available for patients. This continual influx of data is bloating EHRs with unnecessary, repetitive, unintelligible information. 

Doctors play an integral part in developing and maintaining medical records. But we are asking them to do too much and the entire healthcare system is suffering because of it. Instead of dictating information into the medical record, many physicians are required to type notes into their EHR, which is time-consuming and distracting.

That’s just one challenge they face when required to directly document into an EHR. Upon accessing the system, the doctor enters a patient’s medical number and their record pops up. There are boxes for history, medications, procedures, etc. This “structured data” methodology allows physicians to click radio buttons or check boxes to denote what was done, but too often allows for little or no free text. Physicians are presented options from which to choose, even if those options aren’t applicable. The structured data choices can’t be changed, and the patient’s record is built off what the doctor ultimately chooses as the lesser of evils.  

Most EHRs allow doctors to copy and paste information from one area of the record to another. This creates “note bloat,” a serious issue that’s resulting in junk data and unwieldy, unmanageable records. It’s not uncommon for information copied from one patient’s record to end up in a different person’s file.

Not only does that create note bloat, it also causes mistakes. One hospital was recently sued by a patient who suffered permanent kidney damage from an antibiotic given for an infection. The patient also had a uric kidney stone, which precludes antibiotic use. The EHR file was so convoluted, none of the attending physicians noticed the kidney stone. Printed out, the patient’s record was 3,000 pages. The presiding judge ruled the record inadmissible, in part because a single intravenous drip was repeated on almost every page.

In late January, Jay Vance, president of the Association for Healthcare Documentation Integrity (AHDI), testified to the US Senate Health, Education, Labor and Pensions Committee that EHR documentation burdens on physicians can be reduced by expanding language to a draft bill aimed at improving the functionality and interoperability of EHR systems.

The move to pay providers based on the quality of the care they deliver instead of the volume of cases seen by physicians and specialists is driving much of the federal healthcare discussion. There’s a chance that work can help restore sanity to the interaction between doctor and document. The Medicare Access and CHIP Reauthorization Act of 2015 (MACRA), the bill that ended the onerous Sustainable Growth Rate, authorized the Centers for Medicare and Medicaid to pay physicians via value-based reimbursement. The law also called for a replacement for Meaningful Use.

One component of MACRA is the Merit-Based Incentive Payment System (MIPS) that, among other things, incentivizes providers for using EHR technology. The goal is to achieve better clinical outcomes, increase transparency and efficiency, empower consumers to engage in their care, and provide broader data on health systems. But there is more that can be done. 

This is progress, because at the end of the day, patient focus should always trump data entry by physicians. That’s not to say that physicians shouldn’t have a hand in documentation. According to AHDI, accurate, high-integrity documentation requires collaboration between physicians and the organization’s documentation team – highly skilled, analytical specialists who understand the importance of clinical clarity and care coordination. Certified documentation and transcription specialists can ensure accuracy, identify gaps, errors, and inconsistencies that may compromise patient health and compliance goals.

AHDI’s recommendation: include wording that expands the definition of “non-physician members of the care team” to include certified healthcare documentation specialists and certified medical transcriptionists.”

There’s not a single documentation and transcription scenario to meet every organization’s needs. But there is common ground to be found where all functions – EHR vendors, documentation specialists, transcription experts, physicians, hospital administrators – can create a structure that results in clean, effective, understandable patient medical records. 

Step 1 – reduce doctors’ administrative burdens. A physician’s role in documentation should be focused on dictation, not data entry. EHR voice recognition software allows doctors to directly narrate into the system. Like any other text, narrated notes need to be reviewed for accuracy and then approved. In some cases, doctors are approving their entries without reviewing them. This increases the risk of inaccurate data and mistakes. 

Step 2 – find the balance of structured and unstructured EHR data. There is a place for both structured and unstructured data in the EHR. Structured data can be queried and reported on with much greater ease than free flow text. However, doctors complain there aren’t enough options to share narratives about encounters and what patients had to say about their visit. The goal of an EHR is to provide a complete and accurate view of patients’ conditions, treatments, and outcomes. It makes sense to use structured data for entries such as those required by CMS. Using dictation and expert transcription assistance, unstructured free-text narratives and information also can be a part of the EHR while maintaining accuracy and completeness. 

Step 3 — eliminate interface barriers. EHRs require interfaces to “talk” with other systems. Fees charged for said interfaces prevent providers from using outside documentation and transcription services. Interfaces are necessary, but should be part of the standard development of EHR structured data forms and information collection.

Step 4 – put the responsibility of document editing and transcription in expert hands. I believe there will be resurgence of transcription services in 2016. Streamlining data entry into an EHR will never replace the need for documentation and transcription experts. Providers will continue to need outside assistance in ensuring patient data is accurately and cleanly logged in the EHR. 

EHRs are here to stay. So are documentation and transcription experts. Provider organizations need both of us. When experts on both sides to combine their strengths and expertise, we can put doctors, physicians, and other health care professionals back where they belong: taking care of patients.

Marilyn Trapani is president and CEO of Silent Type of Englewood, NJ. 

View/Print Text Only View/Print Text Only
April 20, 2016 Readers Write 5 Comments

Readers Write: Radiology Benefits Managers: An Inelegant Method for Managing the Use of Medical Imaging

April 13, 2016 Readers Write No Comments

Radiology Benefits Managers: An Inelegant Method for Managing the Use of Medical Imaging
By James A. Brink, MD, FACR

image

Doctors, lawmakers, and regulators are supposed to work together to make healthcare better. So why put a process in place that takes medical decisions out of the hands of doctors and patients, may delay or deny care, and often results in longer wait times to get care?

That is what insurance companies do by requiring preauthorization of advanced medical imaging (such as MRIs or CT scans) ordered for beneficiaries. A better way to ensure appropriate imaging is widely available and already in use.

In most cases, if your doctor thinks an imaging scan can improve your health, he or she has to ask a radiology benefits management company (RBM) whether the scan will be covered or not. This process can take days or even weeks. You may not be able to get the scan at all if the RBM says no, which happens a lot.      

In fact, a Patient Advocate Foundation (PAF) study found that in people who challenged coverage denial for scans, 81 percent were denied by RBMs and 90 percent of reversed denials were in fact covered by the patient’s health plan. The U.S. Department of Health and Human Services (HHS) says there are no independent or peer-reviewed data that prove radiology benefit managers’ effectiveness. HHS also warned against the non-transparent coverage protocols that RBMs use. 

What’s more, ensuring appropriate imaging is already being done in a more modern and efficient way. Clinical decision support (CDS) systems, embedded in electronic health records systems, allow providers to consult appropriate use criteria prior to ordering scans. American College of Radiology (ACR) Appropriateness Criteria, for instance, are transparent, evidence-based guidelines continuously updated by more than 300 doctors from more than 20 radiology and non-radiology specialty societies.

CDS systems — easily incorporated into a doctor’s normal workflow — reduce use of low-value scans, unnecessary radiation exposure, and associated costs. The systems educate ordering healthcare providers in choosing the most appropriate exam and suggesting when no scan is needed at all.

An Institute for Clinical Systems Improvement study across Minnesota found that such ordering systems saved more than $160 million in advanced imaging costs vs. RBMs and other management methods over the course of the study. A major study by Massachusetts General Hospital and the University of Florida showed that these systems significantly reduced advanced imaging use and associated costs. This was done without delaying care or taking decisions out of the hands of patients and doctors.

In fact, the Protecting Access to Medicare Act — passed by Congress with the backing of the ACR and multiple medical specialty societies — will require providers to consult CDS systems prior to ordering advanced imaging scans for Medicare patients starting as soon as next year. This makes image ordering more transparent and evidence-based than any other medical service. The law would require preauthorization only if a provider’s ordering pattern consistently fails to meet appropriate use criteria.

In short, preauthorization is an antiquated approach to utilization management that disconnects doctors and patients from learning systems designed to improve patient care. Patients. together with the providers and legislators who serve them, should be demanding a more modern approach to prior authorization through the delivery of EMR-integrated imaging CDS.

James A. Brink, MD, FACR is vice chair of the American College of Radiology, radiologist-in-chief of Massachusetts General Hospital, and Juan M. Taveras Professor of Radiology at Harvard Medical School.

View/Print Text Only View/Print Text Only
April 13, 2016 Readers Write No Comments

Readers Write: Why Can’t I Be Both Patient and Customer?

April 13, 2016 Readers Write 7 Comments

Why Can’t I Be Both Patient and Customer?
By Peter Longo

image

I love the clinicians at my local health system. However, I hate the bills from my local health system.

When the clinic staff helped last month with my knee, they were the best — rock stars. When I got their confusing bill, they were the worst. Is there any other industry where you love the service, but 30 days later, they go out of their way to take away all of your happy thoughts?

Yes, I did something stupid again. Over the holidays, I took some time off to go skiing with the family. Time with the family was not stupid; skiing in the trees was stupid. (note to self; you are not in your 20s any more and need to take it easy). The ensuing tumble, spin, twist, and crash resulted in an injured knee.

I entered the local university health system in search of a cure. In total amazement, I walked into the office and the entire staff greeted me. Just like in the Gap, the entire front staff looked up and said “hello” loudly.

Over the next month, the medical group and hospital went out of their way to make me feel at home … until the bill came. Or should I say “bills” (plural). They should have stamped on the envelopes, “Screw you” in an effort to be more honest.

Most of the bills appeared to be for my knee, based on the dates of service. But for the record, they decided to add some of my wife’s medical charges into the mix on one statement.

Having spent 25 years working in the healthcare tech world plus having two graduate degrees, it still did not give me the skills to make any sense of the bills. I decided to call them at 4:50 one afternoon. The very nice recording said, “The billing office closes at 4 p.m. Monday through Friday.” Seriously? What about people who work and don’t have time to call until after work, or on the weekend? The Gap has greeters, but they are open nights and weekends. Seems my health system copied the Gap only on the greeters.

A few days later, I was able to talk to someone. I started the call by saying, “I want to pay all that I owe, so please provide a summary and explain the charges so I can pay you.” Surprisingly, they did not understand half the statements. They indicated they could not access the “other system that has more information,” so they would need to call me back.

A few days later, someone from the billing office called. Together we figured out where there were some discrepancies and determined the correct amount owed. She indicated she would clean everything up and send me a new statement. Thirty days later, I got the statement and paid right away. As I was writing that check, I had already forgotten about how they “cured” me, as it seemed so long ago.

The cost for the billing staff involved in my bill was probably more that what I owed, so I did feel bad for them. That sympathetic feeling only lasted a short time. Last night I got a call at the house. My 15-year-old handed the phone to me. I owe $25 and they sent it off to their collection agency.

Is it too much to ask that my health system treat me both as a patient and as a customer?

Peter Longo is SVP/chief revenue officer of Sirono of Berkeley, CA.

View/Print Text Only View/Print Text Only
April 13, 2016 Readers Write 7 Comments

Readers Write: Three Reasons EHRs Need to Treat Biosimilars Differently from Generics

April 13, 2016 Readers Write No Comments

Three Reasons EHRs Need to Treat Biosimilars Differently from Generics
By Tony Schueth

image

Biosimilars are being introduced in the United States and are expected to quickly become more mainstream in the near future. In response, stakeholders are beginning to work on how to make them safe and useful within the parameters of today’s healthcare system.

The reason is that biosimilars, like biologics, are made from living organisms, which makes them very different from today’s conventional drugs. These differences will create challenges and opportunities in how they are integrated in electronic health records (EHRs) and user workflows as well as how patient safety may be improved.

Normally, there is a lot of lead time before EHR vendors must address such issues. Things are different with biosimilars. Here are some reasons.

There are powerful drivers

Several drivers will stimulate demand for EHRs to address biosimilars sooner rather than later. This is because of central role EHRs play in value-based care coordination and patient safety.

New biologics will be bursting on the healthcare scene. Although biosimilars have recently been approved for use in the US, they have been in use extensively in Europe and Asia for many years. More than 80 biosimilars are in development worldwide, and the global biosimilars market is expected to reach $3.7 billion. This will stimulate rapid adoption by payers and physicians in the US, which, in turn, will create the need for EHRs to capture and share a variety of information about biologics and biosimilars. It is easy to envision the availability of four biosimilars for 10 reference products in 2020, given projected market expansions.

Next, uptake in the US is expected to take off because biosimilars are lower-cost alternatives that will be used to treat the growing number of patients with such chronic diseases as arthritis, diabetes, and cancer. Rand has estimated savings from using biosimilars at $44.2 billion over 10 years. Money talks and payers will create demand for EHRs to fold biosimilars and biologics into EHR functionalities and workflows.

Payers and regulators also will demand enhanced tracking of biologics and biosimilars because they are key pieces of the move toward value-based reimbursement and are a focus of public and private payers. Identifying, tracking, and reporting adverse events that might be associated with biologics and biosimilars are expected to become key metrics for assessing care quality and pay-for-performance incentives.

Biosimilars are not generics

It would be a mistake to think of biosimilars as being synonymous with generics, which have been around for years and use mature substitution methodology. The reason begins with the fact that biologics and biosimilars are medications that are made from living organisms. Unlike generics, which have simple chemical structures, biosimilars are complex, “large molecule” drugs that are not necessarily identical to their reference products, thus the term “biosimilar,” not “bioequivalent.” In addition, biosimilars made by different manufacturers will differ from the reference product and from each other, making each biosimilar a unique therapeutic option for patients.

Furthermore, biologics and biosimilars have varying locations where they are administered, most commonly infused in physician offices, hospitals, or special ambulatory centers, or by patients at home. Given that administration location and type can vary, such information — along with the particulars of the drug that was administered — must get back to the physician and incorporated into the patient’s EHR record.

Getting this information into the patient’s record in the EHR also is important for improving patient safety. That is because it will help in identifying and distinguishing the source of the adverse drug events and patient outcomes from a biosimilar, its reference biologic, and other biosimilars.

Substitution laws are expanding and evolving

Developers of EHR systems will need to keep abreast of evolving state laws concerning substitution. In fact, many states already are considering substitution legislation or have enacted it. According to the National Conference of State Legislatures, as of early January 2016, bills or resolutions related to biologics and/or biosimilars were filed in 31 states. Keeping pace with these new laws is likely to be a challenge to ensure that EHRs are compliant, especially since requirements are apt to vary considerably from state to state. Given the rapid changes in the regulatory landscape, latency of updates to EHR systems is a problem that needs to be addressed.

Not only that, the drug that is dispensed may be very different than what was prescribed. As a result, it is important for physicians to know whether a substitution has been made and capture information about the drug that was administered in the patient’s EHR record. Because of the differences from conventional medications, different, more granular information such as lot number, will also be required. This is important for treatment and follow-up care as well as in cases where an adverse drug event or patient outcome occurs later on.

All in all, EHRs will face a brave new world when it comes to adapting to biologics and biosimilars.

Tony Schueth is CEO and managing partner of Point-of-Care Partners of Coral Springs, FL.

View/Print Text Only View/Print Text Only
April 13, 2016 Readers Write No Comments

Readers Write: The Future of Mobility and Cloud in Healthcare

April 6, 2016 Readers Write No Comments

The Future of Mobility and Cloud in Healthcare
By Joe Petro

image

For some time now, we’ve been hearing concerns voiced by physicians about how complicated their lives have become due to the mountainous documentation requirements. Among the most difficult is capturing the details a patient shares during a consultation and trying to fit that information into the structured template found in today’s EHRs.

How can we expect a patient’s story to be impactful when all its context and richness is lost to making sure we click and check the right boxes? This is a byproduct of all the initiatives coming out of the federal government. The EHRs are left with no choice but to force the structured capture of clinical documentation.

At the same time that we see these changing requirements, we’re also seeing a change in the technology used by physicians. Physicians are becoming increasingly more mobile and technologies can improve the physician experience and allow them to capture the patient story across the multitude of devices they currently use throughout the day. Executed properly, this ultimately offers physicians a way to streamline this documentation burden as certain technologies, such as speech recognition and language understanding, let them capture the required documentation in a more natural way.

In parallel, we are seeing an emergence of a cottage industry of mHealth app vendors looking to bring innovative technologies to the healthcare workflow. We have reached a tipping point where technical tools make it easier to leverage a large number of advanced capabilities. This makes it easier for the entire industry to create solutions and applications that are immediately impactful. This is a unique time and place in our technological evolution in the healthcare space.

Cloud is an example of a set of technologies that makes things easier and has the potential to deliver high impact. The cloud makes it possible for technologies to meet physicians wherever they are, on any device, at any time. For example, physicians can enter data into their mobile devices/apps any time, anywhere, and on the go. The cloud will be there to broadcast this information far and wide to EHRs or other apps and tools in a more meaningful way no matter where it originated. Thanks to cloud enablement, mHealth apps and other innovations become more useful to the physicians who want to be mobile.

Mobile and cloud innovations are impacting patients as well. Mobile applications and wearable devices are allowing patients to manage their own health, capture their own health data, and turn this data into actionable insights. Our lives and our health are on the brink of being substantially instrumented. We are now tracking sleep and eating patterns and mobile devices are starting to capture valuable information from blood pressure to heart rate to weight and more.

This technology can help patients comply with the treatment plans that physicians prescribe by allowing them to report progress or other important details in real time. The cloud is connecting patients to their own personal health experience, enabling them with the tools they need to better look after and manage their own health. It also connects patients to their healthcare providers and institutions before they actually need to receive care, potentially keeping them out the hospital in the first place. This evolution is taking place today.

We’re transitioning to a phase where we can truly call this “healthcare” instead of “sick care,” a phase where we are shifting to managing our health proactively instead of just managing a sickness after it has already happened. With all this data available via the cloud, EHRs and all health-oriented applications will evolve, making it easier for physicians to leverage the technology to increase productivity and improve quality of care. The value that the EHRs are promising to deliver will be delivered partly through this mechanism.

As we continue down this path, we move towards a setting that seems as if it’s almost from a futuristic movie where everything in healthcare is mobile, connected, and intelligent. We’re going to see patients surrounded by enabling technology in such a way that intelligent services in the cloud will help their mobile devices keep track of important information that can then be used during visits with their physicians or, more importantly, prior to visits.

Physicians will be primed for the visit with everything they need on a device, reducing the time patients spend having to tell the same thing to three different people upon entering a health system. Present-day documentation requirement problems will eventually fade into the background as technology advances and interacting with these systems become more human-like and natural. Physicians will be able to focus fully on what got them into medicine in the first place: caring for their patients.

Joe Petro is senior vice president of healthcare research and development of Nuance of Burlington, MA.

View/Print Text Only View/Print Text Only
April 6, 2016 Readers Write No Comments

Readers Write: Tax Rebate? Insurance rebate!

April 6, 2016 Readers Write No Comments

Tax Rebate? Insurance rebate!
By Richard Gengler

image

Now that tax season is in full swing and the eventual rebate is around the corner, it is an ideal time to think about another kind of rebate. This one stems from the changes in healthcare policy with the Affordable Care Act (ACA) with the increasing push of the triple aim of improved patient experience, improving the health of populations, and reducing the per capita cost of healthcare.

With the individual markets becoming the fastest-growing part of the payer sector and increasingly competitive, payers are searching for any potential leverage to obtain, retain, and grow their membership base. There is more discussion on the importance of net promoter score (NPS), whereby payers can utilize their existing members to act as promoters.

By utilizing new innovations and alternative service modalities, insurance companies are able to hit all three parts of the triple aim. Almost on a daily basis we are hearing about innovations that have greater than 90 percent user satisfaction rates and significantly having positive impact on population health at potentially a fraction of the cost.

Health plans are required to have an 80 percent or 85 percent medical loss ratio (MLR), meaning that they spend this amount of the premiums they collect on medical expenses. The rest can be used for administrative, profit, and marketing. Any difference in this percentage must be refunded to the members, according to law. Great idea, but does this actually work?

Looking back to 2014, there are plentiful insurers offering rebates to their members in a wide variety of markets from individual, small group, and large group. Take, for instance, Celtic Insurance Company in Arkansas, which had $6,774,488 in rebates to its individual market. Or how about California Physicians Service ,with an astounding $21,819,095 for its small group market. In the large group market, Cigna Health and Life Insurance Company of DC sent back $5,608,359.

clip_image004

One would think this is an opportunity to fully engage and grow membership. Data from the Kaiser Family Foundation shows that many insurance companies are not meeting the medical loss ratio standards. This signals a missed opportunity.

clip_image006

To calculate the MLR is quite simple.

Let’s take, for instance, a population of 3 million Americans using a service that traditionally costs $1,751 per person per year. If there was an alternative service modality that is clinically equivalent for $30, this would create a savings of $1,721 and a percentage difference of 98 percent. If the premiums and other elements remain the same, this could be extrapolated out to provide bountiful rebates to the members.

Next time you are thinking about innovative strategies to increase the NPS of your members while increasing membership, think about your taxes. Your members will thank you, tell their friends, and increase your membership.

Richard Gengler is founder and CEO of Prevail Health of Chicago, IL.

View/Print Text Only View/Print Text Only
April 6, 2016 Readers Write No Comments

Readers Write: All Claim Attachments are Not Created Equal

April 6, 2016 Readers Write No Comments

All Claim Attachments are Not Created Equal
By Kent McAllister

image

According to the 2014 CAQH Index, responding health plans representing 103 million enrollees returned data on claim attachments. There was approximately one claim attachment for every 24 claims during 2013 from those same responses.

Interestingly, the vast majority of claim attachments were submitted manually via paper delivery or fax. CAQH counted approximately 46 million claim attachments processed among the plans reporting, which can be extrapolated to roughly 110 million claim attachments industry-wide.

CAQH also estimates another 10 million prior authorization attachments. This statistic suggests a total of 120 million attachments annually across healthcare.

There’s a clarification, however, that must be made when dealing with attachments. Electronic attachments, in and of themselves, are not always the same despite industry rhetoric claiming that there is little difference between the healthcare sectors.

When dealing with the substance of attachments, there are two major distinct segments that providers must accommodate. These two segments are vaguely similar at the highest level, but distinctly different at the business process level for a few reasons. These two segments align with respective accountable payer organizations:

  1. Health and dental plans: commercial health plans and federal and state fiscal agents and administrators,
  2. Workers compensation (WorkComp): property and casualty insurance carriers and third-party-administrators.

The majority of the 120 million attachments are processed by health plans. Dental plans also manage an essentially equivalent business process for handling attachments, often through the same technical channels and human resources with similar skills.

Workers compensation claims, on the other hand, while voluminous, have a notably different set of business processes because of a number of distinctions in both the property and casualty insurance business and in the nature of “claims” in WorkComp parlance.

A WorkComp claim is generally related to an individual injured on the job. That claim may have a life of many months, or, in some cases, years. Resulting from that claim are typically many bills (or e-bills) that usually have an attachment. The e-bill submission process is more similar to property and casualty processes — such as auto physical damage — than to traditional health and dental plan processes.

An interesting contributor to this distinction is that property and casualty insurers are not considered “covered entities” under the 1996 HIPAA legislation. This is important, and any industry observers not recognizing this are failing to accommodate a major consideration.

Just as not all claim attachments are equal, neither are all vendors. For example, some companies that are heavily involved in the P&C space don’t work with the medical side, while others focus almost exclusively on medical. Vendors usually serve one of the two often-unrelated markets.

Providers must be aware of the differences. P&C electronic attachments, even though they may sound as if they’re in the healthcare setting, just don’t carry the same weight as electronic claims actually exchanged to support patient claims generated within a health system. Likewise, those vendors that work almost entirely in healthcare have little claim, if any, to the P&C market.

In a market filled with healthcare claims-related vendors, healthcare organizations must be able to place their trust in partners that understand the complete landscape of the healthcare space. They should also know that even though WorkComp may appear on the surface to be medical, it requires an entirely different scope of work than their counterparts working in the space. In this burgeoning sector of healthcare administration, messages are often painted too broadly with too wide a brush and healthcare leaders should be wary when entering into conversations that broach the subject of electronic attachments.

For the improvement of all parties involved, vendors should recognize and articulate the differences between health and dental attachment processes and WorkComp attachment processes in their public messages. The industry will be better served if vendors accept a mandate to clarify market confusion and to paint clearer lines as to their roles in electronic attachments.

Kent McAllister is chief development officer of MEA|NEA|TWSG of Dunwoody, GA.

View/Print Text Only View/Print Text Only
April 6, 2016 Readers Write No Comments

Readers Write: Time for Providers to Lead the Price Transparency Revolution

March 23, 2016 Readers Write 5 Comments

Time for Providers to Lead the Price Transparency Revolution
By Jay Deady

image

With ICD-10 in the rear-view mirror, providers now face a new challenge – answering the public and media call for consumer price transparency. High-deductible plans now cover nearly a quarter of those Americans with commercial insurance, raising the ante on patient financial responsibility. Yet large numbers of patients remain confused about how much they will owe for hospital services—a full 36 percent, according to one survey.

This problem, unheard of in other consumer industries, not only endangers patient satisfaction scores, but threatens to increase the bad debt load of organizations already struggling with severely low margins.

While insurance companies and employers have deployed some pricing tools, they have done a poor job of accurately representing multiple providers’ fees within a geographic area. New technologies are available from a handful of companies that let providers take the price transparency bull by the horns and lead themselves.

These technologies transcend the usual approach of mere compliance with a state’s price transparency laws. Posting a list of charges on a provider’s website may satisfy the letter of the law, but it fails to give consumers an accurate picture of what they will owe for services. Knowing this, providers have struggled to come up with an alternative that does not reveal proprietary information to their competitors. Most have concluded there is no way for them to easily accomplish this and they refer questions to patients’ insurance companies.

But it turns out the path to truly efficient, accurate, and accessible price transparency is one that healthcare consumers can take themselves—directly from the provider’s website.

Healthcare consumers want – and deserve – an accurate understanding of what they will owe for services before they are rendered. The operative word here is “accurate”—as in an estimate based on the consumer’s current levels of insurance coverage. Or, in the case of a self-pay patient, an estimate based on the provider’s discounted fees for consumers that pay fully out of pocket.

Either way, with self-service pricing, healthcare consumers generate the estimates themselves, typically from an online calculator on the provider’s website. The process is quick and hassle-free. A consumer simply inputs their name, insurance plan number, and perhaps two or three more data elements. Within 10 to 45 seconds, a complete and accurate estimate appears, giving consumers immediate, line-item insight into what they will owe.

The process is powered by rules-based engines that automatically query, retrieve, and combine data from payer portals with the hospital’s charge master data and payer contracts. Analytics plays a critical role in assuring the estimate is accurate, including analysis of previously adjudicated claims to identify variances.

Such a tool neatly solves one of the most persistent challenges with implementing price transparency: the pitfalls of making proprietary financial information public. As a provider-facing solution, and because patient-unique information needs to be entered to generate an estimate, not just anyone can use the calculators. This is vastly preferable to putting a list of total charges or paid amounts out there for all competitors to see, which neither reflects negotiated rates with payers or the patient’s accurate out-of-pocket costs.

At the same time, self-service price calculators appeal to today’s information-driven patients and nicely align with how they already seek pricing on other purchases, from airfare to mortgages.

One of the most promising advantages of a self-service price calculator is its potential to engage consumers in multiple ways. After generating a price estimate, for example, the calculator could prompt high-deductible and self-pay consumers to view payment plan options. It could even engage those patients with concerns about their ability to pay and schedule time with a financial counselor. Realistically, we can only expect such concerns to grow along with the increasing number of high-deductible health plans. Since these plans were introduced in 2006, they have increased from 4 percent to a whopping 24 percent.

A deductible payment and co-insurance spread out over a year, or whatever the time span the provider and patient agree on, is clearly more manageable than a lump sum payment. Armed with clear, accurate information about how much they will pay—and how—healthcare consumers can better plan for paying their medical bills. This in turn will help reduce a hospital’s bad debt or charity write-offs.

Most important, patients who clearly understand their financial responsibility are more likely to schedule rather than delay urgently needed care. This reason, above all others, is why providers would be wise to take control of the price transparency issue now.

Jay Deady is CEO of Recondo of Greenwood Village, CO.

View/Print Text Only View/Print Text Only
March 23, 2016 Readers Write 5 Comments

Readers Write: Trend Watch: Innovation Forges On in the Provider Sector

March 7, 2016 Readers Write No Comments

Trend Watch: Innovation Forges On in the Provider Sector
By John Kelly

image

Provider organizations face tremendous innovation challenges. The success or failure of new systems and technology will depend on their ability to adapt and anticipate the impact of major industry changes. Looking ahead to a successful 2016, hospitals and provider organizations should still expect barriers to using EMR data, should be wary of the hype surrounding cognitive systems, and should prepare for a value-based care partnerships world where providers and payers share information in ways not imaginable until recently.

EMR data will not be fully liberated in 2016

Barriers that exist to move data in and out of EMRs will not abate in 2016, despite pressure. The business model of EMR vendors and real technological barriers will continue to thwart the goals of interoperability sought under the concept of Meaningful Use.

The good news is that providers and payers are establishing pockets of innovation using edge technologies to support better care and risk sharing based upon shared data, and the public outcry over data blocking from EHRs will eventually force vendors to adopt standard APIs. We can expect the personal health data train to gain momentum with hundreds of new market entrants, but not in 2016.

Don’t trust the hype around cognitive systems

Technology-based cognitive systems in healthcare are not in our immediate future. There is lack of clarity around the FTC’s rules regarding software that makes a medical decision — when do they have to be certified as a medical device? Without medical device certification, can the output of cognitive systems be loaded into an EMR? What about malpractice liability?

Analytics vendors and their customers have been tentative in applying the technology to direct patient care, and counter to what other prognosticators believe, this liability and the fear of the unknown will slow down the cognitive market in the US.

ACOs will invest in payer technology

Successful ACOs will require the technology to support all-payer data ingestion. They will need to see the patients as a single population, but within the context of separate payer contracts. These organizations are beginning to invest in the technology that payers have used for years to successfully acquire and integrate claims data with their population health registries.

If providers are to succeed assuming risk, it will be by employing a highly-focused health management approach that addresses the specific risks associated with specific populations of patients. Population and risk analytics infrastructure requires capital investment beyond the reach of many small and mid-size provider organizations. To encourage providers to assume greater risk for performance, payers will offer shared information exchange platforms that augment provider capabilities with analytic services.

Accountable care continues to evolve

Healthcare market transformation will gain momentum in 2016 and provider organizations should also consider the following:

  • Most first-generation ACOs will fail because they don’t know what it means to truly manage risk. They do not have the ability or will to modify how they treat patients. CMS, commercial payers, and the provider community have to figure out how to hold providers harmless on what they can’t control while also rewarding them for doing the things they can do well, then help them bet on their ability to delivery consistently on their promises.
  • 2016 will see an assault on post-acute care providers, who until this point have long been profitable even as many provide little relative value. This will affect nursing homes, outpatient rehabs, and even vendors who sell to post-acute care providers. The release of Medicare data for public research, particularly in the area of Medicare fraud, combined with the high-profile budget line for post-acute care will accelerate the move to overhaul the post-acute care industry.
  • Finally, don’t expect a change in administration to affect CMS innovation. Regardless of the 2016 Presidential election outcome, payment reform will continue, primarily both macro-economic reasons, but importantly as well, the political reality that both parties favor fundamental reform.

John Kelly is principal business advisor at Edifecs of Bellevue, WA.

View/Print Text Only View/Print Text Only
March 7, 2016 Readers Write No Comments

Readers Write: The Many Flavors of Interoperability

March 7, 2016 Readers Write 9 Comments

The Many Flavors of Interoperability
By Niko Skievaski

image

As the shift towards value-based care persists, the demand for data is as hot as ever. That means the term “interoperability” will be thrown around a lot this year. Let’s describe the various flavors in which it will inevitably be discussed. I’ve seen many conversations become confused as the context for the buzzword is mixed. Here’s an attempt at outlining the various i14y use cases. (Can we start abbreviating it like we do i18n?)

Interoperability for Care Continuity

This is the iconic use case that first comes to mind. Chronically ill patients with binders full of paper records and Ziplocs bulging with pill bottles. As patients bounce around town seeing specialists, they often need to repeat demographic data, med lists, allergies, problems, diagnoses, prior treatment, etc. The solution to this use case calls for ad hoc access to a patient’s data at the point of care. A provider’s chart doesn’t necessarily need to be synced to all other providers in the disjointed care team. Rather, the data needs to be available upon request from the relevant provider.

New payment models have fueled demand for this solution. In a fee-for-service world, redundant tests actually brought more income to the health system,  whereas in value-based models, excessive costs are eaten by the organization. This aligns the provider and patient by incentivizing only the tests and treatments that have the highest likelihood of impacting the patient’s health. Understanding the value of any given treatment also requires looking across a wide set of patients. This brings us to the second use case.

Interoperability to Measure Value

In order to understand how to pay for healthcare based on value, we must make an attempt to measure the impacts to health: a patient’s health is a function of the healthcare they receive as well as a slew of other variables. Estimating this relationship requires a magnitude more data than we’ve traditionally measured. Beyond knowing the diagnosis and treatment, we’d need to control for behavior, family history, comorbidities, prior treatments, etc. Basically everything we can know about a patient’s health. And that’s for a single patient. To build a model, we’d need this information from a large sample of patients to determine the impact of each of these variables. But as treatments are provided to patients and we receive more results, we’ll need to be updating our models to refine their accuracy over time.

Much of this data is stored in an electronic health record over the time period a patient was cared for by that health system. But it’s likely missing data from care outside of that health system. And beyond that patient, how could we combine this record with a sizable population to make a predictive (or even representative) model? Even at very large health systems, limiting their records down to the few who have a rare diagnosis for a given sex and age, the sample set can become insignificantly small.

This i14y use case requires large sets of longitudinal data, rather than single patient records in an ad hoc query. Current attempts at producing such data sets have been extremely resource intensive and normally centered around research efforts focused on a single diagnosis in a de-identified manner. We’ve also seen rampant consolidation in the industry, partially driven by the notion that taking care of larger and larger populations of patients will enable more accurate estimations of value.

Interoperability to Streamline Workflows

This i14y use case has been around since before the term garnered widespread adoption in healthcare. HL7 was created back in 1987 to develop a standard by which health data could be exchanged between the various systems deployed at a health system: electronic health records, lab information systems, radiology information systems, various devices, and pretty much everything else deployed in data center. These systems are most often tied to a centralized interface engine that acts as a translation and filtering tool bouncing transactional messages between each.

So problem solved, right? Not quite. Over the past few decades, health systems have customized their HL7 deployments just as isolated communities evolve a language into a dialect. This proves problematic as each new software application adopted by the health system requires extensive interface configuration and the precious FTE that entails. Interface teams are increasingly the most backlogged tranche of the IT department. As health systems search for more efficient ways to deliver care, they’re more often turning to cloud-based software applications because of the dramatically reduced infrastructure costs and mobility.

This use case likely requires upgraded infrastructure that allows a health system to efficiently connect with and communicate with cloud applications. The customized HL7 dialects will need to be replaced or translated into something consistent and usable for cloud applications. HL7, the organization, is currently developing FHIR as a much needed facelift to a graying standard. In the coming years we look forward to seeing more FHIR adoption in the industry, and hope to avoid the level of customization we have seen with HL7v2 — although initial feedback and documentation from EHR vendors is not promising.

Interoperability to Engage Patients

This is likely the most interesting need for i14y because of its potential. Patients don’t currently walk into doctor’s office and demand that their health data be electronically sent to applications of their choosing. But then again, where are these applications? The inability for patients to authorize API access to their health data has undoubtedly stifled the development of innovative applications. Instead, new application creation has focused on the B2B space in search of enterprise revenue.

If a patient could download an app on their phone and authorize it to pull their medical history, an army of coders would mobilize in creating apps to engage patients as consumers. Application adoption would be holistically democratized and new apps would get to market instantaneously, as opposed to the usual 18-month B2B sales cycles. Applications would be developed to help patients decipher the complexities of care, track care plans and medication adherence, and benchmark against others with similar comorbidities. They could effortlessly download and store their records and be the source of truth. They could contribute their records to research banks that would be willing to pay for their use. Widespread adoption of patient authorized access to health data would almost make the other i14y use cases moot.

Luckily, we’re getting closer. There’s mention of its mandate in MU3. One of the challenges is solving for the chicken-or-egg problem. We need enough widespread adoption of a single authentication framework and data standard to simultaneously sway the development community and health systems to adopt. MU3 seeks to force the right hand side of that equation, however failing to mandate a prescriptive framework or standard in its current draft while wavering in its timeline. As written, it’s possible that health systems can comply with differing technology making the problem only slightly better.

I’m optimistic as accelerating demand has spurred i14y innovation across the sector. HL7 is rapidly organizing support around FHIR and SMART. Incumbent integration engines are stepping up their game and outside integrators are rapidly moving into healthcare. Startups are sprouting to tackle pieces. Some health systems are proactively standing up their own i14y strategies. EHR vendors are vowing to adopt standards and roll out tools to encourage application development. I don’t doubt that we’re beginning to see the fruits of the solutions that will be adopted in the years to come. But it’s on us — as providers, technologists, developers, and patients — to continue the rally cry by demanding i14y now.

Niko Skievaski is  co-founder of Redox.

View/Print Text Only View/Print Text Only
March 7, 2016 Readers Write 9 Comments

Readers Write: HIMSS, Ice Cream, and the Law of Diminishing Returns (LoDR)

February 24, 2016 Readers Write 10 Comments

HIMSS, Ice Cream, and the Law of Diminishing Returns (LoDR)
By Mike Lucey

image

“Clearly the third scoop has fewer calories than the first and second. It is simply the law of diminishing returns.” This perverse application of the LoDR only returns a derisive, “You are pathetic” from my wife when used to justify the purchase of a large ice cream sundae. But I carry on and get the nuts on top — they are healthy.

It’s not that the LoDR doesn’t apply, just that I apply it to the wrong side of the counter. The medium at $2.75 (two scoops) and the large at $3.25 (three scoops) delivers less value to the ice cream lady. Extended (five or six scoops?), it would reach the breaking point where the ice cream would cost more to scoop then it would return in cash.

I wonder if some in our industry are confused as to which side of the counter they are on? More importantly, that the LoDR will flip the counter when we are not looking. Are we effectively and consistently asking the question, “Am I getting more than giving, or giving more than getting as I continue down this project path?”

Back in my days in financial services (maybe because our product was money), every project was systemically graded for current value. “Current” being the critical word. Not graded against the expected value we assigned at the start, but against the current costs, current value, and (here’s the kicker) current alternatives.

image

The Boston Globe recently published an article citing a Health Policy Commission study of the disparate cost of care in Boston-area hospitals. Using maternity services as an example, the study found large differences in what hospitals charge.

For us in the healthcare IT industry, it is notable that four of the five top hospitals are actively using or have recently installed Epic  with a big price tag (three Partners hospitals, one UMass). This correlation raises the question: how much IT cost flows through the system, and are there effective checks against these rising costs? Did LoDR flip the counter in these cases?

To Epic’s credit, there is a concerted effort on their part to control costs that are often embedded in questionable customization. In other words, the folks at Epic are applying the concern of LoDR against the impulse of the client to work toward the elusive “best” at an ever-growing expense.

As we head toward HIMSS, our annual festival of IT goodies, we get to see a whole new set of “current” alternatives. Can we review the new stuff through the filter of LoDR? Stuff that is truly new for me, does it get me more then I need to give? And the stuff that is newer than what I have, does it keep me on (or get me back on) the right side of the counter?

And for me the ultimate question: who’s giving away free ice cream? Because free ice cream has no calories. Everyone knows that.

Mike Lucey is president of Community Hospital Advisors of Reading, MA.

View/Print Text Only View/Print Text Only
February 24, 2016 Readers Write 10 Comments

Readers Write: Removing Tunnel Vision from Enterprise Imaging

February 24, 2016 Readers Write 2 Comments

Removing Tunnel Vision from Enterprise Imaging
By Karen Holzberger

image

I find the evolution of technology to be fascinating. Just think about music. Fifteen years ago, CDs were the most popular way to access music. Now you can listen to music anywhere, instantaneously, from tiny devices. The population has universally embraced the change. Why has accepting change in healthcare been so slow and difficult?

I’m not saying we all need to be on the bleeding edge of innovation, but it’s important to remove the tunnel vision and recognize advances not just in diagnostic medicine or medical research, but also in health IT innovations that make things faster, easier, and less costly.

I was surprised when I read a recent report on enterprise imaging that their research and results was limited only to organizations with a vendor-neutral archive (VNA) or universal viewer (UV) technologies.

The need to access and store medical images has been the most common demand of radiology departments for decades, but to think that in 2016 enterprise imaging is only done with these two approaches – it’s like taking a Polaroid camera to the beach and waiting a week for the film to be developed.

Don’t get me wrong. This report got it half right, but VNA and UV solutions don’t fit the needs of every organization, and that can lead people down the wrong path. If healthcare facilities are going to succeed in advancing the quality of patient care, then it is time to accept new and nimble health IT solutions for enterprise imaging today that bring patient images to people’s fingertips as swiftly and securely as the cloud delivers your favorite song.

Over the last few years, cloud-based image exchanges have gained popularity as an option for enterprise imaging. A HIMSS Analytics Cloud Survey showed that 83 percent of healthcare organizations used cloud-based apps in 2014. While this simpler approach is not the same as a VNA, it allows facilities to achieve the same overall goals, often more efficiently. Facilities can be up and running on an image exchange in as little as two weeks and have central access to all necessary images via the cloud – anywhere, anytime.

VNAs are one of the oldest imaging technologies. When introduced, they finally allowed healthcare sites to collect data from all departments in one location and exchange that information with a broader audience. But what about patient care happening elsewhere and other types of patient data?

Today, it’s critical that facilities share information with other facilities, not just other departments within the same building. In addition, the shift to value-based care means facilities require quick, efficient technology that follows patients across a continuum, which takes more than just sending an image from point A to point B. Imagine only being able to listen to your favorite song on your iPod and not on any of your other connected devices.

VNAs can take up to two years to implement and can be horribly expensive. Further, since they don’t encapsulate all of a patient’s data, sites need to use them in connection with other solutions, like a picture archiving and communication system (PACS), to have a complete enterprise imaging strategy.

Cloud-based imaging, on the other hand, provides more than the seamless sharing of images. It delivers real value and efficiencies like capturing and sharing all relevant patient data, just like how the cloud allows you to access your music, videos, and playlists effortlessly between your phone, tablet and laptop. Which is why I’m perplexed that society openly welcomes this technology in our lives, but accepting technology that can make life-saving differences has proved to be so challenging.

The time to embrace is now. If not, I fear that we will only continue set back an industry that so desperately needs to move forward.

Karen Holzberger is VP/GM for diagnostics at Nuance of Burlington, MA.

View/Print Text Only View/Print Text Only
February 24, 2016 Readers Write 2 Comments

Subscribe to Updates

Search


Loading

Text Ads


Report News and Rumors

No title

Anonymous online form
E-mail
Rumor line: 801.HIT.NEWS

Tweets

Archives

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Follow

Reader Comments

  • Seriously: Did you know that Iatric laid off 25 people yesterday, in what is an ongoing hemorrhage of talent? Senior leadership is ...
  • Peter Butler: Bg - Thanks for the comment. I'm fortunate to have such a strong and talented team behind me. We wouldn't be who we ar...
  • A-M: "Soon-Shiong says he will use healthcare-developed artificial intelligence to 'bring together editors and reporters and ...
  • Charles Owen: I find your allocation of the costs of healthcare to various determinants to be interesting, and it is pretty well align...
  • meltoots: Dr Jayne, you got played. You wanted someone to do this to you. To push your buttons. You even admit that no one reall...

Sponsor Quick Links