Home » Readers Write » Recent Articles:

Readers Write: Medication Electronic Prior Authorization, the Next Big Thing for EHRs

July 16, 2014 Readers Write 3 Comments

Medication Electronic Prior Authorization, the Next Big Thing for EHRs
By Tony Schueth

image

Electronic prescribing (ePrescribing) has surpassed the tipping point, where more prescriptions are being written electronically than on paper. Now the industry must start thinking about the next big thing that will take ePrescribing to the next level and address one of healthcare’s most inefficient processes: prior authorization (PA) of prescriptions.

With ePrescribing considered table stakes in an electronic health record (EHR), software developers should be thinking about innovations that will take ePrescribing from a humdrum utility to a must-have. Electronic prior authorization (ePA) for the pharmacy benefit offers that innovation opportunity.

EPA is the #1 ePrescribing capability desired by physicians, according to market research conducted by NCPDP’s ePA Task Group. In order to foster a standardized approach to satisfy this demand, NCPDP approved an electronic data interchange (EDI) standard for ePA last year.

By design, the ePA transaction can be integrated with the EHR ePrescribing work flow, enabling prescribers to complete the prior authorization process within two minutes as compared with the manual process, which involves many phone calls and faxes that can take days to weeks to complete (15 days, on average). Considering that specialty medications dominate the drug pipeline and require prior authorization up to 95 percent of the time, the need for ePA is urgent.

Seven states have mandated the use of ePA beginning in late 2014 and eight others are engaged in ePA regulatory activity. In May, the National Committee on Vital Health Statistics (NCVHS) recommended that the Department of Health and Human Services adopt the NCPDP transaction as the standard for medication PAs. NCVHS recommendations regarding ePrescribing and related transactions often become requirements for payer participation in Medicare Part D.

The coming regulatory mandates afford EHR vendors the opportunity to be ahead of the curve. Rather than scrambling to meet multiple state regulatory deadlines at the last minute, vendors can take advantage of the interval between Meaningful Use (MU) Stages 2 and 3 to begin development of ePA functionality while there is still breathing room to concentrate on work flow enhancements.

The availability of ePA may sway some physicians in their EHR choice. Recently, Surescripts found that 28 percent of physicians surveyed would switch their EHR for one that supports ePA. While this percentage may be exaggerated based upon a single feature, there is no question that a robust replacement market for EHRs exists. Many physicians are looking to transition from early purchases of basic EHRs to more sophisticated solutions.

EDI networks such as Surescripts have begun offering ePA connectivity, while such established ePA services vendors as CoverMyMeds have introduced APIs to ease EHR integration. Some service providers offer connectivity for all ePAs – even if a pharmacy benefit manager or other payer isn’t electronically enabled, electronically initiated ePAs are delivered via fax.

The time is right. EPA is a logical and useful enhancement that physicians desire. A transaction standard that ensures compatibility is in place. Regulators are beginning to mandate its use. The number of PAs is growing. EDI networks and service vendors are eager to ease integration.

With the rare opportunity posed by the MU Stage 2 delay, vendors can roll out a new feature that is a “win-win-win-win” benefit for physicians, patients, payers, and EHR vendors.

Tony Schueth is founder, CEO and managing partner at Point-of-Care Partners of Coral Springs, FL.

Readers Write: Data Exchange with C-CDA: Are We There Yet?

July 16, 2014 Readers Write 8 Comments

Data Exchange with C-CDA: Are We There Yet?
By Brian Ahier

image

Do you think you have all the interoperability criteria to meet current and future stages of the EHR Incentive Programs? A new study published in JAMIA found that most providers don’t.

The study concluded that providers likely are lacking critical capabilities. It found that some EHR systems still don’t exchange data correctly using Consolidated Clinical Document Architecture (C-CDA), which may prevent providers from receiving future Meaningful Use (MU) incentives.

After sampling several platforms used to produce Consolidated Clinical Document Architecture (C-CDA) files, the research team from the Substitutable Medical Applications and Reusable Technology (SMART) C-CDA Collaborative — funded by the ONC as part of the SHARP research program — found a number of technical problems and obstacles which prevented accurate data exchange between different EHR systems.

There is already wide-scale production and exchange of C-CDA documents among healthcare providers this year due to the EHR incentive program and for meeting Meaningful Use requirements. Indeed, live production of C-CDAs is already underway for anyone using 2014 Certified EHR Technology (CEHRT). C-CDA documents enable several aspects of Meaningful Use, including transitions of care and patient-facing download and transmission.

Stage 2 Meaningful Use requires that providers be capable of producing C-CDA files, which contain both machine-readable and human-readable templates used to exchange patient data between EHRs during transitions of care. While all 2014 CEHRT must have the ability to create these files, some vendors are unfortunately not using the basic XML and HL7 technology correctly.

To find out how these variations affect providers and their participation in Stage 2, the researchers sampled 107 healthcare organizations using 21 EHR systems. They examined seven important elements of the documents: demographics, problems, allergies, medications, results, vital signs, and smoking status, all of which are required to be included in the C-CDA for Stage 2. They found errors in the XML that conflicted with HL7 standard usages, rendering the document ineligible to meet the Stage 2 rules for interoperability.

One key takeaway from this research is that live exchange of C-CDA documents is likely to omit relevant clinical information and increase the burden of manual review for provider organizations receiving the C-CDA documents. Common challenges included omission or misuse of allergic reactions, omission of dose frequency, and omission of results in interpretation. Unfortunately, only some of these errors can be detected automatically.

image

The team found 615 errors and data expression variation across 11 key areas. The errors included “incorrect data within XML elements, terminology misuse or omission, inappropriate or variable XML organization or identifiers, inclusion versus omission of optional elements, problematic reference to narrative text from structured body, and inconsistent data representation.”

"Although progress has been made since Stage 1 of MU, any expectation that C-CDA documents could provide complete and consistently structured patient data is premature," the researchers warned. The authors also note that more robust CEHRT testing and certification standards could prevent many of these troubling errors and variations in the technology and that the industry may also benefit from the implementation of data quality metrics in the real-world environment.

The researchers recommended several steps to improve interoperability: providing richer, more standardized samples in an online format; requiring EHR certification testing to include validation of codes and vocabulary; reducing the number of data elements that are optional; and improving monitoring to track real-world document quality.

The researchers make the case for using a lightweight, automated reporting mechanism to assess the aggregate quality of clinical documents in real-world use. They recommend starting with an existing assessment tool such as Model-Driven Health Tools or the SMART C-CDA Scorecard. This tool would form the basis of an open-source data quality service that would:

  • Run within a provider firewall or at a trusted cloud provider
  • Automatically process documents posted by an EHR
  • Assess each document to identify errors and yield a summary score
  • Generate interval reports to summarize bulk data coverage and quality
  • Expose reports through an information dashboard
  • Facilitate MU attestation

"However, without timely policy to move these elements forward, semantically robust document exchange will not happen anytime soon," the authors stated. “Future policy, market adoption and availability of widespread terminology validation will determine if C-CDA documents can mature into efficient workhorses of interoperability,” the report concludes. It would seem that if policy changes are not put in place, there could be risk in the Meaningful Use program not actually being all that meaningful.

This month CMS released the proposed 2015 Physician Fee Schedule. Among other things,it includes proposals to revise the physician supervision requirements for Chronic Care Management (CCM) services and proposes to require CCM practitioners to use EHRs certified to meet at least the 2014 Edition Meaningful Use criteria, which require the ability "to capture data and ultimately produce summary records according to the HL7 Consolidated Clinical Document Architecture standard."

Since this new proposed rule includes expanding the use of the certification program beyond Meaningful Use and specifically mentions the C-CDA standard, I thought I would ask Joshua Mandel, one of the authors of the study, for his thoughts.

"It’s not too surprising that CMS’s efforts to improve chronic care management would build on Meaningful Use requirements," he said. "In the section you’ve quoted, CMS, is simply saying that Eligible Providers would need to use MU-certified systems (just as they must use MU-certified systems to attest for MU incentive payments). And so C-CDA capabilities come along for the ride in that decision. I can certainly say C-CDA is better than nothing; and C-CDA 1.1 is a specification that exists and has been implemented today, so it’s a natural choice here."

While there are challenges in implementing and making good use of C-CDA documents, there is little doubt that HHS is continuing to drive the use of these standards forward through various policy levers. The ability to exchange relevant clinical information for transitions of care is a key enabler in transforming our healthcare system to paying for quality instead of quantity.

Despite these challenges, we are beginning to see success in the marketplace. Building on this success and continuing to improve content standards is critical if true interoperability is to become a reality.

Brian Ahier is director of standards and government affairs for Medicity, A Healthagen Business of Salt Lake City, UT.

Readers Write: Replace Your RFP with an RFS

June 20, 2014 Readers Write 2 Comments

Replace Your RFP with an RFS
By Patty Miller

image

I have been on both sides of an RFP, as a member of the organization issuing the RFP and the organization receiving the RFP. In one case, when on the issuing side, we did not select a vendor, as no traditional vendor solved our problem. We went back to the drawing board, so to speak.

From the receiving side, I have seen organizations spend hundreds of thousands of dollars to millions of dollars on software or services, only to realize none or a small portion of their anticipated ROI because they implemented only part of the services or software or never completed the implementation.

What happened? Everything was spelled out in the RFP. All the vendors followed the steps and they had their evaluation process down.

The RFP is a wonderful tool to reduce price and squeeze an existing vendor for savings or to use when the solution is known. But what happens when there is a real problem to solve or when venturing into new territory?

That is where the RFS, or Request for Solution, comes into play. Entrepreneurs, innovators, and vendors are some of the most effective problem solvers we have in society today. How do we harness them? Bring these solution architects into our fold and share the insights we have into the problem we are trying to solve or the direction we would like to take. Then, we can create a solution that is used, solves challenges, and realizes the desired ROI.

Let’s use an example scenario. Our organization is looking to purchase analytics software. What is the desired outcome? Is this in line with our strategic goals? For this example, we would like to better understand how we are paid and ultimately be paid 10 percent faster.

Problem Definition Phase

In this phase, the solution architects are engaged as a group or one-on-one to hear what the problem and current state is. An NDA may be a good idea during the problem definition phase. To convey the problem and current state, a presentation, observation, or a demonstration can be used. That’s it. Share the problem, nothing else.

In our case of looking to purchase analytics software, we find our DSO to be 89 days on average. Delays in cash collection inhibit our ability to reinvest in the company, thus delaying sales.

Ideal Future State Definition Phase

Describe and share with the solutions architects what some of the desired outcome looks like when the problem is solved, such as, we will reduce defects or have zero defects. We can deliver 10 percent more of our product to the customer for the same price. We have reduced overtime by 10 percent.

Make part of the selection criteria about these outcomes and other outcomes they can deliver and convey this information to the solution architects so that they will provide all the value they can. The ideal future state should align with organizational strategic goals as well.

In our scenario of looking for analytics software, we would like to bring DSO down from 89 days to 80 days and increase sales by 5 percent. Due to the high importance of cash in running a business, it is in our best interest to collect outstanding receivables as quickly as possible. By quickly turning sales into cash, we can put the cash to use again — ideally, to reinvest and make more sales.

Solution Definition Phase

During this phase, work collaboratively with each solution architect to refine the initial solution they provide. If this is a large project, this phase can take months. The payoff is implementing a solution or service that is used, solves challenges, and realizes the desired ROI.

In this phase, the solution architects are designing and proving — through POCs, demonstration, references, or site visits — how their analytics software will help us reach our desired outcomes along with other benefits they can provide. We decided the goals of this implementation. They will be to reduce DSO and more accurately forecast sales because the solution architects demonstrated in their solution that we would be able to forecast sales more accurately.  They also demonstrated how that would lead to turning sales into cash.

Contracting Phase

Contracting should include deliverables and a timetable based on the collaboratively designed solution.

For our analytics software scenario, in contracting we ask for a guaranteed outcome, perhaps not as aggressive as our original goals. In some instances, it might be more aggressive, depending on what the solution architects have demonstrated they can deliver and if our goals have changed in any way. Contracting can be difficult because everyone is trying to minimize risk during contracting, but ultimately everyone should have a little skin in the game, both solution architect and buyer. Include a change policy during contracting.

In our scenario of analytics software, we agreed that DSO would decrease by nine days in one year.

Implementation Phase

During this phase, the solution should be referenced frequently. If there are changes, they should be documented per the change policy agreed to in the contracting phase, especially if the anticipated outcomes change. Needs change during an implementation and, if additional value can be achieved, it may make sense to proceed in a slightly altered direction.

In our analytics software scenario, during implementation, some lean processes were put in place and DSO decreased by two days. We are now looking for the software to give us another eight days instead of the initial nine, although we think the benefits will be greater.

Solution Monitoring Phase

For a period of 3-12 months or more after the solution has been implemented, there should be frequent monitoring to ensure that the solution or service is utilized, has solved the challenge, and realized the desired ROI. If not, the solution architects or issuing organization should be accountable as per the contract.

 

If someone is truly interested in solving a problem, that person will have skin in the game. RFS issuing or receiving can be a scary process for an organization that is accustomed to the traditional RFP. The organizations interested in change will embrace the RFS; others may be resistant.

No organization can afford to spend its resources, time, people, and capital on a project that does not produce the desired outcome. I expect many vendors, entrepreneurs, and purchasers will welcome this collaborative solution design, but some will be resistant due to the insecurities associated with change.

Harnessing the power of the Request for Solution allows for bringing solution architects into your fold and implementing a solution that is used, solves challenges, and realizes a desired ROI.

Patty Miller is sales manager, health and sciences division,  for TechDemocracy of Edison, NJ.

Readers Write: AMA Adopts New Recommendations on Telemedicine, Signaling Further Comfort with Telehealth

June 20, 2014 Readers Write Comments Off on Readers Write: AMA Adopts New Recommendations on Telemedicine, Signaling Further Comfort with Telehealth

AMA Adopts New Recommendations on Telemedicine, Signaling Further Comfort with Telehealth
By Alexis Gilroy, JD

image

Earlier this month, the American Medical Association (“AMA”) approved recommendations regarding the provision of medical services using telecommunications technologies (commonly known as “telemedicine”). AMA’s report, on the heels of a policy adopted in April by the Federation of State Medical Boards, indicates the growth of telemedicine, an increased comfort-level with telemedicine, and a desire to align legal and regulatory frameworks between medical services provided “in-person” and those provided using telemedicine.

In particular, AMA’s report provides an overview of key topics specific to telemedicine, including reimbursement, known practice guidelines, and telemedicine use cases, and it establishes a number of new AMA policies and recommendations regarding telemedicine services. The report is a significant departure from some of the AMA’s previous policies regarding the use of telemedicine, including a 1994 AMA opinion prohibiting physicians from providing clinical services via telecommunications (to which the report notes “may no longer be consistent with the best ethical analysis”).

Most notably, perhaps, the AMA advocates equating the standard of care for services provided via telemedicine with the standard of care for in-person services. While this may just seem like legal jargon to some, it has potential real positive impact on the digital health industry. This move signals an acknowledgement of telemedicine as an accepted delivery model akin to “in-person” delivery models. After all we are talking about medical services in either context with the difference merely being the venue for accomplishing delivery.

Unfortunately, to date, regulators do not always have similar views between medical services provided “in-person” versus telemedicine, as adopted regulations in many states indicate a strong deference to traditional “in-person” services and in some cases a flat prohibition on services provided through telemedicine. For example, Texas and Alabama currently require an in-person exam prior to any services provided via telemedicine in a patient’s home regardless of the patient’s illness or situation, causing significant roadblocks for telehealth providers in these states. This is especially frustrating for some home-based patients who could significantly benefit from engagement with a primary care or specialty physician using telemedicine.

With the AMA’s support through the new policy, similar to the Federation’s telemedicine policy, we may see state medical boards and other regulators rethink existing and proposed regulations specific to telehealth that placed an across the board barrier on the delivery of some medical services merely because the provider chose to utilize telecommunications rather than considering whether telemedicine could be used safely and perhaps more effectively for some patients and illnesses.

The AMA’s adopted recommendations about the delivery of health care services via telemedicine include the following items:

  • Establishing a valid patient-physician relationship. Telemedicine services should be based on a valid patient-physician relationship established prior to providing telemedicine services, which can be established through: (a) a face-to-face examination, where a face-to-face encounter would otherwise be required for providing the same service in-person; or (b) a consultation with another physician who has an ongoing patient-physician relationship with the patient and agrees to supervise the patient’s care; or (c) meeting standards of establishing a patient-physician relationship included as part of evidence-based clinical practice guidelines on telemedicine developed by major medical specialty societies, such as those of radiology and pathology. Although this recommendation does not explicitly describe what constitutes a “face-to-face examination,” the full report provides that “[t]he face-to-face encounter could occur in person or virtually through real-time audio and video technology.”
  • State licensure. Physicians and other health practitioners delivering telemedicine services must abide by state licensure and medical practice laws and requirements in the state where the patient receives services.
  • Choice of provider and information on provider credentials. Patients seeking care delivered via telemedicine must be offered a choice of provider. Further, patients receiving telemedicine services must have access to the licensure and board-certification qualifications of the health care practitioners providing care in advance of their visit.
  • Practice guidelines. The delivery of telemedicine services must follow evidence-based practice guidelines, to the degree they are available, to ensure patient safety, quality of care, and positive health outcomes.
  • Patient history and documentation. The patient’s medical history must be collected as part of the provision of any telemedicine service, the services provided must be properly documented, and a visit summary must be provided to the patient.
  • Care coordination. The provision of telemedicine services must include care coordination with the patient’s medical home and/or existing treating physicians, which includes, at minimum, identifying the patient’s existing medical home and treating physician(s) and providing such physician(s) with a copy of the medical record.
  • Emergency referral protocols. Providers must establish protocols for emergency referrals.
  • Privacy and security. Delivery of telemedicine services must abide by laws addressing the privacy and security of patients’ medical information.

Alexis Gilroy, JD is a partner with the Jones Day law firm of Washington, DC.

Readers Write: Six Ways to Capitalize on the ICD-10 Delay

June 9, 2014 Readers Write Comments Off on Readers Write: Six Ways to Capitalize on the ICD-10 Delay

Six Ways to Capitalize on the ICD-10 Delay
By Dan Stewart

image

Most of the healthcare industry was taken by surprise when President Obama signed legislation that delayed the deadline to implement ICD-10 by at least a year. Now that there has been time to digest the new compliance date of October 1, 2015, healthcare providers may benefit by considering a more strategic approach for their transition to ICD-10.

Prior to the extension, many healthcare providers put in patches to meet the previous and quickly approaching October 1, 2014 compliance date. Process improvements and documentation training were put into high gear to meet the deadline, and in many cases, lacked strategic planning. With the additional time, providers can revisit their approach to implementation and potentially take advantage of other initiatives that directly impact the way their organization is evolving.

Here are six strategies to take advantage of the delay to be better positioned for post-transition success.

1. Increase clinical documentation and education

Providers now have an additional year to train their workforce. Nurses, physicians, coders, and even members of the C-suite need to understand the benefits for greater specificity in clinical documentation and how it applies to their role. Customized simulation training that addresses the specific educational needs of clinician groups can simplify the learning process and speed adoption. For example, customized simulation training can allow caregivers to practice documenting care in ICD-10 through their actual EHR application, which is critically important for learning workflow and gaining new knowledge about the system.

Any time and money invested in efforts like simulation training will be financially beneficial in ICD-9 and will also provide a smoother transition to ICD-10 with reduced risk of reimbursement issues. In addition, by continuing to engage staff with training, organizations can avoid losing the focus and interest that was created by the momentum leading up to the previous deadline.

2. Evaluate and improve the revenue cycle

Providers now have time to improve charge capture and billing and claims processing. Doing so will help to identify potential lost revenue and charge issues before claims are submitted and will improve compliance in anticipation of new denials and other post-transition challenges. Improved charge capture will also create a safety net to assist in identifying any potential ICD-10 process issues.

3. Implement computer-assisted coding (CAC) systems

Many hospitals have invested in CAC systems to aid coders in digesting physician documentation and determining which of the staggering 141,000 possible codes under ICD-10 is appropriate for each diagnosis and procedure. Now is the time to support the implementation of CAC and focus on coder workflow to optimize the benefits. Remote coding programs should also be evaluated. Incorporating tools like these not only reduces post-transition risk but also assists in the recruitment and retention of coders, which are in significantly increasing demand.

4. Begin dual coding

It is a reality that hospitals will need additional coders during the transition from ICD-9 to ICD-10. The extra time resulting from the delay creates an opportunity to begin dual coding sooner, providing physicians and coders additional practice before the compliance date. Prior to the transition, CAC systems can assist in the dual coding process by providing an automated crosswalk back to ICD-9 codes for submissions to payers, clearinghouses, and other third parties. The increased accuracy and efficiency of documentation and coding optimizes the post-transition period, mitigating the risk of compliance and reimbursement issues.

5. Analyze the financial impact

Hospitals should take the time to perform an in-depth financial impact analysis to determine the highest-impact codes on reimbursement to provide focus on operational remediation and training. Such analysis will additionally assist in identifying the reserves that will potentially be needed to get through post-compliance stabilization.

6. Expand the implementation plan

The ICD-10 extension presents an opportunity to strategically link its transition with other initiatives like Meaningful Use, Patient-Centered Medical Home (PCMH), and Accountable Care. Combining plans to adopt all of these programs can help ensure they each work together as efficiently as possible.

Miami Children’s Hospital, for example, is working to deploy a revenue cycle management system in addition to working toward ICD-10 compliance. Now that there is less immediate pressure to have physicians trained as soon as possible on ICD-10, their training can occur after the new system modules are implemented to better reflect the healthcare provider’s specific system and workflow. Implementing both of these programs in tandem saves time and money and strengthens the success of each.

 

While it would be easy for healthcare providers to decide to pause their efforts to become ICD-10 compliant as a result of the recent delay, it would benefit them much more to view the extra time as an opportunity to take a more strategic approach. Continuing the process will position the provider for a more successful, efficient transition to ICD-10. 

Dan Stewart is vice president and partner of strategic consulting and advisory services with Xerox.

Readers Write: Al’s Story

May 27, 2014 Readers Write 3 Comments

Happy Memorial Day. Today’s article is dedicated with a special, heartfelt thank you to all of our veterans serving our country abroad and to those here at home. Many thanks to all of the family members of the soldiers currently serving in harm’s way and to those who have lost loved ones. You all truly demonstrate great courage on a daily basis.

Mr. HIStalk, thank you for being so supportive of the troops. I’ve been present at many events across the country where you have personally recognized and paid tribute to anyone who has served in the military.

I recently sat down with Captain Donna Rowe who shared the story of her husband, Colonel Al Rowe.

Al’s Story
By Lisa Reichard, RN, BSN

clip_image002

Colonel Alvin G. “Al” Rowe

Al Rowe was born in Dubuque, IA in 1933. He became an Eagle Scout by the age of 12. He was a proud Iowa Hawkeye and graduated from the University of Iowa in 1956 with a bachelor’s degree in civil engineering. It was then that he entered the US Army as a Second Lieutenant through the university’s ROTC program. Al also received his masters in science degree from Iowa State University. Like many soldiers, Al could have made six figures working in the public sector as a civil engineer, but instead he chose to serve his country and did so faithfully for 30 years.

In 1965, he was sent with the 82nd Airborne to quell a communist uprising in the Dominican Republic. He was in his Jeep with his comrades and battalion. Sniper fire from rooftops hit him in the head. His comrades saved his life. There would be no one left behind.

clip_image004

“Al [shown third from the right] loved his comrades and put them first. He was a soldier’s solder who cared about his men,” said Donna.

My Sweetheart

According to Donna, “Al was treated for his injury at Fort Bragg, NC. This is how I came to meet him at Womack Army Hospital. He was my patient. I was a nurse supervisor at the time and we met briefly while he was recovering from surgery. Our first encounter was when I had to ask Al to quiet down. He was singing too loudly in the ward. Four days later when he was off duty, he asked to see me and if he could take me to dinner and I said OK. Although Al asked me for my number, I got busy and I walked off without giving it to him.”

“He called for three weeks to get my number, but since army policy is to never give out phone numbers, the ward would not release it. Finally, he called one of my friends who got my permission to give Al my phone number. We finally had our dinner date and when Al came to get me, my Louisiana-native roommate at the time, Carol Burnett, said with a very southern accent when Al picked me up in a white T-Bird convertible, ‘Donna, he has come to pick you up in a white stallion and carry you away.’ We were married 18 months later in 1967.”

Newlyweds Sent to War

Al and Donna were sent to Vietnam during the peak of the war in 1968 and 1969. Donna served as a head nurse of the Third Field Hospital in Saigon, one of the largest shock-trauma-triage emergency rooms in Vietnam. Al served as an adviser and equipment supplier to soldiers in the field during combat.

clip_image006

“Al and I were married 47 years and 10 months. He was my best friend,” said Rowe.

clip_image008

Donna and Al in Vietnam, Christmas 1968: “We sent this photo home to our families.”

Remembering an American Soldier and War Hero

Donna explained Al was shot down five times in Vietnam, but survived. “The communities where Al served loved and respected him a great deal both here and abroad. The South Vietnamese awarded him the Vietnam Cross of Gallantry.”

Col. Rowe received other military medals and decorations, including the Legion of Merit, the Bronze Star, Meritorious Service Medal, Joint Service Medal, Army Commendation Medal, Purple Heart, and the National Defense Service Medal, and many more. He was also a Master Parachutist.

After Vietnam, he went on to serve in the Pentagon, followed by the Army War College in Pennsylvania, before setting up forces command at Fort McPherson.

clip_image010

“Al [2nd from left] loved his comrades and put them first. He was a soldier’s solder who cared about his men.”

clip_image012

Al’s promotion to colonel at Fort McPherson in Atlanta in 1974 with Donna and son Richard at far left

“Al was a wonderful family man, and he was very active in the community,” said Donna. “We have two wonderful sons. He was a father figure to many.” She continued, “The military life can be very tough on families. They make lots of sacrifices.”

Upon his retirement from the Army, Al moved to Marietta, GA where he worked for Lockheed as a research engineer. Col. Rowe retired from the Army in 1983 as a colonel and was president of the Georgia Vietnam Veterans Alliance for four terms.

Another Battle

Col Rowe contracted Lou Gehrig’s disease, a neurodegenerative condition that affects nerve cells in the brain and the spinal cord, and struggled with the debilitating disease for three to five years. Donna believes it was service-connected (US Dept of Veteran Affairs – Agent Orange). “The journey with Lou Gehrig’s was difficult. It was another war that Al and I fought together.” She added, “The Department of Veterans Affairs in DC was wonderful during the illness. I really can’t say enough about how well we were treated.”

clip_image014

“Al served his country for 30 years, 10 months, and 22 days before he passed away on January 21, 2014. I miss him dearly. He was loved by many more friends and comrades-in-arms, and he will be dearly missed by everyone who knew him.”

Col Rowe’s legacy lives on through many programs, including the Society of American Military Engineers (SAME), which provides scholarships.

Fast Forward to Telemedicine Possibilities

With the recent resignation of Robert Petzel, undersecretary for health for US Veterans Affairs, there is a lot of discussion around improving timely access to care. General Eric Shinseki, US Secretary of Veterans Affairs, recently said most veterans are satisfied with the quality of care they get, but more must be done to "improve timely access to that care." Telemedicine could help to improve compliance and provide specialized care while decreasing long appointment waits both in the fields and at home for veterans.

Donna was willing to share her thoughts on telemedicine. “I really think it would be great to have telemedicine for diabetes patient maintenance and for treatment of Post-Traumatic Stress Syndrome (PTSS). It would cut down on a lot of hassle around travel time, parking, and other logistics and could help to increase compliance with maintenance programs,” she emphasized. Donna said that telemedicine will be great for soldiers in the field and that email centers exist for communication.

Final Thoughts — Help a Veteran

clip_image016

Hire Heroes USA provides career placement assistance to all of our returning service men and women. Here are some vet-friendly employers, including several healthcare companies.

Thank a Veteran

clip_image018 clip_image020

Donna sharing stories with me from her personal memoirs.

Donna was candid and generous to share her photos for this article. This interview was a good reminder for me that, like Donna and Al, every soldier has their own unique story just waiting to be told. If you get a chance this Memorial Day or any day, talk to a veteran and thank them for their service to our country.

When I started the interview with Donna Rowe about her husband Al, I thought it would make her day. Instead, I left the interview knowing that she had made mine.

image

Lisa Reichard, RN, BSN is director of community relations at Billian’s HealthDATA. HIStalk also featured an interview with Donna Rowe on The Kathleen Story for Nurses Week in May 2012.

Readers Write: Narrow Networks: Blessing, Curse, Should You Care?

May 23, 2014 Readers Write 1 Comment

Narrow Networks: Blessing, Curse, Should You Care?
By Shawn Wagoner

clip_image001

Narrow networks = blessing. In its recommendations to improve the government’s ACO programs, the American Hospital Association is urging CMS to “create some financial incentive on the part of the beneficiary to choose to stay ‘in network’ so that their care can be coordinated.”

Narrow networks = curse. In Seattle and New Hampshire, healthcare organizations are taking legal action to prevent health plans from developing narrow networks.

Narrow networks = real. Regardless of where an organization falls on the blessing vs. curse spectrum, narrow networks are back and gaining momentum. McKinsey research finds that 70 percent of the plans sold on the individual exchanges created as part of the ACA are what they categorize as narrow and ultra-narrow hospital networks. There is also serious traction among the private sector companies that help finance health insurance for their employees. As evidence, a commercial health plan in Minneapolis now has roughly 30,000 members enrolled in private exchanges and over half of those enrolled have chosen a narrow network benefit product constructed around one of four available ACOs.

Former ONC Chief Dr. David Blumenthal recently wrote about narrow networks, suggesting that “by guaranteeing their chosen caregivers a certain volume of business, health plans acquire the leverage to negotiate better prices in future contracts.” The private exchange example from Minneapolis suggests that providers also agree to higher quality and patient experience standards in addition to the price concessions. In theory, these narrow networks have the potential to benefit all stakeholders:

  • Health plans pay lower prices to providers and can package those lower prices into lower cost and higher quality benefit products to attract consumers and members.
  • Consumers pay lower premiums to the health plans for higher-quality care.
  • Providers are assured that the members will use their services when the need arises. Additionally, more people than before will use their services because the lower-priced narrow network benefit products attracts new patients.

Chances are that most organizations have a strategic plan that includes some form of a narrow network, whether a clinically integrated network, an ACO, or in many cases, both. Given their strategic importance and operational complexities, now is the time to start thinking about how to operate a narrow network effectively.

Recall the advent of high-deductible health plans a decade ago and how quickly patient responsibility grew as a percentage of revenue and the amount of process and technological change required in response. Likewise, narrow networks bring forth new yet similar challenges that will require a great deal of process change and technological advancement. Here are some thoughts to help assess the readiness of an organization:

Challenge #1: Patient transitions require improved coordination to track patient status in order to deliver on the higher quality standards and realize the financial benefit by ensuring patients are transitioned to in network providers.

Operational considerations:

  1. Can pertinent portions of chart notes be shared among all in-network providers?
  2. Does an automated workflow exist to book follow-on appointments for in network providers, both employed and affiliated?

Challenge #2: Narrow networks typically incent patients to stay in network for care by making it more expensive for them to have treatment with an out of network provider.

Operational considerations:

  1. Is a system in place to respond to patient inquiries for whether a given provider or facility is in their network?
  2. Can providers easily determine who is in and out of network when they are recommending follow-on care?

Challenge #3: Patients who choose narrow network products are cost conscious and expect their clinicians to be as well.

Operational considerations:

  1. Are clinical protocols broadly adopted that address the appropriateness of care so that patients are not faced with medical bills for unnecessary care?
  2. Are workup requirements established so that patients do not arrive at an appointment to find out key steps were not completed and therefore additional appointments are necessary before coming back?

Challenge #4: Patients have traded broad access via a wide open network of every provider and facility for a limited access option. However, limited access only refers to the number of physicians and facilities, not the ability to be seen in a timely manner.

Operational considerations:

  1. Are the individuals who handle inbound requests able to quickly view availability for all services within the narrow network to ensure the patient can get a timely appointment?
  2. Is this the time to start allowing patients themselves to book their own appointment online?

By no means is this an exhaustive list, but it should help quickly determine how prepared an organization is to support a narrow network strategy.


Shawn Wagoner is president of
Proximare Health of Savannah, GA.

Readers Write: ATA Conference Recap: My Impressions of the Show

May 23, 2014 Readers Write 2 Comments

ATA Conference Recap: My Impressions of the Show
By Norman Volsky

After attending and walking the exhibit hall of the 19th Annual American Telemedicine Conference in Baltimore Monday and Tuesday, I walked away with several conclusions (besides Baltimore having the world’s most delicious crab cakes.)

  • Telemedicine is a very exciting space. This market has the potential to help hospitals, patients, employers, and health plans reduce cost. There are also solutions out there which simultaneously improve quality and outcomes. This is a market that is poised for some tremendous growth.
  • The telehealth / telemedicine / telepresence (these all have different definitions) space could become commoditized very soon if it hasn’t already. There were a ton of companies that sold mobile carts, each with their own differentiators. Some were focused on providing their services at the lowest cost while others focused on quality and value. Either way, this market seems to be moving in the same direction that HIE and more recently EMR have gone in the past couple of years towards consolidation and commoditization.

clip_image004 clip_image006

  • Telemedicine is geared towards multiple customers. There were some companies like Healthspot and American Well that were showing off kiosks or pods designed for the retail sector including pharmacies, large corporate headquarters, and supermarkets as well as hospitals. American Well had solutions geared towards a tablet and smartphone that were impressive. This is a market that could have some significant growth.
  • Remote patient monitoring software companies are poised for growth. Some focus on home health, while others focus on post-acute and more broadly, the entire continuum of care. The companies that collect data from wearable devices are particularly cool. Many of these companies have patient engagement capabilities, secure texting, and outbound or proactive phone calls to patients to make sure they are following their care plans. This segment of HIT helps hospitals qualify for Meaningful Use by reducing readmissions. ACOs and health plans are leveraging these types of software systems to reduce cost, risk, and readmissions (the holy HIT trinity). The majority of these companies are focused on high-risk populations which include chronic care patients, the elderly, and patients who have had a recent major operation or episode. Others are focused on wellness for population management. I was particularly impressed with the exhibits of CareVia, AMC Health, Ideal Life, and Tactio Health.
  • Unique software caught my eye. Specific companies that caught my eye had unique offerings such as iMDsoft (clinical information systems software geared towards perioperative and critical care) and MediSprout (a telemedicine platform that runs entirely on tablets and leverages existing HIT apps.)
  • Smaller vendors need additional funding. I asked a lot of companies about their revenue model and some of them didn’t have great answers. There was also some ambiguity as to who the economic buyer would be (patients, hospitals, payers, etc.) Many companies threw out buzzwords like population health management and care coordination, but it seemed to me that they need to better articulate why these types of solutions are important to providers and health plans. If these companies can show how their solutions connect to the larger healthcare picture, they would have a better chance of obtaining the funding they require.
  • This is a very sheltered segment of the industry. The majority of the booths I went to had no knowledge of HIStalk. Most were unfamiliar with the site and many of these companies did not have a vast knowledge of the software world. At least half of the exhibiting companies were hardware focused, for example mobile carts with videoconference capabilities customized for healthcare.
  • The telemedicine segment should become more in tune with how their products and solutions fit within the broader healthcare IT market. With the previous conclusions in mind, these companies would be wise to keep abreast of blogs like HIStalk. They need to understand where hospitals are spending their money and what types of products and solutions will get the attention of hospital C-Level executives. With a better understanding of their competition for dollars, they would be more successful in articulating the right message to potential buyers. I also believe that partnering with some pure software companies could give them a more comprehensive and marketable offering to sell.

Overall, telemedicine is an area of healthcare that will have incredible growth over the next several years. There is a lot of competition in the telemedicine and remote patient monitoring segments and there will undoubtedly be some winners and losers. However, once the dust settles and consolidation occurs, the healthcare space will be better off. The ability to have doctor visits remotely and be able to monitor patients while they are at home is powerful. With this technology, hospitals and health plans will be able to reduce cost, risk and readmissions and, most importantly, save lives.

In conclusion, I feel this market is too siloed and needs a better understanding and exposure to the rest of the healthcare IT market. My advice for companies in this space would be to attend next year’s HIMSS conference in Chicago. I think doing so would be an eye-opening experience that would be extremely beneficial to this market’s inevitable growth. The better companies in this space understand how they fit into the bigger picture of healthcare, the better chance they will have to make it in both the short and long term.

clip_image007

Norman Volsky is director of mobile healthcare IT practice for Direct Recruiters, Inc. of Solon, OH.

Readers Write: EHR Usability – Whose Job Is It?

May 16, 2014 Readers Write 4 Comments

EHR Usability – Whose Job Is It?
By Michael Burger

image

Near misses, good catches, or serious reportable events – how many of these could be a design flaw of the EHR used? This was an underlying question in an article published recently entitled, “Poor Prescription documentation in EHR jeopardizes patient safety at VA hospital.” This article caught my eye because I thought perhaps there would be information on a design flaw that might need to be addressed in ePrescribing software.

The article referred to a Department of Veterans Affairs Office of Inspector General report from December that cited a variety of documentation lapses regarding opioid prescriptions at the VA Medical Center in San Francisco. The EHR was a factor in the report primarily because the EHR is the place from which the documentation was missing.

From the headline of this article, the reader assumes that the EHR figures prominently in the patient safety hazard. In all probability, the same lapse in documentation would have occurred in a paper chart environment. The report found that 53 percent of opioid renewals didn’t have documentation of a provider’s assessment. I’d lay a sizable wager that the percentage would be the same or higher were the hospital to be using paper charts versus an EHR.

It seems to be sport these days to throw daggers at (dare I say beleaguered) EHRs and EHR vendors. Studies are published showing the levels of dissatisfaction with EHRs. ONC responds by introducing EHR usability requirements in the Meaningful Use standards and certification criteria. Inevitably, the focus of these activities centers on the notion that vendors purposely build EHRs that aren’t usable, are inept at training, and are uncooperative (or even sinister) about working together.

In reality, vendors are anything but purposefully uncooperative, inept, or builders of unusable products. Logically, how could a vendor stay in business if they weren’t cooperative, sold things that didn’t work, and were failures at teaching people how to use their products? In the world of EHRs, there are forces at play that help to explain these perceptions.

EHR vendors, like creators of any other product, build software features based upon demand. The limitations to a development budget are time, scope, and resources. While any feature could be built, priorities must be set as to what to build and in what order, given the limitations.

Meaningful Use has disrupted this prioritization process by inserting requirements that have become high priority because they are necessary to pass the certification test but for which there is little or no customer demand. For example, no EHR user is asking for a way to document patient ethnicity. But there are plenty of requests for workflows that don’t require dozens of clicks. The challenge vendors face is that Meaningful Use requires focus on marginally useful features, such as tracking patient ethnicity, and doesn’t leave bandwidth to eliminate clicks in the workflow.

Ineptitude in training is an interesting claim. One very successful vendor is renowned for their “our way or the highway” mentality when it comes to training. Very effective to be certain, though not a lot of fun for those receiving the training. But this method does set an appropriate expectation that workflow modification is required for successful EHR adoption. Other vendors are renowned for their mostly failed attempts to “make the software accommodate your workflow so you won’t have to change a thing.” The reality is that it’s not possible to insert a computer into a manual process like clinical workflow and expect not to have to change a thing. It’s not that a failing vendor is inept, it’s that expectations aren’t being set correctly.

Meaningful Use has inserted a perverse twist into this already unpleasant reality by forcing vendors to train clients to perform workflows that are out of context of what doctors would typically do but are now required to be able to attest.

The uncooperative accusation is the most laughable of all. Interfaces have been around since before there were EHRs – HL7 was founded in 1987. It’s a question of supply and demand. When customers demand an ability to connect disparate systems, vendors build interfaces. It’s true that vendors have built products using proprietary architectures, because till now no one was asking for common standards. Even today, with the availability and mandated use of common standards, less than 30 percent of doctors regularly access HIE data. There’s not a lot of demand for all of that external data. It’s not that vendors don’t build interfaces because they’re being uncooperative; it’s because providers aren’t asking for it.

The principal of supply and demand is a fundamental market driver. It’s disappointing that Meaningful Use has sidetracked the natural evolution of the market by creating artificial demand for EHR functions that aren’t being asked for by actual consumers. MU has had the unintended consequence of stifling innovation of the functionality being asked for by users, which would have spurred widespread organic adoption. We’ve not (yet) seen the iPod of electronic health records because vendors have been too busy writing code to pass the MU test.

Rather than introducing a voluntary 2015 Edition EHR certification, CMS and ONC should give vendors the year that the start of MU Stage 3 has been deferred to innovate features the customers really want, rather than adding more features and another certification to continue a harsh cycle. 

Michael Burger is senior consultant with Point-of-Care Partners of Coral Springs, FL.

Readers Write: Liberating Data with Open API

May 16, 2014 Readers Write 4 Comments

Liberating Data with Open API
By Keith Figlioli

image

Today, people all over the world use Twitter as a means of everyday communication. But how useful would the application be if you had to contact the company and get a custom code each time you wanted to post a thought? As ludicrous as this seems in the social media space, it’s reality in healthcare information technology.

For all the hype around electronic health records (EHRs), healthcare providers still lack the ability to easily access data in EHRs. This in essence means that developers can’t just build applications that meet a use case need. This is because each system is closed behind a proprietary wall that requires custom coding in order to be unlocked for add-on workflow applications. If you want to marry EHR with pharmacy data so that doctors can be alerted when a medication hasn’t been refilled, for instance, health systems must contact their EHR vendor and pay to have that application developed to their specs.

These walls around data have real consequences. Not only are healthcare providers spending millions on one-off applications, but they are missing innovation opportunities by requesting custom builds. In the case of smartphones, both Apple and Google released their application programming interfaces (API) for any developer to leverage, creating thousands of apps, many of which users would not have imagined on their own. In healthcare, these APIs don’t exist, meaning that apps are only developed if they are imagined by either the provider or the vendor, with all potential for crowdsourced innovation completely cut off.

Although it’s hard to put a price tag on missed opportunity, a McKinsey & Company report found that the US loses between $300-$450 billion in annual economic potential because of closed data systems.[1] With more “liquid” data, McKinsey predicts new applications that close information gaps, enable best practice sharing, enhance productivity, support data-driven decision making, pinpoint unnecessary variation, and improve process reliability — all sorely lacking in today’s healthcare environment.

There’s also a price for patients. According to a recent Accenture poll, 69 percent of people believe they have a right to access all of their healthcare data in order to make decisions about their personal care. Yet almost none of these patients (76 percent) have ever accessed their EHR, chiefly because they don’t know how to, nor do they have the ability to integrate EHR data with other applications, such as those that track weight, diet or exercise via a smart phone or home computer.

Two forces need to align in order to facilitate change. In the marketplace, healthcare providers and patients both need to advocate for open API and liquid data in order to get the most out of healthcare applications. With increased demand for open access, market forces will be unleashed to prevent closed systems from being introduced for a single vendor’s financial gain. Moreover, with open systems and free access to development platforms, EHR vendors can differentiate themselves with the diversity and utility of the apps that are built to work with their systems, creating an added value to end users.

Secondly, we need a policy environment that enables innovation. One way this could be achieved would be for the Office of the National Coordinator to require open API for health data. In an optimal environment, vendors should have to demonstrate that data can be extracted via open API and leveraged by third-party software developers.

The business of healthcare should not be predicated on keeping data trapped behind proprietary walls. Given the critical need to use data to better predict, diagnose, and manage population health, the truly differentiated vendor is one that allows open access and third-party application development in order to create systems that providers and patients truly value. It’s time to liberate information and unleash innovation in healthcare.

[1] McKinsey & Company, “Open Data: Unlocking innovation and performance with liquid information”, October, 2013, p.11.

Keith Figlioli is senior vice president of healthcare informatics for Premier, Inc. of Charlotte, NC.

Readers Write: FDASIA and Healthcare’s Moon Shot Goal of ICU Safety

May 15, 2014 Readers Write 7 Comments

FDASIA and Healthcare’s Moon Shot Goal of ICU Safety
By Stephanie Reel

image

Preparing for the FDASIA panel was an energizing opportunity. It allowed me to spend a little time thoughtfully considering the role of government and the role of private industry in the future of health IT integration and interoperability. It gave me an opportunity to think a great deal about the important role ONC has played over the past few years and it made me question why we haven’t achieved some of the goals we had hoped to achieve.

As I was preparing my remarks, I reflected on the great work being done by my colleagues at Johns Hopkins and our vendor partners. We have the distinct privilege of having the Armstrong Institute at Hopkins focused on patient safety and quality, which is generously funded by Mr. Mike Armstrong, former chairman of our the Board of Trustees for Johns Hopkins Medicine. It is unequaled and a part of our fabric and our foundation. The Armstrong Institute is inspirationally led by Dr. Peter Pronovost, who is an incredibly well-respected leader in the field of patient safety, and also a trusted colleague and a good friend.  

We in IT at Hopkins receive exceptional support from our leadership – truly. We also have amazingly strong partnerships with our school of medicine faculty, our nurses, and our enterprise-wide staff. I suspect we are the envy of most academic health systems. The degree of collaboration at Hopkins is stunning – in our community hospitals, physician offices, and across our academic medical centers. Our systems’ initiatives derive direct qualitative and quantitative benefit from these relationships. Our CMIO, Dr. Peter Greene, and our CNIO, Dr. Stephanie Poe, are the best of the best in their roles. The medical director of our Epic deployment, Dr. John Flynn, is a gift.  

We are luckier than most. We could not do what we do without them. But despite this impressive and innovative environment, we still have significant challenges that are not unique to Hopkins. 

Despite huge investments and strong commitments to Meaningful Use, we have challenges across all  health IT initiatives. They aren’t new ones and they aren’t being adequately addressed by our current commitment to Meaningful Use criteria. We are still not operating in a culture adequately committed to safety and patient- and family-centered care. We are still not sufficiently focused on technologies, processes, and environments that consistently focused on doing everything in the context of what’s best for the patient. 

We decided to try harder. All across Johns Hopkins Medicine, we published a set of guiding principles that guide our approach to the deployment of information technology solutions. These guiding principles reduce ambiguity and  provide constancy of purpose. They drive the way we make decisions, prioritize our work, and choose among alternatives – investment alternatives, deployment alternatives, vendor alternatives, integration tactics, and deployment strategies. They provide a “true north” that promotes the culture we are hoping to create.

Our first guiding principle expects us to always do what is best for the patient. No question, no doubt, no ambiguity. We will always do what is best for the patient and for the patient’s family and care partners. We are committed to patient safety and it is palpable. This is our true north.

Our  second guiding principle allows us to extend our commitment even further. We commit to also always doing what is best for the people who take care of patients. So far, we have never found this to be in conflict with our first guiding principle. We view the patient and the patient’s family as our partners. Together, we are the team. Our environment, our work flow, our processes, and our technologies need to do what is best for all members of the team and all of the partners in the process of disease prevention, prediction, and treatment.

Our remaining guiding principles deal with our commitment to integration, standardization, and best practices. We know that unmanaged complexity is dangerous. We know that there are opportunities to improve our processes and our systems if we are always focused on being a learning healthcare system. We know we can achieve efficiencies and more effective solutions if we also achieve some degree of standardization and data and system integration. This is essential, critically important, and huge. It is something FDASIA (the FDA,FCC, and ONC) and the proposed Safety Center may be able to help us address. 

Is this the best role for government?

Government has an important role and government has the power to convene, which is often critical. But I also feel strongly that market forces are compelling and must be tapped to help us better serve our patients and the people who care for our patients. Health systems and hospitals have tremendous purchasing power. We should ensure we define our criteria for device and system selection based upon the vendor’s commitment to integration, standardization, and collaboration around best practices. We must find a way to promote continuous learning if we are to achieve the triple aim. 

We need to step up. We need to say we will not purchase devices, systems, and applications if the vendors are not fully and visibly committed to interoperability and continuous learning. This must be true for software, hardware, and medical devices. It must be true for our patients and for the people who care for our patients.

Moon shot goal

This relates my plea that we define a moon shot goal for our nation. We must commit to having the safest healthcare delivery system in the world. We should start with our intensive care units. We must ensure that our medical devices, smart pumps, ventilators, and glucometers are appropriately and safety interoperable. We must  make a commitment to settle for nothing less. We must agree that we will not purchase devices or systems that do not integrate, providing a safe, well-integrated solution for our patients and for the people taking care of our patients.

Let’s decide as a nation that we will place as much emphasis on safety as we have on Meaningful Use. Or perhaps we can redefine Meaningful Use to define the criteria, goals, and objectives to be achieved to ensure that we meet our moon shot goals. We will ensure that we have the safest hospitals in the world and we will start with our ICUs, where we care for the most vulnerable patients. We might even want to start with our pediatric ICUs, where we treat the truly most vulnerable patients.

More than 10 years ago, I was given an amazing opportunity to “adopt a unit” at The Johns Hopkins Hospital as a part of a safety program invented at Hopkins by Dr. Peter Pronovost. Each member of our leadership team was provided with an opportunity to adopt an ICU. We were encouraged to work with our ICU colleagues to focus on patient safety. We were educated and trained to be “preoccupied with failure” and focused on any defects that might contribute to patient harm. We didn’t realize it at the time, but we were learning how to become a High Reliability Organization.  

I learned quickly that our ICUs are noisy, chaotic, extremely busy, and not comforting places for our patients or their families. I learned that our PICU was especially noisy. Some of our patients had many devices at their bedside, nearly none of which were interoperable. They beeped, whirred, buzzed, and sent alarms – many of which were false alarms — all contributing to the noise, complexity, and feeling of chaos. They distracted our clinicians, disturbed our patients, and worried our family partners. 

Most importantly, they didn’t talk to one another. So much sophisticated technology, in the busiest places in our hospitals, all capable of providing valuable data, yet not integrated – not interoperable – and sometimes not even helpful.

I realized then, and many times since I adopted the PICU, that we all deserve better. Our patients and the people who care for our patients deserve better. We must build quiet ICUs where our care team can listen and learn and where our patients can receive the care they need from clinicians who can collaborate, leveraging well-integrated solutions and fully integrated information to provide the safest possible care. Many of these principles influenced the construction of our new clinical towers that opened two years ago. Again, we are fortunate, but huge challenges remain.

What about Quality Management Systems? Are we testing and measuring quality appropriately?

In many ways, I think we may focus too much on the back end. Perhaps we focus too much on testing and not enough time leading affirmatively. A commitment to excellence – to high reliability – might lessen the complexity of our testing techniques. I am very much committed to sophisticated quality assurance testing, but it seems far better to create and promote a culture that is committed to doing it right the first time. It will also be important that we affirmatively lead our design and deployment of systems that rely only on testing our solutions. 

With that in mind, I would prefer to see an additional focus or strategy that embraces High Reliability at the front end in addition to using quality management techniques. We undoubtedly need both. 

As I have recently learned, most High Reliability Organizations have much in common related to this dilemma. We all operate in unforgiving environments. Mistakes will happen, defects will occur, and we need to be  attentive. But we must also have aspirational goals that cause us to relentlessly focus on safety at the front end. We must remain passionate about our preoccupation with failure. We must recognize that our interventions are risky. We must have a sense of our own vulnerabilities and ensure we recognize we are ultimately responsible and accountable despite our distributed and decentralized models. We must continue to ask ourselves, “How will the next patient be harmed?” and then do everything possible to prevent harm at the front end as well as during testing.  We must create a culture that causes us to think about risk at the beginning.  And of course, we must be resilient, reacting appropriately when we do recognize errors, defects, or problems.

I should note that many of these ideas related to High Reliability are very well documented in Karl Weick and Kathleen Sutcliffe’s book, Managing the Unexpected. They encourage “collective mindfulness” and shared understanding of the situation they face. Their processes are centered around the five principles: a preoccupation with failure, reluctance to simplify interpretations, sensitivity to operations, commitment to resilience, and deference to expertise.

Why the moon shot goal?

As Dr. Pronovost at Johns Hopkins Armstrong Institute often says, “Change travels at the speed of trust.” We need to learn from one another. We need to be transparent, focused, and committed to doing what is best for our patients and for the people who care for our patients. We must commit to reducing patient harm. We must improve the productivity and effectiveness of our healthcare providers. We must have faith in our future and trust our partners. We need to make a commitment to no longer expect or accept mediocrity. 

From a recent study performed at the Armstrong Institute under Dr. Pronovost’s leadership, we know that patients around our country continue to die needlessly from preventable harm. Healthcare has little tangible improvement to show for its $800 billion investment in health information technology. Productivity is flat. Preventable patient harm remains the third leading cause of death in the US.

In addition, costs of care continue to consume increasingly larger and unsustainable fractions of the economy in all developed countries. While cutting payments may slightly decrease the cost per unit of service, improving productivity could more significantly deflate costs. Other industries have significantly improved productivity, largely through investments in technology and in systems engineering to obtain the maximal value from technology. Yet healthcare productivity has not improved. Our nurses respond to alarms — many of them false alarms – on average, every 94 seconds. This would be unthinkable in many other environments.

Despite my view that we must encourage market forces, we know that we have a long way to go to have an ICU that has been designed to prevent all patient harm while also reducing waste. Clinicians are often given technologies that were designed by manufacturers with limited usability testing by clinicians. These technologies often do not support the goals clinicians are trying to achieve, often hurt rather than help productivity, and have a neutral or negative impact on patient safety.

Moreover, the market has not yet integrated technologies to reduce harm. Neither regulators nor the market has applied sufficient pressure on electronic health record vendors or device manufacturers to integrate technologies to reduce harm. The market has not helped integrate systems or designed a unit that prevents all patient harm, optimizes patient outcomes and experience, and reduces waste. Hospitals continue to buy technologies that do not communicate.

It is as if Bloomberg News would have been successful if there were no standards for sharing of financial and market data. It would be unthinkable that Boeing would continue to partner with a landing gear manufacturer that refused to incorporate a signal to the cockpit informing the pilot whether the landing gear was up or down. We need the same engineering, medical, clinical trans-disciplinary collaboration expectations to ensure the same is true for healthcare.

Back to the moon shot….

An ideal ICU is possible if we decide it matters enough. If we agree to combine trans-disciplinary collaboration with broad stakeholder participation and demand authentic collaborations, we can get there in less than five years. But it won’t be trivial. It will require a public/private partnership.

The cultural and economic barriers to such collaborations are profound. Engineers and physicians use different language, apply different theories and methods, and employ different performance measures. We must take a holistic approach to create the ideal ICU and the ideal patient and family experience.

A safe, productive system is possible today. Technology is not the barrier. Let’s make it happen. Let’s have a goal for 2020 that we will have the safest ICUs (and the safest hospitals) on the planet – focused on patient- and family-centered care, disease prevention, and personalized and individualized healthcare.

Stephanie L. Reel is CIO and vice-provost for information technology at Johns Hopkins University and vice-president for information services for Johns Hopkins Medicine of Baltimore, MD.

Readers Write: What is a Patient Safety Organization and Should You Join One?

May 7, 2014 Readers Write Comments Off on Readers Write: What is a Patient Safety Organization and Should You Join One?

What is a Patient Safety Organization and Should You Join One?
By Brenda Giordano, RN, MN

image

Can they really say that?

In 2011, the government asked Walgreens for information about two of its pharmacists. Walgreens said “no” to the request. There was nothing the government could do about it — Walgreens belonged to a Patient Safety Organization (PSO).

If you are a provider and are unfamiliar with PSOs, take six minutes to read this article. You’ll not only learn how to have a more just and fair culture of safety, but also how to have stronger legal protections for the work your teams do with safety events.

Nine years ago this July, Congress passed the Patient Safety and Quality Improvement Act of 2005, also called the Patient Safety Act. This law created a system of voluntary reporting to Patient Safety Organizations (PSOs) of safety events, near misses, and unsafe conditions, similar to what is available within aviation. At the same time, a Network of Patient Safety Databases (NPSD) was established so data could be analyzed and we could all learn why safety events occur and how to avoid them. 

The ultimate aim is to improve safety, but in a manner that also creates environments where working through the nitty gritty of what happened and why it happened can be done with legal protection and confidentiality. This freedom to fully explore safety events and safety data should foster a Just Culture, where reporting an event does not result in punishment, but rather in learning.

Let me take a pause here to lay this out very plainly. Provider organizations (hospitals, skilled nursing facilities, pharmacies, home health, ambulatory care, physician and dentist offices, laboratories, renal centers, ambulance and paramedic services, and so forth) can receive legal protections from discovery in the case of a civil law suit if they belong to a PSO and put together a Patient Safety Evaluation System. This means that if, heaven forbid, you, as a provider find yourself being sued, there are strict limits on what can be “discovered” (think “uncovered.”)

Two things can be discovered: the facts of the case (what is in the medical record) and the billing and discharge information. Everything else — with exceptions that make sense, like the committee meetings where specific safety events are discussed or the information gained from root cause analysis – is legally protected.

If you hang around a hospital, clinic, or any of the above-mentioned care areas, you probably know that after an event, the Risk people often rush in and tie people’s hands on what is documented. They are afraid that a lawsuit will uncover all kinds of things that the facility would be liable for, that would make them look bad, or that would hurt their reputation.

This is a logical approach, but sometimes the end result is that few things are learned and progress on safety is slow since everyone’s mode is CYA (the only acronym I decided to not spell out). I really wish it was not like this because I truly believe that complete transparency is the better road to take.

The reality is that few organizations have the guts to be fully transparent. The legal protection provided by this law tries to break up that bad cycle of “burying our mistakes” and remove the fear so that honest work on safety improvement can happen.

Comparative information in safety is hard to obtain. To that end, the Agency for Healthcare Research and Quality (AHRQ) created a common format so that event information from any safety reporting system can be placed into 10 categories. Research can then be done on falls, medication errors and so forth. PSOs send de-identified information to AHRQ in this common format for addition to the Network of Patient Safety Databases.

Here are few reasons for joining a PSO.

  • It encourages a healthy culture of safety. It’s hard to learn when you are worried that you’ll be punished or found out in a public way. A PSO helps to remove the “‘whack of the ruler across the knuckles” attitude that does not help anything. The intent of the law is to foster learning, not place blame. We all want to improve safety and quality for our patients. A healthy Just Culture of safety can foster this.
  • Do it while it’s voluntary (unless you have really bad readmission.) Joining a PSO is voluntary, but in the future, hospitals with 50 beds or more need to have a Patient Safety Evaluation System in place to participate in state insurance exchanges (the exact date is not set). By joining a PSO now, hospitals can be prepared for this eventuality with a good system in place. No one knows if the PSO program will ever be mandatory, but knowing the government… About the readmission exception, courtesy of the Affordable Care Act, if the Secretary of HHS has determined you are eligible, well, you probably know who you are and why you need to be part of a PSO.
  • Remove wasteful costs that come with poor safety. Safety-related lawsuits are costly to defend. In addition, liability carriers increase premiums when they have to defend you a lot. Imagine having your carrier tell you, “Your premiums will be going down because we’ve had so few cases where we needed to defend you.” Wouldn’t that be nice?
  • Compare and collaborate with other organizations. PSOs can provide de-identified regional and national safety benchmarks. Knowing where you stand can help you to focus your improvement efforts and where to give praise. PSOs can also broker collaboration among their members so they can share what they have learned. It’s great to have a buddy outside your own system where mutual learning is not just allowed, but encouraged.

There are around 80 PSOs. Some are specialty based, others are state based, and many will cover multiple types of providers across the US. I hope you will consider joining one.

Brenda Giordano, RN, MN is operations manager of the Quantros Patient Safety Center, a federally-listed PSO serving 4,000 facilities, of Milpitas, CA.

Readers Write: The Engaged Healthcare Consumer is a Myth

The Engaged Healthcare Consumer is a Myth
By Tom Meinert

Although I am a reader, this may be more appropriately titled “Patient Writes.” I have been feeling cynical lately regarding healthcare. It seemed to culminate with a recent experience with my health insurer.

My daughter had to get an MRI. Shortly thereafter, I received the bill for my portion and it was a nice, whopping $1,000. Aside from the sticker shock, I was surprised because I had an MRI about a year and a half ago and I only had to pay $400. It’s the same Insurance plan for both of us, and although the MRIs were done at different places, the difference of the sticker price of the MRIs was $250. I was expecting to pay a bit more, but somehow a $250 difference actually cost me $600 more out of pocket. 

I called my insurer. After 40 minutes, I learned the following.

  • It’s not just what the hospital charges, but how they bill it. That may significantly change how much I have to pay out of pocket.
  • The people I call don’t have the information about the true cost to me.
  • Even if they have this information, they can’t share it with me.

This call confirms what has bothered me all along. Despite all the talk and hype regarding patient engagement, consumerization of healthcare and mHealth ushering in a whole new world along with the ACA, the concept of an engaged and informed consumer of health care is a myth.

I have worked my entire professional career in healthcare IT. I believe it can help change it for the better. Admittedly, most of my time has been spent working on projects that help improve care within a hospital.

But the more articles I read, conferences I attend, and apps I play around with, as I compare my own experiences as a patient, I am not impressed. Even more, I can’t see how healthcare is going to fundamentally change for the better.

I am frustrated and powerless. Over the past four years, my health has improved greatly. I have lost weight. My cholesterol is way down, along with my blood pressure and resting heart rate. Yet over this time, my personal out-of-pocket costs and premiums have increased.

I live in Massachusetts, which is at the forefront of healthcare reform. I can’t see a PCP without booking nine months in advance. I can see them only once every two years because that’s what insurance covers for a well visit.

Like more and more of us who were pushed into a high-deductible plan, I can only begin to empathize with patients who have complex problems who try to navigate the world of billing.

I hardly feel as though I am a consumer of healthcare. The truth so far seems to be that the definition of a healthcare consumer is simply proportional to the amount of costs pushed on to me. The more out-of-pocket costs I have, the more of an engaged consumer I am.

However, pushing costs on to me certainly does not make me an empowered and informed consumer. And it certainly doesn’t incentivize me to be healthy.

Going back to my phone call, it ended as expected. There was no real resolution. I still have to pay the bill. 

Those calls are recorded “for quality,” so I included one last comment out of frustration. Despite this company telling me that they have no real way to get at this information, someone there has it readily available at all times — the person who sends out all those bills that are accurate down to the last cent.

Readers Write: Can Intuitive Software Design Support Better Health?

April 16, 2014 Readers Write Comments Off on Readers Write: Can Intuitive Software Design Support Better Health?

Can Intuitive Software Design Support Better Health?
By Scott Frederick

image

Biometric technology is the new “in” thing in healthcare, allowing patients to monitor certain health characteristics—blood pressure, weight, activity level, sleep pattern, blood sugar—outside of the healthcare setting. When this information is communicated with providers, it can help with population health management and long-term chronic disease care. For instance, when patients monitor their blood pressure using a biometric device and upload that information to their physician’s office, the physician can monitor the patient’s health remotely and tweak the care plan without having to physically see the patient.

For biometric technology to be effective, patients must use it consistently in order to capture a realistic picture of the health characteristics they are monitoring. Without regular use, it is hard to see if a reading is an anomaly or part of a larger pattern. The primary way to ensure consistent use is to design user-friendly biometric tools because it is human nature to avoid things that are too complicated, and individuals won’t hesitate to stop using a biometric device if it is onerous or complex.

Let’s look at an example.

An emerging growth area for healthcare biometrics is wireless activity trackers—like FitBit—that can promote healthier lifestyles and spur weight loss. About three months ago, I started using one of these devices to see if monitoring metrics like the number of steps I walked, calories I consumed and hours I slept would make a difference in my health.

The tool is easy-to-use and convenient. I can monitor my personal metrics any time, anywhere, allowing me to make real-time adjustments to what I eat, when I exercise, and so on. For instance, at any given time, I can tell how many steps I’ve taken and how many more I need to take to meet my daily fitness goal. This shows me whether I need to hit the gym on the way home from work or whether my walk at lunch was sufficient. I can even make slight changes to my routine, choosing to stand up during conference calls or take the stairs instead of the elevator.

I download my data to a website, which provides easy-to-read and customizable dashboards, so I can track overall progress. I find I check that website more frequently than I look at Facebook or Twitter.

Now, imagine if the tool was bulky, slow, cumbersome and hard to navigate. Or the dashboard where I view my data was difficult to understand. I would have stopped using it awhile ago—or may not have started using it in the first place.

Like other hot technology, there are several wireless activity trackers infiltrating the market, each one promising to be the best. In reality, only the most well-designed applications will stand the test of time. These will be completely user-centric, designed to easily and intuitively meet user needs.

For example, a well-designed tracker will facilitate customization so users can monitor only the information they want and change settings on the fly. Such a tool will have multiple data entry points, so a user can upload his or her personal data any time and from anywhere. People will also be able to track their progress over time using clear, easy-to-understand dashboards.

Going forward, successful trackers may also need to keep providers’ needs in mind. While physicians have hesitated to embrace wireless activity monitors—encouraging patients to use the technology but not leveraging the data to help with care decisions—that perspective may be changing. It will be interesting to see whether physicians start looking at this technology in the future as a way to monitor their patients’ health choices. Ease of obtaining the data and having it interface with existing technology will drive provider use and acceptance.

While biometric tools are becoming more common in healthcare and stand to play a major role in population health management in the future, not every tool will be created equal. Those designed with the patient and provider in mind will rise to the top and improve the overall health of their users.

Scott Frederick, RN, BSN, MSHI is director of clinical insight for PointClear Solutions of Atlanta, GA.

Readers Write: Addressing Data Quality in the EHR

April 16, 2014 Readers Write 1 Comment

Addressing Data Quality in the EHR
By Greg Chittim

image

What if you found out that you might have missed out on seven of your 22 ACO performance measures, not because of your actual clinical and financial performance, but because of the quality of data in your EHRs? It happens, but it’s not an intractable problem if you take a systematic approach to understanding and addressing data quality in all of your different ambulatory EHRs.

In HIStalk’s recent coverage of HIMSS14, an astute reader wrote:

Several vendors were showing off their “big data” but weren’t ready to address the “big questions” that come with it. Having dealt with numerous EHR conversions, I’m keenly aware of the sheer magnitude of bad data out there. Those aggregating it tend to assume that the data they’re getting is good. I really pushed one of the major national vendors on how they handle data integrity and the answers were less than satisfactory. I could tell they understood the problem because they provided the example of allergy data where one vendor has separate fields for the allergy and the reaction and another vendor combines them. The rep wasn’t able to explain how they’re handling it even though they were displaying a patient chart that showed allergy data from both sources. I asked for a follow up contact, but I’m not holding my breath.

All too often as the HIT landscape evolves, vendors and their clients are moving too quickly from EHR implementation to population health to risk-based contracts, glossing over (or skipping entirely) a focus on the quality of the data that serves as the foundation of their strategic initiatives. As more provider organizations adopt population health-based tools and methodologies, a comprehensive, integrated, and validated data asset is critical to driving effective population-based care.

Health IT maturity can be defined as four distinct steps:

  1. EHR implementation
  2. Achievement of high data quality
  3. Reporting on population health
  4. Transformation into a highly functioning PCMH or ACO.

High-quality data is a key foundational piece that is required to manage a population and drive quality. When the quality of data equals the quality of care physicians are providing, one can leverage that data as an asset across the organization. Quality data can provide detailed insight that allows pinpointing opportunities for intervention — whether it’s around provider workflow, data extraction, or patient follow-up and chart review. Understanding the origins of compromised data quality help recognize how to boost measure performance, maximize reimbursements, and lay the foundation for effective population health reporting.

It goes without saying that reporting health data across an entire organization is not an easy task. However, there are steps that organizations must take to ensure they are extracting sound data from their EHR systems.

Outlined below are the key issues that contribute to poor data quality impacting population health programs, how they are typically resolved, and more optimal ways organizations can resolve them.

 

Variability across disparate EHRs and other data sources

EHRs are inconsistent. Data feeds are inconsistent. Despite their intentions, standardized message types such as HL7 and CCDs still have a great deal of variability among sources. When they meet the letter of national standards, they rarely meet the true spirit of those standards when you try to use.

Take diagnoses, for example. Patient diagnoses can often be recorded in three different locations: on the problem list, as an assessment, and in medical history. Problem lists and assessments are both structured data, but generally only diagnoses recorded on the problem list are transported to the reports via the CCD. This translates to underreporting on critical measures that require records of DM, CAD, HTN, or IVD diagnoses. Accounting for this variability is critical when mapping data to a single source of truth.

Standard approach: Most organizations try to use consistent mapping and normalization logic across all data sources. Validation is conducted by doing sanity checks, comparing new reports to old.

Best practice approach: To overcome the limitations of standard EHR feeds like the CCD, reports need to pull from all structured data fields in order to achieve performance rates that reflect the care physicians are rendering– either workflow needs to be standardized across providers or reporting tools need to be comprehensive and flexible in the data fields they pull from.

The optimal way to resolve this issue is to tap into the back end of the EHR. This allows you to see what data is structured vs. unstructured. Once you have an understanding of the back-end schema, data interfaces and extraction tools can be customized to pull data where it is actually captured, as well as where it should be captured. In addition, validation of individual data elements needs to happen in collaboration with providers, to ensure completeness and accuracy of data.

 

Variability in provider workflows

EHRs are not perfect and providers often have their own ways of doing things. What may be optimal for the EHR may not work for the providers or vice versa. Within reason, it is critical to accommodate provider workflows rather than forcing them into more unnatural change and further sacrificing efficiency.

Standard approach: Most organizations ignore this and go to one extreme or another: (1) use consistent mapping and normalization logic across all data sources and user workflows, making the assumption that all providers use the EHR consistently, or (2) allowing workflows to dictate all and fight the losing battle to make the data integration infinitely adaptable. Again, validation is conducted using sanity checks, comparing new reports to old.

Best practice approach: Understand how each provider uses the system and identify where the provider is capturing all data elements. Building in a core set of workflows and standards dictated by an on-the-ground clinical advisory committee, with flexibility for effective variations is critical. With a standard core, data quality can be enhanced by tapping into the back end of the EHR to fully understand how data is captured as well as spending time with care teams to observe their variable workflows. To avoid disruption in provider workflows, interfaces and extraction tools can be configured to map data correctly, regardless of how and where it is captured. Robust validation of individual data elements needs to happen in collaboration with providers to ensure completeness and accuracy of data (that is, the quality of the data) matches the quality of care being delivered.

 

Build provider buy-in/trust in system and data through ownership

If providers do not trust the data, they will not use population health tools. Without these tools, providers will struggle to effectively drive proactive, population-based care or quality improvement initiatives. Based on challenges with EHR implementation and adoption over the last decade, providers are often already skeptical of new technology, so getting this right is critical.

Standard approach: Many organizations simply conduct data validation process by doing a sanity test comparing old reports to new. Reactive fixes are done to correct errors in data mapping, but often too late, after provider trust has been lost in the system.

Best practice approach: Yet again, it is important to build out a collaborative process to ensure every single data element is mapped correctly. First meetings to review data quality usually begin with a statement akin to “your system must be wrong — there’s no way I am missing that many patients.” This is OK. Working side by side with the providers to ensure they understand where data is coming from and how to modify both workflow and calculations ensure that they are confident that reports accurately reflect the quality of care they are rendering. This confidence is a critical success factor to the eventual adoption of these population health tools in a practice.

 

Missed incentive payments under value-based reimbursement models

An integrated data asset that combines data from many sources should always add value and give meaningful insight into the patient population. A poorly mapped and validated data asset can actually compromise performance, lower incentive reimbursements, and ultimately result in a negative ROI.

Standard approach: A lackluster data validation process can result in lost revenue opportunities, as data will not accurately reflect the quality of care delivered or accurately report the risk of the patient population.

Best practice approach: Using the previously described approach when extracting, mapping, and validating data is critical for organizations that want to see a positive ROI in their population health analytics investments. Ensuring data is accurate and complete will ensure tools represent the quality of care delivered and patient population risk, maximizing reimbursement under value-based payments.

 

We have worked with a sample ACO physician group of over 50 physicians to assess the quality of data being fed from multiple EHRs within their system into an existing analytics platform via CCDs and pre-built feeds. Based on an assessment of 15 clinically sensitive ACO measures, it was discovered that the client’s reports were under-reporting on 12 of the 15 measures, based only on data quality. Amounts were under-reported by an average of 28 percentage points, with the maximum measure being under-reported by 100 percentage points.

Reports erroneously reported that only six of the 15 measures met 2013 targets, while a manual chart audit revealed that 13 of the 15 measures met 2013 targets, indicating that data was not being captured, transported, and reported accurately. By simply addressing these data quality issues, the organization could potentially see additional financial returns through quality incentive reimbursements as well as a reduced need for labor-intensive intensive chart audits.

As the industry continues to shift toward value-based payment models, the need for an enterprise data asset that accurately reflects the health and quality of care delivered to a patient population is increasingly crucial for financial success. Providers have suffered enough with drops in efficiency since going live on EHRs. Asking them to make additional significant changes in their daily workflows to make another analytics tool work is not often realistic.

Analytics vendors need to meet the provider where they are to add real value to their organization. Working with providers and care teams not only to validate integrity of data, but to instill a level of trust and give them the confidence they need to adopt these analytics tools into their everyday workflows is extremely valuable and often overlooked. These critical steps allow providers to begin driving population-based care and quality improvement in practices, positioning them for success in the new era of healthcare. 

Greg Chittim is senior director of Arcadia Healthcare Solutions of Burlington, MA.

CMIO Rant with … Dr. Andy

April 9, 2014 Readers Write 5 Comments

CMIO Rant with … gives CMIOs a place to air their thoughts or gripes. Yours are welcome.

The Great Prescription Pad Race
By Andy Spooner, MD

image

Which is more usable: a prescription pad or a computer?

That’s a no-brainer. For writing a prescription, the pad wins, hands down. Consider its features:

  • Instant-on. No booting up. Just reach in your pocket and you are ready to go.
  • Compact, lightweight. Did I mention your pocket?
  • Self-documenting. No need to print a summary describing the prescription.
  • No irritating pop-ups with irrelevant alerts.
  • Patient-centered. The pharmacist can fill in missing information (liquid or tablet or capsule? brand or generic?) based on patient preferences.
  • Flexible. Can be taken to any pharmacy. No need to route it to a specific place, or even to ask the patient about a preferred pharmacy.
  • Streamlined. No need to worry about pharmacy benefit management rules. The pharmacist can sort all that stuff out.
  • Information-persistent. If the family has a question about an apparently erroneous prescription, they can read the details right off the prescription when talking to the after-hours nurse.
  • No record-keeping clutter. Patients can just tell us about their prescriptions next time we see them. They could just bring in the bottle or something.

With all of these advantages, surely only the geekiest of pencil-necked CMIOs would advocate an electronic method of prescribing, right?

Of course not.

The prescription pad is easier only if we define the work as the minimum possible activity that a doctor can do to get a prescription into a patient’s hands. The truth is, we are not done with the task of prescribing when we hand the slip of paper to the patient. If we think we are, then the pad seems far easier to use—more usable—than any electronic health record or e-prescribing system.

The above competition is absurd, of course, in an era when, according to the CDC’s National Ambulatory Medical Care Survey, over 80 percent of office-based physicians in 2013 used electronic prescribing. That rate rose from less than 60 percent over the past three years. E-prescribing is here to stay.

But we still hear about how unusable electronic medical record systems are. In The Atlantic this month, we read that a doctor who sees 14 patients a day spends “1-3 hours” each day entering orders. Assuming that each patient needs some orders for health maintenance (screening lab work), prescription renewals, and maybe a few diagnostic tests and referrals, it’s hard to take that statistic seriously. It’s clear that the writer is irritated at his EMR, and there may be some legitimate design or implementation issues with it. But 1-3 hours of ordering per day? C’mon.

Somewhere between the slapdash paper prescription and the three hours of daily ordering is the truth. Managing clinical information takes some amount of time, and some of it should be done directly by physicians. Some of this activity serves a “compliance” goal that you may not like, but all of it is a part of building a system of healthcare that serves a worthy goal.

If we insist that all clicks are wasted time, then we can’t have a conversation about usability, because under the prescription pad scenario, the only usable computer is one you don’t have to use at all.

On the other hand, if we insist that our current systems are bad because of hyperbolic, data-free assertions about how the EMR is making our lives miserable, we are similarly blocked from making productive plans to improve usability because, well, it’s just too darn much fun to complain.

My thesis, then, is that EMR usability is not as much about design as about expectations. Variations in what these expectations ought to be between different perspectives will lead to unproductive conversations (or no conversations at all) about what it means to have an EMR that’s easy to use.

All I know for sure as a CMIO is that physicians want all of this stuff to be easier to use. We also want these systems to read our minds, but that’s at least a couple of versions away, if I am understanding the vendor presentations at HIMSS correctly.


Andy Spooner, MD, MS, FAAP is CMIO at Cincinnati Children’s Hospital Medical Center. A general pediatrician, he practices hospital medicine when he’s not enjoying the work involved in keeping the integrated electronic health record system useful for the pediatric specialists, primary care providers, and other child health professionals in Cincy.

Readers Write: Advanced Interoperability: Leveraging Technology to Improve Patient Lives and Provider Workflows

April 2, 2014 Readers Write 1 Comment

Advanced Interoperability: Leveraging Technology to Improve Patient Lives and Provider Workflows
By Justin Dearborn

image

There’s an increasing need for all of healthcare to be integrated in its approach to accessing, sharing, and storing information. It’s not just patients who could stand to benefit from more advanced interoperability. It’s also healthcare providers who want to meet legislative requirements such as Meaningful Use Stage 2 and Stage 3, as well as reduce costs and improve care quality.

Consider what typically happens in today’s medical imaging environment—often partway between a traditional manual environment and a fully interoperable one—when a patient presents to his primary care physician (PCP) complaining of shoulder pain, for example:

After receiving a comprehensive clinical exam, a patient named Dave heads home with a hand-scribbled order for a shoulder MRI. Before the exam can take place, however, the imaging center must get the order pre-certified by Dave’s health insurer. After receiving the insurer’s faxed approval days later, the imaging center schedules the patient for his exam. Days after that, the radiologist faxes his report to the PCP, who then calls Dave to set another appointment to discuss his torn rotator cuff. Once the decision to seek surgical treatment is made, Dave is asked to bring a CD of his radiology images to the orthopedic specialist.

If this process sounds cumbersome, time consuming, and inefficient, that’s because it is. It’s also the rule with respect to today’s medical imaging processes.

While it’s true that anywhere between 10 to 20 percent of imaging orders issued today are processed electronically, that still means the vast majority are processed manually via paper and/or fax. According to the Centers for the Disease Controls and Prevention (CDC), approximately 12 percent of all PCP visits alone result in a referral for diagnostic imaging—some 44 million imaging exams each year—which equates to a lot of wasted time and paper, not to mention money.

The payer-approval process only adds to that burden. Roughly 122 million imaging exams are processed manually by radiology benefits management companies each year, at a cost of about $75 per exam. That adds up to nearly $8 billion of waste a year.

So the question is this: What would happen in an environment of advanced interoperability, where existing electronic health records (EHR) and other technologies are fully leveraged? Take Dave’s scenario again:

After receiving a comprehensive clinical exam, Dave’s PCP electronically orders a shoulder MRI and schedules an imaging appointment for later in the day. Before the exam takes place, the imaging center receives electronic pre-certification. Once the MRI is complete, the PCP automatically is alerted that an image-enabled report is available. Before he leaves his office for the evening, the PCP calls Dave to discuss his torn rotator cuff and to electronically refer him to an orthopedic specialist who already has secure automated access to the image-enabled radiology report.

As this simple scenario illustrates, the entire patient-imaging process can be streamlined by enabling five key services: 1) electronic referrals and ordering; 2) automated pre-certification and approval using clinical decision support; 3) electronic patient navigation and scheduling; 4) image-enabled reporting; and 5) analytics.

Such advanced interoperability provides Dave, his PCP, and his orthopedic specialist with near-instantaneous exam ordering, approval, and scheduling. Ease of access to reports, results, and images is dramatically increased.

By creatively leveraging EHRs and other technologies, healthcare organizations can maximize their interoperability with internal and external providers. All these services, moreover, can be provided without costly point-to-point HL7 interfaces.

With payment reform, it is clear that the days of disjointed, manual image processing are numbered. Indeed, advanced interoperability like that described here not only addresses the challenges that impact physicians, but also pays handsome dividends for patient care.

Justin Dearborn is CEO of Merge Healthcare of Chicago, IL

Text Ads


RECENT COMMENTS

  1. DOGE is not intended to improve healthcare cost, quality, or access. Their goal is to take a wrecking ball to…

  2. Forgive my skepticism but I doubt having a couple of vendors on DOGE will do anything to improve healthcare cost,…

  3. Bloomberg's editorial Board seems overly harsh. The seemingly unlimited power bestowed upon ONC & CMS resulted in Increasing Meangful Use…

  4. To broaden the point out further. Have you noticed that social norms have changed? Time was, if a person was…

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

RSS Webinars

  • An error has occurred, which probably means the feed is down. Try again later.