Home » Readers Write » Recent Articles:

Readers Write: From Rice Fields to Big Data

July 30, 2014 Readers Write 1 Comment

From Rice Fields to Big Data
By Ping Zhang


My journey into technology was a long road. The first 15 years of my life were spent in the Hunan province in rural southern China. My family had no running water, and more often than not, we went to bed hungry.

At five, I started working with my father in the rice paddies. I planted rice seeds while my father manually built rice rows and dug irrigation canals. Everything was done by hand. It wasn’t until I was 11 that I saw my first technological advancement — a tractor — on my way to school.

At age 15, I rode on a train for the first time on my way to college. It was only then that I realized the promise of technology and how it could save my father’s back and hands from the brutal years of manual labor.

My passion for mathematics helped me earn my bachelor’s degree at 19 in China. After the 1989 events in Tiananmen Square, I decided to try to migrate to the United States. In 1990, I landed in Fayetteville, Arkansas with only my bags and a hundred American dollars to pursue my PhD at the University of Arkansas at Fayetteville. My wife followed soon after.

Eighteen months after moving to the US, I had my first experience with the American healthcare system. Early one Friday evening in 1992, my wife suddenly felt a sharp pain in her stomach. We rushed to the emergency room. We waited and waited – and waited some more. Three hours later, she was finally seen by an OB/GYN doctor.

It turned out that she had an ectopic pregnancy. She had been pregnant, but the fertilized egg had become lodged in one of her Fallopian tubes. Two liters of blood had accumulated as she waited for treatment. She had come close to losing her life.

The next day, the doctor cleared her for discharge with a clean bill of health, leaving us with a bill of a few thousand dollars. One of her Fallopian tubes had been torn open and the other had become so clogged with lost blood that it would likely permanently block any egg. We were told the chances of her ever having a child were slim.

The experience was shocking, scary, and life altering. Thankfully, after years of infertility treatments, she was able to give birth to two beautiful boys.

That horrible experience was over 20 years ago, but I still remember it like it was yesterday. Part of the reason is that I have spent many of those past 20 years working within the healthcare system to change it myself. I want to share three key lessons I have learned over this long journey.

The quality of healthcare is too low

The state of service in American healthcare is far below where it needs to be and where it could be, especially with its skyrocketing costs. But what if our healthcare system operated under a free competition model, much like the retail industry? No department store would ever have its customers regularly wait for hours in line to buy its products – because no one would go to that store any more (and they would book it in the other direction with haste).

Under a similar system for healthcare, providers would have to work much harder and more effectively to attract and adequately serve consumers. Open competition would lead to greater efficiency, lower cost, better quality of service, and more choices for consumers.

More innovation and disruption

Innovation and disruption must be encouraged. Over the past two decades, I learned about the underlying principles of world-class innovation from Silicon Valley. I had mentors who constantly encouraged me to break out of the box, experiment, and try something new and different. Healthcare is clearly not where it should be. We must find a better solution for something as vital to societal and individual wellbeing. Healthcare still needs a Steve Jobs and Apple-like innovation revolution to make it more clinically effective for the consumer and cost effective for all.

For example, what would healthcare look like if we could receive updates and monitor our health through a Fitbit device or health app the same way we receive ESPN notifications on an iPhone today? What if technology motivated us to pay the same amount of attention to our health as we do with our social media networks, and with the same ease? These are the simple concepts that will help us all live longer and save hundreds of lives.

Top-down is not enough; consumers need to become more invested

Consumers themselves must do more to control their own outcomes. As an immigrant to the US, I knew that I had to work much harder than my peers to succeed. Consumers today should adopt a similar drive and approach to their health. Rather than waiting for doctors to treat and prescribe “fixer” medications, we need to work more diligently to lead healthier, more proactive lifestyles – and we have the information and technology to do so at our fingertips.

What was once monopolized by professionals with years of training (and extremely costly) is now available at the nearest Best Buy or app store for just a couple of hundred dollars — or nothing at all. Look to wearable biometric devices as an example; those gadgets can accurately monitor an individual’s health, diagnose risky behaviors based on behavioral research from big data findings, and provide information on how to live a healthier, lower-risk life.

I am thrilled that my sons get to reap the benefits of a wonderfully innovative country that is slowly, but surely, transforming its healthcare system for the better. Sooner rather than later, as my boys grow into husbands and fathers, we will move past the times when emergency care is almost as painful as the medical ailment that necessitates the visit. And if we don’t, we’re doing something very wrong.

Ping Zhang, PhD is SVP of product innovation and chief technology officer of MedeAnalytics of Emeryville, CA.

Readers Write: 20th Century Man

July 30, 2014 Readers Write 4 Comments

20th Century Man
By Barry Wightman


"I work in healthcare during the day, then I go home to the 21st century."

Dave Levin, MD, founder/CEO of Tres Rios Group (@DaveLevinMD), uttered these stinging words at last week’s OnBase + Epic User Forum in Cleveland. Thanks to our friends at Nordic Consulting (@Nordicwi), who tweeted this in real time on the jungle network raising, I’m sure, a knowing smile on many a grey-haired regular Joe technology type such as myself in health IT-land. Now, I’m not here to comment Dr Levin’s presentation, but with that tasty, snarky comment, he hit on something I’ve been wanting to get off my chest for a while now.

It’s all déjà vu all over again.

And it starts with an old Kinks song.

No, really.


Good old Ray Davies, on the fine Muswell Hillbillies elpee of 1971, sang, “I’m a 20th century man but I don’t want to be here….” (cue chiming guitars and thumping drums.)

And here we are in healthcare’s 2014 and those words still ring true. And we’re faced with vast data centers that don’t interoperate well, processes and workflows that haven’t changed much since, well, let’s say the ‘90s.

Some folks complain about EHRs, saying they’ll never work – which is like attacking the telephone as a useless gadget in 1910. Yes, healthcare is conservative, slow to change, and it is a vast, terribly complex industry. But still.


See, I come out of the high-end computing and communications world of the last 40 years and we all know about the technology revolutions that came out of that time.

And it all had to do with the decentralization of power.

Power to the people.

Here’s what happened:

  • Big iron, mainframe computing was overthrown, or, at least changed forever by personal computing. The data center data processing IT gatekeepers of the ‘60s/’70s, the men in the white coats (not docs) lost their complete control. Big mainframe data centers were then called “glass houses” – complex and unknowable, sealed safely away from the actual user rabble. Then user peasants with pitchforks began throwing rocks.
  • Distributed processing was invented in the ‘70s/‘80s – in which mini computers and personal computers were bought by company departments, who, fed up by the long delays and general stick-in-the-mudness of corporate IT, took matters into their own hands. Lotus 1-2-3! Excel! WordPerfect! Soon, IT was surrounded. Some bit of chaos ensued. Luddites were outraged.
  • Interoperability became an issue. Can I get my 1980s IBM 3081 mainframe or Cray supercomputer to talk with all these blazingly fast minicomputers and those crazy PCs and their newfangled servers? Can I get this application to communicate with that application on a batch or maybe even real-time basis? Will this data jive with that data? Oh man. There was even a huge Silicon Valley trade show called Interop. Big business. You can look it up.

And, over time, it all began to work. Standards emerged: TCP/IP, the Open Systems Interconnection (OSI) Model, designed to facilitate application-to-application communications. Forums and user groups were founded. Requests for comments solicited. Hardware and software vendors cooperated. Open systems mostly won. Proprietary systems mostly lost. Not bad.

And new business empires were built. Microsoft, Cisco, Apple, and an endless army of startups who followed in their wake disrupted markets as they went, changing the world.

Fast forward to now. Healthcare. We’ve been there before.

Thing is we’re still there. Maybe about 1990. EMR/EHR monsters have emerged: Epic, athena, Cerner, et al. The big payors – you know the list. Crazy startups in new markets are bubbling under – population health, mhealth. And there are user revolutionaries out there – visionary clinics and system departments who are moving ahead on their own, confounding CIOs and IT (with whom I have much sympathy – which reminds me of another old tune).

And just like in 1990, with the World Wide Web just around the corner, then gestating in gov, mil, and edu domains, everything changed.

And we won’t have to live in the 20th century with the Kinks. (Not that that’s a bad thing.)

And Dr. Dave Levin will be happy.

And so will we.

It’s gonna happen.

Barry Wightman is VP of marketing of Forward Health Group of Madison, WI and the author of Pepperland.

Readers Write: Bench, Bonus and Bondage: The Sorry Side of IT Consulting

July 23, 2014 Readers Write 3 Comments

Bench, Bonus and Bondage: The Sorry Side of IT Consulting
By Mike Lucey


If I could lose 20 pounds, I would be ready to model swimwear. That’s a nasty image for those who know me, but if I were serious, hiring a personal trainer would make sense. Or better yet, a personal exerciser!

Why not both? One person to tell me what to do and another to go and do it. I might not get the results I want, but much less effort. Think of what I would save in sneakers and tee shirts!

This wacky logic seems to be in play in our industry when it comes to hiring consultants. When I moved into consulting, it was because I figured I had some unique smarts and skills that a hospital would need. Once my smarts became their smarts or my skills were no longer needed, off I would go to the next guy. For this I would get a nice rate and the fun of doing new projects.

But what I am finding is hospitals have some consultants who offer guidance, and then other “consultants” who do the work, work that hospitals really need to be doing themselves. Part of why this happens can be found in the way consulting companies can market their services.

Bench: To start a consulting company, scrape up a pile of resumes, format them nicely, and throw them at every hospital problem you hear about until some of them stick. Now you have consultants working. As these consultants roll off projects, they go to the Bench. Yikes! Good news: you now have consultants ready for the next project. Bad news: every hour they sit on the bench they cost money (until you pull the bench out from under them). A way companies can lighten the bench is to give bonuses to the consultants that are still working to find work for the benchwarmers.

Bonus: Let your working consultants know that they will get a bonus for every benchwarmer they place. This is where the worm turns. Now those consultants you hired to solve a problem are to some degree degraded or distracted by the incentive to be a sales guy. The inclination to teach a hospital employee how to solve the next problem conflicts with an inclination to pull in a colleague from the company. Good for these companies, maybe not so good for the hospital.

Bondage: With each additional placement, each incremental bump in the billable hours (and bump in that bonus income), the idea of ending the engagement becomes more ugly and the motivation to extend more attractive. It is stressful to see a project end and face the uncertainty of the next job, stress that is magnified with the addition of each colleague and the bonus income they represent. Suddenly maintaining my value as a smart guy may depend on maintaining a certain amount of client ignorance and so client dependence – knowledge bondage.

This is how you end up with a consultant who is not just the captain of your hospital softball team, but the batting champ three years running.

We consultants have a great part to play as our industry continues to change. We bring real value helping hospitals make decisions, helping them act on those decisions, and providing resources when big projects need extra hands. That value is based on smarts, skills, and experience that hospitals don’t yet have, but can gain with our input. 

When that value wanes, not to worry — I’m off to the next project. Or I always have the modeling gig to fall back on. (note to self: find my Ab-Master.)

Mike Lucey is president of Community Hospital Advisors of Reading, MA.

Readers Write: Is DIY Network Security a Good Idea?

July 23, 2014 Readers Write No Comments

Is DIY Network Security a Good Idea?
By Jason Riddle


Patients and clients count on healthcare providers, payers, and business associates to protect their electronic health records. For optimal care, patients need to feel comfortable divulging personal information that could cause them injury—financially, emotionally, and/or physically—should it be illegally accessed or corrupted by hackers or malware.

Additionally, covered entities are required by HIPAA/HITECH laws to maintain a certain level of network security. Violation of these regulations could result in stiff fines, a disruption in operations, and a general loss of goodwill among the people who do business with them.

Many small to medium-sized organizations are managing some if not all of their network security on their own. Here is one question they often ask:

Do we have enough protection for our patients’ data, or do we need to hire outside professionals to do the job for us?

While there is no right or wrong answer to this, there are a few factors that need to be considered.

HIPAA/HITECH was designed with built-in flexibility so that organizations could make their own decisions about their level of investment in network security. For example, a large organization may choose to hire an outside cyber security firm to monitor their networks around the clock, but a three-person doctor’s office might be hard pressed to put such an aggressive solution in place. Office for Civil Rights (OCR) auditors who are responsible for monitoring HIPAA compliance recognize that organizations of various sizes make decisions based on practical restraints.

As covered entities make decisions for (or against) increasing security, the reasoning and conclusions should systematically be written down. OCR auditors generally take into consideration all well-documented justification.

One way to think about whether or not to hire an outside vendor to assist with network security is to recognize that a solution doesn’t have to be all or nothing. For example, some companies will hire an independent third party to conduct an initial security risk analysis. This gives them the objectivity where it counts—identifying vulnerable areas and obtaining guidance on how to address them.

Once the fix-it plan is set, the internal IT team can assume the responsibility of maintaining the network’s security from there on out. This hybrid solution can oftentimes save money. Cyber security professionals will likely identify problems faster and provide guidance to tools that are both free and/or low cost.

If an organization is committed to a DIY network security solution — whether starting out with the help of professionals or taking it all on independently — it takes more than someone who is just an IT whiz to manage a network security program. There are six main areas that a security officer must be well versed in to carry out the required responsibility:

  1. Understanding HIPAA compliance. A security officer must understand the HIPAA/HITECH regulations and what compliance really means. This includes (but is not limited to) regular security risk analyses, documenting all security measures. and reporting any breaches that may have occurred.
  2. Securing the data. Firewalls and antivirus software are a must, but that’s just the minimum. Some of the other areas to be addressed are data encryption, regularly scheduled reviews of all logs (on the firewall and the server), restricted access, and regular data backups.
  3. Securing the facility and equipment. Physical access to computer equipment must be controlled at all times. Doors to the server room should be locked. When appropriate, screens should be protected from nosy passers-by. The security officer should have an eye for the logistics of the facility and areas that might pose a risk to keeping patient data secure.
  4. Monitoring mobile access. Decisions need to be made about how employees are able to access data from mobile devices. Types of data that can be obtained wirelessly might need to be limited, and employees will need to be aware of the whereabouts of their mobile devices at all times.
  5. Training the staff. A lot of security breaches are the result of human error. Everyone in the organization needs regular reminders that they are handling sensitive data and to be aware of actions they might be taking to jeopardize it.
  6. Understanding relationships with business associates. Responsibility for protecting client and patient data extends to everyone that has access to it. If a third party does the billing, for example, it’s critical that they are compliant as well.

A DIY network solution for healthcare organizations is not necessarily a bad idea. But it does need to be a well thought out one. Patients and clients are counting on it.

Jason Riddle is practice leader with LBMC Managed Security Services of Nashville, TN.

Readers Write: EMR vs. EHR

July 23, 2014 Readers Write 2 Comments

By Steve Blumenthal, JD


HIStalk has asked me to explain the difference between an EMR (electronic medical record) and an EHR (electronic health record). Clearly, HIStalk needs to get out more. But I’m a nerdy lawyer and analyzing defined terms ranks up there with reading blogs about who’s being cast in the “Star Wars” reboot.

Let’s start with the source of most healthcare IT terminology, the feds—specifically, ONC. ONC’s website (healthIT.gov) says that an EMR is “a digital version of a paper chart that contains all of a patient’s medical history from one practice.” On the other hand, an EHR is “a digital version of a patient’s paper chart.” So, clearly an EMR and EHR are differ…. Wait a sec. Is it me, or do those definitions look remarkably similar?

I think I’ve figured it out. An EMR and EHR are both a digital version of a patient’s paper chart, but an EMR only has one practice’s patient chart. So, if I never see a physician other than my internist at Vanderbilt, Vandy’s electronic record system is an EMR with respect to me. However, my daughter has seen two doctors in different practices within Vandy’s health system, so Vandy’s electronic record system would be an EHR (not an EMR) with respect to her. No, that can’t be right.

Wait, ONC has more to say. An EHR is “more than just a computerized version of a paper chart in a provider’s office.” Whew, that clears up everything. An EHR is more than an EMR. Now I can go home and finally hang the curtains in the guest bedroom.

On second thought, that didn’t clear up anything. The curtains will just have to wait another year.

“EHR systems are built to share information with other health care providers and organizations—such as laboratories, specialists, medical imaging facilities, pharmacies, emergency facilities, and school and workplace clinics—so they contain information from all clinicians involved in a patient’s care.” I think we’ve found something. “Built to share information” is the key. I feel an analogy coming on.

An EMR is an earthworm, a useful creature that burrows into the earth, carrying organic material down into lower levels, breaking down dead plant material, and aerating the soil. But an earthworm is not transformative. Its life is spent toiling in the soil as an earthworm (and usually ending underneath a person’s shoe or in a bird’s gullet). On the other hand, an EHR is a caterpillar, a worm-like larva that will eventually transform into a beautiful butterfly (or somewhat less attractive moth or fruit fly). An EHR is designed for great things—collecting and distributing data from EMRs and other sources like butterflies cross-pollinating fields of flowers.

The difference between an EMR and an EHR isn’t what they are today. Let’s face it, given the interoperability issues with most EHRs today, they’re pretty much toiling in the same soil as EMRs. The difference lies in what an EHR is designed to become. That’s why the “Base EHR” definition in ONC’s EHR certification regulations says that an EHR must, in addition to including patient health information, have the capacity to do more—to provide clinical decision support, support physician order entry, capture and query information relevant to health care quality, and exchange electronic health information with other sources.

It’s actually kind of inspirational when you think about it. If you’ve got a kid, you’ve read “The Very Hungry Caterpillar.” Sure, the caterpillar eats a couple tons of food that could otherwise have been used to feed impoverished children, but then he spins a cocoon and, a short time later, becomes a beautiful butterfly. So maybe we’re spending a lot of resources on EHRs right now, but the payoff will be amazing in the end.

Unfortunately, the process of changing from a caterpillar into a butterfly is, well, disgusting. As “Scientific American” puts it, “First, the caterpillar digests itself, releasing enzymes to dissolve all of its tissues. If you were to cut open a cocoon or chrysalis at just the right time, caterpillar soup would ooze out.”


(For those of you wanting to double-check me on my quotes from ONC’s website, see here and here.

Steven E. Blumenthal is an attorney with Bone McAllester Norton PLLC of Nashville, TN.

Readers Write: Medication Electronic Prior Authorization, the Next Big Thing for EHRs

July 16, 2014 Readers Write 3 Comments

Medication Electronic Prior Authorization, the Next Big Thing for EHRs
By Tony Schueth


Electronic prescribing (ePrescribing) has surpassed the tipping point, where more prescriptions are being written electronically than on paper. Now the industry must start thinking about the next big thing that will take ePrescribing to the next level and address one of healthcare’s most inefficient processes: prior authorization (PA) of prescriptions.

With ePrescribing considered table stakes in an electronic health record (EHR), software developers should be thinking about innovations that will take ePrescribing from a humdrum utility to a must-have. Electronic prior authorization (ePA) for the pharmacy benefit offers that innovation opportunity.

EPA is the #1 ePrescribing capability desired by physicians, according to market research conducted by NCPDP’s ePA Task Group. In order to foster a standardized approach to satisfy this demand, NCPDP approved an electronic data interchange (EDI) standard for ePA last year.

By design, the ePA transaction can be integrated with the EHR ePrescribing work flow, enabling prescribers to complete the prior authorization process within two minutes as compared with the manual process, which involves many phone calls and faxes that can take days to weeks to complete (15 days, on average). Considering that specialty medications dominate the drug pipeline and require prior authorization up to 95 percent of the time, the need for ePA is urgent.

Seven states have mandated the use of ePA beginning in late 2014 and eight others are engaged in ePA regulatory activity. In May, the National Committee on Vital Health Statistics (NCVHS) recommended that the Department of Health and Human Services adopt the NCPDP transaction as the standard for medication PAs. NCVHS recommendations regarding ePrescribing and related transactions often become requirements for payer participation in Medicare Part D.

The coming regulatory mandates afford EHR vendors the opportunity to be ahead of the curve. Rather than scrambling to meet multiple state regulatory deadlines at the last minute, vendors can take advantage of the interval between Meaningful Use (MU) Stages 2 and 3 to begin development of ePA functionality while there is still breathing room to concentrate on work flow enhancements.

The availability of ePA may sway some physicians in their EHR choice. Recently, Surescripts found that 28 percent of physicians surveyed would switch their EHR for one that supports ePA. While this percentage may be exaggerated based upon a single feature, there is no question that a robust replacement market for EHRs exists. Many physicians are looking to transition from early purchases of basic EHRs to more sophisticated solutions.

EDI networks such as Surescripts have begun offering ePA connectivity, while such established ePA services vendors as CoverMyMeds have introduced APIs to ease EHR integration. Some service providers offer connectivity for all ePAs – even if a pharmacy benefit manager or other payer isn’t electronically enabled, electronically initiated ePAs are delivered via fax.

The time is right. EPA is a logical and useful enhancement that physicians desire. A transaction standard that ensures compatibility is in place. Regulators are beginning to mandate its use. The number of PAs is growing. EDI networks and service vendors are eager to ease integration.

With the rare opportunity posed by the MU Stage 2 delay, vendors can roll out a new feature that is a “win-win-win-win” benefit for physicians, patients, payers, and EHR vendors.

Tony Schueth is founder, CEO and managing partner at Point-of-Care Partners of Coral Springs, FL.

Readers Write: Data Exchange with C-CDA: Are We There Yet?

July 16, 2014 Readers Write 8 Comments

Data Exchange with C-CDA: Are We There Yet?
By Brian Ahier


Do you think you have all the interoperability criteria to meet current and future stages of the EHR Incentive Programs? A new study published in JAMIA found that most providers don’t.

The study concluded that providers likely are lacking critical capabilities. It found that some EHR systems still don’t exchange data correctly using Consolidated Clinical Document Architecture (C-CDA), which may prevent providers from receiving future Meaningful Use (MU) incentives.

After sampling several platforms used to produce Consolidated Clinical Document Architecture (C-CDA) files, the research team from the Substitutable Medical Applications and Reusable Technology (SMART) C-CDA Collaborative — funded by the ONC as part of the SHARP research program — found a number of technical problems and obstacles which prevented accurate data exchange between different EHR systems.

There is already wide-scale production and exchange of C-CDA documents among healthcare providers this year due to the EHR incentive program and for meeting Meaningful Use requirements. Indeed, live production of C-CDAs is already underway for anyone using 2014 Certified EHR Technology (CEHRT). C-CDA documents enable several aspects of Meaningful Use, including transitions of care and patient-facing download and transmission.

Stage 2 Meaningful Use requires that providers be capable of producing C-CDA files, which contain both machine-readable and human-readable templates used to exchange patient data between EHRs during transitions of care. While all 2014 CEHRT must have the ability to create these files, some vendors are unfortunately not using the basic XML and HL7 technology correctly.

To find out how these variations affect providers and their participation in Stage 2, the researchers sampled 107 healthcare organizations using 21 EHR systems. They examined seven important elements of the documents: demographics, problems, allergies, medications, results, vital signs, and smoking status, all of which are required to be included in the C-CDA for Stage 2. They found errors in the XML that conflicted with HL7 standard usages, rendering the document ineligible to meet the Stage 2 rules for interoperability.

One key takeaway from this research is that live exchange of C-CDA documents is likely to omit relevant clinical information and increase the burden of manual review for provider organizations receiving the C-CDA documents. Common challenges included omission or misuse of allergic reactions, omission of dose frequency, and omission of results in interpretation. Unfortunately, only some of these errors can be detected automatically.


The team found 615 errors and data expression variation across 11 key areas. The errors included “incorrect data within XML elements, terminology misuse or omission, inappropriate or variable XML organization or identifiers, inclusion versus omission of optional elements, problematic reference to narrative text from structured body, and inconsistent data representation.”

"Although progress has been made since Stage 1 of MU, any expectation that C-CDA documents could provide complete and consistently structured patient data is premature," the researchers warned. The authors also note that more robust CEHRT testing and certification standards could prevent many of these troubling errors and variations in the technology and that the industry may also benefit from the implementation of data quality metrics in the real-world environment.

The researchers recommended several steps to improve interoperability: providing richer, more standardized samples in an online format; requiring EHR certification testing to include validation of codes and vocabulary; reducing the number of data elements that are optional; and improving monitoring to track real-world document quality.

The researchers make the case for using a lightweight, automated reporting mechanism to assess the aggregate quality of clinical documents in real-world use. They recommend starting with an existing assessment tool such as Model-Driven Health Tools or the SMART C-CDA Scorecard. This tool would form the basis of an open-source data quality service that would:

  • Run within a provider firewall or at a trusted cloud provider
  • Automatically process documents posted by an EHR
  • Assess each document to identify errors and yield a summary score
  • Generate interval reports to summarize bulk data coverage and quality
  • Expose reports through an information dashboard
  • Facilitate MU attestation

"However, without timely policy to move these elements forward, semantically robust document exchange will not happen anytime soon," the authors stated. “Future policy, market adoption and availability of widespread terminology validation will determine if C-CDA documents can mature into efficient workhorses of interoperability,” the report concludes. It would seem that if policy changes are not put in place, there could be risk in the Meaningful Use program not actually being all that meaningful.

This month CMS released the proposed 2015 Physician Fee Schedule. Among other things,it includes proposals to revise the physician supervision requirements for Chronic Care Management (CCM) services and proposes to require CCM practitioners to use EHRs certified to meet at least the 2014 Edition Meaningful Use criteria, which require the ability "to capture data and ultimately produce summary records according to the HL7 Consolidated Clinical Document Architecture standard."

Since this new proposed rule includes expanding the use of the certification program beyond Meaningful Use and specifically mentions the C-CDA standard, I thought I would ask Joshua Mandel, one of the authors of the study, for his thoughts.

"It’s not too surprising that CMS’s efforts to improve chronic care management would build on Meaningful Use requirements," he said. "In the section you’ve quoted, CMS, is simply saying that Eligible Providers would need to use MU-certified systems (just as they must use MU-certified systems to attest for MU incentive payments). And so C-CDA capabilities come along for the ride in that decision. I can certainly say C-CDA is better than nothing; and C-CDA 1.1 is a specification that exists and has been implemented today, so it’s a natural choice here."

While there are challenges in implementing and making good use of C-CDA documents, there is little doubt that HHS is continuing to drive the use of these standards forward through various policy levers. The ability to exchange relevant clinical information for transitions of care is a key enabler in transforming our healthcare system to paying for quality instead of quantity.

Despite these challenges, we are beginning to see success in the marketplace. Building on this success and continuing to improve content standards is critical if true interoperability is to become a reality.

Brian Ahier is director of standards and government affairs for Medicity, A Healthagen Business of Salt Lake City, UT.

Readers Write: Replace Your RFP with an RFS

June 20, 2014 Readers Write 2 Comments

Replace Your RFP with an RFS
By Patty Miller


I have been on both sides of an RFP, as a member of the organization issuing the RFP and the organization receiving the RFP. In one case, when on the issuing side, we did not select a vendor, as no traditional vendor solved our problem. We went back to the drawing board, so to speak.

From the receiving side, I have seen organizations spend hundreds of thousands of dollars to millions of dollars on software or services, only to realize none or a small portion of their anticipated ROI because they implemented only part of the services or software or never completed the implementation.

What happened? Everything was spelled out in the RFP. All the vendors followed the steps and they had their evaluation process down.

The RFP is a wonderful tool to reduce price and squeeze an existing vendor for savings or to use when the solution is known. But what happens when there is a real problem to solve or when venturing into new territory?

That is where the RFS, or Request for Solution, comes into play. Entrepreneurs, innovators, and vendors are some of the most effective problem solvers we have in society today. How do we harness them? Bring these solution architects into our fold and share the insights we have into the problem we are trying to solve or the direction we would like to take. Then, we can create a solution that is used, solves challenges, and realizes the desired ROI.

Let’s use an example scenario. Our organization is looking to purchase analytics software. What is the desired outcome? Is this in line with our strategic goals? For this example, we would like to better understand how we are paid and ultimately be paid 10 percent faster.

Problem Definition Phase

In this phase, the solution architects are engaged as a group or one-on-one to hear what the problem and current state is. An NDA may be a good idea during the problem definition phase. To convey the problem and current state, a presentation, observation, or a demonstration can be used. That’s it. Share the problem, nothing else.

In our case of looking to purchase analytics software, we find our DSO to be 89 days on average. Delays in cash collection inhibit our ability to reinvest in the company, thus delaying sales.

Ideal Future State Definition Phase

Describe and share with the solutions architects what some of the desired outcome looks like when the problem is solved, such as, we will reduce defects or have zero defects. We can deliver 10 percent more of our product to the customer for the same price. We have reduced overtime by 10 percent.

Make part of the selection criteria about these outcomes and other outcomes they can deliver and convey this information to the solution architects so that they will provide all the value they can. The ideal future state should align with organizational strategic goals as well.

In our scenario of looking for analytics software, we would like to bring DSO down from 89 days to 80 days and increase sales by 5 percent. Due to the high importance of cash in running a business, it is in our best interest to collect outstanding receivables as quickly as possible. By quickly turning sales into cash, we can put the cash to use again — ideally, to reinvest and make more sales.

Solution Definition Phase

During this phase, work collaboratively with each solution architect to refine the initial solution they provide. If this is a large project, this phase can take months. The payoff is implementing a solution or service that is used, solves challenges, and realizes the desired ROI.

In this phase, the solution architects are designing and proving — through POCs, demonstration, references, or site visits — how their analytics software will help us reach our desired outcomes along with other benefits they can provide. We decided the goals of this implementation. They will be to reduce DSO and more accurately forecast sales because the solution architects demonstrated in their solution that we would be able to forecast sales more accurately.  They also demonstrated how that would lead to turning sales into cash.

Contracting Phase

Contracting should include deliverables and a timetable based on the collaboratively designed solution.

For our analytics software scenario, in contracting we ask for a guaranteed outcome, perhaps not as aggressive as our original goals. In some instances, it might be more aggressive, depending on what the solution architects have demonstrated they can deliver and if our goals have changed in any way. Contracting can be difficult because everyone is trying to minimize risk during contracting, but ultimately everyone should have a little skin in the game, both solution architect and buyer. Include a change policy during contracting.

In our scenario of analytics software, we agreed that DSO would decrease by nine days in one year.

Implementation Phase

During this phase, the solution should be referenced frequently. If there are changes, they should be documented per the change policy agreed to in the contracting phase, especially if the anticipated outcomes change. Needs change during an implementation and, if additional value can be achieved, it may make sense to proceed in a slightly altered direction.

In our analytics software scenario, during implementation, some lean processes were put in place and DSO decreased by two days. We are now looking for the software to give us another eight days instead of the initial nine, although we think the benefits will be greater.

Solution Monitoring Phase

For a period of 3-12 months or more after the solution has been implemented, there should be frequent monitoring to ensure that the solution or service is utilized, has solved the challenge, and realized the desired ROI. If not, the solution architects or issuing organization should be accountable as per the contract.


If someone is truly interested in solving a problem, that person will have skin in the game. RFS issuing or receiving can be a scary process for an organization that is accustomed to the traditional RFP. The organizations interested in change will embrace the RFS; others may be resistant.

No organization can afford to spend its resources, time, people, and capital on a project that does not produce the desired outcome. I expect many vendors, entrepreneurs, and purchasers will welcome this collaborative solution design, but some will be resistant due to the insecurities associated with change.

Harnessing the power of the Request for Solution allows for bringing solution architects into your fold and implementing a solution that is used, solves challenges, and realizes a desired ROI.

Patty Miller is sales manager, health and sciences division,  for TechDemocracy of Edison, NJ.

Readers Write: AMA Adopts New Recommendations on Telemedicine, Signaling Further Comfort with Telehealth

June 20, 2014 Readers Write No Comments

AMA Adopts New Recommendations on Telemedicine, Signaling Further Comfort with Telehealth
By Alexis Gilroy, JD


Earlier this month, the American Medical Association (“AMA”) approved recommendations regarding the provision of medical services using telecommunications technologies (commonly known as “telemedicine”). AMA’s report, on the heels of a policy adopted in April by the Federation of State Medical Boards, indicates the growth of telemedicine, an increased comfort-level with telemedicine, and a desire to align legal and regulatory frameworks between medical services provided “in-person” and those provided using telemedicine.

In particular, AMA’s report provides an overview of key topics specific to telemedicine, including reimbursement, known practice guidelines, and telemedicine use cases, and it establishes a number of new AMA policies and recommendations regarding telemedicine services. The report is a significant departure from some of the AMA’s previous policies regarding the use of telemedicine, including a 1994 AMA opinion prohibiting physicians from providing clinical services via telecommunications (to which the report notes “may no longer be consistent with the best ethical analysis”).

Most notably, perhaps, the AMA advocates equating the standard of care for services provided via telemedicine with the standard of care for in-person services. While this may just seem like legal jargon to some, it has potential real positive impact on the digital health industry. This move signals an acknowledgement of telemedicine as an accepted delivery model akin to “in-person” delivery models. After all we are talking about medical services in either context with the difference merely being the venue for accomplishing delivery.

Unfortunately, to date, regulators do not always have similar views between medical services provided “in-person” versus telemedicine, as adopted regulations in many states indicate a strong deference to traditional “in-person” services and in some cases a flat prohibition on services provided through telemedicine. For example, Texas and Alabama currently require an in-person exam prior to any services provided via telemedicine in a patient’s home regardless of the patient’s illness or situation, causing significant roadblocks for telehealth providers in these states. This is especially frustrating for some home-based patients who could significantly benefit from engagement with a primary care or specialty physician using telemedicine.

With the AMA’s support through the new policy, similar to the Federation’s telemedicine policy, we may see state medical boards and other regulators rethink existing and proposed regulations specific to telehealth that placed an across the board barrier on the delivery of some medical services merely because the provider chose to utilize telecommunications rather than considering whether telemedicine could be used safely and perhaps more effectively for some patients and illnesses.

The AMA’s adopted recommendations about the delivery of health care services via telemedicine include the following items:

  • Establishing a valid patient-physician relationship. Telemedicine services should be based on a valid patient-physician relationship established prior to providing telemedicine services, which can be established through: (a) a face-to-face examination, where a face-to-face encounter would otherwise be required for providing the same service in-person; or (b) a consultation with another physician who has an ongoing patient-physician relationship with the patient and agrees to supervise the patient’s care; or (c) meeting standards of establishing a patient-physician relationship included as part of evidence-based clinical practice guidelines on telemedicine developed by major medical specialty societies, such as those of radiology and pathology. Although this recommendation does not explicitly describe what constitutes a “face-to-face examination,” the full report provides that “[t]he face-to-face encounter could occur in person or virtually through real-time audio and video technology.”
  • State licensure. Physicians and other health practitioners delivering telemedicine services must abide by state licensure and medical practice laws and requirements in the state where the patient receives services.
  • Choice of provider and information on provider credentials. Patients seeking care delivered via telemedicine must be offered a choice of provider. Further, patients receiving telemedicine services must have access to the licensure and board-certification qualifications of the health care practitioners providing care in advance of their visit.
  • Practice guidelines. The delivery of telemedicine services must follow evidence-based practice guidelines, to the degree they are available, to ensure patient safety, quality of care, and positive health outcomes.
  • Patient history and documentation. The patient’s medical history must be collected as part of the provision of any telemedicine service, the services provided must be properly documented, and a visit summary must be provided to the patient.
  • Care coordination. The provision of telemedicine services must include care coordination with the patient’s medical home and/or existing treating physicians, which includes, at minimum, identifying the patient’s existing medical home and treating physician(s) and providing such physician(s) with a copy of the medical record.
  • Emergency referral protocols. Providers must establish protocols for emergency referrals.
  • Privacy and security. Delivery of telemedicine services must abide by laws addressing the privacy and security of patients’ medical information.

Alexis Gilroy, JD is a partner with the Jones Day law firm of Washington, DC.

Readers Write: AHIP Institute 2014 Recap: My Impressions of the Show

June 18, 2014 Readers Write 1 Comment

AHIP Institute 2014 Recap: My Impressions of the Show
By Kasey Fahey


After attending the America’s Health Insurance Plans (AHIP) Institute 2014 in Seattle, I walked away with a greater knowledge of the current state of the healthcare industry as a whole. AHIP is the national trade association representing the health insurance industry. It is one of the most effective trade associations in Washington, DC.

The choice of Seattle as the hosting city was excellent. It is a top technology hub in the country and I partially credit the incredible coffee for all the innovative thinking.

  • The payer industry is poised for huge growth, as are many other sectors of the healthcare IT world. However, as we shift from a fee-for-service industry to pay-for-performance, the collaboration between payers and providers will continue to evolve and blossom in a shared risk model. That said, health plans are just as concerned with outcomes, readmission rates, and population health as the health systems. Many providers are trying to break into the payer space and compete with companies that have already established themselves. There is room for potential collaboration between the two.
  • At the show, there were companies of all sizes from startups in growth mode to large, established players in the game. There were many companies focused on claims processing and core administrative platforms and the big data and analytics space. It seems as if vendors both catering to payers and providers are heavily focused on big data and analytics.
  • With the population health initiative, I see a lot of potential partnership opportunities for payers and mobile companies to monitor patients at home to reduce readmissions, improve the quality of care, and lower costs simultaneously (the holy trinity of healthcare IT). My colleague Norm Volsky, recently attended the ATA conference in Baltimore and wrote a recap. He mentioned that many of the telehealth / telemedicine vendors are in startup mode with innovative technology, but with a lack of funding or a concrete revenue plan. Many vendors at AHIP mentioned they’d like to create mobile / remote solutions within the upcoming year. With this in mind, a collaboration between the two spaces would allow health plans to use remote patient monitoring without having to develop proprietary solutions and the mobile companies to find the funding they need while sharing their technology with patients.
  • This show was significantly smaller than the annual HIMSS conference, where there are vendor booths as big as city blocks and miles upon miles of fancy displays and creative advertising ploys. Institute was more low key, which was also refreshing. There were a little under 200 vendors, and they focused on the meaningful benefits of their solutions without wildly expensive spinning signs from the ceiling. That said, it was not as visually appealing or as easy to tell the huge players in the industry from the rookies based on booth size. I did, however, enjoy Practice Fusion’s creative “Jeopardy” game. The takeaway from this is that the payer vendors seem to focus more on hard ROIs instead of marketing. Their messages and analyses are tight and they seem to have more of a B2B sale than vendors in the provider space.
  • I was surprised at the abundance of C-Level executives at the show and the lack of actual health plans. It was primarily all vendors, who mentioned that they expected to be able to meet with health plans for business opportunities. It seems in the provider space that health system and hospital C-level executives are in high attendance at shows like HIMSS, so perhaps there is a lack of participation by health plans.

Overall, the show was nicely done with many speakers and sessions on a variety of topics. As far as the future, the healthcare industry as a whole relies heavily on public policy. However, there is huge growth and partnership potential across the industry and everyone has the same goals in mind: lowering costs, improving care, and reducing readmission rates.

Kasey Fahey is director of payer practice for healthcare IT at Direct Recruiters, Inc. of Cleveland, OH.

Readers Write: Six Ways to Capitalize on the ICD-10 Delay

June 9, 2014 Readers Write No Comments

Six Ways to Capitalize on the ICD-10 Delay
By Dan Stewart


Most of the healthcare industry was taken by surprise when President Obama signed legislation that delayed the deadline to implement ICD-10 by at least a year. Now that there has been time to digest the new compliance date of October 1, 2015, healthcare providers may benefit by considering a more strategic approach for their transition to ICD-10.

Prior to the extension, many healthcare providers put in patches to meet the previous and quickly approaching October 1, 2014 compliance date. Process improvements and documentation training were put into high gear to meet the deadline, and in many cases, lacked strategic planning. With the additional time, providers can revisit their approach to implementation and potentially take advantage of other initiatives that directly impact the way their organization is evolving.

Here are six strategies to take advantage of the delay to be better positioned for post-transition success.

1. Increase clinical documentation and education

Providers now have an additional year to train their workforce. Nurses, physicians, coders, and even members of the C-suite need to understand the benefits for greater specificity in clinical documentation and how it applies to their role. Customized simulation training that addresses the specific educational needs of clinician groups can simplify the learning process and speed adoption. For example, customized simulation training can allow caregivers to practice documenting care in ICD-10 through their actual EHR application, which is critically important for learning workflow and gaining new knowledge about the system.

Any time and money invested in efforts like simulation training will be financially beneficial in ICD-9 and will also provide a smoother transition to ICD-10 with reduced risk of reimbursement issues. In addition, by continuing to engage staff with training, organizations can avoid losing the focus and interest that was created by the momentum leading up to the previous deadline.

2. Evaluate and improve the revenue cycle

Providers now have time to improve charge capture and billing and claims processing. Doing so will help to identify potential lost revenue and charge issues before claims are submitted and will improve compliance in anticipation of new denials and other post-transition challenges. Improved charge capture will also create a safety net to assist in identifying any potential ICD-10 process issues.

3. Implement computer-assisted coding (CAC) systems

Many hospitals have invested in CAC systems to aid coders in digesting physician documentation and determining which of the staggering 141,000 possible codes under ICD-10 is appropriate for each diagnosis and procedure. Now is the time to support the implementation of CAC and focus on coder workflow to optimize the benefits. Remote coding programs should also be evaluated. Incorporating tools like these not only reduces post-transition risk but also assists in the recruitment and retention of coders, which are in significantly increasing demand.

4. Begin dual coding

It is a reality that hospitals will need additional coders during the transition from ICD-9 to ICD-10. The extra time resulting from the delay creates an opportunity to begin dual coding sooner, providing physicians and coders additional practice before the compliance date. Prior to the transition, CAC systems can assist in the dual coding process by providing an automated crosswalk back to ICD-9 codes for submissions to payers, clearinghouses, and other third parties. The increased accuracy and efficiency of documentation and coding optimizes the post-transition period, mitigating the risk of compliance and reimbursement issues.

5. Analyze the financial impact

Hospitals should take the time to perform an in-depth financial impact analysis to determine the highest-impact codes on reimbursement to provide focus on operational remediation and training. Such analysis will additionally assist in identifying the reserves that will potentially be needed to get through post-compliance stabilization.

6. Expand the implementation plan

The ICD-10 extension presents an opportunity to strategically link its transition with other initiatives like Meaningful Use, Patient-Centered Medical Home (PCMH), and Accountable Care. Combining plans to adopt all of these programs can help ensure they each work together as efficiently as possible.

Miami Children’s Hospital, for example, is working to deploy a revenue cycle management system in addition to working toward ICD-10 compliance. Now that there is less immediate pressure to have physicians trained as soon as possible on ICD-10, their training can occur after the new system modules are implemented to better reflect the healthcare provider’s specific system and workflow. Implementing both of these programs in tandem saves time and money and strengthens the success of each.


While it would be easy for healthcare providers to decide to pause their efforts to become ICD-10 compliant as a result of the recent delay, it would benefit them much more to view the extra time as an opportunity to take a more strategic approach. Continuing the process will position the provider for a more successful, efficient transition to ICD-10. 

Dan Stewart is vice president and partner of strategic consulting and advisory services with Xerox.

Readers Write: Al’s Story

May 27, 2014 Readers Write 3 Comments

Happy Memorial Day. Today’s article is dedicated with a special, heartfelt thank you to all of our veterans serving our country abroad and to those here at home. Many thanks to all of the family members of the soldiers currently serving in harm’s way and to those who have lost loved ones. You all truly demonstrate great courage on a daily basis.

Mr. HIStalk, thank you for being so supportive of the troops. I’ve been present at many events across the country where you have personally recognized and paid tribute to anyone who has served in the military.

I recently sat down with Captain Donna Rowe who shared the story of her husband, Colonel Al Rowe.

Al’s Story
By Lisa Reichard, RN, BSN


Colonel Alvin G. “Al” Rowe

Al Rowe was born in Dubuque, IA in 1933. He became an Eagle Scout by the age of 12. He was a proud Iowa Hawkeye and graduated from the University of Iowa in 1956 with a bachelor’s degree in civil engineering. It was then that he entered the US Army as a Second Lieutenant through the university’s ROTC program. Al also received his masters in science degree from Iowa State University. Like many soldiers, Al could have made six figures working in the public sector as a civil engineer, but instead he chose to serve his country and did so faithfully for 30 years.

In 1965, he was sent with the 82nd Airborne to quell a communist uprising in the Dominican Republic. He was in his Jeep with his comrades and battalion. Sniper fire from rooftops hit him in the head. His comrades saved his life. There would be no one left behind.


“Al [shown third from the right] loved his comrades and put them first. He was a soldier’s solder who cared about his men,” said Donna.

My Sweetheart

According to Donna, “Al was treated for his injury at Fort Bragg, NC. This is how I came to meet him at Womack Army Hospital. He was my patient. I was a nurse supervisor at the time and we met briefly while he was recovering from surgery. Our first encounter was when I had to ask Al to quiet down. He was singing too loudly in the ward. Four days later when he was off duty, he asked to see me and if he could take me to dinner and I said OK. Although Al asked me for my number, I got busy and I walked off without giving it to him.”

“He called for three weeks to get my number, but since army policy is to never give out phone numbers, the ward would not release it. Finally, he called one of my friends who got my permission to give Al my phone number. We finally had our dinner date and when Al came to get me, my Louisiana-native roommate at the time, Carol Burnett, said with a very southern accent when Al picked me up in a white T-Bird convertible, ‘Donna, he has come to pick you up in a white stallion and carry you away.’ We were married 18 months later in 1967.”

Newlyweds Sent to War

Al and Donna were sent to Vietnam during the peak of the war in 1968 and 1969. Donna served as a head nurse of the Third Field Hospital in Saigon, one of the largest shock-trauma-triage emergency rooms in Vietnam. Al served as an adviser and equipment supplier to soldiers in the field during combat.


“Al and I were married 47 years and 10 months. He was my best friend,” said Rowe.


Donna and Al in Vietnam, Christmas 1968: “We sent this photo home to our families.”

Remembering an American Soldier and War Hero

Donna explained Al was shot down five times in Vietnam, but survived. “The communities where Al served loved and respected him a great deal both here and abroad. The South Vietnamese awarded him the Vietnam Cross of Gallantry.”

Col. Rowe received other military medals and decorations, including the Legion of Merit, the Bronze Star, Meritorious Service Medal, Joint Service Medal, Army Commendation Medal, Purple Heart, and the National Defense Service Medal, and many more. He was also a Master Parachutist.

After Vietnam, he went on to serve in the Pentagon, followed by the Army War College in Pennsylvania, before setting up forces command at Fort McPherson.


“Al [2nd from left] loved his comrades and put them first. He was a soldier’s solder who cared about his men.”


Al’s promotion to colonel at Fort McPherson in Atlanta in 1974 with Donna and son Richard at far left

“Al was a wonderful family man, and he was very active in the community,” said Donna. “We have two wonderful sons. He was a father figure to many.” She continued, “The military life can be very tough on families. They make lots of sacrifices.”

Upon his retirement from the Army, Al moved to Marietta, GA where he worked for Lockheed as a research engineer. Col. Rowe retired from the Army in 1983 as a colonel and was president of the Georgia Vietnam Veterans Alliance for four terms.

Another Battle

Col Rowe contracted Lou Gehrig’s disease, a neurodegenerative condition that affects nerve cells in the brain and the spinal cord, and struggled with the debilitating disease for three to five years. Donna believes it was service-connected (US Dept of Veteran Affairs – Agent Orange). “The journey with Lou Gehrig’s was difficult. It was another war that Al and I fought together.” She added, “The Department of Veterans Affairs in DC was wonderful during the illness. I really can’t say enough about how well we were treated.”


“Al served his country for 30 years, 10 months, and 22 days before he passed away on January 21, 2014. I miss him dearly. He was loved by many more friends and comrades-in-arms, and he will be dearly missed by everyone who knew him.”

Col Rowe’s legacy lives on through many programs, including the Society of American Military Engineers (SAME), which provides scholarships.

Fast Forward to Telemedicine Possibilities

With the recent resignation of Robert Petzel, undersecretary for health for US Veterans Affairs, there is a lot of discussion around improving timely access to care. General Eric Shinseki, US Secretary of Veterans Affairs, recently said most veterans are satisfied with the quality of care they get, but more must be done to "improve timely access to that care." Telemedicine could help to improve compliance and provide specialized care while decreasing long appointment waits both in the fields and at home for veterans.

Donna was willing to share her thoughts on telemedicine. “I really think it would be great to have telemedicine for diabetes patient maintenance and for treatment of Post-Traumatic Stress Syndrome (PTSS). It would cut down on a lot of hassle around travel time, parking, and other logistics and could help to increase compliance with maintenance programs,” she emphasized. Donna said that telemedicine will be great for soldiers in the field and that email centers exist for communication.

Final Thoughts — Help a Veteran


Hire Heroes USA provides career placement assistance to all of our returning service men and women. Here are some vet-friendly employers, including several healthcare companies.

Thank a Veteran

clip_image018 clip_image020

Donna sharing stories with me from her personal memoirs.

Donna was candid and generous to share her photos for this article. This interview was a good reminder for me that, like Donna and Al, every soldier has their own unique story just waiting to be told. If you get a chance this Memorial Day or any day, talk to a veteran and thank them for their service to our country.

When I started the interview with Donna Rowe about her husband Al, I thought it would make her day. Instead, I left the interview knowing that she had made mine.


Lisa Reichard, RN, BSN is director of community relations at Billian’s HealthDATA. HIStalk also featured an interview with Donna Rowe on The Kathleen Story for Nurses Week in May 2012.

Readers Write: Narrow Networks: Blessing, Curse, Should You Care?

May 23, 2014 Readers Write 1 Comment

Narrow Networks: Blessing, Curse, Should You Care?
By Shawn Wagoner


Narrow networks = blessing. In its recommendations to improve the government’s ACO programs, the American Hospital Association is urging CMS to “create some financial incentive on the part of the beneficiary to choose to stay ‘in network’ so that their care can be coordinated.”

Narrow networks = curse. In Seattle and New Hampshire, healthcare organizations are taking legal action to prevent health plans from developing narrow networks.

Narrow networks = real. Regardless of where an organization falls on the blessing vs. curse spectrum, narrow networks are back and gaining momentum. McKinsey research finds that 70 percent of the plans sold on the individual exchanges created as part of the ACA are what they categorize as narrow and ultra-narrow hospital networks. There is also serious traction among the private sector companies that help finance health insurance for their employees. As evidence, a commercial health plan in Minneapolis now has roughly 30,000 members enrolled in private exchanges and over half of those enrolled have chosen a narrow network benefit product constructed around one of four available ACOs.

Former ONC Chief Dr. David Blumenthal recently wrote about narrow networks, suggesting that “by guaranteeing their chosen caregivers a certain volume of business, health plans acquire the leverage to negotiate better prices in future contracts.” The private exchange example from Minneapolis suggests that providers also agree to higher quality and patient experience standards in addition to the price concessions. In theory, these narrow networks have the potential to benefit all stakeholders:

  • Health plans pay lower prices to providers and can package those lower prices into lower cost and higher quality benefit products to attract consumers and members.
  • Consumers pay lower premiums to the health plans for higher-quality care.
  • Providers are assured that the members will use their services when the need arises. Additionally, more people than before will use their services because the lower-priced narrow network benefit products attracts new patients.

Chances are that most organizations have a strategic plan that includes some form of a narrow network, whether a clinically integrated network, an ACO, or in many cases, both. Given their strategic importance and operational complexities, now is the time to start thinking about how to operate a narrow network effectively.

Recall the advent of high-deductible health plans a decade ago and how quickly patient responsibility grew as a percentage of revenue and the amount of process and technological change required in response. Likewise, narrow networks bring forth new yet similar challenges that will require a great deal of process change and technological advancement. Here are some thoughts to help assess the readiness of an organization:

Challenge #1: Patient transitions require improved coordination to track patient status in order to deliver on the higher quality standards and realize the financial benefit by ensuring patients are transitioned to in network providers.

Operational considerations:

  1. Can pertinent portions of chart notes be shared among all in-network providers?
  2. Does an automated workflow exist to book follow-on appointments for in network providers, both employed and affiliated?

Challenge #2: Narrow networks typically incent patients to stay in network for care by making it more expensive for them to have treatment with an out of network provider.

Operational considerations:

  1. Is a system in place to respond to patient inquiries for whether a given provider or facility is in their network?
  2. Can providers easily determine who is in and out of network when they are recommending follow-on care?

Challenge #3: Patients who choose narrow network products are cost conscious and expect their clinicians to be as well.

Operational considerations:

  1. Are clinical protocols broadly adopted that address the appropriateness of care so that patients are not faced with medical bills for unnecessary care?
  2. Are workup requirements established so that patients do not arrive at an appointment to find out key steps were not completed and therefore additional appointments are necessary before coming back?

Challenge #4: Patients have traded broad access via a wide open network of every provider and facility for a limited access option. However, limited access only refers to the number of physicians and facilities, not the ability to be seen in a timely manner.

Operational considerations:

  1. Are the individuals who handle inbound requests able to quickly view availability for all services within the narrow network to ensure the patient can get a timely appointment?
  2. Is this the time to start allowing patients themselves to book their own appointment online?

By no means is this an exhaustive list, but it should help quickly determine how prepared an organization is to support a narrow network strategy.

Shawn Wagoner is president of
Proximare Health of Savannah, GA.

Readers Write: ATA Conference Recap: My Impressions of the Show

May 23, 2014 Readers Write 2 Comments

ATA Conference Recap: My Impressions of the Show
By Norman Volsky

After attending and walking the exhibit hall of the 19th Annual American Telemedicine Conference in Baltimore Monday and Tuesday, I walked away with several conclusions (besides Baltimore having the world’s most delicious crab cakes.)

  • Telemedicine is a very exciting space. This market has the potential to help hospitals, patients, employers, and health plans reduce cost. There are also solutions out there which simultaneously improve quality and outcomes. This is a market that is poised for some tremendous growth.
  • The telehealth / telemedicine / telepresence (these all have different definitions) space could become commoditized very soon if it hasn’t already. There were a ton of companies that sold mobile carts, each with their own differentiators. Some were focused on providing their services at the lowest cost while others focused on quality and value. Either way, this market seems to be moving in the same direction that HIE and more recently EMR have gone in the past couple of years towards consolidation and commoditization.

clip_image004 clip_image006

  • Telemedicine is geared towards multiple customers. There were some companies like Healthspot and American Well that were showing off kiosks or pods designed for the retail sector including pharmacies, large corporate headquarters, and supermarkets as well as hospitals. American Well had solutions geared towards a tablet and smartphone that were impressive. This is a market that could have some significant growth.
  • Remote patient monitoring software companies are poised for growth. Some focus on home health, while others focus on post-acute and more broadly, the entire continuum of care. The companies that collect data from wearable devices are particularly cool. Many of these companies have patient engagement capabilities, secure texting, and outbound or proactive phone calls to patients to make sure they are following their care plans. This segment of HIT helps hospitals qualify for Meaningful Use by reducing readmissions. ACOs and health plans are leveraging these types of software systems to reduce cost, risk, and readmissions (the holy HIT trinity). The majority of these companies are focused on high-risk populations which include chronic care patients, the elderly, and patients who have had a recent major operation or episode. Others are focused on wellness for population management. I was particularly impressed with the exhibits of CareVia, AMC Health, Ideal Life, and Tactio Health.
  • Unique software caught my eye. Specific companies that caught my eye had unique offerings such as iMDsoft (clinical information systems software geared towards perioperative and critical care) and MediSprout (a telemedicine platform that runs entirely on tablets and leverages existing HIT apps.)
  • Smaller vendors need additional funding. I asked a lot of companies about their revenue model and some of them didn’t have great answers. There was also some ambiguity as to who the economic buyer would be (patients, hospitals, payers, etc.) Many companies threw out buzzwords like population health management and care coordination, but it seemed to me that they need to better articulate why these types of solutions are important to providers and health plans. If these companies can show how their solutions connect to the larger healthcare picture, they would have a better chance of obtaining the funding they require.
  • This is a very sheltered segment of the industry. The majority of the booths I went to had no knowledge of HIStalk. Most were unfamiliar with the site and many of these companies did not have a vast knowledge of the software world. At least half of the exhibiting companies were hardware focused, for example mobile carts with videoconference capabilities customized for healthcare.
  • The telemedicine segment should become more in tune with how their products and solutions fit within the broader healthcare IT market. With the previous conclusions in mind, these companies would be wise to keep abreast of blogs like HIStalk. They need to understand where hospitals are spending their money and what types of products and solutions will get the attention of hospital C-Level executives. With a better understanding of their competition for dollars, they would be more successful in articulating the right message to potential buyers. I also believe that partnering with some pure software companies could give them a more comprehensive and marketable offering to sell.

Overall, telemedicine is an area of healthcare that will have incredible growth over the next several years. There is a lot of competition in the telemedicine and remote patient monitoring segments and there will undoubtedly be some winners and losers. However, once the dust settles and consolidation occurs, the healthcare space will be better off. The ability to have doctor visits remotely and be able to monitor patients while they are at home is powerful. With this technology, hospitals and health plans will be able to reduce cost, risk and readmissions and, most importantly, save lives.

In conclusion, I feel this market is too siloed and needs a better understanding and exposure to the rest of the healthcare IT market. My advice for companies in this space would be to attend next year’s HIMSS conference in Chicago. I think doing so would be an eye-opening experience that would be extremely beneficial to this market’s inevitable growth. The better companies in this space understand how they fit into the bigger picture of healthcare, the better chance they will have to make it in both the short and long term.


Norman Volsky is director of mobile healthcare IT practice for Direct Recruiters, Inc. of Solon, OH.

Readers Write: EHR Usability – Whose Job Is It?

May 16, 2014 Readers Write 4 Comments

EHR Usability – Whose Job Is It?
By Michael Burger


Near misses, good catches, or serious reportable events – how many of these could be a design flaw of the EHR used? This was an underlying question in an article published recently entitled, “Poor Prescription documentation in EHR jeopardizes patient safety at VA hospital.” This article caught my eye because I thought perhaps there would be information on a design flaw that might need to be addressed in ePrescribing software.

The article referred to a Department of Veterans Affairs Office of Inspector General report from December that cited a variety of documentation lapses regarding opioid prescriptions at the VA Medical Center in San Francisco. The EHR was a factor in the report primarily because the EHR is the place from which the documentation was missing.

From the headline of this article, the reader assumes that the EHR figures prominently in the patient safety hazard. In all probability, the same lapse in documentation would have occurred in a paper chart environment. The report found that 53 percent of opioid renewals didn’t have documentation of a provider’s assessment. I’d lay a sizable wager that the percentage would be the same or higher were the hospital to be using paper charts versus an EHR.

It seems to be sport these days to throw daggers at (dare I say beleaguered) EHRs and EHR vendors. Studies are published showing the levels of dissatisfaction with EHRs. ONC responds by introducing EHR usability requirements in the Meaningful Use standards and certification criteria. Inevitably, the focus of these activities centers on the notion that vendors purposely build EHRs that aren’t usable, are inept at training, and are uncooperative (or even sinister) about working together.

In reality, vendors are anything but purposefully uncooperative, inept, or builders of unusable products. Logically, how could a vendor stay in business if they weren’t cooperative, sold things that didn’t work, and were failures at teaching people how to use their products? In the world of EHRs, there are forces at play that help to explain these perceptions.

EHR vendors, like creators of any other product, build software features based upon demand. The limitations to a development budget are time, scope, and resources. While any feature could be built, priorities must be set as to what to build and in what order, given the limitations.

Meaningful Use has disrupted this prioritization process by inserting requirements that have become high priority because they are necessary to pass the certification test but for which there is little or no customer demand. For example, no EHR user is asking for a way to document patient ethnicity. But there are plenty of requests for workflows that don’t require dozens of clicks. The challenge vendors face is that Meaningful Use requires focus on marginally useful features, such as tracking patient ethnicity, and doesn’t leave bandwidth to eliminate clicks in the workflow.

Ineptitude in training is an interesting claim. One very successful vendor is renowned for their “our way or the highway” mentality when it comes to training. Very effective to be certain, though not a lot of fun for those receiving the training. But this method does set an appropriate expectation that workflow modification is required for successful EHR adoption. Other vendors are renowned for their mostly failed attempts to “make the software accommodate your workflow so you won’t have to change a thing.” The reality is that it’s not possible to insert a computer into a manual process like clinical workflow and expect not to have to change a thing. It’s not that a failing vendor is inept, it’s that expectations aren’t being set correctly.

Meaningful Use has inserted a perverse twist into this already unpleasant reality by forcing vendors to train clients to perform workflows that are out of context of what doctors would typically do but are now required to be able to attest.

The uncooperative accusation is the most laughable of all. Interfaces have been around since before there were EHRs – HL7 was founded in 1987. It’s a question of supply and demand. When customers demand an ability to connect disparate systems, vendors build interfaces. It’s true that vendors have built products using proprietary architectures, because till now no one was asking for common standards. Even today, with the availability and mandated use of common standards, less than 30 percent of doctors regularly access HIE data. There’s not a lot of demand for all of that external data. It’s not that vendors don’t build interfaces because they’re being uncooperative; it’s because providers aren’t asking for it.

The principal of supply and demand is a fundamental market driver. It’s disappointing that Meaningful Use has sidetracked the natural evolution of the market by creating artificial demand for EHR functions that aren’t being asked for by actual consumers. MU has had the unintended consequence of stifling innovation of the functionality being asked for by users, which would have spurred widespread organic adoption. We’ve not (yet) seen the iPod of electronic health records because vendors have been too busy writing code to pass the MU test.

Rather than introducing a voluntary 2015 Edition EHR certification, CMS and ONC should give vendors the year that the start of MU Stage 3 has been deferred to innovate features the customers really want, rather than adding more features and another certification to continue a harsh cycle. 

Michael Burger is senior consultant with Point-of-Care Partners of Coral Springs, FL.

Readers Write: Liberating Data with Open API

May 16, 2014 Readers Write 4 Comments

Liberating Data with Open API
By Keith Figlioli


Today, people all over the world use Twitter as a means of everyday communication. But how useful would the application be if you had to contact the company and get a custom code each time you wanted to post a thought? As ludicrous as this seems in the social media space, it’s reality in healthcare information technology.

For all the hype around electronic health records (EHRs), healthcare providers still lack the ability to easily access data in EHRs. This in essence means that developers can’t just build applications that meet a use case need. This is because each system is closed behind a proprietary wall that requires custom coding in order to be unlocked for add-on workflow applications. If you want to marry EHR with pharmacy data so that doctors can be alerted when a medication hasn’t been refilled, for instance, health systems must contact their EHR vendor and pay to have that application developed to their specs.

These walls around data have real consequences. Not only are healthcare providers spending millions on one-off applications, but they are missing innovation opportunities by requesting custom builds. In the case of smartphones, both Apple and Google released their application programming interfaces (API) for any developer to leverage, creating thousands of apps, many of which users would not have imagined on their own. In healthcare, these APIs don’t exist, meaning that apps are only developed if they are imagined by either the provider or the vendor, with all potential for crowdsourced innovation completely cut off.

Although it’s hard to put a price tag on missed opportunity, a McKinsey & Company report found that the US loses between $300-$450 billion in annual economic potential because of closed data systems.[1] With more “liquid” data, McKinsey predicts new applications that close information gaps, enable best practice sharing, enhance productivity, support data-driven decision making, pinpoint unnecessary variation, and improve process reliability — all sorely lacking in today’s healthcare environment.

There’s also a price for patients. According to a recent Accenture poll, 69 percent of people believe they have a right to access all of their healthcare data in order to make decisions about their personal care. Yet almost none of these patients (76 percent) have ever accessed their EHR, chiefly because they don’t know how to, nor do they have the ability to integrate EHR data with other applications, such as those that track weight, diet or exercise via a smart phone or home computer.

Two forces need to align in order to facilitate change. In the marketplace, healthcare providers and patients both need to advocate for open API and liquid data in order to get the most out of healthcare applications. With increased demand for open access, market forces will be unleashed to prevent closed systems from being introduced for a single vendor’s financial gain. Moreover, with open systems and free access to development platforms, EHR vendors can differentiate themselves with the diversity and utility of the apps that are built to work with their systems, creating an added value to end users.

Secondly, we need a policy environment that enables innovation. One way this could be achieved would be for the Office of the National Coordinator to require open API for health data. In an optimal environment, vendors should have to demonstrate that data can be extracted via open API and leveraged by third-party software developers.

The business of healthcare should not be predicated on keeping data trapped behind proprietary walls. Given the critical need to use data to better predict, diagnose, and manage population health, the truly differentiated vendor is one that allows open access and third-party application development in order to create systems that providers and patients truly value. It’s time to liberate information and unleash innovation in healthcare.

[1] McKinsey & Company, “Open Data: Unlocking innovation and performance with liquid information”, October, 2013, p.11.

Keith Figlioli is senior vice president of healthcare informatics for Premier, Inc. of Charlotte, NC.

Readers Write: FDASIA and Healthcare’s Moon Shot Goal of ICU Safety

May 15, 2014 Readers Write 7 Comments

FDASIA and Healthcare’s Moon Shot Goal of ICU Safety
By Stephanie Reel


Preparing for the FDASIA panel was an energizing opportunity. It allowed me to spend a little time thoughtfully considering the role of government and the role of private industry in the future of health IT integration and interoperability. It gave me an opportunity to think a great deal about the important role ONC has played over the past few years and it made me question why we haven’t achieved some of the goals we had hoped to achieve.

As I was preparing my remarks, I reflected on the great work being done by my colleagues at Johns Hopkins and our vendor partners. We have the distinct privilege of having the Armstrong Institute at Hopkins focused on patient safety and quality, which is generously funded by Mr. Mike Armstrong, former chairman of our the Board of Trustees for Johns Hopkins Medicine. It is unequaled and a part of our fabric and our foundation. The Armstrong Institute is inspirationally led by Dr. Peter Pronovost, who is an incredibly well-respected leader in the field of patient safety, and also a trusted colleague and a good friend.  

We in IT at Hopkins receive exceptional support from our leadership – truly. We also have amazingly strong partnerships with our school of medicine faculty, our nurses, and our enterprise-wide staff. I suspect we are the envy of most academic health systems. The degree of collaboration at Hopkins is stunning – in our community hospitals, physician offices, and across our academic medical centers. Our systems’ initiatives derive direct qualitative and quantitative benefit from these relationships. Our CMIO, Dr. Peter Greene, and our CNIO, Dr. Stephanie Poe, are the best of the best in their roles. The medical director of our Epic deployment, Dr. John Flynn, is a gift.  

We are luckier than most. We could not do what we do without them. But despite this impressive and innovative environment, we still have significant challenges that are not unique to Hopkins. 

Despite huge investments and strong commitments to Meaningful Use, we have challenges across all  health IT initiatives. They aren’t new ones and they aren’t being adequately addressed by our current commitment to Meaningful Use criteria. We are still not operating in a culture adequately committed to safety and patient- and family-centered care. We are still not sufficiently focused on technologies, processes, and environments that consistently focused on doing everything in the context of what’s best for the patient. 

We decided to try harder. All across Johns Hopkins Medicine, we published a set of guiding principles that guide our approach to the deployment of information technology solutions. These guiding principles reduce ambiguity and  provide constancy of purpose. They drive the way we make decisions, prioritize our work, and choose among alternatives – investment alternatives, deployment alternatives, vendor alternatives, integration tactics, and deployment strategies. They provide a “true north” that promotes the culture we are hoping to create.

Our first guiding principle expects us to always do what is best for the patient. No question, no doubt, no ambiguity. We will always do what is best for the patient and for the patient’s family and care partners. We are committed to patient safety and it is palpable. This is our true north.

Our  second guiding principle allows us to extend our commitment even further. We commit to also always doing what is best for the people who take care of patients. So far, we have never found this to be in conflict with our first guiding principle. We view the patient and the patient’s family as our partners. Together, we are the team. Our environment, our work flow, our processes, and our technologies need to do what is best for all members of the team and all of the partners in the process of disease prevention, prediction, and treatment.

Our remaining guiding principles deal with our commitment to integration, standardization, and best practices. We know that unmanaged complexity is dangerous. We know that there are opportunities to improve our processes and our systems if we are always focused on being a learning healthcare system. We know we can achieve efficiencies and more effective solutions if we also achieve some degree of standardization and data and system integration. This is essential, critically important, and huge. It is something FDASIA (the FDA,FCC, and ONC) and the proposed Safety Center may be able to help us address. 

Is this the best role for government?

Government has an important role and government has the power to convene, which is often critical. But I also feel strongly that market forces are compelling and must be tapped to help us better serve our patients and the people who care for our patients. Health systems and hospitals have tremendous purchasing power. We should ensure we define our criteria for device and system selection based upon the vendor’s commitment to integration, standardization, and collaboration around best practices. We must find a way to promote continuous learning if we are to achieve the triple aim. 

We need to step up. We need to say we will not purchase devices, systems, and applications if the vendors are not fully and visibly committed to interoperability and continuous learning. This must be true for software, hardware, and medical devices. It must be true for our patients and for the people who care for our patients.

Moon shot goal

This relates my plea that we define a moon shot goal for our nation. We must commit to having the safest healthcare delivery system in the world. We should start with our intensive care units. We must ensure that our medical devices, smart pumps, ventilators, and glucometers are appropriately and safety interoperable. We must  make a commitment to settle for nothing less. We must agree that we will not purchase devices or systems that do not integrate, providing a safe, well-integrated solution for our patients and for the people taking care of our patients.

Let’s decide as a nation that we will place as much emphasis on safety as we have on Meaningful Use. Or perhaps we can redefine Meaningful Use to define the criteria, goals, and objectives to be achieved to ensure that we meet our moon shot goals. We will ensure that we have the safest hospitals in the world and we will start with our ICUs, where we care for the most vulnerable patients. We might even want to start with our pediatric ICUs, where we treat the truly most vulnerable patients.

More than 10 years ago, I was given an amazing opportunity to “adopt a unit” at The Johns Hopkins Hospital as a part of a safety program invented at Hopkins by Dr. Peter Pronovost. Each member of our leadership team was provided with an opportunity to adopt an ICU. We were encouraged to work with our ICU colleagues to focus on patient safety. We were educated and trained to be “preoccupied with failure” and focused on any defects that might contribute to patient harm. We didn’t realize it at the time, but we were learning how to become a High Reliability Organization.  

I learned quickly that our ICUs are noisy, chaotic, extremely busy, and not comforting places for our patients or their families. I learned that our PICU was especially noisy. Some of our patients had many devices at their bedside, nearly none of which were interoperable. They beeped, whirred, buzzed, and sent alarms – many of which were false alarms — all contributing to the noise, complexity, and feeling of chaos. They distracted our clinicians, disturbed our patients, and worried our family partners. 

Most importantly, they didn’t talk to one another. So much sophisticated technology, in the busiest places in our hospitals, all capable of providing valuable data, yet not integrated – not interoperable – and sometimes not even helpful.

I realized then, and many times since I adopted the PICU, that we all deserve better. Our patients and the people who care for our patients deserve better. We must build quiet ICUs where our care team can listen and learn and where our patients can receive the care they need from clinicians who can collaborate, leveraging well-integrated solutions and fully integrated information to provide the safest possible care. Many of these principles influenced the construction of our new clinical towers that opened two years ago. Again, we are fortunate, but huge challenges remain.

What about Quality Management Systems? Are we testing and measuring quality appropriately?

In many ways, I think we may focus too much on the back end. Perhaps we focus too much on testing and not enough time leading affirmatively. A commitment to excellence – to high reliability – might lessen the complexity of our testing techniques. I am very much committed to sophisticated quality assurance testing, but it seems far better to create and promote a culture that is committed to doing it right the first time. It will also be important that we affirmatively lead our design and deployment of systems that rely only on testing our solutions. 

With that in mind, I would prefer to see an additional focus or strategy that embraces High Reliability at the front end in addition to using quality management techniques. We undoubtedly need both. 

As I have recently learned, most High Reliability Organizations have much in common related to this dilemma. We all operate in unforgiving environments. Mistakes will happen, defects will occur, and we need to be  attentive. But we must also have aspirational goals that cause us to relentlessly focus on safety at the front end. We must remain passionate about our preoccupation with failure. We must recognize that our interventions are risky. We must have a sense of our own vulnerabilities and ensure we recognize we are ultimately responsible and accountable despite our distributed and decentralized models. We must continue to ask ourselves, “How will the next patient be harmed?” and then do everything possible to prevent harm at the front end as well as during testing.  We must create a culture that causes us to think about risk at the beginning.  And of course, we must be resilient, reacting appropriately when we do recognize errors, defects, or problems.

I should note that many of these ideas related to High Reliability are very well documented in Karl Weick and Kathleen Sutcliffe’s book, Managing the Unexpected. They encourage “collective mindfulness” and shared understanding of the situation they face. Their processes are centered around the five principles: a preoccupation with failure, reluctance to simplify interpretations, sensitivity to operations, commitment to resilience, and deference to expertise.

Why the moon shot goal?

As Dr. Pronovost at Johns Hopkins Armstrong Institute often says, “Change travels at the speed of trust.” We need to learn from one another. We need to be transparent, focused, and committed to doing what is best for our patients and for the people who care for our patients. We must commit to reducing patient harm. We must improve the productivity and effectiveness of our healthcare providers. We must have faith in our future and trust our partners. We need to make a commitment to no longer expect or accept mediocrity. 

From a recent study performed at the Armstrong Institute under Dr. Pronovost’s leadership, we know that patients around our country continue to die needlessly from preventable harm. Healthcare has little tangible improvement to show for its $800 billion investment in health information technology. Productivity is flat. Preventable patient harm remains the third leading cause of death in the US.

In addition, costs of care continue to consume increasingly larger and unsustainable fractions of the economy in all developed countries. While cutting payments may slightly decrease the cost per unit of service, improving productivity could more significantly deflate costs. Other industries have significantly improved productivity, largely through investments in technology and in systems engineering to obtain the maximal value from technology. Yet healthcare productivity has not improved. Our nurses respond to alarms — many of them false alarms – on average, every 94 seconds. This would be unthinkable in many other environments.

Despite my view that we must encourage market forces, we know that we have a long way to go to have an ICU that has been designed to prevent all patient harm while also reducing waste. Clinicians are often given technologies that were designed by manufacturers with limited usability testing by clinicians. These technologies often do not support the goals clinicians are trying to achieve, often hurt rather than help productivity, and have a neutral or negative impact on patient safety.

Moreover, the market has not yet integrated technologies to reduce harm. Neither regulators nor the market has applied sufficient pressure on electronic health record vendors or device manufacturers to integrate technologies to reduce harm. The market has not helped integrate systems or designed a unit that prevents all patient harm, optimizes patient outcomes and experience, and reduces waste. Hospitals continue to buy technologies that do not communicate.

It is as if Bloomberg News would have been successful if there were no standards for sharing of financial and market data. It would be unthinkable that Boeing would continue to partner with a landing gear manufacturer that refused to incorporate a signal to the cockpit informing the pilot whether the landing gear was up or down. We need the same engineering, medical, clinical trans-disciplinary collaboration expectations to ensure the same is true for healthcare.

Back to the moon shot….

An ideal ICU is possible if we decide it matters enough. If we agree to combine trans-disciplinary collaboration with broad stakeholder participation and demand authentic collaborations, we can get there in less than five years. But it won’t be trivial. It will require a public/private partnership.

The cultural and economic barriers to such collaborations are profound. Engineers and physicians use different language, apply different theories and methods, and employ different performance measures. We must take a holistic approach to create the ideal ICU and the ideal patient and family experience.

A safe, productive system is possible today. Technology is not the barrier. Let’s make it happen. Let’s have a goal for 2020 that we will have the safest ICUs (and the safest hospitals) on the planet – focused on patient- and family-centered care, disease prevention, and personalized and individualized healthcare.

Stephanie L. Reel is CIO and vice-provost for information technology at Johns Hopkins University and vice-president for information services for Johns Hopkins Medicine of Baltimore, MD.

Founding Sponsors


Platinum Sponsors

























































Gold Sponsors












Reader Comments

  • HIT Girl: My recollection from my EHR- and patient-portal building days is that restrictions vary by state, with some of the state...
  • IANAL: This company will face competition from Australians doing Outback Healthcare....
  • Mr. HIStalk: The official exhibitor count -- which includes those who just booked meeting rooms and aren't really "exhibiting" -- is ...
  • Socal: I’m pretty sure that Nuance pulled out as well as I had an email from them yesterday. I know last year there were list...
  • HIMSS No More: @Andy, Salesforce is out...

Sponsor Quick Links