Home » Readers Write » Recent Articles:

Readers Write: Narrow Networks: Blessing, Curse, Should You Care?

May 23, 2014 Readers Write 1 Comment

Narrow Networks: Blessing, Curse, Should You Care?
By Shawn Wagoner

clip_image001

Narrow networks = blessing. In its recommendations to improve the government’s ACO programs, the American Hospital Association is urging CMS to “create some financial incentive on the part of the beneficiary to choose to stay ‘in network’ so that their care can be coordinated.”

Narrow networks = curse. In Seattle and New Hampshire, healthcare organizations are taking legal action to prevent health plans from developing narrow networks.

Narrow networks = real. Regardless of where an organization falls on the blessing vs. curse spectrum, narrow networks are back and gaining momentum. McKinsey research finds that 70 percent of the plans sold on the individual exchanges created as part of the ACA are what they categorize as narrow and ultra-narrow hospital networks. There is also serious traction among the private sector companies that help finance health insurance for their employees. As evidence, a commercial health plan in Minneapolis now has roughly 30,000 members enrolled in private exchanges and over half of those enrolled have chosen a narrow network benefit product constructed around one of four available ACOs.

Former ONC Chief Dr. David Blumenthal recently wrote about narrow networks, suggesting that “by guaranteeing their chosen caregivers a certain volume of business, health plans acquire the leverage to negotiate better prices in future contracts.” The private exchange example from Minneapolis suggests that providers also agree to higher quality and patient experience standards in addition to the price concessions. In theory, these narrow networks have the potential to benefit all stakeholders:

  • Health plans pay lower prices to providers and can package those lower prices into lower cost and higher quality benefit products to attract consumers and members.
  • Consumers pay lower premiums to the health plans for higher-quality care.
  • Providers are assured that the members will use their services when the need arises. Additionally, more people than before will use their services because the lower-priced narrow network benefit products attracts new patients.

Chances are that most organizations have a strategic plan that includes some form of a narrow network, whether a clinically integrated network, an ACO, or in many cases, both. Given their strategic importance and operational complexities, now is the time to start thinking about how to operate a narrow network effectively.

Recall the advent of high-deductible health plans a decade ago and how quickly patient responsibility grew as a percentage of revenue and the amount of process and technological change required in response. Likewise, narrow networks bring forth new yet similar challenges that will require a great deal of process change and technological advancement. Here are some thoughts to help assess the readiness of an organization:

Challenge #1: Patient transitions require improved coordination to track patient status in order to deliver on the higher quality standards and realize the financial benefit by ensuring patients are transitioned to in network providers.

Operational considerations:

  1. Can pertinent portions of chart notes be shared among all in-network providers?
  2. Does an automated workflow exist to book follow-on appointments for in network providers, both employed and affiliated?

Challenge #2: Narrow networks typically incent patients to stay in network for care by making it more expensive for them to have treatment with an out of network provider.

Operational considerations:

  1. Is a system in place to respond to patient inquiries for whether a given provider or facility is in their network?
  2. Can providers easily determine who is in and out of network when they are recommending follow-on care?

Challenge #3: Patients who choose narrow network products are cost conscious and expect their clinicians to be as well.

Operational considerations:

  1. Are clinical protocols broadly adopted that address the appropriateness of care so that patients are not faced with medical bills for unnecessary care?
  2. Are workup requirements established so that patients do not arrive at an appointment to find out key steps were not completed and therefore additional appointments are necessary before coming back?

Challenge #4: Patients have traded broad access via a wide open network of every provider and facility for a limited access option. However, limited access only refers to the number of physicians and facilities, not the ability to be seen in a timely manner.

Operational considerations:

  1. Are the individuals who handle inbound requests able to quickly view availability for all services within the narrow network to ensure the patient can get a timely appointment?
  2. Is this the time to start allowing patients themselves to book their own appointment online?

By no means is this an exhaustive list, but it should help quickly determine how prepared an organization is to support a narrow network strategy.


Shawn Wagoner is president of
Proximare Health of Savannah, GA.

Readers Write: ATA Conference Recap: My Impressions of the Show

May 23, 2014 Readers Write 2 Comments

ATA Conference Recap: My Impressions of the Show
By Norman Volsky

After attending and walking the exhibit hall of the 19th Annual American Telemedicine Conference in Baltimore Monday and Tuesday, I walked away with several conclusions (besides Baltimore having the world’s most delicious crab cakes.)

  • Telemedicine is a very exciting space. This market has the potential to help hospitals, patients, employers, and health plans reduce cost. There are also solutions out there which simultaneously improve quality and outcomes. This is a market that is poised for some tremendous growth.
  • The telehealth / telemedicine / telepresence (these all have different definitions) space could become commoditized very soon if it hasn’t already. There were a ton of companies that sold mobile carts, each with their own differentiators. Some were focused on providing their services at the lowest cost while others focused on quality and value. Either way, this market seems to be moving in the same direction that HIE and more recently EMR have gone in the past couple of years towards consolidation and commoditization.

clip_image004 clip_image006

  • Telemedicine is geared towards multiple customers. There were some companies like Healthspot and American Well that were showing off kiosks or pods designed for the retail sector including pharmacies, large corporate headquarters, and supermarkets as well as hospitals. American Well had solutions geared towards a tablet and smartphone that were impressive. This is a market that could have some significant growth.
  • Remote patient monitoring software companies are poised for growth. Some focus on home health, while others focus on post-acute and more broadly, the entire continuum of care. The companies that collect data from wearable devices are particularly cool. Many of these companies have patient engagement capabilities, secure texting, and outbound or proactive phone calls to patients to make sure they are following their care plans. This segment of HIT helps hospitals qualify for Meaningful Use by reducing readmissions. ACOs and health plans are leveraging these types of software systems to reduce cost, risk, and readmissions (the holy HIT trinity). The majority of these companies are focused on high-risk populations which include chronic care patients, the elderly, and patients who have had a recent major operation or episode. Others are focused on wellness for population management. I was particularly impressed with the exhibits of CareVia, AMC Health, Ideal Life, and Tactio Health.
  • Unique software caught my eye. Specific companies that caught my eye had unique offerings such as iMDsoft (clinical information systems software geared towards perioperative and critical care) and MediSprout (a telemedicine platform that runs entirely on tablets and leverages existing HIT apps.)
  • Smaller vendors need additional funding. I asked a lot of companies about their revenue model and some of them didn’t have great answers. There was also some ambiguity as to who the economic buyer would be (patients, hospitals, payers, etc.) Many companies threw out buzzwords like population health management and care coordination, but it seemed to me that they need to better articulate why these types of solutions are important to providers and health plans. If these companies can show how their solutions connect to the larger healthcare picture, they would have a better chance of obtaining the funding they require.
  • This is a very sheltered segment of the industry. The majority of the booths I went to had no knowledge of HIStalk. Most were unfamiliar with the site and many of these companies did not have a vast knowledge of the software world. At least half of the exhibiting companies were hardware focused, for example mobile carts with videoconference capabilities customized for healthcare.
  • The telemedicine segment should become more in tune with how their products and solutions fit within the broader healthcare IT market. With the previous conclusions in mind, these companies would be wise to keep abreast of blogs like HIStalk. They need to understand where hospitals are spending their money and what types of products and solutions will get the attention of hospital C-Level executives. With a better understanding of their competition for dollars, they would be more successful in articulating the right message to potential buyers. I also believe that partnering with some pure software companies could give them a more comprehensive and marketable offering to sell.

Overall, telemedicine is an area of healthcare that will have incredible growth over the next several years. There is a lot of competition in the telemedicine and remote patient monitoring segments and there will undoubtedly be some winners and losers. However, once the dust settles and consolidation occurs, the healthcare space will be better off. The ability to have doctor visits remotely and be able to monitor patients while they are at home is powerful. With this technology, hospitals and health plans will be able to reduce cost, risk and readmissions and, most importantly, save lives.

In conclusion, I feel this market is too siloed and needs a better understanding and exposure to the rest of the healthcare IT market. My advice for companies in this space would be to attend next year’s HIMSS conference in Chicago. I think doing so would be an eye-opening experience that would be extremely beneficial to this market’s inevitable growth. The better companies in this space understand how they fit into the bigger picture of healthcare, the better chance they will have to make it in both the short and long term.

clip_image007

Norman Volsky is director of mobile healthcare IT practice for Direct Recruiters, Inc. of Solon, OH.

Readers Write: EHR Usability – Whose Job Is It?

May 16, 2014 Readers Write 4 Comments

EHR Usability – Whose Job Is It?
By Michael Burger

image

Near misses, good catches, or serious reportable events – how many of these could be a design flaw of the EHR used? This was an underlying question in an article published recently entitled, “Poor Prescription documentation in EHR jeopardizes patient safety at VA hospital.” This article caught my eye because I thought perhaps there would be information on a design flaw that might need to be addressed in ePrescribing software.

The article referred to a Department of Veterans Affairs Office of Inspector General report from December that cited a variety of documentation lapses regarding opioid prescriptions at the VA Medical Center in San Francisco. The EHR was a factor in the report primarily because the EHR is the place from which the documentation was missing.

From the headline of this article, the reader assumes that the EHR figures prominently in the patient safety hazard. In all probability, the same lapse in documentation would have occurred in a paper chart environment. The report found that 53 percent of opioid renewals didn’t have documentation of a provider’s assessment. I’d lay a sizable wager that the percentage would be the same or higher were the hospital to be using paper charts versus an EHR.

It seems to be sport these days to throw daggers at (dare I say beleaguered) EHRs and EHR vendors. Studies are published showing the levels of dissatisfaction with EHRs. ONC responds by introducing EHR usability requirements in the Meaningful Use standards and certification criteria. Inevitably, the focus of these activities centers on the notion that vendors purposely build EHRs that aren’t usable, are inept at training, and are uncooperative (or even sinister) about working together.

In reality, vendors are anything but purposefully uncooperative, inept, or builders of unusable products. Logically, how could a vendor stay in business if they weren’t cooperative, sold things that didn’t work, and were failures at teaching people how to use their products? In the world of EHRs, there are forces at play that help to explain these perceptions.

EHR vendors, like creators of any other product, build software features based upon demand. The limitations to a development budget are time, scope, and resources. While any feature could be built, priorities must be set as to what to build and in what order, given the limitations.

Meaningful Use has disrupted this prioritization process by inserting requirements that have become high priority because they are necessary to pass the certification test but for which there is little or no customer demand. For example, no EHR user is asking for a way to document patient ethnicity. But there are plenty of requests for workflows that don’t require dozens of clicks. The challenge vendors face is that Meaningful Use requires focus on marginally useful features, such as tracking patient ethnicity, and doesn’t leave bandwidth to eliminate clicks in the workflow.

Ineptitude in training is an interesting claim. One very successful vendor is renowned for their “our way or the highway” mentality when it comes to training. Very effective to be certain, though not a lot of fun for those receiving the training. But this method does set an appropriate expectation that workflow modification is required for successful EHR adoption. Other vendors are renowned for their mostly failed attempts to “make the software accommodate your workflow so you won’t have to change a thing.” The reality is that it’s not possible to insert a computer into a manual process like clinical workflow and expect not to have to change a thing. It’s not that a failing vendor is inept, it’s that expectations aren’t being set correctly.

Meaningful Use has inserted a perverse twist into this already unpleasant reality by forcing vendors to train clients to perform workflows that are out of context of what doctors would typically do but are now required to be able to attest.

The uncooperative accusation is the most laughable of all. Interfaces have been around since before there were EHRs – HL7 was founded in 1987. It’s a question of supply and demand. When customers demand an ability to connect disparate systems, vendors build interfaces. It’s true that vendors have built products using proprietary architectures, because till now no one was asking for common standards. Even today, with the availability and mandated use of common standards, less than 30 percent of doctors regularly access HIE data. There’s not a lot of demand for all of that external data. It’s not that vendors don’t build interfaces because they’re being uncooperative; it’s because providers aren’t asking for it.

The principal of supply and demand is a fundamental market driver. It’s disappointing that Meaningful Use has sidetracked the natural evolution of the market by creating artificial demand for EHR functions that aren’t being asked for by actual consumers. MU has had the unintended consequence of stifling innovation of the functionality being asked for by users, which would have spurred widespread organic adoption. We’ve not (yet) seen the iPod of electronic health records because vendors have been too busy writing code to pass the MU test.

Rather than introducing a voluntary 2015 Edition EHR certification, CMS and ONC should give vendors the year that the start of MU Stage 3 has been deferred to innovate features the customers really want, rather than adding more features and another certification to continue a harsh cycle. 

Michael Burger is senior consultant with Point-of-Care Partners of Coral Springs, FL.

Readers Write: Liberating Data with Open API

May 16, 2014 Readers Write 4 Comments

Liberating Data with Open API
By Keith Figlioli

image

Today, people all over the world use Twitter as a means of everyday communication. But how useful would the application be if you had to contact the company and get a custom code each time you wanted to post a thought? As ludicrous as this seems in the social media space, it’s reality in healthcare information technology.

For all the hype around electronic health records (EHRs), healthcare providers still lack the ability to easily access data in EHRs. This in essence means that developers can’t just build applications that meet a use case need. This is because each system is closed behind a proprietary wall that requires custom coding in order to be unlocked for add-on workflow applications. If you want to marry EHR with pharmacy data so that doctors can be alerted when a medication hasn’t been refilled, for instance, health systems must contact their EHR vendor and pay to have that application developed to their specs.

These walls around data have real consequences. Not only are healthcare providers spending millions on one-off applications, but they are missing innovation opportunities by requesting custom builds. In the case of smartphones, both Apple and Google released their application programming interfaces (API) for any developer to leverage, creating thousands of apps, many of which users would not have imagined on their own. In healthcare, these APIs don’t exist, meaning that apps are only developed if they are imagined by either the provider or the vendor, with all potential for crowdsourced innovation completely cut off.

Although it’s hard to put a price tag on missed opportunity, a McKinsey & Company report found that the US loses between $300-$450 billion in annual economic potential because of closed data systems.[1] With more “liquid” data, McKinsey predicts new applications that close information gaps, enable best practice sharing, enhance productivity, support data-driven decision making, pinpoint unnecessary variation, and improve process reliability — all sorely lacking in today’s healthcare environment.

There’s also a price for patients. According to a recent Accenture poll, 69 percent of people believe they have a right to access all of their healthcare data in order to make decisions about their personal care. Yet almost none of these patients (76 percent) have ever accessed their EHR, chiefly because they don’t know how to, nor do they have the ability to integrate EHR data with other applications, such as those that track weight, diet or exercise via a smart phone or home computer.

Two forces need to align in order to facilitate change. In the marketplace, healthcare providers and patients both need to advocate for open API and liquid data in order to get the most out of healthcare applications. With increased demand for open access, market forces will be unleashed to prevent closed systems from being introduced for a single vendor’s financial gain. Moreover, with open systems and free access to development platforms, EHR vendors can differentiate themselves with the diversity and utility of the apps that are built to work with their systems, creating an added value to end users.

Secondly, we need a policy environment that enables innovation. One way this could be achieved would be for the Office of the National Coordinator to require open API for health data. In an optimal environment, vendors should have to demonstrate that data can be extracted via open API and leveraged by third-party software developers.

The business of healthcare should not be predicated on keeping data trapped behind proprietary walls. Given the critical need to use data to better predict, diagnose, and manage population health, the truly differentiated vendor is one that allows open access and third-party application development in order to create systems that providers and patients truly value. It’s time to liberate information and unleash innovation in healthcare.

[1] McKinsey & Company, “Open Data: Unlocking innovation and performance with liquid information”, October, 2013, p.11.

Keith Figlioli is senior vice president of healthcare informatics for Premier, Inc. of Charlotte, NC.

Readers Write: FDASIA and Healthcare’s Moon Shot Goal of ICU Safety

May 15, 2014 Readers Write 7 Comments

FDASIA and Healthcare’s Moon Shot Goal of ICU Safety
By Stephanie Reel

image

Preparing for the FDASIA panel was an energizing opportunity. It allowed me to spend a little time thoughtfully considering the role of government and the role of private industry in the future of health IT integration and interoperability. It gave me an opportunity to think a great deal about the important role ONC has played over the past few years and it made me question why we haven’t achieved some of the goals we had hoped to achieve.

As I was preparing my remarks, I reflected on the great work being done by my colleagues at Johns Hopkins and our vendor partners. We have the distinct privilege of having the Armstrong Institute at Hopkins focused on patient safety and quality, which is generously funded by Mr. Mike Armstrong, former chairman of our the Board of Trustees for Johns Hopkins Medicine. It is unequaled and a part of our fabric and our foundation. The Armstrong Institute is inspirationally led by Dr. Peter Pronovost, who is an incredibly well-respected leader in the field of patient safety, and also a trusted colleague and a good friend.  

We in IT at Hopkins receive exceptional support from our leadership – truly. We also have amazingly strong partnerships with our school of medicine faculty, our nurses, and our enterprise-wide staff. I suspect we are the envy of most academic health systems. The degree of collaboration at Hopkins is stunning – in our community hospitals, physician offices, and across our academic medical centers. Our systems’ initiatives derive direct qualitative and quantitative benefit from these relationships. Our CMIO, Dr. Peter Greene, and our CNIO, Dr. Stephanie Poe, are the best of the best in their roles. The medical director of our Epic deployment, Dr. John Flynn, is a gift.  

We are luckier than most. We could not do what we do without them. But despite this impressive and innovative environment, we still have significant challenges that are not unique to Hopkins. 

Despite huge investments and strong commitments to Meaningful Use, we have challenges across all  health IT initiatives. They aren’t new ones and they aren’t being adequately addressed by our current commitment to Meaningful Use criteria. We are still not operating in a culture adequately committed to safety and patient- and family-centered care. We are still not sufficiently focused on technologies, processes, and environments that consistently focused on doing everything in the context of what’s best for the patient. 

We decided to try harder. All across Johns Hopkins Medicine, we published a set of guiding principles that guide our approach to the deployment of information technology solutions. These guiding principles reduce ambiguity and  provide constancy of purpose. They drive the way we make decisions, prioritize our work, and choose among alternatives – investment alternatives, deployment alternatives, vendor alternatives, integration tactics, and deployment strategies. They provide a “true north” that promotes the culture we are hoping to create.

Our first guiding principle expects us to always do what is best for the patient. No question, no doubt, no ambiguity. We will always do what is best for the patient and for the patient’s family and care partners. We are committed to patient safety and it is palpable. This is our true north.

Our  second guiding principle allows us to extend our commitment even further. We commit to also always doing what is best for the people who take care of patients. So far, we have never found this to be in conflict with our first guiding principle. We view the patient and the patient’s family as our partners. Together, we are the team. Our environment, our work flow, our processes, and our technologies need to do what is best for all members of the team and all of the partners in the process of disease prevention, prediction, and treatment.

Our remaining guiding principles deal with our commitment to integration, standardization, and best practices. We know that unmanaged complexity is dangerous. We know that there are opportunities to improve our processes and our systems if we are always focused on being a learning healthcare system. We know we can achieve efficiencies and more effective solutions if we also achieve some degree of standardization and data and system integration. This is essential, critically important, and huge. It is something FDASIA (the FDA,FCC, and ONC) and the proposed Safety Center may be able to help us address. 

Is this the best role for government?

Government has an important role and government has the power to convene, which is often critical. But I also feel strongly that market forces are compelling and must be tapped to help us better serve our patients and the people who care for our patients. Health systems and hospitals have tremendous purchasing power. We should ensure we define our criteria for device and system selection based upon the vendor’s commitment to integration, standardization, and collaboration around best practices. We must find a way to promote continuous learning if we are to achieve the triple aim. 

We need to step up. We need to say we will not purchase devices, systems, and applications if the vendors are not fully and visibly committed to interoperability and continuous learning. This must be true for software, hardware, and medical devices. It must be true for our patients and for the people who care for our patients.

Moon shot goal

This relates my plea that we define a moon shot goal for our nation. We must commit to having the safest healthcare delivery system in the world. We should start with our intensive care units. We must ensure that our medical devices, smart pumps, ventilators, and glucometers are appropriately and safety interoperable. We must  make a commitment to settle for nothing less. We must agree that we will not purchase devices or systems that do not integrate, providing a safe, well-integrated solution for our patients and for the people taking care of our patients.

Let’s decide as a nation that we will place as much emphasis on safety as we have on Meaningful Use. Or perhaps we can redefine Meaningful Use to define the criteria, goals, and objectives to be achieved to ensure that we meet our moon shot goals. We will ensure that we have the safest hospitals in the world and we will start with our ICUs, where we care for the most vulnerable patients. We might even want to start with our pediatric ICUs, where we treat the truly most vulnerable patients.

More than 10 years ago, I was given an amazing opportunity to “adopt a unit” at The Johns Hopkins Hospital as a part of a safety program invented at Hopkins by Dr. Peter Pronovost. Each member of our leadership team was provided with an opportunity to adopt an ICU. We were encouraged to work with our ICU colleagues to focus on patient safety. We were educated and trained to be “preoccupied with failure” and focused on any defects that might contribute to patient harm. We didn’t realize it at the time, but we were learning how to become a High Reliability Organization.  

I learned quickly that our ICUs are noisy, chaotic, extremely busy, and not comforting places for our patients or their families. I learned that our PICU was especially noisy. Some of our patients had many devices at their bedside, nearly none of which were interoperable. They beeped, whirred, buzzed, and sent alarms – many of which were false alarms — all contributing to the noise, complexity, and feeling of chaos. They distracted our clinicians, disturbed our patients, and worried our family partners. 

Most importantly, they didn’t talk to one another. So much sophisticated technology, in the busiest places in our hospitals, all capable of providing valuable data, yet not integrated – not interoperable – and sometimes not even helpful.

I realized then, and many times since I adopted the PICU, that we all deserve better. Our patients and the people who care for our patients deserve better. We must build quiet ICUs where our care team can listen and learn and where our patients can receive the care they need from clinicians who can collaborate, leveraging well-integrated solutions and fully integrated information to provide the safest possible care. Many of these principles influenced the construction of our new clinical towers that opened two years ago. Again, we are fortunate, but huge challenges remain.

What about Quality Management Systems? Are we testing and measuring quality appropriately?

In many ways, I think we may focus too much on the back end. Perhaps we focus too much on testing and not enough time leading affirmatively. A commitment to excellence – to high reliability – might lessen the complexity of our testing techniques. I am very much committed to sophisticated quality assurance testing, but it seems far better to create and promote a culture that is committed to doing it right the first time. It will also be important that we affirmatively lead our design and deployment of systems that rely only on testing our solutions. 

With that in mind, I would prefer to see an additional focus or strategy that embraces High Reliability at the front end in addition to using quality management techniques. We undoubtedly need both. 

As I have recently learned, most High Reliability Organizations have much in common related to this dilemma. We all operate in unforgiving environments. Mistakes will happen, defects will occur, and we need to be  attentive. But we must also have aspirational goals that cause us to relentlessly focus on safety at the front end. We must remain passionate about our preoccupation with failure. We must recognize that our interventions are risky. We must have a sense of our own vulnerabilities and ensure we recognize we are ultimately responsible and accountable despite our distributed and decentralized models. We must continue to ask ourselves, “How will the next patient be harmed?” and then do everything possible to prevent harm at the front end as well as during testing.  We must create a culture that causes us to think about risk at the beginning.  And of course, we must be resilient, reacting appropriately when we do recognize errors, defects, or problems.

I should note that many of these ideas related to High Reliability are very well documented in Karl Weick and Kathleen Sutcliffe’s book, Managing the Unexpected. They encourage “collective mindfulness” and shared understanding of the situation they face. Their processes are centered around the five principles: a preoccupation with failure, reluctance to simplify interpretations, sensitivity to operations, commitment to resilience, and deference to expertise.

Why the moon shot goal?

As Dr. Pronovost at Johns Hopkins Armstrong Institute often says, “Change travels at the speed of trust.” We need to learn from one another. We need to be transparent, focused, and committed to doing what is best for our patients and for the people who care for our patients. We must commit to reducing patient harm. We must improve the productivity and effectiveness of our healthcare providers. We must have faith in our future and trust our partners. We need to make a commitment to no longer expect or accept mediocrity. 

From a recent study performed at the Armstrong Institute under Dr. Pronovost’s leadership, we know that patients around our country continue to die needlessly from preventable harm. Healthcare has little tangible improvement to show for its $800 billion investment in health information technology. Productivity is flat. Preventable patient harm remains the third leading cause of death in the US.

In addition, costs of care continue to consume increasingly larger and unsustainable fractions of the economy in all developed countries. While cutting payments may slightly decrease the cost per unit of service, improving productivity could more significantly deflate costs. Other industries have significantly improved productivity, largely through investments in technology and in systems engineering to obtain the maximal value from technology. Yet healthcare productivity has not improved. Our nurses respond to alarms — many of them false alarms – on average, every 94 seconds. This would be unthinkable in many other environments.

Despite my view that we must encourage market forces, we know that we have a long way to go to have an ICU that has been designed to prevent all patient harm while also reducing waste. Clinicians are often given technologies that were designed by manufacturers with limited usability testing by clinicians. These technologies often do not support the goals clinicians are trying to achieve, often hurt rather than help productivity, and have a neutral or negative impact on patient safety.

Moreover, the market has not yet integrated technologies to reduce harm. Neither regulators nor the market has applied sufficient pressure on electronic health record vendors or device manufacturers to integrate technologies to reduce harm. The market has not helped integrate systems or designed a unit that prevents all patient harm, optimizes patient outcomes and experience, and reduces waste. Hospitals continue to buy technologies that do not communicate.

It is as if Bloomberg News would have been successful if there were no standards for sharing of financial and market data. It would be unthinkable that Boeing would continue to partner with a landing gear manufacturer that refused to incorporate a signal to the cockpit informing the pilot whether the landing gear was up or down. We need the same engineering, medical, clinical trans-disciplinary collaboration expectations to ensure the same is true for healthcare.

Back to the moon shot….

An ideal ICU is possible if we decide it matters enough. If we agree to combine trans-disciplinary collaboration with broad stakeholder participation and demand authentic collaborations, we can get there in less than five years. But it won’t be trivial. It will require a public/private partnership.

The cultural and economic barriers to such collaborations are profound. Engineers and physicians use different language, apply different theories and methods, and employ different performance measures. We must take a holistic approach to create the ideal ICU and the ideal patient and family experience.

A safe, productive system is possible today. Technology is not the barrier. Let’s make it happen. Let’s have a goal for 2020 that we will have the safest ICUs (and the safest hospitals) on the planet – focused on patient- and family-centered care, disease prevention, and personalized and individualized healthcare.

Stephanie L. Reel is CIO and vice-provost for information technology at Johns Hopkins University and vice-president for information services for Johns Hopkins Medicine of Baltimore, MD.

Readers Write: What is a Patient Safety Organization and Should You Join One?

What is a Patient Safety Organization and Should You Join One?
By Brenda Giordano, RN, MN

image

Can they really say that?

In 2011, the government asked Walgreens for information about two of its pharmacists. Walgreens said “no” to the request. There was nothing the government could do about it — Walgreens belonged to a Patient Safety Organization (PSO).

If you are a provider and are unfamiliar with PSOs, take six minutes to read this article. You’ll not only learn how to have a more just and fair culture of safety, but also how to have stronger legal protections for the work your teams do with safety events.

Nine years ago this July, Congress passed the Patient Safety and Quality Improvement Act of 2005, also called the Patient Safety Act. This law created a system of voluntary reporting to Patient Safety Organizations (PSOs) of safety events, near misses, and unsafe conditions, similar to what is available within aviation. At the same time, a Network of Patient Safety Databases (NPSD) was established so data could be analyzed and we could all learn why safety events occur and how to avoid them. 

The ultimate aim is to improve safety, but in a manner that also creates environments where working through the nitty gritty of what happened and why it happened can be done with legal protection and confidentiality. This freedom to fully explore safety events and safety data should foster a Just Culture, where reporting an event does not result in punishment, but rather in learning.

Let me take a pause here to lay this out very plainly. Provider organizations (hospitals, skilled nursing facilities, pharmacies, home health, ambulatory care, physician and dentist offices, laboratories, renal centers, ambulance and paramedic services, and so forth) can receive legal protections from discovery in the case of a civil law suit if they belong to a PSO and put together a Patient Safety Evaluation System. This means that if, heaven forbid, you, as a provider find yourself being sued, there are strict limits on what can be “discovered” (think “uncovered.”)

Two things can be discovered: the facts of the case (what is in the medical record) and the billing and discharge information. Everything else — with exceptions that make sense, like the committee meetings where specific safety events are discussed or the information gained from root cause analysis – is legally protected.

If you hang around a hospital, clinic, or any of the above-mentioned care areas, you probably know that after an event, the Risk people often rush in and tie people’s hands on what is documented. They are afraid that a lawsuit will uncover all kinds of things that the facility would be liable for, that would make them look bad, or that would hurt their reputation.

This is a logical approach, but sometimes the end result is that few things are learned and progress on safety is slow since everyone’s mode is CYA (the only acronym I decided to not spell out). I really wish it was not like this because I truly believe that complete transparency is the better road to take.

The reality is that few organizations have the guts to be fully transparent. The legal protection provided by this law tries to break up that bad cycle of “burying our mistakes” and remove the fear so that honest work on safety improvement can happen.

Comparative information in safety is hard to obtain. To that end, the Agency for Healthcare Research and Quality (AHRQ) created a common format so that event information from any safety reporting system can be placed into 10 categories. Research can then be done on falls, medication errors and so forth. PSOs send de-identified information to AHRQ in this common format for addition to the Network of Patient Safety Databases.

Here are few reasons for joining a PSO.

  • It encourages a healthy culture of safety. It’s hard to learn when you are worried that you’ll be punished or found out in a public way. A PSO helps to remove the “‘whack of the ruler across the knuckles” attitude that does not help anything. The intent of the law is to foster learning, not place blame. We all want to improve safety and quality for our patients. A healthy Just Culture of safety can foster this.
  • Do it while it’s voluntary (unless you have really bad readmission.) Joining a PSO is voluntary, but in the future, hospitals with 50 beds or more need to have a Patient Safety Evaluation System in place to participate in state insurance exchanges (the exact date is not set). By joining a PSO now, hospitals can be prepared for this eventuality with a good system in place. No one knows if the PSO program will ever be mandatory, but knowing the government… About the readmission exception, courtesy of the Affordable Care Act, if the Secretary of HHS has determined you are eligible, well, you probably know who you are and why you need to be part of a PSO.
  • Remove wasteful costs that come with poor safety. Safety-related lawsuits are costly to defend. In addition, liability carriers increase premiums when they have to defend you a lot. Imagine having your carrier tell you, “Your premiums will be going down because we’ve had so few cases where we needed to defend you.” Wouldn’t that be nice?
  • Compare and collaborate with other organizations. PSOs can provide de-identified regional and national safety benchmarks. Knowing where you stand can help you to focus your improvement efforts and where to give praise. PSOs can also broker collaboration among their members so they can share what they have learned. It’s great to have a buddy outside your own system where mutual learning is not just allowed, but encouraged.

There are around 80 PSOs. Some are specialty based, others are state based, and many will cover multiple types of providers across the US. I hope you will consider joining one.

Brenda Giordano, RN, MN is operations manager of the Quantros Patient Safety Center, a federally-listed PSO serving 4,000 facilities, of Milpitas, CA.

Readers Write: The Engaged Healthcare Consumer is a Myth

The Engaged Healthcare Consumer is a Myth
By Tom Meinert

Although I am a reader, this may be more appropriately titled “Patient Writes.” I have been feeling cynical lately regarding healthcare. It seemed to culminate with a recent experience with my health insurer.

My daughter had to get an MRI. Shortly thereafter, I received the bill for my portion and it was a nice, whopping $1,000. Aside from the sticker shock, I was surprised because I had an MRI about a year and a half ago and I only had to pay $400. It’s the same Insurance plan for both of us, and although the MRIs were done at different places, the difference of the sticker price of the MRIs was $250. I was expecting to pay a bit more, but somehow a $250 difference actually cost me $600 more out of pocket. 

I called my insurer. After 40 minutes, I learned the following.

  • It’s not just what the hospital charges, but how they bill it. That may significantly change how much I have to pay out of pocket.
  • The people I call don’t have the information about the true cost to me.
  • Even if they have this information, they can’t share it with me.

This call confirms what has bothered me all along. Despite all the talk and hype regarding patient engagement, consumerization of healthcare and mHealth ushering in a whole new world along with the ACA, the concept of an engaged and informed consumer of health care is a myth.

I have worked my entire professional career in healthcare IT. I believe it can help change it for the better. Admittedly, most of my time has been spent working on projects that help improve care within a hospital.

But the more articles I read, conferences I attend, and apps I play around with, as I compare my own experiences as a patient, I am not impressed. Even more, I can’t see how healthcare is going to fundamentally change for the better.

I am frustrated and powerless. Over the past four years, my health has improved greatly. I have lost weight. My cholesterol is way down, along with my blood pressure and resting heart rate. Yet over this time, my personal out-of-pocket costs and premiums have increased.

I live in Massachusetts, which is at the forefront of healthcare reform. I can’t see a PCP without booking nine months in advance. I can see them only once every two years because that’s what insurance covers for a well visit.

Like more and more of us who were pushed into a high-deductible plan, I can only begin to empathize with patients who have complex problems who try to navigate the world of billing.

I hardly feel as though I am a consumer of healthcare. The truth so far seems to be that the definition of a healthcare consumer is simply proportional to the amount of costs pushed on to me. The more out-of-pocket costs I have, the more of an engaged consumer I am.

However, pushing costs on to me certainly does not make me an empowered and informed consumer. And it certainly doesn’t incentivize me to be healthy.

Going back to my phone call, it ended as expected. There was no real resolution. I still have to pay the bill. 

Those calls are recorded “for quality,” so I included one last comment out of frustration. Despite this company telling me that they have no real way to get at this information, someone there has it readily available at all times — the person who sends out all those bills that are accurate down to the last cent.

Readers Write: Can Intuitive Software Design Support Better Health?

April 16, 2014 Readers Write No Comments

Can Intuitive Software Design Support Better Health?
By Scott Frederick

image

Biometric technology is the new “in” thing in healthcare, allowing patients to monitor certain health characteristics—blood pressure, weight, activity level, sleep pattern, blood sugar—outside of the healthcare setting. When this information is communicated with providers, it can help with population health management and long-term chronic disease care. For instance, when patients monitor their blood pressure using a biometric device and upload that information to their physician’s office, the physician can monitor the patient’s health remotely and tweak the care plan without having to physically see the patient.

For biometric technology to be effective, patients must use it consistently in order to capture a realistic picture of the health characteristics they are monitoring. Without regular use, it is hard to see if a reading is an anomaly or part of a larger pattern. The primary way to ensure consistent use is to design user-friendly biometric tools because it is human nature to avoid things that are too complicated, and individuals won’t hesitate to stop using a biometric device if it is onerous or complex.

Let’s look at an example.

An emerging growth area for healthcare biometrics is wireless activity trackers—like FitBit—that can promote healthier lifestyles and spur weight loss. About three months ago, I started using one of these devices to see if monitoring metrics like the number of steps I walked, calories I consumed and hours I slept would make a difference in my health.

The tool is easy-to-use and convenient. I can monitor my personal metrics any time, anywhere, allowing me to make real-time adjustments to what I eat, when I exercise, and so on. For instance, at any given time, I can tell how many steps I’ve taken and how many more I need to take to meet my daily fitness goal. This shows me whether I need to hit the gym on the way home from work or whether my walk at lunch was sufficient. I can even make slight changes to my routine, choosing to stand up during conference calls or take the stairs instead of the elevator.

I download my data to a website, which provides easy-to-read and customizable dashboards, so I can track overall progress. I find I check that website more frequently than I look at Facebook or Twitter.

Now, imagine if the tool was bulky, slow, cumbersome and hard to navigate. Or the dashboard where I view my data was difficult to understand. I would have stopped using it awhile ago—or may not have started using it in the first place.

Like other hot technology, there are several wireless activity trackers infiltrating the market, each one promising to be the best. In reality, only the most well-designed applications will stand the test of time. These will be completely user-centric, designed to easily and intuitively meet user needs.

For example, a well-designed tracker will facilitate customization so users can monitor only the information they want and change settings on the fly. Such a tool will have multiple data entry points, so a user can upload his or her personal data any time and from anywhere. People will also be able to track their progress over time using clear, easy-to-understand dashboards.

Going forward, successful trackers may also need to keep providers’ needs in mind. While physicians have hesitated to embrace wireless activity monitors—encouraging patients to use the technology but not leveraging the data to help with care decisions—that perspective may be changing. It will be interesting to see whether physicians start looking at this technology in the future as a way to monitor their patients’ health choices. Ease of obtaining the data and having it interface with existing technology will drive provider use and acceptance.

While biometric tools are becoming more common in healthcare and stand to play a major role in population health management in the future, not every tool will be created equal. Those designed with the patient and provider in mind will rise to the top and improve the overall health of their users.

Scott Frederick, RN, BSN, MSHI is director of clinical insight for PointClear Solutions of Atlanta, GA.

Readers Write: Addressing Data Quality in the EHR

April 16, 2014 Readers Write 1 Comment

Addressing Data Quality in the EHR
By Greg Chittim

image

What if you found out that you might have missed out on seven of your 22 ACO performance measures, not because of your actual clinical and financial performance, but because of the quality of data in your EHRs? It happens, but it’s not an intractable problem if you take a systematic approach to understanding and addressing data quality in all of your different ambulatory EHRs.

In HIStalk’s recent coverage of HIMSS14, an astute reader wrote:

Several vendors were showing off their “big data” but weren’t ready to address the “big questions” that come with it. Having dealt with numerous EHR conversions, I’m keenly aware of the sheer magnitude of bad data out there. Those aggregating it tend to assume that the data they’re getting is good. I really pushed one of the major national vendors on how they handle data integrity and the answers were less than satisfactory. I could tell they understood the problem because they provided the example of allergy data where one vendor has separate fields for the allergy and the reaction and another vendor combines them. The rep wasn’t able to explain how they’re handling it even though they were displaying a patient chart that showed allergy data from both sources. I asked for a follow up contact, but I’m not holding my breath.

All too often as the HIT landscape evolves, vendors and their clients are moving too quickly from EHR implementation to population health to risk-based contracts, glossing over (or skipping entirely) a focus on the quality of the data that serves as the foundation of their strategic initiatives. As more provider organizations adopt population health-based tools and methodologies, a comprehensive, integrated, and validated data asset is critical to driving effective population-based care.

Health IT maturity can be defined as four distinct steps:

  1. EHR implementation
  2. Achievement of high data quality
  3. Reporting on population health
  4. Transformation into a highly functioning PCMH or ACO.

High-quality data is a key foundational piece that is required to manage a population and drive quality. When the quality of data equals the quality of care physicians are providing, one can leverage that data as an asset across the organization. Quality data can provide detailed insight that allows pinpointing opportunities for intervention — whether it’s around provider workflow, data extraction, or patient follow-up and chart review. Understanding the origins of compromised data quality help recognize how to boost measure performance, maximize reimbursements, and lay the foundation for effective population health reporting.

It goes without saying that reporting health data across an entire organization is not an easy task. However, there are steps that organizations must take to ensure they are extracting sound data from their EHR systems.

Outlined below are the key issues that contribute to poor data quality impacting population health programs, how they are typically resolved, and more optimal ways organizations can resolve them.

 

Variability across disparate EHRs and other data sources

EHRs are inconsistent. Data feeds are inconsistent. Despite their intentions, standardized message types such as HL7 and CCDs still have a great deal of variability among sources. When they meet the letter of national standards, they rarely meet the true spirit of those standards when you try to use.

Take diagnoses, for example. Patient diagnoses can often be recorded in three different locations: on the problem list, as an assessment, and in medical history. Problem lists and assessments are both structured data, but generally only diagnoses recorded on the problem list are transported to the reports via the CCD. This translates to underreporting on critical measures that require records of DM, CAD, HTN, or IVD diagnoses. Accounting for this variability is critical when mapping data to a single source of truth.

Standard approach: Most organizations try to use consistent mapping and normalization logic across all data sources. Validation is conducted by doing sanity checks, comparing new reports to old.

Best practice approach: To overcome the limitations of standard EHR feeds like the CCD, reports need to pull from all structured data fields in order to achieve performance rates that reflect the care physicians are rendering– either workflow needs to be standardized across providers or reporting tools need to be comprehensive and flexible in the data fields they pull from.

The optimal way to resolve this issue is to tap into the back end of the EHR. This allows you to see what data is structured vs. unstructured. Once you have an understanding of the back-end schema, data interfaces and extraction tools can be customized to pull data where it is actually captured, as well as where it should be captured. In addition, validation of individual data elements needs to happen in collaboration with providers, to ensure completeness and accuracy of data.

 

Variability in provider workflows

EHRs are not perfect and providers often have their own ways of doing things. What may be optimal for the EHR may not work for the providers or vice versa. Within reason, it is critical to accommodate provider workflows rather than forcing them into more unnatural change and further sacrificing efficiency.

Standard approach: Most organizations ignore this and go to one extreme or another: (1) use consistent mapping and normalization logic across all data sources and user workflows, making the assumption that all providers use the EHR consistently, or (2) allowing workflows to dictate all and fight the losing battle to make the data integration infinitely adaptable. Again, validation is conducted using sanity checks, comparing new reports to old.

Best practice approach: Understand how each provider uses the system and identify where the provider is capturing all data elements. Building in a core set of workflows and standards dictated by an on-the-ground clinical advisory committee, with flexibility for effective variations is critical. With a standard core, data quality can be enhanced by tapping into the back end of the EHR to fully understand how data is captured as well as spending time with care teams to observe their variable workflows. To avoid disruption in provider workflows, interfaces and extraction tools can be configured to map data correctly, regardless of how and where it is captured. Robust validation of individual data elements needs to happen in collaboration with providers to ensure completeness and accuracy of data (that is, the quality of the data) matches the quality of care being delivered.

 

Build provider buy-in/trust in system and data through ownership

If providers do not trust the data, they will not use population health tools. Without these tools, providers will struggle to effectively drive proactive, population-based care or quality improvement initiatives. Based on challenges with EHR implementation and adoption over the last decade, providers are often already skeptical of new technology, so getting this right is critical.

Standard approach: Many organizations simply conduct data validation process by doing a sanity test comparing old reports to new. Reactive fixes are done to correct errors in data mapping, but often too late, after provider trust has been lost in the system.

Best practice approach: Yet again, it is important to build out a collaborative process to ensure every single data element is mapped correctly. First meetings to review data quality usually begin with a statement akin to “your system must be wrong — there’s no way I am missing that many patients.” This is OK. Working side by side with the providers to ensure they understand where data is coming from and how to modify both workflow and calculations ensure that they are confident that reports accurately reflect the quality of care they are rendering. This confidence is a critical success factor to the eventual adoption of these population health tools in a practice.

 

Missed incentive payments under value-based reimbursement models

An integrated data asset that combines data from many sources should always add value and give meaningful insight into the patient population. A poorly mapped and validated data asset can actually compromise performance, lower incentive reimbursements, and ultimately result in a negative ROI.

Standard approach: A lackluster data validation process can result in lost revenue opportunities, as data will not accurately reflect the quality of care delivered or accurately report the risk of the patient population.

Best practice approach: Using the previously described approach when extracting, mapping, and validating data is critical for organizations that want to see a positive ROI in their population health analytics investments. Ensuring data is accurate and complete will ensure tools represent the quality of care delivered and patient population risk, maximizing reimbursement under value-based payments.

 

We have worked with a sample ACO physician group of over 50 physicians to assess the quality of data being fed from multiple EHRs within their system into an existing analytics platform via CCDs and pre-built feeds. Based on an assessment of 15 clinically sensitive ACO measures, it was discovered that the client’s reports were under-reporting on 12 of the 15 measures, based only on data quality. Amounts were under-reported by an average of 28 percentage points, with the maximum measure being under-reported by 100 percentage points.

Reports erroneously reported that only six of the 15 measures met 2013 targets, while a manual chart audit revealed that 13 of the 15 measures met 2013 targets, indicating that data was not being captured, transported, and reported accurately. By simply addressing these data quality issues, the organization could potentially see additional financial returns through quality incentive reimbursements as well as a reduced need for labor-intensive intensive chart audits.

As the industry continues to shift toward value-based payment models, the need for an enterprise data asset that accurately reflects the health and quality of care delivered to a patient population is increasingly crucial for financial success. Providers have suffered enough with drops in efficiency since going live on EHRs. Asking them to make additional significant changes in their daily workflows to make another analytics tool work is not often realistic.

Analytics vendors need to meet the provider where they are to add real value to their organization. Working with providers and care teams not only to validate integrity of data, but to instill a level of trust and give them the confidence they need to adopt these analytics tools into their everyday workflows is extremely valuable and often overlooked. These critical steps allow providers to begin driving population-based care and quality improvement in practices, positioning them for success in the new era of healthcare. 

Greg Chittim is senior director of Arcadia Healthcare Solutions of Burlington, MA.

CMIO Rant with … Dr. Andy

April 9, 2014 Readers Write 5 Comments

CMIO Rant with … gives CMIOs a place to air their thoughts or gripes. Yours are welcome.

The Great Prescription Pad Race
By Andy Spooner, MD

image

Which is more usable: a prescription pad or a computer?

That’s a no-brainer. For writing a prescription, the pad wins, hands down. Consider its features:

  • Instant-on. No booting up. Just reach in your pocket and you are ready to go.
  • Compact, lightweight. Did I mention your pocket?
  • Self-documenting. No need to print a summary describing the prescription.
  • No irritating pop-ups with irrelevant alerts.
  • Patient-centered. The pharmacist can fill in missing information (liquid or tablet or capsule? brand or generic?) based on patient preferences.
  • Flexible. Can be taken to any pharmacy. No need to route it to a specific place, or even to ask the patient about a preferred pharmacy.
  • Streamlined. No need to worry about pharmacy benefit management rules. The pharmacist can sort all that stuff out.
  • Information-persistent. If the family has a question about an apparently erroneous prescription, they can read the details right off the prescription when talking to the after-hours nurse.
  • No record-keeping clutter. Patients can just tell us about their prescriptions next time we see them. They could just bring in the bottle or something.

With all of these advantages, surely only the geekiest of pencil-necked CMIOs would advocate an electronic method of prescribing, right?

Of course not.

The prescription pad is easier only if we define the work as the minimum possible activity that a doctor can do to get a prescription into a patient’s hands. The truth is, we are not done with the task of prescribing when we hand the slip of paper to the patient. If we think we are, then the pad seems far easier to use—more usable—than any electronic health record or e-prescribing system.

The above competition is absurd, of course, in an era when, according to the CDC’s National Ambulatory Medical Care Survey, over 80 percent of office-based physicians in 2013 used electronic prescribing. That rate rose from less than 60 percent over the past three years. E-prescribing is here to stay.

But we still hear about how unusable electronic medical record systems are. In The Atlantic this month, we read that a doctor who sees 14 patients a day spends “1-3 hours” each day entering orders. Assuming that each patient needs some orders for health maintenance (screening lab work), prescription renewals, and maybe a few diagnostic tests and referrals, it’s hard to take that statistic seriously. It’s clear that the writer is irritated at his EMR, and there may be some legitimate design or implementation issues with it. But 1-3 hours of ordering per day? C’mon.

Somewhere between the slapdash paper prescription and the three hours of daily ordering is the truth. Managing clinical information takes some amount of time, and some of it should be done directly by physicians. Some of this activity serves a “compliance” goal that you may not like, but all of it is a part of building a system of healthcare that serves a worthy goal.

If we insist that all clicks are wasted time, then we can’t have a conversation about usability, because under the prescription pad scenario, the only usable computer is one you don’t have to use at all.

On the other hand, if we insist that our current systems are bad because of hyperbolic, data-free assertions about how the EMR is making our lives miserable, we are similarly blocked from making productive plans to improve usability because, well, it’s just too darn much fun to complain.

My thesis, then, is that EMR usability is not as much about design as about expectations. Variations in what these expectations ought to be between different perspectives will lead to unproductive conversations (or no conversations at all) about what it means to have an EMR that’s easy to use.

All I know for sure as a CMIO is that physicians want all of this stuff to be easier to use. We also want these systems to read our minds, but that’s at least a couple of versions away, if I am understanding the vendor presentations at HIMSS correctly.


Andy Spooner, MD, MS, FAAP is CMIO at Cincinnati Children’s Hospital Medical Center. A general pediatrician, he practices hospital medicine when he’s not enjoying the work involved in keeping the integrated electronic health record system useful for the pediatric specialists, primary care providers, and other child health professionals in Cincy.

Readers Write: Advanced Interoperability: Leveraging Technology to Improve Patient Lives and Provider Workflows

April 2, 2014 Readers Write 1 Comment

Advanced Interoperability: Leveraging Technology to Improve Patient Lives and Provider Workflows
By Justin Dearborn

image

There’s an increasing need for all of healthcare to be integrated in its approach to accessing, sharing, and storing information. It’s not just patients who could stand to benefit from more advanced interoperability. It’s also healthcare providers who want to meet legislative requirements such as Meaningful Use Stage 2 and Stage 3, as well as reduce costs and improve care quality.

Consider what typically happens in today’s medical imaging environment—often partway between a traditional manual environment and a fully interoperable one—when a patient presents to his primary care physician (PCP) complaining of shoulder pain, for example:

After receiving a comprehensive clinical exam, a patient named Dave heads home with a hand-scribbled order for a shoulder MRI. Before the exam can take place, however, the imaging center must get the order pre-certified by Dave’s health insurer. After receiving the insurer’s faxed approval days later, the imaging center schedules the patient for his exam. Days after that, the radiologist faxes his report to the PCP, who then calls Dave to set another appointment to discuss his torn rotator cuff. Once the decision to seek surgical treatment is made, Dave is asked to bring a CD of his radiology images to the orthopedic specialist.

If this process sounds cumbersome, time consuming, and inefficient, that’s because it is. It’s also the rule with respect to today’s medical imaging processes.

While it’s true that anywhere between 10 to 20 percent of imaging orders issued today are processed electronically, that still means the vast majority are processed manually via paper and/or fax. According to the Centers for the Disease Controls and Prevention (CDC), approximately 12 percent of all PCP visits alone result in a referral for diagnostic imaging—some 44 million imaging exams each year—which equates to a lot of wasted time and paper, not to mention money.

The payer-approval process only adds to that burden. Roughly 122 million imaging exams are processed manually by radiology benefits management companies each year, at a cost of about $75 per exam. That adds up to nearly $8 billion of waste a year.

So the question is this: What would happen in an environment of advanced interoperability, where existing electronic health records (EHR) and other technologies are fully leveraged? Take Dave’s scenario again:

After receiving a comprehensive clinical exam, Dave’s PCP electronically orders a shoulder MRI and schedules an imaging appointment for later in the day. Before the exam takes place, the imaging center receives electronic pre-certification. Once the MRI is complete, the PCP automatically is alerted that an image-enabled report is available. Before he leaves his office for the evening, the PCP calls Dave to discuss his torn rotator cuff and to electronically refer him to an orthopedic specialist who already has secure automated access to the image-enabled radiology report.

As this simple scenario illustrates, the entire patient-imaging process can be streamlined by enabling five key services: 1) electronic referrals and ordering; 2) automated pre-certification and approval using clinical decision support; 3) electronic patient navigation and scheduling; 4) image-enabled reporting; and 5) analytics.

Such advanced interoperability provides Dave, his PCP, and his orthopedic specialist with near-instantaneous exam ordering, approval, and scheduling. Ease of access to reports, results, and images is dramatically increased.

By creatively leveraging EHRs and other technologies, healthcare organizations can maximize their interoperability with internal and external providers. All these services, moreover, can be provided without costly point-to-point HL7 interfaces.

With payment reform, it is clear that the days of disjointed, manual image processing are numbered. Indeed, advanced interoperability like that described here not only addresses the challenges that impact physicians, but also pays handsome dividends for patient care.

Justin Dearborn is CEO of Merge Healthcare of Chicago, IL

Readers Write: Competing for Talent in Healthcare IT – Remember, Candidates are Interviewing You, Too

April 2, 2014 Readers Write No Comments

Competing for Talent in Healthcare IT – Remember, Candidates are Interviewing You Too
By Mike Silverstein

image

The healthcare IT market is as hot and competitive as ever and the battle for the industry’s top talent is on. If your firm has gone through any recent hiring waves, I probably don’t have to tell you that the buyer’s market of 2008-2012 is over. Strong candidates, regardless of specialty, who have good work records, great performance reviews, and above-average soft skills are being flooded with lucrative and enticing opportunities as soon as they dip their toe in the market.

These concrete, actionable items can help win over a candidate who is evaluating offers from multiple firms.


Tighten Up the Recruitment Process

Companies are making decisions and hiring faster than I have experienced in the past five years. In order to be competitive, make sure your process is swift and efficient and that the proper decision makers are involved. Nothing kills the chances of landing a great candidate faster than not being able to schedule something on a hiring manager’s calendar for a delayed period of time. Talent is the lifeblood of an organization. Make sure managers block off the appropriate time on their calendars so they do not become the bottleneck that kills the process.

Also, make sure there is a good rhythm between calls with the candidate. If the last phone interview was two weeks ago, don’t expect the candidate to be as excited about your job as they were 12 hours after their initial call.


Make Sure Messaging is Consistent

Nothing spooks a candidate more than hearing different things about a position from different people. Make sure everyone involved in the recruitment process understands the reason you are hiring for the position, who it reports to, what the expectations are, what the time frame is, and what is expected from this individual. If anyone is going to bring up the compensation associated with the position, make sure it’s consistent with what your HR team and your recruiter is saying.


Present the Company in the Best Light

From a convenient travel itinerary (even if it costs a few extra bucks) to having a “Welcome Joe Smith” sign on the door, it is important to pay attention to the details. Have a well-organized itinerary of meetings with the hiring team. Schedule a meeting with those same executives within 48 hours to make a go/no-go decision. 

Be to produce a succinct written offer within 24 hours after that decision, including a comprehensive benefits summary, explanation of compensation including (competitive) salary, bonus, and equity. Include a breakdown of how to earn 100 percent of the bonus.

Virtually none of this advice will cost you any more money. It is all about making the candidate recruitment experience more attractive and enjoyable. 

Mike Silverstein is partner and director of healthcare IT of DIrect Consulting Associates of Solon, OH.

Readers Write: Below the Waterline: Is Your Network Population-Health Ready?

April 2, 2014 Readers Write No Comments

Below the Waterline: Is Your Network Population-Health Ready?
By Nancy Ham

4-2-2014 3-42-44 PM

Historically, health information exchange (HIE) implied the tactical, the plumbing and pipes that enable movement of health-related information among organizations according to national standards. Today an HIE network is a strategic asset vital to population health management.

Health organizations must supply more than bricks and mortar as our industry moves from what was once a conceptual model of healthcare to reality. They must provide a network solution for powering appropriate population health management capabilities.

HIE capabilities are evolving. Existing competencies are being coupled with workflow and care management processes, essential for analyzing and managing populations of patients − a shift from the traditional retrospective version of care to real-time, preventive care. Today’s care management needs to be informed and powered by high-quality, real-time discrete data from myriad sources across the continuum of care.

To affect population health, the entire healthcare ecosystem from acute to ambulatory to long-term and beyond needs to be connected, moving beyond the traditional reach and capabilities that current health information exchanges offer.

We’re all familiar with the phrase “the tip of the iceberg.” The tip of the iceberg is visible. It glints and shines. This iceberg principle applies nicely to many population health management solutions with flashy dashboards and snazzy visualization methods. They look really good on the surface, but what is imperative is what lies beneath the waterline. Is the foundation − the data asset from which the analysis is conducted − a solid one?

Before adding population health visualizations, ensure that your foundation is complete. Ask yourself:

  • Are your patient records correctly and accurately matched?
  • Do you have a sophisticated privacy and security infrastructure?
  • Do you have pointers established to access clinical data regardless of where it exists?
  • Can you manage granular patient consent?
  • Do you have a sophisticated mechanism for driving role-based access, including new network participants such as payers?
  • Is your solution able to scale to bring more and more participants into your network?
  • Does your system represent the entire healthcare community across care settings?
  • Are your referrals managed and communicated among providers?
  • Do you have alerts to notify providers when a patient experiences a health event so they can make informed and timely decisions for that patient’s care?
  • Are your EHR interfaces bi-directional?
  • Do you have patient engagement tools such as patient portals and personal health information?
  • Can you aggregate claims and billing data in conjunction with clinical data?
  • Are you using data standardization methods to furnish mineable data?
  • Can you share patient care plans?

Your HIE should do all of this. Your HIE partner should have a track record of linking hospitals with the entire community of providers.

When you have a sophisticated HIE network to enable clinicians to manage their patient population, you have a scalable foundation for improving the quality and cost of care. The foundation is key. From there you can snap on population health analytics solutions, whether from your HIE vendor or from one or more third-party vendors. Now you have evolved your HIE to a strategic network, curating the data flowing through the network to provide contextual, real-time information that engages both clinicians and patients.

Nancy Ham is CEO of Medicity of Salt Lake City, UT.

Readers Write: Doctor’s Day and HIMSS

March 31, 2014 Readers Write 5 Comments

Doctor’s Day and HIMSS
By Dr. Wellbeing

On Doctor’s Day, I am reflecting on the morale of my profession. I hope to get away with being the kid who says, "The emperor has no clothes."

I looked long and hard for inspirational writings about doctors, particularly on this day, and there were none. Maybe the closest to a worthy read was Malcolm Gladwell’s article in Forbes about the state of American healthcare, followed by his interview called, "Tell People What It’s Really Like to Be a Doctor." It came as a surprise considering he is not an insider, but he is one of the most insightful thinkers out there. As such, I was happy to see a voice in defense of doctors.

Alas he is the exception that proves the rule. Because it seems doctors are blamed for everything that is wrong with the healthcare system these days.

CMS is making physician payment data public. Insurance companies are placing doctors in straightjackets and requiring pages of paperwork just to get one medication approved. Patients have unrealistic expectations of, "I can hardly wait to chat with my doctor at my leisure online."

No wonder the morale of doctors is at an all-time low. It is the equivalent of a big bully chasing the weak kid around on the playground because he is small and cannot fend for himself. Physicians historically have not been well organized, and their representatives – such as AMA — abandoned them a long time ago.

I just returned from my first HIMSS conference, where everybody was busy giving advice on how practices should get ready for ACOs, MU, ICD-10, PCMH, population health etc. I am thinking that maybe the ones who should have their data made public are the HIMSS folks themselves, since just to exhibit there for one week costs more than my salary for a year.

I am not returning anytime soon. Even though I was warned by close friends that HIMSS is physician-unfriendly, I was not prepared for the CEO and chairman to start the conference with an insult to physicians after asking them to stand up in a full room. One IT person who is equally disenchanted with HIMSS and vows to not return suggested that we should invite "60 Minutes " to one of those conferences. Or maybe some patients, for that matter.

The healthcare IT industry has become a bit like the tail that wags the dog. It has lost its sense of purpose and meaning. One of my favorite lines ever is from “Jurassic Park,” which is, “Just because we could does not mean we should.”

One vendor was shocked to find out that I am not paid for “population health.” Another one dressed in a white lab coat could not explain to me who makes the ultimate decision in telemedicine (one genie that left the bottle) when the patient with congestive heart failure whom we try to keep out of the hospital cannot breathe. Who calls 911 — the patient, his doctor, or the “doc in the box?” Nor could he answer whose liability that is. I have yet to see a tele-intubation.

If we are to have some foresight, then maybe we should have some hindsight as well and understand how we got here in the first place. The sickest patients, the elderly, the chronically ill, and nursing home residents are not technology savvy. Nursing homes have been beaten by litigation to the point that the fear of being sued is so ingrained into their psyche that they cannot even give a Tylenol or a laxative without a doctor’s approval. Hence, why every decision in healthcare goes through a doctor’s office whether we like it or not. It is why my inbox is full every day and I have endless hours of mind-numbing work.

Another EMR vendor could not articulate why a hospital should buy its care coordination solution when hospitals are not being paid for care coordination. Little do they know that very few actually participate in ACOs, nor do they plan to do so. As such, their solutions are based on assumptions that we will all be in an ACO one day, an experiment that has yet to show results, also ignoring the fact that 60 percent of doctors have expressed no desire to join one.

The whole premise of the HIMSS technology offerings relies very heavily on the Assume a Can Opener theory. It has very little understanding of how healthcare is being delivered and paid for, which I particularly found very disturbing.

The shortage of physicians and the burnout is real. On this day, let us all remember that in reality what we are paying docs to do is to make decisions. That is far more complex than any of the human endeavors or societal activities, and yet it is the most-intruded upon, most-regulated and most-distrusted of them all. In Greek mythology, Hydra grew two heads each time one was cut off, just as healthcare grows more complex by the day. Yet the ones who should be consulted first seem to be the last.

It seems the harder we try to fix the "broken American healthcare system," the farther we are from fixing it. The recent proof is the ineptitude of Congress, who is not capable  to understand the complexity of what  they are voting on, always approving short-term fixes to long-term problems. The sad part is not that the fee-for-service is broken or that the HMO / capitation / ACO is better, nor that they kicked the  can down the road regarding SGR for the umpteenth time, but that we are forced to practice in a a dual, ambivalent environment where both are equally approved.

While hospitals are paid by DRG, doctors are paid per day. In Medicare Advantage Plans, the Primary Care gets paid capitation, while the specialists get paid fee-for-service. We have created a de facto "divide and conquer" system.

Too often the term "physician alignment" is used along with "patient engagement " and "accountable care." How about if for one day a year  – March 30 — we use "patient alignment and patient accountability“ and "physicians’ engagement?” Because returning from HIMSS, my EMR inbox was full of messages like, "Can the doctor write me a note for early dinner sitting on my cruise since I am a diabetic?" or “Can the doctor write me a note for the airline to allow me to take my dog with me on the plane since I suffer from anxiety?" or "What kind of fiber should I take?"

I do believe strongly that patients should have access to their records and that healthcare records should be digitized, but  I do not subscribe to the assumption that it will resolve our healthcare woes or that it will lower healthcare costs. Just giving patients their data does not mean that they will know what to do with it, nor that they will make wiser decisions. Giving them an app does not mean that they will start eating spinach.

The same is true of physicians. Inundating them with data does not mean they will make better decisions, either. We are in essence data-rich but information-poor.

While I enjoyed networking and meeting people, it was difficult to separate the signal from the noise. I saw what billionaires look like, from Judy Faulkner to Dr. Patrick Soon-Shiong, who was gracious and took a picture with me so I can show my kids that the richest man in LA is a physician and not a movie mogul. But I cannot ignore the fact that not all hospitals can afford Epic and that so many had their credit ratings downgraded due to Epic implementations budget overruns, nor the fact that the orange-clad NantHealth staffers had no idea what their company was about. I also had the opportunity to explain to some why social media for many physicians is a legal minefield and many MDs shy away and don’t want to live in the town where they practice.

One economist said that when the barn is on fire, the farmer by instinct goes inside and saves the cat, the dog, and the livestock and leaves the rest behind. I believe it is time for healthcare IT industry to do the same. If healthcare is on the verge of a cliff, then healthcare IT can either throw us a rope or give us the final push. 

I keep hoping that someday the pendulum will swing back to doctors being respected and trusted and that they themselves rediscover the meaning and calling of their profession amid all the chaos. I should have faith because Mr. H told me so.

Readers Write: What Is Population Health Management, Exactly?

March 12, 2014 Readers Write 1 Comment

What Is Population Health Management, Exactly?
By Steven Merahn, MD

image

While at HIMSS, I stopped by the KLAS booth and ended up revisiting the October 2013 KLAS report on population health management. I was both impressed and concerned about its findings. Impressed because of the level of market commitment to population health-related solutions, but concerned because I still don’t think the market gets it when it comes to population health management.

The real power of population health is the opportunity it offers those delivering care to disintermediate those we now call payers — removing or disrupting a layer insulating patients from their providers – or at least put physicians and provider networks in a position of strength in negotiations with those contracting for care (unfortunately, it also puts hospitals at risk for similar disruption, like what happened to the railroads when airline travel began to get traction).

HIMSS was full of vendors hawking analytics and care management platforms, but population health is really not at its heart a technology play. In the executive summary of the KLAS report, author Mark Wagner tried to address this issue when he said,  “…automation is a misnomer for vendor solutions and PHM remains largely a manual process.”

However, the use of the phrase manual process is itself a misnomer. It presumes that automation is even possible for population health management. Elements of a technology stack can enable (and may be necessary for) population management, but these elements – individually and collectively – are wholly insufficient for successful implementation of a population management infrastructure.

Wagner again alludes to this in his reference to the value of “collaborating with physicians early,” but there’s more to this than simply physician engagement. It’s far more fundamental, as physician leaders, provider networks, and healthcare delivery systems are discovering. In successful population management, the databases, software analytics. and care planning platforms — whether EHR-based or independent but interoperable — are largely subordinate to a more dominant factor:  the human factor.

If there’s one thing that has been consistently affirmed to me in the 30+ years since medical school graduation, it is that health and healing is impossible without the human connection. I submit that the value in value-based care – improving quality of care and quality of health based on more efficient use of effective healthcare resources across a cohort or defined population – is more powerfully achieved through reconsideration of the organizational principles and operating relationships among the people, programs, platforms, and partners that comprise healthcare delivery and care management.

Population health management transcends the technological elements that may fulfill some of its specific functional requirements. Product, services, and channels may be necessary, but are insufficient to truly influence the trajectory and quality of a person’s health. That influence occurs at more tactile and emotive levels in people lives, “tactile” referring to the responsiveness, reliability, consistency, and convenience of care; “emotive” referring to the sincerity, authenticity, integrity, and dignity associated with the experience.

I am reminded here of Dr. Lipton, our family physician in the 1950s and 1960s, For him, what we now call population health was just the way he practiced medicine. If my grandfather – who had his first heart attack in his mid-30s – missed his quarterly blood pressure check, we would get a call. After my grandmother’s sigmoidoscopy — then done in an operating room as an inpatient — he stopped by the house.

His technology for this: the work of worry — and a weekly index card tickler file. But despite what would seem to us some technological limitations, time and time again he demonstrated to us that we were very present for him even when we were absent from his waiting room.

He did get paid in cash for services rendered, on a fee schedule and sliding scale, but he also worked to earn our trust. There was no doubt that this was an important form of compensation for him. His value proposition was threefold:  mastery of his craft, demonstrable commitment, and genuine consideration. As such, his responsibilities for our health extended beyond the doors of his office.

For our family, he provided comfort and a safe harbor – despite some looming health threats — because there was a person, and not just a person, an expert, who worried along with us and that was in many ways a more powerful influencer of our healthcare quality then the medicines he prescribed. His recommendations were followed, even when there was intellectual resistance, because we could not imagine letting him down.

Our current approach to technology is focused on “managing measureable variables,” but the real challenge is that quality of health is based on a different set of variables than quality of care. Our technology may allow us to identify and attempt to control dozens of evidence-based clinical factors, but is still not powered by factors representing the capacity to influence a patient in ways that truly matter.

Which means that if we truly want transform care delivery with technology, we need to shift our focus from the meaning of the data to what we mean to each other.

Healthcare technologies should be instruments of human expression in service of health and healing, with a fundamental mission to provide the patient and their family the same sense of comfort, safety, and reliability provided by the Dr. Liptons of the world – where professionals are valued for their commitment to mastery and human service and patients are helped to find the meaning of health in the context of their relationship with themselves and others.

This will require us to reconsider what we mean by population health by designing systems of care that amplify the humanness in our care delivery, where technology supports goal-directed collaboration between humans and machines and where we are allowing people to find meaning and value within themselves and from each other.

Steven Merahn, MD is senior vice president and director of the Center for Population Health Management at Clinovations of Washington, DC.

Readers Write: Why a Unique Patient Identifier is Critical to Improve Patient Matching

March 12, 2014 Readers Write 4 Comments

Why a Unique Patient Identifier is Critical to Improve Patient Matching
By Barry Hieb, MD

image

In a recent HIStalk article entitled “National Patient Identifier: Why Patient Matching Technology May Be a Better Solution,” Vicki Wheatley argues that, “… healthcare organizations should instead focus on strengthening their existing enterprise matching strategies” rather than work to implement a national patient identifier (NPI). The article makes several valid points that contribute to the ongoing debate about an NPI:

  • No solution, including an NPI, can solve all patient matching problems.
  • Patient matching errors and healthcare fraud will continue to require special attention.
  • Accurate tracking of an individual’s information across healthcare silos is becoming increasingly important.
  • Any proposed patient matching solutions must not negatively influence privacy, security, or clinical outcomes.
  • Accurate patient matching is essential for activities ranging from clinical care to healthcare analytics to population health management.

In these and several other areas, Ms. Wheatley’s article makes a valid contribution to the ongoing debate concerning a national unique patient identifier.

There were a few areas, however, where we have a somewhat different viewpoint. The first of these is the implied assumption that healthcare organizations must make a choice between having an EMPI and having a national patient identifier. We believe that this is a false dichotomy.

Clearly, healthcare organizations must continue to improve their existing EMPI systems as much as possible. However, years of analysis and experience indicate that this will not allow them to achieve the levels of patient matching accuracy that are being required going forward. Those requirements include identification of individuals across disparate healthcare systems, the need for matching against ever-increasing patient populations, and the fact that patient demographic data has known variability and ambiguities.

These represent just three of the reasons why unassisted EMPI demographic matching cannot represent the sole patient matching strategy. Rather, the EMPI approach will need to be supplemented by techniques such as the use of an NPI, biometrics, digital certificates, and other technologies.

Virtually every EMPI system uses a patient’s Social Security number as a data element to improve the performance of their demographic matching algorithm. I was puzzled by the statement, “… even in theory, every single potential patient in the country would need to be assigned one…” as a condition for an NPI to work. Ms. Wheatley acknowledges that there are many people in the US who require healthcare but do not have an SSN. Despite this deficiency, the use of the SSN clearly adds value in those situations where it is accurately available. Similarly, an NPI would benefit each patient who chooses to use one.

An important point to keep in mind is that there is no mechanism to check for data entry errors in most of the data elements currently used for demographic matching. This includes the SSN, names, and addresses. For example, there is no reliable way to detect transposition of digits when a SSN is manually entered. Nor is there an easy way to automate the capture of a patient’s SSN.

Contrast that with a well-designed national patient identifier system. In most situations, the NPI would be read using automated technology such as a barcode reader or a smart chip that would virtually eliminate errors. Even when the NPI is manually entered, embedded check digits can ensure that any data entry errors are immediately detected and the operator is prompted to re-enter the NPI. When added to a person’s demographic profile, the NPI thus becomes the single demographic element that can lead to accurate patient identification on its own. These proposals represent a major advance from the current situation – i.e., an 8 percent or more error rate in EMPI matches.

It is very clear that healthcare organizations will continue their use of EMPI systems for the foreseeable future. That fact, however, should not blind us to the reality that these EMPI systems need to be augmented by additional capabilities going forward if they are going to meet the patient matching accuracy needs that are emerging in healthcare.

The use of a national patient identifier, even if it is initially only chosen by a subset of providers (or patients, on a voluntary basis), will enhance the patient matching accuracy for those patients and help avoid the medical errors that are associated with patient matching errors.

Barry Hieb, MD is chief scientist with Global Patient Identifiers, Inc. of Tucson, AZ.

Readers Write: The Data Problem

March 5, 2014 Readers Write No Comments

The Data Problem
By Randy Thomas

image

Dr. Jayne asked important questions in her Curbside Consult about big data, EHR conversions, the “sheer magnitude of bad data out there,” and how best to insure the integrity of health data.

The best way to address the issue of bad data is to follow the old adage,“Begin with the end in mind.” Implementing an enterprise-wide EHR is a massive, complex undertaking. It involves considering the needs of many stakeholders when defining the build requirements. For example, workflow must support ease of use and not interfere in patient care delivery and related work processes. Furthermore, many implementation decisions focus on driving clinician adoption to ensure that both quality and efficiency objectives are met (not to mention regulatory requirements related to Meaningful Use.)

With all the multi-threaded work streams and decision processes involved in planning and executing an EHR implementation, the re-usability of captured data frequently falls out of scope. That leads to the bad data problem.

Re-usability means using data captured in any source system (EHR, ADT  materials management, patient accounting, registration, operating room, emergency department, etc.) for reporting, measurement, and analytics. Re-using the data captured in these source systems accelerates the value realized from implementing such systems and supports a virtuous cycle of performance improvement across an enterprise.

It all relates to, “You can’t manage what you don’t measure.” That is, you can’t measure something if you don’t have the right data. This leads back to the decisions made in implementing EHRs and other systems. You need to start with what data is required to measure and analyze what’s important to the organization and ensure that data can be consistently, reliably, and accurately captured at the point of origin (e.g., at registration or in the care process).

It’s not realistic, however, to expect that every bit of data about a patient should be captured in a discrete form for re-use. What’s required is a balance between supporting ease of use in the appropriate workflows and the availability of data for reusability.

An effective way to strike this balance is to create a list of data elements the organization agrees is necessary for analytics. Some detective work is required: tracing the journey of that data back to the source system and ensuring that each data element is captured as expected in the intended workflow. This requires collaboration across a multi-disciplinary team — one involving experts in quality reporting, data analysis, and clinical (or operational) workflow.

The inventory of data elements can be used to identify where each data element can be captured in the source system (e.g., EHR, ADT, etc.). This is the “data chain of trust.” Team discussion and compromise are required to design workflows that both support ease of use and capture data reliably and consistently.

With a documented inventory of data elements married to how that data will be captured in the source systems, data can start flowing into an analytics environment. Applying sound data governance principles and implementing a data profiling discipline will ensure data consistency and reliability.

Organizations don’t have to begin with a large set of discrete data, but they must recognize that any level of measurement, reporting, and analytics requires consistent, reliable, accurate data starting at the point of capture in the source systems. They should begin with the data most important to each organization and ensure that data can flow from origin to analytics in a chain of trust that is known and transparent.

From there, health systems can incrementally increase the available data as they come to understand why it’s important to capture data discretely and accurately and as more stakeholders benefit from access to that data. With the increasing value realized comes the understanding that, “It’s all about the data.”

Randy Thomas is associate partner of health analytics with Encore Health Resources.

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

Reader Comments

  • Mark Hochhauser: Sanford Health's CEO has been replaced. https://www.startribune.com/sanford-replaces-ceo-after-controversial-email-a...
  • IANAL: That's a lot of money for eMDs though it isn't clear how the financing works. At face value it would take compugroup mor...
  • Anne: Apologies for how rudely that came across. I do still question why our health is the responsibility of our doctors, but...
  • Elizabeth H. H. Holmes: Incredible. What an awful posture to take, what an awful example to set. It just encourages others to lie that they had ...
  • @JennHIStalk: Katie, if you're still looking for health IT history resources, check out Vince Ciotti's HIStory here: http://histalk.co...

Sponsor Quick Links