Home » Readers Write » Recent Articles:

Readers Write: Defining Our Terms: Does Anyone Know What an "Open EHR" Really Is?

June 16, 2015 Readers Write 4 Comments

Defining Our Terms: Does Anyone Know What an "Open EHR" Really Is?
By Dean F. Sittig, PhD and Adam Wright, PhD

image image

Adapted from “What makes an EHR “open” or interoperable?” J Amer Med Inform Assoc 2015. Available at: http://jamia.oxfordjournals.org/content/early/2015/06/13/jamia.ocv060.

There’s been a lot of talk lately about “open” EHRs, ranging from Congressional hearings to industry buzz. Last summer, Mr. H challenged his readers with, “What core set of published standards or capabilities must a given EHR support to be considered open?” We thought this was a great question, so we decided to give it a try.

First, “open” does not mean “open source.” Although open source software is of great value, an EHR can certainly be open without being open source.

We’ve also noticed that some commentators equate open with the platform software is built on, and specifically, that systems which use relational databases and support SQL (structured query language) are inherently more open than those that use hierarchical databases (e.g., Cache). We think this is a distraction, too – you can make closed systems on SQL or open systems on Cache.

Regardless of the database technology (relational, hierarchical, object-oriented), data exchange with another application requires significant effort to transform the data into an agreed-upon format with agreed-upon meaning. This transformation must take into account the data’s syntax (the format), semantics (the meaning), and pragmatics (the way the data are used in context to create a meaningful clinical application). The internal representation of the data, in either the sending or receiving EHR, is largely immaterial.

We decided to organize our definition of open around five use cases, which we refer to as the EXTREME criteria (short for EXtract, TRansmit, Exchange, Move, Embed):

EXTREME Use Cases

An organization can securely extract patient records while maintaining granularity of structured data.

  • Secure login and role-based access controls.
  • Structured data importable programmatically into another database (unstructured formats such as PDF, do not suffice).
  • Audits of extracted records.
  • Sufficient metadata included in the extract to ensure interpretability, e.g., units and normal ranges for lab results.
  • Freely-available data dictionary indicates where data are stored and what they mean.

An authorized user can transmit all or a portion of a patient record to another clinician who uses a different EHR or to a personal health record of the patient’s choosing without losing the existing structured data.

  • Data selection methods that allow users to identify which data to include or exclude.
  • Standard method to structure data (e.g., C-CDA) or portions thereof (e.g., DICOM, e-prescribing).
  • Standard methods used to describe the meaning of the data (i.e., controlled clinical vocabulary used) Note: conversion of structured data to an unstructured format such as PDF would not meet these requirements.

An organization in a distributed/decentralized health information exchange (HIE) can accept programmatic requests for copies of a patient record from an external EHR and return records in a standard format.

  • EHR infrastructure capable of responding to queries 24 hr/day, 7 days/week.
  • Record-locator service functionality available and in use.
  • Standard method used to structure data (e.g., C-CDA).
  • Sending EHR’s data dictionary available to receiving EHR.
  • “Internet robustness principle” respected (be liberal in what you accept and conservative in what you send).

An organization can move all its patient records to a new EHR.

  • Standard method in which to structure key clinical data (e.g., laboratory results, medications, problems, admission history) provided (e.g. HL7 v2.x or v3).
  • Data dictionary used to define clinical and administrative data.
  • Existing metadata (e.g., timestamps, source, and authors) exported to the new system.
  • Transaction history of data items (e.g., renewals and dose changes for a medication) preserved.

An organization can embed encapsulated functionality within their EHR using an application programming interface (API). Goals: access specific data items, manipulate them, and then store a new value.

  • External applications have “read” and “write” access to clinical and administrative data, including metadata from the EHR (e.g., using the SMART app platform or HL7’s Fast Healthcare Interoperability Resources (FHIR) services.
  • Programmatic method to embed external applications (either code or presentation, i.e., an embedded web application, e.g., Cerner’s mPages) with which the user can interact via the EHR’s user interface without re-compiling the existing EHR’s codebase.
  • Appropriate support and maintenance to ensure that encapsulated functionality will continue to work and meet user needs following system configuration changes or upgrades.
  • HIPAA-compliant protection of newly created data item(s) (e.g., only accessible to authorized users and backed-up with all other patient data) like all other patient-related data.

These use cases were designed to address the needs of patients, so they can access their personal health information no matter where they receive their healthcare; clinicians, so they can provide safe and effective healthcare; researchers, so they can advance our understanding of disease and healthcare processes; administrators, so they can reduce their reliance on a single-source EHR developer; and software developers, so they can develop innovative solutions to address limitations of current EHR user interfaces and create new applications to improve the practice of medicine.

In addition to the specific features and functions required to implement these use cases, we also note that many developers limit access to their systems by requiring: special training and certification by the developer before users can extract data from the system or integrate an application; users to sign a non-disclosure agreement; users to pay an additional license fee to access data or integrate an application; customized programming that only the developer can do; or access to documentation that requires special permission or additional fees. While we understand that developers need to maintain a degree of control over access to their software for financial, security, intellectual property, and reliability reasons, we question whether a system subject to such constraints can be considered truly open.

In addition to these use cases, open EHRs should be subjected to stringent conformance testing to ensure that receiving systems are able to import and parse the structured data and store it in the appropriate location within the receiving EHR, while maintaining the metadata and transaction history from the sending system.

Widespread access to open EHRs that implement at least the five EXTREME use cases we propose is necessary if we are to realize the enormous potential of an EHR-enabled healthcare system. Healthcare delivery organizations must require these capabilities in their EHRs. EHR developers must commit to providing them. Healthcare organizations must commit to implementing and using them.

In addition to having all EHRs meet these technical requirements, we must also begin addressing the myriad socio-legal barriers (e.g., lack of a unique patient identifier, information blocking, high margin, fee-for-service clinical testing) to widespread health information exchange required to transform the modern EHR-enabled healthcare delivery system.

Dean Sittig, PhD is professor of biomedical informatics at the University of Texas Health Science Center at Houston. Adam Wright, PhD is senior scientist in the Division of General Medicine of Brigham and Women’s Hospital, a senior medical informatician with Partners HealthCare, and assistant professor of medicine at Harvard Medical School.

Readers Write: The Learning Healthcare System Starts with the Vendor-Neutral Archive

June 10, 2015 Readers Write Comments Off on Readers Write: The Learning Healthcare System Starts with the Vendor-Neutral Archive

The Learning Healthcare System Starts with the Vendor-Neutral Archive
By Larry Sitka

image

The Office of the National Coordinator for Health Information Technology, commonly referred to as ONC, recently released “Connecting Health and Care for the Nation, A Shared Nationwide Interoperability Roadmap (DRAFT Version 1.0).” Inside the 166-page framework description, ONC introduces the need for a platform called a Learning Health System, which it defines as “an environment that links the care delivery system with communities and societal supports in ‘closed loops’ of electronic health information flow, at many different levels, to enable continuous learning and improved health.”

The ONC document is designed to be a 10-year roadmap that describes barriers to interoperability across the current health IT landscape, including a description and proposal for a desired future state of healthcare IT. It introduces an architecture overview for a learning healthcare system and what is required of such a system.

In the report, ONC states that “by 2024, individuals, care providers, communities and researchers should have an array of interoperable health IT products and services that support continuous learning and improved health. This ‘learning health system’ should also result in lower health care costs (by identifying and reducing waste and preventable events), improved population health, empowered consumers and ongoing technological innovation” through coordinated care plans.

The report states that in the future, “all individuals, their families and health care providers should be able to send, receive, find and use electronic health information in a manner that is appropriate, secure, timely and reliable. Individuals should be able to securely share electronic health information with care providers and make use of the electronic health information to support their own health and wellness through informed, shared decision-making.”

While the vision and future state put forth by the ONC is sound, as healthcare professionals, we must ask ourselves, “Where do we begin?” and, “What can we do today to begin reaping some of the benefits of interoperability and providing the foundation for the next 10 years?”

As with any technology revolution, certain technologies mature faster than others and begin to provide a glimpse of the future landscape. In the case of interoperability, the vendor-neutral archive (VNA) is a mature technology that is already playing a leading role in evolving the current healthcare ecosystem toward a learning healthcare system and providing a means for real-time healthcare delivery.

The foundation for a learning healthcare system is the basis of what a VNA provides today. Leveraging and thinking of a VNA as merely an imaging storage tool is shortsighted. Why not envision the VNA as providing the pathway and functionality for a patient-centered healthcare discovery tool? The VNA already has the capability to provide an IT interoperability framework that enables many applications to work in unison to learn the context of a patient, inside or outside the current healthcare organization. By leveraging a VNA in this context, suggestive results can be provided to the healthcare organization’s clinicians, physicians, and, most importantly, the patient in a passive or real-time manner.

The VNA is an effective means for improving patient outcomes through interoperability and for moving healthcare organizations beyond the traditional product sell. The ONC report states, “Consumers are increasingly expecting their electronic health data to be available when and where it matters to them, just as their data is in other sectors. New technology is allowing for a more accessible, affordable and innovative approach. However, barriers remain remain to the seamless sharing and use of electronic health information.” The VNA has all the elements necessary to establish a learning health system foundation.

In the construction of a building, every project begins with the foundation. A solid and stable foundation is critical and must be carefully planned. It is the most difficult structural element to change. The foundation of a learning healthcare system is built around two key components—patient context and the healthcare delivery organization (HDO) context. Taking ownership of the data and focusing on HDO interoperability through standards are essential pillars that must be cemented into this foundation.

From an HDO perspective, ownership of clinical content on behalf of the patient is a mandatory requirement. An assumed role of the HDO, on behalf of the patient, is the holding of collected patient content for future use in the continuum of care. The HDO must define and build a foundation by which secure sharing of patient content is inherent. This environment must be capable of not just storing content but also dynamically finding, moving, and distributing content in real time.

This content is linked and possibly moved into a learning healthcare system independent of the organization’s affiliation. The content is either linked on demand or covertly as information is discovered, further extending the patient longitudinal record. The goal of content aggregation is to provide suggestive access to patient information for the healthcare worker who is responsible for delivering a better patient outcome. The patient outcome is the evidence by which the HDO shall be paid.

From the patient perspective, ownership of the data by the patient is now something we vendors must enable and that HDOs are legally bound to steward. HIPAA, for example, can appear to vendors as restricting and controlling. It attempts to define who and what content can be accessed along with the purpose of accessing that content. However, it is actually HIPAA that finally gives ownership of the content back to the patient. It is the first piece of legislation specifying to the HDO and its vendors that true ownership of results and supporting documentation belongs to the patient and not the healthcare organization, the insurance company, or the product vendors.

Once the foundation of a learning healthcare system is created, the framing comes next. Framing requires exact measurements and sizing using standards-based products. With the cutting and coercion of the materials comes a custom fit per the requirements in a blueprint. Such is the case of a learning healthcare system, where the HDO must begin by demanding standardization of not only structured content but also unstructured content. Standardization assures interoperability and a canonical data model that is based on industry standards and site-specific requirements, not proprietary vendor specifics. Standardization or canonicalization of the metadata to be used and exchanged in a learning healthcare system is exactly what a true VNA platform provides.

Simple problems come with very complex solutions in these cases. For example, patient names, IDs, and study descriptions have become as complex to the HDO as the Y2K problem. Can you imagine the chaos that would ensue from an IT infrastructure not based on wireless or Ethernet standards for physical connectivity? Simply put, what if we all drove on an Interstate without painted lines? What if the map we used for guidance did not include a legend?

Such is the case for the HDO when it comes to delivering a standards-based form of patient content. Of course, there are DICOM standards, HL7 standards, and the XDS framework, but HDOs must demand that vendors actually support and utilize these standards, participating in annual Connectathons to validate their ability to interoperate. More importantly, HDOs must contractually demand interoperability following those exact standards. In short, an HDO must stop purchasing solutions that are unique to its own internal, proprietary standards.

The deployment of the electronic medical record (EMR) to capture and attempt to hold unstructured content, at least inside a data warehouse application, is a step in the right direction. Unfortunately, the EMR only solves half of the problem by providing a collection point. To test this, try and share the unstructured content between EMRs and between organizations. This has become a next-to-impossible task. EMR providers that claim to be able to share unstructured content typically come up far short of expectations.

clip_image002

The idea of sharing an electronic record is what initially drove EMR adoption. But now we have a large volume of unstructured content that must feed the learning healthcare system. The VNA is a capable platform for achieving this goal. The chart above indicates where the VNA is already meeting three-year and six-year interoperability objectives set forth in the ONC report.

The final steps in a construction process are completed by selecting the best products, with the best look and feel, to meet the needs of the owner. Such is the case in creating a learning healthcare system, which demands the ability to select the best products and functionality to deliver the best patient outcomes. Different departments and healthcare settings, much like physicians, have different needs and requirements. Why be limited to only one selection? More importantly, don’t be forced into “one size fits all” in the selection of applications. Give HDO users the flexibility to select the applications that best suit their workflow and objectives. For example, a radiology-centric viewer will not work very efficiently for wound care or treatment planning.

When connecting the building to the outside world, each location typically has its own utility providers that are part of a grid. The same is true for a learning healthcare system, where existing healthcare information exchanges (HIEs) are the on-ramps. The HIE and image or content exchange, which are typically not profitable today, are expected to evolve into much more in the future. Difficulties often arise when seeking cooperation among different, unaffiliated organizations for patient informational access. Vendors, of course, find it difficult to build any product today around something that is not profitable, not to mention being a very difficult sell to HDO executive teams. Tomorrow’s HIE technology inside the learning healthcare system, however, will not only be a necessity but will be integral in making sure image and content exchange is included in the VNA as an embedded feature. Sharing patient content across the private sector, HIEs and government organizations will become commonplace within the next decade, all driven by patient outcomes.

But, more importantly, the business and legal perspective. The VNA selected should support an HIE inherently. An image/content exchange is a mandatory requirement of a VNA and is the basis of a learning healthcare system for moving released content in a secure manner. It is also critical that an image/content exchange within a learning healthcare system provide the business process and verification steps, including automation of steps that include BAA approval and appropriate patient release form access and approval.

The data demands of a learning healthcare system will far exceed anything an HDO has seen to date. Typically, the sizing of a VNA is done by traffic volumes requested by concurrent users, or study volumes. However, the oncoming big data analytics applications (a necessity inside a learning healthcare system) will far exceed any current traffic volumes requested by humans. A learning healthcare system will be in a continuous mode of finding, aggregating, and coercing information relevant to the patient in context. This is also a necessity to building out the patient record.

Once found, the information is persisted in the learning healthcare system whereby the analytics and other applications, including natural language processing (NLP), will access the information. NLP will give the data better context and perception around the patient, allowing the healthcare worker to have better informational access and decision processing through new clinical support applications. Support for these demanding applications will require an infrastructure that can scale on-demand, both horizontally and vertically. These applications will leverage your VNA for more than just “basement storage,” where content becomes cluttered and inefficient while never being used again.

The learning healthcare system will be an integral part of improving the way the healthcare ecosystem works and how patients, providers, and payers interact within that ecosystem. Achieving the complete vision of the learning healthcare system will be a gradual process and lessons will be learned throughout the journey. There are important actions we can initiate today, however, to begin building the necessary foundation for this vision. VNA technology is the foundational cornerstone mature enough to begin solving some of the greatest challenges and to remove some of the obstacles to a fully interoperable healthcare system.

Larry Sitka is principal solution architect with Lexmark Healthcare of Lexington, KY.

Readers Write: The Internet of Things Can Revolutionize Healthcare, But Security is Key

May 28, 2015 Readers Write 3 Comments

The Internet of Things Can Revolutionize Healthcare, But Security is Key
By David Ting

image

The Internet of Things (IoT) holds tremendous promise in healthcare, potentially enabling a digital health revolution and support the future of care delivery.

Gartner estimates that approximately 3.9 billion connected things were in use in 2014. This number is expected to increase to 25 billion by 2020, a growth trajectory that will surely impact the healthcare industry, which is already being flooded with devices for generating valuable patient data.

However, the transformative potential of the IoT won’t be realized for healthcare unless data integrity and security are built into the foundations of the IoT movement.

The IoT’s network of IP-connected computers, sensors, and devices allows care providers and patients to share information to a transformative degree by:

  • Giving care providers access to a greater number of devices for accessing protected health information (PHI).
  • Allowing patients to generate real-time biometric data with low-cost devices and applications.
  • Changing the nature of encounters with care givers from episodic to real time.

For clinical staff, the ability to interact with EMRs or other applications containing PHI from any device is invaluable, especially in creating a push vs. pull dynamic for access to patient information and health records. Today’s care providers are highly mobile and the IoT can provide the ability to seamlessly use connected devices within a single session.

For patients, the IoT offers the ability to participate in their own care. Specific patient opportunities include:

  • Generating valuable health information from wearables and home health devices.
  • Allowing real-time voice, video, and data streaming for telemedicine.
  • Enabling more active patient engagement. Instead of requiring patients to take initiative to look up records or set appointments, messages can be proactively sent to patients informing them about updates or other relevant information

Some of these changes are already taking place on a small scale. But for the IoT to reach its full potential in healthcare, identity and data integrity will become critical as PHI moves from the hospital to the edge of patient care delivery, especially to assuage consumer concerns about privacy and security.

The data generated by a series of connected devices can only be captured, aggregated, analyzed, and put to meaningful use on a broad scale if the identities of providers and patients are verified. The data being generated, collected, and shared through networked devices must be protected with strong, usable authentication methods.

For providers, authentication is required to meet compliance and privacy regulations. If security considerations are baked into the IoT infrastructure, wearables or others devices can be assigned to particular users and leveraged to verify their identity. Similarly, proximity awareness technologies can simplify the user authentication process to access various devices and applications.

Patient authentication is also essential in the IoT paradigm because it ensures the correct information is being generated by and shared with the correct patient. Creating a one-to-one link between patients and their medical records can establish a foundation for additional forms of patient identification. As with providers, devices will become part of the digital credential set for patients, necessitating a secure enrollment process to bind one or more devices to unique patient identities.

Constructing the necessary infrastructure to properly manage and optimize the proliferation of connected devices in healthcare starts with security. A strong security strategy includes authentication technologies and processes to verify patient and provider identities to ensure that devices can only be used by authorized users. The communications channels between the devices within the IoT must also be secure to ensure the integrity of the information passing through them.

Putting these security building blocks in place will help create a closed-loop system in which patients and providers can securely interact in a more engaging, meaningful way. 

David Ting is chief technology officer for Imprivata.

Readers Write: Trusted Data Is the Foundation for Advanced Analytics

May 28, 2015 Readers Write 2 Comments

Trusted Data Is the Foundation for Advanced Analytics
By Vicky Mahn-DiNicola RN

image

Much has been said about using advanced predictive analytics to improve the quality of healthcare. But one thing not receiving the attention it deserves is the pre-requisite of trusted data being sewn into the fabric of the healthcare organization. Every organization has data at its fingertips, but full value of that data can only be actualized if it is properly understood and trusted.

Take a relatively straightforward data element like a patient’s weight. While it is a simple, basic element, it can create havoc for analytics teams who discover there are upwards of 17 different places in their HIT systems where weight is captured. Weight is recorded in the emergency department flow sheets, nursing assessment intake forms, pharmacy profiles, ambulatory clinic records, and daily critical care flow sheets, just to name a few. Determining which weight field is the most reliable and appropriate to use is a difficult, lengthy process and one that is multiplied by hundreds of data variables required in advanced analytics projects.

Healthcare organizations are excited by the brilliant technology coming our way in the form of genomics, mobile health, and telemedicine. But too often, the cart is put before the horse. Just as bad ingredients guarantee a bad meal for even the best of chefs,  unreliable data in healthcare will inform inaccurate, even dangerous decisions.

Effective use of analytics is not something you can buy off the shelf from a vendor. Rather it is an organizational strategy, structure, and culture that have to be developed over time. While the technical and tactical execution is delegated to others, the chief executive in a healthcare organization is responsible for determining and overseeing this direction and progress.

The executive also needs to align the organization with data cooperatives and national groups that promote data standardization. National standards have historically been ambiguous, so it is important for providers to ensure they are not working in a vacuum, but have a common understanding of national guidance.

Diversity of systems and processes breeds confusion. Because there are many ways to express any given concept, there is a need for robust crosswalk, data mapping, and standardization to ensure data integrity within, between, and across organizations. This body of work is the responsibility of a designated data governance body within an organization.

Data governance implies far more than the maintenance of documents that describe measurement plans and reporting outputs.  It is a comprehensive process of data stewardship that is adopted by all data stakeholders across the organization, from the board room to the bedside.   Data governance is critical in order to standardize data entry procedures, reporting outputs, clinical alerts, or virtually any information that is used in clinical and business decision-making.  In the era of pay-for-performance and risk-based care, data standardization is mission critical for a true, accurate comparison to take place when evaluating an organization’s performance against external benchmarks and determining reimbursement based on value.

Another final step toward creating robust data governance structures is to create a data validation process. Data cleansing and maintenance should be automated, centralized, and transparent across the organization and should be designed to accommodate the needs of both clinical and business stakeholders.

A “data librarian” should be appointed to catalogue and oversee data elements across the healthcare system. The most mature organizations will implement a master data hub that is fully integrated into their application system environments so that changes are made simultaneously to all systems that need the same data. By doing so, a simple element like a patient’s weight will always be consistent in HIT systems.

Organizations need to recognize that the advanced analytics of tomorrow will only be achieved if the data we have today can be trusted. Those who succeed in establishing proper data governance will unlock the full value data can provide in our industry, beyond regulatory reporting and retrospective benchmarking initiatives to the more exciting prospects of predictive and prescriptive analytics.

Vicky Mahn-DiNicola RN, MS, CPHQ is VP of research and market insights with Midas+ Solutions, A Xerox Company.

Readers Write: Demystifying Population Health

May 13, 2015 Readers Write 1 Comment

Demystifying Population Health
By Jeff Wu

 image

Population health was once again a major topic of this year’s HIMSS conference. We saw even more vendors offering products, services, and solutions aimed at helping organizations deal with the challenges population health management presents.

Unfortunately, population health is such a broad domain that no singular solution really encompasses all of it. As a result, vendor offerings tend to only address a specific challenge. The wide and varying offerings across vendors adds confusion to the topic.

Population health shouldn’t be an industry buzzword that’s approached with trepidation. Instead, we need to understand the categories of challenges we are trying to address and the process for developing interventions to solve them. Let’s start by taking a look at the three categories that population health management interventions fall into.

  • Government or mandated interventions. For many organizations, this is the primary (and perhaps only) component of their population health strategy. Some initiatives, like becoming an accountable care organization, encompass requirements that address items that will be discussed below. For many organizations, this may be enough.
  • Enterprise population health interventions. These encompass interventions that are applied to the full population of an organization’s patients. Immunization and vaccination interventions or physical activity interventions are broadly applied to an organization’s full patient population. As organizations begin to try to standardize care, interventions aimed at variation reduction are also encompassed here.
  • Cohort, group, or sub-population health interventions. This class of interventions is the most varied and covers any intervention that addresses a sub-population of patients. Some examples of interventions in this category include health maintenance for diabetes patients, preventative care efforts like breast cancer screening in women over 50, and depression/PTSD screening for military veterans.

Population health management evolves linearly in three stages that borrow some classical tools from epidemiological tracking.

  1. Passive surveillance. Passive surveillance involves the retrospective analysis of a specific issue. This is the evaluation of data that already exists. Passive surveillance addresses questions like, "How many of our diabetic patients got a glucose test in the last six months?" or, "How many of our patients got flu vaccines last month?" Most analysis starts from this level of surveillance. It’s important to note that the majority of organizations are just getting to this point in their analytical journey. Implementation of the EHR tools necessary to do this level of surveillance are finally settling and getting to a state that allows for this to happen. To date many ‘organized’ population health based initiatives focus only on this type of surveillance. CMS’s MSSP ACO initiative is a classic example of this, where an organization participating in the MSSP ACO need only report their measures for the first year to receive their financial incentive.
  2. Active surveillance. The next evolution is active surveillance. If passive surveillance identified how many patients got flu vaccines last month, active surveillance would try and answer the question how many of our patients got a flu vaccine last week or yesterday. If passive surveillance told us which of our diabetes patients got a glucose test in the last six months, active surveillance would try to address which ones are being well controlled. In the epidemiological world, passive surveillance relies on existing data, while active surveillance implies a program that generates more recent and/or new data. This could be as simple as querying the medical record or running a report more frequently for simple cases or designing a whole new workflow and data elements to monitor for more complex cases.
  3. Prescriptive intervention. Once a population or initiative is identified, prescriptive intervention is what an organization uses to address the problem. This is where the art of evidence-based medicine comes in. We now have a lot more data to develop more fine tuned and effective interventions. Things like smoking cessation no longer have to be just a pamphlet, a discussion with a provider, and then a check box in the medical record. Full care teams can be coordinated and then patients can be monitored to help them with compliance.

As the industry and technology continues to advance, so do the tools at our disposal. Sentinel surveillance and predictive analytics offer some exciting opportunities to do more earlier. Additionally, the increased volume of data allows us to start taking a more in-depth look at cost-effectiveness and variation reduction between treatments for diseases.

It’s imperative to remember that every organization’s population health strategy will necessarily be different. This is because each organization’s population of patients is different. The vendor perspective often approaches organizations with packaged solutions, when in reality, it’s almost impossible for these solutions to be “one size fits all.” Even a product geared to a specific population health goal will require nuanced configuration to be effective for an individual organization.

Here in Madison, Wisconsin, population health interventions for UW Health are drastically different than Dean St. Mary’s or Group Health Co-op. UW is an academic medical center that draws high-acuity patients from across Wisconsin, while Dean has the region’s only obstetrics practice and GHC handles only primary care needs. While these organizations may benefit from adopting collaborative population health initiatives like the MSSP ACO (which both Dean and UW are a part of), their intervention focuses differ significantly based on their unique patient populations. Seldom can a product or solution apply to both, and even more rarely will it work for both.

As the industry continues to shift care delivery to encompass a population-based perspective, we are constantly introducing changes to our workflows, our assumptions, and most importantly, our expectations. These changes introduce uncertainty and apprehension, but they are also our greatest opportunity. It’s important to realize that population health management isn’t actually anything new. We’ve been here before—we’re just upping the scale.

Jeff Wu is a population health researcher at the University of Wisconsin-Madison.

Readers Write: New Discoveries in Health IT Diagnoses

May 13, 2015 Readers Write Comments Off on Readers Write: New Discoveries in Health IT Diagnoses

New Discoveries in Health IT Diagnoses
By Niko Skievaski

image

Over the past decade, we’ve spent billions to digitize healthcare. Health IT was to bring us the same exponential efficiency gains that computers and the Internet brought nearly every other industry. But now that rooms of paper have transitioned into rooms of servers and swarms of software vendors attempt to surf the wakes of legacy EHRs, the acute impact of this stoic transition begin to appear. Some of these newly diagnosed alignments are approaching risk of epidemic.

I am writing this to discuss our findings from a 300-vendor study attempting to understand the root causes, and most importantly, the prevention measures individuals can take when confronted with known early symptoms.

Type 1 and 2 MU (further mutations into Type 3)

An early stage MU diagnosis was a catalyst to much of the following conditions. In 2009, it first appeared in populations incentivized to spread it via certified EHR technology. If caught early, although not curable, it could have been contained and controlled. However, it soon became chronic and subsequently categorized as type 2. And it looks now as though a more progressive mutation is afoot, growing beyond incentivized  to penalized attestation.

Hyperactive Click Finger

Most commonly affecting the right index finger, hyperactive click finger (HCF) resulted from premature adoption of EHRs as spurred by type 1 MU. Market driven adoption would have controlled click counts to safe levels as sovereign end users would have chosen vendors based on efficiency gains,rather than subsidy. A regimen of optimization efforts led by EHR therapists is a potential solution that some patients have found effective. However, these therapies are usually administered at extremely high hourly costs and repeated consults are inevitable.

Acute Alert Fatigue

As MU progressed to type 2, clinical decision support combined with CPOE brought on acute alert fatigue in provider populations. This is commonly misdiagnosed as Bipolar Disorder or mild Tourette’s. Comorbidities frequently include HCF. EHR vendors have backed off heavy alerts and periphery vendors are beginning to set precedence with FDA clearance for forceful support. Additionally, alerts are normally hard-coded based on known errors and omissions, thus avoiding opportunity for proactive machine learning.

I14Y Virus

An infectious disease has been uncovered: I14Y Virus (interoperability influenza). Red blood cells clump together and bind the virus to infected cells, making it extremely difficult to share data between inhabitants. Additionally, the inconsistencies in data models create often insurmountable barriers for new software entrants that could otherwise bring increased efficiency and quality. New therapies, including acronyms like FHIR and SMART, are beginning to change public perception of the disease, yet it is still unclear to most of us what the heck they actually mean. Private middle layers are starting up to tackle known I14Y opportunities and a race to the cure is among us. The cure standard will be defined by what is adopted, not what is agreed upon in committees.

Hyperportalitis

Patients and providers are affected by hyperportalitis similarly. Yet it affects each population quite differently. Upon surfacing symptoms, patients simply disengage, causing aggregated MU. Affected providers, under mandate to comply, simply write usernames and passwords on sticky notes under keyboards, or in severe cases, on the frames of their computer screens. This exacerbates conditions leading to potential risk of HIPAAppendicitis.

HIPAAppendicitis

Despite repeat training videos depicting hospital elevators polluted with oral PHI leaks, we still run a high population risk of HIPAAppendicitis. This creates risk-averse symptoms of committee meeting purgatory and sluggish adoption of innovative cloud-based software therapies.

 

This is by no means a comprehensive study. I welcome review from my distinguished peers who subscribe to this journal, as well as subsequent research and inquiry. There will be an open comment period prior to the amendment of ICD-10.

Niko Skievaski is  co-founder of Redox.

Readers Write: Is Health IT Guilty of Being a Worm in Horseradish?

May 6, 2015 Readers Write Comments Off on Readers Write: Is Health IT Guilty of Being a Worm in Horseradish?

Is Health IT Guilty of Being a Worm in Horseradish?
By Nick van Terheyden, MD

image

A survey conducted at HIMSS15 found that patient satisfaction and patient engagement rank among the top priorities for CIOs. In fact, they rank above improving care coordination, streamlining operational efficiencies, and achieving Meaningful Use.

The tides are clearly changing. We’ve all been talking about what the shift to a value-based care model means for healthcare organizations. What we haven’t been talking about is how this shift is transforming our patients into “prosumers.”

There’s a saying, “To a worm in horseradish, the world is horseradish,” meaning we are predominantly aware of that which we are surrounded by on a daily basis. Health IT, in all its intricacies and expansiveness, has become hyper focused on making sense of its nebulous infrastructures, working hard to prepare healthcare organizations for next new wave of regulations. Our world, while not horseradish, is composed of goals and milestones that are 100 percent contingent upon these systems.

But, as yet one more unintended consequence of this pursuit, we have become myopic. The business of healthcare is no longer simply confined to a hospital or an IDN site map.

Patients are reaching for their phones, not to call their doctors, but to research their symptoms. They’re educated buyers, looking up reviews before seeing a new specialist, just as they would before buying the latest gadget on Amazon. And, as we enter the era of the Internet of Everything (IoE), they want their wearable devices to meaningfully connect as simply as when they use their phones to play songs from the playlist on their laptop.

It becomes a challenge of sustaining the momentum of the moment. As the wearable trend continues to grow, it is not merely enough to count steps or measure the amount of UV rays absorbed. That won’t keep patients engaged. We need statistics and personal health trends that can be used to foster a richer, ongoing dialogue between patients and their physicians.

Consider the positive health implications for patient who receives a treatment plan from her doctor, which is entered into the EMR during the visit and connected to a three-question daily check-in for three months via a mobile device. The patient could provide a thumbs-up, thumbs-down, or neutral rating (think Pandora playlist) on how the treatment is working, with perhaps an option to enter free text should she choose to expound upon her responses. These daily reports could be aggregated into trends and reviewed by a clinician to make adjustments to the treatment plan as needed, extending patient care beyond confines of the four walls and the 12 minutes of an office visit.

Connectivity and personalization is the zeitgeist. CIOs know this. We are all unique snowflakes, and as more and more people submit their genes for analysis and mapping, we’re proving the increased drive for individuality. While the industry is pushing for population health (a laudable vision indeed), patients are looking not to be considered in aggregate, but to be treated with the same personalized attention they experience when they go to a favorite restaurant where the wait staff recalls their usual order or when they go to a website that remembers all their previous preferences. It’s about not starting from square one every time.

Patients aren’t going to tolerate the disconnect in healthcare forever. And as digital natives, some generations won’t tolerate it at all. The day is coming where a patient will ask her doctor, “Did you notice that that my headaches seemed to lessen on those days I go to the gym? I’m wondering if there’s a connection?” If her physician isn’t paying attention to her, she will find a physician, or perhaps even an intelligent medical assistant, who will.

Nick van Terheyden, MD is CMIO at Nuance Communications.

Readers Write: Big Data, Small Data, Meta Data, See Ya Latah

Big Data, Small Data, Meta Data, See Ya Latah
By Jim Fitzgerald

image

It’s the RESTful, object store, file and block make me snore, it’s still bits and bytes to me……(sorry, Billy)

I just got back from HIMSS. Big data, like savoir faire, is everywhere. The cynical side of me says that technology vendors just want to sell more disk or flash drives. The analytical technical businessperson somewhere inside me says that the real play for the people trying to sell you and me on big data is in the tool suites for managing, monitoring, sorting, searching, and processing big data. We will be lured in with open source tools like Hadoop, and then when the hook is deep enough, the vendor community will point out to us why we need their quasi-proprietary toolkit to enhance the “limited feature set” and “programmer required” aspects of Hadoop.

Don’t read me wrong. I think I am a fan of this. Why the qualification? Big data, taken to its logical extreme and paired with some artificial intelligence, can help my doctor process all the environmental, social, and lifestyle data related to me and correlate it with the highly structured “small data” in my electronic health record to zero in on, and advise on, the real underlying issues behind my health that go well beyond the “sick care symptom” I am presenting that day.

The vague and slowly clarifying healthcare zeitgeist around population health and “well care” probably won’t be realized without employing big data management techniques as an everyday tool. This apparent service to humankind will be aided and abetted by small and large chunks of data streaming up to the cloud from the “personal Internet of things” that I already own and the things I am considering, like Apple Watch.

The cautionary note comes from my informed-paranoid fear of Big Brother. I have Orwellian visions of the healthcare police showing up at my house and herding me into the quarantine van for a stint of “voluntary rehab” after some warehouse full of seemingly disconnected Facebook posts, Yelp reviews, sensor numbers, and Whole Foods Market receipts mistakenly puts me on a high-risk list for the next pandemic. I won’t even go off into the potential side rant on all my voluntary and involuntary surrenders of my privacy rights along the way, although I do think the court system should brace itself for the onslaught.

Let’s hope my paranoia amounts to nothing more than the receptionist not being a bit surprised that I showed up in the doctor’s office that day because the data-lake-fed-AI predicted I would and had already authorized my insurance and sucked all the available fresh data on me into a useful visualization for my clinicians.

What’s the difference between big data and small data? The short version is that big data is generally considered to be an unstructured collection of data objects. Unstructured in this usage implies that there is no classic structured database format imposed on the data. The unstructured data could be a song captured as MP3 or AAC, a simple list of my last 20 temperatures stored in my Apple Watch, or a photo just taken in the ED of the festering wound on my right leg.

Big data is generally big because it is a vast collection of objects. Sometimes big data is big because the individual objects are prodigious on their own, and are also known as BLOBs or binary large objects – for example, your favorite “Breaking Bad” episodes that are still sitting on your iPad. It could really be anything, including a file that has a structure and order of its own, but is being considered as part of a greater set of data molecules in a “data lake.”

Storing data as objects, most commonly done on the Internet with RESTful storage protocols, is an increasingly normal trick in the world of data storage and management. When we store data as objects, we don’t care all that much about structure, or about the nature of the data, or about its accessibility by a particular file system or operating system. That problem is shifted from its traditional place in the OS or the storage array and is moved to the app. (notice I did not say “application.”)

To the extent that we care about the objects in an object store (an allegedly safe place to put objects) we may tag them as they go in with meta data, which everyone who has followed the Edward Snowden story knows is “data about the data.” In fact, the object might get multiple tags. One might be a lookup address or unique ID in the object store and one or more others might be some common descriptor of what is in the object itself. Hence the chaos of unstructured data may in fact, have some external structure imposed on it by some rules-based system ingesting the data objects.

In truth, small data is still where the rubber meets the road in today’s healthcare information systems. The organization or structure of that data by the HCIS in a pre-defined database provides the accuracy and confidence clinicians need to treat me and administrators need to bill me. It generates the endless arguments and the grossly inefficient cottage industry that has sprung up around HIEs. (do we really need to argue on what the “first name” field means?)

Big data can provide inferential context for small data, but it cannot supplant the precise articulation or definitive metrics collected and presented, in context, to help treat me. Small data is so important that we protect it not only in context of its integral structure in a database, but also in some cases at the file system, operating system, and storage subsystem levels. In many cases via RAID technology, backups, and replicas we have so many copies of the same small data that it is really not very small at all; but hey, in the days of petabyte and zettabyte data lakes, a few terabytes looks more like a data puddle.

There is, however, an economic force in play here. Depending on whose numbers you believe, big data on object stores is four to 20 times cheaper to manage than an equivalent amount of small data being managed by a production application in a Tier 1 SAN. The “apps” which are slowly arriving in healthcare (and may continue to arrive) may be happy just to slam a bunch of tags on an object and call it a day. Then we will have “tag oceans” and “tag bagging” toolsets with cute animal logos, and the circle of data will continue to self-perpetuate.

Jim Fitzgerald is technology strategist and EVP at Park Place International.

Startup CEOs and Investors: Bruce Brandes

May 4, 2015 Readers Write Comments Off on Startup CEOs and Investors: Bruce Brandes

All I Needed to Know to Disrupt Healthcare I Learned from “Seinfeld”: Part V – Yada Yada Yada
By Bruce Brandes

image

Most every company talks about their elevator pitch, which is intended to be a brief summation of the business to intrigue one to want to learn more. My question is this: exactly how long are the elevator rides some people are taking? More broadly, in any sort of business interaction, how to you best balance brevity vs. meaty detail?

The Webster’s definition of the phrase “yada yada” is "boring or empty talk often used interjectionally, especially in recounting words regarded as too dull or predictable to be worth repeating." Anyone still recovering from the HIMSS conference can likely recall many conversations where yada yada would have been a very welcomed interjection.

image

Our old friend George Costanza once dated a woman who often filled in her stories with the expression yada-yada, leaving out much of the detail. Jerry praised her for being so succinct (like dating USA Today) but not knowing the full picture drove George crazy. So opens the debate: is yada yada good, or is yada yada bad?

As discussed in an earlier column, most pitches are too long and generic. A little yada yada to help you explain your company in 60 seconds or less is very good. In calculating how to consolidate your elevator pitch, reread the Webster’s definition above and be sure to yada yada overused, now almost meaningless buzzwords like “patient engagement,” “big data analytics,” or “telemedicine.”

Instead, focus on concisely describing why your company exists, what problem you solve, and how you deliver that solution in a way that is clearly superior or more simple than the masses. Even 60 seconds might seem like a long elevator ride to your audience if you do not make a compelling initial impression in the first 15. Without the yada yada, you are not getting a first meeting.

Better yet, if your solution is as vastly unique and compelling as you may perceive, perhaps its simplicity speaks for itself. Did Apple need to yada yada when it introduced the iPad?  In his book “Insanely Simple,” Ken Segal describes the cultural foundation which led to Apple’s development of transformational products so simple and obvious that a two-year-old or a 90-year-old could just intuitively understand them.  

For real game-changing solutions, an unspoken yada yada is implicit. For example, in philanthropy, the Human Fund’s mission statement – “money for people” – enticed Mr. Krueger with its understated stupidity.

However, the buyers of and investors in healthcare technology solutions are remiss to not press for the substantive details and validation of claims glossed over by the yada yada. How many HIStalk readers been burned by extrapolating assumptions from high-level vendor assertions only to later recognize in the fine print that some important information was omitted by a yada yada?

  • Q: Where does your system get all the data you are showing in your demo?
  • A: Once you sign the contract … yada yada yada … we integrate seamlessly with your EMR.

  • Q: How do you achieve your revenue projection of growing 20x in two years?
  • A: We had meetings with people at both HCA and Ascension about doing pilots … yada yada yada …. we forecast 300 hospitals next year.

Let’s try to yada yada some of the memorable events in healthcare IT history.

  • We acquired five more companies which will be integrated by next quarter … yada yada yada … we beat our forecasted revenue numbers. (every HBOC quarterly earnings call in the 1990s)
  • We closed on our acquisition of HBOC … yada yada yada … our market cap dropped $9 billion today. (McKesson 1999)

 

  • We are putting out an RFP to evaluate vendors and purchase a new enterprise electronic medical records system … yada yada yada … we bought Epic. (any academic medical center in the past 10 years)
  • We are making great progress on our successful Epic rollout … yada yada yada … we are announcing major budget cuts to protect our bond rating. (that same academic medical center three years later)

I contend that yada yada is both good and bad. Mastery of this notion leads to knowing when to use the figurative yada yada to establish appropriate interest, rapport, and trust. It is equally important to know how and when to effectively press for critical information which the symbolic phrase may be concealing.  

Bruce Brandes is managing director at Martin Ventures, serves on the board of advisors at AirStrip and Valence Health, and is entrepreneur in residence at the University of Florida’s Warrington College of Business.

Startup CEOs and Investors: Michael Burke

May 4, 2015 Readers Write Comments Off on Startup CEOs and Investors: Michael Burke

The Shifting Incentives of Startups
By Michael Burke

image

Mr. H asked a few startup CEOs to give his readers an “inside baseball view into a world that a lot of us will never see as employees” — the world of starting and running a startup company. In this post, I’ll try to honor the spirit of that request by describing how incentives in an early-stage startup create an environment that is simultaneously thrilling, rewarding, and terrifying. We’ll then discuss the challenge of maintaining a startup’s culture while these incentives change.

I’ll start first with a sweeping generalization:

An early-stage startup company’s incentives are more purely aligned with their customers’ incentives than any other size, stage, or structure of business.

Think about it. At this stage, it really doesn’t matter whether the founders want to build a great company, make the world a better place, or make a big pile of cash. They can’t do any of these things if they don’t focus exclusively on the success of their early customers. This singular focus is a luxury not afforded to companies of other stages. These purely aligned incentives create an environment of productivity and creativity like no other.

Does this alignment of incentives guarantee success? Absolutely not. I’ve noted in an earlier article that the odds of success for a startup are low. There are a million things that can go wrong. The alignment of incentives does, however, mitigate the risks to some degree.

Now I know that most companies of various stages consider their customers important and would assume on the surface that their interests are aligned with those of their customers. But until they’ve pledged their house and savings to guarantee a loan for working capital, they don’t know what a real incentive feels like. That’s the terrifying part.

Shifting Incentives and OPM

Incentives often change as a startup grows. The really great companies find a way to maintain the positive elements of their culture during these periods of change. It’s not easy to do.

There’s a phenomenon in the startup world that is repeated time and time again. A scrappy startup that was efficient with the little bit of capital it had gets a big chunk of money from a VC. Then they start to suffer from OPM (Other People’s Money) syndrome. They start to think that they really need those golf bags emblazoned with the company logo. They over-hire. They move away from making small, responsible bets to Vegas-style gambles. It’s not entirely their fault. Their incentives have shifted.

Because of their new outside investors (who may now have a controlling interest but almost certainly have preferential exit terms), they now have to hit a grand slam. The fund needs to generate a 10X return in 3-5 years. A base hit, double, or triple might cover the VC’s vig, but it won’t put any money in the founders’ pockets.

In order to generate this sort of return, companies are strongly incented to focus exclusively on short-term revenue growth and ignore long-term investments in people, product, and process. In a parallel universe, big public corporations often find that their incentives diverge with those of their customers when it comes to the obsession with quarterly earnings, sometimes at the expense of similarly necessary investments in people, product, or process.

Some companies manage to maintain their focus and keep their culture intact through these and other changes. As a result, they often deliver exceptional value to their customers.

Freedom and Responsibility

Most successful startups are usually characterized by a culture with freedom and responsibility at its foundation. The freedom isn’t just a cultural choice; it’s a requirement. Top-down management structures just don’t work in a startup. The glacial speed of command and control environments is absent the requisite flexibility, productivity, and creativity. Distributed, self-organizing environments are required in the early stages to learn quickly, fail quickly, and adapt quickly.

Responsibility is the opposite side of the freedom coin in a startup. It makes the selection of the startup team absolutely critical. Folks who are attracted to working in an early-stage startup seem energized by this environment of responsibility. There’s just no place to hide in a startup, and nearly every decision is important. You need folks who are willing to act and to take responsibility for their actions.

In the early days, this culture of freedom and responsibility often emerges organically as a byproduct of the nature of the work and the requirements placed on the team. As a company grows, however, it needs to be much more intentional if it wants to keep the magic going. When we were a few founders in a room, we didn’t have to worry about vacation policy. No one planned to go anywhere until the work was done anyway. Now, when we hire a new employee, we need to have an intelligent answer to the question. So our answer is: take whatever time you want. We care about results, not about punching the clock.

One of the really great things about a startup is that you get to collectively define a culture with a relatively small group of folks. That’s a very exciting and fulfilling process. Contrary to popular belief, this definition of culture doesn’t come from the top down. Don’t get me wrong — a founder/CEO can single-handedly screw up a company’s culture, but the CEO can’t define it unilaterally. A founder/CEO can be a part of the process of a company’s emerging culture, but only a part. In my view, the most influential part a CEO can play in the intentional cultivation of culture is in hiring decisions. Secondarily, a CEO can make sure the policies of the company appropriately support the required culture of freedom and responsibility. Policies are fine, but in a startup, it matters much more what you do than what you say.

No Shortcuts

The bottom line is that startups can’t focus on the finish line if they want to be successful. They have to find a way to set aside the numerous distractions and shifting incentives of fund raises and exit strategies and simply focus on building a great company that delivers great value to customers. Protecting their company’s culture is a big part of this. If they can maintain this focus, they increase their odds of long-term success dramatically.

Michael Burke is an Atlanta-based healthcare technology entrepreneur. He previously founded Dialog Medical and formed Lightshed Health (which offers Clockwise.MD) in September 2012.

Readers Write: Chicken or Egg?

April 27, 2015 Readers Write 3 Comments

Chicken or Egg?
By Niko Skievaski

image

HIStalk recently released these poll results: “Which #1 reason would cause you to avoid doing business with a startup?” (*n=350):

  • Fears that the company isn’t financially viable (47 percent )
  • Offering a product that solves a non-strategic problem (21 percent)
  • Lack of integration with existing IT systems (17 percent)
  • Lack of comparable reference sites (10 percent)
  • A CEO who doesn’t have poise, polish, or healthcare experience (5 percent)

These embody much of the technology adoption barriers facing healthcare. Startups are perceived as being unable to commit to long-term contracts and lack reference sites to build confidence in buyers– just as the first chicken couldn’t have been hatched without the egg from which it came. These things combined make for a very difficult landscape for healthcare technology startups to thrive in. So who lays the egg?

Fears that the company isn’t financially viable. It’s extremely costly for a health system to adopt new technology. Beyond the price tag, there are real costs associated with implementation and training necessary to successfully go live. The last thing they want is to be left hanging if your company goes under. Jason Bornhorst, who exhibited the last two years in the HIMSS startup neighborhood, said, “I’d estimate that about two-thirds of the companies that were here last year aren’t around any more.” The fact of the matter is that you need to have the resilience to ride the bone-breaking sales cycle. They’ve been practicing medicine without your software for 100 years; they can wait another 1-2. How do you bootstrap financial viability to last the long sales cycles and combat this perception? Raise more money, find alternative revenue sources, join an accelerator or two to buddy up with health systems, and surf the cycle efficiently.

Offering a product that solves a non-strategic problem. This isn’t so much a market failure as it is a customer development failure. Start a better startup. Don’t start a bakery because you’re a good baker. Start a bakery because there is excess demand for baked goods. I just hope that the buyers at health systems are delivering this intuition directly to the startup in addition to anonymous HIStalk polls.

Lack of integration with existing IT systems. Integration is a must. It’s not enough to say, “Use our product in a standalone capacity during the pilot and we’ll figure out integration later.” Providers hate double documenting and clicking. Forget switching windows. Their complaints bog down IT teams. Both of these groups will throw a block at your pitch if you don’t have a solid answer for interoperability with existing systems, both from a technical perspective as well as implementation. There’s a new wave of startups out there providing modern integration strategies for startups attempting to interoperate with the EHR.

Lack of comparable reference sites. One of the mantras I learned back at Epic is that every single customer should be able to be considered a reference site. It’s that level of customer service and do-anything-ness that makes them stand apart as a vendor. The space is too small to simply write off any customer as a lost cause. If a health system chooses to work with us, we need to do everything possible to make sure they’re a good reference site for future customers.

A CEO who doesn’t have poise, polish, or healthcare experience. Have you met Judy Faulkner, Chris Patterson, or Jonathan Bush?  Just a few examples of eccentric, throw-caution-to-the-wind type personalities who oversaw a successful EHR startups. But you need to know the audience of decision makers. If you’re new to healthcare, welcome to the wild world of buzzword bingo. Get conversational stat (yep, that’s a healthcare word). Read books, blogs, HIStalk.  Listen to podcasts. Go to HIMSS and actually listen to some sessions that relate to your domain. You wouldn’t buy a car from a guy that didn’t know the difference between a carburetor and catalytic converter. Be sure that you can demonstrate that this isn’t your first rodeo. Manufacture your “experience” by becoming an expert in the domain.

Mr. H, maybe a survey on top reasons to work with a startup next time?

Niko Skievaski is  co-founder of Redox.

Readers Write: The Journey to Value-Based Care: Lessons Learned from Aviation

April 27, 2015 Readers Write 1 Comment

The Journey to Value-Based Care: Lessons Learned from Aviation
By David Nace, MD

image

The Affordable Care Act (ACA) and healthcare reform have impacted providers in all aspects, from the way they are and will be paid to how they engage patients. To meet the deadlines and demands of an industry shifting to value-based care (VBC), physicians must change their thinking from independent to team-oriented in order to succeed in this new world.

VBC is empowering an evolution within the overall healthcare community, especially amongst physicians. This is enabling a focus on delivering high-quality of care to patients. The Meaningful Use and EHR certification programs have helped all provider organizations get closer to the more meaningful use of information technology, but the requirements also pose many challenges for providers.

These challenges should not be met with resistance. The physician community should embrace the call for change. Similar to the revolution of the aviation industry, reform required them to adapt to new methods of communication and technology to ensure safer flights.

Traditionally, physicians are independent and competitive in nature. They didn’t go through rigorous selection and testing over nearly eight years of higher education to merely coast by – they have an innate drive to be successful and help people. Value-based care, in theory, plays to their personality traits and gives them the motivation to achieve even higher goals.

However, physicians have a hard time trusting data or measures that they do not understand, especially when their evaluation is out of their control and input. For example, a 2014 survey of 4,000 physicians found 78 percent reported patient satisfaction ratings moderately or severely affected their job satisfaction and 28 percent considered quitting their job or leaving the medical profession.

To add to this statistic, most organizations do not have the appropriate communication, technologies, and data collection sources and processes put in place to understand the measurements being imposed on them. To tackle this challenge, hospital executives and physicians need to improve physician communications and transparency in regards to measurement.

Pilots faced a similar disconnect during the 1980s. Training a pilot occurred in an apprenticeship model — you learn from a “master” and through them learn their personal techniques and strategies. It really was a “master craftsman” mentality of mentorship.

This method of training and learning lead to a variations in practice and high accident and death rates associated with aviation. The practice was not based on teamwork or leveraging technology for standard operating procedure. There were no Global Positioning System (GPS) and Cockpit Resource Managements (CRM) utilized – it was all based on the techniques and approach of pilots. To understand the technologies imposed on them and to improve quality of flight, the way pilots were taught changed to a team-based approach that focused heavily on communication and transparency, data, and standard operating procedures.

There is a similar revolution coming to the world of medicine. Many of the physicians of tomorrow are beginning to prepare through team-based, information driven training. Young physicians in training are being proactive in understanding the methodologies and technologies of today and starting grass root movements — for instance, Primary Care Progress — to inform and inspire newcomers to the industry. Medical students are increasingly being trained in groups (versus one-on-one) to leverage the concept of teamwork and to better understand the evolving healthcare industry and their role in the transformation.

Change is inevitable in any organization. New rules, methods, and technologies will always cause a shift. These transformations should not and cannot be met with resistance, but with an open mind, as everyone needs to work together toward the end goal.

Pilots needed to adapt and alter their training and methodologies during flight to fly in a safer, more efficient manner. Similarly, providers must do the same with value-based care. The more collaboration, the smoother the ride will be.

David Nace, MD is vice president and medical director of McKesson Technology Solutions.

Readers Write: Why Some Physicians are Opting Out of Meaningful Use Attestation

April 3, 2015 Readers Write 3 Comments

Why Some Physicians are Opting Out of Meaningful Use Attestation
By Charles Settles

image

Since its inception, the Meaningful Use Incentive Program (MUIP) has paid out nearly $30 billion worth of incentives, but a rising number of physicians are opting out. Why?

2011, the first year of the MUIP, saw widespread interest. Nearly 200,000 eligible providers (EPs) and over 3,000 hospitals completed registration for either the Medicare or Medicaid versions of the program, according to the latest summary report from CMS. However, much of this original momentum appears to be lost. 2014 saw under 73,000 EPs and just 108 hospitals register across both programs.

clip_image002

clip_image004

Altogether, 515,158 registrations have been completed by EPs across both programs, with 415,550 unique EPs receiving an average of $25,190 in incentive payments. According to the CMS’s latest data, just over half of eligible providers have received an incentive payment. But what about the other (at least) 40 percent?

It can’t simply be a question of eligibility. According to Medscape’s 2014 EHR Report, only 22 percent of physicians are abandoning or have never supported the MUIP, but examining the CMS summary report suggests a much higher rate of attrition — only 23 percent of the 260,900 physician EPs who received a payment in 2013 received one in 2014. That translates to an attrition rate of just under 77 percent.

When considering the payments only by stage of the program, the numbers for physicians are even worse — only 5.7 percent of those physicians who received a payment for the first stage have received one for the second. More physicians will complete Stage 2 eventually, but the odds of making it to CMS’s 75 percent adoption rate target by 2018 appear to be growing shorter. The carrot simply hasn’t been enough.

The stick may not be enough either. If the 75 percent adoption rate target is not met, reimbursements stand to be cut by up to five percent. The average family physician, arguably the primary focus of the MUIP, receives about $100,000 per year from Medicare reimbursements, according to Dr. Jason Mitchell, former director of the American Academy of Family Physicians’ Center for Health IT. Since the penalties increase by one percent per year beginning with a one percent penalty in 2015, a physician receiving $100,000 annually could lose up to $10,000 in reimbursements through 2018. For some, the penalty is a small price to pay to not have to deal with requirements that they feel prevent them from delivering better patient care.

Dr. S. Steve Samudrala, medical director of America’s Family Doctors, was an early proponent of electronic health records, patient engagement, and other medical software systems. It seems ironic that Dr. Samudrala does not participate in the MUIP. Though his EHR (eClinicalWorks) is fully certified through Stage 2, Dr. Samudrala feels the reporting requirements for primary care physicians would prevent him from delivering the high quality, personal care his patients have come to expect.

He does acknowledge, though, that many independent primary care physicians have little choice in the matter — the incentives can make or break some smaller practices. Payments are shrinking, competition from hospital-owned groups is increasing, and medical practice brokers keep calling. Dr. Samudrala’s bet isn’t on incentives — he and a growing number of primary care physicians are proponents of what’s coming to be known as “direct primary care.”

The idea behind direct primary care, sometimes called “concierge medicine,” is to remove the expensive bureaucracy and processes associated with billing insurance or government programs and offer services directly to patients for a monthly or annual fee, supplemented by small co-pays. Though the number of successful direct primary care practices is small, and the trend doesn’t solely explain the number of physicians opting out of the MUIP, rising interest in the concept makes it worth mentioning.

Ultimately, the MUIP will likely be viewed as a success if widespread adoption of health IT was the goal. Adoption doubled between 2009 and 2013. Even if physicians don’t meet all the requirements to receive incentives, the benefits of health IT to providers, payers, and most importantly patients cannot be denied. We’ll likely see even more attrition from the MUIP with the announcement of the Stage 3 rules, but despite the growing disillusionment with the program, EHR and other health IT is here to stay.

Charles Settles is a product analyst at TechnologyAdvice.

Readers Write: Your Interfaces Suck Because You Want Them To

March 30, 2015 Readers Write 7 Comments

Your Interfaces Suck Because You Want Them To
By T. Ruth Hertz

Your interfaces suck because you want them to. Yup, that’s the stone cold reality. 

I am looking at you Mr. /Ms. CIO. You may talk all day about interoperability, data normalization, HIEs, standards, etc. but unless the right data in the right format gets to the right place at the right time, you are wasting time and money and possibly risking patient safety.

But wait, you say. We insist that all applications have HL7 interfaces – we even put it in the contract! Yes, maybe you do, but do you take the time to get and review detailed specifications before you sign the contract? Do you require the vendors to demonstrate interfacing their application with the ones you already have? Not just give you a list of other clients that have “the same systems as you” but actually connect their system to your engine and downstream application test environments? How well would the physicians at your institution react to being given a list instead of a demo and / or site visit?

Do you let your interface experts ask the tough questions during due diligence? If you do, does it matter when the answers are wrong or evasive? Or do you just accept it when the vendor says, “You can fix it in the engine?” Do the interface experts get to go on the site visit, see the interface in action, and talk to the folks that have had to actually make the interface work?

Let’s face the facts:

  • It is in the application software vendor’s best interest to not interface well with other vendors’ apps. Selling a suite of apps that work well together but not well with others makes buying their products as a set look like the smart thing to do.
  • Application software vendors can make their interfaces work. They have the source code and the underlying database. They just need a very good reason to do so – like “no sale” if they don’t.
  • Your interface staff time isn’t free. All the time spent on analyzing, designing, and building workarounds to compensate for deficiencies in the sending and/or receiving applications costs hard money. That time is also time lost from other projects.

It’s time that the decision-makers who buy healthcare apps put a stop to this madness and insist that true interoperability be delivered by the software vendors – or no sale.

Readers Write: A Prescription for Getting Face Time with Doctors at HIMSS

March 30, 2015 Readers Write 1 Comment

A Prescription for Getting Face Time with Doctors at HIMSS
By Chris Lundgren

image

It’s no secret that it’s getting harder and harder to get face time with doctors. But I’m a sales guy, so I always see a silver lining in everything.

In this scenario, the silver lining is that doctors are just like you and me. They can’t live without their gadgets. Recent studies have shown that 75 percent of doctors own a smartphone and 55 percent use both a tablet and smartphone in their daily work. So while you may have a more difficult time connecting in person with doctors, you can still be very much connected.

The key to engaging doctors today is to use technology when the time is ripe. The upcoming HIMSS conference is the perfect example. It has a huge audience and nearly 60 percent of the attendees are healthcare providers. Let me repeat: thousands and thousands of doctors gathered in one space to live and breathe technology for five days. If that isn’t a jackpot waiting to happen, I don’t know what is.

Problem is, doctors are going to be running from one panel to another at HIMSS, so you can’t expect to get face time with them if you haven’t engaged them prior to the conference. And the way to do that is – you guessed it – through their gadgets. Here’s what I recommend:

  1. Ask them how they’re doing. Doctors are always asking others how they’re doing. Now that the pressure is on doctors to improve patient outcomes and reduce healthcare costs in measurable ways, it’s time to ask doctors how they’re doing and what’s on their minds. A quick and easy way to do that is a survey that asks 1-3 questions such as, “What topic(s) are you most interested in at HIMSS?”, “What do you hope to gain from learning about that topic?”, and “What other concerns are on your mind?”
  2. Make them a HIMSS-only offer. Limited time offers work time and time again because they create a sense of urgency. If you want to ensure that you get some face time at HIMSS, make sure you’re prepared to make offers that will only be available at the conference. Use the results of the survey to develop the offer or gather qualitative insights from your sales reps – what have their conversations revealed about doctors’ needs right now? Take advantage of email marketing for its quick response, analytics, and segmentation capabilities.
  3. Impress them with knowledge. A recent study showed that doctors are always hungry for new research, case studies, and other clinical knowledge that can help them in their work. But here’s the catch (and also the opportunity): they’re often too busy to look for it on their own. Do the work for them by delivering valuable content. Remember, they’re busy, so don’t deluge them with a library of links. Try a short list of statistics or a link to an article to get the conversation started. Tip: Information related to patients is hot right now and there’s a treasure trove of relevant content with a quick search.

Digital engagement is an essential component to any physician communication strategy. However, to maximize the results of such a strategy, the focus should be on quality rather than quantity. In addition, integrating a quality digital campaign with the right mix of print, mail, and telemarketing can optimize any effort. Be sure to get your reps to follow up with doctors on the phone or via email after a campaign goes out. Using this multi-channel approach can boost revenue by more than 10 percent. Good luck at the conference.

Chris Lundgren is VP of strategic sales for Healthcare Data Solutions of Lincoln, NE.

Readers Write: Twenty Things Vendors Need to Know About ONC’s New 2015 (Stage 3) Certification Program, But Were Afraid to Ask

March 24, 2015 Readers Write 7 Comments

Twenty Things Vendors Need to Know About ONC’s New 2015 (Stage 3) Certification Program, But Were Afraid to Ask
By Frank Poggio

On March 23, late on a Friday afternoon, ONC published two drafts of the proposed revisions to the 2015 Test Criteria along with new Stage 3 provider MU attestation requirements. Two separate large documents were published:

  • Electronic Health Record Incentive Program, Stage 3 Draft Rule, (300+ page PDF)
  • 2015 Edition Health Information Technology (Health IT) Certification Criteria, ONC Health IT Certification Program Modifications (400+ page PDF)

The first covers the proposed rules for MU Attestation for Providers under Stage 3. The second addresses proposed test criteria and requirements for vendors and revised operating rules for the Accredited Certification Bodies (ACB).

Already there has been a great deal of discussion on the first MU requirements document since it impacts all providers, while the second document is aimed at vendors and system developers and has received little attention . I commented on the MU provider piece on HIStalk earlier this week and will focus now on the impact on vendors and system developers. Some of my vendor clients have been calling and emailing me asking, “What’s changed for us?” Others are afraid to ask.

Suffice it to say there are some major additions and revisions to the test criteria and process that will give system developers heartburn, or maybe a K51.914 (ICD10=ulcer).

Before I dive into the document, let’s remember that back in 2013 ONC disconnected the MU Stages from the certification test versions. The concept that a vendor is Stage 2 or Stage 3 certified is almost meaningless since a provider could MU attest to Stage 2 using either modified 2011 test criteria or the 2104 criteria. With the eventual issuance of these new 2015 criteria, for a short period providers can Stage 2 attest using a vendor’s 2014 certified product, or if available, the vendor’s 2015 certified product.

All 2015 Test Criteria are now referred to as the 170.315 regulations. At this time, these are just draft proposals that will be formally published in the Federal Register on March 30, 2015. Then after a 90-day comment period, some revisions will be made, with the final regulations issued in the July-August timeframe.

Using the last two cycles of draft rules versus final issued regulations, I predict that some 90 percent of what is now proposed will be adopted into law. So fasten your seat belts — here we go. Some highlights (or lowlights? are:

  1. Privacy and Security (170.315 d1-d7). There are some minor changes in several of these tests, such as access, time outs, integrity, device encryption and audit logs. But now under 2105 testing, they have become mandatory if a vendor wants to test out on other criteria, such as Demographics. The P&S tests were mandatory under 2011 (Stage1), then ONC made them optional for 2014, now they are back in the mandatory column. To paraphrase ONC, it’s all due to the never-ending march of data breaches. An added requirement to P&S which is stated in the MU regs, but not in any specific test criteria, is vendors now must attest to having completed a HIPAA risk analysis of their product whenever they install new releases or updates. Here’s why. In order for providers to be compliant with MU and HIPAA, they will have to get an attestation from the vendor before they install any update, the provider MU regulations state on page 64: EPs, eligible hospitals, and CAHs must conduct the security risk analysis upon installation of CEHRT or upon upgrade to a new Edition of certified EHR Technology.
  2. Demographics 170.315a4. ONC wants coding for language and ethnicity to support all 900 OMB codes and all RFC 5646 ethnicity codes. But ONC acknowledges that a drop-down list of 900 data elements might cause workflow problems, so they have said a full drop-down list is not required. You just need to show in a test you support all the codes and can tailor the list for each provider client.
  3. Vital Signs 170.315 a6. All values must have LOINC codes. Data elements have been expanded and pediatric vitals have separate criteria.
  4. Advance Directive (170.315 a17). Now you have to electronically capture and track the AD. No more just check a box and who cares what file drawer it’s in.
  5. Medical Implants (170.315 a20). Must now be tracked and reported.
  6. Social, Psychological, and Behavioral data must now be captured and tracked using LOINC and SNOMED coding. (170.315 a21).
  7. Clinical Decision Support tools must be linked to Knowledge Artifacts formatted in the HeD standard Release 1.2. (170.315 a22).
  8. New “decision support – service” (170.315 g6) certification criterion requires technology to electronically make an information request with patient data and receive in return electronic clinical guidance in accordance with an HeD 1.2 standard.
  9. New CDA standard (170.315 b1). The C-CDA standard is now the single standard permitted for certification and the representation of summary care records. An updated version, HL7 Implementation Guide for CDA Release 2: Consolidated CDA Templates for ClinicalNotes (US Realm), Draft Standard for Trial Use, Release 2.076 includes the following changes: addition of new structural elements: new document sections and data entry templates: New Document Templates for: Care Plan; Referral Note; Transfer Summary. New Sections for: Goals; Health Concerns; Health Status Evaluation/Outcomes; Mental Status; Nutrition; Physical Findings of Skin, etc.
  10. CDA system performance (170.315 g6). As part of the focus on interoperability, ONC is requiring performance standards for data transfers of CCA/CCR. Data transmission of CDAs will be tested for volume and response times.
  11. XDM packing of View/Download/ Transmit and CCR/CCD with incorporation of industry APIs using the IHE-IT infrastructure standard.
  12. Data Portability has been broken out into Send /Receive as separate components (170.315 b6).
  13. Care plans (170.315 b9). ONC proposes to include the “assessment and plan of treatment,” “goals,” and “health concerns” in the “Common Clinical Data Set” for certification to the 2015 Edition. The “assessment and plan of treatment,” “goals,” and “health concerns” are intended to replace the concept of the “care plan field(s), including goals and instructions” which is part of the “Common MU Data Set” in the 2014 Edition.
  14. CQM (170.315 c1). Has been expanded into separate segments: filters, create, import, and calculate.
  15. Quality Management System (170.315g4-g5). Now includes an “access-ability technical component” in accordance with ADA. The QMS must be mapped to a federal guideline or industry standard. (No more home-grown QMS process/tools.)
  16. Safety Enhanced Design – SED (170.315g3). Expanded and requires specific and detailed usability test documentation. ONC recommends following NISTIR 7804176 “Technical Evaluation, Testing, and Validation of the Usability of Electronic Health Records” for human factors validation testing of the final product to be certified. They recommend a minimum of 15 representative test participants for each category of anticipated clinical end users who conduct critical tasks where the user interface design could impact patient safety.
  17. Authorized Testing Bodies (testing agencies) are now required to conduct surveillance (audits) on at least 5 percent of vendor installs (or max of 10) every year to verify that the certified system in fact meets each certified test criteria.
  18. Attestation for Price transparency. ONC wants vendors to disclose on their web site and in marketing materials material system limitations. The vendor must also disclose any material add-on costs such as transaction fees to support interfaces/interoperability, etc. and supply any requesting entity a reasonably accurate cost estimate of total system costs. That’s ANY requesting entity, not just prospects or for bid requests.
  19. ONC wants monthly reports from the testing agencies on provider complaints and counts of vendor updates and modifications. If the number of updates/modifications exceed a set number, ACB is to call vendor back in for re-testing.
  20. ONC predicts the rules and test criteria will be finalized by mid-summer and vendors will work “aggressively” in 2016-17 to modify products and meet the target date of 2018 to support Stage 3 provider attestations, which will require a full year of calendar data from providers.

ONC estimates that all vendors together will have to invest approximately $300 to $400 million to effect all these changes. They calculate there are 81 unique vendors with certified products, hence an average cost of $4-5 million each, which does not include the time and cost to go through the test process.

ONC states they will continue with the “Gap” test process, meaning if you passed a test criteria under 2014 and there were no (or minimal) changes for the 2015 criteria, you get a bye. Given the preceding, my advice is if you’re a vendor that is not yet 2014 certified, you really want to get it done sooner rather than later. My experience tells me that being 2014-certified for as many criteria as you can before the 2015 criteria are cast in stone will be a better place to be.

Lastly, ONC states that the 2105 Test Criteria and Stage 3 Provider MU Attestation rules will be the last Stage for MU, but that the rules and test requirements will continue to be revised and expanded as ONC deems necessary. I guess we can next expect Stage 3.1, along with revised test criteria 2015 dot 1,dot 2 … can anyone see a light at the end of this tunnel?

Frank Poggio is president of The Kelzon Group.

Readers Write: Ignorance of the Major EMR Software Vendors is Not Bliss

March 23, 2015 Readers Write 10 Comments

Ignorance of the Major EMR Software Vendors is Not Bliss
By Tyler Smith

11-6-2013 12-24-41 PM

We in healthcare IT have found ourselves in a pretty sexy industry. You know that is true when Silicon Valley is practically banging down the doors to get in and KPCB’s John Doerr states that he would really like to see an open source competitor to Epic created. Damn, so Valley money admits it is losing to a slowly built behemoth in Madison – not a brand spankin’ new startup it missed an angel round on.

Needless to say, HIStalk’s Startup columns are a quite timely addition to the blog. I particularly enjoyed reading Marty Feisenthal’s explanation of the elite JPM conference. Having heard about the conference from banker friends (not HIT colleagues), his column removed much of the mystique. Being a fellow Atlanta resident and having visited the Atlanta Tech Village before, I also have greatly appreciated Michael Burke’s articles on the experiences of an HIT founder in Atlanta.

I recently co-founded a startup that aimed to bring efficiency to the Epic staffing arena by using very simple tools already in place in other industries. I do not want to call it the Uber of Epic staffing – for fear of sounding like a hack – but the basic idea was a connection platform with ratings for Epic certified consultants. While we have put the project on hold due to some shakeups on our technical team and also due to slow buy-in from provider organizations (our target clients), the pause in the action has given me time to reflect on the current state of HIT startups – particularly those looking to nibble on the enterprise EMR vendors’ scope of services.

Along with Mr. H and most readers here, when anybody from the outside comes and brings a new idea to the HIT table, I am usually skeptical. For starters, most entrants do not understand the complexity of the hospital / provider organization buyer or the provider organizations’ importance in the system. In theory, I love the idea of patient advocacy and patient-centric apps, but if providers or the systems that house them aren’t buying it, you better have something that patients see as life or death (read: an HIV curing drug, not a sleep tracking app) if you want them to fight the entrenched stakeholders for you or with you to make your startup relevant or widely used to truly create positive clinical outcomes.

Secondly and most importantly, many of these outsiders do not understand the current state of the EMR vendor landscape, and if they do, they arrogantly think they can steal market share while the enterprise systems watch from the sidelines. True, Epic and Cerner’s UX can appear very basic from an end user stand point and it often appears that the enterprise systems do not appear to be covering even close to all the functions that could be automated in a hospital or healthcare delivery organization. However, it would be naïve to think that these vendors have no big plans to tackle all of these remaining un-automated functions in the near future. When they do, unlike many of the new startups, these vendors will be able to simply make an additional sale to their already heavy client lists instead of having to undergo the arduous process of breaking down the doors to just get on the approved software vendor list at a major healthcare system.

The truth is that healthcare IT is a B2B market, not a consumer market. Organizations do not make purchasing decisions overnight, and thus while an app may actually do something better than an organization’s EMR, it better be a lot better for a healthcare provider organization to consider even meeting with the startup’s sales team.

This is not to say that I think that clinical apps which could be potentially developed and which will lead to improved clinical outcomes should not be attempted. What I am really saying is that before delving into development, HIT startup founders should take a much more serious look into EMR current state.

Even more importantly, startups should also consider what logical next steps vendors will be taking in their product offerings and research timelines as the massive implementation phase winds down and optimization becomes a priority for the vendors’ in house development teams. If there really is a competitive advantage which the startup has over these behemoths in the development of an EMR related application, then by all means go for it. But if not, it is probably best developing something far outside of the current or near future EMR vendor scope.

Easy for me to say as I sit on the sidelines and consult on EMR projects, I know. And you can object and say I’m siding with the status quo. Regardless, it pays to do your homework on the massive vendors. They aren’t going to crumble and they certainly aren’t going to let their clients get on products that encroach on their turf without a very solid battle.

In closing, I would ask any hopeful HIT entrepreneur: what is your startup doing that an established EMR vendor could not accomplish without a system update or by adding a new application which would seamlessly integrate with their current lineup?

Tyler Smith is a consultant with TJPS Consulting and co-founder of Hitop.co.

Text Ads


RECENT COMMENTS

  1. Well, it would probably be easier for them to physically jump over Judy Faulkner than it would be to outcompete…

  2. Jealous of the 10 figure money feinberg has made as a C- at best leader at Google and Cerner (besides…

  3. I’m familiar with the three largest telemedicine companies; they all have strong antibiotic stewardship programs. Go on the app reviews…

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

RSS Webinars

  • An error has occurred, which probably means the feed is down. Try again later.