Home » Readers Write » Recent Articles:

Readers Write: Future Health Solution

February 1, 2017 Readers Write 5 Comments

Future Health Solution
By Toby Samo, MD

image

Health information technology (HIT) has made significant advances over the last two decades. While adoption is not necessarily a good marker for successful EHR usage, adoption of office-based physicians with EHR has gone from about 20 percent to over 80 percent and more that 95 percent of all non-federal acute care hospitals possess certified health IT. HIT implementation has led to improvements in quality and patient safety.

However, many of the goals of increased HIT implementation have been stymied by social and technical roadblocks. A “one type fits all” approach may help reduce training and configuration costs, but there are many approaches to patient care and unique workflows between specialties and among individual users.

Most EHRs are burdened with three major legacy issues:

  1. Technology. Present EHR systems are mostly built on what would now be considered old technology. Some of the ambulatory products and small acute care products have moved onto cloud-based architecture, but most are client-server. While hosting instances of a product reduces the technical expertise needed by the client and can lead to better standardization of implementation, it does not necessarily deliver the advantages of a native, cloud-based architecture.
  2. Encounter-based. EHRs have been built on the concept that interactions with patients (or members or clients) are associated with a specific encounter. This functions well for face-to-face visits and for specific events, but is limiting where longitudinal care is required.
  3. User experience. The user experience has for the most part taken a back seat to functionality in HIT software development. A quick view of most HIT systems shows the interface to be cluttered and does not draw the user’s attention to the areas that need the most attention. Most users access only a small percentage of the functionality that is present within the system, but vendors continue to add functionality rather than clean up the interface.

Platforms have revolutionized the way business is conducted in many industries. Numerous examples have made household names out of companies like Airbnb, Uber, Facebook, YouTube, Amazon and many more. A platform is not just a technology, but also “a new business model that uses technology to connect people, organization, and resources in an interactive ecosystem.”

There is a need for a HIT platform that would support the multitude of components necessary to move the delivery of HIT into the next generation. The future health solution needs to use contemporary technology that will have the flexibility to adapt to ever-changing requirements and use cases of modern healthcare. Some of the characteristics of the future health solution are:

  • Open. One of the biggest complaints of users and regulators is the closed nature of many HIT systems. The future health solution needs to be built as a platform that is able to share and access not only data, but also workflows and functionality through APIs
  • Apps and modules. A modular structure will enable components to be reused in different workflows and encourage innovation and specialization.
  • True, cloud-based architecture. Cloud computing delivers high performance, scalability, and accessibility. Upfront costs are reduced or eliminated and minimizes the technical resources needed by the client. Management, administration, and upgrading of solutions can be centralized and standardized.
  • Multi-platform. Users expect access to workflows on their smartphones and tablets. Any solution must develop primary workflows for the mobile worker and ensure that the user interface supports these devices
  • Scalable (up and down). To meet the needs of small and large organizations, the future health solutin will need to scale to accommodate changes in client volumes.
  • Analytics, reporting, and big data. HIT systems have collected massive amount of data. The challenge is not just mining that data, but presenting the information in a way that can be quickly absorbed by the individual user.
  • Searchable at the point of use. All the data that is being collected needs to be readily accessible. Using universal search capabilities and the ability to filter and sort on the fly will facilitate the easy access to information at the point of care.
  • Privacy and security. The core platform will need to be primarily responsible for the security and privacy of the data. The other modules built on the platform will need to comply to the platform security and privacy practices, but will not need to primarily manage these issues.
  • Interoperable. Need to adopt all present and future (FHIR) standards of data sharing. The open nature of the platform will facilitate access to data.
  • Internationalization and localization. Internationalization ensures that the system is structured in such a way that supports different languages, keyboards, alphabets, and data entry requirements. Localization uses these technical underpinnings to ensure that the cultural and scientific regional differences are addressed to help with implementation and adoption.
  • Workflow engine. Best practices can change and can be affected by national and regional differences. An easy-to-use workflow engine will be a necessity to help make changes to the workflow as needed by the clients.
  • Task management. Every user has tasks that need to be identified, prioritized, and addressed. Therefore, a task management tool that extends beyond a single module or workflow will be needed.
  • Clinical decision support. Increasingly sophisticated decision support needs to be supported, including CDS, artificial intelligence, and diagnostic decision support. These capabilities need to be embraced by the platform, allowing external decision support engines to interface easily with the other modules.
  • Adaptable on the fly by the end user. Allowing the end user with proper security to make changes to templates and workflows would help improve adoption.
  • User experience. Probably the most significant barrier to adoption of HIT is the user experience. Other industries are way ahead of healthcare in the adoption of clean, easy-to-use interfaces. It is vital that a team of user experience experts be integrally involved in the development process. All user-facing interactions, screens, and workflows need to be evaluated by user experience experts who can recommend innovative ways the user interacts with the system and how information is displayed.

The HIT industry has hit a wall that is preventing it from developing innovative products that use the newest technology and have an exemplary user experience. A new platform has the potential to support a robust, flexible, and innovative series of products that can adapt to meet the needs of the various healthcare markets globally. Such a project would have to build slowly over time, as does any disruptive technology. The legacy systems and other HIT systems that exist do not have to be excluded, but rather can be integrated into this new platform.

Identifying technology that, at its core, has the privacy, security, data management, and open structure could lead to the next generation of healthcare management systems. While some of these characteristics are obvious to developers and users alike, it is the sum of the parts that is important. Integrating most if not all of these characteristics into a single model is what can lead to enhancing the value of HIT and the delivery of care.

Toby Samo, MD is chief medical officer of Excelicare of Raleigh, NC.

Readers Write: Are You Ready for the Quality Payment Program?

January 18, 2017 Readers Write 7 Comments

Are You Ready for the Quality Payment Program?
By Kory Mertz

image

With the start of the New Year, the first performance period for the Quality Payment Program (QPP) has officially started. The QPP, part of the MACRA legislation, was passed with strong bipartisan support in Congress and sends a clear signal of the federal government’s accelerating effort to move to value-based payments.

QPP creates two new tracks for Eligible Clinicians (ECs), as program participants are called: the Merit-based Incentive Payment System (MIPS) and the Alternative Payment Models Incentive Program.

MIPS

MIPS consolidates and sunsets three programs focused on ambulatory providers: the Physician Quality Reporting Program, the Value-Based Payment Modifier, and the Medicare EHR Incentive Program for eligible professionals. In 2017, ECs can receive a maximum payment adjustment of plus or minus 4 percent based on their performance in four categories. ECs who are new to Medicare or who bill less than $30,000 or see fewer than 100 Medicare beneficiaries during a year will be exempt from MIPS.

image

In response to significant feedback from the provider community, the Centers for Medicare and Medicaid Services (CMS) has simplified the requirements and made 2017 a transition year to help ECs get used to participating in MIPS. Providers have three general approaches they can take:

image

Alternative Payment Models Incentive Program

The second track of QPP is focused on increasing EC participation in Alternative Payment Models (APM) (i.e. Accountable Care Organizations, bundled payments, etc.) by offering a 5 percent bonus and exemption from MIPS for ECs who participate in an Advanced APM and meet certain participation thresholds. In 2017, ECs must have at least 25 percent of their Medicare payments or 20 percent of their Medicare patient panel in a CMS Advanced APM to receive the bonus and MIPS exemption. ECs who meet lower payment or patient thresholds have the option to be exempt from MIPS. CMS maintains the list of qualifying Advanced APMs here.

Moving Forward

The overarching framework created in the legislation and initial rulemaking completed by the Obama Administration will continue unchanged in 2017. The Trump Administration will have a chance to put its own twist on the QPP in 2017 by filling in the program implementation details through sub-regulatory guidance (much like CMS has done with the Meaningful Use program) and in 2018 and beyond through rulemaking to establish future program requirements. If Representative Tom Price is confirmed as the Secretary of the Department of Health and Human Services, he may accelerate efforts to reduce provider burden and simplify the QPP.

As providers prepare to participate in the first year of QPP and HIOs prepare to support providers’ success, they should keep the following in mind.

  • While APMs have gained significant attention in recent years, CMS anticipates that the vast majority of providers will participate in MIPS in the early years of the QPP.
  • Providers just beginning to think about the QPP requirements should  generate reports to determine which providers are likely to be an EC during the performance period and which will fall under the low volume exclusion; map out the existing TIN/NPI structure of the organization to help support decision making around group versus individual reporting; and undertake a scan across the organization to determine existing Advanced APM participation by ECs. If an organization participates in an Advanced APM, a report should be generated based on all participating providers to determine if participants will qualify for a bonus and MIPS exemption under the APM track.

HIOs have the opportunity to position themselves to support providers’ success in QPP. HIOs should ensure they have functionality that aligns with program requirements, including:

  • Implement certified tools to collect and submit electronic quality measures to CMS to support ECs and help them achieve bonus points for the quality performance category.
  • Support ECs success with a variety of ACI measures including HIE (send and receive); view, download and transmit; and submitting information to public health and clinical data registries. A key consideration in determining which measures to support include the existing exchange environment the HIO operates in, if certified technology is required to meet the measure, whether the HIO’s technology meets the requirements (i.e. providing machine readable C-CDAs), and the ability to provide ECs necessary audit documentation.
  • Support improvement activities. For example, “Ensure that there is bilateral exchange of necessary patient information to guide patient care that could include one or more of the following: Participate in a Health Information Exchange if available; and/or use structured referral notes.” A key consideration for supporting improvement activities is whether the HIO has the ability to provide ECs with necessary audit documentation.

Kory Mertz is senior manager of Audacious Inquiry of Baltimore, MD.

Readers Write: Integrating EHRs and PDMPs: A Trend for 2017

December 21, 2016 Readers Write 1 Comment

Integrating EHRs and PDMPs: A Trend for 2017
By Connie Sinclair, RPh

image

The opioid epidemic will continue to be a big story in 2017 and the statistics get grimmer by the minute. We just learned from the government that more than 33,000 people died from opioid overdoses in 2015, making it the deadliest year ever.

In response, states will continue to enact legislation to mandate prescribers to use the Prescription Drug Monitoring Program (PDMP) and will encourage making electronic health records (EHRs) more interoperable with PDMPs by integrating access into prescriber workflows. For example, Massachusetts and Ohio are subsidizing statewide projects to facilitate the integration of the state PDMP into EHR solutions used by providers. PDMP usage has been associated with fewer overdose deaths and lack of integration into prescriber workflow has been shown to be a barrier to utilization, so we anticipate more states will follow suit.

While PDMP and EHR integration is an important policy goal, making it a reality has been easier said than done. PDMPs are independent, state-run databases of controlled substance prescriptions that have been reported from pharmacy dispensers. They are operational in all states except Missouri. Because PDMP systems have evolved outside the health IT ecosystem, significant barriers to interoperability have resulted. In contrast to electronic prescribing, for example, there is not a standard method to exchange and integrate the prescription drug data available in PDMPs into EHRs.

That is changing. In 2013, the Office of the National Coordinator (ONC) created a pilot initiative to bring together the PDMP and health IT system communities. The goal was to standardize the data format, transport, and security protocols to exchange controlled substance history information between PDMPs and EHRs as well as pharmacy systems. 

These actions are beginning to bear fruit. These pilots have recently concluded and seven of 10 participating vendors are now moving PDMP functionality into production, leveraging the pilot’s final implementation guide. Appriss has indicated that many EHRs are indeed integrating to their PDMP gateway. 

It is clear that 2017 will see increased legislative movement to require EHRs to integrate with PDMPs and prescriber workflows. The ONC pilots have shown a technical path forward. Now is the time for forward-thinking EHRs to capitalize on that progress and get ahead of the legislative curve. It will create competitive advantage, serve as a tremendous value-add to prescribers, act as a proactive means to improve patient care, and potentially save lives.

Connie Sinclair, RPh is director of the Regulatory Resource Center of  Point-of-Care Partners of Coral Springs, FL.

Readers Write: Seven HIT Talent Trends to Watch in 2017

December 21, 2016 Readers Write No Comments

Seven HIT Talent Trends to Watch in 2017
By Frank Myeroff

image

Here are seven talent trends that are shaping the HIT workforce.

  1. C-level title of chief robotics officer rises. Expect more than half of healthcare organizations to have a chief robotics officer (CRO) by 2025. Since healthcare is an industry where robotics and automation play a significant role, the CRO will have a similar status to that of the CIO today within the next few years. The CRO and their team will manage the new set of challenges that comes with Robotics and Intelligent Operational Systems (RIOS). They will translate how to use this technology and how it is linked to customer-facing activities, and ultimately, to organizational performance.
  2. Talent raids to acquire HIT leaders. Top-tier HIT talent is a core factor in the success of any healthcare organization. Yet there is an insufficient talent pool from which to acquire IT leadership. This labor shortage is causing those on the front lines to talent poach from other healthcare organizations. Right now, the competition for highly qualified and experienced leaders is at an all-time high due to several factors including an underinvestment in leadership development and tighter operating margins that influence workforce strategies.
  3. Videoconferencing for telehealth grows in popularity and jobs. While not exactly new, videoconferencing is gaining popularity in healthcare due to the advances in HIT infrastructure and communication as well as the need to serve the aging population and those residing in remote areas. Healthcare practitioners are increasingly adopting these interactive video applications to offer better access to healthcare as well as deliver improved patient care at reduced prices. Additionally, patients are finding benefits to using this real-time, two-way interaction since it enables healthcare providers to extend their reach of patient monitoring, consultation, and counseling. The most popular HIT professionals sought after in videoconferencing are implementation specialists and telehealth directors.
  4. Burgeoning cybersecurity job market. Healthcare organizations of all sizes are in the hunt for skilled cybersecurity professionals. Just about every day there’s a story regarding a data breach incident within the healthcare industry. Many of these incidents could be attributed to unfilled cybersecurity jobs. Since the current demand is greater than the supply, a career in this sector can mean a six-figure salary, job security, and upward mobility. The cybersecurity industry as a whole is expected to grow from $75 billion in 2015 to $170 billion by 2020, according to Forbes.com. In addition, the demand for the cybersecurity workforce is expected to rise to 6 million by 2019 with a projected shortfall of 1.5 million.
  5. Working remotely fully takes off. Working from anywhere and at any time will become a normal every day thing. By 2020, it is expected that 50 percent of workers in the US will be working either from home or another remote location. Having virtual employees is not only a way to get things done round the clock, without commuting, and with hard-to-find skill sets, but is also a way to meet the needs of employees who don’t live near the organization.
  6. Boomerang employees more common. Boomerang employees are employees who leave an organization only to return back to that same employer sometime later. Rehiring these former workers are on the rise. With HIT talent at a premium, it only makes sense. HIT Managers know that hiring back someone they know is easier than recruiting new blood plus it saves money on training and development. In addition, there’s an immediate ROI.
  7. 3D technology careers wide open. Everyone is talking about 3D printing these days. It is expected to be the top medical innovation in 2017 for the reason that it could change everything for transplants and prosthetics through customization. As the 3D industry continues to evolve in 2017, the job market is wide open. In fact, jobs are appearing faster than candidates can be recruited. Young HIT professionals, especially software developers, should see this market as having huge potential for beginning a new career.

Frank Myeroff is president of Direct Consulting Associates of Cleveland, OH.

Readers Write: The Six Bedrocks in a Post-Trump Healthcare Landscape

December 12, 2016 Readers Write 1 Comment

The Six Bedrocks in a Post-Trump Healthcare Landscape
By Steve Levin

image

With a Trump administration and Republican-led Congress on the horizon, a shift in the direction of national healthcare policy is a near certainty. But the exact nature and timing of that change might be, unfortunately, less clear. Based on the principles outlined by Trump’s team themselves, the history of appointees, and conversations with clients and industry pundits, it feels as if there are some bedrock themes to orient efforts while Washington turns over and argues its way forward.

  1. Expect more creativity from payers. Multiple factors are at play here. Moving the locus of health insurance requirements from federal levels to the individual state organizations will promote flexibility. The pullback on the individual mandate means that the days of Bronze, Silver, Gold, and Platinum plans will go the way of the floppy disk drive. Couple this with increased incentives for consumers to set up HSAs and take control of their health insurance purchase means that payers can let loose their product design teams for new solutions to meet the range of consumer challenges.
  2. Consumers will end up paying for a larger share of their healthcare. There is simply no money left in the checking accounts of government—federal, state, or city – or employers to fund the growth in healthcare costs. Add on more plan innovations, the disappearance of the individual mandate, and Medicaid expansion being reined in and the future for the consumer is pretty clear. If we have insurance, we are going to be paying more in the form of co-payments, co-insurance, and deductibles. More procedures will go from covered to un-covered. Many consumers will end up on the far end of the insured continuum —namely, uninsured.
  3. Bundles and risk-based reimbursement will march forward. Over the past several years there have been pilots, tests, and more pilots and tests comparing and contrasting fee-for-service to something along the lines of pay-for-value. CMS has led the charge. While the incoming leadership has historically been less bullish on all the pilots and innovations, the results to date do suggest bundles can create positive care integration and control total costs. Readmission penalties, while still rough, are raising an issue that organizations know they need to tackle. Certainly the current risk programs are not polished and perfect, but they are driving integration around the patient and toward higher value at an overall lower cost. So build out those teams of contract modeling talents; continue the march toward building your own insurance solution; and figure out how you can process those contracts amid clinical workflows and revenue cycle in volume.
  4. Time to become patient relationship experts. Combine items 1, 2, and 3 and a fourth bedrock principle emerges—specifically, figuring out how providers manage the patient relationship both clinically and financially before, during, and after treatment. This relationship will become of paramount importance. Moving forward, the patient is going to control a great deal of our cost structure and cash flow. Providers need to be proactive to shape patient decisions.
  5. Extracting more value from every budget dollar will be table stakes. Every scenario comes back to the same operational mandate— lower operating costs and improve the impact of every activity. Eliminate the 20 to 30 percent of processing work that is predictably of no value or impact. The double whammy in my reading of the future is that every activity is more expensive when the counter party is the patient themselves and not a commercial or government payer. It is simply more expensive to manage patients than a large business partner. So regardless of how Washington reshuffles ACA, healthcare processes need to be more efficient at every turn.
  6. Time to get more ROI from those EHR investments. Organizations spent millions on big-iron electronic health records and went through the agony of stabilizing processes. Now it is time to actually optimize those platforms using the higher quality information at hand. Using predictive analytics to reduce low and no-value efforts (see point five), optimizing insourcing and outsourcing logic, and targeting high-cost patient engagement processes are just examples of how these bedrock systems can begin to finally drive financial improvement.

Only time will tell what Washington actually decides and when those decisions truly have bearing on the thousands of hospitals and millions of patients. However, while the exact policies and processes are TBD, the six bedrock items listed here are most likely enabling and contributing regardless of the final rules and regulations.

Steve Levine is CEO of Connance.

Readers Write: How Trumpcare Could Win Big

December 12, 2016 Readers Write 2 Comments

How Trumpcare Could Win Big
By E. Todd Bennett

image

Government involvement in the healthcare industry has increased under HITECH, the Affordable Care Act (ACA), and now MACRA. The phrase, “large-scale change happens when customers demand it, suppliers agree on it, or the government mandates it,” certainly applies to healthcare and has played out in these legislative acts. These federal government initiatives, except MACRA (since the quasi-final rule was only recently published), have failed to improve quality and bend the cost curve in a broad and dramatic way to put the United States healthcare system unequivocally in a worldwide leadership position.

On the cusp of a new administration, it’s important to understand why these legislative acts aren’t dramatically improving healthcare quality and reducing costs.

Overall, incentives seem misdirected with the healthcare industry goals related to cost and quality. In fact, the definitions of the goals seem too fuzzy or missing altogether. For instance, we do not know the specific cost and quality goals to target for a total knee replacement or the defined cost and quality outcomes related to lifestyle-related chronic disease.

Instead of incenting attainment of specific cost and quality outcomes, existing regulation has incented the intermediate activities, behaviors, and organizational structures that some legislators and industry leaders believe will aid in reaching the outcomes. Even when the intermediate actions seem productive, the lack of compelling results leads to a conclusion that the actions are, at best, incomplete. The right combination of processes to achieve the desired cost and quality outcomes is not always clear, and in the absence of evidence-based clarity, practitioners need maximum flexibility to act in accordance with their training and experience.

By shifting to incentives based on optimal quality and cost outcomes, the Trump administration has an opportunity to reduce administrative burden from government agencies, reduce the compliance burden from healthcare organizations and practitioners, and create a competitive and innovative environment that is truly driven to achieve world-leading healthcare quality and cost-of-care goals.

Let me explain with some examples.

HITECH

While a digitized and connected ecosystem and at least aspects of electronic health records (EHRs) are surely part of the long-term solution to higher quality and lower costs, incenting adoption of EHRs and telling providers what stepwise features constitute Meaningfully Use is an industry-wide micro-management mandate. This movement to automate so many processes may be ineffective, inefficient, or both. The EHR is a tool— a complicated and expensive one – and like other tools available to providers, it has the potential to enhance certain clinical and administrative activities and/or become a source of frustration and waste.

Shifting incentives from Meaningful Use of EHRs to attainment of a desired combination of higher quality outcomes for care and lower cost gives providers the option to select and de-select the technologies that impact cost and outcomes the most. Providers who use EHRs or certain features may have a clear advantage, and if so, competition among providers would spur increased adoption of those features. In this scenario, the government defines the optimal quality/cost outcome at population and/or episode levels along with incentives for attainment and foregoes defining which EHR functionality is most important; the market will decide which technological features should be meaningfully used to help them achieve the goal.

ACA

Take the ACA’s formulation of Accountable Care Organizations (ACOs). ACOs use incentives and penalties to drive a more coordinated care delivery environment with the potential to reduce unnecessary care, increase patient safety, and lead to higher quality outcomes. An ACO has the best opportunity to impact quality and cost when patients get their care within the ACO network, but when patients go outside the ACO network of practitioners, care coordination wanes, reducing the opportunity to optimize quality and cost.

Unless incentives to coordinate care extend to every doctor who cares for a given member and not only to doctors who participate in the constrained provider organization, ACOs will continue to have blind spots that prevent their impact to the degree desired. The structure of the ACO and the incentives to coordinate care are not the ultimate goals, and even brilliantly coordinated care in the absence of other behaviors will fail to produce higher quality and lower cost. If healthcare providers are convinced of the benefits of coordinating care, they will facilitate care coordination regardless of whether the patient sees an in- or out-of-network provider and using whatever technology they deem appropriate. Once again, this reduces government involvement in managing care, reduces administrative and technical complexity for providers to what the provider deems appropriate, and creates a competitive and innovative environment where reaching the ultimate goal is rewarded.

MACRA

Incenting practitioners who treat Medicare patients with a potential bonus valued at less than a tenth of their total reimbursement from Medicare, using quality metrics reported two years prior to the incentive payment, and thinking that it will change practitioner behavior seems aspirational. Incentivizing process metrics and clinical practice improvement activities seems to have merit, but clinicians seem better positioned to define the process metrics and improvement activities themselves and incent their care delivery teams to operationalize them. Meanwhile, the federal government seems best suited to craft a measurement system for an optimal combination of quality and cost outcomes and a timely incentive program to reinforce those behaviors.

Resetting legislation and the associated rules to motivate our nationwide healthcare system to be the world-recognized leader requires understanding of granular outcome goals, prescribing fewer actions around how provider organizations function to give room for innovation, and aligning incentives that facilitate competition and reward successful attainment of the ultimate cost and quality goals.

If Trumpcare — whether a revision of Obamacare or something wholly different — can shift the role of the federal government to defining targets and driving the healthcare industry with incentives to reach them, American ingenuity, resourcefulness, and competitiveness will take over like never before and attainment of quality and cost containment goals will follow.

E. Todd Bennett is healthcare market leader for LexisNexis Risk Solutions.

Readers Write: Not Just Ransomware: Common EHR Threats You Need to Know

December 12, 2016 Readers Write No Comments

Not Just Ransomware: Common EHR Threats You Need to Know
By Robert Lord

image

It is no secret that data breaches are becoming more common and increasingly more expensive. New threats to patients’ electronic health records (EHRs) are constantly emerging, forcing healthcare organizations to be on the lookout for potential dangers so they can eliminate threats quickly. It is important for organizations to understand the array of potential threats to the EHR, allowing them to make decisions on how to best protect this sensitive data.

After talking with healthcare stakeholders inside hospital systems, the federal government, etc., and distilling themes that continually come up, I thought it would be useful share what I’ve learned.

Think Twice Before Opening That Email — Phishing and Social Engineering

Phishing scams represent a very real danger to EHRs, but they are often overlooked by healthcare organizations because they assume such threats cannot break through their security. Phishing scams are email or social engineering attacks that try to appear legitimate in order to get healthcare employees to release patients’ sensitive medical information. Such attacks often use email or website scams to either target patients’ information directly or to obtain an employee’s username and password, thereby gaining access to that organization’s entire EHR.

Just recently, a phishing email disguised as official OCR Audit communication about Phase 2 Audits went out to healthcare organizations. Thankfully, it was only a misguided attempt at marketing for a cybersecurity firm, but it could have been much worse. In December 2014, an employee of Seton Healthcare Family opened a scam email. The resulting breach released the medical record numbers, Social Security numbers, insurance information, demographic information, and clinical data of 39,000 patients.

Nevertheless, even if phishing attacks are not the cause of a breach, they can still represent a threat. After the massive breach of Anthem Inc., for example, affected patients began receiving scam emails that promised them free credit monitoring, thus demonstrating that phishing attacks remain a threat even in the wake of a data breach.

Star-Studded HIPAA Violations Can Be Costly — VIP Patient Privacy

The temptation to peek at the medical record of a celebrity or public figure represents a real threat to patient privacy. VIP patients deserve the same right to privacy as the general public, and steps need to be put in place to guarantee that their sensitive information is kept safe and the treating medical facilities out of the headlines.

In 2011, UCLA Health System came to a settlement with the federal government, agreeing to pay $865,000 after two unnamed celebrities alleged that UCLA employees had viewed their medical records without authorization. Two years before that, in 2009, California health regulators fined Kaiser Permanente $250,000 after some of its employees looked at the medical record of Nadya Suleman, the famous mother of octuplets. Unfortunately, there are many other examples of employees being fired or healthcare organizations being fined because they did not protect the privacy of their VIP patients.

The Family Doesn’t Need to Know Everything — Snooping Threat

The desire for relatives, friends, or even co-workers to snoop into patients’ records often result in messy – and costly – data breaches. In 2013, a nurse accessed the records of her nephew’s partner without authorization and saw that her nephew’s partner had given birth to a baby and put the child up for adoption five years earlier. The nurse then announced the news at a family funeral. After the victim sent a complaint to the hospital, the nurse was terminated and gave up her Florida nursing license.

A similar lawsuit involving Aspen Valley Hospital District and a former employee is currently ongoing. A former employee of the hospital, who was also a patient there, alleged that several employees of the hospital violated his privacy when they disclosed that he had HIV “as a piece of conversational gossip over drinks.” The unnamed patient is currently seeking an apology, compensatory damages, punitive damages, and attorney fees from the hospital. These are but two examples of how devastating these seemingly small breaches can be to the affected patients.

The Biggest Threat to Patient Privacy is Hiding in Plain Sight — Insider Threat

Some of the most dangerous threats to EHRs are criminal insiders. In this type of attack, an employee of a healthcare organization steals patient information from the inside, using his or her access to do so. Earlier this year, Jackson Healthcare Systems found out how dangerous these threats can be the hard way. In February, the health system reported that one of their employees had gone “rogue” and stolen the information of 24,000 patients over the course of five years. The stolen information included names, birth dates, home addresses, and Social Security numbers. As the Jackson Healthcare Systems example demonstrates, these breaches are so dangerous because they are so difficult to detect. In this case, it took five years before the organization was able to identify and eliminate the insider threat.

Business Associates and Contractors

Business associates and contractors within healthcare organizations represent a growing vulnerability for the EHR, especially in recent years. The US Health and Human Services (HHS) established the Omnibus Rule in 2013, which required the business associates of healthcare organizations to adhere to the HIPAA Rules. Unfortunately, there is still much work to be done to address this vulnerability.

In July of this year, Catholic Health Care Services, a business associate for six skilled nursing facilities, agreed to pay $650,000 for HIPAA violations after a mobile device was stolen. The data breach affected 412 patients. Moreover, this is not an isolated incident; according to a report from Protenus and DataBreaches.net, 30 percent of all data breaches in the first eight months of this year involved a business associate of a healthcare organization. In other words, 4.5 million patients have been affected by data breaches of third parties thus far in 2016.

Lost and Stolen Devices

One final threat to EHR is lost and stolen devices, including laptops and mobile devices. If the information on the lost device is not encrypted or the encryption is not working, all someone has to do is open the device and look at the information for a breach to occur. And if the device was stolen, the criminals do not even have to decrypt the information for them to be able to use it.

One example from this year involves Seim Johnson, an accounting and consulting services company. In February 2016, Seim Johnson reported to HHS that a laptop had been stolen. The encryption on the laptop malfunctioned, exposing the private information of almost 31,000 patients. And these types of breaches are becoming increasingly frequent, with Verizon’s 2015 Data Breach Investigation Report stating that 45 percent of all healthcare data breaches are the result of stolen devices.

Knowledge is Power

As more and more healthcare organizations make the switch from paper to electronic health records, it will become increasingly important for organizations to be able to protect their patient records. Of course, this also means that threats to EHR will become more varied and more sophisticated. Healthcare organizations must be well informed about the different types of threats that exist so they can put security measures in place to effectively combat them, and ultimately protect the privacy of their patients.

Robert Lord is co-founder and CEO of Protenus of Baltimore, MD.

Readers Write: 5 Common Clinical Information Blind Spots

November 28, 2016 Readers Write No Comments

5 Common Clinical Information Blind Spots
By Sandra Lillie

image

The growth rate of data moving into VNAs is exploding – expected to reach 1.4 billion objects by 2017 – and approximately 75 percent of these objects will be non-DICOM assets. To date, many hospitals don’t have a formal strategy addressing how to identify, import, and manage non-DICOM images and video as part of core image management and security efforts. This puts the organization at risk of exposing PHI (protected health information).

Moreover, these assets often aren’t included in or accessible from the EHR (electronic health record). These holes in the health record provide clinicians with an incomplete picture of the patient that can negatively impact diagnoses, treatment plans, and ultimately, outcomes.

With increased scrutiny being placed on the healthcare organizations to tighten up security efforts to protect patient data, and an industry-wide movement toward greater interoperability and patient-centered care, the need to establish centralized insight and control of non-DICOM assets has never been more important. However, this can be a significant challenge because of all the systems, devices, and media throughout an HDO (healthcare delivery organization) on which these images reside.

The departmental nature of care delivery in the past has created a plethora of locked and blocked silos that contain critical clinical images an organization may be unaware even exist. Identifying and consolidating these assets as part of an enterprise imaging strategy allows for the deployment of a more complete EHR while reducing costs locked in departmental system solutions. The key is to identify areas throughout the HDO where the largest numbers of unconnected and potentially valuable non-DICOM images are likely to reside. Bringing these images into the fold first can address some the biggest risk areas while adding the most clinically relevant patient information to the health record.

The following are five of the biggest sources of non-DICOM blind spots in hospitals and health systems.

1. Visible light images and video. This source is fairly convoluted because of all the areas of the hospital where visible light images and video are captured and stored. However, they are all important, whether they’re endoscopy or colonoscopy images from gastroenterology; ureteroscopy or cystoscopy images from urology; or laparoscopy images from OR/surgery. It’s vital to identify all of the producers of visible light images and video throughout the hospital and implement technology solutions that allow those assets to be captured and imported in their native formats from a wide range of video scope systems and processors.

2. Dermatology and plastic surgery. Many dermatology and plastic surgery departments have specialized imaging systems that capture high-definition (and sometimes 3D-rendered images) of everything from routine skin conditions to complex reconstructive surgery. These images are important pieces of the clinical narrative that are often missing from a patient’s electronic health record because of the isolated and proprietary nature of many of these systems.

3. Ophthalmology. Ophthalmology departments also routinely leverage specialty systems that capture images of the retina, cornea, and other features of the eye. A complete picture of a patient’s eye health can only be obtained by including images from these specialty systems in an overall enterprise imaging strategy.

4. Mobile devices. The healthcare industry today is increasingly mobile. Clinicians at the point of care (especially in emergency rooms) routinely capture images of wounds, allergic reactions, skin anomalies, and more in the exam room on their smartphones and tablet devices. Capturing, consolidating, and managing these photos as part of an enterprise imaging strategy can be challenging, particularly in healthcare environments that have adopted a BYOD (bring your own device) mobile policy. A technology that can be installed on mobile devices to encrypt and route medical images from these devices to a central PACS, VNA, or EHR while ensuring no image data is saved to the device camera roll is essential.

5. CD/DVD media. This is another convoluted source of non-DICOM (and potentially even DICOM) images and video. Practically any medical department that leverages imaging in some way, shape, or form has (at one point or another) stored old patient images on CDs or DVDs. These images are likely rarely, if ever, accessed by clinicians and are completely disconnected from the EHR. It is important that the pertinent historical imaging data contained on this media is imported into an enterprise imaging platform and reintroduced to the patient record.

These five sources of medical imaging clinical blind spots are just a sample of the areas to keep in mind in pursuing an end-to-end enterprise imaging strategy. As the industry moves further down the path toward delivering true personalized medicine, other emerging areas – such as pathology and genomics – will be important to consider in an effort to produce and maintain a comprehensive patient record for clinical use.

Furthermore, HDOs also sometimes forget that additional unstructured information (such as documents) exist within other departmental systems and provide another source of important clinical information. A well-articulated and focused enterprise imaging and healthcare content management (HCM) strategy with a reputable partner capable of delivering the necessary interoperability requirements can put an HDO on the path for delivering a truly comprehensive EHR.

Sandra Lillie is industry manager for enterprise imaging for Lexmark Healthcare.

Readers Write: The Next Phase for Recovery Audits

November 16, 2016 Readers Write No Comments

The Next Phase for Recovery Audits
By Nicole Smith

image

Healthcare providers have reveled in the abatement of audits by recovery auditor contractors that have been silent during the last two years of legal challenges and the procurement process resulting in a tremendous reduction — and in some instances, a pause — of recovery audits. During this down time, the Centers for Medicare and Medicaid (CMS) has been working tirelessly to procure new audit contracts – which they have now done — while dealing with post-award protests and growing concerns from the provider community about the administrative burden audits impose, as well as the methodology in which the contractors had been auditing.

CMS has said multiple times that it is committed to maintaining the integrity of the Medicare program, but its latest priority has been reducing provider burden. With contracts finally awarded to Cotiviti LLC, Performant Recovery, Inc. and HMS Federal Solutions performing post-payment audit reviews for Medicare Part A and Part B, CMS added a new fifth region that will be dedicated to identifying improper payments for durable medical equipment and home health and hospice providers. The fifth region was awarded to Performant Recovery, Inc.

Providers can expect to see some program enhancements that will improve the provider experience once the new contractors resume auditing. Providers should familiarize themselves with the upcoming changes and revise their workflow to efficiently handle Medicare audits.

While recovery audits can impose a tremendous administrative burden on a provider and can have a negative financial impact on a health organization, developing a plan to manage the audit process may prove to be beneficial for providers. For a process that has been largely paper-based up to this point, CMA implemented changes the past two years to streamline the audit submission process after contractors issued more than 2 million requests annually. Thus, CMS recognized the need to develop an electronic process so that providers and health systems could process their responses to audit contractors electronically without paper.

The Electronic Submission of Medical Documentation (esMD) program was developed as part of strategic plan to transform business operations and uphold their commitment to modernize business processes, streamline medical documentation submissions, and sustain enrollment gains in the Medicare program.

Providers have long since felt that the contingency fee basis in which recovery auditors were reimbursed encouraged auditors to target and deny a high volume of high-dollar claims, resulting in false denials and leaving the burden on the provider to appeal the decision – all while the monies paid were recouped. The appeals process can take years and tremendously impacts organizational revenue. CMS revised the way in which auditors will be reimbursed.

Now, recovery auditors will not receive their contingency fee until after the second level of appeal is completed. Additionally, auditors are required to maintain a 95 percent accuracy rate and an overturn rate of less than 10 percent at the first level of appeal. Failure to comply will result in corrective action for the recovery auditor. This is one of the most notable changes that directly addresses concerns of the provider community.

Further testament to CMS’s apparent commitment to minimize provider burden is the ability for providers to electronically file level one and level two appeals through a CMS Certified Health Information Handler (HIH) for esMD. These new esMD use cases alleviate providers from the overwhelming costs of printing, mailing, and tracking of supporting audit documentation while also helping to ensure timely filing, which historically has contributed to denials for providers as well.

Through the updated RAC contract, CMS also will require recovery auditors to provide detailed information about current recovery audit issues. This information is expected to be posted and reviewable on the auditor website for all the see, creating an added level of transparency for the entire process. Providers can proactively prepare for the identified issues by reviewing Medicare billing rules and making sure they are billing in compliance and have all the necessary support documentation in the event of an audit. If providers remain focused on compliance and timely filing recovery, audits should have little impact on the provider – at least that’s the hope.

In addition to the administrative burden of managing Medicare audits, providers have often felt that they had no direct line of communication with CMS regarding the audit process if they encountered an issue related to an audit. Frustration often grew quickly as providers tried in vain to contact someone at CMS while attempting to address any issues they may have had. From my experience with the program, providers often felt bounced around when trying to locate the appropriate person to speak with. To alleviate this problem, CMS created a new position, a provider relations coordinator, designated as the single point of contact for the provider community. The provider relations coordinator is meant to create a streamlined communication outlet for concerns with the recovery audit program.

With the return of the recovery audits on the horizon, providers should use this time to review their internal processes for handling audits and closely monitor regulatory requirements and changes in compliance policies and procedures to develop best practices for their audit program. The program, based on the developments spoken of here, are meant to ensure a more democratic, effective audit process for every party. It is my belief that the program will be less combative, less of a financially-driven attack on health systems by audit contractors, and more of a process designed to right any accidental billing wrongs and return legitimate overpayments to CMS, an equitable approach for all.

Based on the program updates, health systems will have a voice now and will be able to engage CMS directly, if needed, to mitigate any potential overzealousness the previous iteration of the program seemed to create. Perhaps now the audit process will more resemble the image of a negotiating table rather than one where an aggressive takeover seems to be occurring, as was an often-expressed sentiment of those working in the care space.

While program changes may continue, and with all signs indicating that the recovery audit program is here to stay, having a solid plan with proven best practices will minimize the administrative burden. Nevertheless, the news from Washington is good and likely portrays better things to come.

Nicole Smith is VP of operations and government services for Vyne of Dunwoody, GA.

The Election Lesson Learned is to be Healthily Skeptical of Analytics

November 9, 2016 Readers Write 18 Comments

The Election Lesson Learned is to be Healthily Skeptical of Analytics
By Mr. HIStalk

image

It was a divisive, ugly election more appropriate to a third-world country than the US, but maybe we can all have a Kumbaya-singing moment of unity in agreeing on just one thing – the highly paid and highly regarded pollsters and pundits had no idea what they were talking about. They weren’t any smarter than your brother-in-law whose political beliefs get simpler and louder after one beer too many. The analytics emperors, as we now know, had no clothes.

The experts told us that Donald Trump was not only going to get blown out, but he also would drag the down-ballot candidates with him and most likely destroy the Republican party. Hillary Clinton’s team of quant geeks had it all figured out, telling her to skip campaigning in sure-win states like Wisconsin and instead focus her energy on the swing states. The TV talking heads simultaneously parroted that Clinton had a zillion “pathways to 270” while Trump had just one, an impossible long shot. The actual voting results would be anticlimactic, no more necessary to watch than a football game involving a 28-point underdog.

The (previously) respected poll site 538 pegged Trump’s chances at 28 percent as the polls began to close. Within a handful of hours, they gave him an 84 percent chance of winning. Presumably by Wednesday morning their finely tuned analytics apparatus took into account that Clinton had conceded and raised his chances a bit more, plus or minus their sampling error.

This morning, President-Elect Trump is packing up for the White House and the Republicans still control the Senate. Meanwhile, political pollsters and statisticians are anxiously expunging their election-related activities from their resumes. They had one job to do and they failed spectacularly. Or perhaps more accurately, their faulty analytics were misinterpreted as reality by people who should have known better.

Apparently we didn’t learn anything from the Scottish referendum or Brexit voting. Toddling off to bed early in a statistics-comforted slumber can cause a rude next-day awakening. Those darned humans keep messing up otherwise impressive statistics-powered predictions.

We talk a lot in healthcare about analytics. Being scientists, we’re confident that we can predict and maybe even control the behavior of humans (patients, plan members, and providers) with medical history questionnaires, clinical studies, satisfaction surveys, and carefully constricted insurance risk pools. But the election provides some lessons learned about analytics-powered assumptions.

  • It’s risky to apply even rigorous statistical methods to the inherently unpredictable behavior of free-will humans.
  • Analytics can reduce a maddeningly complex situation into something that is more understandable even when it’s dead wrong.
  • Surveyors and statisticians are often encouraged to deliver conclusions that are loftier than the available data supports. We humans like to please people, especially those paying us, and sometimes that means not speaking up even when we should. “I don’t know” is not only a valid conclusion, but often the correct one.
  • Be wary of smoke-blowing pundits who suggest that they possess extra-special insight and expertise that allow them to draw lofty conclusions from a limited set of data that was assembled quickly and inexpensively.
  • Sometimes going with your gut works better than developing a numbers-focused strategy, like it did for Donald Trump and for doctors who treat the patient rather than their ICD-10 code or or lab result.
  • Confirmation bias is inevitable in research, where new evidence can be seen as proving what the researcher already believes. The most dangerous bias is the subconscious one since it can’t be statistically weeded out.
  • A study’s design and its definition of a representative sample already contains some degree of uncertainty and bias.
  • Sampling errors have a tremendous impact. We don’t know how many “hidden voters” the pollsters missed. We don’t know how well they selected their tiny sampling of Americans, each of whom represented thousands of us who weren’t surveyed. Not very, apparently.
  • Response rates and method of outreach matter. Choosing respondents by landline, cell phone, email, or regular mail and even choosing when to contact them will skew the results in unknown ways. Most importantly, a majority of people refuse to participate entirely, making it likely whatever cohort they are part of leaves them unrepresented in the results.
  • You can’t necessarily believe what poll respondents or patients tell you since they often subconsciously say what they think the pollster or society wants to hear. The people who vowed that they were voting for Clinton might also claim that they only watch PBS and on their doctor’s social history questionnaire declare their unfamiliarity with alcohol, drugs, domestic violence, and risky sexual behaviors.
  • Not everybody who is surveyed shows up, and not everybody who shows up was surveyed. It’s the same problem as waiting to see who actually visits a medical practice or ED. Delivering good medical services does not necessarily mean effectively managing a population.
  • Prediction is best compared with performance in fine-tuning assumptions. The experts saw a few states go against their predictions early Tuesday evening, and at that moment but too late, applied that newfound knowledge to create better predictions. Real-time analytics deliver better results, and even an incompetent meteorologist can predict a hurricane’s landfall right before it hits.

It’s tempting to hang our healthcare hat on piles of computers running analytics, artificial intelligence, and other binary systems that attempt to dispassionately impose comforting order on the cacophony of human behavior. It’s not so much that it can’t work, it’s that we shouldn’t become complacent about the accuracy and validity of what the computers and their handlers are telling us. We are often individually and collectively as predictable as the analytics experts tell us, but sometimes we’re not.

Readers Write: Don’t Get Stuck in the Readmissions Penalty Box

November 9, 2016 Readers Write No Comments

Don’t Get Stuck in the Readmissions Penalty Box
By Lisa Lyons

The Hospital Readmissions Reduction Program (HRRP) requires the Centers for Medicare and Medicaid Services (CMS) to reduce payments to inpatient hospitals with relatively high 30-day readmission rates. CMS applies up to a three percent reduction for “excess” readmissions using a risk-adjusted ratio that compares a hospital’s performance to the national average for sets of patients with specified conditions.

Payment adjustments for FY 2017 (based on performance from July 2012 through June 2015) will be applied to all Medicare discharges starting October 1 of this year and running through September 30, 2017. Payment reductions for FY 2017 will be posted on the Hospital Compare website this October.

Total HRRP penalties are expected to reach $528 million for FY 2017, up sharply from about $420 million in FY 2016, with more than half of the nation’s hospitals affected, according to a Kaiser Health News analysis. The average penalty will spike in similar fashion, from 0.61 percent in FY 2016 to 0.73 in FY 2017.

The situation calls for a thorough understanding of the readmissions penalty environment and a strategic mindset for taking action.

Prior to FY 2017, CMS measured excess readmissions by dividing a hospital’s number of “expected” 30-day readmissions for heart attack, heart failure, pneumonia, hip/knee replacement, and COPD by the number that would be expected, based on an average hospital with similar patients.

For FY 2017, CMS expanded the list of cohorts to include coronary artery bypass graft (CABG) procedures. The agency also added to the existing pneumonia cohort: the assignment criterion now includes cases where the principal diagnosis of non-severe sepsis includes secondary diagnosis of pneumonia and aspiration pneumonia. This creates a bigger set of patients from which a hospital could have readmissions — in fact, it may expand the pneumonia cohort by 50 percent in many hospitals.

Complicating matters, excess readmissions found in any of the six cohorts will result in an overall penalty. A hospital gets no credit for making readmissions improvements along the way.

At the same time, all hospitals are working on readmissions, so the average of excess readmissions is decreasing. That means it’s harder than ever for hospitals to stay under the penalty bar.

Also, due to HRRP’s reporting cycle, an excess readmission stays in CMS’s data for three years.

These factors make it hard for hospitals to know if they have passed the tipping point for readmissions penalties before notification from CMS — which typically happens just four months prior to penalties being imposed. In practical terms, there’s not enough time to impact results.

Further, analyzing CMS data is challenging for most hospitals because:

  • CMS data is retrospective. CMS calculates fiscal year penalties by looking back at data over a range of two to five years. As such, current improvements to readmission reduction programs will not be seen right away.
  • CMS data includes readmissions from “non-same” hospitals. Most hospitals can’t view cases where a patient initially admitted to their facility ended up being readmitted in another facility.
  • CMS data only includes readmissions among the Medicare patient population. Many commercial payers have instituted pay-for-performance programs, which should also be analyzed. Limiting your view to the Medicare HRRP program will only reveal part of your overall readmissions.
  • CMS’s Measure Methodology for Readmissions can’t be easily replicated. CMS risk-adjusts each qualifying patient using Medicare Part A and Part B data for a full year prior to admission, and 30 days post-discharge. Since hospitals don’t have access to this information, they can’t replicate the methodology to calculate their excess readmissions.

Fortunately, with the right data, there’s a way to emulate the CMS methodology to help estimate the volume of excess readmissions that will be attributed to your hospital. You can do so well before receiving your hospital-specific reports from CMS.

Here are four ways advanced analytics can help position hospitals to be more proactive in managing their readmissions:

  1. Purchase de-identified Medicare Part A and B claims data from CMS. Advanced analytics makes it possible to match historic claims data with known patients in your hospital information systems. In this way you can see longitudinal care histories for the patients you are discharging today. Algorithms can also predict the rate of non-same hospitalization from current readmission data, effectively filling in the blanks on readmissions that occur outside your hospital. That may give you up to two years advance notice regarding which readmissions will be counted as excessive. With that knowledge, you can do something about readmissions before the end of the evaluation period.
  2. Know how many readmissions will put you in jeopardy of incurring penalties. This is the previously mentioned tipping point. Surprisingly, for many hospitals, only a few excess readmissions per month can send them to the penalty box. Predictive analytics identify patients at greatest risk for unplanned readmissions. Look for algorithms with a high degree of accuracy in matching the CMS dataset to your own database to single out cases that were identified in the assignment criteria. Once you’re able to identify trends, you can fix the issues.
  3. Since CMS measures readmission back to any hospital, partner with other hospitals in your region to which you commonly refer patients back and forth. Concentrate on areas of improvement in either coordination or quality of care.
  4. Analyze clinical conditions across the board among your hospital’s patient population, not just within the six CMS-defined cohorts. Taking a broader view establishes more effective data patterning to help determine if a systemic problem exists. Dashboards and pre-formatted reports signal where to drill down for more detail (for example, whether you discharged the patient to home or a different care setting).

Government policy statements clearly indicate Medicare payments becoming more heavily weighted on quality or value measures, and HRRP will be part of that determination.

What’s more, CMS has proposed that the readmission measure itself be expanded to count excess days associated with readmissions — taking into account ED patients and those assigned to observation status — rather than singular readmission events for inpatients. Expect increased involvement of care management and quality teams in this area, and another layer of potential penalties.

Don’t wait to react to how these measures will impact your hospital’s operations and finances. Now’s the time to implement data analytics tools to intelligently manage your hospital’s readmission risk with a high degree of accuracy.

Lisa Lyons is director of advanced analytics and population health and interim VP of consulting at Xerox.

Readers Write: Address the Disruption in Provider Data Caused by Clinically Integrated Networks and Value-Based Care

October 31, 2016 Readers Write No Comments

Address the Disruption in Provider Data Caused by Clinically Integrated Networks and Value-Based Care
By Tom White

image

Hospitals that became health systems and are now morphing into clinically integrated networks (CINs) are facing increasing struggles managing their expanding patchwork of providers. These include credentialed and referring physicians, APRNs, nurses, other licensed professionals. Their provider count has often grown by five to 10 times.

Not only are there more providers, but also they are working in a wider variety of outpatient care settings. This has been a boon for consumers, as there are now many new retail healthcare locations on neighborhood street corners, but this poses an increasing challenge from a provider data perspective. Who is providing the service? What is their affiliation in the ACOs, next gen ACOs, CINs, or narrow networks? Are they sanctioned?

These problems rise from the emergence of the retail healthcare economy. The resultant growth in provider data is creating obvious and not-so-obvious consequences caused by disruptions in the provider data management process, affecting the accuracy of the provider data.

Poor provider data management tends to hurt healthcare organizations much more than they realize, especially in the context of today’s emerging retail healthcare economy and value-based reimbursement market. For hospitals and providers to succeed in these circumstances it is imperative to drive out unnecessary costs, and outdated or inaccurate provider data is a hidden source of significant costs.

As hospitals and health systems develop new alliances, it is critical to know what providers are included in a CIN, including their roles and affiliations. Efforts to collaborate over large patient populations and control value-based payments require in-depth and proprietary knowledge of provider affiliations, practice scope, and their economic models. This information is mission critical for success. Using a system that manages provider data in these areas should be a business imperative for every health system executive.

Licensed healthcare provider data management programs have historically been managed by numerous, fragmented systems across the healthcare ecosystem. Many healthcare leaders believe that electronic medical records (EMR) systems and their health information exchange (HIE) modules, credentialing, and other modern back-office IT systems have made provider data more accurate, secure, and accessible. Perhaps this is so with patient data, but this is not the case with provider data. These enterprise IT systems provide numerous benefits and may even provide a repository for some provider data, but they are not inherently designed for ongoing management of this business-critical data.

Let’s think for a minute about some specific areas in which provider data plays a vital role. Do CINs know who their providers are? How do they take these new provider networks and build the tools for consumers and providers to search and find them? Simple natural language searching (think Google searches) is how the entire world except for healthcare works. Having accurate provider data who are in-network with modern search tools should be a goal for all health systems and CINs.

Accurate provider data is critical to ensure that provider search tools can be the foundation of a successful referral management program. Potential patients that visit the hospital website and search for a local, in-network doctor or a specialist expect that the information they are presented with is accurate and current. If not, a bad customer experience could mean the loss of a patient, a loss of trust, and perhaps worst of all, a bad online review by the patient.

Physicians who use these search tools to identify specialists they can refer their patients to is a critical aspect of referral management. The range of critical data that is relied upon now goes beyond simple contact information and insurance plan participation. It might include physician communication preferences, licensing data, internal system IDs, exclusionary lists, and other sensitive internal information. This information changes frequently, but users don’t have time to ponder these facts. Inaccurate information wastes time and hurts patient satisfaction.

Inaccurate provider data causes billing delays that hurt cash flow and increases days A/R. Invoices sent to the wrong location or faxed to the wrong office are common in healthcare. Never mind issues stemming from inaccurate or incomplete address information.

Beyond clinical and financial performance gains from having more accurate information on providers is that this data can then be used in consumer and physician outreach programs across the health systems, whether part of a CIN or ACO. Hospitals are businesses, too. Historically many of their patients may be admitted through the ED, but increasingly are referred by in-network physicians or come through another outpatient service. The hospital’s marketing department may want to reach out to a network of physicians within a 200-mile radius to encourage referring patients to their facilities or simply promote a new piece of equipment or innovative procedure that’s now available at their facility. The marketing department might do searches to find these physicians and contact them. Having accurate provider ensures that these efforts are productive and efficient.

A tool is required that makes it easy for the appropriate teams in the health system to curate and update their health system provider data to create a single source of truth. This should include all credentialed and referring providers from across the entire healthcare organization, including acute, post-acute, outpatient, and long-term care environments.

While health systems can develop data governance models that require all departments to verify the accuracy of their provider data and to specify how it should be shared, this is seldom a success. Most organizations don’t know exactly who is in their pool of licensed providers and historically there has not been an IT system that can provide this comprehensive capability.

Healthcare leaders have to take a proactive approach to provider data management and can no longer afford to deny the critical role this information plays in today’s increasingly complex and challenging healthcare system. In a fee-for-service world where practitioners are paid for whatever work they perform, it may not be as critical to have accurate provider data. But in today’s value-based care market, accurate provider data is critical for running an efficient, competitive, and profitable healthcare system.

Thomas White is CEO of Phynd Technologies of Dallas, TX.

Readers Write: Ready or Not, ASC X12 275 Attachment EDI Transaction Is Coming

October 17, 2016 Readers Write No Comments

Ready or Not, ASC X12 275 Attachment EDI Transaction Is Coming
By Lindy Benton

image

As electronic as we are in many aspects of business – and life in general – oftentimes healthcare providers and payers are still using paper for claim attachment requests and responses. With the ASC X12 275 attachment electronic data interchange on the horizon, the need for utilizing secure, electronic transactions will soon be here.

Let’s look at the claim attachment process.

  1. A claim attachment arises when a payer requests additional information from a provider to adjudicate a claim. This attachment is intended to provide additional information or answer additional questions or information not included in the original claim.
  2. In many instances, the process for sending and receiving attachments is still largely done via a manual, paper-based format.
  3. Paper-based transactions are slow, inefficient, and can bog down the revenue cycle. Additionally, paper transactions are prone to getting lost in transit and are difficult if not impossible to track.
  4. The ASC X12 275 transaction has been proposed as a secure, electronic (EDI) method of managing the attachment request while making it uniform across all providers and payers.

The ASC X12 275 can be sent either solicited or unsolicited. When solicited, it will be when the claim is subjected to medical or utilization review during the adjudication process. The payer then requests specific information to supplement or support the providers request for payment of the services. The payer’s request for additional information may be service specific or apply to the entire claim, the 275 is used to transmit the request. The provider uses the 275 to respond to the previously mentioned request in the specified time from the payer.

Both HIPAA and the Affordable Care Act are driving the adoption of these secure, electronic transaction standards. HIPAA requires the establishment of national standards for electronic healthcare transactions and national identifiers for providers, health insurance plans, and employers. In Section 1104(b)(2) of the ACA, Congress required the adoption of operating rules for the healthcare industry and directed the secretary of Health and Human Services to “adopt a single set of operating rules for each transaction” with the goal of creating as much uniformity in the implementation of the electronic standards as possible.

Providers and payers will be required to adopt these standards at some point and it will happen sooner rather than later, so it’s time to be prepared.

The final specifications and detail for the EDI 275 transaction were supposed to be finalized in January 2016, but that has yet to happen. Both the American Health Association and American Medical Association have urged the Department of Health and Human Services to finalize and adopt the latest 275 standard, so with that kind of backing, it’s only a matter of time until the 275 transaction standard gains momentum and comes to fruition.

EDI 275 is coming. The question is, will you be ready?

Lindy Benton is president and CEO of Vyne of Dunwoody, GA.

Readers Write: Exploring the EMR Debate: Onus On Analytics Companies to Deliver Insights

October 17, 2016 Readers Write 1 Comment

Exploring the EMR Debate: Onus On Analytics Companies to Deliver Insights
By Leonard D’Avolio, PhD

image

Late last month, a great op-ed published in The Wall Street Journal called “Turn Off the Computer and Listen to the Patient” brought a critical healthcare issue to the forefront of the national discussion. The physician authors, Caleb Gardner, MD and John Levinson, MD, describe the frustrations physicians experience with poor design, federal incentives, and the “one-size-fits-all rules for medical practice” implemented in today’s electronic medical records (EMRs).

From the start, the counter to any criticism of the EMR was that the collection of digital health data will finally make it possible to discover opportunities to improve the quality of care, prevent error, and steer resources to where they are needed most. This is, after all, the story of nearly every other industry post-digitization.

However, many organizations are learning the hard way that the business intelligence tools that were so successful in helping other industries learn from their quantified and reliable sales, inventory, and finance data can be limited in trying to make sense of healthcare’s unstructured, sparse, and often inaccurate clinical data.

Data warehouses and reporting tools — the foundation for understanding quantified and reliable sales, inventory, and finance data of other industries – are useful for required reporting of process measures for CMS, ACO, AQC, and who knows what mandates are next. However, it should be made clear that these multi-year, multi-million dollar investments are designed to address the concerns of fee-for-service care: what happened, to whom, and when. They will not begin to answer the questions most critical to value-based care: what is likely to happen, to whom, and what should be done about it.

Rapidly advancing analytic approaches are well suited for healthcare data and designed to answer the questions of value-based care. Unfortunately, journalists and vendors alike have done a terrible job in communicating the value, potential, and nature of these approaches.

Hidden beneath a veneer of buzzwords including artificial intelligence, big data, cognitive computing, data science, data mining, and machine learning is a set of methods that have proven capable of answering the “what’s next” questions of value-based care across clinical domains including cardiothoracic surgery, urology, orthopedic surgery, plastic surgery, otolaryngology, general surgery, transplant, trauma, and neurosurgery, cancer prediction and prognosis, and intensive care unit morbidity. Despite 20+ years of empirical evidence demonstrating superior predictive performance, these approaches have remained the nearly exclusive property of academics.

The rhetoric surrounding these methods is bimodal and not particularly helpful. Either big data will cure cancer in just a few years or clinicians proudly list the reasons they will not be replaced by virtual AI versions of themselves. Both are fun reads, but neither address the immediate opportunity to capitalize on the painstakingly entered data to deliver care more efficiently today.

More productive is a framing of machine learning as what it actually is — an emerging tool. Like all tools, machine learning has inherent pros and cons that should be considered.

In the pro column is the ability of these methods to consider many more data points than traditional risk score or rules-based approaches. Also important for medicine is the fact that machine learning-based approaches don’t require that data be well formatted or standardized in order to learn from it. Combined with natural language processing, machine learning can consider the free text impressions of clinicians or case managers in predicting which patient is most likely to benefit from attention sooner. Like clinical care, these approaches learn with new experience, allowing insights to evolve based on the ever-changing dynamics of care delivery.

To illustrate, the organization I work with was recently enlisted to identify members of a health plan most likely to dis-enroll after one year of membership. This is a particularly sensitive loss for organizations that take on the financial responsibility of delivering care, as considerable investments are made in Year 1 stabilizing and maintaining the health of the member.

Using software designed to employ these methods, we consumed 30 file types, from case management notes, to claims, to call center transcripts. Comparing all of the data of members that dis-enrolled after one year versus those that stayed in the plan, we learned the patterns that most highly correlate with disenrollment. Our partner uses these insights to proactively call members before they dis-enroll. As their call center employs strategies to reduce specific causes of dissatisfaction, members’ reasons for wanting to leave change. So, too do the patterns emerging from the software.

The result is greater member satisfaction, record low dis-enrollment rates, and a more proactive approach to addressing member concerns. It’s not the cure for cancer, but it is one of a growing number of questions that require addressing when the success of an organization is dependent on using resources efficiently.

The greatest limitation of machine learning to date has been inaccessibility. Like the mainframe before it, this new technology has remained the exclusive domain of experts. In most applications, each model is developed over the course of months using tools designed for data scientists. The results are delivered as recommendations, not HIPAA-compliant software ready to be plugged in when and where needed. Like the evolution of computing, all of that’s about to change.

Just hours after reading the Gardner and Levinson op-ed, I sat across from a primary care doc friend as she ended a long day of practice by charting out the last few patients. Her frustration was palpable as she fought her way through screen after screen of diabetes-related reporting requirements having “nothing to do with keeping [her] patients healthy.” Her thoughts on the benefits of using her organization’s industry-leading EMR were less measured than Drs. Gardner and Levinson: “I’d rather poke my eyes out.”

I agree fully with Drs. Gardner and Levinson. The answer isn’t abandoning electronic systems, but rather striking a balance between EMR usability and the valuable information that they provide. But I’ve been in healthcare long enough to know clinicians won’t be enjoying well-designed EMRs any time soon. In the meantime, it’s nice to know we don’t need to wait to begin generating returns from all their hard work.

Leonard D’Avolio, PhD is assistant professor at Harvard Medical School CEO and co-founder of Cyft of Cambridge, MA.

Readers Write: ECM for Healthcare Advances to HCM (Healthcare Content Management)

October 17, 2016 Readers Write 1 Comment

ECM for Healthcare Advances to HCM (Healthcare Content Management)
by Amie Teske

image

Industry analysts project healthy market growth for enterprise content management (ECM) solutions across all industry sectors. Gartner’s 2016 Hype Cycle for Real-Time Health System Technologies places ECM squarely along the “plateau of productivity” at the far, right-hand side of the hype cycle curve. This essentially means that ECM software has succeeded the breakthrough in the market and is being actively adopted by healthcare providers.

This is good news for ECM users and technology suppliers, but what’s next for ECM in healthcare? To remain competitive and leading edge, ECM solutions at the plateau must evolve for the sake of customers and the marketplace in order to maintain business success. There is more good news here in that ECM solutions are evolving to keep pace with healthcare changes and demands.

Up to 70 percent of the data needed for effective and comprehensive patient care management and decision-making exists in an unstructured format. This implies the existence of a large chasm between resources and effort expended by healthcare delivery organizations (HDOs) on EHR technology to manage discrete data and the work yet to be done to effectively automate and provide access to the remaining content. ECM solutions are evolving in a new direction that offers HDOs an opportunity to strategically build a bridge to this outstanding content.

Healthcare content management (HCM) is a new term that represents the evolution of ECM for healthcare providers. It is the modern, intelligent approach to managing all unstructured document and image content. The biggest obstacle we must overcome in this journey is the tendency to fall back on traditional thinking, which drives health IT purchases toward siloed, non-integrated systems. Traditional methods for managing patient content have a diminishing role in the future of healthcare. It’s time to set a new course.

An HCM Primer

  • HCM = documents + medical images (photos and video. too).
  • The 70 percent of patient content outside the EHR is primarily unstructured in nature, existing as objects that include not only DICOM (CT, MRI) but also tiff, pdf, mpg, etc.
  • ECM has proven effective for managing tiff, pdf and a variety of other file formats. It is not, however, a technology built to handle DICOM images, which represent the largest and most numerous of the disconnected patient objects in question.
  • Enterprise imaging (EI) technologies have traditionally been responsible for DICOM-based content. These include vendor neutral archives (VNA), enterprise/universal viewers, and worklist and connectivity solutions that are unique to medical image and video capture.
  • Leveraging a single architecture to intentionally integrate ECM and EI technologies — enabling HDOs to effectively capture, manage, access and share all of this content within a common ecosystem — is referred to as healthcare content management or HCM.

Although the market is ready for HCM and many HDOs are already moving in this direction, it is important to know what to look for.

Critical Elements of HCM

Although it is the logical first step, HCM encompasses much more than simply unifying ECM and EI technologies together into a single architecture to enable shared storage and a single viewing experience for all unstructured content, DICOM and non-DICOM. Just as important is workflow and how all document and image content is orchestrated and handled prior to storage and access. This is essentially the secret sauce and the most difficult aspect of an HCM initiative.

ECM for healthcare workflow is geared to handle back office and clinical workflows associated with health information management, patient finance, accounts payable, and human resources, for example. The intricacies of these workflows must continue to cater to specific regulations around PHI, release of information, etc. All this to say that the workflow component of ECM is critical and must remain intact when converging ECM with EI technologies.

The same goes for workflows for enterprise imaging. EI workflow is optimized to handle image orchestration from many modalities to the core VNA or various PACS systems, medical image tag mapping/morphing to ensure image neutrality and downtime situations, for example.

These workflow features should not be taken lightly as health systems endeavor to establish a true HCM strategy. Do not overlook the need for these capabilities to ease the complexities inherently involved and to fully capitalize on any investment made.

Guidance for HCM Planning

Consider the following recommendations as you plan an HCM approach and evaluate prospective vendors:

  • Be wary of an archive-only strategy. A clinical content management (CCM) approach is primarily an archive and access strategy. The critical element of workflow is fully or partly missing. A word of caution to diligent buyers to ask the right questions about workflow and governance of unstructured document and image content before, during, and after storage and access.
  • Always require neutrality. Changing standards is a given in the healthcare industry. HCM should be in alignment with the new standards to ensure all document and image content can be captured, managed, accessed, shared, and migrated without additional cost due to proprietary antics by your vendor. An HCM framework must have a commitment to true neutrality and interoperability.
  • Think strategically. A deliberate HCM framework offered by any healthcare IT vendor should be modular in nature but also able to be executed incrementally and with the end in mind. Beginning with the end in mind is slightly more difficult. The modularity of your HCM approach should allow you to attack your biggest pain points first, solving niche challenges while preserving your budget and showing incremental success in your journey toward the end state.
  • Consider total cost of ownership (TCO). If a common architecture and its associated cost efficiencies are important in wrangling your outstanding 70 percent of disconnected patient content, you cannot afford to take a niche approach. It may seem easier and cheaper to select a group of products from multiple niche vendors to try and solve your most pervasive siloed document and image management problems. Take a careful look at the TCO over the life of these solutions. It is likely the TCO will be higher due to factors which include the number of unique skillsets and FTEs required for a niche strategy.
  • Demand solution flexibility and options. Your HCM approach should provide extensive flexibility and a range of options and alternatives that are adaptable to your unique needs. Software functionality is important, but not the only criterion.

Your HCM approach for strategically managing all unstructured patient content should allow you to:

  • Start small or go big, solving one challenge or many.
  • Establish a common architecture with a unified content platform and viewing strategy for all document and imaging content.
  • Enable unique ECM and EI workflows, not simply storage and access.
  • Hold one technology partner responsible – “one throat to choke” – for easier overall performance management and administration.

Providers of all shapes and sizes must take a thoughtful and deliberate approach when evaluating document and image management solutions. There is much more involved than simply capture and access. Because this category of technology can enable up to 70 percent of your disconnected patient and business information, you cannot afford to make a decision without carefully considering the impact of HCM on your healthcare enterprise, immediately and over time.

Amie Teske is director of global healthcare industry and product marketing for Lexmark Healthcare.

Readers Write: Guaranteeing MACRA Compliance at the Point of Care

October 5, 2016 Readers Write No Comments

Guaranteeing MACRA Compliance at the Point of Care
By David Lareau

image

MACRA will affect every physician and every clinical encounter. Current systems have been designed to produce transactions to be billed. MACRA will require that clinical conditions have been addressed and documented in accordance with quality care guidelines. The only way to ensure that happens is to do it at the point of care.

The challenge is that physicians need to address all conditions, not just those covered by a MACRA requirement. One approach is to just add another set of things to do, slowing doctors down and getting in their way. This is the transactional approach — just another task.

Most current systems have different tabs that list problems, medications, labs, etc. Users must switch back and forth looking for data. The data cannot be organized by problem since the systems lack any method for correlating information based on clinical condition. Adding another set of disconnected information to satisfy quality measures will only make it worse for users.

A better approach is to integrate quality care requirements for any condition with all the other issues the physician needs to address for a specific patient and to work it into a physician’s typical workflow. A well-designed EHR should have a process running in the background that keeps track of all applicable quality measures and guidelines for the patient being seen. The status of all quality measures must be available at any point in the encounter in a format that ties all information together for any clinical issue.

This requires actionable, problem-oriented views of clinical data, where all information for any clinical issue is available instantly. Physicians need to be able to view, react to, and document clinical information for every problem or issue addressed with the patient. This includes history and physical documentation, review of results, clinical assessments, and treatment plans as well as compliance with quality measures.

Guaranteeing MACRA compliance at the point of care can be accomplished by using a clinical knowledge engine that presents all relevant information for any clinical issue so that MACRA quality measures are seamlessly included as part of the patient’s overall clinical picture, not as just another task to be added on to the already burdensome workflows of current systems.

David Lareau is CEO of Medicomp Systems of Chantilly, VA.

Readers Write: Telemedicine Is Just Medicine

October 5, 2016 Readers Write 6 Comments

Readers Write: Telemedicine Is Just Medicine
By Teri Thomas

image

Telemedicine. MHealth. Remote healthcare. What’s the best term for a given use case? A large portion of my job is focused on it, yet my answer is, “I don’t much care what term you use.” 

Well, I guess I care a little if I see confusion getting in the way of progress. Don’t get me wrong — I’m glad that nobody has been saying “mMedicine” yet (would that be like, “mmm…medicine” or “em-medicine?”) I don’t love “virtual health” as it makes me wonder if I watch lots of exercise shows and raw food infomercials, could I get virtually healthy? 

Defining telemedicine as a subset of telehealth related to direct care at a distance vs. provision of healthcare-related services at a distance, while correct—who cares? Consider if when indoor plumbing was new, people discussed “s-water” (out of a stream), vs. “i-water” (from in the home). I guess i-water would be better than p-water from pipes (it’s OK to giggle a little — be a middle-schooler for a minute). We care about perhaps three factors:

  • Is it modified/sparkling/flavored?
  • Do we have to pay for it (bottled water vs. tap water)?
  • Is it clean enough to drink?

Medicine is medicine. Healthcare is healthcare. It’s care: good, bad, and a ton in the middle. Yet I hear murmurs like, “Telemedicine isn’t good quality healthcare.” That’s like saying tap water isn’t good enough to drink because you’ve spent time in Flint.

Good quality care isn’t determined by the location of the provider or patient. Care can be done very well without requiring the patient and the clinician to be in the same room. It can also be done very poorly. Probably the majority of it — just like when the doctor and patient are together in a room — is not perfect, not bad, and mostly OK. 

Not every type of visit is appropriate over video, but many types are. In dermatology, providers have been using photos for decades. Camera cost and image resolution have dramatically improved so that even inexpensive systems can provide more image detail than a physician with the sharpest of vision. Stethoscopes, lights, cameras, video connections, telephones—all are tools to help us practice medicine better.  Sometimes the tools work great and are helpful and sometimes not.

If the Internet connection is slow or the battery dies, quality is impacted. But think for a minute about the impact on quality of care for the physician who had an extra-complex first appointment and is running an hour or more behind. The patients are stacking up and getting upset about their wait times. The clinic day is lengthening. The pressure to catch up mounts. Finally, consider the patient taking off work, driving to a clinic, parking, sitting in a waiting room with Sally Pink Eye, feeling at bored at best and anxious and angry at worst about their wait times.

How high of quality will that encounter be compared to the patient connecting with the provider from home or work? The patient didn’t have to drive, and even if waiting, likely they were in a more comfortable environment with other things to do.

Keep in mind that if the patient were physically there in the dermatology office and the lights went out or the dermatologist’s glasses were suddenly broken, it would be very hard to provide a quality exam. For a remote derm visit, if you can ensure reliable “tool” quality (history from the patient and/or GP, high enough resolution video/images, clear audio), why should there be a care quality concern? Yet these kinds of “visits” — heavily image-focused encounters — are still traditionally accomplished by asking a patient come to the provider. 

Thank you to Kaiser and other telemedicine leaders for providing us with the validating data: remote visits can be done with high quality, lower costs, and positive quality care and patient satisfaction outcomes. On behalf of patients who are increasing expecting more convenient care, healthcare providers who are hesitant — please invest in video visit technology and seek opportunities to provide more convenient care for your patients. Payers, please recognize that this is in everyone’s best interest and start financially rewarding those providers.

Teri Thomas is director of innovation for an academic medical center.

Subscribe to Updates

Search


Loading

Text Ads


Report News and Rumors

No title

Anonymous online form
E-mail
Rumor line: 801.HIT.NEWS

Tweets

Archives

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Reader Comments

  • Where's Kyle: Nevermind. Obviously, Kyle's been busy....
  • HIT Girl: \m/ (>-<) \m/...
  • Dysf(n): I would say the speaker meant "epic" (vs "epoch"), using the concept from Agile development. I don't have hard data, but...
  • Agile Ninja: I agree the tenant/tenet one is maddening, but I'm not sure epics is wrong. Agile project management works in "user sto...
  • Mr. HIStalk: I hate typos and I'm happy to have fixed this one. Thanks....

Sponsor Quick Links