Home » Readers Write » Recent Articles:

Readers Write: Top Technologies in Private Practice for 2015 and Beyond

February 4, 2015 Readers Write No Comments

Top Technologies in Private Practice for 2015 and Beyond
By Arman Samani

image

As we enter into 2015, healthcare is entering an era where it must compete for the patient’s time and attention. Mobile and cloud computing are now pervasive enablers of other technologies that physicians can and should be leveraging. Mobile and cloud are attributes of other technologies rather than a technology or trend themselves.

As payment reform is progressing and we are switching from fee-for-service to fee for value models, it is critical that private practices take steps in 2015 to prepare for this new reimbursement model. This preparation will steer practices toward doing the right things for their patients as well their businesses. Practice management, EHR, patient relationship management, actionable analytics, and interoperability are broad categories that a private practice should evaluate carefully to be prepared for long-term growth.

Health watcher technologies are enablers of proactive patient engagement. According to the recent IDTechEx report on the wearables market for healthcare, the market is projected to grow from $14 billion in 2014 to more than $70 billion in 2024. This booming market is an opportunity for physicians to shift into the role of “health watcher” for their patients. The industry can no longer function in reactive ways to patients initiating visits. Both Apple and Samsung have introduced health tracking frameworks and data repositories in their mobile devices, and as a result, consumers will soon be wearing devices such as Apple Watch and Samsung wearables.

Not only can someone track how many steps or even floors they have walked, but health statistics like heart rate and blood pressure can be measured. Private practices should think about integrating this patient base data into their EHR in order to provide proactive and preventative actions to their patient population. Not only is this the right thing to do for the patients, it increases the practice revenue, enhances reputation, and decreases healthcare costs. While appointment reminder technology is now mainstream, health reminder communications such as email, text, and phone calls will be become mainstream in 2015 and beyond.

With the rise of mobile computing, convenience will be an important factor. A 2014 study from Manhattan Research found two in five physicians agreed that using digital technology to communicate with patients will improve patient outcomes. Starting January 2015, CMS will start paying for chronic care management, wellness visits, and psychotherapy services. The telemedicine cash business has been growing for a few years and now that CMS has expanded reimbursement for telemedicine, private practices need to start putting business processes and technologies together to take advantage of this growing market and offering a convenient way for their patients to save time and get readily accessible preventative care.

With industry regulations such as ICD-10 imminent, practice management software has to be ready to support the increased complexity in coding. However, the effects of an expanded code base aren’t all about technology. Patient visits and the associated workflows, from the moment a patient arrives through to receiving a claim payment, need to change in fundamental ways. The questions that are asked and data that is collected right at the point of care are also affected. Practices need to stop thinking of coding as data entry and make it a proactive process that happens in real time. Practices that don’t plan for this shift may see a rise in claim denials — the aftermath that creates may overwhelm staff and burden the business. Practices should do ICD-10 risk assessment now.

With cloud technologies, big data is no longer only for large health systems. From patient health monitoring to quality measures, accounts receivable and payer reimbursement, and more should be provided in easy and actionable analytics. In addition to actionable analytics for the different aspects of business, it is important to benchmark a practice against other practices or the industry as whole. Otherwise, a practice might never know how well it is doing and what new goals should be set. Benchmarking tells the practice manager where their business stands compared to other practices. It helps answer questions such as how much the practice is getting paid relative to other practices and if it needs to start collecting more for certain services. Analytics can measure these factors relative to the practice’s goals and in comparison to other practices. Benchmarking is complex and time consuming, but cloud providers of EHR and practice management technologies are especially well positioned to provide these benchmarking services.

Practices are overwhelmed with data and technology providers need to move beyond providing dashboards and monitoring trends. Big data must now be a driving force for actionable alerts that trigger automated staff or even patient actions. Physicians might be asking what treatment plan or medications other doctors are prescribing for the same diagnosis. Analytics data can help with matching up patients who share the same condition so they can compare notes and even create support groups.

All practices play a role in the healthcare ecosystem. Most practices receive patient referrals from or give patient referrals to other practices or care settings. It is important to have seamless transition of care among entities to save time and money and provide patients with excellent and convenient service. Interoperability will enable sending and receiving summary of care documents and other necessary information about the patient care continuum. Interoperable systems will be able to store patient information such as discrete data points within the EHR automatically. This will allow practices to not have to ask the same questions from patient multiple times and will expedite care, increase care quality, and decrease costs by avoiding unnecessary procedures and tests.

The pace of innovation in healthcare has immense potential to advance the quality of care in 2015 and beyond. Smart practices need to prepare and adopt for upcoming healthcare reforms and provide proactive preventative care for their patients. This is not only good business but also the right thing to do for the patients and communities.

Arman Samani is CTO of ADP AdvancedMD of South Jordan, UT. 

Readers Write: EHR Ease of Use is Not Easy

February 4, 2015 Readers Write 3 Comments

EHR Ease of Use is Not Easy
By Lee Farabaugh

image

Usability shows no signs of losing its luster as a buzzword in health IT. Coverage of a usability collaborative involving the efforts of the Electronic Health Record Association, the American Medical Association (AMA), and the American College of Physicians to improve user-centered design of EHRs in the context of the Meaningful Use program has certainly escalated. I

It’s no secret that EHR usability is, generally-speaking, pretty abysmal. There are standouts in the realm of interface design excellence – think of the award-winning PracticeFusion and athenahealth. But the overwhelming response to EHRs from the physician community is a groundswell of complaints over poor design, longer patient encounters, time-consuming documentation, and slow information retrieval response time.

The AMA recently published an article entitled “Improving Care: Priorities to Improve Electronic Health Record Usability” that identifies eight EHR usability principles, including supporting team-based care, promoting care coordination, and reducing cognitive workload through a user-centered design (UCD) approach. But even the AMA admits that while “some vendors have implemented user-centered design … their results have been inconsistent and many other vendors have not [even] implemented UCD.”

Apparently it’s not as simple as just applying the UCD process of user research, iterative design, and usability testing to the field of EHR design. Mary Kate Foley, VP of user experience at athenahealth, perhaps says it best: “Our industry has been talking about EHRs for years now, and if it were simple to make EHRs easy to use, we’d be done by now.”

EHR interface design is still subject to the design choices of individual interaction, visual, and user experience designers. While we’ve become used to the new flat UI convention on our iPhones, the vast majority of EHRs still look like snapshots from the past. In short, we don’t typically look to EHRs to be on the cutting edge, whether in terms of visual design conventions or adherence to UI design best practices.

The AMA calls for “the development of a common style guide – designed through collaboration between physicians and vendors – so physicians who practice in different care settings can move from one EHR to another.” But it’s not just physicians who stand to benefit. This type of common design framework frees organizations to make changes to their toolset because they don’t have to fear a steep learning curve for providers on a new interface.

How can we as designers support these efforts?

  • Remember that EHR design affects not only physicians, but patients, too. Patient tools, while separate from the EHR itself, both push information to and pull information from the EHR, making patients de facto EHR users by default.
  • Acknowledge existing efforts to reach a common design language in EHR interface design. Juhan Sonin, Jeff Belden, and Catherine Plaisant, among others, have created a nice start towards an EHR style guide for the industry at InspiredEHR.org. Their work includes medication lists, allergy lists, and drug alerts.
  • Continue to push forward with additional design patterns. One area where common design vocabulary is needed is the patient banner. EHRs should employ common conventions for elements such as patient name, gender, date of birth, allergies, etc. that typically appear in this space, and balance information communication with respect for screen real estate.
  • Educate our colleagues in industry about the importance of understanding and designing for the way real humans think and work. In my course on user-centered design for healthcare at UAB’s Masters Program in Health Informatics, my students (nurses, business analysts, and EHR vendors) are learning about how humans process information, think irrationally, and act according to behavior patterns that point the way towards more intuitive design.

EHR usability isn’t easy. It involves a complex interplay of care teams, workflows, the legacy of paper charts, and the promise of a design language we can all speak. But the need is real, and as the focus on “checking the box” for MU fades away, we’ll get down to the real business of not just using EHRs in a meaningful way, but in a delightful way.

Lee Farabaugh is chief experience officer at PointClear Solutions of Atlanta, GA.

Readers Write: EHR Go-Live Activation – Big-Bang or a Phased Approach?

January 30, 2015 Readers Write 8 Comments

EHR Go-Live Activation – Big-Bang or a Phased Approach?
By Zack Tisch

image

After completing the RFP process and determining which vendor and products will be part of the implementation, the real fun begins. Should the organization deploy this change in a single event — typically referred to as a big bang go-live –  or would a methodical, phased approach be a better fit?

At first glance, a big bang can feel aggressive, particularly in a healthcare environment where risk can mean significant consequences, not only to organizational financial health, but potentially to quality and patient safety. This surface analysis can be, misleading however, and more detailed consideration often reveals challenges to a phased approach that can be even more significant, particularly for multi-hospital organizations that may be on different core clinical or financial software platforms. The following considerations are a start to determining which approach may be best for a given organization.

Carefully categorizing likely risks and how to manage them is a major factor in determining a go-live activation approach. A successful go-live is one where known risks are decisively and quickly managed and unknown risks are quickly analyzed and attacked. Both activation approaches can be equally successful, but there are specific tasks and processes that should be put in place prior to go-live to help support the approach.

For example, with a big-bang go-live, technical considerations become primary due to the volume of users and equipment that will be interacting with the system at the same time. Is security configured correctly? Can all users log in? Have they verified this in the production environment prior to go-live? With a phased install consisting of a smaller initial pilot, security, login, printing, and hardware issues may not be as pressing.

On the other hand, with a large-scale big bang featuring potentially thousands of users and workstations, the first few days or week of go-live can easily be spent just resolving technical issues that could have been sorted out with a thorough pre-live plan. This is a known risk and I would strongly advocate as much testing in production with real hardware and actual end users as possible, regardless of the chosen go-live approach.

Outside of technical issues, another key risk for most EHR go-lives is operational change and how well clinicians, front desk, and back-office staff accept and adopt the new workflow changes and tools. With a phased install, there is the luxury of being able to portion this change over time, reducing end-user anxiety and the amount of information they need to process and retain from training. However, one major drawback is that with a phased go-live, there will often be interim workflows, requiring end users to learn a new process and then unlearn aspects of that process shortly thereafter.

One key area in the organization to evaluate for potential risks is physician coding, particularly on the outpatient side. Physician coding is a highly integrated process, beginning with appointment scheduling and patient registration through clinical support staff rooming, physician documentation and order entry, charge generation, coder review, and ultimately claims submission. When implementing a new system, it is important that there is clarity and consistency on who is performing what task, particularly for the charge generation and coding review steps.

Will physicians or clinician support staff be entering or reviewing charges? What about evaluation and management (E & M) codes for level of service? How do coders work with providers to get clarity or update documents? When considering a phased approach (as an example, bringing outpatient clinical modules live prior to a separate billing go-live), will these workflows change? Each change to this workflow introduces key elements of risk, primarily of missing or delayed documentation and charges. This is an area that can quickly spiral out of control, and if not well understood and managed prior to go-live, can lead to significant financial risk for an organization, which unfortunately seems to dominate headlines, rather than the many highly successful projects.

My suggestion would be to take the time to perform a detailed risk analysis or partner with industry experts to assist with this. Also, work closely with organizational senior leadership to evaluate the benefits of having a phased install versus a big bang. Going through this process in the past, I have seen highly risk-averse organizations that initially wanted to move forward with a very phased install transition to a big-bang approach because the interim workflows and frequent system changes of a phased approach posed a higher risk of failure.

Another key factor to consider is the current state of the legacy EHR data. If the health system has multiple ADT or EHR systems, with multiple patient MRNs, a phased go-live can be much more difficult. A detailed analysis and thorough testing of how this will impact your downstream systems must be performed. One of my clients who had two separate clinical and registration systems initially desired a phased approach. However, upon further analysis, there was significant crossover for orders and results between the two. As a result, it would have been extremely difficult to keep all systems in sync. While the new EHR could handle these multiple MRNs, a number of key integrated systems could not handle interfaced merge messages or multiple patient identifiers. We would have had to pursue a major parallel project to implement an additional patient identity management application or merge and update MRNs across the entire organization.

One other example that is often identified late or overlooked is the ability for a new system to run alongside the legacy system during a phased install. There are often significant compatibility issues between vendors related to the versions of Internet Explorer, Java, and other critical Windows / Web architecture components necessary for a system to function correctly. With a thin client deployment, it may be possible to get around this with separate setups on the individual servers, but this is not always possible.

Lastly, as someone who has experienced many implementations in a variety of roles — from analyst through project leadership — I would highly advocate considering the health, effectiveness, and well-being of the project team as it relates to the go-live approach. These implementations are challenging, requiring significant hours and brainpower, often well above and beyond a 40-hour work week. With a big bang go-live, the team has a single mission and a single event. Team members can see the light at the end of the tunnel and this is particularly critical as they work through the challenging build completion and testing phases of the project. Having an event to rally around can be significant for motivation and keeping everyone on the same page.

The downside is that one large go-live means only one chance to get it right. This can introduce significant anxiety, particularly for team members who have not previously worked on a similar project. It’s important for leadership to direct time and energy with the project team and end users to understand why a big-bang approach was selected and the significant steps and thousands of hours of hard work the team is putting in to ensure the go-live will be successful.

The benefit of a phased approach is each individual go-live is more approachable for the project team. The smaller scope and scale makes it easier for team members to wrap their heads around the effort and the amount of support required for the go-lives to be successful. However, by having multiple go-lives, the team now has to get up for more “showtime” events and more weekends and late nights performing pre-live cutover and go-live support. It can also make it more difficult to define when the project can be considered a success.

It is especially important to limit the number of phases and space them out appropriately. If they are too close together, it can feel like one very large and extended go-live, particularly if the initial phase is challenging and it is difficult to stabilize and move to support on time. I’ve also seen challenges where go-lives are spaced too far apart, and the project team and end users have become apathetic. If the amount of change at any one time is too little to be felt broadly across the organization, or too spread out, it can become difficult for staff to understand the benefits from the project and why the organization undertook this significant and expensive process. If choosing a phased approach, work carefully with the project team and vendor to make sure there is a realistic timeline with enough time between phases to appropriately stabilize and shift focus.

These considerations are just a small subset of the topics that are critical to discuss with the leadership team when deciding on a go-live approach. There are benefits and drawbacks to both approaches and one size certainly does not fit all. With appropriate foresight and planning, either approach can be highly successful. There are a multitude of expert resources and organizations that can share lessons learned to help follow in their footsteps.

Zack Tisch, PMP is director of strategic solutions with Nordic Consulting Partners.

Readers Write: Information Blocking: Don’t Blame the EHR

January 30, 2015 Readers Write 3 Comments

Information Blocking: Don’t Blame the EHR
By Michael Burger

image

Healthcare IT seems to be getting some attention in Washington these days, and not necessarily in a positive way. As a case in point, a statement which affects healthcare IT was included in an explanatory statement by the chairman of the House Committee on Appropriations regarding the house amendment to the recently passed government spending bill.

Information Blocking. The Office of the National Coordinator for Health Information Technology (ONC) to use its authority to certify only those that … do not block health information exchange. The agreement requests a detailed report from ONC … regarding the extent of the information blocking problem, including an estimate of the number of vendors or eligible hospitals or providers who block information.

This is clear evidence that Congress is frustrated by the relative lack of data exchange despite an investment of $30 billion for healthcare IT. As the explanatory statement states, “ONC should take steps to decertify products that proactively block the sharing of information because those practices frustrate congressional intent, devalue taxpayer investments in CEHRT, and make CEHRT less valuable and more burdensome for eligible hospitals and eligible providers to use.”

No question, information blocking is a significant factor in the lack of data exchange. It is appropriate for Congress to expect a return on taxpayers’ investment. What concerns me is the prevailing but erroneous perception that EHR vendors have conspired to block information.

In the nascent HIT business of 20 years ago, there was a notion of a “closed system,” where data was only accessible by those using that system. In those days, the closed system was certainly used to sell additional software by controlling the flow of data. That business model was ideal for a marketplace many years ago with few competitors and no real demand for interoperability.

However, such a strategy no longer exists in today’s HIT marketplace, if for no other reason than to meet the certification requirements for Meaningful Use (MU), EHRs must be capable of interoperability with other EHRs. A claim that a company’s EHR “doesn’t work well when you mix and match vendors” would not be a smart selling tactic, since it openly defies the very premise of MU and because there are many, many competitors.

There are fees from EHR vendors for interoperability, data extraction, and conversion from one system to another. These cover the vendor’s cost to do the work plus a profit margin. (Let us not forget that these are, in fact, for-profit businesses.) While the marginal cost of extracting the data may be small, it is not a provider’s inalienable right to have their vendor provide services for free.

One form of information blocking is called a “walled garden.” In Joel White’s recent blog post regarding Information Blocking, he says, “Information blocking [in a walled garden] occurs not because different technologies or standards prevent data transfer between EHRs, but because EHR vendors or health care providers engage in this activity as a business practice. This is not a technology problem, but a competition one.”

I disagree that EHR vendors in recent times conspire to strategically erect walled gardens, but I do see that healthcare providers routinely engage in this activity as a business practice. The following example illustrates my point.

Let’s say that there are two integrated delivery networks (IDNs) in a given market. Each IDN has acquired ambulatory practices and positioned itself to be able to offer a full spectrum of care, from pediatrics through geriatrics. Each advertises to their potential customers (patients) that they offer the highest quality, most convenient care in town. There is a competitive and profit incentive to keep patients within the network.

Now let’s say a patient is treated at IDN A and then receives treatment at IDN B. From a public health perspective, the patient’s records should flow from one to the other. But from a business perspective, there is no incentive in making it easier for a patient to go out of network and seek treatment at the other IDN. All IDNs use EHRs that are capable of exchanging clinical data in some capacity, but they do so grudgingly because of competitive concerns.

It’s appropriate for Congress to expect a return on our $30 billion healthcare IT investment. It’s refreshing to see that the authors of the spending bill understand the existence of information blocking. Let’s hope, however, that our new Congress doesn’t take the easy way out and blame EHR vendors for this phenomenon when it is really a result of competition of healthcare providers in the free market.

Michael Burger is a senior consultant with Point-of-Care Partners of Coral Springs, FL.

Readers Write: Oh, To Be a Dog in Boulder

January 21, 2015 Readers Write 2 Comments

Oh, To Be a Dog in Boulder
By Bonny Roberts (and Juneau and Lily)

image

I adopted my standard American bulldog, Lily, from the Humane Society of Boulder Valley (HSBV) about two years ago. Due to her excessive and exuberant tail-wagging that resulted in a chronically open wound at the tip of her tail, we decided that, while the burgundy, Jackson Pollock-esque wainscoting that now decorated our home was provocative, we needed to dock Lily’s tail.

HSBV’s veterinary clinic agreed to do the procedure “at cost” since she had so recently been adopted. The bill was $300. Since that experience, I have taken both our dogs to their veterinary clinic for annual checkups and emergent needs. The staff is friendly and responsive and the veterinarian has a strong bedside manner and always calls post-procedure. 

My other dog, a Siberian Husky named Juneau, required two surgeries in 2014 to remove mast cell tumors, the first near her shoulder, the second on her hip. The latter healed poorly based upon its location and required both an after-hours urgent care visit and one additional “observation day” at the clinic. I paid for everything out of pocket – prep, anesthesia, supplies, vet time, OR time, meds, the “Cone of Shame,” recovery time in post-op, urgent care, observation day, suture removal. The total was $875, not to mention the complementary pedicure they had given her while under anesthesia. 

A couple of weeks ago, the HSBV veterinary clinic sent me a link to a Pet Portal. After an easy enrollment process –consisting mainly of creating a login id and password, I instantly had access to both my dogs’ vaccination, visit, and medication history. I also had the option to:

  • Set alerts and reminders for vaccinations and appointments (medical management).
  • Join community groups (social).
  • Read care guidelines on everything from behavior and aging to safety and disease (education).
  • Review diet details, if applicable (education).
  • Create a customize care instruction document, which after investigating in detail, could only be compared to discharge instructions inclusive of a pre-populated med list and exercise routine (care transition).
  • Complete a customer survey (satisfaction and quality improvement).
  • Schedule our next visit based on visit categories, such as vaccinations, sick exam, follow-up, blood work, etc. 

image

I couldn’t help but compare my own lack of portal access with my primary care provider or the fact that my children’s pediatrician used to charge to fax over immunization records. If only I were a dog in Boulder County, my information would be far more accessible. 

Interesting facts about the Humane Society of Boulder Valley, a non-profit facility based in Boulder, Colorado:

  • HSBV is a no-kill, or live release, shelter that also offers behavioral training and medical services to stray and relinquished dogs and cats.
  • On average, dogs are adopted in seven days, cats in 12. In 2013 alone, they facilitated 5,698 adoptions. The relevance to this volume is that according to their annual report, “Medical rehabilitation mends the bodies and spirits of more than 40 percent of the animals in the shelter annually. In concert with our medical care, we are dedicated to the mental health of our animals as well. Techniques and protocols developed at our facility are now being used by shelters to save more lives all across the country.”
  • The HSBV has ~700 volunteers and each dog gets three walks a day, while each cat is played with or stroked four times a day.
  • Fifty percent of the organization’s income comes from investment and trust income and contributions, with 55 percent of that going to healthcare for the sheltered animals

While this study and comparison is apples to oranges on many levels, I do think there is relevance and value to the comparison. What can we learn from organizations serving other verticals with similar missions, much like we do with foreign healthcare systems? In addition, I am convinced that the innovation and technology we have developed and are promoting here in the US has incited progress and more encompassing services to meet the holistic needs of more than just humankind.  Here is to small victories. 

Bonny Roberts is director of sales operations for Aventura of Denver, CO.

Readers Write: Death to the Dinosaurs! DRGs and the Legacy of What it Means Under the Affordable Healthcare Act

January 5, 2015 Readers Write No Comments

Death to the Dinosaurs! DRGs and the Legacy of What it Means Under the Affordable Healthcare Act
By Matthew B. Smith

image

You could really also call this piece “History Repeats Itself.”

For those of you who recall the reimbursement transformation of the healthcare industry from a cost-plus formula (no institution in the field lost money under this approach) to the DRG Era (October 1983), life is about to repeat itself.

Payment caps were placed on 466 diagnostic and therapeutic procedures based upon the type and place where a procedure was performed. The Old Guard of Dinosaur Hospital Administrators couldn’t adapt and the nearly 6,000 U.S. hospitals at the time underwent a financial operation that affected their health.

Prior to the enactment of the federal regulations, less than 1 percent of all inpatient facilities (which funneled about 80 percent of all healthcare dollars) were in financial straits. Within seven years, well more than half were suffering asset declines and nearly 20 percent were facing cash flow dilemmas that threatened their very existence as going concerns. New York State had more than half its hospitals in financial difficulty.

A new breed of administrator — drawn from outside the industry, by and large, and with MBAs, not the soon-to-be outdated MHA — had to find their way into the industry (along with substantially larger salaries and performance structures) to reformulate how these institutions play the game.

Now the Affordable Healthcare Act (AHCA) portends yet another drastic (read: draconian) change to our beloved industry. The current crop of dinosaurs will need to be replaced yet again.

The AHCA will drive reimbursement towards direct links with patient outcomes and be a distinct report card on the deliverers of care. Penalties for not achieving population catchment healthcare levels (too many readmissions; too many specific conditions with below average status; higher costs per unit of service relative to the patient’s achievement level; mistakes in medication administration; higher than normal nosocomial infection rates) will cause the bottom 3 –5 percent of providers to lose payments and have it redirected to the top 3-5 percent. This is a “taking from Peter to pay Paul”* concept so that net/net healthcare payments are flat.

The ability to achieve this and measure it so that it can be implemented (along with the other AHCA factors that are mandated) will give rise to a new healthcare administrator extremely well versed in IT and data accumulation and farming the data. The accent on secured and incontrovertible healthcare information adhering to the concepts of confidentiality, integrity and authenticity (CIA) to make an institution’s case will demand new management with a decided proficiency in not only amassing, but organizing and clinically and financially proving that the provider organization has successfully delivered care.

Failure to be a top performer or even a middling-level participant will have excruciating financial impact as it did when DRGs came into effect. These new breed leaders will look at industry and non-industry solutions to accumulate and manage the massive amount of data that the HITECH Act is encouraging. The New Breed will master it as other industries have shown they can, but the healthcare field will, once again, be strewn with the fossils of dinosaurs among providers and vendors who didn’t listen to the changing reimbursement and care outcome winds that are blowing.

*For those of you not familiar with the origins of this phrase, it arose when the Church of England separated from the Roman Catholic Church and the English King (Henry) levied taxes on the Cathedral of St. Peter (Catholic) in London to pay for the construction of the Church of St. Paul (which Henry headed as the Church of England) also in London.

Matthew B. Smith is president and CEO of SecLingua of Shelton, CT.

Readers Write: Leveraging Technology to Create Payer-Provider Collaboration

January 5, 2015 Readers Write No Comments

Leveraging Technology to Create Payer-Provider Collaboration
By Andrew Underhill

image

Although providers and payers play a critical role in elevating US healthcare, these two entities have not traditionally worked collaboratively, especially when it comes to sharing information. Up to this point, the two groups have shied away from exchanging data, with providers holding on to patient clinical information and payers protecting patient financial information. However, the walls are slowly but surely coming down as more providers and payers begin to partner in delivering accountable care.

There are both operational and patient care benefits to smoother provider-payer information exchange. For example, when providers are given access to claims data, they are able to garner a more longitudinal perspective of the patient’s health and treatment to date. This enables more informed care decisions and also reduces the likelihood of duplicative or unnecessary tests. It also limits the reliance on patient memory and perspective, ensuring care decisions are based on facts rather than educated guesses.

On the payer side, having access to clinical data allows the organization to adjudicate claims more effectively, improving efficiency and ensuring the most appropriate care for members. Clinical data also helps payers proactively manage their members’ care rather than responding to issues after the fact.

Despite the advantages, there are some significant roadblocks to payer-provider information exchange. Although some would suggest technology shortfalls have been the primary hurdle, it is the business barriers that present the greater obstacle. Providers in particular have been hesitant to share data with payers because it includes competitive and pricing information they would prefer to keep internal. Moreover, some physicians are concerned that making this kind of data available to payers may open the physician up to criticism on how he or she treated the patient.

To reap the benefits of information exchange while still acknowledging provider and payer qualms, organizations should take a well-considered approach to any data sharing arrangements. Following are a few tips to ensure these agreements meet all parties’ goals.

  • Appoint an advocate. To truly realize provider-payer data exchange, organizations must have an advocate who will push the idea forward and raise awareness of the strategic, financial and patient care benefits.
  • Develop a strategic roadmap. The advocate should work closely with organization leadership to identify the business drivers for data exchange and craft a strategic plan to lay the cultural and operational groundwork. Basically, this plan should be a roadmap that underscores the importance of provider-payer information sharing and defines how to achieve success.
  • Establish trading partner agreements. Before actually exchanging information, providers and payers must set up trading partner agreements that define the types of data to be shared, the appropriate data-sharing standards, and how the data will be used. This should be a customized agreement that reflects the unique needs and characteristics of the institutions involved. By setting the parameters upfront, both parties can be confident any data exchange will meet their needs and not violate internal strategic goals.
  • Define an exchange framework. Once organizations have a forward path, they can search out solutions to enable seamless information sharing. Providers and payers may want to consider using health information exchanges—such as those provided by the state or a local entity—to securely share data. A commercial product that facilitates payer-provider information exchange can also be a good option.

Greater transparency will ultimately drive better healthcare, and a key to transparency is robust data exchange between providers and payers. Organizations that embrace this idea now can be on the forefront of collaborative care, improving the patient experience and driving better health outcomes.

Andrew Underhill is a chief technologist at Systems Made Simple of Syracuse, NY.

Readers Write: The Eve of War

December 29, 2014 Readers Write 3 Comments

The Eve of War
By John Gomez

Steve Lewis arrived at his office at 7:03 a.m., draining the last remains of his grande mocha as he finished chewing on his blueberry scone. These were his last few minutes of peace before the day started. He did all he could to savor them as his laptop booted. He began the login to his corporate network.

Username:
Password:

WHAT THE HECK?

image

There on the screen in front of him was an image a red skeleton and the words “Hacked by #GOP.”

Steve pressed Escape, F1, ALT-TAB, CTRL-ALT-DELETE. Nothing. The skeleton just starred back at him. Power off. No luck — the skeleton remained. He closed the laptop and opened it. The skeleton was still there.

The sudden ringing of the phone made Steve jump. He noticed that every line on his phone was lit up with inbound calls. He randomly choose one and answered, “Sony Pictures network support, Steve speaking …”

Steve would handle hundreds of calls that morning, as would his colleagues. Everyone reported that their computer bore the image of a skeleton. Within minutes, word had spread across the corporation of the computer attack.

Managers scrambled to calm employees and asked them to remain, though many decided to take immediate time off as they didn’t feel safe. If you were to have asked Steve’s colleagues that morning, not one of them would have said, “I feel safe and secure.” 

In the coming days, Sony Pictures executives would make a gutsy choice and agree to the demands of the company’s attackers. Meanwhile, several hundred miles away, members of the Department of Defense Cyber Command were spending their time analyzing cybermunitions and strategies to provide the President of the United States with options in the event he ordered cyberattack on North Korea.

As the dawn of 2015 appears on the horizon, the United States is poised to engage in the first cyberwar in the history of mankind. If there is any irony to all of this, it would be that it all reads very much like a Tom Clancy script. Unfortunately, all of the events and the situation we find ourselves in as the year comes to an end are all too real.

The attacks on Sony Pictures by North Korea are interesting. Studying what happened is critical to protecting our own infrastructure and systems. The key takeaways are that although the attacks were not sophisticated or highly technical, the strategy by those who executed the attack was advanced.

We now know that Sony was being probed and scanned for months, with the sole purpose being to gather massive amounts of intelligence that could be used to formulate escalating attack strategies. We also know that as a result of this intelligence gathering, the attackers were able to carefully and selectively control the attacks and the resulting damage.

We should also keep in mind that since the attacks themselves were not highly advanced, it does show that the use of proactive security hardening measures could have helped Sony minimize or defend against the attacks.

What do we do now? We as an industry and nation have never had to prepare for a cyberwar. The battle is now all of ours. The actions we take in the coming days and weeks will be critical to how we navigate and survive whatever may occur on the cyberfront.

The top three targets for cyberterrorism and warfare are finance, utilities, and healthcare. Attacking any of those areas creates extreme consequence to the citizens. Of the three, the most damaging would be healthcare. The worst case would be affecting patient outcomes in some form or manner. In my eyes, this could be done.

My prescription is as follows.

Top-Down Education

Educate the C-suite and board of directors to provide clarity in terms of what occurred and the reality of the attack types and strategy. Clarify the resources and support needed to harden systems.

Little Things Matter

The technically simple attacks on Sony were effective because Sony didn’t do the little things: using old technology like Windows XP; not enforcing security policies or policies, and giving in to the screaming user or privileged executive while compromising the overall welfare of the organization.

Holistic Approach

Fight as a team. Cyberattacks aren’t about singling out one system. They involve finding a vulnerability anywhere and exploiting that for all it’s worth. If someone can exploit security cameras to gather compromising information that leads to greater exploits, they win. Think of the entire organization, physical and digital, as a single entity and then consider the possible risks and threats. What if someone shut down the proximity readers? What if they disabled the elevators? What if biometric devices or medical devices running Linux were infected with malware?

Monthly War Games

This is a fun way to build a security-minded culture. Once a month, gather the security team (which should represent the physical and digital world) and start proposing attacks and how the organization would respond or defend. Invite someone from outside.

Fire The Professionals

Organizations rely on those who help them feel good by saying all the right things – clean-cut consultants with cool pedigrees and fancy offices. Those might be the right people to review financials, but for security, look for crazy, go-for-broke, “been there, done that” people. The ones who make you a little scared when you meet them that maybe they bugged your office while you stepped out for a minute. When it comes to testing systems and infrastructure, be liberal with the rules of engagement and highly selective in who to engage. Get someone who makes everybody uncomfortable but who can also provide guidance.

Admit You Need Help

For most people, cybersecurity is not something they do day and night. Even a dedicated team won’t see everything outsiders see because they are exposed only to a single organization. Consider getting help from people who do this every second of the day, regardless of if the help entails remote monitoring, managed services, surprise attacks on a subscription basis, or delivering quarterly educational workshops. The SEAL teams of cybersecurity exist.

Education Matters

Cybersecurity education is as critical as that for infection control and privacy. It could be that last line of defense before becoming the next Sony, Target, Kmart, Staples, or Sands Casino. Also consider providing ongoing education for the in-house technologists.

Integrate Business Associates

Don’t let business associates do whatever they want. Set standards and insist that they be followed. Minimize shared data with them, enforce strong passwords, require surprise security assessments, and get the board and C-suite to understand that they are the weakest link.

The Technology Vendor Exposure

Hardware or software doesn’t matter — most vendors do not design or engineer secure systems. Not because they don’t want to, but they overlook things when trying to get hundreds of features to market and dealing with client issues and priorities. Not to mention many of today’s HIT systems were designed and developed decades ago, well before the words “buffer overflow”, “SQL injection,” or “cyberwarfare” were known. Push vendors hard to demonstrate how they are designing and developing highly secure systems that keep customers and patients safe and secure.

Security Service Level Agreement:

Do this is nothing else – it will make sure the other stuff gets done. Set a clear and aggressive Security Service Level Agreement (SSLA). This should be a critical success factor that holds the CIO, CISO, COO, and CEO accountable. Defining what is part of the SSLA should be a joint venture between the C-suite and the board, but it should clearly dictate the level of security to be maintained and how it will be measured.

These aren’t earth-shattering suggestions. However, had someone from Sony read this last year, they would have said, “We already do this,” yet Sony may very well end up being a case study for cybersecurity (and depending what happens in the coming days, a key part of our history lessons for centuries to come).

The bottom line is that HIT is an insecure industry that has not done enough to pull forward and become the standard of cybersecurity that everyone outside the industry expects (and thinks we are already doing).

Now is the time to set a standard, fight back, and take things to a new level. Sony provides an opportunity to educate the board, create a partnership with the CEO, reexamine trusted partnerships, and push vendors to step up their game. Let’s hope that Sony is more than enough to be a call to action for our industry.

John Gomez is CEO of Sensato of Asbury Park, NJ.

Readers Write: EHR Vendors: Barriers to Interoperability

December 29, 2014 Readers Write 2 Comments

EHR Vendors: Barriers to Interoperability
By King Coal

As patients and taxpayers, I encourage everyone to contact your Congressional members about this topic. Mention that the barriers to EHR interoperability are not just technical — they are contractual as well.

EHR vendors that enjoy the benefit of our tax dollars under the HITECH Act are preventing interoperability — and innovation around the edges of their EHR products by third-party developers — by placing limitations and threats in their contracts with clients. The vendors who are engaged in this antitrust behavior can point to their technology and say, "See? We can share data. We follow data sharing technical standards. Quit criticizing us."

But when you look at these vendors’ contracts, the license fees associated with interoperability are cost prohibitive. In addition, the interoperability clauses are surrounded by onerous contractual obstacles that are veiled to protect the vendors’ intellectual property, but are actually ensuring the vendors’ continued monopoly and preventing innovation around their products.

This behavior on the part of some EHR vendors is strikingly ironic given the enormous success of open source, easily accessible APIs that benefit interoperability. The more open products are from a software architecture perspective, the more value that accretes to a product’s intellectual property. Open, transparent APIs create a larger dependence and ecosystem around products, not less.

Several years ago, I sponsored a meeting with senior executives from three large EHR vendors, lobbying them to open their APIs and migrate their software engineering architecture from tightly coupled, difficult to modify and upgrade, message-oriented architectures to loosely coupled, flexible, services-oriented architectures with open, published APIs so that my development teams could write innovative products around the edges of these EHR products. 

I will never forget the response from one of those EHR vendor’s senior executives: “We see ourselves as more than a database vendor.” Meaning, of course, “Our closed APIs are a market advantage.” 

Bill Gates and Microsoft used to think the same thing about Windows, Office, and Internet Explorer. You can see how that worked out for them when you compare what’s happened with the openness of Android, iOS, the browser market, and office suite products. Salesforce.com is the supreme example of business success based upon an open API and open culture.

A colleague described his thoughts in an email:

Current interoperability standards selected by the ONC and required by MU-S2 do not contain an adequate amount of data/data types to support the quality measurement requirements of the same MU-S2 program. This gap in data is what enables the EHR suppliers to continue the veil of interoperability while still protecting their proprietary intellectual property, serving the interests of the owners of these companies with little regard to what may be best for care, providers, patients, or consumers.

Several EHR vendors are banning together around a new magic bullet technical standard called HL7-FHIR based on JASON technology. While this new standard is great from a technical perspective (XML, REST, etc.), in its current form based largely on existing HL7 v2, v3 and CDA concepts, it does not improve the accessibility of proprietary EHR data types and those data types are needed for quality and cost performance improvement in healthcare. While FHIR could be expanded to include this type of data, it appears the first efforts are focused on reinventing the technology for currently defined interoperability data types.

I’m not sure what if anything Congress can do at this point to fix the ills of Meaningful Use Stage 1, which rewarded existing vendors with billions of dollars in tax money to maintain those vendors’ closed and proprietary APIs. Decertification by ONC will become a bureaucratic mess, but I appreciate the symbolic stance taken by Congress around decertification nonetheless.

One thing that must happen—and maybe our legal courts are the only option for this—the contractual threats and barriers in EHR vendor contracts that stand in the way of interoperability and innovation must be removed.

Interoperability and innovation in healthcare IT are suffering, both technically and contractually, by old-fashioned, old-school thinking on the part of EHR vendors. As a consequence, our healthcare system and patient care are suffering, too. ​

Readers Write: What Physicians Want From Their Medical Software

December 29, 2014 Readers Write No Comments

What Physicians Want From Their Medical Software
By Charles Settles

image

Physicians looking for medical software have many options. With hundreds of healthcare IT vendors and bloated feature sets, making a decision can be difficult. Especially when purchasing a system for the first time.

Many physicians are skeptical of vendor claims (especially regarding workflow efficiency) and healthcare IT in general. Additionally, learning a new system can be a daunting task for busy providers who have spent years managing patient encounters with paper charts. Some providers are opting out of healthcare IT entirely and are accepting reimbursement reductions or taking early retirement in order to avoid electronic health records and other systems.

Conventional wisdom (and the marketing material from vendors) would lead healthcare IT buyers to believe that Meaningful Use incentives are the number one reason to buy medical software. Based on responses we’ve received, fewer than 10 percent of physicians care whether or not their electronic health records system is certified for Meaningful Use. The latest data from CMS would seem to confirm this; less than 1.5 percent of physicians and organizations that attested for Stage 1 of the program have successfully attested for Stage 2.

The biggest factor for most physicians is effective document management. This should come as no surprise. It is difficult to achieve the goal of a paperless office without such tools. Despite requirements for health information exchange, interoperability between medical systems remains difficult. Many providers still use fax machines to coordinate care and share notes. An electronic health records system with built-in fax capabilities allows providers to bypass this. Additionally, the role- and user-based access capabilities provided by these systems keep health information secure in a HIPAA-compliant manner.

The second-most requested feature for medical software is template-based progress notes and orders. Despite concerns with upcoding or indecipherable template-based notes, most physicians want to be able to use customized templates to save time during encounters. One otolaryngologist said he performed “the same three procedures for over 90 percent of patients.” Using a template makes the most sense for providers who find themselves in a similar situation. Primary care providers were the only specialists to show an aversion to template-based notes, which makes sense, as a primary care provider is likely to deliver a much wider variety of care than a specialist.

Other features are less of a surprise: a patient portal, e-prescribing, and tablet or mobile-based access round out the top five most-requested features by providers using our service. Also, despite security and uptime concerns with cloud-delivered systems, it’s worth noting that fewer than 15 percent of providers asked for medical software that could be installed on their own server; 56 percent of providers requested cloud-based software; and the rest had no preference.

Despite the trend of providers opting out of the Meaningful Use Incentive Program, the market for electronic health records and other medical software systems remains significant. With estimates of healthcare IT adoption rates rising above 80 percent, many of these purchasers are replacing an existing system. This could explain some of the feature preferences, especially the significant preference for strong electronic document management capabilities.

Charles Settles is a product analyst at TechnologyAdvice.

Readers Write: Review of the mHealth Summit

December 17, 2014 Readers Write No Comments

Review of the mHealth Summit
By Norman Volsky

image

Last week I attended the mHealth Summit in Washington, DC. I met with 25 vendors over a two-day period and came away with several takeaways regarding mHealth industry trends and the companies that correspond to each.

 

Changing Patient Behavior

Many companies are trying to change a patient’s behavior and inspire them to participate in their own rescue. If you can change a patient’s behavior that resulted in them getting sick (eating unhealthy, not exercising, smoking, etc.), that patient has a better chance of staying healthy.

  • Telcare. Mobile diabetes solution that allows patients to manage their condition more effectively by providing them with timely and actionable information. Their enterprise glucose monitoring solution enables the entire care team to be connected so they can make a difference.
  • Propeller Health. FDA-cleared asthma and COPD management vendor that helps patients and physicians better manage chronic respiratory conditions. Digital products that have therapeutic benefit.

Reducing Cost and Readmissions

These two themes go hand in hand. Health plans, ACOs, and employers are looking to treat patients outside of the four walls of a hospital. Telehealth in the form of online doctor visits is helping reduce cost significantly for the healthcare system. Monitoring patients remotely and making sure the entire continuum of care is informed can prevent readmissions, reduce the cost of care by treating patients in the appropriate care setting, and prevent catastrophic events.

  • MDLive. Consumer-focused telehealth vendor that provides concierge connected care for customers of all socioeconomic backgrounds. MDLlive has the potential to become the Uber of telehealth by providing a fully integrated end-to-end solution to its customers.
  • Twiage. Communication platform that improves clinical workflow and outcomes by allowing first responders to deliver real-time data from an ambulance to an emergency department physician.
  • Ideal Life. End-to-end remote patient monitoring vendor that has been in the space for 12 years. Allows patients to self-monitor using a wearable device.
  • TruClinic. Medical Skype on steroids. Telemedicine services vendor that allows physicians to use the same workflows they are already using daily.
  • Wellpepper. Patient engagement vendor that provides personalized mobile care plans to patients. Reduces cost by using video capabilities that reduce need for multiple physical therapy visits.
  • SnapMD. Telemedicine vendor that enables doctors (particularly specialists) to develop a digital practice in addition to their core business. Leverages built-up trust with a patient’s personal physician.
  • Lively. Personal emergency response vendor that provides non-invasive wearable device and activity sensors that monitor an elderly person’s behavior and alerts family members if their behavior changes to prevent falls and emergencies.

Managing Risk Effectively

Government regulation has changed how patient care is being paid for. The healthcare industry is morphing from a fee-for-service to a pay-for-performance environment. If a health system can effectively manage risk, they are much better positioned in the new environment.

  • Wellbe. Guided episode management vendor that helps organizations manage risk more effectively and transition into value-based care and bundled payment environment.
  • Acupera. True population health management vendor that created unique workflow engine that guides physicians on a minute-by-minute basis and assigns tasks to the appropriate care team members.

Communication, Interoperability, and Secure Messaging

Patient information is extremely sensitive and confidentiality is paramount. HIPAA compliance is required. Companies have used secure texting, communication, and interoperability to improve medication adherence, referral management, clinical workflows, and many other issues in the healthcare market.

  • CareSync. Facebook for your health. Mobile health platform that helps build a unified patient record and a common care plan. Allows doctors, family members, and friends to monitor a patient’s chronic condition and overall health.
  • Memotext. Medication adherence vendor using a secure messaging platform and behavioral questionnaires to improve patient compliance to medication regimens.
  • Health123. Patient engagement platform that allows HIPAA-compliant communication.
  • Carevia. Telecommunication platform that helps organizations with interoperability.
  • Doc Halo. True mobile health platform that improves workflows and reduces readmissions by enabling secure communication throughout the continuum of care.
  • Mobile Health One. Communication platform that allows validation at the point of registration. Solution has real-time fluidity that improves clinical workflows.
  • Shift Health. Mobile patient engagement platform that addresses survey fatigue by customizing surveys for a healthcare facility.
  • Zoeticx. Sells a middleware solution that addresses patient medical information flow. They help improve outcomes and workflows by overcoming the problems of effective health information exchange and poor EHR interoperability. Their mobile platform has care coordination tools as well as a secure messaging platform that is triggered based on events.

Miscellaneous Emerging Technology

There were several vendors I met with that were doing some unique things that did not fit into the above industry trends.

  • VisualDx. Specializes in diagnostic clinical decision support. They differentiate from other clinical decision support vendors because they are using visual diagnostics to help physicians arrive at the correct diagnosis. They also have a search tool to isolate common infectious diseases based on specific countries. The recent news surrounding misdiagnosed cases of Ebola has moved this type of technology to the top of mind of C-levels at hospitals.
  • Validic. Industry-leading digital health platform that delivers easy access and actionable data that healthcare companies can analyze effectively. It is a back-end solution that provides maintenance and integration for the entire digital health ecosystem.
  • J Street Technology. Scheduling software that automates the process of backfilling cancelled appointments. Securely texts patients to confirm appointments and makes sure doctors’ schedules are optimally filled.
  • Care Connectors. Back-end integration vendor that provides bi-directional communication and coordinated care solutions to enable the healthcare ecosystem.

Overall, I saw a lot of awesome technology. This is a growing, exciting space and I am very fortunate to talk to interesting people throughout the industry daily. It is not surprising that private equity and venture capital firms are investing heavily in the mHealth market and I think they will continue to do so for many years to come.

Norman Volsky is director of the mobile healthcare IT practice of Direct Recruiters, Inc. of Solon, OH.

Readers Write: Automate Your Informed Consent Process: Lessons Learned from the Joan Rivers Tragedy

December 10, 2014 Readers Write 4 Comments

Automate Your Informed Consent Process: Lessons Learned from the Joan Rivers Tragedy
By Tim Kelly

image

A number of errors have recently come to light in the investigation of the tragic death of Joan Rivers. The endoscopy clinic that treated the 81-year-old comedian was cited by the New York State Department of Health for numerous deficiencies, including failing to obtain informed consent for each procedure performed. Organizations should review the following processes and ensure that they are in place to avoid deficiencies such as those cited at Yorkville Endoscopy.

  • Append the consent to the electronic medical record at the time it is executed. A recent study published in JAMA Surgery found that signed consents were missing for 66 percent of patients at the time of surgery, resulting in delays for 14 percent of the cases. It is clear that Ms. Rivers agreed to a specific treatment when she presented at Yorkville Endoscopy on August 28. It also appears that the documentation of that consent may not have been adequate to address all aspects of the procedures that were ultimately attempted.
  • Ensure that the informed consent document states the exact procedure(s) or treatment(s) to be performed. Many hospital consents are one-size-fits-all consents or fill-in-the-blank consents. The former are of little value in verifying the patient’s understanding of the planned procedure if the document is reviewed retrospectively. The latter are frequently flawed by illegible handwriting or abbreviations. An analysis of the Rivers case suggests that consent may have been obtained for an esophagogastroduodenoscopy (EGD) but not the two nasolaryngoscopy procedures that may have resulted in complications that in turn may have contributed to her death. Automated systems can force the clear delineation of planned procedures while also documenting possible treatments and interventions that may be pursued intraoperatively.
  • Identify and confirm the providers who will perform the treatment or procedure. Many organizations employ electronic credentialing systems to identify which providers have privileges to perform certain procedures. Yorkville Endoscopy was cited for allowing a physician who was not privileged at the facility to participate in the treatment of Ms. Rivers. Automating the consent process, and integrating that process with a credentialing system, ensures that only providers authorized to perform the contemplated procedures are documented on the consent form. This practice can mitigate the potential for deviations involving non-credentialed providers.
  • Obtain the patient’s permission for observers and photography. It is vital to teaching organizations to allow for the presence of observers and sometimes the recording of surgical procedures. It is also essential that the patient give his or her permission to the presence of observers and use of photography. It appears in the Rivers case that unauthorized observers were present and unauthorized photographs were taken during the procedure. Automating documentation of consent, including allowance for observers, authorization for photography, preferred disposition of tissue samples, and similar permissions, allows for those preferences to be communicated to other HIT systems. This practice can help ensure that patients’ wishes are followed.
  • Leverage the consent in the time out. Yorkville Endoscopy was cited for not following an acceptable time out procedure. Review of the consent form immediately prior to the start of a surgical procedure is a key component of the Joint Commission’s Universal Protocol. Significantly, verification of informed consent documentation – documentation that lists the procedures and well as the surgical site – has been found to be the most effective mechanism for avoiding wrong person / wrong procedure / wrong site surgery.

It should be noted that informed consent documentation alone cannot correct all of procedural deficiencies that were identified by the Department of Health in the Joan Rivers case. However, a well-prepared, procedure-specific consent can serve as both a contract and a roadmap for how a procedure or course of treatment should be performed. When the consent process is facilitated electronically and that process is integrated with other HIT systems, including the EHR, the risk of deviations or errors may be minimized.

Many of the findings in the New York State Department of Health report were not that policies were lacking; it was determined that established policies were not followed. Automation, by its nature, helps ensure compliance with an organization’s policies and procedures.

An excellence policy on automating the informed consent process has been developed by the Department of Veterans Affairs.

Tim Kelly is director of marketing of Standard Register Healthcare of Dayton, OH.

Readers Write: The Case for Smarter Clinical Workflows

December 10, 2014 Readers Write 2 Comments

The Case for Smarter Clinical Workflows
By Sean Kelly, MD

image

The practice of evidence-based medicine can promote patient safety, increase quality of care, and improve clinical outcomes. Providers are increasingly being held accountable to abide by regulatory standards, Meaningful Use guidelines, and Centers for Medicare and Medicaid Services incentive and penalty programs.

The move toward measuring quality and patient safety as key performance indicators in healthcare makes sense, but accomplishing these goals relies in large part upon improving efficiency. Unfortunately, inefficiency is inherent in many of today’s clinical workflows, which detracts from the patient care process by bogging down providers and disrupting the care team’s collective thought process.

The answer is to implement technologies and processes to enable smarter clinical workflows that promote efficiency while also improving quality of care.

Take, for instance, clinical communication. As an emergency physician, I see firsthand the need for faster, more effective communications. If I am able to quickly receive information, share with colleagues and coordinate next steps, I can better care for patients. Unfortunately, relying on pagers and other outdated technologies creates barriers that can delay care and can have significant impact on patients, especially in critical care situations.

Consider a heart attack patient. It is essential that providers are able to diagnose and treat the patient as quickly as possible to ensure that no permanent damage occurs. In cases of ST elevation myocardial infarctions (STEMIs), streamlining clinical workflows to speed the time from door to balloon — the time from patient arrival to catheterization of the coronary arteries to alleviate the occlusion—can mean the difference between complete recovery and a life of struggling with congestive heart failure … or worse.

Cath lab activation is a coordinated effort which may involve many different care providers and care teams. This makes the workflows vulnerable to the negative impacts of inefficient communications. In this situation, invaluable time is potentially wasted from step to step, time that could substantially impact the patient outcome.

This scenario highlights the need for—and benefits of—a smarter clinical workflow. For example, if the care team could use secure communications solutions to send group messages to the care team, coordination and activation of the cath lab would be far more efficient. In this scenario, the smarter clinical workflow includes technology that allows:

  • Immediate, synchronous, bi-directional secure messaging with the ability to send high definition images to assist in rapid diagnosis and collaboration over best treatment option (resuscitate and open up the cath lab).
  • Direct integration into scheduling and on-call systems to facilitate tracking of team members, complete with read receipts, send receipts, and auditability to enable accurate, rapid messaging capabilities (ensure that the correct people are on call, aware they are on call, and rapidly respond when called, complete with escalation if any delays in response).
  • Group messaging capabilities to send code team activation directly to multiple devices so team members get alerted more quickly, simultaneously, and messages and responses are easily tracked and acted upon, instead of multiple pages (and waiting for callbacks).
  • Multi-site communication systems to allow the notification of other clinicians needed for complete care delivery, such as the patient’s primary care physician, specialist, or case manager, to provide notifications about the patient’s condition and follow-up instructions for care (which could also prevent unnecessary readmissions).

This is just one of many examples of how more efficient communication can impact the healthcare continuum. Giving physicians, nurses, and other care providers the tools to do their jobs more effectively can help hospitals meet quality and patient safety goals, support accountability, and most importantly, improve the overall quality of patient care.

Sean Kelly, MD is chief medical officer at Imprivata and emergency physician at Beth Israel Deaconess Medical Center in Boston.

Readers Write: Summary of RSNA and My Takeaways

December 8, 2014 Readers Write 2 Comments

Summary of RSNA and My Takeaways
by Mike Silverstein

image

I just returned from the 100th Radiological Society of North America (RSNA) conference at McCormick Place in Chicago. It was my fifth time attending this show. It is always well attended given the core importance of diagnostic medical imaging within the healthcare provider community.

I was particularly paying attention to the messaging of the vendors in the room and the value propositions they put forward given the budget constraints within healthcare IT.

  • RSNA is international. As opposed to HIMSS, AHIMA, MGMA, etc., RSNA is populated by vendors from all over the world. As such, the attendees include large contingents of representatives specifically from hospitals in Europe and Asia in addition to North America.
  • If you have never attended the show, more than half of the exhibits (if not more) are focused on large pieces of capital diagnostic equipment: MRI, CT scan, monitoring etc. As a result, some of the booths (Siemens, GE, Agfa, Fujifilm etc.) are huge. I’m talking multiple city blocks.
  • Unlike HIMSS, where there is an annual influx of new companies with net new technologies, RSNA is similar from an exhibitor perspective year over year. There is still a tremendous number of companies talking about PACs, RIS, and CVIS, although when I spoke with a number of the executives at those booths, the market for standalone imaging systems is stagnant.
  • The buzz in the room was primarily centered around image sharing technologies like vendor-neutral archiving, enterprise imaging, cloud-based image storage, multi-site reading interoperability, and other technologies focused on breaking down silos and disparate systems. The focus of these firms is helping hospitals, imaging centers and the like to leverage and get more usability and flexibility out of their existing PACs, RIS, and CVIS systems. Vendors such as Mach7 Technologies, SCImage, Merge, Agfa, Acuo Technologies (now a part of Perceptive Software), Accelerad (aka seemyradiology.com, now a part of Nuance), and others highlighted the groups focused on flexible image interoperability systems.
  • There was a good deal of activity as well at the TeraRecon and Vital Images (now part of Toshiba) booths. Both of these vendors have historically been known for their capabilities in 3D and 4D imaging, but both are trying to educate the market on some of their new enterprise imaging capabilities.
  • There were other workflow vendors focused on speech recognition and other complimentary diagnostic tools such as MModal with its Fluency product, Nuance with its Powerscribe 360 product set, and Dolbey with its Fusion product, which was Best in KLAS the last couple of years. These booths had good activity too.
  • Another well-represented area that should continue to grow is the teleradiology segment. Reading of remote images has been going on for years, but as we focus on providing better quality of care to remote areas and the fact the telemedicine as a whole is on the rise, these companies in my opinion are still a good bet.
  • Lastly, there was a new vendor that I thought was very interesting called MedCPU, which recently deployed at the Cleveland Clinic. They have solution that operates behind the scenes of an EMR, RIS, or any other clinical documentation system that can read and comprehend unstructured notes, text, test results, speech (from a Nuance or MModal), and any other clinical information. The solution analyzes this information and cross checks it against compliances guidelines and clinical best practices and identifies variances in real time to alert the clinician of medical errors. They incorporate a combination of natural language processing and other homegrown technologies. After viewing their demo, I think they are a company to watch out for.

All in all, RSNA was well attended this year, but I think that the general consensus is that the large vendors need to figure out how to move the needle while helping CIOs keep costs down and get more out of their existing imaging systems. This will be a challenge for some of the big, publicly traded players, but the future looks bright for the nimble enterprise imaging interoperability companies who are gearing up for Meaningful Use Stages 3 and 4 that require the incorporation of medical images into the EMR.

Mike Silverstein is a managing partner of Direct Consulting Associates of Solon, OH.

Readers Write: 10 Talent Trends to Watch in 2015

December 3, 2014 Readers Write 1 Comment

10 Talent Trends to Watch in 2015
By Anthony Caponi

image

The entirety of my career has been spent in the healthcare staffing industry. Consequently, I have been at both ends of the spectrum. There were tough times in 2008 and 2009 as the nation’s economic recession spilled into healthcare hiring. Then, as part of the American Recovery and Reinvestment Act of 2009, numerous jobs were created with the promotion of EHR adoption.

The healthcare IT industry is absolutely on the rise. However, we will also see some obstacles, including a talent and skills gap. Below is a list of 10 increasing trends for 2015.

Increasing Mergers and Acquisitions

Healthcare reform is becoming a powerful catalyst for the consolidation and integration trend in the hospital industry. A study conducted by Kaufman Hall found that hospital mergers and acquisitions increased 10 percent in the first quarter of 2014 compared with the same time frame the previous year. Overall, studies indicate a continuation of several trends, including increasing numbers of acquisitions. These mergers and acquisitions that are taking place are resulting in a number of highly qualified CIOs in the job market.

Big Data Employment Boom

The data economy needs dedicated people — 4.4 million of them by 2015 in the IT field alone, according to a Gartner Research analysis. In the U.S., a McKinsey & Company report projects a shortfall of between 140,000 and 190,000 big data professionals with deep analytical skills by 2018. Additionally, the impact of big data on employment goes far deeper than the deep analytics and IT fields. Companies need professionals at all levels that are not necessarily educated in deep analytics but are nevertheless big data-savvy.

New C-Level Positions

The chief data officer (CDO) is a new position coming into play in the healthcare IT industry. Hospitals are using the role to try to "leverage data as a strategic institutional asset … It’s about how to transform data into information, how to transform information into better-informed decisions," according to Seattle Children’s Hospital CDO Eugene Kolker.

Another position that is becoming more popular in the healthcare IT space is the chief nursing information officer (CNIO). According to a Modern Healthcare report, about 30 percent of hospitals and health systems now have a CNIO and that number is expected to grow. CNIOs are helping hospitals implement their EHRs and other healthcare IT projects because of their expertise in how nurses use patient data.

Growing Job Market

The healthcare sector is poised to add 5 million jobs by 2020, according to a report by AMN Healthcare. The increased use of technology for healthcare applications is the primary factor for the growing job market. Healthcare job growth averaged 26,000 positions per month between March and September of this year, jumping significantly in the second quarter and continuing into the third quarter, according to the Altarum Institute’s Center for Sustainable Health Spending.

More Interim Executives

The number of interim executives is growing and the demand for interim talent has become apparent. This trend will become a growing part of the employment movement, especially in healthcare IT-related roles like CIOs and CMIOs. With the expected sizable number of baby boomers retiring, combined with the number of independent delivery networks and hospitals in the U.S., it’s easy to see that the demand will grow. This means that there will likely be a shortage of experienced healthcare executives in 2015, which means demand for interim healthcare executives will only grow over time.

Talent Shortage

As baby boomers retire in record numbers, the healthcare IT industry is feeling the pain of a talent shortage. In an article in InformationWeek.com, Asal Naraghi, director of talent acquisition for healthcare services company Best Doctors, says she “absolutely” sees an IT talent shortage. Tracy Cashman, senior VP and partner in the IT search practice of WinterWyman, also says she sees a genuine talent shortage. "There are more jobs than people who are skilled," she says. While she’s starting to see an uptick in engineering graduates, "we’ve been feeling this since the [dot-com] bubble burst," Cashman says, when college students were worried that all IT jobs would move to India. "And we’re still fighting that," she says.

Universities Offering Healthcare IT Degrees

Cloud computing, big data, mobile technology — three of the biggest trends in IT are changing the way the healthcare industry deals with information and creating a big need for trained healthcare IT professionals. Thus, colleges and universities have started offering healthcare IT as a major, where students learn what it takes to function as a fully capable software developer in any professional environment, but specifically tailor their skills to the rapidly expanding healthcare IT field.

Specialists in Demand

Today’s IT shops don’t just want experience, they want deep experience. “IT organizations are under intense pressure to deliver projects faster than before — and that need for speed necessarily influences IT hiring. The IT generalists, and even some topic generalists, such as infrastructure managers, have found their roles left by the side of the road, as project leaders hire for deep experience in specific niches, such as cloud security, DevOps, and data analysis and architecture.”

McGraw-Hill Education CIO David Wright says, "More and more, the hands-on coders, we’re looking for people who are just really deep in whatever discipline we’re trying to hire." And he isn’t the only one advocating for specialization; Asal Naraghi, Director of Talent Acquisition for healthcare services company Best Doctors, also says, “The trend has gone into more specialized skill sets."

Video Interviewing and Skype More Popular

The use of remote yet face-to-face interactions such as video interviewing and Skype is on the rise. Advanced technology is giving people a way to present themselves with depth and personality to hiring managers and recruiters. In addition, new hires meet the team before they even step in the office.

Interview Process Becoming Lengthier

The interview and hiring process have become more elongated in recent years, a trend that we can expect to see more of in 2015. According to Anne Kreamer, a journalist who specializes in business and work/life balance, “Data compiled for the New York Times by Glassdoor found that an average interview process in 2013 lasted 23 days versus an average of 12 days in 2009. And time-consuming assignments and auditions for candidates … are the new normal.”

Anthony Caponi is vice president of healthcare IT of Direct Consulting Associates of Solon, OH.

Readers Write: HIE Encounter Notification Solutions and Meaningful Use

November 19, 2014 Readers Write No Comments

HIE Encounter Notification Solutions and Meaningful Use
By Rob Horst

image

I joined esteemed colleagues from Johns Hopkins Community Physicians (JHCP) in presenting an HIStalk webinar on November 12 titled “3 Ways to Improve Care Transitions Using an HIE Encounter Notification Service.” Some of the attendee questions during and after the webinar required more insight into how ENS helps Eligible Hospitals (EHs) meet Meaningful Use Stage 2 (MU2) and the Transitions of Care (TOC) Measure.

In the way of background, EHs and critical access hospitals (CAHs) that transition or refer a patient to another setting of care are required to provide a summary of care record for more than 50 percent of transitions of care and referrals. This MU2 measure has proven challenging for many organizations to achieve. The method of getting a summary of care record to the right destination and then calculating the number of summary of care records that are actually received is imprecise.

image

On September 22, CMS issued FAQ 10660, clarifying that a third-party organization that plays a role in determining the next provider of care and that ultimately delivers the summary of care document can count in the measure’s numerator for EHs.

Part of the challenge of meeting the TOC measure is that EHs/CAHs and providers must clearly identify the intended recipient of the transition or referral and verify that the summary of care was received by the intended recipient via one of the allowed transport methods. ENS has a unique capability that can help EHs/CAHs meet the TOC measure.

ENS is capable of sending a C-CDA summary record using the same logic that it uses to send EHs/CAHs encounter notifications to subscribers. Using the patient demographic information in the header of the C-CDA, ENS is able to match the patient with the subscriber’s patient panel and send the document with the same accuracy and predictability that it does with encounter notifications. Once the C-CDA is sent to the subscriber, ENS logs the acknowledgement of when it was accessed and is able to provide a report back to the C-CDA sender with the critical metric needed to calculate the numerator for this measure.

We received these questions during and after the webinar that might provide clarity for those considering their options.

How does ENS help EHs/CAHs satisfy the TOC requirement?

EHs/CAHs, primary care physicians, and specialists submit panels (patient rosters) to ENS. When a patient is discharged from the EH/CAH, the EH/CAH generates a C-CDA from their Certified Electronic Health Record Technology (CEHRT) and sends the C-CDA to ENS via one of the allowed transport methods. ENS uses the patient data in the C-CDA header and the patient rosters to identify the correct PCP or specialist and automatically send a summary of care document to the receiving provider.

How does ENS help provide relevant metrics for the EH/CAH to use in its numerator calculation?

ENS will provide a report to the EH/CAH that includes data elements such as the patient identifiers, receiving subscribers, and time of receipt of the C-CDA. These data elements can be used in calculating the numerator.

Does ENS have to be CEHRT?

No. ENS is not the technology that is creating and transmitting the C-CDA and therefore does not need to be CEHRT.

Rob Horst is a principal with Audacious Inquiry of Baltimore, MD.

Readers Write: Leveraging Technology for Communicable Disease Care

November 19, 2014 Readers Write No Comments

Leveraging Technology for Communicable Disease Care
By Paul J. Caracciolo

image

The Ebola crisis has been another wake-up call for healthcare providers to get prepared for national and global medical emergencies. Experts agree that it is only a matter of time before the world experiences another pandemic, such as the flu of 1918 that killed many of millions worldwide.

The recent outbreak of Ebola in West Africa and subsequent spread to the US has caused providers to re-examine how they handle sick (and potentially infected) patients, but we don’t have to use Ebola as the example. The seasonal flu still has a significant impact on health and many deaths occur each year. This past year has also seen the rise of enterovirus D68, sickening many hundreds of children across the country, resulting in several deaths.

The proper care of patients with communicable disease is a concern. We want to ensure that patients receive appropriate care, but at the same time, we need to take precautions around the containment and spread of disease. Recently, CNN News reported statistics that approximately 4.5 percent of reported Ebola cases in West Africa are infected caregivers. In the case of Ebola, disease management is further complicated considering the 21-day incubation period, with possible imposed isolation and continuous monitoring of potentially infected patients during this time.

Solutions can be implemented now that could make a huge difference in not only increasing the quality of patient care, but also protecting caregivers from prolonged or unnecessary exposure to sick patients.

Telehealth / telemedicine. It would be beneficial to have this capability in sick patient rooms to control access. This would allow remote consults with disease specialists, primary care providers, ancillaries, or whoever needs direct access to these patients and their caregivers. This solution could be expanded to include two-way audio and video with nursing staff and HD video conferencing between the patient and their families. Or in the case of isolation for potential infection, patients could communicate with their loved ones, employers, benefits providers, or anyone else on the outside.

Virtual patient observation. This solution includes video equipment, network integration with nurse call, and intelligent software that can be configured to be sensitive to patient movement. A monitoring console can be presented at a nurse station computer or accessed mobile from tablets. Several patients can be monitored from one station, or select rooms can be monitored. Coupled with two-way voice communication, this can be a powerful tool.

Alert and alarm management, workflow enhancement. This middleware that can capture relevant patient data from monitoring devices and lab results and then present this data to caregivers on mobile devices. Staying with the theme of patient and caregiver safety and more efficient workflows, this technology can streamline communications. Alarms from biomedical equipment in a patient’s room can be triaged by the configured system, thus preventing alarm fatigue for caregivers and focusing attention on critical alarms. Additionally, these applications can use push notification technology to send out critical lab test results, with related information, to the mobile devices of clinicians Secure text messaging, typically another feature, can streamline communications and record the information and send it to the EMR to complete the care record and maintain compliance.

Care team collaboration applications. Having the ability to share patient related data is key to keeping care teams on the same page. Access to the EMR may not be feasible for all caregivers involved. The ability to share documents, notes, lab results, and images (and imaging) among care team members wherever they may be is powerful. Even caregivers who are suspected of being infected (and in isolation) could still be part of a productive care team with these applications. Cloud applications could be used on demand and are easily scalable to fit emergency scenarios.

Hospitals can take action now to be better prepared to deal with outbreaks. Although many hospitals may not have formal isolation rooms, they may want to designate and prepare certain rooms that could be used in a more formal manner if needed in emergencies. For instance, specific nurse wards, floors, or group of rooms could be outfitted with these technologies. In time of emergency, the emergency protocol would kick in, with technology in place and workforce trained. These technologies can also be used on demand for triage or isolation tents, with portable versions of telemedicine and virtual patient observation solutions.

Paul J. Caracciolo is chief healthcare officer of Nexus – A Dimension Data Company of Valencia, CA.

Subscribe to Updates

Search


Loading

Text Ads


Report News and Rumors

No title

Anonymous online form
E-mail
Rumor line: 801.HIT.NEWS

Tweets

Archives

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Reader Comments

  • Where's Kyle: Nevermind. Obviously, Kyle's been busy....
  • HIT Girl: \m/ (>-<) \m/...
  • Dysf(n): I would say the speaker meant "epic" (vs "epoch"), using the concept from Agile development. I don't have hard data, but...
  • Agile Ninja: I agree the tenant/tenet one is maddening, but I'm not sure epics is wrong. Agile project management works in "user sto...
  • Mr. HIStalk: I hate typos and I'm happy to have fixed this one. Thanks....

Sponsor Quick Links