Home » Readers Write » Recent Articles:

Readers Write: Moving the Adoption Needle on Electronic Prior Authorization: What Stakeholders Can Do

July 5, 2017 Readers Write No Comments

Moving the Adoption Needle on Electronic Prior Authorization: What Stakeholders Can Do
By Tony Schueth

image

Tony Schueth is CEO and managing partner of Point-of-Care Partners of Coral Springs, FL.

Prior authorization (PA) is a major pain point for both prescribers and payers (health plans and PBMs). That is because there are significant administrative costs and patient-safety issues associated with today’s antiquated paper-phone-fax PA processes. The number of PAs is increasing, causing stakeholders to look for a solution to keep ahead of the PA curve. The answer is electronic PA (ePA), which is available today. Yet, while headed in the right direction, uptake isn’t where we’d like it to be.

Manual prior authorization and utilization review create burden on providers and payers alike. According to a recent article in Health Affairs, physicians spend $37 billion annually ($83,000 per doctor) thrashing out PA and formulary issues with payers. According to another estimate, doctors spend 868.4 million hours on PA each year, not counting time devoted by other staff members. Payers incur up to costs of $25-$40 per PA, plus risk to downstream CMS rebates and medical cost savings.

Perhaps most importantly, the difficulties inherent in trying to obtain a PA significantly affect the quality of care and patient safety. According to a survey by the American Medical Association (AMA), most physicians experience a delay in excess of a week for their PA request to be processed. Some 70 percent of prescriptions rejected at the pharmacy require PA; of those, 40 percent are eventually abandoned due to the complex, paper-based PA process. The PA process impacts more than 185 million prescriptions each year and results in nearly 75 million abandoned prescriptions.  

These issues will only be exacerbated as demand for PA increases. This is due to several factors. First is the robust pipeline of new specialty medications, the majority of which require PA. Second, the demand for specialty medications is rapidly increasing because, in large part, of the rising number of chronically ill patients who rely on specialty medications.

Because of increased specialty medication utilization — coupled with the reliance on today’s antiquated paper-based processes — prescribers, pharmacies, and payers will be unable to keep pace with the anticipated flood of new specialty prescriptions and related PA requirements. Patients will have delays in obtaining needed therapies or will forego them altogether if prescriptions are abandoned due to administrative delays. Quality and patient safety are at stake.

Now that the need for process improvement and efficiencies is imperative, stakeholders are beginning to coalesce around the promise of electronic prior authorization (ePA). This solution is in keeping with the trend toward automation of healthcare and the wide-spread adoption of electronic prescribing, which is used by 75 percent of ambulatory physicians. Standards to support the ePA transaction are in place. Vendors are emerging that can handle the transaction. States are jumping on the ePA bandwagon, with several requiring use of ePA in the near future and others expected to follow suit. It won’t be long before the federal government is expected to mandate ePA as well.

That said, use of ePA is not to the point of bringing robust value for all stakeholders. Electronic health record (EHRs) can support the ePA process, but not all of them have that capability. Physicians may not know that their EHR supports ePA and it could be integrated into the workflow.

What can be done to move the ePA adoption needle? Here are some things stakeholders can do now.

Prescribers should:

  • Ask their EHR vendor about their system’s ePA functionality. If it is not available, push for it as an enhancement request.
  • If integrated ePA is available, start using it as a way to improve return on investment (ROI) in the EHR. Electronic PAs get adjudicated much more quickly than prior authorizations submitted on paper via fax. ePA also can reduce costs and improve the quality and safety of patient care. These metrics are increasingly important ingredients of value-based care, alternative payment models, and related reporting requirements.

EHR vendors should:

  • Make prescribers aware of their product’s ePA functionality and how it is used. This educational component provides value to the buyer and also could be a differentiator in the market.
  • Get ahead of regulatory mandates by either the federal government and the states. Savvy vendors will not wait for the regulatory shoes to drop and then play catch up. This can be costly and affect market share.
  • Build competitive advantage. While some of the major EHRs have incorporated ePA, many of the small and medium-sized EHRs have not. Practices and integrated delivery networks are overwhelmed by the growing number of PAs. They are demanding relief, which can only be solved through the availability of ePA. This also is an attractive selling and retention point.

Health plans and PBMs should:

  • Adopt ePA functionality that is beyond the basics. Research shows that many PBMs have minimal ePA processing capabilities. Improving ePA processes and question sets will improve efficiency, reduce costs, and add value.
  • Push prescribers to adopt and use ePA. A legitimate ROI can be demonstrated, but only if functionality is used.
  • Take full advantage of the feature set included in the NCPDP SCRIPT standard for ePA. Supporting more of the features available in the standard will reduce the need for attachments, thus reducing turnaround times and increasing efficiencies. This translate into cost savings.

Advancing ePA requires focused effort by all stakeholders, but the time is right and the technology is ready.

Readers Write: Why Daily Clinical Analysis May Be A Game-changer In Patient Outcomes

June 28, 2017 Readers Write 3 Comments

Why Daily Clinical Analysis May Be A Game-changer In Patient Outcomes
By Benjamin Yu, MD, PhD

 image

Benjamin Yu, MD, PhD is vice president of medical informatics and genomics at Interpreta of San Diego, CA.

A missing piece in population health is real-time data and its real-time and continuous analysis. It’s a key ingredient that can help streamline health delivery, improve outcomes, and manage a dynamic patient population. Real-time interpretation is a keystone in many service industries, especially finance and e-commerce. However, its value is often overlooked in healthcare.

Instead, the health industry typically relies on monthly or quarterly business reports to spotlight health needs (e.g., gaps in care, medication management, etc.) and to find regional deficiencies. Using this information, groups plan and carry out campaigns to target and improve care through a variety of outreach mechanisms such as care managers, call centers, and provider network contacts.

However, during the laborious process of assessing static reports, millions of conditions change. It’s analogous to using turn-by-turn instructions for driving based on outdated information. Normally, turn-by-turn directions help the driver navigate through unknown roads and emerging traffic conditions in real time. However, if the system is not current, it might alert the driver long after he/she has missed a turn.

Similarly, in healthcare, by the time an outreach takes place, the member’s medications may have changed, a new refill may have been missed, a vaccine or screening may have been completed without the knowledge of the campaign, or a patient could have become ill or hospitalized before discovery of his/her risk. Thus, in addition to being expensive, the discover-and-campaign approach can be disjointed and too slow to adapt to the ever-changing landscape of a patient population.

Despite their potential benefits, real-time clinical solutions have been hampered in population health for several reasons. Many groups fear that real-time clinical data means too many alerts. While this may be true of some clinical information systems, it is not inherently true. In fact, the opposite may be true in that one of the major efficiencies provided by real-time data is reduced noise.

Because data is up to date, resolved issues should quickly disappear from the clinical workflow. For example, when a health plan calls a patient, instead of reviewing a long list of care initiatives — many of which are already complete — the clinician or plan can focus on future needs that are of the highest priority. Using up-to-date information ultimately can reduce alert fatigue and provide a more satisfying and impactful patient experience. In summary, real-time analysis is a noise reducer.

Indeed, the fear of ‘too much information’ often stems from the design of current health information systems, which rely heavily on clinicians and staff to sort through printouts, inboxes, notifications, and data reports to resolve issues in the clinic. Notably, real-time data should not be considered synonymous with an increase in graphs and decisions. Using the driving analogy, data is constantly changing in a turn-by-turn application. However, these applications natively interpret incoming data and only alert the user with upcoming turns or changes to the route. With respect to healthcare, real-time systems also need to be designed to interpret real-time data with actions and prioritizations of the clinician in mind.

The value of real-time data is underestimated. While some inherently accept that real-time clinical information is better than outdated information, real-time data and its immediate interpretation impacts far more than today’s era of business reports. Real-time data and analysis enables feedback interactions and behavioral modifications that cannot be derived from periodic reports.

In the consumer market, real-time responses enable end-to-end services such as ride share, routing, and many financial transactions. In healthcare, real-time clinical information enables better predictive technology and thus an ability to identify trends much earlier. In an increasingly connected world, new clinical services and technologies require instantaneous feedback and timely actions for members and users, enabled by real-time clinical information. If the rapid growth of consumer health devices like wearable monitors is an indicator of upcoming trends, real-time clinical data in population health is just around the corner. Leading healthcare institutions and technology providers need to make sure they don’t miss the turn.

Readers Write: Procuring Sustainable Success with Value-Based Care Models

June 28, 2017 Readers Write No Comments

Procuring Sustainable Success with Value-Based Care Models
By Dustyn Williams, MD

image

Dustyn Williams, MD is a hospitalist at Baton Rouge General Medical Center (LA) and founder of DoseDr.

All healthcare providers want the same things: better health for their patients and lower costs. Conceptually, value-based care achieves this shared goal by creating the incentives for all involved to provide better care and secure improved outcomes. Yet this approach lacks the appropriate framework and tools that enable and equip clinicians to achieve value-based outcomes.

Adding to this dilemma is the lack of an appropriate definition of “value” that would enable healthcare organizations to truly comprehend what constitutes “value-based care” and how to implement a successful, sustainable value-based model. True value is realized when efforts are focused on reducing costs and achieving enhanced outcomes rather than simply on attaining quality metrics.

Although the utilization and achievement of these metrics is a step in the right direction to positively impact care quality and outcomes, it’s not enough. Checking off boxes indicating that best-practice protocols are being followed does not necessarily equate to better outcomes or improved financials. Closing this gap between incentives and outcomes requires clinical care to evolve to reflect proactive management of chronic disease and promotion of patient wellness. Incentives alone are not enough; clinicians must also have access to the appropriate tools to achieve those quality goals.

The good news is that value-based payment models are providing the necessary impetus for the creation of radical disruptive practice patterns and new models of care. For instance, uptake of Internet-based care delivery that enable more proactive treatments is on the rise, particularly with chronic illness.

Value-based care is also a significant catalyst of advancements in telehealth solutions. These interventions are effectively disrupting traditional care models by providing the necessary best-practice based infrastructures and tools needed to proactively and effectively address chronic health conditions while seamlessly integrating into provider workflows.

Consider diabetes management. Despite the challenges faced with self-management of their condition, diabetic patients spend an average of just two hours per year with their primary care provider. Further, while physicians strive to provide patients with best-practice knowledge for controlling A1c levels, poor retention of medical information and rapidly changing effects of diabetes put patients at risk for serious health conditions and preventable hospitalizations. Clinical and financial impacts stemming from uncontrolled diabetes greatly influences the steep costs of the condition, averaging $176 billion nationwide each year. Patients and providers must have access to tools that enable enhanced collaboration and ongoing care monitoring to improve outcomes and expenditures for diabetes, as well as other chronic conditions.

Telehealth solutions fill this gap. Features such as smartphone-enabled provider feedback loops can now rapidly deliver easily-understandable, actionable information to patients to facilitate engagement, compliance and sustainable improved outcomes. By empowering patients to effectively self-manage their chronic conditions, long-term care costs to health plans and risk based-entities are significantly reduced, along with the steep costs associated with emergency room visits and hospital admissions.

Improvements in the health management of high-risk patient populations secure enhanced Healthcare Effectiveness Data and Information Set (HEDIS) performance measures and Star ratings for health plans, along with improved Medicare Access and CHIP Reauthorization Act (MACRA) and Merit-Based Incentive Payment System (MIPS) outcomes for providers.

Additional issues impacting the efficiency and success of value-based care include resistance to change and slow adoption of innovative care models. Industry laggards continue to stunt the progress made by early adopters of value-based care as they consume more resources than are saved. Ultimately, payers and providers must be willing to accept and adhere to new models, which will be helped along by the evolution of technology and processes, such as telehealth, capable of truly impacting care quality, outcomes and expenditures.

When risk is shared and incentives are aligned, value-based care models can enable providers to ultimately reduce expenditures and enhance patient care. If healthcare facilities provide quality care and cost-effective treatments that yield optimal outcomes, both patients and the healthcare system, as a whole, will benefit. Conversely, if there is no alignment, value-based care will collapse under the weight of a reimbursement structure that continues rewarding utilization. For instance, hospitals may continue to benefit from prolonged lengths of stays, while patients are buried under a mountain of medical bills and struggle with uncontrolled chronic diseases.

By delivering proactive, trusted information directly to patients, disruptive technologies fill a critical gap in population health and care management. The key is ensuring that information has been carefully vetted by a physician capable of making necessary adjustments based on the monitoring of a patient’s health in real time along with additional environmental factors such as food intake. This ensures that these interventions enable improved patient care outcomes while strengthening revenues by avoiding penalties and increasing profitability through performance-based bonuses.

Readers Write: Why Online Provider Search and Referral Management Programs Demand High-Quality Provider Data

June 28, 2017 Readers Write 2 Comments

Why Online Provider Search and Referral Management Programs Demand High-Quality Provider Data
By Thomas White

image

Thomas White is CEO of Phynd Technologies of Dallas, TX.

Healthcare systems, like any business, are competing for customers (patients) and referrals. In many respects, this competition has increased as patients are either forced to, or opt to, take more control over their own healthcare. The rise of consumerism is pressing healthcare systems to improve their online presence. Physicians and healthcare systems must fully leverage web tools to grow their customer base by empowering patients with the high-quality information they need to make important healthcare decisions.

The Internet has made it much easier for patients to search beyond their local area for the most qualified providers who meet their needs, participate in their insurance plan, and offer the highest-quality services. As a result of this new paradigm, healthcare systems must prioritize the quality and ongoing maintenance of the provider data that feeds their online “Find A Doctor” search and referral tools. Simply put, a poor search experience is a major turn-off. Patients may go elsewhere, referrals (and revenue) are lost, and reputation is damaged.

Patients and referring physicians alike expect to have the same online experience they would with Google and other search engines: instantly and easily find what they are looking for. Healthcare consumers satisfaction grows (and referrals are gained) when they can quickly find a doctor via a simple process that gives them useful information in easily understood terms. Accuracy is assumed.

Patients expect to see provider demographic, practice, insurance, and contact information with a few keystrokes. That’s a given. And when they are presented with more data than expected — such as the provider’s availability, ratings, languages spoken, clinical focus, research interests, treatments provided, and travel directions — even better.

This search process can be further enhanced if the provider’s data includes videos and other multimedia information. Video profiles personalize information and instructional videos can simplify patient visits and improve customer satisfaction and engagement. It’s kind of like online dating and hoping for the perfect match. In both cases, as they say nowadays, a picture can be worth a thousand words, and a video is worth a thousand pictures.

Patients are more likely to book an appointment if their search results direct them to a provider who meets their needs. High-quality data can seal the deal.

Online providers search tools are not just for patients. Physicians use them to identify the most appropriate in-network referral options for their patients. If the information from a referral management website is inaccurate or out of date, it can result in referral leakage, lost revenue, and wasted time. If there’s a delay in the delivery of urgently-needed care, then patient well-being and satisfaction may suffer. This can hurt reimbursement, particularly in today’s value-based care environment. Value-based payments emphasize evidence-based medicine and efficient delivery of care. These basic tenets should be supported by the information from any “Find A Doctor” search tool by ensuring patients see the most appropriate care giver the first time.

None of this, however, can be achieved without a holistic approach spanning the enterprise (clinical, financial, and marketing systems) to capture, manage, and share high quality provider data. A unified approach to provider data management is critical to meet the rising tide of healthcare consumerism and value-base care initiatives, never mind remaining competitive. Providing effective online provider search tools to healthcare consumers and providers is an investment that can quickly pay for itself through referrals that keep patients in network and improve overall satisfaction.

While online provider search tools are certainly not new, they must serve the demands and expectations of increasingly savvy and demanding online healthcare consumers and harried referring physicians trying to balance conflicting demands on their time and attention. Healthcare system leaders should assess how well their organizations online physician referral and outreach programs are meeting these end-user needs and determine relevant ROI measures to improve their effectiveness with an enterprise provider data management approach.

Readers Write: Top 10 Takeaways From the EClinicalWorks Settlement

June 15, 2017 Readers Write No Comments

Top 10 Takeaways From the EClinicalWorks Settlement
By Colette Matzzie

image

1. The federal False Claims Act provides an effective way to hold EHR vendors accountable for failing to meet Meaningful Use standards.

Many customers had complained to EClinicalWorks about major problems with its software, but little changed. It took a knowledgeable healthcare IT implementation specialist and the might of the US government to get the software problems fixed. They used a powerful whistleblower law known as the False Claims Act, which encourages whistleblowers to fight fraud by filing “qui tam” lawsuits, to force ECW to take action. Anyone who “causes” false claims to be submitted to the government is liable under the False Claims Act. Customers of ECW relied on representations that ECW’s EHR technology was properly certified and therefore, unknowingly submitted tens of thousands of claims for government incentive payments that falsely attested MU requirements had been met.

2. The federal Anti-Kickback Statute forbids EHR vendors from paying or rewarding users to promote or refer others as customers.

Many healthcare providers, pharmaceutical companies, and medical device manufacturers have been penalized for violating the Anti-Kickback Statute, but the ECW case is the first time it has been applied in the EHR industry. The government cited payments totaling almost $300,000 through ECW’s “referral program,” “site visit program,” and “reference program,” in addition to unknown amounts for consulting and speaker fees paid to influential users, as evidence of alleged violations of the Anti-Kickback Statute. The law prohibits providing money, gifts, or other remuneration intended to get referrals for services or items paid for by federal healthcare dollars except under very limited circumstances.

3. The accuracy of representations made to certifying bodies will be a factor when the DoJ reviews the liability of an EHR vendor under the False Claims Act.

Certification by a government Authorized Testing and Certification Body has been a prerequisite to successful sales because buyers can obtain federal incentive payments only for certified EHR technology. The government cited EClinicalWork’s decision to modify its software to “hard code” the drug codes needed for testing without meeting the certification criteria as evidence that ECW had “falsely obtained” its certification. This gave rise to its liability under the False Claims Act. Accurate and truthful information will remain a requirement for certification, despite the debate over whether the certifications adequately ensure software reliability and patient safety.

4. The Office of Inspector General crafted an innovative Corporate Integrity Agreement requiring ECW to fix deficiencies, notify its customers, provide customers with free upgrades, and permit customers to transfer clinical data without penalty.

As part of the settlement, ECW signed an expansive, state-of-the-art Corporate Integrity Agreement that the OIG put together to ensure that providers and patients are protected going forward. ECW is required to take significant remedial steps, which included sending out a series of notifications and advisories to customers that advise them of patient safety risks with its software, giving customers an opportunity to obtain updated (and presumably remediated) software free of charge; and offering the opportunity to transfer clinical data to another vendor free of onerous penalties or other restrictions. Software vendors should consider the agreement a guide to understand the risks they will face if their software does not meet federal requirements or if other misconduct occurs.

5. The government deems data portability and audit log requirements to be essential to proper EHR functioning.

EHR systems are required by the government to be able to export clinical information on patients electronically, including by batch exports, and reliably and accurately record user actions in an audit log. In its complaint-in-intervention, the government faulted ECW for allegedly misrepresenting these capabilities, and made clear that these omissions from the software were not acceptable.

6. EHR vendors need to respond in a timely and effective manner to customer reports of software defects, usability problems, or other issues that may present a risk to patient safety or that may violate federal law.

The Corporate Integrity Agreement requires ECW to notify OIG of certain reportable events that involve patient safety, certification, or a matter that a reasonable person would consider to be a violation of law. The government wants all EHR vendors to report significant problems or violations of law, especially when patient health or safety may be at issue.

7. EHR vendors should have persons and procedures in place to ensure compliance with federal law, just as healthcare companies do.

ECW’s Corporate Integrity Agreement requires the company to establish a compliance program with a compliance officer and a written code of conduct, similar to what many healthcare companies have. That’s something all EHR vendors should consider doing, as it’s wise to offer employees clear avenues to report concerns internally. Most employees prefer to address concerns internally before blowing the whistle by filing a qui tam lawsuit – unless the company has shown it is not responsive to legitimate concerns or will retaliate against employees who speak up.

8. The government will hold managers personally responsible for activities of the EHR vendor company.

The ECW settlement holds both the company and its three founders (the CEO, CMO, and COO) liable for payment. The settlements reinforce the DoJ’s commitment to individual accountability for corporate decisions in a very tangible way.

9. EHR vendors must ensure that all contracts and agreements with its customers do not restrict disclosures of information about the performance of the software or reporting of patient safety concerns (the “anti-gag” rule).

ECW’s Corporate Integrity Agreement requires that contracts between the company and its customers do not restrict customers from disclosing concerns about the performance of its software. This includes concerns related to patient safety, public health, and product quality. Other vendors should consider adopting similar “anti-gag” practices.

10. Whistleblower rewards may be paid for information that leads to successful resolution of a federal qui tam action against an EHR vendor.

The ECW case shows the government welcomes whistleblowers who have information about significant problems with EHR software. Under the False Claims Act, the government will pay whistleblowers a reward of 15 to 25 percent of the proceeds recovered by the government as damages and civil penalties, if the government joins the “qui tam” case filed by the whistleblower. The government awarded the ECW whistleblower $30 million.

Colette Matzzie is a whistleblower attorney and partner at Phillips & Cohen LLP, which represented the whistleblower in the EClinicalWorks case.

Readers Write: Technology Can Lift the Veil of Secrecy on Drug Prices

May 24, 2017 Readers Write 1 Comment

Technology Can Lift the Veil of Secrecy on Drug Prices
By Thomas Borzilleri

image

Thomas Borzilleri is CEO of InteliScript.

The recent story about the rift over prescription drug prices between insurer Anthem and its pharmacy benefits manager Express Scripts should anger — and frankly, befuddle — any physician or electronic health record (EHR) vendor. Providers and IT vendors should be fed up with payers and patients getting ripped off by inflated drug prices, taking a disproportionate share of the healthcare dollar. They also ought to be puzzled about why, with all of our advances, we are still living in a marketplace where no one knows what drugs really cost.

It’s particularly absurd because technology exists that can put an end to the opacity, overpayment, and oligarchy that characterize prescription drug purchasing today. Providers deserve and EHR vendors can offer tools that deliver the prices for any drug at the five cheapest pharmacies nearby. Doctors can have this data at their fingertips, within a few seconds, at the point of care, integrated into their existing workflow. These technology solutions can also track prescriptions to make sure they are picked up and refilled on a regular basis to gain new insight into which patients are at risk for adverse events due to medication non-adherence.

For years, insurers and patients have just accepted that the price they are getting is the best price, or the only price. However, allegations like Anthem’s — that Express Scripts overcharged the insurer by $3 billion — should make everyone in the healthcare ecosystem skeptical about the fairness of drug prices. But truly lifting the veil on drug prices will take a concerted effort by many stakeholders in the provider and IT vendor communities to take on the PBM juggernaut.

Strangely enough, when PBMs gained widespread popularity in the 1980s, there was an understanding that they worked on behalf of payers to lower prices, both by securing discounts and by steering patients towards lower-cost drugs. The truth, however, is that PBM “discounts” have always included heavy padding in the form of ingredient spreads and per-prescription fees. In fact, while PBMs are typically paying manufacturers 96 percent off the Average Wholesale Price (AWP) —the “sticker price” for drugs —the prices they charge insurers and employers are between 70 percent and 85 percent off the AWP. PBMs are skimming 10-25 percent off each prescription.

Insurers and employers have had little recourse, both because they did not know the true price of prescription drugs and because they did not have a way to easily shop around between competing pharmacies to get the best price on every medication. Instead, complex, opaque package deals with PBMs mean the payer might be getting good deals on some drugs and getting raked over the coals on others.

Drug price transparency and shopping tools are essential for payers to rein in costs and keep both premiums and co-pays from spiking. The urgent need for this data has also intensified recently because an increasing share of prescription drug costs are borne by consumers themselves. Patients simply won’t take their drugs properly, or at all, if they are out of reach financially. Affordability is now the number one reason for non-adherence to medications, which leads to poor outcomes, including avoidable hospital readmissions. A lack of medication adherence is estimated to cause approximately 125,000 deaths, at least 10 percent of hospitalizations, and cost between $100 billion and $289 billion a year.

In the past, some patients have looked to Canadian or other foreign mail-order pharmacies to try to lower drug costs. But these transactions are usually outside the doctor-patient relationship and may cause more harm than good to the patient, either by exposing him or her to dangerous drug formulations or by causing rifts in care continuity.

Doctors and patients, together, must come to the best decision about the right drug for their condition and price must be a part of that equation. We need technology solutions that enable doctors to find the best price on any drug, at local pharmacies that are convenient to the patient. Tools exist to address these concerns. The key is to embed these tools into existing EHR systems. By doing so, we can avoid disrupting doctors’ workflow and can ensure that all e-prescribing information is captured in the patient record.

These solutions must achieve savings for both the payer and the consumer. First, the solution must provide the lowest possible retail price while consumers are still paying off their deductibles, and then provide the lowest negotiated payer price to the insurer or employer once they start picking up the tab. These solutions can also be used to circumvent common PBM strategies, such as excluding low-cost brand and generic drugs from formularies to artificially increase co-pays on these cheaper drugs, which costs insurers and self-insured consumers billions of dollars each year.

Typically, consumers don’t realize that the cash price is in many instances lower than their adjusted co-pay, with the excess going right into the pocket of the PBM. Drug price transparency and shopping solutions should crunch the numbers for the doctor and patient, letting them know when it’s better to pay the cash price and when it’s more cost-effective to pay the co-pay.

Health IT solutions are typically geared towards one healthcare user: hospitals, doctors, patients, insurers, or employers. But drug price transparency technology is one of those rare innovations that will benefit each of those audiences. Doctors and patients, together, will be able to make the best decisions about medication management, at the point of care, during the prescribing process. Hospitals will enjoy better population health management through better medication adherence. Insurers and employers will be able to wring more value from each healthcare dollar.

What we need now is a commitment from EHR vendors to adopt this type of technology. The bottom line is that we can’t succeed in bending the cost curve in healthcare if we don’t know what the costs are in the first place. That includes prescription drugs. We in the health IT industry have the insight and ingenuity to draw the curtain back on drug price secrecy and we have a real obligation to do so.

Readers Write: Celebrate the Milestones, But Keep Your Eye on the Road Ahead

May 17, 2017 Readers Write No Comments

Celebrate the Milestones, But Keep Your Eye on the Road Ahead
by Tonguç Yaman

image

Tonguç Yaman is CIO of Advocate Community Providers of New York, NY.

I will turn 50 this year. A few days after my birthday in May, my daughter will graduate from Yale, the second of my two children to earn that distinction. In October, I will graduate from Columbia University’s Executive Master of Public Health program.

I guess you could say it’s a watershed year for me, one of the biggest of my life so far. We all have them. And once the celebrations are over, I imagine we’re all faced with the same question. What’s next?

Here is what I learned as I looked for answers to that question.

It’s never too late for a new beginning

Some of us might feel inclined to stop and take a breather at 50, especially once the kids are out of school. We may think we have reached a high point that we’ll never exceed in our time on earth. As for me, I’m viewing it as a new beginning.

It’s a simple construct. Fifty is half of my life. Sure, it’s a milestone, but it doesn’t scare me. I’ll admit I am tickled to be at a point in my life where I have no dependents. My son and daughter are well on their way to taking their places in the workforce and the world. While I have strong relationships with my children and see them often, there is a level of excitement, a feeling of freedom now that they are adults.

I don’t want to waste that feeling of freedom. I want to channel it in constructive ways and put it to good use.

The opportunity to focus is a gift

There is an even greater excitement in the fact that I recently began a new phase of my career, a phase that I have envisioned for a very long time. I am no longer simply an IT guy, but a healthcare professional, a CIO for a very exciting organization in NYC brimming with possibilities.

When I was a kid growing up in Turkey, I dreamed of becoming a medical doctor, so this move into healthcare feels as if I have come full circle. Our dreams get tweaked as we get older, but I like the way this one has turned out. Though not an exact match, I am still able to use my skills and experience to effect change in the healthcare sector, and probably on a much larger scale.

The transition wouldn’t mean half as much if I weren’t confident that I did everything I could to prepare myself for its challenges. That’s one of the benefits of maturity. They say good things come to those who wait, but I also believe that good things come to those who are prepared. Now I have the time, the skills, and the experience to give my new healthcare position the total focus it demands. This opportunity is a gift and I am eager to embrace it with all the dedication and energy I have.

Maturity and passion are not mutually exclusive

I’ve attended HIMSS healthcare IT conferences in previous years, but at this year’s event, something was different. Instead of observing from the sidelines, I was involved. I was invited to participate in sessions. I was a contributor. I felt respected and connected and I was able to help others make connections, too. One of these connections resulted in CHIME welcoming a new foundation member.

This ability to find the things two professionals might have in common and make a connection happen for their mutual benefit is probably the thing I am best at. While others are inspirational leaders, effective organizers,or  impeccable planners, I’m a connector.

In my work with colleagues and partners, I can find win-win solutions, shape commitments between parties, challenge others to exercise their own good judgment, and solidify their trust in one another. That is very exciting to me. I heard it time and again at the HIMSS conference this year: people notice my passion. It is gratifying to be able to say that at this point in my career.

Here’s to 50

There’s a saying now that 50 is the new 30. I’m not so sure I agree. Physically, I don’t feel that much different from the way I felt at 30. But in terms of what I have learned about my industry and about myself over the past two decades, I’ll take 50 over 30 any day.

Readers Write: Blockchain’s Missing Link

May 17, 2017 Readers Write 1 Comment

Blockchain’s Missing Link
By Frank Poggio

image

Frank Poggio is president and CEO of The Kelzon Group.

The IT concept that you hear most about today is blockchain systems design and technology. If you have not, you will very soon. It’s a concept that relies heavily on core Internet communication tools and shared information.

When you mention blockchain, some people automatically think of bitcoin, but bitcoin is just one application of the block chain concepts and tools — it is not blockchain proper. HIStalk posted a good synopsis of blockchain last year .

Blockchain in its simplest form is a virtual ledger. A ledger that is available to all on an instantaneous basis via the Internet.

Let’s look at an example. Say I borrow $100 from my office buddy Joe. If it’s just the two of us involved and no one else cares, then he notes on his paper ledger an asset of $100, with an offsetting entry of his cash decreasing by $100. On my ledger, I note an increase in my cash balance of $100 and a liability to Joe of $100. If an auditor were to check our ledgers, they would see all four entries and all things would be kosher, or in accounting terminology, in balance.

Now assume everyone in our office cares and we all have electronic ledgers and all our ledgers communicate with one another via the Internet, thereby creating one big virtual ledger. Every time one ledger changes, they all change instantaneously. Everyone in our office would see I owe Joe $100. As I make payments on the loan (or fail to), all ledgers would reflect the subsequent activity.

Blockchain software maintains a universal virtual ledger by maintaining constant communication among all participants. All updates and transactions — whether add, change, or delete — are stored forever. Hence the full provenance of any activity is easily viewable. The number of participants is not limited and could be in the billions, limited only by agreed-to privacy and security constraints.

The implications of blockchain technology are enormous. For example; since the blockchain is always in balance (no single entity entries allowed), goodbye auditors, bookkeepers, accountants, financial intermediaries, clearinghouses, many regulators, and so on.

Some industry pundits are predicting that blockchain will transform healthcare and make the interoperability mountain into a mole hill. A deeper understanding of the healthcare landscape with its many non-technical issues makes me a skeptic.

On the business side of healthcare, the impact should be pretty much the same as in commercial business. You can expect a big impact on finance and supply chain management. Operations such as scheduling and resource management will see significant impacts. Many will happen within the next decade. Legacy systems will have a tough time adapting, more so than their adaptation to the Internet, but slowly they will adjust.

On the clinical / medical side, I predict a much longer runway before we see any real impact. There are two reasons. First, blockchain is highly dependent on definition and structure. Terminology must be consistent and procedures must be standardized. Generally Accepted Accounting Standards (GAAP) have been in place for centuries, and as they have changed over the decades, multiple oversight groups have hashed out agreed-to changes. On the supply side, UPC codes have been around for almost 50 years and go down to almost the molecular level.

Not so in medicine. A practitioner’s understanding and use of terminology and protocols is highly dependent upon where they went to medical school and who they trained under. Studies have shown that even today, after the federal government has paid out over $30 billion in EHR incentives, still over 70 percent of a patient’s medical record is entered into the EHR as free text.

The second reason is that blockchain cannot work without absolute accurate identification about the transaction initiator and the information target. Identifying the initiator is easy. The target is the person / patient and that’s another matter. Still today after decades and trillions of dollars spent on healthcare IT we do not have a unique person / patient identifier.

As I have noted in my past writings on HIStalk and other blogs, this is not a technology problem, but a political one. If blockchain is to be the savior of healthcare interoperability, as the technocrats suggest, then it’s Congress that will have to forge the most critical first link in the chain.

My prediction is that systems developers will continue to jury rig solutions around this missing link. Providers would do well to remember that a chain is only as strong as its weakest link.

Readers Write: The Value Proposition of Optimizing Clinical Communication

May 10, 2017 Readers Write No Comments

The Value Proposition of Optimizing Clinical Communication
By James Jones and Wayne Manuel

image image

James Jones, BSN, MBA, MSN, NEA-BC is VP of patient care services and nursing operations at University of Washington Medicine’s Valley Medical Center. Wayne Manuel is senior VP of strategic services at University of Washington Medicine’s Valley Medical Center.

A few years ago, Wayne was on an airplane when he came across a magazine article about how Texas Children’s Hospital switched to Apple iPhones to improve clinical communication and reduce noise. With some due diligence, he found that Cedars-Sinai Medical Center and several other hospitals had also switched from old ways of communicating to iPhones, and they experienced similar positive results. As our senior VP of strategic services, Wayne recognized the opportunity for UW Medicine’s Valley Medical Center to replace our old, noisy phones with smartphones.

Around the same time, James attended a dinner event for chief nursing officers in Seattle. Again, smartphones were a main topic of the discussion, representing a solution to some common clinical communication challenges.

With both of us having technical backgrounds, we started sharing ideas on how to transition from our disparate communication systems to a more modern solution. We approached our CNO and CMO with research on the value proposition of implementing a mobile communication strategy. It was easy to see how a new way of communicating would bring us additional value. Some of the improvements we hoped to achieve included:

  • Improving the clinician and patient experience.
  • Reducing interruptions.
  • Gaining workflow efficiencies.
  • Saving time for clinicians.
  • Improving communication between interdisciplinary teams.
  • Meeting The Joint Commission’s National Patient Safety Goals for alarm management.

At that time, we had recently deployed a new electronic health record (EHR), which gave us the opportunity to improve many other systems and workflows. Our senior leadership team felt that to get the most out of our EHR, we needed a mobile app to close the gap and provide real-time access to clinical information, allow for mobile documentation, and offer an easy way for nurses and other staff to communicate.

Our staff were already using smartphones in their personal lives and were frustrated with the multiple communication devices they were juggling (two-way radios, legacy phones, pagers, and overhead paging). We met with many of our nurses to get their input, and one said, “Anything you can do to lighten the load would be greatly appreciated.”

We started with a phased approach, rolling out iPhones to one pilot unit, then to all inpatient units and several ancillary departments for calling; secure text messaging; and notification of alarms and alerts from patient monitoring, patient elopement, and the nurse call system. This was done via Voalte and Connexall applications.

We conducted before and after analysis so we could measure the outcomes from the new clinical workflows. One area we looked at was hospital-acquired pressure ulcers and skin integrity events. Using the iPhones, our wound care nurses saw an immediate improvement in workflow by using the Epic Rover application to take a photo of the wound, which uploads the photo for documentation into the patient medical record. The physician or wound care nurses can see it immediately and even show it to the patient and their family when rounding with a physician.

With only two dedicated wound care nurses on our team, their time is extremely limited. Rather than spending time walking around looking for a physician or nurse to discuss a patient, they can now find the appropriate physician in the smartphone directory, send a photo via Rover, and ask the physician to call when he or she is available to discuss treatment. The result has been better communication among our interdisciplinary teams, more efficient use of time for our wound care team, real-time documentation to the medical record, and improved communication with patients and families.

Another area where we have made great headway with the iPhones is in reducing medication errors. Using our new workflow, a nurse changes his or her status in the directory from “available” to “busy” and types in a status message, such as “administering meds.” This lets the rest of the care team know not to interrupt that nurse until their status changes back to “available.” New workflows escalate alerts to a backup while that nurse is busy.

Today, we are using iPhones for communication on all clinical inpatient units for nurses, physicians, respiratory therapists, discharge planners, environmental services managers, and administration. We are communicating more efficiently, with about 70 percent of all communication now taking place via text message versus 30 percent via voice calls. Our very tech-savvy staff loves the new solution and has adapted well to the workflow changes. One nurse said her unit is much quieter and that the hospital “feels like a hotel, so patients can get some rest.”

In our first year using smartphones, we are still learning where we can make adjustments in our workflows to make the most of our new way of communication. Going forward, we will be analyzing workflow efficiencies, adjusting alarm settings, and managing notifications from nurse call, physiological monitors, and the EHR.

The authors presented an HIStalk webinar titled “Improving patient outcomes with smartphones: UW Medicine Valley Medical Center’s story.”

Readers Write: An Uncomfortable Truth About Hospital Revenue and an Overlooked Way to Gain It Back

May 10, 2017 Readers Write 4 Comments

An Uncomfortable Truth About Hospital Revenue and an Overlooked Way to Gain It Back
By Crystal Ewing

image

Crystal Ewing is manager of data integrity at ZirMed.

In a video message from last year that he surely never intended for public and regulatory scrutiny, Mayo Clinic CEO John Noseworthy, MD appeared to advise employees to prioritize patients with commercial insurance in order for the famed hospital to remain financially strong.

Months later, Mayo is still explaining exactly what Dr. Noseworthy meant. Many healthcare leaders need no further explanation, even if they personally dislike any suggestion of favoring the commercially insured over Medicare and Medicaid patients. With government reimbursement continuing its decline, most hospitals are straining to hold on to their profitability.

Still, placing hope in commercial insurance to make up the difference is misguided, especially with the rising dominance of health plans that are not only high deductible, but also require high co-payments and high co-insurance. Touted as a means of covering more Americans, these plans often put more of the financial burden on patients than simply paying for healthcare in cash at a discount.

As such, many patients with these plans may claim they have no coverage when it comes time to pay for a procedure or service. It’s hard not to empathize with their motivation for doing so, but it’s a practice that can put the hospital in a precarious position.

With self-pay patients, things become more complicated, especially since there can be a lag of 30 or more days between the time that they are treated and the time the invoice comes due. When faced with a choice between paying for housing, utilities, food for their families, auto repairs, etc. – all of which affect the present and future – or paying a hospital bill for an event that occurred in the past, the decision is easy.

When this thinking is spread across a large patient population, bad debt accumulates quickly. Additionally, patients are unlikely to pay medical bills that are greater than 5 percent of household income, according to the Advisory Board, a consulting firm for hospitals. Median household income in the U.S. is at about $53,000, suggesting that when out-of-pocket charges exceed $2,600 hospitals can forget about collecting, according to Spencer Perlman, an analyst with Height Securities in Washington.

Given the above realities, more hospitals are using automated coverage detection technology, which also finds insurance coverage that patients legitimately aren’t aware of or are unable to communicate. When patients are brought to the hospital in the grips of a heart attack, for example, or while unconscious, they’re hardly able to convey their levels of coverage. Some fully conscious patients even may forget they have coverage, or provide information on secondary rather than primary coverage, or become confused about which carrier covers them. This isn’t uncommon with elderly patients.

No matter the reason it is problematic , it is imperative that coverage verification becomes a more streamlined process at our nation’s hospitals. It can be done in a way that respects the patient and in a timely fashion to protect the hospital’s finances. The most feasible method is to pair automated coverage detection with automated eligibility verification, the latter of which is already in place at many hospitals. However, coverage detection can also be an independent, standalone process. Either way, it makes quick work of checking with thousands of healthcare payers to determine if any are the primary or secondary insurer for a given patient.

Often, as much as 15 percent more instances of billable insurance are uncovered with superior processes and technology. Even just some quick mental calculation can see how this would recoup millions of dollars for many large hospital systems. It’s also significantly over the 1 to 5 percent rate achieved by manual and legacy coverage detection.

Much of this improvement is due to the huge data sets that now power some business intelligence engines, encompassing billions of historical health insurance transactions for millions of Americans. As these insights are tested against a pre-identified set of payers, algorithms can match the key data attributes that confirm coverage and the information needed to file the claim.

What has yet to be quantified but surely exists is the reduction in future collections activity with patients. Despite jargon that describes these patients as “empowered consumers,” the reality is they are struggling to pay their bills and rely on hospitals to help them navigate this uncertain terrain. In turn, hospitals must be fully informed about all of a patient’s sources of payment, including if commercial insurance coverage exists.

There is nothing unethical about seeking such information, only for using it to prioritize patients who it turns out are commercially covered. Clearly hospitals should be setting their sights on treating all patients, regardless of source of payment. The ability to do so is greatly enhanced when hospitals can identify all sources.

Readers Write: A Prescription for Poor Clinician Engagement with Health IT: Stop Communicating and Start Marketing

April 26, 2017 Readers Write 12 Comments

A Prescription for Poor Clinician Engagement with Health IT: Stop Communicating and Start Marketing
By David Butler, MD

image

David Butler, MD is associate CMIO of the Epic/GO project of NYC Health + Hospitals of New York, NY. 

My first lesson in healthcare marketing came in the spring semester of my junior year at Texas A&M University, when I accepted a prestigious internship with a little company called Merck Pharmaceuticals. Believe it or not, I hadn’t even heard of this company, but I soon found out one of the many reasons for their meteoric rise.

That summer, Merck was releasing a new prostate drug. They posed the question to their young crop of interns: where should we market this drug? Field & Stream! Men’s Health! Cigar Aficionado! We shouted rapid-fire.

Wrong, wrong, and wrong again. Our instructor basked in our ignorance for a moment before he uttered the answer: Good Housekeeping. Targeting the significant others of the drug’s target audience was actually the smarter way to go. They were more likely to notice changes in their partner’s behavior and push them to go to the doctor.

Fast-forward 25 years later and healthcare is approaching physicians and nurses with the non-WIIFM, non-behavioral economics approaches similar to what my intern class suggested.

We spend hundreds of millions of dollars to implement technology for our best and brightest to leverage to care for patients, yet we continue to allow these transformative changes to the software to enter into their workflows without rollout efforts that match the investment and the desired results.

Healthcare needs to stop communicating and start marketing new health IT projects and improvements to existing provider-facing solutions. Too many initiatives fail not on the merit of the technology, but because the organization failed to successfully relay the value to the end users.

Here are five ways to help launch a full-fledged marketing campaign to capture your end users’ attention and effectively roll out new technology and important updates to current systems:

Change the mindset.

Health IT project teams need to think of their communication differently. It should not only inform, it should persuade. If you were going to sell something to physicians to get them to actually buy it, how would you change your communication? That should be a question asked during the creation of every piece of project collateral. How do you find the wife or the Good Housekeeping marketing equivalent from my opening example?

Get docs and nurses to want to do your desired action, or even better in some cases, understand why it would hurt not to do it.

Spotlight the value.

Too often healthcare organizations spend a bunch of R&D resources creating or improving something really cool, and then communicate that in an email with a laundry list of other changes that aren’t as meaningful. If you’ve added technology that will help save lives or otherwise have a profound impact on clinician efficiency, give it the spotlight it deserves.

For example, it used to be a policy at Sutter Health (my former organization) that if a nurse gave a patient insulin, a second nurse had to log in to double-check the dose. The organization finally changed the policy so that second nurse and verification was no longer needed. Some genius asked how much nursing clicks, time, or dollars would this save. We actually took the time to figure it out.

After calculating the size of organization and the insulin doses given each day, we figured that policy change resulted in $400,000 in savings of nurses’ time—and that’s the value we marketed. Not only to the nurses, but also to the board. We told the nurses how much of their time we were giving back to them and told the board about the significant cost savings for the organization.

Once you find the value to spotlight, think about what that value means to different parties and market that ROI.

Devise a catchphrase.

If you want end user attention, you’re going to have to earn it. There are too many competing priorities for a busy physician’s or nurse’s attention. Have some fun and get some eyeballs by devising a catchphrase for your campaign.

For example, when I was helping roll out a secure messaging solution to thousands of physicians, we could have promoted it with “New! Secure Messaging” or even “Pagers to Smartphones” messaging. Instead, we used “Safe Text.” It was fun and catchy—there were plenty of good-natured jokes and buzz around the campaign—and it also tapped into their own motivation to protect PHI. Make your catchphrase not only descriptive, but also memorable. That’s marketing.

Include a call to action.

What do you want your audience—physicians, nurses, or whichever group it may be—to actually do after they’ve read your communication? Good marketing always includes a call to action, or CTA. After you create marketing for the group, ask yourself what the CTA should be. Do you want them to download an app or an update? Submit their feedback? Add an event to their calendar? Always make the CTA big, bold, and if possible, frictionless.

For example, include a link that can automatically add the event to their calendar, or seamlessly forward it to a friend or colleague. You can also think about the tools you already have and how you might get innovative with them to drive follow-through.

One prominent health system in the Pacific Northwest used their EHR alerts to creatively capture clinician attention at various workflow points within the EHR. They were greeted by a respected physician leader — their CMO — whose image and quote reminded them to complete certain crucial activities within the EHR. Having his face staring at the clinicians alongside that CTA made it much more influential.

Rinse and repeat.

If a company you already like and engage with introduces a new product, they’re going to be marketing that to you on every channel they can: Direct mail, email, TV commercials, social media ads, display ads. Follow a similar approach for internal projects: Emails, flyers, reader boards, table tents in the cafeteria, digital banners on internal websites, announcements at town halls, free tchotchkes—anything you can think of where your end users might see it.

Physicians rarely understood why drug companies would provide free prescription pads, pens, and other items. They stated, “It doesn’t affect my prescribing patterns.” However, after many years of research on this, it actually does. So let’s wise up and follow other marketing examples from other verticals to keep the messaging in front of them. It may take several exposures for the message to resonate, but you can keep it fresh by switching up the format, colors, and graphics.

Finally, don’t forget to ask for help if you need it. Most healthcare organizations have talented marketing teams that are consumer-facing, but may be willing to help out with internal initiatives. They’re just not always asked.

With these five strategies, you can help your organization’s IT team pivot from communicating new technologies from boring emails to full-fledged campaigns that truly market the value to doctors and nurses and successfully bring them on board.

Readers Write: Deep Neural Networks: The Black Box That’s Changing Healthcare Decision Support

April 26, 2017 Readers Write 1 Comment

Deep Neural Networks: The Black Box That’s Changing Healthcare Decision Support
By Joe Petro

image

Joe Petro is SVP of research and development with Nuance Communications.

Don’t look now, but artificial intelligence (AI) is quietly transforming healthcare decision-making. From improving the accuracy and quality of clinical documentation to helping radiologists find the needle in the imaging haystack, AI is freeing clinicians to focus more of their brain cycles on delivering effective patient care. Many experts believe that the application of AI and machine learning to healthcare is reaching a crucial tipping point, thanks to the impact of deep neural networks (DNN).

What is a Neural Network?

Neural networks are designed to work in much the same way the human brain works. An array of simple algorithmic nodes—like the neurons in a human brain—analyze snippets of information and make connections, assembling complex data puzzles to arrive at an answer.

The “deep” part refers to the way deep neural networks are organized in many layers, with the intermediate (or “hidden”) layers focused on identifying elemental pieces (or “features”) of the puzzle and then passing what they have learned to deeper layers in the network to develop a more complete understanding of the input, which ultimately produces a valid answer. For example, a diagnostic image is submitted to the network and the output may be a prioritized worklist and the identification of a possible anomaly.

Like us humans, the network is not born with any real knowledge of a problem or a solution; it must be trained. Also known as “machine learning,” this is achieved by feeding the network large amounts of input data with known answers, effectively teaching the network how to interpret and understand various inputs or signals. Just like showing your child, “This is a car, this is a truck, this is a horse,” the network needs to be trained to interpret an input and convert it to an output.

For example, training a DNN for medical transcription might involve feeding it billions of lines of spoken narrative. The resulting textual output forms a truth set consisting of spoken words connected with transcribed text. This truth set expands over time as the DNN is subjected to more and more inputs. Over time, errors are corrected and the network’s ability to deliver the correct answer becomes more robust.

A key feature of a neural network is that when it gets something wrong, it is corrected, Just like a child, it becomes smarter over time.

The Black Box

Here’s where it gets interesting. Once the DNN has that baseline training and it begins to analyze problems correctly, its neural processes become a kind of black box. The DNN takes over the sophisticated, multi-step intelligence process and figures out how the inputs are connected or related to the outputs. This is a very powerful concept because we may not fully understand exactly how the network is making every little decision to arrive at an output, but we know it is getting it right.

This black box effect frees us from having to contemplate—and generate code for—all the complex intermediate variables and countless analytical steps required to get to a result. Instead, the DNN figures out all intermediate steps within the network, freeing the technologist from having to worry about every single one. And with every new problem we give it, we provide additional truth sets and the neural network gets a little bit smarter as it trains itself, just like a child learning its way in the world.

How smart is smart? One of the biggest challenges with speech recognition is accommodating language and acoustic models, the specific and very individual aspects of the way a person speaks—including accent, dialects, and personal speech anomalies. Traditionally, this has required creating many different language and acoustic models to cover a diverse range of speakers to ensure accurate speech recognition and improve the user experience across a large population of speakers.

When we started using special purpose neural networks for speech recognition, we discovered something surprising. We didn’t need as many models as before. A single neural network proved robust enough to handle a wider range of speech patterns. The network essentially leveraged what it learned from the massive amounts to speech data we used as a training set to improve its accuracy and understand people across the entire speaker population, reducing the word error rate by nearly 30 percent.

Anecdotally, I’ve heard from people seated across from a physician dictating with such a thick accent at such high speed that they could not comprehend what was said, yet DNN-driven speech recognition technology understood and got it right the first time.

It’s important to note that neural networks are not magic. DNNs require problems that have clear answers. If a team of trained humans agrees with no ambiguity and they can repeat the agreement across a large set of inputs, this is the kind of problem that neural nets may help to solve. However, if the truth set has grey areas or ambiguity, the DNN will struggle to produce consistent results. The problems we choose and the availability of strong training data is key to the successful applications of this technology.

Putting DNNs to Work in Healthcare

So how are DNNs changing the way healthcare is practiced? Neural networks have been used in advanced speech recognition technology for years, and that’s just the beginning. The potential applications are nearly endless, but let’s look at two: clinical documentation improvement (CDI) and diagnostic image detection.

Clinical documentation includes a wide range of inputs, from speech-generated or typed physician notes to labs, medications, and other patient data. Traditionally, CDI involves having people who are domain experts reviewing the documentation to ensure an accurate representation of a patient’s condition and diagnosis. This second set of eyes helps ensure patients receive the appropriate treatment and that conditions are properly coded so the hospital receives appropriate reimbursement. The CDI process requires time and resources and can be disruptive to physicians’ workflow since the questions coming from CDI specialists are generally asynchronous with the documentation input.

Technology is used to augment the CDI process. Applications exist that capture and digitize CDI processes and domain expertise, creating a CDI knowledge base at the core. This involves processing clinical documentation, applying natural language processing (NLP) technology to extract key facts and evidence, and then running these artifacts through the knowledge base. The output of this complicated process is a context-specific query that fires for the physician in real time as she is entering patient documentation, linking, say, a relevant lab value with key facts and evidence from the case to indicate the possibility of an undocumented infection, for example. This approach to addressing a common documentation gap is a technically arduous and complex processing task.

What if we applied neural networks to change the paradigm? Many institutions have been doing CDI manually for years and we can leverage not only the existing clinical documentation (the input), but also the queries generated (the output) from those physician notes to create a truth set for training the neural network with a repeatable, deterministic process. The application of neural networks allows us to skip over complexity of digitizing domain expertise and processing the inputs through a multi-step process. Remember the black box concept? The DNN essentially determines the intermediate steps, based on what it learned from the historical truth set. In the end, this helps improve documentation by having AI figure out the missing pieces or connections to advise physicians in real time while they’re still charting.

The applications of neural networks are not limited to speech or language processing. DNNs are also changing the game for evaluating visual data, including radiological images. Reading the subtle variations in signal strength associated with identification of an anomaly requires a highly-trained eye in a given specialty. With neural networks, we can leverage this deep experience by training the network with thousands of radiological images with known diagnoses. This enables the network to detect the subtle differences between a positive finding and a negative finding. The more images we feed through it, the more experienced and accurate the DNN becomes. This technology will streamline the busy workflow of the radiologist and truly amplify their knowledge and productivity.

Augmenting, Not Replacing

While the possibilities for neural networks are incredibly exciting, it’s important to note that they should be viewed as powerful tools for augmenting human expertise rather than replacing it. In the case of diagnostic image detection, for example, a DNN can serve as a first line review of films, helping prioritize them so radiologists focus first on those that are most critical. Or it might serve as an automated second opinion, possibly spotting something that might have been overlooked.

Today, AI in healthcare decision support is still in its infancy. But with the exciting possibilities created by DNNs, that infant is poised to transition from crawling to walking and even running in the foreseeable future. That’s good news for providers and patients alike.

Readers Write: Top Health IT Marketing Trends From #HITMC

April 10, 2017 Readers Write No Comments

Top Health IT Marketing Trends from #HITMC
By John Trader

image

John Trader is VP of communications at RightPatient in Atlanta.

image

I had the opportunity to attend the Health IT Marketing & PR Conference in Las Vegas last week, and thought I’d share some of my top health IT marketing takeaways.

image

Content, Content, Content

Content was certainly king in terms of session topics. What works. What doesn’t work. How to establish a sound content-marketing strategy (even if you’re a small company with a shoestring budget). My biggest takeaway on content is that marketers need to start with the end in mind. Understand what content resonates with the demographic you target by listening first, and then developing a strategy that addresses customer needs and is strategically presented to them as they make their way down the sales funnel.

image

I enjoyed Sarah Davelaar’s (from the The Signal Center for Health Innovation) session where she outlined the key elements in content strategy. I also enjoyed a panel discussion featuring four physicians who shared their content consumption habits – where they go to find information, what content resonates with them, and what they like versus what they ignore. The million-dollar question for any health IT marketer is: What influences their decision to buy? Most docs said that conferences are a great place for them to discover new products. Those docs on social platforms like Twitter do pay attention to who shares their posts and who interacts with them. Catchy headlines are important, and most of them look for unique perspectives on issues as opposed to extolling the virtues of a product.

Innovation Versus Value

Conference organizer and Netspective founder Shahid Shah’s opening presentation on day two was excellent (although the amount of information on his slides was a tad overwhelming). There was a lot of discussion at the conference about whether marketers should position themselves as innovators, since nothing we do is truthfully going to "disrupt" healthcare. The truth is, customers care a lot more about value than innovation. One of the best quotes from his presentation was, “Do customers care about what you think is innovation or will they care more about you when you care about what their innovation needs are?” 

image

Social

Although I didn’t attend any sessions dedicated to social media use or strategy, there were a few that addressed how to navigate the online universe, and how to develop and execute effective social media strategies. “Go where your customers are” seemed to be the general takeaway from attendees of those sessions. Don’t chase the latest shiny social platform just for the sake of having a presence. Again, start with an end goal in mind (create leads and eventually sales), and make sure you are measuring your results (how will you be able to tell if your efforts are successful?) There was also some discussion on how to effectively measure social to gain a better understanding of what works versus what doesn’t work. There was also a lot of chatter moving beyond brand awareness and more into how social efforts are creating leads and sales.

Leveraging the Customer

A recurring theme was how to leverage existing customers to create new business. Kathy Sucich of Dimensional Insight delivered an excellent presentation, where she provided a case study on how she increased her own company’s “share of voice” (a term that was new to me), and gave sound advice on how to successfully leverage customers to create new content and increase brand visibility and messaging. The key takeaway for me here was that capturing and then bringing the customer’s voice to your messaging requires personal relationships with customers. You simply must spend the time to cultivate these relationships by establishing a set of expectations at the outset of the relationship that outlines your plan to work with your customers and get their story in front of others.

image

Video

There was lot of buzz about creating more video as part of an effective marketing strategy. It continues to be a hot topic of interest because it’s clear that people want to consume more of it. The key is making it resonate. The key seems to be keeping it simple, short, and focused on addressing a problem instead of extolling the virtues of a product. Christine Slocumb’s (of Clarity Quest Marketing) session was excellent in reiterating the point that in this day and age, videos have to be personalized to be effective.

SEO Isn’t Dead

Kristine Schachinger of The Vetters Agency presented an excellent session covering modern SEO practices, soup to nuts. We talked about ways to analyze SEO performance, online SEO resources, ranking factors, inbound link tactics, do’s and don’ts for SEO, how to add Google Search Console to your site, how content affects SEO, and keyword research – just to name a few topics. There was a great deal of interaction between the presenter and the audience, and directly between audience members, which, in my opinion, is what makes this conference excellent. Questions were asked and topics brought up that were a great supplement to Kristine’s curriculum. This is perhaps what I like best about HITMC. It has a more intimate setting than most conferences I attend.

About That Other Conference

The buzz around the conference seemed to be the forthcoming HIMSS marketing conference (which, by the way, I don’t anticipate being able to offer the intimate setting I mentioned above). Many have said they heard through the rumor mill that it may be frowned upon by the marketing community to attend in lieu of supporting HITMC’s more grassroots efforts. I talked to several people who have already signed up for the HIMSS event but seem to be keeping that information to themselves. Other buzz has been the quality of HITMC – most people agree that it’s an excellent conference and gets better each year by addressing the most relevant topics to marketers.

The only drawbacks I found, aside from freezing temps in the conference rooms, was that the few tough questions I asked during Q&As weren’t answered as thoroughly as I would have liked, and there was a lack of substantial, real-world case studies to back up presenter assertions. Overall, I think the conference was a great investment. It’s always helpful for me to be around likeminded professionals eager to gain insight and tips on how we can do our jobs more effectively.

Readers Write: In a Fog About the Cloud?

March 27, 2017 Readers Write No Comments

In a Fog About the Cloud?
By Alan Dash

image

Alan Dash is senior advisor with Impact Advisors of Naperville, IL.

The use of a cloud to symbolize some magical spot where all the answers to the world’s questions are housed and an infinite amount of storage exists has been around since the 1970s. I reflect on my own career, while programming for the US Air Force in the early 1980s, drawing clouds in my diagrams to show that somewhere, out there, beneath the pale moonlight, someone’s thinking of me, and filling the void symbolized by my cloud with meaningful data.

Not exactly how Linda Ronstadt and James Ingram sang it, but that was my visual. Back then we called it what it was – centralized computing; output devices received data from centrally-located applications.

Then came PCs, placing those applications out onto the edge of the computing environment and away from the monster in the data center that threw off a dragon’s amount of heat and occasionally an equal amount of fire and brimstone. We called that de-centralized computing; everyone was free to process at their desk.

PCs became smaller, applications bigger. Soon we needed gigabits of storage to hold the very applications that were to be fed with an obese amount of data. Ultimately PCs couldn’t handle the power and space needed, so centralized computing came back, only this time we called it “The Cloud” and it was good – good because we learned new acronyms like SaaS, DaaS, IaaS, NaaS, and RaaS.

So now that we understand the Cloud, kinda, manufacturers have introduced something new — very small sensors which can equally communicate and intercommunicate in such a way, justifying a new name, The Internet of Things (IoT), or The Internet of Everything (IoE).

Ostensibly, these little sensors and devices communicate with the Cloud in a two-way format, providing data and receiving instruction. An example of these devices under IoT include sensors designed to control lights and blinds, HVAC systems and appliances, security and energy efficiency systems. More recent additions of IoT devices include wearable medical technologies, wildlife movement monitoring, urban infrastructure monitoring (road and bridge), and even intelligent collision-avoidance sensors in automobiles (both with driver and without driver).

Back to the Cloud. Servers located remotely (in the Cloud) can, and do, communicate with IoT devices out on the edge of the network; centralized computing works for IoT devices. However, propagation delay (another ‘old’ term) has become a serious factor. Propagation delay is the length of time it takes to get a signal from a sender to a receiver and back. Under normal circumstances, while we are impacted by this delay, we don’t really experience it because of our reference point.

Here’s an example. You call a friend who you are meeting at a restaurant, you ask where they are, and then you see them walking around the corner. You see their mouth move, then you hear their voice in your phone. We always have this delay, but our reference point is such that we do not realize it, so it does not bother us.

Not so for IoT devices. These devices need to instantly communicate and intercommunicate between other IoT devices, and the process of these devices speaking to each other in the Cloud, while technically capable, adds way too much propagation delay to the mix. They become ineffective.

This brings a new (old) concept back into play – de-centralized computing. Ahh, remember that? But we can’t call it de-centralized computing because it’s an old term that we were told does not work any longer, so for IoT to IoT device communication a new name had to be created. That name is … The Fog.

And yes, it makes sense. A fog is a cloud at ground level. A billion droplets of water vapor floating around at a low level, not relying on the cloud for existence. And that’s what the idea of intercommunicating IoT devices is. A billion little sensors bouncing around, intercommunicating, and not relying on the Cloud to perform that communication.

In healthcare, IoT is already here and located within wearable technologies monitoring biometric data, in the RFID systems used to track supplies and locate staff, and in mechanical controls for building automation. For hospitals, growth of wearable tech will be seen as the next step, and this growth will be the first impact on architecture from IoT.

Already we are seeing program space being set aside by hospitals to blend clinical engineering, clinical care providers, and IT departments who will work together to choose, fit, configure, and remotely monitor patients wearing sensors, smart clothing, even implants and prosthetics that will communicate back into the hospital network.

While large leaps into IoT and Fog Computing won’t be seen in the typical hospital for a few years, forthcoming IoT devices will route alarms from equipment to care providers, warn of fall risks, automate re-supply of equipment and meds, track clinical process flow, mitigate queuing, and heighten the use of autonomous robots for specimen collection, supply delivery, and remote telemedicine visits. Beyond that, as driverless cars make their way into mainstream, hospital garages and way finding systems will ultimately communicate directly with these vehicles, perhaps even routing cars to appropriate entry points based on the current biometric readings of the passengers within.

The possibilities are, well, still foggy.

Readers Write: What Healthcare Can Learn From My Roofer

March 27, 2017 Readers Write 6 Comments

What Healthcare Can Learn From My Roofer
By Phelps Jackson

image

Phelps Jackson is CEO of Sirono of Berkeley, CA.

I had a leaky roof over my kitchen. In the dry season, it wasn’t a problem, but it was something I needed to take care of. I kept putting off the repairs because I dreaded the hassle of bids, estimates, and surprise expenses.

When the rainy season finally came, I started using my pots more for catching drips than for cooking. I had to do something. I looked online for the highest-rated roofing company in my area, got an estimate for repairs, and gave the go-ahead for the work.

About 30 minutes into the job, I got a call from the roofer. The wood beneath the shingles was ruined. It would add $1,200 to the repairs. When I asked why that cost wasn’t included in the initial estimate, he politely reminded me that he had warned about the possibility of additional costs.

When I asked why the price was so high, what I got was modern, high-quality customer service: on-the-spot pictures of the rotten sheathing, an email with the price breakdown, a follow-up phone call to see if I had any billing questions, and more pictures of progress as the repairs went on. Actual pictures!

In the end, I was comfortable paying the higher cost because I understood the real value of the service. Best of all, he kept me well informed throughout the whole process even though I was 1,000 miles away on a business trip.

So, if a guy standing on top of my house can offer omni-channel customer service and high-level billing support, why can’t a multimillion-dollar hospital with teams of representatives do the same?

That’s exactly what frustrated patients ask themselves every day. They don’t care about the complexity of medical claim processes. They just want to know how much they will owe and why. The reality is that 61 percent of patients find themselves surprised by out-of-pocket expenses because they were never told that pre-service estimates aren’t 100 percent accurate or more likely didn’t get an estimate in the first place.

In contrast to the customer billing support I was offered, what if three months after the repair I had gotten a roofing bill $1,200 higher than the estimate? I would have assumed that I was being ripped off, disputed the charges, and most likely left negative online reviews so others could avoid a similar experience.

It’s no different when patients receive unanticipated escalated medical bills, which is so often the case. They become suspicious of the additional charges, question their own financial liability, and delay payment or refuse to pay altogether. Even if patients are happy with their medical care and would be willing to accept additional fees, they probably assume that there was an error.

Proactive outreach to explain balance changes shows patients that they are valued and respected. It clarifies the quality of the care received, expedites payment, and inspires customer loyalty. Fifty-seven percent of patients say their medical bills are confusing.

Improving the patient billing experience is a must for every hospital. Utilizing the patient’s preferred methods of communication makes the process easier and far more patient-centered. In healthcare, as in every other industry, consumers want to interact with businesses the way they prefer, whether it is online, email, text, phone, or through the mail.

The ease of online shopping and service-oriented local businesses have raised customer service expectations and the average hospital doesn’t come close. As patient payments become increasingly critical to the revenue cycle, smart health systems will adapt and prosper. Those who don’t—won’t.

Readers Write: Data Security Comparison: Healthcare vs. Retail, Finance, and Government

March 15, 2017 Readers Write No Comments

Data Security Comparison: Healthcare vs. Retail, Finance, and Government
By Robert Lord

image

Robert Lord is co-founder and CEO of Protenus of Baltimore, MD.

In 2016, the healthcare industry experienced, on average, more than one health data breach per day, and these breaches resulted in 27,314,647 affected patient records. Clearly, criminals are targeting patients’ medical information with great frequency and success.

How has the healthcare industry responded to this continuing epidemic? Data suggests there is still a lot of work for healthcare organizations to do in order to improve the security of their patient data. It’s important to look closely at and analyze how healthcare organizations’ security practices and spending compare to retail, finance, and government — three industries known to have proactively advanced their security posture to protect their sensitive data.

Compared to the retail and finance industries, the state of healthcare data security is sorely lacking. Since 2015, 140 million patient records have been compromised, equating to one in three Americans experiencing their health data being inappropriately accessed. Ransomware attacks hit the healthcare industry especially hard, as 88 percent of all ransomware attacks target a healthcare organization.

Criminals are increasingly targeting healthcare because patients’ medical information is incredibly profitable on the black market and it’s more easily accessible when compared to more protected industries, such as finance. Within the finance industry, if a customer’s credit card or bank account number is stolen, that information can simply be changed, rendering it useless to the criminal. Patient data, on the other hand, is a repository of information that can be used to steal an individual’s identity – Social Security numbers, DOB, and addresses.

When combined with sensitive medical information like diagnoses, claims history, and medications, it can create the perfect storm for wreaking havoc in a patient’s life. This kind of information cannot be easily changed, and because of the lagging security in the healthcare industry, this data is incredibly easy to obtain and increasingly vulnerable to criminals’ sophisticated attacks.

There is no question that when compared to other industries, healthcare falls short when it comes to data security. A 2015 survey found that only 31 percent of healthcare organizations used extensive methods of encryption to protect sensitive data and 20 percent used no encryption at all. Another study found that 58 percent of organizations in the financial sector used encryption extensively. These results are concerning because the information healthcare organizations must protect is far more sensitive and potentially damaging than the information retail and finance organizations gather and protect even though the latter group is more proactive in keeping this information safe.

Retail and financial service organizations have more experience protecting customer data from cyber criminals.This gives them an advantage over healthcare organizations, who are relatively new to the game and whose unique security challenges require specially designed solutions. It’s past time for healthcare organizations to invest substantially in protecting patient data. Sadly, according to KPMG, this has not yet occurred at the necessary scale, as IT security spending in the healthcare industry is just 10 percent of what other industries spend on security.

Incentives exist for healthcare organizations to improve their security posture because the cost of a healthcare breach is significantly higher than in other industries. The average cost per lost or stolen record is $158 across all industries. In the retail sector, the cost is $200 per record lost or stolen. In the financial sector, the cost is $264 per record.

Compare this to the healthcare industry, where the average cost per record lost or stolen is $402, double that of the retail sector. Why are healthcare data breaches so much more expensive? In the aftermath of a breach in a heavily regulated industry like healthcare, the breached organization must conduct a forensics investigation and notify any affected patients. These organizations must also pay any HIPAA fines or penalties incurred because of failure to comply with federal or state regulations. This is in addition to legal fees, lawsuits and most importantly, the long-term brand reputation of the affected organization and lost patient revenue.

However, it’s important to note that healthcare is not the only industry to have fallen behind when it comes to data security. The US government has also struggled to institute effective data security practices. A study by SecurityScoreCard examined the security posture of 600 local, state, and federal government organizations and compared them to other industries. The study found that government organizations had some of the lowest security scores, trailing behind transportation, retail, and healthcare industries. It also found that there were 35 major data breaches of the surveyed organizations from April 2015 to April 2016.

In the summer of 2015, the Office of Personnel Management (OPM) announced that it had suffered a massive data breach. The sensitive information of over 21 million people had been stolen, including fingerprints, Social Security numbers, and sensitive health information. A report from the House Committee on Oversight and Government Reform alleged that poor security practices and inept leadership enabled hackers to steal this enormous amount of sensitive data. OPM immediately began to implement changes aimed at improving their security posture and ensure that such a future massive breach would be prevented. However, one can’t help but consider how much less damage would have been done if OPM had made these changes as a proactive data security measure instead of a reactive one.

While healthcare organizations have had their fair share of data breaches, the OPM breach must serve as a lesson to the industry. Since that incident, the government has prioritized cybersecurity and focused on finding solutions to protect our nation’s sensitive information, data, and assets. Healthcare organizations must follow suit.

Here are five things healthcare organizations can do now to improve their health data security:

  1. Frame security risk assessments as an ongoing process rather than a once-per-year event, ideally, but at the very least ensure they are done annually.
  2. Encrypt data stored in portable devices.
  3. Assess other third-party security risks.
  4. Proactively monitor patient data for inappropriate access.
  5. Educate and retrain staff on how to properly handle sensitive data.

Healthcare must make privacy and security top priorities, learning from the past, applying knowledge from other industries, and creating unique solutions specifically designed for the complicated healthcare clinical environment. This will ultimately provide healthcare organizations with the tools to keep sensitive patient information safe, maintain the organization’s brand reputation, and most importantly, increase patient trust.

Readers Write: Beyond the Buzzword: Survey Shows What EHR Optimization Means to Providers

March 15, 2017 Readers Write 3 Comments

Beyond the Buzzword: Survey Shows What EHR Optimization Means to Providers
By David Lareau

image

David Lareau is CEO of Medicomp Systems of Chantilly, VA.

I was intrigued by this recent KPMG CIO survey that found “EMR system optimization” was currently the top investment priority for CIOs. The survey, which was based on the responses of 112 CHIME members, revealed that over the next three years, 38 percent of the CIOs plan to spend the majority of their capital investment on EHR/EMR optimization efforts.

The key word here is “optimization,” since over 95 percent of hospitals already have an EHR/EMR, according to the Office of the National Coordinator (ONC). Given the high level of provider dissatisfaction with their EHRs/EMRs, it’s not surprising that CIOs are seeking ways to make their doctors happier with existing solutions, since starting over with a new system would require a major capital investment that few hospitals are willing or able to afford.

In the KPMG report, the authors suggested a few ways CIOs could optimize their EMRs/EHRs, including providing effective user training and making more technology available remotely and via mobile devices.

Coincidentally, at HIMSS this year, we conducted our own survey to get a better understanding of what providers find most frustrating about working in their EHR/EMR. I am the first to admit our survey wasn’t the most scientific – the primary reason that almost 700 people agreed to participate in the survey was because it allowed them to enter our drawing for a vacation cruise – but nevertheless, the results were compelling.

We asked HIMSS attendees the following question: What is most frustrating about working in your EHR? We then offered the following response choices:

  1. Relevant clinical information is hard to find
  2. Documentation takes too long
  3. Doesn’t fit into my existing workflow
  4. Negatively impacts patient encounters
  5. Doesn’t frustrate me
  6. My organization doesn’t use an EHR

A whopping 44 percent selected the response, “Documentation takes too long.” For the sake of comparison, the next-highest response was, “Relevant information is hard to find” (18 percent), followed by, “My organization doesn’t use an EHR” (13 percent).

What I glean from these results – aside from the fact that CIOs would be well served to invest in solutions that improve documentation speed – is that CIOs and other decision makers may not be focused on the right solutions.

I am a big proponent of user training, but let’s be realistic: if you have a propeller-driven airplane, it’s never going to perform like a jet aircraft. CIOs must accept that even with all the training in the world, the documentation process within some legacy EHR systems will never be significantly faster, nor will it be particularly user friendly.

Rather than investing resources in trying to teach users how to make more efficient use of an inefficient system, why not consider investing in a solution that can easily be plugged into legacy systems and give clinicians the fast documentation tools they desire? CIOs can find technologies that work in conjunction with existing EHRs to alleviate provider frustration because they work the way doctors think, do not get in their way, and do not slow them down.

The KPMG survey confirms what most of us in healthcare IT have long known: EHRs have not yet achieved their full potential, providers are weary of the inefficiencies, and more resources must be spent to optimize the original investments. As CIOs and other decision-makers consider their next steps, I encourage them to assess what they now have and look for solutions that give clinicians what they want and need at the point of care.

Subscribe to Updates

Search


Loading

Text Ads


Report News and Rumors

No title

Anonymous online form
E-mail
Rumor line: 801.HIT.NEWS

Tweets

Archives

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Reader Comments

  • Hoo boy...: I wonder if you'll look back on this comment one day and feel a tiny fraction of the embarrassment I feel for you now....
  • Greg Park: What ignorance indeed Galileo~...
  • AC: Let's see 98% of scientists agree that the climate change is largely man-made. Could they be wrong, yes? What's the chan...
  • Galileo Galilei: Must call you out on the derogatory comment you slipped into the sun eclipse item, whereby you included people who diffe...
  • GolferNot: I agree with your golf tactics - whacking little white balls is very satisfying especially if you shout the name of a pe...

Sponsor Quick Links