Home » Readers Write » Recent Articles:

Readers Write: Removing Tunnel Vision from Enterprise Imaging

February 24, 2016 Readers Write 2 Comments

Removing Tunnel Vision from Enterprise Imaging
By Karen Holzberger


I find the evolution of technology to be fascinating. Just think about music. Fifteen years ago, CDs were the most popular way to access music. Now you can listen to music anywhere, instantaneously, from tiny devices. The population has universally embraced the change. Why has accepting change in healthcare been so slow and difficult?

I’m not saying we all need to be on the bleeding edge of innovation, but it’s important to remove the tunnel vision and recognize advances not just in diagnostic medicine or medical research, but also in health IT innovations that make things faster, easier, and less costly.

I was surprised when I read a recent report on enterprise imaging that their research and results was limited only to organizations with a vendor-neutral archive (VNA) or universal viewer (UV) technologies.

The need to access and store medical images has been the most common demand of radiology departments for decades, but to think that in 2016 enterprise imaging is only done with these two approaches – it’s like taking a Polaroid camera to the beach and waiting a week for the film to be developed.

Don’t get me wrong. This report got it half right, but VNA and UV solutions don’t fit the needs of every organization, and that can lead people down the wrong path. If healthcare facilities are going to succeed in advancing the quality of patient care, then it is time to accept new and nimble health IT solutions for enterprise imaging today that bring patient images to people’s fingertips as swiftly and securely as the cloud delivers your favorite song.

Over the last few years, cloud-based image exchanges have gained popularity as an option for enterprise imaging. A HIMSS Analytics Cloud Survey showed that 83 percent of healthcare organizations used cloud-based apps in 2014. While this simpler approach is not the same as a VNA, it allows facilities to achieve the same overall goals, often more efficiently. Facilities can be up and running on an image exchange in as little as two weeks and have central access to all necessary images via the cloud – anywhere, anytime.

VNAs are one of the oldest imaging technologies. When introduced, they finally allowed healthcare sites to collect data from all departments in one location and exchange that information with a broader audience. But what about patient care happening elsewhere and other types of patient data?

Today, it’s critical that facilities share information with other facilities, not just other departments within the same building. In addition, the shift to value-based care means facilities require quick, efficient technology that follows patients across a continuum, which takes more than just sending an image from point A to point B. Imagine only being able to listen to your favorite song on your iPod and not on any of your other connected devices.

VNAs can take up to two years to implement and can be horribly expensive. Further, since they don’t encapsulate all of a patient’s data, sites need to use them in connection with other solutions, like a picture archiving and communication system (PACS), to have a complete enterprise imaging strategy.

Cloud-based imaging, on the other hand, provides more than the seamless sharing of images. It delivers real value and efficiencies like capturing and sharing all relevant patient data, just like how the cloud allows you to access your music, videos, and playlists effortlessly between your phone, tablet and laptop. Which is why I’m perplexed that society openly welcomes this technology in our lives, but accepting technology that can make life-saving differences has proved to be so challenging.

The time to embrace is now. If not, I fear that we will only continue set back an industry that so desperately needs to move forward.

Karen Holzberger is VP/GM for diagnostics at Nuance of Burlington, MA.

Readers Write: Read This Before You Sponsor Another Hackathon

February 3, 2016 Readers Write 6 Comments

Read This Before You Sponsor Another Hackathon
By Niko Skievaski


Innovation is undoubtedly a hot topic right now in healthcare. For good reason: it’s said that one-third of spend is waste and payment models are shifting in an attempt to drive efficiency.

Technology is the obvious place to look for efficiency gains and health systems around the country are getting creative with ways to better utilize it. We see rampant partnerships with startup accelerator programs, direct early stage investments, innovation teams, and the advent of the “chief innovation officer” whose primary goals seems to be a gating mechanism for the army of entrepreneurs trying to make an impact.

We’ve directly participated in a dozen flavors of enterprise innovation programs over the past two years. With this experience, I’d like to ask health systems to try a different sort of program: just try our products.

That’s a lot easier said than done. Your organizations weren’t designed to adopt new technology. Over the past decades, data centers were constructed to house your intranet, EHR, ERP, LIS, MIS, and a slew of other acronyms. IT departments were invented to manage the onslaught of hardware and software subsequently installed on their machines. The systems are weaved together in a web of interfaces managed by graying whizzes from their cubicles.

Each new piece of technology requires budget, a new install project to be prioritized, FTE to be allocated, and expertise to be acquired. Why would any IT head want to shake up their delicate game of Jenga with new software? Especially software from an unproven startup. Especially software in the cloud.

This is poles apart from the modern, tech-savvy organization. Other industries felt market pressures and profit motives to become agile and modernize incrementally. Meanwhile, health systems felt little market pressure as costs inched up year over year.

Pressure later came from well-meaning government subsidies to adopt adequate electronic health record software, however exacerbating rather than toppling the Jenga tower. While health systems upgraded their hardware, the rest of the world moved to SaaS-based tools that eliminated the need for designated IT departments to show you where to click.

The mounting inefficiencies observed in everyday healthcare interactions could cause any millennial to quit her job and start a digital health startup attempting to bring a modern Web experience and level of service to an industry worth saving. This is the core of my request. We don’t need help starting more startups. We don’t need accelerators. We don’t need strategic investments. We need feedback.

I’m not referring to conference panels of CIOs or experienced entrepreneurs tearing startups apart. The feedback required to build an effective product comes at the front lines in the real world. It needs to get all the way into the hands of the doctors, nurses, support staff, and patients.

The technology crisis in healthcare is rooted in the lack of adoption of technology, not in the lack of technology. Similarly, your innovation won’t be in the tech you help to create — it will be in your ability to more rapidly adopt the tech that already exists.

Focus enterprise innovation efforts on decentralizing technology adoption. Figure out ways to let departments choose how to manage their work. Decentralize new technology budgets to get that decision-making process as close to the front line as possible.

The vendors will figure out ways to make it cheap enough by eliminating upfront capital and installation projects. IT should invest in infrastructure technology that allows modern technology to work within your facilities: fast Wi-Fi, modern browsers and devices, API layers, make SSO easy, etc.

Don’t partner with accelerators unless you plan allowing them to outsource your technology selection process. The primary reason those companies participate is to sell to you. And don’t invest in digital health companies unless you’ve used the product. Put your money where your mouth is. Otherwise, your investment is not strategic, it’s just money.

This will also force the business development teams to work closely with clinical teams for product validation. You’re all on the same team — align incentives. You don’t need to depend on accelerators and suits with MBAs to help you figure out if a startup’s product will improve care or increase efficiency at your hospital. The front line will tell you in 10 minutes if you let them use the product.

Niko Skievaski is  co-founder of Redox.

Readers Write: Dealing with the Aftermath of Hurricane ICD-10

February 3, 2016 Readers Write No Comments

Dealing with the Aftermath of Hurricane ICD-10
By Michael Nissenbaum


It seems only fitting to compare the October 1, 2015 transition from ICD-9 to ICD-10 to a hurricane. Like a hurricane, we tracked the pending event well in advance. The news media were filled with stories speculating whether ICD-10 would hit as expected and what the potential impact might be.

Even as we braced for the worst, ICD-10 made landfall with a great deal of noise and fury. But according to a Porter Research and Navicure survey, 99 percent of the 360 organizations who responded said they were ready for it and survived the event itself, meaning they were able to begin using ICD-10 when the deadline hit.

Yet Hurricane ICD-10 also shares another characteristic with its physical world counterparts: the aftermath may have a longer-term effect than the event itself. So while all the cork-popping and victory laps back in October may have been well deserved, providers are realizing the forecast is not all sunshine and light tropical breezes just yet.

As you address the ICD-10 aftermath, be wary of some of the issues that may still crop up  and prepare in advance to deal with them.

  1. Be prepared to set up specialized “per payer” rules. While it would be great if every payer organization was now fully converted to ICD-10, it’s still not the case. For example, some smaller workers compensation carriers still aren’t accepting ICD-10 codes, so providers must process their claims differently. Ideally providers can set up special rules in their electronic health records (EHR) and/or practice management (PM) systems to automate this conversion and avoid the need to make manual changes, or even worse, submit paper claims.
  2. Make sure your team is fully trained on the changes. While there is currently a grace period for unspecified ICD-10 codes, that leeway is scheduled to come to an end within the next year. Denials will then increase if you’re not prepared. The best approach is to act as if the grace period doesn’t exist. Ensure your team is trained to submit documentation that is specific enough to support the selected ICD-10 code in the event you’re ever audited. If your users are still struggling in this area, partnering with an expert third party for training may be a worthwhile investment.
  3. Become experts on your most common codes first. We have gone from 13,000 codes in ICD-9 to 69,000 in ICD-10. That’s a lot to learn, but your team doesn’t have to master all 69,000 at once. Identify your practice’s most commonly used codes and make sure you can get them right every time. Once those are in good order you can expand the training on a prioritized basis.
  4. Make the most commonly used codes available quickly through adaptive learning. Take advantage of technologies that “remember” the codes that are used most frequently and make them readily available without a lengthy search process. This will enhance user productivity and minimize user frustration.
  5. Consider technologies that take advantage of natural language search. Another way to improve productivity under ICD-10 is help providers find specific codes faster. Natural language search allows a user to type in “chest pain,” for example, and be presented with answers that match chest pain specifically, as well as related terms such as angina and other heart-related diagnoses. This significantly reduces the time it takes for providers to search for the right level of specificity, especially when first learning new codes.
  6. Take advantage of automated correlation between ICD-9 and ICD-10. Providers that are still learning ICD-10 may benefit from technologies that allow them to type in a familiar ICD-9 code and have the system narrow the choices to a closely related ICD-10 subset. While there will not be many one-to-one relationship between codes, trimming the options can be a huge time-saver.
  7. Speed the selection process with filters. Technologies that use filters to navigate the ICD-10 coding process can also enhance productivity. These solutions deliver a step-by-step approach to drill down to the correct category (e.g., diabetes or chest pain,) followed by more precise options (e.g., left or right.)
  8. Make sure your team understands the importance of these changes. It’s human nature to resist change and providers have had more than their share of changes thrust upon them in the last few years. But failure to comply with ICD-10 affects reimbursements for both the practice and the individual providers. Be as encouraging as possible and keep working to ease the transition.

While Hurricane ICD-10 may have passed through in October, there’s still work to be done. Many organizations are still suffering from productivity losses that could impact their financial success for a long time to come. If your organization is still not recovered from the ICD-10 aftermath, consider the implementation of time-saving technologies and partnerships with knowledgeable experts that can deliver the training and support you need.

Finally, it’s worthwhile to remember that the ICD-10 implementation date was pushed back twice, which is akin to giving providers 15 days warning for an impending storm versus a mere five days. Take note, all you rule-making bodies, and consider how a more sensible implementation pace contributed to the relative success of the ICD-10 transition. Something to keep in mind next time anyone considers cramming providers with a new round of arduous regulations in unreasonable timeframes.

Michael Nissenbaum is president and CEO of Aprima Medical Software of Carrollton, TX.

Readers Write: The Importance of HIT Succession Planning

February 3, 2016 Readers Write No Comments

The Importance of HIT Succession Planning
By Frank Myeroff


While getting ready for HIMSS 2016 Conference & Exhibition, I’ve had the opportunity to speak with many healthcare IT leaders about what’s on their priority list this year when it comes to acquiring, promoting, and retaining key HIT talent. One response that I heard over and over again was “Succession Planning.”

The HIT profession is seeing shortages of talent, making succession planning more important than ever. Having a well-developed and current strategic plan in place will help your organization prepare for the future in these vital areas:

  • Prevent vacancies when baby boomers retire. As senior HIT personnel begin to retire, including many CIOs, the industry will lose leadership, knowledge, and skills and that won’t be easy to replace. However, even after having this advanced notice, many organizations are still unprepared for their absence. Therefore, they will find themselves with many vacancies, and consequently may cause them to make quick and rash hiring decisions.
  • Recognize and develop future leaders. As we face a leadership shortage in HIT and in just about every industry across the board, companies must identify and foster those individuals demonstrating leadership skills and abilities through mentoring, training, and stretch assignments so they are ready to take the helm when the time comes.
  • Prevent turnover and costs associated. Employees at all levels are less likely to leave a company that is committed to providing meaningful work and opportunities to grow. A continuous flow of engaged people with defined career paths will stop the revolving door which can be detrimental to any firm. A high turnover is quite costly. According to the Wall Street Journal, experts estimate that it costs upwards of twice an employee’s salary to find and train a replacement.
  • Maximize organizational value. Healthcare organizations with an HIT succession plan in place are more attractive. Management teams having a strategy for when a key player exits protects the value, integrity, and longevity of the company. As a result, your company’s reputation stays positive and in turn, attracts top performers to your company.
  • Meet growing demands for high quality, cost-effective care patient care. Healthcare leaders face unprecedented pressure to meet the ambitious expectations of health reform, i.e. to reduce costs and simultaneously assure high quality patient care. Therefore, the industry needs to better prepare their HIT professionals to manage the complex organizations that provide and finance care.
  • Guarantee the stability of business operations. HIT succession planning helps to mitigate risks and ensures business continuity. People are your greatest asset. They can also be its greatest downfall. If your company becomes overly dependent on the services of a few key individuals, it can lead to operational risks that can cause damage when one or more of those key people are no longer there.

With HIT succession planning on the minds of so many organizations right now, there are a number of ways to find the HIT talent who can ultimately step-in to fill those current and future roles:

  • Hire more military technology veterans. Organizations are on a mission to find, hire, train, and accommodate US military veterans who possess the IT skills in high demand, such as cybersecurity. The military represents a large IT talent pool even though military technology experts may not have civilian HIT certifications or experience. Savvy organizations are able to look past that when onboarding and then later assist returning vets in obtaining those civilian credentials, including IT certifications. In addition, when hiring military veterans, they bring so much more to the job such as leadership skills, ability to perform under pressure, teamwork, respect for procedures, and integrity.
  • Implement college-level internships. More and more organizations are moving towards creating HIT internship programs at the college level. They consider it a year-round recruiting tool which means having an ongoing pipeline of future HIT talent. In addition, interns are an inexpensive resource while at the same time are some of the most highly motivated members of the workforce. Internships.com allows a company to post a profile free of charge. This way, a company gets exposure to top colleges and candidates without breaking their budget.
  • Re-hire retirees for expertise and training. One way organizations will succession plan is to pay retirees to come back. Many IT professionals are now returning as consultants operating under one- to two-year contracts for their help and expertise. In addition, they are being asked to train and mentor promising IT professionals. These seasoned workers have experience and a tremendous amount of knowledge to share.
  • Hire and promote from within. In many cases, organizations lay out an HIT career path that they use to retain quality people. This approach fosters loyalty and also positions your company as a place that career-minded individuals want to work. If you hire or promote from within, it also helps you to retain other key people.
  • Acquire talent from outside or competitors. If an organization does not have confidence that an internal candidate is ready for the position, they may have to recruit from outside and from the competition. Hiring from a rival firm can mean bringing aboard someone who already knows your industry, your HIT initiatives, and/or can bring valuable new project knowledge.

Healthcare IT succession planning should be a part of every company’s strategic plan. It’s vital for the vision of where your company will be going in the future and how it will get there.

Most importantly, succession planning in general will shape how your organization develops and nurtures its people, assures a continuing sequence and pipeline of qualified people to move up and take over when needed, and assures that key positions will be filled with the right people able to carry your company into the future.

Frank Myeroff is president of Direct Consulting Associates of Cleveland, OH.

Readers Write: What EHR Vendors Need to Know About Implementing Minnesota’s Electronic Prior Authorization Law

January 20, 2016 Readers Write No Comments

What EHR Vendors Need to Know About Implementing Minnesota’s Electronic Prior Authorization Law
By Tony Schueth


It’s January 2016 and electronic prior authorization (ePA) is now “required” by law in Minnesota. There has been surprisingly little fanfare about this deadline, and it’s my observation that most electronic health records (EHRs) and providers are not ready to comply. Here’s what EHR vendors need to know about implementing Minnesota’s law, also known as MS §62J.497.

It’s not necessarily a mandate. Minnesota really wants clinicians to do PAs electronically using standards from the National Council for Prescription Drug Programs (NCPDP), but there are no penalties for non-compliance. According to a state fact sheet:

  • “Starting January 1, 2016, prescription drug authorizations – including prior authorizations (PA) and formulary exception requests – must be exchanged electronically, using the NCPDP SCRIPT Standard version 2013101.”
  • “The law does not require prescription drug PA and/or formulary exceptions. However, for those entities subject to the law, if PA requests and responses and/or formulary exception requests and responses are exchanged, starting January 1, 2016, they must be exchanged electronically based on the NCPDP SCRIPT Standard version 2013101.”
  • No later than January 1, 2016, drug prior authorization requests must be accessible and submitted by health care providers, and accepted by group purchasers, electronically through secure electronic transmissions. Facsimile shall not be considered electronic transmission.”

While the language is very strong, the statute doesn’t definitively say that every single PA must be done electronically.

State officials acknowledge they may be out in front of everyone else: “technological updates to enable this functionality can take time, and manual methods for prior authorization may need to be used until electronic functionality is available with all partners.” Kudos to Minnesota for showing leadership.

Should you wait?

Despite Minnesota’s lack of a true mandate, I wouldn’t recommend waiting for the regulatory axe to fall in Minnesota. The paper-, fax-, and phone-based prior authorization (PA) process is time-consuming and burdensome to physicians and expensive for payers. In contrast, ePA promises efficiencies.

It’s early in the adoption cycle, kinks need to be worked out, and implementation isn’t uniform. That said, pharmacies and prescribers ultimately will prefer ePA over current processes to help keep pace with the PA requirements associated with the increasing number of drugs used to treat the rising number of the chronically ill. Furthermore, large integrated delivery networks will select EHRs that are compliant with the statutes and regulations in their service area. These EHRs must be able to handle transactions, such as ePA, regardless of site of care.

What about e-prescribing of controlled substances?

Minnesota has had 62J.497 on the books to mandate e-prescribing for all prescriptions effective since 2011 and the state has some of the strongest e-prescribing adoption in the country. We have heard anecdotally that Minnesota has a goal of having all controlled substance prescriptions being electronically prescribed by the end of 2016. While that appears to be just a goal, there are two aspects of controlled substances prescribing that should be kept in mind.

The first is that e-prescribing of controlled substances (EPCS) is permitted both at the federal and state level. Even so, the facts about the legality of EPCS are often surrounded by confusion. Because of this misperception and the fact that there are no penalties for non-compliance, demand by prescribers is just beginning to appear. But that is changing.

The second is interfacing with the state’s prescription monitoring program (PMP), which is up and running under the auspices of the state’s board of pharmacy. All dispensers (pharmacies or providers that dispense from their offices) licensed by the State of Minnesota must report on a daily basis all controlled substance II-V and butalbital prescriptions that were dispensed. To satisfy the reporting requirements, all EHRs should be able to interface with the PMP to provide the necessary information.

New York takes a different approach

What is interesting is the contrast of Minnesota’s electronic prescribing and ePA “mandate” with New York’s I-STOP. The spirit (and language) of the rules are very similar for both states, except, of course, that I-STOP doesn’t mention ePA. The key difference is that I-STOP articulates the penalties for non-compliance. New York has the right to impose professional misconduct penalties (including fines and possible license revocation) for non-compliance with I-STOP.

As a result, it appears that most EHR vendors with clients in New York have enabled their products to handle e-prescribing – including EPCS – and have emphasized their readiness.

I applaud both states’ efforts to lead and urge EHR vendors not to wait until the last minute to roll out products in either state. Your customers will appreciate it. Furthermore, your competitors will have solutions available for those who aren’t either prescribing electronically or facilitating ePA yet.

Tony Schueth is CEO of Point-of-Care Partners of Coral Springs, FL.

Readers Write: Industry Trends Impacting RCM in 2016

January 6, 2016 Readers Write 1 Comment

Industry Trends Impacting RCM in 2016
By Patrick Hall


Higher out-of-pocket costs, new reimbursements models, and rising operating costs are just a few of the trends that will impact provider revenue cycles in 2016. These industry developments will force providers to evaluate existing RCM strategies and possibly implement new technologies and workflows to simultaneously maintain financial health and address evolving consumer and regulatory demands.

Consider some of the more significant trends and their potential impact:

Higher out-of-pocket costs for patients

As the cost of insurance continues to rise, patients are shouldering higher out-of-pocket costs for deductibles and co-insurance. As a result:

  • Consumers are paying more attention to the cost of care and requesting greater price transparency prior to receiving services.
  • Providers need efficient tools to estimate a patient’s out-pocket-costs. This includes accurate eligibility and co-pay details, up-to-date information on a patient’s deductible status, and specifics on what services are included in a patient’s coverage.
  • Consumers may need help managing the cost of their care. Providers may require automation tools to facilitate any special payment arrangements.
  • Existing workflows may need to be altered. For example, in the past a practice may not have verified insurance information until the patient arrived in the office. Administrators may now elect to verify insurance details in advance of scheduled appointments and advise patients when a large out-of-pocket cost is anticipated.
  • Providers face greater financial risk. When patients struggle to pay for services, providers risk losing revenue and must dedicate additional resources for collection efforts.

Overhead costs are rising, but not necessarily reimbursements

In order to preserve financial health, providers must:

  • Remain diligent in controlling costs and managing the revenue cycle.
  • Consider technologies that automate RCM processes and increase efficiencies.
  • Make sure staff is well trained in order to maximize the benefits of technologies.

Provider reimbursement models are shifting from traditional fee-for-service to models that include incentives for the efficient delivery of quality outcomes

RCM is no longer as simple as sending out a claim after a patient office visit. Instead, providers must:

  • Be proactive in managing the health of their patients.
  • Implement workflows and technologies to help track patient outcomes.
  • Improve care coordination to minimize test duplication, manage costs, and enhance outcomes.
  • Prioritize efforts to collect patient payments at the point of care or in advance of procedures.

Meaningful Use and the transition to ICD-10 are less of a priority

Meaningful Use and ICD-10 have been top priorities for providers for the past several years. Today, however, most organizations have implemented certified EMRs and achieved some degree of Meaningful Use success, as well as made the transition to ICD-10. Providers now have more time and resources to address revenue cycle needs, possible platform upgrades, and the addition of apps to increase operational efficiencies.

In recent years, RCM has taken a back seat to competing priorities, but the changing healthcare landscape is forcing providers to evaluate existing strategies and technologies. This could be the year that RCM emerges from the shadows and perhaps into the spotlight.

Patrick Hall is EVP of business development for e-MDs.

Readers Write: Why Do Digital Health Startups Need So Much VC Investment?

January 6, 2016 Readers Write No Comments

Why Do Digital Health Startups Need So Much VC Investment?
By Tom Furr


It is projected that by the end of 2015, $4.3 billion in funds will have been raised to bolster digital health startups. The first half of 2015 saw $2.1 billion invested in this area.

To clarify, a digital health company is one that could not exist without broadly available digital technology (like the Internet) and serves the healthcare market exclusively — pure plays. A company that publishes software and happens to have a health application in its portfolio is not considered a digital health company. Providers and partners are also not in this area – they are service organizations.

It’s been reported that digital health represents eight percent of all venture funding. However, there are many companies that have been in business for nearly 10 years that have raised more than $50 million but are not experiencing anything approaching “Google-like growth.”

Even someone who reads business news occasionally understands that mega-startups like Uber are raising huge amounts of money to build a legal war chest to contend with suits and regulatory pressures around the world. There must be an obvious reason for the continued big raises for the Ubers out there as well as digital health companies.

So why are we seeing such a robust funding for digital health outfits? Is it that digital health companies require large capital requirements to contend with regulatory challenges akin to what Uber faces? Is it that digital health companies are started by people from healthcare who don’t really understand how to build scalable technology solutions that have great usability? Is healthcare continuing to be an industry that lags in the adoption of new technology? Are consumers, when they’re the ultimate end-user, unaware or unimpressed with the new offerings?

What’s going on? Companies may be bulking up with funds so they can last through a long, slow adoption cycle or the twisted path to regulatory acceptance. However, here’s my take on these important questions.

Is it that digital health companies require large capital requirements to contend with regulatory challenges in the healthcare?

I suggest that most established, large healthcare IT companies have been built on government subsidies and regulations that force providers to purchase their products. In some cases, this is the only way some companies will be successful, in stark contrast to software companies that build a product that adds value to the customer instead of it being a requirement.

The ability to transition from being forced to purchase to wanting to purchase (based on recognized value) has clearly built up lots of apathy in the industry against change or innovative products. That is why healthcare is the last major industry to send so many paper statements as compared to all other sectors, including banking and financial services.

Is it that digital health companies are started by people from healthcare who don’t really understand how to build scalable technology solutions that have great usability?

In my examination of currently available products, none strike me as being user-friendly or innovative. Even Athenahealth – arguably one of the top established healthcare IT innovators in digital health history — says it’s time for an upgrade to usability for all products, including their own.

I come from the payments space, which has similar characteristics to healthcare with regards to its established companies. However, it has seen massive amounts of innovation from companies like Square that is shaking up the status quo by building a solution that is easy to get up and running, even by my young son, validating the importance of usability.

3. Is healthcare continuing to be an industry that always lags in the adoption of new technology?

I think most will agree that healthcare is at the back of the line when it comes to applying innovative technology to operations versus clinical settings like the latest heart procedure. I challenge any of the established players to say they’ve kept back-end systems or user interfaces fresh. At least Athenahealth’s Jonathan Bush is willing to call a spade a spade, which I respect.

No established player wants to see their company become a Yahoo (they did not value contextual ads like those served by Google) or an IBM (which did not value a PC operating system as Microsoft did) when life gets comfortable and profits are healthy, so they crush innovation.

4. Are consumers, when they’re the ultimate end user, unaware or unimpressed with the new offerings?

If you look at any number of healthcare-related portals, it appears that usability was the last thing in mind when they were created. In fact, if it were not for Meaningful Use, would any of those vendors have even created portals? This gets back to my second point — that any new company is going to have to spend substantial amounts of money to educate patients on the value of their offering. In all likelihood, the same thing has to do be done with provider solutions.

So we’re clear, I am not advancing a radical change in the way the healthcare market behaves. It is an industry whose product is positive outcomes for people with, or prone to health issues. That must remain its focus. But it needs to realize that there are ways to do things better, faster, cheaper, and designed to be easy to use for providers and patients.

Let me restate my central question. Why is this happening? Let me ask you to join the conversation with your thoughts. Getting to the core reason will not only help healthcare investors, but more importantly, anyone providing or getting healthcare.

Tom Furr is founder and CEO of PatientPay of Durham, NC.

Readers Write: Inventing the Mid-Cycle with Patient Self-Service

December 21, 2015 Readers Write No Comments

Inventing the Mid-Cycle with Patient Self-Service
By Janie Tremlett


It’s no surprise that payment models in healthcare are transforming. What may come as a surprise, though, is how quickly these models are transforming. Healthcare is rapidly moving to incorporate measures of value into payment models, with more than two-thirds of payments expected to be based on value measurement in five years, up from just one-third today, according to a study conducted by ORC International.[1]

The study goes on to state that most of the key obstacles that need to be overcome during this shift are technology-related, with one of the biggest technology problem areas being data collection, access, and analytics. Hospitals and health systems’ financial health is now come to depend on getting and accessing accurate and current data, documenting data, and ultimately delivering good clinical outcomes.

Defining the Mid-Cycle

The mid-cycle is where revenue cycle meets clinical interactions and patient access. A value-based reimbursement system requires tighter integration of clinical records and other systems with providers’ financial systems. Today, however, a key bottleneck for many hospital revenue cycles occurs in the link with the clinical side. Identifying this bottleneck area and learning how to optimize it is critical for healthy financial performance, solid clinical performance, and for patient satisfaction and engagement. 

How can we optimize it? Start small.

Capturing Data Pre-Service

There is an opportunity to engage patients in a pre-registration workflow pre-service — before they have even stepped foot on premise — on a personal device, such as a laptop, smartphone, or tablet. Several items go into a pre-registration workflow, some of which can include a confirmation or update of a patient’s demographics and insurance information, completion of forms and questionnaires, and bill payment. Capturing that information beforehand has a great impact when it comes to anything that needs to be sent to the patient, such as appointment reminders, billing, mail order prescriptions, and lab results.

In addition to asking for demographic information confirmation, you have the opportunity to ask clinical screening and clinical intake questionnaires relevant to a patient’s appointment as part of the pre-registration process. These questionnaires can help determine any number of issues, like maybe the patient is scheduled for the wrong appointment, before they ever show up on site.

For example, a patient may be scheduled to receive a procedure at a certain location, but, based on his or her answers to a questionnaire, it turns out the patient is in a wheelchair and the original appointment location wouldn’t be appropriate because the spacing and equipment don’t allow for the size of a wheelchair. It could have been a disaster for the patient to actually go to the appointment, not only because it would have been a waste of time for the patient, but also because the hospital would have wasted an appointment time slot with expensive equipment and would have to spend time finding a new appointment time and location for the patient. Through a questionnaire given pre-service, this can be found out and flagged in advance.

Facilitating Data Capture On-site

Just like the airline industry, where travelers can check in for a flight on a kiosk at the airport, patients can do the same at a hospital or health system. The big advantage to having an on-site registration and check-in solution is healthcare facilities can capture patient data on patients who they’re not expecting to arrive, like a walk-in, and thus can’t ask to complete a pre-registration workflow.

Instead of registering and checking in face to face with a member of the hospital staff, kiosks — whether they’re free-standing, wall-mounted, table-top, or tablet kiosks –  can be designed for a quick two-minute interaction. They’re an effective way to identify patients on-site, give them questionnaires, take them through relevant workflows, and triage them. Even the most basic question, “Are you here for a scheduled appointment or are you here as a walk-in?,” can allow healthcare facilities to optimize their patient flow.

If you take it a step further and ask questions like, “What are your symptoms? What is your pain level?” healthcare facilities have the opportunity to prioritize patients and get them to the right place in a timely fashion. Kiosks also can be used to educate or inform patients. For example, if healthcare facilities want to encourage their patient population to get flu shots or to think about getting tested for a certain disease, they could display notifications or reminders on these kiosks.

Automating Clinical Intake Documentation on the Front-End

There is a lot of clinical intake documentation that we can pull out of the clinician workflow that really gets down to them simply interviewing patients. We can take these parts to the patient to do electronically, and then feed it directly into the EMR. We call this concept of patient-directed digital questionnaires a “virtual clipboard,” a tablet or kiosk with the same questionnaires patients would have been given in paper form, but now just automated.

The virtual clipboard is a practical, low-cost way to save time and start providing relief to clinicians during their clinical workflow. Specific areas that can be automated using a virtual clipboard include:

  • History of present illness
  • Medication reconciliation
  • Chief complaint
  • HIV, drug, alcohol screening
  • Behavioral and mental health screening
  • Antibiotic over-prescription screening

Identifying and Optimizing the Mid-Cycle

These results are certainly within reach. By taking a small step in extending patients the ability to enter their own data, healthcare systems can strengthen their documentation initiatives, which will ultimately optimize their revenue cycle and bolster their bottom line.

[1] The 2014 State of Value-Based Reimbursement, ORC International, 2014.

Janie Tremlett is GM pf patient solutions at Vecna Technologies of Cambridge, MA.

Readers Write: Inefficiencies Lost, Productivity Gained: Healthcare Communication Systems the Key

December 21, 2015 Readers Write No Comments

Inefficiencies Lost, Productivity Gained: Healthcare Communication Systems the Key
By Lindy Benton


Healthcare providers continue to seek a unified electronic view of patient’s health data, one that is comprehensive and fully accessible across the enterprise. Considered the health IT holy grail for many hospitals, the benefits of such are far reaching. A unified platform for communication and data collection facilitates better collaboration among care teams, increased productivity, and improved performance across the healthcare continuum.

Progress toward this objective has been a challenge for providers because of the lack of effective methods to capture and manage unstructured data that typically resides outside the EHR. Hospitals struggle to integrate information needed from ancillary systems as well as from disparate sources, such as paper files and even verbal exchanges. Only when data from these sources are aggregated and accessible through a single repository can a truly comprehensive view of the patient’s path across the spectrum be achieved.

Recent advancements in health information exchange and integration, however, have positioned the industry closer than ever to meeting this goal. With new options for solutions that facilitate the secure exchange of health information and management of healthcare communication, providers are within reach of a single integrated platform to view all patient data.

Providers can now document all interactions – phone calls, faxes, web visits, medical records, and even face-to-face conversations – tie them to the patient record, and centrally store them for viewing, processing and retrieval. By combining the capture and management of all communication types, hospitals are able to close gaps in documentation processes and create for themselves a complete view of patient information available and exchangeable across the organization.

Taking such steps has implications not only at the point of care, but also in the revenue cycle. As hospitals continue to invest countless resources to ensure full and accurate reimbursement for services, there are ever-present nuances and variables affecting this process. Documentation of the hospital’s communication surrounding payment, therefore, can protect the investment being made to secure these hard-earned dollars.

Of course, this is just one example of many improvements realized from a unified view of patient data. The following are additional opportunities for health systems to better leverage health information and communication management systems for improved performance, workflow and quality outcomes.

Financial Performance

Performance in denials remains an area of concern for many hospitals. Nearly two-thirds of errors leading to initial denials originate in patient access departments and issues associated with eligibility, authorization, or demographic information. Systems that document hospital efforts to secure authorization – verbal, fax and electronic – can be leveraged to prevent and overturn denials, shorten appeals, and reduce cost to collect. Documentation of the agreed-upon level of care also helps hospitals avoid retro denials for lack of medical necessity, translating into countless thousands in savings for the hospital.

Physician and Staff Alignment

Lost physician orders cause frustration among physicians, increased wait times for patients, and bottlenecks throughout the hospital. Routing fax and electronic exchanges through a central platform allows providers to receive and manage all orders in one location. In so doing, the hospital improves workflow, process times, and service to both physicians and patients. For example, with fax and electronic orders in a single location – searchable by patient and available enterprise-wide – physicians and staff can confirm in advance that orders are complete and accurate for all patients prior to service. Whenever and wherever the patient arrives, staff are able to immediately locate and process the order, reducing delays and cancellations that can result from missing orders.

Centralizing fax and electronic communication also gives providers the opportunity to reduce costs and risks associated with standalone fax machines. By converting to an electronic process, providers gain the benefit of a digital audit trail of individuals who have accessed each record, reducing the risk of a HIPAA violation caused by unauthorized access to paper files. Converting to an electronic fax process can also reduce document delivery costs such as maintenance and paper by up to 90 percent.

Patient Experience

Studies have shown that more than 50 percent of patients say that good communication is the primary reason they chose a hospital or clinic. Similarly, a patient’s rating of provider communication skills has been shown to be the strongest predictor of overall HCAHPS scores. Creating a positive patient experience means managing the hospital’s message from the first point of contact to the last. With systems available to capture and centralize all patient encounters – phone, electronic and in-person – providers can review interactions to improve quality, conduct service recovery, and reinforce communication best practices across departments for a better overall patient experience.


Hospitals need convenient access to patient records while ensuring that protected health information remains secure. Secure sharing of records between systems and team members can eliminate time-wasting and error-prone processes. A central point of access to patient data reduces duplication, rework, and back-and-forth between departments.

Patient Safety and Quality

Movement toward a value-based delivery model has placed even greater emphasis on care coordination. Systems that streamline the process of getting the right information to the right people mean faster response times, better care transitions, and possibly improved continuity of care. With quality assurance programs to ensure compliance with hospital policies and procedures, providers can better protect patient safety and promote better outcomes.

Hospitals can now close several gaps in documentation through an enterprise-accessible patient record. Real-time, seamless access to critical patient information fosters an environment for better care outcomes and improved revenue cycle performance. There are clear benefits to aggregating communication and capture systems and pairing them with the electronic health record to tell the full tale of the patient’s story from the moment of entry to the time of exit.

Lindy Benton is president and CEO of MEA|NEA|TWSG of Norcross, GA.

Readers Write: 501(r) — Are You Ready for This?

December 2, 2015 Readers Write 1 Comment

501(r) — Are You Ready for This?
By Jonathan Wiik


Last time I checked, hospitals have a lot on their plates. Remember October? ICD-10 ring any bells?

In case you haven’t heard, another new set of regulations — under section 501(r) of the Affordable Care Act — is set to take effect in 2016 for all 501(c)(3) non-profit hospitals. The implications: comply or lose your tax-exempt status.

It’s a hard truth, but the healthcare industry is facing more regulations than nuclear power—look it up. These new regulations are far from straightforward. Compliance with 501(r) can be incredibly complex. The entire process can take anywhere from several months to a year, depending on how smart your people are.

Not to mention expensive. Staff, signage, documentation, training, etc. are all crucial elements of effective 501(r) compliance. What’s more, you may need to hire a new employee or two just to manage the task.

In a nutshell, 501(r) requires that you satisfy new regulations around CHNA, FAP, ECTP, AGB, and EAC. (Go ahead and look up those up after you read up on nuclear power—or just read on.)

Here’s what you need to know about 501(r):

  • Congress passed the Patient Protection and Affordable Care Act (PPACA) in 2010.
  • Prior to this, some non-profit hospitals had been engaging in aggressive billing and collections efforts that brought their “charitable” status into question.
  • This led to the enactment of section 501(r), which requires non-profit hospitals to demonstrate the benefits they provide to their patients and community from a financial standpoint.

As part of 501(r), non-profit hospitals must now meet four specific requirements in order to maintain their tax-exempt status:

  1. Conduct a periodic community health needs assessment (CHNA).
  2. Provide written financial assistance and emergency care policies (FAP, ECTP).
  3. Establish limitations on charges for emergency or medically necessary care (amounts generally billed or AGB).
  4. Set policies and procedures related to billing and collection activities (extra ordinary collection or EAC).

There are three basic approaches you can take when it comes to compliance:

  • Ignore it, do nothing, and assume that you’ll handle it when something happens.
  • Check with the experts in your organization to see where you stand.
  • Take a proactive approach, put a team together, perform an assessment, and establish an action plan. (Hint: choose option 3 if you want to bolster your charity status, prevent poor patient experiences, ensure your tax-exempt status, and maybe even reduce future expenditure.)

Here’s how to get 501(r) right:

  • Measure the pros and cons, risks and rewards of tax-exempt status against the costs of 501(r) compliance. I the juice worth the squeeze? Personally, I think it is for a variety of reasons, but it’s still helpful to understand what you’re in for.
  • Document, document, document. Proper documentation is a crucial requirement of 501(r), but it can also be used to show that you’re making a good-faith effort to comply with the rest of the requirements.
  • CHNA. This may actually help support your strategic plan. Are the programs offered by your hospital meeting most of the needs of your community? Are all your resources in sync with the community? Have community wellness and health in general become better, worse, or stayed the same?
  • Reputation insulation. Compliance can actually help you avoid negative patient experiences and minimize bad press. Along with the “worth” of your non-profit status in the community, in hard and soft costs, the fines can pile up quickly.
  • Use third-party data to presumptively determine eligibility for FAP. 501(r) clearly permits the use of presumptive eligibility, which enables you to assess a patient’s financial health early in the revenue cycle. By streamlining this process with third-party data, you can realize increased or accelerated cash flow as well as save time and money by converting manual workflows into automated processes.
  • Documented standard work. The use of third-party data can help facilitate a consistent (unbiased) and efficient method for identifying which patients are eligible for financial assistance, effectively taking the guesswork out of the equation. Additionally, 501(r) requires that you thoroughly screen patients for eligibility before sending them to collections or initiating extraordinary collections actions. Again, by using third-party data, you can identify which self-pay accounts can be pursued for collections and which accounts can be presumptively qualified for charity care. This allows for an accelerated segmentation of aged self-pay accounts into payment, charity, and bad debt buckets.

At the end of the day, it’s important to evaluate your patient mix and adjust policies to fit any changes, as well as track and measure your results. Be sure to establish measureable goals to ensure that your FAP-reporting processes are meeting 501(r) standards as well as your patient population mix. Setting a specific number of goals will help keep the focus on the high-priority tasks, ensuring that your processes can be measured more effectively. Are you ready for this?

Jonathan Wiik is principal, revenue cycle management, with TransUnion of Chicago, IL.

Readers Write: Clinical Decision Support: Are We Ready to Invest?

December 2, 2015 Readers Write 1 Comment

Clinical Decision Support: Are We Ready to Invest?
By Jaffer Traish


Sometimes great ideas are just ahead of their time. Microsoft launched a smart watch in 2004. Digital currency received 100M in venture funding, but collapsed in the dotcom era. Google Glass has come and gone – or has it?

Evidence-based medicine and the marriage with technology is another open playground. Opportunities abound to create interactive, engaging clinician workflows to support real-time decision-making and enhance not just clinical outcomes, but the patient experience and revenue integrity.

The Hearst Corporation’s portfolio includes efforts to improve real-time medication decision support, maintain the currency of order sets and care plans, as well as drive care team and care transition communication. Wolters Kluwer is similarly working on the above, as is Elsevier in their respective product portfolios. The CMS value-based purchasing and other HITECH act incentives provide some soft carrots to push forward.

EHR vendors also provide significant clinical content (sometimes including specialties as well) that provide a very practical head start, though with no assessment of evidentiary integrity. Some startups like Stanson Health are also tackling niche areas of decision support.

The meta-analysis, categorization, and dissemination of evidentiary information is not a hard science. Teams of clinicians and coders together can review hundreds of articles and publish findings relatively quickly. Most healthcare systems have enterprise subscriptions to evidentiary libraries to consume these findings. Even as there is disagreement among communities over studies and trials, that very disagreement is the impetus for further study.

Some EHR vendors support communities of clinicians coming together to bridge the gaps in knowledge and best practice findings, especially in pediatric care.

Healthcare systems aren’t software development shops. Most don’t develop teams to tackle this opportunity. Instead, they hire analysts to manually manage the change (painful and expensive). The evidence subscription vendors have been trying, though they aren’t the EHR experts – the integration approach has been flawed. Groups like OpenCDS are refreshing and bring attention to standards development and process, though still ahead of its time. Last but not least, implementation, rollouts, ICD-10, and other priorities have taken the spotlight.

Clinicians are adjusting to their systems. Are they be ready to do focused collaboration on their (ex.) 200 order sets with evidence depth?

EHRs are maturing their decision support tools. Are they ready to participate more fully in sharing public specifications for standard decision support ingestion?

Evidence vendors have grown revenue streams on non-integrated IT tools. Are they willing to wipe the slate and start fresh with new API models?

Revenue cycle teams have been focused on SBO models, centralization, and patient satisfaction, but there is a strong link to revenue integrity with the reduction of unnecessary tests and improved standards of care. Is the CFO ready to demand this value?

In talking with many CXOs, some truly want to insource this activity while others would prefer to pay –  to have content and evidence managed externally and reap the lessons and value from others. Both models could prove effective. Today, the costs are high dollar subscriptions. Perhaps these costs need to be part of a risk strategy and not paid without successful implementation.

Today, if an African American would have better outcomes with a different antibiotic, the clinician should have this information at his/her fingertips in the workflow.

Today, if a drug is removed from the market, it should be removed from the clinician’s selection in swift manner without much manual intervention.

Today, if the several major children’s hospitals wanted to jump online and compare their pediatric specialty order sets, they should be able to do so with ease and share the best.

Today, if there are 500 opportunities to improve clinical content with evidence supported changes within an organization, the CFO should know what the patient outcomes and related costs/savings may be.

The list goes on, and we can do all of this today – manually.

The challenge is not dissimilar from the interoperability debate. Just as we need a national patient identifier, adopted patient security measures, and implementation cost-sharing that includes practices, hospitals, patients and providers, the same theme can be found here. We need public specifications through collaboration, a change in the way evidentiary information is so proprietary today and closer partnerships with innovation teams.

Organizations each pay $50k-1.2M for decision support systems today in existing budgets. Various market analysis projects decision support to be a market of $550M by 2018, and upwards of $2B in the future. Let us demand more for our healthcare dollars.

 Jaffer Traish is VP of the Epic practice at Culbert Healthcare Solutions of Woburn, MA.

Readers Write: Eight IT Talent Trends to Watch for 2016

November 25, 2015 Readers Write 1 Comment

Eight IT Talent Trends to Watch for 2016
By Frank Myeroff


What’s in store for the New Year when it comes to IT talent? Here are eight talent trends that are shaping the IT workforce in 2016.

  1. Internet of Things (IOT). Talk about a technology revolution! IOT is emerging as the next technology mega-trend across the business spectrum. This means a job boom for developers, coders, and hardware professionals. However, to land a job in IOT, organizations want candidates with specific technology skill sets and experience. Consequently, an IOT talent shortage is expected.
  2. New C-level title. Chief privacy officer (CPO) is a senior-level executive title and position that was created as a result of consumer concerns over the use of personal information, including medical data and financial information. Organizations have had to rethink IT security due to recent breaches. According to InfoWorld, while most organizations already have a CSO (chief security officer) and/or a CISO (chief information security officer), there’s a need for a CPO, a dedicated privacy advocate responsible for keeping personal information safe.
  3. Gen Z will enter the workforce in greater numbers in May. Generation Z, those born between 1994 and 2004 (although there’s been no general agreement on exact years), are the most digitally connected generation yet. They have no concept about life before the Internet, mobile devices, digital games, or iTunes. Therefore, they are tech savvy and even more entrepreneurial than Millennials. They will choose career opportunities that provide quick advancement and work-life balance over salary and want mentors to help them achieve their goals.
  4. Big data becomes even bigger data. Big data is increasing the need for a new breed of engineers who specialize in massive databases. While the skills required aren’t necessarily new, there is a significant amount of knowledge needed in the areas of math and scientific analysis. Typical high-level skills expected for a position in this field include data analysis, data warehousing, data transformation, and data collection.
  5. Longer hiring process continues. According to the Wall Street Journal, in the US, the time it takes to fill a job is lengthening. In April 2015, the average job was vacant for 27.3 days before being filled. This nearly doubles the 15.3 days it took prior to 2009. The long hiring process can be attributed to having fewer qualified candidates for job openings as well as the increased number of background screening and drug tests ordered. WSJ also cites that the many portals and databases used to source and find candidates have become more entailed. While better hires are coming out of the process, it’s moving slowly.
  6. Hybrid IT talent in demand. The IT hybrid employee is on the rise. They are considered a generalist and a specialist all in one. A generalist tends to be someone who knows quite a few technologies, but only at an average level. A specialist knows only one or two, but at an expert level. A hybrid knows about a great many things at an advanced level and can adapt to any type of project. With a hybrid employee, employers are basically getting two people in one.
  7. Project work and consultant roles are abundant. Project work and consulting roles are most likely to remain abundant through 2016 and beyond. Increasing business demands are prompting many companies to invest in new technologies, along with upgrades and migration projects around tools such as enterprise resource planning (ERP) systems. Candidates who have knowledge of both new and legacy business systems are highly sought after by employers.
  8. Hottest industries hiring IT. The following industries are the top industries that will be hiring more IT professionals in 2016: healthcare, financial services, managed services, mobile technologies, telecommunications, and hospitality.

Frank Myeroff is president of Direct Consulting Associates of Cleveland, OH.

Readers Write: Sitting In the Shopping Cart: IT Tips for RSNA 2015

November 25, 2015 Readers Write No Comments

Sitting in the Shopping Cart: IT Tips for RSNA 2015
By Michael J. Cannavo


Most IT and C-suite people are about as excited about going to the RSNA as a child is going to the grocery store with their mom. They hope mom buys them some candy to make the trip worthwhile, but often have no choice but to sit in the cart and watch as items are piled in.

That doesn’t need to be the case at RSNA and shouldn’t be, either. IT folks and C-suites have a responsibility to make sure the products and services being purchase make sense from a technical, operational, and financial standpoint. Following these tips should help the trip be more productive and provide a better overall solution for the facility.

  1. Ask pointed, directed questions. Don’t be shy. Have questions ready that you will ask of all vendors that require more than a simple yes or no answer. How do you do it, not just do you it.
  2. Be consistent. Apples to apples is key, with each vendor getting asked the same questions. If you uncover something that may require further elaboration, go back and ask the others the same question.
  3. Lead, don’t follow. It is very easy for a vendor to take you down the path that best projects their products, but that may not necessarily be one that best meet your needs. The Yellow Brick Road was good for Dorothy, but isn’t for you. Take control of the discussion..
  4. Interoperability. One of the biggest buzzwords in IT today is interoperability. Don’t just ask where a vendor has connected to an EHR. Find out where and how they have done it and who you can talk to there about it. What resources were required (internal and external as well as financial)? How much time did it take?
  5. Support. Does the vendor provide a data dashboard or allow you to integrate to one? How much support can you provide internally and what can and can you not have access to? These are crucial questions.
  6. Facts, not fiction. Where have you done it with an EHR like we have in place? Don’t fall for a simple “yes, we can.” Pretend you are from Missouri, the Show Me state. Who can I talk to who has done it?
  7. Talk to engineers. If you want the unfiltered truth, talk with a systems engineer. They are easy to spot — the wrinkled shirt that just came out of the Walmart bag and the loose 1980s vintage tie they borrowed from their dad. They are also the ones who also talk nonstop about anything and everything <laugh>.
  8. Bail on the demo. RSNA is the absolute worst place to get a full product demo unless you just want a quick and dirty overview. Do the demo at your facility, where you can examine the product in detail, walk it through its paces, and ask the questions to get the answers you want and need.
  9. Get contacts. Your IT counterparts are the best source of information. Get names, phone numbers, and e-mails of those who are similar to you.
  10. Relax. Consider this a first date, not an “I do” situation. Don’t hesitate to cut your losses early Trust your gut. If it doesn’t feel right, it usually isn’t.

Michael J. Cannavo, aka The PACSMan, is owner of Image Management Consultants of Winter Springs, FL.

Dr. Herzenstube Goes to AMIA–Monday

November 17, 2015 Readers Write 1 Comment

Dr. Herzenstube is a practicing family physician who can make nothing of it.

The first session I attended today was a panel on ICD-11 given by representatives of WHO, IHTSDO, and academic organizations involved in developing ICD-11. ICD-11 will be the next version of ICD. The general idea behind it is to harmonize ICD with SNOMED to facilitate the use of SNOMED’s polyhierarchy while retaining ICD’s capability to meet the needs of epidemiologic analysis.

Bedirhan Ustun, a psychiatrist who manages terminology work for the WHO, was the first presenter. He explained that, unlike prior versions of ICD, ICD-11 will have an explicit content model. This means that each ICD-11 code will have underlying definitional modeling (as do SNOMED concepts). The work to build this has been initiated in collaboration with IHTSDO. 

Jim Case of NLM and IHTSO came next and explained that the goal of ICD-11 is to link SNOMED CT and ICD so that data can be captured once at the point of care and avoid the need for duplicate coding effort. He explained one important point about ICD, that as classification system, categories are mutually exclusive. This is important to support use cases of epidemiology and statistics, and explains why “other” codes are needed in ICD (something that never really made sense to me until now).

Chris Chute followed with a discussion of the SNOMED-ICD common ontology, which will provide the semantic anchoring of ICD-11.  Jim Campbell from University of Nebraska discussed some of the areas where the SNOMED CT and ICD-11 hierarchies were at odds and need to be reconciled and Harold Solberg discussed the process of building the links between ICD and SNOMED, either through equivalence maps (A = B) or cases where ICD is described as a compositional SNOMED statement, and automated testing for potential disconnects in the respective hierarchies. 

This panel provided a really helpful degree of clarity on ICD-11 from the people at the very center of building it. It will likely be years before this gets used in the US, but it is good to have a sense of where things may be heading.

I also attended a presentation on the Clinical Quality Framework (CQF), an effort to harmonize standards for clinical decision support with those for quality measurement (nope, they’re not already harmonized; yep, they definitely should have been from the beginning, hindsight is 20-20, etc. etc.)

Dr. Julia Skapic from ONC kicked off the presentation by describing a bit of the regulatory context around clinical quality measurement and clinical decision support and the need to develop a unified way of representing the underlying logic that expresses the standard of care involved. The holy grail to which this work strives is that, if a provider organization configures their system to measure quality using a particular quality measure, they can enable clinical decision support functionality based on the same underlying logic without any additional logic editing work. 

Marc Hadley from MITRE described current standards for CQM and CDS and the output of ongoing work under the umbrella of CQF to harmonize them. One such output is Clinical Quality Language”(CQL), which has been issued as an HL7 draft standard for trial use (DSTU). CQL is a Turing-complete, XML-based language designed to be a human-readable way of expressing clinical rules that is also machine-computable and agnostic to data model. 

In addition, Quality Improvement and Clinical Knowledge (QUICK) has been developed as a data model for use along with CQL, automatically derived from FHIR Base Resources and FHIR Quality Improvement Core (QICore) profiles. Kensaku Kawamoto described several pilots of using data artifacts based on these standards, which were able to represent rules for things like chlamydia screening and routine immunization. Tom Oniki discussed the Clinical Information Modelling Initiative (CIMI), a community of interest that has become an HL7 working group.  While this work is not yet ready for prime time, the amount of progress that has been made is really impressive and the momentum seems substantial. The large lecture hall was filled to capacity, an indication of how vital the need is for a solution to this thorny problem.

The first session of the afternoon I attended was on ACOs, moderated by Gil Kuperman of New York Presbyterian. David Bates of Brighan and Women’s Hospital discussed the use of claims data to identify patients at high risk for hospitalization, who then get an assigned care manager. They have seen a significant reduction in hospitalizations in this population since starting their work. 

The most interesting part of his presentation, to me at least, was the use of what he calls Standardized Clinical Assessment And Management Plans (SCAMPs). Basically, SCAMPs consist of a small set of data elements clinicians are asked to document in particular clinical situations. For example, for distal radial fractures, a few details on the fracture type and whether or not the fracture was treated surgically. After a few weeks of data collection, it is shared with the physicians and collection continues. 

What he found was that the practice patterns at the start were highly divergent from one physician to another. After sharing the data, the variances all but disappeared without any attempt to coerce or persuade any of them to change their practice patterns. A remarkable example of the Hawthorne effect. 

David Dorr from OHSU described the state of Oregon’s experiments with developing approaches to coordinate healthcare for vulnerable populations. His research involves figuring out how to help medical practices perform medical home-related activities such as establishing care management plans, ensuring close follow-up from hospitalizations, and doing clinical quality measurement. While he and his colleagues have developed a population management tool, they have observed something that most practicing clinicians will be familiar with — clinicians need point-of-care reminders, care management workflow tools, etc. within the same system they use to manage other patient information (within the EHR, in other words).

David Kaelber of MetroHealth spoke of some of the real-world challenges of meeting payors’ rules around ACO payments, including the fact that different payors often have slightly different requirements around data collection, population definitions, and quality measurement, requiring duplicate work for what amounts to very similar quality measurements. 

David Bates described his work at NYP with the Delivery System Reform Incentive Payment (DSRIP) program, an ACO-like program operated by the New York State Medicaid program. NYP’s programs include everything from patient navigation services in the ED to an HIV chronic care program to a program to deliver palliative care. They did a formal analysis of IT requirements, such as the ability to trigger notifications when key events occur, like a patient being hospitalized or new patient status values in their EHR. Among the lessons learned were that not all of the information flow can be EHR-based since many of the providers they are collaborating with don’t have EHRs.

One of the other highlights of the day was the poster session. The posters were fairly varied, and as is typical for any scientific conference, a bit hit or miss. One that I found amazing was by Matthew Rioth and Jeremy Warner, two physicians at Vanderbilt, titled “Visualizing High Dimensional Clinical and Tumor Genotyping Data.” When understanding data requires looking at it and two dimensions just aren’t enough, innovative data visualization is necessary. While the examples they provided were primarily research-focused, such as generating new hypotheses regarding what genes are important in cancer behavior, some applied directly to clinical practice, like one that showed patterns of ordering of molecular profiling tests across multiple clinics in their organization.

As with earlier days of the conference, the accidental conversations with other attendees were as valuable as the presentations. One memorable such encounter was with Lisa, an epidemiologist working in a reproductive health program at a state health department. She is becoming an informaticist by necessity since to support her research, she needs to figure out how to get more and better data from the clinical practices that her team funds.

To get data to the health department, these clinics currently either complete paper forms (!) or enter data manually through a Web-based portal. A few clinics have set up data entry forms within their EHRs to capture the necessary data, but it still requires duplicate data entry since these forms can’t pull in data from elsewhere in the patient record.  So if the patient has been screened for chlamydia, even if that data is in the EHR, it needs to be entered a second time to into the data element that will be sent to the health department. 

It was a sobering moment, amidst the promise of future progress all around us at AMIA, to realize how pedestrian the current state is in so many ways. It also drove home to me the ever-increasing burden we’re putting on practicing clinicians to engage in data-entry activities that, while they may serve a noble goal, make it harder and harder to focus on the immediate needs of the patient in front of them.

Readers Write: Health Data Security – Who Do You Trust?

November 16, 2015 Readers Write No Comments

Health Data Security – Who Do You Trust?
By Jeff Thomas, MS, CISSP


I don’t know about you, but I certainly don’t want to be associated with the next health data breach in the headlines. But we all likely rely on outside vendors for a variety of services and products, entrusting them with data and information. A recent report by Gartner Inc., “Trust and Resilience – the Future of Digital Business Risk,” lays out the stark reality: “malicious actors and increasing complexity create systemic threats to trust and resilience.”

Like the old 1950s game show “Who do you trust?,” care to roll the dice? Use that old dartboard?

Say you’re looking at new SaaS applications, mission-critical stuff. Naturally vendors are going to tell you that your data is safe with them. That’s what you want. But how can you tell if they are telling you the truth or not? Is there some “truthiness” going on? How can you tell those that are competent from those that are not?

Gartner predicts that IT spending on security and risk will double in the next five to 10 years, going from about 15 percent of overall IT spending to 30 percent. That’s huge. You’ve got to wonder – is your vendor keeping pace with their security needs or are they perhaps cutting a few corners, exposing your data to risk to save a buck?

You’re going to need some help. An important tool to get an insider’s view is a third-party audit report. Has your potential vendor had their data security procedures audited?

Everyone claims to be “HIPAA compliant.” But that gives you no real assurance that your vendor truly knows data security. Let’s look at one of the most widely-used and rigorous audits available, the SOC 2 Type II.

The SOC (Service Organization Controls) series of reports are governed by the American Institute of Certified Public Accountants (AICPA). These reports are designed to build trust and confidence between services organizations that operate information systems and their customers by having their service delivery processes evaluated by an independent auditing organization.

The SOC 2 is relevant for companies handling sensitive data as it reviews controls related to AICPA’s trust principals for Security, Availability, Processing Integrity, Confidentiality, and Privacy. (Controls may range from being technical in nature to manual processes). If those areas are of interest to you when choosing a vendor, reviewing their report is something you will likely wish to do.

A common question I hear is if a SOC 2 is good, isn’t a SOC 1 better? But in reality, it’s an apple-to-orange comparison. SOC 1 revolves around financial reporting and is often used as part of Sarbanes-Oxley compliance. If you’re selecting a vendor to handle your sensitive patient data, it’s not the right fit.

Or how about a SOC 3? A SOC 3 report is a summary report that does not have the detail of a SOC 2 report. It is generally used as a marketing tool, where the SOC 2 is a restricted document. If you want to see what controls are in place and how these controls are tested, the SOC 2 report is what you will want to read. To do so you’ll likely need to sign a non-disclosure agreement.

So you’ve signed the vendor’s NDA and have the report. Now what?

If you’re comparing vendors, it’s important to know that not all SOC 2 reports are the same. For starters, the biggest difference is that there are two types— a Type I and a Type II. A Type I reviews the vendor’s system and the suitability and design of the controls in place. Think of it as a point-in-time review indicating that the design of the controls was deemed to be reasonable on a specific day. A Type II goes further, and tests the operating effectiveness of the controls over a period of time. Accepted testing periods range from six to 12 months.

Once you have the report, what should you look for? First, there will be a summary, in which the auditor will summarize the engagement to include information about the scope of the engagement, as well as their opinion of the controls audited. This is a good place to see if there are any overall concerns.

Another section will be the vendor’s description of their controls. This will be a lengthy description of all the controls in place to meet the SOC 2 principles. After this, you will find a description of the tests for each control and the results for each test. This will map each of the vendor’s controls to the different SOC criteria and list the test performed and if any exceptions were noted. Ideally, you will find controls that meet your needs, along with a report of the tests finding “no exceptions noted.”

A SOC 2 report, especially the Type II, will not be a quick read. The time spent reading it will give you good insight into what measures a vendor uses to protect and process your data. The best part is that you don’t have to take their word for it—it’s coming from a trusted third party.

Don’t roll the dice or use darts when it comes to security. Insist on an industry-accepted, third-party audit or attestation. In this day and age of increasing digital business risk, you’ll be glad you did.

Jeff Thomas, MS, CISSP is chief technology officer of Forward Health Group of Madison, WI.

Readers Write: The Complexity of Maintaining Compliance

November 16, 2015 Readers Write No Comments

The Complexity of Maintaining Compliance
By Megan Tenboer


Clinical research presents a unique challenge when it comes to billing compliance. Often it’s left to clinical staff to understand Medicare and third-party guidelines, Clinical Trial Policies and other internal and external regulations, and to stay current in a fluid regulatory environment. Non-compliance puts the institution’s financial and ethical well-being at risk.

Two timely illustrations of just how complex compliance can be for research institutions came into play earlier this year. One revises the submission process for investigational device exemption (IDE). The other is the introduction of Condition Code 53 (CC-53).

Not satisfied with simply expanding criteria for coverage of IDE studies, the Centers for Medicare and Medicaid Services (CMS) also decided to centralize the review and approval process.

Previously, research institutions were responsible for submitting the require documents to their respective Medicare Administrative Contractor (MAC)[i] for device trials. Now CMS requires the sponsoring organization to secure approval of coverage for IDE device trials that obtained an FDA approval letter dated January 1, 2015 or later.

If this change is overlooked, it could have a devastating financial impact on the study and could delay treatment for patients in critical need. Failure to seek coverage approval through appropriate channels will delay or negate reimbursement for expenses related to the use of an FDA-approved device—even the device itself depending upon whether it is a Category A (Experimental) or Category B (Non-experimental) IDE study (Category A devices are statutorily excluded from coverage[ii]).

Another layer of complexity hit research institutions on July 6, 2015. An updated code details the process/requirements when generating a claim to local MACs, titled, Condition Code 53 (CC-53). This code is designed to identify and track medical devices that are provided to a hospital by the manufacturer at no cost or with full credit due for a clinical trial or a free sample.[iii]

Previously, hospitals used either CC-49 (Product Replacement within Product Lifecycle) or CC-50 (Product Replacement for Known Recall of a Product) along with value code “FD” (Credit Received from the Manufacturer for a Replaced Medical Device). However, these codes described only procedures surrounding replacement devices and not a reduced cost for non-replacement devices. The latter may be provided to Medicare beneficiaries as part of medical device trials.

It seems straightforward, and its intent was to fill the void by describing initially implanted medical devices that are not replacements. However, critics have been vocal about the lack of clarity about the new code. This new code adds to an already overflowing cache of device-related services that must be reported.

These two mandates may appear to be obscure regulations that impact only a small fraction of the overall healthcare market, but that’s not the case. According to business intelligence provider Visiongain, the worldwide market for clinical trials over the next five years will experience a cumulative growth of more than 50 percent.

Further, clinical research organization global revenues are expected to reach $32.73 billion in 2015 and to exceed $65 billion in 2021. Add the growing number of strategic alliances between full-service clinical research organizations and big pharma organizations that have outsourced drug development and the impact of errors skyrockets.

The best defense is to assign one individual to become the “regulatory mandate” expert tasked with staying up-to-date on proposed and finalized changes to ensure timely compliance.

Megan Tenboer is director of strategic site operations at PFS Clinical of Middleton, WI.

[i] Centers for Medicare and Medicaid Services: Medicare Coverage Related to Investigational Device Exemption (IDE) Studies. Available at: http://www.cms.gov/Medicare/Coverage/IDE/

[ii] Department of Health and Human Services Health Care Financing Administration: Medicare Carriers Manual Part 3 – Claims Process. Transmittal 1701. May 25, 2001. Available at: https://www.cms.gov/Regulations-and-Guidance/Guidance/Transmittals/downloads/R1704B3.pdf

[iii] Centers for Medicare and Medicaid Services. “Implementation of New National Uniform Billing Committee (NUBC) Condition Code “53” – “Initial placement of a medical device provided as part of a clinical trial or a free sample.” MLM Matters. Medicare Learning Network. Available at: http://www.cms.gov/Outreach-and-Education/Medicare-Learning-Network-MLN/MLNMattersArticles/downloads/MM8961.pdf

Dr. Herzenstube Goes to AMIA–Saturday and Sunday

November 16, 2015 Readers Write 1 Comment

Dr. Herzenstube is a practicing family physician who can make nothing of it.


AMIA is the professional society for health informatics. The AMIA annual symposium is the largest scientific informatics conference in the US. It brings together researchers, policymakers, industry leaders, and practitioners of health informatics from dozens of countries. I have been attending regularly since 2000 and it has been amazing to see the attendees and conference content grow in diversity as clinical information systems become more widespread.

AMIA always offers tutorial learning sessions before the official start of the conference and I have always tried to attend at least one. The chance to take a full day to participate in a structured, deep learning activity, taught by experts in the field, is a rare joy.

In line for coffee, I struck up a conversation with another tutorial attendee, a neonatologist at a major medical center. He also has a degree in informatics and spends “as much time as they’ll let me” on applied informatics projects in his institution, though there is no dedicated informatics department or team. Much of his time is spent working on their Epic system, into which he says they have “shoehorned” their neonatology workflows. 

He is here to attend a CMIO workshop, hoping to learn ways to elevate his level of influence within his organization. It is heartening to see someone dedicated enough to the promise of informatics to push on against the headwind of an organization that doesn’t yet know how to effectively use him, but it is a shame that those headwinds are still so prevalent.

At the tutorial, I found myself sitting next to one of the luminaries of the informatics field, someone who has occupied most of the leadership positions at AMIA and is now a senior executive of a very large academic medical center. To my surprise, he explained that he, too, makes a point of attending at least one tutorial at every AMIA conference. There are few things as impressive to me as someone at the top of their field who still thinks they have something to learn.

At the morning break, I chatted with Jose, another tutorial attendee. Jose is an internist and part of the clinical informatics team at a large East Coast medical system. His interests include population health and chronic care management. One of the projects he’s working on is development of a homegrown application for their health coaches. Among the workflows this application will support is capture of PHQ-9 questionnaire results. 

Jose recognized that there are LOINC codes for both PHQ-9 questions and answers and has been working with his development team to make sure that those codes are stored along with the questionnaire, increasing its ability to be re-used for reporting, decision support, interoperability, etc. Another great example of how informatics knowledge can make a difference in how health care organizations operate.

The NLP tutorial itself was certainly worthwhile, with instructors who were very knowledgeable and well-prepared. At the same time, it illustrated one of the challenges that faces AMIA and the field of informatics in general. Informatics is a “big tent” field whose adherent come from a wide variety of professional backgrounds and are working to solve a wide variety of problems. While this is a tremendous strength, it also creates challenges. In some cases, informaticists assume of each other familiarity with a particular set of knowledge or a shared set of priorities and interests. 

This was evident in the NLP tutorial. The presenters spent much more time describing the steps in using a set of open-source tools to create NLP engines (including the mechanics of setting up the processing queue for new documents in a data repository) than they did describing the logic by which NLP engines work and how that can be optimized. It would have been a great introduction for a grad student considering building an NLP engine for their dissertation. The clinician attendees, hoping to learn how NLP could help manage clinical information and patient care at their organizations, seemed less well served. Still, without AMIA, the “in the trenches” folks and the “in the ivory tower” folks would rarely come into contact. I believe that both benefit from the interaction.


AMIA officially opened today with a plenary session with a keynote from Avi Rubin, an information security expert from Johns Hopkins, who gave a widely-viewed TED talk back in 2011 pointing out some serious security vulnerabilities of modern technology, including medical devices. His keynote today expanded on this landscape, which has only worsened. It was a very unsettling talk to hear and a cautionary tale to those who develop IT-enabled implantable devices or take care of people who have them.

After the keynote, the first set of conference sessions began. I attended a paper session on “Deep Phenotyping.” AMIA paper sessions fit four brief presentations into 90 minutes with a few minutes at the end for questions. If you’re not already very familiar with the topic and current research in the area, it’s tricky to keep up. 

Phenotyping refers simply to solving the problem of identifying the phenotype of a person, i.e. classifying them according to some biological or health-related category, such as determining whether they’re diabetic or not diabetic. It’s an important problem if you are trying to do something that requires knowing the phenotype of individuals in a population (for population management, knowledge discovery, etc.) 

The most interesting paper in the session, in my opinion, described “semi-supervised” machine learning for phenotype identification from free-text notes. In traditional (“supervised”) machine learning, a system is given a set of documents and manually-applied labels as to their contents (the “answers”). Based its analysis of the associations between the contents of the documents and the labels, it develops an algorithm that it can use to infer the appropriate labels for an unlabeled document. 

In semi-supervised machine learning, following the supervised process, the system refines its algorithm based on its own inferences on the contents of the data. To my knuckleheaded family physician brain, it’s as if you teach someone that an AMIA attendee with a backpack is more likely to be a grad student than one without a backpack, and then they notice that the AMIA attendees with backpacks are more likely to be wearing sneakers than those without backpacks, and then that person starts inferring that AMIA attendees wearing sneakers are more likely to be grad students. In other words, after learning from being taught explicitly, the computer starts to be able to learn just from what it’s seeing. Intriguing stuff.

Following the session was the welcome reception in the exhibit hall. Among the folks I chatted with was John, the medical director of quality for the Medicaid program of a Midwestern state. It was his first AMIA.  He was excited about the potential of sophisticated data analysis for assessing quality, but also mentioned that at present, the only data he has to work with is claims data — he has no way to get any data from EHRs.  While we’re making great strides in thinking about how we might use healthcare data in positive ways, the options for much of the real world are limited.

Stepping outside the San Francisco Hilton, the realities of human misery are stark and obvious. The Hilton is right in the middle of the Tenderloin district, full of individuals who are clearly mentally ill and/or intoxicated. It is an important reminder of the urgent need to expand knowledge about human health and how to improve it, in which informatics has a critical role to play. As we dive deep into the intellectual challenges of our field, we must never lose sight of whom we’re doing all this for.

Founding Sponsors


Platinum Sponsors


























































Gold Sponsors












Reader Comments

  • Eat Bubbles: Just another bubble waiting to burst......
  • IANAL: They display pharmaceutical ads to doctors. Similar to how practice fusion made money but doximity has lower risk becaus...
  • DoximityDoubter: Though I was good at math in engineering school, my skills have obviously atrophied with the transition to medical train...
  • Neil Young Remembers: Judgement is okay in life and death matters like this, especially since tolerance and forbearance with guys like that di...
  • J Brody Brodock: I think the thing I am more concerned about than the belittling of a state or the "silly"ness of posting it on social me...

Sponsor Quick Links