Home » Readers Write » Recent Articles:

Readers Write 10/6/10

October 6, 2010 Readers Write 13 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

EMR: One Size Does Not Fit All
By Evan Steele

10-6-2010 6-27-07 PM

A recent comment on HIStalk, by a hospital CIO about what he identified as the best EMRs for enterprise systems and their physicians, highlights a problematic and all-too-prevalent misconception. The fact is, it is impossible to satisfy both hospitals and community ambulatory physicians with the same EMR product.  Furthermore, even the ambulatory market cannot be looked at as a whole. EMRs designed for primary-care physicians respond to a set of needs that are very different from those of specialists.

Enterprise EMRs simply do not work in high-volume ambulatory practices. This is particularly true for specialists’ practices. Many hospitals have had some success with Epic and other hospital-focused EMRs, but success has been limited when these same hospitals ask physicians — again, particularly the specialists — to implement these systems in their practices. A monolithic enterprise product cannot possibly support equally well such different workflows, patient care scenarios, and providers’ needs.

Within the ambulatory market itself, it is time to bifurcate the EMR discussion into two groups: EMRs for primary care physicians and those for specialists.

Industry analysts typically lump all EMRs into one category, which does not adequately differentiate the market segments or their distinct needs. The major EMR vendors have massive footprints in the marketplace, yet a small company like SRSsoft has the lion’s share of referenceable high-volume, prominent specialty practices in areas like orthopaedics and ophthalmology. Why? Because one size does not fit all, and it is impossible to satisfy the needs of both groups without compromising the needs of one.

The American Academy of Orthopaedic Surgeons (AAOS) acknowledged this issue in its recently released EMR Position Statement, pointing out that “Many systems are geared toward primary care medical practice, which can limit the utility of EHRs for specialty surgical practice.” It correctly suggests that “the different needs and uses of EHR by disparate medical specialties should be recognized.”

Specialists represent approximately 50% of the physician market, a sizeable segment that is largely being ignored. How are specialists to determine which EMRs are designed for their needs?

KLAS, the closest our industry has to a JD Powers–type of rating source, does not break out its ratings by specialty. This means that if an EMR vendor does well in the ambulatory primary care market and has high KLAS ratings, an unsuspecting specialty practice might purchase their product based on those ratings, only to find out that the product does not fit their unique needs. 

Exacerbating the situation is the fact that KLAS only surveys practices that have actually installed the EMRs. It does not survey practices with failed implementations. Since specialists represent a disproportionate number of the failures, the information is even further biased.

The result is that there are thousands of specialists who purchase EMRs from highly rated and/or household name vendors, but who end up with failed implementations and significant financial loss.

One size does not fit all. There are good EMR solutions available for every type of physician. It is incumbent upon the individual physician to research and identify the product that best suits his/her practice’s needs.

Evan Steele is CEO of SRSsoft of Montvale, NJ.

ClickFreeMD Comment Response
By Bob Gordon

Note: Mr. H here. I’m breaking my “no commercial pitch” rule this one time because Inga had questioned the business model of ClickFreeMD, which offers practice systems including billing for a flat monthly fee rather than the traditional model of a percentage of collections. Inga’s point was that the percentage model encourages the billing company to collect. CEO Bob Gordon was nice enough to e-mail Inga an explanation and we thought his response might interest some readers even though it is hardly unbiased. I’m not endorsing their product and I have no connection to ClickFreeMD.

ClickFreeMD leapfrogs the percentage-based provider business model. Consider the following:

  • No start-up, implementation or training charges.
  • The flat fee is lower on an equivalent percentage basis than most practices would pay for outsource medical billing alone and far less than in-source options.
  • If the practice improves its revenue or we boost it (which we often can do), the equivalent percentage drops through the floor.
  • The breadth, quality, and integrated end-to-end nature of our software, services, and support are unrivaled. Physicians are paying twice as much elsewhere for much less elegant solutions today.
  • The flat fee sticks. If encounter or charge values increase, the flat fee stays the same and the practice captures cost free revenue. If it drops outside ordinary seasonality range, the rate is adjusted down pro-rata so our physicians’ earning power is fully protected.
  • Importantly, the flat fee is backed by a performance guarantee that makes sure we work every claim or we rebate half of the flat fee. There is no equivalent protection in a percentage-based model. In fact, any claim that takes more than 15 minutes to resolve in a percentage system is probably costing them more than they are making, and hence billing company profitability is at some point in the collection continuum inversely correlated to increasing practice collections.
  • Our contracts all have 90-day outs and low price match guarantees for comparable services.

You may ask how we do this. We have deep domain expertise from running billing companies, back offices, and technology companies for decades and have organized a Southwest Air-like discount fee, high-result business model that is very scalable. We expect that ongoing volume will feed a virtuous cycle for all, continuing to allow us to offer more for less while achieving top results.

One of the most striking things we are doing is the least recognized — giving the practice their flat-fee price, online and instantly, as well as their included services, without asking them to give us any information. Try this anywhere else like Athena and what we do in 30 seconds becomes a multi-day process that involves e-mail / telephone / online discussions and/or meetings and requires the practice undressing for the vendor. We are completely ONE-WAY transparent. That’s because we want the practice to decide if they want to contact us — after they are satisfied that this is a superior value for them and only then. We aren’t interested in lead nurturing them to death. 

This is about "more dollars for doctors" and great news in the group practice fight to sustain their independence. We are doing our part to create a reversal of fortune in the group practice community with a unique business model that raises revenues faster than costs, delivers immediate and ongoing savings, and provides the tools and support that allow them to be ready for tomorrow.  

Like the boiled frogs of lore, physicians have been nickel and dimed by payers, billing companies, and others, overpaying to under-produce for so long, they find themselves working much, much harder for less and less. We’re changing that and we’re passionate about it! Thank you for your consideration.

Bob Gordon is CEO of Click4Free of Chevy Chase, MD.

It’s Official: The Rush for Talent Has Begun
By Tiffany Crenshaw

10-6-2010 6-55-56 PM 

In recent weeks, a number of existing and prospective clients have called me for a pulse on the healthcare IT recruitment marketplace and thoughts on how to attract quality resources. After a number of such calls, I decided to put my thoughts in writing and share.

Let’s start with the good news. Industry hiring is definitely picking up and employed candidates are now less afraid to make a career change then they were three to six months ago.

As for hot products, it’s no secret that Epic is hot, hot, hot. Hospitals are purchasing Epic left and right. Honestly, there are simply not enough Epic resources, especially Epic-certified resources, to go around, so the talent war is raging. Cerner recruitment remains modest but steady, while McKesson needs are starting to rebound after quite a lull.

In the ambulatory market, we are seeing more and more requests for eClinicalWorks and Allscripts. New names like Sage and Greenway are coming to light. And occasional needs for Meditech, Siemens, IDX/GE and Eclipsys are surfacing.

On the integration side, Cloverleaf and e-Gate skills are still in demand, but we are seeing more requests for Web-based and lesser known products like Ensemble, Symphony, and Rhapsody.

The hiring demand is highest by far for hands-on resources to design, build, and install EMR applications. However, there is a fair amount of activity for sales, project management, and training professionals, including go-live support.

CPOE, clin doc, pharmacy, oncology, and HIM are generating the most recruitment activity within the applications. Based on new client requests, we foresee growing needs for business intelligence, security, and report-writing resources.

In addition to employers’ desire for one or more of the skill sets mentioned above, most are adding clinical designation to the requirements. Over 50% of our job requisitions right now require clinicians. Pharmacists, nurses, and physicians with healthcare IT experience are in great demand.

However, post-recession hiring is creating challenges previously unheard of in my 12-year history recruiting in this industry. The process is now wrought with excruciatingly slow interview scheduling, shrinking employee benefits packages, little to no relocation assistance, and financially conservative offers resulting in more and more frustrated candidates.

Things have changed drastically since the lowest points of the recession. After the release of Meaningful Use requirements, recruiting mania has taken off. Everyone seems to have hiring needs. Candidates are getting called left and right by internal and external recruiters. Just check out a few of the job boards if you don’t believe me — you’ll see countless job postings. Furthermore, check out all of the recruiting firms with no previous healthcare IT experience trying to break into this market as experts claim abundant need for resources.

If your organization is currently or will be in the market soon for these in-demand resources, you may want to evaluate your hiring process, recognize that your competition is fierce, and take note of a few trends our candidates and clients have shared with us quite candidly over recent months.

  • New car syndrome. Candidates are migrating to new implementations. Who can blame them? It’s more exciting to be on the ground level and see a project through from A to Z.
  • Red carpet treatment. Employers who roll out the red carpet win. When weighing decisions between job offers, candidates almost always choose the employer who provided quickest response time and showed sincere interest in them. (Both response time and sincerity are simple and no-cost ways to roll out that red carpet.)
  • Relocation blues. Relocation is a HUGE issue right now. Even if candidates want to move, they can’t do so because of the housing market. Kudos to all of the organizations willing to work around this by providing remote work, commuting, or coverage of interim living expenses.
  • Communicate. Many, many candidates are feeling jerked around by potential employers because of lack of communication in the interview process. Here’s what they are thinking: “If I don’t feel valued as a candidate, how are they going to treat me as an employee?” On the flip side, these candidates are communicating with plenty of their peers. Too many hospitals and consulting firms are getting bad reputations as being lousy places to interview and to work.
  • Too much is not always a good thing. In the quest for resources, too many organizations are panicking and calling in all of the troops — internal recruiters, employee recruiting bonuses, dozens of external recruiters and advertisements. Candidates get called multiple times by different sources all looking to fill the same positions. Not only do they end up confused, but all the activity makes candidates suspicious. They wonder what’s wrong with an organization that has such a hard time attracting and retaining talent?
  • Get on board. We are hearing more and more horror stories about candidates showing up on the first day only to find their new employer is not ready for them. This gets them off to a bad start from the get-go. Employees stay longer and perform better when they feel welcomed and the transition process is smooth. The period of time between offer acceptance and start date can also be a black hole, when candidates are most vulnerable. Employers are losing candidates this far into the game because they aren’t communicating with them. If you don’t have a formal on-boarding program, now is probably a good time to look into it.
  • Disconnect between human resources and hiring managers. As an outside firm, we work with both HR representatives and hiring managers. We hear complaints on both sides about the other on a regular basis — namely due to lack of response. The hiring managers want candidates fast. And HR wants answers fast. Throw candidates in the mix who get frustrated as well and it’s a nasty situation. However, we find that employers who really engage the final decision-maker in the process from beginning to end and set response expectations up front have the least amount of frustrations and the most successful outcomes.

In summary, you can safely say that the industry is quickly changing to a candidate-driven market and that the market is impacted heavily by post-recession recovery and Meaningful Use. It is official. The rush for talent really has begun.


Tiffany Crenshaw is president and CEO of Intellect Resources of Greensboro, NC.

The Coming Speed Bump in the EMR Market
By Jon Shoemaker

It’s no secret that there is currently a mad rush occurring, not unlike The Oklahoma Land Rush of the 1800s, where hundreds of companies both new and old are getting into the business of healthcare information technology. Some come with industry expertise. Others come to take advantage of the financial opportunity. Consider Best Buy, the consumer electronics giant, that will install your EMR using their Geek Squad. So much for needing clinical expertise!

I believe this climate of frenetic activity will cause the EMR market to encounter a large, steep speed bump in the next 10 years. It won’t be from all of the EMR installations or supporting all of these systems, as this will create thousands of jobs and supporting infrastructure that currently does not exist. The bump in the road will come when all of these new digital silos must talk to each other as required in Phase II of Meaningful Use (MU). It is the very selling point of these systems — simple communication and usability — which become the Achilles heel of these EMRs.

EMR’s to date are not installed with a common code structure for identifying exams, studies, or services, all of which will need to be exchanged outside of the office in Phase II of MU. The reason for this lack of standardization has nothing to do with EMR functionality or capability — it is that everyone is still thinking locally not globally.

To ensure true interoperability and exchange of patient health information, EMRs must be installed to satisfy the local requirements, but also with the forethought that they will integrate to larger systems. This requires standards and standardization. The absence of a standard will require the use of translation services so that HIE repositories use the same codes for exams performed across the region.

Translation services, while a viable alternative to standardization, require one-off knowledge for the database structure and logic for each customized local EMR as well as that of the destination repository. This level of granularity creates layers of complexity for maintenance and mapping. Any changes to local system will mandate updates to the translation engine. The support nightmare of constant mapping modifications to assure the proper codes are sent outbound or received inbound will be effectively unsustainable.

Once all of the paper silos are replaced by digital silos, there will be enlightenment of EMRs that were installed incorrectly, don’t address the clinical workflows of the office, and don’t communicate outside of the office with a standard communication protocol using standard coding methods. This will lead to a second phase of the EMR revolution will include translation services and reinstallation of EMRs to address workflow and data gaps. This will have to be resolved before integration to a larger HIE repository can take place.

If we begin now with standardization of workflow and codes and ensure they are addressed with current EMR installations, we will be in a better place in five years and users will see the true benefits of these systems. With our current strategy of “every man for himself,” we risk losing users’ confidence once these systems are installed and address workflow and physician concerns. Once we lose the users’ confidence, they will stop using the system and re-adoption efforts will prove Herculean.

As you begin planning your EMR implementation, there are hundreds of questions to ask. When it comes to meeting the long-term requirements of MU as well as realization of the true benefits of an EMR, here are a few to begin with:

  1. Have we reviewed and documented our office workflow?
  2. Are we using the new SNOMED codes?
  3. Are we following standardized codes for services rendered?
  4. Does the installation team understand clinical workflow or do they look glassy-eyed when we discuss medical terms?
  5. Is our vendor of choice an IT company trying to cash in on the HIT initiative without clinical experience and knowledge which could place our business at risk?
  6. How will this EMR connect us in the future to larger integrated systems?

Jon Shoemaker is senior consultant with Ascendian Healthcare Consulting of Sacramento, CA.

Readers Write 9/30/10

September 29, 2010 Readers Write 7 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

"Granularity" — A Detailed Analysis
By Robert Lafsky, MD

“Granular” is turning into a buzzword. And that’s not a good thing.

It was a perfectly respectable, albeit not very useful term in the analog days, referring usually to a physical material composed of — you know, little granules. You’ll actually see it used sometimes as a descriptive term in pathology and endoscopy reports, and in general use it describes some thing’s particular type of grainy texture.  But then, of course, computer people got hold of it and gave it a much more specific, albeit metaphorical meaning, which I’ll get to in a minute.

Recently, writers in the mainstream media, with their ears always pressed to the ground and desperate for novelty, have picked up on this word and are starting to use it to describe more abstract things, in a way that fails to grasp the IT meaning at all. For instance, the other day political pundit Michael Gerson described a Karl Rove critique of Christine O’Donnell as “granular and well informed.” If you substituted “detailed” for “granular” in that sentence, you wouldn’t have changed the meaning a bit.

But IT people don’t use “granular” to mean just “detailed.” Hard copy or scanned documents can, of course,  be very detailed. I remember a couple of old docs from my training days who would sit with pen and paper and do beautiful two- or three-page, single-spaced handwritten reports on their patients with every bit of the history, physical, and labs on them. It was impressive effort, very detailed, but even if you found those reports now and scanned them into your EMR, the information in them wouldn’t be granular.

No, for a computer, detail is necessary for granularity, but it’s far from sufficient. The computer has to be able to do something with the details so that it can store them in an orderly way and then use them for searches and reports. That sort of thing, of course, is the “use” that at least has the potential to be “meaningful.”  

So if, say, a particular drug for hypertension is found to be dangerous for everybody over 60 with diabetes, I don’t have to go manually through a thousand records. They are recorded in a yes-or-no fashion in a database. I can query my system and get an immediate list of all my patients who meet those criteria, with their addresses and phone numbers.  

That’s granularity. Facts have to be detailed, but in a fashion where computers can take advantage of them.

Maybe this is obvious to the IT business readers out there, but I sure spend a lot of time in the doctor’s lounge painstakingly explaining this to medical colleagues. And granularity is at the heart of all the arguing about workflow issues in EMRs, as well as interoperability and the coherence of automated reports that rages in the comment sections of this website and elsewhere.

I can’t offer a resolution of any of these arguments. But to get anywhere, we need commonly defined terms, and granularity is a pretty useful one. General media people out there, if you mean “detailed,” say “detailed.” Leave “granular” for those that really need it.

Robert Lafsky is a gastroenterologist in Lansdowne, VA.



A HCIT System Architecture for Cloud Computing
By Mark Moffitt

Note: This article uses a fictional story about Google and Meditech as a backdrop to describe a healthcare IT (HCIT) system architecture for cloud computing.

(Oct. 1, 2020) Today marks the eighth anniversary of Google’s purchase of an obscure private company know then as Meditech that marked the beginning of the transformation of the HCIT industry into what it looks like today.

At the time, the purchase shocked everyone. Over the years, Meditech had repeatedly rejected any notion of a buyout by another company.Then Google offered $1.5 billion, more than a 50% premium on the estimated valuation of the company. The offer, it turns out, was too good to turn down. Neil Pappalardo of Meditech walked away with a $400 million payout. Google’s market cap at the time: $168 billion.

The Vision

Google’s vision for the future of HCIT was straightforward: provide all IT services to healthcare system as a cloud computing service at a price much lower than market rates as a strategy to capture 60% of the worldwide market by 2020. Google’s service included applications, data management, and integration. Google architected the system from the ground up for cloud computing, so they were able to offer the service at a much lower price while realizing higher margins than competitors.

Google bought Meditech for its customer base and use case models that had been hardened by use over many years. Google took Meditech’s functional specs and enhanced and implemented them in a new architecture. In addition, Google purchased several other HCIT vendors and integrated them to provide a total HCIT solution to customers.

Data Storage

Google did not use a relational database management system (RDBMS) as was common at the time, and instead used schema-less, key-value, non-relational, distributed data stores, aka as NoSQL.

RDBMS scale well, but usually only when that scaling happens on a single server node. When the capacity of that single node is reached, you need to scale out and distribute the load across multiple server nodes. This is when the complexity of relational databases starts to bump up against their potential to scale.

Goggle’s key-value data store model improves scalability by eliminating the need to join data from multiple tables. As a result, commodity multi-core server hardware can be used that are far less expensive than high-end multi-CPU servers and expensive SANs. The overall reduction in cost due to savings in database license fees and maintenance and hardware is around 70% when compared to using a RDBMS. Database sharding and the “shared-nothing” approach is ideal for managing large amounts of data at a low cost.

Three Data Types

Another concept introduced by Google was segregating data into three buckets — transaction data, results data, and analytic data — and managing each differently. Competitors at the time combined all three into one big, complex RDBMS.

clip_image002

Transaction data — what was ordered, when and by whom, what tests were performed, or what meds were given to a patient — are persisted to a transaction data store. At some point, all of the transactions related to a patient encounter are collected in a single electronic medical record file and compressed to about 10% of original size. Results are also contained in this file but not images, due to size, as was the case with the original paper medical record and film file.

The compressed medical record file provides an interactive view of the patient’s encounter to satisfy legal and payment inquiries. These electronic medical record files are stored securely in the cloud. Records are never transferred between organization; rather, access is authorized and the record viewed from the cloud.

Data is purged from the transaction data store once the electronic medical record file is created. The transaction data store remains a constant size and, as a result, it retrieves data faster and is easier to manage than if the transaction data store grew in size. Transaction data is concurrently stored in a separate analytic data store and is not purged.

Google partnered with several business intelligence vendors to offer advanced analytical services from the cloud using the customer’s analytic data store.

Results such as images, labs, reports, and waveforms are also stored in schema-less, key-value, non-relational, distributed data stores.

The three buckets — transaction data, results data, and analytic data — are each stored across multiple commodity server hardware using a “shared-nothing” approach. Scaling any individual bucket for a customer is almost as simple as adding server hardware.

Integration

Google used a derivative of their search engine technology to integrate a patient’s records and results across multiple providers and systems.

Application Development Framework

Google used an application development framework that was easier to build and deploy software. In a RDBMS, application changes and database schema changes have to be managed as one complicated change unit. Google’s key-value data store allows the application to store virtually any structure it wants in a data element. Application changes can be made independent of the database.

In addition, Google used a scripting language for code that changes most often — user-facing code. Both of these features combined to make software development easier and allowed applications to iterate faster. In software development, the rate of innovation is directly related to the rate of iteration.

Mark Moffitt, MBA, BSEE is the former CIO at GSMC in Texas and is working as an independent consultant while he searches for his next opportunity.


Software Upgrades – To Be or Not to Be? That is the Question
By Ron Olsen

The day your facility installs a new piece of software, you rarely think about the upgrades that will inevitably come later. You probably ask if such upgrades are included in the maintenance agreement, and then shuffle away that information for future use … or not.

Many times an upgrade is more than just a requirement from the vendor — it’s a welcome relief that offers bug fixes, provides additional functionality, and many times, increases productivity, which equates to money-saving. Hey, any time we humble IT/IS guys and girls can do something to keep the CIO happy, we’ve got to jump on it! That’s what IT should be all about — increasing the ability to save money and/or help other departments increase revenue streams.

Most of us have been caught in the XP vs. Vista vs. Windows 7 debate. The old adage, ‘If it ain’t broke, don’t fix it’ seems applicable here. XP works fine. Vista is, well, Vista. Windows 7 has generated a lot of hype. Windows 7 offers many enhancements, but if your organization’s PCs aren’t up to it, the new bells and whistles aren’t available. To get the full feel of the new Internet Explorer 9 beta release, Windows 7 is now required.

This is just one example of how an upgrade is never a simple, single-issue vote. There are dozens of interrelated concerns that an IT department must evaluate before pulling the trigger on a software upgrade.

And then the software compatibility issues. How many times have we heard from a vendor, “It’s not certified for (fill in any number of OS versions) yet!” This causes a push me – pull you effect. Some vendors are pushing you to move forward, and others you have to pull along with you.

Things to consider before upgrading your software:

1) Can you adjust your current processes to take advantage of new functionality? Many times we take an upgrade and claim there is not enough time to do a full evaluation prior to going live. Then, we certainly do not have time to go back and look again. This could actually cost your company money in the long run, instead of delivering the benefits of a well-planned project.

2) Downtime can be a deal-breaker for upgrades. No department ever wants to experience downtime unless it’s unavoidable. How will each department test the new upgrade? Do they have a full test system to work with? If all of the issues are thought out beforehand and these questions answered, upgrading shouldn’t be that painful.

3) Does hardware need replaced? Could this be a great opportunity to replace some old PCs and servers? Is this the catalyst that moves your facility to server virtualization …finally?!

4) What vendor software (enterprise forms management, ECM/EDM, etc.) will need to be upgraded simultaneously?

Thoughtful software and hardware upgrades are usually embraced by end users and the C-level alike. Personnel get new PCs that increase productivity, which keeps the Powers that Be happy once they’ve overcome the initial sticker shock. Just the idea of new PCs gets most staff members feeling like the hospital is moving forward technologically.

Server virtualization condenses the physical footprint of the server room, decreases power and cooling costs, and in most cases, reduces server administrative duties. And with your software running faster with full functionality from vendors’ latest compatible releases, IT/IS will (hopefully) get less end user complaints. Hey, it sounds good in theory!

Just make sure you plan well in advance; get buy-in from department heads, super users and (if you’re lucky) an enthusiastic executive; and communicate openly with vendors and you’ll be good to go.

Ron Olsen is a product specialist at Access.

Readers Write 9/15/10

September 15, 2010 Readers Write 6 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

Document Management is Good for Business
By Shubho Chatterjee, PhD, PE

Enterprise content management (ECM), also referred to as document management, is a capability with significant potential to centralize content and document storage, streamline and automate processes, and integrate smoothly with other enterprise systems. The business benefits are improved operational efficiency, reduced manual labor, reduced paper consumption, and improved process quality.

ECM consists of a central content or document repository, with indexing and searching capabilities, integrated with automated workflow allowing documents to be routed to appropriate processes and processors. The usage of the system is controlled by access policies at individual and group levels. Examples of use of this system include, but are not limited to, patient admissions, medical records management, invoice and payment processing, finance and accounts management, contract management.

A rigorous vendor selection process is critical to selecting the appropriate vendor. This should include an initial evaluation of functions and workflows where ECM is deemed to impact the most. Additional selection parameters include, but are not limited to, the total future cost of ownership for the proposed system, the projected process improvements and labor reductions, current material consumption, and current storage costs, product functionality, deployment options, and scalability. These parameters should be used to construct ROI scenarios for different options. Both objective and subjective factors should be integrated into the decision making.

Deployment options can be in-house (client server) or SaaS. While the in-house option provides for greater control, it also requires dedicated resources to manage, maintain, and upgrade the environment. SaaS deployment enables access to the system on a subscription basis with the vendor managing and operating the system and associated infrastructure in its data center.

The SaaS option frees IT staff to focus on more strategic tasks that add value to the organization while avoiding the expense of adding more IT infrastructure and resources to manage the system. Key factors to consider here are Internet connectivity and bandwidth and information security. Implementation is also quicker as the vendor completes the system build, configuration, and installation at their data center.

Collaborating to build a solution requires a thorough examination of the current processes across the organization with supporting process turnaround time data collection. This forms a baseline from which process improvements can be tracked in the future. To maximize the impact of the solution, this in-depth, step-by-step process analysis should be used to re-engineer and automate processes using ECM.

Creating efficiencies with this solution is feasible in many areas. After implementing ECM in the admissions department, Miami Jewish Health Systems has a central repository for patient documents. Seamless integration with the EMR application allows authorized users from any location to instantly access the associated patient’s documents from their workstation, eliminating time-consuming manual searches.

Routing documents electronically to employee’s workflow queues allows for faster processing and greater security. Eliminating the need to search for documents or make paper copies frees the admissions staff from tedious tasks and focus on patient care. Medical Records Management workflow has also improved with easy, instant, and effective collaboration across the organization. Medical personnel receive automated alerts for completing charts and associated notes and deficiencies. Previously, this required a visit to the medical records office.

Back-office departments, such as accounting and finance, have a high volume of paper flow and manual process being susceptible to lost invoices, missed bills, overpayment, or underpayment.

ECM deployment at MJHS is automating invoice processing. Invoices are now indexed to payments made and are searchable easily. With this technology, invoice approval is also automated and does not require manual inter-office mailing and completion. Payments are also completed in a timely manner.

As with any technology solution implementation, ECM must be well planned with a cross-functional team. Integration aspects with other enterprise applications must be well thought out. Baseline process documentation and re-engineered processes are also critical for success and before-after comparisons.

Shubho Chatterjee is chief information officer of Miami Jewish Health Systems of Miami, FL.

Regaining Control of Disaster Recovery
By Tony Cotterill

9-15-2010 6-56-26 PM

While working with our clients in hospital IT departments, we come across a variety of data backup scenarios. Some hospitals do full backups nightly, while others rely on an incremental/full backup strategy. Some sites exclude specific applications from their nightly backup simply because the volume is too great to complete in a 24-hour period.

Although there’s no ‘typical’ approach to backup and disaster recovery, a hospital’s data is a vital asset that must be protected. Before deciding how to protect it, however, first you must understand it.

The data landscape in the healthcare industry is more complex than in many other sectors, primarily because of the varied data types – namely, structured, unstructured and semi-structured — that are generated by both clinical and administrative systems. The type of data being secured and protected is inextricably linked to how that data needs to be recovered.

Structured data comes from database-driven applications, such as the hospital information system, radiology information system, electronic health record, and accounting systems. These applications typically generate hundreds of GBs, possibly a few TBs in larger facilities.

Unstructured data comes from applications that produce discrete files that are not associated with a database. Examples include word processing and spreadsheet files, which are routinely created by administrative staff and then stored on file servers. Many TBs of unstructured file data can be a challenge to backup and recovery.

Semi-structured data is produced most commonly by picture archiving and communication systems and document management and imaging systems. Both maintain a database of information (structured data) that references large quantities of discrete files (unstructured data). A PACS database may run on Oracle or SQL, and its size may be relatively small in relation to the many TB of DICOM images that database references.

Once you understand the three categories of hospital data, you can determine how much is dynamic vs. static. The dynamic data, which typically comprises 20-30 percent of overall healthcare information, is accessed regularly, and therefore changes constantly. This is the data you should be replicating every day.

Static data, which probably makes up the other 70-80 percent of your storage, should be treated differently. This unstructured and semi-structured data never changes and much of it will never be recalled again. Nevertheless, regulations and/or institutional policies compel hospitals to store it for five years, ten years, perhaps even the life of the patient.

So here’s the good news: once you’ve identified your static data, you can replicate it and move it to a self-protecting archive. Then there’s no need to include it in your backups.

This combination of backup and archiving provides an optimal strategy for treating each data type with the right method. By understanding the nature of the data in the critical clinical systems, the IT team can deliver both realistic and acceptable data recovery objectives to the business. In the event of a disaster, the organization can rest assured that the data can be recovered in a reasonable timeframe, minimizing the disruption to patient care.

Tony Cotterill is president and CEO of BridgeHead Software of Ashtead, Surrey, UK.

RTLS and Temperature Monitoring Mania
By Fed Up with the Fever

Would someone please tell me what real-time locating systems in healthcare have to do with environmental monitoring? I keep seeing all these temperature monitoring requirements pop up in RFPs and press releases. It concerns me that the healthcare CIO (or whoever is making these decisions) doesn’t realize that temperature monitoring of refrigerators has nothing to do with real-time locating, and even worse, is willing to saddle their wifi  system with this function risking QOS-sensitive systems such as POE and VoIP.

Sure, real-time alerts of out-of-range or variable temperatures are important, but unless you’re subject to that old Bart Simpson joke where he calls up the bar and says, “Is your refrigerator running?” followed by Moe’s inevitable “Yes” and Bart’s “Well, then you better go catch it!” — well, your refrigerator is not mobile! There’s no need to locate it, and certainly not in real-time.

The real-time alerts and reports that healthcare needs related to temperatures of refrigeration units can be easily achieved with over-the-counter probes. Then, just as it would with any other DCC-based system (i.e., “dry contact closure” such as security cameras, alarms, doors, or nurse call lights), the RTLS would respond to certain pre-established conditions (i.e., temperature out of range). These other systems do not rely on real-time location except to “trigger” an event condition. That is, if you want a security camera to come on if a certain tagged piece of equipment enters the egress zone, you need the RTLS as it relates to the real-time location of the tagged piece of equipment.

Temperature monitoring requires no such “trigger.” It requires only that you “push” an alert to an individual (or group) when a particular event is recognized within the event software. No location changes are recognized or recorded. If healthcare organizations could recognize this, they would save a tremendous amount of money and not be subject to the heartache of a low-grade RTLS that does only one thing (wholly unrelated to real-time locating) well.

So I ask what RTLS has to do with temperature monitoring even as I understand why temperature monitoring is so prominent in the RTLS space. It’s an easy way for vendors to make money. So long as the company can write some basic rules, they can provide an alert when temperatures are out of range. They can also record temperatures at regularly scheduled intervals without staff ever having to physically approach the unit.

There’s no doubt it’s an important time and money saver for the hospital. And it’s a money maker for the RTLS vendor. They get to solve a problem for the customer and appear wholly competent on this level, so that when it comes to delivering their RTLS with any level of accuracy, there will be a certain level of trust pre-instilled.

Unfortunately, too many hospitals fall prey to the belief that environmental monitoring is a function of RTLS, so if the vendor can do that well, surely they can locate assets and automate patient flow, right? Sorry, folks, but it’s just not so.

Readers Write 8/31/10

August 30, 2010 Readers Write 4 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

Meaningful Use:  Specialists Still Not a Priority
By Evan Steele

8-30-2010 6-53-20 PM

The HIT Policy Committee’s creation of a new Quality Measures Workgroup last week is the most recent in a string of actions confirming that meaningful use has not been defined meaningfully for specialists — and that it is not likely to be. Despite the fact that this new workgroup is charged with prioritizing Stage 2 quality measures and analyzing gaps in Stage 1 criteria, the 18 physicians selected for the 24-member group are primary care providers* — a fact that surely raises concerns among specialists.

The appointment of this workgroup comes on the heels of a growing response to the Stage 1 definition of Meaningful Use from specialists and their professional organizations, commenting on the lack of fit with the way specialists routinely practice. Every one of the six core clinical quality measures are primary care-related, and many specialists will be hard-pressed to identify three of the 38 additional clinical quality measures that are relevant to their practices.

At last month’s HIT Policy Committee meeting, committee member Gayle Harrell commented that much of the input gathered from the specialist panels convened last October seems to have been ignored, She contended that in the final rule, CMS made it more — rather than less — difficult for many specialists to comply with Meaningful Use.

This position was echoed by Thomas C. Barber MD, EMR project team leader of the American Academy of Orthopaedic Surgeons (AAOS) in discussing the Academy’s EMR position statement in the most recent issue of AAOS Now:

Orthopaedic surgeons will have great difficulty in meeting the current 25 Meaningful Use standards. Orthopaedics would derive greater benefits from standards promulgated by our medical specialty society rather than a set of generic requirements that mostly do not apply to musculoskeletal patient care.

This is not a new issue. The primary care focus of the legislation and regulations has been intentional from the outset. President Obama appointed an internist, David Blumenthal MD, to spearhead the program. There was only one private-practicing specialist among the committee members that crafted the recommendations to CMS.

It is not surprising that the Meaningful Use criteria do not reflect the practice patterns of specialists. Federal funding to assist physicians with EMR adoption has been directed towards primary care. The $357 million allocated for Regional Extension Centers, for example, was earmarked to “provide outreach and support services to at least 100,000 primary care providers and hospitals.”

The definition of Meaningful Use is not the only obstacle for specialists. The EMR products themselves are not tailored to the needs of specialists. The AAOS EMR Position Statement correctly suggests that in developing certification standards, it is essential that “the different needs and uses of EHR by disparate medical specialties should be recognized. In particular, the differences between surgical specialties and primary care specialties should be acknowledged.“

Unfortunately, because the certification criteria are linked to the Meaningful Use requirements, they are similarly primary care-driven. The EMRs most likely to be certified for Meaningful Use are predominantly those that were created and developed for primary care physicians — those of vendors that, from 2004 until recently, have devoted their development resources to meeting CCHIT’s 467 largely primary care-focused criteria. The AAOS statement continues: “Many systems are geared toward primary care medical practice, which can limit the utility of EHRs for specialty surgical practice.”

Specialists are no different from other physicians in their desire to participate actively in the evolution of the country’s medical care delivery system. But until Meaningful Use is defined in a way that is applicable to the way they deliver that care, they will participate on their own terms — adopting specialist-focused EMR technology that increases their productivity and enables them to provide the highest quality care and service to their patients.

* Includes internists, family practitioners, pediatricians, preventive medicine, an internist/hematologist, and a psychiatrist.


Evan Steele is CEO of
SRSsoft of Montvale, NJ.

Safeguarding EMRs Against System Failure or Downtime
By Arthur Young

8-30-2010 6-49-08 PM 

Using time-saving information technology and automated patient records management ensures clinicians have faster access to the most up-to-date patient information, enabling timely diagnoses and treatment and maintaining a high quality of care. However, if the network goes down, the system fails, or a planned or unplanned system downtime occurs, clinicians are unable to access critical patient information.

Whether they are planned or unplanned, system downtimes can occur for any length of time — from a few minutes to a few days. Downtimes can also be very costly if there is no system for preserving and accessing up-to-date patient information and maintaining uninterrupted patient care. Healthcare organizations have implemented systems for recovering from disasters, but not for protecting data and continuing operations during downtime. Without such a system, downtime can become more than an annoyance — it can be a life-threatening event.

Distinct from disaster recovery — which helps get systems back up when they go down due to a power outage or property damage — business continuance keeps vital business operations running at or near normal capacities in the event of any network or system downtime. That includes the downtime that occurs while disaster recovery mechanisms are being executed.

There are various solutions available that can help healthcare organizations remain functional during downtimes. However, they have drawbacks. Redundant or fault-tolerant systems can keep computers running and available during a system failure or power outage, but if they are the only system being used for business continuance and the network also goes down, clinicians will not be able to access patient data. Printing patient reports periodically allows clinicians to have the current data on hand, however it is a time-consuming and cumbersome task that diminishes data security, not to mention a waste of paper, ink, and other resources.

To maintain access to patient information from the location its needed, healthcare organizations need to select a business continuance approach that will provide the most protection in the most circumstances. Ideally, a business continuance solution should enable healthcare organizations to do the following:

  • Identify critical information and automatically distribute it to areas it will be needed in the event the HCIS is unreachable;
  • Ensure the information is secured but available on local machines;
  • Maintain seamless operation in the background, notifying administrators of any interruptions; and
  • Eliminate the storage of data in paper form, saving paper, ink, and printers.

Intelligent report generation and distribution decentralizes data in the event of downtime by sending the latest reports from the HCIS to its system and creating secure databases in multiple locations. The information is indexed in the database so clinicians can search and find the data they need whenever they need it.

With access to critical data during periods of system failure or extended downtime, clinicians can provide uninterrupted care and healthcare organizations mitigate risks to patient safety. Patients can be assured their health records are up-to-date and secure and confident they are receiving the best possible care.

Arthur Young is the president of Interbit Data of Natick, MA.

Ode to the Dumbwaiter and Robo-Butt
By Frustrated Farmacist (Female)

I saw your blurb about the Aethon TUG delivery robots installed at El Camino Hospital. The old ECH had an awesome dumbwaiter delivery system in place.

It is rumored that the Aethon TUG delivery robot solution was something of an afterthought that came six agonizing months after the grand opening of the $470 million hospital. Apparently earlier plans to integrate a delivery system may have been (ahem) overlooked. You can see from ECH documents that the robot contract was drafted in January 2010, several months after the new hospital opened. Early reports said the robots didn’t work for all departments and some ended up using volunteers, auxiliary staff, temporary workers, and other solutions to get medications and lab materials delivered.

$470 million and NOBODY initially planned an integrated medication delivery and lab transport system for a brand new, ultimate-in-high-tech 400-bed hospital! It doesn’t take too much imagination to extrapolate how important timely medication delivery is in the patient care scheme of things and why it’s the top complaint and employee satisfaction issue for nurses.

ECH’s competitor down the road has been using a similar robot system from Pyxis for the PAST 22 YEARS. It’s on its third generation, fondly named Robo-Butt. He travels in elevators and down halls to six floors and 15 departments. He is guided by sensors in the walls and speaks aloud to nurses to alert them when he arrives and when he commandeers the elevator. He steers around obstacles. His compartments are locked and secure, requiring a numbered password to open. He’s powered by six car batteries that are recharged and swapped for backups 2-3 times daily.

He breaks down every now and then. The elevators break down more often, grounding him on the first floor. Pyxis no longer supports these robots, so parts were scrounged from the basements of other hospitals and from a hospital supply house in Hawaii. But Robo-Butt WORKS. Here’s a picture of this bad boy:

8-30-2010 7-13-19 PM

The average hospital pharmacy department dispenses at least 40% more meds than were ordered because of late deliveries or items that are misplaced. The overhead and amount of wasted labor and supplies is unacceptable and frustrating for everyone involved, including the patient, nurse, doctor, pharmacy, and departments like lab and surgery that are held up because of medication delays.

With pneumatic tubes, you place meds in padded bullets and shoot them to the receiving department. Fragile ampules and vials can be broken — think about Epogen, a blood booster whose fragile proteins are destroyed by a violent trip in the tube system. It used to cost $6,000/vial and is still in the $600-900 range. Can I tell you how many bullets have exploded inside the pneumatic tube tunnels? Can I tell you what I think about the ER department tubing patient urine and blood samples to the lab inside this system?

We still have bullets that wind up in the basement due to malfunctioning suction or drivers. It’s hilarious when it’s a $45,000 rattlesnake venom antidote. But the bad part is that sometimes meds lie in piled up bullets in the tube receiving bin. Worse, the staff goes to send an “empty” tube to lab or ER and accidentally send a bullet filled with meds. The worst part is there is no “track-ability” or accountability — we can never tell whether someone received a bullet. If they “say” it never arrived, we have to send it again.

Here’s why I think dumbwaiters may be the ultimate smart medication delivery system with the fastest turnaround times, the least amount of waste, and minimal lost meds and lost charges. The pharmacy staff places labeled / bagged medications into little sectioned trays (like your silverware drawer’s insert) and leave them in the little locked elevator. The nurse that needs meds comes to her department’s locked elevator door, calls up the tray, and REMOVES ONLY HER PATIENT’S MEDS. She leaves the other meds in their little slots.

Think about this. You don’t have one nurse removing meds for her entire department and then misplacing them, storing them improperly (oops, that expensive IVIG that cost $20,000 belonged in the refrigerator?) or just putting them in her POCKET and forgetting to put them in the med room altogether. Then the Pharmacy can call the dumbwaiter down later and retrieve unused meds, credit them back, and recycle them. 

You can imagine that the number of missing med phone calls drops in half. Anyone in the pharmacy can check the dumbwaiter and see if missing meds are there before re-making them. Can I tell you how much time I waste every day re-doing the same missing med that simply gets misplaced or misdelivered and there’s no way to track it? It’s cheaper to bag up another blood pressure pill with the patient’s name and send it again sometimes. And then we get to retrieve all the duplicates, sort them, and restock them in the pharmacy bins…

I have nightmares about this.

A reliable well-planned medication delivery system is worth $$$ millions and makes up 80% of the nursing / customer satisfaction basis. I swear this is true! Any healthcare organization that builds a state-of-the-art facility without planning a delivery system is completely ignorant.

Done with my rant. 😉

Readers Write 8/17/10

August 16, 2010 Readers Write 3 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

How Would You Define a Secure Database?
By Robert J. Rogers, MD

8-16-2010 6-34-13 PM 

While driving to work in late June, my phone rang. I saw it was my office manager calling. For those of us who own our practices, an early morning call from the office manager is rarely good news.

“Dr. Rogers, someone broke into the office last night and stole the computers!”

Thus my partners and I began our saga of learning the ins and outs of dealing with a potential breach of protected data. We are the “Texas allergy clinic” referenced by Mr. H in the Monday Morning Update of 8/9/10.

(Let me briefly mention now that our database was purely for our practice management system. We do not use an EMR yet. More on that later).

Selfishly, my first thought was, “I hope our backup is good”. A few years ago, we experienced a server crash and learned that our backup was corrupted, requiring a manual rebuild of our database. Fortunately, we learned the backup was fine when the new computers were installed. I naively thought our biggest challenge was behind us.

We decided to check with the Texas Medical Association regarding our reporting responsibilities. We were directed to the AMA’s summary of the HIPAA Data Breach Notification Rule, which was enacted in September of last year. It was at this point that we learned the very important distinction between password protection and encryption.

As I suspect is true in most offices, we were under the impression that our database was secure since we needed a username / password combination to gain access to it. We use a well-known practice management system supported by a local reseller. Password protection was the only security measure discussed.

However, we learned that the database was considered vulnerable if it was not encrypted, thus triggering our reporting responsibilities (first class letter to each affected party, and notification of local media if more than 500 individuals are affected). I will leave it to your imagination to consider the logistics of sending letters to 25,000 patients.

This was a nightmare until we learned that commercial printers and mailing services can handle everything — stuffing, addressing, stamping, and mailing (for a fee, of course). [Mr. H — I didn’t actually complain about the cost of this process. I just responded to the reporter’s question regarding the cost of the mailing.]

Being victims of this crime has triggered a number of questions that I hope some of you may be able to answer. Now that I have learned the importance of encryption, I wonder why encryption is not automatically provided by all vendors? Is it complicated and/or expensive?

In an informal survey of my physician friends, none of them understood the importance of encryption. None had asked their vendors about encryption. Many of these doctors host their own servers.

Our potential data breach was important mainly due to the potential for identity theft since we don’t use an EMR (fortunately, in this case). That’s bad enough, but I worry even more about the thousands of physicians who use EMRs and may not use data encryption, thus making sensitive medical information potentially accessible.

As a patient, should you ask your doctor about the security measures in place?

The Data Breach Rule requires notification of the local media if more than 500 patients are affected. I wonder about the wisdom of that requirement. Might the thief be unaware of importance of the stolen server until learning about it from media reports?

Because of our experience, we elected to change to an ASP model for our software, using an off-site server accessed through an encrypted virtual private network. We think this is an adequate level of security, but we thought our previous system was secure, too. Is our database now secure?

As we rush to encourage all physicians to use EMRs, how can we make sure that all involved understand these important security issues?

Robert J. Rogers, MD is a physician with Fort Worth Allergy and Asthma Associates of Fort Worth, TX.

The Business Associate “Relationship”
By Stephanie Crabb

8-16-2010 6-29-01 PM

We are working with many customers who are looking to implement Data Loss Prevention as part of their information security and compliance programs. The best-practice deployment of these solutions requires collaboration with the HIS application vendors that contribute to the ePHI data life cycle such that the DLP solutions can efficiently and effectively content fingerprint targeted data “at the source” in the applications themselves. To some, this might fall under the rubric of “integration” as we have come to define it in healthcare, a common practice.

One small client of ours, a hospital of just 20 beds with an unwavering commitment to patient privacy and data security, approached its core HIS vendor, Meditech, with a formal request to connect directly with the database (aka “dictionaries” in Meditech-speak) to accomplish the implementation of its DLP solution. The data set was minimal — six fields of basic, and I mean basic, data to start. 

This request, surprisingly to us, was met with a firm “no” from Meditech. Why? They consider this “customization.”

Respecting Meditech’s longstanding position in this area, I personally worked with our customer to develop the business case to present to Meditech as to why they needed Meditech’s re-consideration. We cited areas around breach notification, uses and disclosures, and the like to inspire cooperation from Meditech and to put into clear context that DLP was a technology being adopted specifically to demonstrate compliance with HITECH and MU.  

The Meditech account representative acknowledged that they would need to do better in the future, but until they had a “critical mass of requests” from their clients to work with another vendor (like the client’s selected DLP vendor), their answer was still no. Understanding that our client’s Meditech account rep only has so much authority, my CEO and the client CEO requested a personal meeting with Howard Messing, only to be told that Mr. Messing could not accommodate their request. 

This is about a simple permission that the vendor could absolutely grant and requires little to no effort on its part whatsoever. It is a permission that other HIS vendors have eagerly provided. Oh, the vendor did offer to sell our client a module that would make this “easier” to the tune of $40K, even though what the client needs to access is already present in their its implementation.

DLP is not the only emerging technology that holds tremendous promise for organizations looking to reduce their data loss / data breach risk, enhance the controls around their data and its uses, and protect patient privacy. Unfortunately, covered entities cannot accomplish the implementation of these technologies alone. They need the business associate to collaborate, facilitate, and, sometimes, participate. And let’s face it, the rise of technologies like DLP that offer compensatory controls for privacy and security has resulted, in part, because the HIS vendors have been slow to respond with their own system capabilities.

I really do not mean to single out Meditech here. There are certainly other vendors who subscribe to similar operational models. This is, however, precisely the client service mindset that needs to change and that HITECH is requiring, particularly when technology is not the barrier. If these implementations are technically possible and largely resource-neutral to the vendor "business associate," why delay or deny their clients the opportunity to close the privacy and security gaps that are requisite to achieving meaningful use?

While the content of the NPRM may set about a chain of events whereby business associates become even more conservative in their commitments to privacy and security collaboration with their covered entity partners, there really is no where to hide, regardless of how ambiguous HIPAA and HITECH may be written. If you are in this space and in the business of touching ePHI in any way, you have to be “all in” — technically, operationally, and in the way you serve your clients and the industry at large.

It is simply not acceptable to relegate privacy and security considerations to the back burner, or worse yet, leave leave your client holding the bag. OCR recently clarified that “willful neglect” includes failure to take action when one recognizes a risk. Business associates who fail to respond when requested by covered entities to address a perceived risk could find themselves in an uncomfortable and costly position if a breach occurs and it could have been avoided.

Stephanie Crabb is VP of client services with CynergisTek of Austin, TX.


Why Are Lab Orders from Ambulatory EHRs So Hard?
By Ken Willett

8-16-2010 6-45-01 PM 

While hospitals with integrated inpatient EHR systems are claiming high adoption rates for CPOE (in some cases 100%), most providers in ambulatory settings are still creating lab orders outside their EHR. What makes it so much harder?

An integrated HIS system, which includes lab and radiology ordering, can present the provider with the correct choices for that hospital’s services. In the ambulatory world, the EHR is much more limited (being less expensive), yet the variety of external service providers is larger. There are generally multiple labs, with ordering rules governed by insurance contracts, and each has its own test catalog, data requirements for the order, HL7 dialect, and requisition formatting requirements.

The provider wants to quickly capture what tests to order and why. Their manual process is to make a few strokes on a preprinted superbill or order slip. It’s very hard for an automated system to compete with that. Ordering facilities in the EHR are often cumbersome, while at the same time too generic to capture the specifics needed by the lab or radiology provider.

The lab, on the other hand, needs an order which is complete, with billing information, the lab’s order codes, appropriate Ask at Order Entry (AOE) questions answered, and the correct requisition and specimen labels printed. To try to assure that orders coming to the lab are accurate and complete, the lab will generally provide an order entry application and workstation to the practice, for use by a phlebotomist or other staff person.

Having a lab-specific ordering application addresses the problem of making sure the order reflects the most current compendium data of that lab (test codes, AOE questions, specimen requirements, and medical necessity rules), at the expense of having a separate ordering application for each lab (and in many cases, due to Stark laws, separate workstations, printers, etc.). Re-entry of order data, together with the need to use multiple ordering applications, significantly increases the likelihood of error.

Improving lab ordering within the ambulatory EHR is difficult because ordering rules need to be configured for each lab and compendium data needs to be constantly updated. This is a significant burden for each practice to
undertake.

Given that we are now in an environment with much more seamless connectivity between applications (with web services and other technologies), I believe a better solution is to move the ambulatory ordering function out of the EHR itself and instead provide orders via a connected SaaS application. This can allow compendium data management to be done in one place for multiple practices and multiple labs while still giving the provider and phlebotomist direct access to a universal ordering interface.

Only some EHRs have the necessary integration capabilities to allow this sort of user interface extension. Still, this seems like a promising direction to improve provider adoption of electronic orders.

Ken Willett is president and CEO/CTO of Ignis Systems of Portland, OR.

Readers Write 8/9/10

August 9, 2010 Readers Write 4 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

EHR Exit Strategy
By Robert Doe, JD

While negotiating license agreements for my clients, we typically focus on functionality issues, warranties, uptime representations, support issues, etc. However, with all the distractions in the world of incentive payments and penalties, one thing I suggest my clients give some additional thought to is: will happen when the relationship with the EHR vendor ends? What will be the “exit strategy”? What do you need to include in the contract with the vendor to ensure the transition to a new system?

One significant concern to consider is what will happen to the data contained in the medical records that your organization has entered into the EHR system after the license agreement has terminated or expired? If the system is being utilized by multiple organizations, will you be required to leave a copy of the information in the system? You will also want to have a clear plan as to how your organization will take the data to a new system upon termination of the relationship with the EHR vendor. Can records be easily copied and/or exported electronically with the current EHR system? It is my understanding that this may not always be a simple task.

In addition, what issues might arise if the license agreement terminates abruptly, as may happen in the event of a breach of contract? The main concern becomes business continuity. You may want to consider including a provision in the license agreement obligating the vendor to provide transition services while you transition to a new system. Ideally, you should be able to fully use the EHR system during this period. Typically, the user pays for these services.

While most organizations are focused on finding and implementing an EHR system, I would suggest giving some thought to the life cycle of the system and devising an “exit strategy” for the time when the license is terminated or expires. Your license agreement should include appropriate provisions to allow you to carry out a smooth transition.

Bob Doe is a founding member of BSSD, an information technology law firm located in Minneapolis, MN.

Our Organization’s Comparison: Cerner vs. Epic
By Roe Coulomb

You asked in a previous post why Epic beats Cerner for every important deal. I previously worked at an organization that did a side-by-side comparison between Cerner and Epic, eventually choosing Epic. I thought it might be helpful to your readers to know the factors that went into that decision.

Corporate Culture

What’s not to like about an organization that has a CEO as accessible as Judy? One year at HIMSS, I observed her rearranging the waste baskets in their booth to make it more user-friendly for her staff servicing customers. That’s servant leadership!

Even before they moved to Verona, their corporate headquarters felt like a college campus. That was partly due to the age of the employees, but also their dress code and the eccentric artwork they’ve acquired over the years.

Does Neal or his top echelon of VPs even attend HIMSS? I’ve never seen him. If the suits do attend, I bet you won’t see them at the booth rubbing shoulders with the average Joe.

Their corporate headquarters is your typical Fortune 500. Lots of suits. Stuffy. Need I mention the bad PR from Neal’s e-mail tirade that was leaked a few years back?

Integration of Ambulatory and Inpatient Records

This is a significant factor for any organization that has a large employed physician group and wants an integrated database for their billing/ADT and EMR data. There are huge opportunities for streamlining things like medication reconciliation from physician office to the inpatient setting and back to the primary care doc.  

Not to mention that providers who practice both in an office and do inpatient work as well have only one application to learn. Once they are doing order entry and documentation in their office, implementing CPOE and clin doc in the hospital is far simpler (for those physicians, anyway).

Epic’s got this nailed! Cerner, not so much.

Implementation Philosophy 

Your 7/28 post said, “A CIO reader who knows both systems says Cerner requires clients to take ownership of the design and use outside consultants, while Epic offers a more turnkey implementation at a higher price."

That’s true to some extent (the Epic turnkey statement), but it wasn’t always that way. Epic got lots of feedback from their customers that there were too many options and decisions of how to implement a specific function. They picked best practices and made that into their model system.

Nevertheless, there are still a lot of consulting firms out there with Epic practices and I am not aware of a major medical center that has installed Epic without using consultants.

Software Usability

At the end of the day, all that other stuff doesn’t really matter if the software sucks. How usable it is for employees and docs is what counts. 

During the evaluation we did at my former employer, Epic was simply easier to use. Cerner’s screens were very busy with all kinds of tabs and lots of clicks and keystrokes. I recall one screen where there were 30 “chart-like” tabs across the top of the screen.

I recently viewed a Cerner demo at my current employer of how a nurse would change one piece of data, like a heart rate, acquired through a monitor device interface. First, click on the cell with the data, click Clear, click Sign to accept the remaining data, go back to the cell, click Edit, enter the new results, click Sign again.

It’s been awhile since I saw this on Epic, but my recollection is that it was like making a change to data in a spreadsheet. First, highlight the data to be changed, over-type with the new results, click Accept. Far simpler!

You said, "… which would seem to indicate that Millennium isn’t up to the task. In other words, a $6 billion market cap company with a single, fairly low-rated product line that’s getting hammered by a smaller and much higher-rated competitor should think about developing a better product."

Isn’t that what Millennium was supposed to do for Classic?  I recall reading about the hit Cerner took to its bottom line the years they put a lot of resources into developing the new Millennium architecture. I guess one measure of how successful that was is how many Cerner customer’s are still using Classic?

Finally, I think you hit the nail on the head: "..Cerner has built a business that could weather Neal’s transition or sale to another organization, but we don’t know that with Epic". Judy won’t be running Epic forever. What happens when she’s gone? Can Carl or whoever replaces her continue to run it as a private company or will they be forced to sell?

Readers Write 7/28/10

July 28, 2010 Readers Write 3 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

How to Use Meaningful Use Measures to Improve Internal Processes
By Shubho Chatterjee, PhD, PE

7-28-2010 7-02-57 PM 

The final ruling on Meaningful Use was released by the Centers for Medicare and Medicaid Services in July of this year after a year of comment period and revisions. According to the final ruling, to be eligible for incentive payments, Eligible Professionals (EPs) are required to submit to the CMS, starting October 2011, 20 objective measures for 15 core objectives and an additional five from a menu of 10. For hospitals and Critical Access Hospitals (CAHs) the corresponding measures are from 14 core objectives and five from a menu of 10.

There are various efforts, dialogues, and debates underway regarding the ability of EPs, hospitals, and CAHs to meet the reporting requirements, whether the cost justifies the incentives, and the sheer human and technical capacity needed. I will not further add to the discussions but will rather focus on how the MU criteria can be used to further improve care delivery process, make it more efficient, and positively impact the operating margin. After all, a measure is related to the output of a process, and while a measure can be met, it can also be used to hone into the process and sub-processes for improvement.

Let us consider some of these Stage 1 measures and how the underlying processes supporting the reporting of the measure can be identified and improved to further improve the measure, the care delivery, and the operating margin.


Stage 1 Measure
More than 30% of unique patients with at least one medication in their list seen by the EP or admitted to eligible hospital’s or CAH’s ED have at least one medication order entered using CPOE.

Implication
Let’s assume that the provider meets the 30% threshold for the reporting period. A logical follow-through is to examine why the remainder are not CPOE and what were some barriers overcome to reach this threshold. Is it because for the remainder unique patient population, data entry is manual because other providing locations are not CPOE enabled, CPOE is available but under-utilized, or are there manual data entry requirements into and between various systems and consolidate the data to one final measure?

Each of these barriers point to a different challenge. The first is system unavailability (a business decision). The second is a change management (a people challenge). The third is a technical and process automation challenge requiring an interface or other electronic inputs, such as document management and integration.

Stage 2 and Stage 3 measures will increase the threshold. Thus the underlying process or system gaps should be identified not only to meet later Stage measures, but to improve process efficiencies as well.


Stage 1 Measure
More than 40% of all permissible prescriptions written by the EP are transmitted electronically using certified EHR technology.

Implication
Assuming the 40% threshold is met, what is necessary to increase the measure? Is it because of volume of data entry from single or multiple locations, or system not fully utilized, or could it be because the receiving pharmacy or is unable to manage additional increases to their receiving capacity from their customers? Again, the barriers are similar to the above and need to be analyzed and overcome.

Stage 1 Measure
More than 10% of all unique patients seen by the EP are provided timely (available to the patient within four business days of being updated in the certified EHR technology) electronic access to health information subject to EP’s discretion to withhold certain information.

Implication
This requirement has procedural, technical, and operational implications. The procedural requirements are in providing HIPAA compliant health information, while the technical requirements are in the mode of providing the information. For example, will a secure patient portal be created, will the information be provided in memory sticks or other portable devices, and if so, what is the encryption or data protection policy?

Note that, depending on the technical solution selected, there are supply chain and purchasing requirements as well, to maintain and increase the measure threshold.


Summary
While the MU provides financial incentives for healthcare organizations, it ends in 2015. It is important for healthcare organizations to use this opportunity, not only to prepare, apply for, and receive the incentives, but to examine their organizations deeply from People, Process, and Systems perspective to utilize and enhance the measures.

Only when these three supports are robust and reliable will the Meaningful Use be truly meaningful to the healthcare system, where the improvement of quality of care is the most important objective and operational improvements and business growth will likely follow.

Shubho Chatterjee is chief information officer of Miami Jewish Health Systems of Miami, FL.

 

Bringing Medical Terminology Management into the 21st Century — Just in Time for ICD-10
By George Schwend

7-28-2010 6-40-42 PM 

ICD-10 promises to improve patient safety, the granularity of diagnosis codes, and diagnostic and treatment workflows as well as billing processes. Sounds like a dream, right? But close to three years from the mandated switch on October 1, 2013, most hospitals and health systems are still thinking of it as a nightmare, dreading the massive amount of time, effort, and money the transition will require.

What many fail to grasp is that ICD-10 is just one step on an endless road. There are already dozens of code sets that will probably eventually need to be integrated with each other — from SNOMED-CT and LOINC to RxNorm to local terminologies and proprietary knowledge bases — and all of them are constantly evolving. Look down the road and you can see ICD-11, already in alpha phase in Europe.

Instead of tackling each new iteration as if they were setting off on a major road trip through uncharted territory, providers, payers, and IT vendors need to ditch the proverbial roadmaps and get themselves a GPS unit. That way, they can simply enter each new destination as it comes along and travel there automatically.

And automation is what true semantic interoperability requires. Our metaphorical GPS could either be embedded in proprietary HIT software or plugged into a hospital’s or payer’s information system and triggered by specific events such as an update or the need to create new maps. It would allow users to automatically:

  • update, map, search, browse, localize, and extend content
  • incorporate and map local content to standards
  • update standard terminologies and local content
  • generate easy-to-use content sets to meet the needs of patients, physicians, and customer support professionals
  • reference the latest terminology in all IT applications
  • codify free text
  • set the stage for converting data into actionable intelligence

Happily, software that fits the bill is already available, in use today at more than 4,000 sites on five continents. It provides mapping and terminology for leading HIT vendors, for health ministries like the UK National Health Service, and for standards organizations such as the IHTSDO, owner of SNOMED-CT., allowing them to not only implement new codes but synchronize codes throughout an enterprise, be it a physician practice or a country.

If you are still having nightmares about ICD-10, this your wake-up call. The ability to merge and manage diverse content from multiple sources — including free text from physician dictation — is what will turn ICD-10 from a frantic, one-off billing upgrade to one in a series of opportunities seized: to move clinical diagnosis to a new level, for example, to optimize EMRs, to meet meaningful use requirements, to satisfy quality initiatives such as the Physician Quality Reporting Initiative and to support robust analytics and reporting.

Can a roadmap do all that? Hardly.

George Schwend is president and CEO of Health Language, Inc. of Denver, CO.

 

HIE Market, A Shot in the Arm
By Tim Remke

The HIE market finally got a shot in the arm with the passage of the federal stimulus. This and other tailwinds sent hundreds of millions of dollars over the next few years toward the HIE market. From this point on, the HIE market gets muddled. Questions such as who is marketing their solutions to which markets, what deployed-use cases are functional or even operate at a high level, and what differences exist between multi-stakeholder, state, and private HIEs are mixed among many other multi-faceted questions.

The definition of a health information exchange has diluted the significance of surveys and results, particularly when they seek to understand what types of data are exchanged, the number of HIEs in the market and their respective operational capacity, and technological and governance structures. Simply, too many results are ‘self-reported’ and produce statistically insignificant, inaccurate, or misleading data points.

Of particular concern, several market surveys and reports related to the HIE market have commingled data by combining statistics from provider organizations that use solutions developed for basic hospital portals — a far cry from a broader HIE platform. Finally, HIEs may be private, multi-stakeholder, or statewide entities. In addition, payer system and public health play a role of delineation. The idea of ‘community HIE’ is limiting, and does not tier appropriately the HIE market.

With this perspective and understanding, we assess a few basic aspects of the current state of the HIE market.

Target Markets
A tremendous amount of friction exists over what specific HIE markets are accelerating at a pace greater than others, and which companies target each market. For example, a few vendors are persistent in their belief that the private HIE market is really the first ‘go-to-market strategy’ place. They look for localized geographies or a few hospitals to install an HIE platform as an overlay solution to act as a ‘buffer’ to a larger regional or statewide exchange.

Within the same HIE market, but more counter to this strategy, are the vendors who seek larger contracts from statewide or vast regional, multi-stakeholder exchanges. Two different approaches that produce some small and other more significant variation in solution focus and offerings. However, the data indicates a consistency that is expected. A

ll vendors will market to almost any market. However, slicing through the data, we see vendors that are targeted. All focus on hospital to hospital environments. Approximately 85 percent focus on providing an acute to ambulatory framework, also; and less than 40 percent offer a platform that readily integrates physician groups.

clip_image004

In addition, and somewhat paradoxically, many solutions are simply not designed to operate as platforms for vast geographic or state exchanges. Therefore, for the multi-stakeholder market, HIE solutions are discriminating. Contrast arises between target markets and the ability of the solution to match the specific market. Unlike other segments, HIEs seem as equally conflicting in details as they are syncopated — characteristics of a nascent market (relative to the past few years).

Critical Minimal Requirements
In recent months, we have seen a number of RFPs that contain a significant number of demands. However, they mask a serious issue in the HIE market. The reality is most HIEs are ill-equipped to take on sophisticated and complex solutions, use-cases, and technical architectures they greatly desire. Furthermore, over 65 percent stated the minimal exchange of data from information systems were posing “mission critical problems” with their respective exchange, and will succumb to “serious delays”. The table below looks at minimum versus preferred requirements for an exchange structure.

7-28-2010 6-49-03 PM 

Conclusion
Finally, the HIE market is dynamic and has hit full stride. Companies that have weathered the storm seek potential exits (i.e. merger and acquisitions) while others are ramping their solution for the future. The market will likely extend an abnormal growth rate for the next one to two years.

However, many unanswered questions will remain. Business models, measured quality improvements, and funding, among other items persist into the future as open question marks. For example, initial stimulus funds will jump start statewide HIEs. However, after these funds have been depleted, real concerns about long-term viability and funding sources will endure.

Tim Remke is vice president of business development for HealthcareCIO, which produced the Health Information Exchange (HIE) Comprehensive Analysis & Insight report from which aspects of the above article were taken.

Readers Write 7/15/10

July 14, 2010 Readers Write 8 Comments

Achieving EMR Usability in Today’s Complex Technology Market
By Odell Tuttle

As HIMSS began recognizing the importance of human/computer interaction, its EHR Usability Task Force developed the 11 principles of usability — a framework which provides methods of usability evaluation to measure efficiency and effectiveness, including patient safety. This framework is invaluable as many of today’s clinical systems do not provide adequate support due to poor interface design.

From multiple data interchange and reporting standards, to formatting and encoding standards, to clinical processes and procedures — not to mention the government organizations and legislation — the EMR domain is vast and complex. For hospitals looking to implement an EMR, it is important they choose a technology partner experienced with proven, tested, and used systems. For rural community hospitals, it becomes critical, because their needs are so unique.

The HIMSS 11 principles of usability is a valuable tool in the EMR selection process. A summary of the HIMSS usability principles include:

Simplicity
Everything from lack of visual clutter and concise information display to inclusion of only functionality that is needed to effectively accomplish tasks.

Naturalness
This refers to how automatically “familiar” and easy to use the application feels to the user.

Consistency
External consistency primarily has to do with how much an application’s structure, interactions, and behaviors match a user’s experience with other software applications. An internally consistent application uses concepts, behavior, appearance, and layout consistently throughout.

Minimizing Cognitive Load
Clinicians in particular are almost always performing under significant time pressure and in environments bursting with multiple demands for their attention. Presenting all the information needed for the task at hand reduces cognitive load.

Efficient Interactions
One of the most direct ways to facilitate efficient user interactions is to minimize the number of steps it takes to complete tasks and to provide shortcuts for use by frequent and/or experienced users.

Forgiveness and Feedback
Forgiveness means that a design allows the user to discover it through exploration without fear of disastrous results. Good feedback to the user supports this goal by informing them about the effects of the actions they are about to take.

Effective Use of Language
All language used in an EMR should be concise and unambiguous.

Effective Information Presentation – Appropriate Density
While density of information on a screen is not commonly measured (though it can be), it is a very important concept to be cognizant of when designing EMR screens.

Meaningful Use of Color
Color is one of several attributes of visual communication. First and foremost, color should be used to convey meaning to the user.

Readability
Screen readability also is a key factor in objectives of efficiency and safety. Clinical users must be able to scan information quickly with high comprehension.

Preservation of Context
This is a very important aspect of designing a “transparent” application. In practical terms, this means keeping screen changes and visual interruptions to a minimum during completion of a particular task.

Reliable usability rating schemes offer product purchasers a tool for comparing products before purchase or implementation.

Making complex things appear simple is a very difficult job.  However, by utilizing the HIMSS 11 usability principles, healthcare providers are armed with a powerful tool in the EMR selection process.

Odell Tuttle is chief technology officer at Healthland.

Tech Talk and Market Strategy – Smart Phones
Mark Moffitt and Chris Reed

Tech Talk – Dictating Reports within an iPhone App

Good Shepherd Medical Center developed an iPhone app that has achieved a very high rate of adoption by physicians (95%) by providing a high degree of customization. The second most popular feature of the app is accessing and playing radiology dictation when a report has not been transcribed and is not available for viewing. Viewing lab data is first.

One reason this feature is popular is that it eliminates the need for a physician to call a dictation system and enter an ID, medical record number, etc., on a telephone keypad. Using the iPhone app. they simply press a virtual button to play a dictation on the iPhone. One less gadget a physician has to futz with.

It seemed logical that physicians would appreciate being able to record a dictation and view clinical results on the iPhone simultaneously without calling a dictation system and entering information on a telephone keypad.

Initially, we planned to integrate our iPhone app with a native dictation app. Unfortunately, this configuration requires multitasking to dictate while viewing clinical information on the iPhone. About one-half of the physicians using the app have the 3G phone. iPhone OS4 (operating system) supports multitasking but runs slow on 3G phones.

iPhone OS3.1.3, the latest OS designed for the 3G and 3GS, supports viewing Web pages while talking on the phone. We used this configuration to provide for the ability to dictate reports and view clinical results from an iPhone. Our iPhone web app uses the URL scheme “tel” to send commands to the iPhone phone app.

tel: <1>, <2>, <3>, <4>, <5> # note: “,” instructs phone to pause

Where:

1. Telephone number for the dictation system.

2. Physician id.    

3. Site id (hospital).  

4. Job type (H&P, discharge summary, progress note, etc.).

5. Medical record number.

The shortcoming of this approach is that the iPhone dials slowly the entries after the initial phone number. However, it is a big improvement over having a physician call the dictation system and enter information manually.

This is not our final solution. Sometime late this year or early next year when most physicians are using a 3GS or iPhone 4, we will switch to using a native app to dictate a report. If we had more resources, we would provide a version for iPhone OS3.1.3 and one for OS4.


Market Strategy – Smart Phones and EMRs

If the battlefield for winning the hearts and minds of physicians using electronic medical record (EMR) systems is shifting to smart phones and iPad-like devices, and I think it is, this trend may open the door for vendors like Meditech, Cerner, etc. to derail the Epic juggernaut.

Newer systems like Epic hold an advantage over older systems in terms of usability and user interface design. Software written for smart phones that operate over an underlying system can hide these flaws. It is possible, I contend, to neutralize Epic’s usability advantage over older systems among physicians using an “agile” smart phone software model. An agile model is one that puts in the hands of the customer the ability to rapidly modify and deploy smart phone software to fit the specific needs of an organization. This approach does not change the functionality of the underlying system.

Customers using agile smart phone software can:

1. Configure the app in different ways to greatly improve flow for different kinds of users, e.g. hospitalists, specialists, and surgeons; and for different types of smart phones.

2. Add data to the user interface to guide users toward a specific objective. For example, display house census, length of stay, observation patients and hours since admission, pending discharge, one touch icon for pending discharge alert, etc.

3. Add features that make the physicians work easier. Examples include one touch icon to call patient’s unit or nurse, play recording or dictate on the smart phone while viewing clinical results; access medication list directly from a PPM EMR without a patient master index between systems; receive clinical alerts; etc.

To compete, smart phone software must be core to your business. Give credit to Epic for recognizing the strategic value of their smart phone software. However, Epic’s smart phone software is “rigid” and that leaves them vulnerable to smart phone software that is agile.

Mark Moffitt is CIO and Chris Reed is Manager at Good Shepherd Medical Center in Longview, Texas.

Readers Write 6/14/10

June 14, 2010 Readers Write 20 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

The EHR Manifesto
By Recently RIFed

A spectre is haunting America — the spectre of Meaningful Use. All the powers of traditional vendors have entered into a holy alliance to exorcise this spectre: Executive Office and ONC, Allscripts and Eclipsys, Epic, Cerner, McKesson, and Meditech.

Where is the software vendor that has not been decried as unusable by its opponents in power? Where is the software vendor that has not hurled back the branding reproach of unusable software, against the more integrated vendors, as well as against its reactionary adversaries? (My apologies to Karl and Friedrich).

10 Point Program to Improve EHR software

  1. Less configurable. The Demotivators® said it best “When people are free to do as they please, they usually imitate each other”. Every hospital or physician practice is unique — they uniquely solve the exact same problems everyone else is facing.
  2. Better designed. End-user input and UI design should be part of the specs, not the pilot.
  3. Customer-prioritized enhancements. Fifty percent vendor-driven (sales and demo feedback, regulatory requirements, infrastructure, etc.), 50% prioritized by customers. Yearly process, projects grouped to be equal number of hours, one vote per licensed bed, top x projects will be roadmapped to fill 50% time.
  4. Consensus-driven standard content and configuration. Vendor designed, large group customer editing — majority rules, everyone uses.
  5. Remote hosted. 99.999% uptime, capacity and response time are key requirements.
  6. Rapid install. If you’ve followed 1-5, training the end-users should be the most time-intensive phase of the implementation.
  7. Qualified buyers. We’ll sell to you if you agree to: follow our standard workflows, use our standard build and participate (end-user input, content design, and prioritization). Must agree to mandate adoption! Better to support 50 involved, committed customers than 100 unhappy, non-standard, partially-implemented, low-adoption targets.
  8. Equitable pricing. Low upfront, subscription-based. Every customer pays the same, scaled by size or volume.
  9. Play nice with other vendors. Integration > Interfacing > Interoperating.
  10. Record portability. Remove vendor lock-in. The intersection of the NHIN and CCDs with the market transitioning to replacement will make this a necessity. You know it will be mandated eventually.

I can’t think of a single vendor that would get a passing grade on my 10-point scale (even the industry darling would only receive a 40%). But please, prove me wrong and post comments. As I review my RIF package and dust off my resume, I’d love to be proven wrong (and find out they’re hiring) …

Personally, I’d love to see a new breed of vendors emerge. Maybe someone will submit a FOIA request and hire a team of developers and clinicians to polish and fill in missing functionality. Maybe even someone willing to follow my manifesto and explore a co-op or non-profit corporate structure. Forget the socialization of medicine, let’s socialize the vendors. Until that happens, I’ll continue to remain anonymous and try to work from within.

Jump-Start HIEs with Integrated Health Records
By Ravi Sharma

 ravisharma

One of the challenges that most EHR systems will have in satisfying the government’s Meaningful Use requirements will be to establish connectivity and interoperability with other providers’ systems and ancillary services. Disparate data from multiple providers must come together as a more complete patient-centric record to achieve this goal, and not all providers are ready for it. These and other business and logistical issues are some of the challenges that health information exchanges (HIEs) have encountered.

One solution is to use technology to leverage data generated through existing business relationships. This can be done through a Web-based, patient-centric “Integrated Health Record” (IHR) that integrates data from multiple sources and institutions. An IHR provides up-to-date, community-wide, patient-centric data such as lab and imaging orders and results, incorporating both hospital and reference labs.

It also can be used for ordering prescription drugs and leverage the patient’s allergies, drug history, and even lab data to prevent adverse events. Physicians can even follow the inpatient encounters for patients admitted in connected hospitals, along with outpatient data, from anywhere over the Web.

IHRs also improve the ability for patient care teams — physicians who must collaborate to provide comprehensive care — to coordinate care and share patient records. Today, such clinical information between referring physicians is shared via fax, mail, or phone. Even when practices have EHRs, they’re often unable to send key patient data electronically to other physicians who may be using different EHR systems.

The Meaningful Use criteria require such exchanges to occur using standards such as Continuity of Care Document (CCD) and the Continuity of Care Record (CCR), but few systems are capable of using such standards. That’s partly because EHRs aren’t designed for information exchange and also because, in the absence of HIEs, the transmittal of CCDs requires point-to-point interfaces. An IHR that already can create connections to multiple EHRs can act as a link to exchange CCDs or CCRs.

The IHR is not designed to replace EHRs or CPOE systems, but rather to collaborate with them to connect them with other information sources. In that sense, the IHR unifies and facilitates the patient-centric data exchange between various entities to realize the formation of HIEs. The IHR further facilitates the integration of data from multiple sources by normalizing data from disparate sources using standards specified in Meaningful Use, criteria such as LOINC for discrete lab data.

Rather than upfront investments in MPI and other expensive technologies, HIE pilots can greatly benefit from the use of technologies like the IHR. The IHR can not only serve as basic HIE, but facilitate HIE participation by providing key information where and when it’s needed on the front lines of patient care.

Ravi Sharma is president and CEO of 4medica.

Thoughts on Eclipsys-Allscripts
By Tim Elliott

The coming together of two heavyweights in the healthcare IT industry, Allscripts and Eclipsys, has the potential to open doors for their existing and future customers, third-party developers, and patients. There will be some challenges, too — including helping current customers integrate legacy Allscripts and Eclipsys systems alongside new modules — but this can be considered another opportunity for outside vendors whose technologies bridge the gaps between Eclipsys and Allscripts applications.

Detractors may be lampooning the Allscripts / Eclipsys “One network, one platform, one patient” slogan, but in truth, the merger does create a cohesive, cradle-to-grave care solution by uniting pre-acute, acute, and post-acute care information, as well as simplifying financial and performance management with non-clinical data.

The use of a common .NET technology stack offers the possibility of seamless integration and increased usability for clinicians and administrative staff. It also makes it easier for third-party software providers to deliver bolt-on solutions that further enhance Allscripts / Eclipsys offerings in physician practices, hospitals, home health, and other care environments. These external vendors will be crucial if Allscripts / Eclipsys is to succeed in bringing together previously disparate patient populations, which will require capturing and managing data from multiple sources in a centralized manner.

Tim Elliott is CEO of Access.

Readers Write 5/17/10

May 17, 2010 Readers Write 6 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

Medical Image Sharing: The Future is In the Cloud
By Eric Maki

eric_maki

Is the world coming to an end — the healthcare IT world of proprietary silos, that is? When it comes to the sharing of radiology images and report files, the answer appears to be an emphatic YES.

My facility, the Great Falls Clinic in Great Falls, Montana is just one of dozens I know about that now share full-resolution images and reports via cloud-based technology.

The approach works seamlessly. Both uploading and downloading aren’t much more complicated than sending an e-mail with an attachment. No one needs to babysit the process, which at a leanly staffed rural clinic like ours, is a big advantage. And there are no requirements to establish and maintain the link, unlike the VPNs that were our workaround until recently.

There are advantages to proprietary healthcare IT technology. But when it comes to sharing images, proprietary IT has posed challenges throughout my entire state. Because nearly all of Montana’s medical facilities are less than full-service, we often have to transport patients with major issues to a large hospital in the nearest big city. The docs there, of course, want to see whatever imaging studies and accompanying info we generated at our facility. Proprietary IT forced us to use VPNs or other workarounds like burning and sending CDs.

There was also a major expense involved in all the time we spent to maintain our VPNs every time we installed an IT upgrade such as a beefier firewall. Some of my colleagues in Montana who relied on CDs for file sharing were having other frustrations. Sometimes the CDs couldn’t be read on the recipient hospital’s computers. Sometimes the CDs were damaged, couldn’t be read anywhere, or worse, were lost and never found.

We were fed up with this situation in our state, so 30 of our facilities formed an organization to search for a better solution. We called it Image Movement of Montana, or IMOM. We asked several PACs vendors for ideas and, fortunately, one had just developed a cloud-based service that met our needs. It required no new capital acquisition of hardware or software and bypassed all the proprietary hurdles that had plagued us to this point.

The Great Falls Clinic was one of the six facilities that tested the system on behalf of all 30 IMOM members. It worked pretty much without a hitch. A problem that vexed us for many years was suddenly solved, just like that.

The system we use is called eMix, but there are other players in this game — LifeImage and SeeMyRadiology, for example. From what I’m reading, there may soon be more cloud-based image-sharing services available. It’s clear to me that the medical image sharing’s future is in the clouds.

Eric Maki is manager of information technology at the Great Falls Clinic, Great Falls, MT.

 

NHIN CONNECT Code-a-thon
By iReporter

connectbanner

ONC sponsored what it called an NHIN CONNECT code<a>thon held in Miami a few weeks back. Like the IHE Connect-a-thon held earlier this year in Chicago, this forum’s attendees were primarily hands-on senior software architects and engineers who are refreshingly working together to tackle our industry connectivity woes. 

This meeting had three components. The main one was two days of in-depth collaborative sessions to discuss a variety of technical topics regarding the current CONNECT version as well as group planning for future version features. The second was the CCD template competition won by Georgia Tech that you highlighted here.

The third and most important component in terms of potential long-term impact on the industry was the creation of the Electronic Health Record Interoperability Special Interest Group (EHRI-SIG). To a standing room only audience (and 60 online participants), the CONNECT team presented their ideas and reached out to the private sector for help in establishing a group committed to advancing the state of practice involving medical record interoperability. 

connectteam

One unique idea presented involved the use of XMPP, a protocol underneath applications like Skype and instant messaging. The idea presented was to exploit this protocol for implementing new communication and exchanges between doctors, patients, personal health records, laboratories, and pharmacies. Another interesting discussion revolved around the CONNECT teams’ desire to implement no-click solutions and to stop the phone from ringing in the doctor’s office.

The meeting video/audio and presentation and audio can be found here.

This modest event could very well signal the beginning of how health information exchange will fundamentally be changed and accelerated in this country. By combining the best of the NHIN CONNECT industrial strength “trust fabric” with the some of the same concepts being considered within NHIN Direct, this effort is positioned to provide a “sweet spot” that likely will appeal broadly to health care industry stakeholders as they tackle meaningful use under Stages 2 & 3.

EHRI-SIG will be making specific decisions on how to move forward at its second meeting in DC on June 2.  As a true working meeting, attendees are required to submit short use case descriptions and be representatives of EHR, lab, pharmacy, PHR, etc. vendors so that the outcome of the discussion can potentially translate into enhancing their own product capabilities. Information can be found here.

This initiative is an open challenge to the healthcare industry vendor community to demonstrate true leadership at a critical time in order to improve outcomes by getting the right information to the right person at the right time. It will be interesting indeed to see who steps up and who does not.

Creating Efficiencies through Enhanced Communications: Alerts and Notifications
By Jenny Kakasuleff

jk

With the recent passage of health care reform and the 30 million newly insured individuals estimated to enter the marketplace, providers are under increasing pressure to improve productivity and efficiencies to meet increasing demand. These challenges must be met while simultaneously improving the quality of care patients receive.

Historically, providers of health care services have taken a piecemeal approach to implementing health information technologies. This has resulted in a number of disparate systems that do not communicate with one another, and contribute to a growing army of devices that health care providers must haul around with them, or have at their disposal in a largely mobile environment.

The alerting and notification systems still in use at many hospitals today are a conglomeration of proprietary systems and devices utilized to perform one particular function — a bedside monitor that sends an alert to the central nursing station to report a change in a patient’s vitals; a tracking system that allows any provider with computer access to locate a device; or a lab information system that sends an e-mail to indicate an abnormal lab result.

While this approach provides many individual solutions to overcome past inefficiencies, it has been uncoordinated, and as a result, creates its own set of problems. The responding provider is saddled with a number of different communication devices to perform a range of non-standardized tasks.

Most professionals today have the ability to perform all of their business-related (and personal) activities via a single mobile device. We make phone calls, check our e-mail, manage our calendar, pay our bills, locate people and places using GPS, listen to music, connect with friends and family through SMS text and instant messaging, or through social media networking — all through one multi-functional device. It is amazing that the same demand is not pervasive in the medical sector.

Health IT solutions now exist that not only address the problems of the past, but work to streamline the disparate systems currently in use into a single, standardized messaging system that delivers a range of alerts and notifications of varying importance to the appropriate recipient. Also, with the integration of an enterprise-class communication solution, providers now have the ability to receive alerts from each proprietary system — electronic medical record (EMR), hospital information system (HIS), nurse assignment, lab information system, etc. — via a single device powered through a unified communications system.

Different messages are delivered based upon their level of importance and escalated until its receipt is acknowledged. The HIS is then updated and auditing trails create a measure of quality tracking and control. The recipient can then respond to the relevant options generated without locating a phone, computer, or other staff member.

As the American Recovery and Reinvestment Act (ARRA) forces health care professionals to evaluate how best to implement and utilize their EMR systems to qualify for meaningful use incentives, their approach should be holistic; cognizant of current and future challenges; and focused on gaining as much mileage as possible from the investment.

Jenny Kakasuleff is government liaison with Extension, Inc. of Fort Wayne, IN.

Readers Write 5/10/10

May 10, 2010 Readers Write 12 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

Thoughts About athenahealth
By Deborah Peel, MD

 dpeel

Another misguided, uninformed EHR vendor will discount the price of EHR software for doctors willing to sell their patients’ data!

How is it possible to be so unaware of what the public wants? The public doesn’t want anything new or earth-shattering, just the restoration of the right to control who can see and use their medical records in electronic systems.

Not only is the practice of selling your patient’s data illegal and unethical, but the new protections in the stimulus bill require that patients give informed consent before their protected health information can be sold. So selling patient data without consent is now a federal crime.

Quotes from the story:

  • athena’s EHR customers who opt to share their patients’ data with other providers would pay a discounted rate to use athena’s health record software.
  • athena would be able to make money with the patient data by charging, say, a hospital a small fee to access a patient’s insurance and medical information from athena’s network.
  • Caritas Christi [Health Care] initially launched athena’s billing software and service in October and then revealed in January that it decided to offer the company’s EHR to physicians.
  • How many patients would agree to sell their health records to help their doctor’s bottom line AND at the same time put their jobs, credit, and insurability at risk?

Health information is an extremely valuable commodity, so people are always thinking of new ways to use it.

What will athena’s informed consent for the sale of health patients health data looks like? Will athena lay out all the risks of harm? Will athena lay out the fact that once the personal health data is sold, the buyer can resell it endless to even more users? Will athena caution patients that once privacy is lost or SOLD, it can never be restored?

I guess some people are so out of it they do not realize what a barrier the lack of privacy and lack of trust is to healthcare. HHS reports 600,000 people a year refuse to get early diagnosis and treatment for cancer because they know the information won’t stay private. Another 2,000,000 refuse early diagnosis and treatment for mental illness for the same reasons.

Check out slides from a recent conference at the UT McCombs Business School on the subject of patient expectations, privacy and consent.

Deborah C. Peel, MD is a practicing physician and the founder of Patient Privacy Rights.

Thoughts About athenahealth
By Truth Seeker

Um, I think we need to settle down here, folks. I may be wrong, but I believe when athena refers to athenaCommunity and the exchange of information, they are referring to the following hypothetical scenario:

A patient whose primary physician is an athena customer needs to be admitted to the hospital. athena delivers to that hospital a clean, clinically accurate, and up-to-date record of that patient’s medical history and charges the hospital a few bucks. athena is able to charge the primary care physician a lower fee for their EMR service because they are shifting some of the financial burden to the hospital. And intuitively, this make sense for a couple reasons:

The push towards electronic medical records is to enable greater exchange of information and better coordination of care, etc So when athena talks about athenaCommunity, I’m fairly certain that they’re not talking about a sinister plot to share info with hospitals so they can refuse to admit high-risk or expensive patients. (Seriously, the conclusions people draw from articles like this without doing their homework can be completely ridiculous, but I suppose that casting baseless aspersions is just the nature of informed discussion in the Internet era.)

They’re just talking about handing the patient over to another provider and making sure that the new provider has a completely accurate and up-to-date record of that patient’s medical history, and of shifting the financial burden from the handover away from the primary care physician. What a "privacy disaster" … a sheer outrage!

And second, I’m no healthcare economist, but I’m pretty sure that a) the hospital really wants and needs that patient’s medical history and that athena is probably better positioned to deliver it in a more useful format than a lot of their competitors; and b) it’s probably worth a lot more to the hospital than a few bucks. 

I’m not an athena employee or other stakeholder, but I do think that they continue to think of innovative new solutions to problems, bottlenecks, and inefficiencies in the healthcare system. Unfortunately, they seem to have a bulls eye on their backs right now. I for one am happy that we have smart people like Jonathan Bush out there coming up with creative new solutions. 

Why Emergency Physicians Prefer Best-of-Breed IT Systems
By John Fontanetta, MD, FACEP

johnf

According to a recent report from KLAS, some hospitals are replacing standalone, best-of-breed (BoB) emergency department information systems (EDIS) with enterprise solutions that are leaving ED clinicians — and often their patients — unsatisfied. Why unsatisfied? Because the clinical functionality in enterprise solutions is both less comprehensive and less efficient for the ED environment and they are just so hard to use.

This report has re-energized the debate over the benefits of the two kinds of systems. IT professionals prefer the seamless interoperability supposedly offered by single systems, but the fact is that many large vendors have simply bought and shoehorned in a separate ED system. The resulting systems have their own interface issues.

Like many of my fellow ED physicians, I have found that a first-class BoB system tailored specifically to the needs of the ED, in our case EDIMS, offers a number of advantages. For example:

  • Workflow in the ED is measured in seconds and minutes rather than hours or days. The fewer clicks required, the faster the care. At Clara Maass Medical Center, we can issue complete sets of orders in as few as three clicks, enabling our physicians to be more productive.
  • Trying to retrofit an inpatient IT system to the ED is difficult because the ED is just so different from the floors. Customized ED order sets with a linked charge capture system means less delay between treatment and billing, not to mention a more accurate capture of charges, which has dramatically increased our per-patient revenue.
  • In the same way, customized alerts that tell the ED staff what they’re forgetting to document cuts back on the number of claims denied due to missing or inaccurate information. At Clara Maass, we have slashed such denials by 75%.

One of the most important things about a good ED system vendor is responsiveness. The vendor should be able to quickly accommodate the ongoing changes in standards and regulations. For example, at Clara Maass, when the H1N1 virus first appeared in 2009, we had templates for recommended care and discharge instructions built into our system by our BoB vendor within 24 hours. And when we decided to create an observation area, they promptly responded with observation-specific templates and order sets and created a secondary note option for the observation physicians.

The EMR system has enabled us to make a number of other improvements in our ED. For example, we have reduced the average patient turnaround time by over 30%. We have boosted the number of EKGs we perform within five minutes of a patient coming through the door from 46% to more than 90%.

Overall, my specialty has been slow to adopt EHRs, not because we don’t see their importance, but because they have a reputation for being unwieldy and unresponsive to the requirements of the ED. With more and more EDs adopting BoB systems that are designed to support ED clinicians’ intricate and demanding workflows, physicians are starting to realize that an EHR can actually be an advantage in our fast-paced environment, rather than a burden. 

CIOs are finding that these BoB systems can offer the same, if not better, integration capabilities than a single, enterprise solution. While many of the HIS vendors are inflexible when it comes to working with other systems, BoB systems have always had to offer integration solutions and many pride themselves on their ability to integrate with almost any system.

John Fontanetta MD, FACEP, is chairman of the department of emergency medicine at Clara Maass Medical Center, Belleville, NJ and chief medical officer of EDIMS.

Digging for Gold in your HIT Applications
By Ron Olsen

Over the past few years, hospitals have focused IT budgets and resources on purchasing applications to enhance their HIS. Many facilities have spent tens or hundreds of thousands — millions for the larger hospitals — on licensing, maintenance, and ongoing professional services.

In the feeding frenzy to continually acquire and implement the latest healthcare information technology, most IT/IS teams are neglecting to ask basic but important questions about their existing applications, such as:

  • Are we using the software to its fullest extent?
  • Have we turned on every feature we’re currently licensed for?
  • Are HIT products meeting the needs we identified when planning the deployment?
  • Have we asked users what they’d like to see added to the product, and if so, has that been communicated to the vendor so they can include it in a future version?

Asking questions does not cost anything and end users are usually very vocal about what they’d really like to see software do for them. Their invaluable real-world input is useless if there’s no feedback mechanism, or if your team refuses to incorporate it into product roadmap discussions with vendors.

In a time in which hospitals’ funds are tighter and IT budgets frozen or cut, it’s time to double back and review what products you have purchased and their capabilities. Maybe re-present the product to different areas of the facility explaining existing functionality again, and introducing new features that have been added since the initial implementation. Now that the users have gotten a refresher, they may identify functionality that was not implemented initially and would now prove useful.

Healthcare technology vendors are always eager to showcase new features and theoretical uses for these at sales presentations, but IT/IS admins often overlook “hidden gems” in the software that other hospitals are actually using. If the vendor has a user group, listservs, or an online forum, these are great places to start, not to mention that they cost nothing and consume very little time.

These collaborative tools may enable your team to discover other use cases that even your vendors have not thought of. There are a lot of people in the healthcare IT trenches creating workarounds every day. There may be capabilities within current products to join with other systems within your tool bag to create a new or improved process that is, again, a freebie.

One of the most over-used buzz words in healthcare IT is “interoperability,” a is really a big word that self-important people use to describe data transfer. When thinking about data transfer at a basic level, almost every HIT product can output to a printer. A printer can be easily set up to print to a file. So now you have data in a file format.

Scripting tools can manipulate those files, turning them into almost any format imaginable. With the correct format, data can be transferred to disparate systems, individually or concurrently, via a data stream. This could be a raw text file, compressed zip file, encrypted e-mail file, FTP, or an HL7 file.

This method is easily applied to an enterprise forms management system. If it has a decision engine, you could print a form set from it and then have the engine input the data to a database for audit trails (you should be able to choose the data points). Next, the engine sends the data to a file and launches an application to text the ordering physician that the patient just presented, based on the data in the text file.

If you’re a budget-conscious healthcare IT professional who wants to better meet the needs of your user community, I implore you to take another look at the systems you’re already working with. In my many years as a system admin at a community hospital, getting more out of the tools available to me (instead of just relying on new purchases) helped me deliver more effective tech solutions to my users, positively impact patient service, and keep decision makers happy by saving money.

You, too, have gold nuggets hidden in your existing software. It’s up to you to find and use them.

Ron Olsen is a product specialist with Access.

Readers Write 5/3/10

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

Goodbye Data Warehouse and Cubes, Hello AQL
By Mark Moffitt

markmoffitt

For the last two years, I have been researching systems to replace the data warehouse used for report-writing in our organization. This effort has been driven by the desire to provide better service to other departments that rely heavily on data reporting for day-to-day operations.

The idea is to push data to users so they can perform in-memory analysis and display of large amounts of data, a system that would replace the current process of requesting custom reports and spreadsheets from the information services (IS) department. The current process requires considerable resources in the IS department and requests can take several days if the number of requests for reports in the queue becomes large.

The requirements for a new system are straightforward, but somewhat daunting:

1. Put data into users’ hands so they can perform business intelligence.

2. The cost of the system, including license, hardware, and consulting, must be offset by the direct costs of shutting down existing systems.

At GSMC we operate Meditech Magic and use a data warehouse for analytics and business intelligence. The data warehouse stores about nine years of financial data in about 650 GB. The data in the warehouse is updated nightly. SQL reports have been developed to provide reporting across the organization.

IS at GSMC is bombarded with requests for new reports. These requests come in the form of specialized requests for data that often require modifying an existing SQL query or writing a new query. The process is iterative that starts with gathering requirements for a report, modifying or writing new SQL queries, generating a report and sending it to the customer.

Typical turnaround times are variable and are highly dependent on the number of reports in the queue to be developed. Best case scenario is four hours, typical is two to four days. Often the customer will, upon review of the report, ask to include or exclude specific data. This back-and-forth typically occurs several times until the report meets the customer’s needs.

The IS department at GSMC has several analysts who spend a good part of their time responding to requests for data. It is a never-ending demand.

We researched the use of OLAP (online analytical processing) cubes to provide data to users. The advantages of cubes is well documented and includes the ability to drill down to details and analyze data in ways simply not possible with reports or spreadsheets. The disadvantage to cubes is that data must first be aggregated. If a user needs data not included in the cube, then the cube must be rebuilt. Also, a data warehouse is required. Finally, building and maintaining cubes require personnel with specialized skills.

About seven months ago, I read on HIStalk about a new company named QlikView. I researched the software and it sounded too good to be true. However, I was intrigued that QlikView doubled revenues in 2008, not an especially good year for selling enterprise software as the national economy was in a major recession.

On the surface, QlikView is a business intelligence solution that consists of a data source integration module, analytics engine, and user interface. QlikView is based on AQL and is completely different from other OLAP tools.

Through AQL, QlikView eliminates the need for OLAP cubes and a data warehouse, replacing the cube structure with a Data Cloud. A Data Cloud does not contain any pre-aggregated data but instead builds non-redundant tables and keeps them in memory at all times. Queries are then created on the fly and are run against the Data Cloud’s in-memory data store.

Under AQL, all data is stored only once, and all data associations are stored as pointers, so a Data Cloud database becomes more efficient at retrieving records than do OLAP databases. A Data Cloud database is also much smaller since records are not repeated through aggregation and its structure never has to change. The architecture allows for a flexible end-user experience because it doesn’t require aggregation or pre-canned queries that try to cover every possible analytical scenario a user can create, unlike data cubes that require both. (1)

Data Clouds run in memory and AQL reduces in-memory storage requirements by about 75% as compared to source data. In-memory Data Clouds can be stored as AQL files for archiving. AQL disk files are 90% smaller than source data. Think of an AQL file like an Excel file where data can be added and deleted and the file saved with different names for archiving purposes.

The price point for the software is about $150,000 (one-time fee) for our health system. Hardware costs are about $15,000 for a server with 98 GB of memory. We expect consulting fees to total $150,000 for a SME in hospital financial data with QlikView experience. We worked with RSM McGladrey on a consulting proposal as they have well-qualified personnel in this space.

If you know much about the BI/Analytics space, you may question the low cost of the software and consulting services. This has everything to do with the AQL model. RSM McGladrey quoted a revenue cycle effort at eight weeks and includes:

  • Transfer data from existing systems to QlikView
  • Data validation
  • Census analysis
  • AR analysis
  • Insurance contract analysis
  • Hindsight analysis
  • Train IS staff on data extraction

The revenue cycle statement of work is only one component of the $150,000 quote for consulting services from RSM McGladrey for implementing QlikView at our organization.

The total cost for QlikView at GSMC is $315,000. That will be directly offset by shutting down a data warehouse, savings from using QlikView for analytics versus another system where the cost of consulting services had already been quoted and budgeted, and other savings. We expect additional direct benefits from having deep analytic capabilities with our revenue cycle data.

QlikView has a number or healthcare customers. I believe you will be hearing more about the company in healthcare in the years ahead as they achieve market awareness of QlikView software’s capabilities and price point.

We have not yet purchased the package. If we do, I’ll write a follow-up article on our experience.

1 “Qliktech, IBM Provide New View Of OLAP”, Mario Morejon, Technical writer for ChannelWeb, July 18, 2003, http://www.crn.com/software/18839582

Mark Moffitt is CIO at Good Shepherd Medical Center in Longview, TX.

Humpty Dumpty Leaves Wonderland to Visit Health Information Technology
By Jim Kretz

Suppose I told you that “voting” henceforth would mean you would only be shown a ballot, period. No more selecting your preferred candidate.

Now suppose I told you that your consent to disseminating clinical information did not mean your granting permission, but only your acknowledgement that you saw my information policy — take or leave it. This may remind you of Humpty Dumpty’s scornful assertion, “When I use a word it means just what I choose it to mean — neither more nor less.”

Surprisingly, the insanity of “…use the term ‘Consent’ to mean the acknowledgement of a privacy policy, also known as an information access policy. In this context the privacy policy may include constraints and obligations.” comes from an IHE (Integrating the Healthcare Enterprise) policy paper “IHE IT Infrastructure Supplement 2009” that was taken up  (line 157) by the IT Standards Advisory Committee Privacy Workgroup, April 23, 2010.

The authors of this paper — the American College of Cardiology, the Healthcare Information and Management Systems Society, and the Radiological Society of North America — are not mean-spirited, uninformed, or confused. What could result in their clearly having tumbled into a conceptual rabbit hole?

Jim Kretz is project officer at the Substance Abuse and Mental Health Services Administration of the Department of Health and Human Services. His comments should not be construed to reflect the official position of SAMHSA.

Massachusetts HIT Conference Thoughts
By Bill O’ Toole

I had the pleasure of attending a National Conference hosted by Massachusetts Governor Deval Patrick in Boston last week. The conference was billed as Health Information Technology: Creating Jobs, Reducing Costs and Improving Quality

Keynote addresses were provided by David Blumenthal, MD, National Coordinator for HIT and Vice Admiral Regina M. Benjamin, MD, MBA, Surgeon General of the United States. Health IT Policies and Standards were addressed by a panel that included John Halamka, MD (CIO, CareGroup and Harvard Medical School), Marc Overhage, MD (CEO, Indiana Health Information Exchange), Paul Tang, MD (CMIO, Palo Alto Medical Foundation), Micky Tripathi, Ph.D (CEO Massachusetts eHealth Collaborative) with Tim O’Reilly (President, O’Reilly Communications) moderating.  

Another panel discussion on Health IT, Business Opportunities and Job Creation featured leading Massachusetts vendor executives Girish Kumar Navani (eClinicalWorks), Howard Messing (Meditech), Richard Reese (Iron Mountain), Bradley J. Waugh (NaviNet) moderated by Chris Gabrieli (Bessemer Venture Partners).

I could go on and on, but the list would be too long. I mentioned those above to give readers a sense of magnitude and to perhaps share in this small article the profound comfort I felt that "we" are doing this right. Many other highly qualified participants shared their knowledge on all things HIT- and ARRA-related.

What impressed me most was the overwhelming sense of momentum. The stimulus package and its future incentives have so far done exactly what was intended, serving as the spark that has set this massive project in motion. Remaining at the forefront of it all, though, is the goal of better medical care for all. That theme was never lost and was frequently repeated.

As one who until now has found certain parts of most conferences to be extraneous (ok, boring), I felt obliged to inform the far-flung readership of HIStalk that I was extremely impressed with every minute of this two-day conference. If the energy, knowledge, and sincere interest and enthusiasm expressed by those involved in this conference are carried forward to the project at large, then we are truly in for a remarkable change in our industry.

Congratulations to the Massachusetts Technology Collaborative and its Massachusetts eHealth Institute, the Massachusetts Health Data Consortium, and Governor Patrick for organizing this special event. It should serve as the model and be repeated whenever possible throughout the country.

William O’Toole is the founder of O’Toole Law Group of Duxbury, MA.

Readers Write 4/22/10

April 22, 2010 Readers Write 2 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

License Rights in Your Software License Agreements
By Robert Doe, JD

Each software license agreement contains a provision which grants specific use rights with regard to the software you are licensing. However, software manufacturers’ standard contract documents may not take into account your organization’s specific use requirements.

As a result, unless your organization has a relatively simple legal structure, you should pay particular attention to this language to ensure the software can be used as you intend it to be used. The extra effort is well worth the time when you consider that without the proper license grant, you may be asked to pay additional, unanticipated fees down the road.

If you don’t alter the standard contract language, typically, the license grant is given only to the legal entity signing the contract. For example, a typical software vendor’s license grant provision might read as follows: “Licensor grants Customer a perpetual, nontransferable, nonexclusive license for the number of concurrent users set forth in Exhibit A to use the computer program listed in Exhibit A (the "Software") at the installation site set forth in Exhibit A for Customer’s internal business purposes.”

In this example, the license grant is given to “Customer,” which is typically defined as the legal entity signing the agreement, which may not encompass all the actual individuals that will use the software. Getting the license rights correct in your contract requires that you know how your organization is structured and who the individuals are that you want to be able to access and use the software, both at the current time and in the future.

If your organization has a parent corporation, or has one or more legal entities that are owned or controlled by your organization or are under common control with your organization, the typical vendor license grant provision will technically not allow any use by the employees of these “affiliate” organizations.

Another example of a situation that is not technically covered in most license agreements is use by contracted providers that are not employees of your organization. In addition, some organizations may have other independent contractors that will need access to the software at various times, such as computer consultants.

With more and more frequency, healthcare organizations are licensing software not only for their own use, but to use on behalf of other smaller healthcare organizations in the community. Similarly, some healthcare organizations are considering re-licensing their systems to smaller organizations at a reduced rate.

In the example license grant language above, use of the software is limited to the “internal business purposes of the Customer.” If the software is to be used, in part, for the benefit of an affiliated or unrelated organization, or re-licensed to such an organization, the license grant will need to be significantly modified to allow for such actions.

When licensing software, it may be worth the extra time to put some thought into how you intend to use the software, both internally within your organization and, if applicable, externally. As part of your analysis, you will need to understand the legal structure of your organization. This information will help you to make sure you have the appropriate license grant in your software license agreements to allow for the use rights you require.

Bob Doe is a founding member of BSSD, an information technology law firm located in Minneapolis, MN.

Readers Write 04/05/10

April 5, 2010 Readers Write 14 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

Has Meaningful Use Already Lost All Meaning?
By Cynthia Porter

cynthiap 

The release earlier this year of expanded meaningful use requirements has gotten the healthcare IT community in quite a tizzy. The phrase was on everyone’s lips before, during, and after HIMSS. It was obvious to me
that:

  1. Everyone has a strong opinion about it;
  2. Not everyone understands it; and
  3. The recent passing of healthcare reform has left providers extremely anxious about how they and the vendors they do business with will comply with “it”, depending on what “it” ultimately turns out to be.

I know for a fact that hospital executives’ concerns about their institutions’ abilities to meet requirements and the overly aggressive timetables released as part of the expanded meaningful use requirements has increased exponentially since 2009, when the HITECH Act was initially released.

Nearly 80 percent of 150 hospital executives recently surveyed by Porter Research noted an increased rate of adoption for e-prescribing, patient portals, and EHRs. That’s a 20% increase from 2009, before the expanded requirements were published. So it’s safe to say that providers are jumping on the bandwagon.

Most, however, are worried that the wheels are going to fall off because vendors won’t have enough qualified employees and/or up-to-date resources to meet demand and requirements. One CIO we interviewed believes vendors “will be forced to spend more programming hours around the interoperability and security of their software versus the primary function, which is taking care of patients and making it easier for clinicians to utilize.”

And there’s the rub. Sure, the healthcare IT community will probably benefit from the political machinations going on in Washington, but will the patients? Will vendors rush to provide hospitals with technologies that could have used a few more months of development and trial? Will hospital staff have time to adequately train their IT people to use these new technologies? Will patients pay the price for a rush job?

It’s unfortunate that time will tell, because time is one thing patients don’t have.

Cynthia Porter is president of Porter Research.


Health Reform, Schmealth Reform – Freakin’ Pay Me
By Gregg Alexander

Down here in the primary care trenches, where the pudding meets the pavement (or some such mixed analogy), no matter how much we may want it to, health reform doesn’t seem like it will ever really get to addressing our needs.

What do I mean by that? Simple enough: it is getting virtually impossible to justify staying in traditional primary care any more and, health reform or no, HITECH or no, Congress just walked away and forgot about me and mine.

Despite our efforts to help bring the best we can to those we serve, what do we get? People, be they private insurers or Medicaid, self-pay or no-pay, hospitals, and even IT folks, all telling us what we can and can’t do. We’re told when we are and aren’t allowed to make medical recommendations based upon our knowledge and experience and then we’re told just how much we’re allowed to charge for our expertise. (Disregard whether or not we’ll even get paid anything at all for our time and trouble.)

We fight to get what we believe is appropriate care for our patients, regardless of their insurance or lack thereof. We struggle to make ends meet so that we can offer the advantages of a quality medical home and, perhaps, digital healthcare information management to our patients. We work far too many hours, away from our family and friends, just so we can feel good enough to sleep at night knowing we have done our best to help those who come to us for care.

And then…and then…Congress goes on break before postponing a 21.3% cut in Medicare payments. (Thank you, Senator Tom Coburn, R-OK.) Whether or not they repeal it when they return, CMS will likely withhold payments for at least 10 days before beginning to process those 21.3% reduced payments. For those affected, continuing this Sustainable Growth Rate (SGR) formula is anything but sustainable and quite the opposite of growth.

Ladies and gentlemen, if you’re not already aware, we have a shortage of primary care providers in America. Pushing us toward expensive technology adoption which may or may not truly be ready to really meet OUR needs while reducing the bottom-of-the-barrel payments with which we already struggle, is not going to solve any little piece of our giant healthcare crisis. It will make it much worse as more and more of us leave for less stressful and less beyond-our-control professional lives. All the while, we’ll leave little encouragement for the med school up-and-comers who will doubtful choose to join the ranks of careworn primary care.

Let us worry about dealing with the pressures of making medical decisions and allow us a reasonable income which doesn’t add to the strain. Elsewise…well…how long would you stick around after a 21% pay cut?

From the (weary) trenches…

“Pay me for my work, but I don’t do it for the money.” – Vanna Bonta

Dr. Gregg Alexander, a grunt in the trenches pediatrician, directs the “Pediatric Office of the Future” exhibit for the American Academy of Pediatrics and is a member of the Professional Advisory Council for ModernMedicine.com. More of his blather…er, writings…can be found at his blog, practice web site or directly from doc@madisonpediatric.com.

This Just In: HIRE Bill Signed! Could Hiring Tax Breaks Benefit Your Organization?
By Tiffany Crenshaw

tiffanyc

On March 18, 2010, President Obama signed into law the Hiring Incentives to Restore Employment Act (HIRE). HIRE is a $17.5 billion jobs bill that the President says will bolster hiring and incent business owners, creating approximately 250,000 new jobs.

The bill was dramatically scaled back as it passed through the House and Senate, from $150 billion to less than $20 billion. Still, lawmakers say it is the first step in a series of bills designed to encourage job growth.

The Act offers two tax breaks to companies who hire recently unemployed workers, one in the form of a payroll tax exemption and the other in the form of a tax credit. Beginning March 19 and through the remainder of 2010, employers will not have to pay the 6.2% Social Security payroll tax on qualifying new hires. In addition, companies are entitled to a credit equal to 6.2% of an eligible employee’s total salary (up to $1,000) if that new hire is retained for at least 52 weeks consecutively.

To qualify for the tax breaks, new employees must be hired between February 3, 2010, and January 1, 2011. Each new hire must verify in writing that he or she was unemployed for a minimum of 60 consecutive days just prior to being hired. If the worker is replacing an employee in the same job role, he or she is not eligible, unless the previous employee was terminated for cause or voluntarily quit.

There is no doubt that these incentives may not help all companies, but HIRE is a start that could benefit your organization as well as the nation’s unemployment rate. Companies still experiencing depressed revenues due to economic slowdown may not benefit from the tax incentives, but others may find the tax savings to be a valuable advantage towards savings and growth.

Tiffany Crenshaw is CEO of Intellect Resources.

Readers Write 3/24/2010

March 24, 2010 Readers Write 9 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

Digital Information is Great, but Only if it’s Accurate
By Deborah Kohn

I am a patient at two local healthcare provider organizations that use the Epic suite of clinical information system modules for their base EHR. Both organizations must not yet have installed Epic’s CareEverywhere because currently, the two Epic systems do not talk to one another (or even look / act like one another). But with time, the installation of CareEverywhere should occur at both.

However, the reason I write this article is that either there is a flaw in Epic’s MyChart, the organizations do not know how to correctly configure MyChart, or there remains an important Epic user training issue. When I visit my providers at both organizations, I receive a hardcopy summary of my visit, which I must assume gets generated by MyChart because also I can view the data online via MyChart. Among many items listed on the summary are Current/ Future/Recurring Orders.

1) Orders listed on the summary and in the system cannot be corrected easily by an organization user, even the provider. I don’t know whether this is a user training issue (e.g., how to easily DC or cancel electronic orders that have been performed but, for some reason, not automatically canceled as Future Orders), a system flaw, or a poor implementation of the function. But for one set of lab orders, I was repeatedly asked for lab work to be performed when the lab work was performed months ago and I had the documentation to support this. Unfortunately, it took several handwritten notes and phone calls from me to the provider to finally update and delete the already performed lab orders from the system.

2) If orders listed on the patient’s hardcopy visit summary are incorrect (e.g., numbers of milligrams, duplicate orders, q 4 months not q 2 months, etc.), again these orders cannot be easily corrected by an organization user. That’s because, according to the organization’s users, these orders come from a different “database” than the “real” orders, which are correct in the system, but don’t print to the hardcopy correctly!

3) Either the Epic clinical system does not include or the provider organizations have yet to install or know how to install the following clinical decision support function: Recently, when my provider at one organization ordered a routine TB test, there was nothing in the system to alert the provider that the same, routine TB test was performed at this organization in July 2009. Consequently, this test was repeated in February 2010 at a cost of $398. When I complained about this, the provider organization commented that it is the provider’s responsibility to look back at all the orders in the system to see if a TB test had been performed within the last several years. I don’t blame the provider for not wanting to scroll through several years of past orders to determine this. And I was sorry I didn’t have my “paper” PHR, which I have kept for at least 30 years, with me at the time to double check this.

Now that electronic PHRs and visit summaries are appearing and patients are beginning to “use” (indirectly) organizational EHRs, not only will the organization’s internal users be complaining about system flaws, poor configurations, or outstanding training issues — but external users, the patients and recipients of health information exchanges, will be added to the lists. Consequently, it’s time our industry professionals address the management of the information, not just the technical and operational mechanisms for the sending and receiving of the information. Because it’s great to receive digital PHRs and visit summaries from provider organizations, but only when the information is accurate! Just ask ePatient Dave!

Deborah Kohn is a HIM professional and power user of EHR systems who not only makes sure her analog and digital health record information is correct, but remains dumbfounded that she need not do same with her bank record information.


We Are In the Business of Letting Clinicians Treat Patients
By Jef Williams

jef

While riding the shuttle to my hotel at HIMSS in Atlanta, I overheard two strangers behind me comparing stories of the conference to one another. Their short exchange encapsulated for me both the HIMSS event and the climate in which we are now living. The conversation went something like this:

Woman: “I attended a session today conducted by an IT expert. You won’t believe what I heard”

Man: “Really?”

Woman: “Oh yes. The presenter was talking about successful EMR and IT implementations and actually said, ‘The physicians are the ones who have received the education. They are the ones who treat patients. So they must be the focus of our implementation.’”

Man: “You’re kidding.”

Woman: “No! I was so offended I nearly walked out.”

Man: “That’s ridiculous.”

Whether one agrees with the federal stimulus package and the push toward EHRs, the fact remains that it has created a significant impact on the business of healthcare IT. Clinicians, administration, and IT each play an important role in running the healthcare organization. Administration and IT serve, however, in support roles to the mission of providing an environment that allows clinicians to do what they do best: treat patients.

Over the past decade, the role of IT has grown significantly as healthcare has played catch-up to the most other industries in moving away from paper and manual systems to electronic and automated systems. This shift has had its share of challenges and most organizations can list a number of tragic stories of failed or messy implementations. Difficult workflow, poor user adoption, and meaningless data are all symptomatic of the problem of letting IT professionals make critical decisions sans clinical input regarding system procurement, design, and implementation.

It appears we have not learned our lesson. Introducing federal subsidized funding and reimbursement into the business model of clinical information systems the federal government has shifted focus to management and IT, leaving clinicians in the trailing position. The idea that caregivers come last could not be more backward to the true value proposition of healthcare. This industry is, and will remain, primarily about providing healthcare. No matter how advanced EHRs, widgets, and handheld devices become, patients will continue to measure satisfaction by whether a doctor knows what she’s doing, has the right tools to treat, and that they ultimately are healthy.

So to that presenter at HIMSS, I am not offended. It seems in this climate we have forgotten that we are in the business of letting clinicians treat patients. No EHR, HIS, PACS, eMAR, or any other system can provide better patient care without a doctor reaching out a stethoscope and asking her patient to breathe deeply. We in administration and IT get to play a valuable role in providing the tools and support to help our physicians provide better patient care. But we are just that — support.

Let’s not let the promise of a few dollars and the lure of a few vendor-hosted parties blind us to that fact.

Jef Williams is vice president of Ascendian Healthcare Consulting of Sacramento, CA.

Readers Write 2/24/10

February 24, 2010 Readers Write 7 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

Imaging Decisions Demand Up to Date Information
By Michael J. Cannavo

pacsman

Five years ago, I was approached by a PACS vendor to put together a presentation for IT folks at HIMSS. We did the presentation titled Everything IT Needs to Know About PACS* (but is afraid to ask) off-site and had 75 people there.

Why off-site? The vendor tried to get HIMSS to sponsor the session, but was continuously rebuffed in their attempt. Exasperated yet needing to get their potential IT clients the information they wanted, this was the only way they knew how to get their information out.

Five years later, where is PACS at HIMSS? Still ostensibly a persona non grata. Of the 300+ presentations being given at HIMSS this year, only two deal with PACS. What is most fascinating is that in spite of this seemingly ongoing denial of PACS importance in the IT community, over 200 of the 900 vendors showing at HIMSS are directly involved in PACS and imaging .

An entry level PACS at a small community hospital can cost $250-300K, while a larger facility can easily spend several million dollars. IT needs have much more information than knowing just the hardware, O/S, and potential network impact, yet has few resources for these from its own society,

The dynamics of the PACS decision making has also significantly changed in the past few years. Where radiology once stood apart from other departments in the way decisions surrounding the vendor of choice were made, now nearly half (and in some cases more) of the final decision on the PACS vendor of choice falls to the IT department. And where does IT go to gets its information? Largely from HIMSS.

With so much geared towards meeting the EHR initiative by 2014 and with it the facilities share of the $20B in ARRA dollars set aside for healthcare IT, one has to question why PACS isn’t part of the HIMSS educational equation. This is especially important since radiology is second only to cardiology in overall revenue generation.

HIMSS should be commended for its role in ongoing education through virtual conferences and expos, but PACS needs to play a much larger role in this. Vendor Neutral Archives are a hot topic not just from a PACS perspective but enterprise wide as well. PACS also plays huge role in the delivery of images both to the desktop and via the web and will play a massive role in the rollout of an EHR.

Some might say that radiology has SIIM as its show, but SIIM doesn’t attract nearly the number of IT professionals or vendors that HIMSS does. Since these IT professionals are already at HIMSS wouldn’t it make sense if SIIM were a subset of HIMSS? Both entities already work together and this way everything radiology/imaging related could be seen at one trade show and not two providing IT with access to radiology-specific educational sessions as well. It’s worth a try…

Michael J. Cannavo is the president and founder of Image Management Consultants and is a 26-year veteran in the imaging community as a PACS consultant. He has authored over 350 papers on PACS and given over 125 presentations on the subject as well.

 

Something Wonderful
By Mark Moffitt

In this article I’ll discuss the potential future of smart phone operating systems and the impact these changes might have on clinical healthcare IT systems.

clip_image002

Enhancements to Smart Phone Operating Systems

The underlying operating system of smart phones will become more robust with improved multitasking and inter app communication. This will allow developers to integrate native apps, built by others that interact with the underlying hardware of a smart phone, e.g. phone, microphone, speaker, etc., with web apps. Web apps seem best for getting and displaying data and consuming services that are unique to a healthcare system. The reason is changes to web apps can be made and pushed out to users much faster than a native app.

These enhancements will enable developers to build hybrid smart phone apps that use, for example, the device’s phone app, another vendor’s dictation app or voice to text app, and another vendor’s secure messaging app. Developers at health systems will spend most of their time writing web apps that get and display data and consume services unique to the system. Inter app communication will minimize data entry by users as they switch between native and web apps. The user experience will be similar to using a single app.

Android (Google phone OS) has these capabilities, but with limitations. The Apple iPhone/iPod Touch/iPad does not, but will, I predict, within a year. Microsoft Windows Phone 7 is similar to Apple. Make no mistake, these three vendors, Microsoft, Apple, and Google, are going to drive innovation that will benefit healthcare IT users.

From 2010, Odyssey Two:

Floyd: "What’s gonna happen?"
Bowman: “Something wonderful.”
Floyd: “What?”
Bowman:  "I understand how you feel. You see, it’s all very clear to me now.  The whole thing.  It’s wonderful.”

See: http://www.youtube.com/watch?v=OqSml40nwCE&feature=related – start at the 2:08 mark

“Something wonderful” is what physicians, nurses, and other care providers have to look forward to once the use case models of smart phone technology are fully realized.  “It’s all very clear to me now.” It will bring software with features that makes your work much easier and you more productive while automatically generating the data needed for reimbursement, decision support, and the legal record.

See: http://histalk2.com/2010/01/18/readers-write-11810/ – second article down

I predict physicians will use smart phones for 80-90% of their work with electronic medical records, versus using a computer and keyboard, to do work such as viewing clinical data, real-time waveforms, vitals, medication list, notes, and critical results notifications; dictation and order entry.

Disruptive technology (see: http://en.wikipedia.org/wiki/Disruptive_technology) is a term used in business and technology literature to describe innovations that improve a product or service in ways that the market does not expect.

Disruptive technologies are particularly threatening to the leaders of an existing market because they are competition coming from an unexpected direction. A disruptive technology can come to dominate an existing market in several ways including offering feature and price point improvements that incumbents do not match, either because they can’t or choose not to provide them. When incumbents choose not to compete it’s often because the incumbent’s business model blocks them from reacting, aka “feet in cement” syndrome.

Smart phone technology alone is not a disruptive technology in clinical healthcare IT. When you mix smart phone technology with web services for integration and messaging and a virtual database model, you get a disruptive technology.

Smart phone and web services technology will bring improvements of a near-magnitude order change in the price-to-feature relationship of clinical healthcare IT systems or, simply stated, much more features at a much lower cost.

Vendors that offer large integrated clinical systems such as Epic, Cerner, McKesson, etc. charge a large premium for an integrated system because the market will pay it. These vendors have built their business model to capture and defend that premium. That premium will shrink to zero over the next decade due to these disruptive technologies. I predict the premium won’t go down without a fight from these very same vendors.

By then the justification for large, monolithic, integrated, single-vendor systems will have vanished taking with them a number of vendors encased in obsolete business models. From the ashes of the fallen will rise a new pack of healthcare IT vendors leading the industry.

clip_image004

This process is called creative destruction (http://en.wikipedia.org/wiki/Creative_destruction) and is a by-product of radical innovation, something the USA does better than any other country. While painful for some caught up in the destructive wave that pain is more than offset by the gains realized by the whole of society during the creative wave of innovation.

Surfs up!: http://www.youtube.com/watch?v=1j7ID47Nng8


Mark Moffitt is CIO at Good Shepherd Health System in Longview, TX where his team is developing innovative software using the iPhone, a web services infrastructure, and a virtual clinical data repository.


EHR Adoption and Meaningful Use
By Glenn Laffel, MD, PhD

glaffel

No matter how you approach the issue, it is clear to see that a serious information technology gap has been created in healthcare. From restaurant reservations to banking records, American information resides electronically across nearly every sector … except healthcare.

Where do we stand? Are the adoption reports accurate? And how will these figures be impacted by the US Government’s economic stimulus investment in health IT? Let’s take a closer look at the numbers.

First, the challenge of tracking EHR use in the US. There are currently varied and discordant definitions of what constitutes an EHR. Let’s take a closer look at the reported EHR use from a few different sources

clip_image002

CDC reports: The Center for Disease Control released a study in 2009 reporting that 44% of office-based physicians are using any kind of EHR system and only 6% are using a fully-functional system.

Harvard reports: A Harvard study reported that 46% of hospital Emergency Departments adopted EHRs. The figures dropped dramatically in rural and Midwestern emergency departments.

Patients report: A Practice Fusion survey conducted by GfK Roper in January 2010 found that 48% of patients reported that their doctor used a computer in the exam room during their last visit.

So where do these numbers leave us? We know from these three studies that approximately half of US doctors have started to use some kind of computer system in their practice. An indication that healthcare has taken major step toward closing the digital divide? Yes. A flawed and limited statistic? Also, yes. We don’t know exactly how these physicians are using the reported EHRs and computers. Practices may just be scheduling or billing with their electronic systems — two features that don’t contribute significantly toward improving quality of care.

Meaningful Use will set the bar high. Starting in 2011, we should be able to have a much more detailed perspective on how doctors use EHR technology. The 25 Meaningful Use criteria (currently still in draft with HHS) require demonstrated use of e-prescribing, CPOE, charting, lab connectivity, and more. As the name states, with the new HHS guidelines will help us to “Meaningfully” understand “Meaningful” EHR use.

Improving these adoption rates. EHR adoption has been slow in the past due to several factors: high upfront costs for traditional health IT programs ($50,000 or more per user), high levels of IT infrastructure needed for installation and maintenance, and concerns over changing workflows. The $44,000 stimulus for EHR adoption under ARRA removes some of the cost barrier with legacy EHR systems. It also creates a dynamic market for doctors to price compare and find affordable solutions to fit their needs.

As the start of the incentive program approaches, it will be interesting for those of us in the sector to track changing EHR adoption rates and see if the government’s hope for exponential EHR adoption growth becomes a reality.

Glenn Laffel, MD, PhD is senior VP of clinical affairs for Practice Fusion.

Readers Write 2/15/10

February 15, 2010 Readers Write 13 Comments

Data Entry and Quality Health Care
By Al Davis, MD

aldavis

The enthusiasm generated for EHRs by the 2009 ARRA legislation is almost palpable and hospitals across the country are scrambling to install systems at a breakneck pace. Behind the enthusiasm, however, are two issues, related yet disparate, that have been the confounding factors of EHR adoption in the past and will continue to be so in the foreseeable future.

EHRs offer the promise of data aggregation which can be used to refine clinical treatments for both improved quality and, possibly, lower costs, but this aggregation is dependent upon standardized dictionaries and, importantly, standardized data entry. EHRs currently offer standardized data via the use of templates, boilerplates, and pre-defined order structures. But the standardized data entry model often (usually?) does not completely and precisely conform to the observed signs, symptoms, and problems displayed by patients in the physician’s office, and therein lies the rub.

Patient care, especially when dealing with complex problems, requires the clinician to differentiate subtle distinctions among less than obvious alterations from normal physiology. Shortness of breath, one of the most common problems encountered in the emergency room, can result from problems with the lungs, with the heart, with the vascular system, with the blood, from medications, or simply from pollution or toxins breathed in by the patient, and those are the direct causes. Indirect causes such as intra-abdominal pathology, skeletal deformity or muscle weakness must also be considered.

While there is a high statistical likelihood that shortness of breath will result from one of a relatively small number of potential pathologies, assuming a diagnosis based on statistical likelihood will lead to poor or even dangerous patient care. The reason a pulmonologist trains for 12 or 13 years, and a nurse practitioner for six or seven, is to allow the pulmonologist to learn not only the underlying basis of the more rare causes of disease, but also to be able to discern the subtle differences that those more unusual pathologies may display. The use of template- or boilerplate-driven clinical notes negates the benefits of the more refined knowledge and experience of the pulmonologist. Requiring the use of such standardized data inputs is antithetical to quality medicine, yet allowing free text entry is equally antithetical to the as yet unrealized potential of the EHR. It is this contradiction which has slowed adoption of EHRs and will continue to hinder their use.

The challenge is for IT designers to work out a way for experienced clinicians to be able to commit to the record the sometimes subtle thought processes and observations that lead to their diagnoses, while maintaining enough control and/or discipline over the input to allow the potential of data aggregation to be realized. Monetary issues, regulatory compliance, and usability are important as well, but the paramount concern of the EHR must be to ensure that the best quality patient care can be delivered. If the cost of the input restrictions needed to allow data aggregation is the loss of ability to place nuance and subtlety into the record, the EHR fails that most primary of tasks.

Al Davis, MD is in private practice in Elmhurst, IL.


A Meaningful Ruse?
By Frank Poggio


clip_image002

At the risk of being a called a Cassandra, or at best a contrarian, I will attempt to explain why the federal government’s HITECH Act and Meaningful Use (MU) incentive program is a wolf in sheep’s clothing and why the better response for a provider would be to run, don’t walk, from this wolf.

First let’s review the basics. When a hospital or physician’s practice purchases and implements an electronic medical record (EMR) or Computerized Physician Order Entry (CPOE) before 2011 and files with the federal Department of Health and Human Services (DH&HS) the yet-to-be-developed regulatory documentation to declare their meaningful use (MU), then starting in 2011 that provider will be potentially eligible for an MU bonus payment. For physician practices, that could amount to a total of $44,000 over three years. For hospitals, depending on the number of discharges, somewhere between $2million to $3.8 million total. These incentive amounts are to be paid over three stages, or years, starting in 2011.

On the other hand, if a provider does not implement an EMR or CPOE, or purchases and implements a system but cannot show meaningful use, then a penalty will be incurred on Medicare payments in years 2015 thru 2017. This penalty will be in the form of a reduction to the legislated increase in Medicare payments for that year. Note: this is not a reduction in overall Medicare payments, but a reduction on the yearly Medicare inflationary adjustment factor. The first year the penalty is a 33% reduction of the adjustment, the second 66%, the third 100% (or in effect, you will get no adjustment at all).

Before I explain why I believe there is a wolf at your door, let me say I am a believer in the benefits of EMRs and CPOEs. There can be significant benefits in both, but not unless they are incorporate a sound work flow re-engineering processes prior to installation. Unfortunately there are very few if any MUs that are workflow-focused.

There are at least four major reasons why I believe your facility will never see an MU bonus.

1) MUs are, by the DH&HS’s own admission, a moving target. As stated in the Interim Final Rule (IFR) published in the Federal Register, December 30, 2009, on page 314, “We expect to issue definitions of meaningful use on a bi-annual basis beginning in 2011”. Hence, MUs will evolve over time. That will allow DH&HS to make them as easy or as onerous as they choose. How can you predict you will hit a moving target that you can’t even describe today? And if you believe the Feds may try to make it easier to foster participation, read on.

2) If you hit all but one MU, will you get the full bonus, or 95%, or 50%? Nobody knows and the question is not addressed in any IFR or other documents. I am willing to wager you will get nothing, and my reasoning follows.

3) The federal government has stated they are funding the HITECH program with $34 billion for MU bonuses. They also have stated repeatedly they expect to save over $200 billion to help fund the new national health plan. That’s about a seven-to-one expected payback in only a few years. When was the last time you had a seven-to-one ROI on any IT project over three years? If the feds do not see the seven-to-one payback in time, how many providers do you think will get to cash an MU check?

4) Our government is under extreme pressure to cut the federal deficit. In the President’s recent State of the Union Address, he stated he will freeze the government budget for ‘non-essential’ items to save $250 billion, to alleviate the trillions of dollars in deficits predicted by the OMB. Essential is currently defined as Social Security payments, interest payments on debt, entitlement programs, Medicare benefits, and the defense budget. These taken together make up over 80% of the total government expenditures. So the freeze has to come from ‘non-essential’ departments and programs. Medicare payments to providers are not considered part of Medicare benefits, they come under the DH&HS /CMS department operating budget. So, although the benefits to the seniors will not be reduced, the payments to the providers are fair game. And therein lays our wolf.

I noted earlier that if you fail to purchase and install an EMR / CPOE, you will be penalized by a reduction in the increase in Medicare inflationary adjustment in future years. Based on the above reasons, I believe there will be little or no adjustment increase in future years. If you don’t think this will happen, look at what Congress and DH&HS had allocated for the adjustment ‘increase’ in 2010 for physician Medicare payments. DH&HS wants to apply a -21% adjustment for physician payments. Yes, that’s minus twenty-one percent. Then, to get the AMA on board with the national health initiative, the Administration and Congress was going to delay this adjustment, but now even that agreement is up in the air.

On the hospital side of the world, look at what the Medicare adjustment increases have been over the last five years. The most they have been is 2% and the average is around 1%. If you run those numbers for a typical 200-bed community hospital with a Medicare utilization percent of 50%, the one percent increase amounts to about $300,000. Hence, reduce it by a third and you will miss out on $100,000 that year. Again, and that’s assuming there is any increase at all in future years.

Lastly, let history be your guide. I have worked in the healthcare world for 35 years as a CFO, CIO and multitude of other roles. As a CFO, I saw Medicare renege on many case mix adjustments, TEFRA adjustments, and DRG adjustments,all in the name of national budget deficits and health care cost controls. At one point, they set up a Medicare Payment Advisory Committee, then disbanded it when the Committee disagreed with too many DH&HS adjustment policies. I doubt the future will be much different, in fact probably worse.

So, run the numbers again, in future years if the Medicare adjustment increase is zero – because the feds and DH&HS say we can’t afford an increase due to overall deficits and budget freezes, then reducing the zero adjustment increase by 33% will incur how many penalty dollars?

What’s a shepherd to do?

The bottom line is there is no need to “horse in” a new EMR/ CPOE regardless of what vendors say. Secondly, horsing in a system as complex and far-reaching as EMR/CPOE and while hitting the expected glitches along the way is going to cost you far more than any Medicare adjustment penalty.

My advice … take your time, do it right ,and install components that will give you the most ROI the fastest. And watch out for the wolves.

Frank L. Poggio is president of The Kelzon Group.

Accurate Patient Identification and Privacy Protection – Not an “Either/Or” Proposition
By Barry Hieb, MD

barryh 

Whether you support federal government funding of HIT or not, it can’t be denied that healthcare is undergoing a major revolution as more and more clinical automation capability is being adopted. Funding of HIE projects, building toward the Nationwide Health Information Network (NHIN), will further these efforts. And clearly progress is being made, as noted in the recent KLAS report that verifies 89 active HIEs across the US. The ultimate vision of regional clinical information exchange crosses political, operational, and geographic boundaries using the NHIN’s network of health information exchanges.

However, we’re not addressing one of the most significant challenges that must be overcome for this scenario to work: the ability to accurately identify patients whose information may be scattered across a number of providers using disparate HIT applications and platforms.

The current state-of-the-art approach for patient identification centers on EMPIs that identify patients using demographic matching techniques. But industry experience indicates that EMPI matching techniques are only accurate 90 to 95% of the time, introducing a variety of potential errors in care delivery within and across provider organizations.

We know the answer to the problem — issue each patient a unique identifier that would be used to label their information across all participating provider locations. In fact, the 1996 HIPAA legislation mandated just such individual healthcare identifier. But, in 1998, Congress reversed itself on the patient identification issue based on valid concerns about the inability to protect the privacy of this data, and forbade the expenditure of federal funds on further pursuit of this essential component for accurate patient identification and data exchange. Since that time, there has been virtually no progress on this issue at the federal level, although recently a number of states have begun to pursue state-wide identifiers to support their HIE projects.

Since I left Gartner in 2008, I’ve been working with Global Patient Identifiers Inc. to build out the Voluntary Universal Healthcare Identifier (VUHID) system under the umbrella of a non-profit, private enterprise. The VUHID system is based on over 20 years of patient identification standards work done by the ASTM international E31 medical informatics group, and proposes a solutions that is both inexpensive and effective.

The VUHID system communicates with the EMPI system at the heart of each HIE. It issues identifiers upon request and maintains a directory indicating the sites that have information for each identifier. The VUHID system has been specifically designed to enhance the privacy of clinical information because it has no identifiable patient data — only the locations where each identifier is recognized.

VUHID identifiers are globally unique and are designed to support activities that the patient or others indicate need to be handled with privacy. The VUHID system represents a secure, cost-effective, currently available solution to enable error-free patient identification that extends across political and organizational boundaries.

Barry Hieb, MD is chief scientist for Global Patient Identifiers, Inc.

Text Ads


RECENT COMMENTS

  1. I dont think anything will change until Dr Jayne and others take my approach of naming names, including how much…

  2. I love the community health center that serves as my medical home, but they regularly ask me to sign forms…

  3. My mom was admitted to the hospital from the ED after she was diagnosed with multiple pelvic fractures. Two different…

  4. Many medical practices have become assembly lines, prioritizing throughput instead of personalized attention. In this case, patients are the widgets…

  5. Typical Big Health System experience. But the fraudulent charting is quite something. The higher-ups would care if they found themselves…

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

RSS Webinars

  • An error has occurred, which probably means the feed is down. Try again later.