Home » Readers Write » Recent Articles:

Readers Write 11/1/10

November 1, 2010 Readers Write 6 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

A Step Towards the Cloud
By Mark Moffitt

People tend to use the terms SaaS and cloud interchangeably, when in fact, they are two different things.

Software as a Service (SaaS) delivers software as a service over the Internet, eliminating the need to install and run the application on the customer’s own computers and simplifying maintenance and support.

Cloud computing is about using economies of scale and sharing cheap, commoditized computing resources to lower overall costs. To realize these economies of scale large data centers are built and managed to protect and secure customer data at the lowest possible cost. These data centers are huge (see photo below).

Cloud software takes full advantage of the cloud paradigm by being service-oriented with a focus on statelessness, low coupling, modularity, and semantic interoperability. Cloud storage uses shared-nothing, distributed data stores so that low-cost, commodity storage technology can be utilized. Traditional RDBMS don’t fit into these new storage models. The reason is RDBMS need to join data from multiple tables. This requirement is incompatible with the distributed storage configuration found in cloud storage services.

clip_image002

Google’s Dalles, OR data center on the Columbia River

On the banks of the windswept Columbia River, Google is working on a secret weapon in its quest to dominate the next generation of Internet computing. But it is hard to keep a secret when it is a computing center as big as two football fields, with twin cooling plants protruding four stories into the sky. New York Times, June 8, 2006

Few HCIT vendors have architected their system for the cloud. The good news is that healthcare systems don’t have to wait for HCIT vendors. They can take advantage of cloud computing today by storing and archiving clinical results such as lab results, transcribed reports, images, and waveforms in the cloud.

Clinical results are well suited to take advantage of cloud storage for reasons such as:

  • Results do not require a schema or other features of a RDBMS to store and access data. Yes, that includes lab results.
  • Key-value (object) stores are better suited for storing results than RDBMS.
  • Key-value data stores can use cloud storage technologies that are less expensive than the cost of using a vendor’s RDBMS to store and archive data.
  • Clinical results often need to be shared beyond the walls of an organization and, therefore, ideally suited to being stored in the cloud.

Amazon’s S3 cloud storage prices run about $18,000 per year for 10 terabytes of data. These prices include storage, archiving, and security. 500 terabytes is priced around $800,000 per year. There are additional fees related to access, but this number gives the reader a ballpark estimate of the price for the service. Other vendors such as Google and Rackspace offer a similar service at about the same price.

Other potential costs include deploying a system to provide local caching of often-used data in the cloud. This is accomplished by deploying a hybrid cloud to include local storage as depicted in the diagram below.

clip_image004

Savings are real and immediate when an organization pursues the cloud storage strategy for clinical results when replacing hardware such as moving from MEDITECH Magic to 6.0 or MEDITECH Magic to Cerner; or upgrading an image archive system. Cloud storage can eliminate the need for hardware and software that would otherwise be needed to store and archive existing and future clinical results.

It seems to me that cloud storage is a better model for an HIE than reposing clinical results into yet another fixed-schema RDBMS. The reasons are:

  1. Providers are obligated to maintain a copy of results for legal and reimbursement obligations.
  2. Providers save money by storing and archiving clinical results in the cloud.
  3. HIE organizations can use clinical results stored in the cloud and focus their efforts on providing services unique to an HIE such as electronic opt-in/opt-out functionality, security, and record locator services for clinical results as a way to offer personalized EHRs to patients.

The transition to cloud computing in HCIT will take years as the business case for the approach becomes financially and operationally attractive as compared to alternatives and customers understand and accept the new paradigm of cloud computing and cloud storage. The transition to cloud computing will not be a waterfall event, but rather a gradual diffusion of the technology into HCIT. Storing, archiving, sharing, and securing clinical results in the cloud may be the first step in moving HCIT to the cloud.

Mark Moffitt, MBA, BSEE, is a former CIO and is working as a consultant while looking for his next opportunity.

Why IT Can Never Be Irrelevant
By Shubho Chatterjee

Over the last few years, journals, trade magazine articles, editorials, and even a textbook (Does IT Matter by Nicholas Carr) have prognosticated the irrelevance and strategic demise of IT. Many thought-provoking articles and blogs have debated the pros and cons of this prognostication.

I am going to add one more and argue that IT can never be irrelevant in organization, strategically or operationally. Here is my argument.

Firstly, IT is a discipline, much like engineering, finance, marketing, and others. Within engineering and finance exist multiple disciplines. As long as the world exists, both disciplines will exist. IT is similarly an assembly of different disciplines providing a very important outcome. Do we come across arguments that engineering or finance is irrelevant? No. Similar rationality will negate the IT “demise” thought leadership.

Secondly, following from the first argument, can we imagine today any organization operating without technology and IT? Take out IT — ERP, EMR, CRM, data networks, Web sites, ad infinitum — from any organization and the entire organization will collapse. Who plans for this, who should strategically plan for this, and who operates these systems? IT. IT is probably the most critical component of a functioning organization.

Thirdly, let’s examine IT functions and how it provides context to this debate. At the lowest level, Tier 1, is the basic infrastructure support, such as, help desk, network management, telecommunications support, and others. These activities are very commoditized and often outsourced, on-shore or off-shore. Outsourcing has also provided a rationality support for the “IT irrelevance” thought camp. But let us examine what happens and who does it.

Even when such functions are outsourced, somebody in IT has to do it, even though it is done by another organization. Often the outsourced employees are absorbed in the outsourcing organization. Therefore, in this case, we cannot say that IT is irrelevant — the function and activity has shifted organizationally and is also managed by IT of the vendor. The outsourced vendor relationship is also managed by the customer IT organization. Similar arguments hold for application development and support activities. For off-shored activities, the job losses are a fact, but it does not make IT irrelevant.

At the middle level (Tier 2) of IT operations, let’s say, at the business analyst, project management, vendor management, or network operations management levels, the IT aspects are critical. For example, the business analysts are key to developing IT product or service development and delivery requirements and pipelines, the IT vendor managers are key to selecting, evaluating, and managing vendor relationships. Can any other disciplines perform these functions? No. Why? Because these activities require domain knowledge and experience. For example, who other than IT can plan how a wireless network will integrate with a wired network to provide a point-of-service usage of EMR for medication management at a patient’s bedside?

Finally, no other function can be responsible for, perform, and meet the strategic technology requirements. Here, IT leadership is key in determining and ensuring the alignment of organization business strategies with technology strategies.

Consider the following example of Miami Jewish Health Systems operating the EMR, HR, Enterprise Content Management, and other applications operating in a cloud (SaaS) environment. The strategic planning and business case for moving to a cloud environment was completed by IT leadership, in collaboration with executive management, as were the tactical and operational aspects.

IT is uniquely positioned to provide results-oriented technology and process leadership to an organization. The future also holds enormous significance for IT, not only in healthcare, but in all industries. Let’s think about the healthcare landscape and the technology leadership requirements. For example, how will Accountable Care Organizations (ACO) function, who will plan and implement the strategic ACO technology requirements, how will cloud computing change service delivery and how will data security be impacted at all levels? These are some of the many very strategic questions that require deep IT involvement.

I believe IT can never be irrelevant. The discussion, while sensational, is moot.

These opinions are mine and do not reflect current or previous employer views.

Shubho Chatterjee, PhD, PE was formerly chief information officer of Miami Jewish Health Systems of Miami, FL.

What Tom Munnecke Is Thinking About Today

I exchanged e-mails with Tom Munnecke after mentioning his VistA-related Congressional testimony. I was fascinated with his 1998 HealthSpace concept paper and asked him if he had updated it or what he was thinking about twelve years later. Here is his reply.

My thinking now largely deals with the deeper implications of time. Here’s a talk I gave at the International Society for the Study of Time and some more in this interview from 2005 for the Pew Internet Visionaries. 

I’ve been also very interested in the physics of anticipation. As this relates to health IT: a deeper understanding of what is sometimes called the placebo effect, but in a broader sense is the self-referential feedback loop between our anticipation of the system and its net effect on us. Also, the need to support the notion of flow or state in our communication systems. 

The Web was built on a stateless protocol, but health information is very stateful, linking things over time. So, I think a "diachronic" model (flow of things over time) is a critical addition to our current "synchronic" (everybody synchronize their transactions, protocols, interfaces, and standards to current).

VistA was designed to be an evolutionary approach from the git-go. We created a "good enough" seed system, and planted it to see it grow. As I’ve learned in my studies of complex adaptive system (Stu Kauffman in particular), the most critical factor shaping evolution is the fitness function, the metric by which "survival of the fittest" is determined. 

In VistA, this fitness function was user acceptance. If people didn’t like or use a module, then it wasn’t fit and fell off the evolutionary path. The finer the granularity of these experiments and the quicker you can get a lot of feedback, the faster you can accomplish the error-making and error-correcting evolutionary process. When you try to do a $100 million centrally-planned change, you lose this graceful process and end up in front of a Senate panel asking what happened when it inevitably crashes.

I think we need to come to grips with the notion of personalization (see my 1999 "personalizing health" paper) beyond just today’s FaceBook craze. While the HHS/ONC focus is weighted to the enterprise-centric (aka the Disease Industrial Complex), turning patients into "consumers," I think we need to turn the healthcare system upside down, putting the patient at the top and the providers as supporting elements. I talked about this a bit in the Opening Chapter (co-authored with Rob Kolodner) in Person-Centered Health Records: Toward HealthePeople. 

What we are seeing now is a heroic battle between rigid, hierarchical top-down control (Blumenthal telling vendors, for example, that it is "imperative" that vendors support less insured populations) and grassroots, peer-to-peer, Net-based activities (FaceBook, Patients Like Me, Cure Together). Looking at the evolutionary fitness functions, I think that the grassroots will eventually win out, but only if the proper constraints can be applied (Tim Berners-Lee constrained the evolution of the web to TCP/IP, for example, a "good fence" that made "good neighbors").

So, I think we need to rethink health IT as a "space" rather than a "system." Perhaps people think that we can keep adding thousands of pages of legislation per year to the 125,000 we already have to end up with a "more perfect" health care system, but sooner or later we are going to have to declare a complexity crisis and admit that our intellectual paraphernalia with dealing with health care is inadequate.

It’s a bit like if Tim Berners-Lee tried to create the Web by going to the UN and asking for the UN High Commission on Innovation to create a Web subcommittee, who would then create global subcommittees and standards for specific applications. The sub-subcommittee of the high commission would meet with all the auction houses to collect all the stakeholders (Christies, Sothebys, etc) to create an integrated approach respectful of all parties and complying with all international regulations, UN regulations, etc. The very thought that Pierre Omidyar would write a simple program to auction off a broken laser pointer and turn it into eBay would be totally beyond belief 🙂

Yes, I’ve been doing a lot of thinking about the future of health and health care IT and dropping notes into my blog. Try the tags for VistA and AHLTA. You can read some of my early thinking at the bottom of this page. And here is some of my early thinking on the personal health record.

Tom Munnecke is a leading expert on healthcare IT, having been involved in the creation of both the VA’s VistA and the DoD’s CHCS and served as VP and chief scientist of SAIC. He is a consultant, entrepreneur, and board member of several health IT startups. He holds frequent workshops, salons, and networking events in a cabana at his home in Encinitas, CA.

Dreaming IT to Reality
By Ron Olsen

11-1-2010 6-10-55 PM

For years as a hospital IS manager, I had the tag-line of ‘Dream It To Reality’ in my e-mail signature. I meant that. You dream it and I, the humble IT guy, will do my best to bring it to reality.

Einstein once said, “Innovation is not the product of logical thought, although the result is tied to logical structure.” Thinking about that quote, I realized that to truly innovate, you must not necessarily think illogically, but you must think outside the sandbox you play in every day.

To meet the ever-changing needs of your organization, you have to empower your IS/IT team to approach problems from different angles — every day — and to not be afraid of failing once in a while. The logical structure is all around us, so when looking at processes, give everyone the freedom to question what you’re doing, at all levels.

With the many masters a hospital IT staff serves, what was once good enough for yesterday will never be good enough for tomorrow.

Ron Olsen is a product specialist at Access.

Readers Write 10/20/10

October 20, 2010 Readers Write 7 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

Note: we asked two consulting company executives to respond to a reader’s question: “Most physician offices say they are waiting for their EMR vendor to let them know how their systems will handle ICD-10 before they do any of their own prep work. Is this common? Vendors seem to be quiet on the subject.”

Preparation for ICD-10
By Peter Butler

10-20-2010 4-47-01 PM 

From what we are seeing and hearing from healthcare organizations we work with, in general, the larger IDNs and healthcare organizations are addressing ICD-10 readiness through appointed committees to head up the planning for ICD-10. It is the smaller physician medical groups that are taking a “wait and see approach” to ICD-10 and vendor readiness. 

In one conversion with a medical group CEO who is also a practicing pediatrician, I was told that his concerns as it relates to ICD-10 were minimal. His view was it was mostly an IT issue. There is a small subset set of ICD-9 codes he uses regularly today and with ICD-10 that list will grow slightly, but nothing that will require a major amount of education or training.

We’ve seen many of the major IT vendors saying they are investing in ICD-10 readiness today. They are still doing their own due diligence internally before communicating details and specific plans with their customers which is why your reader may not be hearing much from the vendors.  

I was recently visiting with a vendor who has made ICD-10 and Meaningful Use their top priorities and slowed other R&D efforts to focus more resources on these two initiatives. We believe that the majority of vendors will deliver ICD-10 compliant upgrades in reasonable timeframes.

For providers, taking a “wait and see” position is dangerous, as ICD-10 codes will affect all services in all settings; and therefore all reimbursement. Providers must begin to inventory all of their vendor systems to determine their ability to be able to accommodate the EDI v5010 enhancements and expanded character sets. Workflows need to be inventoried so organizations can understand where testing and mitigation need to be planned. There are many constituents (i.e., insurance companies, labs, etc.) that also need to be managed and contracts reviewed to minimize the impact to provider reimbursement.

ICD-10 needs to be viewed more broadly than just complying with government regulations. The ICD-10 code set provides organizations with new opportunities to enhance their revenue stream. The key is to begin now and prepare a clearly defined transition plan.

Peter Butler is president of Hayes Management Consulting of Newton Center, MA.

Preparation for ICD-10
By David Vreeland

10-20-2010 4-59-15 PM

I’d say that the burden of implementing CMS V5010 and ICD-10 is largely going to be borne by the vendors, but it’s always the responsibility of the organization’s leaders to ensure that the organization is compliant with such regulations.

In a hospital, there are typically many more information systems in production and so the burden on the organization is larger because they need to responsibly ensure that they have a handle on all those vendors and determine what the plan is for accommodating the change to these new code sets across the various IT providers. They also will likely need to have a testing plan in place for interfaces, downstream system compatibility, etc.

On the ambulatory / physician practice side, I’d say that the approach is the same, but the complexity is likely significantly lower. But simply waiting until the vendor takes action is ill-advised.

As a physician, I’d be requesting information from my vendors about the development plan and timeline for these enhancements, and if the practice operates a practice management system that is provided by a different vendor than the EMR solution, I’d be looking at moving to an integrated solution. Most vendors we work with have a plan and timeline either in mind or on paper by this point, and it’s appropriate to ask for it.

David Vreeland is a partner with Cumberland Consulting Group of Brentwood, TN.

Back to School – For a Master of Biomedical Informatics Degree
By Jeremy Harper

clip_image002 

With the recent influx of government funding in healthcare, educational opportunities abound. I have been lucky enough to receive a scholarship to Oregon Health & Science University’s (OHSU) Healthcare Informatics program. My passion is to work with healthcare organizations to ensure patient safety and innovative technologies. This article will cover how I found the OHSU opportunity and why I decided to attend a program that required moving to the other side of the country.

An elevator story about my personal background is that I attended The Ohio State University’s business degree program from August 2003 to December 2006, receiving a BS in General Business. I worked at Epic Systems from February 2007 to April 2009 and discovered a passion for healthcare IT.

After I left that company, I took my GRE, where I scored well but not brilliantly. I applied and was accepted to three graduate schools for healthcare informatics (Capella, UIC, UW Milwaukee) but I failed to procure funding, so I accepted a full-time position at an amazing hospital as a systems analyst. A month after starting, I transitioned to being the secondary interface engineer and over the past year spanned both positions.

While I found my work environment to be an amazing experience, I had a passion for education and furthering my career. To do this, I needed either to gain further professional experience or consider specific degree programs. I made the decision that I would find an online program that would allow me to continue to gain real world experience while furthering my education and qualifications.

To this end, I researched available programs online and sent a letter to Mr. HIStalk to ask which programs he could recommend. He came back with a number of programs, among them OHSU as one of the leading online programs. I went to their website (along with the others) and found that OHSU had received a grant that would fully fund a one-year online certificate program and a few masters’ degree students. I applied and was accepted to the master’s degree program.

That application was not instantaneous nor was it free. However, spending $358.72, (Including the A&P online course I took to be eligible for the program, not including the sunk cost of my GRE from 2009) was a small price to pay to have a fully funded degree program with stipend. The program itself will take six quarters, two of which will include an internship. This fall, I am taking courses in Java, scientific writing, and introductions to biomedical informatics, biostatistics, and healthcare. This scholarship provides me the freedom to focus solely on my education rather than needing to balance it with work.

There have been opportunity costs. Nothing is free, even on a scholarship. The highest costs are the same anyone attempting a work/life balance will have to face. My personal situation means I have a fiancé 2,800 miles away in Ohio. I will have limited time to see her until we are married next year. I left a position where I enjoyed my co-workers and found the work itself exciting and fun. I moved with only what could fit in my Toyota Corolla (far more and far less than you’d expect). I have needed to find a roommate because of my budget. I have needed to budget my funds closely to assure I will be able to attend school. All the type of sacrificed anyone going back to school will have to consider.

If you are considering further education, now is the time to look into opportunities. OHSU, for example, will be funding hundreds of more certificate program students, leaving those students half way to receiving a MBI degree from the program. If you visit the ONC website, you can research and find additional schools that have been funded. The ONC has also funded community colleges around the nation for a workforce re-education model that will put folks through approximately a nine-month program educating them on healthcare informatics.

Jeremy Harper is a student at Oregon Health & Science University of Portland, OR.

Note: the following original article exceeds the usual word limit, but was valuable enough for its content and citations that I thought it was worth running intact.

Customer Relationship Management in Healthcare
By Lindsey P. Jarrell

10-20-2010 7-43-12 PM

Consumerism is playing an increasingly important role in healthcare, one that hospitals need to address in order to deliver the level of service that patients are starting to expect. In fact, according to a 2009 survey of healthcare consumers by the Deloitte Center for Health Solutions, consumerism is such a powerful force in healthcare that it is a “defining characteristic between its past and its future that will impact every stakeholder’s value proposition and business models. Consumerism is not a fad; it is a trend of enormous significance.”[1]

Today’s consumers are highly attuned to the level of service in healthcare and their attitudes and behavior reflect this. Roughly one in four has switched or has considered switching hospitals, clinics, or doctors because of a negative customer service experience.[2] Slightly more than half of customers report that they choose hospitals “based on whether they believe employees understand their needs.”[3]

Consumers have many choices when selecting their healthcare providers and they are beginning to exercise their options. Almost one-third report comparing doctors before choosing one and 15% compared hospitals.[4] Unfortunately, healthcare consumers believe the system is performing poorly: 76% percent grade the system as “C” or below.[5]

Customer relationship management (CRM) is an approach used in many industries that focuses on addressing the unique needs of customers to increase value for both the customers and the organization.[6] CRM software is currently used in only about 15% of hospitals, but it is a growing trend.[7] It can help streamline operations to handle the multi-headed juggernaut of attempting to compete for lucrative customers, control costs, improve profitability, and foster a customer-focused cultural climate.

Today’s Healthcare Consumer

A growing number of consumers want to be actively engaged in their health. They compare doctors, hospitals, medications, devices, and health plans; explore alternatives to conventional approaches; and spend money to achieve their health goals.[8] They want to control their health information and prefer providers who use Internet-based tools to augment care.

The 2009 Deloitte survey showed that 57% want a secure Internet site that would enable them to access their medical records, schedule office visits, refill prescriptions, and pay medical bills. Forty-two percent of health care consumers want access to an online personal health record connected to their doctor’s office, one in five would switch physicians to obtain such access, and consumers are less concerned about privacy and security issues than in the past,.[9],[10] Many (62%) believe that hospitals vary with respect to quality.

Because they are increasingly sensitive to errors, poor service, and lack of useful tools that would enable them to navigate the system more effectively, they are receptive to innovations that offer greater value, better service, higher quality, and lower costs. What’s more, they embrace innovations that enhance convenience, personalization, and control of their personal health information. Consumers, especially those who are younger, are willing to try new service and change providers in order to obtain better value. They are highly receptive to technology that eliminates redundant paperwork and unnecessary tests and saves time and money.[11]

Why CRM?

It’s not surprising that consumers are often dissatisfied with their healthcare experience. Today’s healthcare environment is fragmented and complex, with numerous entities controlling access to information that exists, yet is inaccessible to both providers and patients. A lack of integration and workflow impedes the ability to deliver complete, accurate patient information, which has a negative impact on patient satisfaction and quality of care.
In seeking better tools to manage patients across the continuum of care, healthcare providers are turning to CRM software because it offers several components to address these issues. It provides integrated business systems that serve the medical staff, the administrative staff, and hospital stakeholders while also directly serving customers, giving them easy access to their healthcare history and on-demand knowledge of potential remedies.

Effective CRM systems are starting to integrate personal health records with the hospital’s data to provide a system for managing care-related activities, costs, and benefits, and enabling patients to have better online access to enhance the management of their healthcare. The benefits of this approach include:

  • The ability to analyze the performance of routine processes over time (such as admissions, discharges, transfers and referrals) in order to eliminate unnecessary steps and increase patient satisfaction.[12]
  • Developing customized workflows to automate care coordination activities between provider organizations (e.g. physician office, hospital and home health) which can lead to improved patient outcomes, increased operational efficiency, and reduced costs.[13]
  • Proactively managing chronically ill patients (e.g., diabetes and congestive heart failure) to target them with communications about educational offerings and remind them of ways to manage their illness.[14]
  • The ability to improve care coordination and reduce the risk of patient readmission.
  • Reducing costs by consolidating systems and pooling resources to obtain economies of scale, improving utilization of appropriate healthcare resources and understanding the cost of treatments to drive business planning
  • Preventing and mitigating medical errors by integrating CRM data with medical history and clinical data.[15]
  • Generating marketing campaigns targeted at specific patient types by combining a knowledge base with scientific analytics and feedback mechanisms.[16]

With the advent of electronic medical records and the infusion of federal stimulus money that is helping to drive the widespread adoption of technology, CRM software may well be the next logical step in the increasing reliance and utilization of IT in healthcare.

CRM Components for Healthcare

Companies such as Siebel Systems, Salesforce, HealthForce and SalesBoom offer CRM solutions that are tailored to the needs of large and small providers. These systems often include the following components:

Integrated Data

In many industries, the majority of revenue comes from existing customers and healthcare is no different: about 80% of annual revenue comes from patients who have previously used the system.[17] Integrating enterprise-wide data is therefore a key component to improving customer service.

An integrated database allows hospitals to collect data, analyze individual needs and preferences, develop relevant messages based on these needs and preferences, and deliver communication through preferred channels (e.g., text messages, e-mails and phone calls). It requires an integrated combination of data and application programs to support analysis, opportunity identification, data mining, and communications management.[18]

Such a system is equipped, for example, to determine which patients are at greatest risk for disease or complications, allowing the hospital to provide appropriate interventions and communications at the right time. It can also help track and improve other processes, such as check-in procedures. The result is a more personalized relationship between providers and patients that increases patient satisfaction.

Customer Care and Recovery

The trend toward consumerism in healthcare means that patients expect to be treated as customers. One in four patients say poor experiences at hospitals or clinics have caused them to use or think about using walk-in centers as an alternative.[19] In its 2008 Hospital Pulse Report, Press Ganey found that the larger the hospital, the lower the overall patient satisfaction rate.[20] Coupled with the fact that the majority of hospital revenue is from repeat business, this means that hospitals need to find ways of increasing customer satisfaction — including rectifying mistakes — so that revenue is not lost.

CRM software solutions can facilitate the collection of patient-related information from a consumer perspective, facilitate complaint management by allowing hospitals to capture, review, approve, and access information about solutions to existing and past problems, and collect feedback data that can be used to improve operations. Feedback also helps mitigates risk in an environment in which government agencies are continually monitoring hospital performance. [21]

Predictive Modeling

CRM software can allow hospitals to predict patients who are at risk for developing certain conditions and identify those already diagnosed who are likely to develop complications, creating an opportunity for preventive interventions instead of more expensive treatments that may otherwise be required for acute episodes or chronic disease.[22] For example, predictive modeling can take into account co-morbidity, severity, frequency, physician, and specialty data to predict the likelihood of a patient with diabetes developing heart disease or the chance of a patient with hypertension developing glaucoma. This translates to earlier disease discovery, better management, improved intervention, and more relevant communications.[23]

Marketing

CRM-driven marketing can allow hospitals to deliver the right message to the right person at the right time. A comprehensive CRM database and analytical software can predict the likelihood of patients to require specific preventive interventions or develop certain health conditions. By leveraging CRM data, hospitals can implement customer-specific outreach to educate both diagnosed and undiagnosed patients.

For example, one hospital implemented a campaign targeted at diabetes patients. This involved mailings that included offers for free glucose screening and nutrition classes, as well as discounted diabetes and cholesterol screenings. The campaign resulted in incremental patients in three categories: patients with a first-ever diabetes diagnosis, patients who used services who had been undiagnosed, and patients who used services who had been previously diagnosed.[24]

CRM software is complimentary to both revenue cycle applications and electronic medical records within physicians’ offices and hospitals. One has only to think of the type and frequency of e-mails from retailers (e.g. hotel chains announcing special deals at exotic locations) that are carefully placed marketing campaigns based on a specific customer’s previous buying experience and profile. The power of using CRM lies in combining data collection, information management and market targeting vehicles to creating a proactive marketing approach that can increase the customer base.[25]

CRM Making a Difference

CRM has been successfully used to help hospitals capitalize on their data to increase patient satisfaction and boost earnings. Today, many hospitals are demonstrating a substantial ROI from implementing a CRM program. Below are a few examples of CRM at work:

  • Children’s Hospital and Research Center at Oakland faced declining referrals and revenue stream, incomplete process follow-through, and decreased patient satisfaction. Using a contact center CRM strategy, the hospital saw a 22% increase in overall referrals and a 50% improvement in both patient and referring physician satisfaction levels.[26]
  • A group of six Florida hospitals used CRM tools to launch a direct mail campaign that generated $1.9 million in new revenue in three months.[27]
  • CRC Health required a platform to manage patient intake, track Web entities, and streamline operations to increase revenue. A CRM system enabled the company’s Web-generated revenue to jump from 4% to 26%. The company can now tie revenue to referral performance, boosting its growth potential. A tool to track web marketing effectiveness indicates to the dollar what is performing and what is not and the system even provides patients with available beds faster. As a result, CRC Health can serve a larger population.[28]
  • Cedars-Sinai Medical Center wanted to improve low call-to-appointment conversion rates and patient satisfaction. The hospital designed a comprehensive contact center-based CRM strategy that improved appointment conversion rates from 22% to 48% and patient satisfaction by 42% over the first year. During that time, more than $3 million was generated in incremental revenue.[29]

Challenges, Tips and Insights

Implementing CRM software can be challenging. It needs to incorporate a variety of security safeguards including patient confidentiality and privacy issues as well as HIPAA compliance. A CRM systems can be costly and time-consuming to get up and running.

Naysayers may point to past efforts of hospitals to implement CRM systems that have failed. But the landscape of healthcare is changing, and CRM can be a valuable tool to help hospitals adapt to the trend toward consumerism and transparency. IT capabilities and technological advances have paved the way for more sophisticated second-generation software-as-a-service platforms and CRM has become both more affordable and more user-friendly.[30]

As with any widespread organizational change, enterprise-wide system compliance can be difficult to achieve. Internal resistance can be significant from top executives and administrators at the outset and from medical staff once implementation begins and the system is in place. It is important to develop strategies to assist team members at all levels in the organization in adopting a new CRM program.

When considering the implementation of a CRM program, hospitals should keep in mind that:

  • Converting from a patient orientation to a customer orientation requires a cultural re-orientation.
  • CRM is not a campaign or a one-time event, but rather an all-out approach to dealing with customers.
  • Modifications in the language used in all customer encounters — even billing — can have a profound impact on the perceived quality of services.
  • Quality is defined by the customer, not the provider.

Conclusion

Information is the fuel on which hospitals run and they must harness it to both continually improve performance and measure their record against competitors. During the next decade, the healthcare environment is likely to see an emphasis on improving, measuring, and reporting the quality and safety of care, link provider reimbursement to care performance, and demand greater levels of patient service.[31]

CRM technology gives hospitals the tools they need to thrive in today’s increasingly consumer-oriented healthcare market, while improving outcomes and reducing costs. While its implementation poses a number of challenges, installing CRM programs is an undertaking worth pursuing.

Lindsey P. Jarrell, FACHE is co-founder of Source88.


References

[1] Deloitte Center for Health Solutions. 2009 Survey of health care consumers. http://www.deloitte.com/view/en_US/us/industries/US-federal-government/center-for-health-solutions/60ea5a1264001210VgnVCM100000ba42f00aRCRD.htm. Accessed April 13, 2010

[2]Datamonitor. Addressing the challenges of consumer-driven healthcare. Published January 26, 2007.

[3] Datamonitor. ibid.

[4] Deloitte Center for Health Solutions. ibid.

[5] Deloitte Center for Health Solutions. 2010 U.S. healthcare consumerism survey. http://www.deloitte.com/view/en_US/us/Insights/centers/center-for-health-solutions/consumerism/2010-survey-health-consumers/index.htm?id=USGoogle%20Consumerism%20_HC_510&gclid=CO6Premo3qECFYNd5Qod9DjKIw Accessed May 17, 2010.

[6] Glaser J, Foley, T. The future of healthcare IT. Healthcare Financial Management. November 2008.

[7] Higgins, JK. Rx for hospitals: a big dose of CRM. CRM Buyer. http://www.crmbuyer.com/story/healthcare/68758.html?wlc=1274277431. Published November 20, 2009. Accessed April 8, 2010.

[8] Deloitte Center for Health Solutions. 2009 Survey of health care consumers. ibid.

[9] Deloitte Center for Health Solutions. 2009 Survey of health care consumers. ibid.

[10] Deloitte Center for Health Solutions. 2010 U.S. healthcare consumerism survey. ibid.

[11] Deloitte Center for Health Solutions. 2009 Survey of health care consumers. ibid.

[12] Smolke P, Virmani S. Why customer relationship management in healthcare? Presented at: Healthcare Information and Management Systems Society annual conference; February 24, 2008; Orlando, FL. http://www.mshug.org/docs/techforumOrlando2008/Smolke_P_Vimani_S_Closing.pdf

Accessed April 13, 2010.

[13] Smolke P, Virmani S. ibid.

[14] Smolke P, Virmani S. ibid.

[15] Healthcare industry CRM software solutions. www.crm.forecast.com.

http://www.crmforecast.com/healthcare.htm. Accessed April 13, 2010.

[16] Higgins, JK. ibid.

[17] Healthcare relationship management depends on tailored database. www.healthcareitnews.com. http://www.healthcareitnews.com/news/healthcare-relationship-management-depends-tailored-database. Published May 13, 2004. Accessed April 8, 2010.

[18] Healthcare relationship management depends on tailored database. ibid.

[19] Healthcare industry CRM software solutions. ibid.

[20] McKay L. Healing the sick. www.destinationcrm.com. http://www.destinationcrm.com/Articles/Editorial/Magazine-Features/Healing-the-Sick-55461.aspx . Published August 1, 2009. Accessed April 7, 2010.

[21] McKay L. ibid.

[22] Schumacher S. Patient relationship management: streamlined approaches for defragmenting healthcare. Health Management Technology. June 2001; 22(6).

[23] Healthcare relationship management depends on tailored database. ibid.

[24] Hallick J. CRM saves lives. www.destinationcrm.com. http://www.destinationcrm.com/Articles/Web-Exclusives/Viewpoints/CRM-Saves-Lives-60149.aspx. Published January 25, 2010. Accessed April 7, 2010.

[25] Higgins, JK. ibid.

[26] Young T. Hospital CRM: unexplored frontier of revenue growth? Healthcare Financial Management. October 1, 2007.

[27] Higgins, JK. ibid.

[28] CRC health builds custom solutions on force.com to streamline intake process and increase web-generated revenue. www.salesforce.com. http://www.salesforce.com/customers/healthcare/crchealth.jsp. Accessed May 18, 2010.

[29] Young T. ibid.

[30] Young T. ibid.

[31] Glaser J, Foley, T. ibid.

Readers Write 10/6/10

October 6, 2010 Readers Write 13 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

EMR: One Size Does Not Fit All
By Evan Steele

10-6-2010 6-27-07 PM

A recent comment on HIStalk, by a hospital CIO about what he identified as the best EMRs for enterprise systems and their physicians, highlights a problematic and all-too-prevalent misconception. The fact is, it is impossible to satisfy both hospitals and community ambulatory physicians with the same EMR product.  Furthermore, even the ambulatory market cannot be looked at as a whole. EMRs designed for primary-care physicians respond to a set of needs that are very different from those of specialists.

Enterprise EMRs simply do not work in high-volume ambulatory practices. This is particularly true for specialists’ practices. Many hospitals have had some success with Epic and other hospital-focused EMRs, but success has been limited when these same hospitals ask physicians — again, particularly the specialists — to implement these systems in their practices. A monolithic enterprise product cannot possibly support equally well such different workflows, patient care scenarios, and providers’ needs.

Within the ambulatory market itself, it is time to bifurcate the EMR discussion into two groups: EMRs for primary care physicians and those for specialists.

Industry analysts typically lump all EMRs into one category, which does not adequately differentiate the market segments or their distinct needs. The major EMR vendors have massive footprints in the marketplace, yet a small company like SRSsoft has the lion’s share of referenceable high-volume, prominent specialty practices in areas like orthopaedics and ophthalmology. Why? Because one size does not fit all, and it is impossible to satisfy the needs of both groups without compromising the needs of one.

The American Academy of Orthopaedic Surgeons (AAOS) acknowledged this issue in its recently released EMR Position Statement, pointing out that “Many systems are geared toward primary care medical practice, which can limit the utility of EHRs for specialty surgical practice.” It correctly suggests that “the different needs and uses of EHR by disparate medical specialties should be recognized.”

Specialists represent approximately 50% of the physician market, a sizeable segment that is largely being ignored. How are specialists to determine which EMRs are designed for their needs?

KLAS, the closest our industry has to a JD Powers–type of rating source, does not break out its ratings by specialty. This means that if an EMR vendor does well in the ambulatory primary care market and has high KLAS ratings, an unsuspecting specialty practice might purchase their product based on those ratings, only to find out that the product does not fit their unique needs. 

Exacerbating the situation is the fact that KLAS only surveys practices that have actually installed the EMRs. It does not survey practices with failed implementations. Since specialists represent a disproportionate number of the failures, the information is even further biased.

The result is that there are thousands of specialists who purchase EMRs from highly rated and/or household name vendors, but who end up with failed implementations and significant financial loss.

One size does not fit all. There are good EMR solutions available for every type of physician. It is incumbent upon the individual physician to research and identify the product that best suits his/her practice’s needs.

Evan Steele is CEO of SRSsoft of Montvale, NJ.

ClickFreeMD Comment Response
By Bob Gordon

Note: Mr. H here. I’m breaking my “no commercial pitch” rule this one time because Inga had questioned the business model of ClickFreeMD, which offers practice systems including billing for a flat monthly fee rather than the traditional model of a percentage of collections. Inga’s point was that the percentage model encourages the billing company to collect. CEO Bob Gordon was nice enough to e-mail Inga an explanation and we thought his response might interest some readers even though it is hardly unbiased. I’m not endorsing their product and I have no connection to ClickFreeMD.

ClickFreeMD leapfrogs the percentage-based provider business model. Consider the following:

  • No start-up, implementation or training charges.
  • The flat fee is lower on an equivalent percentage basis than most practices would pay for outsource medical billing alone and far less than in-source options.
  • If the practice improves its revenue or we boost it (which we often can do), the equivalent percentage drops through the floor.
  • The breadth, quality, and integrated end-to-end nature of our software, services, and support are unrivaled. Physicians are paying twice as much elsewhere for much less elegant solutions today.
  • The flat fee sticks. If encounter or charge values increase, the flat fee stays the same and the practice captures cost free revenue. If it drops outside ordinary seasonality range, the rate is adjusted down pro-rata so our physicians’ earning power is fully protected.
  • Importantly, the flat fee is backed by a performance guarantee that makes sure we work every claim or we rebate half of the flat fee. There is no equivalent protection in a percentage-based model. In fact, any claim that takes more than 15 minutes to resolve in a percentage system is probably costing them more than they are making, and hence billing company profitability is at some point in the collection continuum inversely correlated to increasing practice collections.
  • Our contracts all have 90-day outs and low price match guarantees for comparable services.

You may ask how we do this. We have deep domain expertise from running billing companies, back offices, and technology companies for decades and have organized a Southwest Air-like discount fee, high-result business model that is very scalable. We expect that ongoing volume will feed a virtuous cycle for all, continuing to allow us to offer more for less while achieving top results.

One of the most striking things we are doing is the least recognized — giving the practice their flat-fee price, online and instantly, as well as their included services, without asking them to give us any information. Try this anywhere else like Athena and what we do in 30 seconds becomes a multi-day process that involves e-mail / telephone / online discussions and/or meetings and requires the practice undressing for the vendor. We are completely ONE-WAY transparent. That’s because we want the practice to decide if they want to contact us — after they are satisfied that this is a superior value for them and only then. We aren’t interested in lead nurturing them to death. 

This is about "more dollars for doctors" and great news in the group practice fight to sustain their independence. We are doing our part to create a reversal of fortune in the group practice community with a unique business model that raises revenues faster than costs, delivers immediate and ongoing savings, and provides the tools and support that allow them to be ready for tomorrow.  

Like the boiled frogs of lore, physicians have been nickel and dimed by payers, billing companies, and others, overpaying to under-produce for so long, they find themselves working much, much harder for less and less. We’re changing that and we’re passionate about it! Thank you for your consideration.

Bob Gordon is CEO of Click4Free of Chevy Chase, MD.

It’s Official: The Rush for Talent Has Begun
By Tiffany Crenshaw

10-6-2010 6-55-56 PM 

In recent weeks, a number of existing and prospective clients have called me for a pulse on the healthcare IT recruitment marketplace and thoughts on how to attract quality resources. After a number of such calls, I decided to put my thoughts in writing and share.

Let’s start with the good news. Industry hiring is definitely picking up and employed candidates are now less afraid to make a career change then they were three to six months ago.

As for hot products, it’s no secret that Epic is hot, hot, hot. Hospitals are purchasing Epic left and right. Honestly, there are simply not enough Epic resources, especially Epic-certified resources, to go around, so the talent war is raging. Cerner recruitment remains modest but steady, while McKesson needs are starting to rebound after quite a lull.

In the ambulatory market, we are seeing more and more requests for eClinicalWorks and Allscripts. New names like Sage and Greenway are coming to light. And occasional needs for Meditech, Siemens, IDX/GE and Eclipsys are surfacing.

On the integration side, Cloverleaf and e-Gate skills are still in demand, but we are seeing more requests for Web-based and lesser known products like Ensemble, Symphony, and Rhapsody.

The hiring demand is highest by far for hands-on resources to design, build, and install EMR applications. However, there is a fair amount of activity for sales, project management, and training professionals, including go-live support.

CPOE, clin doc, pharmacy, oncology, and HIM are generating the most recruitment activity within the applications. Based on new client requests, we foresee growing needs for business intelligence, security, and report-writing resources.

In addition to employers’ desire for one or more of the skill sets mentioned above, most are adding clinical designation to the requirements. Over 50% of our job requisitions right now require clinicians. Pharmacists, nurses, and physicians with healthcare IT experience are in great demand.

However, post-recession hiring is creating challenges previously unheard of in my 12-year history recruiting in this industry. The process is now wrought with excruciatingly slow interview scheduling, shrinking employee benefits packages, little to no relocation assistance, and financially conservative offers resulting in more and more frustrated candidates.

Things have changed drastically since the lowest points of the recession. After the release of Meaningful Use requirements, recruiting mania has taken off. Everyone seems to have hiring needs. Candidates are getting called left and right by internal and external recruiters. Just check out a few of the job boards if you don’t believe me — you’ll see countless job postings. Furthermore, check out all of the recruiting firms with no previous healthcare IT experience trying to break into this market as experts claim abundant need for resources.

If your organization is currently or will be in the market soon for these in-demand resources, you may want to evaluate your hiring process, recognize that your competition is fierce, and take note of a few trends our candidates and clients have shared with us quite candidly over recent months.

  • New car syndrome. Candidates are migrating to new implementations. Who can blame them? It’s more exciting to be on the ground level and see a project through from A to Z.
  • Red carpet treatment. Employers who roll out the red carpet win. When weighing decisions between job offers, candidates almost always choose the employer who provided quickest response time and showed sincere interest in them. (Both response time and sincerity are simple and no-cost ways to roll out that red carpet.)
  • Relocation blues. Relocation is a HUGE issue right now. Even if candidates want to move, they can’t do so because of the housing market. Kudos to all of the organizations willing to work around this by providing remote work, commuting, or coverage of interim living expenses.
  • Communicate. Many, many candidates are feeling jerked around by potential employers because of lack of communication in the interview process. Here’s what they are thinking: “If I don’t feel valued as a candidate, how are they going to treat me as an employee?” On the flip side, these candidates are communicating with plenty of their peers. Too many hospitals and consulting firms are getting bad reputations as being lousy places to interview and to work.
  • Too much is not always a good thing. In the quest for resources, too many organizations are panicking and calling in all of the troops — internal recruiters, employee recruiting bonuses, dozens of external recruiters and advertisements. Candidates get called multiple times by different sources all looking to fill the same positions. Not only do they end up confused, but all the activity makes candidates suspicious. They wonder what’s wrong with an organization that has such a hard time attracting and retaining talent?
  • Get on board. We are hearing more and more horror stories about candidates showing up on the first day only to find their new employer is not ready for them. This gets them off to a bad start from the get-go. Employees stay longer and perform better when they feel welcomed and the transition process is smooth. The period of time between offer acceptance and start date can also be a black hole, when candidates are most vulnerable. Employers are losing candidates this far into the game because they aren’t communicating with them. If you don’t have a formal on-boarding program, now is probably a good time to look into it.
  • Disconnect between human resources and hiring managers. As an outside firm, we work with both HR representatives and hiring managers. We hear complaints on both sides about the other on a regular basis — namely due to lack of response. The hiring managers want candidates fast. And HR wants answers fast. Throw candidates in the mix who get frustrated as well and it’s a nasty situation. However, we find that employers who really engage the final decision-maker in the process from beginning to end and set response expectations up front have the least amount of frustrations and the most successful outcomes.

In summary, you can safely say that the industry is quickly changing to a candidate-driven market and that the market is impacted heavily by post-recession recovery and Meaningful Use. It is official. The rush for talent really has begun.


Tiffany Crenshaw is president and CEO of Intellect Resources of Greensboro, NC.

The Coming Speed Bump in the EMR Market
By Jon Shoemaker

It’s no secret that there is currently a mad rush occurring, not unlike The Oklahoma Land Rush of the 1800s, where hundreds of companies both new and old are getting into the business of healthcare information technology. Some come with industry expertise. Others come to take advantage of the financial opportunity. Consider Best Buy, the consumer electronics giant, that will install your EMR using their Geek Squad. So much for needing clinical expertise!

I believe this climate of frenetic activity will cause the EMR market to encounter a large, steep speed bump in the next 10 years. It won’t be from all of the EMR installations or supporting all of these systems, as this will create thousands of jobs and supporting infrastructure that currently does not exist. The bump in the road will come when all of these new digital silos must talk to each other as required in Phase II of Meaningful Use (MU). It is the very selling point of these systems — simple communication and usability — which become the Achilles heel of these EMRs.

EMR’s to date are not installed with a common code structure for identifying exams, studies, or services, all of which will need to be exchanged outside of the office in Phase II of MU. The reason for this lack of standardization has nothing to do with EMR functionality or capability — it is that everyone is still thinking locally not globally.

To ensure true interoperability and exchange of patient health information, EMRs must be installed to satisfy the local requirements, but also with the forethought that they will integrate to larger systems. This requires standards and standardization. The absence of a standard will require the use of translation services so that HIE repositories use the same codes for exams performed across the region.

Translation services, while a viable alternative to standardization, require one-off knowledge for the database structure and logic for each customized local EMR as well as that of the destination repository. This level of granularity creates layers of complexity for maintenance and mapping. Any changes to local system will mandate updates to the translation engine. The support nightmare of constant mapping modifications to assure the proper codes are sent outbound or received inbound will be effectively unsustainable.

Once all of the paper silos are replaced by digital silos, there will be enlightenment of EMRs that were installed incorrectly, don’t address the clinical workflows of the office, and don’t communicate outside of the office with a standard communication protocol using standard coding methods. This will lead to a second phase of the EMR revolution will include translation services and reinstallation of EMRs to address workflow and data gaps. This will have to be resolved before integration to a larger HIE repository can take place.

If we begin now with standardization of workflow and codes and ensure they are addressed with current EMR installations, we will be in a better place in five years and users will see the true benefits of these systems. With our current strategy of “every man for himself,” we risk losing users’ confidence once these systems are installed and address workflow and physician concerns. Once we lose the users’ confidence, they will stop using the system and re-adoption efforts will prove Herculean.

As you begin planning your EMR implementation, there are hundreds of questions to ask. When it comes to meeting the long-term requirements of MU as well as realization of the true benefits of an EMR, here are a few to begin with:

  1. Have we reviewed and documented our office workflow?
  2. Are we using the new SNOMED codes?
  3. Are we following standardized codes for services rendered?
  4. Does the installation team understand clinical workflow or do they look glassy-eyed when we discuss medical terms?
  5. Is our vendor of choice an IT company trying to cash in on the HIT initiative without clinical experience and knowledge which could place our business at risk?
  6. How will this EMR connect us in the future to larger integrated systems?

Jon Shoemaker is senior consultant with Ascendian Healthcare Consulting of Sacramento, CA.

Readers Write 9/30/10

September 29, 2010 Readers Write 7 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

"Granularity" — A Detailed Analysis
By Robert Lafsky, MD

“Granular” is turning into a buzzword. And that’s not a good thing.

It was a perfectly respectable, albeit not very useful term in the analog days, referring usually to a physical material composed of — you know, little granules. You’ll actually see it used sometimes as a descriptive term in pathology and endoscopy reports, and in general use it describes some thing’s particular type of grainy texture.  But then, of course, computer people got hold of it and gave it a much more specific, albeit metaphorical meaning, which I’ll get to in a minute.

Recently, writers in the mainstream media, with their ears always pressed to the ground and desperate for novelty, have picked up on this word and are starting to use it to describe more abstract things, in a way that fails to grasp the IT meaning at all. For instance, the other day political pundit Michael Gerson described a Karl Rove critique of Christine O’Donnell as “granular and well informed.” If you substituted “detailed” for “granular” in that sentence, you wouldn’t have changed the meaning a bit.

But IT people don’t use “granular” to mean just “detailed.” Hard copy or scanned documents can, of course,  be very detailed. I remember a couple of old docs from my training days who would sit with pen and paper and do beautiful two- or three-page, single-spaced handwritten reports on their patients with every bit of the history, physical, and labs on them. It was impressive effort, very detailed, but even if you found those reports now and scanned them into your EMR, the information in them wouldn’t be granular.

No, for a computer, detail is necessary for granularity, but it’s far from sufficient. The computer has to be able to do something with the details so that it can store them in an orderly way and then use them for searches and reports. That sort of thing, of course, is the “use” that at least has the potential to be “meaningful.”  

So if, say, a particular drug for hypertension is found to be dangerous for everybody over 60 with diabetes, I don’t have to go manually through a thousand records. They are recorded in a yes-or-no fashion in a database. I can query my system and get an immediate list of all my patients who meet those criteria, with their addresses and phone numbers.  

That’s granularity. Facts have to be detailed, but in a fashion where computers can take advantage of them.

Maybe this is obvious to the IT business readers out there, but I sure spend a lot of time in the doctor’s lounge painstakingly explaining this to medical colleagues. And granularity is at the heart of all the arguing about workflow issues in EMRs, as well as interoperability and the coherence of automated reports that rages in the comment sections of this website and elsewhere.

I can’t offer a resolution of any of these arguments. But to get anywhere, we need commonly defined terms, and granularity is a pretty useful one. General media people out there, if you mean “detailed,” say “detailed.” Leave “granular” for those that really need it.

Robert Lafsky is a gastroenterologist in Lansdowne, VA.



A HCIT System Architecture for Cloud Computing
By Mark Moffitt

Note: This article uses a fictional story about Google and Meditech as a backdrop to describe a healthcare IT (HCIT) system architecture for cloud computing.

(Oct. 1, 2020) Today marks the eighth anniversary of Google’s purchase of an obscure private company know then as Meditech that marked the beginning of the transformation of the HCIT industry into what it looks like today.

At the time, the purchase shocked everyone. Over the years, Meditech had repeatedly rejected any notion of a buyout by another company.Then Google offered $1.5 billion, more than a 50% premium on the estimated valuation of the company. The offer, it turns out, was too good to turn down. Neil Pappalardo of Meditech walked away with a $400 million payout. Google’s market cap at the time: $168 billion.

The Vision

Google’s vision for the future of HCIT was straightforward: provide all IT services to healthcare system as a cloud computing service at a price much lower than market rates as a strategy to capture 60% of the worldwide market by 2020. Google’s service included applications, data management, and integration. Google architected the system from the ground up for cloud computing, so they were able to offer the service at a much lower price while realizing higher margins than competitors.

Google bought Meditech for its customer base and use case models that had been hardened by use over many years. Google took Meditech’s functional specs and enhanced and implemented them in a new architecture. In addition, Google purchased several other HCIT vendors and integrated them to provide a total HCIT solution to customers.

Data Storage

Google did not use a relational database management system (RDBMS) as was common at the time, and instead used schema-less, key-value, non-relational, distributed data stores, aka as NoSQL.

RDBMS scale well, but usually only when that scaling happens on a single server node. When the capacity of that single node is reached, you need to scale out and distribute the load across multiple server nodes. This is when the complexity of relational databases starts to bump up against their potential to scale.

Goggle’s key-value data store model improves scalability by eliminating the need to join data from multiple tables. As a result, commodity multi-core server hardware can be used that are far less expensive than high-end multi-CPU servers and expensive SANs. The overall reduction in cost due to savings in database license fees and maintenance and hardware is around 70% when compared to using a RDBMS. Database sharding and the “shared-nothing” approach is ideal for managing large amounts of data at a low cost.

Three Data Types

Another concept introduced by Google was segregating data into three buckets — transaction data, results data, and analytic data — and managing each differently. Competitors at the time combined all three into one big, complex RDBMS.

clip_image002

Transaction data — what was ordered, when and by whom, what tests were performed, or what meds were given to a patient — are persisted to a transaction data store. At some point, all of the transactions related to a patient encounter are collected in a single electronic medical record file and compressed to about 10% of original size. Results are also contained in this file but not images, due to size, as was the case with the original paper medical record and film file.

The compressed medical record file provides an interactive view of the patient’s encounter to satisfy legal and payment inquiries. These electronic medical record files are stored securely in the cloud. Records are never transferred between organization; rather, access is authorized and the record viewed from the cloud.

Data is purged from the transaction data store once the electronic medical record file is created. The transaction data store remains a constant size and, as a result, it retrieves data faster and is easier to manage than if the transaction data store grew in size. Transaction data is concurrently stored in a separate analytic data store and is not purged.

Google partnered with several business intelligence vendors to offer advanced analytical services from the cloud using the customer’s analytic data store.

Results such as images, labs, reports, and waveforms are also stored in schema-less, key-value, non-relational, distributed data stores.

The three buckets — transaction data, results data, and analytic data — are each stored across multiple commodity server hardware using a “shared-nothing” approach. Scaling any individual bucket for a customer is almost as simple as adding server hardware.

Integration

Google used a derivative of their search engine technology to integrate a patient’s records and results across multiple providers and systems.

Application Development Framework

Google used an application development framework that was easier to build and deploy software. In a RDBMS, application changes and database schema changes have to be managed as one complicated change unit. Google’s key-value data store allows the application to store virtually any structure it wants in a data element. Application changes can be made independent of the database.

In addition, Google used a scripting language for code that changes most often — user-facing code. Both of these features combined to make software development easier and allowed applications to iterate faster. In software development, the rate of innovation is directly related to the rate of iteration.

Mark Moffitt, MBA, BSEE is the former CIO at GSMC in Texas and is working as an independent consultant while he searches for his next opportunity.


Software Upgrades – To Be or Not to Be? That is the Question
By Ron Olsen

The day your facility installs a new piece of software, you rarely think about the upgrades that will inevitably come later. You probably ask if such upgrades are included in the maintenance agreement, and then shuffle away that information for future use … or not.

Many times an upgrade is more than just a requirement from the vendor — it’s a welcome relief that offers bug fixes, provides additional functionality, and many times, increases productivity, which equates to money-saving. Hey, any time we humble IT/IS guys and girls can do something to keep the CIO happy, we’ve got to jump on it! That’s what IT should be all about — increasing the ability to save money and/or help other departments increase revenue streams.

Most of us have been caught in the XP vs. Vista vs. Windows 7 debate. The old adage, ‘If it ain’t broke, don’t fix it’ seems applicable here. XP works fine. Vista is, well, Vista. Windows 7 has generated a lot of hype. Windows 7 offers many enhancements, but if your organization’s PCs aren’t up to it, the new bells and whistles aren’t available. To get the full feel of the new Internet Explorer 9 beta release, Windows 7 is now required.

This is just one example of how an upgrade is never a simple, single-issue vote. There are dozens of interrelated concerns that an IT department must evaluate before pulling the trigger on a software upgrade.

And then the software compatibility issues. How many times have we heard from a vendor, “It’s not certified for (fill in any number of OS versions) yet!” This causes a push me – pull you effect. Some vendors are pushing you to move forward, and others you have to pull along with you.

Things to consider before upgrading your software:

1) Can you adjust your current processes to take advantage of new functionality? Many times we take an upgrade and claim there is not enough time to do a full evaluation prior to going live. Then, we certainly do not have time to go back and look again. This could actually cost your company money in the long run, instead of delivering the benefits of a well-planned project.

2) Downtime can be a deal-breaker for upgrades. No department ever wants to experience downtime unless it’s unavoidable. How will each department test the new upgrade? Do they have a full test system to work with? If all of the issues are thought out beforehand and these questions answered, upgrading shouldn’t be that painful.

3) Does hardware need replaced? Could this be a great opportunity to replace some old PCs and servers? Is this the catalyst that moves your facility to server virtualization …finally?!

4) What vendor software (enterprise forms management, ECM/EDM, etc.) will need to be upgraded simultaneously?

Thoughtful software and hardware upgrades are usually embraced by end users and the C-level alike. Personnel get new PCs that increase productivity, which keeps the Powers that Be happy once they’ve overcome the initial sticker shock. Just the idea of new PCs gets most staff members feeling like the hospital is moving forward technologically.

Server virtualization condenses the physical footprint of the server room, decreases power and cooling costs, and in most cases, reduces server administrative duties. And with your software running faster with full functionality from vendors’ latest compatible releases, IT/IS will (hopefully) get less end user complaints. Hey, it sounds good in theory!

Just make sure you plan well in advance; get buy-in from department heads, super users and (if you’re lucky) an enthusiastic executive; and communicate openly with vendors and you’ll be good to go.

Ron Olsen is a product specialist at Access.

Readers Write 9/15/10

September 15, 2010 Readers Write 6 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

Document Management is Good for Business
By Shubho Chatterjee, PhD, PE

Enterprise content management (ECM), also referred to as document management, is a capability with significant potential to centralize content and document storage, streamline and automate processes, and integrate smoothly with other enterprise systems. The business benefits are improved operational efficiency, reduced manual labor, reduced paper consumption, and improved process quality.

ECM consists of a central content or document repository, with indexing and searching capabilities, integrated with automated workflow allowing documents to be routed to appropriate processes and processors. The usage of the system is controlled by access policies at individual and group levels. Examples of use of this system include, but are not limited to, patient admissions, medical records management, invoice and payment processing, finance and accounts management, contract management.

A rigorous vendor selection process is critical to selecting the appropriate vendor. This should include an initial evaluation of functions and workflows where ECM is deemed to impact the most. Additional selection parameters include, but are not limited to, the total future cost of ownership for the proposed system, the projected process improvements and labor reductions, current material consumption, and current storage costs, product functionality, deployment options, and scalability. These parameters should be used to construct ROI scenarios for different options. Both objective and subjective factors should be integrated into the decision making.

Deployment options can be in-house (client server) or SaaS. While the in-house option provides for greater control, it also requires dedicated resources to manage, maintain, and upgrade the environment. SaaS deployment enables access to the system on a subscription basis with the vendor managing and operating the system and associated infrastructure in its data center.

The SaaS option frees IT staff to focus on more strategic tasks that add value to the organization while avoiding the expense of adding more IT infrastructure and resources to manage the system. Key factors to consider here are Internet connectivity and bandwidth and information security. Implementation is also quicker as the vendor completes the system build, configuration, and installation at their data center.

Collaborating to build a solution requires a thorough examination of the current processes across the organization with supporting process turnaround time data collection. This forms a baseline from which process improvements can be tracked in the future. To maximize the impact of the solution, this in-depth, step-by-step process analysis should be used to re-engineer and automate processes using ECM.

Creating efficiencies with this solution is feasible in many areas. After implementing ECM in the admissions department, Miami Jewish Health Systems has a central repository for patient documents. Seamless integration with the EMR application allows authorized users from any location to instantly access the associated patient’s documents from their workstation, eliminating time-consuming manual searches.

Routing documents electronically to employee’s workflow queues allows for faster processing and greater security. Eliminating the need to search for documents or make paper copies frees the admissions staff from tedious tasks and focus on patient care. Medical Records Management workflow has also improved with easy, instant, and effective collaboration across the organization. Medical personnel receive automated alerts for completing charts and associated notes and deficiencies. Previously, this required a visit to the medical records office.

Back-office departments, such as accounting and finance, have a high volume of paper flow and manual process being susceptible to lost invoices, missed bills, overpayment, or underpayment.

ECM deployment at MJHS is automating invoice processing. Invoices are now indexed to payments made and are searchable easily. With this technology, invoice approval is also automated and does not require manual inter-office mailing and completion. Payments are also completed in a timely manner.

As with any technology solution implementation, ECM must be well planned with a cross-functional team. Integration aspects with other enterprise applications must be well thought out. Baseline process documentation and re-engineered processes are also critical for success and before-after comparisons.

Shubho Chatterjee is chief information officer of Miami Jewish Health Systems of Miami, FL.

Regaining Control of Disaster Recovery
By Tony Cotterill

9-15-2010 6-56-26 PM

While working with our clients in hospital IT departments, we come across a variety of data backup scenarios. Some hospitals do full backups nightly, while others rely on an incremental/full backup strategy. Some sites exclude specific applications from their nightly backup simply because the volume is too great to complete in a 24-hour period.

Although there’s no ‘typical’ approach to backup and disaster recovery, a hospital’s data is a vital asset that must be protected. Before deciding how to protect it, however, first you must understand it.

The data landscape in the healthcare industry is more complex than in many other sectors, primarily because of the varied data types – namely, structured, unstructured and semi-structured — that are generated by both clinical and administrative systems. The type of data being secured and protected is inextricably linked to how that data needs to be recovered.

Structured data comes from database-driven applications, such as the hospital information system, radiology information system, electronic health record, and accounting systems. These applications typically generate hundreds of GBs, possibly a few TBs in larger facilities.

Unstructured data comes from applications that produce discrete files that are not associated with a database. Examples include word processing and spreadsheet files, which are routinely created by administrative staff and then stored on file servers. Many TBs of unstructured file data can be a challenge to backup and recovery.

Semi-structured data is produced most commonly by picture archiving and communication systems and document management and imaging systems. Both maintain a database of information (structured data) that references large quantities of discrete files (unstructured data). A PACS database may run on Oracle or SQL, and its size may be relatively small in relation to the many TB of DICOM images that database references.

Once you understand the three categories of hospital data, you can determine how much is dynamic vs. static. The dynamic data, which typically comprises 20-30 percent of overall healthcare information, is accessed regularly, and therefore changes constantly. This is the data you should be replicating every day.

Static data, which probably makes up the other 70-80 percent of your storage, should be treated differently. This unstructured and semi-structured data never changes and much of it will never be recalled again. Nevertheless, regulations and/or institutional policies compel hospitals to store it for five years, ten years, perhaps even the life of the patient.

So here’s the good news: once you’ve identified your static data, you can replicate it and move it to a self-protecting archive. Then there’s no need to include it in your backups.

This combination of backup and archiving provides an optimal strategy for treating each data type with the right method. By understanding the nature of the data in the critical clinical systems, the IT team can deliver both realistic and acceptable data recovery objectives to the business. In the event of a disaster, the organization can rest assured that the data can be recovered in a reasonable timeframe, minimizing the disruption to patient care.

Tony Cotterill is president and CEO of BridgeHead Software of Ashtead, Surrey, UK.

RTLS and Temperature Monitoring Mania
By Fed Up with the Fever

Would someone please tell me what real-time locating systems in healthcare have to do with environmental monitoring? I keep seeing all these temperature monitoring requirements pop up in RFPs and press releases. It concerns me that the healthcare CIO (or whoever is making these decisions) doesn’t realize that temperature monitoring of refrigerators has nothing to do with real-time locating, and even worse, is willing to saddle their wifi  system with this function risking QOS-sensitive systems such as POE and VoIP.

Sure, real-time alerts of out-of-range or variable temperatures are important, but unless you’re subject to that old Bart Simpson joke where he calls up the bar and says, “Is your refrigerator running?” followed by Moe’s inevitable “Yes” and Bart’s “Well, then you better go catch it!” — well, your refrigerator is not mobile! There’s no need to locate it, and certainly not in real-time.

The real-time alerts and reports that healthcare needs related to temperatures of refrigeration units can be easily achieved with over-the-counter probes. Then, just as it would with any other DCC-based system (i.e., “dry contact closure” such as security cameras, alarms, doors, or nurse call lights), the RTLS would respond to certain pre-established conditions (i.e., temperature out of range). These other systems do not rely on real-time location except to “trigger” an event condition. That is, if you want a security camera to come on if a certain tagged piece of equipment enters the egress zone, you need the RTLS as it relates to the real-time location of the tagged piece of equipment.

Temperature monitoring requires no such “trigger.” It requires only that you “push” an alert to an individual (or group) when a particular event is recognized within the event software. No location changes are recognized or recorded. If healthcare organizations could recognize this, they would save a tremendous amount of money and not be subject to the heartache of a low-grade RTLS that does only one thing (wholly unrelated to real-time locating) well.

So I ask what RTLS has to do with temperature monitoring even as I understand why temperature monitoring is so prominent in the RTLS space. It’s an easy way for vendors to make money. So long as the company can write some basic rules, they can provide an alert when temperatures are out of range. They can also record temperatures at regularly scheduled intervals without staff ever having to physically approach the unit.

There’s no doubt it’s an important time and money saver for the hospital. And it’s a money maker for the RTLS vendor. They get to solve a problem for the customer and appear wholly competent on this level, so that when it comes to delivering their RTLS with any level of accuracy, there will be a certain level of trust pre-instilled.

Unfortunately, too many hospitals fall prey to the belief that environmental monitoring is a function of RTLS, so if the vendor can do that well, surely they can locate assets and automate patient flow, right? Sorry, folks, but it’s just not so.

Readers Write 8/31/10

August 30, 2010 Readers Write 4 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

Meaningful Use:  Specialists Still Not a Priority
By Evan Steele

8-30-2010 6-53-20 PM

The HIT Policy Committee’s creation of a new Quality Measures Workgroup last week is the most recent in a string of actions confirming that meaningful use has not been defined meaningfully for specialists — and that it is not likely to be. Despite the fact that this new workgroup is charged with prioritizing Stage 2 quality measures and analyzing gaps in Stage 1 criteria, the 18 physicians selected for the 24-member group are primary care providers* — a fact that surely raises concerns among specialists.

The appointment of this workgroup comes on the heels of a growing response to the Stage 1 definition of Meaningful Use from specialists and their professional organizations, commenting on the lack of fit with the way specialists routinely practice. Every one of the six core clinical quality measures are primary care-related, and many specialists will be hard-pressed to identify three of the 38 additional clinical quality measures that are relevant to their practices.

At last month’s HIT Policy Committee meeting, committee member Gayle Harrell commented that much of the input gathered from the specialist panels convened last October seems to have been ignored, She contended that in the final rule, CMS made it more — rather than less — difficult for many specialists to comply with Meaningful Use.

This position was echoed by Thomas C. Barber MD, EMR project team leader of the American Academy of Orthopaedic Surgeons (AAOS) in discussing the Academy’s EMR position statement in the most recent issue of AAOS Now:

Orthopaedic surgeons will have great difficulty in meeting the current 25 Meaningful Use standards. Orthopaedics would derive greater benefits from standards promulgated by our medical specialty society rather than a set of generic requirements that mostly do not apply to musculoskeletal patient care.

This is not a new issue. The primary care focus of the legislation and regulations has been intentional from the outset. President Obama appointed an internist, David Blumenthal MD, to spearhead the program. There was only one private-practicing specialist among the committee members that crafted the recommendations to CMS.

It is not surprising that the Meaningful Use criteria do not reflect the practice patterns of specialists. Federal funding to assist physicians with EMR adoption has been directed towards primary care. The $357 million allocated for Regional Extension Centers, for example, was earmarked to “provide outreach and support services to at least 100,000 primary care providers and hospitals.”

The definition of Meaningful Use is not the only obstacle for specialists. The EMR products themselves are not tailored to the needs of specialists. The AAOS EMR Position Statement correctly suggests that in developing certification standards, it is essential that “the different needs and uses of EHR by disparate medical specialties should be recognized. In particular, the differences between surgical specialties and primary care specialties should be acknowledged.“

Unfortunately, because the certification criteria are linked to the Meaningful Use requirements, they are similarly primary care-driven. The EMRs most likely to be certified for Meaningful Use are predominantly those that were created and developed for primary care physicians — those of vendors that, from 2004 until recently, have devoted their development resources to meeting CCHIT’s 467 largely primary care-focused criteria. The AAOS statement continues: “Many systems are geared toward primary care medical practice, which can limit the utility of EHRs for specialty surgical practice.”

Specialists are no different from other physicians in their desire to participate actively in the evolution of the country’s medical care delivery system. But until Meaningful Use is defined in a way that is applicable to the way they deliver that care, they will participate on their own terms — adopting specialist-focused EMR technology that increases their productivity and enables them to provide the highest quality care and service to their patients.

* Includes internists, family practitioners, pediatricians, preventive medicine, an internist/hematologist, and a psychiatrist.


Evan Steele is CEO of
SRSsoft of Montvale, NJ.

Safeguarding EMRs Against System Failure or Downtime
By Arthur Young

8-30-2010 6-49-08 PM 

Using time-saving information technology and automated patient records management ensures clinicians have faster access to the most up-to-date patient information, enabling timely diagnoses and treatment and maintaining a high quality of care. However, if the network goes down, the system fails, or a planned or unplanned system downtime occurs, clinicians are unable to access critical patient information.

Whether they are planned or unplanned, system downtimes can occur for any length of time — from a few minutes to a few days. Downtimes can also be very costly if there is no system for preserving and accessing up-to-date patient information and maintaining uninterrupted patient care. Healthcare organizations have implemented systems for recovering from disasters, but not for protecting data and continuing operations during downtime. Without such a system, downtime can become more than an annoyance — it can be a life-threatening event.

Distinct from disaster recovery — which helps get systems back up when they go down due to a power outage or property damage — business continuance keeps vital business operations running at or near normal capacities in the event of any network or system downtime. That includes the downtime that occurs while disaster recovery mechanisms are being executed.

There are various solutions available that can help healthcare organizations remain functional during downtimes. However, they have drawbacks. Redundant or fault-tolerant systems can keep computers running and available during a system failure or power outage, but if they are the only system being used for business continuance and the network also goes down, clinicians will not be able to access patient data. Printing patient reports periodically allows clinicians to have the current data on hand, however it is a time-consuming and cumbersome task that diminishes data security, not to mention a waste of paper, ink, and other resources.

To maintain access to patient information from the location its needed, healthcare organizations need to select a business continuance approach that will provide the most protection in the most circumstances. Ideally, a business continuance solution should enable healthcare organizations to do the following:

  • Identify critical information and automatically distribute it to areas it will be needed in the event the HCIS is unreachable;
  • Ensure the information is secured but available on local machines;
  • Maintain seamless operation in the background, notifying administrators of any interruptions; and
  • Eliminate the storage of data in paper form, saving paper, ink, and printers.

Intelligent report generation and distribution decentralizes data in the event of downtime by sending the latest reports from the HCIS to its system and creating secure databases in multiple locations. The information is indexed in the database so clinicians can search and find the data they need whenever they need it.

With access to critical data during periods of system failure or extended downtime, clinicians can provide uninterrupted care and healthcare organizations mitigate risks to patient safety. Patients can be assured their health records are up-to-date and secure and confident they are receiving the best possible care.

Arthur Young is the president of Interbit Data of Natick, MA.

Ode to the Dumbwaiter and Robo-Butt
By Frustrated Farmacist (Female)

I saw your blurb about the Aethon TUG delivery robots installed at El Camino Hospital. The old ECH had an awesome dumbwaiter delivery system in place.

It is rumored that the Aethon TUG delivery robot solution was something of an afterthought that came six agonizing months after the grand opening of the $470 million hospital. Apparently earlier plans to integrate a delivery system may have been (ahem) overlooked. You can see from ECH documents that the robot contract was drafted in January 2010, several months after the new hospital opened. Early reports said the robots didn’t work for all departments and some ended up using volunteers, auxiliary staff, temporary workers, and other solutions to get medications and lab materials delivered.

$470 million and NOBODY initially planned an integrated medication delivery and lab transport system for a brand new, ultimate-in-high-tech 400-bed hospital! It doesn’t take too much imagination to extrapolate how important timely medication delivery is in the patient care scheme of things and why it’s the top complaint and employee satisfaction issue for nurses.

ECH’s competitor down the road has been using a similar robot system from Pyxis for the PAST 22 YEARS. It’s on its third generation, fondly named Robo-Butt. He travels in elevators and down halls to six floors and 15 departments. He is guided by sensors in the walls and speaks aloud to nurses to alert them when he arrives and when he commandeers the elevator. He steers around obstacles. His compartments are locked and secure, requiring a numbered password to open. He’s powered by six car batteries that are recharged and swapped for backups 2-3 times daily.

He breaks down every now and then. The elevators break down more often, grounding him on the first floor. Pyxis no longer supports these robots, so parts were scrounged from the basements of other hospitals and from a hospital supply house in Hawaii. But Robo-Butt WORKS. Here’s a picture of this bad boy:

8-30-2010 7-13-19 PM

The average hospital pharmacy department dispenses at least 40% more meds than were ordered because of late deliveries or items that are misplaced. The overhead and amount of wasted labor and supplies is unacceptable and frustrating for everyone involved, including the patient, nurse, doctor, pharmacy, and departments like lab and surgery that are held up because of medication delays.

With pneumatic tubes, you place meds in padded bullets and shoot them to the receiving department. Fragile ampules and vials can be broken — think about Epogen, a blood booster whose fragile proteins are destroyed by a violent trip in the tube system. It used to cost $6,000/vial and is still in the $600-900 range. Can I tell you how many bullets have exploded inside the pneumatic tube tunnels? Can I tell you what I think about the ER department tubing patient urine and blood samples to the lab inside this system?

We still have bullets that wind up in the basement due to malfunctioning suction or drivers. It’s hilarious when it’s a $45,000 rattlesnake venom antidote. But the bad part is that sometimes meds lie in piled up bullets in the tube receiving bin. Worse, the staff goes to send an “empty” tube to lab or ER and accidentally send a bullet filled with meds. The worst part is there is no “track-ability” or accountability — we can never tell whether someone received a bullet. If they “say” it never arrived, we have to send it again.

Here’s why I think dumbwaiters may be the ultimate smart medication delivery system with the fastest turnaround times, the least amount of waste, and minimal lost meds and lost charges. The pharmacy staff places labeled / bagged medications into little sectioned trays (like your silverware drawer’s insert) and leave them in the little locked elevator. The nurse that needs meds comes to her department’s locked elevator door, calls up the tray, and REMOVES ONLY HER PATIENT’S MEDS. She leaves the other meds in their little slots.

Think about this. You don’t have one nurse removing meds for her entire department and then misplacing them, storing them improperly (oops, that expensive IVIG that cost $20,000 belonged in the refrigerator?) or just putting them in her POCKET and forgetting to put them in the med room altogether. Then the Pharmacy can call the dumbwaiter down later and retrieve unused meds, credit them back, and recycle them. 

You can imagine that the number of missing med phone calls drops in half. Anyone in the pharmacy can check the dumbwaiter and see if missing meds are there before re-making them. Can I tell you how much time I waste every day re-doing the same missing med that simply gets misplaced or misdelivered and there’s no way to track it? It’s cheaper to bag up another blood pressure pill with the patient’s name and send it again sometimes. And then we get to retrieve all the duplicates, sort them, and restock them in the pharmacy bins…

I have nightmares about this.

A reliable well-planned medication delivery system is worth $$$ millions and makes up 80% of the nursing / customer satisfaction basis. I swear this is true! Any healthcare organization that builds a state-of-the-art facility without planning a delivery system is completely ignorant.

Done with my rant. 😉

Readers Write 8/17/10

August 16, 2010 Readers Write 3 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

How Would You Define a Secure Database?
By Robert J. Rogers, MD

8-16-2010 6-34-13 PM 

While driving to work in late June, my phone rang. I saw it was my office manager calling. For those of us who own our practices, an early morning call from the office manager is rarely good news.

“Dr. Rogers, someone broke into the office last night and stole the computers!”

Thus my partners and I began our saga of learning the ins and outs of dealing with a potential breach of protected data. We are the “Texas allergy clinic” referenced by Mr. H in the Monday Morning Update of 8/9/10.

(Let me briefly mention now that our database was purely for our practice management system. We do not use an EMR yet. More on that later).

Selfishly, my first thought was, “I hope our backup is good”. A few years ago, we experienced a server crash and learned that our backup was corrupted, requiring a manual rebuild of our database. Fortunately, we learned the backup was fine when the new computers were installed. I naively thought our biggest challenge was behind us.

We decided to check with the Texas Medical Association regarding our reporting responsibilities. We were directed to the AMA’s summary of the HIPAA Data Breach Notification Rule, which was enacted in September of last year. It was at this point that we learned the very important distinction between password protection and encryption.

As I suspect is true in most offices, we were under the impression that our database was secure since we needed a username / password combination to gain access to it. We use a well-known practice management system supported by a local reseller. Password protection was the only security measure discussed.

However, we learned that the database was considered vulnerable if it was not encrypted, thus triggering our reporting responsibilities (first class letter to each affected party, and notification of local media if more than 500 individuals are affected). I will leave it to your imagination to consider the logistics of sending letters to 25,000 patients.

This was a nightmare until we learned that commercial printers and mailing services can handle everything — stuffing, addressing, stamping, and mailing (for a fee, of course). [Mr. H — I didn’t actually complain about the cost of this process. I just responded to the reporter’s question regarding the cost of the mailing.]

Being victims of this crime has triggered a number of questions that I hope some of you may be able to answer. Now that I have learned the importance of encryption, I wonder why encryption is not automatically provided by all vendors? Is it complicated and/or expensive?

In an informal survey of my physician friends, none of them understood the importance of encryption. None had asked their vendors about encryption. Many of these doctors host their own servers.

Our potential data breach was important mainly due to the potential for identity theft since we don’t use an EMR (fortunately, in this case). That’s bad enough, but I worry even more about the thousands of physicians who use EMRs and may not use data encryption, thus making sensitive medical information potentially accessible.

As a patient, should you ask your doctor about the security measures in place?

The Data Breach Rule requires notification of the local media if more than 500 patients are affected. I wonder about the wisdom of that requirement. Might the thief be unaware of importance of the stolen server until learning about it from media reports?

Because of our experience, we elected to change to an ASP model for our software, using an off-site server accessed through an encrypted virtual private network. We think this is an adequate level of security, but we thought our previous system was secure, too. Is our database now secure?

As we rush to encourage all physicians to use EMRs, how can we make sure that all involved understand these important security issues?

Robert J. Rogers, MD is a physician with Fort Worth Allergy and Asthma Associates of Fort Worth, TX.

The Business Associate “Relationship”
By Stephanie Crabb

8-16-2010 6-29-01 PM

We are working with many customers who are looking to implement Data Loss Prevention as part of their information security and compliance programs. The best-practice deployment of these solutions requires collaboration with the HIS application vendors that contribute to the ePHI data life cycle such that the DLP solutions can efficiently and effectively content fingerprint targeted data “at the source” in the applications themselves. To some, this might fall under the rubric of “integration” as we have come to define it in healthcare, a common practice.

One small client of ours, a hospital of just 20 beds with an unwavering commitment to patient privacy and data security, approached its core HIS vendor, Meditech, with a formal request to connect directly with the database (aka “dictionaries” in Meditech-speak) to accomplish the implementation of its DLP solution. The data set was minimal — six fields of basic, and I mean basic, data to start. 

This request, surprisingly to us, was met with a firm “no” from Meditech. Why? They consider this “customization.”

Respecting Meditech’s longstanding position in this area, I personally worked with our customer to develop the business case to present to Meditech as to why they needed Meditech’s re-consideration. We cited areas around breach notification, uses and disclosures, and the like to inspire cooperation from Meditech and to put into clear context that DLP was a technology being adopted specifically to demonstrate compliance with HITECH and MU.  

The Meditech account representative acknowledged that they would need to do better in the future, but until they had a “critical mass of requests” from their clients to work with another vendor (like the client’s selected DLP vendor), their answer was still no. Understanding that our client’s Meditech account rep only has so much authority, my CEO and the client CEO requested a personal meeting with Howard Messing, only to be told that Mr. Messing could not accommodate their request. 

This is about a simple permission that the vendor could absolutely grant and requires little to no effort on its part whatsoever. It is a permission that other HIS vendors have eagerly provided. Oh, the vendor did offer to sell our client a module that would make this “easier” to the tune of $40K, even though what the client needs to access is already present in their its implementation.

DLP is not the only emerging technology that holds tremendous promise for organizations looking to reduce their data loss / data breach risk, enhance the controls around their data and its uses, and protect patient privacy. Unfortunately, covered entities cannot accomplish the implementation of these technologies alone. They need the business associate to collaborate, facilitate, and, sometimes, participate. And let’s face it, the rise of technologies like DLP that offer compensatory controls for privacy and security has resulted, in part, because the HIS vendors have been slow to respond with their own system capabilities.

I really do not mean to single out Meditech here. There are certainly other vendors who subscribe to similar operational models. This is, however, precisely the client service mindset that needs to change and that HITECH is requiring, particularly when technology is not the barrier. If these implementations are technically possible and largely resource-neutral to the vendor "business associate," why delay or deny their clients the opportunity to close the privacy and security gaps that are requisite to achieving meaningful use?

While the content of the NPRM may set about a chain of events whereby business associates become even more conservative in their commitments to privacy and security collaboration with their covered entity partners, there really is no where to hide, regardless of how ambiguous HIPAA and HITECH may be written. If you are in this space and in the business of touching ePHI in any way, you have to be “all in” — technically, operationally, and in the way you serve your clients and the industry at large.

It is simply not acceptable to relegate privacy and security considerations to the back burner, or worse yet, leave leave your client holding the bag. OCR recently clarified that “willful neglect” includes failure to take action when one recognizes a risk. Business associates who fail to respond when requested by covered entities to address a perceived risk could find themselves in an uncomfortable and costly position if a breach occurs and it could have been avoided.

Stephanie Crabb is VP of client services with CynergisTek of Austin, TX.


Why Are Lab Orders from Ambulatory EHRs So Hard?
By Ken Willett

8-16-2010 6-45-01 PM 

While hospitals with integrated inpatient EHR systems are claiming high adoption rates for CPOE (in some cases 100%), most providers in ambulatory settings are still creating lab orders outside their EHR. What makes it so much harder?

An integrated HIS system, which includes lab and radiology ordering, can present the provider with the correct choices for that hospital’s services. In the ambulatory world, the EHR is much more limited (being less expensive), yet the variety of external service providers is larger. There are generally multiple labs, with ordering rules governed by insurance contracts, and each has its own test catalog, data requirements for the order, HL7 dialect, and requisition formatting requirements.

The provider wants to quickly capture what tests to order and why. Their manual process is to make a few strokes on a preprinted superbill or order slip. It’s very hard for an automated system to compete with that. Ordering facilities in the EHR are often cumbersome, while at the same time too generic to capture the specifics needed by the lab or radiology provider.

The lab, on the other hand, needs an order which is complete, with billing information, the lab’s order codes, appropriate Ask at Order Entry (AOE) questions answered, and the correct requisition and specimen labels printed. To try to assure that orders coming to the lab are accurate and complete, the lab will generally provide an order entry application and workstation to the practice, for use by a phlebotomist or other staff person.

Having a lab-specific ordering application addresses the problem of making sure the order reflects the most current compendium data of that lab (test codes, AOE questions, specimen requirements, and medical necessity rules), at the expense of having a separate ordering application for each lab (and in many cases, due to Stark laws, separate workstations, printers, etc.). Re-entry of order data, together with the need to use multiple ordering applications, significantly increases the likelihood of error.

Improving lab ordering within the ambulatory EHR is difficult because ordering rules need to be configured for each lab and compendium data needs to be constantly updated. This is a significant burden for each practice to
undertake.

Given that we are now in an environment with much more seamless connectivity between applications (with web services and other technologies), I believe a better solution is to move the ambulatory ordering function out of the EHR itself and instead provide orders via a connected SaaS application. This can allow compendium data management to be done in one place for multiple practices and multiple labs while still giving the provider and phlebotomist direct access to a universal ordering interface.

Only some EHRs have the necessary integration capabilities to allow this sort of user interface extension. Still, this seems like a promising direction to improve provider adoption of electronic orders.

Ken Willett is president and CEO/CTO of Ignis Systems of Portland, OR.

Readers Write 8/9/10

August 9, 2010 Readers Write 4 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

EHR Exit Strategy
By Robert Doe, JD

While negotiating license agreements for my clients, we typically focus on functionality issues, warranties, uptime representations, support issues, etc. However, with all the distractions in the world of incentive payments and penalties, one thing I suggest my clients give some additional thought to is: will happen when the relationship with the EHR vendor ends? What will be the “exit strategy”? What do you need to include in the contract with the vendor to ensure the transition to a new system?

One significant concern to consider is what will happen to the data contained in the medical records that your organization has entered into the EHR system after the license agreement has terminated or expired? If the system is being utilized by multiple organizations, will you be required to leave a copy of the information in the system? You will also want to have a clear plan as to how your organization will take the data to a new system upon termination of the relationship with the EHR vendor. Can records be easily copied and/or exported electronically with the current EHR system? It is my understanding that this may not always be a simple task.

In addition, what issues might arise if the license agreement terminates abruptly, as may happen in the event of a breach of contract? The main concern becomes business continuity. You may want to consider including a provision in the license agreement obligating the vendor to provide transition services while you transition to a new system. Ideally, you should be able to fully use the EHR system during this period. Typically, the user pays for these services.

While most organizations are focused on finding and implementing an EHR system, I would suggest giving some thought to the life cycle of the system and devising an “exit strategy” for the time when the license is terminated or expires. Your license agreement should include appropriate provisions to allow you to carry out a smooth transition.

Bob Doe is a founding member of BSSD, an information technology law firm located in Minneapolis, MN.

Our Organization’s Comparison: Cerner vs. Epic
By Roe Coulomb

You asked in a previous post why Epic beats Cerner for every important deal. I previously worked at an organization that did a side-by-side comparison between Cerner and Epic, eventually choosing Epic. I thought it might be helpful to your readers to know the factors that went into that decision.

Corporate Culture

What’s not to like about an organization that has a CEO as accessible as Judy? One year at HIMSS, I observed her rearranging the waste baskets in their booth to make it more user-friendly for her staff servicing customers. That’s servant leadership!

Even before they moved to Verona, their corporate headquarters felt like a college campus. That was partly due to the age of the employees, but also their dress code and the eccentric artwork they’ve acquired over the years.

Does Neal or his top echelon of VPs even attend HIMSS? I’ve never seen him. If the suits do attend, I bet you won’t see them at the booth rubbing shoulders with the average Joe.

Their corporate headquarters is your typical Fortune 500. Lots of suits. Stuffy. Need I mention the bad PR from Neal’s e-mail tirade that was leaked a few years back?

Integration of Ambulatory and Inpatient Records

This is a significant factor for any organization that has a large employed physician group and wants an integrated database for their billing/ADT and EMR data. There are huge opportunities for streamlining things like medication reconciliation from physician office to the inpatient setting and back to the primary care doc.  

Not to mention that providers who practice both in an office and do inpatient work as well have only one application to learn. Once they are doing order entry and documentation in their office, implementing CPOE and clin doc in the hospital is far simpler (for those physicians, anyway).

Epic’s got this nailed! Cerner, not so much.

Implementation Philosophy 

Your 7/28 post said, “A CIO reader who knows both systems says Cerner requires clients to take ownership of the design and use outside consultants, while Epic offers a more turnkey implementation at a higher price."

That’s true to some extent (the Epic turnkey statement), but it wasn’t always that way. Epic got lots of feedback from their customers that there were too many options and decisions of how to implement a specific function. They picked best practices and made that into their model system.

Nevertheless, there are still a lot of consulting firms out there with Epic practices and I am not aware of a major medical center that has installed Epic without using consultants.

Software Usability

At the end of the day, all that other stuff doesn’t really matter if the software sucks. How usable it is for employees and docs is what counts. 

During the evaluation we did at my former employer, Epic was simply easier to use. Cerner’s screens were very busy with all kinds of tabs and lots of clicks and keystrokes. I recall one screen where there were 30 “chart-like” tabs across the top of the screen.

I recently viewed a Cerner demo at my current employer of how a nurse would change one piece of data, like a heart rate, acquired through a monitor device interface. First, click on the cell with the data, click Clear, click Sign to accept the remaining data, go back to the cell, click Edit, enter the new results, click Sign again.

It’s been awhile since I saw this on Epic, but my recollection is that it was like making a change to data in a spreadsheet. First, highlight the data to be changed, over-type with the new results, click Accept. Far simpler!

You said, "… which would seem to indicate that Millennium isn’t up to the task. In other words, a $6 billion market cap company with a single, fairly low-rated product line that’s getting hammered by a smaller and much higher-rated competitor should think about developing a better product."

Isn’t that what Millennium was supposed to do for Classic?  I recall reading about the hit Cerner took to its bottom line the years they put a lot of resources into developing the new Millennium architecture. I guess one measure of how successful that was is how many Cerner customer’s are still using Classic?

Finally, I think you hit the nail on the head: "..Cerner has built a business that could weather Neal’s transition or sale to another organization, but we don’t know that with Epic". Judy won’t be running Epic forever. What happens when she’s gone? Can Carl or whoever replaces her continue to run it as a private company or will they be forced to sell?

Readers Write 7/28/10

July 28, 2010 Readers Write 3 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

How to Use Meaningful Use Measures to Improve Internal Processes
By Shubho Chatterjee, PhD, PE

7-28-2010 7-02-57 PM 

The final ruling on Meaningful Use was released by the Centers for Medicare and Medicaid Services in July of this year after a year of comment period and revisions. According to the final ruling, to be eligible for incentive payments, Eligible Professionals (EPs) are required to submit to the CMS, starting October 2011, 20 objective measures for 15 core objectives and an additional five from a menu of 10. For hospitals and Critical Access Hospitals (CAHs) the corresponding measures are from 14 core objectives and five from a menu of 10.

There are various efforts, dialogues, and debates underway regarding the ability of EPs, hospitals, and CAHs to meet the reporting requirements, whether the cost justifies the incentives, and the sheer human and technical capacity needed. I will not further add to the discussions but will rather focus on how the MU criteria can be used to further improve care delivery process, make it more efficient, and positively impact the operating margin. After all, a measure is related to the output of a process, and while a measure can be met, it can also be used to hone into the process and sub-processes for improvement.

Let us consider some of these Stage 1 measures and how the underlying processes supporting the reporting of the measure can be identified and improved to further improve the measure, the care delivery, and the operating margin.


Stage 1 Measure
More than 30% of unique patients with at least one medication in their list seen by the EP or admitted to eligible hospital’s or CAH’s ED have at least one medication order entered using CPOE.

Implication
Let’s assume that the provider meets the 30% threshold for the reporting period. A logical follow-through is to examine why the remainder are not CPOE and what were some barriers overcome to reach this threshold. Is it because for the remainder unique patient population, data entry is manual because other providing locations are not CPOE enabled, CPOE is available but under-utilized, or are there manual data entry requirements into and between various systems and consolidate the data to one final measure?

Each of these barriers point to a different challenge. The first is system unavailability (a business decision). The second is a change management (a people challenge). The third is a technical and process automation challenge requiring an interface or other electronic inputs, such as document management and integration.

Stage 2 and Stage 3 measures will increase the threshold. Thus the underlying process or system gaps should be identified not only to meet later Stage measures, but to improve process efficiencies as well.


Stage 1 Measure
More than 40% of all permissible prescriptions written by the EP are transmitted electronically using certified EHR technology.

Implication
Assuming the 40% threshold is met, what is necessary to increase the measure? Is it because of volume of data entry from single or multiple locations, or system not fully utilized, or could it be because the receiving pharmacy or is unable to manage additional increases to their receiving capacity from their customers? Again, the barriers are similar to the above and need to be analyzed and overcome.

Stage 1 Measure
More than 10% of all unique patients seen by the EP are provided timely (available to the patient within four business days of being updated in the certified EHR technology) electronic access to health information subject to EP’s discretion to withhold certain information.

Implication
This requirement has procedural, technical, and operational implications. The procedural requirements are in providing HIPAA compliant health information, while the technical requirements are in the mode of providing the information. For example, will a secure patient portal be created, will the information be provided in memory sticks or other portable devices, and if so, what is the encryption or data protection policy?

Note that, depending on the technical solution selected, there are supply chain and purchasing requirements as well, to maintain and increase the measure threshold.


Summary
While the MU provides financial incentives for healthcare organizations, it ends in 2015. It is important for healthcare organizations to use this opportunity, not only to prepare, apply for, and receive the incentives, but to examine their organizations deeply from People, Process, and Systems perspective to utilize and enhance the measures.

Only when these three supports are robust and reliable will the Meaningful Use be truly meaningful to the healthcare system, where the improvement of quality of care is the most important objective and operational improvements and business growth will likely follow.

Shubho Chatterjee is chief information officer of Miami Jewish Health Systems of Miami, FL.

 

Bringing Medical Terminology Management into the 21st Century — Just in Time for ICD-10
By George Schwend

7-28-2010 6-40-42 PM 

ICD-10 promises to improve patient safety, the granularity of diagnosis codes, and diagnostic and treatment workflows as well as billing processes. Sounds like a dream, right? But close to three years from the mandated switch on October 1, 2013, most hospitals and health systems are still thinking of it as a nightmare, dreading the massive amount of time, effort, and money the transition will require.

What many fail to grasp is that ICD-10 is just one step on an endless road. There are already dozens of code sets that will probably eventually need to be integrated with each other — from SNOMED-CT and LOINC to RxNorm to local terminologies and proprietary knowledge bases — and all of them are constantly evolving. Look down the road and you can see ICD-11, already in alpha phase in Europe.

Instead of tackling each new iteration as if they were setting off on a major road trip through uncharted territory, providers, payers, and IT vendors need to ditch the proverbial roadmaps and get themselves a GPS unit. That way, they can simply enter each new destination as it comes along and travel there automatically.

And automation is what true semantic interoperability requires. Our metaphorical GPS could either be embedded in proprietary HIT software or plugged into a hospital’s or payer’s information system and triggered by specific events such as an update or the need to create new maps. It would allow users to automatically:

  • update, map, search, browse, localize, and extend content
  • incorporate and map local content to standards
  • update standard terminologies and local content
  • generate easy-to-use content sets to meet the needs of patients, physicians, and customer support professionals
  • reference the latest terminology in all IT applications
  • codify free text
  • set the stage for converting data into actionable intelligence

Happily, software that fits the bill is already available, in use today at more than 4,000 sites on five continents. It provides mapping and terminology for leading HIT vendors, for health ministries like the UK National Health Service, and for standards organizations such as the IHTSDO, owner of SNOMED-CT., allowing them to not only implement new codes but synchronize codes throughout an enterprise, be it a physician practice or a country.

If you are still having nightmares about ICD-10, this your wake-up call. The ability to merge and manage diverse content from multiple sources — including free text from physician dictation — is what will turn ICD-10 from a frantic, one-off billing upgrade to one in a series of opportunities seized: to move clinical diagnosis to a new level, for example, to optimize EMRs, to meet meaningful use requirements, to satisfy quality initiatives such as the Physician Quality Reporting Initiative and to support robust analytics and reporting.

Can a roadmap do all that? Hardly.

George Schwend is president and CEO of Health Language, Inc. of Denver, CO.

 

HIE Market, A Shot in the Arm
By Tim Remke

The HIE market finally got a shot in the arm with the passage of the federal stimulus. This and other tailwinds sent hundreds of millions of dollars over the next few years toward the HIE market. From this point on, the HIE market gets muddled. Questions such as who is marketing their solutions to which markets, what deployed-use cases are functional or even operate at a high level, and what differences exist between multi-stakeholder, state, and private HIEs are mixed among many other multi-faceted questions.

The definition of a health information exchange has diluted the significance of surveys and results, particularly when they seek to understand what types of data are exchanged, the number of HIEs in the market and their respective operational capacity, and technological and governance structures. Simply, too many results are ‘self-reported’ and produce statistically insignificant, inaccurate, or misleading data points.

Of particular concern, several market surveys and reports related to the HIE market have commingled data by combining statistics from provider organizations that use solutions developed for basic hospital portals — a far cry from a broader HIE platform. Finally, HIEs may be private, multi-stakeholder, or statewide entities. In addition, payer system and public health play a role of delineation. The idea of ‘community HIE’ is limiting, and does not tier appropriately the HIE market.

With this perspective and understanding, we assess a few basic aspects of the current state of the HIE market.

Target Markets
A tremendous amount of friction exists over what specific HIE markets are accelerating at a pace greater than others, and which companies target each market. For example, a few vendors are persistent in their belief that the private HIE market is really the first ‘go-to-market strategy’ place. They look for localized geographies or a few hospitals to install an HIE platform as an overlay solution to act as a ‘buffer’ to a larger regional or statewide exchange.

Within the same HIE market, but more counter to this strategy, are the vendors who seek larger contracts from statewide or vast regional, multi-stakeholder exchanges. Two different approaches that produce some small and other more significant variation in solution focus and offerings. However, the data indicates a consistency that is expected. A

ll vendors will market to almost any market. However, slicing through the data, we see vendors that are targeted. All focus on hospital to hospital environments. Approximately 85 percent focus on providing an acute to ambulatory framework, also; and less than 40 percent offer a platform that readily integrates physician groups.

clip_image004

In addition, and somewhat paradoxically, many solutions are simply not designed to operate as platforms for vast geographic or state exchanges. Therefore, for the multi-stakeholder market, HIE solutions are discriminating. Contrast arises between target markets and the ability of the solution to match the specific market. Unlike other segments, HIEs seem as equally conflicting in details as they are syncopated — characteristics of a nascent market (relative to the past few years).

Critical Minimal Requirements
In recent months, we have seen a number of RFPs that contain a significant number of demands. However, they mask a serious issue in the HIE market. The reality is most HIEs are ill-equipped to take on sophisticated and complex solutions, use-cases, and technical architectures they greatly desire. Furthermore, over 65 percent stated the minimal exchange of data from information systems were posing “mission critical problems” with their respective exchange, and will succumb to “serious delays”. The table below looks at minimum versus preferred requirements for an exchange structure.

7-28-2010 6-49-03 PM 

Conclusion
Finally, the HIE market is dynamic and has hit full stride. Companies that have weathered the storm seek potential exits (i.e. merger and acquisitions) while others are ramping their solution for the future. The market will likely extend an abnormal growth rate for the next one to two years.

However, many unanswered questions will remain. Business models, measured quality improvements, and funding, among other items persist into the future as open question marks. For example, initial stimulus funds will jump start statewide HIEs. However, after these funds have been depleted, real concerns about long-term viability and funding sources will endure.

Tim Remke is vice president of business development for HealthcareCIO, which produced the Health Information Exchange (HIE) Comprehensive Analysis & Insight report from which aspects of the above article were taken.

Readers Write 7/15/10

July 14, 2010 Readers Write 8 Comments

Achieving EMR Usability in Today’s Complex Technology Market
By Odell Tuttle

As HIMSS began recognizing the importance of human/computer interaction, its EHR Usability Task Force developed the 11 principles of usability — a framework which provides methods of usability evaluation to measure efficiency and effectiveness, including patient safety. This framework is invaluable as many of today’s clinical systems do not provide adequate support due to poor interface design.

From multiple data interchange and reporting standards, to formatting and encoding standards, to clinical processes and procedures — not to mention the government organizations and legislation — the EMR domain is vast and complex. For hospitals looking to implement an EMR, it is important they choose a technology partner experienced with proven, tested, and used systems. For rural community hospitals, it becomes critical, because their needs are so unique.

The HIMSS 11 principles of usability is a valuable tool in the EMR selection process. A summary of the HIMSS usability principles include:

Simplicity
Everything from lack of visual clutter and concise information display to inclusion of only functionality that is needed to effectively accomplish tasks.

Naturalness
This refers to how automatically “familiar” and easy to use the application feels to the user.

Consistency
External consistency primarily has to do with how much an application’s structure, interactions, and behaviors match a user’s experience with other software applications. An internally consistent application uses concepts, behavior, appearance, and layout consistently throughout.

Minimizing Cognitive Load
Clinicians in particular are almost always performing under significant time pressure and in environments bursting with multiple demands for their attention. Presenting all the information needed for the task at hand reduces cognitive load.

Efficient Interactions
One of the most direct ways to facilitate efficient user interactions is to minimize the number of steps it takes to complete tasks and to provide shortcuts for use by frequent and/or experienced users.

Forgiveness and Feedback
Forgiveness means that a design allows the user to discover it through exploration without fear of disastrous results. Good feedback to the user supports this goal by informing them about the effects of the actions they are about to take.

Effective Use of Language
All language used in an EMR should be concise and unambiguous.

Effective Information Presentation – Appropriate Density
While density of information on a screen is not commonly measured (though it can be), it is a very important concept to be cognizant of when designing EMR screens.

Meaningful Use of Color
Color is one of several attributes of visual communication. First and foremost, color should be used to convey meaning to the user.

Readability
Screen readability also is a key factor in objectives of efficiency and safety. Clinical users must be able to scan information quickly with high comprehension.

Preservation of Context
This is a very important aspect of designing a “transparent” application. In practical terms, this means keeping screen changes and visual interruptions to a minimum during completion of a particular task.

Reliable usability rating schemes offer product purchasers a tool for comparing products before purchase or implementation.

Making complex things appear simple is a very difficult job.  However, by utilizing the HIMSS 11 usability principles, healthcare providers are armed with a powerful tool in the EMR selection process.

Odell Tuttle is chief technology officer at Healthland.

Tech Talk and Market Strategy – Smart Phones
Mark Moffitt and Chris Reed

Tech Talk – Dictating Reports within an iPhone App

Good Shepherd Medical Center developed an iPhone app that has achieved a very high rate of adoption by physicians (95%) by providing a high degree of customization. The second most popular feature of the app is accessing and playing radiology dictation when a report has not been transcribed and is not available for viewing. Viewing lab data is first.

One reason this feature is popular is that it eliminates the need for a physician to call a dictation system and enter an ID, medical record number, etc., on a telephone keypad. Using the iPhone app. they simply press a virtual button to play a dictation on the iPhone. One less gadget a physician has to futz with.

It seemed logical that physicians would appreciate being able to record a dictation and view clinical results on the iPhone simultaneously without calling a dictation system and entering information on a telephone keypad.

Initially, we planned to integrate our iPhone app with a native dictation app. Unfortunately, this configuration requires multitasking to dictate while viewing clinical information on the iPhone. About one-half of the physicians using the app have the 3G phone. iPhone OS4 (operating system) supports multitasking but runs slow on 3G phones.

iPhone OS3.1.3, the latest OS designed for the 3G and 3GS, supports viewing Web pages while talking on the phone. We used this configuration to provide for the ability to dictate reports and view clinical results from an iPhone. Our iPhone web app uses the URL scheme “tel” to send commands to the iPhone phone app.

tel: <1>, <2>, <3>, <4>, <5> # note: “,” instructs phone to pause

Where:

1. Telephone number for the dictation system.

2. Physician id.    

3. Site id (hospital).  

4. Job type (H&P, discharge summary, progress note, etc.).

5. Medical record number.

The shortcoming of this approach is that the iPhone dials slowly the entries after the initial phone number. However, it is a big improvement over having a physician call the dictation system and enter information manually.

This is not our final solution. Sometime late this year or early next year when most physicians are using a 3GS or iPhone 4, we will switch to using a native app to dictate a report. If we had more resources, we would provide a version for iPhone OS3.1.3 and one for OS4.


Market Strategy – Smart Phones and EMRs

If the battlefield for winning the hearts and minds of physicians using electronic medical record (EMR) systems is shifting to smart phones and iPad-like devices, and I think it is, this trend may open the door for vendors like Meditech, Cerner, etc. to derail the Epic juggernaut.

Newer systems like Epic hold an advantage over older systems in terms of usability and user interface design. Software written for smart phones that operate over an underlying system can hide these flaws. It is possible, I contend, to neutralize Epic’s usability advantage over older systems among physicians using an “agile” smart phone software model. An agile model is one that puts in the hands of the customer the ability to rapidly modify and deploy smart phone software to fit the specific needs of an organization. This approach does not change the functionality of the underlying system.

Customers using agile smart phone software can:

1. Configure the app in different ways to greatly improve flow for different kinds of users, e.g. hospitalists, specialists, and surgeons; and for different types of smart phones.

2. Add data to the user interface to guide users toward a specific objective. For example, display house census, length of stay, observation patients and hours since admission, pending discharge, one touch icon for pending discharge alert, etc.

3. Add features that make the physicians work easier. Examples include one touch icon to call patient’s unit or nurse, play recording or dictate on the smart phone while viewing clinical results; access medication list directly from a PPM EMR without a patient master index between systems; receive clinical alerts; etc.

To compete, smart phone software must be core to your business. Give credit to Epic for recognizing the strategic value of their smart phone software. However, Epic’s smart phone software is “rigid” and that leaves them vulnerable to smart phone software that is agile.

Mark Moffitt is CIO and Chris Reed is Manager at Good Shepherd Medical Center in Longview, Texas.

Readers Write 6/14/10

June 14, 2010 Readers Write 20 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

The EHR Manifesto
By Recently RIFed

A spectre is haunting America — the spectre of Meaningful Use. All the powers of traditional vendors have entered into a holy alliance to exorcise this spectre: Executive Office and ONC, Allscripts and Eclipsys, Epic, Cerner, McKesson, and Meditech.

Where is the software vendor that has not been decried as unusable by its opponents in power? Where is the software vendor that has not hurled back the branding reproach of unusable software, against the more integrated vendors, as well as against its reactionary adversaries? (My apologies to Karl and Friedrich).

10 Point Program to Improve EHR software

  1. Less configurable. The Demotivators® said it best “When people are free to do as they please, they usually imitate each other”. Every hospital or physician practice is unique — they uniquely solve the exact same problems everyone else is facing.
  2. Better designed. End-user input and UI design should be part of the specs, not the pilot.
  3. Customer-prioritized enhancements. Fifty percent vendor-driven (sales and demo feedback, regulatory requirements, infrastructure, etc.), 50% prioritized by customers. Yearly process, projects grouped to be equal number of hours, one vote per licensed bed, top x projects will be roadmapped to fill 50% time.
  4. Consensus-driven standard content and configuration. Vendor designed, large group customer editing — majority rules, everyone uses.
  5. Remote hosted. 99.999% uptime, capacity and response time are key requirements.
  6. Rapid install. If you’ve followed 1-5, training the end-users should be the most time-intensive phase of the implementation.
  7. Qualified buyers. We’ll sell to you if you agree to: follow our standard workflows, use our standard build and participate (end-user input, content design, and prioritization). Must agree to mandate adoption! Better to support 50 involved, committed customers than 100 unhappy, non-standard, partially-implemented, low-adoption targets.
  8. Equitable pricing. Low upfront, subscription-based. Every customer pays the same, scaled by size or volume.
  9. Play nice with other vendors. Integration > Interfacing > Interoperating.
  10. Record portability. Remove vendor lock-in. The intersection of the NHIN and CCDs with the market transitioning to replacement will make this a necessity. You know it will be mandated eventually.

I can’t think of a single vendor that would get a passing grade on my 10-point scale (even the industry darling would only receive a 40%). But please, prove me wrong and post comments. As I review my RIF package and dust off my resume, I’d love to be proven wrong (and find out they’re hiring) …

Personally, I’d love to see a new breed of vendors emerge. Maybe someone will submit a FOIA request and hire a team of developers and clinicians to polish and fill in missing functionality. Maybe even someone willing to follow my manifesto and explore a co-op or non-profit corporate structure. Forget the socialization of medicine, let’s socialize the vendors. Until that happens, I’ll continue to remain anonymous and try to work from within.

Jump-Start HIEs with Integrated Health Records
By Ravi Sharma

 ravisharma

One of the challenges that most EHR systems will have in satisfying the government’s Meaningful Use requirements will be to establish connectivity and interoperability with other providers’ systems and ancillary services. Disparate data from multiple providers must come together as a more complete patient-centric record to achieve this goal, and not all providers are ready for it. These and other business and logistical issues are some of the challenges that health information exchanges (HIEs) have encountered.

One solution is to use technology to leverage data generated through existing business relationships. This can be done through a Web-based, patient-centric “Integrated Health Record” (IHR) that integrates data from multiple sources and institutions. An IHR provides up-to-date, community-wide, patient-centric data such as lab and imaging orders and results, incorporating both hospital and reference labs.

It also can be used for ordering prescription drugs and leverage the patient’s allergies, drug history, and even lab data to prevent adverse events. Physicians can even follow the inpatient encounters for patients admitted in connected hospitals, along with outpatient data, from anywhere over the Web.

IHRs also improve the ability for patient care teams — physicians who must collaborate to provide comprehensive care — to coordinate care and share patient records. Today, such clinical information between referring physicians is shared via fax, mail, or phone. Even when practices have EHRs, they’re often unable to send key patient data electronically to other physicians who may be using different EHR systems.

The Meaningful Use criteria require such exchanges to occur using standards such as Continuity of Care Document (CCD) and the Continuity of Care Record (CCR), but few systems are capable of using such standards. That’s partly because EHRs aren’t designed for information exchange and also because, in the absence of HIEs, the transmittal of CCDs requires point-to-point interfaces. An IHR that already can create connections to multiple EHRs can act as a link to exchange CCDs or CCRs.

The IHR is not designed to replace EHRs or CPOE systems, but rather to collaborate with them to connect them with other information sources. In that sense, the IHR unifies and facilitates the patient-centric data exchange between various entities to realize the formation of HIEs. The IHR further facilitates the integration of data from multiple sources by normalizing data from disparate sources using standards specified in Meaningful Use, criteria such as LOINC for discrete lab data.

Rather than upfront investments in MPI and other expensive technologies, HIE pilots can greatly benefit from the use of technologies like the IHR. The IHR can not only serve as basic HIE, but facilitate HIE participation by providing key information where and when it’s needed on the front lines of patient care.

Ravi Sharma is president and CEO of 4medica.

Thoughts on Eclipsys-Allscripts
By Tim Elliott

The coming together of two heavyweights in the healthcare IT industry, Allscripts and Eclipsys, has the potential to open doors for their existing and future customers, third-party developers, and patients. There will be some challenges, too — including helping current customers integrate legacy Allscripts and Eclipsys systems alongside new modules — but this can be considered another opportunity for outside vendors whose technologies bridge the gaps between Eclipsys and Allscripts applications.

Detractors may be lampooning the Allscripts / Eclipsys “One network, one platform, one patient” slogan, but in truth, the merger does create a cohesive, cradle-to-grave care solution by uniting pre-acute, acute, and post-acute care information, as well as simplifying financial and performance management with non-clinical data.

The use of a common .NET technology stack offers the possibility of seamless integration and increased usability for clinicians and administrative staff. It also makes it easier for third-party software providers to deliver bolt-on solutions that further enhance Allscripts / Eclipsys offerings in physician practices, hospitals, home health, and other care environments. These external vendors will be crucial if Allscripts / Eclipsys is to succeed in bringing together previously disparate patient populations, which will require capturing and managing data from multiple sources in a centralized manner.

Tim Elliott is CEO of Access.

Readers Write 5/17/10

May 17, 2010 Readers Write 6 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

Medical Image Sharing: The Future is In the Cloud
By Eric Maki

eric_maki

Is the world coming to an end — the healthcare IT world of proprietary silos, that is? When it comes to the sharing of radiology images and report files, the answer appears to be an emphatic YES.

My facility, the Great Falls Clinic in Great Falls, Montana is just one of dozens I know about that now share full-resolution images and reports via cloud-based technology.

The approach works seamlessly. Both uploading and downloading aren’t much more complicated than sending an e-mail with an attachment. No one needs to babysit the process, which at a leanly staffed rural clinic like ours, is a big advantage. And there are no requirements to establish and maintain the link, unlike the VPNs that were our workaround until recently.

There are advantages to proprietary healthcare IT technology. But when it comes to sharing images, proprietary IT has posed challenges throughout my entire state. Because nearly all of Montana’s medical facilities are less than full-service, we often have to transport patients with major issues to a large hospital in the nearest big city. The docs there, of course, want to see whatever imaging studies and accompanying info we generated at our facility. Proprietary IT forced us to use VPNs or other workarounds like burning and sending CDs.

There was also a major expense involved in all the time we spent to maintain our VPNs every time we installed an IT upgrade such as a beefier firewall. Some of my colleagues in Montana who relied on CDs for file sharing were having other frustrations. Sometimes the CDs couldn’t be read on the recipient hospital’s computers. Sometimes the CDs were damaged, couldn’t be read anywhere, or worse, were lost and never found.

We were fed up with this situation in our state, so 30 of our facilities formed an organization to search for a better solution. We called it Image Movement of Montana, or IMOM. We asked several PACs vendors for ideas and, fortunately, one had just developed a cloud-based service that met our needs. It required no new capital acquisition of hardware or software and bypassed all the proprietary hurdles that had plagued us to this point.

The Great Falls Clinic was one of the six facilities that tested the system on behalf of all 30 IMOM members. It worked pretty much without a hitch. A problem that vexed us for many years was suddenly solved, just like that.

The system we use is called eMix, but there are other players in this game — LifeImage and SeeMyRadiology, for example. From what I’m reading, there may soon be more cloud-based image-sharing services available. It’s clear to me that the medical image sharing’s future is in the clouds.

Eric Maki is manager of information technology at the Great Falls Clinic, Great Falls, MT.

 

NHIN CONNECT Code-a-thon
By iReporter

connectbanner

ONC sponsored what it called an NHIN CONNECT code<a>thon held in Miami a few weeks back. Like the IHE Connect-a-thon held earlier this year in Chicago, this forum’s attendees were primarily hands-on senior software architects and engineers who are refreshingly working together to tackle our industry connectivity woes. 

This meeting had three components. The main one was two days of in-depth collaborative sessions to discuss a variety of technical topics regarding the current CONNECT version as well as group planning for future version features. The second was the CCD template competition won by Georgia Tech that you highlighted here.

The third and most important component in terms of potential long-term impact on the industry was the creation of the Electronic Health Record Interoperability Special Interest Group (EHRI-SIG). To a standing room only audience (and 60 online participants), the CONNECT team presented their ideas and reached out to the private sector for help in establishing a group committed to advancing the state of practice involving medical record interoperability. 

connectteam

One unique idea presented involved the use of XMPP, a protocol underneath applications like Skype and instant messaging. The idea presented was to exploit this protocol for implementing new communication and exchanges between doctors, patients, personal health records, laboratories, and pharmacies. Another interesting discussion revolved around the CONNECT teams’ desire to implement no-click solutions and to stop the phone from ringing in the doctor’s office.

The meeting video/audio and presentation and audio can be found here.

This modest event could very well signal the beginning of how health information exchange will fundamentally be changed and accelerated in this country. By combining the best of the NHIN CONNECT industrial strength “trust fabric” with the some of the same concepts being considered within NHIN Direct, this effort is positioned to provide a “sweet spot” that likely will appeal broadly to health care industry stakeholders as they tackle meaningful use under Stages 2 & 3.

EHRI-SIG will be making specific decisions on how to move forward at its second meeting in DC on June 2.  As a true working meeting, attendees are required to submit short use case descriptions and be representatives of EHR, lab, pharmacy, PHR, etc. vendors so that the outcome of the discussion can potentially translate into enhancing their own product capabilities. Information can be found here.

This initiative is an open challenge to the healthcare industry vendor community to demonstrate true leadership at a critical time in order to improve outcomes by getting the right information to the right person at the right time. It will be interesting indeed to see who steps up and who does not.

Creating Efficiencies through Enhanced Communications: Alerts and Notifications
By Jenny Kakasuleff

jk

With the recent passage of health care reform and the 30 million newly insured individuals estimated to enter the marketplace, providers are under increasing pressure to improve productivity and efficiencies to meet increasing demand. These challenges must be met while simultaneously improving the quality of care patients receive.

Historically, providers of health care services have taken a piecemeal approach to implementing health information technologies. This has resulted in a number of disparate systems that do not communicate with one another, and contribute to a growing army of devices that health care providers must haul around with them, or have at their disposal in a largely mobile environment.

The alerting and notification systems still in use at many hospitals today are a conglomeration of proprietary systems and devices utilized to perform one particular function — a bedside monitor that sends an alert to the central nursing station to report a change in a patient’s vitals; a tracking system that allows any provider with computer access to locate a device; or a lab information system that sends an e-mail to indicate an abnormal lab result.

While this approach provides many individual solutions to overcome past inefficiencies, it has been uncoordinated, and as a result, creates its own set of problems. The responding provider is saddled with a number of different communication devices to perform a range of non-standardized tasks.

Most professionals today have the ability to perform all of their business-related (and personal) activities via a single mobile device. We make phone calls, check our e-mail, manage our calendar, pay our bills, locate people and places using GPS, listen to music, connect with friends and family through SMS text and instant messaging, or through social media networking — all through one multi-functional device. It is amazing that the same demand is not pervasive in the medical sector.

Health IT solutions now exist that not only address the problems of the past, but work to streamline the disparate systems currently in use into a single, standardized messaging system that delivers a range of alerts and notifications of varying importance to the appropriate recipient. Also, with the integration of an enterprise-class communication solution, providers now have the ability to receive alerts from each proprietary system — electronic medical record (EMR), hospital information system (HIS), nurse assignment, lab information system, etc. — via a single device powered through a unified communications system.

Different messages are delivered based upon their level of importance and escalated until its receipt is acknowledged. The HIS is then updated and auditing trails create a measure of quality tracking and control. The recipient can then respond to the relevant options generated without locating a phone, computer, or other staff member.

As the American Recovery and Reinvestment Act (ARRA) forces health care professionals to evaluate how best to implement and utilize their EMR systems to qualify for meaningful use incentives, their approach should be holistic; cognizant of current and future challenges; and focused on gaining as much mileage as possible from the investment.

Jenny Kakasuleff is government liaison with Extension, Inc. of Fort Wayne, IN.

Readers Write 5/10/10

May 10, 2010 Readers Write 12 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

Thoughts About athenahealth
By Deborah Peel, MD

 dpeel

Another misguided, uninformed EHR vendor will discount the price of EHR software for doctors willing to sell their patients’ data!

How is it possible to be so unaware of what the public wants? The public doesn’t want anything new or earth-shattering, just the restoration of the right to control who can see and use their medical records in electronic systems.

Not only is the practice of selling your patient’s data illegal and unethical, but the new protections in the stimulus bill require that patients give informed consent before their protected health information can be sold. So selling patient data without consent is now a federal crime.

Quotes from the story:

  • athena’s EHR customers who opt to share their patients’ data with other providers would pay a discounted rate to use athena’s health record software.
  • athena would be able to make money with the patient data by charging, say, a hospital a small fee to access a patient’s insurance and medical information from athena’s network.
  • Caritas Christi [Health Care] initially launched athena’s billing software and service in October and then revealed in January that it decided to offer the company’s EHR to physicians.
  • How many patients would agree to sell their health records to help their doctor’s bottom line AND at the same time put their jobs, credit, and insurability at risk?

Health information is an extremely valuable commodity, so people are always thinking of new ways to use it.

What will athena’s informed consent for the sale of health patients health data looks like? Will athena lay out all the risks of harm? Will athena lay out the fact that once the personal health data is sold, the buyer can resell it endless to even more users? Will athena caution patients that once privacy is lost or SOLD, it can never be restored?

I guess some people are so out of it they do not realize what a barrier the lack of privacy and lack of trust is to healthcare. HHS reports 600,000 people a year refuse to get early diagnosis and treatment for cancer because they know the information won’t stay private. Another 2,000,000 refuse early diagnosis and treatment for mental illness for the same reasons.

Check out slides from a recent conference at the UT McCombs Business School on the subject of patient expectations, privacy and consent.

Deborah C. Peel, MD is a practicing physician and the founder of Patient Privacy Rights.

Thoughts About athenahealth
By Truth Seeker

Um, I think we need to settle down here, folks. I may be wrong, but I believe when athena refers to athenaCommunity and the exchange of information, they are referring to the following hypothetical scenario:

A patient whose primary physician is an athena customer needs to be admitted to the hospital. athena delivers to that hospital a clean, clinically accurate, and up-to-date record of that patient’s medical history and charges the hospital a few bucks. athena is able to charge the primary care physician a lower fee for their EMR service because they are shifting some of the financial burden to the hospital. And intuitively, this make sense for a couple reasons:

The push towards electronic medical records is to enable greater exchange of information and better coordination of care, etc So when athena talks about athenaCommunity, I’m fairly certain that they’re not talking about a sinister plot to share info with hospitals so they can refuse to admit high-risk or expensive patients. (Seriously, the conclusions people draw from articles like this without doing their homework can be completely ridiculous, but I suppose that casting baseless aspersions is just the nature of informed discussion in the Internet era.)

They’re just talking about handing the patient over to another provider and making sure that the new provider has a completely accurate and up-to-date record of that patient’s medical history, and of shifting the financial burden from the handover away from the primary care physician. What a "privacy disaster" … a sheer outrage!

And second, I’m no healthcare economist, but I’m pretty sure that a) the hospital really wants and needs that patient’s medical history and that athena is probably better positioned to deliver it in a more useful format than a lot of their competitors; and b) it’s probably worth a lot more to the hospital than a few bucks. 

I’m not an athena employee or other stakeholder, but I do think that they continue to think of innovative new solutions to problems, bottlenecks, and inefficiencies in the healthcare system. Unfortunately, they seem to have a bulls eye on their backs right now. I for one am happy that we have smart people like Jonathan Bush out there coming up with creative new solutions. 

Why Emergency Physicians Prefer Best-of-Breed IT Systems
By John Fontanetta, MD, FACEP

johnf

According to a recent report from KLAS, some hospitals are replacing standalone, best-of-breed (BoB) emergency department information systems (EDIS) with enterprise solutions that are leaving ED clinicians — and often their patients — unsatisfied. Why unsatisfied? Because the clinical functionality in enterprise solutions is both less comprehensive and less efficient for the ED environment and they are just so hard to use.

This report has re-energized the debate over the benefits of the two kinds of systems. IT professionals prefer the seamless interoperability supposedly offered by single systems, but the fact is that many large vendors have simply bought and shoehorned in a separate ED system. The resulting systems have their own interface issues.

Like many of my fellow ED physicians, I have found that a first-class BoB system tailored specifically to the needs of the ED, in our case EDIMS, offers a number of advantages. For example:

  • Workflow in the ED is measured in seconds and minutes rather than hours or days. The fewer clicks required, the faster the care. At Clara Maass Medical Center, we can issue complete sets of orders in as few as three clicks, enabling our physicians to be more productive.
  • Trying to retrofit an inpatient IT system to the ED is difficult because the ED is just so different from the floors. Customized ED order sets with a linked charge capture system means less delay between treatment and billing, not to mention a more accurate capture of charges, which has dramatically increased our per-patient revenue.
  • In the same way, customized alerts that tell the ED staff what they’re forgetting to document cuts back on the number of claims denied due to missing or inaccurate information. At Clara Maass, we have slashed such denials by 75%.

One of the most important things about a good ED system vendor is responsiveness. The vendor should be able to quickly accommodate the ongoing changes in standards and regulations. For example, at Clara Maass, when the H1N1 virus first appeared in 2009, we had templates for recommended care and discharge instructions built into our system by our BoB vendor within 24 hours. And when we decided to create an observation area, they promptly responded with observation-specific templates and order sets and created a secondary note option for the observation physicians.

The EMR system has enabled us to make a number of other improvements in our ED. For example, we have reduced the average patient turnaround time by over 30%. We have boosted the number of EKGs we perform within five minutes of a patient coming through the door from 46% to more than 90%.

Overall, my specialty has been slow to adopt EHRs, not because we don’t see their importance, but because they have a reputation for being unwieldy and unresponsive to the requirements of the ED. With more and more EDs adopting BoB systems that are designed to support ED clinicians’ intricate and demanding workflows, physicians are starting to realize that an EHR can actually be an advantage in our fast-paced environment, rather than a burden. 

CIOs are finding that these BoB systems can offer the same, if not better, integration capabilities than a single, enterprise solution. While many of the HIS vendors are inflexible when it comes to working with other systems, BoB systems have always had to offer integration solutions and many pride themselves on their ability to integrate with almost any system.

John Fontanetta MD, FACEP, is chairman of the department of emergency medicine at Clara Maass Medical Center, Belleville, NJ and chief medical officer of EDIMS.

Digging for Gold in your HIT Applications
By Ron Olsen

Over the past few years, hospitals have focused IT budgets and resources on purchasing applications to enhance their HIS. Many facilities have spent tens or hundreds of thousands — millions for the larger hospitals — on licensing, maintenance, and ongoing professional services.

In the feeding frenzy to continually acquire and implement the latest healthcare information technology, most IT/IS teams are neglecting to ask basic but important questions about their existing applications, such as:

  • Are we using the software to its fullest extent?
  • Have we turned on every feature we’re currently licensed for?
  • Are HIT products meeting the needs we identified when planning the deployment?
  • Have we asked users what they’d like to see added to the product, and if so, has that been communicated to the vendor so they can include it in a future version?

Asking questions does not cost anything and end users are usually very vocal about what they’d really like to see software do for them. Their invaluable real-world input is useless if there’s no feedback mechanism, or if your team refuses to incorporate it into product roadmap discussions with vendors.

In a time in which hospitals’ funds are tighter and IT budgets frozen or cut, it’s time to double back and review what products you have purchased and their capabilities. Maybe re-present the product to different areas of the facility explaining existing functionality again, and introducing new features that have been added since the initial implementation. Now that the users have gotten a refresher, they may identify functionality that was not implemented initially and would now prove useful.

Healthcare technology vendors are always eager to showcase new features and theoretical uses for these at sales presentations, but IT/IS admins often overlook “hidden gems” in the software that other hospitals are actually using. If the vendor has a user group, listservs, or an online forum, these are great places to start, not to mention that they cost nothing and consume very little time.

These collaborative tools may enable your team to discover other use cases that even your vendors have not thought of. There are a lot of people in the healthcare IT trenches creating workarounds every day. There may be capabilities within current products to join with other systems within your tool bag to create a new or improved process that is, again, a freebie.

One of the most over-used buzz words in healthcare IT is “interoperability,” a is really a big word that self-important people use to describe data transfer. When thinking about data transfer at a basic level, almost every HIT product can output to a printer. A printer can be easily set up to print to a file. So now you have data in a file format.

Scripting tools can manipulate those files, turning them into almost any format imaginable. With the correct format, data can be transferred to disparate systems, individually or concurrently, via a data stream. This could be a raw text file, compressed zip file, encrypted e-mail file, FTP, or an HL7 file.

This method is easily applied to an enterprise forms management system. If it has a decision engine, you could print a form set from it and then have the engine input the data to a database for audit trails (you should be able to choose the data points). Next, the engine sends the data to a file and launches an application to text the ordering physician that the patient just presented, based on the data in the text file.

If you’re a budget-conscious healthcare IT professional who wants to better meet the needs of your user community, I implore you to take another look at the systems you’re already working with. In my many years as a system admin at a community hospital, getting more out of the tools available to me (instead of just relying on new purchases) helped me deliver more effective tech solutions to my users, positively impact patient service, and keep decision makers happy by saving money.

You, too, have gold nuggets hidden in your existing software. It’s up to you to find and use them.

Ron Olsen is a product specialist with Access.

Readers Write 5/3/10

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

Goodbye Data Warehouse and Cubes, Hello AQL
By Mark Moffitt

markmoffitt

For the last two years, I have been researching systems to replace the data warehouse used for report-writing in our organization. This effort has been driven by the desire to provide better service to other departments that rely heavily on data reporting for day-to-day operations.

The idea is to push data to users so they can perform in-memory analysis and display of large amounts of data, a system that would replace the current process of requesting custom reports and spreadsheets from the information services (IS) department. The current process requires considerable resources in the IS department and requests can take several days if the number of requests for reports in the queue becomes large.

The requirements for a new system are straightforward, but somewhat daunting:

1. Put data into users’ hands so they can perform business intelligence.

2. The cost of the system, including license, hardware, and consulting, must be offset by the direct costs of shutting down existing systems.

At GSMC we operate Meditech Magic and use a data warehouse for analytics and business intelligence. The data warehouse stores about nine years of financial data in about 650 GB. The data in the warehouse is updated nightly. SQL reports have been developed to provide reporting across the organization.

IS at GSMC is bombarded with requests for new reports. These requests come in the form of specialized requests for data that often require modifying an existing SQL query or writing a new query. The process is iterative that starts with gathering requirements for a report, modifying or writing new SQL queries, generating a report and sending it to the customer.

Typical turnaround times are variable and are highly dependent on the number of reports in the queue to be developed. Best case scenario is four hours, typical is two to four days. Often the customer will, upon review of the report, ask to include or exclude specific data. This back-and-forth typically occurs several times until the report meets the customer’s needs.

The IS department at GSMC has several analysts who spend a good part of their time responding to requests for data. It is a never-ending demand.

We researched the use of OLAP (online analytical processing) cubes to provide data to users. The advantages of cubes is well documented and includes the ability to drill down to details and analyze data in ways simply not possible with reports or spreadsheets. The disadvantage to cubes is that data must first be aggregated. If a user needs data not included in the cube, then the cube must be rebuilt. Also, a data warehouse is required. Finally, building and maintaining cubes require personnel with specialized skills.

About seven months ago, I read on HIStalk about a new company named QlikView. I researched the software and it sounded too good to be true. However, I was intrigued that QlikView doubled revenues in 2008, not an especially good year for selling enterprise software as the national economy was in a major recession.

On the surface, QlikView is a business intelligence solution that consists of a data source integration module, analytics engine, and user interface. QlikView is based on AQL and is completely different from other OLAP tools.

Through AQL, QlikView eliminates the need for OLAP cubes and a data warehouse, replacing the cube structure with a Data Cloud. A Data Cloud does not contain any pre-aggregated data but instead builds non-redundant tables and keeps them in memory at all times. Queries are then created on the fly and are run against the Data Cloud’s in-memory data store.

Under AQL, all data is stored only once, and all data associations are stored as pointers, so a Data Cloud database becomes more efficient at retrieving records than do OLAP databases. A Data Cloud database is also much smaller since records are not repeated through aggregation and its structure never has to change. The architecture allows for a flexible end-user experience because it doesn’t require aggregation or pre-canned queries that try to cover every possible analytical scenario a user can create, unlike data cubes that require both. (1)

Data Clouds run in memory and AQL reduces in-memory storage requirements by about 75% as compared to source data. In-memory Data Clouds can be stored as AQL files for archiving. AQL disk files are 90% smaller than source data. Think of an AQL file like an Excel file where data can be added and deleted and the file saved with different names for archiving purposes.

The price point for the software is about $150,000 (one-time fee) for our health system. Hardware costs are about $15,000 for a server with 98 GB of memory. We expect consulting fees to total $150,000 for a SME in hospital financial data with QlikView experience. We worked with RSM McGladrey on a consulting proposal as they have well-qualified personnel in this space.

If you know much about the BI/Analytics space, you may question the low cost of the software and consulting services. This has everything to do with the AQL model. RSM McGladrey quoted a revenue cycle effort at eight weeks and includes:

  • Transfer data from existing systems to QlikView
  • Data validation
  • Census analysis
  • AR analysis
  • Insurance contract analysis
  • Hindsight analysis
  • Train IS staff on data extraction

The revenue cycle statement of work is only one component of the $150,000 quote for consulting services from RSM McGladrey for implementing QlikView at our organization.

The total cost for QlikView at GSMC is $315,000. That will be directly offset by shutting down a data warehouse, savings from using QlikView for analytics versus another system where the cost of consulting services had already been quoted and budgeted, and other savings. We expect additional direct benefits from having deep analytic capabilities with our revenue cycle data.

QlikView has a number or healthcare customers. I believe you will be hearing more about the company in healthcare in the years ahead as they achieve market awareness of QlikView software’s capabilities and price point.

We have not yet purchased the package. If we do, I’ll write a follow-up article on our experience.

1 “Qliktech, IBM Provide New View Of OLAP”, Mario Morejon, Technical writer for ChannelWeb, July 18, 2003, http://www.crn.com/software/18839582

Mark Moffitt is CIO at Good Shepherd Medical Center in Longview, TX.

Humpty Dumpty Leaves Wonderland to Visit Health Information Technology
By Jim Kretz

Suppose I told you that “voting” henceforth would mean you would only be shown a ballot, period. No more selecting your preferred candidate.

Now suppose I told you that your consent to disseminating clinical information did not mean your granting permission, but only your acknowledgement that you saw my information policy — take or leave it. This may remind you of Humpty Dumpty’s scornful assertion, “When I use a word it means just what I choose it to mean — neither more nor less.”

Surprisingly, the insanity of “…use the term ‘Consent’ to mean the acknowledgement of a privacy policy, also known as an information access policy. In this context the privacy policy may include constraints and obligations.” comes from an IHE (Integrating the Healthcare Enterprise) policy paper “IHE IT Infrastructure Supplement 2009” that was taken up  (line 157) by the IT Standards Advisory Committee Privacy Workgroup, April 23, 2010.

The authors of this paper — the American College of Cardiology, the Healthcare Information and Management Systems Society, and the Radiological Society of North America — are not mean-spirited, uninformed, or confused. What could result in their clearly having tumbled into a conceptual rabbit hole?

Jim Kretz is project officer at the Substance Abuse and Mental Health Services Administration of the Department of Health and Human Services. His comments should not be construed to reflect the official position of SAMHSA.

Massachusetts HIT Conference Thoughts
By Bill O’ Toole

I had the pleasure of attending a National Conference hosted by Massachusetts Governor Deval Patrick in Boston last week. The conference was billed as Health Information Technology: Creating Jobs, Reducing Costs and Improving Quality

Keynote addresses were provided by David Blumenthal, MD, National Coordinator for HIT and Vice Admiral Regina M. Benjamin, MD, MBA, Surgeon General of the United States. Health IT Policies and Standards were addressed by a panel that included John Halamka, MD (CIO, CareGroup and Harvard Medical School), Marc Overhage, MD (CEO, Indiana Health Information Exchange), Paul Tang, MD (CMIO, Palo Alto Medical Foundation), Micky Tripathi, Ph.D (CEO Massachusetts eHealth Collaborative) with Tim O’Reilly (President, O’Reilly Communications) moderating.  

Another panel discussion on Health IT, Business Opportunities and Job Creation featured leading Massachusetts vendor executives Girish Kumar Navani (eClinicalWorks), Howard Messing (Meditech), Richard Reese (Iron Mountain), Bradley J. Waugh (NaviNet) moderated by Chris Gabrieli (Bessemer Venture Partners).

I could go on and on, but the list would be too long. I mentioned those above to give readers a sense of magnitude and to perhaps share in this small article the profound comfort I felt that "we" are doing this right. Many other highly qualified participants shared their knowledge on all things HIT- and ARRA-related.

What impressed me most was the overwhelming sense of momentum. The stimulus package and its future incentives have so far done exactly what was intended, serving as the spark that has set this massive project in motion. Remaining at the forefront of it all, though, is the goal of better medical care for all. That theme was never lost and was frequently repeated.

As one who until now has found certain parts of most conferences to be extraneous (ok, boring), I felt obliged to inform the far-flung readership of HIStalk that I was extremely impressed with every minute of this two-day conference. If the energy, knowledge, and sincere interest and enthusiasm expressed by those involved in this conference are carried forward to the project at large, then we are truly in for a remarkable change in our industry.

Congratulations to the Massachusetts Technology Collaborative and its Massachusetts eHealth Institute, the Massachusetts Health Data Consortium, and Governor Patrick for organizing this special event. It should serve as the model and be repeated whenever possible throughout the country.

William O’Toole is the founder of O’Toole Law Group of Duxbury, MA.

Readers Write 4/22/10

April 22, 2010 Readers Write 2 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

License Rights in Your Software License Agreements
By Robert Doe, JD

Each software license agreement contains a provision which grants specific use rights with regard to the software you are licensing. However, software manufacturers’ standard contract documents may not take into account your organization’s specific use requirements.

As a result, unless your organization has a relatively simple legal structure, you should pay particular attention to this language to ensure the software can be used as you intend it to be used. The extra effort is well worth the time when you consider that without the proper license grant, you may be asked to pay additional, unanticipated fees down the road.

If you don’t alter the standard contract language, typically, the license grant is given only to the legal entity signing the contract. For example, a typical software vendor’s license grant provision might read as follows: “Licensor grants Customer a perpetual, nontransferable, nonexclusive license for the number of concurrent users set forth in Exhibit A to use the computer program listed in Exhibit A (the "Software") at the installation site set forth in Exhibit A for Customer’s internal business purposes.”

In this example, the license grant is given to “Customer,” which is typically defined as the legal entity signing the agreement, which may not encompass all the actual individuals that will use the software. Getting the license rights correct in your contract requires that you know how your organization is structured and who the individuals are that you want to be able to access and use the software, both at the current time and in the future.

If your organization has a parent corporation, or has one or more legal entities that are owned or controlled by your organization or are under common control with your organization, the typical vendor license grant provision will technically not allow any use by the employees of these “affiliate” organizations.

Another example of a situation that is not technically covered in most license agreements is use by contracted providers that are not employees of your organization. In addition, some organizations may have other independent contractors that will need access to the software at various times, such as computer consultants.

With more and more frequency, healthcare organizations are licensing software not only for their own use, but to use on behalf of other smaller healthcare organizations in the community. Similarly, some healthcare organizations are considering re-licensing their systems to smaller organizations at a reduced rate.

In the example license grant language above, use of the software is limited to the “internal business purposes of the Customer.” If the software is to be used, in part, for the benefit of an affiliated or unrelated organization, or re-licensed to such an organization, the license grant will need to be significantly modified to allow for such actions.

When licensing software, it may be worth the extra time to put some thought into how you intend to use the software, both internally within your organization and, if applicable, externally. As part of your analysis, you will need to understand the legal structure of your organization. This information will help you to make sure you have the appropriate license grant in your software license agreements to allow for the use rights you require.

Bob Doe is a founding member of BSSD, an information technology law firm located in Minneapolis, MN.

Readers Write 04/05/10

April 5, 2010 Readers Write 14 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

Has Meaningful Use Already Lost All Meaning?
By Cynthia Porter

cynthiap 

The release earlier this year of expanded meaningful use requirements has gotten the healthcare IT community in quite a tizzy. The phrase was on everyone’s lips before, during, and after HIMSS. It was obvious to me
that:

  1. Everyone has a strong opinion about it;
  2. Not everyone understands it; and
  3. The recent passing of healthcare reform has left providers extremely anxious about how they and the vendors they do business with will comply with “it”, depending on what “it” ultimately turns out to be.

I know for a fact that hospital executives’ concerns about their institutions’ abilities to meet requirements and the overly aggressive timetables released as part of the expanded meaningful use requirements has increased exponentially since 2009, when the HITECH Act was initially released.

Nearly 80 percent of 150 hospital executives recently surveyed by Porter Research noted an increased rate of adoption for e-prescribing, patient portals, and EHRs. That’s a 20% increase from 2009, before the expanded requirements were published. So it’s safe to say that providers are jumping on the bandwagon.

Most, however, are worried that the wheels are going to fall off because vendors won’t have enough qualified employees and/or up-to-date resources to meet demand and requirements. One CIO we interviewed believes vendors “will be forced to spend more programming hours around the interoperability and security of their software versus the primary function, which is taking care of patients and making it easier for clinicians to utilize.”

And there’s the rub. Sure, the healthcare IT community will probably benefit from the political machinations going on in Washington, but will the patients? Will vendors rush to provide hospitals with technologies that could have used a few more months of development and trial? Will hospital staff have time to adequately train their IT people to use these new technologies? Will patients pay the price for a rush job?

It’s unfortunate that time will tell, because time is one thing patients don’t have.

Cynthia Porter is president of Porter Research.


Health Reform, Schmealth Reform – Freakin’ Pay Me
By Gregg Alexander

Down here in the primary care trenches, where the pudding meets the pavement (or some such mixed analogy), no matter how much we may want it to, health reform doesn’t seem like it will ever really get to addressing our needs.

What do I mean by that? Simple enough: it is getting virtually impossible to justify staying in traditional primary care any more and, health reform or no, HITECH or no, Congress just walked away and forgot about me and mine.

Despite our efforts to help bring the best we can to those we serve, what do we get? People, be they private insurers or Medicaid, self-pay or no-pay, hospitals, and even IT folks, all telling us what we can and can’t do. We’re told when we are and aren’t allowed to make medical recommendations based upon our knowledge and experience and then we’re told just how much we’re allowed to charge for our expertise. (Disregard whether or not we’ll even get paid anything at all for our time and trouble.)

We fight to get what we believe is appropriate care for our patients, regardless of their insurance or lack thereof. We struggle to make ends meet so that we can offer the advantages of a quality medical home and, perhaps, digital healthcare information management to our patients. We work far too many hours, away from our family and friends, just so we can feel good enough to sleep at night knowing we have done our best to help those who come to us for care.

And then…and then…Congress goes on break before postponing a 21.3% cut in Medicare payments. (Thank you, Senator Tom Coburn, R-OK.) Whether or not they repeal it when they return, CMS will likely withhold payments for at least 10 days before beginning to process those 21.3% reduced payments. For those affected, continuing this Sustainable Growth Rate (SGR) formula is anything but sustainable and quite the opposite of growth.

Ladies and gentlemen, if you’re not already aware, we have a shortage of primary care providers in America. Pushing us toward expensive technology adoption which may or may not truly be ready to really meet OUR needs while reducing the bottom-of-the-barrel payments with which we already struggle, is not going to solve any little piece of our giant healthcare crisis. It will make it much worse as more and more of us leave for less stressful and less beyond-our-control professional lives. All the while, we’ll leave little encouragement for the med school up-and-comers who will doubtful choose to join the ranks of careworn primary care.

Let us worry about dealing with the pressures of making medical decisions and allow us a reasonable income which doesn’t add to the strain. Elsewise…well…how long would you stick around after a 21% pay cut?

From the (weary) trenches…

“Pay me for my work, but I don’t do it for the money.” – Vanna Bonta

Dr. Gregg Alexander, a grunt in the trenches pediatrician, directs the “Pediatric Office of the Future” exhibit for the American Academy of Pediatrics and is a member of the Professional Advisory Council for ModernMedicine.com. More of his blather…er, writings…can be found at his blog, practice web site or directly from doc@madisonpediatric.com.

This Just In: HIRE Bill Signed! Could Hiring Tax Breaks Benefit Your Organization?
By Tiffany Crenshaw

tiffanyc

On March 18, 2010, President Obama signed into law the Hiring Incentives to Restore Employment Act (HIRE). HIRE is a $17.5 billion jobs bill that the President says will bolster hiring and incent business owners, creating approximately 250,000 new jobs.

The bill was dramatically scaled back as it passed through the House and Senate, from $150 billion to less than $20 billion. Still, lawmakers say it is the first step in a series of bills designed to encourage job growth.

The Act offers two tax breaks to companies who hire recently unemployed workers, one in the form of a payroll tax exemption and the other in the form of a tax credit. Beginning March 19 and through the remainder of 2010, employers will not have to pay the 6.2% Social Security payroll tax on qualifying new hires. In addition, companies are entitled to a credit equal to 6.2% of an eligible employee’s total salary (up to $1,000) if that new hire is retained for at least 52 weeks consecutively.

To qualify for the tax breaks, new employees must be hired between February 3, 2010, and January 1, 2011. Each new hire must verify in writing that he or she was unemployed for a minimum of 60 consecutive days just prior to being hired. If the worker is replacing an employee in the same job role, he or she is not eligible, unless the previous employee was terminated for cause or voluntarily quit.

There is no doubt that these incentives may not help all companies, but HIRE is a start that could benefit your organization as well as the nation’s unemployment rate. Companies still experiencing depressed revenues due to economic slowdown may not benefit from the tax incentives, but others may find the tax savings to be a valuable advantage towards savings and growth.

Tiffany Crenshaw is CEO of Intellect Resources.

Readers Write 3/24/2010

March 24, 2010 Readers Write 9 Comments

Submit your article of up to 500 words in length, subject to editing for clarity and brevity (please note: I run only original articles that have not appeared on any Web site or in any publication and I can’t use anything that looks like a commercial pitch). I’ll use a phony name for you unless you tell me otherwise. Thanks for sharing!

Digital Information is Great, but Only if it’s Accurate
By Deborah Kohn

I am a patient at two local healthcare provider organizations that use the Epic suite of clinical information system modules for their base EHR. Both organizations must not yet have installed Epic’s CareEverywhere because currently, the two Epic systems do not talk to one another (or even look / act like one another). But with time, the installation of CareEverywhere should occur at both.

However, the reason I write this article is that either there is a flaw in Epic’s MyChart, the organizations do not know how to correctly configure MyChart, or there remains an important Epic user training issue. When I visit my providers at both organizations, I receive a hardcopy summary of my visit, which I must assume gets generated by MyChart because also I can view the data online via MyChart. Among many items listed on the summary are Current/ Future/Recurring Orders.

1) Orders listed on the summary and in the system cannot be corrected easily by an organization user, even the provider. I don’t know whether this is a user training issue (e.g., how to easily DC or cancel electronic orders that have been performed but, for some reason, not automatically canceled as Future Orders), a system flaw, or a poor implementation of the function. But for one set of lab orders, I was repeatedly asked for lab work to be performed when the lab work was performed months ago and I had the documentation to support this. Unfortunately, it took several handwritten notes and phone calls from me to the provider to finally update and delete the already performed lab orders from the system.

2) If orders listed on the patient’s hardcopy visit summary are incorrect (e.g., numbers of milligrams, duplicate orders, q 4 months not q 2 months, etc.), again these orders cannot be easily corrected by an organization user. That’s because, according to the organization’s users, these orders come from a different “database” than the “real” orders, which are correct in the system, but don’t print to the hardcopy correctly!

3) Either the Epic clinical system does not include or the provider organizations have yet to install or know how to install the following clinical decision support function: Recently, when my provider at one organization ordered a routine TB test, there was nothing in the system to alert the provider that the same, routine TB test was performed at this organization in July 2009. Consequently, this test was repeated in February 2010 at a cost of $398. When I complained about this, the provider organization commented that it is the provider’s responsibility to look back at all the orders in the system to see if a TB test had been performed within the last several years. I don’t blame the provider for not wanting to scroll through several years of past orders to determine this. And I was sorry I didn’t have my “paper” PHR, which I have kept for at least 30 years, with me at the time to double check this.

Now that electronic PHRs and visit summaries are appearing and patients are beginning to “use” (indirectly) organizational EHRs, not only will the organization’s internal users be complaining about system flaws, poor configurations, or outstanding training issues — but external users, the patients and recipients of health information exchanges, will be added to the lists. Consequently, it’s time our industry professionals address the management of the information, not just the technical and operational mechanisms for the sending and receiving of the information. Because it’s great to receive digital PHRs and visit summaries from provider organizations, but only when the information is accurate! Just ask ePatient Dave!

Deborah Kohn is a HIM professional and power user of EHR systems who not only makes sure her analog and digital health record information is correct, but remains dumbfounded that she need not do same with her bank record information.


We Are In the Business of Letting Clinicians Treat Patients
By Jef Williams

jef

While riding the shuttle to my hotel at HIMSS in Atlanta, I overheard two strangers behind me comparing stories of the conference to one another. Their short exchange encapsulated for me both the HIMSS event and the climate in which we are now living. The conversation went something like this:

Woman: “I attended a session today conducted by an IT expert. You won’t believe what I heard”

Man: “Really?”

Woman: “Oh yes. The presenter was talking about successful EMR and IT implementations and actually said, ‘The physicians are the ones who have received the education. They are the ones who treat patients. So they must be the focus of our implementation.’”

Man: “You’re kidding.”

Woman: “No! I was so offended I nearly walked out.”

Man: “That’s ridiculous.”

Whether one agrees with the federal stimulus package and the push toward EHRs, the fact remains that it has created a significant impact on the business of healthcare IT. Clinicians, administration, and IT each play an important role in running the healthcare organization. Administration and IT serve, however, in support roles to the mission of providing an environment that allows clinicians to do what they do best: treat patients.

Over the past decade, the role of IT has grown significantly as healthcare has played catch-up to the most other industries in moving away from paper and manual systems to electronic and automated systems. This shift has had its share of challenges and most organizations can list a number of tragic stories of failed or messy implementations. Difficult workflow, poor user adoption, and meaningless data are all symptomatic of the problem of letting IT professionals make critical decisions sans clinical input regarding system procurement, design, and implementation.

It appears we have not learned our lesson. Introducing federal subsidized funding and reimbursement into the business model of clinical information systems the federal government has shifted focus to management and IT, leaving clinicians in the trailing position. The idea that caregivers come last could not be more backward to the true value proposition of healthcare. This industry is, and will remain, primarily about providing healthcare. No matter how advanced EHRs, widgets, and handheld devices become, patients will continue to measure satisfaction by whether a doctor knows what she’s doing, has the right tools to treat, and that they ultimately are healthy.

So to that presenter at HIMSS, I am not offended. It seems in this climate we have forgotten that we are in the business of letting clinicians treat patients. No EHR, HIS, PACS, eMAR, or any other system can provide better patient care without a doctor reaching out a stethoscope and asking her patient to breathe deeply. We in administration and IT get to play a valuable role in providing the tools and support to help our physicians provide better patient care. But we are just that — support.

Let’s not let the promise of a few dollars and the lure of a few vendor-hosted parties blind us to that fact.

Jef Williams is vice president of Ascendian Healthcare Consulting of Sacramento, CA.

Text Ads


RECENT COMMENTS

  1. Anything related to defense will need to go to Genesis.

  2. That, or we see if Judy will announce Epic's new Aviation module (probably called Kitty Hawk) that has integrated Cruise…

  3. The $50 billion Rural Health payout is welcome. In context, it's less than the total cost of the F22 raptor…

  4. RE NEJM piece: He shouldn’t future-conditional with “they can retreat, which might mean abdicating medicine’s broad public role, perhaps in…

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

RSS Webinars

  • An error has occurred, which probably means the feed is down. Try again later.