Well that's a bad look as the Senators contemplate filling in the House gaps in the VA Bill
HITlaw 8/24/09
Non-Disclosure Agreements
I am weighing in on the recent flurry of activity on HIStalk regarding non-disclosure clauses in software agreements that preclude a customer from discussing or revealing problems with a vendor’s software.
Any worthwhile attorney reviewing agreements for a provider client should flag such an inclusion and require its deletion. Something like that should scream for attention to the savvy IT person, be it the CIO, the consultant, or the attorney.
Executives — when negotiating a contract, really think through the obligations. Where a clause requires education of your entire staff (such as telling them that they cannot disclose a serious software problem), just imagine giving that talk to your chief medical officer. If you find yourself not being able to defend or justify the offending term, you know what to do — get rid of it.
I cannot think of a more self-serving “requirement” in the paperwork that establishes the vendor-client relationship (some would say partnership). Imagine a high profile hospital negotiating with any vendor. The vendor is salivating, not just for the potential sale, but for the huge publicity it hopes to gain at some point by announcing that the high-profile hospital is running its software.
Certainly that vendor does not offer to keep secret the fact that the hospital runs its software in exchange for the hospital keeping errors or defects quiet. I personally find this offensive. I am not speaking for Meditech, but in speaking for myself, in the 20 years I spent negotiating tens of thousands of agreements for Meditech, I never once included such language in any agreement with any customer.
Imagine an ER physician who comes across a dangerous software malfunction. That physician may moonlight across town at another ER. Suppose that hospital has the same software vendor. Assuming the physician knows about the disclosure restriction (which is unlikely), you have placed the physician in a horrible situation. Should he or she abide by a software contract’s egregious terms and risk the health and safety of patients? Or, do what it is right (and required under the Hippocratic oath, I would say) and let the staff at the second hospital know about the software malfunction? In the more likely scenario, if the physician has no idea the restriction exists and divulges the existence of the problem, then the hospital is in breach of its agreement with the vendor.
Also consider the CIO, who you hopefully want collaborating with other CIOs on all things HIT related. You’re putting pressure on them as they sit at a table with other CIOs with the same software system, knowing this problem exists, but not being able (contractually) to divulge the information.
For a little perspective, let’s remember that the errors or malfunctions we are most concerned about are the ones directly involving patient care. A misaligned billing form does not rise to the level of concern as a bad dose amount. However, the non-disclosure terms do not differentiate, I am sure, in permitting disclosure of severe problems and restricting disclosure of minor ones. That makes no sense, which tends to enforce the assumption that the vendors using such restrictions wish to keep critical issues from the public because they fear the negative exposure that may result.
I say boo hoo. The vendor selected the market and designed the software. The vendor takes the profits. The vendor should stand behind its products, bad or good. The profit/loss reports do not differentiate. Neither should disclosures about software performance.
Just as a vendor should be proud of a good endorsement by any customer, so should the vendor permit free disclosure of serious problems. Not in a headline-grabbing, gossipy manner, but in a manner befitting this industry for the care of patients and avoidance of harm to those patients.
Providers should dust off their agreements and check to see if any such language is included. If so, call the vendor and demand an amendment deleting the provision. Better yet, vendors should be able to identify customers with such terms and do the right thing — provide the amendment without being asked.
William O’Toole is the founder of O’Toole Law Group of Duxbury, MA.
Epic has had these clauses in their agreements for years and several smaller vendors as well. Epic dwells on the pitch that ‘this is a partnership’ – ‘we are patners’ , etc. But ask your partner to remove the clause and guess what…you may not be good enough to be an Epic user! They’ll walk from the deal.
The other clause that really gets me is the one that says no one from outside the hosiptal can look at Epic screens or review the system without their permission – which is rarely, if ever, given!
To my, and many others, amazement hospitals keep signing them!
Why ? Because if the hospital across town has signed it, it can’t be all bad.
Mr. O’Toole,
Thank you for the legal interpretation.
I agree and see things from an additional perspective: that of hospital governance responsibilities under the Joint Commission’s safety standards, as well as their fiduciary responsibilities towards those who set foot into their hospitals as either patients, workers or contractors to protect from physical or legal jeopardy.
Thus were the comments in my JAMA letter of 7/22/09 as reinforced by Koppel and Kreda’s response in the same issue. See http://jama.ama-assn.org/cgi/content/extract/302/4/382 (subscription required; JAMA does not permit posting content publicly).
I have a more comprehensive post on this issue at this link: http://hcrenewal.blogspot.com/2009/03/health-it-hold-harmless-and-defects-gag.html
Scot Silverstein, MD
Drexel University
College of Information Science and Technology.
This is exactly the point that Koppel and Kreda made in their JAMA article. Pleased to see an attorney weigh in on this.
Agree. 110%
This is well stated. The report provides ample reason for there to be careful scientific evaluation of the safety and efficacy of these systems.
Since it is a foregone conclusion that the dangers have been obscured from the usual venues of publication, it would appear to be an appropriate time for the FDA to weigh in with its input.
As things stand at present, hospital administrators and their paid “champions” blame any and all adverse events on user error. In other words, the learned intermediary is punished. Review the case of the pharmacist in Cleveland (published at HIStalk) who is in jail because of a mistake when the computers were broken on a Sunday morning.
To assist in this vetting process, all interested users who believe the systems have been associated with adverse events affecting their patients should consider it their responsibility to report such events (anonymously, if desired) to the FDA at FDA.gov
Our patients’ safety depends on this information being reported.
“As things stand at present, hospital administrators and their paid “champions” blame any and all adverse events on user error. In other words, the learned intermediary is punished. Review the case of the pharmacist in Cleveland (published at HIStalk) who is in jail because of a mistake when the computers were broken on a Sunday morning.”
Ths is completely absurd. This wasn’t a case of computer error being blamed on a user. The computer had nothing to do with this error. The pharmacist screwed up.
Sounds like we need legislation to make this clause of precluding a customer from discussing or revealing problems with a vendor’s software legally unenforceable. It’s a patient safety issue.
Vendors will not remove it without a big incentive. It sounds like Epic isn’t incited enough by a customer’s only power: to not buy their product.
What’s left to solve this problem?
To programmer,
Do you mean to say that the computer outage had zero impact on delays of therapy, hurriedness, and cognitive disruption that when combined are more than a fertile field for catastrophic error?
From an ethical standpoint, if a vendor insists on such contract language, then it’s up to the Provider to walk away from the vendor. Even if it means that you don’t get to jump on the $100+ million bandwagon. The only way I can see vendors truly take responsibility and fix potentially dangerous software bugs is to allow providers upfront and full disclosure of such issues. Only then would a vendor feel a bit more embarassment and actually agree to fix issues quickly rather than make promises in an uncoming release not set.
Please, please will a vendor representative or provider who actually allowed the language to stay in the contract to defend such terms? I would like to understand why, even if it’s simply CYA.
“Do you mean to say that the computer outage had zero impact on delays of therapy, hurriedness, and cognitive disruption that when combined are more than a fertile field for catastrophic error?”
No, I’m saying that the computer outage didn’t put the wrong dose of sodium chloride in the patient’s chemotherapy bag, nor did it direct the pharmacist to add the wrong dose. If the hospital didn’t have a backup plan for when the computer went down, that also is not the fault of the computer. The computer also was not responsible for the fact that the technician who mixed the solution was busy planning her wedding.
Modern Healthcare has printed the “100 Most Powerful People in Healthcare”. Kathleen Sebelius is #2, David Blumenthal is #6, Mark Leavitt is #58, and H. Stephen Lieber has fallen to #92, but he still made the list.
https://home.modernhealthcare.com/clickshare/authenticateUserSubscription.do?CSProduct=modernhealthcare&CSAuthReq=1:273364458336595:AID|IDAID=20090824/REG/908219983|ID=:63C4330F8A8EADCE42587F23485CD05B&AID=20090824/REG/908219983&title=100%20Most%20Powerful%20at%20a%20glance&ID=&CSTargetURL=http%3A%2F%2Fwww.modernhealthcare.com%2Fapps%2Fpbcs.dll%2Flogin%3FAssignSessionID%3D273364458336595%26AID%3D20090824%2FREG%2F908219983
Would any hard data be available as to just how many of the suppliers of hospital software systems, say order and result communnication systems in particular, actually include contract langauage within their non-disclosure provisions that prohibt the reporting (to other than the supplier) of defects which may impact care? Non-disclosure provisions also typlically have exceptions for disclosure required by regulation or law, true?
Vendors also want to GREATLY undervalue their “Limits of Liability” responsibilty to the acquisition cost of the software, when we all know that a patient safety error could result in many times that amount in a lawsuit. Don’t get me started…
Hah – as I recall Larry, Howard and Neil were referred to as the “sales prevention team” as they would rarely agree to modify MEDITECH’s contracts.
If you didn’t like MEDITECH, fine. Go pay 2 to 10 times as much to someone else.
Programmer says:
“No, I’m saying that the computer outage didn’t put the wrong dose of sodium chloride in the patient’s chemotherapy bag, nor did it direct the pharmacist to add the wrong dose. If the hospital didn’t have a backup plan for when the computer went down, that also is not the fault of the computer.”
Computer are not at issue here … disruptions of clinicians and other medical professionals — who work in poorly bounded, complex, unpredictable sociotechnical environments — by faulty or unreliable *information systems* is the issue.
Perhaps we need to stop thinking about computers, and start thnking about nformation systems. IMO, CIO’s are really chief computing officers (unless they have expertise in true blue information science). The term “Chief Information Officer” has led to no end of confusion amongst technological laypeople about the nature and relationships of data, information, and knowledge to the tools that facilitate their capture, analysis and diffusion (e.g., paper systems, networks, and computers).
A root cause analysis of the failure is in order, but one thing is certain: any HIT outage distracts clinicians, even when resorting to paper backups. Clinicians and other HC professionals, including pharmacists, are already massively overburdened. They need LESS distraction, not more. HIT needs to be considered mission critical, and have a very high uptime.
Yes, the hospital should have had a backup plan for IT outages, the championing of such fail safes being under the aegis of (certainly) the CIO, other senior governance and sr. medical officers.
Mismanagement knows no professional bounds.
It is appalling that patients are subjected to this degree of deception. I am in agreement with Hitman. Take heed of Silverstein’s comments. I have submitted a sentinel event to the FDA confidential link. Its user friendliness exceeds that of the the HIT apps I use at the hospital.
“A root cause analysis of the failure is in order, but one thing is certain: any HIT outage distracts clinicians, even when resorting to paper backups. Clinicians and other HC professionals, including pharmacists, are already massively overburdened. They need LESS distraction, not more. HIT needs to be considered mission critical, and have a very high uptime.”
The technician who mixed the solution was distracted by her upcoming wedding. From what I have read that was more of a factor than the computer system. Does this mean we forbid hospital personel from getting married? The problem here was not the information system, the problem is that the staff were distracted for various reasons, and they responded very poorly.
Ultimately, the people who provide the care are responsible. If the information system you’re using is a safety hazard, it’s up to you to stop using it until you can depend on it. If it isn’t reliable, stop using it, or be prepared for when it isn’t available. If you find yourself with a vendor that is not legally obligated to fix an unreliable, or dangerous, system, you should fire the person who selected the vendor and signed the contract.
Programmer,
You display the typical ill informed arrogance of IT personnel. You provide half baked opinions in the absence of knowledge of a large corpus of research at the intersection of IT and social science.
Your “ultimately, the people who provide the care are responsible” statement seems characteristic of an inability, as I wrote years ago, to be team players – i.e., part of the clinical team. IT personnel seem to want to “revolutionize healthcare” but seem unwilling to include their efforts as part of a team effort. In a team, all take responsibility not just for the gains, but for any of the downsides that result from faults. These attitudes fuel the “hold harmless” and “defects nondisclosure” demands by vendors.
There is also research on the IT culture itself, for example:
“Defensive climate in the computer science classroom”: ( http://portal.acm.org/citation.cfm?id=563354&coll=portal&dl=ACM )
As part of an NSF-funded IT Workforce grant, the authors conducted ethnographic research to provide deep understanding of the learning environment of computer science classrooms. Categories emerging from data analysis included 1) impersonal environment and guarded behavior; and 2) the creation and maintenance of informal hierarchy resulting in competitive behaviors. These communication patterns lead to a defensive climate, characterized by competitiveness rather cooperation, judgments about others, superiority, and neutrality rather than empathy.
Learn about the above issues, or shut up.
I also find the attitudes of “physician expected helplessness” in IT quarters and beyind interesting. See http://hcrenewal.blogspot.com/2009/02/physician-expected-helplessness.html .
Programmer,
Are you saying that the error would have occurred had the computers not gone down?
Employees are on the net all of the time. The wedding planning caper is a good excuse for the Case admin, who bought the equipment, to blame the users. Blaming the users is the admin fallback and has a symbiotic relationship with the do not disclose gig.
Your “ultimately, the people who provide the care are responsible” statement seems characteristic of an inability, as I wrote years ago, to be team players
I think the company I work for is an excellent team player. We constantly consult with our customers on the design of our software. But that does not change the fact that we are just providing you with a tool. If that tool is a threat to patients, you shouldn’t use it.
Learn about the above issues, or shut up.
I have no idea where this anger is coming from. I wasn’t on the jury that convicted that pharmacist. Maybe you should go scream at them.
Are you saying that the error would have occurred had the computers not gone down?
That appears to be what the jury decided. Do you have any evidence that it would not have occured if the computers were running?
As part of an NSF-funded IT Workforce grant, the authors conducted ethnographic research to provide deep understanding of the learning environment of computer science classrooms. Categories emerging from data analysis included 1) impersonal environment and guarded behavior; and 2) the creation and maintenance of informal hierarchy resulting in competitive behaviors. These communication patterns lead to a defensive climate, characterized by competitiveness rather cooperation, judgments about others, superiority, and neutrality rather than empathy
By the way, I had to chuckle when I read this. I went to one of the premier pre-med universities in the US. We used to call the more competative students “throats”. I don’t ever recall it being used to describe a computer nerd. It was used all the time to describe pre-meds. It’s short for “cut-throat”.
Programmer,
Which vendor supplied the system that failed? Do you work there? The jury, that never experienced the horrors of computer failure in a critical health setting, was uninformed about HIT hazzards (they are not disclosed as O’Toole writes) and probably, the pharm’s lawyer did not understand the dangers either.
Do you believe that the admin would have deinstalled the defective computer system had the pharmacist told them it was prone to breakdowns?
Programmer,
One generally needs to offer substantive information to support their positions. I cannot identify where you have refuted the arguments of a number of posters here in any credible manner. All you seem to offer are claims of an IT privilege in the form of a shield from immunity when bad healthcare IT contributes to adverse clinical events.
Your claims are not only without merit, some are also patently outrageous, e.g., “we are just providing you with a tool. If that tool is a threat to patients, you shouldn’t use it” when healthcare organizations and now the Federal Government itself is strongly pushing for mandatory HIT use, imposing actual financial penalties on nonadopters.
As I also have written in the past, if IT personnel are unwilling to share the risks of involvement in biomedicine, if they cannot take the heat, they best leave the kitchen.
You have already demonstrated you cannot take heat, hiding behind the shadow of anonymity.
Which vendor supplied the system that failed? Do you work there?
I have no idea. I didn’t provide the example, all I did was do a Google search to find out what HITman was talking about.
The jury, that never experienced the horrors of computer failure in a critical health setting, was uninformed about HIT hazzards (they are not disclosed as O’Toole writes) and probably, the pharm’s lawyer did not understand the dangers either.
The horror of computer failure is that you go back to using paper. That is something you should be prepared to do if you work in a hospital. I suspect the lawyer and the jury all knew that the hospital should have had a backup plan.
Or is everyone on the planet dumber than you doctors?
Do you believe that the admin would have deinstalled the defective computer system had the pharmacist told them it was prone to breakdowns?
I don’t know what the admin would have done. All I know is that the computer system was down that morning. Computers break sometimes. That does not mean the system was unreliable. If the computer was prone to breakdown and a threat to patient safety, and the pharmacist told them as much, then the admin should have shut it down. I’m not a lawyer, but I’ll bet the pharmacist would NOT have been convicted if she had previously complained about an unreliable computer system.
Your claims are not only without merit, some are also patently outrageous, e.g., “we are just providing you with a tool. If that tool is a threat to patients, you shouldn’t use it” when healthcare organizations and now the Federal Government itself is strongly pushing for mandatory HIT use, imposing actual financial penalties on nonadopters.
Speaking of claims that are completely without merit, I’m pretty sure the government is not forcing you to use software that is a threat to your patients. If you use software that you feel is a threat, and you don’t complain to the hospital admin, you are a fool.
As I also have written in the past, if IT personnel are unwilling to share the risks of involvement in biomedicine, if they cannot take the heat, they best leave the kitchen
I must have missed the part where hospital software complanies were made immune to lawsuits.
One generally needs to offer substantive information to support their positions. I cannot identify where you have refuted the arguments of a number of posters here in any credible manner. All you seem to offer are claims of an IT privilege in the form of a shield from immunity when bad healthcare IT contributes to adverse clinical events.
If you can offer a credible example of bad healthcare IT, I will probably have trouble refuting it. So far you haven’t provided any such examples.
the pharm’s lawyer did not understand the dangers either
Hiring an incompetent lawyer is also not th fault of the computer system.
“I must have missed the part where hospital software complanies were made immune to lawsuits”
I suggest that the Programmer witer of that statement read the Koppel and Kreda commentary and become learned in “hold harmless” agreements.
The US is wasting billions on systems of HIT that are unreliable and user unfriendly. The extensive network of HIT lobbyists have sold the policy wonks and lawmakers a pig in a poke.
And now, I will echo earlier comments. The learned intermediary will always be screwed by the hospital and its champions of IT. In order to prevent career ending mistakes that have as their root cause, bad IT, pre-empt this risk to you by reporting all HIT related incidents to the FDA. Start now.
I suggest that the Programmer witer of that statement read the Koppel and Kreda commentary and become learned in “hold harmless” agreements
Does a “hold harmless” agreement prevent criminal prosecution? That seems very unlikely to me.
It was the best of software…
that never experienced the horrors of computer failure in a critical health setting
It was the worst of software…
The US is wasting billions on systems of HIT that are unreliable and user unfriendly
How does anyone survive a visit to the hospital if you can’t trust the software when it’s working and it’s a horror when it isn’t working?
Having nothing more to add to that which has been written, I applaud Mr. HISTalk for allowing candid and direct comments on this challenging topic of vendor hold harmless and defects nondisclosure agreements.
While there may be disagreements about this issue, I think the moral high ground of holding patient safety as priority #1, and holding accountable those who contribute to medical error through preventable negligence (whether medical or IT) will prevail.
Case in point, I just got back from NY where a friend had a thyroidectomy at a major NY hospital for possible thyroid cancer.
The pre-op instructions, neatly printed by computer, were entitled “Patient instructions for parathyroidectomy.” She’d already bought calcium supplements based on the instructions given – which happened to be for the wrong surgery (for the nonmedical, the parathyroids and thyroid are different glands with different functions. Thyroid regulates body metabolism; parathyroids regulate serum calcium, abnormalities of which can be quite serious).
Fortunately, the surgeon was on the ball regarding this particular error.
However, this scenario could effect anyone reading this message, who might not have a medical advocate or on-the-ball physician.
“If you can offer a credible example of bad healthcare IT, I will probably have trouble refuting it. So far you haven’t provided any such examples.”
try googling “healthcare IT failure”
The pre-op instructions, neatly printed by computer, were entitled “Patient instructions for parathyroidectomy.” She’d already bought calcium supplements based on the instructions given – which happened to be for the wrong surgery (for the nonmedical, the parathyroids and thyroid are different glands with different functions. Thyroid regulates body metabolism; parathyroids regulate serum calcium, abnormalities of which can be quite serious).
And you know for certain the problem was the software and not a user who filed the wrong instructions on the patient file or associated the instructions with the wrong surgery?
try googling “healthcare IT failure”
I’m a healthcare IT programmer. I’ve been fixing healthcare IT problems for the last 20 years. I don’t need to Google them.
Programmer said:
“The technician who mixed the solution was distracted by her upcoming wedding. From what I have read that was more of a factor than the computer system. Does this mean we forbid hospital personel from getting married? The problem here was not the information system, the problem is that the staff were distracted for various reasons, and they responded very poorly.”
This betrays a serious lack of understanding of the clinical environment, which is that there are always distractions. Always. And software needs to be designed to accommodate that.
Unlike HIT, PACS is regulated by the FDA. And the FDA is very clear on the matter of use errors. (Note: Not “user” errors, “use” errors.) Use errors indicate a problem with the software. The kinds of errors that users are likely to make are quite predictable. For an introduction, see James Reason’s book Human Error, as well as more recent books on the topic.
Consequently, software can and should be designed in such a way as to minimize the probability of such errors, to minimize or be able to ameliorate their effects when they occur, and to, if possible, increase their detectability. And that design must take into account the actual operating environment, not the ideal.
Software needs to enable accurate work even when the user is tired, or trying to manage three things at once, or is interrupted, or is operating in a noisy environment. It needs to provide smooth and effortless interaction so that clinical users can devote their cognitive resources to the clinical situation at hand rather than to the use of the computer.
Also, as much as it is advisable to have some sort of backup system in place when the IT system goes down, the fact is that any backup system is the equivalent of limping along on emergency power. It simply is not practical to maintain two complete systems in parallel when only one of them will be used 99.999% of the time. If an IT system adds value instead of merely replicating the paper system, it’s also not possible to maintain fully parallel systems. Parallel systems use is reserved for the go-live period during which any obvious kinks in the system are supposed to be worked out.
I think now that HIT has crossed over from the administrative & financial side to the clinical side, regulation is likely to be necessary. It’s difficult to argue that CPOE is not a medical device but that PACS is.
This betrays a serious lack of understanding of the clinical environment, which is that there are always distractions. Always. And software needs to be designed to accommodate that.
This betrays a serious lack of understanding of the situation we were discussing. The computer system had nothing to do with the error. The only connection between the computer system and the error is that the system was down earlier in the day.
For those of you who feel the pharmacist should not have been convicted, that’s fine. I wasn’t on the jury. I had nothing to do with her conviction. Neither did the computer system. You need to go yell at the jury. I’m just concerned about the people trying to shift the blame from the pharmacist to the computer system.
The pacs guy hit the nail square on. CPOE is a device. As such, it carries higher risk to the patient if it fails, compared to a PACS.
The pharmacist was a victim of his employer and the computer products that were purchased and went down. If the devices were working correctly that Sunday morning, the catastrophic death would not have happened (unless such mistakes occur when the computer is functioning). Such mistakes do occur with CPOE and may be interesting to PACS Designer and other readers.
Agree that if your patients are affected adversely, let the FDA know.
The pharmacist was a victim of his employer and the computer products that were purchased and went down. If the devices were working correctly that Sunday morning, the catastrophic death would not have happened (unless such mistakes occur when the computer is functioning)
Repeating the same unsupported claims over and over do not make them any more true. If the device is working correctly and the user screws up, people are still going to die. Also, I’m pretty sure the computer was wiorking when the user screwed up. The computer was down prior to the screwup, not during.
I did a quick Google search on this story.
Two months ago, an exclusive NewsChannel5 investigation revealed how pharmacists can make repeated mistakes that do not have to be reported to the state board of pharmacy.
After his mistake, Cropp was hired by another pharmacy and continued to make mistakes, Regan reported.
You might want to find a different spokesman for your next “It’s the Computer’s Fault” convention.
I’m a healthcare IT programmer. I’ve been fixing healthcare IT problems for the last 20 years. I don’t need to Google them.
Wow.