I wrote weekly editorials for a boutique industry newsletter for several years, anxious for both audience and income. I learned a lot about coming up with ideas for the weekly grind, trying to be simultaneously opinionated and entertaining in a few hundred words, and not sleeping much because I was working all the time. They’re fun to read as a look back at what was important then (and often still important now).
I wrote this piece in August 2007.
Software: No, You May NOT Have It Your Way
By Mr. HIStalk
Healthcare IT is a lot more like McDonald’s than Burger King. When it comes to software, you may NOT have it your way.
Here’s a classic example. Well-intentioned clinical software sends alerts to doctors or pharmacists that a certain drug is eliminated by the kidney. Therefore, it nags incessantly, making sure you’re amply warned to reconsider the dose you’re ordering. Sometimes the warning is too stupid to even realize that the patient’s kidneys are normal – it just fires indiscriminately any time that drug is ordered.
That’s not terrible for inexperienced residents, but it’s darned annoying for specialists who spend their entire working lives dealing with patients and drugs like this. "I’m a nephrologist and this system thinks it knows more than me? Hah."
So, one might innocently ask, why must the computer treat all users as infuriatingly equal? Would it not make sense to allow those warnings to be selectively turned off for qualified users? Or, maybe clinical systems should work like some PC applications, where there’s a "basic" mode with limited available options and an "advanced" mode that opens up all kinds of cool but dangerous capabilities once you’ve proven your capability, kind of like a learner’s permit.
Much of the software in use today was built with the mainframe paradigm, i.e. users are stupid and programmers need to spank them if they dare make a mistake. Programmers usually have a deeply held contempt for non-geeky users. If the technology existed, they’d send 110 volts through the keyboards of imperfect users in some kind of Skinner-inspired operant conditioning experiment. I’ve been a programmer and we always dreamed longingly about this ("So he enters medical record number where it says visit number, and — ZAP! Read the screen next time, loser.")
PC and Web software has, in the mean time, progressed into this century. Amazon knows who you are, makes recommendations, lets you easily find your previous orders, and offers you deals on stuff it knows you’ll like. iGoogle lets you build your own home page and use your choice of hundreds of widgets that can do everything from showing your inbox to displaying the latest headlines from The Onion. When you want to do stuff in Windows, you can choose a friendly, step-by-step wizard instead of an imposing screen full of impending, cryptic warnings.
It’s no wonder that doctors and nurses are disappointed by the multi-million dollar systems we make them use. They’ve used the cool PC stuff. Using healthcare software is like waiting in the driver’s license line: everyone is treated contemptuously equal.
Here’s how it should work. The kidney expert gets a basic warning about a certain drug’s dosing in renal failure. A button should be right there that says, "Don’t show me these kinds of messages again." Just like Word’s spell checker, in other words. The doctor is a big boy or girl and can choose for themselves what’s useful and what’s not. Just because the computer has the capability to issue warnings doesn’t mean it should. You trust users to deliver care, so trust them to ignore unhelpful information.
Technically, this is not hard. Neither is personalization that allows users to customize menus, create their own subset of commonly used items, or create an inbox of the kinds of new information they’d like to see about their patients.
It’s no wonder that clinical users just can’t warm up to a computer system as a trusted ally when it often behaves like electronic idiot savant happy to fire off ignored and unwanted information to those who resent it.