I’m continually surprised by the inability of EHR vendors to add fairly straightforward safety features to their systems. A reader sent in this screenshot that shows a patient’s pulse documented as 12,224 beats per minute. They also shared screenshots of other parts of the EHR with similar issues. The blood pressure field isn’t divided for systolic or diastolic entries, but rather requires the user to type a “/” between the values, leading to potential errors.
Since I’ve been on the vendor side of the house, I understand that it’s not as simple as it seems to make these kinds of corrections since they may require changes to the database. However, it’s not as difficult to make changes to the screens where data is entered. In this situation, they could limit the entered data to three characters. Frankly, if you need to enter a fourth digit, your patient has probably just died of rampant tachycardia.
When reporting these types of issues to vendors, we are often told that it’s not on the development roadmap, that it would be too technically difficult, or that it would require more development hours than are available. When we’re talking about data entry errors and patient safety, however, how much time is too much time to spend on something like this? Not to mention that if you’re trying to exchange data and want to be truly interoperable, this kind of bad data is going to be an issue for practices trying to consume data from flawed sources.
With all the certification requirements, why haven’t we mandated management of basic patient safety issues like this? As much as we’re told that paper kills, I doubt there were too many instances where a technician would have documented a pulse of 12,000 in a paper chart.
You know people are desperate when felony colonoscope theft becomes an issue. Two men and a woman burglarized a Philadelphia-area hospital and made off with tools from the colonoscopy suite. Police fear the devices may be sold on the black market. If that’s the case, I hope they go through a thorough cleaning cycle first.
I had a conference call today with a potential employee who dialed into the meeting from a shared work space with little privacy. There were people walking back and forth in the background during the entire call, and in a couple of instances, the interviewee even turned around to see what was going on behind him. If this is the best environment he could come up with for a job interview, I wonder what his daily work environment might look like. He mentioned that he likes to work away from home because “it’s less boring,” but failed to elaborate. That’s a trip directly to the round file for this candidate.
Other occupants of the round file include people who try to conduct their entire lives from their phones, leading to emails saying they can’t open attachments on their phones or have trouble accessing various resources on their phones. Mobile is a great extender, but if you’re going to do a serious job in IT, you have to understand when it’s appropriate to use a more traditional laptop, tablet PC, desktop, etc.
In response to my recent piece about Ovia and other fertility apps, a reader shared this Washington Post follow-up that discusses ways women protect their privacy on these apps. One woman drew the line at providing the name and date of birth of her baby – she was willing to share her own personal information, but not that of her newborn.
Over 100 women responded to a request from the Post. Respondents “often said they felt trapped by an unfair choice: They cared about privacy, but they also found the digital trackers too valuable to give up.” Women used pseudonyms, logged only a minimum of information, and modified some data to preserve anonymity. Others noted that the apps weren’t that helpful. One commented that they “led to micromanaging my body and habits, which led to stress.” Another noted deliberate gaming of systems used by employers who are trying to get data on staffers: “If my employer was offering money for pregnancy tracking, I would probably do the same thing I already do with the fitness tracking and just input false content.”
The reality is that personal health information is everywhere, whether people are providing it willingly with the understanding that they can’t control it once it’s out of their hands, or whether they want to use it for specific purposes. There is a great deal of discussion about the role of patient-generated health data in clinical care. Many clinicians are uncertain about its role in driving outcomes and contributing to clinical quality. There are also concerns about how to handle the data. Clinicians find the idea of receiving hundreds if not thousands of data points into their EHRs to be particularly daunting. Some of these physicians were concerned in the paper days about patients bringing in blood pressure or blood sugar logs, so it’s not surprising that they are uncertain about the data in the electronic world.
There are also concerns by both patients and providers about data security, but it’s hard to quantify the pros and cons. An article in the Journal of the American Medical Informatics Association notes that patients are open to various methods for data collection, whether it is through a medical history, patient questionnaires and surveys, or biometric and activity data. Researchers interviewed health system leaders, EHR vendor leaders, and leaders of third parties providing patient-generated health data tools to health systems. They also interviewed patients with chronic conditions, with half of those patients having experience with generating data. The number of survey participants was small, but the authors conclude that patient-generated health data really isn’t being pursued at broad scale, largely due to concerns about its value.
What do your providers think about patient-generated health data? Are you using it? Does it add more confusion? Leave a comment or email me.
Email Dr. Jayne.