Curbside Consult with Dr. Jayne 10/7/24
I had several friends attending the Becker’s Health IT + Digital Health + Revenue Cycle conference last week in Chicago. In sending me some impressions and notes, they all mentioned conversations around the topic of whether health systems should be able to monetize patient data.
Key areas where organizations might use de-identified patient data include generative AI, genomics and precision medicine, and pharmaceutical drug discovery. When I’ve discussed this with fellow physicians in the past, the question of ethics was usually at the top of the list. In recent months, however, the focus seems to have shifted to whether organizations can truly protect patient data, and whether or not there is a risk of it becoming re-identified by someone intent on doing harm.
Putting on my patient hat, the first issue I have with using patient data beyond the actual patient care is that of consent. Organizations may claim that they received patient consent, but did they really? Most large health systems are giving patients multi-page documents to read when they arrive. Those documents include a multitude of topics, from data sharing to billing assignment to consent for treatment. I’d be hard pressed to find 10 patients in the lobby at my local academic medical center who know definitively whether they have signed a consent for dissemination of their de-identified data or not.
At my last mammogram visit, the facility didn’t even offer me copies of the Notice of Privacy Practices or Consent for Treatment documents to review. The registration clerk simply pointed to a signature pad and said, “Now you’re going to sign indicating you received this and agree to it” and became irritated when I asked for copies of the documents prior to signing.
In my book, consent that is obtained in that manner is in no way a valid consent of any kind. Not to mention, the patient really has no other choice but to consent, in many cases. With insurance companies building narrow networks, patients may not really have a choice as to where they receive treatment and end up agreeing to whatever is put in front of them because they need care. One never hears the word “coercion” uttered when patients are at the check-in desk, but that’s essentially what is happening.
Another major issue for patients is the general lack of understanding of what HIPAA covers and doesn’t cover. Most people don’t know that consumer devices collect clinical information, but it’s not at all protected, and companies can do whatever they want with it for the most part. There have been recent concerns following shakeups at genetic testing company 23andMe as far as to what will happen to their data in the event of an acquisition or changes in leadership. With changes in state abortion laws, there are increasing worries about period tracker apps, fertility tracking apps, and other ways of capturing reproductive data. Between those two catalysts, I’m hoping that patients become more aware of the fact that their information is just out there. We all know that no one reads the terms and conditions when they sign up to use a new app.
Changing to my clinical informatics hat, I absolutely agree with the concerns around organizations’ inability to protect patient data. Recent cybersecurity events have shown that they struggle to protect fully identifiable data used for direct patient care, so what makes us think they’re using an equivalent level of rigor for de-identified data? There are plenty of articles out there that describe how easy it is to re-identify patient data, going back as far as 1997 from what I could identify with some quick searching. There are plenty of data-rich sources that are publicly available, such as voter registration lists.
Several colleagues posted to a local physician forum after they received data breach notifications stemming from the Change Healthcare hack. The words used by these physicians, who were impacted as patients, caught my attention. They were “crushed” and “stunned” that their information could have been impacted. It was an eye-opener for them, I guess.
We have all worked for, and been patients at, the same healthcare system over the last couple of decades. I know that my data has been impacted at least a half dozen times, including when a research coordinator had a non-encrypted laptop in the trunk of her car and it disappeared. It made me think that our organization probably does a bad job of making physicians aware of these incidents, when in reality, we are going to be a point of contact for concerned patients whether we like it or not. I’m trying to give them the benefit of the doubt, but at this point in the game, we need to all assume that none of our healthcare data is private or truly protected.
Speaking of privacy and confidentiality, fall is when many organizations require completion of annual compliance training updates. Although I’ve been through HIPAA and other compliance training in several dozen organizations over the last decade, I have yet to see one that addresses the fact that ease of access with phones and tablets has led to physicians accessing patient information in all kinds of places and with minimal privacy protections. I was sitting in a restaurant booth with a colleague a couple of weeks ago. She was waiting for some lab results on a patient and kept pulling out her phone to check and see if they were back. She generally has a hard time disconnecting from the office due to her specialty, and in a matter of minutes, I saw her entire patient schedule, several other patients’ labs, and some imaging go by.
I made sure I paid detailed attention to my salad while this was going on, but was flabbergasted that she thought it was OK to do this. Maybe she felt safe because I am a fellow physician, but given her overall track record of how she uses her phone, I would guess that she does this in other environments.
I mentioned it to her and she shrugged it off, saying she was sure “no one is looking at my phone” and justifying her behavior due to being in a high-stakes surgical subspecialty and needing to check in on patients. But it’s a snapshot into the cavalier attitude that many in healthcare have around protecting patient information. I’m sure she once watched a video that said “screens should be pointed away from prying eyes” but maybe making mention of specific environments where clinicians access patient data on their phones might be more impactful.
It will be interesting to see how patient privacy, consent, and monetization of patient information plays out over the coming years. In the meantime, think twice before you’re hitting the EHR during your kids’ soccer game.
What do you think about the monetization of patient data? Does your organization have a stance? Leave a comment or email me.
Email Dr. Jayne.
Thank you for calling out the implied coercion of signing electronically a pad to give consent for pages of legalese.…