Home » Dr. Jayne » Currently Reading:

Curbside Consult with Dr. Jayne 8/2/21

August 2, 2021 Dr. Jayne 5 Comments

Like most of us, it’s been a long time since I’ve attended an in-person conference. Often, the sessions aren’t terribly memorable, and once I get home, I rarely consult my notes.

One of the exceptions was a presentation I attended at the American Medical Informatics Association Annual Symposium some years ago, where the topic covered patient portal use among children and adolescents. I remember the speakers talking about how their institution did the difficult work of defining what elements could be shared, which should be shared, and how to best set up various age and proxy restrictions for the best outcome.

Fast forward, and now we’re dealing with not only the limitations of patient privacy and EHR capabilities, but the impact of interoperability and information blocking rules. JAMA Pediatrics had a good viewpoint article about this last week. Working with patients who are minors can be challenging, especially as they move through the adolescent years and become candidates for certain healthcare services that can be kept confidential to some degree, such as pregnancy, sexual health, mental health, and related care. It’s always been a fine line that we’ve had to walk, because although we can restrict that information in the medical records, parents and guardians may still receive the bills and insurance correspondence.

For those who might not be in the data-sharing trenches, the article provides a nice overview of what HIPAA and HITECH have required as far as making records available. It also summarizes the 21st Century Cures Act and information blocking rules. As far as information blocking goes, there is a subset of situations where information blocking might be allowable, including technical infeasibility, preventing harm, and privacy. Those caring for minors might need to use one of these exceptions to protect patient confidentiality, especially considering that states have differing requirements as far as protecting restricted categories of information such as mental / sexual health services and contraception.

Clinicians have to understand those state rules and what parents might be able to see, and they also need to fully understand what features their EHRs might provide to help them with this daunting task. Some EHRs I’ve worked with allow users to mark specific data elements as “sensitive” and block release; others require the user to create separate encounter notes where an entire visit’s documentation is blocked from release. Less-savvy users might not understand these nuances, leading to negative consequences for patients, not to mention increased liability for themselves and their institutions.

The article also notes that the flow of data must also protect information provided by caregivers who might have a need to keep certain history elements from the patient, such as adoption status, genetic diseases, or other pieces of family history that a patient might not be mature enough to absorb. Another tricky area noted by the authors is the maternal data that is contained in a newborn’s EHR chart. This information often includes sensitive testing (HIV, hepatitis, sexually transmitted infections) as well as information on maternal drug and alcohol use, intimate partner violence screening, and more. Disclosing the mother’s protected health information to other caregivers can be a problem if not handled carefully.

The article mentions benefits of information sharing and jogged my mind on some of those aspects from the AMIA presentation. When I was in a traditional family medicine practice, we often spent the majority of the 17-year-old well visit discussing “Healthcare Adulting 101” so patients could understand their health information and how to best access it as they headed to college or otherwise into adulthood. With the rise of patient portals, adolescent patients may be able to schedule their own visits, request refills, and more. Education is needed so they understand the difference between urgent messages, non-urgent needs, and the best ways to navigate our often-chaotic healthcare system.

For adolescents with complex medical histories who have the ability to participate in self-management programs, having access to their information can be valuable and can help them get the best outcomes. Patients can partner with their parents for co-management, but organizations must be careful that common policies (such as reducing parental access to the chart during the teenage years) do not inadvertently hamper successful family dynamics. It’s quite a tightrope that that care teams walk at times and I thought the article was a good reminder for the rest of us. Unfortunately, since it appeared in a pediatric-specific journal, I’m not sure how much external visibility it will get.

The piece paired nicely with another article that I ran across, this one about using artificial intelligence systems to sort through electronic health records. The study looked at the amount of time that clinicians spent reviewing clinical data during patient visits and whether an AI system could help organization patient information prior to review. The study was small, with only 12 gastroenterologists participating. Each participant received two clinical records, one in the standard format and one that had been optimized via AI. They were then required to search the record to try to answer more than 20 clinical questions. The AI-optimized records allowed physicians to answer the clinical questions faster with equivalent accuracy. Nearly all the physicians stated they preferred the optimized records to the standard.

Even though the study was small and really needs to be redone with a larger number of physicians across multiple specialties and with multiple samples per physician, it got me thinking. What if you could use AI optimization to tackle the pediatric data- sharing problem? What if AI could be used to augment clinician efforts to seek out and appropriately tag or restrict sensitive information? Could AI-enabled tools run in the background while physicians are documenting and alert them to state laws about the information they’re adding to the chart, and do so right at the point of documentation? What if our systems could actually allow us to work smarter and could help make it easier to do the right thing the majority of the time? I think that’s the goal that most of us have in clinical informatics, although it’s often difficult to deliver those advantages to our users.

For those of you in the pediatric informatics trenches, how well are the tools available to you doing? Are they making it easier to manage information sharing or more difficult? Leave a comment or email me.

Email Dr. Jayne.



HIStalk Featured Sponsors

     

Currently there are "5 comments" on this Article:

  1. I am working on a project surrounding Adolescent and Young Adult care transitions this summer!

    One major barrier for my project specifically is the organization’s interpretation of a minor’s ability to consent to the Terms and Conditions of the patient portal as an individual. This bars patients under the age of 18 from creating and managing their own patient portal account, so there is no ability to teach patients how to manage their own healthcare via a digital platform. This interpretation is compounded by limitations in the patient portal with hiding and showing information dynamically based on the clinical area, such as labs related to sexual or reproductive health or notes from child and family abuse visits.

    Re: discussing “Healthcare Adulting 101” at age 17, my research has found that introducing the concept as early as age 12 leads to best results, with discussions happening over time until the patient leaves the practice.

    • Regarding a minor’s ability to consent, consider the reading grade level of the patient portal Terms and Conditions. There are many readability formulas available that can help you figure that out: I strongly recommend “ReadabilityStudio” by Oleander Software.

      You may find out that adolescents are expected to read and understand documents written at a college or even graduate school reading level (depending on the legalize). Plus you can get a word count, and by estimating an average reading speed of about 250 words per minute, you can calculate how long it should take to read those materials. Research on Terms and Conditions shows that the vast majority of readers sign off on them without reading them, leading what’s been called “uninformed consent” or “blind consent.”

    • I remember having a discussion with a Public Health-type person, many years ago. The topic was youth, STD’s, sexual health, and how the rights of the parents intersected with the rights of the youth. My concerns were information related and not service delivery.

      My assumption going in was, Age of Majority was everything. Well, was I given a jolt! It turned out that the topic was complex and effectively, the youth was granted various adult-type rights and protections in stages.

      Yet I also remember, I was not introduced to any specific policy or plan, enumerating exactly how that happened. Which left me scratching my head a little, to be honest. It sounded more like, a clinical judgement call was being made. Perhaps they were gauging how mentally and emotionally mature the youth was?

      • My recollection from my EHR- and patient-portal building days is that restrictions vary by state, with some of the states I was looking at specifically withholding information on minors’ sexual health from parents, once they were above a certain age. The consideration, as I understand it, is less about “maturity and more about protecting minors from danger in the home. Minors who are being abused in the home, or who are sexually active, LGBT, trans, or something else related to their sexual health and may suffer abuse as a result of that, must be allowed to have medical care while being protected from abuse.

  2. Ah yes, that to date ‘elusive’ tech promise. In theory AI – the latest hope for digital health – should add value at the point of care decision making. We’ll likely get there, eventually. As far as I can tell baseline AI performance is been little more than a Netflix like engine, ie, based on your history ‘you may enjoy this title’ (more often than not at least, in the ball park). Yet, in my experience, most AI assisted tech seems like an ingestion of keywords that produce a wide range of consideration many of which are irrelevant to me. Discerning the privacy vs. interoperability considerations and automating them via AI seems like a steep climb. Thanks for sharing both articles!

Text Ads


RECENT COMMENTS

  1. Giving a patient medications in the ER, having them pop positive on a test, and then withholding further medications because…

  2. Apple legacy? Seems I heard that before. Say around 1997. Jobs put out a 15 min video where a guy…

  3. Cmon, publishing and writing about an Only Fans and TikTok user is tabloid news. Its junk news, not up to…

  4. "Healthcare startup Particle Health has been battling electronic health records giant Epic Systems all year. Now, the startup just raised…

  5. I'd never heard of Healwell before and took a look over their offerings. Has anyone used the products? Beyond the…

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

RSS Webinars

  • An error has occurred, which probably means the feed is down. Try again later.