Home » Dr. Jayne » Currently Reading:

Curbside Consult with Dr. Jayne 11/18/24

November 18, 2024 Dr. Jayne 5 Comments

The practices in which I’ve spent the majority of my clinical time over the past few years don’t use AI-assisted or ambient transcription technologies. One uses human scribes, while the other leaves physicians to their own devices for finding ways to become more efficient with their documentation.

In the urgent care setting, my scribes have always been cross trained. They started out as patient care technicians or medical assistants, and if they had excellent performance and a desire to learn, they could request to enter the in-house scribe training program. During that multi-month period, they received additional training in medical terminology, clinical documentation, regulations and requirements, and understanding the physician thought process for history-taking, creating a differential diagnosis, and ultimately creating and documenting a care plan.

Many of our human scribes had the goal of attending medical school or PA school, so they had a strong drive to learn as much as possible while doing their job. As they learned our habits for seeing patients and describing our findings, they would sometimes prompt us for something that we might have forgotten to mention or might not have performed during the exam. Because of the level of cross training, they could also assist us with minor procedures during the visit rather than just standing there and waiting for us to describe some findings.

Towards the end of the visit, when the physician typically summarizes the findings for the patient and describes the plan of care, the scribes would review and clean up the notes so that they were ready for our signature as soon as the patient disposition was complete. I would often be able to sign my notes in real time, and even if I had to wait until the end of the day, it might take me less than a minute to review each note because of the diligence they used capturing the visit.

Human scribes are also helpful when conducting sensitive visits, which often happen in the urgent care environment as we discuss a patient’s sexual history or perform sensitive portions of the physical exam. In those situations, our scribes served as both chaperones and assistants, providing support to patients when needed and assisting with specimen collection – uncapping and capping jars and tubes, ensuring accurate labeling, etc. I’ve had scribes help patients take their shoes and socks off and assist them in getting on the exam table and returning to a chair. When contrasting a visit that uses a human scribe to one where the physician has to perform their own documentation, there’s a substantial difference in the time that it takes to complete the visit, and not just from a documentation standpoint.

In speaking with my colleagues who have transitioned from human scribes to either virtual scribes or AI-assisted technologies in similar practice environments, they note that they miss the physical assistance of the scribe. No one is in the room with them who can step out and grab supplies or equipment when a situation occurs where it would be more efficient to do that instead of the physician stepping out to get what they need. There are also flow issues when chaperones are needed or when assistance is needed during a procedure, which can make the day bumpier.

Some colleagues with whom I recently discussed this mentioned that their organizations didn’t consider these workflow changes when moving to non-human documentation assistance strategies. One said that he felt that everyone thought it would be so much cheaper to not pay a person that they forgot to calculate in the time physicians would now be spending doing things that they didn’t have to do in the past.

It’s a classic parallel to what we experienced back in the early days of EHR implementation, when there were constant encounters with unintended consequences. One example: in a paper-based workflow where no one reconciled medications, implementing an EHR that requires medication reconciliation is going to increase visit duration, whether it’s done by an assistant or the physician. They should have been doing medication reconciliation in the first place because it’s a patient safety issue, but the EHR took the blame as forcing them to do something they didn’t think was important. Now we have different unintended consequences when we layer on more sophisticated technologies such as AI-assisted documentation.

One colleague described the problem of excessive summarization, where his organization’s AI documentation solution took a lengthy physician / patient discussion that included detailed risks and benefits of treatment or lack thereof and condensed it down into two sentences. When that happens, one has to consider the downstream ramifications. Will a physician even see that it’s been condensed in that way, or are they just signing notes without reviewing them to keep their inbox clear? That situation happens more than many would think. If a physician catches the issue, will they spend the time editing the note or will they just move on because they’re pressed for time? And if they do take the time to edit the visit note, will they capture all the nuances of the discussion exactly as it had occurred with that particular patient?

Another colleague, who is also a clinical informaticist, mentioned that having AI documentation solutions doesn’t fix underlying physician behavior challenges. The physician who never finished his notes at the end of the day and instead left them for Saturday mornings still leaves them for Saturday mornings, which means that he’s reviewing documentation that’s up to five days old and for visits that are no longer fresh in his mind. It’s creating issues with the technology platform, since recordings have to be kept until the notes are signed, and it’s skewing metrics for chart closure that were important to measure the success of the project. 

The team that implemented the solution could have anticipated this had they looked at baseline chart closure rates, but they were in such a hurry to get the solution rolled out that now they’re having to go back and examine that data retrospectively. They also missed the opportunity to coach those physicians during the implementation phase about the patient safety value of closing notes in a timely manner.

Others have noted issues with using AI solutions to examine documentation after the fact, such as only using data from structured fields. This is great when you have a specialty that does a lot of structured documentation, but doesn’t work well in one where the subtleties of the patient’s story are largely captured via free text.

I recently attended a lecture where they discussed the hazards of using AI tools in the pediatric population, since so much of the language used in capturing a child’s status varies based on the age of the patient. For example, saying a patient is “increasingly fussy” has a meaning that goes beyond the words themselves and has a different impact when treating an infant versus an older child or a teenager.

The pediatricians also mentioned the difficulty in obtaining consent for use of AI tools during visits, especially when only one parent is present or when the child might be brought to the office by a caregiver such as a nanny or sitter. Although those individuals may have capacity to consent to treatment, they may not have specific ability to consent to the use of AI tools. There is also the issue of the child’s consent to being recorded. Although the laws generally allow parents to consent on behalf of their children, obtaining the permission of an adolescent patient is an ethical issue as well, and one which physicians may not have the time to address appropriately due to packed schedules.

The dialogue around use of AI solutions has certainly changed over the last year, and we’ve gone beyond talking about how cool it is to addressing the questions it has raised with expanding use. It’s great to see people asking thoughtful questions and even better to see vendors incorporating ethical discussions into their implementation processes. We’ll have to see what this landscape looks like in another year or two. I suspect that we will have found many other areas that need to be addressed.

How is your organization balancing the addition of AI solutions with the need for human assistants and the need to respect patient decisions? Leave a comment or email me.

Email Dr. Jayne.



HIStalk Featured Sponsors

     

Currently there are "5 comments" on this Article:

  1. I want to highlight this passage: “It’s a classic parallel to what we experienced back in the early days of EHR implementation, when there were constant encounters with unintended consequences. One example: in a paper-based workflow where no one reconciled medications, implementing an EHR that requires medication reconciliation is going to increase visit duration, whether it’s done by an assistant or the physician. They should have been doing medication reconciliation in the first place because it’s a patient safety issue, but the EHR took the blame as forcing them to do something they didn’t think was important. Now we have different unintended consequences when we layer on more sophisticated technologies such as AI-assisted documentation.”

    This is exactly my concern. There are SO many unintended consequences here. We haven’t even begun to tease all of them out. You mentioned quite a few, but there will be more. Governance and safety means thoroughly examining and checking for these unintended consequences, and that takes time and money on its own.

    • I had an old physician colleague whose favorite hobby was bitching about EHRs, and one day told a story about a patient being seen, the only documentation needed was a prescription being written on a piece of paper, and DONE! See how easy? I asked “Did the patient get better?” and his answer? “I don’t know”.

    • I’m terribly conflicted by this quote. On the one hand, medication reconciliation is a patient safety issue, and on the other hand physicians don’t think it’s important. OK, which is it then?

      And most likely, it’s a bit of both. Med rec is a potential safety issue, but actual important negative safety events are not very common. You can’t get clinicians to speak against it because of the former. Yet physicians don’t want to spend the time required to complete the requirement.

      Unintended consequences are a matter of working out conflicting requirements. As the quote says, the EHR isn’t responsible for the conflicting goals, but the EHR makes the medical issue unavoidable. Programmers and analysts made med rec mandatory, because they were told to do so! This workflow item came from the medical community itself.

      What you then have to do is this. Decide on a course of action:

      – Should med rec be abandoned as not worth the effort?
      – Conversely, the answer might be that “this is indispensable and simply must be done as prescribed”;
      – Perhaps there’s some other way of achieving med rec;

      The entire issue smells of, “This is the way the textbook says to do things” versus “This is the way it’s usually done in the real world.” But in the past tense, because the prevalence of EMRs has reconciled those two realities.

  2. You’re right about it feeling like groundhog day EHR to AI Medical Scribes. You highlighted one comparison. How they’re going to market is another. How they’ll consolidate. The battle for market share. etc etc etc.

    My favorite unintended consequence of going to the EHR and losing the paper chart was how many workflows were queued off of the paper chart on the door. It was amazing to see and realize that now there wasn’t that physical visual indicator. Of course, the solutions around it weren’t too bad to figure out, but they did take figuring out.

  3. I appreciate the new and approved medical terminology for my daughter who is “increasingly fussy” (and has been since she was born!)

    But of course organizations need to figure out the culture and workflow change these new tools create. But hopefully we can create more areas when a $20 an hour human gets replaced by a $20 a week technology

Text Ads


RECENT COMMENTS

  1. Giving a patient medications in the ER, having them pop positive on a test, and then withholding further medications because…

  2. Apple legacy? Seems I heard that before. Say around 1997. Jobs put out a 15 min video where a guy…

  3. Cmon, publishing and writing about an Only Fans and TikTok user is tabloid news. Its junk news, not up to…

  4. "Healthcare startup Particle Health has been battling electronic health records giant Epic Systems all year. Now, the startup just raised…

  5. I'd never heard of Healwell before and took a look over their offerings. Has anyone used the products? Beyond the…

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

RSS Webinars

  • An error has occurred, which probably means the feed is down. Try again later.