Most of the scribes I work with are either applying to or have been admitted to medical school and are trying to save up money as well as learn about clinical practice. It’s fun to work with them because they’re eager to learn. I enjoy teaching but haven’t been on faculty anywhere in years. Most of them know I have another job besides seeing patients, and are often interested in what I’m working on from a CMIO standpoint. We talk a lot about EHRs and what’s different about using our niche system versus what they will encounter when they go to medical school.
A recent article in the Journal of the American Medical Informatics Association looked at the existing literature to assess the current state of EHR training for medical students and residents. The authors looked specifically at “educational interventions designed to equip medical students or residents with knowledge or skills related to various uses of electronic health records” and to “compare the aims of these initiatives with the prescribed EHR-specific competencies for undergraduate and postgraduate medical education.” There wasn’t a tremendous amount of literature for them to sift through in their analysis – only 11 studies. Of those, seven covered medical students, three included residents, and one included both groups. All of the interventions they identified covered data entry, but none involved manipulating the resulting data at a panel or population level.
They concluded that the documented interventions don’t really prepare students to show mastery in the competencies required to be effective physicians. In thinking through this, I’m not sure how many current physicians have EHR skillsets beyond just data entry. Most of the organizations I work with expressly prohibit their physicians from doing anything remotely involving data analysis or population health work. All of those functions are managed at the group level or health system level rather than by physicians. Although physicians may receive various clinical scorecards, they’re not really accessing or addressing the data on their own. This certainly would be different for independent physicians, although many of my peers in those environments don’t have the knowledge or understanding of how to get at that data, either.
In digging deeper into the study and its methods, I was surprised by how much the different training interventions varied. Some were a brief (one hour) self-directed module that reviewed screenshots of different areas of the EHR; others could be as long as a multi-week simulated EHR curriculum. Most of the included workflows were based on data entry or information retrieval. Other activities included retrieving lab results, looking at medication lists, orders, and billing functions such as E&M coding. The interventions had different ways to assess competency. Some included a pre-test followed by a post-test after the intervention. Only three studies included a control group. Nine of the studies involved changing skills and only two looked at changing attitudes. Other assessment methods included quizzes, surveys, self-reported questionnaires, chart review, and structured practice with standardized patients.
The students and residents did well when they were evaluated using quizzes and surveys, and were satisfied with that approach as well as being able to demonstrate competency. Other studies didn’t show a difference between the intervention group and a control group. One study was able to show that learners receiving the intervention performed better on standardized patient examinations while they were being judged on their ability to complete a structured patient visit. Although standardized patients are an important part of learning (particularly as students and residents learn to perform sensitive examinations) they always made me nervous, since they were fully aware of what I was supposed to be doing and what kind of findings were supposed to be present, and I was being compared not only against my own classmates but the dozens of students who had examined them in previous years.
I was curious as to the specific competencies the authors were including when they identified gaps in training interventions. They expected students, prior to beginning their clinical clerkships (usually in the third year of medical school at the latest) to “be able to describe the components, benefits, and limitations of EHRs; the principles of managing and using aggregated electronic health information, including tenets of electronic documentation as well as differences between unstructured and structured data entry; and articulate standards for recording, communicating, sharing, and classifying electronic health information in the context of a medical team.” They also note students should “be able to identify how systems may generate inaccurate data, discuss how data entry affects direct patient care and healthcare policy, gather relevant data from EHRs, and assess the reliability and quality of these data.”
Again, I’m not sure many practicing physicians would be able to enumerate all of these elements. They may also not have a “working knowledge of health informatics through chart audits and research projects.” On the flip side, maybe if the physicians I work with had received better education around the role of EHRs, they’d be more interested in the idea of clinical informatics as well as what they can do with the vast amount of data they’ve been keying into the EHR over the years. The authors did note that “a significant number of trainees have had exposure to the EHR before their medical training as scribes and that inclusion of these individuals in the studies may have affected the results.”
I’d be curious to hear from those of you who are academic institutions on whether your training programs are incorporating these competencies into the curriculum. My medical school recently began undertaking a complete overhaul of the educational curriculum, so you can bet I’ll be asking about it the next time I run into the new associate dean who has been tasked with that effort. We heard a bit about it at my medical school reunion in the spring, but the main points of her address were around providing clinical exposure to students far earlier than we experienced during our training. The only EHRs available when I was a student involved one that used a green screen terminal to access lab results at the flagship hospital, and one that used a light pen to navigate at the community hospital. The academic center was just beginning to build its own clinical data viewer, whose contents were hit or miss, as I entered my fourth year. Now after a decade of best-of-breed construction, they’re all using Epic.
Do you think your current practicing physicians can demonstrate mastery of the skills the authors evaluated? Leave a comment or email me.
Email Dr. Jayne.