Curbside Consult with Dr. Jayne 11/25/24
At several conferences I’ve attended lately, there has been discussion among clinical informaticists about how increasing use of technology might be affecting our ability to process information and retain items in memory.
In speaking with medical students, it’s clear that they are learning in ways that are dramatically different from the options that we had when I was in school. At that time, the primary method of teaching was lecture based, with or without slides or visuals. Accompanying paper textbooks had chapters that roughly aligned with the material that was being presented in the lectures, but sometimes presenters would go deep into their own personal research areas, which left students scratching their heads trying to figure out what was important. Not only for testing purposes in a highly competitive environment, but for the not-so-distant future when we would actually be expected to care for patients.
If you didn’t want to go to lectures or wanted supplemental materials for the fast-paced sessions, each medical school class ran its own transcription service. Designated people agreed to attend each lecture and record audio cassettes of the content, then placed them in the mail slots of other students who had agreed to listen to and create transcripts of the lectures. Other students printed those transcripts and took them to the local copy shop, returning with paper copies that they dutifully stuffed into those mail slots for the rest of us to gather. For those of us who attended class, this was a great backup for the times that content was going over our heads or for when we inevitably zoned out due to information overload.
The only time we ever had lectures that were formally recorded by the university was for those classes that were presented during certain religious holidays. In those situations, videos were made, but they were only available to the students who observed those holidays. I remember wondering what it would be like if they just recorded all the lectures and made them available to everyone so that those who learned differently could use that modality, but the university said it would be cost prohibitive to do so. Thinking back, these were the days when we thought Lotus Notes was the be the end-all of software suites, so it’s hard to know what the true cost would have been when looking through the lens of today.
Fast forward to my 20-year medical school reunion, where a student tour guide told us that the university was recording all lectures and making them immediately available. At least in her class, she said very few students attended lectures, with most learning from videos that they watched at 1.5x or greater speed. It sounded like the focus of learning had changed, too. Since they weren’t “wasting time in class” they could spend more time studying for the medical licensing exams, which were viewed as being more important for the ability to match into a competitive residency training program.
I’ve learned that in recent years that they have added AI-assisted transcription to the recordings. I wonder if students even take notes anymore or just highlight and annotate those transcripts. I haven’t seen any of those materials myself, so I don’t know how well the transcription does with medical words and complicated scientific concepts.
When I was a student, we still carried pagers. I remember that when the Motorola text-based pagers came out, we thought we had really arrived. Cell phones were still a rarity. Now, every medical student holds the entirety of human knowledge in their hands on a near-continuous basis. It’s easy to look things up and we’ve become dependent on always having that ability, at least until it comes crashing down during a hack or other loss of service.
Students still memorize things, especially if they know they will be on a test. Some information becomes ingrained because of common use, such as the ability to quickly recall certain clinical formulas or calculations. Depending on how those resources might be presented in an EHR or online resource, it’s likely faster to be able to do them yourself, although accuracy is always a risk (but then again, it can be a risk in the EHR as well).
There are studies that look directly at how the internet may be changing our ability to think — attention spans, memory processes, and understanding social interactions both online and in person. I’ve done a lot of work during my career on understanding learning styles and trying to maximize how patients receive information, and much of that applies to understanding how clinicians receive information. The major differences are overall educational level and health literacy. I’ve spent more than 20 years working with teams to create training materials for EHRs and HIEs as well as patient-facing educational materials that address procedure preparation and chronic conditions.
Requests for specific lengths of training segments have decreased over time. When I first began working in educating clinicians, classes were way too long. We thought that we were progressive when we reduced them to 90-minute blocks, knowing that anything presented after that mark was unlikely to be absorbed. From there, we worked to shorten courses to 60-minute blocks. When technology evolved enough to be able to do recordings that we could park on our learning management system, our goal was to have 10- to 15-minute segments that went together to form a larger body of material. Since the advent of social media, the push has been to get those down to 3-5 minute blocks.
Now I’m starting to see requests from physicians for TikTok-style videos for continuing medical education, and I struggle to see how that might work. Healthcare concepts are often complex and I don’t know how you can even explain them in 30 seconds or less, let alone do so in a way that allows the learner to achieve mastery.
I also worry that the shift towards that style of learning will penalize those of us who learn best through the written word, even if it’s via digital media. I’ve always been a reader and use a variety of paper and digital sources. I find that if I’m in “hey, let’s learn something” mode, I do best with a traditional paper book. If I’m reading for leisure, either paper or electronic is fine. If I’m traveling, I’m not going to read it unless it’s on my Kindle since I’m a fast reader and tend to devour novels (I love a good mystery) and there’s not enough luggage space to accommodate paper for a long trip. I also love audiobooks and am trying to embrace those for learning as well as for entertainment. As someone who learns through written language, I’m grateful that my organization has digital transcription enabled for recorded meetings, because often I’ll turn off the audio and just read the transcript along with viewing the slides.
I’m curious how other informatics and educational experts have perceived this shift, and what other perspectives might be. Hopefully readers will weigh in. I’m happy to share comments, whether attributed or anonymous.
In the mean time, I’m making my reading list for 2025. What’s the best book you’ve read recently, and why? Leave a comment or email me.
Email Dr. Jayne.
My take is that DOGE is a fake-out. Elon Musk is working without a salary, but he is a businessman…