VA is a much more complicated rollout since there are so many different interactions and configurations of VistA. In addition,…
I ran across an article about the impact on multi-tasking and memory. We’ve known for a while that the idea of multi-tasking is a myth. What really happens when we try to do multiple things at once is rapid switching of attention, which sometimes doesn’t work very well.
In my experience, trying to tackle two tasks simultaneously only works when one of them is significantly less critical and the majority of attention is paid to the more critical task. This is how we can get away with browsing Facebook while on conference calls, or reading the newspaper while eating breakfast.
When people try to do equally critical tasks at the same time, that’s when things start falling apart. I’ve had a couple of instances where people tell me they’re on two conference calls at the same time, and based on their participation on my side, it’s clear that they’re probably not paying adequate attention to either.
The article specifically looks at the impact of multi-tasking on memory. Research has showed that when people don’t fully attend to an event, they’re less likely to be able to create a strong memory of the event. One of the people interviewed in the article, Anthony Wagner, is a neuroscience researcher. He intentionally avoids having a smart phone, and has found that without it, he’s not lured into surfing the Internet or being constantly connected. As a result, he’s more focused on the activities around him. According to research coming out of the Stanford University Memory Lab, this means he’s more likely to remember the activities he’s watching.
There’s something to be said about just saying no to technology, although most people would be reluctant to give up their smart phones. Unfortunately, it then becomes a matter of discipline, where you have to consciously leave your phone in your pocket or bag rather than give in to the need for constant connection. That seems to be getting harder and harder for many people. I’ve had several uncomfortable conversations recently with employees who cannot pull their noses out of their phones long enough to pay attention to even a brief conversation. Fortunately, these people are not my personal employees because they wouldn’t last long.
Still, I’ve been increasingly asked to help teach people how to work in the new world of technology. People sometimes assume that because younger employees have grown up with technology, that somehow they know the best practices. I’ve found this challenging as workers struggle with prioritization of work, distraction, and follow through. Some of them are not aware of seemingly straightforward work habits, such as how to assess and prioritize an overflowing inbox when time is limited, or how to carve time out of the day to look at that inbox when you’re assigned to train end users or support a go-live.
The research shows that abilities such as attention and recall can be trained. It’s human nature for our minds to wander, but some of us definitely go walkabout more than others. One study mentioned in the article looked at brain function in heavy multi-taskers vs. that in light multi-taskers. The heavier multi-tasking group did worse on certain tests, and brain activity showed they were having to work harder to focus on the task at hand. It’s not clear whether this is a chicken or egg phenomenon – whether this was caused by multi-tasking or whether people with more fluid attention were more likely to multi-task.
Other research has looked at whether using technology causes our cognitive skills to atrophy. One study mentioned looked at those who used Google Maps for navigation vs. using landmarks. Those who used landmarks built better mental maps than those relying on digital assistance. Another looked at people taking pictures of museum pieces vs. those who simply looked at them. Those with cameras had worse recalls of the details. Anyone who has ever been to a school program, assembly, concert, or recital in the last decade has to wonder about the people who are experiencing the entirety of their children’s lives through the screen of an iPhone. Are they really seeing what is going on or are they more focused on getting the perfect video? Regardless, I long to attend events without people holding phones and tablets in the air, blocking everyone’s view.
The article also mentions a 2011 paper titled “Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips.” It showed that people are more prone to think of how to find information than to be able to remember it. As someone who deals with tremendous volumes of complex information, the ability to look things up instantaneously is a great asset. On the other hand, if it’s making us somehow less able to retain and recall information, it might not be so great.
One researcher talks about being selective regarding the use of technology. For tasks that are going to be done multiple times, it’s better to learn the information. For one-and-done type work, it might be OK to leverage technology. A non-tech example would be for those of us from the days of the dinosaurs, where we had to memorize our multiplication tables and regurgitate them on 60-second “timed tests” rather than calculating out the numbers each time. No one wants to have to use a calculator to figure out 7×6.
You can easily identify people who haven’t figured out how to successfully leverage technology. They’re the ones who repeatedly ask you questions that fall into the “let me Google that for you” category. They’ve been habituated to need external resources to figure out even small things. Frankly, I’d be glad for some of these folks to use technology as their primary resource rather than waste their employers’ consulting dollars asking me for basic information because it’s easier to ask someone else than to leverage your company’s Intranet, personnel manuals, and policies and procedures.
These are the kinds of basics I’m having to work on at some of my client sites. I recently taught a class on the successful integration of instant messenger into the clinical office to improve patient care rather than detract from it. People don’t inherently know when they should use IM, when they should use email, or when they should simply talk to one another. They need to understand the right use of each modality and then solidify it with documented processes for patient care. Unless you address it head-on, it will continue to cause chaos. I never thought I’d be teaching these kinds of skills, let alone teaching them to physician peers. It’s part of the evolution of technology and healthcare, though, and if a practice is savvy enough to ask for help, I’m certainly glad to provide it.
What’s the most egregious example of multitasking you’ve ever seen? Email me.
Email Dr. Jayne.