Now that 2020 is here, CMS has opened the data submission period for the 2019 MIPS program. Eligible clinicians can submit their Quality Payment Program data until March 31, 2020. If you haven’t done this before, there are a variety of systems you have to register with to create your profile and submit your data, so be sure to visit the QPP website if you’re having difficulty. I’m exempt from individual participation and our group is opting out again this year, so that’s a relief. You can check the status of your favorite providers by using the CMS Quality Payment Program Participation Status Lookup Tool.
The new year also brings with it the California Consumer Privacy Act, with its provisions extended to the rest of us since big businesses aren’t going to have California-facing websites and others for the rest of us. It gives consumers expanded knowledge of what personal data is being collected, how it is being used, and the right to say no to its sale. It’s good for people to be more aware of how their data is being used, especially since so many people willingly give up their data without even thinking about it.
Even seemingly innocuous sharing using fitness sites can provide a wealth of information about people’s habits and movements. I’ve seen plenty of people overshare information about their children on social media, not thinking of how it might affect them when they’re older, but hadn’t thought about consumer-based genetic testing for children. A recent New York Times opinion piece addresses this, posing questions about parents sharing their children’s DNA profiles online. Apparently sending your kids’ swabs to 23andMe and sharing the results online is a thing.
I got a much-needed laugh during a clinical shift the other day. Apparently someone stuck a magnet to the inside door frame of one of our exam rooms. It wasn’t from the beach or something inspirational, but rather an ad for one of our competitors. Bold move and well played, but we did transfer it to the round file.
Less funny were the patients who came in with adverse effects of marijuana, given the recent legality of recreational purchases in Illinois. Not only did the patients get hit with nearly 25% tax, but also a hefty urgent care co-pay. As I’ve already put in several patient plans this year, lay off the weed, folks.
Amidst everything else going on in the world right now, this week the White House proposed guidelines regarding the regulation of artificial intelligence in healthcare, transportation, and other private sector industries. The general principles of “fairness, non-discrimination, openness, transparency, safety, and security” were mentioned, but in a general way. A memo from the acting director of the Office of Management and Budget warned about the perfect being the enemy of the good, stating that “Agencies must avoid a precautionary approach that holds AI systems to such an impossibly high standard that society cannot enjoy their benefits.” It remains to be seen how the principles will be specifically implemented or how much focus this will receive given other regulatory priorities.
Pet peeve of the week: use of the word “solutioning.” I’ve heard it three times this week in three venues, which makes me wonder if something is triggering increased use. Offending sentences included: “Let me work with the team to see what we can solution for you” along with “We’ll be doing some solutioning on this problem Friday and will keep you posted.” Sounds wordy and awkward to me, but I’d be interested to hear from others that think it’s a great word to use in this way.
Around the physician lounge this week: There was a study in the journal Pediatrics about the problem of “low-value care,” especially in the pediatric population. Researchers were specifically looking at whether children with public insurance (Medicaid) were more likely to receive unnecessary medical services than those with private insurance. They looked at data for over 8 million children across 12 states and found that one in nine publicly-insured patients vs. one in 11 privately-insured patients received so-called “low-value” services, meaning that they were either unneeded or unlikely to improve the patient’s situation. Either way, close to 10% of pediatric patients re receiving wasteful care.
The authors looked at a group of 20 low-value tests and treatments, many of which I see requested in practice: antibiotics for colds, unneeded x-rays, unneeded medications, etc. It’s difficult to explain to parents (and to the adults when they are the patients) that sometimes to do less is more and those explanations take precious time that providers often don’t have, so the cycle perpetuates itself. Clinical decision support rules and other technology can help us identify the low-value care, but they don’t do much to help explain why we’re saying no. Perhaps some brilliant developer could create a virtual reality game that tours through “all the bad things that can happen when providers give in to unrealistic patient request” that might make an impact. It should include a scary section where the player goes bankrupt due to wasteful spending.
Another potential game element could be the downward spiral that occurs when unneeded tests lead to a medical wild goose chase. This was mentioned in the Washington Post and I see it all the time when we order a panel of blood tests (because they all come on a convenient CLIA-waived cartridge testing system) rather than the single element we’re looking for. Something comes up out of the normal range, which doesn’t mean that it’s even abnormal, and more visits and consultations and tests are needed to work through it because everyone is worried about missing something or getting sued. The Post piece mentions unneeded testing done prior to cataract surgeries, which can lead to cascades of extra services.
I think this is one area where artificial intelligence might really be able to help – to assist us in learning what these not-normal but not necessarily concerning results truly mean across large populations, vs. us always having to go down the rabbit hole trying to figure out their significance.
The article has some gripping stories, such as the patient who had their kidney removed for what turned out to be a piece of fat, and then their remaining kidney failed. It also mentions the frustration felt by providers in these journeys. Physicians are also subject to cognitive bias (such as memories of when they previously “caught” something unusual) fed by anecdotal stories as well as personal experiences. These are exactly the elements that clinical decision support is designed to combat, but too often the physicians I spoke with are suspicious of the data behind such systems or whether use of that data would be defensible if they miss something significant and are sued.
The discussion also veered into the direct-to-consumer realm and some of the self-directed testing that is out there. Patients can now order large panels of tests, including genetic tests, without any kind of counseling or advice first. These can lead to significant anxiety along with the costs. There’s certainly variability in the services offered and the degree of physician involvement with some of these efforts. However, as long as there’s a buck to be made and patients are willing to pay for it, I don’t see them going away any time soon.
Do you think that healthcare IT can truly have an impact on the delivery of low-value services? Leave a comment or email me.
Email Dr. Jayne.