Time Capsule: "Best", "Most Wired", and Other Hospital Surveys: Good for Selling Stuff and Not Much Else
I wrote weekly editorials for a boutique industry newsletter for several years, anxious for both audience and income. I learned a lot about coming up with ideas for the weekly grind, trying to be simultaneously opinionated and entertaining in a few hundred words, and not sleeping much because I was working all the time. They’re fun to read as a look back at what was important then (and often still important now).
I wrote this piece in July 2007.
"Best", "Most Wired", and Other Hospital Surveys: Good for Selling Stuff and Not Much Else
By Mr. HIStalk
US News and World Report released its Best Hospitals 2007 list last week. Indecisive or brain-dead folks who can’t choose a movie, restaurant, or college now have yet another life decision they can outsource to faceless reporters who will tell them what to do for just the cost of a magazine (anyone who’s worked around reporters would question whether that’s a good idea.)
Few readers will care how "Best" was determined. Answer: a few hundred doctors were surveyed, Medicare mortality was reviewed, and two-year-old AHA survey results were picked through to see who had cool technology and who was really busy. Some kind of unstated weighting was applied and, voila! the Best Hospitals were extruded out the other end.
Best Hospitals isn’t as blatantly biased toward a specific industry sector as the annual Most Wired Hospitals ("Buy more of our products so you can get on our list.") They’re just trying to sell magazines, not multi-million dollar IT gadgets. But what does "best" mean when it comes to hospitals? Is there such thing?
It doesn’t mean much to the average hospital patient, as far as I can tell from those three survey sources. Doctors don’t know anything about most hospitals first hand, so that’s just a popularity contest. Medicare mortality may be relevant (or not) if you’re a senior citizen, but not so much if you’re an expectant mother or trauma victim. Using old AHA survey results to create a brand new conclusion seems iffy.
Not surprisingly, the physician "reputation points" question ensures that only big academic medical centers make the list. It’s like the Best Colleges edition: the only drama is whether Harvard, Stanford, or Princeton will bag the #1 spot in a given year. In fact, the Best Colleges issue itself influences reputation, so maybe the only thing that will ever change is the order of finish.
Rankings aside, you don’t know if your kid will get a better education or a faster meningitis cure just because you picked the Best instead of the local places that only non-magazine reading rubes would patronize. Nobody knows. It’s not predictive. But one thing’s sure: it helps sell magazines.
The bottom line is that we don’t really know, for a given individual or condition, which hospital is best. We don’t even know if it matters which one you go to. Maybe it’s your doctor, your faith, your preventive care, or your genes that has the most effect on whether you walk out happy or not. Whose building you sleep in may or may not play as much of a major role as us hospital types would like to think (except avoiding those that are prone to killing patients with a hospital- acquired infections or medical mistakes). Big hospitals have their share (maybe disproportionately so) of medical errors and poor outcomes.
The best hospitals (or, more precisely, doctors who practice in them) do just one thing obviously better: diagnosis. After that, I’m not convinced there’s much difference. In fact, in the typical giant medical center run by a liberal academic parent, you’re apt to find hordes of geeky diagnosticians wearing bow ties and vast armies of lower-ranking types who are likely to miss your meds and take their time responding to your call button (I like to think it’s because they’re not scared of an employer that deals with incompetent professors by offering them lifetime employment through tenure).
Juxtaposed with this story was hardly shocking news: according to a study, electronic medical records don’t improve patient care. Well, actually, that’s what the headlines said. What the study found was that EMRs didn’t improve compliance with standard practices that ought to improve care. The biggest shock to me was that somebody apparently thought they should. You can buy the golf clubs Tiger Woods uses, but that doesn’t mean you’ll play better golf. When it comes to EMRs, the hopes of the naive apparently needed dashing.
When it comes to what’s most important – whether you, specifically, will walk out of a hospital alive and well – maybe some reporter’s Best Hospitals or Best EMR or Most Wired lists don’t really make much difference. It’s not that easy. That’s a tough message for a data-driven, standardization-obsessed, sometimes sheep-following industry to hear, but I think it’s true.