Making sense of hospital rankings
Several organizations rank health care facilities nationally. But a study comparing these rankings yielded disconcerting results. Rankings varied widely from group to group, with many “high performing” hospitals on one list landing among the worst performers on another.
My friend Chris loves cycling, both for the exercise and the stress relief it provides. Even during this brutal winter, he rode his bike nearly every day and posted photos of himself to prove it. Wearing a ski mask under his ultra-cool lighted helmet, he would sometimes pose in front of the Washington Monument, the Capitol, or some other famous DC landmark. Last year was a rough one for him, though. Like so many others, he was hit by a car door opened by an inattentive driver in the bike lane. His bike was demolished, and Chris suffered extensive road rash and bruises, and a badly broken arm.
Replacing his bike was easy. Chris solicited advice from fellow riders on social media, perused reviews on cycling websites, and consulted local clubs. He was able to outline his criteria, including budget, and aggregate the data he had collected to identify a few finalists. Evaluations focused on consistent issues, enabling apples-to-apples comparisons. Then, using a similar approach, he identified several excellent bike shops. After visiting each, he found the right shop to sell him the right bike and was back out on the road—taking more selfies.
Fixing his broken arm wasn’t as easy. He had a list of orthopedic surgeons near him, but no idea how to evaluate them. Information from social media sites gave him a sense of what the service experience at those practices was, but there was little guidance on whether the doctors achieved good outcomes. Rankings in area magazines gave little indication on how they arrived at their recommendations. At this stage, many consumers give up and simply follow the advice of someone they trust (their primary care doctor, a friend, their mother), irrespective of what that advice is based upon. Chris wound up calling me, and I recommended someone I had known for many years, had worked with in the operating room, and had gone to myself when injured. While my advice was based on significant insider experience, it still was not based upon outcomes data. Thankfully he was pleased, had a very good outcome, and was back out riding on his beautiful new bike.
Last week Chris’s world was turned upside down when he was diagnosed with prostate cancer. With so many treatment options to choose from, guidance was once again essential. While he might have been happy to rely on friends’ recommendations for orthopedic surgeons, finding the best cancer treatment was far more complex. Fortunately, national data is readily available to make the process simpler, or so he thought.
Several organizations rank facilities nationally. Recently, Health Affairs1 compared these rankings, and the results were disconcerting. Rankings varied widely from group to group, with many “high performing” hospitals on one list landing among the worst performers on another. The study found that only 10 percent of the 844 hospitals rated as “high performing” in one rating were also “high performing” by any of the other rating systems. Additionally, no hospital was ranked as “high performing” on all four national rating systems. Adding to the confusion, the rankings even defined “safety” differently.
The struggle for many consumers is to navigate what each group means by “best” or “worst.” Part of the reason for this is the difference in methodology; another is the difference in focus (health outcomes versus patient safety versus patient satisfaction).
Like so many others facing a serious health concern, Chris has some important decisions ahead of him. Despite this diagnosis, he is actually lucky. He has wonderfully supportive family and friends, and he has people he can turn to for expert advice when he needs it to help make sense of confusing and conflicting information. I’m confident that he will overcome this challenge and next year his biggest concern will again be car doors.
The question for the industry is how to help the rest of the population make informed decisions when confronting critical health decisions. Ranking and rating programs will need to address consumers’ needs more clearly, directly, and transparently. The differences among them must be obvious, as should the reason why one might consult one provider versus another. People are savvy enough to appreciate that a provider who is great in one area, may not be as good in another. Therefore, blanket assessments of “best” or “top” offer little utility.
The Center for Medicare & Medicaid Services (CMS) is working to make quality data more widely available through sites like Physician Compare,2 and industry rankings strive to give consumers tools to choose high quality care. However, greater consistency is needed to allow for apples-to-apples comparison. And, beyond information on service and patient experience, information on outcomes can help consumers select a setting to receive their care. With a thoughtful, patient-centered approach to reporting, the true value of these studies can be realized.