Banner

Article

Why do doctors ignore hospital rankings?

Author(s):

Some experts think physicians rely too much on personal connections and reputations when admitting patients

Last year, the Center for Medicare & Medicaid Services (CMS) unveiled a new and controversial rating system for hospitals called the Overall Hospital Quality Star Rating. The agency used data from 64 of the approximately 100 quality measures already reported on its Hospital Compare website to assign 3,662 hospitals ratings of from one to five stars, with five being the best. The ratings will be updated quarterly. 

While intended primarily for patients, physicians should also consult hospital ratings when deciding where to admit patients, say patient advocates. 

However, research and anecdotal information indicate that most don’t, which raises the question: with this rating system for hospital quality and others like it now available, why don’t more primary care physicians use them when deciding where to admit patients?

Physicians are too reliant on personal contacts, their own hospital network and a  hospital’s reputation among physicians when making these decisions, says consumer advocate Steven Findlay, author of a recent Health Affairs paper on hospital ratings.

“There is a lot of information about outcomes and quality of care that is, at this point, probably more reliable than physicians’ own personal knowledge of specialists and hospitals,” he says. 

For example, CMS gathers data for its quality measures from its hospital inpatient and outpatient quality reporting programs. The 64 measures used in the star rating system are assigned to seven different categories, and a hospital’s overall score is the weighted average of its category scores.

The categories of mortality, safety, readmission and patient experience are each assigned a 22% weight. Effectiveness of care, timeliness of care and efficient use of medical imaging are each weighted 4%. CMS then uses an algorithm to translate that summary score into a star rating.

But for some physicians, often-conflicting hospital ratings may not seem useful. In a 2015 study published in Health Affairs, no one hospital was rated a high performer by U.S. News & World Report, Consumer Reports, HealthGrades and the non-profit Leapfrog group, and Consumer Reports and U.S. News did not agree on any of their top hospitals.

Three of these organizations draw on CMS quality data but each uses its own statistical methodology, and U.S. News adds an opinion survey of physicians. Leapfrog uses both publicly reported data and a survey of hospital executives for its quality data.

CMS’s new system may only add to the confusion. Only 102 hospitals received the top rating of five stars, and few of the top-rated hospitals are considered the nation’s best by the private rating organizations, according to a review by Kaiser Health News. Nearly 40% received three stars.

What may be the only study of how physicians use hospital ratings was published in 2012 in the Journal of Hospital Medicine. Researchers invited nearly 200 primary care physicians affiliated with three Massachusetts hospitals to participate in an anonymous online survey. Nearly half responded and of those, only 61% were aware of hospital quality reporting, 16% were familiar with Hospital Compare, and “no physicians reported ever using quality information to make a referral decision or discussing it with patients,” according to the article.

 

 

Rankings don’t sway physicians 

It appears that not much has changed since 2012. Some primary care physicians tell Medical Economics that they simply find hospital ratings irrelevant. Keri Erland, MD, an internist in solo practice in Boise, Idaho, says she is so busy keeping up with the day-to-day demands of running her practice that she doesn’t have time to look up hospital quality ratings.

Besides, she adds, a patient’s insurance plan may dictate the hospital to which she admits the patient. And when that is not the case, Erland says, she’s practiced long enough in Boise to know its two main hospitals’ strengths and weaknesses. 

Internist James Rommer, MD, a solo practitioner in Livingston, New Jersey, also says he rarely pays attention to hospital quality ratings, and never considers them when sending patients for specialty care. Instead, he relies on a hospital’s reputation in the medical community and his own experience. “And I’ve never regretted it. The care that my patients get at the hospitals I send them to is very good,” he says. 

Internist John Erickson, MD, who practices in Portland, Maine, says the ratings are an imperfect attempt to assign a number to something that is difficult to quantify. In particular, a large, tertiary-care teaching hospital should not be compared to a smaller community hospital, he says. 

“A smaller hospital may get higher marks because it’s able to provide more personal care and the acuity may be less. The most acute and complex cases are sent to the larger hospitals,” Erickson says. Portland’s medical center received two stars on CMS’s Hospital Compare, while a much smaller hospital in the city received four stars.

Erickson says he does not consult Hospital Compare when admitting patients. The only reason he might send someone to the city’s smaller hospital is if the patient insists on privacy, because the smaller hospital has all private rooms, he says, unlike the larger medical center, where he is on contract to help run the residency program. 

Internist Jason Goldman, MD, of Coral Springs, Florida, says hospital ratings “aren’t worth the paper they’re written on. The data is not well collected, I don’t trust the metrics, and I don’t believe that they give a true reflection of the quality of a hospital.” 

For example, if too many patients do not have controlled blood pressure or not enough eligible new mothers are breast feeding, a hospital’s score could suffer, he notes. Yet these factors often are beyond the hospital’s control, says Goldman, who chaired his local hospital’s quality committee for seven years through 2015. 

 

Ratings rebuffed

These criticisms mirror those from many hospitals and their trade associations. In a July 2016 statement, the Federation of American Hospitals said the methodology behind the CMS star ratings “has important defects,” such as not adequately accounting for the significant differences between small and large hospitals, teaching and non-teaching, and those serving affluent or middle-class populations and those providing care in poorer areas. 

 

That same month, the American Hospital Association (AHA) issued a statement calling the CMS star ratings “confusing” and “not ready for prime time.” The ratings are “one of many tools for physicians and their patients,” said Nancy Foster, AHA vice president of quality and patient safety, in an email.

Hospitals have legitimate complaints and quality report cards can be improved, but the perfect should not be the enemy of the good, counters Findlay. 

CMS officials declined to discuss Hospital Compare and how physicians might use it. But in a July blog post, Kate Goodrich, MD, MHS, director of CMS’ Center for Clinical Standards and Quality, noted that the star ratings are adjusted to account for the “illness-burden” of the populations that hospitals serve.

But Goodrich didn’t rule out further adjustment for socioeconomic status. CMS is studying the impact of socioeconomic status on the quality measures used to create the star ratings. 

As for patients taking the initiative, Goldman says he’s never had a patient ask about hospital quality ratings. “Fortunately, at least so far, they trust their doctor to guide them in their healthcare,” he says. For the most part, neither has Rommer. Very rarely will a patient bring up the U.S. News & World Report rankings, he says. 

This is consistent with a 2015 Henry J. Kaiser Family Foundation public opinion poll. Just 17% of those polled said they saw hospital quality comparisons in 2014, and only 4% said they used that information to make a choice of hospital.

Erickson says  patients have asked about hospital quality ratings, but only after a news story, such as when the local public radio station reported the medical center’s two-star rating on Hospital Compare.

“I tell patients that it does measure stuff but not necessarily the most important things,” Erickson says. He tells them that he thinks the medical center’s quality is not well represented by the star system and that the rating doesn’t dissuade him from suggesting that the medical center is where they ought to be admitted. 

“I haven’t had any pushback,” he adds.  

Related Videos