Infection ratings: Did your hospital earn a white dot?
This blog was updated on March 8, 2012.
Last week, a friend of mine was showing her Cal State Long Beach journalism students how to check physicians’ credentials on line when a student’s question stopped her short.
Where is the website, the student asked, that grades the hospitals where those doctors work?
The simple answer: The web abounds in hospital ratings these days, most of them based on consumer comments rather than hard data.
Readers of Yelp, for instance, can rate, not only the local Chinese takeout place, but the hospital up the street. Long Beach Memorial Medical Center -- generally respected among its peers -- earned only two and a half gold stars out of five on Yelp. It received plenty of comments like AWFUL and NEVER, a complaint about a broken parking pay machine, and a few kind words for its cafeteria’s Chinese chicken salad.
These comments have value, of course, but they don’t reflect the views of the regulatory experts charged with assessing health care quality.
Fortunately, Cal State Long Beach students don’t have to rely solely on Yelp to judge their local hospitals .
The national push for health care transparency has produced several high-quality sites that use credible data from government regulatory agencies.
I told my teacher friend about a few well-known sites, including Medicare’s Hospital Compare, but also about a new report on hospital infections on the website of the California Department of Public Health.
Borrowing from Yelp and Consumer Reports, it’s one of the most readable reports that I’ve seen from an agency rightfully criticized in the past for technical, impenetrable infection reporting.
This year, state infection experts rethought how to inform consumers about the incidence of central line-associated bloodstream infections at hospitals across the state in the year ending in March 2011.
For seriously ill patients, these lines are critical for delivering food and medication. But they can turn into conduits for bacteria, spawning infections with a 25-percent mortality rate.
Such infections are relatively easy to prevent, however, and some consumer groups use line infection rates to measure the rigor of a hospital’s infection control program.
White dots show that a hospital’s rate is significantly better than the state norm.
Black dots show that the rate is significantly worse than the norm.
Gray dots—circles with specks inside—indicate that the rate is no different statistically than those of other hospitals
The system allows consumers to check out a hospital’s performance at a glance.
Cedars-Sinai Medical Center in Los Angeles excelled in the report, winning two white dots and 14 gray “average” dots. Cedars has targeted central line infections in a facility-wide campaign called “Zero is the Greatest Number,” as I wrote about last year in the Riverside Press-Enterprise.
Long Beach Memorial received one black dot and seven “average” dots in the eight categories in which it was rated. The black dot showed its rates in oncology care, where patients are among those most prone to infection.
It could do better. But that’s a better performance than many hospitals in Southern California—and it’s far more statistically precise than those two-and-a-half stars and AWFUL comments on Yelp.
UPDATE: A reader asked why hospitals don’t have dots in every category. That’s largely because the state collects its infection data according to the types of patient care offered by each hospital — whether in a trauma unit, neonatal unit, burn unit, and so on. If a hospital doesn’t have a burn unit — and most hospitals don’ t — it won’t have a dot in the burn category.
By the way, readers can see details about infections in each category by clicking on the links at the top of each page.