State infection accounting faces daunting challenges
When California health officials recently announced a 10 percent annual drop in hospital central line infections, they did not mention that they had found flaws in the facilities’ reporting of an illness that can kill up to one in four infected patients.
In fact, the state’s own fact-checking of records from one-quarter of hospitals statewide had uncovered a series of errors, including an overall 38 percent undercount of central line associated bloodstream infections. In response, state officials asked hospitals to correct the data.
Such infections strike the sickest hospital patients, those who must be fed or medicated with tubes that doctors insert in large veins near their hearts. An estimated 41,000 U.S. hospital patients each year suffer central lines infections, often caused by dirty hands or equipment, the federal Centers for Disease Control and Prevention reported last year.
The reporting mistakes uncovered in California highlight a major question looming in the current national rush to improve the quality of patient care: How can states accurately count the untold number of infections that occur in thousands of U.S. hospitals, amid a tangle of differing definitions, counting techniques and plain human error?
Never before has the accuracy of such numbers been so important. Consumers can now choose among their local hospitals using brand-new rating systems posted on the internet that rely on the same data base used to record infections in California.
The stakes are huge for hospitals, too. Under the 2010 health care reform law, a new federal program kicks off this October that ties Medicare payments to the quality of hospitals’ patient care. The goal over 10 years: an estimated $50 billion savings in Medicare costs.
Starting in 2015, hospitals with high central line infection rates could lose a portion of their Medicare funding under the new program, called value-based purchasing.
Judging hospitals fairly will hinge on the accuracy of the same data that California and its hospitals are grappling with today.
California is relatively new to reporting infections, trailing by years a number of other states that discovered similar errors and learned to fix them. The state’s program is a work in progress, and its quality will improve with each annual report, say regulators and hospital officials alike.
“It’s an ongoing learning process. We’ve always said that this is going to take time,” said Jan Emerson-Shea, spokeswoman for the California Hospital Assn.
The massive 2011 report that the state Department of Public Health released Aug. 9 contained data on five types of hospital-acquired infections, or HAIs, for virtually all of the nearly 400 general acute care hospitals statewide.
The report was the third produced by California, which, under a 2008 law, became the 28th state to require such reports.
To hone its work, California sought $600,000 in funds through the federal Centers for Disease Control and Prevention to validate 2011 reporting from 100 hospitals that volunteered for the project. The state sent one or two infection experts to each hospital in summer 2011 to review records and confer with the staff.
“The goal is not to go into a facility and say, ‘Gotcha.’” said Dr. Arjun Srinivasan, CDC associate director for health care associated prevention programs. “What we’ve seen is if you do this in a collaborative way, if you use it as a teaching tool, the reporting gets better.”
State officials wrote in an email note last week, “This project was done to assess and assist hospitals with surveillance and reporting. Improving the quality of HAI data will also reduce HAIs.”
Hospitals welcomed the validation project, and the state had too few slots for them, Emerson-Shea said.
“The fact is, we have hospitals volunteering to work with the state, opening themselves up, being willing to hear they’re not doing it right and getting guidance on how to do it better,” she said.
News of the validation study was first reported by California Watch in an Aug. 10 article.
One researcher who has worked extensively with state data said officials should have disclosed the study.
“If they had the data at the time they released the report, they should have had a section on the validation, say, in the technical report,” said Dr. David Zingmond, an associate professor at the UCLA David Geffen School of Medicine who has done research on hospital quality and the epidemiology of health care.
A state spokesman said that the validation work was indeed made public, citing a link on a department web page under a section labeled “Information for Infection Prevention Programs.”
The link leads to a slide show that state officials presented to hospital officials and others in 17 cities from May to July of this year. The 38 percent undercount of line infections can be calculated from raw numbers shown on slide 23.
“The 38 percent is a disappointing number, because they could have done better,” Zingmond said after reviewing the slides.
And the reported 10 percent decrease?
“I would say that there was a decrease,” Zingmond said, “but quantifying it is harder due to the undercount.”
He urged that the state repeat the validation work.
“You have to keep going back to see how well they’re doing their reporting,” he said.
Experts say that medical workers can largely prevent line infections by following checklists to assure cleanliness and by removing a line promptly when a patient no longer needs it.
In fact, health officials measured a 58 percent decrease between 2001 and 2009 among central-line patients in intensive-care units, the CDC reported, estimating that such decreases saved up to 27,000 lives and $1.8 billion in medical costs.
The state’s 2011 validation study also turned up errors in the reporting of cases of Clostridium difficile (C. dif), vancomycin-resistant enterococci (VRE) and methicillin-resistant Staphylococcus Aureus (MRSA), although the central line undercount was the largest.
The study was financed with part of a $2.6 million, 27-month federal grant to the state, funded under the 2008 American Recovery and Reinvestment Act.
The CDC grants did not require that states to make their validation results public. Several states have, however, and some, including New York and Connecticut, published their results in peer-reviewed journals. New York and Colorado health officials disclosed the results in their annual reports.
Utah, by contrast, is just getting started. The Salt Lake Tribune reported last Thursday that hospitals there have disclosed 16 infection cases since a reporting mandate began last summer, and their names are kept confidential.