Freeman Spogli Institute for International Studies Center for International Security and Cooperation Stanford University


CISAC News



February 1, 2006 - CHP/PCOR News

Quality Indicators' broad use fulfills promise of translational research

By Sara L. Selis

While university research typically culminates in published papers and conference presentations, the Quality Indicators project at CHP/PCOR has produced something more: a set of practical tools that are used by health systems, insurers, public health agencies and employer coalitions to screen for potential quality problems and to flag possible overuse or under-use of recommended of care.

Since their release in 2002, the indicators -- developed by the Stanford-UCSF Evidence-based Practice Center under a contract with the federal Agency for Healthcare Research and Quality -- have evolved considerably in terms of their dissemination, technical sophistication, ease of use and user support.

Back in 2002, relatively few people beyond hospitals' quality improvement departments were aware of or using the indicators. Today, reflecting the widespread interest in healthcare quality improvement, the QIs are used by hundreds of hospitals, medical groups, health plans, state health departments and employer coalitions -- even by automaker General Motors. Five of the indicators have been endorsed by the National Quality Forum, a prominent nonprofit group that is crafting a national strategy for healthcare quality improvement. Some of the indicators are being incorporated into quality measures developed by the Centers for Medicare and Medicaid Services, the National Committee for Quality Assurance, and others.

"It's exciting to know that these tools we developed are out there, being used by hospitals every day," said CHP/PCOR executive director Kathryn McDonald, who leads the Stanford-UCSF Evidence-based Practice Center and the Quality Indicators project.

The QIs consist of three sets of measurements: the Prevention Quality Indicators -- aimed at detecting hospitalizations that might have been prevented with proper outpatient care; the Inpatient Quality Indicators -- which highlight potential problems with the quality of inpatient care; and the Patient Safety Indicators -- aimed at identifying potentially preventable complications and medical errors in inpatient care.

While the indicators were originally developed for internal quality improvement purposes, their use has now expanded to include public reporting of healthcare data. State health departments in eight states, for example, are using the indicators to produce quality "report cards" which give consumers comparative information on hospitals' case volumes, complication rates and mortality rates for scores of medical conditions and procedures, including heart bypass surgery, hip fracture, stroke, pneumonia, gastrointestinal hemorrhage, postoperative sepsis, and admissions for diabetes complications.

User support for the indicators has improved considerably in recent years, thanks to a support team led in part by CHP/PCOR researchers. The team answers users' questions about how to run the QIs (a task requiring the use of a statistical software program) and how to interpret the results. They also receive users' feedback and suggestions. Based on such input, the team has refined the indicators, improved the supporting documentation, and introduced new features.

Over the past two years, for example, the QI team has been developing a Pediatric Quality Indicators module, in response to requests from clinicians and policymakers who wanted data tailored to pediatric populations. The first set of the Pediatric QIs, based on the existing indicators, will be released by spring 2006. A second set, composed of new indicators, will be released in 2007.

"This project is novel because we have a feedback loop to the research team," McDonald said. "We get to hear directly from the hospitals and other users about how they use the indicators and how they might be improved."

That kind of feedback was conveyed to the QI team in person on Sept. 25 and 26, when the inaugural AHRQ Quality Indicators Users Meeting was held in Rockville, Md. The meeting -- organized in part by McDonald and by Quality Indicators project manager Sheryl Davies at CHP/PCOR -- was attended by more than 100 representatives of healthcare systems, hospital associations, state health departments, healthcare payers and purchasers -- a stronger-than-expected turnout.

"The conference was a fantastic experience," Davies said. "It was a great opportunity for users to ask us questions, give us ideas, and help us identify the next steps in our research." She said the team members were pleased with the interactive discussions that took place, and the positive feedback they received on the indicators.

At the meeting, Carol Munsch -- regional director of clinical data for the nonprofit Covenant Healthcare System in Milwaukee, Wisc. -- discussed how the five-hospital system routinely uses the Quality Indicators to identify potential quality problems, compare itself against competitors, and monitor its progress on quality improvement efforts.

A few months ago, for example, Covenant implemented Rapid Response Teams -- groups of nurse specialists, ICU and emergency medicine experts who are trained to intervene early and aggressively when a patient begins showing signs of decline. It has been documented that hospitals often fail to respond quickly enough to such declines, and the patient deteriorates further until he suddenly "crashes," experiencing a life-threatening episode such as respiratory failure or renal failure.

Covenant Healthcare wanted to track how often its Rapid Response Teams were being used, and how this affected patients' outcomes. "Typically that would be very labor-intensive to track -- we would have to manually review the patient charts," Munsch explained, "but running the QIs was a relatively fast and easy way to measure our progress."

When Covenant compared the number of patient "crashes" before and after implementing Rapid Response Teams, "we saw a tremendous improvement in patient outcomes," Munsch said. "We saw that because of this intervention, we were able to save lives." She called the Quality Indicators "valuable tools that help us look at care delivery and quality in our hospitals."

Munsch said the QIs have also helped Covenant identify coding problems at its hospitals. When the system recently ran the Patient Safety Indicators for obstetric complications, they revealed that one of the system's hospitals had an unusually high rate of obstetric complications. After a team of clinicians and quality improvement personnel investigated the problem, they found that the hospital did not actually have a higher complications rate, but that a couple of medical record coders were being overly aggressive -- recording minor complications as something more serious, for example. The individuals were given training on proper OB coding and the issue was resolved.

At the September users meeting, several sessions focused on how to use the QIs for public reporting of healthcare quality data. This is a challenging area, say project investigators, because the QIs were not originally designed for this purpose. They note that because the indicators are based on hospitals' administrative data (primarily coding and billing records) rather than clinical data, there are limitations to how the results can be interpreted. This fact has led the researchers to revise and expand the documentation explaining how the QIs can be used and interpreted. "Adapting the indicators for public reporting has been a challenge," Davies said. "It's been a learning experience about how to translate research into practice."

McDonald said the existence of a "feedback loop" for the QIs has been invaluable, helping researchers understand how their work is being used and how they can make it more relevant to healthcare organizations. "As we place more emphasis on translational research, I think there will be more projects like this," she said.

In addition to McDonald and Davies, the Quality Indicators research team includes Corinna Haberland and Amy Ku at CHP/PCOR, along with Jeffrey Geppert at the Battelle Memorial Institute and Patrick Romano at the University of California-Davis.




Topics: Diabetes | Health and Medicine | Health care institutions | Organizations