Click to enlarge
To practice medicine is often to be inundated in metrics. Institutions may spend immense resources collecting data on readmission rates, time to treatment, and adherence to therapy. We know that appropriate measurement is key to health care improvement, improved rankings, and reimbursement. However, for many of these metrics, the connection between short-term performance and long-term patient outcomes remains unproven. This may prompt the overwhelmed and pragmatic provider to ask, “Have we even picked the right metric?” In this week’s NEJM, Bucholz et al. share results that may help answer this question in patients with acute myocardial infarction (AMI).
The investigators utilized data from the Cooperative Cardiovascular Project (CCP) to assess Medicare hospitalizations for AMI between February 1994 and July 1995. For each hospital involved, they calculated a risk-standardized 30-day mortality rate, and then divided all hospitals into quintiles based on their 30-day mortality performance. Hospitals with the highest mortality rates were categorized as “lowest-performing” and those with the lowest mortality rates were categorized as “highest-performing.” Then, using Medicare enrollment data from 1994 to 2012, the authors identified dates of death for patients included in the initial CCP database, allowing them to estimate life expectancy for each of the hospital quintiles.
Analysis of this data indicated that AMI patients admitted to the lowest-performing 30-day mortality hospitals had the lowest 17-year life expectancy, whereas AMI patients admitted to highest-performing 30-day mortality hospitals had the highest estimated life expectancy. After adjusting for patient-specific factors and hospital-specific clinical factors, AMI patients treated at high-performing hospitals lived on average 0.74 to 1.14 years longer than patients treated at low-performing hospitals. This difference in life expectancy remained statistically significant even when accounting for a hospital’s case mix.
What impact should these findings have on our AMI practice? The results suggest that early, better care resulting in improved short-term mortality may actually create a persistent benefit to patients. They also reinforce the importance of 30-day mortality in assessing the quality and impact of AMI care being administered at our hospitals.
Unfortunately, Bucholz et al. cannot provide us with recommendations on specific interventions or processes in which to invest resources to improve AMI 30-day mortality. In fact, the authors note, prior research has yet to consistently demonstrate a tight link between specific AMI process measurements and AMI patient outcomes, and in their own analysis, adjustment for treatment variables such as use of aspirin and beta-blockers did not fully account for the observed differences in outcome.
Moving forward, these results are an excellent starting point for the next phase of quality improvement in cardiac care. We should continue to critically assess the utility of current metrics, building on those that create value (such as 30-day mortality) and eliminating those that only add to paperwork without providing benefit to patients. As the authors suggest, future AMI improvement efforts could also focus on areas we are less adept at measuring: hospital culture, organizational structure, and collaboration. Regulators, administrators, and front-line providers should begin to think critically about how to leverage this information to improve patient care.
What are you and/or your institution doing to improve care for acute myocardial infarction? When you assess the quality of care you are providing, what metrics are most/least important to you?
Browse more From Pages to Practice
Joshua Allen-Dicker, MD, MPH
Josh is an Instructor in Medicine and Hospitalist at Beth Israel Deaconess Medical Center in Boston, MA. He completed his residency in internal medicine at Beth Israel Deaconess Medical Center, medical degree at NYU School of Medicine, and MPH at Harvard School of Public Health.