Although there is lots of talk about making measures of health system performance available to the public, the reality often falls short of the aspirations.
Not only are these measures often difficult for public users to understand and access; evidence suggests that they have little impact.
In April of this year, the Canadian Institutes of Health Information (CIHI) released an extensive set of indicators of hospital performance known as the Canadian Hospital Reporting Project. This is a national effort to provide performance measures from 600 hospitals across Canada. CIHI notes that that website is intended to give “hospital decision-makers, policy-makers and Canadians” access to these results.
Jeremy Veillard, Vice President of Research and Analysis for CIHI says that this website was “motivated by CIHI ‘s mandate to develop pan-Canadian, standardized measures around health system performance” and that “as a publicly funded government agency, we follow the principle that our work should be publicly available.”
The measures which are available on the website include 21 clinical and 9 financial indicators of clinical effectiveness, patient safety, appropriateness of care, accessibility and financial performance. However, the indicators are displayed as acronyms and abbreviations such as “VBAC”, “coronary angio following AMI” and “hip fract surg within 48hr (all)” which are unlikely to be understood by the general public.
Nevertheless, many media outlets hailed the release of this tool as a “landmark” event providing the public with unprecedented access to information, allowing them “to judge their clinic’s performance by a couple of clicks.”
While publicly-releasing performance information is a growing trend as organizations try to increase transparency and accountability, there is little solid evidence that this has an impact on public behavior or ongoing quality improvement. In fact, a 2011 review of the scientific evidence about the impact of public reporting noted that there is sparse, low quality evidence available – none of which demonstrates that public reporting changes the behavior of the public.
Very little public use of publicly-reported performance information
A recent Ontario study suggested that the frequency of Clostridium difficile (C. difficile), a bacterial infection that can spread in a health care setting, dropped by 27% after the Ontario Ministry of Health and Long-Term Care decided to report hospital infection rates on their website. However, the authors caution that we don’t actually know how or why public reporting reduced the infection rates. They suggested that having this information publicly reported may have motivated hospital decision makers and administrators to take the problem of C. difficile more seriously.
However, the authors suggested that patients are unlikely to have made choices about which hospitals to seek care at using this information. They wrote that the evidence suggests that “the impact of public reports on consumers is related to the accessibility and ease of understanding the messages of the report.” Related to this observation, the authors note that “the reports [of C.difficile rates] are quite deeply buried on a Ministry of Health and Long-Term Care website.”
This does not surprise Stephen Duckett, former CEO of Alberta Health Services and Professor at LaTrobe University in Melbourne, Australia who notes that “developing websites for consumers and media are a flash in the pan.” In his role as CEO of Alberta Health Services, Duckett did not release hospital-specific performance information to the Fraser Institute, which was preparing a publication ‘ranking’ hospital performance in the province. Duckett argues that sometimes publicly-reported health care information “ends up being about managing reputational risk for the hospital” and that reporting can make organizations lose sight of important quality improvement goals.
He suggests that while hospitals should access regularly collected administrative data to understand their performance relative to other similar organizations, public “naming, shaming and blaming” is counterproductive because the public is not paying attention, and this doesn’t encourage hospitals to learn and improve. An example of this “naming, shaming and blaming” occurred with the release of the CIHI tool. A number of media outlets used the data to list “best and worst” hospitals in certain regions and assess whether their local hospitals were “measuring up.”
It is too early to know whether releasing this information on the CIHI website will affect patient outcomes and hospital performance across Canada, and some are not as skeptical as Duckett. For example, Gwyn Bevan, a Professor at the London School of Economics has argued that “naming and shaming” in England through a public hospital performance reporting system played a large role in reducing waiting times in the early 2000s.
Measuring What Matters
Sholom Glouberman, President of the Patients Association of Canada, says that the CIHI website is a “first step and a good beginning” in establishing transparency and accountability but that “the tool is not a patient tool, and it’s not meant to be.” Glouberman says that the website is “focused on what information hospitals would like to have reported, but no one has asked patients what they would like to have reported.” He suggests that tools like these would be used by the general public if they measured what matters to patients.
He feels that members of the public are interested in the experience of care and that “people don’t want to compare hospitals on efficiency – as a patient I don’t care which hospital is more efficient, I care that my experience will be good.”
The CIHI website currently does not provide any measures of patient experiences or satisfaction, although it has indicated that it plans to do so. CIHI did not engage with members of the public during pilot testing of the current tool, although Veillard says that they plan to do so in the future.
Alan Forster, a general internist and Scientific Director of Performance Measurement at the Ottawa Hospital argues that while the key users of the CIHI website are hospital and health system administrators across Canada, “the public is looking at performance measurement information more, the CIHI website got a great deal of media coverage and that the internet creates more opportunities to access health care performance information.”
Veillard notes that in the 48 hour period immediately following the launch of the project, the website received more than 70,000 hits, motivated by the coverage of the website in the print and televised media, which was not anticipated by CIHI.
However, accessing this information is not as simple as a few clicks. When healthydebate.ca tried to access the tool, text is displayed on the main page with the message that “due to overwhelming interest, you may experience difficulty accessing the Hospital Reports component of our newly launched CHRP tool. We are currently working to resolve this. Thank you for your patience.”
Megan Ogilvie, a health journalist with the Toronto Star who covered the launch of the CIHI website says “we heard from our readers that they tried to use the system – and it crashed – and the public was having a hard time trying to find what they were looking for.”
Veillard concedes that “we think that the tool is too complex as it stands and needs to have different levels of entry for different users, such as the public.” He says that “there are more people interested in health care information – it may not change how they consume health care services – but they want to know more, and have a right to do so.”