Can “bottom up” measurement improve the quality of Canadian health care?
At Jack Kitts’s first performance review as CEO of The Ottawa Hospital in 2003, he was able to report that the budget was balanced and that he was “feeling good” about the hospital’s finances. He also had a plan in place to improve morale at the hospital. When the Chair of his board asked him whether the hospital was now providing quality care, Kitts replied “of course.”
But then his Chair asked him a question he couldn’t answer: “How do you know?”
Kitts realized that he couldn’t answer, because while he felt that The Ottawa Hospital was staffed by excellent, dedicated doctors and nurses, the hospital wasn’t systematically measuring its quality of care.
And The Ottawa Hospital was not alone – ten years ago, few Canadian hospitals measured quality of care, and there was limited measurement of health system performance (including primary care, home care or long term care).
A decade later, progress has been made. At the provincial level, health systems report publicly on wait times and some quality measures. Hospital quality is also measured and reported by the Canadian Institute for Health Information (CIHI). Yet there are still large gaps in what is measured in our health care system, and much of what is measured is only useful to top-level system managers, not to the front-line clinicians whose day-to-day work is so important to the overall quality of the system. This leads experts to question whether measurement is being used effectively to improve the quality of Canadian health care.
Little standardization in measurement across health systems
In Ontario, the Ministry of Health and Long Term Care now publicly reports wait times for emergency departments, MRI/CT scans and some surgeries. However, much of this wait time data is incomplete since it does not include wait times to see specialists.
Information on health system performance is monitored by Health Quality Ontario (HQO). HQO’s annual Quality Monitor reports on a range of measurements for hospitals, primary care, home care and long term care.
HQO reports dozens of metrics, including measures of wait times, adverse events and patient satisfaction. While some of the measures are reported every year – such as the proportion of home care patients with pain that is not well controlled – other measures vary from year to year – such as the rate of deep vein thrombosis after surgery, which was reported in 2010 but not in 2012.
While HQO’s quality monitor provides a snapshot of health system performance, the most recent report acknowledges it has “major gaps.” According to the report, “in some cases, the data [on quality] exist but are inaccurate or difficult to access, while in other cases, there are no data at all.”
Alberta Health Services (AHS), the authority responsible for administrating Alberta’s health care system, publicly reports 55 performance metrics. These include such different measures as life expectancy, childhood immunization rates, workforce absenteeism, wait times, adherence to budgeting and patient satisfaction.
AHS reports these performance metrics quarterly, and has been reporting on the same measures since 2010.
In addition to AHS’s reporting on the health care system, Alberta’s ministry of health (Alberta Health) also publicly reports on health care utilization and population health.
Transparency alone not enough to drive quality improvement
There is no doubt that reporting health system performance measurements on the web can make a health care system more transparent (assuming the measurements are accurate). However, there is limited evidence to date that public reporting – at least in its current form – is contributing to meaningful improvement.
Kitts believes strongly in the power of transparency. “Unless you can compare yourself to others and benchmark against best practice, quality improvement is very slow going,” he says. But he acknowledges that transparency alone is not enough to drive quality improvement.
Transparency may be ineffective at driving quality improvement if the information being publicly reported isn’t accurate. “With something like CIHI’s report on hospital quality, doctors and nurses are very concerned that the data is old and that the comparisons aren’t ‘apples to apples’ – because everyone is reporting the data differently,” Kitts says. This certainly appears to be true of some quality indicators, such as hand washing, where there are large discrepancies between the rates of hand washing reported by some hospitals versus the rates observed by researchers.
Kitts’ concern is that questions of accuracy can be used as an excuse to not focus on quality improvement. “We have to take this away,” he says. “We have to get health professionals to take quality data seriously.”
This means more effort must be made to ensure quality data is reported the same way by hospitals and other health care facilities.
Measuring what matters
Cy Frank, CEO of Alberta Innovates Health Solutions and Chief Medical Officer of the Alberta Bone and Joint Institute, believes part of the gap between measurement and quality improvement is due to relying too much on “administrative data” rather than doing the hard work of measuring quality directly. “You need good data to make good decisions,” says Frank, “if you use data that was generated for other purposes, to track billing for example, you’re not getting good data about quality.”
Stafford Dean, Vice President of Data Integration Measurement and Reporting for AHS, agrees. “We’ve been really successful at making the system a lot more transparent – and that’s great. Now we need to focus on making sure that we’re measuring the right things to really drive quality improvement.”
Frank believes a key part of good quality measurement is not to rely on a single metric or focus on one part of a continuum of care. “Focusing on one thing can have perverse effects,” he says. “If you measure only one part of a continuum of care, the system will find ways of pushing patients out of that part of the continuum. You have to have continuum approaches, multiple data sources, multiple metrics and timely analysis.”
Tom Briggs, Vice President of Health System Priorities for AHS has a similar perspective. “What you report publicly tends to determine what the system focuses on improving, and we want to focus on the real game-changers.”
For Briggs, many of these “game-changers” lie beyond the “big-dot” measures of health system performance.
Measurement from the “bottom up”
Briggs thinks one of the keys to using measurement to drive quality improvement is to provide clinical staff with data that is relevant to them. “There’s not much a front-line practitioner can do to move a ‘big-dot’ measure of health system performance,” he says.
Instead, he believes clinicians need a finer-grained level of data that helps them identify how they can improve their practices. “If clinicians throughout the system are using their own data to improve on the quality in their own practices, that’s what’s going to move the big measures of overall system performance.”
“Most of the measures we have in Canada right now are top down,” says Dean, “but to improve quality on the front lines, we also need measurement from the bottom up – we need a whole layer of clinically-relevant measurement underneath the big health system performance measures.”
Alberta has already had some experience using data in this way. For the last eight years, Alberta Bone and Joint Institute has collected information on quality of care, including patient reported outcomes, and provided it directly to both providers and administrators.
“The key,” says Frank “is packaging. We analyze and package the data in a way that is useful to clinicians that can help them improve their care.” He stresses that this information isn’t used for reward or punishment, but to help identify opportunities to improve outcomes.
Dean hopes to use the work of the institute as a model for the rest of Alberta. He believes clinicians are eager for this kind of information, saying “I’ve seen a shift in the attitude of doctors over the last ten years –they want to know their performance, they want to know things like whether their patients are winding up back in the emergency department after they’ve seen them.”
Dean doesn’t see “bottom-up” measurement as a replacement for public reporting of high-level system performance. Rather, he thinks that measurement at all levels of the health care system (not just the top) will allow measurement drive quality improvement.
Frank, Dean and Briggs all acknowledge that it has yet to be proven that a “bottom-up” approach to measurement can work on a provincial scale. However, they’re hopeful that Alberta is on track to use measurement to drive performance at all levels of the health care system.