Family doctors are being offered personalized reports that identify gaps in the quality of care they provide, from cancer screening to diabetes management. But can data really drive better performance?
Alberta’s Chinook Primary Care Network serves more than 170,000 residents, bringing together health-care groups that include 140 physicians, as well as nurse practitioners and dietitians. But it also includes some more surprising positions: a director of evaluation, three senior quality analysts, and an information management lead.
What kind of group needs five full-time data analysts? One that believes in the importance of regular feedback and metrics. The primary care network (PCN) has been measuring performance indicators for a decade, and it offers customized reports on demand to its physician leadership board as well as individual doctors.
For example, knowing that diabetes increases the risk of cardiovascular disease, they often look at how many diabetes patients are on a statin. They further break that information down by age, sex, BMI and tobacco use. And they can track how often diabetes patients are seeing their care providers – including who hasn’t been seen within the past year, and might need to be called in for an appointment. (Click here to see a sample report.)
“One doctor who had just taken over another doctor’s panel of patients asked me, ‘Proportionally, how many of my diabetic patients are on a statin?’ I said about 40%, and he was really alarmed,” says Charles Cook, director of evaluation at the PCN. “This allows us to provide that bird’s eye view.”
This kind of data-driven analysis is now available to family doctors in both Ontario and Alberta province-wide. Known as audit and feedback, it offers doctors – and teams of doctors – the chance to have their information analyzed and submitted to them as a report. (Such reports have also been used in Canadian hospitals, but this article focuses on its use in primary care teams.)
Audit and feedback compares their performance on key indicators, like the use of lab tests or of cancer screening against those of other practitioners in the region. For now, those reports are most often presented quarterly or once a year. But we’re moving towards faster feedback. Some health care workers in the Chinook network can access current reports on a private website, which will evolve into a continually updating hub – like a Google Analytics page for clinical care. “Our plan over the next couple years is to get it as close to real time as we can,” says Cook.
The idea of audits doesn’t resonate with everyone, however. A 2011 mixed-methods study looked at how seven Ontario family health teams responded to having their performance measured. It found that, on the whole, physicians were supportive of the idea, but some also had concerns. “It can be threatening to someone who has done stuff the same way for 25 years, to be told that people can measure this now and they can tell you whether you are effective or not, and their records are completely accessible for analysis,” one physician told the researchers.
“Some doctors who haven’t used this before are anxious about having people looking at their data – or are even anxious about looking at their own data,” echoes Lara Cooke, associate dean of Continuing Medical Education and Professional Development at the University of Calgary and co-leader of the Alberta Physician Learning Program. “It’s a culture shift.”
The effectiveness of audits also varies greatly with the quality of implementation – with poorly executed audit and feedback having no impact at all on quality. And of course, it’s not possible to measure everything that’s important. “There are limitations,” says Noah Ivers, a family physician and scientist focusing on quality in primary care. “But there is also incredible potential.”
Alberta and Ontario’s data-driven primary care improvements
Audit and feedback systems are seen as a way to ensure more accountability and to shape continuing professional development. Worldwide, it has been tied to public accountability and physicians’ pay, but in Canada, many audit and feedback programs are framed as part of quality improvement and professional development, and therefore are voluntary and not publicly available.
Alberta offers multiple audit and feedback programs, including the Physician Learning Program (PLP). The PLP is “basically thought of as a service to the members of the [Alberta Medical Association],” says Cooke. It offers tailored reports – a key component of successful audit and feedback, in addition to providing general reports. “Doctors or groups of doctors might come to us with questions about some specific clinical thing – how are we doing with management of condition X – and we sit down with them and figure out how to tailor the report to them,” she says.
“None of us is terribly good at knowing what our actual performance is…But the data tells the tale, and brings out the gaps.”
Last year, they created 300 reports across the province, which combine charts and data with structured feedback, including looking at evidence-based guidelines and identifying barriers to success. In primary care, she says, they’ve often focused on Choosing Wisely recommendations, such as increasing the likelihood these recommendations are followed for cervical cancer screening. They follow up with another audit on the same data six months or a year later, so doctors can see their progress. And they’re working with other groups on a “dashboard” model that would update key figures in real time.
Ontario also has a handful of organizations working on primary care audit and feedback, including Health Quality Ontario’s Primary Care Practice Reports. Since its launch in 2014, hundreds of doctors, Family Health Teams and Community Health Centres have volunteered to have their practice analyzed.
An example of a chart from a Primary Care Practice Report. Click here to see a full sample report
It compares the practice with other similar types of practices in the province, reporting on, for example, how many patients are up to date on cancer screening and diabetes management testing, or examines rates of emergency department visits.
“For health care utilization data, [like ED visits and readmissions] we present risk-adjusted data to account for differences in the populations different practices are serving,” explains Anna Greenberg, vice president of health system performance at HQO. “On the other hand, we present raw, unadjusted data for cancer screening or diabetes management indicators, so that practices can understand the true rates of uptake.”
HQO also offers tips for improvement and encourages doctors to set goals for themselves. The team is currently working with Ivers and the Ontario SPOR Support Unit to study the impact of these reports, but that information is not yet available.
Physicians can get another report in a year’s time, though HQO is working towards providing faster results. “There’s a lot more data available at the practice level than there was when this started,”Greenberg says.
Nonetheless, creating these reports isn’t easy, as accessing and analyzing the data can be labour intensive. There can be long turnaround times to get the information – typically, a year long in Ontario – and it can be time consuming to anonymize it. The data for the PCP reports comes from institutions like OHIP, the Ontario Cancer Registry and the Ontario Diabetes Database, while Alberta’s PLP pulls from sources like Alberta Health and Alberta Health Services.
It’s difficult to get information on prescriptions for people under 65, and for other team members like nurse practitioners. And some of the data simply isn’t appropriate. “It’s not collected with this kind of work in mind,” explains Cooke. “Its purpose is for billing and data quality can be an issue. If [doctors] don’t know all the billing codes, and have some go-to ones…garbage in, garbage out.”
The evidence behind its effectiveness
So is it worth the effort? Audit and feedback has been analyzed for its effect in both primary care and for specialists. Audit and feedback is generally effective, “but there’s a huge variation in that effectiveness,” says Ivers, author of the Cochrane Review that investigated its impact. It looked at 140 studies and found that audit and feedback “generally leads to small but potentially important improvements” in performance, with a median of 4% improvement in the outcomes the feedback was trying to address. But one quarter of the interventions had larger effects (up to 16% absolute improvement), and one quarter had no effect.
The review found audit and feedback was more effective when it was given both verbally and in writing by a supervisor or colleague, when it included an action plan and targets, and when it was offered more than once. It also worked best on health care workers who had been doing worse than average on the outcomes being measured.
Well aware of the importance of execution, the Primary Care Practice Reports are working with the Canadian Institutes of Health Research (CIHR) to modify their reports and test their effectiveness. A recent ICES report by Richard Glazier points to key evidence-backed ways that Ontario’s reports can improve, including adding both explicit targets and tools that help physicians create an action plan.
“We kind of see this work as a bit of a laboratory and a learning system; we work with researchers to continually look at how we optimize the report,” says Greenberg.
Ivers also cautions against “pretending that all the things that are important are readily measurable. We need to try to measure what matters, and right now we’re frequently measuring what’s easy to measure,” he says. For example, what patients feel is important – including wait time to see their doctors, and whether they had enough time with them at their appointment – is often not available in the data that doctors collect. (Some programs, like the Association of Family Health Teams of Ontario’s Data to Decisions feedback, does incorporate patient experience data, however.)
But despite its limitations, audit and feedback is superior to other professional development measures, says Ivers. “We have evidence that humans tend to seek continuing learning in things they know a lot about and are already doing good at. It kind of defeats the whole purpose,” he says. “[Using data] to drive continuing professional development means it’s about what our patients and communities need, not what we’re interested in.”