For the first time in 100 years, Canada is significantly changing how it trains doctors. Following the lead of countries like the U.S., Australia and the Netherlands, Canada will be moving away from a time-based system and towards competency-based education.
This will mostly affect residents – those who have just graduated from medical school and are working in clinics and hospitals, under supervision, for two to five years before writing national exams and becoming fully qualified to practice on their own.
The new competency-based system focuses on having residents master key skills rather than on how much time they’ve spent working in each area. The change comes with hopes that residents will be better trained on the whole, and that those who are falling behind will be identified sooner.
“In our current system there is a failure to fail – in some cases poorly performing residents get through into practice, or more commonly we get people who end up in remediation in their final years who would do much better being helped earlier on,” says Kevin Imrie, president of the Royal College of Physicians and Surgeons of Canada.
It also allows the colleges to offer more “quality assurance” to the public, says Ivy Oandasan, director of education at the College of Family Physicians of Canada. “The programs have more responsibility to ensure they are monitoring learners, and that means the college can certify that the residents actually have those competencies.”
But the shift has prompted concerns, from the practical – that it will add to the already heavy workloads of supervisors and residents – to the philosophical – that medicine can’t be reduced to a list of skills.
“Medical education paradigms are exactly like health care treatment: They have effects, and they have side effects,” says Brian Hodges, executive vice-president of education at University Health Network and co-author of The Question of Competence: Reconsidering Medical Education in the Twenty-First Century. “Competency based education has lots of promise, and a substantial number of challenges too.”
Competency-based medical education comes to Canada
Over the past eight years, pilot projects testing competency-based medical education have appeared at some schools, including the University of Toronto’s orthopaedic surgery program and Ottawa’s anesthesiology program. And it will soon be compulsory nationwide. The College of Family Physicians of Canada has already begun, having switched to training all family medicine residents using a competency based system in 2015. The Royal College of Physicians and Surgeons of Canada plans to begin rolling out its competency-based program in 2017.
Family medicine residents must complete certain objectives across their two years of residency. Those include both treating various populations and developing specific skills. “In the old model you could be in the emergency room for eight weeks, and you may never get somebody who broke a leg. In this model if you don’t get that experience, we help you acquire it – perhaps by going to a fracture clinic,” explains Oandasan.
Daily feedback in the form of electronic “field notes” is a key part of the system. “At the end of a clinic, I’ll say [to the resident], what interaction did you learn the most from, and I’ll ask the learner to write the field note and I then add my comments,” explains Dianne Delva, chair of the University of Ottawa’s family medicine department. Those daily notes are then brought together to measure competencies and look for gaps in knowledge and experience. “I think it’s helping us to do a better job in evaluation and in helping those learners who are struggling,” she says.
The Royal College’s Competence by Design program takes a similar approach. It’s a hybrid, combining the structure of the current system, including clinical rotations for a set period of time, with competency testing.
The competencies are broken down into two parts. The first is milestones, which are discreet skills that are aligned to the CanMEDS framework. Those then contribute to larger Entrustable Professional Activities (EPA), which are more comprehensive. A milestone, for example, might be recognizing problems in cancer patients, and another might be to know the appropriate treatments for those issues. The EPA that encompasses them would be providing care in general for cancer patients.
This should move the system away from focusing on time spent and toward mastery of skills. “In competency based training the emphasis is not so much on time – time is really seen as a resource to enable people to achieve the competencies that they need to achieve,” says Mark Levine, staff anaesthesiologist at the Hospital for Sick Children and associate professor of anaesthesiology at the University of Toronto.
The benefits
Competency-based training is more individualized to the learners, more transparent to stakeholders, learners and teachers, and more objective, according to an article in The Lancet that argued in favour of its implementation worldwide.
It also offers a way to identify underperforming residents. “The actual failure rate of residency is almost zero, and remediation and hold back is very unusual,” says Hodges. With a competency-based program, “the assessment system will pivot, and you should see people not automatically progressing.”
Jason Frank, director of specialty education, strategy and standards at the Royal College of Physicians and Surgeons of Canada, says he expects only around 1% of residents will take longer than five years to complete the program, about the same number as do in the current system. But he hopes it will catch underperformers sooner. “It’s always a tragedy when somebody comes to the exam and fails,” says Frank. (About 5% of students do.) “In the new system, that would be much more unlikely, because there’s just so much more observation.”
A talented few residents should be able to finish faster as well. “On a year by year basis it is theoretically possible for an exceptional candidate to graduate early, but we think that’s going to be very unusual,” says Frank. Instead, those that excel might have a modified schedule the next year or be allotted more time for research or electives.
Both the family medicine and Royal College systems ask residents to take a lot of responsibility for their own learning, which is also positive, says Oandasan. “A competency based curriculum puts a lot of pressure on the learner. You have to be able to say, I don’t really know this well, what can I do better?” she says. “If we do this right, we will have a generation of physicians able to reflect and ask themselves, what can I do better in my practice?”
Potential pitfalls
Some argue that focusing on competencies leaves out the important intangibles of medicine. After all, there are some things you simply can’t test for. “The big challenge is to avoid reductionism, to avoid turning this into a list of skills that you can do,” says Hodges. “Dealing with complexity, displaying judgement – these things cannot be turned into a checklist.”
The addition of EPAs is one proposed answer to that, as they show the ability to complete a larger task rather than individual skills. “The use of EPAs – done well – can emphasize integrated, higher level abilities,” says Hodges.
Another concern is that this will encourage residents to achieve a minimum competency and move on to the next step, known as migration to the minimum. “That’s a myth,” counters Frank. “The evidence says this new way of structuring progressive steps pulls everybody up, and everybody ends up achieving at a higher level.”
But the most common concerns are not around the theory of competency based design, but the logistics of implementing such a big change. “When we talk about curriculum, we see residents as learners. And they’re funded by the ministries of health, who see them as service providers. So there’s a tension there,” says Oandasan.
Imrie says the Royal College is also well aware of that concern, but they don’t believe the new program will affect residents’ ability to provide care. “There may be a small amount of that at the early phases, but the payback is that residents will be much more aware and engaged, especially later on [in their training],” he says.
At the same time, there’s a widespread worry about the programs causing an increased administrative burden on supervisors. “I’m hearing from faculty that there’s an increased time requirement for documentation and feedback,” says Delva.
“What we can’t do is add competency based assessment to what we currently do, this can’t be an extra layer on top of everything else,” says Imrie. “We’re planning to get rid of a number of the assessments we have currently, and to make assessments mobile enabled as much as possible. We think that it will be a modest increase of time.”
Usability concerns were one reason why the Royal College’s competency launch was pushed back from 2016 to 2017, as they were working on making sure the mobile platform for assessments was ready.
And residents are also worried about workload, says Tom McLaughlin, president of the Resident Doctors of Canada. There are a lot of potential positives from the increased feedback and opportunities competency-based education provides, he says. “But if you have exactly the same service model where residents and staff are busy, it makes it difficult to do all of these idealistic things… if you ask residents to be simultaneously doing service provision while doing the educational component, it may make an already difficult time even more difficult.”
The evidence
The University of Toronto’s orthopedic project has been running for almost eight years, so it offers a glimpse into the effects of competency-based design. A 2013 study looked at the pilot program’s effects on 14 residents who took the curriculum since it began in July 2009 and found they had superior technical skills and that “without exception, the residents report a high degree of satisfaction with this new model of training.”
It’s important to note that the program, as studied, was a much more modular one than the hybrid model being put forward by the Royal College. (In fact, the “pendulum has swung back towards conventional training” and the orthopaedic program is also much more like a hybrid program, says Peter Ferguson, chair of orthopaedic surgery.)
Research out of the U.S. has suggested that emergency medicine milestones show both validity and reliability and that they’re a better way of judging residents’ progress than more commonly used scales.
A recent article by the International Competency-Based Medical Education Collaborators admits competency based education can be seen as “a resource-intensive system of education and training that is as yet unproven as a means of producing better doctors.” But it goes on to argue that “sound advances in education theory serve as the building blocks of CBME: the importance of clearly defined outcomes, learners taking an active role in their education and assessment within an authentic clinical setting, and formative and focused feedback from multiple assessors using multiple methods.”
Canada’s experiment with competency-based medical training will be closely tracked. “We have an extensive program evaluation plan developed, with a mixed methods approach that includes a longitudinal survey,” says Oandasan. The Royal College will begin field testing elements on July 1, and a cohort study will follow residents to look at exam results and practice patterns.
”There’s evidence of a number of important outcomes that are all positive,” says Frank. “For the rest, we’re holding our breath.”
The comments section is closed.
This is not feasible in 2017. Nor in 2019. Faculty across the country are already burdened, and dishonest in their evaluation of trainees. This will fix neither of those two fundamental problems.
I’ve wondered if this move serves to benefit the hospital associations and provincial governments by causing a lengthening of residencies rather than a neutral or contracting effect.
Getting an experienced physician for longer at 10% of the cost is a great business move.
I’m concerned with “talented residents will finish faster and those who experience difficulties will be identified sooner”. There is a high competitivity culture in medicine, through the whole process of medical training, but also during residency, while residents are constantly compared to each other. This might put an even higher pressure on them, the pressure to finish as fast as possible to proove they’re fast learners (and perhaps to start to make big money asap). I have to say, I’m no expert, I only know the field as a doctor’s partner. This might be an interesting way of improving medical training, but in my view, it’s important to consider their mental health, their state mind through the process. Also, there might be a significant difference among men and women that should be considered (the orthopedic project at U Toronto, was it mostly with male residents?)
The highly touted U of T orthopedics experiment was conducted on a small group of very outstanding resident applicants. They were even separate from the “2nd tier” U of T orthopedics residents who were streamed through the regular program. Of course they succeeded, as this was an outstanding example of selection bias. There is no way to know, as you point out, other residents (M and F) will have the same success. The experts making these wholesale changes designed experiments to have pre-determined outcomes and are now happily touting the methodologically flawed pilot projects as if they prove something. A better experiment of this style of residency would have been randomizing the residents accepted into either a competency-based or traditional curriculum across an entire specialty in the country to see what happens. The evidence provided as justification to the RCPSC changes is very weak.
There is no high quality evidence to support this national change across all specialties, so we have to really hope this small group of like-minded ‘experts’ know what they are doing and have a collective wisdom to predict the future. Otherwise we have simply added an immense administrative burden with no positive impact and quite possibly a negative effect on training. Not only is this an experiment in residency training, but an experiment in the value of evidence.
It is the way of thinking and approaches to problem solving that is most important for young learning doctors. It is impossible to be “completely” “educated” or “experienced” in the never ending spectrum of illnesses, diseases, injuries and all health problems. The junior doctors should be trained for the skills and principles necessary for future exposures to previouly in-experience or even undeveloped medical treatments or strategies, rather than learning current individual diseases and treatments. It is the attitude and approaches, not the wide spectrum of healthcare problems, that need to be imparted to these learners. Having said that, it is very difficult to develope a way to assess the “competency” of attitude. Time is still a factor, and it is widely variable for individuals and specialized fields, and no “standard” length/duration of training can be rigidly imposed. there is still a need for “completion of training” assessment by regulating bodies like the Royal College or the Family Practice Colleges.