Competency-based medical education (CBME) is emerging as the predominant paradigm in the undergraduate training of future physicians, but it still faces stiff challenges in assessment. One of the big reasons: the demands of residency selection.
An AMA webinar recently explored the conflict between competency-based assessment and residency-selection practices. The expert panelists also highlighted ways that competency-based medical education and traditional medical education can be brought together into a single, unbroken road that all learners travel.
“What clearly stands in our way is a system that grossly encourages achievement orientation,” said Holly Caretta-Weyer, MD, MHPE, associate residency program director and director of evaluation and assessment in the emergency medicine department at Stanford University School of Medicine.
She and her Stanford colleagues got an AMA Reimagining Residency initiative grant to help develop a unified system of assessment and predictive learning analytics using entrustable professional activities across emergency medicine residency programs.
Medical students tend to be so focused on meeting the requirements of residency programs that they are often unwilling to admit to the things they cannot yet do well.
“They're so pinned in by this idea that the grades and the test scores are the things that define them,” Dr. Caretta-Weyer said. And it’s no wonder, as that is often what residency programs have told them to focus on.
“We convince ourselves we're making these apples-to-apples comparisons in order to make a rank-order list,” Dr. Caretta-Weyer said. “We need to be able to say: You're No. 1, you're No. 2, you're No. 3, you're No. 4. So we say—head to head, who's the best?”
Meanwhile, residency programs have to deal with an avalanche of applications. The ratio of applicants to positions is now about 10-to-1.
“This leads us to dual-purpose our assessments,” she said, by seeking the achievement of metrics on the one hand and the desire for growth in a competency-based medical education system on the other. Those two purposes can seem irreconcilable, but Dr. Caretta-Weyer argued “that meaningful comparison and CBME can coexist.”
For starters, residency programs need to consider all their stakeholders and priorities, Dr. Caretta-Weyer said, citing the 2018 Ottawa consensus statement, which outlines recommendations on resident-selection and recruitment.
“There are three buckets to think about,” she said. “There’s individual achievement in individual priorities; there’s the overall competence and the ability to do the work that you're hiring them to do; and then there's the responsibility to meet the outcomes of a diverse society. And I would argue that you have to have all three of these things in a selection system going forward for it to be meaningful.”
But assessment goes both ways, and one of the keys to success is having a complete understanding of your program’s needs, purpose and mission. That includes, Dr. Caretta-Weyer said, assessing these following areas.
The job requirements. “What is it that they need to be able to do? What can you train them to do? And what do you need to select for?”
The outcomes you are trying to achieve. “Be honest with yourself as a program, as a specialty. Think about what patients need. How does that drive your mission?”
Issues within the system. “There are things that we can't quite put our finger on yet, but we know we need to think about. How do you select for teams that meet these outcomes? Because a lot of times it's not going to be individuals who meet the outcomes. It's going to be a team that you select for.”
The needs of patients and society. “Think about your patient population. Think about what your specialty does. … How do you get that data, or how do you get at those outcomes? Involve patients in the process.”
How to defend your decisions. You might first consider learners, “thinking also about their priorities and the things they care about,” but you also need to defend your decisions to patients and society. “How do you tell them we're meeting the outcomes you care about?”
Also presenting at the webinar were:
- Eric Holmboe, MD, who is the chief research, milestones development and evaluation officer at the Accreditation Council for Graduate Medical Education, and talked about challenges in assessing competencies.
- Brian George, MD, of the Society for Improving Medical Professional Learning, which has developed an app to predict competency in various surgical areas.