Technology is playing an increasingly significant role in medical student and program assessment. Leaders from three member schools of the AMA Accelerating Change in Medical Education Consortium are using medical student portfolio programs and digital dashboards and improving institutional awareness of how students learn and potential roadblocks.
Context for coaching
In revamping its medical school curriculum in recent years, Oregon Health & Science University (OHSU) set out to create graduates better prepared for residency. OHSU’s Research & Evaluation Data for Educational Improvement (REDEI) assessment portfolio tool is a mechanism through which students can maximize their performance with the aid of an academic coach.
Upon entering OHSU, medical students are assigned an academic coach who will remain with them throughout their years of study. REDEI helps coaches and students get the most out of their interactions. It functions as a comprehensive electronic portfolio that houses all assessment data, and the platform recently added formative narrative comments from pre-clinical instructors to a student’s portfolio.
The portfolio coach meets with a student periodically throughout their four year journey, with visits taking place more frequently during the pre-clinical curriculum. A coach’s role is not to assess the student. Instead they help the student accurately self-assess, set goals and hold student’s accountable, Tracy Bumsted, MD, MPH, associate dean for undergraduate medical education at OHSU, said during a recent AMA Innovations in Medical Education webinar.
“We really wanted to create a system where academic advising could occur with complete data and knowledge the coach could use to best assist the student in goal setting and performance,” Dr. Bumsted said.
Creating a shared language
Like OHSU’s portfolio program, Vanderbilt University School of Medicine’s VST*R (or VSTAR) portfolio relies on coaches to aid in a student’s growth during their medical school career. To do this effectively, the school needed to create a new way to discuss and chart student performance.
That required “thinking about the authentic work places the students will be in and how we can share the language across those sites in order to gain evidence of their development of competency,” said Kimberly Lomis, MD, then associate dean for undergraduate medical education at Vanderbilt. She is now the AMA’s vice president of undergraduate medical education innovations.
That new language helped shape an assessment system that guides student learning through flexible pathways with explicit, standardized expectations. To convey a student’s ability, Vanderbilt relies on two popular frameworks for assessment: the Accreditation Council for Graduate Medical Education’s core competencies and the Association of American Medical Colleges’ Core Entrustable Professional Activities for Entering Residency.
Once a standardized language for assessment was created, Vanderbilt was able to populate their digital portfolios with data from a student’s entire body of course and clinical work. That data directs learners to knowledge and information resources that can help them address weaknesses.
Taking the macro view
New York University School of Medicine (NYU) is using a massive repository of data to assess its curriculum on the whole.
NYU, one of the programs at the forefront of the big data movement in medical education, has gathered data from variety of electronic learning, assessment, and clinical systems and stored it in a data warehouse.
In 2017, with the aim to get better and more data from students on the quality of the curriculum, NYU created COMET, an education evaluation dashboard that is tightly integrated with the data warehouse. The platform is accessible from mobile and desktop devices.
The overarching goal is to use the data gathered from student and faculty members to support innovation, enhance interpretation, provide theory-based recommendations for curriculum development, and design and implement faculty development.