OK, part of the title was to get your attention, but let’s just look at what the pandemic has forced/taught us to do. Out of necessity, educators are downplaying the need for standardized tests in admission, and are looking for more wholistic ways to evaluate a student’s progress using formative self-tests, and open book assessments. In content-rich subjects like medicine, some of these trends that educators have been encouraging us to consider for years, might seem drastic or radical. However, as assessment drives student behavior, shouldn’t we consider how to not just “teach to the (content) test,” and, given the need to train our graduates for life as a professional, should we not emphasize how to train a problem-solver and future life-long learner?
Let’s be honest about the efficacy of content-centered formative examinations. As veterinary faculty, we all have seen students cram content from our lectures just to make it past a formative examination, only to deny exposure to the topic at-hand in a couple of weeks. As the department head of a department responsible for pre-clinical topics, I was often asked by clinicians if we ever taught students certain topics. The answer was always yes. But the question made me wonder how we could make our instruction “stick.”
If we believe that our curricula should build long-term understanding and train students to solve problems, why wouldn’t a more complex evaluation be more appropriate? And in the era of internet search engines, why not train students to understand how to weed through content and judge its quality? Certainly, there is baseline “walk-around” knowledge a veterinarian requires, but that knowledge is vast and constantly changing. Medical information doubles every 70 days, we are told. So, why not emphasize disciplinary problem-solving is a core skill? If you can see that logic, why not try to evaluate a student’s ability to reason, and craft a rational argument in defense of their solution? Also, wouldn’t we want to evaluate progress towards a standard of performance? So, particularly given concerns of possible cheating and the enormous costs of proctoring exams at a distance, shouldn’t we at least evaluate whether we can improve student assessments that don’t have such problems?
Case-based instruction is one instructional strategy that brings relevance front and center and helps a student develop the necessary memory “hooks” for long-term retention of process and content. But for a faculty member, the reading of multiple dozens of student analyses is a daunting task. Evidence has accrued that rubric-guided anonymous peer review of a student’s analysis can lead to equivalent quality of the final written product. And, in the process, generate thousands of datapoints on which to evaluate performance.
So this particular faculty newsletter edition is both a position paper and an invitation to help contribute to research into a novel social learning and peer review platform supplemented by rich learning analytics and soon, medical/veterinary ontologies. The goal is to relieve the instructor interested in employing case-based learning projects from being dissuaded by the perceived workload of student evaluation. CGScholar (https://cgscholar.com) is such a “Social Knowledge” platform described in the following short video (~12 minutes) by its founder, Dr. Bill Cope of the College of Education at the University of Illinois. If your time is limited, hone in at time segment 5:42-8:17 where he compares the expansiveness of learning analytics in CGScholar to "traditional" tests. The development of CGScholar has been supported by NSF, The Gates Foundation, and the Institute of Educational Sciences, amongst others. Having been used by veterinary and medical school classes at the University of Illinois, it is now expanding its focus on the enhancement of multi-media medical writing, clinical reasoning, all supported by peer review, analytics and artificial intelligence. You might be pleasantly surprised, as we were, at what your students can develop, when you ask. We were. And the students surprised themselves also. 1
If interested in learning more and/or collaborating on advancing these tools for use in veterinary education, please contact me at duncanf@vetmedacademy.org.
Duncan C. Ferguson, VMD, PhD, DACVIM, DACVCP
Reference:
1. McMichael MA, Ferguson DC, Allender MC, Cope W, Kalantzis M, Haniya S, Searsmith D, Montebello M (2020): Use of a multimodal, peer-to-peer learning management system for introduction of critical clinical thinking to first-year veterinary students. J Vet Med Ed. (in press).