Skip to main content
x
Artificial Intelligence Brain Thinking and Control

Student and Professional Well-Being: What might Artificial Intelligence do for it?

In past blog posts, I’ve addressed the ideals of active learning and of encouraging critical clinical thinking for the burgeoning veterinary medical professional. I’ve also talked about information overload but have never discussed the toll that this reality might be taking on our students. Basically, medical information is now doubling every 4 months, compared to 7 years 40 years ago!  

I’ve also implied that using digital tools and “just-in-time“ micro-content can be a game changer for veterinary education.  However, we all know that change in academia is not so simple, and that there are many traditions, including funding streams, promotion and tenure, and the picture of what good teaching and assessment “looks like” (i.e., what we ourselves have experienced!!), that result in the glacial pace of didactic change.

Reframing AI as a Well-Being Issue

So, this has led me to look out for compelling arguments for reframing our collective approach to veterinary education.  Recently, considerable attention has been paid to student and professional mental health, and, in the case of veterinary medicine, the high suicide rate within the profession, reported to be twice that of physicians or dentists.1 Certainly, there are many external factors leading to stress amongst practitioners,2 but what is the contribution of the educational environment for students?

We know that veterinary students struggle too. A Student American Veterinary Medical Association (SAVMA) report indicated that 2/3 of veterinary students have experienced depression and 1/20 have seriously contemplated suicide.3,4 Veterinary schools have increased counseling resources and instituted wellness programs.  The coordinator of such a program at Oregon State University has said, “We place incredibly hard demands on these students…Between classes, difficult clinical cases, exams, rotations, and peer-to-peer interactions, there is no secret how these things can add up.” This problem has also been reported in other medical professions, and while wellness programs have also been developed in many medical schools, one report noted that “these programs implicitly focus on shortcomings of students and do not adequately address the root of the problem: the demanding learning environment.”5 In our existing information-base curricula, students have to deal with information overload and concerns about never knowing enough.

Potential of AI in Veterinary Education

This brings us to the potential of applying artificial intelligence to veterinary medical education.  How would this address the information overload issue? The information isn’t going away, but the tools we provide to students and what we ask them to do with them can change. Firstly, it is important to define what artificial intelligence in education represents, and what it does not. The reader is highly encouraged to review the recent discussion on this subject by Cope et al. (2020).6 These authors encourage us to not think reflexively that AI threatens our academic existence: “In education, we need to recognize the less (not to get carried away, to think that machines might replace teachers), and leverage the more (designing a new kind of education that fully exploit the affordances of artificial intelligence).” (emphasis mine).

Affordances of AI for Medical Education

What are those affordances, with particular focus on medical education?  According to Cope et. al, the 4 elements at which machines easily supersede human capabilities are:

1. Namability: This relates to the ability to use consistent defined discipline-specific terminology and ontologies, like SNOMED and its subset cousin, SNOVET or the Center for Disease Control’s International Classification of Diseases (currently ICD-10-CM). Adding a tool for students to be reminded of these definitions as they write and discuss a clinical case becomes a learning experience in itself.

2. Calculability: Computers are much better than humans at large complex calculations and can apply algorithms for transforming large amounts of data into meaning.

3. Measurability: The human experience of a peer or expert assessing the quality of a student’s written analysis, once measured and coded, can then allow the computer to connect a quantitative value to that assessment and “learn” how to provide automatic mentorship or even assessment.

4. Representability: Interpreting complex datasets, a computer can create a graphical representation to convey the nature of the analysis output. AI employs “machine learning,” that is the use of statistics to predict outcomes based upon patterns of data.  When even a subset of the predictions are supplemented by human “trainers,” the system can “learn” and generalize from that classification to facilitate future automated mentoring of a student’s work and eventually its assessment.  In previous blog post, we’ve discussed the multifaceted “aster plot” of performance within the social learning community of CGScholar.

Further summarizing Cope et al.’s argument, AI’s power for education is derived from the ontologies of naming. Computers can then sort and process massive amounts of information with great speed not possible by a human. Natural language processing is another form of artificial intelligence where statistical methods and “training” can lead to the possibility of evaluation of a student’s written work. AI offers its greatest transformative value in the area of student assessment.  Beyond just evaluating a learning outcome as a retrospective assessment of a student’s memory, AI allows tracking of huge numbers of interactions that are part and parcel of learning, including a student’s engagement with content, interaction with peers and instructors, and their work products, such as case analyses.  A student’s individual progress can be documented, supported, and even redirected at any point along the path of learning.  Summative “closed book” exams represent a snapshot of content recall, and lead to much student stress. The wholistic potential of AI could make them obsolete.6  Detailed characterization of a student’s progress can also be then tracked over time, allowing intervention unassociated with a stressful high-stakes summative examination, and before it is “too late.”

Any evidence of AI in medical school curricula?

Although I suspect that medical schools are generally no further along than most veterinary schools in their use of AI, a recent article7 described the potential of AI to relieve medical student stress, noting that it might also train future physicians to interact with AI-driven information systems for diagnosis, treatment and health insurance. Noting the “increasing congruence between the organizing and retention capacities of the human mind and medicine’s growing complexity,” they advocated that medical school curricula should emphasize 4 features:
“1) Knowledge capture, not knowledge retention; 2) Collaboration with and management of AI applications; 3) A better understanding of probabilities and how to apply them meaningfully in clinical decision making with patients and families; and 4) The cultivation of empathy and compassion.”7

We’ve had forms of AI in veterinary education for some time now.

I recall when electronic drug formularies became available to students and practitioners over 30 years ago.  Although memorizing dosages was always a risky waste of time, as a clinical pharmacologist, it finally allowed me to tell students and practitioners to find a good detailed formulary, not to memorize dosages, and to hone their understanding of drug classes. And today, that formulary can be on our phone, with detailed information about mechanisms of action, indications, adverse effects, and, of course, dosages. This allows each case to be an opportunity for review or even the original learning of some aspect of pharmacology. By now, most of us settle arguments by doing a search of Google or Wikipedia.  For our students, part of the challenge is framing the search with the right keywords or questions AND being able to judge the relevance and quality of the information returned from their query.  THAT is what today’s veterinary instructors should model through example and train students to do for themselves as a career-long skill.

Evidence of AI in Veterinary Education

We should be encouraged to see more advanced examples of AI now popping up in veterinary education. Jamie Perkins, DVM, EdD, now of the University of Arizona, and then of Lincoln Memorial University College of Veterinary Medicine, won first place in the Hyperdrive contest at the 2018 DevLearn Conference in Las Vegas for enabling the Amazon Alexa device to teach communication and clinical reasoning to veterinary students. In that project, she pre-programmed Alexa with case information to allow practice at taking of a history.8  More recently, she led a successful USDA grant application entitled “Delivering a Comprehensive Food Safety Database to Support Early Career Veterinarians in Rural, Large Animal Practice as an Amazon Alexa Skill.”9 I can’t think of more direct attempts to face the issue of information overload head on.  So, what are “Amazon Alexa skills”?  Avoiding garbage in, garbage out….i.e. asking the right questions.

So in AI-facilitated medical education, learning can then be focused on the seeking of evidence, and the creation of schema and models around accepted ontologies, followed by the application of critical clinical thinking.6  Studies of disciplinary ontology already exist in medical and veterinary education with the use of the multimedia peer reviewed writing platform of CGScholar (https://cgscholar.com) led by Cope and colleagues, including this author.10,11 More detailed analysis of student case analyses using text topic modeling algorithms seeks to and offer machine feedback on a student’s work, effectively seeking to judge the quality of critical clinical thinking.12 Another, using the same data, used set-like operators to analyze complex thinking.13

Conclusion

Despite its potential to change how we assess student work and progress, and perhaps to ironically extend our “humanity” as more powerful mentors, the application of AI in veterinary education will still require radical curricular refinement to define appropriate disciplinary ontologies, and to learn how to represent them. In fact, we may be seeing some of this occur with the Competency-Based Veterinary Education (CBVE) guidelines being developed under the auspices of the American Association of Veterinary Medical Colleges (AAVMC).14 Appropriately, this leaves plenty of work for us academics, including the adjustment of not only the expectations of our students, but also of ourselves.

Duncan C. Ferguson, VMD, PhD, DACVIM, DACVCP

References

1. Stoewen DL (2015) Suicide in veterinary medicine: Let’s talk about it. Can Vet J 56(1): 89-92.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4266064/

2. Chan M (2019) Veterinarians face unique issues that make suicide one of the profession's big worries. Time (Sept. 12, 2019)

 https://time.com/5670965/veterinarian-suicide-help/

3. Hafen M, Reisbig AMJ, White MB, Rush BR (2008). The first-year veterinary student and mental health: The role of common stressors.  J Vet Med Ed 35 (1): 102-109.

https://jvme.utpjournals.press/doi/full/10.3138/jvme.35.1.102

4. Pettway A (2018): Veterinary schools work to remedy the profession’s mental health crisis

April 16, 2018.

https://www.insightintodiversity.com/veterinary-schools-work-to-remedy-the-professions-mental-health-crisis/

5. Baker K, Sen S (2016) Healing medicine’s future: Prioritizing physician trainee mental health. AMA J Ethics 18(6):604-613.

https://journalofethics.ama-assn.org/article/healing-medicines-future-prioritizing-physician-trainee-mental-health/2016-06

6. Cope W, Kalantzis M, Searsmith D (2020): Artificial intelligence for education: knowledge and its assessment in AI-enabled learning ecologies. Educational Philosophy and Theory (published online February 18, 2020). DOI: 10.1080/00131857.2020.1728732

7. Wartman SA, Combs CD (2019). Reimagining medical education in the age of AI

AMA J Ethics. 21(2):E146-152.

https://journalofethics.ama-assn.org/article/reimagining-medical-education-age-ai/2019-02

8. Carrozza A (2018): Amazon’s Alexa: Bridging a gap in veterinary education. American Veterinarian, December 11, 2018.

https://www.americanveterinarian.com/news/amazons-alexa-bridging-a-gap-in-veterinary-education

9. USDA Research, Education and Economics Information System: https://portal.nifa.usda.gov/web/crisprojectpages/1020726-delivering-a-comprehensive-food-safety-database-to-support-early-career-veterinarians-in-rural-large-animal-practice-as-an-amazon-alexa-skill.html

10. Haniya S, Montebello M, Cope B, Tapping R (2018): Promoting critical clinical thinking through e-Learning. in Proceedings of the 10th International Conference on Education and New Learning Technologies (EduLearn18). Palma de Mallorca, Spain.

11. McMichael MA, Ferguson DC, Allender, Cope W, Kalantzis M, Searsmith D, Haniya S (2020):

Use of a novel learning management system for teaching critical thinking to first year veterinary students. J Vet Med Ed., in press.

12. Kuzi S, Cope W, Ferguson DC, Geigle C, Zhai (2019): Automatic assessment of complex assignments using topic models." in Learning at Scale 2019. Chicago IL.

13. Kanti S, Santu K, Geigle C, Ferguson DC, Cope C, Kalantzis M, Searsmith D, Zhai C (2018): Sofsat: Towards a set-like operator based framework for semantic analysis of text. Paper presented at the ACM SIGKDD Explorations Newsletter.

14. Reports of the Competency-Based Veterinary Education Working Group on website of American Association of Veterinary Medical Colleges:

https://www.aavmc.org/additional-pages/competencybasedveterinaryeducation.aspx