aamc.org does not support this web browser.
  • AAMCNews

    Medical schools train faculty on AI best practices

    Faculty are training one another to understand artificial intelligence and use it to build curricula, develop quizzes, create presentations, and more.

    Doctors looking at medical results on a laptop in a meeting at the hospital

    When it comes to learning to use artificial intelligence, Tracey Taylor, PhD, admits, “I was not an early adapter.”

    When the Oakland University William Beaumont (OUWB) School of Medicine launched AI training for staff early this year, “I was one of the curmudgeons kicking and screaming that this just seemed like something else to learn,” says Taylor, an associate professor of microbiology at the Michigan school. “I had no idea what he [the trainer] was showing us. I ate my lunch and left.”

    Yet Taylor felt “intrigued enough to give it [AI] a try” and learned to use it to draft test questions and accurate answers. “I realized this could make my work more efficient,” she says. “It wasn’t hard, and it wasn’t scary.”

    Today, Taylor uses AI for more classroom tasks and participates in efforts to teach other faculty at the school of medicine how to use the emerging technology.

    That experience illustrates the accelerating efforts at medical schools to train faculty and administrators on how to use AI for everything from routine tasks to classroom instruction to student assessment. The format of the trainings ranges from informal lunches and self-paced video lessons to webinars and hands-on workshops.

    “The groundswell [to create AI coaching] came from conversations with faculty feeling like they were not prepared to use AI tools responsibly,” says Kirsten Brown, PhD, MA, associate director of teaching and learning resources and career development at the George Washington University (GWU) School of Medicine and Health Sciences. “We had this gap” in AI skills between students, many of whom routinely use AI, and their instructors, says Brown, who organizes AI trainings for staff.

    There’s also a gap among faculty members, which can make it challenging to decide what to teach in a group session. As Taylor puts it, staff experiences cover the gamut from “‘I know how to spell ChatGPT’ to some who could teach these sessions.”

    It’s the latter group — faculty with AI experience — who typically organize or conduct the lessons, sometimes recruiting staff from other areas of the university (such as librarians and computer instructors) to drill into particular AI functions. The training sessions cover AI topics the organizers think faculty should know, along with the specific skills faculty say they want to learn.

    Douglas McKell, DHSc, MS, moderator of the Artificial Intelligence Community of Growth, convened by the International Association of Medical Science Educators, stresses that faculty training on AI has to go beyond piecemeal. “A comprehensive plan to get faculty up to speed is essential, not just a nice thing to do,” says McKell, assistant professor of teaching excellence at the College of Population Health at Thomas Jefferson University in Philadelphia.

    What faculty want to know

    At many schools, the initial efforts set out to give staff a broad understanding of how AI works and its weaknesses. Those instructions include the workings of large language models (LLMs) — the foundational technology that enables many AI tools to understand, analyze, and generate human language; the transparency (or lack thereof) about where an AI tool gets its information and how it generates answers; how AI tools produce errors; the risk of uploading data and private information into AI tools; and bias that infiltrates AI tools because of the materials from which they learn.

    “We have to teach people principles and critical thinking about AI,” says Thomas Thesen, PhD, director of the digital health and AI curriculum at the Geisel School of Medicine at Dartmouth in New Hampshire. He says this fundamental understanding —“how to apply it [AI], how to check it, how to see where it’s appropriate” for various uses — will serve users well as AI capabilities evolve to handle specific tasks.

    To avoid overwhelming new users, however, it’s important to provide training on specific AI functions that faculty can master quickly.

    “When they use this technology, they need to immediately see the improvement in their everyday educator activities,” says Youngjin Cho, PhD, MS, associate professor of immunology at Geisinger Commonwealth School of Medicine in Pennsylvania, where she represents the pre-clerkship educational use of generative AI on AI governance.

    Among the most basic hands-on lessons have been those about the best way to ask an AI tool to do something, such as create a quiz or develop a recovery plan for a surgery patient. This practice is known as prompt engineering. Generally, the more precise the prompt, the better. Prompt engineering requires questions or commands that specifically define what the user wants to know or create, then crafting follow-up prompts to fine-tune the request.

    Ronald Rodriguez, MD, PhD, professor of urologic science at UT Heath San Antonio, shares an example from coaching faculty there: Asking an AI tool to draft postoperative physical therapy instructions for a spinal stenosis patient produces a generic plan of high intensity movements. Adding information about a patient’s compromised respiratory system and orthostatic symptoms produces a less intense plan that reduces the risk of harm from exercises that might be too strenuous for that patient.

    Developing such case scenarios is one of the common requests that faculty make of AI. Among others: building curricula; developing questions and answers for tests, especially the United States Medical Licensing Examination; creating interactive online virtual patients; making presentation materials; drafting informed consent agreements for clinical trials; and summarizing collections of published medical literature.

    The wide array of needs illustrates the importance of customizing the instructions to the groups of staff. “Radiology wants something very specific. Cardiology has something else in mind,” Brown says. “We cannot take a one-size-fits-all approach.”

    From informal to formal lessons

    As for how to carry out this coaching, the formats range from informal sharing among colleagues to structured workshops. In some settings, faculty members show colleagues how they use one AI tool to carry out certain tasks. Other sessions are led by self-taught AI super-users like Rodriguez, who does coding to customize AI tools for specific uses. Some examples from schools of medicine:

    • OUWB — Medical school faculty started a monthly brown bag lunch to trade their experiences with AI. “Someone booked a room, and we kind of rotated around. ‘Show us how you use AI,’” Taylor recalls. That grew into more structured sessions focused on particular AI functions, she says.
    • Rowan-Virtua School of Osteopathic Medicine in Stratford, New Jersey — Steve Garwood, EdD, helps to organize faculty discussion groups on specific AI topics, including sessions within departments of the school. “Sometimes we get five [attendees], sometimes we get 50,” he says.
    • Geisinger Commonwealth — Faculty in the department of medical education piloted a community of practice gathering, “similar to a brown bag lunch,” Cho says. Among the topics: using AI to revise an abstract for a session to be presented at a conference.
    • The Mayo Clinic College of Medicine and Science in Minnesota — Faculty can use self-paced online or scheduled hybrid group learnings, says Heather Billings, PhD, MA, director of faculty development and codirector of the Academy of Educational Excellence. One example of the former is the “Take 5” video series: approximately five-minute videos that show how to use a particular AI tool for specific tasks, building on the experience of a faculty member.
    • GWU — The school developed an AI Technology Workshop Series that includes the use of AI in assessing residency applications, pursuing medical education research, and developing slides, clinical cases, and multiple-choice questions.
    • Dartmouth — Thesen codirects a teaching academy for faculty that includes classes on generating research ideas and helping professors create AI agents to perform tasks based on their objectives.
    • UT Health San Antonio — Rodriguez creates lessons for specific groups of faculty and administrators based on their requests. For example: He ran a session for about 20 staffers in the dean’s office to cover their questions about AI tasks, such as helping committees assess promotion and tenure applications.

    The most useful sessions, Cho says, are “those hands-on settings where everyone has their computers, and are sharing their outputs and having discussions” about what they generated with an AI tool.

    Several faculty members note that they routinely refer their colleagues to reliable instructional resources outside their institutions, including the AAMC’s Artificial Intelligence and Academic Medicine page, which includes webinars, virtual communities for learning, and guidelines for AI use.

    The future

    A major challenge for educators is the speed with which the technology is advancing. What someone learns about an AI ability today might be surpassed by a new version within months.

    “It moves so fast that if you don’t use it for a week, you feel that you’re behind,” Cho says.

    Do faculty need to keep up with every advancement? Maybe for the tools they routinely use, AI coaches say. More important, they add, is that medical schools continue to build foundational understanding of AI that applies across the tools.

    “We are not taking a tool-centric approach,” says Elissa Hall, EdD, codirector of the Mayo Clinic Harper Family Foundation Artificial Intelligence Education in Medicine Program. “We focus on human skills. Durable skills of problem-solving, critical thinking, and adaptability, inclusive of accuracy and bias.”

    At the same time, it’s the growing, hands-on use of AI tools by faculty and administrators that demonstrates the impact of AI coaching. At Dartmouth, Thesen recalls an anesthesiologist who, after attending academy training sessions, used AI to create an app that walks students through clinical-case examples to determine the best approaches for various patient conditions and procedures. At Rowan-Virtua, faculty tell Garwood how they use AI to build course materials, such as study guides for students. He shares this anecdote:

    “One of our anatomists, who’s been at the school for about 20 years, who is very well-loved and not highly technical, came up to me and said, ‘I love [Microsoft] Copilot. I’ve been doing so many things with it to work on my lectures and make them better.’”

    That educator, James White, PhD, assistant professor of medical education and scholarships at Rowan-Virtua, explained via email to AAMCNews: “I had never used AI before, but now I use it almost every day to add or simplify content in my lectures.”

    Recalling White’s comment to him about Copilot, Garwood smiles and says, “That’s a win.”