This fall, students at the George Washington School of Medicine and Health Sciences (GW SMHS) can take a new elective called Artificial Intelligence Applications in Healthcare. At Harvard Medical School, students can earn an AI in Medicine PhD. At the University of Virginia (UVA) School of Medicine, students practice using AI to diagnose and develop treatment plans for fictitious patients. At UT Health San Antonio Long School of Medicine, students can take a year off to pursue a dual degree in medicine and AI.
Those are some of the ways that medical schools are teaching students how to use AI as future physicians and researchers. That’s a significant advance from just two years ago, when the sudden release of AI tools had faculty grappling with how the technology might help or hamper education.
“Half of doctors didn’t know what a chatbot was” back then, says Jonathan H. Chen, MD, PhD, director of medical education in artificial intelligence at the Stanford University School of Medicine.
Chen is guiding efforts to teach AI to students this fall, and notes that his new job title reflects the rush to keep up with expanding technology: “This position did not exist three months ago.”
The need is clear. Day by day, AI is becoming a routine tool to help physicians diagnose and treat patients through, for example, enhanced imaging of X-rays, blood-sample analysis, predictive analytics of wound complications, and applying patient genetics.
“We need to teach our students how to use AI responsibly,” says Ioannis Koutroulis, MD, PhD, associate dean of admissions at GW SMHS, in Washington, D.C.
Other medical school leaders agree: According to the 2023-2024 Curriculum SCOPE Survey of MD- and DO-granting schools in the United States and Canada, 77% of respondents said they covered AI in their curricula.
“We must prepare students for technology that’s probably going to disrupt their clinical practice,” says Andrew Parsons, MD, associate professor of medicine and associate dean for clinical competency at the UVA School of Medicine. “It's going to improve our clinical reasoning if we know how to use it effectively.”
Learning AI
AI education in medical schools starts with the assumption that most of the students have used AI to varying degrees but without formal training. Studies have found that as many as 86% of undergraduates have used AI for schoolwork.
“The newest generation of medical students is conversant in [generative artificial intelligence] because they have used [AI] tools routinely as a part of their college education,” two medical school administrators wrote in a recent article in Academic Medicine. “Student expectations … have outpaced faculty development and strategic implementation.”
The various education initiatives start during the first days of school and progress through each year, ranging from onetime lectures to hands-on practice. They cover such areas as how generative AI learns and produces answers; strategies for crafting AI prompts to get the most useful response; ethics — for instance, not giving an AI tool identifying patient information; dangers such as inaccurate answers; and the importance of the human element, i.e., using AI as one factor in decision-making rather than relying on it as an authority.
“We’re not trying to replace the basic functions of a physician” with AI, Koutroulis says. “This can help you be more efficient, and the quality of care gets better.”
For example:
- At Harvard Medical School, in Boston, incoming students on the Health Sciences and Technology track get a one-month introductory course on AI in health care.
- Starting this fall, the Long School of Medicine will offer first- and second-year students two electives that introduce them to generative AI and to the programming language behind it.
- The University of Miami Miller School of Medicine recently posted, on MedEdPORTAL, a module for students on Exploring Applications of Artificial Intelligence Tools in Clinical Care and Health Professions Education.
- This month the Icahn School of Medicine at Mount Sinai, in New York City, announced that it will provide all medical and graduate students with access to OpenAI’s ChatGPT Edu (a subscription-based version of ChatGPT that focuses on educational needs) and train them in how to use it.
- Starting this fall, GW SMHS will offer third- and fourth-year students a two-week elective, Artificial Intelligence Applications in Healthcare. The course includes lectures and case studies about using AI for research and patient care.
The GW SMHS elective illustrates how AI tools are particularly well-suited to help with clinical course work. Koutroulis explains how students in the course will use AI to diagnose and treat patients in fictional cases:
“They can ask for a differential diagnosis. They can ask what kind of laboratory or imaging studies they should order. They can ask for potential treatments or recommendations.”
The students produce a final project using AI to help them work through a clinical case, with assessments of AI’s accuracy and how it contributed to their recommended course of action, Koutroulis says.
But AI education isn’t always delivered through AI-specific courses; it is gradually being woven into existing classes as well. “We are spending a lot of time redesigning some of the old courses and designing new courses as well,” says Arjun Manrai, PhD, assistant professor of biomedical informatics at Harvard Medical School.
For example, a Harvard Medical School course, Computationally Enabled Medicine, uses AI and other tools to analyze biomedical data, such as genomics and epidemiology. At the Long School of Medicine, an ethics-focused course called Medicine, Behavior, and Society includes a 90-minute presentation about generative AI, says Stephanie Gutierrez, MEd, the school’s manager of dual-degree programs. At the UVA School of Medicine, students in a foundational clinical course learn how to use AI along with other resources to help diagnose patient illnesses and develop treatment plans, Parsons says.
“We ingrain AI into our standard clinical-reasoning exercises,” he says.
The clinical exercises stress the skill of crafting AI queries about a patient’s case, including context that offers a holistic picture of the person without providing identifying information. “The way you ask the questions is very important,” Koutroulis says.
Learners as innovators
The next step is to turn some medical students into AI innovators, not just users.
Christopher Mao studied electrical engineering as an undergraduate student but says that along the way “I discovered my love for medicine,” prompting him to purse a degree at the Long School of Medicine. While working in the Feldman Research Lab, where students used AI to improve cardiac imaging tools, he learned about the chance to simultaneously earn a master of science degree in AI from the University of Texas at San Antonio (UTSA).
“Everything just happened to align” in terms of his career aspirations, Mao says.
The students attend UTSA for one year after their third year of medical school, then return for the final year of their medical studies to graduate with the dual degree. Mao is finishing his year at UTSA, where he explored ways to improve AI tools for cardiac imaging — specifically, he worked with AI to generate histology-like images from intravascular optical coherence tomography (IVOCT), which uses light to create high-resolution images of blood vessel walls. The device is inserted through a catheter into the heart.
With standard IVOCT, “the image is difficult to interpret because it contains more information than the human eye can recognize,” Mao says. “It’s like looking at an ultrasound.”
He and other students used AI tools to integrate images of tissue samples with the optical images to help doctors better assess conditions of the heart.
In Harvard Medicine School’s PhD track, students work with leading AI researchers to build tools that use generative language, diverse data types, and computer vision to, as the program website says, “improve clinical decision-making and biomedical research.”
Students need to have elite skills in math, science, and medicine. “We’re selecting computationally enabled students who are deeply interested in medicine and health care,” Manrai says.
Learner demand is high
A foundational struggle for medical schools is keeping up with the evolution of AI and the demand for AI education. At GW SMHS, the new two-week AI elective is offered twice per year, accommodating a total of 40 students.
“We’ve received a lot of emails from students who want to participate,” Koutroulis says. “I think we’re going to have to cap participation and defer some students to future sessions.”
At the Long School of Medicine, the dual-degree program started small but is expected to grow as more students come to medical school with AI experience and interest in applying AI to medicine. Currently, one to three students are graduating from the program each year, but those were students already in the medical school when the initiative began, Gutierrez says. Now, she says, prospective students often ask about the dual degree, and “we talk about this on interview day as a way to recruit students who are interested in AI.”
As more and more incoming students arrive with thoughts of pursuing the dual degree, Gutierrez estimates that each year 10 to 15 students will graduate through that program.
In addition, Long School of Medicine students have formed a student interest group in AI, which gives them formal time with faculty to talk about the use of AI in medicine, even though they might not pursue the dual degree, Gutierrez says.
A major challenge for schools is that AI technology and use are advancing so rapidly that lessons created today might be old news next month.
“You must not plan for where we are” with AI tools, says Chen of Stanford. “You must plan for where we are going. We — as an institution, as a profession — are going to be left behind if we don’t plan for where it’s going.
“Because it’s coming at us really fast.”