A renowned geriatrician and New York Times best-selling author says we must change policies, improve training, expand research — and above all, combat ageism — if we hope to improve care for the 62 million people in the United States who are 65 or older.
Editor’s note: The opinions expressed by the author do not necessarily reflect the opinions of the AAMC or its members.
When I turned 30, I was shocked to find myself single and childless. I felt like a failure. Never mind that I was a resident at a prestigious medical center working 100-hour weeks. Never mind that I had friends and family and, for the first time, my own apartment, that I was becoming a decent doctor and solid recreational runner, that I had just enough free time and income to sometimes go out to dinner, a movie, or a literary event.
At 40, I had a successful, satisfying career caring for older people, but I started joking that I needed a sign above my head while I was jogging, as a warning or explanation of my declining pace: Middle-Aged Woman Running. I started secretly racing people: men a bit older than I was, women a bit younger. I tried to pass them and then stay ahead. This gave me a shameful amount of pleasure and probably reflects poorly on my character.
At 50, I felt 35 — internally, that is. That feeling was ridiculous, since I actually felt nothing like I had felt at 35. Mostly I felt better. More secure and confident, less anxious, and more content in myself and my life, if not in my work, because of the growing complexities and perversions of the health care system, with its focus on procedures and profits. Externally, certain changes in my body had reached the stage of undeniability: the prominent tendons and veins of my hands, the facial wrinkles, the beginnings of a disappointing menopausal shape-shift. Medical students who once would have easily transitioned to calling me “Louise” now mostly stuck to “Dr. Aronson.” After all, I could be their mother.
I realized that I — the aging expert, professor of geriatrics at University of California, San Francisco and author of dozens of articles about ageism and old age — had harbored the delusion that I would remain essentially 35 until one day, with a cartoon-style poof, I would be 70. I think that’s why being 50 so surprised me. It seemed that in some persuasive and wholly irrational part of my brain, I had believed that aging was only something that happened to other people — parents, teachers, patients.
To consider old as other is a near-universal phenomenon, even if it takes a variety of forms, from condescending compassion or romanticizing to disdain, discounting, and dehumanizing. This is visible in all sectors of society and health care — including in academic medicine, where education on care of older adults is too often negligible to very limited and distorted by prejudice. Indeed, the health care system’s entrenched ageism — with its gendered and racialized intersectionality — makes medical training and practice part of the problem at a time when media commentary on the aging of the human population routinely includes terms such as “bomb,” “burden,” and “tsunami.”
To consider old as other is a near-universal phenomenon, even if it takes a variety of forms, from condescending compassion or romanticizing to disdain, discounting, and dehumanizing.
Conversations about aging, both generally and in medicine, are full of hypocrisies. In employment, there are calls for older people to “get out of the way,” alongside laments about the burdens posed to families and Social Security by dependent elders. Employers often view older workers as costly and less productive, yet evidence shows that age-diverse teams are more productive. In health care, we invest hugely in keeping people alive and then complain about the significant medical needs of all the old people we have kept alive. In medical research, we exclude older people from trials by age, then apply trial results to those excluded older patients and blame old age for their adverse outcomes.
This counterproductive blend of ambivalence and ageism plays out in medicine in all the ways the World Health Organization (WHO) has called out: personally, interpersonally, and structurally. As the 2021 WHO global report states, “Ageism is everywhere,” influencing “how we think (stereotypes), feel (prejudice), and act (discrimination) towards others or ourselves as we age.” Daily, I hear politicians, clinicians, and health-systems leaders in their 60s, 70s, and 80s refer to old people or the elderly as if, by virtue of their roles and adequately functioning bodies and brains, they don’t belong in those categories themselves.
Medical ageism manifests in many ways and in many arenas. At the federal level, it’s gratifying to see that the Department of Health and Human Services recently announced that Medicare will reward hospitals that provide data that support “age-friendly medical care” and invested $200 million in training primary care providers to better serve older Americans. Yet much work remains before older Americans can receive life-stage-informed care.
In 1986, the National Institutes of Health established a policy that encouraged inclusion of women and minorities in research, with the goal of ensuring that “findings can be generalizable to the entire population.” It added children in the 1990s, but the policy wasn’t extended to older adults until 2019, despite the abundant data on harms to older patients from drugs, treatments, and procedures studied only in younger and healthier people. A medical degree isn’t required to see the illogic of a system that excludes a population based on their differences, applies the results to them despite those differences, and then blames age, not faulty science, for that population’s poor outcomes.
The Centers for Disease Control and Prevention suffers similar shortcomings. Its annual vaccine schedule includes more than a dozen subcategories for children and three for adults, but people ages 65 and older are lumped into a single group. Many in the latter parts of that over-40-year span are excluded from vaccine trials — including the COVID-19 trials that saved so many older people’s lives ... by luck rather than study inclusion and scientific evidence. The need for vaccination is determined partly by immunological, biological, and social factors — all of which are markedly different among older people. At older ages, health and functional status may be better determinants of vaccine requirements than age alone, and toward the end of life, vaccination may not improve health or be consistent with the person’s medical priorities. No less important, scientists are working on strategies to bolster the aging immune system prior to vaccination, to increase the body’s response, another factor to consider when making vaccination decisions. A truly age-inclusive health system would look at each life stage with the same scientific rigor and creativity.
The numbers tell an important story. Currently, 18% of the U.S. population is 65 or older, and this group accounts for about 40% of hospital stays and a quarter to a third of outpatient visits. Usually, medical education and health systems are responsive to demographics and epidemiology, but that’s not so when it comes to the needs of older patients.
According to the Curriculum SCOPE Survey, 96% of medical schools reported covering geriatrics in 2022-2023. While this is progress, some schools spend just a few hours on geriatrics, whereas all schools devote months-long rotations to children and adults under 65. Most clinicians don’t treat children or pregnant women or deliver anesthesia, yet those rotations are required, while training in the care of older adults, something a majority of clinicians do routinely, is not. At my top-10 institution, for example, medical students get less than two weeks of geriatrics training, and geriatrics has been waived for our medicine residents for over a decade.
The United States has an age-based approach to medical training and care in the form of pediatric and adult specialties and children’s and adult hospitals. Yet knowledge and skill in the care of older people — the age group most likely to use health care — are largely considered optional throughout medical education. This is a blatant inequity in an era when the AAMC, the New York Times, and countless others have called attention to the unmet medical needs of our aging population. Health systems also devalue older lives, albeit in different ways. For example, if a few people or programs within the system adopt four evidence-based elements of quality care for older people, the whole system gets the “Age-Friendly” designation. Finally, our country’s primary sites of old-age-specific care are nursing homes, places that are neither homelike nor truly medical, and places we all hope to avoid.
The field of geriatrics itself is not without fault. In response to the rapidly growing aging population, the surge in information about medical care for older adults, and the geriatrician shortage, geriatrics has narrowed its focus to the oldest and frailest adults, far from its original goal of caring for all older people — just as pediatricians care for all children, from newborns to adolescents. In the meantime, the healthy-aging and longevity industries are booming, with huge responses from the public, scientists, and funders who aren’t getting what they want or need from our health care system. Because trainees get exposure to the former, not the latter, they don’t see the full range of opportunities and pleasures from caring for older adults. Because the focus is so narrow and the training is so brief, when trainees care for older patients, they feel less inspired and competent. It’s a normal human response to steer clear of something that makes one feel inadequate, particularly after a decade or more of training.
It is important to note that ageism is not directed only at older people. We claim to be a society that values children, but more children live in poverty than do people in other age groups, and in medicine, pediatricians are paid significantly less than their counterparts who care for adults. Not coincidentally to be sure, both pediatrics and geriatrics are women-predominant specialties, caring for populations that are primarily cared for at home by women family members.
The number of older people in the United States will continue to grow for the foreseeable future. Most of us will enter that group sooner or later. Training clinicians to provide high-quality, evidence-based care to older adults and combating ageism in medicine are not add-ons. They are necessities.
The good news is that it doesn’t have to be this way. Medical schools and teaching hospitals can and should lead the way toward designing medical education that prepares all clinicians to provide high-quality care to older adults, and that values and rewards care of children as much as care of adults. An inclusive step in the right direction would be promoting curricular requirements that acknowledge the entire lifespan when covering all organ systems, diseases, and health-system challenges. One way to get there is for medical schools, teaching hospitals, and organizations from the AAMC to the Liaison Committee on Medical Education and the Accreditation Council for Graduate Medical Education to ask themselves, with every advocacy position, requirement, and public-information release, whether they have considered childhood, adulthood, and elderhood — not so much equally as equitably, across all forms of human diversity, so patients of all ages and backgrounds get what they need and deserve.
The number of older people in the United States will continue to grow for the foreseeable future. Most of us will enter that group sooner or later. Training clinicians to provide high-quality, evidence-based care to older adults and combating ageism in medicine are not add-ons. They are necessities.