New section

Content Background

New section

Amy Edmondson: Psychological safety is critically important in medicine

Gabrielle Redford , Managing Editor
November 13, 2019

Creating an interpersonal climate in which all employees feel empowered to speak up will lead to fewer errors and better performing teams, the Harvard professor and bestselling author told attendees at Learn Serve Lead 2019: The AAMC Annual Meeting.

New section

New section

Author Amy Edmondson delivers her speech at at Learn Serve Lead 2019: The AAMC Annual Meeting.
Critically acclaimed author and Harvard professor Amy Edmondson, PhD, speaks during the closing plenary session on November 12 at Learn Serve Lead 2019: The AAMC Annual Meeting in Phoenix, Ariz.
Richard Greenhouse

When Amy Edmondson, then a PhD candidate at Harvard Business School, set out to study the relationship between error making and teamwork in hospitals, she expected to find that higher performing teams made fewer mistakes.

What she found instead, she told attendees at Learn Serve Lead 2019: The AAMC Annual Meeting, was the opposite: Better teams — as measured by a team diagnostic survey — reported higher error rates, not lower ones. “That forced me to think: Maybe better teams don’t make more mistakes. Maybe they’re more willing and able to talk about them,” said Edmondson, author of the critically acclaimed bestseller The Fearless Organization: Creating Psychological Safety in the Workplace for Learning, Innovation, and Growth.  

Further investigation proved her right: The highest performing teams did, indeed, cultivate an interpersonal climate in which everyone, from the lowest ranking employee to the highest, felt empowered to speak up. She called this climate “psychological safety,” and she has since conducted dozens of studies that show that it is a critical component of successful teams.

Creating psychologically safe workplaces is simple, but not easy, Edmondson told meeting attendees. It requires leaders to do three things: frame the work by ensuring that everyone is “on the same page about the risks that lie ahead,” invite engagement by asking good questions, and respond productively by not just appreciating employees’ input but by acting on it.

She cited numerous instances in which the failure to speak up had grave consequences – from the NASA engineer who knew about the defective rocket boosters but was unable to alert higher-ups ahead of the Challenger disaster, to the countless employees at Wells Fargo who were under such pressure to sell the bank’s financial services products that they lied rather than stand up to their supervisors.

“Can we have a moment of empathy for that engineer or that branch employee who believes it to be more palatable and more possible to cheat regulators and ... lie to customers than to tell the boss it cannot be done,” Edmondson said. “We need to have empathy for the pickle that we put people in.”

“Speaking up doesn’t make you look bad or incompetent or intrusive or negative,” she added. “Speaking up is heroic. It's the right thing to do ... it saves lives.”

In an exclusive interview with AAMCNews several weeks ago, Edmondson further explored why psychological safety is particularly important in medicine, how to foster teamwork in a hierarchical environment such as academic medicine, and the need to be ever vigilant in the quest to reduce medical errors.

Does creating an environment in which people feel free to report errors — without the threat of retaliation or judgment – lead to fewer errors?

Ultimately yes, but not all by itself. There need to be other things in place. Nobody wants to make errors and nobody believes that errors are inevitable. Catching, correcting, and reducing errors are team activities, and if your teams don’t have the interpersonal climate they need to do that, then it won’t happen.

Academic medicine is hierarchical, with students reporting to residents who report to attending physicians and so on. How do you create an environment in which even the junior members of a team feel empowered to speak up when they spot an error — or admit they might have made a mistake?

Hierarchy is a risk factor, but hierarchy is not deterministic. You can have teams with the same hierarchical makeup with very different levels of psychological safety. It’s more about how those with higher status handle their higher status.

So how can those higher up in the hierarchy create this psychologically safe environment?

They should first and often acknowledge their own fallibility, not meaning that they are inept or deficient, but that they are human. If you are acknowledging your fallibility, you are saying, “I might miss something here. I need to hear from you.” You have to proactively invite input: “What did you see last night? You were rounding on patients. What’s going on?” It’s both an acknowledgment of fallibility and a direct invitation to voice that creates psychological safety. You have to remind people that health care by its very nature is a complex, error-prone system. It’s a given that things will go wrong. But a thoughtful leader will proactively invite input with good questions that signal he or she is genuinely interested in what people have to say. Leaders of psychologically safe teams are generally seen as accessible and approachable.

Then how do you hold people accountable for poor performance?

I will say that doing so through fear will backfire. What you’ll get is the illusion of accountability and not accountability itself. So the simplest way to hold people accountable is by conveying, motivating, and inspiring high standards. And also going out of your way to create psychological safety, so that when things get missed or bad things happen, people feel free to speak up. The most important insight to know is that this is not a trade-off. There is no trade-off between high standards and psychological safety. In fact, in medicine, they go hand in hand.

How do you create good teams, particularly in academic medicine, where potentially dozens of health care providers who might not have worked together before need to come together to care for a patient?

The first thing is to be explicit about the shared goal of providing great care to the patient, and then recognize where each member of the team is coming from and what they bring. It’s not a given. You may think, well, I understand medicine, you understand medicine, we must be coming at this the same way. But that’s not true. You need to pause and say, “What are we trying to do? What are we up against? What do we each bring?”  

I think most good clinicians are spontaneously curious and humble about what’s going on with their patients. We know that good clinicians have excellent diagnostic and question-asking skills. But they don’t always bring those same skills to their interpersonal and interprofessional relationships at work. So I’m really curious about my patients and I’m asking really good questions about them and then I turn around and yell at my colleagues or the nurses. 

You can’t team effectively without psychological safety. You’re not going to be asking the right questions, you’re not going to be willing to admit gaps in your knowledge, you’re not going to be willing to ask for help when you’re in over your head.

It comes back to purpose and the reason we’re all here, and that’s the patient and the privilege of being able to have a profound impact on someone’s life. Clinicians have to continually remind themselves of what’s at stake and why it matters — why it matters what they do.

What other industries might medicine look to in learning about how to create psychologically safe working environments where errors are rare and not the norm?

Medicine is a high-risk industry but it’s not one where the clinicians are themselves at high risk of harm. In industries such as nuclear power where the risk is immense and ever present, people have managed to create cultures and leadership models that have been summarized as high-reliability organizations. And they are characterized by immense vigilance and an ongoing, explicit discussion about vulnerabilities — safety vulnerabilities but also human vulnerabilities.

If we’re not talking about the potential for error and harm, we’re not doing our leadership jobs. That vigilance has to be nurtured. It can’t be something we take for granted. In high-risk industries such as nuclear power and aviation, they have managed to create cultures of tremendous vigilance and moment-to-moment learning that are based on interpersonal and interprofessional respect.

What that tells me is that when the stakes are high enough, it can be done. And where are mistakes higher than in health care?

Not all bad outcomes in medicine are preventable, though. Can you talk more about that? 

I think it’s easier if you distinguish between process errors, where there’s a complexity and things just didn’t line up the way they should have, and adverse events that you couldn’t have foreseen. [Consider] a well-understood procedure that was done incorrectly or a well-understood drug that was administered wrongly, as opposed to an allergic reaction that wasn’t known in advance. Our failure to distinguish between these types of errors makes it hard for people to talk about them. We’ve decided that they’re all shameful, but they’re not all shameful because we couldn’t have foreseen all of them.

It’s what we tell ourselves: Nothing bad is ever supposed to happen. But because we work in an error-prone system, bad things will happen. The only thing we can control is how we react to them and how much we learn from them.

Psychologically safe workplaces are not the norm, so nobody should feel bad about not having one yet. But they really matter because they are the early warning system.

This interview has been edited for clarity and length.

An earlier version of this AAMCNews article published October 25, 2019, with the title "Why psychologically safe workplaces lead to fewer medical errors."

New section