Perspectives on Safety—Approaching Safety Culture in New Ways
This month's interview features Mary Dixon-Woods, MPhil, RAND Professor of Health Services Research at Cambridge University, Deputy Editor-in-Chief of BMJ Quality and Safety, and one of the world's leading experts on the sociology of health care. We spoke with her about new ways to approach safety culture.
In Conversation With… Mary Dixon-Woods, MPhil
Editor's note: Dr. Dixon-Woods is RAND Professor of Health Services Research at Cambridge University and Deputy Editor-in-Chief of BMJ Quality and Safety. Her research focuses on patient safety and health care improvement, health care ethics, and methodological innovation in studying health care. We spoke with her about approaching safety culture in new ways.
Dr. Robert M. Wachter: What got you interested in safety culture?
Dr. Mary Dixon-Woods: I'm a social scientist by training. So my bread and butter is looking at aspects of how people behave, the norms that guide them, the values that seem to inform their practice, how that shows up in organizational routines, how it shows up in how people behave toward each other. I'm also interested in questions of how or whether it's possible to manipulate behavior. What got me interested is that I'm very committed to doing research that helps patients ultimately. It's very important to me that the kind of research that I do is actionable, has practical meaning, and is important to people. Culture is one thing I see as being critical to patient safety.
RW: During my training years, I learned precisely nothing about culture. Folks like you have helped teach us about it. As you came into health care, what were the surprising things you saw about the way physician culture manifested itself?
MDW: I would probably challenge you and say you learned an awful lot about culture as a medical student. Some of the best work done in medical sociology has focused on the culture of students. There's a famous book from the late 1950s called Boys in White: Student Culture in Medical School. This classic text explains how medical students are socialized and acculturated into particular ways of behaving. So they learn very quickly what attracts rewards, but they also learn even more importantly what gets sanctioned and what kinds of punishments will apply. These may never be identified as punishments, but nonetheless, that's how they are experienced. You figure out what will get you yelled at, what kinds of things will prevent you from getting a good recommendation for your next job, and a way of identifying the role models in your environment. So I would say you learn a huge amount through this informal curriculum about culture. The achievement of people like Charles Bosk was that it began to codify some of that, to give us a language and a way of talking about this that means we know more about it, and we turn it into something valuable and studiable.
RW: The culture obviously is all around us, and we are enculturated as we enter a profession. But it was never explicit—and this gets very meta very quickly—but nobody ever pointed out to me or to anyone in that era that there was such a thing as culture. That it's happening, it's studiable and maybe even modifiable, and it's an important determinant of key outcomes. How important is pointing out culture to people who are creating and living a culture in terms of then ultimately doing something about it?
MDW: That's a great question. It's become an exasperation to me that culture is now used too extravagantly to explain almost anything. It's used as a kind of shorthand for saying, "We don't know what's going on, but it's all about the culture," as if somehow this is a great insight. I almost never use the term culture when I'm talking about behaviors, norms, and values in a situation. People often say what we, as social scientists, do is hold up a mirror. Typically, when we're doing one of our studies, my wonderful ethnographers will go in to a setting. They may hang out there for a week or more, and they will simply come back and describe what they've seen. We can use theoretical terms to describe what we've seen to turn it into an account that is sociologically satisfying. Then not infrequently, we go feed it back to where we've done the observations. This often comes as a revelatory moment to those participants. It's not that they didn't know it. It's a bit like you say you didn't realize you were being socialized or nobody ever talked about it. But somehow telling somebody this is how you do this thing and this is how other people do this thing can often act as an intervention in its own right. Culture is a term I don't find particularly useful. It has this kind of delinquent character to it, promiscuous even. It's too readily available.
RW: Too facile.
MDW: Yeah. So it explains everything and nothing. It's not a term I particularly like.
RW: When you describe to people what they did and how ethnographers noticed it, how does that change behavior? Do you find that people are much more receptive to making hard changes having heard that?
MDW: Another great question. We've not always followed this through to the end. This is something I'd like to do as a study, giving them feedback as an intervention. What people can do is take that and use it as a way of disturbing what they're doing in a very positive way. They can use it to say things that otherwise are forms of what we call forbidden knowledge. You find forbidden knowledge in every organization. It's the kind of thing people know, but it's very difficult to give voice to because it's dangerous to know—or, if you articulate it, you end up making all kinds of mischief for yourself or for other people. Being able to channel that through a neutral broker—which is usually what we are—and then make it visible, call it out, is a valuable intervention.
RW: Can you give an example of a piece of forbidden knowledge?
MDW: A very common one would be somebody telling us that somebody is not acting in a particularly helpful way—yells at people or whatever. A more subtle thing could be that there is no forum for raising concerns that will actually convert anything you say into action. For example, junior doctors are supposed to be given quiet time to complete their notes, but they never get to do it. And whenever they try to say, "We really do need time to do this; if you interrupt us in the middle, we don't get the notes done, then we haven't created a good record for the handover." That's a form of forbidden knowledge, in that if they tried to bring it to attention they end up being seen as whiners.
RW: Over the last 15 to 20 years, our way of thinking about mistakes and harm has changed, with much greater focus on system flaws rather than personal flaws. Hearing you discuss the loose use of the word "culture," it strikes me that there's almost a 2x2 table there: the importance and maybe overuse of the concept of culture and the importance and maybe also the overuse of the concept of systems?
MDW: About 7 or 8 years ago, I spent a lot of time looking at the problem of bad apples and became quite intrigued with this. There is a literature on bad apples from completely outside health care, which shows if you plant somebody who is acting like a spoiled child in a group and you get them to behave in particular ways, you reduce group productivity by about 60%. Their effect is so pernicious and pervasive. Health care systems are not immune to having bad apples. They're a relatively rare problem, but they're not an unknown problem. You could say that it's up to systems to detect and deal with these. In my experience, they don't do that particularly well. Systems need to be able to cope with the fact that there will be some people, even if a very small number, who are uncooperative, lazy, or don't make enough of an effort to get along with people and to contribute appropriately.
To a social scientist, the idea that you would say systems don't include people seems extraordinary—it's a nonsense thing to say. Systems include people, and we have to understand how people work in systems. I completely agree—systems and culture both end up being dustbin terms that don't take us very far. Recently, we've been thinking about how you would look at accountability in systems. There is a real danger that you take a construct like accountability, and you return to a really unhelpful idea that it's all about people having to demonstrate that they are contributing in the right way. We're interested in getting to a quite sophisticated view of what accountability means. Sociological literature describes how easy it is to exploit people in organizations by appealing to professionalism—even though nothing has been done to optimize the operational systems, equipment, supplies, or anything else that people need to deliver on their responsibilities.
RW: Putting all that together, it sounds like there's a lot of Kabuki dancing going on in which there might be a bad apple, but nobody wants to call him or her out because that's forbidden knowledge. There is a way of demonstrating that you're acting within an accountability framework, but some of it might be for show. It seems like there's a level of authenticity that good organizations somehow get to, one in which you strip away this artifice so that people are being humans and are being open about what's hard and what's easy. Yet very few organizations have reached that Nirvana as far as I can tell.
MDW: Authenticity is a terrific word, and it's much better than those very normatively infused words like transparency and accountability. There is a terrific literature in the sociology of accountancy, which talks about how the account of organizations may have very little relationship to the realities of what's going on. And that literature, in fact, predicted the Enron disaster a couple of years before it happened. When we apply the sociology of accountancy to something like health care quality and safety, it's very easy to see how you could produce these displays of compliance. It has very little to do with authentically producing an environment in which safe care can flourish.
RW: You talked about a number of trends that received a lot of attention: the focus on culture, measurement, transparency, and professionalism. At some point, do we reach a more mature phase, where we've gotten through all these trends and we're now getting down to what's real? Do you see signs that we're moving in the right direction?
MDW: In some ways yes, and in some ways no. In the science, we're seeing some very welcome shifts happening. I'm Deputy Editor of BMJ Quality and Safety, so I see a lot of the research coming out. It's clear we've moved a long way over the last 10 years in terms of understanding patient safety. What I'm seeing, for example, is the notion of error increasingly receding and increasingly a focus on risk controls, risk reduction, protection from harm. That seems positive to me because the idea that error and harm are directly linked was probably one of the mistakes made early on in the patient safety movement. Errors can happen, but probably a million drug errors are made today. Very few of them result in harm. So the idea that error is the appropriate target has been left behind. Instead, what we're seeing is a focus on what are the right ways of protecting people? What are the risk controls we can put in place? The many other ways in which we need to think about protection that have nothing to do with error but to do with, for example, how we do better prognostication, better diagnosis.
The diagnosis agenda is interesting because it may have very little to do with error but instead optimizing how we characterize information we have about patients. That's a very welcome shift. Some of the stuff about Safety II I don't fully agree with—but that move is generally positive in that there's much to learn from the example of places doing really great things. Or examples of things going right that seem very obvious to me, in the same ways we don't learn about how to avoid obesity just by looking at fat people. We need to look at thin people too, and what they're doing. So the trends in the science are generally very positive, and I'm very pleased with the way that field is maturing. On the policy end, I'm not seeing the same level of maturity. Particularly in the US, I was really quite chagrined by the continuing emphasis on things like safety being used as a marketing element. The whole dispute about rankings of hospitals is a reflection of a continuing preoccupation with displays of compliance, and the real danger is that you elicit organizational behaviors that are antithetical to safety rather than encouraging it.
RW: Part of the way you became known, at least to American audiences, was through your work on the Keystone Project in Michigan, not only studying the implementation of what was a seminal success but also helping us understand more deeply how it worked, then looking at how it translated to other settings. That's an enormous body of work. Can you summarize it?
MDW: The first thing to say about the Michigan program was that it had a very good technical intervention. It's too easy to forget how important it is that you actually know how to solve the problem. Too often, I see with patient safety or other quality improvement interventions a rush to implementation, when in fact the evidence for the technical intervention is not as good as it should be. So if we're going to implement something, we have to know this thing actually works. We've seen some of the dangers of what happens when we don't. For example, with the use of routine beta-blockers in surgery, where that technical intervention just wasn't the right thing.
The second is that you need an optimized implementation strategy. No two interventions are going to be exactly the same in how you implement them. What Peter Pronovost and Chris Goeschel pulled off in Michigan was that they figured out how to persuade people that there really was a problem. Central line infections were seen as the price of doing business in ICUs. They were able to demonstrate it really was a problem and that it was tractable. That won't be the same for every single safety problem that you come across.
One of the things that has been forgotten about with Keystone is that they got the implementation strategy right by trying and failing several times at Hopkins. So they had both the technical intervention and most of the implementation strategy right before they went to scale. That's another important learning: to try it out in a few pilot sites first. The replication and spread of what they achieved in Michigan was challenged because, unfortunately, it mutated into a very simplistic story about checklists. I know that Peter Pronovost was exasperated by that story. We were able to explicate the mechanisms that led to the change in Michigan and articulate things that were invisible to the people involved. How you replicate at scale seems to depend on really understanding what happened when it worked that time.
Too often, the attempt to replicate the scale produces this etiolated, pale imitation that doesn't reproduce the mechanisms. It simply produces this outer superficial appearance. When the attempt was made to take it to England without understanding what was there, the program was not as successful. It did work in the sense that it achieved the same infection rate, but it could have achieved so much more had there been a better understanding of what was needed to make it work. So to summarize, it's important to have a good technical intervention, have a good implementation strategy, put the two together, and once you got it to work once, then your responsibility is to figure out how you get that to work more than once. That means a really sophisticated understanding of both the components of your program and its mechanisms.
RW: As I listen to you, I'm wondering how you can prevent all of the bad things from happening. This becomes a play within a play. You were part of the project. You were well aware of the possibility that this complex intervention depended on a lot of nuanced things that might get watered down. What will this look like when someone who isn't as charismatic as Peter is managing it? And someone who isn't as sophisticated as you are is trying to manage the sociological aspects of it. There was some prospective thought about the problems of scaling, and still the problems emerged. How do you prevent those problems, which seem to me almost natural and maybe even inevitable?
MDW: That is, in fact, where my current research program is going. In psychology they talk about manualization. You have an intervention that seems to work. How do you manualize it so that it can be taught another time? That probably isn't quite right for what we're trying to do with patient safety; it's much too mechanical a metaphor. But something a tiny little bit like that is what we need to do. We're currently looking at successful examples of this positive deviance idea where they're getting something right. We can tell from the outcomes, using hard data, in the same way that Peter Pronovost had hard data from Michigan. When we find a great performer, we go and spend time there. Very often we discover that the things the team thinks are producing the good outcomes are not fully articulated or even recognized by the people in the setting. It takes us to come along and see what they're doing. Our concern now is to see if we can convert that into a package that can be reproduced at the next site. It is the key question. I'm sick of things that work once.
RW: How does the digitization of a health care work environment change the kinds of issues that we've been talking about? Change the ability to improve care, improve safety, and improve quality? Is it just you've digitized this world, but it's the same kind of issues. Or are there fundamental changes that occur once you're now operating in a digital environment?
MDW: Well I think your book has shown the latter. It is completely fundamental and it is a profound restructuring of the work. It comes back to your point earlier: is there really a distinction between the people and the systems? The effect of that system change is to reimagine even what the work is. It's restructuring the routines, the relationships between people and machines, between people and people, between teams and teams. It's restructuring the nature of professional roles. It's restructuring professional identities. I love Abraham Verghese's notion of the real patient keeping the bed warm while the doctors look after the e-patients. It's transforming what it means to be a patient. I think the anguish in the US has to do with these remakings of professional identities. This is one of the most profound changes we have seen in health care. We often say the culture isn't that malleable and if it were malleable then it would be very unstable. There are evolutionary reasons why it's difficult to change cultures, because they're there to keep coherence. But what we have seen in health care, in the US particularly, has been a massive large-scale intervention, in which very little of this was framed as being about cultural change rather than a simply organizational change.
RW: I guess the question going forward is the same scaling question. Now that we know it, how do we change it?
MDW: Actually, just as we're talking it's occurring to me your intervention in the English health system [producing recommendations for digitizing the National Health Service; RW chaired the committee and MDW was a member] is actually tremendously helpful. And it's not manualization, but what you did by convening that group and sharing the learning from the US, and even from your book, are examples of articulating things, synthesizing—it essentially is a social science exercise that you engaged in, and some of it was forbidden knowledge. So articulating that, codifying it, and saying, "Here are the lessons you can learn from the pain we went through." Already we're beginning to see evidence of this, and it will mean we don't have quite so much pain when we do it. But human nature being what it is, I suspect we're still in for quite a bit of pain.
RW: Take out your crystal ball and given the trajectory that you've seen in all of this work over the last 10 years, as you project 5 or 10 years forward what do you think we're likely to see?
MDW: I'm hoping what we're going to see is more sophistication about the deployment of quality improvement (QI). I think one of the paradoxes of the patient safety movement has been the promotion of local QI as a solution to patient safety. It's clear to me that, unless that's really well coordinated, it's really not the answer to some kinds of important problems, though it is for others. As part of that, we're going to see much more emphasis on large-scale efforts, much more emphasis on how we bring people together, find collective solutions, and much more emphasis on using labs and pilot testing so we get things to work. That will happen when we can say we've optimized the technical intervention, we've optimized the implementation strategy, and we're looking to the best mechanisms and modes for replication and scaling.
RW: Every time I read what you've written or talk to you, I feel like part of the intervention will need to be having more people like you to help organizations or collaboratives. Yet finding such people and hiring them for this work seems quite unusual. I don't think there's a general appreciation of the value of an outside expert like you to help with these interventions.
MDW: Well, that's a very good observation. You started by saying "I was never told about culture as a medical student." And there is this sense that it's in that category of things like communication skills and other things that don't have that hard-core feel. People are inclined to think, "They're the kind of things that we're doing anyway. It doesn't need to be taught. This is just fluffy stuff."
The reality is that that's usually the really hard stuff. Anyone can learn science if they try hard enough. Doing the cultural work is much tougher and takes a very specific kind of skillset. Linked to that then is this sense that social scientists are frivolous add-ons. You know, nice if you have a bit of spare cash, but the role that they might play isn't valued. I'm not sure social scientists have always done this particularly well, either. They typically have their own agenda. They publish in journals that clinicians and managers don't read. They may write in ways that are completely inaccessible. And their commitments may not be to improvement. So as a kind of personal crusade, I'm keen on trying to make social science matter. My big question is: how do we make it practical and useful so it helps patients?
No hay comentarios:
Publicar un comentario