domingo, 7 de febrero de 2010

The antidote to medical errors


The antidote to medical errors.
Price M. Monitor. January 2010;41:50.

This feature article explains how cognitive errors contribute to medical mistakes and describes ways to lessen their occurrence.


abrir aquí para acceder al documento AHRQ original:
The antidote to medical errors

The antidote to medical errors
Psychologists' insights into humans' penchant for mistakes are making medical procedures safer and more efficient.

By Michael Price

Monitor Staff

January 2010, Vol 41, No. 1


In a perfect world, physicians would never get tired and never get stuck on the wrong solution. They wouldn't have to hand off their patients to the physician working the next shift. Nurses would communicate instructions to each other with perfect clarity. Not so in the real world. Most medical errors are small, procedural slip-ups. But occasionally, they blossom into full-blown tragedies.

In fact, the correct diagnosis is either missed or delayed in 5 percent to 14 percent of urgent hospital admissions, with autopsies suggesting diagnostic error rates between 10 percent and 20 percent, according to research by Ian Scott, MD, director of internal medicine and clinical epidemiology at Princess Alexandra Hospital in Brisbane, Australia.

What accounts for these errors in physicians' reasoning? It's not incompetence or inadequate knowledge, Scott says. It's the fact that physicians have a tendency to get stuck in particular modes of thinking.

Recognizing this, psychologists are members of the teams working to mitigate such medical mistakes by designing health-care systems and practices that encourage clear communication and quality cooperation. Their efforts are focused on encouraging physicians and other medical workers to think differently about how they diagnose and manage diseases.

“There's a huge opportunity to study these settings so that it supports a patient-centered approach and continuity of care,” says cognitive psychologist David Woods, PhD, professor of ergonomics at Ohio State University in Columbus.

Checks and balances

Medical procedures in most hospitals combine factors that Woods describes as resilient — such as workers' attention to detail and makeshift rules intended to prevent errors — and brittle — including human oversight and systems that break down under unforeseen circumstances. He began looking at systemic strengths and weaknesses in 1979 when he was asked to help redesign control rooms to avoid future errors at Three Mile Island Nuclear Generating Station. Since then, he's helped critique and redesign workflow plans for hospitals, the aviation industry and NASA Mission Control.

Woods describes a chemotherapy regimen he and his team analyzed at a hospital as a real-life example of how a system can be both resilient and brittle (Cognitive Technology and Work, Vol. 9, No. 3). The medical staff at the hospital started a patient on chemotherapy over a weekend. As usual, the medicine was routed through a pharmacist and nurse, but it turned out the chemotherapy plan had an error and the patient received the wrong medicine. The ability to detect and correct such mistakes determines whether a system is resilient or brittle, Woods says.

So, what went wrong? It turned out the physician made a prescription error. Still, a resilient system has procedures in place to catch a physician's mistake. And it turns out the hospital even had an ad hoc rule designed to do just that: Don't start chemotherapy on weekends because the most knowledgeable physician and pharmacist didn't work on weekends. But that rule looks pretty flimsy if your loved one has cancer and you want them to start treatment immediately, Woods says. More rigid measures are necessary.

That's where psychologists come in. They can help medical personnel recognize and correct these errors by encouraging them to check each other's work and raising concerns as soon as possible. One of the big things psychologists look for are cross-checks, Woods says. “We get high reliability, not by perfecting each component, but by putting imperfect components into a system that overcomes the weaknesses of each component.”

In the chemotherapy example, Woods identifies what he calls the “are you sure?” problem. The weekend nurse and pharmacist did notice that the physician's prescription didn't appear to fit the patient's condition. They asked him, “Are you sure this is the right medicine?” He said yes, so they continued with the plan.

But very generic questions like, “Are you sure?” are often ineffective at catching errors, Woods explains. When people are fixated on a particular idea, it takes specific lines of questioning to consider possible alternatives, he says.

What works better? Woods suggests a statement like, “I didn't know about combining X and Y.”

“When you're into the flow, you get caught in the error,” he says. “You need a fresh perspective to uncover mistakes.”

Ian Scott has explored why these cognitive breakdowns occur in the diagnostic process. In an online analysis he did in June for the British Medical Journal, Scott wrote that in many instances, physicians use rules of thumb and shortcuts to pare down the potential diagnoses and start a treatment course. In most cases, those rules are very accurate. But they can also get physicians stuck in a mental rut, ignoring contrary evidence, discounting the patient's own history or just failing to think about other scenarios, Scott says. For example, even in cases where patients have a history of cancer, physicians might be more likely to attribute back pain to osteoarthritis or other common causes than to a metastatic spinal disease, Scott notes.

To head off such errors, he recommends fostering a climate where seeking second opinions and advice is encouraged, not ridiculed. He also recommends that hospitals have clinical audits and mortality and morbidity reviews through which physicians' diagnoses and treatment plans are openly discussed. Beyond that, Scott suggests physicians develop their knowledge of “basic error theory and skills in meta-cognition — that is, thinking about their thinking,” so they learn to recognize situations in which they're prone to making mistakes.

Details, details

Shahar Arzy, MD, PhD, is looking at how a single misleading detail can throw off a physician's diagnosis. In a study in the October Journal of Evaluation in Clinical Practice (Vol. 15, No. 5), Arzy, of Hadassah Hebrew University Hospital in Jerusalem, Israel, looked at ways physicians might learn to think about their diagnoses differently to reduce errors.

In his experiment, Arzy assigned 51 physicians to one of three groups. One received a series of clinical cases embedded with prominent but misleading details, such as the case of a young girl complaining of pain in her lower ribs since falling during a skiing accident three months ago (in fact, she has non-Hodgkin's lymphoma). Another group of physicians is given the same series of cases, but is told to beware of misleading details. The third group received the same cases, but with the prominent detail changed to something trivial and unrelated.

Arzy asked them to make a diagnosis and note which detail they considered most influential in their decisions. Then he asked them to disregard that detail and make another diagnosis.

In both groups working with the misleading details, physicians made a wrong initial diagnosis of trauma 90 percent of the time. After Arzy asked them to omit the most influential detail, their error rate dropped to 30 percent.

Echoing Scott, he says that physicians who use rules of thumb usually do make the correct decisions, but they're also susceptible to misleading details that can throw off their diagnoses. But by training their minds to be on the lookout for these details — and omitting them to see if another diagnosis would also make sense — physicians can reduce their diagnostic errors, Arzy says.

From weak link to strength

All the personal attention to detail in the world can get lost in the shuffle of a “handoff,” though, when medical workers hand over responsibility for a patient to another medical worker. With the mix of different individuals and specialties — nurse, physician, pharmacist, etc. — high-quality communication is essential to make sure all pertinent information gets across, Woods says. Some physicians know all too well the risks involved with a handoff and instead would rather work long hours to prevent one. But those longer hours have tolls of their own cognitive skills, Woods says.

The better solution, he says, is to design handoff systems that incorporate lots of cross-checks. After studying shift changes at the NASA Johnson Space Center in Houston and two Canadian nuclear power plants, Woods published a list of cross-checks in 2004 he believes make for good safety controls (International Journal for Quality in Health Care, Vol. 16, No. 2). Face-to-face interactions are better than written notes, he says, but once the interaction is over, there's less of a written record to rely on. One solution, Woods suggests, might be to audio-record notes on your patient's status, then also engage in a face-to-face meeting with the incoming worker to hash out specific concerns. By building in these multiple cross-checks, the handoff becomes not a weak link in the system but a chance to add resiliency, he says.

“Yes, misassessments can get introduced in a handoff, but it's also an opportunity to correct things,” Woods says.

Another opportunity to correct problems before they begin is to focus on examples where medical workers prevented an error. Woods learned from his experience working with Mission Control that getting together lots of smart people to work on the problem is great for bringing out diverse opinions, asking a range of different questions and eventually finding the right answer.

This works in health care, too, Woods says. For example, surgeons and assistants might be prepping a patient for surgery, and the surgeon imagines it will be a short procedure with relatively little blood loss. An anesthesiologist recommends that the nurses insert extra fluid and blood lines, anyway. Why? It's a case of being prepared, Woods says.

“The anesthesiologist might understand some of the other problems the patient has, might understand the contingencies, the way things could go wrong,” he says.

Once you're in the middle of surgery, it becomes far more difficult to get access to the patient's veins, should it become necessary. “The expert anesthesiologist is anticipating a potential bottleneck, even though it's maybe a relatively low probability.”

These kinds of routines don't always come naturally to health-care workers, Woods says. But psychologists who are trained to see flaws in the system, to see where human error creeps into the best laid plans, can help build in these kinds of cross-checks, cobbling together a well-functioning system out of imperfect parts.

“Psychologists are really important in this,” Woods says. “We're a critical team member in building ultra-high safety approaches to health care.”

No hay comentarios: