Welcome to the March issue of The Surgeons’ Lounge. In this issue, John Paige, MD, associate professor of clinical surgery, LSU Health New Orleans School of Medicine, New Orleans, replies to a very timely and interesting operative scenario in which human error is highly possible, but where trainees and other team members are strongly discouraged from voicing concerns, or questioning decisions of the attending (senior) surgeon, even when error is very evident and may negatively affect the patient.
Check out what the experts say about closure of fascia for port sites 10 mm or greater in size in Expert Express!
The April issue will feature presentations from the 25th Anniversary Jagelman/35th Anniversary Turnbull International Colorectal Disease Symposium 2014, hosted by Steven D. Wexner, MD, and Feza Remzi, MD, which took place Feb. 11-16, 2014, in Fort Lauderdale, Fla.
Readers’ feedback is our greatest asset. Tell us how we’re doing! What do you want to see more of? Less of? What is the best part of Surgeons’ Lounge? What can we do even better? We look forward to your feedback!
Samuel Szomstein, MD, FACS
Editor, The Surgeons’ Lounge
Dr. Szomstein is associate director, Bariatric Institute, Section of Minimally Invasive Surgery, Department of General and Vascular Surgery, Cleveland Clinic Florida, Weston.
Jessica Zagory, MD, PGY-3
LSU Health New Orleans Department of Surgery,LSU Health New Orleans Health Sciences Center,New Orleans
Apatient with a left inguinal hernia was scheduled for open repair as the third operating room (OR) case of the day. The first two cases also were open inguinal hernia repairs and ran longer than expected, cutting into clinic time. To make up time, the patient was moved to another room and the OR staff prepped the patient so our team could come in as soon as the prior case was completed. The patient was brought into the OR prepped and draped. Our team came in and was about to begin the procedure when an OR nurse requested a time-out (even though we were itching to get started!). Needless to say, a fight ensued between my attending and the OR nurse because of this. She persisted, and we obeyed the time-out. Thank goodness! It turns out that the case that had been scheduled in the OR room before our moving our case into it had been a right inguinal hernia, and the OR staff had prepared the right side of the patient. Because there were so many inguinal hernia patients lined up that day, we did not notice until the consent form was reviewed. As it happens, the medical student had noticed the side mix-up but was too afraid to speak up because the attending surgeon had made it clear in the past that medical students (and the rest of the team for that matter) were to be seen and not heard in his room! It brought up a lot of questions for me:
Thanks for such great questions. They raise several points related to systems-based practice that often can get overlooked, but, I would argue, are as important as knowing how to manage a specific disease or surgical condition. What you have described is known as a “near miss” in human factors parlance: that is, a catastrophic event that is narrowly averted.
Human factors is a field of study focusing on man’s interaction with his environment. Human factors engineering attempts to design systems for safe and effective human use by trying to optimize the interaction through the study of human behaviors, abilities and limitations (Am J Med Qual 2006;21:57-67). A key axiom on which such work is based is the recognition that human error is inevitable. Thus, an error-free system cannot be created, and, as a result, defenses-in-depth are required to help identify and mitigate problems or errors when they arise. The more numerous and stronger the defenses created to prevent adverse events, the less likely the holes within each layer will line up, leading to a catastrophic outcome. This Swiss cheese model of error, promoted by James Reason (Figure), is a useful way to understand how seemingly inexplicable events, such as wrong-site, wrong-patient, or wrong-procedure, can happen within an organization (BMJ 2000;320:768-770).
In your example, many of the so-called holes in the layers of defense had lined up to prime the system for a catastrophic event: OR delays pushing back the scheduled start time of the case, time pressure on the surgeon due to being late to clinic, similar cases scheduled in multiple rooms, changing rooms and teams to speed things up, fear of speaking up on the part of the medical student. Fortunately, one layer did not have an aligned hole: the time-out requirement and the circulating nurse who insisted on it, resulting in a “near miss” as opposed to a sentinel event (i.e., wrong-side surgery).
Unfortunately, in health care, providers often react to such close calls with a “boy, was that lucky” response and do nothing more. From a human factors perspective, such events are actually “gifts” to the organization because they reveal weaknesses in the system before something bad has happened, providing the opportunity to evaluate the system and implement changes to prevent similar future occurrences. Such responsiveness requires what is known as a “generative” organizational culture that has specific values and characteristics that prepares the organization to recognize and learn from such events. They include an orientation toward always improving performance, good cooperation within and between departments, well-trained messengers to point out issues, shared risk taking, a willingness to implement novel ideas and a belief that failures in the system reflect an underlying problem that should prompt inquiry into solutions (Qual Saf Health Care 2004;13:22-27).
Generative organizational cultures are the core of so-called high reliability organizations (HROs). HROs demonstrate high levels of safety; consistent performance; and reliable outcomes in unforgiving, even hostile, work environments (Surg Clin North Am 2012;92:1-14). Industry examples include aviation, the military and nuclear power. They promote mindfulness in lieu of mindlessness among their members and are constantly seeking reliability instead of achieving it (Managing the Unexpected. Jossey-Bass; 2001; Best Pract Res Clin Anesthesiol. 2011;25:133-144).
Thus, an HRO would have a near-miss reporting system in place to identify and correct those “latent conditions,” defects in layers of defense that are present but not recognizable until they manifest themselves as holes in the Swiss cheese. Therefore, your take-home lesson should be to use this close call to try to improve processes and make defenses stronger.
What are some strategies to help improve systems to “trap and mitigate” hazards? They can include decreasing complexity within a given system, optimizing processing, automating in an intelligent (as opposed to universal) manner, and thinking about and avoiding the inevitable unintended consequences of changes (BMJ 2000;320:771-773). Designing constraints within processes (e.g., hard stops) is especially useful. The most effective of these is the physical constraint that is designed to make it impossible for a person to make the error. Diesel gas nozzles are a great example of this concept. They are purposely made too large to insert into unleaded gas tanks to prevent the catastrophic effect of diesel gas use in an engine designed for unleaded gas use.
Another type of constraint is the procedural constraint. Although not failproof like a physical constraint, procedural constraints are required processes that are designed to prevent a problem. In our profession, the much-maligned Joint Commission–mandated time-out is one such procedural constraint. It requires a review before commencing a procedure to ensure that the correct person is being operated on, the correct side is prepared, and the correct procedure is being performed. In practice, additional components have been added to help ensure good evidence-based practice, such as checks regarding antibiotic and thromboembolism prophylaxis. In this case, this procedural constraint succeeded in doing what it was designed to do: act as a defense against operating on the wrong side.
Another strong, although not perfect constraint is the cultural constraint. By making the “way we do things” align with values of HROs, a generative culture can be established that exerts pressure on individuals who might act otherwise to conform. Typically, such cultures are ones in which safety is the primary priority and all necessary resources are allocated to this goal (Qual Saf Health Care 2003;12:112-118). A good example outside of medicine is the culture on an aircraft carrier. Here, speaking up regarding perceived safety issues is encouraged and rewarded if it prevents a catastrophic event such as a jet crash. Thus, improving teamwork is also an important strategy in promoting HRO function.
Ensuring health care teams such as the OR team function as highly reliable units is one of the best ways to ensure any member feels comfortable speaking up when he or she identifies a problem. In fact, establishing highly reliable teams (HRT) is a prerequisite to creating an HRO (Surg Clin North Am 2012;92:1-14). Research into the science of teamwork has identified key characteristics that HRTs possess to ensure resilience in dynamic situations, and many frameworks have been established describing them. Eduardo Salas’ Big Five Model of teamwork is particularly useful because it served as the basis for the Agency for Healthcare Research and Quality’s TeamSTEPPS™ program, which is used by many health care institutions. (Small Gr Res 2005;36:555-599; AHRQuality, Rockville, MD. http://teamstepps.ahrq.gov/abouttoolsmaterials.htm. Accessed May 8, 2013).
This model identifies five core competencies essential to highly reliable teamwork: team leadership, mutual performance monitoring, backup behavior, adaptability and team orientation.
These “big five” competencies are facilitated by three key coordinating mechanisms: shared mental model, closed loop communication and mutual trust. The coordinating mechanisms help ensure that the core team-based competencies are fostered in any given situation. The “big five” are particularly useful in high-risk, dynamic environments like the OR to ensure rapid, effective responses to hazards in the environment. Teams espousing such knowledge, skills and attitudes (KSAs) maintain flattened hierarchical structures in which each member feels comfortable speaking up and leadership of the team in any given situation is based on expertise.
Good teamwork, therefore, like expert operating, can be learned. The KSAs of HRTs should be trained and fostered just as much as technical skills and medical knowledge in the surgeon, especially because poor teamwork is rife in the clinical environment. In the OR, the “silo mentality” prevails, stifling collaboration and communication and fostering a form of tribalism among the professions (Qual Saf Health Care 2010;19:103-106). This situation negatively affects patient care and safety (Am J Surg 2009;197:678-685; Surgery 2006;139:159-173).
Strategies that have been successful in fostering HRT include the use of checklists, briefings, team training programs such as TeamSTEPPS™, and high-fidelity simulation-based training (HF-SBT) of interprofessional teams and discipline-specific crews. HF-SBT is the strategy adopted at LSU Health New Orleans for teaching HRT functions to students and residents. The advantages of HF-SBT include the ability to create a safe learning environment in which low-frequency, high-risk events can be experienced and teams can learn from mistakes without harm to the patient (Qual Saf Health Care 2004;13:51-56). In this manner, we are trying to promote cultural change by giving you and your colleagues the team-based KSAs to be able to adapt effectively, communicate openly and anticipate one another’s needs in order to provide the best possible care to your surgical patients.column break