A New Approach to Anesthesiology and Health Care System Safety: High Reliability Organizing (HRO)
Michael R. Hicks, MD, MBA
President and Chairman, Pinnacle Anesthesia Consultants, PA, Dallas, TX.
Clinicians in the specialty of anesthesiology have much for which we should be proud. Advances in our specialty have made the anesthetic experience both safer and more convenient even as we have advanced the care of our patients both in terms of who can safely receive anesthesia and where it is delivered. In fact, the rest of medicine and health care generally views us as pioneers in the patient safety movement.
Anesthesiology blazed the patient safety trail utilizing a variety of approaches: prudent adoption of improved technology, advances in pharmacology, advanced monitoring techniques, adoption of clinical practice guidelines and standards and by adopting some basic system theory and team-based behavioral principles and models to name just a few. Today, patients with even significant underlying health issues have higher expectations of successful surgical experiences than was thought possible a generation ago. These advances, embraced by us and persistent in our application, have become part of our culture of vigilance.
Reducing Error in the Health Care System
Unfortunately, our vigilance has been slow to spread far from our operating rooms to the rest of the health care system. Health care remains beset with errors, dangerous places and practices, and risks that our patients would not voluntarily tolerate in other aspects of their lives (Kohn et al., 1999; Richardson et al., 2001; Richardson et al. et al., 2001; Landrigan et al., 2010). We, as patients, are no different. Why then, as caregivers, do we not only tolerate a health care system that carries such risks but also actively work every day as integral participants without doing as much as possible to make it better?
In my career I have heard many reasons for our reluctance to change the system. Fears of malpractice or personal litigation, perceived loss of clinical autonomy, lack of authority, fear of retribution, the variability of individual patients, fear of being replaced by production driven surgeons and administrators, lack of financial incentive or even just apathy are just some of the reasons with which I am familiar. I have even heard it suggested that it is not a lack of intention or motivation on our part but that health care is fundamentally different than other process-driven enterprises and is incapable of achieving the performance standards demanded and produced elsewhere. However, increasing numbers of stakeholders—patients, payers, regulators and the government most notably—are demanding greater attention and accountability for the care that we collectively deliver. How then should we proceed?
Beyond Lean and Six Sigma to a New Approach: HRO
We have attempted to incorporate many tools from other industries with varying degrees of success. Lean, Six Sigma and other approaches are all in use and in some cases achieve almost mystical status in the minds of some in the industry. To be fair, when applied correctly these tools can produce significant performance improvements. In many cases, however, they ultimately fall short and produce only limited-term gains. Sometimes, because of misunderstanding and misapplication, they produce results worse than the baseline issues they were intended to improve. A primary reason for this is that these approaches are merely tools to address isolated processes. They are not changers of an organization’s underlying culture and so whatever gains are made erode over time, as behavior is subservient to culture and routine in nearly every circumstance.
Fortunately, there is much we can learn from outside of health care. There are organizations and industries that continue to demonstrate lasting success in minimizing errors, reducing opportunities for failure and generating value. From them we can learn the necessary tools and more importantly a solid theoretical construct that results in a great deal of practical success (at least in other industries) from which we can draw our own applications.
One such theory, known as High Reliability Organizing (“HRO”), was developed by observing the behaviors, process design, and ultimately the cultures of other industries that for various reasons have very, very low fault tolerances or alternatively, very high expectations for reliability (Weick & Sutcliffe, 2001; Vogus, Sutcliffe, & Weick, 2010). Note that the acronym “HRO” can refer to both the theory (High Reliability Organizing) as well to organizations that have embraced the theory (High Reliability Organizations) and the reader should use the context to discern the difference as the acronym is applied.
The classic examples of HRO industries are commercial aviation, the nuclear power industry and aircraft carrier operations. These industries have embraced the HRO culture because failure for them typically means catastrophic outcomes. Increasingly, however, organizations within health care such as The Joint Commission and the Agency for Health Care Research have proposed that health care organizations should be added to the list of HRO disciples, not as current examples of HRO in practice but as entities that could benefit from the aspirational goals of HRO thinking (The Joint Commission, 2012; Hines, 2008).
Five Key Principles of Mindfulness
It should be emphasized that the key principles of HRO can be explained fairly easily but that their successful application both drives and is dependent on an underlying “mindfulness” of an organization’s operation. The concept of mindfulness is for most people the most difficult concept to grasp. In simple terms, mindfulness refers to an ongoing, overall sense of operational and situational awareness that involves yet transcends the particular task that one is performing. In other words, and borrowing from the American Society of Anesthesiologists’ motto, mindfulness in my view is another manifestation of being vigilant on a continuous basis about the totality of the process. In our world it is akin to being the patient’s advocate when patients cannot do that for themselves.
High reliability organizations adhere to five key principles that serve as underpinnings for the general culture of mindfulness (Weick, Sutcliffe, & Obstfeld, 2008). Interestingly, and suggestive that HRO theory should resonate with anesthesia clinicians, the precepts of HRO are completely consistent with fundamental aspects of our daily practices as outlined below. My intent here is to explain in simple terms the core of HRO and to suggest that we as anesthesia clinicians are uniquely suited to embrace and promote the concept as we move the specialty forward into the future. While I believe that HRO concepts are actually the bedrock upon which the safe practice of anesthesiology rests, my frequent use of clinical examples from our specialty is largely for illustration purposes as the concept of HRO is applicable to any process where minimal defects are required and where consequences are great when errors do occur. In my opinion, our daily clinical application of these principles uniquely qualifies us to offer expertise in other areas of the health care system. That being said, let’s explore the basic concepts of high reliability organizing and organizations.
First, HROs are preoccupied with failure (or variance). HROs are constantly looking for signals, even very weak ones, that failure is occurring. Weak signals of failure, like a slight decrease in oxygen saturation, an increase in heart rate or airway pressure or even a drug vial in the wrong slot of the drug storage unit can be suggestive of a much greater systemic failure. A hallmark of HRO theory relative to failure or variance is not just a hyper-attention to these weak signals, but also that strong responses are taken when even weak signals are noticed.
For example, how many of us when confronted with a drug ampule in the wrong place merely move it to the correct slot and move on with our day, happy that we have caught the mistake? Or better yet, particularly as we try to expand our practices outside of the OR and the ICU, simply order a missing lab or imaging study that we need before proceeding on with a surgical case? Have we really solved the real problem when we see such a patient preoperatively lacking an adequate workup or just solved it for at that particular moment?
The answer to both of these questions is only a partial yes—for that particular patient we have but for the system we have not. Maybe the next patient won’t be so lucky and reducing the possibility for error, or catastrophe, for the next patient—for the next opportunity—is what sets high reliability organizations apart. In an HRO culture we still take care of that individual patient, but we elevate our observations up and out to the organization as a whole so that the processes can be appropriately modified. Only by doing this is the risk of failure—system failure—lower the next time. On a practical level, how will the pharmacy staff or the primary care practitioner ever learn of our downstream issues if we don’t do otherwise?
The concept of failure at the system level is an important one. HROs approach failure prevention at the system level by being mindful at the level of the individual and of the environment in which they operate and have prepared mitigating interventions when problems do occur. In an HRO culture, negative events such as errors or near misses are seen as opportunities to learn and improve. Reporting of actual events or near misses is encouraged, rewarded and treated in an open and non-punitive fashion. To do otherwise, results in less reporting, less data and information, and less learning. The organization that engages in the latter is the poorer for it as are the consumers of its services.
The second fundamental concept underlying the culture of HROs is a reluctance to simplify. This is counterintuitive for most people in that being organized is generally characterized as the ability to organize a large number of items, events, tasks or, in clinical terms, even physiologic symptoms and drugs into a much smaller number of categories. This categorization allows for more coordinated and thought out responses. But as Einstein reportedly said, “Everything should be as simple as possible, but not simpler.” Unfortunately, early warning signs of potential failure can be hidden with too much simplification. Details, meaningful ones, can be lost when we strip away too much of the alleged noise from the signal and categorize too broadly.
Even the names and labels that we attach to events and things can hide weak signals that if observed without the assigned categorization would portend possible negative outcomes. HROs recognize that they operate in complex environments and that failure, even repetitive failures, can occur in new and novel ways. Oversimplification can mask the signals of impending failure or lead one down the wrong path in terms of dealing with seemingly known issues.
A third characteristic of HROs is their attention to continuous situational awareness. This “sensitivity to operations,” as it is called, means that HROs pay attention to the work that is being done as it occurs and not how it is expected by policy or procedure to occur. In our world, for example, a lack of sensitivity to operations is why we continue to have an epidemic of wrong site, wrong drug, or wrong procedures being performed even though we likely all have universally adopted “time-out” procedures as supposed standards.
Sensitivity to operations also means that HROs focus not only on specific issues of the moment, but also broader ones that the system relies on to reduce the potential errors. For example, is the crash cart where it is supposed to be and fully stocked? Does the anesthesia machine check out before a moderate sedation case? Has the anesthesiologist or CRNA assigned to a case had adequate rest or training for the case at hand? This type of attention may seem trite to some, but to workers in an HRO environment it is just part of the organization’s continuous scanning of the situation. Once again, this should be second nature for us in anesthesiology.
Interestingly, and of particular importance to us in our field, is that one of the threats that HROs face regarding vigilance to sensitivity to operations is for routine tasks to become so routine that they become casual, even mindless. Consider for a moment the danger of this in caring for the surgical patient. How many times have you heard a colleague (anesthesiologist, surgeon, nurse) make a statement that an upcoming procedure is going to be simple or that a more thorough workup isn’t necessary? The fact that we frequently “get away” with accepting these circumstances does not diminish the real underlying threat that is present for the patient where we aren’t so lucky. Our experiences in situations like these and how we handle them on a system level is translatable to other aspects of health care and presents us opportunities to apply our experiences to the theory.
A fourth key HRO concept relates to an organization’s ability to deal with errors and issues as they arise. This ability is driven by what is characterized as a commitment to resilience and is predicated on the knowledge that the system, no matter how well designed, will still have failures. Handling errors, whether expected or not, requires training, forethought and an anticipation that errors can and will occur.
Much like our room preparation before administering an anesthetic, an HRO continually prepares for what ostensibly has already been prevented or can’t be anticipated by stressing teamwork training, mitigation strategies and by minimizing variations from the norm as a situation unfolds. In other words, although the events playing out are not exactly unfolding as planned, much like an unanticipated difficult intubation, the system (and us in the intubation example) has the necessary capacity and redundancy to handle the current crisis, return as close to normal as possible and learn from the endeavor as well.
Finally, HROs’ have a marked deference to expertise wherever it is in the organization. Expertise here refers to that person who has the most relevant knowledge of the situation that is unfolding. Expertise does not mean the most experienced or the most senior person as neither experience nor seniority necessarily carries with it the knowledge of the given situation at that moment in time. Expertise can mean that an individual only recognizes that something is amiss and the only action available to them is to bring the production process to a halt until other mitigating strategies can be implemented.
This, among all of the HRO strategies in my opinion, would be the most meaningful and easiest in theory to implement in health care. Unfortunately, health care remains a rigid, hierarchical industry that more encourages silence, silos, and passing issues on to others to discover and solve instead of an environment based on transparency, open communication, and non-punitive responses to error handling.
A Natural Fit for Anesthesiology
Taken together, the precepts of high reliability organizing should be applicable to all of health care. In my opinion, anesthesiology practice in its purest form is based on HRO principles whether we know the HRO theory or not. Because of this, and because we have excelled in its application, even unknowingly, I think that there is a role for us to be both disciples and advocates for high reliability organizing being embraced throughout the health care system.
While the motivations should be intrinsic, however, the reality is that very little changes in clinical practice or within the health care system without the application of external forces and the passage of time. This is unequivocally true for the changes in culture that HRO requires. There is no lack in the scientific literature of references to high reliability organizing and the health care system or the clinical practice of anesthesiology (Sutcliffe, 2011; Chassin & Loeb, 2011; Dixon & Shofer, 2006; Wilson, Burke, Priest, & Salas, 2005; Gaba, 2000). Despite the applicability of HRO concepts and culture to anesthesiology our uniquely applied expertise in this area remains largely unrecognized even within our own ranks. This may be ripe for change, however, as increasingly both internal and external forces are now working to bring HRO to health care.
For example, large purchasers of health care such as the government and employers don’t understand why what they take for granted as common sense business practices aren’t being applied to the care they purchase and receive. Similarly, the health care industry continues to consolidate and seek opportunities to create greater value and one of the compelling ways to do this is by embracing an HRO culture. High reliability organizing offers the opportunity to increase value, decrease harm and its potential associated costs, and improve system performance.
Finally, influential entities such as The Joint Commission, the Agency for Healthcare Research and Quality and the Institute for Healthcare Improvement have embraced the concept of HRO and are advocating for its adoption and implementation (The Joint Commission, 2012; Dixon & Shofer, 2006; Resar, 2006). The interest and direction of these organizations will likely drive further implementation of high reliability organizing in health care. As anesthesiology clinicians, however, we should embrace the opportunity that our experience in this area gives us and be at the forefront as the high reliability movement proceeds.
Chassin, M. R., & Loeb, J. M. (2011). The ongoing quality improvement journey: next stop, high reliability. Health Aff (Millwood), 30(4), 559-568.
Dixon, N. M., & Shofer, M. (2006). Struggling to Invent High-Reliability Organizations in Health Care Settings: Insights from the Field. Health services research, 41(4p2), 1618-1632.
Gaba, D. M. (2000). Anaesthesiology as a model for patient safety in health care. BMJ, 320(7237), 785-788.
Hines, S. (2008). Becoming a high reliability organization: operational advice for hospital leaders. Agency for Healthcare Research and Quality.
Kohn, L.T., Corrigan, J.M., Donaldson, M.S. (1999). To Err is Human. Building a Safer Health System. Committee on Quality of Health Care in America. Washington, DC: Institute of Medicine
Landrigan, C. P., Parry, G. J., Bones, C. B., Hackbarth, A. D., Goldmann, D. A., & Sharek, P. J. (2010). Temporal trends in rates of patient harm resulting from medical care. New England Journal of Medicine, 363(22), 2124-2134.
Resar, R. K. (2006). Making noncatastrophic health care processes reliable: Learning to walk before running in creating high-reliability organizations. Health Serv Res, 41(4 Pt 2), 1677-1689.
Richardson, W. C., Berwick, D., Bisgard, J., Bristow, L. R., Buck, C. R., & Cassel, C. K. (2001). Crossing the quality chasm: a new health system for the 21st century.
Sutcliffe, K. M. (2011). High reliability organizations (HROs). Best Practice & Research Clinical Anaesthesiology, 25(2), 133-144.
Commission, T. J. (2012). Improving Patient and Worker Safety: Opportunities for Synergy, Collaboration and Innovation.
Vogus, T. J., Sutcliffe, K. M., & Weick, K. E. (2010). Doing no harm: enabling, enacting, and elaborating a culture of safety in health care. The Academy of Management Perspectives, 24(4), 60-77.
Weick, K. E., & Sutcliffe, K. M. (2001). Managing the unexpected. Jossey-Bass San Francisco.
Weick, K. E., Sutcliffe, K. M., & Obstfeld, D. (2008). Organizing for high reliability: Processes of collective mindfulness. Crisis management, 3, 81-123.
Wilson, K. A., Burke, C. S., Priest, H. A., & Salas, E. (2005). Promoting health care safety through training high reliability teams. Quality and Safety in Health Care, 14(4), 303-309.
Michael R. Hicks, MD, MBA is a physician executive based in Dallas, TX. He is President and Chairman of Pinnacle Anesthesia Consultants, PA. In addition Dr. Hicks is a consultant for a national hospital and ambulatory surgery center company. He can be reached at firstname.lastname@example.org.