Healthcare error proliferation model: “
Alexnia: Added {{[[Template:uncategorized|uncategorized]]}} tag to article. using [[WP:FRIENDLY|Friendly]]
== Introduction ==
Healthcare systems are ”complex” in that they are diverse in both structure (e.g. nursing units, pharmacies, emergency departments, operating rooms) and professional mix (e.g. nurses, physicians, pharmacists, administrators, therapists) and made up of multiple interconnected elements with ”adaptive” tendencies in that they have the capacity to change and learn from experience. The term ”complex adaptive systems” (CAS) was coined at the interdisciplinary [[Santa Fe Institute]] (SFI), by [[John Henry Holland|John H. Holland]], and [[Murray Gell-Mann]]. Subsequently, scholars such as Ruth Anderson, Rubin McDaniels, and Paul Cilliers have extended CAS theory and research to the social sciences such as education and healthcare.
== Model Overview ==
The Healthcare Error Proliferation Model (HEPM) adapts the Swiss Cheese Model (Reason, 1990) to the complexity of healthcare delivery systems and integrated organizations. The Swiss Cheese Model, likens the complex adaptive system to multiple hole infested slices of Swiss cheese positioned side-by-side (Reason, 1990, 2000). The cheese slices are dubbed defensive layers to describe their role and function as the system location outfitted with features capable of intercepting and deflecting hazards. The layers represent discrete locations or organizational levels potentially populated with errors permitting error progression. The four layers include: 1) organizational leadership, 2) risky supervision, 3) situations for unsafe practices, and 4) unsafe performance.
The HEPM portrays hospitals as having multiple operational defensive layers outfitted with essential elements necessary to maintain key defensive barricades (Cook & O’Connor, 2005; Reason, 2000). By examining the defensive layers attributes, prospective locales of failure, the etiology of accidents might be revealed (Leape et al., 1995). Experts have discussed the importance of examining these layers within the context of the complex adaptive healthcare system (Kohn et al., 2000; Wiegmann & Shappell, 2003) and considering the psychological safety of clinicians. Hence, this model expands Reason’s seminal work.
The model incorporates the complex adaptive healthcare system as a key characteristic. Complex adaptive systems characteristically demonstrate self-organization as diverse agents interact spontaneously in nonlinear relationships <ref>{{cite journal | author = Anderson, R. A., Issel, M. L., & McDaniel, R. R | year = 2003 | title = Nursing homes as complex adaptive systems: Relationship between management practice and resident outcomes | journal = Nursing Research | volume = 52 | issue = 1 | pages = 12–21.}}</ref> <ref>{{cite book | author = Cilliers, P. | year = 1998 | title = Complexity and post modernism: Understanding complex systems | publisher = New York: Routledgel. | ISBN = 978-0415152860}}</ref> where professionals act as information processors (Cilliers, 1998; McDaniel & Driebe, 2001) and co-evolve with the environment (Casti, 1997). Healthcare professionals function in the system as diverse actors within the complex environment utilizing different methods to process information (Coleman, 1999) and solve systemic problems within and across organizational layers (McDaniel & Driebe, 2001).
=== Definitions ===
A complex adaptive healthcare system (CAHS) is a care delivery enterprise with diverse clinical and administrative agents acting spontaneously, interacting in nonlinear networks where agents and patients are information processors, and actively co-evolve with their environment with the purposed to produce safe and reliable patient-centered outcomes.<ref>{{cite journal | author = Palmieri, P. A., DeLucia, P. R., Ott, T. E., Peterson, L. T., & Green, A. | year = 2008 | title = The anatomy and physiology of error in averse healthcare events | journal = Advances in Health Care Management | volume = 7 | pages = 33–68 | doi = 10.1016/S1474-8231(08)07003-1 | accessdate 2008-08-29}}</ref>
== Citations ==
{{reflist|2}}
== References ==
;Articles
* Anderson, R. A., Issel, M. L., & McDaniel, R. R. (2003). Nursing homes as complex adaptive systems: Relationship between management practice and resident outcomes. Nursing Research, 52(1): 12-21.
* Berta, W. B. & Baker, R. (2004). Factors that impact the transfer and retention of best practices for reducing error in hospitals. Health Care Management Review, 29(2): 90-97.
* Chiles, J. R. (2002). Inviting disaster: Lessons from the edge of technology. New York: HarperCollins Publishers.
* Coleman, H. J. (1999). What enables self-organizing behavior in business. Emergence, 1(1): 33-48.
* Cook, R. I., Render, M., & Woods, D. D. (2000). Gaps in the continuity of care and progress on patient safety. British Medical Journal, 320(7237): 791-794.
* Leape, L. L., Bates, D. W., Cullen, D. J., Cooper, J., Demonaco, H. J., Gallivan, T., R., H., Ives, J., Laird, N., Laffel, G., Nemeskal, R., Peterson, L. A., Porter, K., Servi, D., Shea, B. F., Small, S. D., Sweitzer, B. J., Thompson, B. T., & van der Vliet, M. (1995). Systems analysis of adverse drug events. ADE prevention study group. Journal of the American Medical Association, 274(1): 35-43.
* Leape, L. L. & Berwick, D. M. (2005). Five years after ‘To err is human’: What have we learned? Journal of the American Medical Association, 293(19): 2384-2390.
* Leduc, P. A., Rash, C. E., & Manning, M. S. (2005). Human factors in UAV accidents, Special Operations Technology, Online edition ed., Vol. 3.
* Leonard, M. L., Frankel, A., & Simmonds, T. (2004). Achieving safe and reliable healthcare: Strategies and solutions. Chicago: Health Administration Press.
* Rasmussen, J. (1990). The role of error in organizing behavior. Ergonomics, 33: 1185-1199.
* Rasmussen, J. (1999). The concept of human error: Is it useful for the design of safe systems in health care? In C. Vincent & B. deMoll (Eds.), Risk and safety in medicine: 31-47. London: Elsevier.
* Reason, J. T. & Mycielska, K. (1982). Absent-minded? The psychology of mental lapses and everyday errors. Englewood Cliffs, NJ: Prentice-Hall Inc.
* Reason, J. T. (1990). Human error. New York: Cambridge University Press.
* Reason, J. T. (1997). Managing the rosks of organizational accidents. Aldershot: Ashgate Publishing.
* Reason, J. T. (1998). Managing the risks of organizational accidents. Aldershot, England: Ashgate.
Reason, J. T. 2000. Human error: Models and management. British Medical Journal, 320:
7680770.
Reason, J. T., Carthey, J., & de Leval, M. R. (2001). Diagnosing vulnerable system syndrome: An essential prerequisite to effective risk management. Quality in Health Care, 10(S2): 21-25.
Reason, J. T. & Hobbs, A. (2003). Managing maintenance error: A practical guide. Aldershot, England: Ashgate.
Roberts, K. (1990). Some characteristics of one type of high reliability organization. Organization Science, 1(2): 160-176.
Roberts, K. H. (2002). High reliability systems. Report on the institute of medicine committee on data standards for patient safety on September 23, 2003.
;Books
Cilliers, P. (1998) Complexity and post modernism: Understanding complex systems. New York: Routledge. (ISBN: 978-0415152860)
== Other Literature ==
;Complexity Theory
* Holland, J. H. (1992). Adaptation in natural and artificial systems. Cambridge, MA: MIT Press. (ISBN: 978-0262581110)
* Holland, J. H. (1995). Hidden order: How adaptation builds complexity. Reading, MA: Helix Books. (ISBN: 978-0201442304)
* Holland, J. H. (1998). Emergence: From chaos to order. Reading, MA: Addison-Wesley. (ISBN: 978-0738201429)
* Waldrop, M. M. (1990). Complexity: The emerging science at the edge of order and chaos. New York: Simon & Schuster (ISBN: 978-0671767891)
== External links ==
None at this time.
{{uncategorized|date=October 2008}}
“
(Via Wikipedia – New pages [en].)