Interior Health Learning how to learn from Patient Safety Events

Interior Health
Learning how to learn from Patient Safety Events
A journey in adoption of the New View of Human Error
Wrae Hill
CSRT Meeting St. John NFLD
May 15/16.2010
Vision: To set new standards of excellence in the delivery of health services in the Province of British Columbia.
Human Factors & Systems Safety
Holistic approach to understanding and solving problems
that sees relationships and interactions and departs from the
old view mechanics of reductionism, decomposing wholes into
individual components.
Understand how systems maintain themselves in a changing
environment by balancing available resources against the
pressures, goals, constraints and opportunities……. concepts are
those of emergence and of resilience…how we maintain safety
amid dynamic complexity.
Challenge the common language of incident management and
describe the problems of old view Newtonian assumptions of
symmetry between cause and effect. Use a new lens to understand
human error as a symptom of complex systems.
Sidney Dekker – Lund Universitat 2010
PART ONE
Check your paradigm
A brief history of human factors
Selected topics ;
1.
2.
3.
4.
5.
The New View of Human Error
Hindsight Bias
On counting errors
Cause is something you construct
Making “recommendations”
The Field Guide to Understanding Human Error
Sidney Dekker 2006
PART TWO
Applied Learning from Patient Safety Events
Selected topics ;
1. Healthcare Human Factors
2. Reporting culture ( our experience with PSLS)
3. Patient Safety Investigation (PSI)
4. Just Culture - Incident management , disclosure
5. Closing the loop & communicating actioned / audited improvements
6. Examples of learning from adverse events (CILS)
Influence of 17th Century Scientism
ORGANIC WORLD VIEW 1000-1500
RENAISSANCE 1500-1600
Leonardo da Vinci - Archeotype of the renaissance man.
The first complexity / systems theorist
SCIENTIFIC REVOLUTION 1600-1700
Rene Descartes 'Cogito, ergo sum,' 'I think, therefore I exist ..created the conceptual
framework for seventeenth-century science.
Sir Issac Newton‘s laws created a comforting symmetry between cause & effect and the
basis for the scientific method. (Determanism)
ENLIGHTENMENT 1800-1900
There is no world and man beside it, only the world and man within it (Neitszche)
Perspectivism
NEW PHYSICS - CONFLICT & COMPLEXITY 1900 – 1950s
World Wars ,The New Physics, Cold War
Science, Philosophy, Geo politics, Sociology , Psychology merge (Artistry)
HUMAN FACTORS 1950-2010
The birth of Human Factors has military aviation roots, Fitts & Jones 1947
Alphonse Chapanis (1917-2002) Father of Human Factors / Ergonomics
Fitts, P.M., & Jones, R.E. (1947a). Analysis of factors contributing to 460 ―pil
ot error‖
experiences in operating aircraft controls (Report No. TSEAA-694-12). Dayton, OH: Aero
Medical Laboratory, Air Materiel Command, U.S. Air Force
B-17 Superfortress Bomber 1944
3 % Reach
5% Unintentional activation
6% Unintentional reversal of controls
18% Forgetting to do something
18 % Adjustments
50% Substitution ( Confusing one control for another )
GEAR
FLAPS
First use of human factors to adapt the technology
to the needs of the operator (Pilot’s Perspective)
Fitts, P.M., & Jones, R.E. (1947a). Analysis of factors contributing to 460 “pilot error”
experiences in operating aircraft controls (Report No. TSEAA-694-12). Dayton, OH: Aero
Medical Laboratory, Air Materiel Command, U.S. Air Force
Wheels-up C-17 crash caused by pilot error
By Bruce Rolfsen - Staff writer
Posted : Tuesday May 12, 2009 11:42:17 EDT
Pilots of a C-17 Globemaster failed to lower the transport‘s landing gear, forcing them to make a
crash landing at Bagram Airfield in Afghanistan, an Air Mobility Command investigation
concluded. None of the six onboard were injured; the repair bill for the $200 million aircraft,
however, totaled $19 million
Anaesthesia Critical Incident Studies
www.qualityhealthcare.com
•
In 1978 Cooper and colleagues published their landmark paper on the
application of the critical incident technique, adapted from uses in
aviation and other fields, to examine the causes—and later prevention
strategies—for adverse anesthesia outcomes.
•
Following on 20 years of rudimentary anesthesia mortality studies, this
was a brilliant approach that gave anesthesia clinicians new insights on
which we could act.
•
As the report told us: ―
. . . factors associated with anesthetists and/or that
may have predisposed anesthetists to err have, with a few exceptions,
not been previously analyzed. Furthermore, no study has focused on the
process of error—its causes, the circumstances that surround it, or its
association with specific procedures, devices, etc—regardless of final
outcome.‖
Human Factors in Anaesthesia
Ohmeda Tec 4, 5 With the center vaporizer removed (if three are mounted side by side), one can activate both outer
vaporizers simultaneously (in machines manufactured after 1995, this fault is corrected). Vaporizer outlet has check valve.
1. Br J Anaesth 2008;100:333-343 Human factors in anaesthetic practice: insights from a task analysis Phipps ,D.
2. J Am Med Inform Assoc 2006;13:635-642 The Evaluation of a Pulmonary Display to Detect Adverse Respiratory
Events Using High Resolution Human Simulator Wachter, S.B.
3. Qual Saf Hlth Care 2002; 11: 227-282 Preventable anesthesia mishaps: a study of human factors, Cooper, J.B.
4. Br. J, Anaesthesia 1987 Human Factors in Accidents Allnutt,M.F
5. Anesthesiology. 1978 Dec;49(6):399-406. Preventable anesthesia mishaps: a study of human factors. Cooper, J.B
(Classic paper)
1)
The New View of Human Error
Cause or Symptom ?
NEW VIEW
OLD VIEW
Why did that make sense at that time ?
what goes wrong
‗Bad Apple theory‘
What goes wrong

•
Human error is the cause of trouble
•
To explain failure you seek failures

•
You must discover people‘s failures in
decisions, judgments, assessments

How to make it right
•
Complex systems are basically safe

•
Unreliable, erratic humans undermine
rules, procedures , regulations & defenses

•
To make systems safer, tighten
procedures, more automation, better
supervision
•
Find and control the ‗badapple‖


Human error is a symptom of deeper
problems
Do not try to find where they ‗screwed up‘
Understand how their decisions,
judgments, assessments made sense to
them at the time.
How to make it right
Complex systems are not inherently safe
Complex systems are dynamic tradeoffs
between multiple irreconcilable goals
(safety/efficiency)
People create safety through practice at
all levels
Human error is not the conclusion of an
investigation…it is the beginning
The Field Guide to Understanding Human Error S.Dekker 2006
2)
Hindsight Bias
Retrospective – we are able to select the sequence of events we think
occurred based on our knowledge of the outcome
Counterfactual – Lays out detail the operators could have or should have
done. Biases your investigation to items you now know were crucial, and
causes you to artificially narrow your assessment to only a piece of ―w
hat
was going on at the time.‖
Judgmental - Judges people‘s actions /inactions from your context &
perspective, not theirs
Proximal - Focus on those people who were closest in time and space to
the event, may ignore unit, system, societal factors
To understand failure you must first understand your reaction to failure
The more you react to failure, the less you will understand
The Field Guide to Understanding Human Error S.Dekker 2006
Hindsight Bias
One of the most reproducible of all psychological phenomena
Fischhoff, B. 1977
“ The future is implausible & the past is incredible ―David Woods
Reinforces Newtonian /Cartesian symmetry between cause and effect
WYLFYWF
•
•
•
•
•
•
•
•
•
•
•
•
•
OLD VIEW INVESTIGATIONS
NEW VIEW INVESTIGATIONS
Down & In
Up & Out
Tend to be rushed (First story)
Societal pressure to seek the cause
Hindsight bias confuses our reality with that
of the participants
Singles out ill performing individuals
Finds evidence of erratic, wrong behavior
Bring to light bad decisions, inaccurate
assessments, deviations from written
procedure / guidelines
Investigators have limited human factors
knowledge
RECOMMENDATIONS
Numerous , many in-actionable
Focus is on retraining
Procedural over-specification (less safe)
Increased automation **
Remove the bad apples
PERSONAL RESPONSIBILITY
Individual responsibility may not be well
calibrated to individual authority












Second stories takes much longer
(Friendly Fire - Snook 2002)
Much harder. Resists simple ―ans
wers‖
Focus on local rationality – The view from
inside…People do what is reasonable
given their ; point of view, focus of
attention , knowledge & objectives
Context & history considered at every
step
Questions; who gets to decide ?
Objectivity is an arrogant illusion
Follow the goal conflicts….
Investigators have detailed human factors
knowledge
RECOMMENDATIONS
May be quite high level
(Moshansky – Air Ontario)
Make the right things easier to do
Fewer, simpler, brief ,flexible guidelines
Focus on learning and share generously
TEAM – CONTEXT FOCUSED
The Field Guide to Understanding Human Error Sidney Dekker Lund University – Sweden 2006
3) On counting “Errors”
• You can‘t count errors
• Error classification systems look unsuccessfully for simple answers
to the sources of trouble and sustain the myth of a stubborn 70%
human error.
•
This also makes artificial distinctions between human and
mechanical failure.
Sidney Dekker 2006
4a) Cause is something you construct
Organizational learning ?
CDN 75 y/o person dies of Med Error.. Non-diabetic received insulin that
was not prescribed . Pt died, Nurse not fired , no smoking gun found …..
Family involved in disclosure and investigation . Detailed RCA found 4
potential causes .
4b) Cause is something you construct
Organizational learning ?
• USA 16 y/o Mom to be dies of Med Error.
• Infusion intended for epidural route was given intravenously.
• Pt died, baby lived, Nurse fired , Detailed RCA found 4 proximate
« causes »..
• Shaping systems for better behavioral choices: lessons learned
from a fatal medication error. Smetzer J, Baker C, Byrne FD,
Cohen MR. J Qual Patient Saf. 2010 Apr;36(4):152-63.
• We have Newton on a retainer: reductionism when we need
systems thinking. Dekker SW. Jt Comm J Qual Patient Saf. 2010
Apr;36(4):152-63.
5) Actionable Recommendations
•
Recommendations must be realistic and locally actionable :
Specific, Measurable, Appropriate, Realistic, Timely.
•
Make sure you assign accountability to a specific local person (with their
permission) , who has both responsibility & the required authority ,
resources and a specific timeline.
•
Are these easy to audit & verify ?
•
Always consider applied human factors* and the hierarchy of effectiveness:
1. Forcing functions ( MOST EFFECTIVE )
2. Automation / computerization
3. Simplification / standardization
4. Reminders, checklists, double checks
5. Rules and policies
6. Education
7. Information ( LEAST EFFECTIVE )
From Cdn Root Cause Analysis Framework
PART TWO
Applied Learning from Patient Safety Events
Selected topics ;
1. Healthcare Human Factors
2. Reporting culture ( our experience with PSLS)
3. Patient Safety Investigation (PSI)
4. Just Culture - Incident management , disclosure
5. Closing the loop & communicating actioned / audited improvements
6. Examples of learning from adverse events (CILS)
Healthcare Human Factors - WHO
1.
2.
3.
4.
5.
Organizational Safety Culture
Managers‘ Leadership
Communication
Team (structures and processes)
Team Leadership (supervisors)
6. Situation Awareness
7. Decision Making
8. Stress
9. Fatigue
10. Work Environment
Rhona Flin, Jeanette Winter, Cakil Sarac, Michelle Raduma
Industrial Psychology Research Centre, University of Aberdeen, Scotland
Report for Methods and Measures Working Group of WHO Patient Safety.
Geneva: World Health Organization April 2009
Improving the quality of quality of care reviews
Incident Management
Policy / Process
Training & Audit
Incident Management
Follow Up
Improved Systems
of Incident
Management
Building Capacity
in Patient Safety
Investigation & Follow-up
Reporting
Patient Safety Investigation (PSI)
Improving the quality, of quality of care reviews and
merges Medical QI , Risk Mgmt & Quality Improvement
Intensive three day learning lab.
12-18 month continuous mentorship, learning and support
1) Utilizes a new lens for understanding human error as an
organizational problem of complexity
2) Ensures timely and thorough [best practice] investigations using
appropriate tools and sensitivity
3) Results in audited follow up with actions and shared improvements
utilizing IH tools and policy
Incident
Reporting
1° Disclosure
Unexpected Severe Harm or Death ?
Use critical incident checklist
Severity Decision
Investigation
Follow up Disclosure
Recommendations
Implement changes
Audit changes
Share learning
Supportive & Just Culture
Support for Health Care Staff
Interior Health is committed to promoting a just and trusting culture of safety in
which it‘s health care providers can readily report Incidents in order to learn
and work to improve the safety of patient care. While professional standards
are known and understood, when there has been a failure in the provision of
care, Interior Health commits to:
1) Appropriate care and support for patients, families and healthcare
providers
2) Evaluating all systemic factors, that may have contributed to failure.
3) Following established fair, processes for evaluating actions and
behaviors
Critical Incident Learning Summaries
• Learning from Adverse events & sharing de-identified case
summaries.
• The demonstrable outcome of a thoroughly reviewed critical incident
• This is what our staff , on patient safety culture surveys , want to
know.
Closing the loop on system improvements
29
Human factors approach
to a recurrent clinical technical problem.
MAV solution created by
clinical RRTs.
Published in CJRT
CSRT/ CSA Technical Ctte
/ Health Canada are
interested
8 months work
31
Automation ―surpri
se‖ in
Meditech made it possible
to magnify patient ID risks
Thankfully, an easy fix
4 months work
Management of
Anticoagulants with
over the counter cough
medications
1 months work
Common narcotic
adverse event.
Corrected at source, and
well communicated
2 weeks work
Easier to resuscitate a patient who is alive
Brindley
Have you retooled your code team into a ; MET, RRT, CCOT ?
Survival following CPR in hospital has not changed in 40 years
Survival - witnessed 48.3% resuscitated, 22.4% to discharge, and
18.9% made it home
Survival – Unwitnessed 21.2% resuscitated, but only 1 patient (1.0% )
to hospital discharge and was able to return home
Brindley, P. CMAJ 2002
Implementing Rapid Response teams (collaboratives)
Hill,W. CJRT 2005
Insanity: Doing the same thing over and over again and expecting different results.
Albert Einstein
A simple tool for clear, concise
communication in urgent
circumstances with appropriate
assertion…..
SBAR
Situation
the punch line 5-10 seconds
Background
the concise context, objective data
Assessment
what is your assessment of the
situation?
Recommendation
what do we need to do now ?
Nine Steps to Move Forward on Safety
D. Woods / R.I. Cook
1. Pursue Second Stories underneath the surface to discover multiple
contributors.
2. Escape the hindsight bias.
3. Understand work as performed at the sharp end of the system.
4. Search for systemic vulnerabilities.
5. Study how practice creates safety.
6. Search for underlying patterns.
7. Examine how change will produce new vulnerabilities and paths to failure.
8. Use new technology to support and enhance human expertise.
9. Tame complexity through new forms of feedback.
Leonardo da Vinci Center for Complexity and Systems thinking
LUND UNIVERSITY , SWEDEN www.lusa.lu.se
Wrae Hill BSc.RRT
Corporate Director Quality Improvement & Patient Safety
Interior Health ,Kelowna, BC. V1Y4N7
P: 250-870-5893
E: wrae.hill@interiorhealth.ca