lessons from three mile island: lessons from big industrial accidents and their relevance for the...

65
Lessons from TMI Lessons from big industrial accidents and their relevance for the Pharma industry

Upload: arete-zoe-llc

Post on 13-Feb-2017

803 views

Category:

Leadership & Management


0 download

TRANSCRIPT

Lessons from TMI Lessons from big industrial accidents and

their relevance for the Pharma industry

ARETE-ZOE, LLC

Registered address: 1334 E Chandler Blvd 5A-19, Phoenix 85048 AZ, USA

Solutions to complex problems in the high stakes and high consequence environment

of Global Pharmaceuticals, including clinical research, healthcare informatics, and

public health. We blend established, Pharma sector methodologies, innovation, and

adaptations/transfers from other sectors to identify and resolve consequential

practices that pose risk and often result in avoidable patient casualty.

Three Mile Island, PA, USA

(1979)

Reactor

Cadmium

rods

Steam

generator

Cooling

circuit

Turbines

Steam pipes

03:58 AM, March 28, 1979

Routine repair of clogged filter

Trace of water left inside air circuit

Alarms went off

…causing confusion in the operating room

Computers interpreted “water in air

system” as “dangerous invader” and

shut down the pumps in cooling circuit

Steam generator Nuclear reactor

Pumps in cooling circuit shut down Why: Misinterpreted information from the air circuit

Consequence: Cooling system disabled, reactor heating up

Operators faced situation they

were not trained for and was

not covered by their procedures

Pressure is building up in primary

cooling system within the reactor

Computer ordered cadmium rods to plunge

into the reactor and stopped chain reaction

Pressure and heat

within the reactor

continues to build

Pressure operator relief valve (P.O.R.V.) opened

to vent the pressure and failed to reclose

Indicator incorrectly shows that the valve

has now reclosed

Design: Indicator correctly shows that the

control order has been “sent”, not “obeyed”

Surrogate endpoint

In clinical trials, a surrogate endpoint (or marker) is a

measure of effect of a specific treatment that may correlate

with a real clinical endpoint but does not necessarily have

a guaranteed relationship.

Disease Surrogate

endpoint

True clinical

endpoint

Intervention

Valve open, reactor coolant leaking

out without anyone’s knowledge

Stuck valve undetected Why: Misinterpreted information from indicators

Consequence: cooling system disabled, reactor heating up

Communication failure Single phone line in the operating room

Key people unable to get through

1

Overheated reactor Incorrect readings from instruments

Volume of coolant measured indirectly

Decision: to turn off reactor pumps

Incorrect conclusions

Loss of trust in the instruments

System did not behave as expected

OB

SE

RV

E

OR

IEN

T

DE

CID

E

Training

Decisions based on incorrect, misleading or no information

7:15 AM stuck valve finally discovered Pumps were finally restarted Reactor still overheating

Misleading temperature readings

Why: Instruments not designed for temperatures this high

Radiation in operating room

Mounting pressure

8:33 AM – General emergency Misleading and deceptive information provided to

the public by the company

Minimum information provided to state

administration and regulators

Partial evacuation within 5 mile radius

Freedom of speech, anyone?

U.S. v. Caronia

Amarin v. FDA

Not an issue during TMI

Real concern now in off-label

promotion (Must be “truthful”)

Reactor cracked

Sample of contaminated coolant

Basement full of contaminated water Radioactive gas was eventually released into the atmosphere

Oh, BTW, it can

blow up because

of accumulated

hydrogen

Fierce dispute within the NRC whether this can happen or not

This risk did not materialize. Partial reactor meltdown did not

result in any additional release of radiation.

Complex combination of minor equipment failures

and major inappropriate human actions

Risk = Probability x Consequence

1973 oil crisis

Cheap power needed

Political topic

Gov’t subsidies

Probability

assessment

flawed “That can’t happen here” mindset

Root cause: Human factors

Combination of lesser events

Misjudged probability

Misinformation

Confusion

Inadequate training

Inappropriate human response

Ordinary mistakes in high stakes environment

Pro

bab

ilit

y

Consequence

X

X X

?

?

?

?

The need for change

RECOMMENDATIONS

Fundamental changes in

• Organizational procedures and

practices

• The attitude of regulators

• Operator training, updates

• Emergency response and planning

Organization failed to learn

from previous failures

The need for change

RECOMMENDATIONS

• More attention to human factors

• Combination of lesser events

(slower to develop, more likely)

• Training, fitness for duty

• Organizational structure

• Communication

Focus on equipment safety

and large break accidents

Compliance v. Safety culture

“It is the responsibility of the NRC to issue regulations to

assure the safety of nuclear power plants. However,

regulations alone cannot assure safety. Once regulations

become as voluminous and complex as those now in place,

they can serve as a negative factor in nuclear safety. The

complexity requires immense efforts to assure compliance.

Requirement v. Consequence

The satisfaction of regulatory requirements is

equated with safety.

• Focus on compliance with regulations

instead of intrinsic system safety

• Inspection manual voluminous and

complex – unclear to many inspectors

• Enforcement actions limited/unused

• Reliance on industry own data

• No systematic evaluation of patterns

• Unclear Roles & Responsibilities

The role of regulators

NRC has erred on the side of

the industry's convenience

rather than its primary

mission of assuring safety

The role of regulators

HUMAN FACTORS Fiduciary responsibilities of

public servants

Worst problem:

Loss of public trust

• Misinformation

• Deception

• Misunderstanding

• Fear & Confusion

Outcome

Transformation of

the industry

Major regulatory

reform

Chernobyl, Ukraine, USSR (1986)

Orders received to carry out tests to find out

how much energy can be saved during routine

maintenance shut down.

Numerous safety mechanisms had to be

turned off to make this test possible

Power levels lowered

to perform tests

Emergency core

cooling system shut off

Operator failed to program computer

to prevent power from dropping

below minimum safe level

Automatic scram

devices an alarms

switched off

Control

rods

withdrawn

too far to

reinsert

quickly

= Bad idea = Very bad idea

Chernobyl nuclear plant, unit 4

April 26, 1986

5 AM

1:23 AM

Systemic factors

Long record of sometimes fatal accidents

ACCIDENT WAITING TO HAPPEN

National 5-year production goals oblivious to reality

Training often suspect and shoddy

Lax observance of rules and regulations

Causes of

disaster

Irresponsibility

Negligence

Indiscipline Flawed performance metrics

HUMAN FACTORS

Outcome

Sweeping changes in Soviet society

Disintegration of the Empire due to loss of credibility

Martin Winterkorn, CEO of Volkswagen, AG acknowledged that 11 million vehicles were equipped with

diesel engines with defeat devices to cheat pollution tests

…And spreading

Root cause

Cause entirely

internal

Flawed

performance

metrics VW very sensitive to its own image

Internal pressures to improve metrics

caused someone to manipulate the system –

in a manner that amounted to conspiracy

Lessons learned?

Behavior of organizations follows the same

principles regardless industry

Common attributes

• Formally regulated industries

• High-stakes, high consequence environment

• Information flow within organization

• Communication with stakeholders

Public trust essential

TMI VW Chernobyl

Accident caused by systemic factors impacts the

whole industry

Common root cause?

• Requirement v. Consequence

• Individual and collective accountability

• Poor leadership

• Flawed performance metrics

• Failure to learn from previous errors

• Communication with stakeholders / public

• Regulatory response

• Delivery/enforcement of regulation

TMI VW Chernobyl

HUMAN FACTORS

Regulators and elected officials

• Subject to the same human frailties

• Oblivious to ambiguity

• Requirement v. Consequence

Public trust essential

TMI VW Chernobyl

HUMAN FACTORS

HUMAN FACTORS

Experience, training, education

Capabilities Demographics Frailties Values

Organizational culture

Reporting structure

Leadership

EN

VI

RO

NM

EN

T

Ethical

What is risk?

Probability of

detrimental

consequence

Risk

Vulnerability

in process

Probability

Threat

Capability

Intent / Ability

Detrimental

consequence

Accidental

Malicious

Qualifying consequence

Safety signal: It takes significant

number of casualties with attributed

causal relationship to produce a signal

Statistically significant cause

attributed to a drug

Patient injury

Qualifying

consequence

Attribution

Dispensing error / incorrect substitution)

Non-compliance with treatment

Self-medication (OTC, Rx, illicit)

Atypical manifestation of disease

Misdiagnosis

Prescribing error

Wrong dose (predictable, unpredictable)

Individual variability in response

Misleading information on drug

Drug interactions (known, unknown)

Off-label use (appropriate, inappropriate)

Counterfeit medications

Limitation of science

Honest mistake

Omission

Commission

Deception

False Claim

PATIENT INJURY

Adverse outcome: Consequences

Patient

Clinician Pharmacist

Regulator

Drug

manufacturer

Healthcare

facility

Insurer Elected officials

Individual

Population

COMMON CONSEQUENCE

The only way how to change behavior of organizations is…

…to create

Detection of vulnerabilitites

Probability of

detrimental

consequence

Risk

Vulnerability

in process

Probability

Threat

Capability

Intent / ability

Detrimental

Consequence

Accidental

Malicious

Quality risk management

Record of past events

(EV, FAERS)

FTA, FMEA, FMECA

HAZOP, HACCP, PHA

Systems modeling

Identify Vulnerabilities

Impose safety Constraints

Enforce these constraints

• By Design

• By Operations

Risk assessment

ICH

Q9

ICH

E2E

Define Accountability for

control of vulnerabilities and

acting upon them (R&R)

Enable decision-makers

Systems theoretic accident

process and modeling (STAMP)

Imposing constraints on a system whilst ensuring

enforceability of these constraints by design and operations

Human supervisor

(Controller)

Model of

process

Model of

automation

Automated Controller

Model of

process

Model of

interfaces

Controlled

process

Sensor

Displays

Controls

Process inputs

Disturbances

Process outputs Actuators Controlled

variables

Measurable

variables

System models

Simplified models of complex environment

Tools to enable decision-makers • Reduce ambiguity and uncertainty

• Accountability for acting upon vulnerabilities

• Limit liability

HUMAN FACTORS

Tools do not substitute good leadership

PUBLIC TRUST

Training

Correct input – accurate and timely orientation

Download presentation

Thank you