human factors

28
Human Factors Risk Management Services Department

Upload: pillan

Post on 07-Jan-2016

15 views

Category:

Documents


0 download

DESCRIPTION

Human Factors. Risk Management Services Department. Are You Perfect?. Have you ever pushed the wrong button on a soda machine, left your car headlights on, or unintentionally deleted a file on your computer? Wonder how often these (and more serious) errors occur?. Laws of Nature. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Human Factors

Human Factors

Risk Management Services Department

Page 2: Human Factors

Are You Perfect?

• Have you ever pushed the wrong button on a soda machine, left your car headlights on, or unintentionally deleted a file on your computer?

• Wonder how often these (and more serious) errors occur?

Page 3: Human Factors

Laws of Nature

We accept and design for the laws of nature.

Example: If a bridge falls down, we don’t list “gravity” as the root cause.

Example: If there is an asphyxiation, we don’t list “people need oxygen” as the root cause of an injury.

Page 4: Human Factors

Human Error

• Law of nature:

HUMANS MAKE MISTAKES!

• DON’T BLAME IT…PLAN FOR IT!

Page 5: Human Factors

How Often Do Humans Make Mistakes?

• Trained, not under stress, not fatigued or overloaded, and enough time:

Error occurs about 1 in every 100 times the operation is done

Page 6: Human Factors

How Often Do Humans Make Mistakes?

• Not trained, or under stress, or overloaded or short period of time:

Error occurs about ½ to every time the operation is done

Page 7: Human Factors

• Trained and not under stress and not fatigued and not overloaded and enough time AND with built in feedback:

Error occurs about 1 in every 1,000 times the operation is done

How Often Do Humans Make Mistakes?

Page 8: Human Factors

What is Feedback?

• Buzzer when you leave your lights on

• Bell if the keys are in the ignition when the car door is opened

• Control system asking you to confirm that the charge amount you entered was correct and showing the proper pumps and valves are open/closed

If you can see that you are doing the right thing, then you can be sure that you did it.

Page 9: Human Factors

Can a Human Check a Human?

• Principle: If a person knows that someone else checked, they are not likely to reliably recheck

Human checking is not generally a reliable safeguard against errors made by other humans

(Exception: airline industry, although it is not 100% reliable…)

Page 10: Human Factors

Helios Plan Crash Aug. 2005

3 checks by two pilots missed the switch in the wrong position

Ineffective response to loss of cabin pressure and incapacitation of crew

Page 11: Human Factors

Is Technology the Panacea?

Principle: If a safety system is installed to protect against human error, the human will depend on it. Then the safety system becomes the only layer of protection.

Principle: All mechanical things break. Safety systems need to be tested to ensure that they are working properly.

Page 12: Human Factors

Real-Life Example

• An operator loading a tank overflowed the tank

• Management put a high level shutoff on the pump

• The operator relied on the switch and did not watch the tank level closely

• One day, the switch failed and the tank overflowed

Page 13: Human Factors

BP Texas City

• Operators did not fully understand Raffinate Splitter Tower operation•Startup procedures not fully followed•Material fed to column but did not exit; critical valve not opened during startup•Level exceeded safe limits; level device failed; not recognized•Level instrumentation in blow-down tank failed, but not repaired •Blow-down tank overflowed, material reached an ignition source, and a vapor cloud explosion resulted

Page 14: Human Factors

BP Texas City Explosion / FireMarch 23, 2005

Page 15: Human Factors

Caveat

Any system human beings devise to prevent failure can be overcome by

human beings with sufficient determination and authority

If there is a will, there is a way!

Page 16: Human Factors

Guiding Principles for Preventing Human Error

• Humans and systems designed by them are vulnerable to error

• Existing facilities contain many traps that can cause human error

• Designers can provide systems to facilitate error/deviation detection and to enable recovery before the error/deviation becomes serious

Page 17: Human Factors

Design Considerations

• Ergonomics – Can the operator reach what he needs to and work safely?

• Operability – Is the work flow designed to minimize taking shortcuts?

• Procedures – Are they clear, easy to follow, and explain the consequences of deviations?

• Maintenance – Is there access and capability to maintain equipment?

• Simplify – less chance of error

Page 18: Human Factors

Design Considerations

• Be consistent – orient valves the same way, use computer diagrams that look like the equipment layout

• Human limitations – consider color-blind operators, different heights

• Safety systems – make sure they can’t be bypassed

• Alarm management – Don’t shower the operator with alarms he can’t process at once!

Page 19: Human Factors

Chernobyl Nuclear Reactor Runaway April 26th 1986

Page 20: Human Factors

Chernobyl, Soviet Union 1986

• Nuclear meltdown resulted in 56 direct deaths, relocation of 336,000 people, and a plume of radioactive fallout

• Significant design flaws in reactor

• Safety systems switched off

• Operator errors/training

• Alarm showers confused the operators (also at Three Mile Island)

Page 21: Human Factors

Cultural Stereotypes

• GREEN is on, RED is off…but not in Japan!

• H is hot water, C is cold…except in non-English countries (chaud or caliente both mean hot in French and Spanish)

• Light switch is up for on…except in the UK!

Page 22: Human Factors

Human Factors Philosophy

1. Make the right way THE ONLY WAY

2. Make the right way THE EASIEST WAY

3. Give the operators feedback that it was

done the wrong way

4. Provide safeguards for when it is done the

wrong way

Page 23: Human Factors

Remember Other Operations…

Don’t forget about maintenance, startup, and shutdown. These are the most risky times in a process. There must be EHS reviews, management of change, permitting procedures, training and communication systems to avoid human error.

Page 24: Human Factors

Piper Alpha, 1987

Page 25: Human Factors

Piper Alpha, 1987

Page 26: Human Factors

Piper Alpha, North Sea, UK

• Operators switched on a pump that was undergoing maintenance – poor lockout/tagout and communications

• Significant leak/fire ensued

• Piper Alpha was destroyed

• 167 fatalities, loss of millions in revenue per day

Page 27: Human Factors

Safety CultureA safety culture that promotes and reinforces safety as a fundamental value is inherently safer than one which does not

- Do we have to follow the standards?

- Do we really have to shut down?

- Do we have to install this safety system?

If these questions are asked, it is an indication of a poor safety culture!

Page 28: Human Factors

Summary

• Human error is a fact of nature – plan on it

• Design process to minimize “traps”

• Provide training and clear guidance

• Provide feedback that the operator action taken is right/wrong

• Don’t expect humans to check humans

• Provide safety systems

• Remember to consider startup, shutdown, and maintenance

• Support an interdependent safety culture