Human Factors Focus Part 3
Everybody wants safety
Everybody is a stakeholder and wants a safe operation. The owners, the safety board, the pilots, the cabin crew and the passengers – everyone likes safety. To create safety in an airline, you need a lot of things. These are just some examples:
- Well respected Standard Operating Procedures
- Selected and motivated personnel
- Training
- Audits and Supervision
- Safety Policies
- Trust
- Just Culture
- Non punitive reporting systems
- Etc. etc.
Avoid hindsight
When something goes wrong, it’s always easy to look at the incident/accident in hindsight. By that I mean that it’s easy to find reasons, put blame and jump to quick and perhaps smart conclusions, once you have some of the facts. You look at “the chain of events” that sometimes ends with a hole in the ground and says “If they only had done this instead of that”, “They should have set switch 1 to position A instead of B” etc. Sure, that’s easy to say in a nice and warm office, months or years later, when all the facts and mistakes have been thoroughly scrutinized over and over again. On top of that, you know for sure that this wasn’t good, since you’ve seen “the hole in the ground” afterwards.
Modern systems & Automation
In a modern aircraft, there could be a huge difference between a small human error and the sometimes catastrophic result. In earlier days, there were often a strong correlation between the size of an error and its unwanted result. We then also often looked at an incident like “a chain of events”. This error lead to that result which lead to this error which…etc. Today, many systems are connected to each other in multiple ways, but every system is influenced by perhaps one single button. These complex systems are sometimes very hard for the pilots to embrace. While the pilots in earlier days where confident that their education and training had supplied them with very detailed knowledge about the various systems, today’s pilots are well aware of that this is impossible and maybe not even relevant today. The pilot is more of a system operator today than ever. Automation has brought a lot of safety to aviation and has helped the pilots to avoid and exclude some of the errors. It has never been as safe to fly as it is today. However, it has also brought new sources of errors, new questions and challenges, which needs to be addressed in a professional way. When introducing new automated systems, it’s often done in the name of safety. More safety is always good. One of these new systems could perhaps increase the altitude accuracy and in turn make dense traffic areas safer. So, the engineers constructed a system which was in the interest of increased safety. After a while, the traffic would increase even more, and then some authority suggests that “thanks to these sophisticated new systems we can lower the separation by 50%, from 2000 feet to 1000 feet!” So the new system, introduced in the name of increased safety has all of a sudden resulted in a 50% decrease in separation. More automation might relieve the pilots, so they won’t be as tired when it’s time to be sharp and perform a demanding approach and landing. Good, huh!? A safety measure! Unfortunately, it wouldn’t take long before someone found out that “-Now, with all this automation, the pilots don’t get as tired as they did before…let’s increase their maximum duty time so they can work longer days!” Do you see what I mean? Eating of the introduced safety margin in order to increase productivity and reduce costs.
WHY instead of WHO
It’s dangerous to put labels like “Pilot Error” in the investigation. There’s nothing called “Pilot Error”! It can be a “Human Error”, since the pilots actually are humans, but even that label doesn’t explain anything. These expressions and labels won’t help us understand the cause. Labels like these promotes investigators in finding someone to blame, finding a Bad Apple in hindsight. If you find the bad apples – remove them instantly, fire them in order not to risk that they influence all the others. If we fire the pilots making errors or mistakes, will we then have a safer operation? NO, since we haven’t found the cause of the accident! It’s still there, in the system, waiting to happen again. By signaling to the other professionals in the company “We take action, we remove the unreliable humans that makes errors” the result could actually be the reverse. More unsafe, since the willingness to report errors will decrease. Who wants to be prosecuted and fired? That’s why it is vital to kill this method once and for all! We need to look into WHY instead of WHO! We need to focus on how the situation was for the people involved, at that very moment, in order to find countermeasures and ways to eliminate this in the future! The pilots were most probably skilled professionals, often with long experience in the sharp end! People normally tries to do their best in these situations. Nobody wants to die – so let’s find out WHY they took these decisions with reference to the takeoff situation in Philadelphia. Let’s look at WHY they didn’t calculate the takeoff thrust, WHY they didn’t report that they’ve just underwent a stress test and received medication, WHY they put in the wrong runway in the computer, WHY they didn’t aborted the takeoff when the warning bell sounded etc. If you do this, you’ll find countermeasures to increase safety in the future!
It’s so easy to put the label “Pilot Error” or “Human Error” on an incident/accident in order to make it more understandable. The stake holders, i.e. the passengers – the public, will understand it if we are using labels like this. No airline or aircraft manufacturer is interested in supporting causes that make THEM responsible. Think of all the law suits and all the money that will cost!
Bottom lines & Pre-planned Strategies
I fly the A320 series, the same aircraft type that were involved in the accident in Philadelphia (se other article on this site) and Air Asia 8501. It’s a brilliant aircraft, and I really love flying it. It’s “safe” and has some features and aids that no other commercial aircraft has. It’s also safe, thanks to the training and skilled professionals operation these machines, often with competing goals and sometimes difficult dilemmas. But it’s also demanding, especially in the pre-flight phase. It needs a lot of data and programming before you are able to takeoff. As a flying pilot, you are very busy during our rather short turn arounds, needing lots of data about the route, the weight, thrust settings, alternate(s), winds etc. Once that is programmed into our computers, we are ready to go! But it’s a long way getting there – and if you make a small error, it can have vast consequences. Like in Philadelphia. One way to handle this pressure is to set bottom lines. By being proactive, you can make a decision early and relieve pressure. If you are fighting to be able to make a takeoff slot, ask yourself “Is it even realistic?” If not, then ask for an extension or a new one as early as possible. Evaluate if you, already before you end up in a stressed pre-takeoff situation, can develop strategies and decide what actions you would like to take. It’s easier to do this at home, long before you even arrive to work. By developing strategies like this, in a calm and focused way, you are better prepared when these situations occur.
New view of Human Error – Summary
- The way forward and the “new perspective on Human Error” should be systemic.
- Human error is a symptom of trouble deeper inside the system.
- Safety is not inherent in systems. The systems themselves are contradictions between multiple goals that people must pursue simultaneously. People have to create safety.
- Human error is systematically connected to features of people’s tools, tasks and operating environment. Progress on safety comes from understanding and influencing these connections.
Disclaimer: These are my own thoughts and does not automatically represent my companys policies. Ref. Professor Sidney Dekker, who have earned my greatest respect in these matters.
More about the “New view of Human Error” later in this series at sascaptain.com