The Field Guide to Understanding Human Error Summary

MD Barkat Ullah
4 min readFeb 20, 2021


The Field Guide to Understanding ‘Human Error’ by Sidney Dekker

Is safety making sure those few things don’t go wrong, or that as many things as possible go right, which includes not necessarily severe ones, there is a balance to be made. The old view of safety sees people as a problem to control whereas the new view of safety sees people seen as a resource to harness.

Explains the hindsight bias. Finding out about an outcome increases the estimate we make about its likelihood. In other words, as a retrospective reviewer who knows the outcome of an event, you exaggerate your own ability to predict and prevent the outcome.

The outcome bias. Once you know the outcome, it changes your evaluation of decisions that led up to it. If the outcome is bad, then you are not only more willing to judge the decisions, but also more likely to judge them more harshly.

Divide an operational system into a sharp end and a blunt end: At the sharp end (for example the train cab, the cockpit, the surgical operating table), people are in direct contact with the safety-critical process.

At the blunt end is the organization or set of organizations that both supports and constrains activities at the sharp end (for example, the airline or hospital; equipment vendors and regulators). Consider starting an investigation at blunt end rather than sharp end.

Try to understand their understanding of the situation was not static or complete, as yours perhaps is in the review situation. There was an incomplete, unfolding and uncertain.

Complacency is also a name for ‘human error’ which is the failure to recognize the gravity of a situation or to follow procedures or standards of good practice. It is essential in the battle against complacency to help retain situation awareness, otherwise they keep missing those warning signals.

“Non-compliance with procedures is the single largest cause of ‘human error’ and failure”. This book clearly points out labelling things isn’t really helpful. Commonly it is perceived there is a need to establish the root cause — however there is often not a single root cause an in fact many factors interplay.

There is a concept known as plan continuation in which early and strong cues suggest that sticking with the original plan is a good, and safe, idea. Only later, and weaker, cues suggest that abandoning the plan would be better. You must appreciate that something can only take moments and a very small amount of time but afterwards a large amount of time can be spent studying the adverse outcome in which time is not as crucial a factor.

Dynamic fault management is typical for event-driven domains in which we must appreciate when a situation is unfolding one must bear in mind that people have to commit cognitive resources to solving them while maintaining process integrity. i.e. other things don’t stop

It is worth acknowledging that complacency may arise when you perceive that an automated process is highly reliable, operators may not merely trust it, but trust it too much, so that they fail to monitor the variables often enough.

There are different models to evaluate errors

Hazard Triangle

Swiss cheese

Chain of events

Barrier model

All have different advantages but need to think what the factors are present when you decide which model to display and show in the model. In addition there may not be a clear time line.

Often trade-offs occur when one aspect of safety conflicts with another part of the business process. These little trades off are to be negotiated and resolved in the form of thousands of little and larger daily decisions and trade-offs. In time these are no longer decisions and trade-offs made deliberately by the organization, but by individual operators or crews.

What then is accepted as risky or normal will shift over time: as a result of pressures and expectations put on them by the organization; as a result of continued success, even under those pressures. This is known as drift into failure. Drift happens insidiously.

Murphy’s law is wrong. What can go wrong usually goes right, and then we draw the wrong conclusion: that it will go right again and again. It is with this that we borrow a little more from our safety margins.

So can human error’ go away? The answer isn’t as simple as the question. A ‘human error’ problem, after all, is an organizational problem. It is at least as complex as the organization that has helped create it. To create safety, you don’t need to rid your system of ‘human errors’. Instead, you need to realize how people at all levels in the organization contribute to the creation of safety and risk through goal trade-offs that are legitimate and desirable in their setting.

Every organization has room to improve its safety. The most important thing is that organization is willing to explore this space, to find leverage points to learn and improve.



MD Barkat Ullah

MY Name is MD Barkat Ullah. I am a certified SEO Expert in Bangladesh.