Each of us knows a person who thinks he is always right and does not make mistakes: “Once I thought I was wrong, but that was a mistake.” But according to surveys, this is actually one of the most vulnerable type of person.
If you are unsure and maybe a bit junior, you read the manuals, ask help and doublecheck before executing. Also, your manager and co-workers are watching every move you make and carefully observing that you don’t get the chance to ‘screw up,’.
But as Stephen Wright is stating in his lecture: trusted, senior, experienced staff are the ones making the worse errors. Nobody is observing them, and they are trusted to get on with it and do it right every time – because they have done so thus far. For them, reading instructions is a sign of weakness, following processes is intended for amateurs and shortcuts are meant specifically for them to follow. As a famous Finnish F1 driver summarized it nicely: “Leave me alone, I know what I´m doing” (maybe he knew, though?).
Who´s to blame
Stephen Wright is bringing up also another very common but dangerous phenomenon: Blame Cycle at work. This is related to another disruptive way of working, called “leading by fear”. In practice, this means that when an error occurs, the most important thing is to find the guilty ones – not the reasons or a way to avoid it in the future.
This results in an atmosphere of fear and all errors and deficiencies will be deliberately hidden from the management. Many companies have been able to perform this theater for a surprisingly long time, but the end-result is inevitable: the problem is discovered – hopefully before it has caused a huge number of new errors.
Saved by the error models?
As humans, we all make mistakes, both in our work life and in our personal lives. Models of human error can be helpful in determining why errors have occurred in the past, where future vulnerabilities may lie, and how we could take action to make our way of working safer. The real trendsetters here have been airlines, but as history has sadly shown: no model can completely eliminate the meaning of human error. As long as there is a human being needed in the system, he may have a bad day, and we are still vulnerable.
DI / MSc (Eng.) / Doctoral Candidate
Managing Director, Beckhoff Automation Oy