top of page

Acknowledging human factors in adopting new technologies

Updated: May 11, 2021



The second lecture of the Exploring Industry Experience course by Professor Stephen Wright focused on how various human factors, rather than technical, can be so critical in creating mistakes in the workplace. During the lecture, I started thinking about how these different human factors should be taken into account when organizations adopt new solutions as that is the focus of my doctoral studies.


The lecture focused mainly on the aviation industry where the stakes are of course extremely high. The incentives for avoiding mistakes are similarly high and many government regulations also force these organizations to do their best to ensure passenger safety. Yet accidents still happen.


Even though many other industries are not as life-critical as the aviation industry, lessons can certainly be learnt from it and transferred into other contexts. For example, the importance of leadership in setting the tone when it comes to safety issues is perhaps one of the most important factors. If the top management do not take some issue seriously, why would anyone else give it much thought?


Several innate human biases can create problems for organizations. For example, people are prone to cut corners to save them time and money. Over time, this can lead to a series of small and large mistakes. These factors have serious implications for organizations when they start using new technologies and systems as the chance for errors is especially pronounced when people are dealing with something that’s new to them.


Special attention needs to be paid to how the employees will actually use the new system, not how it would work in some ideal scenario. In the lecture, Professor Wright highlighted how simple external factors, such as the weather, can scuttle very basic safety procedures.


For example, organizations often instruct employees to carry a manual with them to support their tasks, but a rainstorm can make using the manual impossible. Employees will then either postpone the task, or more likely, disregard the manual and try to complete the task from memory which often leads to errors. These errors can then build up over time without anyone noticing them. Eventually, this can lead to big problems or even catastrophes.


People also by and large prefer the status quo. It is therefore not sensible to deride them on this fundamental behavior, instead it is better to try to understand how this will affect technology adoption. For example, employees have built up skills and routines with the old systems which might be incompatible with the new system and cause mistakes.


Identifying how these things manifest in the context of the new system can then help spot any biases and fundamental errors that people are likely to make with the new system in the long-term.


However, these kind of testing situations likely can’t be implemented realistically in every context, such as the nuclear industry. Perhaps virtual reality might offer an intriguing solution for allowing people to test new systems (especially in dangerous and costly contexts) while also allowing them to fail without it creating serious consequences in the real world?



- Henri Jalo, Doctoral Researcher


Comments


bottom of page