+

Cognitive biases in risk management

Imagine. On your way to the office you see an accident happening right in front of you. A truck leaves a construction site, and hits a pedestrian. When in the office, you have a risk assessment meeting. This meeting focuses on the operational risks of the loading of a vessel in a terminal. One of the risks you bring up is that traffic management might not be regulated sufficiently, potentially leading to collisions between vehicles and persons on site. This phenomenon – giving examples immediately coming to mind – is called availability bias.

People often like to think that they are rational and logical, and that they approach situations as such. However, we all are subject to various cognitive biases that influence our thinking and reasoning. Availability bias is just one of many. This means that these biases do influence risk management as well, meaning that the identification and evaluation of risks might not be accurate as we would like to see it. Is that a bad thing? Well, not directly, only if you are not aware of the fact that you are subject to those biases.

After explaining three cognitive biases, tips will be given on how to work with these biases during risk management. The following cognitive biases are explained:

  • Availability bias;
  • Optimism bias;
  • Confirmation bias.

Availability bias

Availability bias influences our thinking and reasoning as it ‘pre-selects’ the thoughts that come to mind – these thoughts are readily available about a specific topic. The bias makes you to think that these ‘pre-selected’ thoughts are more important or representative than they actually are – if you recall them, they must be important. Right? Think of the example with the truck hitting the pedestrian immediately popping up during the risk assessment meeting. Was this indeed a relevant risk in this situation? Or were others more important? As a result of the availability bias people judge situations on recent information or news that they have got.

Optimism bias

The optimism bias colours your thoughts as well. Actually, this bias is considered to be most prevalent and consistent of all. This bias causes us to think that we underestimate the chances that something bad happens to us, that we have a smaller likelihood of experiencing something negative than reality would show. On the contrary makes this bias that we overestimate the chances of positive events – ever bought a lottery ticket?

Interesting to note is that underestimating likelihoods of negative events is present in a stronger way than the overestimating of chances something positive occurs. This is important to notice. Underestimating the chances of negative events happening, results in not, or not sufficiently, implementing preventive or mitigating measures. Or on the other hand, overestimating your chances of luck might result in people thinking that they will not need their safety harness if they only need to be at height ‘for just a minute’.

Whereas you might have put up the traffic management on site as being high risk (availability bias), another person in the meeting might downgrade this risk as if this is not being relevant, and does not involve high risks (optimism bias).

Confirmation bias

Believe what you want to belief, that is a confirmation bias. Or better said, people tend to search, recall or interpret information that will confirm their belief. It means that they will be selective in the information they remember, or how they interpret it, and what information is accepted. On the contrary, information that will contradicts your beliefs is (purposely) overlooked or rejected.

In the discussion during the risk assessment, you want to bring up the risk of vehicles colliding with other vehicles or individuals. To support this stance, you have brought up an incident investigation about a truck colliding with another vehicle (confirmation bias). However, you may ignore the statistics on accidents for this specific site – stating that such accidents did not happen at all so far -, and the detailed traffic management plan, as you consider them inaccurate or incomplete.

How to work with these biases

So, people are not rational and logical. We are subjective to various cognitive biases that influence our reasoning. What should we do then to ensure that these biases do not influence our risk assessment too much?

 

  1. Start with being aware that your reasoning is mostly being influenced one way or the other. Just by knowing that your reasoning – and that of others – is not completely objective, helps to interpret information within the risk assessment.
  2. Ensure that risk assessment is a group responsibility. All participants will be subject to their own biases, and discussing various threats and opportunities in a group will shed different light on the topics.
  3. Risk evaluation (for example by using a risk matrix) may help to prioritize the various risks. However, ensure that not only the highest rated risks have sufficient prevention and mitigation measures included, but lower rated risks as well. Would biases have influenced the rating, then at least you are sure, would the events occur, that sufficient measures are implemented.

 

Have you ever noticed such biases in your own work?

For more information on risk management, do not hesitate to contact us. info@quattorp.com, www.quattorp.com

Final note:

Note that all examples that are being given are solely stated for illustrative purposes. This article’s purpose is not to downgrade any risks of trucks colliding with other vehicles, or stating that traffic management will always be effective. Again, think for yourself, and make a well-considered and thought-through analysis to ensure that there are sufficient preventive or mitigation measures in place.