I have observed several activities where, because they are a risk management practitioner, the activity’s manager has rightly been made responsible for leading the identification and control of spotting the uncertainties that could impact on their activity’s objectives. However, it has been painful to sometimes watch the finger pointed menacingly by a senior manager when the team failed to spot each and every uncertainty.
I often find that this is where hindsight bias collides with risk management efforts, and results in a misunderstanding of what a risk management practitioner can achieve.
To identify, assess and control the risks to an activity, the manager invites other stakeholders’ viewpoints to ensure a consensus of what we all agree to be the likely risk profile and exposure. This leverages the wisdom of crowds.
However, the collective intelligence of an assembled throng will have its limits. When the United States Secretary of Defence, Donald Rumsfeld famously stated there are only so many eventualities that can be predicted:
“Reports that say that something hasn't happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don't know we don't know. And if one looks throughout the history of our country and other free countries, it is the latter category that tend to be the difficult ones.”
The difficulty when assessing a risk manager’s performance fairly is to determine what are the unknown unknowns. Despite a team’s hard-working efforts to gain a collective consensus about what risks exist and need to be controlled to optimize the objectives of the activity they are supervising, I have heard senior managers tell the team that it should have spotted an issue that caught us by surprise (i.e. we should have identified and controlled it as a risk before it materialised) because, despite there having been little or no objective basis for forecasting it, the event was predictable.
Yes, occasionally, there are events that unexpectedly occur, and after-the-event analysis shows that they could have been better anticipated. However, in my own experience, this is more unusual than a stakeholder looking to allocate blame (rather than extracting learning – a whole essay for another day!) by claiming that they would have predicted the uncertainty because they “knew it all along”.
In one of my favourite Steven Spielberg movies “Jurassic Park”, the mathematician Ian Malcolm (Jeff Goldblum in fine form) famously cites chaos theory’s assertion that small changes in complex systems can have big, unpredictable effects.
We may be able to plan for different eventualities (which often leads to another troubling consequence as managers believe in the illusion that they have control), but we cannot thereby determine the future. It is hard to predict very far ahead successfully. At all levels of the management hierarchy, we have to expect things to fail, and we have got to be ready to change.