Valuable lessons engineers can learn about risk from Covid-19

The engineering profession needs to do more to explain to the public how its work is related to science and the part played by ‘unknown unknowns’.

 Risk is in the news more than ever at the moment. Our collective behaviour depends on understanding it, but public debate reveals confusion that I believe needs to be clarified and which identifies an important learning point for engineers.

Risk is at the heart of decision and action because, just as truth is about what we know, risk is about how we act. Truth and risk are concepts which, at once, are both simple and complex. Common sense tells us that truth is correspondence to the facts and risk is the chance of loss. But look deeper and the more we scratch, the more we find.

For example, facts are true statements – the simple definition is circular. And risk is chance – but what is the chance of an event never experienced before – an unknown unknown like the spread of Covid-19?

There are five issues where media reporting of the current pandemic illustrates confusion around this important question: the discussion of risk as only a risk of infection; no mention of the consequences (the risk of infection and dying or not); the lack of context, such as a person’s state of health; failure to distinguish between risk across a population and for a specific individual; and finally the failure to mention ‘unknown unknowns’ and unintended consequences.

The root of risk is fear of the unknown; an understandable human condition. Our perceptions of risk vary greatly in complicated ways. Two of the main deciding factors are the degree to which the risk is familiar and the extent to which we feel in control. Throughout history, people have craved certainty. The growth and success of engineering and technology has, to people outside of the profession, reinforced the myth that scientific knowledge is an uncovering of the ‘truth’. What has not advanced so quickly is our understanding of ourselves – how we organise to achieve the things we want.

The ways in which engineers make ‘things’ that work are widely misunderstood by those outside of the profession and have served to reinforce the myth. Many people see us as simply ‘appliers of known science’. We engineers have failed to communicate adequately that the relationship between science and engineering is much more nuanced. There is a natural but incomplete gap of interdependence between theory and practice, and risk lies squarely in that gap.

Formally, risk is the chance that a particular set of conditions (events) will happen in the future leading to further consequences in a stated context. But routinely the four elements – chance, conditions, consequences and context – become blurred or lost. We know that when conditions spell danger or harm then they are a hazard – a trailing wire that someone may trip over, for example. Covid-19 is a hazard of infection.

A complex hazard is the difficult-to-spot incubating conditions of an ‘accident waiting to happen’ such the events leading up to the Grenfell Tower fire. When the conditions spell benefit then they present an opportunity. A chance, such as 1 in 10,000, is the ratio of the number of times an event may occur to the number of possible future scenarios. So, for every 10,000 possibilities 1 might occur. This is straightforward when the sample space is finite – such as tossing a dice or spinning a roulette wheel. Unfortunately, problems that involve people’s behaviour are complex because people have the propensity to do the unexpected.

Statistics is based on measuring random variables that demonstrate patterns of behaviour over large populations. Powerful techniques for examining those patterns break down for specific cases. We can calculate the probability of catching Covid-19 within a population of classified people (over 50 years old, say) but we cannot be confident that the figure applies to any one individual, such as you or me, simply because that person may have unique health issues.

In recent years we have been forced to recognise the possibility of unknown unknowns and utter surprises. Since we cannot foresee them, they are usually not included in risk assessments. We have no choice but to rely on human judgement. Mathematicians have addressed these difficulties by interpreting probabilities as ‘Bayesian degrees of belief’. These kinds of modelling tools help, but they inevitably depend on our values and belief systems. So, the message is loud and clear – the role of science is to inform, not to act upon.

Engineers do not ‘apply science’ any more than doctors ‘apply biology’. All practitioners use science to inform their decision-making but not to replace it. That is one of the lessons of Covid-19 which engineers must communicate to those outside our profession much more effectively.

David Blockley is emeritus professor of engineering at the University of Bristol and author of ‘Building Bridges’ (World Scientific, £50, ISBN 9781786347626), a book about bridging the gaps between what engineers know, what they do and why things go wrong.