top of page

Valuing risks to life and limb

Donald Miller (ScottishPower), Lisa Gahan (ICS Consulting), Joanna Goulding (Highways Agency), and Martin Grey (NDA) look at the highly emotive topic of evaluating risk.

In the wake of the Japanese tsunami, one news article was particularly striking for engineers of a certain age: a retired engineer in Japan offered to work at the Fukushima plant. His very logical argument was that the risk of him contracting cancer as a result of radiation exposure was extremely low because he would probably be dead before any tumour could reach a life-threatening stage. He could, therefore, do work that would be considered too high risk for younger workers. Does this mean the value of his life was lower than that of the average employee? Is it reasonable to extend his logic to all risks, or is this a special case?

One of the toughest tasks for many of us is comparing investment options that have the remote potential to save a life to those with the high potential to prevent any number of injuries. These decisions raise a number of questions: would you spend less on reducing risks for a worker than for a member of the public, or is a child’s life worth more than a pensioner? And how do you make that case without appearing cold and calculating or emotional and irrational? We turned to two organisations to help us find the answers.

Highways Agency

The Highways Agency is responsible for the Strategic Road Network in England, which comprises around 4,340 miles of all-purpose trunk roads, or 86 billion vehicle miles. Joanna Goulding, from the Road Safety and Risk team, discusses how they manage risk.

In 2009, there were 11,322 road-user accidents on our network, which resulted in 232 deaths, 567 serious injuries and 16,045 minor injuries. For the Agency, a life is a life and, when considering safety measures, we draw no distinction between the life of a road worker or road user. This means that when we get to cost-benefit considerations, we use the same value for preventing a fatality and we refer to the Department for Transport’s WebTAG advice for the correct figure.

The Agency’s responsibilities for road safety for road users are set through the Government’s road safety policy for all roads in the UK, which includes the Highways Act and the Road Traffic Act along with supporting legislation. However, the Agency’s legal responsibility for road users is significantly different to that of other road populations in that we also have a duty to road and construction workers, traffic officers, national vehicle recovery operatives and other third parties. The Agency’s legal responsibilities in this area are defined under the Health and Safety at Work Act (HSWA).

When the Agency implements its Safety Management System we identify the main factors driving safety risk, along with the proportion of that safety risk which can be controlled or influenced directly through design, engineering or operation. This is where the distinction of legal duties and responsibilities becomes key and we must ask the question: what, if anything, must/could/should the Agency do to address the causes of accidents?

The Agency interprets its duty and responsibility to road users as having to do what is “reasonably required” to manage, operate, repair and maintain the trunk road network. For other populations, activities or indeed situations falling under the HSWA, risk is managed using the So Far As Is Reasonably Practicable (SFAIRP) principle and, in practice, this is achieved by applying the As Low As Is Reasonably Practicable (ALARP) principle.

“Reasonably required” is similar to SFAIRP or ALARP in that it allows us to examine risk capacity/tolerance and benefits versus costs. However, unlike SFAIRP and ALARP, risk mitigation measures do not necessarily have to be implemented until they become grossly disproportionate – “reasonably required” allows for and takes account of the Agency’s available budgets and other duties when considering safety measures. Economic appraisal and cost benefit analysis (CBA) aids the decision-making process by giving monetary values to the costs and benefits, allowing for a comparison between safety risk-reduction measures on a consistent basis.

So when the Agency considers road users, it is from the perspective that they themselves have a responsibility for their own safety. The Agency expects that they will take the road as they find it, driving in accordance with prevailing conditions and in accordance with the law.

In a nutshell, the fact that an accident occurred on a road that the Agency is responsible for does not mean that we should, or indeed could, have prevented it. Analysis of accident data tells us that, in the vast majority of cases, human factors are responsible for causing accidents and not the road or highway infrastructure.

Organisationally, we can and should design our roads to be safe for lawful use and take appropriate measures to reduce the possibilities of accidents. The only way to identify what is reasonably required and appropriate is to apply some form of CBA and this must therefore be informed by applying a monetary value to the cost of preventing a fatality.

Nuclear Decommissioning Authority

The Nuclear Decommissioning Authority (NDA) is responsible for the decommissioning and clean-up of the UK’s 20 civil public sector nuclear sites. Engineering and Project Validation Manager, Martin Grey, outlines it risk assessment and mitigation processes.

Our remit is to deliver accelerated decommissioning and clean-up programmes safely, securely and cost effectively, in ways that protect the environment for this and future generations. This involves hundreds of projects, operational as well as decommissioned facilities – of varying age and condition – and dealing with different waste streams. Across these programmes, we use competition to drive improvements in contractor performance and deliver best value to taxpayers.

One of our key roles is to provide and maintain a consistent approach to investment prioritisation across our sites. Each operator uses a risk matrix which maps risk factors with likelihood and impact scores. We haven’t set tolerability levels for any of the risks; however, the NDA has developed a tool called the Value Framework to provide a consistent basis for framing decisions without being prescriptive.

Contractors rank potential projects by the benefits of investment in terms of their impact on accelerating the decommissioning and clean-up programmes presented within the value framework. This prioritisation process allows each contractor to assess the following important factors:

- the rate at which the “hazard potential” from the radioactive material can be reduced over time

- the ongoing safety of a building or facility and the uncertainty about the materials being stored, and how these will change over time

- the impact a facility or amount of stored waste would have, over time, on the environment if the material was left untreated

- the costs involved, including any change in the ongoing cost of running and maintaining a facility.

We do not have explicit policies that state the acceptable level of health and safety risk for occupational or radiological hazards – that is a matter for the safety and environmental regulators and site licensed contractors (Controlling Mind). Within the decommissioning programme, there is an implicit declining risk profile which the investments are designed to accelerate, but there is currently no acceptable spend defined for any level of risk or rate of reduction.

It could be argued that the NDA’s environment is much simpler than a commercial business: with no competitors and no market pressure to set or adjust tolerable risk, there is only a relative performance measure between sites. There is also just one overall objective, namely decommissioning.

Should site licence companies identify the potential for injuries or worse, their remit to manage them comes under the HSWA. In practice, the industry approach to safety cases at sites pushes “fatality events” into the incredible zone, requiring multiple levels of failure. For example, on our sites, the risk of a road traffic accident (RTA) is such that speed limits of 20mph are enforced. Even so, the risk of an RTA resulting in near fatal injuries is mathematically higher than the Fukushima event, which resulted in no fatalities but raised concerns over the potential health effects.

Public and state perception can drive a two-speed approach to health and safety risks. We know that a news story reporting that two staff died in a car crash at a nuclear site would have a different reception to one reporting that two staff were contaminated in a nuclear release. Therefore, we consider it unnecessary to set a spend target to avoid a fatality. Instead, we prefer to consider the mitigation of risk on a case-by-case basis, against a consistent regulatory and prioritisation framework.

From a purely economic point of view, we should value all health and safety expenditure in much the same way as any other aspect of expenditure: that is, in terms of the costs and benefits it delivers to ensure value for money is maximised. This, in part, means attributing monetary values to injuries and fatalities. Over the years, numerous techniques have been developed and applied in a variety of settings and some of these are summarised in Figure 1.

Figure 1

It is not surprising that, with such a large variety of techniques, there is a wide range of valuations. Capital valuation techniques are usually the ones that result in big numbers – tens of millions of pounds are not uncommon. In general, the UK government has adopted the “cost of injury” approach, with some measure of human cost added in. This is toward the lower end of the spectrum with estimates between £1.5m and £2m.

The cost of injury and QALY techniques are commonly used to work out the optimal allocation of resources, including health and safety related expenditure – and this works well for most industries. But, as the case studies show, the tolerable levels of risk for industries are not all the same. Why is this the case?

The use of economics assumes that decision makers consider all costs and benefits in a consistent, formalised assessment, and then use this to make reasoned investment decisions about the allocation of resources. This also assumes everyone – government, organisations and the public – is rational and bound by the same set of values. This is where the logic falls down because rationality and consistency cannot be guaranteed.

We all know the big risks we face – smoking, obesity and cancer to name a few. They kill so many people that it should be front page news every day. But front pages are dominated by the new risks we’d never thought of before, such as shark attacks, swine flu, terrorism and mad cow disease. We are not very tolerant to new risks, so we over-react – and over-allocate resources to manage them.

But it’s not just novelty that drives our tolerance to risk. The list of “tolerability factors”[1] also includes:

- catastrophic potential – the more that can be potentially impacted, the more we fear it

- personal control – we all feel in control when we are driving because we are operating the vehicle, even though we can’t control other road users

- impact on children – especially our own!

- voluntariness – did we choose to expose ourselves to risk?

- ability to identify with the victim – one is a tragedy, a million is a statistic

- trust in the institutions involved

- history – how easily we can recall other incidences

- whether it’s man-made or natural.

A united front

Economics is important in our decision making – we need to spend our money wisely. But given the nature of our industries, the confidence of the public is paramount and we need to recognise that in our decisions.

This means that the tolerable level of risk is not same across industries. The nuclear industry ticks so many of the tolerability factors that it is clear its tolerable level of risk is extremely low. The highways industry ticks far fewer, so it is likely to be much more comfortable in applying economics in a rational manner.

[1] Adapted from Dan Gardner “Risk: The Science and Politics of Fear”


bottom of page