The quality of communication – whether internal or external – makes or breaks a company. But for risk managers, it is an essential tool that is key to the recognition and understanding of risks and the facts that underline those risks. Communicating numbers is particularly tricky, so we speak to David Spiegelhalter, Winton Professor for The Public Understanding of Risk at the University of Cambridge, to identify the challenges of conveying messages about risks responsibly, and to define the issues of misinformation, trust, and uncertainty in today’s world.
What are the challenges of communicating risk in a way that it informs rather than manipulates?
Most people when they communicate numbers, in particular, tend to be trying to persuade somebody of something. Often, they’re trying to reassure people or to frighten them. So many risk news stories are trying to make us anxious about what we might eat, or what we might be exposed to, or whether there are criminals in the neighbourhood. So it is a real challenge to not manipulate people’s emotions with the message.
Of course, we want people to be interested, so we are trying to manipulate them to the extent that they want to read or take notice your message. But we don’t want to reassure or frighten them. This means being aware of the cognitive processes, the heuristics people use, and how the framing of a number makes a big difference.
I think I can make any number big or small, depending on how I tell the story. For risk, this is particularly true. People in the US are told that open heart surgery has a 2% mortality rate and that the doubling of the risk means that the mortality rate increases to 4%. In the UK, we say that there’s a 98% survival rate, which sounds better. And even with somebody with double the normal risks will still have a 96% survival rate, which sounds fine.
Any agency providing information should not want to be trusted. That’s the wrong objective. You should not want to be trusted. Instead, what you should want to do is to demonstrate trustworthiness, because that is within your control.
So we know that we can change people’s impressions by the way in which numbers are told. The challenge would be to do it in a balanced way, which neither reassures or frightens. We could say, for example, that out of 100 operations like this, 98 people would survive and 2 would not. We could show a graphic representing both statistics and we must make sure that equal emotional weight is given to both the good and the bad news.
In recent years, the term “alternative facts” was coined, which is especially dangerous for an industry that relies on facts. How could facts prevail? Is the answer in communication?
Misinformation is a big problem in today’s society, but it has always been a problem. People have always manipulated what they have told you, and in particular, they manipulated numbers to make them look big or small. It’s a bigger problem now because with social media, there is a direct access between the providers of this information and the audience. The information is not necessarily up to even the questionable standards of newspapers or TV news. And the messages can be propagated much faster around the population.
Many bodies are trying to counter misinformation and it is difficult. But you have to think about both the supply side and the demand side. You can look at the communication – what is said and how it is said – and try to clean that up. But you also have to train people and encourage them to question what they’re being told, to look for and cross check the information they are getting, and to go to reputable sources.
In terms of supplying trustworthy information, well, it’s everyone’s responsibility. That’s why the idea of trust is incredibly important. I, personally, have been hugely influenced by Baroness Onora O’Neill, who is a philosopher from Cambridge, a philosopher of Kant, and who has written and communicated wonderful things about trust.
The first thing she says is that any agency providing information should not want to be trusted. That’s the wrong objective. You should not want to be trusted. Instead, what you should want to do is to demonstrate trustworthiness, because that is within your control.
The first thing we need to do is to acknowledge uncertainty, and admit that we don’t know. This is probably the biggest step of all – to be honest with ourselves about our ignorance.
There are all sorts of ways to define trustworthiness. One of the things Baroness O’Neill emphasises is ‘intelligent transparency’, because it is a very important aspect of trustworthiness that the information is transparent. She defines intelligent transparency as meaning that the information you give should be accessible (people should be able to get at it easily), comprehensible (people should be able to understand it), useable (it should fit people’s needs), and – this final one is great – assessible. In principal, if people want, they should be able to check the workings and see where the claims came from.
This list of qualities – accessible, comprehensible, usable, and assessible – is incredibly important. I try to use it in all my work, and I think it’s something that everyone can take on board.
Are companies taking this on board already?
I’m sure that companies have their own principles – honesty, openness, and transparency, perhaps. But Baroness O’Neill is an extraordinarily insightful person who boiled it down to these principles which are being promoted in governments more. People have been very influenced by this, for example the Office for National Statistics. The new code of conduct for national statistics has three pillars, and the very first one is to be trustworthy. So that’s considered as the most important aspect of statistics. Baroness O’Neil is very influential in the open data world, trying to make information available for people in a way that it informs and does not manipulate people.
In a world of uncertainty that we live in, how can risk managers make sense of uncertainty and communicate their understanding of it effectively?
I think this is a really central problem. In order to be trustworthy, we need to be upfront about uncertainty.
And uncertainty in itself is an extremely complex issue. There are lots of sources of uncertainty. It may be because you don’t understand what’s going on; it may be a lack of information; it may just be the fact that it’s about the future, and very often in risk management, it really is about the future. And the future is always uncertain.
Of course, it doesn’t have to be all about the future; you might be uncertain of what’s going on at the moment. We can be uncertain about facts, numbers, and science, which is my particular area of research, and that a big enough source of uncertainty, even before we get to make predictions.
The first thing we need to do is to acknowledge uncertainty, and admit that we don’t know. This is probably the biggest step of all – to be honest with ourselves about our ignorance. In other words, to avoid delusions, confirmation bias, and the fact that we constantly tend to make up our minds too fast and seek information that validates it.
In our research, we’ve been looking at all sorts of ways in which people can break down uncertainty about facts, numbers, science, and the future, and communicate that in a transparent way. There are some general rules that we found in psychological research. For example, using words to communicate uncertainty – although it may be necessary – can be extremely misleading because in different context, people interpret words in very different ways.
We are actively exploring this in our research: can we admit uncertainty without losing trust and credibility? This remains to be a vital issue in our day of misinformation and overclaiming.
So various organisations now are trying to pin words to the rough idea of magnitude of an issue. For example, when the Intergovernmental Panel of Climate Change says “likely”, it means that the probability is at least 66%. Other people are taking on board this kind of scale, so when an event is “likely“ or “extremely likely”, it has got more credibility because there is a consensus on what these words mean.
But we can’t always put numbers on our uncertainty. Sometimes it is just due to realising that our evidence isn’t very good, and we have an incomplete understanding of what is going on. This is another important area that is being developed, in which people have been developing scales that can represent the degree of our understanding of a topic and the quality of evidence underlining any claim that we make. There have been many scales developed in lots of different areas, and people have been trying to bring these together into a more coherent framework.
Admitting uncertainty does not necessarily lead to distrust, but communicating it in an assertive manner is important.
So for climate change, that is a measure of confidence in the science behind various claims. In health, there is a 4-star rating method that grades the quality of evidence underlining your claim. Similar scales have been developed in education, policies for crime, and so on.
I think this is a terribly important area, because it means that although we could admit that we can’t always put numbers on things, but in some situations our understanding and evidence are better than others. A little bit like a TripAdvisor for our evidence.
What is the risk of being 100% certain?
The lack of humility is the most dangerous of all things – to think you know something while in fact, you don’t. Probably every man-made disaster that has happened to humanity can be put down to people not having the humility to realise that they didn’t know as much as they thought they did.
It’s massively important to have that understanding about what we don’t know. That’s true about facts and science, but it’s also true about the future. Endless experiments and predictions have shown that people who do have that humility and who can interrogate their own knowledge are the ones who can make the most reliable judgements.
If uncertainty is admitted, will it mean that people won’t trust us?
This is a real anxiety that many people have. If I say ‘I don’t know’ people might say ‘what kind of expertise is that? I’ll talk to someone who does know’ (or someone that claims they know). We are actively exploring this in our research: can we admit uncertainty without losing trust and credibility? This remains to be a vital issue in our day of misinformation and overclaiming.
We’ve been carrying out various experiments with online panels, individuals, and interviews with those who use statistics, to find out. The evidence so far suggests that if you can be confident about your uncertainty or assertive about it and are able to present numbers or a scale, there doesn’t seem to be any loss of trust in the source of the information. The people would rely on the number less, which they should, but trust remains in the source of information.
I think I would trust somebody more if they admitted that they don’t know. For example, when driving in Europe, my Google assistant was extremely untrustworthy when it directed me to drive down a set of steps in Portugal with absolute confidence. But at another time, it admitted to not knowing what to advise. I felt slightly abandoned, but it was better than being directed in the completely wrong direction. I think this is a great example of trust being unaffected when it was admitted that the advisor doesn’t know.
Our experiments so far suggest that what we call a ‘muscular uncertainty’ (or unapologetic uncertainty) does not lead to a decrease in trust in the source. So to answer the original question, no, admitting uncertainty does not necessarily lead to distrust, but communicating it in an assertive manner is important.
David Spiegelhalter is Chair of the Winton Centre for Risk and Evidence Communication, and works to improve the way in which risk and statistical evidence are taught and discussed in society. He gives many presentations to schools and others, advises organisations on risk communication, and is a regular commentator on risk issues. He presented the BBC4 documentaries ‘Tails you Win: the Science of Chance’ and the award-winning ‘Climate Change by Numbers’. He was elected FRS in 2005, awarded an OBE in 2006, and was knighted in 2014 for services to medical statistics. In 2011 he came 7th in an episode of Winter Wipeout.