Honesty, competence and a willingness to give us all the facts are essential for establishing who to trust, says statistician David Spiegelhalter
You’d be forgiven for not knowing what to believe during this pandemic. Some scientists who say their claims are based on evidence tell us that lockdown is too severe; others say that we relax at our peril. Some argue that masks are of little use, others that they save lives. So, who can you trust?
During a crisis like this one, trust clearly matters. It changes what people are willing to do: whether that be wearing a face covering or getting a vaccine
when one becomes available. Since the first lockdown, people have continued to trust scientists, despite their disagreements or changes in official scientific advice.
But knowing who or what to believe is difficult. The philosopher Onora O’Neill tells us that rather than focusing on trust, we should focus on trustworthiness. She advises that trust happens when people show honesty, reliability and competence, presenting evidence in ways that make it accessible, intelligible, useful and easily assessed (meaning you can check the workings for yourself, if you so wish).
These principles form the basis for a useful guide for those trying to communicate evidence of all kinds during a pandemic, and for those of us trying to assess what to believe, whether from politicians, scientists or media pundits.
When evaluating whether something is trustworthy, it’s first worth asking whether you feel like someone’s trying to sell you something, or begging for your vote. When the government announces new rules, the evidence for them should be properly explained, and not treated as a means of persuading people how to think or act. The aim should be to inform rather than merely persuade (except perhaps in emergencies where fast action is imperative).
During a crisis like this one, scientists can feel as if they’re in a communications arms race. Those trying to defend the public good against potentially dangerous misinformation can end up denying uncertainty. We saw this early on during the pandemic, when the official line on face coverings was that they were ineffective – rather than admitting there wasn’t enough evidence to evaluate their effectiveness.
Too often, the message is shaped by communications professionals working to ensure the greatest number of people “get the message” rather than thinking about how to present the evidence so the greatest number of people can understand it, trust it and then decide for themselves.
Yet uncertainty is the engine of science, and a sign of knowledgable humility. John Krebs, the former chair of the Food Standards Agency who dealt with numerous crises such as BSE and foot-and-mouth disease, came up with a useful checklist for science communication in such crises: say what you know, then say what you don’t know; then, having acknowledged the uncertainty, say what research is being done, what people can do in the meantime, and, vitally, that advice will change as more is learned.
When judging whether a source is trustworthy, look to the evidence. If it only shows one side, ask yourself what’s missing and why, as a trustworthy source should present relevant evidence in the round.
That’s not the same as claiming every argument has two equal sides; inviting climate crisis deniers on to panel discussions is not the same thing as achieving “balance”. But if someone is only telling you about the potential benefits of a measure, or citing arguments that support their position, it’s worth asking whether there are any potential harms or evidence that points the other way.
Often, there are difficult tradeoffs involved with decisions – whether about lockdown or vaccines. In an ideal world, these would be clearly set out so we could make up our own minds. Of course, sometimes there’s just too much evidence to do it all justice – but a balanced summary of the pros and cons should be possible.
Those who want to be judged as trustworthy communicators, whether government, media outlets or scientific groups, should carefully consider how they present this evidence if they truly want to help us make up our minds about an issue.
For example, is the presentation of evidence pushing you to feel reassured, or anxious? A poster on the London Underground once proudly declared that “99% of young Londoners do not commit serious youth violence”. It’s a reassuring number – but if it were put the other way around, with “1% of young Londoners commit serious violence”, it would have had a completely different effect.
Equally, when politicians and scientists refer to numbers, such as daily numbers of Covid
deaths, it should be clear whether these are based on reports of death, which are higher on Tuesdays, or the actual day of death, which means recent figures will inevitably be revised upwards. Ideally, evidence would have some sort of star rating.
For example, Sage reported “high confidence” that wearing face coverings outdoors has negligible impact on transmission, but “low confidence” in their estimate that closure of close-contact personal services such as hairdressing and beauty therapy could reduce R by up to 0.05. Sometimes decisions have to be made in the face of little evidence, but it should be clear what evidence there is, and how strong.
The ultimate test of evidence is being able to check it yourself. If people don’t tell you how to drill down into the evidence to find out more, or where to find the data, why not?
Part of what makes humans such an extraordinary species is our ability to learn from the experience and knowledge of others – and to pass on that social and cultural learning across time and space. We don’t all have to experience losing someone to Covid
-19 to recognise the virus as a serious threat and nor do we each have to invent ways to protect ourselves against that threat. We can trust the experience, knowledge and hard work of others.
But in the world of competing sources we inhabit, we need to develop new ways to evaluate who really does have our best interests at heart, and the knowledge and experience they claim.
It’s far better for communicators to be trustworthy from the start – to be honest about the complexities and the uncertainties, open about the tradeoffs and reasoning behind policies – than it is to simplify for the sake of an easy message. After all, there’s no easy path to the truth. But it can help to spot when it’s at least being attempted.
• David Spiegelhalter is chair of the Winton Centre for Risk and Evidence Communication at Cambridge University. Alex Freeman is the executive director of the Winton Centre. Michael Blastland sits on the management board of the Winton Centre. Theresa Marteau is director of the Behaviour and Health Research Unit at Cambridge University. Sander L van der Linden is an associate professor of social psychology at the University of Cambridge.