Misleading news claims: an interview with Professor Chris Chambers

17th Jul 2019

"Claims of causality in health news: a randomised trial"

Led by Professor Chris Chambers and Professor Petroc Sumner at the School of Psychology, Cardiff University, this is an exciting and much-needed trial on how misleading news claims can be detrimental to public health.

In collaboration with nine UK press offices, the controlled trial used press releases as ‘participants’, with an aim to improve the relationship between causal claims and evidence, without losing news interest.

Here at the British Neuroscience Association (BNA), we caught up with Professor Chris Chambers to find out more about the trial and its outcomes, as well as ask him for some critical advice for researchers.


Exaggerated science news is associated with exaggerated press releases

More accurate university press releases = more accurate news

Having cautious and precise press releases does NOT make them boring or unreportable 

Tips for scientists

Take control of your press release

Avoid even subtle exaggeration

If the study was in animals then say so

Don’t offer advice that isn't in the orginal article

Don't allow causal relationships to be drawn from correlations

Don’t be afraid to include caveats and limitations 

Include a “What this study does NOT show” section

Don't automatically blame journalists for exaggeration; find out where it comes from! 

Q. Can you give us a brief outline of the trial?

In our previous research, published in the BMJ in 2014, we found that most exaggeration in a subset of health-related science news was already present in the press releases issued by the UK universities that conducted the research. We then found similar results for press releases issued by journals, and our results were replicated by a study in the Netherlands.

What this told us is that press releases are a likely source of error and bias in news reporting, but all of the work up to this point was retrospective and observational, and so we couldn’t say whether exaggerated press releases actually cause exaggerated news.

To test this, we decided to run a randomised controlled trial.

We experimentally manipulated press releases prior to them being issued and then measured the effect of those changes on the news. This is quite challenging to do, both logistically and politically, so we worked closely with press offices around the UK to make it possible.

The results are a little complex, but our main finding was that more accurate press releases are associated with more accurate news, with no signs that accuracy reduces news uptake.

This is good news for scientists and press offices because it suggests that we can make press releases more cautious and precise without making the research boring and unreportable by journalists.


Q. What was the main driver for the decision to run this trial?

Going right back to 2011, our original motivation for pursuing this line of work was some extraordinary misreporting of our own research, and an ensuing (and very public) debate in 2012 about what, if anything, scientists can do to improve the quality of science journalism.

That debate took many twists and turns, but one of the key lessons I learned is that scientists have an opportunity — and a responsibility — to get their own houses in order when it comes to press releases, because press releases are usually under the control of the scientist.

If we exaggerate our own public relations materials, even unconsciously, I believe we forfeit our right to lay the blame for exaggerated news reporting at the feet of journalists.

The trial is the culmination of this line of research in which we sought to find out whether interventions that seek to improve the quality of press releases would lead to better quality news. 


Q. Were the outcomes what you were expecting?

They were broadly in line with what I thought would happen based on our earlier work. If we improve quality standards in press releases, I believe we will see a substantial and broad improvement in science news reporting.


Q. Can you give any advice and tips for BNA members and researchers on how they should go about engaging with their University Press Office?

My most important recommendation to researchers would be to keep a very careful eye on the language you use in your press releases to avoid even subtle exaggeration.

If the study was in animals then say so – don’t allow your press release to go out if it implies that your animal study was run on humans.

Also, don’t offer advice to readers that wasn’t in the peer-reviewed article.

And don’t allow exaggeration of inference. If your study was observational or correlational (e.g. “red wine consumption is associated with elevated cancer risk”), don’t allow a press release to be issued that states or implies a causal relationship (e.g. “red wine increases risk of cancer”). It’s very easy for exaggeration to creep in automatically, often unintentionally, when we simplify language.

In the same vein, don’t be afraid to mention caveats and study limitations in your press releases. Across several studies we have found no evidence that caveats dampen news interest, but we do find that caveats – when included in press releases – are likely to be included in news stories.

Finally, if your research is prone to misinterpretation or misrepresentation, I would suggest including a “What this study does NOT show” section in your press release to head off the most probable bad takes. We don’t have any evidence yet that such information is effective, but based on what we do know, I believe it should help. 


Q. Do you have any advice how people might take this piece of research and apply it to real-life scenarios?

As a reader, when you pick up a piece of science news that you suspect is hyped or misleading, don’t automatically assume that the hype was introduced by the journalist. A lot of the time, the exaggeration will have been introduced by the researchers and the journalist simply repeated it.

This kind of zombie reporting is bad and we shouldn’t be letting journalists off the hook even if the scientists are exaggerating. But at the same time, it’s important not to assume that the media is solely to blame for bad science reporting.

If you want to find out where hype originated from, track down the press release (many are published) and compare it with the news story. You’ll find it illuminating.

A huge thank you to Chris for his time and especially, his vital advice for researchers above. Please do read more about the trial here.

Plus, the BNA are launching our own campaign, ‘Credibility in Neuroscience’, one of most important programmes to date, to drive open, transparent and reproducible research. If you’d like to help shape the future of neuroscience, take a few minutes to take part in our Credibility Survey. We’d love to hear what you know, or don’t know, about reproducible, credible & open neuroscience.


< Back to Media