We understand that a media audience is attracted by hyperbole and drama, but this is not what science is about.
Press releases about scientific findings need to be treated with caution and skepticism, because a crucial aspect of the scientific process is that findings must be reviewed and duplicated. Who remembers the claims of nuclear reactions at room temperatures, so-called “cold fusion”, back in 1989? It was widely acclaimed in the media when it was first announced, but disappeared later that same year when the claims couldn’t be verified by other researchers repeating the work. The media had rushed in without due scepticism and without waiting for verification.
“Science” covers a huge range of fields and overlaps with medicine, information technology, occupational health and safety, etc. Not even formally qualified science journalists will have the knowledge to sensibly cover them all.
We offer these suggestions in the hope that they will lead to better science journalism:
General
- Journalists are supposed to be skeptical about what they are told and supposed to try to seek the truth. There’s no reason not to apply this to science.
- Science is very competitive and the number of papers published each year has exploded. This sometimes leads to exaggeration in order to be noticed.
Press releases
- Think of them as advertising for a scientific paper. They rarely tell the whole story.
- Don’t be reluctant to ask questions or request a copy of the paper. The people who sent you the release will be pleased that you are showing interest.
- Regardless of what the press release says, the rest of the scientific field needs time to consider the paper and pass judgement on it.
Implications and assumptions
- The expression “Scientists say” implies all scientists, but this is almost never the case. Make clear how many you are talking about by using the number of people or making it clear that it was a team.
- Even “Scientists at XYZ say” implies that every scientist at XYZ agrees.
- There are few certainties in science. Every idea is provisional and might be replaced with an idea that better explains what’s been observed.
- Don’t assume or imply that the findings of one paper are the absolute truth.
- Even in medicine it’s not unusual for some people to react differently to a drug than most other people, so don’t imply that any new discovery promises a cure for everyone.
- It can take time for negative side-effects to become clear — think thalidomide.
Correlation alone proves nothing
- Claims based on correlations are common in press releases and are potentially misleading. The issue is which (if any) is the cause and which is the consequence.
- If A and B correlate then without further information it’s not obvious whether A drives B, B drives A, or something else drives them both. For example, just because sunrise correlated with a rooster crowing a few minutes earlier doesn’t mean the rooster caused the sun to appear.
- Claims of cause need to be supported with plausible explanations.
Verification
- Is the data available, so others can repeat the analysis?
- Is the analysis itself clearly described enough, so others can follow the steps and repeat it?
- If these conditions are not in place, then how can anyone independently verify the claim? It’s like having a single witness who might be unreliable or mistaken.
Claims need evidence
- Claims need evidence, otherwise they are just opinions.
- Don’t assume that evidence exists to support popular ideas — don’t be afraid to try to check for it.
- Work should be judged on its evidence and argument, not by the character or possible motivations of the author –financial, ideological, reputational etc. In science, even an axe-murderer might come up with the right ideas.
Peer review means little
- Peer review was intended as a check that the science is solid and to prevent too much repetition in the literature. However it became a tool for supporting some work and rejecting others for nefarious reasons — perhaps it conflicts with the reviewer’s view and undermines his reputation or pay packet, or maybe the reviewer wants to take credit for the idea and is racing to put out his own paper.
- Most scientific journals request that the authors of papers suggest possible reviewers — what author is likely to suggest someone who will reject it?
- In 2017 an account surfaced of a paper’s author creating a fake email address and fake biography for the supposed owner, then suggesting that the person at that address would be an appropriate reviewer. Yes, he reviewed his own paper.
- The web site http:///www.retractionwatch.com has examples of papers that passed peer review and were published, but were found to be incorrect and later withdrawn
Consensus mean nothing
- Opposing a popular notion doesn’t necessarily mean the person is wrong
- The majority of scientists have believed nonsense in the past — such as that heavier-than-air powered flight that could carry humans was impossible, that stomach ulcers were caused by stress, phlogiston, or eugenics. They will probably be wrong again in the future, at times.
- Even if you’re pretty sure an idea is wrong, it’s worth checking the evidence for and against it. It’s likely that the researcher knows more than you do about the subject.
- Science is not a democracy. The hypothesis that best accounts for what is observed is what matters. Thousands of people supporting a wrong idea don’t defeat an individual with the correct idea.
- Breakthroughs have come by rejecting popular ideas and trying something different,
Models are often misused
- Models are based on assumptions, sometimes implicit assumptions that are hard to notice.
- Models deliberately simplify reality, but sometimes the simplification cause significant errors. If the understanding of a situation is incomplete then so are the models.
- The output of models is never evidence per se but more like “Based on our current knowledge this is what would happen.”
- Most models are computer software, so it makes no sense to call them computer model. Use their correct name — climate models, epidemic models, etc.
Percentages or numbers?
- Be very cautious about using percentages, especially when they relate to change.
- Medical treatment that improves a person’s response to a disease by 25% is almost meaningless if only 1 in 10 million people suffer from the disease,
- A 100% increase in the annual number of complaints is meaningless if it means an increase from 1 to 2.
You don’t have to be an expert
- It doesn’t require an expert to say “Your claims don’t explain what the data shows” or “Your predictions didn’t come true.”
- No alternative theory is required to replace another idea that is wrong . For example, trying to fly by strapping on some wings didn’t need a replacement theory before anyone could say it was a flawed idea.
- Criticisms need to be supported by evidence, which will sometimes be obvious and sometimes not.
Tips on reading a scientific paper
- Read the Abstract, because it’s a summary.
- Read the Introduction to see what the paper is about and (probably) why the work was done.
- Read the Conclusion to see what was discovered and what it might mean.
- Papers usually have a designated author for correspondence, so feel free to contact them.
Further reading
What Every Journalist Should Know About Science
What Every Journalist Should Know About Science and Science Journalism