Last year, we had the manufactured controversy about Lynton Crosby and plain packaging. This was a classic attempt to imply fire by creating steam and it has persisted despite there never being a shred of evidence to support it and despite any wrongdoing being explicitly denied by those involved.
Earlier this month we saw the BMJ hatchet job against anyone who has ever dared say anything against minimum pricing. This seemed to be directly inspired by the fake Crosby furore and was aimed at the same outcome—pressuring the government into a U-turn by implying improper access by business interests.
At the weekend, we saw a preemptive strike on the sugar front. The new and inexplicably influential Action on Sugar (formerly Consensus Action on Salt and Health - let's call them Action on Sugar and Salt (ASS) for now) is desperate for the WHO to halve its recommended daily sugar guidelines. There is virtually no evidence to support this idea. What little evidence exists is of "very low quality", as the NHS notes. Reducing the limit would be a purely political move. It would allow campaigners like ASS to claim that things are twice as bad as they thought.
In addition, the sugar nuts want the UK government to lower its own guidelines on sugar consumption. There is little chance that this will happen. As regular readers know, the garbage spewed out by ASS people like Aseem Malhotra may play well on the news, but it has very little relationship with science, and it is scientists—specifically the Scientific Advisory Committee on Nutrition—who will ultimately make the call. The sugar nuts know this and are preparing their own industry conspiracy theory for when the time comes.
The Sunday Times—which has a habit of raising cranks to the status of experts when it comes to sugar—led the way by reporting that some members of the Scientific Advisory Committee on Nutrition had committed the original sin of being consultants for the food industry.
This was followed by a truly pathetic Channel 4 Dispatches documentary which was like the worst George Monbiot article made flesh. Titled 'Are You Addicted to Sugar?' (despite the only expert interviewed politely confirming that sugar is not addictive) the programme floundered in a sea of innuendos and follow-the-money smears before making the inevitable, spurious comparison with tobacco. I challenge anyone to watch it and not conclude that the scientist from the Scientific Advisory Committee on Nutrition is an honest and thoughtful man while the presenter is not. You can view it here.
A further attempt to close the circle is the growing campaign to prevent industry-funded research appearing in scientific journals. As with so many public health crusades, this began with tobacco and has quickly moved on to other fields. It's only been three months since the British Medical Journal announced that it would not be publishing any more research funded by 'big tobacco', but there are already demands that this ban be extended to the pharmaceutical industry. There are, inevitably, calls for the same to happen to the alcohol industry as well, and Monday's Dispatches also made insinuations against research funded by the sugar industry.
I haven't read such books as Ben Goldacre's Bad Pharma (and probably never will) so I won't comment on the alleged wrongdoings of the pharmaceutical industry, although I am sure they are many and varied. Ditto the tobacco industry, although I am unaware of any examples of scientific fraud funded by them in recent memory (the BMJ ban was said to be the result of unspecified historic sins).
As for the booze and sugar industries, their crimes appear to amount to nothing more than funding studies that tend to disagree with studies funded by their explicit enemies. The Dispatches documentary felt it was enough to say that studies funded by - ahem - 'Big Sugar' were more likely to show that sugar didn't cause [fill in the blank]. Similarly, a staggering piece of political propaganda published in PLoS last year claimed that the alcohol industry was inherently untrustworthy because it tended to disagree with 'public health' campaigners.
In neither case was any thought given to the possibility that there might be biases on the other side, let alone that the industry-funded researchers might be right and the state-funded studies wrong.
If industry-funded studies are in error, surely the best response is to expose the flaws and debunk the evidence. This is almost never done by those who blithely condemn them. Although I can readily believe that biases, not least funding biases, can lead to bad science, it must be shown to be bad science before the source of funding becomes interesting or relevant. In Dispatches, as in the BMJ hatchet job, the competing merits of the evidence were given no thought whatsoever—only the money.
By barring those who have vested interests from engaging in the scientific process, the medical journals implicitly accept that their much-vaunted peer-review process does not, in fact, prevent scientific fraud or spot quackery. It is an admission that the whole system is essentially based on trust. This was made clear in a BMJ podcast recorded at the time of the ban on tobacco-funded research, featuring Fiona Godlee, the BMJ's editor-in-chief:
"I think there's been a dawning realisation of the extent of the bias and research misconduct relating to some of the studies funded by the tobacco industry. And although people in the past might have relied on peer review and also on transparency statements around the fact that the research was funded - leaving the reader, really, to draw their own conclusions - what we do know is that biases and research misconduct are often very hard to detect, and also that the funding can influence the outcome of studies in ways that are invisible to the peer reviewer or the reader."
Godlee went on to discuss the problems of "Pharma-sponsored studies or any self-interested group", as if there were no "self-interested groups" lurking in the hallowed halls of the public health movement. The simple truth is that biases and misconduct that cannot be identified in the case of tobacco-funded research will not be picked up in other forms of research.
There are, to put it simply, biases everywhere. Some are ideological and some are financial, but it has always seemed to me that ideological biases are more pervasive and more dangerous than financial biases, not least because they do not have to be declared. Last night, for example, Mike Rayner tweeted this:
Conflict of interest? On the sugar payroll http://t.co/vNWhkUq9Pe via @FoodNavigator
— Mike Rayner (@MikeRayner) January 21, 2014
You may recall that Rayner believes that he is doing the Lord's work in campaigning for a fizzy drinks tax (about which he also conducts influential research) and so I tweeted back to him...
.@MikeRayner When you declare your conflict of interests, do you mention that you heard the voice of God telling you to bring in a soda tax?
— Christopher Snowdon (@cjsnowdon) January 21, 2014
My (unanswered) response was light-hearted, but it raises a serious point. If Rayner believes that "God is calling me to work towards the introduction of soft-drink taxes in this country", what are the chances of him changing his mind in the face of the evidence? What are the chances of him conducting research that is likely to challenge his belief?
Isn't divine intervention a greater bias than that of someone who receives a consultancy fee from a seller of lemonade? The recipient of a grant from Big Soda can always look elsewhere for work. What is the true believer to do?
A religious belief such as this might seem an extreme example, but other deeply held political, economic or moral ideologies can be just as strong.
I am not, of course, suggesting that researchers be required to list their beliefs as conflicts of interest, nor am I suggesting than Mike Rayner's published research has been corrupted by Big Christianity. The point is that there are biases on all sides of the fence and science exists for us to get beyond the biases to the truth. If you think that research is flawed, prove it. If you think that misconduct has taken place, prosecute it. But don't bar thousands of people from presenting evidence just because you don't like what you think they might say.
Just seen this blog post which makes a similar point.
An ad hominem attack is typically utilized to compensate for weak evidence behind a proposition someone is trying to advance. It is an obnoxious distraction. The real need is for integrity in research, acknowledging all sources of bias, and addressing inevitable biases through scientific rigor, not ad hominem fallacies.
It’s hard to imagine a worse idea than discouraging public health and nutrition experts from offering their best advice to companies who make the food we eat. But ad hominem attacks on those who do so might have just that effect.