by Tim Noakes
The point, of course, is that not one of the 3 200 participants was ever put on the so-called Noakes diet. Furthermore, the review did not prove that the Noakes diet is dangerous because, in the first place, it did not study the Noakes diet. And, in the second place, there was no difference in any measured outcome between the different dietary interventions. The articles in the press could just as easily have stated that the popular LFHC diet promoted by ADSA and the HSFSA to lose weight and to prevent heart disease is no better than the ‘Noakes diet’ and therefore ‘no more than a dangerous fad’.
When I read the Naudé review and the accompanying press releases from the embedded journalist, my faith in South African nutritional science was shaken. Was this science with an agenda? The agenda being to discredit Tim Noakes and ‘his’ diet so that no one would give any credence to anything he said or any inconvenient questions he raised?
As time went by, I began to wonder if perhaps the Naudé review might also be bogus. My suspicions of foul play only really began to surface in July 2016, when I reread Chris Bateman’s review of the UCT Centenary Debate, published in the SAMJ in February 2013. Now one sentence in particular attracted my attention: ‘According to Steyn and Levitt, the Cochrane Collaboration at the Medical Research Council is due to release a formal review of all existing data on the subject by the end of February 2013.’
If my opponents had been so keen to silence me, and if they believed they had definitive proof that I was wrong in the form of the Naudé review already in February 2013, why was that paper only published in July 2014? Why had they given me those 17 months to further advance my argument, and to publish The Real Meal Revolution? What could possibly have delayed the publication of such a politically charged paper?
Could it be that the paper had originally produced a result that was the opposite of what the authors (and their backers) had desired? If that had indeed happened, then the authors faced an appalling choice: either they publish the truth, which would essentially terminate the attacks on me and the Banting diet, or they modify the paper in ways that would show Banting to be either no better or significantly worse than the ‘balanced’ LFHC diet. The latter could then be spun to the South African public by compliant, embedded journalists as evidence that Banting is a ‘dangerous fad’.
If the Naudé meta-analysis had actually shown that even a poor imitation of a real LCHF diet produced superior weight loss, the campaign to discredit me and Banting would have imploded. Would the HPCSA, who used the paper as the final piece of evidence vindicating their decision to prosecute me, still have been able to justify their actions? And what would have been the response if the media had announced that the ‘balanced’ LFHC diet advocated by ADSA and the HPCSA, and which forms the basis for essentially all government dietary interventions across the globe, is nothing better than a ‘dangerous fad’?
This finding would have been utterly unacceptable to all involved in the study. Too many people, especially dietitians, were already aware that the study was proceeding for it to be shelved. And they were all expecting the Naudé review to finally discredit the Banting diet, and me, once and for all. The only way to save the day would have been to modify the analysis in ways that hopefully would not be detected by the expert referees appointed to review the paper when it was eventually submitted for publication. The authors would require just enough modifications to make the LFHC diet appear to be at least as effective as the ‘low-carbohydrate’ diet, particularly in terms of weight loss. With the assistance of a compliant media and organisations such as the HPCSA, ADSA and the HSFSA, Noakes and his diet would be appropriately discredited and the manipulations would lie undetected. The complicit scientists would all stay loyal to the omertà.
For the next two days, I wondered how I could test my hypothesis that the original analysis of the data in the Naudé review had shown that the ‘low-carbohydrate’ diet outperformed the ‘balanced’ LFHC diet. I realised that I needed to invite an expert in meta-analysis to perform a forensic analysis of the Naudé review. But who in South Africa could be trusted to do such a study without informing on me? Then it occurred to me that one of the world’s leading health statisticians, Dr Zoë Harcombe, was coming to Cape Town in October to serve as one of my three expert witnesses in the HPCSA hearing. She had been one of the most impressive speakers at the February 2015 low-carb summit in Cape Town and had recently completed her PhD thesis, which was based on a series of meta-analyses of all the available evidence in the 1970s and 1980s that might have justified the adoption of the LFHC diet for the prevention of heart disease.13 (As she would testify at the hearing, there was no such evidence. See Chapter 13.)
I emailed Harcombe to ask for her help, and within a few hours she had confirmed her enthusiasm for the task. At her request, I sent her PDF copies of the 14 studies that Naudé and her colleagues had used in their meta-analysis. And then I waited. Within a few days, Harcombe emailed to say that there were ‘problems’ with the paper; within a week she had sent me a draft of a potential article that she suggested we write for submission to the SAMJ.
For the rest of July and August we worked on the paper, submitting it for the first time in early September. After significant corrections at the suggestion of two expert referees, our manuscript was accepted for publication on 29 September 2016. ‘The universities of Stellenbosch/Cape Town low-carbohydrate diet review: Mistake or mischief?’ was published in the December 2016 issue of the SAMJ.14
We identified the following 15 material errors in the Naudé review:
Material errors 1–6
Naudé and her colleagues selected 14 studies15 for inclusion in their meta-analysis; however, four of these should have been excluded: (1) and (2) reported the same data, and so only one should have been included; (3) and (4) failed Naudé and her colleagues’ own inclusion criteria, as the fat content of the ‘balanced’ diet was less than their stipulated 25 to 35 per cent; and (5) did not report results in a manner that could be used in a meta-analysis looking at average weight loss. This particular paper, along with the other by De Luis (6), used the weight of participants at the end of the trial when the rest used average (mean) weight loss. This meant that, on a list of average weight losses ranging between 2.65 and 10.2 kilograms, Naudé and her colleagues tried to put end weight loss (of 88 to 91 kilograms) as the data to be used for the two De Luis studies. There was no reference to starting weight – only the participants’ final body weights at the end of the trial were given. Harcombe and I called this ‘absurd’ – not a word to be used lightly in an academic paper, but we could think of no other. Anyone reading the Naudé review should easily have detected these fifth and sixth material errors.
Material errors 7–12
The strength of the meta-analysis method is that it assigns different weightings to different studies on the basis, among other criteria, of the number of subjects studied, the duration of the trial and, of course, in this study, the magnitude of the weight lost during the trial. Thus studies with more subjects, which continue for longer and whose subjects lose more weight, attract more ‘weighting’ in the meta-analysis than do shorter trials with fewer subjects who lose less weight.
For four of the studies (7–10), Naudé and her colleagues reported the number of subjects who completed the trial at a time later than that at which the weight-loss data they included had been recorded. Because fewer subjects completed these trials than had their weights measured at interim stages during the trials, in the meta-analysis these trials would appear to include fewer participants than was the case, and hence would receive a less-favourable rating. In all these studies, weight loss was greater in the ‘low-carbohydrate’ diet groups. As a result, this method of analysis disadvantaged the overall pooled effect for the ‘low-carbohydrate’ diets.
In addition, in the case of the Wycherley study (10), Naudé and her colleagues claimed that they had used weight-loss data for 52 weeks of the trial, but they had not. Instead, they had used data from 12 week
s of the trial. Using the 52-week data would have favoured the ‘low-carbohydrate’ intervention.
The Keogh study (3) also reported higher weight loss in the ‘low-carbohydrate’ group. However, while the data for the number of completers was taken at the end of the trial, the weight loss was taken from an earlier stage of the trial. In the subsequent meta-analysis, this mitigated the superiority of the ‘low-carbohydrate’ diet.
Material errors 13–15
The Naudé review reported weight loss in the control group in the Farnsworth study (2) as 7.95 kilograms, whereas the actual value given in the paper is 7.9 kilograms. This change favoured the ‘balanced’ LFHC diet.
Naudé and her colleagues reported that the weight losses in the two diet interventions in the Krauss study (11) were the same, specifically minus 2.65 kilograms. We were unable to locate this data in the original manuscript. Instead, in Table 2 of their paper, Krauss et al. reported weight losses of 5.4 and 5.3 kilograms for the ‘low-carbohydrate’ and ‘balanced’ LFHC diets respectively. Again, this change favoured the LFHC diet.
The slightly greater weight loss in the ‘low-carbohydrate’ group in the Sacks study (4) was wrongly assigned to the ‘balanced’ LFHC group in the Naudé review, and the lesser weight loss in that group was allotted to the ‘low-carbohydrate’ group, a modification that once again favoured the LFHC diet.
Despite the facts – that the Naudé meta-analysis did not review genuinely low-carbohydrate diets of the type that we promote; that the inclusion of isoenergetic studies negated the key benefit of the LCHF diet in reducing calorie consumption by reducing hunger; and that few of the studies used were actually designed to evaluate weight loss as their primary objective – Harcombe and I decided to repeat the meta-analysis using the 10 studies that fulfilled Naudé and her colleagues’ inclusion criteria, and after correcting the 15 material errors.
Despite everything being apparently stacked against the ‘low-carbohydrate’ diet, our finding was absolutely clear: ‘In conclusion, when meta-analysis was performed on the 10 studies that qualified for the inclusion in the study of Naudé et al. using their own criteria, the data confirmed that the lower-CHO [carbohydrate] diet produced significantly greater weight loss than did the balanced diet.’16
Perhaps unsurprisingly, our paper was met with silence. No embedded scientists published articles stating that our re-analysis of the Naudé review had established that the ‘balanced’ LFHC diet is inferior to a low-carbohydrate diet for weight loss. The only journalist to suggest that our re-analysis showed that the low-fat ‘balanced’ diet is a dangerous fad that South Africans should avoid at all costs was the co-author of this book, Marika Sboros. She was also the only one to address the core issue: that the Naudé review might possibly contain evidence of scientific fraud.
Even more disturbingly, at the time of writing in September 2017, neither the universities involved (Stellenbosch, Cape Town and Liverpool) nor the SAMRC have taken any definitive actions to determine whether or not the paper is fraudulent. The editors of the journal in which the original meta-analysis was published, PLoS One, have taken more than nine months to decide if they will act on our request to decide whether the paper should be withdrawn, as it appears to be fraudulent.
On Tuesday 20 December 2016, I was waiting in line at a local coffee shop when the person in front of me pointed to a newspaper and said, ‘I see you are in the news again.’ He was referring to an article in the Cape Times titled ‘Noakes disputes diet study’.17
It was a report of my and Harcombe’s SAMJ analysis of the Naudé review. For the most part, the journalist quoted directly from our paper. But the final two paragraphs attracted my attention:
In a statement, authors of the original study – Celeste Naude, Anel Schoonees, Taryn Young and Jimmy Volmink of Stellenbosch, and Marjanne Senekal of UCT and Paul Garner of the Liverpool School of Tropical Medicine – said that Harcombe and Noakes’ paper was submitted as evidence in the recent HPCSA hearing into the professional conduct of Noakes.
‘The numerous criticisms of the review in the Harcombe paper were addressed in the hearing during the cross-examination of Dr Harcombe. Dr Harcombe conceded more than seven times that the “errors” she had pointed out in her paper were in fact not material to the findings of the review. Some “errors” were in fact found to be sound statistical methods,’ they said.
For those of us who were actually present at the HPCSA hearing, this was certainly an unusual interpretation. Rechecking the transcripts of Harcombe’s testimony told us that their statement had no basis in fact.
This was the first response from Naudé and her colleagues to our analysis that I had seen. If this was the best they could do, then they clearly had no answers to counter the obvious implication of our detective work.
Harcombe and I spent a few hours the next day preparing a response, which was published in the Cape Times on Friday 23 December.18 ‘The defence [of Naudé et al.] is as poor as the original article,’ we began. ‘The authors have made no attempt to address the 14 [sic] material errors we identified, all but part of one in their favour.’ We pointed out that their use of a duplicate study was sufficient to warrant retraction of the paper by PLoS One, and that the authors had made no attempt to defend their conclusions or the tens of errors ‘sloppy and unworthy of the esteemed organisations they represent’.
We next pointed out that their claims that our criticisms were addressed during Harcombe’s cross-examination, and that Harcombe conceded that the errors were not material, were simply not true:
Dr Harcombe conceded nothing. She was the one who presented faithfully to the panel which errors were material, which were not, and the consequences of each.
The hearing transcripts confirm that while giving testimony Dr Harcombe used the term ‘material error’ 16 times. While under cross-examination, Dr Harcombe twice dismissed questions by volunteering ‘I did not report this/that as a material error’. The authors’ errors thus remain unaddressed and unexplained. As a consequence, our question remains unaddressed and unexplained. Was this mistake or mischief?
A few weeks later, a News24 article quoted Stellenbosch University as saying: ‘The researchers rigorously applied the international gold standard of research synthesis, namely the Cochrane review process, which lends the greatest level of credibility to their results.’19
The same article quoted Harcombe’s response: ‘Cochrane is a methodology. It needs to be used accurately and honestly and in good faith to achieve the results it can produce. Cochrane methodology should enhance the reputation of a paper. This paper has managed to impair the reputation of Cochrane.’ (Privately, Harcombe was of the opinion that Stellenbosch was really saying: ‘We used Cochrane and we are very experienced, so we cannot have erred.’ This is not, of course, a credible answer for a university that wishes to be taken seriously.)
The article ended with the following promise: ‘Stellenbosch University said a formal response to the points of contention Noakes and Harcombe had raised would be submitted to the South African Medical Journal.’
Two months later, Naudé et al.’s response appeared in the SAMJ in the form of a letter, imposingly titled ‘Reliable systematic review of low-carbohydrate diets shows similar weight-loss effects compared with balanced diets and no cardiovascular risk benefits: Response to methodological criticisms’.20 Unfortunately, once again, the authors failed miserably to address our concerns. They admitted to just one of the 15 material errors – the use of duplicate publications – and ignored the rest. Perhaps they believe that if they do not address them, they will somehow magically disappear. ‘We welcome scrutiny and comments,’ they concluded. ‘Having considered these carefully, we stand by our analysis and results.’
The letter was signed by all six authors, indicating that they are each equally responsible for any errors in the original paper. It is noteworthy that their conflict-of-interest statement indicated that five of the authors ‘work for a charity committed to
using scientifically defensible methods to prepare and update systematic reviews of the effects of health interventions’. None was bold enough to name the charity. It is, in fact, the Liverpool School of Tropical Medicine’s Effective Health Care Research Consortium. One wonders why they didn’t just say so.
Our response was written within a week and appeared in the May issue of the SAMJ.21 We began by stating that it is common cause that the Naudé meta-analysis played a decisive role in my prosecution by the HPCSA, and that in her testimony under oath on 24 November 2015, Claire Julsing Strydom had said: ‘So before any media statements could be made we had to get that information and all these associations were waiting on that … We were all waiting for the evidence to be published.’ We continued:
Another prosecution witness, Prof. H Vorster, referred to the Naude et al. meta-analysis five times and quoted from it verbatim once. A third prosecution witness, Prof. A Dhansay, referenced the meta-analysis twice, using the term ‘Cochrane’ to ensure that it was afforded the appropriate esteem. Without the ‘correct conclusion’ from this meta-analysis, it is possible that the HPCSA trial against Noakes might never have happened. Therefore, the importance of the Naude et al. meta-analysis extends far beyond any role purely as a neutral scientific publication.
We then wrote that if we had realised that disproportionate consideration would be given to this ostensibly innocuous publication in the HPCSA hearing, we would have examined it sooner. Having now submitted it to careful analysis, we found it to be ‘replete with errors’.
Next we traced the tardy response of the authors to our detailed re-analysis, before launching into more ferocious criticisms:
The authors cheaply suggest that we show a ‘lack of understanding’ of their protocol. We understand the Naude et al. protocol only too well. Indeed, we appear to understand it rather better than do its authors: