A recent study claimed to find that those who follow a low-carbohydrate diet but that eat a chocolate bar each day lose weight faster than if they did not have the chocolate bar. However, the real aim of the study was to test how susceptible people were to believing the information and getting it in to the news/ social media, despite many significant flaws in the fake study design.
‘A miracle study received significant worldwide media coverage after finding that people following a low-carbohydrate diet lost weight 10% faster if they ate one chocolate bar every day. The problem? The real aim of the study was to see how easy it would be to get bad science into the news [1].’
So how successfully did this majorly flawed “study” make it in to the media? It turns out it made headlines in some well known sources of news.
‘It appears as though the project was a resounding success. The findings of the study were reported by newspapers such as Bild in Germany – Europe’s largest daily newspaper – The Daily Mail andThe Daily Express in the UK, websites such as the Times of Indiaand the Huffington Post and television shows in the US and Australia. The study was submitted to 20 journals and was ultimately published by the International Archives of Medicine. Backed with a very helpful news release, the team sent word out about the study and swiftly news outlets began to cover the research [1].’
Despite the big attention the study got, there were large enough flaws to make the study results meaningless. For one, there was only 16 participants and the study ran for only 3 weeks. With so many measurements being recorded and such a small number of participants, it was expected that the study would find something that was considered to be significant.
‘Unfortunately, there were a number of key flaws with the study that were significant enough to render the results largely meaningless. Notably, only 16 volunteers participated in the study, which ran for just 3 weeks. The researchers also assessed 18 different measurements, planning to base their story around whichever provided them with a statistically significant result [1].’
With such a small number of participants and such a wide variety of measurements being taken, the researchers had a 60% chance of finding something that could be described as “significant” and turned into the cornerstone of their story for the media [1].’
The paper has since been removed from the International Archives of Medicine website. The brilliant take home message of the sham study:
‘”John Bohannon claims he ‘fooled millions into thinking chocolate helps weight loss.’ But he may have directly fooled only a few – not millions. And those few – whom I will politely call ‘journalists’ – did the rest of the fooling for him. And they do it all the time, gobbling up crumbs from a steady diet of weak, hyped studies.”
There are countless studies such as this that are produced every day and reported on by the media. Journalists should be careful to check studies and news releases to make sure that what they are reporting is accurate and informed.
Unfortunately, as this study proves, sometimes the lure of an eye-catching headline and provocative findings can be enough to distract from shortcomings that can be obscured by numbers, symbols and highly specific language.
This story should serve as a cautionary tale, both to readers and reporters, reminding us all to be extra discerning when it comes to evaluating the worth of studies that appear in the news [1]’