April Fools! When Scientists Get it Wrong

 on

The peer review process is the foundation of scientific advancement. Before a new finding can be published, it must be reviewed by a panel of experts in the field to assess its scientific merits, or to point out any critical flaws in the methods, results, or interpretation. It’s what keeps science a quest for truth.  

Occasionally, though, a paper that doesn’t meet the standards of scientific integrity will slip through the cracks. Whether it’s accidental data inconsistencies or downright data manipulation, it is now up to the broader scientific community to validate the findings of their peers. But that takes time, and the ensuing media storm surrounding a groundbreaking publication may be too vast to reel in by the time the findings are deemed unsound.

In honor of April Fool’s Day, we bring you five stories of published papers that were themselves untrue or eked through the review process through deception.

1. Global Cooling, You Say?

Senator Ted Cruz. By Gage Skidmore via Wikimedia Commons

As always, climate change is primed to be a divisive issue in the 2016 presidential election. Taking the perennial conservative stance, GOP candidate Senator Ted Cruz has already roused debate over global warming, calling those who acknowledge global warming “the equivalent of flat-Earthers” in an interview with the Texas Tribune.

It turns out, though, that Cruz’s understanding of climate change comes not from extensive peer reviewed scientific literature, but rather from a 1975 article in Newsweek. Written by Peter Gwynn, the article outlines data suggesting that the Earth was cooling, not warming. But in a fascinating 2014 article in Slate, Gwynn admits the flaws of his own reporting. The global cooling hypothesis, while thought to be a real possibility at the time, was far more speculative than the article led readers to believe, and advances in science have shown the hypothesis to be incorrect.  

Gwynn’s article remains one of Newsweek’s most cited articles despite its lack of scientific rigor and subsequent discrediting.

2. Fake It ‘til You Make It

When submitting a research paper, researchers are often provided the opportunity to recommend peers whom the publisher may ask to review the article, or to list reviewers who should be avoided due to their personal or professional biases. This allows journal editors to request reviews from experts in the field who may best understand and critique the work while protecting the researcher from unfair critiques by competitors.

It appears a new industry is emerging to exploit this system. The latest scheme came to light earlier this year, when publisher BioMed Central was forced to retract 43 articles published in its journals. Pulled articles in rapidly evolving fields ranging from oncology to cardiology were written by predominantly Chinese research groups, who have been accused of engaging in a systematic scheme of fraudulent review practices. Detailed in a blog post from Senior Editor for Research Integrity Elizabeth Moylan, the articles had been approved for publication based on the recommendation of fictitious reviewers fabricated by third-party agencies.  That’s right: authors actually invented nonexistent reviewers to accept their publications, and the editors employed by journals failed to notice.

According to Retraction Watch, 170 articles have been retracted for fake peer reviews over the last few years. As the pressure for researchers to publish increases, so too will the pressure to fabricate reviews and mislead the science community.

3. A Nerdy Heist

The impact of a retracted study can be far greater when years pass between publication and retraction. Nineteen years had passed since Dr. Ten Feizi was able to retract her 1994 Nature paper, “Oligosaccharide ligands for NKR-P1 protein activate NK cells and cytotoxicity.” By that point the article had already been cited by an impressive 255 other peer-reviewed works. By comparison, less than 3% of the articles published in Nature in 2002 and 2003 had more than 100 citations. 

Feizi volunteered to retract her article in 1996 when she was unable to replicate data from former lab member Karel Bezouska. Dr. Bezouska refused to sign that retraction, and so the article remained in circulation with only a note of correction. Now, as an independent researcher at Charles University in Prague, Dr. Bezouska’s allegedly fraudulent work has continued in astonishing fashion. While under investigation by the ethics committee at Charles University, Bezouska was caught multiple times on security video breaking into the lab to manipulate samples in a refrigerator.

Bezouska has since been removed from his post and several papers have been retracted, but the widespread adoption of his reported findings by other scientists will surely have a lasting impact.

4. The Maize of Truth

Photo: Flickr

The anti-genetically modified organisms (GMO) movement was delivered a blow in 2013 when a popular study demonstrating carcinogenic properties of GMO maize was retracted from Food and Chemical Toxicology. The study demonstrated that herbicide-resistant maize caused more tumors and a shorter life span in rats fed the maize for two years, and has been used as fuel in the protests against agriculture GMO giant Monsanto, who developed the maize.

Critics of the study argue that the sample size was too small to rule out that difference between experimental and control groups were due to chance, particularly because the strain of rat used in the study is prone to tumor growth later in life. These arguments were strong enough that publisher Elsevier chose to retract the study, striking its findings from the scientific literature. While there was no evidence of impropriety or manipulation of the data, statistical questions arising following the article’s publication suggest that no definitive conclusion can be drawn from the study.

Nonetheless, anti-GMO sentiment remains strong in the US and Europe, and more thorough studies will be required to demonstrate their impact on health.

5. The Great Vaccine Debate

Photo: CDC

The most infamous of all influential but discredited science was published in 1998 by Dr. Andrew Wakefield and would have a lasting impact on public perceptions of medicine even long after its 2010 retraction. That study, published in the respected medical journal The Lancet, suggested that the measles, mumps, and rubella (MMR) vaccine could cause autism in children. Needless to say, the media onslaught that followed caused vaccination rates to plummet, as researchers scrambled to replicate or disprove Wakefield’s findings. 

Wakefield’s article remained an anchor for anti-vaccination arguments until its retraction in 2010 when a British journalist uncovered serious ethical issues in the design of the experiments. Not only had Wakefield received research funding from parents seeking to sue the makers of vaccines for harming their children, but he also stood to gain financially from a transition to a new version of the measles vaccine. His study also consisted of a mere 12 patients, who were subjected to excessively invasive tests like spinal taps. Lastly, Wakefield and the healthcare media fell victim to a classic fatal flaw in statistics: a correlation between vaccination and harmful toxicity does not imply causation.

Wakefield’s study continues to be held in regard by those who suspect that the pharmaceutical industry used its weight to quiet claims that its vaccines are harmful. Despite numerous other studies demonstrating the relative safety of the MMR vaccine parents still choose not to vaccinate their children, contributing to a growing incidence of measles in the US.

Check back next month when PhDISH contributor Michael Schreiber digs deeper into the controversy surrounding vaccine safety.

 

As much as we’d like to believe that science uncovers truth, it’s important to remember that the truth does not come easy. The peer review process will forever be the cornerstone of the scientific method, ensuring checks and balances to protect research integrity. The media, however, exists outside of the realm of peer review, making it more important than ever that scientists engage more directly with the general public and avoid misconceptions such as those discussed here.