Fantastic. He shows the mistakes o one of the most importants magazines could publish a paper as this fraud
Photograph by Frans Lanting, National Geographic
Published October 3, 2013
A cancer drug discovered in a humble lichen, and ready for testing in patients, might sound too good to be true. That's because it is. But more than a hundred lower-tier scientific journals accepted a fake, error-ridden cancer study for publication in a spoof organized by Science magazine.
The fake study points to a "Wild West" of pay-to-publish outlets feeding off lower tiers of the scientific enterprise by publishing studies without any appreciable scrutiny, say research ethics experts. (See "Who's Afraid of Peer Review?")
Some 8,250 "open-access" scientific journals worldwide are now listed in a directory supported by publishers. Unlike traditional science journals that charge for subscriptions or fees from those wishing to read their contents, open-access journals make research studies free to the public. In return, study authors pay up-front publishing costs if the paper is accepted for publication.
"From humble and idealistic beginnings a decade ago, open-access scientiﬁc journals have mushroomed into a global industry, driven by author publication fees," says journalist John Bohannon, writing in the Science magazine report of his survey-style spoof of review practices at such journals.
The cover of Science magazine.
Image courtesy Science/AAAS
"The goal was to create a credible but mundane scientiﬁc paper, one with such grave errors that a competent peer reviewer should easily identify it as ﬂawed and unpublishable," Bohannon says. Of 255 open-access journals that said they would review his study, 157 accepted the fake study for publication. "Acceptance was the norm, not the exception," he writes.
Science Spoofs Not New
Spoof studies intended to spotlight problems with individual journals and their review practices have made news before. New York University physicist Alan Sokal spoofed the cultural studies journal Social Text in 1996 with a crackpot physics treatise. And last month, Serbian academics spoofed a Romanian journal with a similarly ludicrous data-processing paper.
But the Bohannon study, which claimed to have discovered a cancer-fighting, lichen-derived drug ready for immediate testing on patients, represents a first systematic test of review practices, or their absence, across many journals at once, says research ethics expert Nicholas Steneck of the University of Michigan in Ann Arbor.
"The public wanted open access to scientific literature, and now they are getting it," Steneck says. "They now need to get over the idea that they can get all that information for free without someone doing the real hard work of reviewing papers."
Cancer Study Faked
The spoof study should have swiftly failed acceptance by "peer" reviewers at the science journals. Peer-reviewed science journals are supposed to publish papers only after a panel of two or three anonymous experts judge its acceptability for publication.
To test these reviewers, Bohannon submitted versions of his study to 304 open-access journals over the course of the year. The name of the cancer, lichen, and drug in each version was essentially picked out of a hat, along with an equally random, made-up name and institution for an author situated in an African capital.
Only 255 journals responded.
The journals tested were ones with relevant medical or biological titles for a cancer study, such as the European Journal of Chemistry or the International Journal of Cancer and Tumor (the latter edited by a "Grace Groovy," according to correspondence with Bohannon). Most appear to be headquartered in India and the United States.
The spoof study had at least three problems:
The study drug killed cancer cells with increasing doses, even though its data didn't show any such effect.
The drug killed cancer cells exposed to medical radiation with increasing effect, even though the study showed the cells weren't exposed to radiation.
The study author concluded the paper by promising to start treating people with the drug immediately, without further safety testing.
"If the scientiﬁc errors aren’t motivation enough to reject the paper, its apparent advocacy of bypassing clinical trials certainly should be," Bohannon writes.
Peer Review Missing
But in many cases, it appears the study wasn't peer-reviewed at all by the journals that responded to the spoof submission. Many of the reviews were just requests to format the study for publication. And of 106 journals that performed any review, 70 percent accepted the study.
"If a bogus paper is able to get through peer review, think about how many legitimate, but deeply flawed, papers must also get through," says Michael Eisen of the University of California, Berkeley, a founder of the Public Library of Science family of open-access journals.
One of those journals, PLOS One, was the only one of the 255 journals that received the spoof that noted its ethical flaws and "meticulously" reviewed the bogus study before rejecting it.
Although Eisen applauds the reviewers at PLOS One, he says of the spoof, "in all honesty, I think it is kind of a general indictment of peer review."
Instead he thinks scientists should move to a process of massive peer review after a study is released, a movement now pursued among physicists who widely upload draft versions of their papers to an online archive prior to journal review and publication.
The University of Michigan's Steneck, however, sees the spoof as exposing an onslaught of shoddy journals and bad studies cluttering the scientific literature. "I don't think it is an indictment of peer review as an idea, but rather shows how hard it is to get right," he says.
Cute, Clever Hoax
"These aren't really science journals pointed out by this very cute and clever hoax; they are more check-cashing operations," says Stanford University study-design expert John Ioannidis. "I don't think that open access is the problem either. I think you would see the same problem with the lower tier of traditional peer-review journals as well."
In fact, the hoax may show that scientists are caught in a double bind when it comes to publishing studies, Ioannidis says. His own research shows that the most prestigious science journals, ones important in hiring decisions for researchers, demand outsize effects from the studies they publish. That may lead to study authors subconsciously biasing their statistics to make a startling discovery worthy of a big-name journal.
Science, which has defended traditional publishing models against open-access efforts, is not without its own blemishes. The journal published a 2004 paper claiming the first cloning of human stem cells that turned out to be faked. And in 2010, it published a study of "arseniclife" microbes that researchers initially believed were using poisonous arsenic in their metabolism.
The paper generated a firestorm of criticism, and was largely refuted in 2012. Later investigation revealed that the paper's original peer reviewers had loved the 2010 paper and largely missed its flaws.
At the same time, a study that is honestly conducted, but doesn't offer headline-making revelations, may end up buried in journals with bad reputations due to the same kinds of shoddy peer-review practices pointed out by the hoax. Then the researchers don't receive grants or jobs offered to others.
The post-publication review advocated by Eisen is no panacea, Ioannidis argues, because researchers aren't rewarded for such reviews and therefore won't do them.
"I don't think there is any one solution," Ioannidis says. "The scientific community has to come up with solutions that rewards good studies. It's an ongoing fight."
Follow Dan Vergano on Twitter.
Look, Every time i see a $40.00 fee for a journal article i must decide weather the article is worth reading or will the abstract suffice. Many times the knowledge gained is not worth the funds spent. Many times the abstract is a teaser, and the journal article is a waste of reading time and funds.
Many in the public arena cannot afford $40.00 fee every time they want to gain knowledge. Many of us did not attend high end academic institutions. Many of us do not receive a free subscription paid for by the employer. However, nonetheless, we need the knowledge to help us explore and answer questions that are posed by our everyday mundane common lives.
The majority of the publishers who have the power to impose a fee do so. It limits access and thus limits access to knowledge. In some cases you have to become a member of the association which in some cases, may be denied because you do not have the pedigree to be a member. Again knowledge is denied.
Not too sure what this article is trying to address:
is it the phony drug journal article or is it the fact that the document was made available for the public to freely read without a fee being charge?
Is it the fact that the article was not peer reviewed?
Is it the fact that the Ivy League journals, make a huge profit from their advertisers and then have the ability and power to charge for the article to be downloaded by the masses.
While there may be some circumstances that justify $40 fee , the moral or ethical aspects, not to release the document for the public to read without an enormous fee, is the real story. It is time for all journals to consider and offer open access. Times are changing. Knowledge denied to the public is knowledge lost.
Having published in peer-reviewed journals for about 30 years, and still active in the process, I have always appreciated both the process and the feedback: Certainly, there have been some frustrations. The blame can be shared at many levels; however, as Cassius said to Brutus, "The fault dear Brutus is not in the stars. But in ourselves, that we are underlings." [Julius Caesar). Perhaps individual honesty, integrity, transparency, and a few other ethical and moral principles need not only to be aggressively taught in our schools at all levels, but need to be practiced by society and the individual. These principles are not an infringement upon individual rights or freedom of speech, but the result of a society that is becoming morally bankrupt, greedy, and self-centered. The mirror is always a good place to start when looking for the dirt. Yet, part of being human is the frailties that come with the experience: I too must be careful of the "mote" in my eye. Peer-review and special interest is an age old problem, with the digital age increasing its complexity. My opinion of the article and some of the reviews; perhaps frightening; yet, not new to the academic community. The good things is we recognize it and are attempting to make corrections. To me, that is the redeeming face of humankind.
The results of this study are not surprising. We are bombarded by emails
of online Journals that promise quick
publication. I found very useful that
the widespread suspect of a lack of scientific rigor has
been demonstrated by empirical data.
1. exploiting these "soft underbelly" one can easily build scientific curriculum; if we think that research evaluation may take into account the number of published papers as a parameter (e.g. this happens in Italy), there is a high risk of distortion
2. I believe that possible remedies are: (i) make researchers free from the obsession of quantity (of papers...), (ii) give more value to Journals that combine transparency in selection procedures with a quality control' and full access to the data (https://sites.google.com/site/openingsciencetosociety/).
yeah cuz what we do now is so good for us. only 50% of our population is currently expected to get cancer during their lifetime. science is mocking real prevention (nutrition, toxins, stress etc.) and alternative cures because all research is paid by whoever makes money out of us being sick. it is disgusting.
Hubris from self-anointed academics and scientists will come back to bite them, once it's discovered that lichen and moss are indeed a cure--it all comes down to karma, doesn't it?
Maybe, maybe not. To trust the mainsteam cancer establishment is foolish beyond words. I have not really read the article. I just know not to trust the mainstream cancer establishment. They are a money making racket that loves to treat cancer, but healing cancer they aren't so keen on. And to prevent cancer, forget about it.
WHERE THE FAULT LIES
To show that the bogus-standards effect is specific to Open Access (OA) journals would of course require submitting also to subscription journals (perhaps equated for age and impact factor) to see what happens.
But it is likely that the outcome would still be a higher proportion of acceptances by the OA journals. The reason is simple: Fee-based OA publishing (fee-based "Gold OA") is premature, as are plans by universities and research funders to pay its costs:
Funds are short and 80% of journals (including virtually all the top, "must-have" journals) are still subscription-based, thereby tying up the potential funds to pay for fee-based Gold OA. The asking price for Gold OA is still arbitrary and high. And there is very, very legitimate concern that paying to publish may inflate acceptance rates and lower quality standards (as the Science sting shows).
What is needed now is for universities and funders to mandate OA self-archiving (of authors' final peer-reviewed drafts, immediately upon acceptance for publication) in their institutional OA repositories, free for all online ("Green OA").
That will provide immediate OA. And if and when universal Green OA should go on to make subscriptions unsustainable (because users are satisfied with just the Green OA versions), that will in turn induce journals to cut costs (print edition, online edition), offload access-provision and archiving onto the global network of Green OA repositories, downsize to just providing the service of peer review alone, and convert to the Gold OA cost-recovery model. Meanwhile, the subscription cancellations will have released the funds to pay these residual service costs.
The natural way to charge for the service of peer review then will be on a "no-fault basis," with the author's institution or funder paying for each round of refereeing, regardless of outcome (acceptance, revision/re-refereeing, or rejection). This will minimize cost while protecting against inflated acceptance rates and decline in quality standards.
That post-Green, no-fault Gold will be Fair Gold. Today's pre-Green (fee-based) Gold is Fool's Gold.
None of this applies to no-fee Gold.
Obviously, as Peter Suber and others have correctly pointed out, none of this applies to the many Gold OA journals that are not fee-based (i.e., do not charge the author for publication, but continue to rely instead on subscriptions, subsidies, or voluntarism). Hence it is not fair to tar all Gold OA with that brush. Nor is it fair to assume -- without testing it -- that non-OA journals would have come out unscathed, if they had been included in the sting.
But the basic outcome is probably still solid: Fee-based Gold OA has provided an irresistible opportunity to create junk journals and dupe authors into feeding their publish-or-perish needs via pay-to-publish under the guise of fulfilling the growing clamour for OA:
Publishing in a reputable, established journal and self-archiving the refereed draft would have accomplished the very same purpose, while continuing to meet the peer-review quality standards for which the journal has a track record -- and without paying an extra penny.
But the most important message is that OA is not identical with Gold OA (fee-based or not), and hence conclusions about peer-review standards of fee-based Gold OA journals are not conclusions about the peer-review standards of OA -- which, with Green OA, are identical to those of non-OA.
For some peer-review stings of non-OA journals, see below:
Peters, D. P., & Ceci, S. J. (1982). Peer-review practices of psychological journals: The fate of published articles, submitted again. Behavioral and Brain Sciences, 5(2), 187-195.
Harnad, S. R. (Ed.). (1982). Peer commentary on peer review: A case study in scientific quality control (Vol. 5, No. 2). Cambridge University Press
Harnad, S. (1998/2000/2004) The invisible hand of peer review. Nature [online] (5 Nov. 1998), Exploit Interactive 5 (2000): and in Shatz, B. (2004) (ed.) Peer Review: A Critical Inquiry. Rowland & Littlefield. Pp. 235-242.
Harnad, S. (2010) No-Fault Peer Review Charges: The Price of Selectivity Need Not Be Access Denied or Delayed. D-Lib Magazine 16 (7/8).
Vergano's "study" would have been significantly strengthened by adding a control group. This whole article is founded on a giant untested assumption that "traditional" journals would have rejected the spoof paper. Building this "study" on an hunch instead of rigorously and empirically testing that assumption shows that this article wasn't worthy of publication either. I mean, seriously - you submitted your spoof article to over 300 open access journals but could not be bothered to simultaneously submit it to 100 "real" journals to see whether your conjecture is actually true or not?
Did none of the NatGeo editors point out this massive, obvious flaw in Vergano's article which, in my opinion, makes it unpublishable? NatGeo's choice to publish this piece is as big an indictment of NatGeo as it is an indictment of the subset of open access journals which are clearly shams.
Why is the goal being published, and not on the research itself?
The use of citation for tenure, pressures from corporations that fund the research, and accepting skewed results as part of the process. All of these need oversight and more rigorous standards applied.
There somehow needs to be a veering away from individual achievement and more of a focus on the goal itself.
...tlachichinolli(N)=consumed, destroyed by fire,=tlachichinoaxiuitl(N)=plant that grows on rocks; soak it in water, then drink it to relieve burning of mouth and stomach; also put on sores and itch, e.g., mange/sarna(sp),=t/lachichin(N)=lichen(E).
I agree that the failure to submit this to high-quality peer-reviewed journals, on the assumption that it would have been rejected, is a very serious flaw. On the other hand, good peer reviewed journals have two features that make this assumption a reasonable one. First, they tend to have high rejection rates. I served as Editor of a peer-reviewed journal in psychology; our rejection rate was 85-90% and only one of over 3000 papers we received during my term was accepted without the need for revisions. Second, good journal attract very good reviewers, and the flaws noted here would almost certainly be spotted in a high-quality journal.
There is an ethical issue that probably makes it difficult to submit this paper to genuine peer-reviewed journals. The peer review process is generally a voluntary one. Reviewers put in substantial time and effort and generally get nothing in return. Submitting a paper in bad faith (i.e., one that is designed to have flaws that should preclude publication) abuses a system that depends fundamentally on the professionalism and hard work of literally hundreds of persons per journal (during my term, I maintained an active list of over 500 people who did peer reviews for our journal, none of whom received any concrete reward for doing so).
The proliferation of journals that are, as the author notes, largely check-cashing operations will, in the end, put even more pressure on researchers to publish into peer-reviewed journals. At one time, any publication was recognized as having some value, but in the current environment, it is hard to pay serious attention to papers that are not published in journals with a solid peer-review system. This probably means that some good papers that are published in lower-tier journals do not get the attention or respect they deserve, but the profusion of vanity presses and journal that will publish anything for a fee has the undersired effect of making open access publication less and less useful to scientists
This is an important debate, but there are incorrect statements here. It is not true, for example, that we can say that "open-access journals make research studies free to the public. In return, study authors pay up-front publishing costs if the paper is accepted for publication." Of the journals listed in DOAJ, well under half take payment. The others are supported by scientific societies, universities etc. I discussed this and the mistakes of the Science article at The Guardian today, cf. What Science — and the Gonzo Scientist — got wrong: open access will make research better http://bit.ly/1f5JAzi
A seemingly biased publication from the AAAS who have very much to lose if we ever find a way of freely sharing the scientific knowledge that is often publicly funded. It's ironic that this publisher gets away with this biased publication,
As in no conflicts interests Declared :
General and administrative expenses
Net assets, end of year $115,175,000,,,,
They do a good job of highlighting the flaws and abuses in some open access but i
found his approach flawed and biased. Besides his limited mentions of
the other even more important side of the argument towards the end of
this a main let down was the missed opportunity to include a parallel
effort to test the waters with the established Journals during this
effort. A disclosure of their financial positions would likely show yet
another arm of the medical industry that puts a very lucrative business
model with excessive profits will before the purer needs to advance
An advanced Wiki model of open access may be what is needed in my view, a format in which those with relevant knowledge could use a crowd sourced approach in reviewing science as opposed to a few high paid academics hand picked from a journals pool of associates.
Its also frighting to imagine how under qualified some of these reviewers may be for a piece of work in which they simple don't have the relevant experience to decided what's valuable & what's not.
To better understand the gravity of this & why it's so important the TED Talk "Ben Goldacre: Battling bad science" is worth a look If you haven't seen it.
they should have also submitted tothe respected journals. After all the great vaccination scare was published by a respectedjournal
This study has a flaw - acceptance by a journal is only the first half of publication, they failed to wait and see if anyone cited their spoof study; that would have skewered the open source community.
All they have shown is that their money is accepted and buyer beware. They have not shown that these sham journals have an impact. They could have done the same study with medical doctors - we visited 255 doctors with fake ailments and only one turned us away without taking our money first...
This demonstrates how incompetent are editors of fancy journals. Science magazine accepted this study on the reliability of open access journals without providing a control experiment. That is, they should have sent the article to the same number of non-open access journals instead of arrogantly assuming that these journals would get it 100% right. And only after that compare to the results from open access journals.
There is also the issue of false negatives: conventional journal are more likely to reject ground-breaking work that open access journals.
Maybe something like stackexchange for science journals...
It would have to have a certain up-vote number to publish?
@Daniela Staiculescu This comment has nothing to do with the article at hand. The research wasn't even research. It was a spoof, a hoax, targeted at exposing lax standards in scientific journals. Cancer, and its various treatments and preventions, isn't even being discussed here.
@Joe david This has nothing to do with the article. The study wasn't even a study. The person has no idea if there is a cure within lichen. There was no experiment done. He just created a fake, deeply-flawed study, and submitted it to various journals to see if they would accept it. He was looking at lax standards in scientific journals, not cancer.
@David Wiley Mr. Wiley, you may want to read the article. The spoof study was written by Science magazine's John Bohannon. This news story also notes the issue of not testing traditional journals.
@Leah ChambleeI think it is interesting that Steneck casts aspersions on open access to government funded research as the problem. If anything open access has exposed an issue with studies that has frustrated many scientists/behaviorists and perhaps will actually be addressed in a constructive and concrete way.
@Andrew Piechocki Yes if you had true OA to all scientist. I'd trust a review any day that has had 500 scientist passionate about the subject going over it, as opposed to a review that has had 3 with two if them disinterested in the subject & the other correcting grammar mistakes! It seem the logic like this does not sink in with everyone.
@Andrew Piechocki we have that, it is called citation, nice catch :)
@Melissa Hickson "He just created a fake, deeply-flawed study, and submitted it to various journals to see if they would accept it. He was looking at lax standards in scientific journals. If the fake paper is accepted by journals, then the result will be that the those journal lacks standard and he could publish his results as Standard Gold A level work on SCIENCE.
Explore With Nat Geo
Anders Angerbjörn learns little foxes have big attitudes.
Special Ad Section
Shop book & DVD gifts for all ages. Plus, save on maps featuring award-winning cartography. Limited time only.