From The Atlantic's Lies, Damned Lies and Medical Science:

Still, Ioannidis anticipated that the community might shrug off his findings: sure, a lot of dubious research makes it into journals, but we researchers and physicians know to ignore it and focus on the good stuff, so what’s the big deal? The other paper headed off that claim. He zoomed in on 49 of the most highly regarded research findings in medicine over the previous 13 years, as judged by the science community’s two standard measures: the papers had appeared in the journals most widely cited in research articles, and the 49 articles themselves were the most widely cited articles in these journals. These were articles that helped lead to the widespread popularity of treatments such as the use of hormone-replacement therapy for menopausal women, vitamin E to reduce the risk of heart disease, coronary stents to ward off heart attacks, and daily low-dose aspirin to control blood pressure and prevent heart attacks and strokes. Ioannidis was putting his contentions to the test not against run-of-the-mill research, or even merely well-accepted research, but against the absolute tip of the research pyramid. Of the 49 articles, 45 claimed to have uncovered effective interventions. Thirty-four of these claims had been retested, and 14 of these, or 41 percent, had been convincingly shown to be wrong or significantly exaggerated. If between a third and a half of the most acclaimed research in medicine was proving untrustworthy, the scope and impact of the problem were undeniable.

[...]

But even for medicine’s most influential studies, the evidence sometimes remains surprisingly narrow. Of those 45 super-cited studies that Ioannidis focused on, 11 had never been retested. Perhaps worse, Ioannidis found that even when a research error is outed, it typically persists for years or even decades. He looked at three prominent health studies from the 1980s and 1990s that were each later soundly refuted, and discovered that researchers continued to cite the original results as correct more often than as flawed—in one case for at least 12 years after the results were discredited.

Here's a suggested solution to the problem of refuted research still being cited. Have some respected agency maintain an archive of studies that have failed to replicate or that have otherwise been found lacking. Once such an archive existed, medical journals could adopt a policy of checking all citations in a proposed article against the archive, rejecting submissions that tried to cite refuted research as valid. This might also alleviate the problem of people not doing replications because replications don't get many cites. Once such an archive was established, getting your results there might become quite prestigious.

The one major problem that I can see with this proposal is that it's not always obvious when a study should be considered refuted. But even erring on the side of only including firmly refuted studies should be much better than nothing at all.

Such a fix seems obvious and simple to me, and while maintaining the archive and keeping it up to date would be expensive, it should be easily affordable for an organization such as a major university or the NIH. Similar archives could also be used for fields other than medicine. Is there some reason that I'm missing for why this isn't done?

New Comment
11 comments, sorted by Click to highlight new comments since:

An archive of refuted research is an excellent idea. It could have ratings which reflect degrees of refutation, just as Polifact has degrees of accuracy and inaccuracy. (Page down for an explanation of their Truth-O-Meter.)

I feel this would be very necessary for the general idea of an archive of refuted research to be effective. I say this primarily because even in the science and engineering domain (with which I am more familiar, as opposed to medicine and the soft sciences) there are many degrees of possible wrongness. When a field is just opening up, it is inevitable and even okay for people to pulish things that are a bit wrong. We know that the Cosmological Constant isn't "right" but we don't call Einstein's original papers "refuted."

Someone may say, "That's fine, we will only put research that is meaningfully refutable in this archive," but then you invite endless argument about whether string theory is refutable, etc., not to mention providing a platform for researchers with vendettas and ulterior agendas to invalidate the research of rivals through sheer nitpicking and willful misrepresentation.

Someone may say, "That's fine, we will only put research that is meaningfully refutable in this archive," but then you invite endless argument about whether string theory is refutable,

That's why I mostly just concentrated on medicine, where you can simply ask whether replication studies on "thing X helps people" manage to reproduce the original result or not. Yes, there's room for interpretation and disagreement even here, but hopefully less than for "is string theory refutable".

Once such an archive existed, medical journals could adopt a policy of checking all citations in a proposed article against the archive, rejecting submissions that tried to cite refuted research as valid.

The problem with this is that power corrupts. Medical science is already suffering from group think problems this would likely make it worse.

Nancy mentioned Polifact here, I'm pretty sure I've seen controversy over some of their pronouncements. In any case I'm sure they'd be much worse if they had actual gate-keeping power.

This is quite true, but it doesn't seem obvious that it'd be worse than incorrect results being cited for decades after being refuted.

[-]djcb20

While I agree in principle that this could be useful, still I wonder -- researchers that spend months, years on some subject are apparently unable to double-check the shoulders they stand on -- would this really make a difference?

Now, if the respected journals would automatically reject any paper citing a refuted paper as support, that might bring about the needed cultural change...

Such a thing exists for law.

What do you mean? As I understand it, LexisNexis is just a database of legal articles. PubMed is a similar database of medical articles. PM lacks the full text of the articles and even the bibliographies, but it seems to me adequate for this purpose. The difference seems to me to be a matter of culture, of making a point of doing the search. Also, the hierarchical nature of law makes it easy to tell if one opinion trumps another.

Math Reviews is a database of journal articles that includes complete bibliographies for the recent articles. I suppose that must require a deal with the journals, but I wonder why PubMed didn't make the same deal? Google Scholar is an automatically generated citation database.

this could be a good project, any takers. We need to collect the information and put it all together on 1 site, we don't have to release our own studies.

In some fields most of the studies are refuted within 2 years of publishing, i wonder if it might be better to include any and all studies we can find and then rate them on a reliability scale like this one: http://www.cebm.net/index.aspx?o=1025 / http://chronopause.com/index.php/2011/08/12/interventive-gerontology-101-01-the-basics/ .

that scale might not work for all fields but we can probably think of a scale that does work. And then say any rating below X should be never be cited.

I do not however have a solution for how to become a respected agency.

I do not however have a solution for how to become a respected agency.

If journals start to reject publishing researched because it relies on 'poor' citations, that should have the effect of making this proposed archive-of-study-quality respected.

So maybe a more specific question: how could we get journals to use this archive as part of their review process?

The obvious method is to start with people who are credentialed in the fields to join in. If you have relevant PhDs who are working on this people will likely pay a lot more attention. Maybe talk to people are the local university, see if you can get them interested.