You’ve heard us tout the benefits of systematic reviews over and over again here at Evidence-based Living. The truth is they are the best way to evaluate the real evidence available on any topic because they use sophisticated methods to evaluate the dozens of research-based articles.
They’re also essential for scientists conducting their own research because one of the main premises of scientific study is that new discoveries build on previous conclusions.
So we were disappointed to see an article in the New York Times last week discussing how few research studies cite preceding studies on the same topic.
The article discussed a study published in the Annals of Internal Medicine that reviewed 227 systematic reviews of medical topics. In total, the reviews included 1523 trials published from 1963 to 2004. For each clinical trial, the investigators asked how many of the other trials, published before it on the same topic and included with it in the meta-analysis, were cited. They found that fewer than 25 percent of preceding trials were cited.
The results shocked study co- author Dr. Steven N. Goodman of Johns Hopkins University School of Medicine.
“No matter how many randomized clinical trials have been done on a particular topic, about half the clinical trials cite none or only one of them,” he told the New York Times. “As cynical as I am about such things, I didn’t realize the situation was this bad.”
The lack of previous citations could lead to all sorts of problems – from wasted resources to incorrect conclusions, the study concluded.
Here at Evidence-based Living, we’d like to see citations for systematic reviews and previous trials in most scientific articles.
I have cited in all of my research. No matter what you do, someone else has already done something on it. Therefore you must rely on the work of another person to do your own research. No matter what, if someone comes to me and says that they did a particular research all by themselves, I would fail them.