Scientific Fact-Checking is a Click Away: The Amazing Cochrane Collaboration

There’s a famous scene in the film Annie Hall, where Woody Allen is standing in line in a movie theater. Behind him, a pretentious professor is loudly proclaiming his opinions about the famous media thinker Marshall McLuhan. Allen’s character reaches the boiling point and from behind a film poster produces Marshall McLuhan himself, who proclaims to the pompous intellectual: “You know nothing of my work! How you got to teach a course in anything is totally amazing!” Woody tells the camera: “Boy, if life were only like this!”

We all wish that we had an impeccable source of information like that at our fingertips, especially when it comes to research on human health and well being. Imagine if you were in a debate – at work or with family and friends – about an issue pertaining to health. What if you could pull up a website and say: “I have the definitive scientific opinion right here!”

Actually, you can. It’s called the Cochrane Collaboration. I urge you to make the first of what I am sure will be many visits today. It is the true mother lode for objective scientific evidence on hundreds of issues relevant to mental and physical health and human development. You really can know what science has to say about many issues.

In the Cochrane Collaboration, teams of scientific experts from around the world synthesize the research information and issue reports offering guidance for what both professionals and the general public should do. It’s a non-profit, entirely independent organization, and that lets it provide up-to-date, unbiased information about the effects of health care practices and interventions.

The site is organized so you can, free of charge, get the abstract of any Cochrane review. What you will get is a clearly-written abstract of the review, written in layperson’s language. These can be used in to help answer your clients’ questions and in any situation where it helps to show the scientific consensus on an issue. They even have podcasts you can download of the reviews.

The number and scope of reviews is mind-boggling, and the Cochrane reviews take a very broad view of health (so you are sure to find ones relevant to your work). Here are just a few examples of the conclusions of reviews:

The media are taking notice of the Cochrane Collaboration, in part because these objective reviews can help figure out what our health care system should be paying for — a nice report appeared in Sharon Begley’s Newsweek blog.

So hey – why are you still here and not looking at the reviews? The easiest place to start is on the review page, where you can search for topics or just browse through the reviews.

Stop the presses! Let’s learn from journalists about communicating research

All of us who are professionally or personally interested in translating research evidence to the public struggle with how best to do it. Fortunately, we don’t need to “reinvent the wheel” – there are great web resources to help us!

Bruce Lewenstein, Professor of Science Communication at Cornell has put together a terrific list of sites that provide nuts-and-bolts ideas and information for how to communicate with the public about science topics. These are geared in part for journalists who make a living communicating about research, or for scientists who are trying to communicate with the public. However, they provide information that is highly useful to extension educators and others who inform citizens about research. Prof. Lewenstein also maintains a web page with basic resources for better communication about scientific research. 

Take a look — I found I got several new ideas with just a few clicks!

http://communicatingscience.aaas.org/ (produced by American Association for the Advancement of Science, includes webinars, tipsheets, etc.)

 www.wfsj.org/course/en/index.html (online science journalism course, developed by World Federation of Science Journalists; primary audience is science journalists in developing countries)

http://www.scidev.net/en/science-communication/ (SciDev.net’s “Communicating Science” section, focused on science journalism for the developing world, but relevant for anyone communicating science)

http://www.pantaneto.co.uk/issue28/thomas.htm Tips for great (science) media interviews (from Patricia Thomas, Knight Chair in Health & Medical Journalism, Grady College of Journalism & Mass Communication, University of Georgia)

 http://www.scienceliteracyproject.org/Science Literacy Project (a workshop for science reporters working in public radio; some resources online, especially the “tip sheets”)

The buzz on antidepressants: A lesson from the media

Antidepressant pillsHere’s a question for all of us trying to apply scientific research and disseminate evidence-based programs:  How does one move the complex findings of a particular research study into useful knowledge on which non-scientists can make decisions? And to what extent are the media friend or foe in such efforts?

We’ve been treated to a great example in a widely-publicized study that came out last week.. Becky Jungbauer  at Scienceblogging [1. http://www.scientificblogging.com/truth_universally_acknowledged/depression_placebos_and_paxil],  summarized the news coverage of this finding on the effectiveness of antidepressant medication. With millions of Americans taking antidepressants, the question of whether they work or not is by no means just an academic one.

The media frenzy over this study could leave a casual reader with a simple (and wrong) impression: That antidepressants don’t work. As Jungbauer notes, many people reading headlines or half-listening to the news might assume that  “antidepressants in general don’t really work unless you’re standing on a ledge.” Headlines such as these appeared: “Study: Antidepressant lift may be all in your head,” and “Placebos: Pretty Good for Depression.”

I’m not going to comment on all the details of the study, which is summarized well in the New York Times [2. http://www.nytimes.com/2010/01/06/health/views/06depress.html],  and on Jungbauer’s blog. Rather, it’s great example for those us translating research because it shows just what a hard job this can be. How can a lay person take findings like these reported in the media and use them?

Let’s begin with the article itself, “Antidepressant Drug Effects and Depression Severity: A Patient-Level Meta-Analysis,” published in the prestigious Journal of the American Medical Association. The first clue to potential problems with translating the findings lies in that subtitle: What the heck is a meta-analysis?
Most people probably assumed that the authors themselves did a study and reported the results. That isn’t the case – they used statistical methods to merge the findings of six previous studies done by various research groups. The results of their analysis are entirely dependent on the quality of the studies reviewed.

What if a reader decided to go to to JAMA itself, to read the article? I can imagine a depressed acquaintance of mine (now worried whether she should be taking antidepressants) trying to make sense of statements like: “Mean intake severity did not differ as a function of treatment condition (F1,711= 0.05, P=.82), but the 6 studies did show different mean intake severity levels, reflecting differences in inclusion criteria (F5,711=79.56, P_.001).” How many concerned readers can explain what “a randomized, double-blind, placebo-controlled trial” is?

It would take some time and effort to see that  the study doesn’t say the anti-depressants don’t work – they did. But just not more than a placebo (a sugar pill).

And it would take even more careful reading  to note that the drug used in three of the six studies is no longer widely used, and that people in the placebo groups typically received some form of counseling and interaction (even if not formal therapy), which might have had an anti-depressant affect. And few people have easy access to JAMA to begin with, even if they wanted to read the article.

The moral of this story  is how hard we need to work to make sure people understand the science behind any generalization. Is there some way that we could ensure that “ordinary” citizens can obtain a minimal level of scientific literacy, so they can discern what they should be influenced by and what they should ignore? Might words like “meta-analysis,” and “randomized-controlled trial” need to become generally understood terms?

I am an ignoramus about car repair. You wouldn’t find me looking under the hood of my car unless I suspected a small animal was trapped inside. But I have learned enough of the language so I can communicate with my mechanic and evaluate in an elementary way what he plans to do to my car. That’s what we need with science.

Skip to toolbar