Why health journalism often gets it wrong

typingContradictory tips and strategies about how to improve our health fill the airwaves, magazines and newspapers year-round. Drink coffee; don’t drink coffee. Eat whole grains; avoid carbohydrates of any kind. Vitamin supplements are good for you; wait, no they’re not.  All of these news stories claim they are based on “evidence.” So what’s the deal?

The Columbia Journalism Review published a fascinating story earlier this year on how many health journalists – even reputable writers working at major publications – often report unreliable information because they don’t pay attention to the pitfalls of scientific research.

The author explains the situation like this:  “…science reporters tend to confuse the findings of published science research with the closest thing we have to the truth. But as is widely acknowledged among scientists themselves, and especially within medical science, the findings of published studies are beset by a number of problems that tend to make them untrustworthy, or at least render them exaggerated or oversimplified.”

The article lists three common problems that affect the accuracy of scientific research. One major problem, he says, is that journalists often list these problems as a side note, instead of realizing that they may skew the results of the study.

1. Mismeasurement. In many circumstances, it would be dangerous and unethical to conduct a full randomized-controlled trial on human subjects.  So researchers have to set up models – either in animals or through indirect measurement in humans – that are often irrelevant or inaccurate.

2. Confounders. There are thousands of factors that affect humans who do participate in studies, so it’s nearly impossible to tease them apart and isolate the impact of a single element.

3. Publication bias Research journals, like all publications, strive to include important and interesting research findings to attract readers. But, the author writes: “Typically, something is exciting specifically because it’s unexpected, and it’s unexpected typically because it’s less likely to occur. Thus, exciting findings are often unlikely findings, and unlikely findings are often unlikely for the simple reason that they’re wrong.”

The article really hits home for us at Evidence-based Living. It’s one of the reasons that we are proponents of the systematic review, which uses sophisticated methods to bring together and evaluate the dozens, hundreds, or even thousands of articles on a topic.

“In our work disseminating research information to the public, we always encourage consumers to beware of changing behavior or making health decisions based on a single study,” says Dr. Karl Pillemer, Associate Dean for Extension and Outreach in Cornell’s College of Human Ecology. “Instead, it’s important to look at accumulated evidence over time, and in consultation with your own health care professionals. Nothing upsets a scientist more than a news report that over-reaches his or her findings; the best journalists know this and frame things cautiously.”

The take-home message: Don’t take news about your personal health at face value. Try to find a systematic review – the Cochrane Collaboration is a great resource – that explains the conclusions of the entire body of evidence on a particular subject.

Speak Your Mind

Skip to toolbar