Does diet soda cause strokes? Nope!

I am guessing that many families reading the paper at breakfast today had this happen: Somebody said to someone else: “See, I told you drinking diet soda was bad for you!”

And that is because of a study reported widely in the media regarding the relationship between consumption of diet soda and stroke. Strokes are very bad things, often devastating the person to whom they occur, so a finding about anything that might increase our risk for stroke is worthy of notice.

 At Evidence-Based Living, one of the most fun things we do is to track back from the media coverage to the actual research findings. In so doing, we hope to help people figure out the nature of the evidence and whether we should immediately change our behavior. This was an unusually big story, and so we ask: Believe it or not?

First, let me say that media coverage was a little more measured than usual. Some news outlets did use headlines like that from Fox News: Diet Soda Drinkers at Increased Risk for Stroke” which make it sound like a firm finding (and probably led to some of the heated breakfast-table conversations). But many other outlets included the all-important “may” in the headline, and the articles themselves included qualifications about the study. 

So let’s take a look at this finding, using some of the key questions EBL recommends you always employ when you are trying to figure out whether a scientific finding should change the way you live. 

1. What kind of a study was this? Was it a good one?

 This is what scientists call an observational study. It was not a randomized, controlled experiment in which some people were asked to drink diet soda and others were not. It uses a longitudinal study called the Northern Manhattan study (or NOMAS). And yes, it is a very good study of its kind. It looks at stroke risk factors across white, black, and Hispanic populations living in the same community (northern Manhattan). It is a large and representative sample, followed up annually to determine if people suffered a stroke (verified by doctors on the research team). Many publications in top referred scientific journals have been published from the study (some of which are available for free on the website). 

2. Where did the information in the media come from? 

Here, in EBL’s opinion, is the first problem. The results were presented at a scientific conference this week (the American Stroke Association). This is not the same as being published in a referred scientific journal. In addition, we cannot follow an EBL cardinal rule: Go to the original article. The only information that is available on the study is from a press release issued by the association and subsequent interviews with the study’s lead author and other experts. So we need to wait until the results are published before we even think of changing our behavior in response to them. 

3. Are the results definitive?

No, no, and again no. There are some good reasons not to drink diet soda (including possible increased risk of diabetes and osteoporosis), but these findings do not “prove” that diet soda leads to strokes.  

Some reasons why this is a very tentative and preliminary finding include the following: 

  • All the data are self-report, so we are dependent on people remembering their diet soda consumption. 

  • It’s the first study to show this association. EBL readers know that we need multiple studies before we even begin to think about recommending behavior change.

  • It’s not all diet soda drinking: It looks like only people drinking diet soda every day show the association with stroke, suggesting that lower consumption may not increase risk. 

  • The study is not representative of the U. S. population. First of all, you had to be over 39 years old in 1990 to get in the study and the average age of the sample now is in the late 60s, so the results can’t be generalized to younger people. Further, the sample for this study included 63% women, 21 % whites, 24 % blacks and 53 % Hispanics. In the U.S as a whole, 51% of the population are women, 77% are white, 23% black, and 16% Hispanic. So it’s a very different group from what a random sample of Americans would get you.

  • We don’t know the reason for the association. The lead author, Hannah Gardener, is open about this: “It’s reasonable to have doubts, because we don’t have a clear mechanism. This needs to be viewed as a preliminary study,” By “clear mechanism,” she means that even if this relationship exists between diet soda and stroke, we don’t know why. 

There’s more we could say, but our main point is this: It doesn’t take very long for you to “deconstruct” what the actual evidence is behind a news story. With a basic understanding of how studies are done and access to the Web, you can often find out as much as you need to know. In this case, the media have reported the first highly tentative findings of an association between two things. Now other scientists need to test it again and again to see if it holds up, as well as finding out why the association exists.

 I go for sparkling water instead of diet soda because of other problems mentioned earlier with diet beverages.  But regarding stroke risk, the data just aren’t there yet. 

   

Weird science reporting: My Saturdays with USA Weekend

On Saturday mornings, my wife and I take turns getting the paper and the morning coffee, and we relax with it for a half hour before starting the weekend routine. My spouse has become used to my reaction when I turn to the magazine that comes with our paper: USA Weekend. Or more accurately, she has become used to covering her ears. When I put my Evidence-Based Living hat on, I believe that USA Weekend’s science reporting could at least enter any “worst of the year” contest.

But then I realized: This may be a “teachable moment” for me and others! On the positive side, it’s nice that relatively heavy coverage is given in USA Weekend to scientific findings. Their health and lifestyle articles are filled with “a recent study shows…” And they make many recommendations regarding nutrition, much of it supposedly based on science. There is even a celebrity panel called “The Doctors” who purport to answer your health questions.

Ah, but the road to you-know-where is paved with good intentions. And what you get from USA Weekend is almost the opposite of good evidence-based advice: It’s a mish-mash of simplistic inferences from individual studies mixed in with folk wisdom and anecdote – and it’s nearly impossible for the lay reader to tell the difference. As such, it’s a great example of exactly the kind of “science journalism” you should avoid taking too seriously. Let me give you a few examples of where the scientific advice provided in USA Weekend should have the label “Let the Reader Beware.”

1. No access to the original research. I am willing to be corrected on this, but nowhere on the USA Weekend site could I find any citations to the original studies. Evidence-Based Living always recommends you go back to the original scientific articles before believing the media, but so little information is given in a typical USA Weekend story that I couldn’t even determine what research was being referred to. If you can’t find the article, how do you know if the finding is real or not?

2. Reliance on a single study (or two). Regular readers of Evidence-Based Living know one cardinal rule: Never believe a single study (or a couple studies). Very often, articles in USA Weekend state: “Swedish scientists have found…” “New Research Shows…” “Two studies found,” “According to research presented at the American Chemical Society.”

What do we really need? All together now, EBL-ers: Systematic reviews of all available research leading to evidence-based practice recommendations. We need to see a finding replicated over and over, using rigorous scientific methods. We want those findings peer-reviewed by other scientists. And we want to know that they work outside of a controlled study. A couple of studies never prove a point, so we should not base our health-related behavior on the findings of a single study (and that’s what almost all scientists tell you at the end of their articles).

Just to give one example, USA Weekend reports that snoring is related to metabolic syndrome. In the closest article to this assertion I could find, the scientists qualify the finding extensively, including that the study is limited by the measures it used, by a small subsample, and by the cross-sectional (one-time) nature of part of the study. Where’s that information, USA Weekend?

3. Quick and confusing generalizations. The Doctors in USA Weekend make the somewhat astonishing recommendation: “Stop counting the calories (if you’re a woman over 65)” and they go on to suggest that it may be better for you stay at your current weight, because “Older women who lose weight can double their risk of hip fracture.” Now try as I might, I couldn’t find the exact reference, although there is research suggesting that weight loss can affect bone density negatively. But this says nothing about the total picture. Should a morbidly obese, diabetic person not lose weight because of a potential increase in hip fracture? Probably not, because the other obesity-related health problems can trump the increase in hip fracture risk.

Here’s a study idea for you: I wonder how many women read that comforting advice and dropped their diet, even if they are very overweight and at no particular risk of hip fracture. That’s why simple generalizations about studies do more harm than good usually.

What’s the lesson here? These snippets of information won’t necessarily do you any good unless you know where they come from, how the study was done, and how it applies to you. Does it fit with other scientific research? We’re told in this week’s issue that we should “sprinkle on the cumin” because “In a scientific study from India, cumin was found to be just as effective as an anti-diabetes drug in controlling diabetes in lab rats.” Does that apply to you? Who knows?

So go to the source whenever you can, and take your Saturday paper’s science reporting with a grain of salt!

Do gun control laws prevent violence?

Gun control laws are in the media spotlight once again in the wake of the Arizona shooting that killed six people and injured 13 including U.S. Rep. Gabrielle Giffords.  Already, the Arizona Legislature has introduced two new bills that would loosen gun controls on college campuses. But what do we really know about gun control laws?  Is there evidence that they reduce violence?

As unsatisfying as it sounds, the answer is that we just don’t know.  One of the only systematic reviews available on this topic was published by the Community Guide, a resource at the U.S. Centers for Disease Control for evidence-based recommendations on improving public health.  It reviewed more than 40 studies on gun control laws ranging from bans to restrictions to waiting periods.  (You can read a summary of the report here.)

The conclusion:  “The evidence available from identified studies was insufficient to determine the effectiveness of any of the firearms laws reviewed singly or in combination.” 

Essentially, the review concludes that there is a lack of high-quality studies that evaluate specific gun control laws.  One challenge is that information about guns and who owns them is limited to protect the privacy of firearms owners.

So what do we know about firearms in the U.S.?

We know that firearms are present in about one-third of U.S. households, and that there are handguns in about half of those homes.

We also have a National Violent Death Reporting System, which collects information from death certificates, medical examiner reports and police reports in 19 states. According to the reporting system, 66 percent of all murders and 51 percent of suicides are committed with guns.  But that doesn’t tell us much – like whether the murders and suicides would occur by other means or, given stricter gun control laws, whether the perpetrators would find a way to obtain guns illegally.

The bottom line is that researchers and government officials need to step up to conduct more research and find a proven way to prevent gun violence from taking the lives of innocent citizens.

The beginning of the end: The demise of cooperative extension in Canada

Cooperative Extension in the United States is a flagship program for connecting public “land-grant” universities to the general public. The goal of the Cooperative Extension System is to move knowledge created by researchers to groups who need it. A major audience has historically been agriculture, but other program areas deal with nutrition, child development, families, the environment and a variety of other issues.

I’ve worked as a faculty member in the Cooperative Extension program for 20 years, and I deeply admire the system. Like everyone with Extension responsibilities, I’ve been watching the changes that are going on nationally and at the state level. So I took notice of a very important cautionary note from our neighbors in Canada.

Writing in the Journal of Extension, Lee-Anne Milburn, Susan Mulley, and Carol Kline document the demise of agricultural extension in the province of Ontario. Their article, “The End of the Beginning and the Beginning of the End: The Decline of Public Agricultural Extension in Ontario,” shows how by the year 2000, “Extension in Ontario was moribund.”

How did this happen? According to Milburn and colleagues, some reasons are:

  • The decline of people involved in farming; fewer than 2% of Canada’s population are now involved in agriculture.
  • The decline in the agricultural sector in turn reduced political support for extension. Population changes “make agriculture less politically relevant and therefore create difficulties in accessing necessary funding for agricultural research and Extension.”
  • A key point: Extension was unable to document economic benefits; without clear “return on investment,” the government was unwilling to fund it.
  • Farmers now have access to many other information sources, making the Extension agent more of a “peer information consultant,” helping the farmer to access information rather than being seen as the source of expertise.
  • Universities focus increasingly on scholarship; in the words of the authors this relegates “Extension to the academic hinterland of ‘service and outreach.’”

It’s clear that these issues confront Cooperative Extension in the United States. Fortunately, the authors have some suggestions for what people involved in Extension should do:

  • Respond to the needs of rural non-farm residents. They point out that there are all kinds of issues in rural life Extension could respond to, like wetland and woodlot management, sustainable economic development, and conservation and stewardship.
  • Recognize that Extension programs have a life cycle and redirect resources away from failing or outdated programs.
  • Make creative use of new information technologies.
  • And a very interesting point: They suggest that reducing Extension field staff can be a mistake, and replacing one-on-one contact with consumers “is a recipe for decline.” They recommend in-person training and discussions rather than fact sheets and web-based information alone.

All food for thought as we enter a new era in Cooperative Extension!

 

 

New evidence on calcium and Vitamin D

Television news programs, newspapers and the Internet are all full of recommendations of how to lead a healthier life. They recommend specific foods, vitamins and all sorts of dietary supplements. But it’s important to look toward research-based facts to understand what your body really needs.

It turns out the federal Institute of Medicine (IOM) is recommending that you up the dose of two nutrients in particular – calcium and Vitamin D.

Cornell nutritionist Patsy Brannon recently served on an IOM panel that issued new recommendations for calcium and vitamin D consumption.  The report triples the recommended vitamin D intake for most healthy people from 200 to 600 international units (IUs) per day. It also caps the suggested vitamin D intake at 4,000 IUs per day, citing links between elevated vitamin D blood levels and adverse effects, including kidney and tissue damage.

The panel making the recommendation was composed of 14 physicians and nutritionists from the United States and Canada, who reviewed more than 1,000 studies and reports and consulted many scientists and stakeholders.

The updated recommendations will influence food policy on many levels, including U.S. Department of Agriculture standards for school meals, nutrition information on food packages and the content of rations eaten by soldiers in the field.

Even with the sharp increase in daily intake levels, the panel found that few people in the United States or Canada lack adequate vitamin D, in part because sunlight provides enough of the nutrient to overcome dietary deficiencies.

“Contrary to the highly publicized epidemic of vitamin D deficiency in America and Canada, the average American and Canadian is meeting his or her needs for vitamin D,” Brannon told the Cornell Chronicle for a story.

The findings also counter recent studies suggesting that insufficient vitamin D levels may be linked to a host of chronic conditions, including cancer, diabetes, autoimmune disorders, and heart and cardiovascular disease.

“The evidence available is inconsistent, with some studies demonstrating this association while others show no association, and still others show evidence of adverse effects with high blood levels of vitamin D,” Brannon said. “Thus, it is not possible to conclude whether there is an association of low vitamin D with chronic disease or not.”

For a complete listing of recommended intakes by age group and gender, click here.

****

As an aside, the Cochrane Collaboration has conducted several systematic reviews on Vitamin D supplements for specific medical conditions.  They’ve found that:

Science in the courtroom: A Cornell professor uncovers the facts behind child testimony

I received a postcard in the mail last week notifying me I was called for jury duty.  The prospect seemed an inconvenience. (Where would I find care for my two-year-old son while serving?). But it was also exciting!

I’ve always been interested in the law, and the idea of serving on a jury conjured up a feeling of civic responsibility that felt good.  It was a job I wanted to take seriously, and I immediately began wondering if there was any research I should consider before embarking on this important task.

Unfortunately, there were no trials in my town this week, so I didn’t even have to report to the court. But the notice did bring to mind the work of Cornell Professor Stephen Ceci, an expert in developmental psychology who has conducted ground-breaking research on the testimony of children.

Ceci’s work bridges the gap between research and real-life in a very tangible way: findings from his studies have influenced the way thousands of law enforcement officers, social workers, lawyers, and judges deal with the testimony of children. This is research that makes a tangible difference in the lives of people who often find themselves in difficult situations.

 (An interesting side note: Ceci refuses to be an expert witness for either prosecutors or defenders – a decision that has lent him credibility among judges throughout North America, who often cite his work in their decisions.)

A main topic of Ceci’s work is how children respond when they are questions about sexual abuse. The conventional wisdom says that children delay reporting abuse for years and will initially deny any abuse occurred when asked directly. But after repeated questioning, they gradually begin to tell little bits and pieces about how they were abused. Next, they recant altogether. Only later, when they are in what is perceived to be a psychologically safe situation, do they give a full and elaborate disclosure.

In analyses of dozens of published studies, Ceci and his colleagues separated out the methodologically-sound studies on children’s disclosure from poorly conducted ones. They found in high-quality studies, children did report abuse in full detail when explicitly asked. They also found that when a child is questioned repeatedly, he is likely to relent and say what he thinks the interviewer wants to hear to get out of an uncomfortable situation.

“It’s important for judges to know what science shows, because this set of invalid beliefs animates the whole investigatory process,” Ceci explained. “It motivates investigators and interviewers to pursue reluctant children, who may be reluctant because nothing actually happened.”

In the case U.S. v. Desmond Rouse, the United States Court of Appeals for the Eighth Circuit (the court directly beneath the U.S. Supreme Court) established new law on vetting child testimony based almost exclusively on the work of Ceci and his colleagues.

For anyone who works with children involved in the court system, Ceci’s work provides a whole new way to think about their testimony.

Evidence-Based Elections: If the House changes over, is it the President’s fault?

In all of the hubbub about the upcoming elections, Evidence-Based Living had to ask: Is there any research evidence that might help us interpret what’s going on? (And, of course, we always scratch our heads about why there isn’t more discussion of research evidence on something so important.)

One of the few enlightening discussions I’ve seen comes in article by Jonathan Chait. Chait notes the endless debate over “Did Obama Lose the 2010 Elections?” that is roiling in media discussions this week.

Folks on the left say Obama’s responsible because he: 1) didn’t stick more to progressive principles, and 2) didn’t more aggressively tout the Democrats accomplishments. People on the right argue that Obama’s responsible because he 1) is out of step with what the country wants, and 2) has moved too far to the left.

But the blaming in either direction hinges on one question: What if the predicted election results are simply, well, normal? That is, what if the ruling party losing seats in the mid-term election is a predictable, scientific phenomenon, rather than someone’s (Obama’s, the Democrat’s, the media’s, etc.) fault? Of course, if this were the case, major news organizations would have nothing to discuss and pundits would be out of a job. Still, it’s worth considering.

This points us to an analysis by Douglas Hibbs, professor of political economy, in a just-published report from the Center for Public Sector Research. Hibbs, like a good scientist, makes clear that his model isn’t designed to specifically predict the elections, but rather to explain midterm House election outcomes in terms of systematic predetermined and exogenous factors.

Based on prior research, Hibbs tells us there are three fundamental factors that predict midterm elections:

1) the number of House seats won by the party in power in the previous election

2) the margin of votes by which the party in power’s candidates won in the prior presidential election

3) the average growth rate of per capita real disposable personal income during the congressional term (a measure of economic prosperity).

From the available data plugged into this model, Hibbs predicts the Democrats will lose about 45 seats. In other words, based on the model alone, we would expect the Democrats to lose control of the house even if the President made no difference at all. And most predictions show the Democrats losing about this many seats (or 5-10 more, depending on which electoral prediction web sites you look at).

Hibbs provides the necessary caveats about his work not being definitive. But it is certainly strong enough to make us ask: Where’s the science behind a lot of the political debate and punditry? The evidence-based perspective encourages us to be careful in attributing cause and effect where none may exist.

Evidence-Based Living Never Takes a Vacation: Resistance to Science

While hanging with my large and boisterous family on the Massachusetts shore this week, the conversation turned to people’s resistance to scientific information. Now this is not actually all that surprising, because my extended family includes an unusual number of individuals who either are or were practicing scientists. Indeed, the gathering over the week involves several psychologists, a research dietician, a sociologist, two young budding researchers (one studying mood disorders, the other conducting research in a business school), a physician, and a historian.

Discussions emerged about issues of barefoot running (see previous post) and athletes’ use of steroids (this trumped our usual Yankees versus Red Sox debate for a while). Niece Julianna then posed the following question: Why are people so resistant to scientific evidence on some issues? Indeed, why does their resistance often approach the first-grade tactic of putting fingers in the ears and singing “I can’t hear you?” Several family members noted that when they have suggested, in the course of an argument, that the scientific evidence be consulted they get responses like: “I don’t care, I just know this is right.”

Of course, scientists haven’t left a topic like that alone. There is a body of research about why individuals reject even what the scientific community views as fundamental facts. An interesting article by Yale psychologists Paul Bloom and Deena Skolnick Weisberg provides a useful review. They begin by noting the prevalence of erroneous beliefs, including the curious finding from a Gallup poll that one-fifth of Americans believe that the Sun revolves around the Earth.

Bloom and Weisberg suggest that a primary reason “people resist certain scientific findings, then, is that many of these findings are unnatural and unintuitive.” Further, science involves asserted information (so we believe that Abraham Lincoln was a U. S. president, even though we can’t validate that information personally). There are few scientific findings we can validate directly – e.g., whether vaccines cause autism, whether natural selection operates, or whether repressed memories exist.

In sum, the data Bloom and Weisberg review suggest that people resist science when:

  • Scientific claims clash with intuitive expectations
  • Scientific claims are contested within society
  • A non-scientific alternative explanation exists that is based in common sense and is championed by people who are believed to be trustworthy and reliable.

A recent study published in the Journal of Applied Social Psychology provides additional explanation. In a series of experiments, Gerald Munro found that when presented with scientific information that contradicts one’s beliefs, people invoke the “impotence of science” hypothesis; that is, they argue that it’s a topic that science can’t effectively study.

When people have very strong beliefs about a topic, research has shown that scientific evidence that is inconsistent with the beliefs has little impact in changing them. But even more problematic, Munro’s research suggests that this inconsistency between beliefs and scientific conclusions actually reduces people’s overall faith in science.  

All this provides interesting challenges for proponents of evidence-based living. We need not only to get scientific information out to the public, but we also need a much better understanding of how beliefs create resistance to information that might improve people’s lives.

The science behind barefoot running

Humans have been running long distances for millions of years, well before the advent of the modern running shoe. In fact, it’s only in the past three decades that athletic companies have developed cushioned, supportive shoes for runners.

Recently, a movement of runners have gone back to their roots – forgoing shoes for running barefoot or with minimal footwear. Why the heck would they do that? Thanks to sports historian Michael Civille for posing this question, and we’ll take a look at the evidence here.

There is some evidence that barefoot running reduces the amount of force on the foot and knee joints. Daniel Lieberman, a professor of human evolutionary biology at Harvard University, studies the biomechanics of barefoot running and how early humans survived by evolving the ability to travel long distances to hunt.

His work – which has been published twice in the journal Nature – has shown that experienced, barefoot runners tend to land in the front or middle of their feet, compared to runners with shoes, who tend to land on their heels. These forefoot and midfoot strikes do not generate the sudden, large impacts that occur with heel strikes. Therefore, barefoot people can run more easily on hard surfaces without discomfort from landing.

Lieberman, who runs barefoot once a week himself, is the first to admit there is no evidence on whether running barefoot causes fewer or more injuries than running with shoes.  (There is also no evidence that running shoes reduce injuries either.)

How about speed?

There is some evidence that barefoot running uses about five percent less energy because runners use the natural springs in their feet and calf muscles to store and release energy.  

But runners with forefoot or midfoot strikes don’t seem to be any faster than runners with heel strikes, according to a Japanese study.  In it, researchers took photographs of elite runners foot strike positions midway through a half-marathon. Seventy-five percent of the runners were landing on their heels, 24 percent landed at about near the arch of their shoe, and only four landed on their forefoot. And they weren’t the four fastest.

The take-home message?  The jury is still out on barefoot running. One thing is clear:  If you want to try barefoot running, start slowly. One thing all of the experts agree on is that the body does take some time to adjust.

Texting while driving: Clearly dangerous

Multitasking has become a way of life in this digital age, where most people can access their e-mail, their calendars and make phone calls from a mobile device they keep in their pockets or purse. While communication-on-the-go certainly can make us more efficient, it can have dire consequences as well.

Some 200,000 car accidents each year are caused by texting while driving, according to a report from the National Safety Council, a nonprofit group recognized by congressional charter as a leader on safety.

The scientific literature backs up the report.  A 2009 study of long-haul truckers by the Virginia Tech Transportation Institute found drivers were more than 23 times more likely to experience a safety-critical event when texting. The study also found that drivers typically take their eyes off the road for an average of four out of six seconds when texting, during which time he travels the distance of a football field without their eyes on the road.

Another study by psychologists and the University of Utah found that texting while driving is riskier than talking on a cell phone or with another passenger. In the study, people texting in a driving simulator had more crashes, responded more slowly to brake lights on cars in front of them, and showed more impairment in forward and lateral control than did drivers who talked on a cell phone or drove without texting.

The Utah study found that drivers who talked on the phone attempted to divide their attention between the conversation and driving, adjusting the priority of each activity based on what was happening on the road.  But texting required drivers to switch their attention from one task to the other, causing a substantial reduction in reaction times compared to those talking on the phone.

State governments are responding to the evidence. Text messaging is banned for all drivers in 30 states and the District of Columbia. In addition, novice drivers are banned from texting in 8 states.  And President Barack Obama issued a texting-ban while driving for on all federal employees while using a government vehicle or government-issued cell phone.

The take home message: Save your text for non-driving times.

–        Sheri Hall

Chocolate and depression: The study vs. the media

I’m always on the lookout for good studies that are misinterpreted by the media (see here and here for examples). Why is this important? Because those of us whose profession it is to translate research findings to the public tend to get smacked upside the head by media misrepresentations. The public gets so used to duelling research findings that they become skeptical about things we are really certain about (e.g., climate change).

If you read your newspaper or watched TV in the last week or so, you may have seen media reports on the relationship between chocolate and depression. Now I love chocolate, and I’m not ashamed to admit it. I spent a year living next to Switzerland, and I can name every brand produced in that country (and I had the extra pounds to show it).

So I got concerned when I read the headlines like this:

Chocolate May Cause Depression

Chocolate Leads to Depression?

Depressed? You Must Like Chocolate

It was a matter of minutes for us to find the original article in the Archives of Internal Medicine. (The abstract is free; unless you have access to a library, you have to pay for the article.)  It’s clearly written, sound research. And it absolutely does not say that chocolate leads to depression (whew!). Indeed, the authors acknowledge that the study can’t tell us that at all.

The research used a cross-sectional survey of 931 subjects from San Diego, California, in which they asked people about both their chocolate consumption and their depressive symptoms. By “cross-sectional” is meant a survey that takes place at one time point. This is distinguished from a longitudinal survey, where the same people are measured at two or more time-points. Why is that important here?

Here’s why. What epidemiologists call “exposure” – that is, whatever might cause the problem (in this case, chocolate) – is measured at the same point in time as the outcome (in this case, depression). For that reason, we can’t be sure whether the chocolate preceded the depression, or the depression preceded the chocolate. They both are assessed at the same time. So we can never be sure about cause and effect from this kind of study.

Now, a longitudinal study is different. The advantage of a longitudinal study is that you can detect changes over time. In this case, you could establish depression and chocolate consumption levels at Time 1, and keep measuring them as they continued over time. For example, if some people who weren’t depressed at Time 1 started eating chocolate and became depressed at a later point, we have stronger evidence of cause and effect.

As good scientists, the authors acknowledge this fact. They note that depression could stimulate cravings for chocolate as a mood-enhancer, that chocolate consumption could contribute to depression, or that a third factor (unknown at this point) could lead to both depression and chocolate consumption (my own pet theory: Valentine’s Day!).

In the interest of full disclosure, some of the media did get it right, like WebMD’s: succinct More Chocolate Means More Depression, or Vice Versa. But because some media sources jump to the most “newsworthy” (some might say sensationalist) presentation, there’s no substitute for going back to the actual source.

Finally, let me say that there is only one way to really establish cause and effect: a randomized, controlled trial. One group gets chocolate, one doesn’t, and we look over time to see who gets more depressed.

Sign me up for the chocolate group!

Exercise and health: Another media mix-up

Is there anything more aggravating than when the media take a sound research study and distort the findings just to attract attention? (Okay, this season’s American Idol and the “five-dollar footlong” jingle are probably more aggravating, but still . . .) And it’s even worse when the public may take the incorrect message and change their behavior as a result. I’m thinking we should sponsor a contest for the worst reporting (stay tuned).

So take this article from the London Daily Mail. The headline: “Fitness flop? It’s all down to the genes, say researchers.” The first line of the article carries on the same theme:  ”Spent hours sweating it out in the gym but don’t feel any fitter? Blame your parents.” The article was then picked up by other sources and reported as fact (for example, by Fox News). Much of the reporting seems to suggest that some people shouldn’t exercise, as this cartoon accompanying the Daily Mail suggests.

We at Evidence-Based Living, of course, had to track down the original article and take a look (here’s the reference). Now, a lot of the article is close to unintelligible to the lay person (here’s one for you: “Target areas for the SNP selection was defined as the coding region of the gene 269 plus 20kb upstream of the 5’ end and 10 kb downstream of the 3’ end of the gene.”). However, the major finding is pretty straightforward.

 One important indicator of fitness is oxygen uptake, and exercise such as running and biking can increase your ability to take in oxygen.  This is commonly referred to as “aerobic fitness.” However, in the study, for about 20% of people intense exercise didn’t improve their oxygen uptake. All the subjects (around 600) in this study did a cycling exercise program.  On average, people’s aerobic capacity improved around 15 percent, but in approximately 20 percent of those studied, improvement was minimal (5 percent or less). The failure to improve was related to specific genes. The study will have practical value, because doctors may be able to tailor special programs to people who don’t respond to exercise.

All in all, a nice study. However, when you saw the extensive media coverage, your take-home could easily be: Why exercise? In fact, there is still every reason to hit the gym or track, or get on the bike several times a week. First and foremost, let’s turn it around and note that 80 % of people DID improve aerobic capacity. What the misleading headlines and coverage don’t tell you that for most of us, exercise works, and works well.

And even if you are in the minority, the study only looked at a couple of outcomes. However, exercise has multiple other benefits, from weight loss, to improving mood, to increasing flexibility, to reducing the risk of osteoporosis. The excellent  evidence-based medicine blog Bandolier summarizes all the benefits of exercise concisely.

So for those of you working to promote healthy behaviors like exercise, make sure people know that it’s still definitely good for people. And for everyone involved in disseminating research to the public: Let’s remember to keep a skeptical eye on media one-liners about scientific findings, especially as they relate to human health. It’s almost always more complicated, and there’s no excuse not to go to the source of the information, rather than relying only on the press.

Skip to toolbar