ASU President urges universities to take action

Nearly all major research universities have systems in place to translate and communicate their findings into information that can benefit society.  But are U.S. universities doing enough to address the problems of contemporary life?

According to Michael Crow, President of Arizona State University, the answer is no.

Crow is on a mission to transform Arizona State University into the model for what he calls “a New American University” – an institution organized to pursue research that benefits the public good. And he is urging other universities to follow suit.

Crow believes major research institutions should take responsibility for “the economic, social, and cultural vitality and health and well-being of the community” and encourage collaboration across disciplines and with other academic institutions.

He argues that a scientific focus on narrower and more fundamental secrets of nature has impaired researchers’ ability to “think at scale and across time.” 

For Crow, this means restructuring universities so they’re more capable of responding to modern challenges. At Arizona State, he has created more than a dozen new transdiscipliniary schools, including the School of Human Evolution and Social Change, the School of Earth and Space Exploration and the School of Sustainability and the School of Life Sciences. The idea is to bring together scientists from a wide range of disciplines, engineers, policymakers and industry leaders to develop solutions to pressing real-world problems.

At the same time, ASU has eliminated traditional departments including biology, sociology, anthropology and geology.

They’re drastic measures, for certain. But changes the Crow insists are necessary if universities are going to do their part in solving major world problems, such as climate change.  Intrigued?  You can read Crow’s thoughts about reorganizing academic institutions to solve improve our world’s sustainability in the June/July 2010 issue of Bioscience. And let us know your thoughts by commenting on this post!

Video feature: Science education outreach

Researchers and scientists across the country are making new discoveries every day, but continually must find the best ways to share that knowledge with the public.  The Cornell Center for Materials Research sets outstanding example of the best way to accomplish this.

The center’s mission is to advance, explore and exploit the science and engineering of advanced materials. It is part of a  national network of centers for Materials Research encompasses 29 centers funded by the National Science Foundation.

Nev Singhota is the director of the center’s Educational Programs Office, which reaches out to thousands of students, hundreds of parents and teachers, and many undergraduates from across the country. Many Cornell faculty, post-doctoral fellows, graduate and undergraduate students contribute to the center’s outreach efforts by visiting schools, hosting family and teacher workshops and coordinating an “Ask the Scientist” column in the local newspaper.

Singhota describes her role a facilitating interactions between Cornell scientists and all sorts of people in the community. “We create this web,” she said. “We’re like the spider who is trying to connect everyone together .” 

Interested in hearing more about Cornell Center for Materials Research’s outreach efforts?  Check out this conversation with Singhota:

New Evidence: TV time leads to attention problems

There is another piece of evidence that supports a long-standing belief among child development experts: Too much TV time is associated with attention problems in youth. The newest piece of proof comes from a study conducted by researchers at Iowa State University and published this month in the journal Pediatrics

The new research found that children who exceeded the two hours per day of screen time recommended by the American Academy of Pediatrics – either in TV-watching or video games – were 1.5 to 2 times more likely to have attention problems in school.

The study followed third-, fourth- and fifth-grade students as well as college-aged students for more than one year. Over that time, participants’ average time using television and video games was 4.26 hours per day, well below the national average of 7.5 hours per day reported in other studies.

Study author Douglas Gentile, an associate professor of psychology at Iowa State, explained the phenomenon for a report in Science Daily.

“Brain science demonstrates that the brain becomes what the brain does,” he said. “If we train the brain to require constant stimulation and constant flickering lights, changes in sound and camera angle, or immediate feedback, such as video games can provide, then when the child lands in the classroom where the teacher doesn’t have a million-dollar-per-episode budget, it may be hard to get children to sustain their attention.”

This phenomenon again raises the question for professionals who coordinate youth intervention programs:  What can be done to capture the attention of youth who are so captivated by electronic media?   The answer is most likely to meet them somewhere in their world.

– Sheri Hall

The evidence on exercise

The sun is shining, the grass is green and the birds are chirping. It’s finally summertime in the northern hemisphere – the time of year when most people find it easier to fit in some aerobic exercise.  You can venture out to a hiking trail, hit the local pool for some lap swimming or dust off that old bicycle. While you may think of these activities as summertime leisure, the evidence shows they are all extremely beneficial to your health – even more powerful than the latest medicines for treating certain conditions.

Heart disease

Medical study after medical study has found that getting your body moving is good for your heart. Moderate-intensity activities, like walking at a brisk pace or swimming, yield the most beneficial effects. It only takes 30-45 minutes five days a week.

While exercise is beneficial to everyone, sedentary people who become moderately active show the greatest improvements in reduced deaths from cardiovascular disease. Those who started exercising regularly after a heart attack show improved rates of survival.

Cognitive impairment

Alright, most of us already knew that exercise was good for our hearts.  But did you also know it’s good for your brain? 

A 2008 review of therapies to slow or reverse cognitive decline concluded that aerobic activity enhances cognitive function in older adults.  In fact, in one study, researchers at the University of Illinois found that a 5 to 7 percent improvement in cardiovascular fitness corresponded with up to 15 percent improvement on mental tests. 

Another study of elderly people diagnosed with mild cognitive impairment found those who enrolled in a six-month exercise program improved their ability to concentrate and carry out complex tasks, while participants who didn’t exercise declined in their performance on those same tasks.  Scientists think the improvements have something to do with exercise increasing the flow of blood and oxygen to the brain, improving growth factors that help create new nerve cells and increasing chemicals in the brain that help with cognition.

Depression

Multiple studies have also found that exercise can help prevent and also treat depression and anxiety disorders. The prevention piece is more difficult to prove, but studies show a strong correlation that people who exercise are much less likely to suffer from depression.  One study did follow participants for a 15-year-period and found that those with high fitness levels were less likely to become depressed.

The evidence clearly shows that exercise as effective as antidepressant medications for treating depression. Duke University researcher James A. Blumenthal and his colleagues studied 156 older adults diagnosed with major depression, assigning them to receive the antidepressant Zoloft, 30 minutes of exercise three times a week, or both. They found that exercise was equally as effective the prescription medicine, and follow-up studies showed patients who exercised were less likely to regress back into depression.

That’s some clear proof the exercise is good for your body on many levels. So don’t delay – get out there while the sun is shining!

Sheri Hall

Evidence-based soccer: Scientific predictions for the World Cup?

I have for many years played soccer – badly. I began in high school some 40 years ago, when our only available opponents were private schools that beat us by scores that sound like American football results (e.g., 14 – 0). I continued as the slowest (but most enthusiastic) player on a variety of adult teams, hanging up my soccer boots only a couple of years ago when the injuries seemed no longer worth it (although nostalgia hits me every fall).

But those who can’t do, can watch. And every soccer fan’s heart begins to beat mightily for the World Cup, the every-four-year phenomenon that captures the entire world’s attention (except for, unfortunately, much of the United States). As we did earlier with basketball, we at Evidence-Based Living couldn’t help asking the question: Is there evidence-based soccer? 

Leave it to a Cornell colleague to come up with just that. Christopher Anderson (former soccer player and self-proclaimed “soccer dad”) teaches in the Government Department. In his spare time, he has created soccerquantified.com, which approaches the “beautiful game” with sophisticated statistical analysis. If you want to see how techniques like logistic regression can be interestingly (and understandably) applied to a something in the real-world, reading Anderson’s blog is worth it for that reason alone.

Anderson has gathered data about past competitions, looking at such issues as each national team’s record, characteristics of the country, and home-continent advantage to predict a likely  winner. He doesn’t claim certainty by any means, but the data point toward powerhouse Brazil as having the best chance. Unfortunately, it doesn’t look like the U. S. advances out of the first round, based on the data.

 Like any good scientist, Anderson is clear about limitations. The overall probability of any one particular team winning is low, given that there are 32 countries competing and there are a lot of factors that are hard to control for (for example, the high altitude of some of the South African venues may affect play for some teams).

 Anderson isn’t the only academic obsessed with soccer. Take a look at the Journal of Economic Psychology, which had a special issue on “The Economics and Psychology of Football.” Articles include analyses of referee behavior (the home team advantage is in part due to pressure on referees from the home crowd) and how distance from your home field affects whether you win or not.

So, let the games begin! And let’s see if a scientific approach works to predict the real-world outcome.

Chocolate and depression: The study vs. the media

I’m always on the lookout for good studies that are misinterpreted by the media (see here and here for examples). Why is this important? Because those of us whose profession it is to translate research findings to the public tend to get smacked upside the head by media misrepresentations. The public gets so used to duelling research findings that they become skeptical about things we are really certain about (e.g., climate change).

If you read your newspaper or watched TV in the last week or so, you may have seen media reports on the relationship between chocolate and depression. Now I love chocolate, and I’m not ashamed to admit it. I spent a year living next to Switzerland, and I can name every brand produced in that country (and I had the extra pounds to show it).

So I got concerned when I read the headlines like this:

Chocolate May Cause Depression

Chocolate Leads to Depression?

Depressed? You Must Like Chocolate

It was a matter of minutes for us to find the original article in the Archives of Internal Medicine. (The abstract is free; unless you have access to a library, you have to pay for the article.)  It’s clearly written, sound research. And it absolutely does not say that chocolate leads to depression (whew!). Indeed, the authors acknowledge that the study can’t tell us that at all.

The research used a cross-sectional survey of 931 subjects from San Diego, California, in which they asked people about both their chocolate consumption and their depressive symptoms. By “cross-sectional” is meant a survey that takes place at one time point. This is distinguished from a longitudinal survey, where the same people are measured at two or more time-points. Why is that important here?

Here’s why. What epidemiologists call “exposure” – that is, whatever might cause the problem (in this case, chocolate) – is measured at the same point in time as the outcome (in this case, depression). For that reason, we can’t be sure whether the chocolate preceded the depression, or the depression preceded the chocolate. They both are assessed at the same time. So we can never be sure about cause and effect from this kind of study.

Now, a longitudinal study is different. The advantage of a longitudinal study is that you can detect changes over time. In this case, you could establish depression and chocolate consumption levels at Time 1, and keep measuring them as they continued over time. For example, if some people who weren’t depressed at Time 1 started eating chocolate and became depressed at a later point, we have stronger evidence of cause and effect.

As good scientists, the authors acknowledge this fact. They note that depression could stimulate cravings for chocolate as a mood-enhancer, that chocolate consumption could contribute to depression, or that a third factor (unknown at this point) could lead to both depression and chocolate consumption (my own pet theory: Valentine’s Day!).

In the interest of full disclosure, some of the media did get it right, like WebMD’s: succinct More Chocolate Means More Depression, or Vice Versa. But because some media sources jump to the most “newsworthy” (some might say sensationalist) presentation, there’s no substitute for going back to the actual source.

Finally, let me say that there is only one way to really establish cause and effect: a randomized, controlled trial. One group gets chocolate, one doesn’t, and we look over time to see who gets more depressed.

Sign me up for the chocolate group!

Food Revolution or Evidence-Based Solutions?

I tuned in to Jamie Oliver’s Food Revolution the other night. I’m not a lover of reality shows, but, in this case, my curiosity got the best of me. For those of you who haven’t watched TV in the past few months, Food Revolution is a show that documents the antics of celebrity chef, Jamie Oliver, as he rides into the “fattest city in the US” and turns the population (especially the school kids) into healthy eaters. All this in slick, sensationalistic, sixty-minute segments!

As we all know, childhood obesity is taking a terrible toll on our kids. There’s no doubt that a crisis of this magnitude requires us to enact policy changes and programs aimed at addressing the problem. But do programs like Oliver’s Food Revolution really work? How do educators, concerned citizens, and policy makers know which programs will give us the best return on our investment?

John Cawley, a professor in the College of Human Ecology’s Policy Analysis and Management department, has recently published a study that addresses this question. Cawley, an economist, examined recent studies of several programs to reduce obesity, and found that CATCH (Coordinated Approach to Child Health), a multistate program that teaches elementary schoolchildren how to eat well and exercise regularly, is the most cost-effective. On the other hand, the study found that many other popular programs are not as effective and were much more costly than CATCH. Cawley’s study can be found here.

Cawley, who has served on the Institute of Medicine’s committee to prevent childhood obesity, says “It’s a bit of a Wild West, anything-goes environment when it comes to creating anti-obesity programs and policies. With limited resources, it would be counterproductive to rush into programs that are not cost-effective and won’t provide the greatest return on investment.

So, what does any of this have to do with Oliver’s Food Revolution? It suggests that policy makers need to look beyond the glitz when they consider which programs to invest in. It’s important to investigate which programs are “evidence-based” and which are merely entertainment. Food Revolution has not been rigorously evaluated. A preliminary study conducted by the West Virginia University Health Research Center to investigate the program suggests that the program had few positive impacts and a negative impact on meal participation and milk consumption.

In the end, as with most persistent societal challenges, the obesity epidemic is a complex problem best addressed by concerned citizens and policy makers who are committed to finding the best evidence-based solutions. And, unfortunately, it’ll probably take us longer than the sixty-minute segments of a reality TV show to fix the problem.

Exercise and health: Another media mix-up

Is there anything more aggravating than when the media take a sound research study and distort the findings just to attract attention? (Okay, this season’s American Idol and the “five-dollar footlong” jingle are probably more aggravating, but still . . .) And it’s even worse when the public may take the incorrect message and change their behavior as a result. I’m thinking we should sponsor a contest for the worst reporting (stay tuned).

So take this article from the London Daily Mail. The headline: “Fitness flop? It’s all down to the genes, say researchers.” The first line of the article carries on the same theme:  ”Spent hours sweating it out in the gym but don’t feel any fitter? Blame your parents.” The article was then picked up by other sources and reported as fact (for example, by Fox News). Much of the reporting seems to suggest that some people shouldn’t exercise, as this cartoon accompanying the Daily Mail suggests.

We at Evidence-Based Living, of course, had to track down the original article and take a look (here’s the reference). Now, a lot of the article is close to unintelligible to the lay person (here’s one for you: “Target areas for the SNP selection was defined as the coding region of the gene 269 plus 20kb upstream of the 5’ end and 10 kb downstream of the 3’ end of the gene.”). However, the major finding is pretty straightforward.

 One important indicator of fitness is oxygen uptake, and exercise such as running and biking can increase your ability to take in oxygen.  This is commonly referred to as “aerobic fitness.” However, in the study, for about 20% of people intense exercise didn’t improve their oxygen uptake. All the subjects (around 600) in this study did a cycling exercise program.  On average, people’s aerobic capacity improved around 15 percent, but in approximately 20 percent of those studied, improvement was minimal (5 percent or less). The failure to improve was related to specific genes. The study will have practical value, because doctors may be able to tailor special programs to people who don’t respond to exercise.

All in all, a nice study. However, when you saw the extensive media coverage, your take-home could easily be: Why exercise? In fact, there is still every reason to hit the gym or track, or get on the bike several times a week. First and foremost, let’s turn it around and note that 80 % of people DID improve aerobic capacity. What the misleading headlines and coverage don’t tell you that for most of us, exercise works, and works well.

And even if you are in the minority, the study only looked at a couple of outcomes. However, exercise has multiple other benefits, from weight loss, to improving mood, to increasing flexibility, to reducing the risk of osteoporosis. The excellent  evidence-based medicine blog Bandolier summarizes all the benefits of exercise concisely.

So for those of you working to promote healthy behaviors like exercise, make sure people know that it’s still definitely good for people. And for everyone involved in disseminating research to the public: Let’s remember to keep a skeptical eye on media one-liners about scientific findings, especially as they relate to human health. It’s almost always more complicated, and there’s no excuse not to go to the source of the information, rather than relying only on the press.

Stop the presses! Let’s learn from journalists about communicating research

All of us who are professionally or personally interested in translating research evidence to the public struggle with how best to do it. Fortunately, we don’t need to “reinvent the wheel” – there are great web resources to help us!

Bruce Lewenstein, Professor of Science Communication at Cornell has put together a terrific list of sites that provide nuts-and-bolts ideas and information for how to communicate with the public about science topics. These are geared in part for journalists who make a living communicating about research, or for scientists who are trying to communicate with the public. However, they provide information that is highly useful to extension educators and others who inform citizens about research. Prof. Lewenstein also maintains a web page with basic resources for better communication about scientific research. 

Take a look — I found I got several new ideas with just a few clicks!

http://communicatingscience.aaas.org/ (produced by American Association for the Advancement of Science, includes webinars, tipsheets, etc.)

 www.wfsj.org/course/en/index.html (online science journalism course, developed by World Federation of Science Journalists; primary audience is science journalists in developing countries)

http://www.scidev.net/en/science-communication/ (SciDev.net’s “Communicating Science” section, focused on science journalism for the developing world, but relevant for anyone communicating science)

http://www.pantaneto.co.uk/issue28/thomas.htm Tips for great (science) media interviews (from Patricia Thomas, Knight Chair in Health & Medical Journalism, Grady College of Journalism & Mass Communication, University of Georgia)

 http://www.scienceliteracyproject.org/Science Literacy Project (a workshop for science reporters working in public radio; some resources online, especially the “tip sheets”)

Reaching youth: Will on-line be the only way?

An almost unbelievable finding from a new Kaiser Family Foundation study: Other than time in school, the average American kid spends almost all of his or her waking time using a smart phone, computer, television, or some other electronic device. My jaw dropped to learn that  kids are on these devices 7.5 hours a day (up from 6.5 hours only five years ago). And as Tamar Lewin points out in her New York Times article on the study, because young people so often do two electronic things at once (e.g., texting while watching TV), it’s actually closer to 11 hours a day.

A good question for intervention programs that work with young people is: How do we respond to an almost exclusively on-line world? It seems like we should be adapting all of the programming we do with young people to reach them where they spend most of their time. We are seeing a seismic shift in how kids spend their time, and where they get information. The challenge for youth development and risk prevention programs is to develop an on-line presence that gets kids information “where they live.” In particular, we may need to familiar with social networking and how to use it to reach young people.

The buzz on antidepressants: A lesson from the media

Antidepressant pillsHere’s a question for all of us trying to apply scientific research and disseminate evidence-based programs:  How does one move the complex findings of a particular research study into useful knowledge on which non-scientists can make decisions? And to what extent are the media friend or foe in such efforts?

We’ve been treated to a great example in a widely-publicized study that came out last week.. Becky Jungbauer  at Scienceblogging [1. http://www.scientificblogging.com/truth_universally_acknowledged/depression_placebos_and_paxil],  summarized the news coverage of this finding on the effectiveness of antidepressant medication. With millions of Americans taking antidepressants, the question of whether they work or not is by no means just an academic one.

The media frenzy over this study could leave a casual reader with a simple (and wrong) impression: That antidepressants don’t work. As Jungbauer notes, many people reading headlines or half-listening to the news might assume that  “antidepressants in general don’t really work unless you’re standing on a ledge.” Headlines such as these appeared: “Study: Antidepressant lift may be all in your head,” and “Placebos: Pretty Good for Depression.”

I’m not going to comment on all the details of the study, which is summarized well in the New York Times [2. http://www.nytimes.com/2010/01/06/health/views/06depress.html],  and on Jungbauer’s blog. Rather, it’s great example for those us translating research because it shows just what a hard job this can be. How can a lay person take findings like these reported in the media and use them?

Let’s begin with the article itself, “Antidepressant Drug Effects and Depression Severity: A Patient-Level Meta-Analysis,” published in the prestigious Journal of the American Medical Association. The first clue to potential problems with translating the findings lies in that subtitle: What the heck is a meta-analysis?
Most people probably assumed that the authors themselves did a study and reported the results. That isn’t the case – they used statistical methods to merge the findings of six previous studies done by various research groups. The results of their analysis are entirely dependent on the quality of the studies reviewed.

What if a reader decided to go to to JAMA itself, to read the article? I can imagine a depressed acquaintance of mine (now worried whether she should be taking antidepressants) trying to make sense of statements like: “Mean intake severity did not differ as a function of treatment condition (F1,711= 0.05, P=.82), but the 6 studies did show different mean intake severity levels, reflecting differences in inclusion criteria (F5,711=79.56, P_.001).” How many concerned readers can explain what “a randomized, double-blind, placebo-controlled trial” is?

It would take some time and effort to see that  the study doesn’t say the anti-depressants don’t work – they did. But just not more than a placebo (a sugar pill).

And it would take even more careful reading  to note that the drug used in three of the six studies is no longer widely used, and that people in the placebo groups typically received some form of counseling and interaction (even if not formal therapy), which might have had an anti-depressant affect. And few people have easy access to JAMA to begin with, even if they wanted to read the article.

The moral of this story  is how hard we need to work to make sure people understand the science behind any generalization. Is there some way that we could ensure that “ordinary” citizens can obtain a minimal level of scientific literacy, so they can discern what they should be influenced by and what they should ignore? Might words like “meta-analysis,” and “randomized-controlled trial” need to become generally understood terms?

I am an ignoramus about car repair. You wouldn’t find me looking under the hood of my car unless I suspected a small animal was trapped inside. But I have learned enough of the language so I can communicate with my mechanic and evaluate in an elementary way what he plans to do to my car. That’s what we need with science.

Skip to toolbar