Youth development that works: Positive findings on the 4-H program

On EBL, we’ve talked a lot about research evidence on problems affecting adolescents, from alcohol use, to  video games, to social networking, to sex (I can hear teens who read this starting to hum the tune to “These Are a Few of My Favorite Things”…).

One might ask: Okay, what about the positive side? We’re glad to say that there’s very encouraging news about a program that really works.

One of the most popular and extensive youth programs in the United States is the 4-H program. There are over 6,500,000 members in the U. S. The 4-H program offers activities for kids from 5-19 who are organized in 90,000 4-H clubs. Throughout its history, 4-H has promoted leadership skills, good citizenship, and life skills development. In recent years, it has branched into health promotion programs (like obesity prevention and fostering physical activity) and science, engineering, and technology (“SET”) programming.

All this sounds great, right? But as EBL readers know, we look for the evidence. And now we have it, thanks to the ground-breaking work of Prof. Richard Lerner of Tufts university, one of the country’s leading experts on youth development.  Lerner and colleagues expected that the mentoring from adult leaders and the structured learning that goes on in clubs might lead to number of desireable outcomes for children. This led them to do a longitudinal, controlled evaluation of the impact of being a 4-H member. Beginning in 2002, they have surveyed oaver 6,400 teens across the U. S.

The researchers have issued a major new report, looking at the findings on youth outcomes over nearly a decade. To really dig into the results, you should read the very accessible report. Among the many findings, according to the report summary, is that 4-H participants:

  • Have higher educational achievement and motivation for future education
  • Are more civically active and make more civic contributions to their communities
  • Are less likely to have sexual intercourse by Grade 10
  • Are 56% more likely to spend more hours exercising or being physically active
  • Have had significantly lower drug, alcohol and cigarette use than their peers
  • Report better grades, higher levels of academic competence, and an elevated level of engagement at school,
  • Are nearly two times more likely to plan to go to college
  • Are more likely to pursue future courses or a career in science, engineering, or computer technology

All in all, very impressive findings. So let’s join in the 4-H pledge (can you find the four “H”s?):

I pledge my head to clearer thinking,
my heart to greater loyalty,
my hands to larger service
and my health to better living,
for my club, my community, my country, and my world.

A pretty good approach to living, and one that seems to work for millions of children and teens!

Randomized, controlled designs: The “gold standard” for knowing what works

You’re having trouble sleeping one night, so you finally give up and turn on the TV. It’s 2 AM, so instead of actual programs, much of what you get are informercials. As you flip through these slick “infotainment” shows, you hear enthusiastic claims about the effectiveness of diet pills, exercise equipment, and a multitude of other products

You will soon see that almost every commercial uses case studies and testimony of individuals for whom the product has supposedly worked. “I lost 50 pounds,” exults a woman who looks like a swimsuit model. “I got ripped abs in 30 days,” crows a man who, well, also looks like a swimsuit model.

The problem is that this kind of case study and individual testimony is essentially worthless in deciding if a product or program works. The main problem is that it’s very hard to disprove case study evidence. Look at the informercials – they seem to have worked for some people, but what about all the people who failed? And how do we know that the people who lost weight, for example, wouldn’t have done so without buying the product?

So case studies and testimonials aren’t worth much because they don’t give us the kind of comparative information needed to rule out alternative explanations.

To the rescue comes experiments using randomized, controlled designs (RCD). Such experiments are rightly called the “gold standard” for knowing whether a treatment will work. In a RCDs, we create a test so that one explanation necessarily disconfirms the other explanation. Think of it like a football game. Both teams can’t win, and one eventually beats the other. It’s the same with science: our knowledge can only progress if one explanation can knock out another explanation.

 The main weapon in our search for truth is control group designs.  Using control groups, we test a product or program (called the “treatment”) against a group that doesn’t get whatever the treatment is.

 Case studies simply don’t have the comparative information needed to prove that a particular treatment is better than another one, or better than just doing nothing. And that’s important because of the “placebo effect.” It turns out that people tend report that a treatment has helped them, whether or not there is any actual therapy delivered. In medicine, placebo effects very strong, and in some cases (like drugs for depression) the placebos have occasionally been found to work more effectively than the drugs.

 So what is a randomized, controlled design? There are four components of RCDs:

 1. There is a treatment to be studied like a program, a drug, or a medical procedure)

 2. There is a control condition. Sometimes, this is a group that doesn’t’ get any treatment at all. Often it is a group that gets some other kind of treatment, but of a different kind or smaller amount.

3.  Now here’s the key point:The participants must be randomly assigned to treatment or control groups. It is critical that nobody – not the researchers, not the people in the experiment – can participate in the decision about which group people fall into. Some kind of randomization procedure is used to put people into groups – flipping a coin, using a computer, or some other method. This is the only way we can make sure that the people who get the intervention will be similar to those who do not.

4. There must be carefully defined outcome measures, and they must be measured before and after the treatment occurs.

Lots of the bogus claims you see on TV and elsewhere look only at people who used the product. Without the control group, however, we can’t know if the participants would have gotten better with no treatment at all, or with some other treatment.

Catherine Greeno, in an excellent article on this topic, sums up why we need to do RCDs if we want to understand if something really does or doesn’t work. She puts it this way:

  • We study a treatment compared to a control group because people may get better on their own.
  • We randomly assign to avoid the problem of giving worse off people the new treatment because we think they need it more.
  • We measure before and after the treatment so that we have measured change with certainty, instead of relying on impressions or memories.

 So when you are wondering if a therapy, treatment, exercise program, product, etc. are likely to work, keep those three little words in mind: Randomized, Controlled Design!

Fairs and exhibitions: We love them in Cooperative Extension, but is there evidence?

I have always loved the county fair. So much so – yes, I’m going to admit it – that on the day after my wedding many years ago we took the entire gang to the the Lorain County Fair in Ohio. And Cooperative Extension staff and participants also  enjoy county and state fairs – and devote considerable time throughout the year to planning and preparing for them.

So my interest was piqued by the title of a recent article in the Journal of Extension: “Fairs and other Exhibitions: Have We Really Thought this Through?” Author Donald Nicholson points out that the investment of effort and time by staff makes makes fairs the largest single program in Cooperative Extension. As he puts it: “There is no doubt that “The Fair” is deeply woven into the very DNA of Extension.” But that, he argues, keeps us from evaluating what we get from all the investment.

Nicholson asks: What do we actually know about whether participation in fairs actually promotes youth development?The answer, surprisingly, is that there is almost no scientific evidence of any kind on this topic. He notes that the public, and commodity groups. heavily support the fair, and that it is extremely popular among extension staff and program participants. But he poses a set of thought-provoking questions for state and county extension programs to consider:

  • What is the 5-year or 10-year goal of Extension in regard to our role, goals, and mission with the fairs in your local or state Extension program?
  • What is the research agenda and intention regarding fairs in your state?
  • Are the procedures used and the time invested by Extension truly guided by research-based information?
  • Could the same educational content be more effectively delivered in other ways with a similar or lesser investment of time and resources?

What is lacking is any solid research evidence regarding the benefits of fair activities to youth participants. Nicholson was able to identify only two pilot studies that addressed this issue. Given that  fairs are probably the single biggest investment of extension, he argues for the development of a knowledge base on what youth development outcomes are achieved by fairs.

Very thought-provoking – as are the comments that follow the article. Some commenters agree with a closer examination of the effectiveness of fairs in extension, whereas others argue for the economic and public-relations benefits of the fair, regardless of scientifically-assessed outcomes.

Leave us a comment with your thoughts on this topic!

New York continues PROSPER Partnership to prevent substance abuse

We heard some exciting news at EBL this week!  New York families will soon have more access to evidence-based programs that prevent substance abuse among middle school students and their families.

You might remember that we wrote about PROSPER Partnerships – which stands for PROmoting School-community-university Partnerships to Enhance Resilience – as an ideal model for implementing substance abuse prevention programming based on real evidence. The program links Cooperative Extension, public schools, and local communities to choose proven programs that serve the needs of individual communities.

Last month, New York was chosen as one of five states in that will continue the process of forming a PROSPER Partnership, with Cornell serving as the university partner.

The goal is for New York to become a full PROSPER State Partnership by August of this year.

Kim Kopko, Extension Associate in the Department of Policy Analysis and Management and New York State Liaison for PROSPER, is excited to continue with the program.

“This is indeed a very positive development and an exciting opportunity to utilize the Cornell Cooperative Extension System to bring evidence-based programming to families and communities in New York,” she said.
As you might expect, PROSPER uses plenty of evidence to determine if a state is ready to enter a full partnership. PROSPER staff collected and analyzed data from a state survey, in-depth interviews with Cooperative Extension staff and partnering agencies, and information garnered from various activities in New York.

PROSPER has also plenty of evidence to prove that their system yields results. PROSPER Programs typically recruit 17 percent of eligible families in their communities, compared to less than six percent for other community programs.

Students who participate in the program are better at problem solving, more likely to refuse offers of alcohol and other drugs, less likely to believe that substance use has positive effects and more likely to delay initiation of substance use. And each $1 invested in the program yield about $9.60 of savings.

All of that is great news for New York families, who will soon have even greater access to evidence-based programming.

How to convince volunteers to care for trees

The evidence shows that trees are an important part of our landscape – whether here in forested Ithaca, or in densely populated urban areas.

Studies have found that trees help improve focus, promote a sense of community, and deter crime. So it’s no surprise that major cities across the nation are launching initiatives to plant trees. New York City is undertaking one such project.  Called the MillionTreesNYC initiative, it aims to plant one million trees across all five city boroughs by 2017.

But urban forestry projects typically encounter a problem, explained Gretchen Ferenz, a senior extension associate at Cornell Cooperative Extension in New York City.

“Capital project funds will support planting and immediate care of trees for a couple of years, but costs for longer term care to ensure a young tree’s growth often are not included in municipal budgets,” she told the Cornell Chronicle for a story. “As a result, many urban trees do not survive into maturity.”

Ferenz’s office has joined forces with Cornell’s Department of Natural Resources to create the Urban Forestry Community Engagement Model, a program that provides workshops about the importance of trees to community members in two New York City neighborhoods. The goal is to enlist residents and organizations to become stewards of their community’s trees and, ultimately, to develop resources to help groups around the country do the same.

As part of the program, they’re collecting evidence to learn how to get more community members involved in caring for trees in their neighborhoods. They recently published a study that examines motivations and recruitment strategies for urban forestry volunteers.

Through a survey and focus groups, as well as a review of existing literature on the topic, the team found volunteer who plant and care for trees in their communities are motivated by a wide range of factors.  And most have a limited knowledge of the benefits of urban forests.

This type of work is an important first step in helping cities learn how to engage community members to help care for trees in their neighborhoods – and ultimately in making our world a bit greener.

(You can learn more about the Urban Forestry Community Engagement Model by clicking here.)

How do I know if a program works? A “CAREful” approach

I was recently giving a talk on intervention research and I was asked: “How do I tell whether the evidence for a particular program is good or not?” I often talk with practitioners in various fields who are struggling with exactly what “evidence-based” means. They will read “evidence” about a program that relies only on whether participants liked it, or they will see an article in the media that recommends a treatment based on a single study. What should you look for when you are deciding: Is the evidence on this program good or not?

I came across a very helpful way of thinking about this issue in the work of educational psychologist Joel R. Levin. He developed the acronym “CAREful research,” which sums up what needs to be done when drawing conclusions from intervention research.

In Levin’s “CAREful” scheme, he identifies four basic components of sound intervention studies.

Comparison – choosing the right comparison group for the test of the intervention. Usually, there needs to be a group that does not receive the program being studied, so one can see if the program works relative to a group that does not receive it. A program description should explain how the comparison was done and why it is appropriate.

Again and again – The intervention program needs to be replicated across multiple studies; one positive finding isn’t enough.

Relationship – There has to be a relationship between the intervention and the outcome. That is, the intervention has to affect the outcome variables. That may seem simple, but it’s important; the program has to have a positive effect on important outcomes, or why should you use it?

Eliminate – The other possible explanations for an effect have to be eliminated, usually through random assignment to experimental and control groups and sound statistical analysis.

 Levin and colleauges sum up the CAREful scheme:

“If an appropriate Comparison reveals Again and again evidence of a direct Relationship between an intervention and a speciried outcome, while Eliminating all other competing explanations for the outcome, then the research yields scientifically confincing evidence of the intervention’s effectiveness.”

To see a good example of an evidence-based approach to intervention that reflects this kind of CAREful research, take a look at the PROSPER program, which takes a similar approach to youth development progams.

So when you are looking at intervention programs, “Be CAREful”: Applying these four criteria for good research can help you decide what works and what doesn’t.

Financial education: Behavior change is possible

One-third of U.S. adults report that they have no savings. More than a quarter of them admit to not paying their bills on time. And more than half of American households don’t have a budget.

Given these figures, it’s not surprising that more than 40 percent of U.S. adults would give themselves a grade of C, D, or F for their personal finance knowledge.  These figures come from the 2009 Consumer Financial Literacy Survey by HarrisInteractive, which surveyed more than 1,000 U.S. adults last year.

Given this dim view of personal finance in our nation, it’s clear that many households would benefit from programs that provide financial education.  But do these programs actually help families improve their financial situations?

A new study reveals the answer is yes. Two researchers from the University of Wisconsin-Madison pulled together evaluations from 41 financial education and counseling programs in a systematic review. Their article is published in the Fall 2010 issue of the Journal of Consumer Affairs. They used a research process called a qualitative systematic literature review to summarize evaluations that measured financial education and counseling’s impacts on financial knowledge and behavior.

The majority of studies cited in their review conclude financial education and counseling are beneficial and hold the promise of improving financial knowledge and facilitating behavior change. But the study also notes that many of these evaluations share methodological weaknesses including selection bias and measurement issues.  Many of the programs also do not utilize an explicit theory or framework for behavior change, which would lend precision to both program development and the measurement of program impacts, the authors wrote.

They encourage researchers and educators who run these programs to pay more attention to theory-based evaluations and invest in randomized field experiments may be fruitful.

Here at Cornell Cooperative Extension, we offer classes to help families develop a household spending plan, save energy and reduce their energy bills and use credit wisely through a program called EmPower New York. The free workshops are offered in 46 counties and sponsored by the New York State Energy Research and Development Authority (NYSERDA).

EmPower is doing its  part to collect viable data on the programs’ effectiveness.  This year, they’re conducting phone surveys with the Survey Research Institute at Cornell to determine the extent of behavior change for those who’ve participated in the workshop.  They’re expecting results sometime in June.

PROSPER helps prevent substance abuse

Growing numbers of youth are experimenting with alcohol and drugs at younger ages.  Nearly a quarter of teens report having had five or more alcoholic drinks in one day, according to data from the U.S. Center for Disease Control.  More than one-third have used marijuana.

 There are hundreds of programs available across the country to help dissuade teens from going down the path of substance abuse. But what works?

PROSPER Partnerships – which stands for PROmoting School-community-university Partnerships to Enhance Resilience – is a model that links Cooperative Extension, public schools, local communities and university researchers to introduce evidence-based programs that prevent substance abuse among middle school students and their families.

There are already PROSPER networks in Iowa, Pennsylvania and Alabama.  New York is lucky enough to be one of seven additional states in the process of forming a PROSPER Partnership with Cornell serving as the university partner.

That’s an exciting prospect for communities in New York. It means that families will have access to a menu of programs proven to work.

“There are many family and youth programs that are research-based, but that is not the same as having strong evidence behind them that the programs actually work,” explained Kim Kopko, an extension associate at Cornell’s Department of Policy Analysis, who is leading the PROSPER team at Cornell. “The programs on the PROSPER menu are evidence-based.  They are carefully implemented and tested on the ground level. They’re time intensive, and expensive, but they work.”

There are five elements that make the program successful:

  • A state-level partnership based in the land grant university system that is connected to the National PROSPER Network.
  • Strategic community teams lead by a local extension educator, a key school district employee  (typically a guidance counselor), and a variety of representatives from the community.
  • Every community team oversees the implementation of one family and one school program that they choose.
  • Community teams must move through a multi-phased developmental process focusing on long-term sustainability.
  • State partners provide on-going evaluation to ensure the program remains successful.

PROSPER has plenty of evidence to prove that their system yields results. PROSPER Programs typically recruit 17 percent of eligible families in their communities, compared to less than six percent for other community programs. 

Students who participate in the program are better at problem solving, more likely to refuse offers of alcohol and other drugs, less likely to believe that substance use has positive effects and more likely to delay initiation of substance use. And each $1 invested in the program yield about $9.60 of savings.

That’s good evidence-based practice at work, and a model that even more states should try to adopt.

Survey says…parent education in NY is working!

Cooperative Extension in New York offers parents and caregiver a variety of programs designed to promote positive parenting and healthy child development.  But are these programs making a difference?

A new analysis by Cornell faculty members suggests they are. Researchers surveyed more than 400 people who participated in parent education classes in nine New York counties. The classes each included at least six hours of instruction.

Before and after the courses, participants were asked ten questions about parenting attitudes, behaviors, and knowledge designed to capture some of what was taught in the class.

Participants showed significant improvements on eight of the ten questions, including:

  • making rules that take their child’s needs into consideration.
  • decreases in how often they yell at their child.
  • decreases in the number of hours their children spend watching television.
  • increased patience with their children.
  • increased time spent reading with their child.
  • increased use of explanations for rules they make.
  • increased feelings of support.
  • increased confidence in having the skills necessary to be a good caregiver.

The results suggest that these parent education courses are having a positive impact on their participants.  You can learn more about the programs at http://www.parenting.cit.cornell.edu.

EFNEP: An evidence-based approach to nutrition education

Across the state of New York, there are thousands of families who worry about their next meal, or whether there will be enough money for a weekly grocery-shopping trip.

For the past 41 years, the Expanded Food Education and Nutrition Program has been using an evidence-based approach to help those families improve their nutrition and use their resources more wisely. Cornell faculty in the Division of Nutritional Sciences provide leadership for the program, and work with Cooperative Extension Educators across the state to conduct research and translate those findings into programming that helps families with children.

The program involves hands-on, experiential learning taught to individuals or small groups by a member of their own community.  Local educators undergo extensive training designed by Cornell Nutrition faculty members.

“Many of our community educators haven’t had formal training before, because the focus in recruiting is to hire local community members who can relate to the course participants,” said Joan Doyle Paddock, senior extension associate in Nutritional Sciences. “We’ve developed and rolled out a 19-session training session that provides them with the latest information on nutrition and the most effective educational methods, both based on scientific research.” 

Evaluation conducted a few years ago found that participants taught individually, rather than in group settings, reported greater improvements in nutrition behaviors.  Based on this, the 19-session training was developed to improve group facilitation skills and the result has been improved outcomes for all types of participants.

“We’re not interested in just teaching people information; we’re looking for them to make behavioral changes in their lives,” she said. “Over the years, we’ve collected a lot of evidence that shows this program really does make a difference.”

Among one of the most important findings is this: EFNEP is yielding results for New York families.  A 2008 study found that every $1 spent on the program reaps $10 in health benefits. Jamie Dollahite, associate professor of nutritional sciences, conducted the study, which looked at the costs and benefits of the program for 5,730 low-income adults who “graduated” from New York’s program.

“Cost-effectiveness was estimated to be as great as for many current health interventions, such as lifestyle changes to prevent diabetes,” Dollahite explained.

Other findings include:

  • The program is most successful when the community educators believe in the value of the program and feel they are making a difference in the lives of EFNEP families.
  • Nutrition education increases food security for low-income families, and there is a dose response relationship between the number of lessons received and increases in food security.
  • Nutrition educators are motivated by perceptions of EFNEP’s value to families, relationships with their coworkers, and having a voice in decisions at work.

What is translational research?

Today, we’re talking with Elaine Wethington, associate professor in the Departments of Human Development and Sociology at Cornell. Wethington is a medical sociologist and an expert in the areas of stress and social support systems. She’s also one of the nation’s leading experts in translational research methods.

Cornell’s College of Human Ecology is pursuing a translational research model to better link social and behavioral science research to extension and outreach, creating a more seamless link between science and service. But the question arises: What is “translational research?”

Evidence-Based Living sat down with Wethington to talk about the growing field of translational research.

To start off, what exactly is translational research?

Many definitions have been given for translational research, but the definition I like best is that it is a systematic effort to convert basic research knowledge into practical applications to enhance human health and well being. 

Translational research was designed for the medical world.  It emerged in response to concern over the long time lag between scientific discoveries and changes in treatments, practices, and health policies that incorporate the new discoveries.

What is applied research, and how does it differ?

Translational research is broader than the traditional term “applied research.”  Applied research is any research that may possibly be useful for enhancing health or well-being. It does not necessarily have to have any effort connected with it to take the research to a practical level. 

For example, an applied research study might analyze longitudinal data that tracks participants’ health and social relationships.  The researchers would report their findings in an academic journal.

But in translational research, the same study would include some “action steps.”  The researchers would partner with a community and ask for ideas about how their findings might apply there.  Together, they would come up with an intervention plan that would also include scientific evaluation of its effectiveness. 

Why are social science researchers slower to adopt these models compared to the medical community?

I think the answer to this question is that researchers have followed where the money has been allocated. The opportunities for social and behavioral scientists have not been established as rapidly.

More recently, three major government institutions have been funding projects that emphasize public health outreach using translational research – the Centers for Disease Control, the National Institutes of Health and the National Institute on Aging.  All three have been establishing translational research centers across the country, primarily focused on underserved communities and health disparities.

Thus, social scientists are only now being encouraged to take part.  More recently economic stimulus funds dispersed the National Institute of Health funded a number of translational research projects headed by social scientists, including three funded at Cornell.  I predict that soon there will be social scientists engaged in translational research across the country, not just at funded centers.

What are the benefits of moving toward translational research?

For researchers, there is benefit to being affiliated with a center that provides seed funding for projects, methodological assistance, advice on developing proposals and experience in getting community input into research projects.

For universities, translational research centers provide a tactical advantage for attracting more funding.  Translational research centers also provide a way for universities to meet public service goals in their strategic plans.

For communities, translational research provides opportunities to make a difference in their own communities.  As part of one of the Cornell centers, we engaged public service agency directors in events where they could contribute to our research agenda.  With a stake in the research, communities feel that they are making a valued and important contribution.  We heard over and over from the community members that this was a real source of pride and accomplishment for them.

How can extension programs participate?

One way local extension programs can participate in translational research is to take part in community stakeholder groups that meet with researchers who are designing intervention and prevention research programs.  Typically, a wide variety of stakeholders need to be engaged.  County Cooperative Extension offices have many collaborative relationships in their counties and can work with researchers to make contacts.

Typically, local extension professionals do not have time to engage in research themselves.  Yet they have valuable experience that can be shared.  This makes Cooperative Extension an ideal contributor for implementing programs.

Video feature: Science education outreach

Researchers and scientists across the country are making new discoveries every day, but continually must find the best ways to share that knowledge with the public.  The Cornell Center for Materials Research sets outstanding example of the best way to accomplish this.

The center’s mission is to advance, explore and exploit the science and engineering of advanced materials. It is part of a  national network of centers for Materials Research encompasses 29 centers funded by the National Science Foundation.

Nev Singhota is the director of the center’s Educational Programs Office, which reaches out to thousands of students, hundreds of parents and teachers, and many undergraduates from across the country. Many Cornell faculty, post-doctoral fellows, graduate and undergraduate students contribute to the center’s outreach efforts by visiting schools, hosting family and teacher workshops and coordinating an “Ask the Scientist” column in the local newspaper.

Singhota describes her role a facilitating interactions between Cornell scientists and all sorts of people in the community. “We create this web,” she said. “We’re like the spider who is trying to connect everyone together .” 

Interested in hearing more about Cornell Center for Materials Research’s outreach efforts?  Check out this conversation with Singhota:

Skip to toolbar