Learning to Rise through Research and Evaluation

Mary Rogers, MEd
March 2018

The year was 1995. It was early September, and I was just barely back as camp director at Sherwood Forest. Our summer programs then were 12-day sessions — two for boys and two for girls referred by schools and organizations in St. Louis. Most of the kids were from underserved communities and low-income families. Despite the long relationships between Sherwood Forest and our referral partners, it was always a scramble in the spring to fill these four sessions. Once our campers arrived in camp, our program’s design looked a lot like the program I had been part of as a child back in the 1960s — the same group-oriented structure, progression in leadership training for teens, and staffing patterns.

But that September also found me back from graduate school. I had taken a year’s leave of absence to study theories of development for youth at risk and earn my master’s degree at the Harvard Graduate School of Education. In those first days back from my studies, I knew two things: I had far more questions about child and adolescent development than I had answers, and I had just spent a year in a research institution.

Research. The word makes many people nervous for lots of reasons. It is too hard to do, too hard to understand, too hard to explain. Research is what other people do; I am a practitioner. Or so I thought. But that year of studying in a research institution gave me an inkling about the value of research, and so, with more trepidation than confidence, when a research project called the National Inclusive Camp Practices Project (Brannon, Fullerton, Arick, Robb, & Bender, 2003) knocked on Sherwood Forest’s door, I encouraged everyone to agree to participate, and we did.

The American Camp Association (ACA) had big research plans. In 2002 and 2003, ACA launched a large-scale national study of camper outcomes, (American Camping Association, Inc., 2005). When Sherwood Forest was randomly selected as one of the ACA-accredited camps to participate, we again agreed to be part of the study. Opportunities for two additional research projects that included evaluation work followed. Sherwood Forest said yes to both. We learned a lot along the way, and some of what we learned challenged our preconceived understanding of what made Sherwood Forest strong.

Learning from Data and Learning to Listen

The second ACA research project we participated in was a study of how camps measured up in a broader youth development context (Inspirations. American Camping Association, Inc., 2006). ACA partnered with Youth Development Strategies, Inc. (YDSI) in this study. YDSI had developed a community framework for understanding youth development (Gambone, Clem, & Connell, 2002). Participating camps surveyed their campers to learn how campers thought their camp experiences provided critical developmental support and opportunities such as supportive relationships, challenging and engaging activities, meaningful involvement, and safety. Before asking campers to complete their surveys, I remember thinking to myself that we would probably do pretty well on safety. After 9/11, our staff had worked hard to create an opening day procedure designed to help campers feel safe in their new camp home. So imagine my surprise when we discovered that whatever efforts our staff had put into place did not translate into a greater sense of safety for our campers. Our kids’ sense of safety at camp was not good enough.

To help us better understand what campers might have been thinking when they completed their surveys we decided to ask campers to join focus groups. Our campers gave us lots of food for thought. One of the most important things they told us was, “You should just tell us about it.” In other words, if we wanted campers to feel safer at camp as an outcome of our first day procedures, we should explain to them why we had created these particular procedures. We now tell kids why for lots of things we do at camp, and we encourage them to ask us why if they don’t understand something.

The third phase of ACA’s research focused on program improvement. (Innovations. American Camping Association, Inc., 2006). In this phase we worked early on to give campers more individual choices. The summer we rolled out these expanded options from which campers could choose, activities independent of their cabin groups appeared to be going well. At the end of the session, when we asked campers if they had enjoyed choosing their own activities, we were not surprised to hear that they really liked having options. What did surprise us was that they didn’t want more individual activities. The next year we offered a small expansion of opportunities for campers to choose their own activities. And when we asked campers at the end of that summer if they wanted even more choices, the answer was a resounding, “Yes!”

Key Drivers of Transformation

Research and evaluation became the key drivers in the transformation of our programs from then to now. How did these transformations come about? These are a few of the principles that we have been using to guide us along the way:

  • Help everyone understand that these two things can be true at the same time: Our programs are good and we can make them better.
  • Commit to the process of learning from data.
  • Create a culture of asking questions, including asking for help.
  • Get campers, parents, staff, and board members to buy in to the process.
  • Ask campers to complete their surveys honestly, because their honesty will help their camp, and camps for all kids, learn.
  • Learn to look for the weak points in the data.
  • When you don’t understand the data, ask your campers to share their perspectives.
  • And finally, popsicles on a hot summer day are a great reward for completing surveys!

Measurements Tailored to Our Work

ACA’s development of the Youth Outcomes Battery (YOB) over the last few years, funded in part by ACA’s Not-for-Profit Council, has provided Sherwood Forest and many other camps and youth development organizations with a set of outcome measures tailored to our work in the camp setting (American Camping Association, Inc., 2011).

Beginning with the field tests of some of these YOB outcome measurement tools, campers at Sherwood Forest have been helping us learn from their honest feedback. The data from these surveys tell us how well campers perceive that their camp experiences help them achieve desired outcomes.

Sherwood Forest’s programs are organized by grade level. For each grade level during the summer of 2018, we have chosen to focus on one developmentally appropriate YOB outcome that aligns with the grade level program focus. In addition, for most grades, an additional program focus will align around an academic subject. For instance, our third-graders’ program YOB focus is Young Camper Learning, and the academic focus is on positive reading attitudes. We use the Elementary Reading Attitude Survey to measure changes in campers’ attitudes about reading for fun and reading in school (McKenna & Kear, 1990).

At the fourth grade level we are interested in how campers are feeling as members of the camp community, and we are introducing nature-based STEM activities. The YOB outcome is Camp Connectedness and a STEM evaluation tool is being developed. Our fifth-graders continue in an academic focus on STEM, and the YOB outcome is Independence.

Campers enter the first of four years of leadership training at sixth grade. The first two years of the program offer a wilderness intensive and/or an arts intensive experience. These sixth- and seventh-graders also participate in St. Louis-based Wyman Center’s Teen Outreach Program, which is the school-related program focus. The YOB for sixth-graders is Interest in Exploration, and for seventh-graders it is Perceived Competence.

The second two years of our leadership training program focus on civic engagement, being ready for high school, and thinking about what post-secondary opportunities will be the best fit for campers’ goals and aspirations. Both eighth- and ninth-graders plan and undertake “big trips” away from camp. These trips include community service projects; visits to places of historic, cultural, or civic importance; college campus visits; and sometimes visits to other camps. The eighth-grade big trip is in Missouri or Illinois, and the ninth-grade big trip is to Washington, DC. In these two summers campers have only one grade level YOB outcome: Responsibility for eighth-graders and Problem-Solving Confidence for ninth-graders.

We will measure one specific YOB outcome every summer all the way through from third grade to ninth grade. This is Affinity for Nature, because it is the overarching theme in our programs. Our kids live, play, and learn outside, and we are interested in understanding how these cumulative experiences impact their attitudes toward the natural environment.

Since we began measuring outcomes using ACA’s YOB, we have learned from our experiences that some things worked and some things didn’t.

In previous years, we have used three YOB measures for each grade level from fourth through ninth grades:

  • Fourth and fifth grades — Affinity for Nature, Independence, and Responsibility.
  • Sixth and seventh grades — Teamwork, Independence, and Interest in Exploration
  • Eighth and ninth grades — Responsibility, Perceived Competence, and Problem-Solving Confidence

Reorganizing Our Evaluation Strategy

For several years, measuring these outcomes at these grade levels yielded interesting results. However, over the last two years we noticed that our campers’ scores have plateaued. As a result, this year we reorganized our evaluation strategy after going through a comprehensive strategic plan for all our programs. We selected different YOB measures that specifically address a large goal for each grade level program and paired it with Affinity for Nature, which again, is a theme to which we want to connect each level of programming. This will be the first year with this new set of data, and we are excited to see what we might learn.

We have found that the detailed version, the retrospective pre- and post-test format of the YOB, is the tool that helps our funders in particular understand the outcomes of our programs. They want to see a baseline and a change, and this tool does a good job describing that.

But the detailed version of the YOB can be difficult to explain to younger kids, so it wasn’t the best fit for our third- through fifth-graders. This year, the third- and fourth-graders will be using the basic format of their YOB outcome. The third grade YOB outcome tool is the Young Camper Learning survey, and it only exists in the basic format. We will use the basic format of the Camp Connectedness YOB outcome survey with our fourth-graders. We hope that by using these simpler versions of the YOB at these grade levels our youngest campers will more fully understand the survey we are asking them to complete.

A Culture of Inquiry and Learning

To create a culture of inquiry and learning through conducting evaluations at camp, we need to ensure our staff members understand that these evaluations are important. We talk about it at training and discuss all the ways data can make us better. We also want to ensure staff buy in because they create the atmosphere that our campers live in, and if they are positive about our evaluation process, we think our kids will be too.

To actually crunch the numbers, we use a few different methods.

For the surveys in the YOB we use the Excel sheet created by ACA for this data. It is simple and straightforward. We enter the data by hand into Excel, and it calculates the statistics. In our camp setting, the online evaluation is incredibly difficult. We have very unreliable Internet access and few available tablets and computer work spaces. It is much simpler and more efficient for our campers to use good, old-fashioned paper and pencil. While this requires more back end work for our staff, who then enter the data themselves, it makes for a more positive experience for our kids, and that is very important.

If camps are interested, they can easily compare their data to that of camps across the country. (See ACA’s website for explicit directions on how to use normative data for comparison: ACAcamps.org/resource-library/research/youth-outcomes-battery-norms.)

For our other evaluation tools, we measure data using statistical analysis tools in Excel. This can be time-consuming and does require some prior knowledge of both evaluation data and using Excel to analyze it. It can be helpful to recruit individuals from local universities to assist with this process. We have had a lot of success working with graduate students from St. Louis-area universities.

Success

Here’s maybe the most important thing you need to understand from reading about our camp’s experiences in research and evaluation. By being willing to hold our camp and its practices up to the critical analysis of research and evaluation projects we have also been able to change and grow. Because of all we learned, we have transformed the programs at Sherwood Forest that help our campers learn critical skills.

Today, there are 48 spaces in our Quest summer camp program for young people at each grade level from the end of third grade through the end of ninth grade. There are wait lists for every grade level. Our Leadership Training Program has been completely revised. Third-graders participate in an award-winning reading program called the “Book Club.” Every young person who graduates from our Leadership Training Program at the end of ninth grade also goes on to graduate from high school — 100 percent. We are recipients of significant grants and gifts from foundations and donors to support and sustain our programs, as the tuition charged doesn’t come close to covering the actual costs to participate.

Today, our year-round programs are anchored in a month-long resident camp experience. Campers, referred by partner schools and organizations, are still from largely underserved communities and low-income families in the St. Louis region. The resident camp experience remains the beating heart of our mission, and at the center of our very long-term relationships with our campers and their families.

All along the way, we have greatly benefited from our relationship and our partnership with ACA, with its professional staff, our camp professional colleagues, academics, researchers, and students. We couldn’t have done any of this without this remarkable camp community.

Photo courtesy of Asbury Hills Camp and Retreat Center in Cleveland, South Carolina

References

American Camp Association. (2011). Camp youth outcomes battery: Measuring developmental outcomes in youth programs. Martinsville, IN: ACA, Inc.

American Camping Association, Inc. (2005). Directions: Youth development outcomes of the camp experience. Martinsville, IN: ACA, Inc.

American Camping Association, Inc. (2006). Innovations: Improving youth in summer programs. Martinsville, IN: ACA, Inc.

American Camping Association, Inc. (2006). Inspirations: Developmental supports and opportunities of youth’s experiences at camp. Martinsville, IN: ACA, Inc.

Brannon, S., Fullerton, A., Arick, J., Robb, G., & Bender, M. (2003). Including youth with disabilities in outdoor programs: Best practices, outcomes, and resources. Champaign, IL: Sagamore Publishing.

Gambone, M., Klem, A., & Connell, J. (2002). Finding out what matters for youth: Testing key links in a community action framework for youth development. Philadelphia Youth Development Strategies, Inc., and Institute for Research and Reform in Education.

McKenna, M. and Kear, D. (1990, May.) Measuring attitude for reading: A new tool for teachers. The Reading Teacher, 43 (9).

Mary Rogers, MEd, is the executive director at Sherwood Forest Camp, a year-round youth development organization serving youth from economically disadvantaged families in St. Louis, Missouri. Mary is a longtime member of the American Camp Association, where she has served in various volunteer roles at local, regional, and national levels. Mary is the current chairperson for CARE at ACA. She holds a master’s in education from Harvard University.