According to the National Oceanic and Atmospheric Administration (NOAA), while weather refers to short-term changes in the atmosphere, climate describes what the weather is like over a long period of time in a specific area (National Centers for Environmental Information, 2020). We can use these descriptions of climate and weather as a metaphor for trends in evaluation.
If we think of the concept of camp evaluation as climate, we can describe camp evaluation as a way of being, integrated into our daily lives. Increasingly, camp professionals choose to formally or systematically evaluate their camp programs to better understand the potential benefits camps provide to participants. But attention to evaluation has slowly changed over many years, somewhat like climate change. It’s hard to observe, but a gradual shift is happening. Camps are increasingly using evaluation findings to improve their organizational learning climates. And the entirety of the camp profession is being enhanced by the research and evaluation work conducted in camps. As camp professionals, evaluation is now part of the air we breathe.
On the other hand, evaluation trends are like weather — trends come and go just as clouds become rain and then turn back to sun. Weather trends change more frequently but can still be predictable. Weather is influenced by the changing climate and is the mix of events that happen on smaller, more local scales. Different regions or camps see different weather or trends. As NOAA says, “Weather tells you what to wear each day. Climate tells you what types of clothes to have in your closet.” Translating this to the world of camp: weather tells you what to focus on in your evaluation. Climate tells you to do that evaluation in the first place.
“Evaluation is a systematic process to determine merit, worth, value, or significance” (American Evaluation Association, 2014).
A process evaluation determines whether the program is being carried out as planned. It is intended to answer the basic question, “Who is being served and what has actually happened in this program?” (Sabatelli, Anderson, & LaMotte, 2005).
Outcome evaluations focus on the immediate effects that the program has on the group of individuals attending the program. The purpose of an outcome evaluation is to learn about short-term changes in participants’ knowledge, attitudes, beliefs, or actual behavior (Sabatelli, Anderson, & LaMotte, 2005).
The following five evaluation trends are observable and happen at a faster rate, like changing weather. Here’s your camp weather forecast!
Trend #1: Culturally Responsive Evaluation
Evaluation has always been guided by ethical standards to:
- Represent participants’ voices appropriately
- Use methods that are helpful and not harmful
- Consider the overall context of the camp and people in the evaluation
- Be fair
With the increasing focus on issues negatively affecting members of Black and Brown communities, LGBTQ+ people, people with disabilities, and people from other marginalized groups, evaluation is also trending toward greater cultural responsiveness. In culturally responsive evaluation, evaluators recognize issues of power, privilege, and intersectionality (University of Illinois Center for Culturally Responsive Evaluation and Assessment, 2020). Culturally responsive evaluation is purposefully inclusive of cultural norms, practices, and expectations in programs. Further, such evaluations ensure cultural relevance and generate meaningful findings to benefit stakeholders.
The Equitable Evaluation Initiative (2020) recommends three principles:
- Evaluation and evaluative work should be in service of equity.
- Evaluative work can and should answer critical questions.
- Evaluative work should be designed and implemented commensurate with the values underlying equity work.
For more information, visit equitableeval.org/ee-framework.
InformalScience.org has a wonderful collection of resources and insights devoted to culturally responsive evaluation: informalscience.org/evaluation/developing-evaluation-plan/resources-culturally-responsive-evaluation.
Trend #2: More Surveys, More of the Time
Since the COVID-19 pandemic hit, it’s been much harder for camps to use focus groups, interviews, or observations to evaluate their programs. In our online worlds, surveys are easy for camps to use to collect data because they can be sent to people who are already online. Also, camps now have more accurate emails and contact information for campers and their families, so it’s easier than ever to make sure surveys get to the right inboxes. Surveys are great ways to collect a lot of information from a large number of people in a short amount of time. While you might not get much depth in responses from surveys, if you phrase questions well, you can learn a lot.
The Surveymonkey, SurveyGizmo, and JotForm online platforms all are easy to use, have free versions (Google Forms is free) and paid versions, and are commonly used to send out surveys. Check out Shelia Robinson and Kimberly Firth Leonard’s book Designing Quality Survey Questions at sheilabrobinson.com/designing-quality-survey-questions-the-book/.
Trend #3: Using Online Metrics to Evaluate
Given that more camp programming is happening online, camp evaluators are getting more experienced at mining data embedded in YouTube Studio, Facebook and Instagram Insights, and TikTok Analytics. Learning to read, understand, and navigate through these analytics makes evaluating participation in online camp programs easier than ever. But a lot of camp evaluators find themselves asking, “So what?” What does this number mean? Is it “good”? How does it compare to last month or last year?
An online metric or number is only good if you can compare that number to something else. Think about what goals or strategies you have for your social media efforts and start tracking your social media metrics to evaluate your reach and engagement.
Hootsuite (hootsuite.com) has several terrific step-by-step directions for analyzing social media metrics across platforms.
Trend #4: The Importance of Data Visualization
Camps have always been proponents of visual learning and communication methods. Sharing evaluation results by using charts, graphs, word clouds, or other means is better than writing a boring paragraph. Although many camps welcome the color and excitement of a well-made chart or graph, fewer camps have the in-house expertise to effectively tell the story of their data. Visualizing data such as numbers served, outcomes for campers or families, or demographics of participants can help tell a camp’s story, including the potential impacts that camp has on the people it serves. Doing so in a fun, clear, and powerful way can only enhance camp messages.
Check out Ann K. Emery’s Depict Data Studio (depictdatastudio.com), Stephanie Evergreen, PhD (evergreendata.com), and Storytelling with Data (storytellingwithdata.com) to read data visualization blogs, watch tutorials, and sign up for online workshops and courses.
Trend #5: Using External Evaluators and Consultants
While many camps are building their own capacities to conduct evaluations, some camps increasingly use external evaluators and consultants to evaluate their camp programs. Bringing in outside evaluators can provide a level of objectivity that internal evaluations don’t have and can help camps see areas of strength and improvement they might not otherwise perceive. External evaluators can bring a broader perspective, while internal evaluators tend to have intimate knowledge about the context that the program is operating within. External evaluators can be perceived as threatening, while internal evaluators can be perceived as being less objective. Advantages of hiring an external evaluator/researcher can include specialized knowledge and ability in evaluation, objectivity, credibility, and broader perspective. However, cost, time, and potential lack of camp-specific expertise can be disadvantages.
Ask colleagues in your area for recommendations, such as educators, social workers, and nonprofit managers. Contact local colleges and universities.
Use the American Evaluation Association’s evaluator directory, which is searchable by state and keyword: eval.org/.
Check out information from EvaluATE on what to consider when finding an external evaluator: evaluate.org/tag/finding-an-evaluator/.
Staying Up to Date on Camp Weather Fronts
That was your weather report for evaluation in the camp profession. Want to stay up to date on other emerging trends in camp evaluation? The ACA Research 360 blog shares biweekly updates and tips for camp evaluation. In the past few months, several posts have addressed evaluation in the age of COVID-19, such as:
- Measuring Camper Impact (Virtually)
- Camp Research in the Age of COVID-19
- A three-part series on evaluating virtual programs:
Stayed tuned for future camp weather forecasts!
Photo courtesy of Camp Lake Stephens in Oxford, Mississippi
Author’s Note: Special thanks to ACA Director of Research Laurie Browne, PhD.
Ann Gillard, PhD, is the research director at SeriousFun Children’s Network and The Hole in the Wall Gang Camp. As a camp professional with over 25 years of experience as a counselor, camp director, volunteer, and researcher, Ann is committed to providing equitable and inclusive camp experiences. Her research areas include camps for youth with serious illnesses, camp program evaluation, and social justice. Ann serves on ACA’s Committee for the Advancement of Research and Evaluation (CARE) and is the Camp Research Forum coordinator.
American Evaluation Association. (2014, January 10). What is evaluation? AEA. Retrieved from eval.org/p/bl/et/blogid=2
Equitable Evaluation Initiative. (2020). The equitable evaluation framework. EEI. Retrieved from equitableeval.org/ee-framework
National Centers for Environmental Information. (2020). What’s the difference between weather and climate? National Oceanic and Atmospheric Administration. Retrieved from ncei.noaa.gov/news/weather-vs-climate
Sabatelli, R. M., Anderson, S. A., & LaMotte, V. A. (2005, September). Assessing outcomes in child and youth programs: A practical handbook. Retrieved from wiafterschoolnetwork.org/wp-content/uploads/2018/02/AssessingOutcomesChildYouthPro-grams.pdf
University of Illinois Center for Culturally Responsive Evaluation and Assessment. (2020). CREA. Retrieved from crea.education.illinois.edu/