WINGS for Kids shows the value of evaluation
Founded in 1996 in Charleston, S.C., WINGS for Kids (WINGS) helps 5- to 13-year-olds gain the essential skills for a joyful, successful life. Their program promotes responsible behavior, good decision-making, positive relationships, self-confidence, and problem-solving through a comprehensive social and emotional learning (SEL) curriculum in after-school programs.
Though still headquartered in Charleston, WINGS has opened additional locations in Charlotte, N.C., and Atlanta. They serve mostly inner-city children from low-income families. The program is a high-energy mix of large-group activities and small-group SEL instruction. Children are taught by “WINGS leaders,” college students who work with a “nest” of ten to twelve children at school sites throughout the cities.
It took a long time, and lots of research, to arrive at this service model. WINGS is unusual and exemplary because it has used a variety of evaluative learning projects to clarify its work. Staff have worked closely with outside researchers and evaluators to identify issues that needed to be addressed, and make tough, even painful, decisions regarding site management, program design and delivery, staff education, and enrollment and retention practices. Their story is an example to others, and should help funders understand the real costs—and benefits—of working at the social service sector’s highest level.
In January, 1997, WINGS began running a residential summer program for kids in Charleston using an SEL approach. In 2000, they launched the current program at three elementary schools in the city. For the next six years, WINGS looked for ways to improve its program quality. They experimented with different approaches to managing multiple sites, including training staff at school sites to run the WINGS program autonomously. But this approach was not successful, and the organization settled on a replication approach in which it manages all sites directly.
By 2007, WINGS leadership recognized that it would need to codify the program in order to manage program quality and grow through replication. It undertook a rigorous Theory of Change Workshop in which the organization formalized its target population and specified its program design. This process helped them determine their program’s core elements, the minimal requirements for participants, and basic staff competencies. Management also identified the short-term, intermediate, and long-term participant outcomes by which they would determine program success. Finally, WINGS implemented Efforts to Outcomes, a data system that they have used ever since to manage performance.
By 2010, WINGS had enough performance data to begin asking systematic questions about its program implementation and delivery. They commissioned an external implementation assessment that found that “WINGS for Kids is by almost all measures a comparatively strong, performance-driven organization with a robust program model designed with extraordinary attention to detail.” The study also noted significant areas for improvement. There were discrepancies among the skills and knowledge of the staff that worked directly with children, and notable inconsistencies in management quality across sites. The assessment team recommended clarifying expectations for participants’ parents and strengthening staff supervision. WINGS took these findings to heart and created a plan to address them over time.
After making the needed changes, WINGS convened an Evaluation Advisory Committee in 2010 to help WINGS leadership better understand plans by a University of Virginia study team to conduct a rigorous impact evaluation of WINGS impact. To ensure that the program was ready for the impact study, in 2012, WINGS again turned to external evaluators (including one from Child Trends) to assess its progress on previous recommendations. The assessment identified that one site was under-performing badly, and emphasized the need for WINGS to take immediate action to address this.
As is often the case in impact evaluations, early findings were disappointing. It was clear that children in the study did always not attend frequently enough to produce intended outcomes. Questions remained: Why didn’t children participate more regularly? What was happening in the nests? WINGS requested an additional implementation evaluation by Child Trends to find out.
Data collection for both the impact evaluation and the implementation evaluation continued. Staff, evaluators, and advisors united in the common goal of conducting an impact and a learning evaluation. Later this year, the study team (including participants from The University of Virginia, The College of Charleston, Portland State University, and Child Trends) will complete the evaluation and share the results with WINGS.
WINGS has attained stability and longevity by systematically building evidence-based learning into its management approach. Their efforts have been featured in national media, including Youth Today and the Stanford Social Innovation Review, and offer a valuable example for other nonprofit social service organizations looking to make similar improvements through research.