Program implementers should engage early and often with evaluators to improve interventions

Strong, early partnerships between program implementers and evaluators can lead to innovative solutions to challenges that program implementers face. And while the primary role of evaluators is to inform implementers of whether (and how) their program is effective, engaging a program evaluator before and during implementation can improve the program, and therefore the evaluation, in unexpected ways.

For five years, Child Trends has served as the lead evaluator for three teen pregnancy prevention (TPP) programs, working closely with our program partners to find solutions to implementation- and evaluation-based challenges. In several cases, our early and flexible engagement contributed to stronger programs by ensuring successful enrollment efforts and high recruitment and retention rates.

Reducing parent opt-outs in a school-based program

Re:MIX is a school-based TPP program implemented in Austin, Texas with predominately Latinx students in grades 8 to 10. Early in our evaluation of Re:MIX, we observed a high proportion of parents who opted their students out of participating in this sex education study. We worked with our implementation partner to create a flyer highlighting key information about the Re:MIX program and study, as well as a frequently asked questions document. These documents were sent home with students in conjunction with the opt-out form, and the additional information helped proactively address parents’ questions and concerns. Child Trends also reached out directly to parents in Spanish. As a result of these efforts, the rate of parents who opted their students out of the study dropped from 16 percent in the first semester to 5 percent in the second.

Identifying and removing ineligible participants in an online evaluation

Pulse is a web-based mobile health app designed for Black and Latinx women ages 18 to 20, intended to help them choose effective contraception that aligns with their needs, seek reproductive health services, and, ultimately, prevent unplanned pregnancies. One key challenge that our partners faced while recruiting participants online was ineligible participants joining the study. Since participants were recruited online and all enrollment was done without any personal interaction with study staff, ineligible participants were sometimes able to enroll either by completing the eligibility screener multiple times until they got in (scammers) or by creating more than one account (duplicates). To ensure that only people who met our recruitment criteria enrolled in the study, we developed detailed procedures to identify and remove scammers and duplicate accounts from the sample. To improve this process for further evaluations, our team developed machine learning techniques to automate the process.

Improving enrollment and attendance in a community-based program

Manhood 2.0 is a TPP program for young men ages 16 to 22 implemented in Washington, DC. The eight-session program encourages young men to discuss gender norms, masculinity, and fatherhood as a gateway to broader discussion about contraceptive use, violence, and teen pregnancy prevention. One key challenge from our pilot of Manhood 2.0 was high attrition rates between enrollment and attending the first session. Initially, our implementation partner requested that we enroll participants, conduct the baseline survey, and randomize participants before the start of the program, and then ask participants to return for the first session at a later date. However, it was difficult to get participants to attend an after-school community-based program with which they had yet to engage, so we added an additional hour of curricula to include a “welcome session.” Eligible participants would complete the baseline survey, immediately be randomized, and participate in an introductory activity with their treatment or control group cohort to build interest in the program and a level of familiarity with facilitators. This and other approaches—such as providing food, allowing participants to bring friends, and offering multiple randomization days and multi-location implementation—increased the percentage of participants attending at least one program session from 21 percent during the pilot to 89 percent in the main study. In addition, 61 percent attended at least three quarters of the eight program sessions, a strong attendance rate for a community-based program for young men.

Child Trends’ flexible partnerships helped us respond to challenges and allowed us to adapt and improve TPP program implementation, thereby improving program evaluation. Other organizations tasked with implementing similar programs should consider early evaluator engagement and strong evaluator/program partnerships to maximize evaluation success.

Newsletters