In program evaluation, using the most rigorous methods possible is essential for producing credible research findings. But beyond the goal of rigor, relevance is important too. In particular, the more that evaluations are able to address specific research or implementation questions that are of interest to practitioners and policymakers, the more likely that the findings will actually get used.
A rigorous evaluation (using a randomized controlled trial) of a student-aid initiative, called Aid Like a Paycheck, recently took three additional steps, beyond typical program evaluation, to ensure that the study produces information that is relevant to end users. The strategies will be of interest to other program evaluators, but also to foundations and other funders who want to support rigorous and relevant program evaluations. The strategies are:
- Implementing a pilot phase — in fact, one that ran longer than most (about 2 1/2 years);
- Forming an advisory group of stakeholders to provide input into the design of both the intervention and the research study; and
- Doing outreach to other stakeholders about both the preliminary intervention design and research design to get additional input.
To learn more, we’re joined by the evaluation’s lead researcher, Evan Weissman. He is a Senior Associate at the nonprofit research firm MDRC and has over 15 years of experience at MDRC directing projects, providing technical assistance, conducting qualitative research, and disseminating findings in a wide range of education and social policy settings.