How school districts can use rigorous program evaluation to test new education reforms: An interview with Matthew Lenard, Director, Data Strategy and Analytics, Wake County Public Schools – Episode #123

When schools or school districts implement district wide reform initiatives, how can they accurately determine if those reform efforts are having the positive effects that school leaders had hoped? How, in other words, can they move beyond anecdotes or simple trend data and rigorously evaluate their district wide reform initiatives?

The Wake County Public School System (WCPSS) — North Carolina’s largest school district — faced exactly those questions when it implemented a district-wide reform initiative. The initiative is called Multi-Tiered System of Supports, or MTSS, and is designed to increase academic achievement and reduce behavioral problems, although the specifics of MTSS are not the focus of our interview.

WCPSS was able to implement a rigorous evaluation of the initiative using a phased-in design, with 88 schools being randomly assigned to one of two groups: One group of 44 schools implemented MTSS first, while the other group of 44 schools will implement it two years later. That allowed district leaders to compare the outcomes for children in each set of schools to determine the impact of the MTSS initiative.

To learn more, we’re joined by Matthew Lenard. He has served as Director of Data Strategy and Analytics for WCPSS since 2012 and is the co-lead researcher on the MTSS evaluation.

Determining if your program is having a positive impact (i.e., impact evaluation 101): An interview with David Evans, Senior Economist, The World Bank – Episode #122

Is my program or initiative having a positive impact?

It’s a question about which organizational leaders may want hard evidence, either to take stock and help improve program results, or to satisfy their authorizers or funders who may be asking for rigorous evidence of impact. Either way, how can you determine the impact of your program? And which strategies may sound useful but are unlikely to produce accurate answers?

To examine these these questions and get a “101” on impact evaluation, we’re joined by David Evans (@tukopamoja). He is a Senior Economist at the World Bank and the co-author, with Bruce Wydick, of a recent post on the Bank’s Development Impact blog on this topic.

The interview covers:

  • The concept of impact
  • Ways that organization could try to estimate impact that generally won’t be accurate
  • Three strategies to more accurately estimate program impact:
    • Using a lottery, aka a randomized experiment
    • Using an eligibility cutoff, aka regression discontinuity design
    • Using before and after data for both participants and nonparticipants, aka a differences-in-differences approach
  • Factors to guide the choice of one impact evaluation strategy over another

Using intensive, individualized math tutoring to boost academic outcomes of disadvantaged youth: An interview with Jonathan Guryan, Professor, Northwestern University – Episode #121

While improving schooling outcomes of disadvantaged youth is a top policy priority in the United States, few interventions have produced convincing evidence that they can improve those outcomes, especially for adolescent youth — the age at which socially costly outcomes occur, such as high school dropout. As a result, it may be conventional wisdom that by adolescence is too late and too costly to improve academic outcomes of children in poverty. A recent study and Hamilton Project policy proposal suggest that this conventional wisdom is wrong. It uses a rigorous evaluation design — a randomized controlled trial — to examine the effects of intensive, individualized (two students to one tutor) math tutoring among 9th and 10th grade boys in twelve Chicago public schools.

To learn more, we are joined by one of the study’s nine authors, Jonathan Guryan. He is a professor of human development and social policy at Northwestern University and a fellow at Northwestern’s Institute for Policy Research.

Lessons from the nation’s first Social Impact Bond, aimed at reducing recidivism among adolescent offenders at Rikers Island: An interview with Gordon Berlin, President, MDRC – Episode #120

A Social Impact Bond (SIB) uses private funds – from philanthropy or other investors — to pay for a social, educational, or health programs. Importantly, the government only repays investors, plus a return, if pre-specified results are achieved. A new report by Gordon Berlin, the president of the nonprofit social policy research firm MDRC (@MDRC_News), reflects on the experience of SIB (also called pay for success) projects to date, including the nation’s first SIB at Rikers Island jail in New York City for which MDRC was the intermediary. As the report notes, while SIBs are the social sector’s hottest “impact investing” strategy, they have generated a range of reactions, from excitement to angst.

In our interview, Gordon Berlin reflects on the Rikers Island SIB as well as broader lessons from SIB projects to date, including:

  • What types of SIB projects are likely to work best
  • The role of evaluation in SIB projects
  • Why philanthropy could and should play a central role in SIBs

How the Institute of Education Sciences at the U.S. Dept. of Education is helping the education field to learn and do what works: An interview with Russ Whitehurst, Senior Fellow, The Brookings Institution – Episode #119

Over the last 15 years, the field of education has become considerably more evidence focused, including a growing number of high-quality studies about how to help students succeed in school. An important catalyst has been the Institute of Education Sciences (IES). It is the independent, non-partisan statistics, research, and evaluation arm of the U.S. Department of Education. Created in 2002 during the George W. Bush Administration, it has continued to flourish under the Obama Administration and today has a budget of about $670 million and a staff of 180.

To learn more, including lessons for other public agencies, we’re joined by Russ Whitehurst. He was the first director of IES and served in that role from 2002 to 2008. Today he is a Senior Fellow at the Brookings Institution, including serving as editor of the Evidence Speaks series.

Web extra: Russ Whitehurst describes the origins of IES, including some of the key people involved in its creation and launch. [click here]

Milwaukee’s three-pronged strategy to reduce teen pregnancy: An interview with Bevan Baker, Commissioner of Health, City of Milwaukee, and Nicole Angresano, United Way of Greater Milwaukee – Episode #118

Bevin-e1460542263871Milwaukee’s Teen Pregnancy Prevention Initiative, launched in 2008, is a citywide effort led by the United Way of Greater Milwaukee and Waukesha County. It has been recognized as a model of community collaboration, including by the White House Council for Community Solutions.

The three main prongs of the strategy are:

  • An aggressive advertising campaign targeted to teens
  • The use of evidence-based sex education
  • The involvement of community partners to support the strategy

The initiative set a goal in 2008 to reduce Milwaukee’s teen births by 46% over 10 years. It exceeded that goal, three years early, with a 50% decline in teen births by 2012. During the same period, national teen birthrates also declined sharply, although Milwaukee’s decline slightly outpaced the national average. More broadly, the initiative is a leading example of an “all-hands-on-deck” community partnership focused on a key community challenge.

To learn more, we’re joined by two people who have been at the center of the initiative. Bevan Baker has been the Commissioner of Health in Milwaukee since 2004. And Nicole Angresano is the Vice President of Community Impact at the United Way of Greater Milwaukee and Waukesha County.

Three strategies to promote relevance in program evaluations so that findings are useful to policymakers and practitioners: An interview with Evan Weissman, Senior Associate, MDRC – Episode #117

In program evaluation, using the most rigorous methods possible is essential for producing credible research findings. But beyond the goal of rigor, relevance is important too. In particular, the more that evaluations are able to address specific research or implementation questions that are of interest to practitioners and policymakers, the more likely that the findings will actually get used.

A rigorous evaluation (using a randomized controlled trial) of a student-aid initiative, called Aid Like a Paycheck, recently took three additional steps, beyond typical program evaluation, to ensure that the study produces information that is relevant to end users. The strategies will be of interest to other program evaluators, but also to foundations and other funders who want to support rigorous and relevant program evaluations. The strategies are:

  • Implementing a pilot phase — in fact, one that ran longer than most (about 2 1/2 years);
  • Forming an advisory group of stakeholders to provide input into the design of both the intervention and the research study; and
  • Doing outreach to other stakeholders about both the preliminary intervention design and research design to get additional input.

To learn more, we’re joined by the evaluation’s lead researcher, Evan Weissman. He is a Senior Associate at the nonprofit research firm MDRC and has over 15 years of experience at MDRC directing projects, providing technical assistance, conducting qualitative research, and disseminating findings in a wide range of education and social policy settings.

Improving student outcomes by giving parents detailed information about their child’s academic progress: An interview with Peter Bergman, Professor, Teachers College, Columbia University – Episode #116

Can regular, detailed information sent to parents about their students’ progress lead to improved student achievement? That question was put to the test by in a field experiment in the Los Angeles school system in which parents were given information by text, phone or email about their children’s missing assignments. The results for high school students show surprisingly large effects and suggest that this type of relatively low cost intervention may have effects on student achievement that are similar to much more costly and intensive interventions.

To learn more, we’re joined by the study’s author, Peter Bergman (@peterbergman_). He is an professor of economics and education at Columbia University’s Teachers College. His research uses randomized controlled trials to find low-cost, scalable interventions that improve education outcomes.

Las Vegas’s data-driven effort to improve traffic safety at its most dangerous intersections: An interview with Betsy Fretwell, City Manager, City of Las Vegas – Episode #115

Today, results-focused cities are using data to improve city services, boost the quality of life, and literally save lives. The City of Las Vegas has gained a reputation for its data-focused approach to addressing important city challenges. A good example is its effort to reduce traffic accidents, first by focusing on reducing left turn crashes and later by focusing on the 50 most dangerous intersections. The results have been dramatic.

To learn more, we are joined by Betsy Fretwell (@BetsyFretwell), the City Manager of Las Vegas. She has been in that role since 2009, overseeing a city workforce of nearly 3,000 and a budget of $1.2 billion per year. She has won several awards for her work, including a National Public Service Award.

Insights from the City of New Orleans’ analytics unit, NOLAlytics, about using data to improve city services: An interview with Oliver Wise, Director, Office of Performance and Accountability, City of New Orleans – Episode #114

The City of New Orleans under Mayor Mitch Landrieu has gained a reputation as being one of the most innovative and data-driven city governments. An important element in those efforts is the Office of Performance and Accountability, launched in 2011. The mission of the office is to use data to set goals, track performance, and drive results across city government. In 2015, it launched an analytics unit called NOLAlytics that undertakes data-driven projects to improve city services.

To learn more, we are joined by Oliver Wise (@ojwise). He is the founding director of the Office of Performance and Accountability.