Using school-based health centers to address the health needs of low-income youth: An interview with Olga Acosta Price, Professor, The George Washington University – Episode #126

How can communities better address young people’s physical and emotional health needs? A growing trend is the use of school-based health centers. The goal is to provide convenient, accessible, and comprehensive health care services to students from pre-k through high school by having a health provider — or sometimes an interdisciplinary health provider team — that is co-located in the school setting.

To learn more about the trends in school-based health centers and the evidence of their impact, we’re joined by a leading expert on the topic, Olga Acosta Price. She is an Associate Professor at the Milken Institute School of Public Health at The George Washington University and director of its Center for Health and Health Care in Schools. Her paper on school-centered approaches to improving community health will be published by the Brookings Institution later in June 2016.

Web extra: Olga Price discusses how the recent clarification of the free care rule issued by the Federal government helps facilitate broader use of school-based health centers. [click here]

How one Federal agency, the Corporation for National and Community Service, strengthened the role of evidence in a key grant program, AmeriCorps: An interview with Diana Epstein and Carla Ganiel, CNCS – Episode #125

The Corporation for National and Community Service (CNCS) is probably best known for overseeing the AmeriCorps program. The program provides grants to nonprofits and local governments to address community needs in education, public safety, health and the environment.  The money pays to support AmeriCorps members and their activities, whether it’s tutoring in an elementary school or building affordable housing in response to a national disaster.  The funding includes about $230 million in competitive grants to about 350 grantees. In 2014, AmeriCorps began prioritizing evidence in the scoring criteria by which it awards those competitive grants.

To learn more, including advice for other Federal agencies, we are joined by Diana Epstein who is a manager in the Office of Research and Evaluation at CNCS and Carla Ganiel who is a Senior Program Specialist with the AmeriCorps program.

Twelve “better practices” that can help public leaders tackle key organizational challenges and boost results: An interview with Bob Behn, Professor, Harvard Kennedy School – Episode #124

Bob Behn of the Harvard Kennedy School (HKS) is one of the leading thinkers on the subjects of public management and leadership. He has argued that public agencies are unlikely to produce better results simply by creating rules, requirements or performance systems. A more effective approach, he notes, is to help managers learn better leadership practices.

In particular, he recommends twelve practices or leadership skills that can help organizations strengthen their performance. Our discussion draws on his original paper on the topic, published by the IBM Center for the Business of Government, which discussed eleven of those practices.

To give us an overview, we’re joined by Bob Behn, speaking with us (probably with a baseball tie on) from Boston. A professor at the Harvard Kennedy School, he is the faculty chair of the executive program called Driving Government Performance. He also publishes monthly insights though his Public Leadership Report, which is available free online.

Web extra: Bob Behn describes the connection between the these twelve practices and the PerformanceStat approach to public leadership, which was the focus of his most recent book. [click here]

How school districts can use rigorous program evaluation to test new education reforms: An interview with Matthew Lenard, Director, Data Strategy and Analytics, Wake County Public Schools – Episode #123

When schools or school districts implement district wide reform initiatives, how can they accurately determine if those reform efforts are having the positive effects that school leaders had hoped? How, in other words, can they move beyond anecdotes or simple trend data and rigorously evaluate their district wide reform initiatives?

The Wake County Public School System (WCPSS) — North Carolina’s largest school district — faced exactly those questions when it implemented a district-wide reform initiative. The initiative is called Multi-Tiered System of Supports, or MTSS, and is designed to increase academic achievement and reduce behavioral problems, although the specifics of MTSS are not the focus of our interview.

WCPSS was able to implement a rigorous evaluation of the initiative using a phased-in design, with 88 schools being randomly assigned to one of two groups: One group of 44 schools implemented MTSS first, while the other group of 44 schools will implement it two years later. That allowed district leaders to compare the outcomes for children in each set of schools to determine the impact of the MTSS initiative.

To learn more, we’re joined by Matthew Lenard. He has served as Director of Data Strategy and Analytics for WCPSS since 2012 and is the co-lead researcher on the MTSS evaluation.

Determining if your program is having a positive impact (i.e., impact evaluation 101): An interview with David Evans, Senior Economist, The World Bank – Episode #122

Is my program or initiative having a positive impact?

It’s a question about which organizational leaders may want hard evidence, either to take stock and help improve program results, or to satisfy their authorizers or funders who may be asking for rigorous evidence of impact. Either way, how can you determine the impact of your program? And which strategies may sound useful but are unlikely to produce accurate answers?

To examine these these questions and get a “101” on impact evaluation, we’re joined by David Evans (@tukopamoja). He is a Senior Economist at the World Bank and the co-author, with Bruce Wydick, of a recent post on the Bank’s Development Impact blog on this topic.

The interview covers:

  • The concept of impact
  • Ways that organization could try to estimate impact that generally won’t be accurate
  • Three strategies to more accurately estimate program impact:
    • Using a lottery, aka a randomized experiment
    • Using an eligibility cutoff, aka regression discontinuity design
    • Using before and after data for both participants and nonparticipants, aka a differences-in-differences approach
  • Factors to guide the choice of one impact evaluation strategy over another

Using intensive, individualized math tutoring to boost academic outcomes of disadvantaged youth: An interview with Jonathan Guryan, Professor, Northwestern University – Episode #121

While improving schooling outcomes of disadvantaged youth is a top policy priority in the United States, few interventions have produced convincing evidence that they can improve those outcomes, especially for adolescent youth — the age at which socially costly outcomes occur, such as high school dropout. As a result, it may be conventional wisdom that by adolescence is too late and too costly to improve academic outcomes of children in poverty. A recent study and Hamilton Project policy proposal suggest that this conventional wisdom is wrong. It uses a rigorous evaluation design — a randomized controlled trial — to examine the effects of intensive, individualized (two students to one tutor) math tutoring among 9th and 10th grade boys in twelve Chicago public schools.

To learn more, we are joined by one of the study’s nine authors, Jonathan Guryan. He is a professor of human development and social policy at Northwestern University and a fellow at Northwestern’s Institute for Policy Research.

Lessons from the nation’s first Social Impact Bond, aimed at reducing recidivism among adolescent offenders at Rikers Island: An interview with Gordon Berlin, President, MDRC – Episode #120

A Social Impact Bond (SIB) uses private funds – from philanthropy or other investors — to pay for a social, educational, or health programs. Importantly, the government only repays investors, plus a return, if pre-specified results are achieved. A new report by Gordon Berlin, the president of the nonprofit social policy research firm MDRC (@MDRC_News), reflects on the experience of SIB (also called pay for success) projects to date, including the nation’s first SIB at Rikers Island jail in New York City for which MDRC was the intermediary. As the report notes, while SIBs are the social sector’s hottest “impact investing” strategy, they have generated a range of reactions, from excitement to angst.

In our interview, Gordon Berlin reflects on the Rikers Island SIB as well as broader lessons from SIB projects to date, including:

  • What types of SIB projects are likely to work best
  • The role of evaluation in SIB projects
  • Why philanthropy could and should play a central role in SIBs

How the Institute of Education Sciences at the U.S. Dept. of Education is helping the education field to learn and do what works: An interview with Russ Whitehurst, Senior Fellow, The Brookings Institution – Episode #119

Over the last 15 years, the field of education has become considerably more evidence focused, including a growing number of high-quality studies about how to help students succeed in school. An important catalyst has been the Institute of Education Sciences (IES). It is the independent, non-partisan statistics, research, and evaluation arm of the U.S. Department of Education. Created in 2002 during the George W. Bush Administration, it has continued to flourish under the Obama Administration and today has a budget of about $670 million and a staff of 180.

To learn more, including lessons for other public agencies, we’re joined by Russ Whitehurst. He was the first director of IES and served in that role from 2002 to 2008. Today he is a Senior Fellow at the Brookings Institution, including serving as editor of the Evidence Speaks series.

Web extra: Russ Whitehurst describes the origins of IES, including some of the key people involved in its creation and launch. [click here]

Milwaukee’s three-pronged strategy to reduce teen pregnancy: An interview with Bevan Baker, Commissioner of Health, City of Milwaukee, and Nicole Angresano, United Way of Greater Milwaukee – Episode #118

Bevin-e1460542263871Milwaukee’s Teen Pregnancy Prevention Initiative, launched in 2008, is a citywide effort led by the United Way of Greater Milwaukee and Waukesha County. It has been recognized as a model of community collaboration, including by the White House Council for Community Solutions.

The three main prongs of the strategy are:

  • An aggressive advertising campaign targeted to teens
  • The use of evidence-based sex education
  • The involvement of community partners to support the strategy

The initiative set a goal in 2008 to reduce Milwaukee’s teen births by 46% over 10 years. It exceeded that goal, three years early, with a 50% decline in teen births by 2012. During the same period, national teen birthrates also declined sharply, although Milwaukee’s decline slightly outpaced the national average. More broadly, the initiative is a leading example of an “all-hands-on-deck” community partnership focused on a key community challenge.

To learn more, we’re joined by two people who have been at the center of the initiative. Bevan Baker has been the Commissioner of Health in Milwaukee since 2004. And Nicole Angresano is the Vice President of Community Impact at the United Way of Greater Milwaukee and Waukesha County.

Three strategies to promote relevance in program evaluations so that findings are useful to policymakers and practitioners: An interview with Evan Weissman, Senior Associate, MDRC – Episode #117

In program evaluation, using the most rigorous methods possible is essential for producing credible research findings. But beyond the goal of rigor, relevance is important too. In particular, the more that evaluations are able to address specific research or implementation questions that are of interest to practitioners and policymakers, the more likely that the findings will actually get used.

A rigorous evaluation (using a randomized controlled trial) of a student-aid initiative, called Aid Like a Paycheck, recently took three additional steps, beyond typical program evaluation, to ensure that the study produces information that is relevant to end users. The strategies will be of interest to other program evaluators, but also to foundations and other funders who want to support rigorous and relevant program evaluations. The strategies are:

  • Implementing a pilot phase — in fact, one that ran longer than most (about 2 1/2 years);
  • Forming an advisory group of stakeholders to provide input into the design of both the intervention and the research study; and
  • Doing outreach to other stakeholders about both the preliminary intervention design and research design to get additional input.

To learn more, we’re joined by the evaluation’s lead researcher, Evan Weissman. He is a Senior Associate at the nonprofit research firm MDRC and has over 15 years of experience at MDRC directing projects, providing technical assistance, conducting qualitative research, and disseminating findings in a wide range of education and social policy settings.