Creating a results-focused city government: An interview with Michael Nutter, former Mayor of Philadelphia – Episode #138

NutterWhat is the value of evidence and data for elected city leaders as well as how can those leaders create a results-focused culture within city government? We get insights from Michael Nutter who served for eight years at the Mayor of Philadelphia, from 2008 to January 2016. Under his leadership, Philadelphia became known as a leader in the use of data and evidence.

In particular, the Nutter Administration established strategic goals with measurable targets; launched PhillyStat, Philadelphia’s performance management system; established Philadelphia’s open data policy in 2012 and launched an open data portal in 2015; and launched Philly 311, the city’s online customer service system.

Today Michael Nutter is a CNN political commentator, a professor at Columbia University, a fellow at the University of Chicago’s Institute of Politics and a senior fellow with the What Works Cities initiative, among other roles.

Making rigorous program evaluation easier with RCT-YES software: An interview with Peter Schochet, Fellow, Mathematica Policy Research – Episode #137

Public leaders — whether they’re helping run a state agency, a school system, a hospital, a set of Head Start centers or any other organization — are likely to implement changes over time, whether it’s adjusting programs or adding new services. Maybe it’s a new curriculum for students in a school district or new intake procedure for patients in a hospital. Whatever the change, how can those leaders determine if the change is actually effective?

Our focus today is new software, called RCT-YES, designed to help public leaders (and the researchers who work with them) answer that question. It was funded by the Institute of Education Sciences, the statistics, research, and evaluation arm of the U.S. Department of Education, and developed in partnership with Mathematica Policy Research. The software, available free to download online, is based on new statistical methods for analyzing data from randomized controlled trials (RCTs).

To learn more, we are joined by Peter Schochet. He is a nationally known methodological expert in program evaluation and a Senior Fellow at Mathematica. He led the team that developed RCT-YES.

Web extra: For those with deeper expertise in evaluation, Peter Schochet gives an overview of how the RCT-YES software is designed to conduct a wide range of analyses using RCT or QED data and how the software uses new statistical methods for analyzing those data. [click here]

Lessons in applying behavioral insights to human services from the Behavioral Interventions to Advance Self-Sufficiency (BIAS) project: An interview with Lashawn Richburg-Hayes and Nadine Deshausay, MDRC – Episode #136

Nadine_Dechausay_MDRCLashawn_Richburg-Hayes_MDRCIn 2010, the Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services launched a project to explore how programs could advance their goals, and address specific challenges, by applying insights from behavioral sciences, including behavioral economics. It is called the Behavioral Interventions to Advance Self-Sufficiency (BIAS) project. Now, six years later, it has results from 15 randomized experiments conducted across seven states on the topics of employment, child support and childcare.

BIAS imageTo get an overview and hear implementation lessons for human services agencies that might want to use these types of interventions — or “nudges,” as they are often called — we are joined by two researchers from the social policy research firm MDRC, which was a partner on the BIAS project. Lashawn Richburg-Hayes is a Director and Nadine Deshausay is a Research Associate at MDRC.

More information: For more information on the 15 projects, including their goals, strategies, results and costs, see MDRC’s PowerPoint presentation presented at the BIAS Capstone Convening in April 2016 [click here].

How states and localities are improving the quality of education, health, and human services through integrated data systems: An interview with Dennis Culhane, Professor, University of Pennsylvania – Episode #135

Programs and agencies in government often exist in silos, where the efforts of one aren’t necessarily connected with others and their data are not shared between them. That slows the process within government of learning what works, coordinating efforts, spurring social innovation, and continuous improvement.

WhatIsIDS_ImageA growing number of states and localities, however, are developing Integrated Data Systems by linking their program data, also called administrative data, across multiple agencies to monitor and track how services are being used and to what effect. These systems can also be used to test social policy innovations through quick, low-cost randomized control trials and quasi-experiments, as well as to do continuous quality improvement efforts and benefit cost analysis.

To learn more, we’re joined by Dennis Culhane. He’s is a professor in the School of Social Policy and Practice at the University of Pennsylvania and the co-principal investigator of the nonprofit Actionable Intelligence for Social Policy (AISP) which has built a network of jurisdictions using Integrated Data Systems.

Credit: The graphic above is from AISP’s overview of Integrated Data Systems.

Why broadening access to Federal administrative data is critical for improving government services and increasing taxpayer value: An interview with Maria Cancian, Professor, University of Wisconsin-Madison – Episode #134

Federal programs produce a lot of data — known as administrative data — and those data can be very useful for program administrators and researchers to answer important questions about policy and practice. That is especially true when data from multiple programs or datasets are linked, producing a broader view of program performance that spans organizational silos.

In short, access to administrative data is critical to making Federal programs and policies more effective and efficient. Today, however, access to data can be so restricted that conducting research and analysis can be very difficult.

Our guest today has a vision for how that could be different and why greater access to data is important. Maria Cancian is a Professor of Public Affairs and Social Work at the University of Wisconsin–Madison and the former Director of the Institute for Research on Poverty. From 2015 to 2016, she served as the Deputy Assistant Secretary for Policy at the Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services.

Additional resource: The White House Office of Management and Budget recently created a set of background briefs on using administrative data, prepared for the Commission on Evidence Based Policymaking [click here].

Test, learn and adapt – How public agencies can use researcher-practitioner partnerships to test low-cost, light-touch interventions: An interview with Adam Sacarny, Professor, Columbia University – Episode #133

How can public agencies can use rapid, low-cost experiments to test (and learn from) low-cost, light-touch interventions such as communications and outreach strategies? Also, how can agencies partner with academic researchers to run those experiments and what characteristics of those researcher-practitioner partnerships help make them successful?

To get insights into all those topics, we are joined by Adam Sacarny (@asacarny). An economist by training, he is a professor of health policy and management at Mailman School of Public Health at Columbia University. He has been working with both the U.S. Department of Health and Human Services (HHS) as well as the State of Colorado on communications-related experiments. Both HHS and Colorado are taking a “test, learn and adapt” approach by testing out certain interventions using relatively quick, low cost randomized controlled trials, learning from the results, and refining the strategies to be tested again.

Web extra: Adam Sacarny discusses the communications-related experiment he conducted with the State of Colorado around helping Colorado citizens choose a health plan on the ACA marketplace that best fits their needs. [click here]

How Utah became a leader in evidence-based policymaking: An interview with Kristen Cox, Director, Governor’s Office of Planning and Budget, and Jonathan Ball, Director, Utah Fiscal Analysts Office – Episode #132

jon-ball Kris-CoxUtah is one of the top states in the U.S. in terms of evidence-based policymaking and budgeting. In particular, with efforts by the Utah State Legislature and the administration of Governor Gary Herbert, Utah has created a variety of agency-specific and cross-agency tools to incorporate evidence into policy and funding decisions. That includes:

  • A requirement from the Governor’s budget office that agencies seeking new funding provide evidence of program efficiency and effectiveness and, for new programs, describe their program evaluation strategy.
  • The Herbert administration’s use of a performance management framework for agencies.
  • A requirement by the Legislature that proposals for new or significantly expanded programs require a performance note that describes how the program will measure its success (with followup by legislative auditors to track results).
  • A statewide registry of evidence-based prevention interventions that guides the Utah Division of Substance Abuse and Mental Health in contracting decisions.
  • The use of a comprehensive cost-benefit model in juvenile justice to help lawmakers identify evidence-based policies that provide the best return on taxpayers’ investment.

To learn more, we are joined by Kristen Cox, the Director of the Office of Planning and Budget for Governor Gary Herbert, and Jonathan Ball, the Director of the Utah Fiscal Analysts Office for the Legislature.

How the State of Mississippi uses evidence-based budgeting to increase return on investment and improve program outcomes: An interview with Toby Barker, Mississippi State Representative – Episode #131

Toby BarkerOver the past several years, the State of Mississippi has taken important steps to use evidence in order to get better results from state spending and, in turn, achieve better outcomes for the people of Mississippi. That includes defining tiers of evidence to focus funding on what works; creating comprehensive program inventories that categorize the level of evidence relating to each program’s effectiveness; and reinvigorating the state’s use of evidence-based budgeting (also known as performance-based budgeting), including using a set of questions to guide funding decisions called the “Seven Elements of Quality Program Design.”

To learn more, we’re joined by someone who has been closely involved in these efforts, Toby Barker (@toby_barker). First elected to the legislature in 2007 at the age of 25, today his is the Chairman of Performance-Based Budgeting Committee, which launched in 2016. He also sits on several other committees, including Appropriations. He is a Republican state representative whose district covers Central Hattiesburg.

More information: Mississippi is part of the Pew-MacArthur Results First Initiative, which Toby Barker references in the interview. For more information, see the Gov Innovator podcast interview with Gary VanLandingham of Results First. [click here]. Also see the initiative’s Guide to Evidence-Based Budget Development.

How the UK’s Education Endowment Foundation (EEF) is building rigorous evidence about how to close education achievement gaps: An interview with Sir Kevan Collins, Chief Executive, EEF – Episode #130

The Education Endowment Foundation (EEF) is dedicated to breaking the link between family income and educational achievement. To do that, it has a unique strategy: increasing the supply of high-quality evidence about what works in order to enable better decisions by teachers and school leaders. Launched in 2011 with a founding grant of £125 million from the UK Department of Education, today it operates as an independent grant making nonprofit. With investment and fundraising income, it intends to award about £220 million over 15 years.

Remarkably, today about one in four schools in the UK (7,600 schools, involving more than 750,000 students) is taking part in some type of EEF-funded randomized controlled trial to learn what works in education policy and practice — or to learn how best to convey evidence-based approaches to teachers and encourage their use. To date, EEF has funded 130 projects; awarded £75 million in funds; partnered with 26 independent evaluation teams; published 60 reports; and launched the Teaching and Learning Toolkit and Early Years Toolkit.

To learn more, we are joined by EEF’s founding Executive Director, Sir Kevan Collins. He has worked in the public sector for over 30 years, including serving as Chief Executive of the London Borough of Tower Hamlets and, before that, as Director of Children’s Services for that borough.

Transforming Federal grant programs from compliance driven to results focused: An interview with Robert Gordon, former Acting Deputy Director, White House Office of Management and Budget – Episode #129

If you think about what the Federal government does, grant making may not be the first thing you think of. Even so, billions of dollars flow from the Federal level to states, localities and nonprofits in the form of grants. How can the Federal government encourage more evidence-based policy and innovation through the grant making process?

We get insights from Robert Gordon who held top leadership roles at the White House Office of Management and Budget and the U.S. Department of Education — and was one of the architects of the Obama Administration’s evidence agenda. He’s also the co-author with Ron Haskins of a bipartisan agenda for strengthening the use of data and evidence, published in the book Moneyball for Government. He is currently a Senior Vice President at the College Board.

In the interview, he discusses three of the grant-related strategies presented in the “Moneyball” chapter. They are for Federal agencies to:

  • conduct grant-program “look backs” to replace mandates for processes with incentives for outcomes;
  • transform existing formula and competitive grants to use more evidence;
  • create new flexibility to test new approaches to fighting poverty.

Web extra: Robert Gordon discusses how evidence-based policy can be an area of agreement between leaders from different political parties around the goal of spending smart. [click here]