Using opportunistic experiments to learn what works: An interview with Peter Schochet, Senior Fellow, Mathematica Policy Research – Episode #63

How can the public sector, including school districts, make the most of opportunities to learn what works — in other words, to fill knowledge gaps about effective policies and practices? In this interview, we discuss on an important tool for doing that: opportunistic experiments. These experiments, i.e., randomized controlled trials (RCTs), are “opportunistic” because they focus on policy or program changes that are already being planned (not changes done specifically for a study) and typically use administrative data that are already being collected. As a result, they can be lower disruption and lower cost than traditional experiments. In short, they’re a way for public leaders to do more experimentation and learning.

To get an overview of opportunistic experiments, we’re joined by Peter Schochet of Mathematica Policy Research, who is a nationally known expert on rigorous program evaluations. Earlier this year, Mathematica authored a how-to guide to using opportunistic experiments in education, as well as a guide specifically for school district leaders and principals, both funded and published by the U.S. Department of Education.


Share Button

Strengths and misperceptions of Social Impact Bonds: An interview with Jeffrey Liebman, Professor, Harvard Kennedy School – Episode #62

Social Impact Bonds (SIBs) — also known as Pay for Success — are relatively new in the U.S., but interest has grown quickly at the local, state and federal levels in just a few years. Two states (NY and MA), as well as New York City, are already implementing SIBs and about a dozen other states and cities are designing them or are considering doing so.

To learn more about Social Impact Bonds, we’re joined by one of the nation’s leading experts, Jeffrey Liebman. He is a Professor at the Harvard Kennedy School, Director of the Taubman Center for State and Local Government, and Director of the Social Impact Bond Technical Assistance Lab (SIB Lab) at the Kennedy School. He previously served on the leadership team of the White House Office of Management and Budget in the Obama Administration.

In our interview, we discuss:

  • the key features of SIBs
  • why interest has grown so quickly among public leaders
  • common misperceptions about the approach
  • the role of the SIB Lab


Share Button

Designing well-crafted mission statements: An interview with Sharon Oster, Professor, Yale School of Management – Episode #61

A well-crafted mission statement is an important building block for any results-focused public sector organization, whether it’s a department, agency, office or even team. Sharon Oster joins us to talk about crafting and using mission statements. She is a Professor of Management and Entrepreneurship at the Yale School of Management, where she previously served as the Dean. She’s also the author of the book Strategic Management of Nonprofit Organizations.

In the interview, Professor Oster discusses three functions of well-crafted mission statements: 1) setting boundaries; 2) motivating internal and external stakeholders; and 3) evaluating organizational performance. Staff within organizations that are crafting a new mission statement — or who are assessing their current one — can use these three functions to help them create a compelling and useful mission statement and then to put it into practice.



Share Button

Becoming an evidence focused grant-making organization: An interview with Kelly Fitzsimmons, Vice President, Edna McConnell Clark Foundation – Episode #60

How can grant making agencies in the public sector build their capacity to build and use rigorous evidence and help their grantees do so too? We gain insights from a successful evidence focused grant-making organization outside of government, the Edna McConnell Clark Foundation.

A hallmark of the Foundation’s approach is a focus on evidence. It chooses and structures its investments largely on the basis of empirical evidence that a grantee or potential grantee’s programs help economically disadvantaged young people. And a major objective of the Foundation’s investments is to help grantees build their own evidence base. In doing this work, it uses a framework to assesses an organization’s evidence of effectiveness on a continuum from high apparent effectiveness to demonstrated effectiveness to proven effectiveness.

To learn more, we’re joined by Kelly Fitzsimmons who is the Foundation’s Vice President and Chief Program and Strategy Officer.

Web extras: Kelly Fitzsimmons discusses:

  • The Foundation’s pilot program PropelNext, designed to help grantees in early stages of evidence building to systematically collect and analyze data [click here]
  • The growing emphasis among nonprofits around rigorous evidence and evaluation [click here]
  • Her advice to organizations just starting their journeys to become more evidence focused [click here]

Additional resources: To learn more about tiered-evidence grant programs, in which larger grant dollars go to approaches backed by stronger evidence (a.k.a. innovation funds), see the video tutorial on the blog. Also, to see another example of a framework to assess evidence levels, see the regulations for the Investing in Innovation (i3) program at the Department of Education, page 18683, which discuss criteria for development, validation and scale-up grants.

Share Button

How program managers can use low-cost experiments to improve results: A video overview – Episode #59

How can public leaders and program managers use low-cost experiments — also knows as low-cost randomized controlled trials (RCTs) — to improve program results in government? In this blog post, rather than conducting an interview as I usually do, I provide a video overview of the topic and provide examples. An audio version is also available, above.

As I explain in the video, low-cost experiments use existing high-quality data that are already being collected, which can bring the cost of rigorous evaluation way down. As a result, low-cost experiments can be a valuable complement to more traditional evaluation approaches and open up more opportunities for program managers to experiment and learn what works.

Additional resource: An interview on the blog with Jon Baron of the Coalition for Evidence-Based Policy on “Rigorous program evaluation on a budget,” highlighting more examples of low-cost experiments, is located here.

Share Button

Implementing a department-wide innovation strategy: An interview with Bryan Sivak, Chief Technology Officer, U.S. Department of Health & Human Services – Episode #58

How can public agencies at the federal, state or local levels spur innovation to tackle tough problems and find ways to better achieve their missions? To gain insights, we’re joined by Bryan Sivak (@BryanSivak), the Chief Technology Officer at the U.S. Department of Health and Human Services (HHS). Under his leadership, HHS launched its IDEA Lab in 2013, which has already catalyzed more than 100 innovation projects. Prior to his current role, he was the Chief Innovation Officer to Maryland Governor Martin O’Malley, Chief Technology Officer for the District of Columbia and a technology entrepreneur in the private sector.

As Bryan explains, the three main strategies of the HHS IDEA Lab are:

  • Supporting innovators from within the department, e.g., the HHS Innovates initiative that identifies and celebrates internal innovation by employees
  • Bringing new ideas and concepts into the department, e.g, the HHS entrepreneurs and HHS innovators-in-residence initiatives that bring in innovators from outside the department to help tackle important challenges
  • Mobilizing communities of practice to work on discrete challenges or ongoing, cross-cutting initiatives that require creative thinking and new solutions, e.g., the HHS Health Data Initiative

Bryan also provides broader advice for public leaders who want to strengthening a culture of innovation in their agencies. The clip of this portion of the interview is available here.

Share Button

Six ways government can use incentive prizes to spur innovation: An interview with Jesse Goldhammer, Principal, Deloitte Consulting – Episode #57

Incentive prizes — also known as prize competitions or challenges — are increasingly being used to spur innovation and address key challenges by public agencies at the federal, state and local levels. A recent report published by Deloitte Consulting, The Craft of Incentive Prize Design: Lessons from the Public Sector, provides insights and advice, including discussing six main outcomes, or goals, that different incentive prizes are designed to address: 1) attract new ideas; 2) build prototypes and launch pilots; 3) stimulate markets; 4) raise awareness; 5) mobilize action; and 6) inspire transformation.

Joining us to discuss the report is one of its co-authors, Jesse Goldhammer. He is a Principal with Deloitte Consulting.

Web extras: Jesse Goldhammer discusses how incentive prizes are used to mobilize action [click here] and inspire transformation [click here]. He also provides advice about crafting incentive prizes [click here].

Additional resource: In a related Gov Innovator interview, Jenn Gustetic of NASA discusses that agency’s use of prizes and challenges [click here].

Share Button

Boosting the life chances of young men of color: An interview with Dan Bloom, Director of Health and Barriers to Employment Policy Area, MDRC – Episode #56

Despite progress in many areas, young men of color still face many obstacles to success in terms of education, employment and other areas. Today, there is growing momentum through government and other efforts to improve outcomes for young men of color, including New York City’s Young Men’s initiative and the Obama Administration’s My Brother’s Keeper initiative. A recent report by the social policy research firm MDRC titled Boosting the Life Chances of Young Men of Color reviews what we know about interventions to improve the outcomes of young men of color that have been shown to be effective through rigorous research.

To discuss the report’s findings, we’re joined by Dan Bloom who, with Christopher Wimer, authored the report. Dan is the Director of the Health and Barriers to Employment Policy Area at MDRC.

Web extra: Dan Bloom discusses a promising area for future rigorous evaluation: strategies for exposing disadvantaged high school students to the labor market [click here]

Share Button

Using logic models, a key building block of results-focused programs: An interview with Tom Chapel, Chief Evaluation Officer, Centers for Disease Control and Prevention – Episode #55

Just like mapping out a journey before embarking on a trip, logic models provide a type of map for programs about where they want to go and how they plan to get there. To learn more about logic models and how they can be useful to programs and public managers, we’re joined by an expert on the topic, Tom Chapel. He’s the Chief Evaluation Officer at the Centers for Disease Control and Prevention (CDC).

The interview provides an overview of:

  • Why a clear program description is important
  • What logic models are and how they’re used
  • How the CDC is using logic models to clarify grantee proposals
  • Advice to program leaders and public managers about using logic models

Web extra: Tom Chapel discusses some additional key terms often used in logic models. [click here]

Share Button

Using predictive analytics and rapid-cycle evaluation to improve program design and results: An interview with Scott Cody, Vice President, Mathematica Policy Research – Episode #54

What are predictive analytics and rapid-cycle evaluation and how can public agencies and programs use them to improve program delivery and outcomes? To explore these questions, we’re joined by Scott Cody. He’s a Vice President of Mathematica Policy Research and the co-author, with Andrew Asher, of a recent paper “Smarter, Better, Faster: The Potential for Predictive Analytics and Rapid-Cycle Evaluation to Improve Program Development and Outcomes,” published by the Hamilton Project at the Brookings Institution.

Web extra: Scott Cody provides two suggested steps for public agencies that want to strengthen their ability to use predictive analytics and rapid cycle evaluation. [click here]

Share Button