Impact Evaluation Archive

0

“Oops! Did I just ruin this impact evaluation?” Top 5 of mistakes and how the new Impact Evaluation Toolkit can help. | News, views, methods, and insights from the world of impact evaluation

“Oops! Did I just ruin this impact evaluation?” Top 5 of mistakes and how the new Impact Evaluation Toolkit can help. | News, views, methods, and insights from the world of impact evaluation.

0

Does Business Training Work? | News, views, methods, and insights from the world of impact evaluation

What do we really know about how to build business capacity? A nice new paper by David McKenzie and Chris Woodruff takes a look at the evidence on business training programs – one of the more common tools used to build up small and medium enterprises. They do some work to make the papers somewhat comparable and this helps us to add up the totality of the lessons. What’s more, as David and Chris go through the evidence, they come up with a lot of interesting (and some not-so-obvious) lessons for actually doing impact evaluation of business training programs – I’ll save these lessons for next week and today talk about why we don’t know that much.

To take stock of the available evidence on business training, David and Chris search Econlit, Google scholar and then go out and ask folks for studies. They limit their discussion to papers which have tried to deal with selection on observables and unobservables and focuses on business practices (thus not taking on the substantial technical/vocational training literature). This gives them 14 studies to focus on (13 of which are randomized). These are almost all some sort of classroom training (sometimes combined as part of a microfinance program) but they also briefly discuss three other experiments which focus on providing individual consulting services.

Chris and David point out that for the average business a 25% increase in profit might cover 75% of the cost of the program over a year. But none of the studies was powered for a 25% increase in revenues and only two were powered for a 25% increase in profits (at 80% power). So we shouldn’t be surprised by the relatively weak overall results on these programs. And keep in mind that this is a fairly new area – of their 14 studies, only 5 have been published so far, and the oldest one dates from 2010 – so we are still learning about doing evaluations in this area.

Does Business Training Work? | News, views, methods, and insights from the world of impact evaluation.

1

Impact Evaluation Toolkit | The World Bank

Measuring the Impact of Results-Based Financing on Maternal and Child Health

This World Bank toolkit offers a step-by-step guide on how to evaluate the impact of interventions, especially those related to maternal and child health and those involving results-based financing (RBF). According to its developer, the World Bank Human Development Network, the guide can also be easily adapted for impact evaluation (IE) in other fields.
The toolkit includes:
• best practices for each stage of the IE cycle, such as how to choose evaluation questions, build a team, design an evaluation, and collect and analyse data.
• over 50 tools, including terms of reference for team members and survey firms, household and facility questionnaires, data-entry programmes, and materials for training enumerators and supervising field work.
In each module, the toolkit provides technical tools that can be used to implement the recommendations of the narrative. More than 50 tools are included, such as terms of reference for IE team members and survey firms, a list of Maternal and Child Health (MCH) indicators of interest, research protocols, questionnaires, enumerator training manuals and curricula, field work supervision materials, data analysis tools, etc. These standardised tools can facilitate cross-country comparisons of the results of RBF projects.

By Christel Vermeersch, Elisa Rothenbühler, Jennifer Sturdy, in June 2012. Download from
http://wbgfiles.worldbank.org/documents/hdn/he/PortfolioIEToolkit061512.pdf (pdf-file, 69.760 kB)

0

Paper v Plastic: The survey revolution is in progress

Markus Goldstein dicsuses the pros and cons of Computer Assisted Personal Interviewing (CAPI) vs. paper questionaires in this World Bank blog.

Paper v Plastic Part I: The survey revolution is in progress | News, views, methods, and insights from the world of impact evaluation.

MEbox : Message: Paper or Plastic? Part II: Approaching the survey revolution with caution.

0

World Bank | Handbook on Impact Evaluation | Quantitative Methods and Practices

Handbook on Impact Evaluation: Quantitative Methods and Practices, by Shahidur R. Khandker, Gayatri B. Koolwal, Hussain A. Samad, The World Bank, Washington DC, 2010, online at http://tinyurl.com/cm4q99v

Identifying the precise effects of a policy is a complex and challenging task. This issue is particularly salient in an uncertain economic climate, where governments are under great pressure to promote programs that can recharge growth and reduce poverty. At the World Bank, our work is centered on aid effectiveness and how to improve the targeting and effi cacy of programs that we support. As we are well aware, however, times of crisis as well as a multitude of other factors can inhibit a clear understanding of how interventions work—and how effective programs can be in the long run.

Handbook on Impact Evaluation: Quantitative Methods and Practices makes a valuable contribution in this area by providing, for policy and research audiences, a comprehensive
overview of steps in designing and evaluating programs amid uncertain and potentially confounding conditions. It draws from a rapidly expanding and broadbased literature on program evaluation—from monitoring and evaluation approaches
to experimental and nonexperimental econometric methods for designing and conducting impact evaluations.

0

Strategic Impact Evaluation Fund (SIEF), World Bank

The Strategic Impact Evaluation Fund is a new trust fund established within the World Bank to carry out and support research evaluating the impact of programs to alleviate poverty. The knowledge generated will provide evidence for designing more effective policies and programs.

The SIEF will finance impact evaluations in the areas of: 1) Early Childhood Nutrition, Health and Development, 2) Basic Education Service Delivery, 3) Health Systems and Service Delivery and 4) Water Supply, Sanitation, and Hygiene for Sustainable Human Development.

HD Office of the Chief Economist – Strategic Impact Evaluation Fund (SIEF).

0

Jeff Sachs, the Millennium Villages Project, and Misconceptions about Impact Evaluation

Jeff Sachs, the Millennium Villages Project, and Misconceptions about Impact Evaluation | News, views, methods, and insights from the world of impact evaluation

http://blogs.worldbank.org/impactevaluations/jeff-sachs-the-millennium-villages-project-and-misconceptions-about-impact-evaluation

News that another $72 million has been committed for a second stage of the Millennium Villages Project (MVP) has led to another round of critical discussion as to what can be learning from this entire endeavor. The Guardian’s Poverty Matters blog and Lawrence Haddad at the Development Horizons blog offer some critiques. In response to the latter, Jeffrey Sachs and Prabhjot Singh offer a rather stunning reply which seems worth discussing from the point of view of what is possible with impact evaluations. World Bank’s David McKenzie
dissects some of their statements.

0

Improving donor support for governance: the case for more rigorous impact evaluation

Rigorous governance evaluation poses three key challenges:

• Governance outcomes are difficult to quantify.

• Rigorous impact evaluation can capture no more than narrow aspects of complex, ‘system-wide’governance support and short-term impacts.

• A lack of incentive, since evaluation results may constrain political choices.

Past evaluations provide valuable recommendations to tackle these challenges. In quantifying governance outcomes, use context-specific information or conduct behavioural games. In dealing with complex ‘system-wide’ interventions, identify components of the programme that is viable for rigorous analysis. In improving incentives, persuade politicians to implement rigorous evaluations by studying the impact of their priority interventions

and by setting appropriate expectations.

Rigorous impact evaluation in governance is difficult but feasible. As donors face increasing pressure to demonstrate results, rigorous impact evaluations should work with, not against, achieving improved governance support and development outcomes.

From: Garcia, Maria Melody (2011)
Improving donor support for governance: the case for more rigorous impact evaluation
Bonn: Deutsches Institut für Entwicklungspolitik / German Development Institute
(Briefing Paper 11/2011)

via Improving donor support for governance: the case for more rigorous impact evaluation.

0

Systematic Review Database | 3ie Impact Evaluation

Systematic Review Database.

The International Initiative for Impact Evaluation (3ie) has just launched the first online database of systematic reviews in international development. Funded by the UK Department for International Development, this database contains systematic reviews focusing on development interventions in low and middle income countries.

Systematic reviews examine all the existing evidence on a particular intervention or programme. This searchable database provides summaries of the findings and methodologies of existing systematic reviews and protocols, as well as links to the full reports. It currently has over a 100 reviews which cover wide-ranging sectors like agriculture, education, nutrition and health. More reviews will be added as they become available.