methods Archive


EuropeAid – Evaluation Guidelines

The EuropeAid Evaluation Guidelines are still here. The site offers methodological bases, evaluation tools, documents, examples and a glossary. EuropeAid – Evaluation – Guidelines.


World Bank’s Blog ‘Development Impact’

World Bank’s Blog ‘Development Impact’ has news, views, methods, and insights from the world of impact evaluation.


Jeff Sachs, the Millennium Villages Project, and Misconceptions about Impact Evaluation | News, views, methods, and insights from the world of impact evaluation

Jeff Sachs, the Millennium Villages Project, and Misconceptions about Impact Evaluation | News, views, methods, and insights from the world of impact evaluation.

News that another $72 million has been committed for a second stage of the Millennium Villages Project (MVP) has led to another round of critical discussion as to what can be learning from this entire endeavor. The Guardian’s Poverty Matters blog and Lawrence Haddad at the Development Horizons blog offer some critiques. In response to the latter, Jeffrey Sachs and Prabhjot Singh offer a rather stunning reply which seems worth discussing from the point of view of what is possible with impact evaluations. World Bank’s David McKenzie
dissects some of their statements.


Social Research Methods

This website is for people involved in applied social research and evaluation. You’ll find lots of resources and links to other locations on the Web that deal in applied social research methods.

via Social Research Methods.


Theory-Based Impact Evaluation: Principles and Practice, by Howard White

theory-based approach to impact evaluation maps out the causal chain from inputs to outcomes and impact, and tests the underlying assumptions. Despite wide agreement that this approach will address the why question, it has not often been effectively used. Most studies make speculations as to the reasons for impact, or differences in impact, rather than using solid empirical analysis.

This paper outlines six principles for successful theory-based impact evaluation: (1) map out the causal chain (programme theory); (2) understand context; (3) anticipate heterogeneity; (4) rigorously evaluate impact using a credible counterfactual; (5) use rigorous factual analysis; and (6) use mixed methods.

Source: GSDRC.


Website: Free Resources for Program Evaluation and Social Research Methods

This page lists FREE resources for program evaluation and social research methods. The focus is on “how-to” do evaluation research and the methods used: surveys, focus groups, sampling, interviews, and other methods. Most of these links are to resources that can be read over the web. A few, like the GAO books, are for books that can be sent away for, for free (if you live in the US), as well as read over the web.


Micro-methods in evaluating governance interventions

Although billions of dollars have been invested on improving governance in developing countries in the past decade, few of the programmes that have received funding have been subjected to strong and rigorous impact evaluation. The aim of the paper is to answer three key questions: what are the features of governance interventions that make rigorous impact evaluation difficult and challenging? Second, what aspects of governance have been evaluated by rigorous quantitative methods? And third, what evaluation lessons can we learn from previous experience and what practical implications does it have?

via Micro-methods in evaluating governance interventions.


Learning how to learn: eight lessons for impact evaluations that make a difference – Resources – Overseas Development Institute (ODI)

Learning how to learn: eight lessons for impact evaluations that make a difference – Resources – Overseas Development Institute (ODI).

This Background Note outlines key lessons on impact evaluations, utilisation-focused evaluations and evidence-based policy. While methodological pluralism is seen as the key to effective impact evaluation in development, the emphasis here is not methods per se. Instead, the focus is on the range of factors and issues that need to be considered for impact evaluations to be used in policy and practice – regardless of the method employed. This Note synthesises research by ODI, ALNAP, 3ie and others to outline eight key lessons for consideration by all of those with an interest in impact evaluation and aid effectiveness.


IDB calls for proposals about impact assessment methodologies

Mira – Banco Interamericano de Desarrollo.

IDB calls for proposals about impact assessment methodologies in Latin America and the Caribbean
Deadline for submissions is June 13th.

The Inter-American Development Bank (IDB) is calling for proposals about “Measuring Institutional Impact on the Region of the Americas (MIRA).” The Bank will select eight proposals to carry out impact assessment methodologies in institutional strengthening programs for government agencies in Latin America and the Caribbean. The deadline for submissions is June 13, 2011. Each selected proposal will receive up to $50,000 in funding.

The competition aims to help achieve the following:

* Demonstrate that institutional strengthening initiatives achieve their principal objectives and therefore contribute to economic development.
* Reduce the knowledge gap on the impact of institutional strengthening.
* Improve the design of future public policies.
* Document the need to incorporate evaluation components in the design of such interventions.

The formulation of an effective methodology for assessing impact on programs to strengthen institutional capacity is critical for ensuring that these activities achieve their principal objectives, which are to help improve public policies, increase competitiveness, and promote economic development and social equity.

The MIRA competition is open to public institutions, nonprofit or for-profit organizations, academic institutions, research centers, or bilateral development agencies. The proposals should be designed for interventions implemented in the public sector, whether executed by the governmental entity itself or through a different organization.

A committee of experts will evaluate the proposals according to the clarity of their objectives, their capacity to provide empirical documentation on the impact of capacity strengthening initiatives, their ability to demonstrate causal relationships based on the results, and their incorporation of elements of cost-efficiency.

The eight winners will have the opportunity to work with IDB specialists in developing their methodologies, which will help provide them with recognition as leaders in issues of institutional capacity building and impact assessment. The winners will receive up to an additional $25,000 to replicate their methodologies in Latin America and the Caribbean.

The competition is open until June 13, 2011. For information on how to send proposals, please visit


Making evaluations matter: A practical guide for evaluators – Wageningen UR

Publications – Wageningen UR – Wageningen UR Centre for Development Innovation.

Recently published by the Centre for Development Innovation, Wageningen University & Research centre, Wageningen (The Netherlands), the guide is primarily for evaluators working in the international development sector. It is also useful for commissioner of evaluations, evaluation managers and M&E officers.The guide explains how to make evaluations more useful. It helps to better understand conceptual issues and appreciate how evaluations can contribute to changing mindsets and empowering stakeholders. On a practical level, the guide presents core guiding principles and pointers on how to design and facilitate evaluations that matter.Furthermore, it shows how to get primary intended users and other key stakeholders to contribute effectively to the evaluation process.