Evaluation in the COVID-19 pandemic mode

Evaluation in the COVID-19 pandemic mode, Part 1

by Karsten Weitzenegger, updated 7 May 2020

Evaluation can inform global policy decisions during the coronavirus pandemic. However, M&E systems themselves are affected by the current crisis. Here is a quick review of recent literature. This may guide evaluators and managers in global development evaluation. Comments are appreciated, as we are preparing for a remote evaluation. Part 2 will be on lessons learnt from this experience.

  1. Policy needs evidence to get out of crisis
  2. Evaluation needs to transform itself
  3. What evaluators should do now
  4. What evaluators should avoid now
  5. Reposition your evaluation design
  6. Data collection gets different
  7. Emerging practices and tools
  8. Literature

Policy needs evidence to get out of crisis

The COVID-19 pandemic is rapidly transforming our world. Individuals, communities and organisations are facing enormous challenges and uncertainty. Limited resources have been further stretched by the climate crisis and unprecedented natural disasters. These global challenges put the Sustainable Development Goals at risk and threaten the well-being of people and the planet.

The consequences of the global crisis are severe for us all, but poor people – especially those in fragile countries – will be the hardest hit, where health systems, government structures, food supplies and social safety nets are weak. The subsequent economic effects are likely to be profoundly damaging, particularly for already vulnerable people, and could jeopardise political and economic stability. The World Bank estimates that the pandemic could push about 49 million people into extreme poverty in 2020. (SÁNCHEZ-PÁRAMO)

International and multilateral cooperation is more important now than ever. Keeping up with this fast-moving pandemic will require agility and the capacity to change course to meet new challenges, guided by a steady focus on the evidence. Every government needs robust monitoring and evaluation (M&E) systems now more than ever to design effective policies. (Alison EVANS)

Choices being made right now will shape our society for years, if not decades to come. While such a pandemic has never happened before in our lifetime, we know from our evaluations that crises can indeed become opportunities for both economic growth and environmental sustainability. Jeneen GARCIA gives examples for this.

As we move towards the next phase of the COVID-19 crisis in many countries, governments have a unique chance for a green and inclusive recovery that they must seize – a recovery that not only provides income and jobs, but also has broader well-being goals at its core, integrates strong climate and biodiversity action, and builds resilience. (GURRÍA, OECD)

Evidence from the responses to past crises offers important lessons on what worked and why. It’s important to think about what the containment measures might do. In the case of Ebola, containment was fairly effective, and from the perspective of this intervention the big differential impacts came from the facts that schools were closed, markets were closed and health centres shifted their focus to Ebola prevention and care. This meant that there were a lot of out of school, non-working teenagers without access to reproductive health services. Evaluations of the Ebola response in West Africa found that too often assistance was ended abruptly and in an uncoordinated manner, undermining short term gains and the broader health system. Of course, the current crisis is global, not local, in scope, so there are a lot more dimensions to think about. (GOLDSTEIN & KONDYLIS)

The crisis opens our mind to unexpected events — both positive and negative. There are lessons from past crisis. Monitoring & Evaluation can bring evidence on the impacts and consequences of the coronavirus pandemic on lives and societies. They can point to solutions to boost healthcare systems, secure our businesses, maintain jobs and education, and stabilise financial markets and economies. While such a pandemic has never happened before in our lifetime, we know from our evaluations that crises can indeed become opportunities for both economic growth and environmental sustainability. (Jeneen GARCIA)

The OECD EvalNet leaders (BASTØE/BRUSSE/FAUST) recommend that all upcoming humanitarian and development assistance responses be accompanied by high-quality results monitoring, evaluation and research. Ignoring past evidence and failing to invest in generating credible new evidence about what is and is not working, in which contexts and for whom, may well cost lives.

The OECD Development Assistance Committee (DAC) upholds its standards and accountability mechanisms and the commitment to sharing evidence, best practice, data and resources on what works to counter the virus. The DAC members will learn lessons from the crisis and will use the experience to inform policy choices during the recovery to fortify efforts to achieve the 2030 Agenda for Sustainable Development.

The OECD/DAC Network on Development Evaluation (EvalNet) and the Independent Evaluation Office of UNDP have jointly prepared a guidance note capturing good practices for evaluations during COVID-19.

Knowing what works and what doesn’t can help us understand where precious (and rapidly scarce) resources should go. Evaluations do just that. By looking closely at pathways and using tools such as counterfactual pathways and implementation research, we can understand how pathways of change are affected during a crisis. This can then help mitigate risks. Evaluations can help understand how much, relatively, interventions and programs make a difference. When combined with cost data, evaluations can help cost-effectiveness estimations so that resources go to the best interventions and cost-effective programs possible. (Jo PURI)

Evaluation needs to transform itself

Hopefully this crisis can teach us that a global perspective and articulation in international aid and its evaluation is becoming a methodological imperative.

The OECD EvalNet leaders (BASTØE/BRUSSE/FAUST) worry that the pressure for urgent action can lead to critical missteps. Evidence from past interventions during global or regional crises has shown that not all well-intended actions are effective. They suggest to take into account the relevant and applicable lessons from past experiences and evaluations to increase effectiveness and relevance of current efforts.

To support this process, development partners and national and sub-national governments should already now begin by investing in sound data, monitoring and evaluation systems. External players such as international NGO’s and international organisations should transform themselves from delivery agents to enablers, monitors and advocates, the DAC EvalNet leaders suggest.

Patricia ROGERS from BetterEvaluation details on their institutional response:

  • We are working remotely to ensure the safety of our staff and community.
  • We are finalizing user experience improvements to our website to make it easier to find relevant information.
  • We are creating and curating additional content to address the current context.
  • We are continuing existing partnerships and capacity strengthening projects, adapting as needed.
  • We are exploring ways we can contribute to specific efforts to tackle these challenges locally and globally.

We have to do much more to display the full value of evaluation for a new era, states Zenda OFIR and dedicates a post series to it. COVID-19 is going to force us to position evaluation in new spaces, in new ways, as something more powerful and useful, and in ways that reflect the spirit and needs of this era. We have to rethink evaluation’s value proposition and how we can give more power to our elevator speeches. We have to move faster towards a systems- and complexity-informed view of a world in need of transformation towards sustainable development.

OLAZABAL/BAMBERGER/YORK stress the importance of the availability of real-time and reliable data and calls for us to rewire our systems of how we predict and assess impact. One step toward achieving this paradigm shift is to bring together the capabilities of people who research and measure the impacts of policies and programs on people and the planet.

The COVID-19 pandemic has created many immediate and urgent needs for more and better data and evidence. However, most evidence-generating activities in development have long timelines. How can we shorten timelines for evaluation and synthesis to respond for immediate needs for evidence? 3ie will offer Webinars soon on this question.

To move forward with existing evaluation plans, Martena REED recommends to develop evaluation contingency plans, include low-interference, creative evaluation methods und to build in time for human connection. As many impact indicators are built on the assumption of peace and macroeconomic stability, a lot of Result Frameworks and LogFrames need to be reformulated now.

What evaluators should do now

Jyotsna PURI reminds us, that evaluations are done with a set of values in mind — equity, access, sustainability and innovation. These values become even more important during times of crisis. Applied research and evaluations can help assess how we can get to these values in the best way possible.

Alice MACFARLAN gives management advice around the seven clusters of tasks in the BetterEvaluation Rainbow Framework.

A listening stance from staff and management of the development agencies, willing to incorporate lessons and good results frameworks along the way, is essential.

Additionally, a committed evaluator is important, ready to contribute and engage in times of emergency with timely inputs, while protecting his or her independence and impartiality. As evaluators we have three roles: advising on how best to measure and monitor, providing high-quality evidence, and making evidence and lessons much more accessible to policymakers.

Michael Quinn PATTON suggests, that all evaluators must now become developmental evaluators, capable of adapting to complex dynamics systems, preparing for the unknown, for uncertainties, turbulence, lack of control, nonlinearities, and for emergence of the unexpected. This is the current context around the world in general and this is the world in which evaluation will exist for the foreseeable future.

Connect with those who have commissioned your evaluations, those stakeholders with whom you’re working to implement your evaluations, and those to whom you expect to be reporting and start making adjustments and contingency plans. Don’t wait for them to contact you. Evaluation is the last thing on the minds of people who aren’t evaluators. (PATTON)

UNDP IEG has published an infographic on the current opportunities for evaluators:


Source, click to enlagre: http://web.undp.org/evaluation/documents/infographics/Evaluation-during_crisis-COVID19.pdf

TAYLOR-DORMOND & TENEV identify four areas in which designers, policymakers, and evaluators can engage during the current emergency work of international financial and development institutions (World Bank, Asian Development Bank)

  1. Blueprint or framework of interventions: Bring lessons from past crisis episodes to inform policy responses.
  2. Real-time assessment: Evaluators need to work in real-time to gauge the likely effectiveness of the responses while they are being implemented. Evaluators must be capable of assessing short-and long-term results of interventions.
  3. Results of interventions: An assessment of results should be conducted as soon as possible when the crisis has subsided. The focus of the early evaluations should be placed on the intended short-term results and the creation of conditions for the resumption of a long-term development path.
  4. Prevention and preparedness: Forward-thinking and prospective analysis may need to be part of the crisis response itself.

While helping, evaluators of course need to watch for misleading analogies in a world in which nothing repeats itself in exactly the same way.

What evaluators should avoid now

Adjust methods according to the crisis situation, but do not drop your principles.

“Do no harm” in the current context this means not propagating the spread of the virus. If you are at the stage in your impact evaluation where an intervention is happening, let the professionals decide what to with the intervention. If you’re running it yourself it’s probably best to stop. In terms of a survey, if you are about to go the field, it’s probably best to delay. If you have a team in the field, it’s probably best to stop. Make sure your team will be OK. (GOLDSTEIN & KONDYLIS) Care should be taken to not place any consultant or stakeholders (national or international) in harm’s way and evaluation methodologies proposed should limit the exposure of stakeholders to the pandemic. Approaches to ensure this should be thoroughly detailed in evaluation inception reports. (UNDP)

“Leave no one behind” means that vulnerabilities also exist in the capacities to gather and act on emerging evidence. Technology is distributed unequally. Refugees at camps have a high level of ownership of telephones. Rural women and elderly population might have little connectivity. The OECD DAC remembers that the response must take account of the role of women and girls, children, youth and vulnerable groups, including people with disabilities and the elderly, and aim to reduce inequalities and protect human rights and freedoms.

Think about different sub-populations that may be affected differently (e.g. in the case of Ebola, health workers died disproportionately). (GOLDSTEIN & KONDYLIS) As an evaluator, you can use your talents to help keep those with the highest needs stay in mind and in sight, LYSY advises.

Human rights-based, equity-focused and ender responsive evaluations are more important than ever, to inform interventions focused on leaving no one behind. The UNFPA Evaluation Office recent guiding principles for adapting evaluations during the pandemic details on that.

The REGIONAL RISK COMMUNICATION AND COMMUNITY ENGAGEMENT (RCCE) Working Group new Guide details on how to include marginalized and vulnerable people in risk communication and community engagement.

Reposition your evaluation design

Now your evaluation isn’t going to be the same. The key thing will be to keep up with what’s going on and adapt (as much as possible) the research to the changes. If implementation is continuing (e.g., because this is a fundamental part of a safety net or vital infrastructure) then be prepared for the implementation modalities to shift. Some implementation is just going to be flat out delayed. (GOLDSTEIN & KONDYLIS)

From a methodological perspective, the World Bank IEG advisors RAIMONDO/VAESSEN/BRANCO identify four main challenges for evaluators.

  1. Current restrictions on empirical data collection at the institutional level. Due to travel restrictions, shifting institutional priorities, and institutional access some of the key stakeholders are not available for interviews. As a result, evaluators may resort to convenience sampling and be prone to selection bias.
  2. Constrains due to the inability to conduct on-site data collection. Evaluators will struggle to develop a rich and contextualized perspective of the evaluand. Remote interviewing (by phone, teleconferencing) constitutes only a partial solution to this challenge. It will only partly alleviate the access problem and is prone to bias (especially when interviews cover complex or sensitive topics).
  3. The “central government bias” in data collection is a constant challenge. Depending on the nature of the evaluation, many interviews are likely to involve stakeholders in national government directly involved in the planning, financing and implementation of interventions. Interviews with stakeholders especially in rural areas may be more difficult to plan in the current circumstances.
  4. Finally, a fourth challenge concerns unlocking the potential of desk-based review and analysis. Evaluators are increasingly reviewers and synthesizers of existing knowledge and data. This also goes for the analysis of existing data, both conventional as well as “big” data. An example of the latter concerns the field of text analytics in combination with (un)supervised machine learning techniques, which offers new ways of conducting evaluative content analysis of existing textual information from documents and the Internet.

Monitoring data is more important now. If implementation is going to shut down, it’s important to find out how much has been done. The provider may or may not have this data (depending on how they were planning to monitor things). You may need to find a way to do some remote monitoring/checking in to get a clear sense of what has happened so far so that when things resume, you’ll have a better understanding of what people’s experience of the intervention actually was. This is hard data to get retrospectively.

RAIMONDO/VAESSEN/BRANCO suggest a framework organized around four questions to address the ethical, conceptual, and methodological challenges that are affecting programmatic evaluation work during the COVID-19 pandemic. From an ethical point of view, evaluation work plans will inevitably need adjustments. Needless to say, if evaluation teams need to go out and collect data, necessary precautions should be taken to protect staff and respondents. Because many public resources are being diverted to the crisis, ongoing interventions may not be implemented as designed, as part of the resources may be diverted to addressing crisis-related needs. This has implications on how evaluators should look at such interventions. They created a decision tree (shown below) that synthesizes key questions we can ask ourselves as we think through various options.

Making Choices about Evaluation Design in times of COVID-19: A Decision Tree

Decision Tree
Source, click to enlarge: http://ieg.worldbank.org/sites/default/files/Data/Blog-images/Covid_Eval_DecisionTree.pdf

Zenda OFIR (13.04.2020) summarizes insights provided by MICHAEL QUINN PATTONRockefeller Foundation (i.a. by Michael BAMBERGER), World BankIEG, UNDP IEO, and comments from webinars held by the UNEG, OECD Evalnet and Blue Marble Evaluation as follows.

Action List
Source, click to enlarge: https://zendaofir.com/wp-content/uploads/2020/04/Action-List-1200px.png

Data collection gets different

The often touted “gold standard” RCT methodology is not the only evaluation method capable of determining if something works. Especially for emerging social responses to crises. When the situation at hand is complex and emerging, there might not be a proper comparison group. But that doesn’t mean we can’t determine how certain actions lead to desired consequences. (Chris LYSY)

Many systems-informed methodologies and methods remain relevant; outcome mapping, contribution analysis, ripples tracing, social network analysis, Most Significant Chane, and outcomes harvesting are only some.

In terms of data collection, one obvious move would be to switch to a slimmed down phone survey. As field operations are halted, phone surveys may be crucial in documenting intermediary outcomes along the causal chain.

As COVID-19 is a global challenge, the interview partners understand why meetings are not possible and help to trace additional stakeholders for calls. When shifting from face to face to phone interviews it is important to be aware of the trade-offs. “Desk-based” case studies, including virtual (or phone) interviewing to collect data at the institutional level (for example among different groups of operations colleagues, ministries, sub-national government entities, or development partners), can be feasible. However, we would lose the possibility to meet certain stakeholders, in some cases it would lower the quality of interview data (e.g. by making it harder to build rapport with the interviewee, explore sensitive topics, or “read the air”), we would also lose the options of unobtrusive observation of projects/institutions, conduct inductive analysis on site, snowball sampling on site, and so on. (RAIMONDO/VAESSEN/BRANCO)

CartONG has published generic recommendations applicable to data collection and management in the context of the Covid-19 crisis:

  • Limit data collection to essential and critical data for project implementation and context monitoring; and postpone non-imperative data collections to later.
  • Identify the level of risk for teams and communities and stop all “risky” data collections (or equip personnel with the necessary protective equipment) such as the collection of biometric samples or data collections resulting in the gathering of too many people.
  • Make maximum use of secondary data. The current crisis is generating a large amount of data: consider using publicly available data (such as data.humdata.org) at least for context monitoring. This limitation of primary data collection can also be an opportunity to explore your old data that you may not have had time to analyze completely yet)
  • Share your data as much as possible: it is crucial – even more than usual – to limit unnecessary data collections if this data already exists and can be found. Try sharing as much as possible your shareable data (with your partners, clusters, open data platforms), even if you doubt their quality – in the context of a crisis, data is of often of imperfect quality due to the difficult data collection conditions.
  • Integrate the data protection component into your new tools (consent, security, etc.). If you anticipate having to refer cases to another actor, think about planning sooner rather than later a “data sharing agreement” so that these are in line with your organization’s procedures.

JONES/KHINCHA/KONDYLIS&UWAMARYA further detail on data protection. Make sure to password protect the devices used for data collection and delete any personal information of respondents.

Claudia MARTINEZ recommends to secure an informed consent from interview partners. Send information material about the evaluation to the interview partners beforehand. Mention who will call and from which number. Ask for the most suitable time to call. Women might be more open to chat remotely even with to a male interviewer than in physical meetings, she reports from Syrian camps.

One way around this is to rely more heavily on the expertise of (local) consultants with the right substantive and contextual expertise in their respective countries. The use of in-country expertise (which is already a key aspect of the “business as usual” scenario) can become an even more essential building block of our evaluations. However, local consultants will need to follow health and safety guidelines and abide by ethical principles for reaching out to key informants. (RAIMONDO/VAESSEN/BRANCO)

Emerging practices and tools

We realize that this pandemic is a rapidly changing situation. Here are some spaces where practitioners de exchange experience.

3ie Virtual Evidence Weeks: Evidence in the time of COVID-19: https://evidenceweeks.3ieimpact.org

BetterEvaluation https://www.betterevaluation.org/

Blue Marble Evaluation, https://bluemarbleeval.org/

COVID-19 #SmartDevelopmentHack https://toolkit-digitalisierung.de/en/

EvalForward discussion, https://www.evalforward.org/discussions/evaluation-covid19

EvalPartners https://www.evalpartners.org

EVALSDGs https://evalsdgs.org/news/

EVALUATION FOR DEVELOPMENT, Blog by Zenda Ofir, https://zendaofir.com

Facebook Group “Navigating Together: Learning, Evaluation, and COVID-19” https://www.facebook.com/groups/536151440381908/

Free-range evaluation, Blog by Tom Archibald, https://tgarchibald.wordpress.com/

IEG Lesson Library: Evaluative Resources and Evidence to inform the COVID-19 Response, World Bank Group, http://ieg.worldbankgroup.org/topic/covid-19-coronavirus-response

J-PAL blog post on resources, https://www.povertyactionlab.org/blog/3-20-20/best-practices-conducting-phone-surveys

Monitoring & Evaluation & Learning, Blog by Karsten Weitzenegger https://deveval.wordpress.com/

UNDP IEO Evaluation http://web.undp.org/evaluation/

#EvalCrisis – is the EC DEVCO initiative to support ‘Evaluation in times of Crisis’.
If your plans are impacted by the #pandemic and you are looking for tested #evaluation tools, expert advice and collection of references from the global evaluation community, visit https://europa.eu/capacity4dev/devco-ess.

Add yours in the comments…

About the author

Karsten Weitzenegger (evaluation.weitzenegger.de) has conducted more than 30 evaluations of projects and programmes in the context of international cooperation. He is currently preparing a remote evaluation for GIZ on green markets in Brazil. For this post he has used Web search, hosted news services, and network contacts from the @deveval Twitter account.

Literature

All URL were accessed on 29.04.2020, if not mentioned otherwise.

60 Decibels (15.03.2020) Remote Survey Toolkit. Prepared in Response to COVID-19, March 2020, https://60decibels.com/user/pages/03.Work/_remote_survey_toolkit/60_Decibels_Remote_Survey_Toolkit_March_2020.pdf
Archibald, Tom (01.04.2020) Resources for #eval in a time of crisis, https://tgarchibald.wordpress.com/2020/04/01/resources-for-eval-in-a-time-of-crisis/
Bastøe, Per Øyvind & Wendy Asbeek Brusse & Jörg Faust (30.04.2020) COVID-19 and Development Co-operation: we know a lot about what works, let’s use the evidence, OECD Network on Development Evaluation, https://oecd-development-matters.org/2020/04/30/covid-19-and-development-co-operation-we-know-a-lot-about-what-works-lets-use-the-evidence/ [Accessed 02.05.2020]
CartONG (07.04.2020) Covid-19 crisis: how to adapt your data collection for monitoring and accountability, Version 1 – 7 April 2020, https://blog.cartong.org/wordpress/wp-content/uploads/2020/04/IM-covid-19-impact-on-monitoring-and-accountability_CartONG.pdf
Duflo, Annie (18.04.2020) COVID-19: How IPA is Responding, Adapting—and Preparing, Innovations for Poverty Action, https://www.poverty-action.org/blog/covid-19-how-ipa-responding-adapting-and-preparing
Evans, Alison (29.04.2020) A global effort is needed to ensure all countries are ready to combat COVID-19 (coronavirus) with evidence, World Bank IEG, https://ieg.worldbankgroup.org/blog/global-effort-needed-ensure-all-countries-are-ready-combat-covid-19-coronavirus-evidence
Garcia Jeneen R. (30.04.2020) When Life Gives You Lemons…or a Pandemic – Finding Opportunity in Disaster, Earth-Eval blog, https://www.eartheval.org/blog/opportunity-disaster-2020-covid19 [Accessed 04.05.2020]
GIZ (20.02.2020) Transforming our work: Getting ready for transformational projects. Guidance, https://www.giz.de/fachexpertise/downloads/Transfomation%20Guidance_GIZ_02%202020.pdf
Goldstein, Markus & Florence Kondylis (26.03.2020) Impact evaluations in the time of Covid-19, Part 1, World Bank, https://blogs.worldbank.org/impactevaluations/impact-evaluations-time-covid-19-part-1
Gurría, Angel (22.04.2020) An inclusive, green recovery is possible: The time to act is now, OECD Secretary General Blog, http://www.oecd.org/coronavirus/en/
Iqbal, Dhaliwal (10.04.2020) J-PAL’s response to COVID-19, Abdul Latif Jameel Poverty Action Lab, https://www.povertyactionlab.org/covid19
Jones, Maria & Roshni Khincha et al. (20.04.2020) Practical Tips for Implementing Remote Surveys in the Time of the Great Lockdown, World Bank Blog, https://blogs.worldbank.org/impactevaluations/practical-tips-implementing-remote-surveys-time-great-lockdown
Kopper, Sarah & Anja Sautmann (20.03.2020) Best practices for conducting phone surveys, https://www.povertyactionlab.org/blog/3-20-20/best-practices-conducting-phone-surveys
Lappo, Alena (25.04.2020) How do we adapt our evaluation approach to the impact of the Covid-19 pandemic? EvalForwad discussion, https://www.evalforward.org/discussions/evaluation-covid19
Lupton, Deborah & Stacy Penna (31.03.2020) NVIVO Webinar: COVID-19 & Virtual Fieldwork, Handout: https://www.qsrinternational.com/getattachment/nvivo-qualitative-data-analysis-software/Resources/On-demand-Webinars/COVID-19-and-Virtual-Fieldwork/NVivo-Webinar-Covid-19-and-Virtual-Fieldwork-final.pdf.aspx?lang=en-US, Recording: https://www.qsrinternational.com/nvivo-qualitative-data-analysis-software/resources/on-demand-webinars/covid-19-and-virtual-fieldwork#data-fancybox
Lysy, Chris (19.03.2020) The Evaluation Mindset: Evaluation in a Crisis, https://freshspectrum.com/the-evaluation-mindset-evaluation-in-a-crisis/
Macfarlan, Alice (21.04.2020) Adapting evaluation in the time of COVID-19 – Part 1: MANAGE, BetterEvaluation, https://www.betterevaluation.org/en/blog/adapting-evaluation-time-covid-19-part-1-manage
OECD/DAC & IEO/UNDP (16.04.2020) Joint Guidance Note for Evaluation Units, http://web.undp.org/evaluation/guideline/documents/covid19/IEOOECD_DAC_Joint-Guidance_COVID19.pdf
OECD/DAC (09.04.2020) COVID-19 GLOBAL PANDEMIC, Joint Statement, 9 April 2020 by the Development Assistance Committee (DAC) of the Organisation for Economic Co-operation and Development (OECD), https://www.oecd.org/dac/development-assistance-committee/DAC-Joint-Statement-COVID-19.pdf
Ofir, Zenda (09.04.2020) Transforming Evaluations and COVID-19, Part 2. Repositioning evaluation, https://zendaofir.com/evaluation-covid-19-transformation-part-2/
Ofir, Zenda (13.04.2020) Transforming Evaluations and COVID-19, Part 3. Advice for this time, https://zendaofir.com/covid-19-and-transforming-evaluations-part-3/
Ofir, Zenda (20.04.2020) Transforming Evaluations and COVID-19, Part 4. Accelerating change in practice, https://zendaofir.com/transforming-evaluations-and-covid-19-part-4-accelerating-change-in-practice/
Ofir, Zenda (29.03.2020) Evaluation in times of COVID19 outbreak. Transforming Evaluations and COVID-19, Part 1. A GPS for complex terrain. https://zendaofir.com/evaluation-covid-19-the-future-part-1/
Olazabal, Veronica & Michael Bamberger & Peter York (04.05.2020) Rewiring How We Measure Impact in a Post-COVID-19 World, https://www.rockefellerfoundation.org/blog/rewiring-how-we-measure-impact-in-a-post-covid-19-world/ [Accessed 06.05.2020]
Patton, Michael Quinn (23.03.2020) Evaluation Implications of the Coronavirus Global Health Pandemic Emergency, https://bluemarbleeval.org/latest/evaluation-implications-coronavirus-global-health-pandemic-emergency
Puri, Jyotsna (05.05.2020) Evaluations and research during crisis? Comments from a self-confessed evidence evangelist. https://ieu.greenclimate.fund/news/new-evaluations-and-research-during-crisis-comments-from-a-self-confessed-evidence-evangelist-?inheritRedirect=true&redirect=%2Fnew-from-ieu%2Flatest-updates [Accessed 06.05.2020]
Raimondo, Estelle & Jos Vaessen & Mariana Branco (22.04.2020) Adapting evaluation designs in times of COVID-19 (coronavirus): four questions to guide decisions, IEG World Bank Group, https://ieg.worldbankgroup.org/blog/adapting-evaluation-designs-times-covid-19-coronavirus-four-questions-guide-decisions
Reed, Martena (25.04.2020) Reflex or Reflection: Three Lessons for Evaluators Amid COVID-19, https://aea365.org/blog/reflex-or-reflection-three-lessons-for-evaluators-amid-covid-19-by-martena-reed/
Regional Risk Communication and Community Engagement (RCCE) Working Group (30.03.2020) Guide: COVID-19: How to include marginalized and vulnerable people in risk communication and community engagement, https://resourcecentre.savethechildren.net/node/17189/pdf/covid-19_-_how_to_include_marginalized_and_vulnerable_people_in_risk_communication_and_community_engagement.pdf [Accessed 01.05.2020]
Rogers, Patricia (03.04.2020) BetterEvaluation COVID-19 Statement, https://www.betterevaluation.org/en/blog/betterevaluation-covid-19-statement
Salmons, Janet & Silvana di Gregorio (07.04.2020) NVIVO Free webinar: When the “field” is Online: Qualitative Data Collection, Handout: https://www.qsrinternational.com/getattachment/nvivo-qualitative-data-analysis-software/Resources/On-demand-Webinars/When-the-field-is-Online-Qualitative-Data-Collect/NVivo-Webinar-online-Fieldwork-final-PPT.pdf.aspx
Salmons, Janet (07.04.2020) Doing Qualitative Research Online, https://www.qsrinternational.com/getattachment/nvivo-qualitative-data-analysis-software/Resources/On-demand-Webinars/When-the-field-is-Online-Qualitative-Data-Collect/Tips-Choosing-Online-Data-Collection_Janet-Salmons.pdf.aspx, Recording: https://www.qsrinternational.com/nvivo-qualitative-data-analysis-software/resources/on-demand-webinars/when-the-field-is-online-qualitative-data-collect
Sánchez-Páramo, Carolina (23.04.2020) COVID-19 will hit the poor hardest. Here’s what we can do about it, https://blogs.worldbank.org/voices/covid-19-will-hit-poor-hardest-heres-what-we-can-do-about-it
Smith, Emma (13.03.2020) A quick primer on running online events and meetings, Better Evaluation, https://www.betterevaluation.org/en/blog/quick-primer-running-online-events-and-meetings
Taylor-Dormond, Marvin & Stoyan Tenev (28.04.2020) How to boost accountability and learning in aid for COVID-19, Asian Development Bank Blog, https://www.adb.org/news/op-ed/how-boost-accountability-and-learning-aid-covid-19-marvin-taylor-dormond-and-stoyan-tenev
UNDP (31.03.2020) Evaluation Planning and implementation during Covid-19, http://web.undp.org/evaluation/guideline/documents/evaluation_guide_COVID19.pdf [Accessed 01.05.2020]
UNDP Independent Evaluation Office (2020) Evaluation during a crisis: COVID-19, Independent Evaluation Office, http://web.undp.org/evaluation/documents/infographics/Evaluation-during_crisis-COVID19.pdf, Video with Oscar Garcia: https://twitter.com/i/status/1252490419495219204
UNFPA Evaluation Office (23.04.2020) Adapting evaluations to the COVID-19 pandemic, https://www.unfpa.org/sites/default/files/admin-resource/FINAL_Covid19_and_Eval.pdf
Vaessen, Jos & Estelle Raimondo (08.04.2020) Conducting evaluations in times of COVID-19, World Bank, http://ieg.worldbankgroup.org/blog/conducting-evaluations-times-covid-19-coronavirus