Translate with Google Translate

Evaluating humanitarian action

The ALNAP Guide on Evaluating Humanitarian Action (EHA Guide for short) supports evaluation specialists and non-specialists in every stage of an evaluation, from initial decision to dissemination.

A pilot version of this Guide was first released in June 2013, following a three-year drafting process led by ALNAP, co-authors John Cosgrave and Margie Buchanan-Smith, and supported by an inter-agency advisory group.

10,000 EHA Guide downloads later, ALNAP has gathered feedback from more than 40 organisations participating in the pilot process who tested its content on the ground. We are now working to incorporate these comments and suggestions and will publish the final EHA Guide in summer 2016.

If you have any EHA-related resources that you would like us to consider,
please get in touch.

The EHA Guide Pilot has been only part of our EHA story. Check out the timeline below which outlines other landmarks in this process.

Why a Guide for Evaluating Humanitarian Action?

Evaluation of humanitarian action has indeed evolved and now much technical and organisation-specific guidance exists. Yet this ALNAP Guide paints the whole picture of evaluation in the sector, consolidating the current knowledge about initiating, managing and completing an evaluation of humanitarian action. As such, it offers a common reference point for humanitarian evaluators.

Although evaluation of humanitarian action faces many of the same challenges of other sectors (e.g. international development) a number of these are accentuated due to the volatile contexts in which humanitarians operate and the nature of the work undertaken. For instance, insecurity may limit access to programmes and affected populations, and work may be particularly time-sensitive in nature.

Check out what some evaluators had to say about what has worked for them and what hasn’t when evaluating humanitarian action.


You can connect with other evaluators in the ALNAP Network and ask questions about any EHA issues by joining the
Humanitarian Evaluation Community of Practice.

Latest discussions from the Humanitarian Evaluation Community of Practice

20 October: 7 requirements for successful Third Party Monitoring (blog post from SAVE research project)

Dear colleagues, The SAVE research project has recently published a short blog post on Third Party Monitoring that we would like to share with you. You may read it online here<

19 October: How do we raise the profile of evidence and evaluation?

Thank you Catriona for this interesting post! Congratulations to you and the team on publishing Tearfund’s first Impact and Learning report and ‘inspiring change’. Well done! The Evaluation, Learning and Accountability (ELA) team at Action Against Hunger recognises how m...

19 October: How do we raise the profile of evidence and evaluation?

Reflections from Tearfund’s Impact and Effectiveness Unit Earlier this year, Tearfund published their first Impact and Learning report <> using evidence from evaluations to draw out impact themes and...

You can join the conversation or ask a new question by visiting the Humanitarian Evaluation Community of Practice or

  • Share this page:
  • Email
  • Print

Before you download this file, please answer two questions to help us monitor usage

1) What do you think you'll use this document for?

  • Other:

1) What is your email address?

2) What is the name of your organisation?

Please answer both questions above Submit

Starting your download...

Pilot version: You are downloading the pilot version of this guide; we welcome any feedback you have. Please email

Close this overlay