Translate with Google Translate


Improving the quality of evaluation of humanitarian action and their related research, learning activities is ALNAP’s first strategic focus area. This entails developing guidance on humanitarian evaluation, provide a repository of knowledge for the humanitarian system through the HELP library and offer a platform for evaluators to be in touch with their peers across the Network.


Evaluating Humanitarian Action

The Evaluating Humanitarian Action (EHA) Guide supports specialists and non-specialists in every stage of an evaluation, from initial decision to dissemination.

Using evaluations

ALNAP has developed a simple framework that agencies can use to commission, conduct, and utilise evaluations in a more effective way.

Evaluating protection

Humanitarian evaluators don't have much guidance available on protection. ALNAP is doing research to identify the challenges on this front and develop guidance.

Real-time evaluations

Real-time evaluations (RTEs) are one of the most demanding due to time constraints. ALNAP's Guide on RTEs intends to help in the commission, oversight and conduction of these.


Meta-evaluations are increasingly being used to identify trends and quality in particular sectors. ALNAP has developed a Quality Pro Forma to aid this process.


Based on the ALNAP Pilot Guide on 'Evaluating Humanitarian Action', this e-learning course is the first to offer an overview of evaluation practice in humanitarian contexts.











You can connect with other evaluators in the ALNAP Network and ask questions about any EHA issues by joining the
Humanitarian Evaluation Community of Practice.

Latest discussions from the Humanitarian Evaluation Community of Practice

18 January: failure report and simlar practices

Dear All My organization is investigating practices of humanitarian agencies to review their intervention that fails and learn from these. I know the Engineer without border published failure report almost annually (

16 January: Improving how we monitor (and assess) humanitarian performance: do we need common indicators?

Dear Alice and everyone, Firstly I’m sorry for being late to the party, but recently, here at Evidence Aid, we’ve been discussing the COMET initiative ( which is a set of core outcomes (although not specifically for the humanitarian sector) and...

16 January: Improving how we monitor (and assess) humanitarian performance: do we need common indicators?

Dear Carlisle, Mike, Uwe and John, Thanks so much for your reflections and for highlighting these resources—several of them are new to me, and I will read with interest. Just a quick point: I think it is interesting that Sphere came up (and John I remember your pa...

You can join the conversation or ask a new question by visiting the Humanitarian Evaluation Community of Practice or

  • Share this page:
  • Email
  • Print

Before you download this file, please answer two questions to help us monitor usage

1) What do you think you'll use this document for?

  • Other:

1) What is your email address?

2) What is the name of your organisation?

Please answer both questions above Submit

Starting your download...

Pilot version: You are downloading the pilot version of this guide; we welcome any feedback you have. Please email

Close this overlay