Translate with Google Translate
 

Evaluation

Improving the quality of evaluation of humanitarian action and their related research, learning activities is ALNAP’s first strategic focus area. This entails developing guidance on humanitarian evaluation, provide a repository of knowledge for the humanitarian system through the HELP library and offer a platform for evaluators to be in touch with their peers across the Network.

 

Evaluating Humanitarian Action

The Evaluating Humanitarian Action (EHA) Guide supports specialists and non-specialists in every stage of an evaluation, from initial decision to dissemination.

Using evaluations

ALNAP has developed a simple framework that agencies can use to commission, conduct, and utilise evaluations in a more effective way.

Evaluating protection

Humanitarian evaluators don't have much guidance available on protection. ALNAP is doing research to identify the challenges on this front and develop guidance.

Real-time evaluations

Real-time evaluations (RTEs) are one of the most demanding due to time constraints. ALNAP's Guide on RTEs intends to help in the commission, oversight and conduction of these.

Meta-evaluations

Meta-evaluations are increasingly being used to identify trends and quality in particular sectors. ALNAP has developed a Quality Pro Forma to aid this process.

E-learning

Based on the ALNAP Pilot Guide on 'Evaluating Humanitarian Action', this e-learning course is the first to offer an overview of evaluation practice in humanitarian contexts.

 

 

 

 

 

 

 

 

 

 

You can connect with other evaluators in the ALNAP Network and ask questions about any EHA issues by joining the
Humanitarian Evaluation Community of Practice.

Latest discussions from the Humanitarian Evaluation Community of Practice

1 February: Looking into presence? ALNAP webinar- Feb 8, 14:00 GMT

Good morning CoP! I was wondering: Have you ever had to map the presence of humanitarian actors, particularly at the level of a whole response? How is this evaluated? What are some recent evaluations where this was considered? Next week, we are hosting a webinar t...

26 January: failure report and simlar practices

Hi Wartini, It’s an excellent question, and one that we have seen many organisations grapple with when they try to encourage innovation and innovative activities across country teams. Even projects that are specifically funded as ‘innovations’ with a higher tolerance for risk ...

23 January: Re: Improving how we monitor (and assess) humanitarian performance: do we need common indicators?

Nice to see this thread. In my work, as part of a network of 190 RCRC national societies with different frameworks and such, I have found it important to make a distinction between trying to arrive a standardized objectives, versus indicators. I think the latter makes more sen...

You can join the conversation or ask a new question by visiting the Humanitarian Evaluation Community of Practice or
e-mailing humanitarian-evaluation@partnerplatform.org

  • Share this page:
  • Email
  • Print

Before you download this file, please answer two questions to help us monitor usage

1) What do you think you'll use this document for?

  • Other:

1) What is your email address?

2) What is the name of your organisation?

Please answer both questions above Submit

Starting your download...

Pilot version: You are downloading the pilot version of this guide; we welcome any feedback you have. Please email EHA@alnap.org

Close this overlay