Translate with Google Translate
 

4 lessons learned from synthesising humanitarian evidence

Roxani Krystalli

By Roxani Krystalli on 25 April 2017.

View existing comments or leave a new one!

Many humanitarians find it hard to gather evidence of what we know and how we know it. For example, what is the impact of different mental health and psychosocial support interventions on populations affected by humanitarian crises? What do we know about the impact of protection interventions on unaccompanied and separated minors during humanitarian emergencies?
 

The Humanitarian Evidence Program has synthesised the available research to answer questions like these. As a partnership between Oxfam and Feinstein International Center at Tufts University, we have just published eight systematic reviews of the evidence in key areas of the humanitarian field, with the ultimate goal of improving humanitarian policy and practice.

In addition to the specific insights that emerged in each evidence synthesis, the programme revealed four overarching themes about the humanitarian evidence base and the process of synthesising existing research:
 

Show me the evidence. As opposed to prospective field research, which often requires crisis-affected populations to narrate their experiences of harm repeatedly for multiple audiences, systematic reviews present an opportunity to learn from the existing evidence base. As one systematic review author said in an interview, emergency responders and decision-makers in the humanitarian field have neither the time nor necessarily the interest or ability to sift through an unsorted evidence base. A key advantage of systematic reviews is, therefore, that they follow a rigorous process of synthesising available research, appraising the evidence base, and identifying research gaps.

Think about the politics of evidence. The systematic review process requires authors to make judgments about which bodies of evidence to search and which pieces of research to include in their synthesis based on pre-determined criteria. Though the process of searching and appraising evidence is rigorous, systematic reviews at times hide their own subjectivity and partiality—and the fact that they still require systematic reviewers to make judgments on what counts as evidence and rigor. As one systematic review author told us, “people put different meanings behind what is ‘evidence-based.’” Based on pre-established, peer-reviewed criteria of which studies were eligible for inclusion in the evidence synthesis, Humanitarian Evidence Program reviews included between 0.1% and 0.7% of the initial studies the search process identified. This has led programme advisors to comment on a missed opportunity to learn from research that does not meet the strict criteria, but may still be of value to the field. As one author said, “through being this strict, humanitarian systematic reviews end up being at once rigorous and anemic.”

Definitions matter. The authors of Humanitarian Evidence Programme systematic reviews noted the challenge of defining the term ‘humanitarian,’ particularly in terms of delimiting humanitarian/development and emergency/non-emergency settings. Definitions matter not only because they determine which studies get included in an evidence synthesis, but also because they affect the comparability of interventions in different contexts. Comparability was also a challenge in terms of how different agencies defined interventions of interest and the indicators they used to measure them. Measurements differed across programmes over time – even within the same country or same agency, making comparisons difficult. Addressing this issue would require coordination among donors and implementing agencies to develop a common system of indicators, measurements, and thresholds and ensure its consistent implementation.
 

More thorough reporting of evaluation methods can improve humanitarian evidence.
A key lesson that emerged from the Humanitarian Evidence Programme is that much of the evidence included in systematic reviews arose from programmatic evaluations, as opposed to academic studies or peer-reviewed journal articles. As one programme advisor said in an interview, “this is a quick win – if we commission, manage, and sign-off on these [evaluations] a little more strictly, we can make big improvements.”

Many of these improvements relate to reporting on the methods of data collection. Evaluations and other programmatic reports should:

Humanitarian evidence syntheses not only present an opportunity to reflect on what we know and how we know it. They also require us to reckon the limitations of existing research and barriers to evidence sharing and use that affect decision-making.
We will soon publish a journal article on these findings and look forward to continuing the conversation on ways in which researchers and practitioners can contribute to improving the humanitarian evidence base

 

Roxanne Krystalli is the Humanitarian Evidence Program Manager at Feinstein International Center. She can be found on Twitter at @rkrystalli. The views and opinions expressed herein are those of the author and do not necessarily represent those of Oxfam, Feinstein International Center or the UK government.

  • Share this page:
  • Email
  • Print

2 comments

Mansoor Ali

Mansoor Ali (International Medical Corps) 26 April 2017, 10:03

Thanks for writing and some really important findings there. One area which probably need more attention is to change the organizational culture and mind-sets in favour of the findings of the above research. There are known barriers to uptake the findings. Findings are excellent, but how we encourage their uptake.

Paul James Crook

Paul James Crook (CrookConsults) 26 April 2017, 11:53

Why are we delineating? If a result is a result, it is a result - Yes? Build the evidence and place against the standards we have set/are setting for the delivery of results. Now we have (at least) a decade of quality evidence, then can we see if we are doing the same humanitarian/immediate needs to the same people, or are we seeing a churn of people through the caseload numbers?

Let us also use the themes Roxanne has picked out. Refer this to emergent issues of supporting people, e.g. refugees and IDPs, and the issue that there is a lack of developmental gains, meaning staying a refugee is an option (sadly true in a number of cases - evidence to support this? Research to look at ways forward?)

Leave a comment

Anyone can leave a comment, but you need a Full or Observer member account first.

If you already have an account, please sign in.

If you don't have an account, you can create an account now.

Before you download this file, please answer two questions to help us monitor usage

1) What do you think you'll use this document for?

  • Other:

1) What is your email address?

2) What is the name of your organisation?

Please answer both questions above Submit

Starting your download...

Pilot version: You are downloading the pilot version of this guide; we welcome any feedback you have. Please email EHA@alnap.org

Close this overlay