Improving quality of humanitarian
  programmes through the use of a
  scoring system: the Humanitarian
  Indicator Tool


Dr Vivien Margaret Walden and Nigel Timmins
Acknowledgements

• We wish to acknowledge the consultants, Andy Featherstone,
  Peta Sandison, Marlise Turnbull and Sarah House who
  helped refine the tools and the 10 countries offices who have
  been through the process and have hopefully come out of it
  unscathed.
• Photographs used in this presentation are by Caroline Gluck,
  Oxfam




                                                  Page 2
Why a new tool?

• In 1995, the ODI evaluation of the response to the Rwanda
  crisis raised concerns around the quality of service delivery
• As a result of findings, the Sphere project was started
• In 1997 the People in Aid Code of Best Practice was
  published
• In 2000 the Humanitarian Ombudsman became HAP
• In 2001 Griekspoor and Sondorp wrote a paper highlighting
  the various quality assurance measures and posed the
  question “Do all of the developments in the field of
  accountability and performance actually improve overall
  performance?”


                                                 Page 3
Why a new tool?

• Oxfam’s hypothesis – by improving the quality of a
  humanitarian programme, the likelihood of positive impact is
  increased
• The tool does not try to prove the link
• Oxfam has introduced the Global Performance Framework as
  part of reporting to DFID
• The framework has global indictors of which the Humanitarian
  Indicator is one
• Whereas the other indicators measure impact, the
  humanitarian indictor looks at quality of the response




                                                 Page 4
Oxfam’s Global Performance
Framework
   Global Output Indicators                                     Global Outcome Indicators
    Humanitarian Assistance:                                    Degree to which humanitarian responses meet
    Total number of people provided with appropriate            recognised quality standards for humanitarian
    humanitarian assistance, disaggregated by sex               programming (e.g. Sphere guidelines)


    Adaptation and Risk Reduction:                              % of supported households demonstrating greater
    # of people supported to understand current and likely
                                                                ability to minimise risk from shocks and adapt to
    future hazards, reduce risk, and/or adapt to climatic
    changes and uncertainty, disaggregated by sex               emerging trends & uncertainty


    Livelihood Enhancement Support:                             % of supported households demonstrating greater        Improved quality
    # of women and men directly supported to increase           income, as measured by daily consumption and            of life for poor
    income via enhancing production and/or market access        expenditure per capita
                                                                                                                       women and men
    Women’s Empowerment:                                        % of supported women demonstrating greater
    # of people reached to enable women to gain increased
                                                                involvement in household decision-making and
    control over factors affecting their own priorities and
    interests                                                   influencing affairs at the community level


    Citizen Mobilisation:                                       Degree to which selected interventions have
    # of a) citizens, CBO members and CSO staff supported       contributed to affecting outcome change, as
    to engage with state institutions/other relevant actors;    generated from findings of rigorous qualitative
    and b) duty bearers benefiting from capacity support        evaluations

    Campaigning and Advocacy:                                   Degree to which selected interventions have
    #of campaign actions directly undertaken or supported,      contributed to affecting outcome change, as
    e.g. contacts made with policy targets, online and          generated from findings of rigorous qualitative
    offline actions taken, media coverage, publications, etc.   evaluations

                                                                Degree to which selected interventions meet
                                                                recognised standards for accountable programming


                                                                Extent to which selected project delivery good value
                                                                for money




                                                                                                                       Page 5
Page 6
The indicator

• Output indicator – the total number of people
  provided with appropriate humanitarian
  assistance, disaggregated by sex
•   Data collected annually through online system


• Outcome indicator – the degree to which
  humanitarian responses meet recognised quality
  standards for humanitarian programming
•   Data collected through HIT




                                                    Page 7
The tool – for rapid onset
emergencies
  Global Humanitarian Indicator: Degree to which humanitarian responses meet recognised quality
  standards for humanitarian programming

  RAPID ONSET EMERGNECY – EARTHQUAKE, SUDDEN FLOODS, TSUNAMI, CYCLONES, TYPHOONS, HURRICANES, SUDDEN
  CONFLICT WITH DISPLACEMENT, AWD OUTBREAKS


  Number Quality standard                                                                      Met        Almost       Partially Not met
                                                                                               (score6)   met (4)      met       (score
                                                                                                                       (score 2) 0)
  1       Timeliness - rapid appraisal/assessment enough to make decisions within 24 hours
          and initial implementation within three days
  2        Coverage uses 25% of affected population as an planned figure (response should
          reflect the scale of the disaster) with clear justification for final count
  3       Technical aspects of programme measured against Sphere standards


  Number Quality standard                                                                      Met        Almost       Partially Not met
                                                                                               (score3)   met (score   met       (score
                                                                                                          2)           (score 1) 0)
  4       MEAL strategy and plan in place and being implemented using appropriate
          indicators
  5       Feedback/complaints system for affected population in place and functioning and
          documented evidence of information sharing, consultation and participation leading
          to a programme relevant to context and needs
  6       Partner relationships defined, capacity assessed and partners fully engaged in all
          stages of programme cycle
  7       Programme is considered a safe programme: action taken to avoid harm and
          programme considered conflict sensitive
  8        Programme (including advocacy) addresses gender equity and specific concerns




                                                                                                                         Page 8
The benchmarks
• Timeliness - rapid appraisal/assessment enough to make
  decisions within 24 hours and initial implementation within
  three days
• Coverage uses 25% of affected population as an planned
  figure (response should reflect the scale of the disaster) with
  clear justification for final count
• Technical aspects of programme measured against Sphere
  standards
• MEAL strategy and plan in place and being implemented
  using appropriate indicators
• Feedback/complaints system for affected population in place
  and functioning and documented evidence of information
  sharing, consultation and participation leading to a
  programme relevant to context and needs

                                                    Page 9
The benchmarks
• Partner relationships defined, capacity assessed and partners
  fully engaged in all stages of programme cycle
• Programme is considered a safe programme: action taken to
  avoid harm and programme considered conflict sensitive
• Programme (including advocacy) addresses gender equity
  and specific concerns and needs of women, girls, men and
  boys and vulnerable groups
• Evidence that preparedness measures were in place and
  effectively actioned
• Programme has an advocacy/campaigns strategy and has
  incorporated advocacy into programme plans based on
  evidence from the field



                                                  Page 10
The benchmarks

• Country programme has an integrated approach including
  reducing and managing risk though existing longer-term
  development programmes and building resilience for the
  future
• Evidence of appropriate staff capacity to ensure quality
  programming




                                                 Page 11
The methodology

• Done by an external consultant – although preferably one who
  has a knowledge of Oxfam
• Done as a desk study using documentation and some
  telephone/Skype interviews
• Follows a pre-determined scoring system and list of
  documents
• Has to be commented on and accepted by the country
• The country writes a management response
• All reports and a summary for each are published on the
  Oxfam website – www. Oxfam.org



                                                 Page 12
The scoring
Quality standard   Evidence needed            Met (score 6)               Almost met (4)   Partially met (score 2)   Not met (score 0)

3                  Proposals                  Sphere standards            NA               Sphere standards          Standards only
                   MEAL strategy and          proposed and put in                          proposed and adjusted     mentioned in proposals
                   plans                      place with adjusted                          to context                but not replicated in
                   PH and EFSL strategies     standards for context                        Standards mentioned in    plans
                   Technical adviser visits   Training in standards                        proposals and             Or
                   Training agendas and       carried out for staff and                    LogFrames but not         No mention of Sphere
                   presentations              partners                                     monitored against         in any document
                   LogFrames and              Indicators use                               Some evidence of
                   monitoring frameworks      standards and                                training but not
                   donor reports              monitoring against                           widespread (staff but
                   RTE and other              standards takes place                        not partners or only in
                   evaluation reports         regularly                                    one area)
                   learning event or          Standards evaluated
                   review reports




                                                                                                               Page 13
Instructions for use

Benchmark Evidence                   Quality check              Benchmark
3         Technical aspects of        Proposals                 Check proposals and strategies to see if
          programme measured         MEAL strategy and plans    standards are mentioned not just as a possibility
          against Sphere standards   PH and EFSL strategies     but that they are considered in the context of
                                     Technical adviser visits   the response – this might mean that Sphere has
                                     Training agendas and       been adapted to suit the context
                                     presentations              The indicators on the LogFrame for technical
                                     LogFrames and monitoring   areas should reflect Sphere standards
                                     frameworks                 The MEAL strategy should have Sphere as
                                     donor reports              indicators and for data collection methods
                                     RTE and other evaluation   Check adviser reports for mention of standards
                                     reports                    and how these were implemented
                                     learning event or review   Check the RTE report for mention of Sphere
                                     reports                    standards
                                                                Check WASH and EFSL strategies and adviser
                                                                reports to see if any training was carried out for
                                                                staff and partners
                                                                Check review and evaluation reports for
                                                                mention of standards




                                                                                                                     Page 14
Final score

•   First year total score could be 30
•   Adjusted for non-applicable benchmarks
•   First year scoring
•   Somalia – 17/28
•   Kenya – 24/30
•   Ethiopia – 9/28
•   Pakistan – 19/30
•   Colombia – the test case for the HIT – 18/26
•   Second year – only South Sudan is complete – 21/30




                                                 Page 15
The findings – Somalia

• Benchmark 5 on accountability
• Score Met – 2/2
• “Oxfam’s partners appeared to differ in the level of beneficiary
  participation in design and delivery. Some documented highly
  participatory process, with qualitative and quantitative data.
  As well as gathering information, rapid assessments were
  done to establish VRCs to improve participation (criteria,
  entitlements, payment points, registration, complaints and
  feedback). Mobile phone hotlines were set up where possible,
  with feedback protocol to guide staff on how to register and
  follow-up complaints”



                                                    Page 16
Pakistan

• Benchmark 7 – protection and gender
• Partially met – ½
• Some protection concerns were identified relating to security
  for staff and women and girls using WASH facilities. Some
  actions were taken responding to dignity and protection
  including involving women in different activities. Post-
  distribution monitoring investigated some security concerns
  related to cashing cheques and distribution points, including
  analysis of responses from women. No protection problems
  observed were communicated to agencies or authorities
  responsible for, or specialising in, protection



                                                   Page 17
Ethiopia

• Benchmark 11 – advocacy
• Not met – 0/2
• Advocacy activities were clearly part of the intended response
  and there is a regional advocacy action plan with Ethiopia
  objectives and a media, advocacy and campaign strategy
  which includes a number of plans for Ethiopia. However, no
  Ethiopia country advocacy strategy was provided for the
  evaluation. The sitreps do mention Oxfam’s participation in
  influential meetings, but are not tied to an explicit strategy.
  There is no record of the impact of Oxfam’s advocacy
  activities



                                                   Page 18
Lessons learnt

• There are limitations to doing a desk study – it relies heavily
  on documentation and scores do not reflect absence of
  documentation or actual absence of good programming
• Several country teams objected to being judged solely on
  “little pieces of paper”
• There is no opportunity to get the views of the affected
  population unless this is already documented
• Telephone/Skype interviews can be biased – it is sometimes
  difficult to triangulate
• The process needs the goodwill and buy-in from the country
  team


                                                    Page 19
Advantages

• The process is fairly inexpensive – under £6000
• The country does not have to host a consultant
• The methodology can be used to track progress in
  subsequent responses in one country
• The scores are comparable across programmes (although
  context should be considered)




                                              Page 20
She deserves the best we can give –
           thank you!




                            Page 21

More Related Content

DOC
The Seven Steps To Building A Successful Prevention Program
PPTX
Obesity Steering committee mtg 4 final
DOC
Advancing standards of care call for applications
PDF
Crisis Now State Region Self-Assessment Programmatic Guide
PPT
Improved Risk information to support sound policy/decision making processes –...
PPTX
BRACC Impact Evaluation Design_Dan Gilligan_BRACC resilience learning event_1...
PDF
Ccb as presentation
The Seven Steps To Building A Successful Prevention Program
Obesity Steering committee mtg 4 final
Advancing standards of care call for applications
Crisis Now State Region Self-Assessment Programmatic Guide
Improved Risk information to support sound policy/decision making processes –...
BRACC Impact Evaluation Design_Dan Gilligan_BRACC resilience learning event_1...
Ccb as presentation

What's hot (13)

PDF
CCXG Forum, March 2022, Sindy Singh
DOCX
Annual Results and Impact Evaluation Workshop for RBF - Day Two - Assessment ...
PPT
Evaluating The California Endowment Clinic Consortia Policy and Advocacy Prog...
PPTX
Amie Heap
PDF
10 core global-fundstrategy_framework_en1
PPT
CHW Program Assessment and Improvement Matrix
PPTX
PSNP 4 Grievance redressing mechanism (GRM)
PDF
3 investment-framework-summary-unaids-issues-brief
PDF
AIDSTAR-One District Comprehensive Approach for HIV Prevention and Continuum ...
PPT
Hotspot assessment for local AIDS control efforts--PLACE maps in the Dominica...
PPT
RIMS+ surveys: A tool for project design and evaluation
PDF
Annual Results and Impact Evaluation Workshop for RBF - Day Two - Improving M...
PDF
Crisis information management framework for regional disaster resiliency (Joe...
CCXG Forum, March 2022, Sindy Singh
Annual Results and Impact Evaluation Workshop for RBF - Day Two - Assessment ...
Evaluating The California Endowment Clinic Consortia Policy and Advocacy Prog...
Amie Heap
10 core global-fundstrategy_framework_en1
CHW Program Assessment and Improvement Matrix
PSNP 4 Grievance redressing mechanism (GRM)
3 investment-framework-summary-unaids-issues-brief
AIDSTAR-One District Comprehensive Approach for HIV Prevention and Continuum ...
Hotspot assessment for local AIDS control efforts--PLACE maps in the Dominica...
RIMS+ surveys: A tool for project design and evaluation
Annual Results and Impact Evaluation Workshop for RBF - Day Two - Improving M...
Crisis information management framework for regional disaster resiliency (Joe...
Ad

Similar to Humanitarian Indicator Tool (Dr Vivien Walden, Oxfam) (20)

PDF
Using a Fidelity Index to Increase Program Attribution
PPTX
Framework user guide presentation cpw dec132015
PPTX
North Africa/West Asia - Intermediate Development Outcomes
PPTX
04 Output 1 Instruments & Tools (3).pptx
PPTX
04 Output 1 Instruments & Tools (3).pptx
PPT
Program Evaluations to avoid aids/HIV for children
PDF
Benefits of M&E.PDF
PPTX
monitoring and evaluation approaches for key population programs: perils, pi...
PPTX
GBV Guidelines Regional Workshop Overview for Webinar
PDF
Improvement of the ngo’s performance
PDF
Community-based Indicators for HIV Programs
PPT
Planning Multisite Evaluations
PDF
How we can use impact evaluation to assure effective use of resources for dev...
PDF
Disaster risk management in food security and agriculture
PPT
ce_wg_presentation_-_nutrition.ppt
PDF
Programme planning and evaluation in extension work
PPTX
Presentation_DRRM Mainstreaming in the Planning Cycle.pptx
PPTX
Development of a Progressive Management Pathway to assist National and Intern...
PPTX
Community HIV Program Indicators
PDF
Using case-based methods to assess scalability and sustainability: Lessons fr...
Using a Fidelity Index to Increase Program Attribution
Framework user guide presentation cpw dec132015
North Africa/West Asia - Intermediate Development Outcomes
04 Output 1 Instruments & Tools (3).pptx
04 Output 1 Instruments & Tools (3).pptx
Program Evaluations to avoid aids/HIV for children
Benefits of M&E.PDF
monitoring and evaluation approaches for key population programs: perils, pi...
GBV Guidelines Regional Workshop Overview for Webinar
Improvement of the ngo’s performance
Community-based Indicators for HIV Programs
Planning Multisite Evaluations
How we can use impact evaluation to assure effective use of resources for dev...
Disaster risk management in food security and agriculture
ce_wg_presentation_-_nutrition.ppt
Programme planning and evaluation in extension work
Presentation_DRRM Mainstreaming in the Planning Cycle.pptx
Development of a Progressive Management Pathway to assist National and Intern...
Community HIV Program Indicators
Using case-based methods to assess scalability and sustainability: Lessons fr...
Ad

More from ALNAP (20)

PPTX
Gf john's presentation
PPTX
From best practice to best fit: changing to a more flexible approach to human...
PPTX
ALNAP PPT FOR MONTREUX XIII | 'From best practice to best fit'
PPTX
ALNAP PPT FOR OFDA | 50 years: From best practice to best fit
PPTX
Strengthening humanitarian leadership teams: Rethinking leadership?
PPTX
'Learning from disaster' study launch presentation
PDF
A networked response? 2013 presentation
PPTX
Disaster risk management in nepal
PPTX
Comisión Nacional de Prevención de Riesgos y Atención de Emergencias
PPTX
Government Forum for Government Response - an overview
PPTX
Monitoring and evaluation: The Caribbean Disaster Emergency Management Agency...
PPTX
Jamaican government experience and learning on disaster response
PPTX
Disaster Management Initiatives in India
PPTX
Disaster Response dialogue
PPT
International assistance for major disasters in Indonesia
PPTX
Simulation a tool to strengthen capabilities in India
PDF
Humanitarian leadership: who's in charge here?
PPTX
Cracks in the machine: is the humanitarian system fit for purpose? (Peter Wal...
PPTX
Data, evidence and access to information
PPTX
Whast goes up must come down: challenges of getting evidence back to the ground
Gf john's presentation
From best practice to best fit: changing to a more flexible approach to human...
ALNAP PPT FOR MONTREUX XIII | 'From best practice to best fit'
ALNAP PPT FOR OFDA | 50 years: From best practice to best fit
Strengthening humanitarian leadership teams: Rethinking leadership?
'Learning from disaster' study launch presentation
A networked response? 2013 presentation
Disaster risk management in nepal
Comisión Nacional de Prevención de Riesgos y Atención de Emergencias
Government Forum for Government Response - an overview
Monitoring and evaluation: The Caribbean Disaster Emergency Management Agency...
Jamaican government experience and learning on disaster response
Disaster Management Initiatives in India
Disaster Response dialogue
International assistance for major disasters in Indonesia
Simulation a tool to strengthen capabilities in India
Humanitarian leadership: who's in charge here?
Cracks in the machine: is the humanitarian system fit for purpose? (Peter Wal...
Data, evidence and access to information
Whast goes up must come down: challenges of getting evidence back to the ground

Humanitarian Indicator Tool (Dr Vivien Walden, Oxfam)

  • 1. Improving quality of humanitarian programmes through the use of a scoring system: the Humanitarian Indicator Tool Dr Vivien Margaret Walden and Nigel Timmins
  • 2. Acknowledgements • We wish to acknowledge the consultants, Andy Featherstone, Peta Sandison, Marlise Turnbull and Sarah House who helped refine the tools and the 10 countries offices who have been through the process and have hopefully come out of it unscathed. • Photographs used in this presentation are by Caroline Gluck, Oxfam Page 2
  • 3. Why a new tool? • In 1995, the ODI evaluation of the response to the Rwanda crisis raised concerns around the quality of service delivery • As a result of findings, the Sphere project was started • In 1997 the People in Aid Code of Best Practice was published • In 2000 the Humanitarian Ombudsman became HAP • In 2001 Griekspoor and Sondorp wrote a paper highlighting the various quality assurance measures and posed the question “Do all of the developments in the field of accountability and performance actually improve overall performance?” Page 3
  • 4. Why a new tool? • Oxfam’s hypothesis – by improving the quality of a humanitarian programme, the likelihood of positive impact is increased • The tool does not try to prove the link • Oxfam has introduced the Global Performance Framework as part of reporting to DFID • The framework has global indictors of which the Humanitarian Indicator is one • Whereas the other indicators measure impact, the humanitarian indictor looks at quality of the response Page 4
  • 5. Oxfam’s Global Performance Framework Global Output Indicators Global Outcome Indicators Humanitarian Assistance: Degree to which humanitarian responses meet Total number of people provided with appropriate recognised quality standards for humanitarian humanitarian assistance, disaggregated by sex programming (e.g. Sphere guidelines) Adaptation and Risk Reduction: % of supported households demonstrating greater # of people supported to understand current and likely ability to minimise risk from shocks and adapt to future hazards, reduce risk, and/or adapt to climatic changes and uncertainty, disaggregated by sex emerging trends & uncertainty Livelihood Enhancement Support: % of supported households demonstrating greater Improved quality # of women and men directly supported to increase income, as measured by daily consumption and of life for poor income via enhancing production and/or market access expenditure per capita women and men Women’s Empowerment: % of supported women demonstrating greater # of people reached to enable women to gain increased involvement in household decision-making and control over factors affecting their own priorities and interests influencing affairs at the community level Citizen Mobilisation: Degree to which selected interventions have # of a) citizens, CBO members and CSO staff supported contributed to affecting outcome change, as to engage with state institutions/other relevant actors; generated from findings of rigorous qualitative and b) duty bearers benefiting from capacity support evaluations Campaigning and Advocacy: Degree to which selected interventions have #of campaign actions directly undertaken or supported, contributed to affecting outcome change, as e.g. contacts made with policy targets, online and generated from findings of rigorous qualitative offline actions taken, media coverage, publications, etc. evaluations Degree to which selected interventions meet recognised standards for accountable programming Extent to which selected project delivery good value for money Page 5
  • 7. The indicator • Output indicator – the total number of people provided with appropriate humanitarian assistance, disaggregated by sex • Data collected annually through online system • Outcome indicator – the degree to which humanitarian responses meet recognised quality standards for humanitarian programming • Data collected through HIT Page 7
  • 8. The tool – for rapid onset emergencies Global Humanitarian Indicator: Degree to which humanitarian responses meet recognised quality standards for humanitarian programming RAPID ONSET EMERGNECY – EARTHQUAKE, SUDDEN FLOODS, TSUNAMI, CYCLONES, TYPHOONS, HURRICANES, SUDDEN CONFLICT WITH DISPLACEMENT, AWD OUTBREAKS Number Quality standard Met Almost Partially Not met (score6) met (4) met (score (score 2) 0) 1 Timeliness - rapid appraisal/assessment enough to make decisions within 24 hours and initial implementation within three days 2 Coverage uses 25% of affected population as an planned figure (response should reflect the scale of the disaster) with clear justification for final count 3 Technical aspects of programme measured against Sphere standards Number Quality standard Met Almost Partially Not met (score3) met (score met (score 2) (score 1) 0) 4 MEAL strategy and plan in place and being implemented using appropriate indicators 5 Feedback/complaints system for affected population in place and functioning and documented evidence of information sharing, consultation and participation leading to a programme relevant to context and needs 6 Partner relationships defined, capacity assessed and partners fully engaged in all stages of programme cycle 7 Programme is considered a safe programme: action taken to avoid harm and programme considered conflict sensitive 8 Programme (including advocacy) addresses gender equity and specific concerns Page 8
  • 9. The benchmarks • Timeliness - rapid appraisal/assessment enough to make decisions within 24 hours and initial implementation within three days • Coverage uses 25% of affected population as an planned figure (response should reflect the scale of the disaster) with clear justification for final count • Technical aspects of programme measured against Sphere standards • MEAL strategy and plan in place and being implemented using appropriate indicators • Feedback/complaints system for affected population in place and functioning and documented evidence of information sharing, consultation and participation leading to a programme relevant to context and needs Page 9
  • 10. The benchmarks • Partner relationships defined, capacity assessed and partners fully engaged in all stages of programme cycle • Programme is considered a safe programme: action taken to avoid harm and programme considered conflict sensitive • Programme (including advocacy) addresses gender equity and specific concerns and needs of women, girls, men and boys and vulnerable groups • Evidence that preparedness measures were in place and effectively actioned • Programme has an advocacy/campaigns strategy and has incorporated advocacy into programme plans based on evidence from the field Page 10
  • 11. The benchmarks • Country programme has an integrated approach including reducing and managing risk though existing longer-term development programmes and building resilience for the future • Evidence of appropriate staff capacity to ensure quality programming Page 11
  • 12. The methodology • Done by an external consultant – although preferably one who has a knowledge of Oxfam • Done as a desk study using documentation and some telephone/Skype interviews • Follows a pre-determined scoring system and list of documents • Has to be commented on and accepted by the country • The country writes a management response • All reports and a summary for each are published on the Oxfam website – www. Oxfam.org Page 12
  • 13. The scoring Quality standard Evidence needed Met (score 6) Almost met (4) Partially met (score 2) Not met (score 0) 3 Proposals Sphere standards NA Sphere standards Standards only MEAL strategy and proposed and put in proposed and adjusted mentioned in proposals plans place with adjusted to context but not replicated in PH and EFSL strategies standards for context Standards mentioned in plans Technical adviser visits Training in standards proposals and Or Training agendas and carried out for staff and LogFrames but not No mention of Sphere presentations partners monitored against in any document LogFrames and Indicators use Some evidence of monitoring frameworks standards and training but not donor reports monitoring against widespread (staff but RTE and other standards takes place not partners or only in evaluation reports regularly one area) learning event or Standards evaluated review reports Page 13
  • 14. Instructions for use Benchmark Evidence Quality check Benchmark 3 Technical aspects of Proposals Check proposals and strategies to see if programme measured MEAL strategy and plans standards are mentioned not just as a possibility against Sphere standards PH and EFSL strategies but that they are considered in the context of Technical adviser visits the response – this might mean that Sphere has Training agendas and been adapted to suit the context presentations The indicators on the LogFrame for technical LogFrames and monitoring areas should reflect Sphere standards frameworks The MEAL strategy should have Sphere as donor reports indicators and for data collection methods RTE and other evaluation Check adviser reports for mention of standards reports and how these were implemented learning event or review Check the RTE report for mention of Sphere reports standards Check WASH and EFSL strategies and adviser reports to see if any training was carried out for staff and partners Check review and evaluation reports for mention of standards Page 14
  • 15. Final score • First year total score could be 30 • Adjusted for non-applicable benchmarks • First year scoring • Somalia – 17/28 • Kenya – 24/30 • Ethiopia – 9/28 • Pakistan – 19/30 • Colombia – the test case for the HIT – 18/26 • Second year – only South Sudan is complete – 21/30 Page 15
  • 16. The findings – Somalia • Benchmark 5 on accountability • Score Met – 2/2 • “Oxfam’s partners appeared to differ in the level of beneficiary participation in design and delivery. Some documented highly participatory process, with qualitative and quantitative data. As well as gathering information, rapid assessments were done to establish VRCs to improve participation (criteria, entitlements, payment points, registration, complaints and feedback). Mobile phone hotlines were set up where possible, with feedback protocol to guide staff on how to register and follow-up complaints” Page 16
  • 17. Pakistan • Benchmark 7 – protection and gender • Partially met – ½ • Some protection concerns were identified relating to security for staff and women and girls using WASH facilities. Some actions were taken responding to dignity and protection including involving women in different activities. Post- distribution monitoring investigated some security concerns related to cashing cheques and distribution points, including analysis of responses from women. No protection problems observed were communicated to agencies or authorities responsible for, or specialising in, protection Page 17
  • 18. Ethiopia • Benchmark 11 – advocacy • Not met – 0/2 • Advocacy activities were clearly part of the intended response and there is a regional advocacy action plan with Ethiopia objectives and a media, advocacy and campaign strategy which includes a number of plans for Ethiopia. However, no Ethiopia country advocacy strategy was provided for the evaluation. The sitreps do mention Oxfam’s participation in influential meetings, but are not tied to an explicit strategy. There is no record of the impact of Oxfam’s advocacy activities Page 18
  • 19. Lessons learnt • There are limitations to doing a desk study – it relies heavily on documentation and scores do not reflect absence of documentation or actual absence of good programming • Several country teams objected to being judged solely on “little pieces of paper” • There is no opportunity to get the views of the affected population unless this is already documented • Telephone/Skype interviews can be biased – it is sometimes difficult to triangulate • The process needs the goodwill and buy-in from the country team Page 19
  • 20. Advantages • The process is fairly inexpensive – under £6000 • The country does not have to host a consultant • The methodology can be used to track progress in subsequent responses in one country • The scores are comparable across programmes (although context should be considered) Page 20
  • 21. She deserves the best we can give – thank you! Page 21

Editor's Notes

  • #4: They then went on to discuss the fact that there were more agencies, more funding and more media attention. They even went as far as to say that 100,000 deaths could be attrivbuted to poor perfomraance by relief agencies
  • #11: Elderly, disabled, HIV positive, single women, female-headed households are examples