SlideShare a Scribd company logo
Introduction
Monitoring, Evaluation and
Impact Assessment
Dr. Agyeya Tripathi
Compiled
For:
Development
Management
Institute
1
Content
• Basic definitions
• Development Context
• Scope and Importance of M&E
• How international development organizations
look at M&E?
• Some Examples
Compiled
For:
Development
Management
Institute
2
Compiled
For:
Development
Management
Institute
3
Source: DFID Research Uptake Guidance,April 2016
“DFID takes monitoring and evaluation of research
uptake seriously and expects to see projects devoting
resources to monitoring and evaluation from the
outset of the programme rather than waiting until the
end and reporting on what has been done”
Compiled
For:
Development
Management
Institute
4
Monitoring
What is monitoring?
• Monitoring is the built-in mechanism to check that things
are going to plan and enable adjustments to be made in a
methodical way (Oxfam, 1995).
• Monitoring is a systematic and continuous assessment of
progress of a piece of work over time (Save the Children,
1995).
Compiled
For:
Development
Management
Institute
5
Compiled
For:
Development
Management
Institute
6
What is M&E?
WHAT? WHY?
Monitoring
Ongoing gathering (and
analysis) of information/data
(usually against targets and
milestones)
Document results, processes and
experiences and track progress as
a basis for steering decisions and
identifying issues early on to take
corrective action.
Evaluation
Assessing data and
information to establish a
judgement on the success of a
project.
-Formative
-Summative
To assess whether a project has
achieved its intended
goals/impact. But evaluation is
not just for accountability
reasons, but also learning to feed
into future decisions
Monitoring is a continuous process, whereas evaluations are carried out at specific points of time in the
course of the project (mostly at the end of the project or a project phase).
Compiled
For:
Development
Management
Institute
7
Project Cycle
••Implementation of
project
•On-going
monitoring and
review
•Adjust and adapt
•Evaluate
performance.
•Learn and share
lessons.
•Adapt and redesign
(extension vs
closure)
•Finalisation of
partnerships and
agreements (MoUs)
•Finalisation of M+E
frameworks
•Business case/
Proposal
•Evidence base for
project
•Development of
Logframe / Theory
of Change
Design
Mobilizat
ion
Delivery
Closure
Compiled
For:
Development
Management
Institute
8
What is Purpose of Monitoring?
We monitor to:
• Assess quality and quantity of work done in
relation to each objective
• Rectify, improve, adapt, and derive lessons
Compiled
For:
Development
Management
Institute
9
Taken from IFRC Project/Programme M&E Guide
http://www.ifrc.org/Global/Publications/monitoring/IFRC-ME-Guide-8-2011.pdf
Compiled
For:
Development
Management
Institute
10
What Do We Monitor?
If you have a project design, you decide on the activities, expected
outputs, and results.
Activities
• Things a project or program does
Outputs
• Products or consequences of project activities
• Tangible deliverables i.e. goods, services, desired behavioural
change or consequences of a project
Results
• Things that happen because of what the project or program does
• Effects of outputs
Compiled
For:
Development
Management
Institute
11
Use of Information
Collected Through Monitoring
Plan/act on issues or concerns
• Inform the development of strategies and tactics
• Inform design of specific activities
Mobilize/manage
• Move resources (people, materials, money, information, time)
• Identify and adjust poorly performing components or pressure concerned
agency
Communicate, report, and replan
• Share information
• Report on project performance to the stakeholders and donors
Compiled
For:
Development
Management
Institute
12
Guiding Principles of Monitoring
• Focus on minimal but key information
• Include all forms of communication
• Use verifiable evidence
• Enhance the quality of actions through
learning and accountability.
Compiled
For:
Development
Management
Institute
13
Monitoring Format
Basic
Compiled
For:
Development
Management
Institute
14
Plan Vs Actual
How to check? What should be asked? What information to collect?
We can verify that we have achieved the planned level of
performance through data.
We ask the following questions:
1. In what units of measurement do we collect the data ?
2. Who has the data (who is your source of data)?
3. How will we gather the data?
4. How frequently?
5. Who will gather the data?
6. How do we interpret the data?
7. What will it cost?
Compiled
For:
Development
Management
Institute
15
Evaluation
Compiled
For:
Development
Management
Institute
16
What is Evaluation?
• Evaluation looks at the impact of the project and
the appropriateness of the action. Monitoring and
evaluation collect information to improve projects
after they have started.
• Evaluation can occur during implementation, at
the end, or even a few years after the project is
completed, and draws conclusions about whether
the right job is/was done well.
Compiled
For:
Development
Management
Institute
17
Types
According to evaluator
• Self-evaluation (participatory)
• Independent evaluation: Internal; External
According to timing
• Interim/mid-term
• Terminal
• Ex-post (impact)
Compiled
For:
Development
Management
Institute
18
Core Evaluation Concerns
Compiled
For:
Development
Management
Institute
19
Information Sources For Evaluation
• Project documents and
subsequent revisions
• Progress reviews and self-
evaluation reports
• Reports of previous
independent evaluations
• Major project outputs
• Minutes of management
committees and other relevant
committees
• Annual reports of partner
organizations
• Socio-economic profiles and other
development indicators
• Organizational charts, by-laws
• Relevant national policy
documents
• Lessons from similar projects in
the country concerned or in other
countries
• Interviews with relevant
stakeholders
• Survey results
Compiled
For:
Development
Management
Institute
20
Relationship  Monitoring & Evaluation
Results Impact
 What happened?
 Accepts design as given
 Focus
 Efficiency
 Execution
 Compliance with procedures
 Achievement of outputs
 Feedback
 Replanning
 Why did it happen or not
happen?
 Focus
 Causality
 Unplanned change
 Net impact
 Causal relationship between
outputs and objectives
 Challenges design
 Replanning
Monitori
ng Evaluatio
n
Compiled
For:
Development
Management
Institute
21
Results Impact
Monitori
ng Evaluatio
n
Are we doing things the right way? Are we doing the right things?
Compiled
For:
Development
Management
Institute
22
Tools for Evaluation
• Participatory Rapid Appraisal (PRA) and other
related tools: e.g., community profiling, mapping,
interviews, sampling
• Quantitative tools: e.g., financial analysis,
statistics
• Tools from anthropological traditions: e.g.,
participant observation
Compiled
For:
Development
Management
Institute
23
Selecting The Right Tool
• Convergence with project / program philosophy
• Perception of stakeholders on the method
• Involvement of end-users in various evaluation activities
• Matching with capacity of stakeholders
• Adaptability to stakeholders’ daily activities
• Capacity to provide timely information
• Reliability of results generated
• Consistency with complexity and cost of evaluation level
• Sensitivity to gender considerations
Compiled
For:
Development
Management
Institute
24
Exercise
Convert The following Into Project
Design
One can lead a horse to water, but one cannot make
it drink.
Compiled
For:
Development
Management
Institute
25
Translated Into Design
• A thirsty horse is the problem.
• The water, the rope, and the man are inputs.
• Leading the horse to the water is an activity.
• The horse should have drunk from the water is an output / The
horse drinking from the water is an output.
• Addressing the thirst of the horse by letting it drink water is an
objective.
• To improve the health of horses is the purpose.
• A herd of happy horses is the overall goal.
Compiled
For:
Development
Management
Institute
26
The drinking behaviour and the fountain
are the outputs.
Access to such fountain and benefits derived
from such access (i.e., improved health of
the horse) are the results.
Compiled
For:
Development
Management
Institute
27
Illustration for Input, Output, Outcome, and Impact
Compiled
For:
Development
Management
Institute
28
Why We Need Standardization?
M&E Planning
Eight blind
men are
debating on
what an
elephant
looks
like.
Compiled
For:
Development
Management
Institute
29
• Everybody is actually telling the truth.
• That is the point of participation
• When people participate, you get different
perspectives of reality.
• If we put together all the findings, then we will come
up with a relatively accurate image of the elephant.
Compiled
For:
Development
Management
Institute
30
WHAT ARE INDICATORS?
Indicators are information needed to help
determine progress. An indicator provides, where
possible, a clearly defined unit of measurement
and a target detailing the quantity, quality and
timing of expected results.
Compiled
For:
Development
Management
Institute
31
Making Indicators Useful
A performance indicator clarifies what we
intend to measure.
It does not tell us what level of achievement
signals success.
That is why we need baselines and targets.
Compiled
For:
Development
Management
Institute
32
Criteria While Selecting Indicator
Use SMART criteria.
Compiled
For:
Development
Management
Institute
33
Indicators: Points To Remember
• Indicators should complement one another in terms
of cross-validation, and point problems with each
other.
• Indicators should as much as possible be
disaggregated by gender, age, or whichever category
desired.
• The number of indicators should be small; as a rule
of thumb, maximum of six per objective.
• Indicators may be relevant to stakeholders based on
different needs and interests.
Compiled
For:
Development
Management
Institute
34
Quantitative Indicators
Vs
Qualitative Indicators
Compiled
For:
Development
Management
Institute
35
Indicators: Typologies
Indicator Type Definition When to use
Risk /
Enabling
Measures influence of
external factors on
project
During project designing usually.
Input Measures resources
devoted to project
At start of Project, atwhich point
baseline data are collected
Process Measures delivery
activities; monitor
achievements through
time
While Project is ongoing
Output Measures immediate
results
Used near the end of donor
involvement
Outcome /
Impact
Measures long term
effects of project
Used after donor involvement.
Usually 3–5 years after the
project ended (or was completed).
Compiled
For:
Development
Management
Institute
36
M&E Tools
• Theory of Change (ToC)
• LogFrame
Compiled
For:
Development
Management
Institute
37
LogFrame
Compiled
For:
Development
Management
Institute
38
ToC Vs Logframe
Compiled
For:
Development
Management
Institute
39
Terminology
INPUT
ACTIVITIE
S
OUTPUTS
OUTCOM
ES
IMPACT
The resources, both financial and human resources required to undertake
your project.
Actions taken or work performed which should lead to outputs.
e.g. the collection of data, the running of workshops, the organisation of
meetings, the development of models etc.
The immediate results of a grantee’s activities –the processes, products, goods
and services delivered through funded activities e.g. publications, manuals,
dataset, models, workshops, stakeholder meetings etc. produced.
The short-term and medium-term effects of an intervention’s outputs affecting
policy or practice. Outcomes are observable behavioural, institutional or societal
changes.
The long term sustained effects of a development intervention, direct or indirect.
This is usually the goal of the programme/project.
Compiled
For:
Development
Management
Institute
40
Monitoring With Log Frame
OUTPU
TS
OUTCO
MES
IMPACT
Objective,
easily
measurable
signs of
change or
progress.
Can be
quant/qual
but do not
show how
much will be
achieved.
Indicator
Data
collected
against
indicators
before any
work has
taken
place.
Baseline Milestone
Data
collected
against
indicators
before any
work has
taken
place.
Target
Amount/
final effect
expected at
the end of
the project
cycle.
Compiled
For:
Development
Management
Institute
41
Develop An M&E Plan
Review
ToC
Identifying
monitoring
questions
Identify
Indicators
Develop an
M+E plan
Compiled
For:
Development
Management
Institute
42
Developing An M&E Plan
Output 1 Evaluation
Questions
Monitoring
Question
Indicator
Journal articles
Effectiveness
How many journal
articles were
produced?
#journal articles
Quality and
usefulness
# downloaded
Citation rates
Has the research
fed into policy?
Discussions of
findings within
relevant southern
policy institutions
facilitated by
research experts
Compiled
For:
Development
Management
Institute
43
M&E Plan
• Source: where will the data come from?
• Baseline: data before you started?
• When and who will collect the data?
Compiled
For:
Development
Management
Institute
44
Thank You
Compiled
For:
Development
Management
Institute
45
Email: tripathi.agyeya@gmail.com

More Related Content

PPTX
PRESENTATION ON MONITORING & EVALUATIONS
PPTX
Monitoring and evaluation of programs that changes life's
PPT
Monitoring & evaluation presentation[1]
PPT
Monitoring and Evaluation.ppt
PPTX
Chapter 1 Introduction to Monitoing and evaluationpptx
PPT
Monitoring and evaluation
PPT
Monitoring and evaluation by Olashore Emmanuel
PPTX
Monitoring and evaluation
PRESENTATION ON MONITORING & EVALUATIONS
Monitoring and evaluation of programs that changes life's
Monitoring & evaluation presentation[1]
Monitoring and Evaluation.ppt
Chapter 1 Introduction to Monitoing and evaluationpptx
Monitoring and evaluation
Monitoring and evaluation by Olashore Emmanuel
Monitoring and evaluation

Similar to Monitoring and Evaluation - AT.pdf (20)

PDF
Monitoring and Evaluation for development and governmental organizations.pdf
PDF
Monitoring and Evaluation for development and governmental organizations.pdf
PDF
Monitoring and Evaluation for development and governmental organizations.pdf
PDF
COURSEWORK.pdf
PPTX
Importance of M&E
PDF
Monitoring & evaluation
PPTX
Monitoring and Evaluation Lesson 2.pptx
PPTX
Chapter Six- strategic managementy1.pptx
PDF
Monitoring & Evaluation Framework - Fiinovation
PPTX
Project management
PPTX
Report on overview monitoring and evaluation in Strategic Management_Report p...
PDF
Planning_lecture_climate change__""""".pdf
PPTX
Definations for Learning 24 July 2022 [Autosaved].pptx
PPTX
Monitoring and Evaluation
PPTX
6 monitoring and evaluation
PPTX
TOPIC ONE for principle of project monitoring and evaluation
PPTX
Introduction to M&E, 2024.pptxIntroduction to M&E, 2024.pptx
PPT
Project monitoring and evaluation by Samuel Obino Mokaya
PPT
M & E Presentation DSK.ppt
PPT
Monitoring & evaluation presentation
Monitoring and Evaluation for development and governmental organizations.pdf
Monitoring and Evaluation for development and governmental organizations.pdf
Monitoring and Evaluation for development and governmental organizations.pdf
COURSEWORK.pdf
Importance of M&E
Monitoring & evaluation
Monitoring and Evaluation Lesson 2.pptx
Chapter Six- strategic managementy1.pptx
Monitoring & Evaluation Framework - Fiinovation
Project management
Report on overview monitoring and evaluation in Strategic Management_Report p...
Planning_lecture_climate change__""""".pdf
Definations for Learning 24 July 2022 [Autosaved].pptx
Monitoring and Evaluation
6 monitoring and evaluation
TOPIC ONE for principle of project monitoring and evaluation
Introduction to M&E, 2024.pptxIntroduction to M&E, 2024.pptx
Project monitoring and evaluation by Samuel Obino Mokaya
M & E Presentation DSK.ppt
Monitoring & evaluation presentation
Ad

Recently uploaded (20)

PDF
Roadmap Map-digital Banking feature MB,IB,AB
DOCX
Euro SEO Services 1st 3 General Updates.docx
PDF
IFRS Notes in your pocket for study all the time
PPTX
The Marketing Journey - Tracey Phillips - Marketing Matters 7-2025.pptx
PDF
Elevate Cleaning Efficiency Using Tallfly Hair Remover Roller Factory Expertise
PDF
How to Get Funding for Your Trucking Business
PDF
Laughter Yoga Basic Learning Workshop Manual
PDF
BsN 7th Sem Course GridNNNNNNNN CCN.pdf
PPTX
HR Introduction Slide (1).pptx on hr intro
DOCX
Business Management - unit 1 and 2
PPTX
job Avenue by vinith.pptxvnbvnvnvbnvbnbmnbmbh
PDF
Business model innovation report 2022.pdf
PDF
DOC-20250806-WA0002._20250806_112011_0000.pdf
PPTX
Amazon (Business Studies) management studies
PPT
340036916-American-Literature-Literary-Period-Overview.ppt
PPTX
Principles of Marketing, Industrial, Consumers,
PDF
Outsourced Audit & Assurance in USA Why Globus Finanza is Your Trusted Choice
PDF
Katrina Stoneking: Shaking Up the Alcohol Beverage Industry
PPTX
New Microsoft PowerPoint Presentation - Copy.pptx
PPTX
Belch_12e_PPT_Ch18_Accessible_university.pptx
Roadmap Map-digital Banking feature MB,IB,AB
Euro SEO Services 1st 3 General Updates.docx
IFRS Notes in your pocket for study all the time
The Marketing Journey - Tracey Phillips - Marketing Matters 7-2025.pptx
Elevate Cleaning Efficiency Using Tallfly Hair Remover Roller Factory Expertise
How to Get Funding for Your Trucking Business
Laughter Yoga Basic Learning Workshop Manual
BsN 7th Sem Course GridNNNNNNNN CCN.pdf
HR Introduction Slide (1).pptx on hr intro
Business Management - unit 1 and 2
job Avenue by vinith.pptxvnbvnvnvbnvbnbmnbmbh
Business model innovation report 2022.pdf
DOC-20250806-WA0002._20250806_112011_0000.pdf
Amazon (Business Studies) management studies
340036916-American-Literature-Literary-Period-Overview.ppt
Principles of Marketing, Industrial, Consumers,
Outsourced Audit & Assurance in USA Why Globus Finanza is Your Trusted Choice
Katrina Stoneking: Shaking Up the Alcohol Beverage Industry
New Microsoft PowerPoint Presentation - Copy.pptx
Belch_12e_PPT_Ch18_Accessible_university.pptx
Ad

Monitoring and Evaluation - AT.pdf

  • 1. Introduction Monitoring, Evaluation and Impact Assessment Dr. Agyeya Tripathi Compiled For: Development Management Institute 1
  • 2. Content • Basic definitions • Development Context • Scope and Importance of M&E • How international development organizations look at M&E? • Some Examples Compiled For: Development Management Institute 2
  • 4. Source: DFID Research Uptake Guidance,April 2016 “DFID takes monitoring and evaluation of research uptake seriously and expects to see projects devoting resources to monitoring and evaluation from the outset of the programme rather than waiting until the end and reporting on what has been done” Compiled For: Development Management Institute 4
  • 5. Monitoring What is monitoring? • Monitoring is the built-in mechanism to check that things are going to plan and enable adjustments to be made in a methodical way (Oxfam, 1995). • Monitoring is a systematic and continuous assessment of progress of a piece of work over time (Save the Children, 1995). Compiled For: Development Management Institute 5
  • 7. What is M&E? WHAT? WHY? Monitoring Ongoing gathering (and analysis) of information/data (usually against targets and milestones) Document results, processes and experiences and track progress as a basis for steering decisions and identifying issues early on to take corrective action. Evaluation Assessing data and information to establish a judgement on the success of a project. -Formative -Summative To assess whether a project has achieved its intended goals/impact. But evaluation is not just for accountability reasons, but also learning to feed into future decisions Monitoring is a continuous process, whereas evaluations are carried out at specific points of time in the course of the project (mostly at the end of the project or a project phase). Compiled For: Development Management Institute 7
  • 8. Project Cycle ••Implementation of project •On-going monitoring and review •Adjust and adapt •Evaluate performance. •Learn and share lessons. •Adapt and redesign (extension vs closure) •Finalisation of partnerships and agreements (MoUs) •Finalisation of M+E frameworks •Business case/ Proposal •Evidence base for project •Development of Logframe / Theory of Change Design Mobilizat ion Delivery Closure Compiled For: Development Management Institute 8
  • 9. What is Purpose of Monitoring? We monitor to: • Assess quality and quantity of work done in relation to each objective • Rectify, improve, adapt, and derive lessons Compiled For: Development Management Institute 9
  • 10. Taken from IFRC Project/Programme M&E Guide http://www.ifrc.org/Global/Publications/monitoring/IFRC-ME-Guide-8-2011.pdf Compiled For: Development Management Institute 10
  • 11. What Do We Monitor? If you have a project design, you decide on the activities, expected outputs, and results. Activities • Things a project or program does Outputs • Products or consequences of project activities • Tangible deliverables i.e. goods, services, desired behavioural change or consequences of a project Results • Things that happen because of what the project or program does • Effects of outputs Compiled For: Development Management Institute 11
  • 12. Use of Information Collected Through Monitoring Plan/act on issues or concerns • Inform the development of strategies and tactics • Inform design of specific activities Mobilize/manage • Move resources (people, materials, money, information, time) • Identify and adjust poorly performing components or pressure concerned agency Communicate, report, and replan • Share information • Report on project performance to the stakeholders and donors Compiled For: Development Management Institute 12
  • 13. Guiding Principles of Monitoring • Focus on minimal but key information • Include all forms of communication • Use verifiable evidence • Enhance the quality of actions through learning and accountability. Compiled For: Development Management Institute 13
  • 15. Plan Vs Actual How to check? What should be asked? What information to collect? We can verify that we have achieved the planned level of performance through data. We ask the following questions: 1. In what units of measurement do we collect the data ? 2. Who has the data (who is your source of data)? 3. How will we gather the data? 4. How frequently? 5. Who will gather the data? 6. How do we interpret the data? 7. What will it cost? Compiled For: Development Management Institute 15
  • 17. What is Evaluation? • Evaluation looks at the impact of the project and the appropriateness of the action. Monitoring and evaluation collect information to improve projects after they have started. • Evaluation can occur during implementation, at the end, or even a few years after the project is completed, and draws conclusions about whether the right job is/was done well. Compiled For: Development Management Institute 17
  • 18. Types According to evaluator • Self-evaluation (participatory) • Independent evaluation: Internal; External According to timing • Interim/mid-term • Terminal • Ex-post (impact) Compiled For: Development Management Institute 18
  • 20. Information Sources For Evaluation • Project documents and subsequent revisions • Progress reviews and self- evaluation reports • Reports of previous independent evaluations • Major project outputs • Minutes of management committees and other relevant committees • Annual reports of partner organizations • Socio-economic profiles and other development indicators • Organizational charts, by-laws • Relevant national policy documents • Lessons from similar projects in the country concerned or in other countries • Interviews with relevant stakeholders • Survey results Compiled For: Development Management Institute 20
  • 21. Relationship  Monitoring & Evaluation Results Impact  What happened?  Accepts design as given  Focus  Efficiency  Execution  Compliance with procedures  Achievement of outputs  Feedback  Replanning  Why did it happen or not happen?  Focus  Causality  Unplanned change  Net impact  Causal relationship between outputs and objectives  Challenges design  Replanning Monitori ng Evaluatio n Compiled For: Development Management Institute 21
  • 22. Results Impact Monitori ng Evaluatio n Are we doing things the right way? Are we doing the right things? Compiled For: Development Management Institute 22
  • 23. Tools for Evaluation • Participatory Rapid Appraisal (PRA) and other related tools: e.g., community profiling, mapping, interviews, sampling • Quantitative tools: e.g., financial analysis, statistics • Tools from anthropological traditions: e.g., participant observation Compiled For: Development Management Institute 23
  • 24. Selecting The Right Tool • Convergence with project / program philosophy • Perception of stakeholders on the method • Involvement of end-users in various evaluation activities • Matching with capacity of stakeholders • Adaptability to stakeholders’ daily activities • Capacity to provide timely information • Reliability of results generated • Consistency with complexity and cost of evaluation level • Sensitivity to gender considerations Compiled For: Development Management Institute 24
  • 25. Exercise Convert The following Into Project Design One can lead a horse to water, but one cannot make it drink. Compiled For: Development Management Institute 25
  • 26. Translated Into Design • A thirsty horse is the problem. • The water, the rope, and the man are inputs. • Leading the horse to the water is an activity. • The horse should have drunk from the water is an output / The horse drinking from the water is an output. • Addressing the thirst of the horse by letting it drink water is an objective. • To improve the health of horses is the purpose. • A herd of happy horses is the overall goal. Compiled For: Development Management Institute 26
  • 27. The drinking behaviour and the fountain are the outputs. Access to such fountain and benefits derived from such access (i.e., improved health of the horse) are the results. Compiled For: Development Management Institute 27
  • 28. Illustration for Input, Output, Outcome, and Impact Compiled For: Development Management Institute 28
  • 29. Why We Need Standardization? M&E Planning Eight blind men are debating on what an elephant looks like. Compiled For: Development Management Institute 29
  • 30. • Everybody is actually telling the truth. • That is the point of participation • When people participate, you get different perspectives of reality. • If we put together all the findings, then we will come up with a relatively accurate image of the elephant. Compiled For: Development Management Institute 30
  • 31. WHAT ARE INDICATORS? Indicators are information needed to help determine progress. An indicator provides, where possible, a clearly defined unit of measurement and a target detailing the quantity, quality and timing of expected results. Compiled For: Development Management Institute 31
  • 32. Making Indicators Useful A performance indicator clarifies what we intend to measure. It does not tell us what level of achievement signals success. That is why we need baselines and targets. Compiled For: Development Management Institute 32
  • 33. Criteria While Selecting Indicator Use SMART criteria. Compiled For: Development Management Institute 33
  • 34. Indicators: Points To Remember • Indicators should complement one another in terms of cross-validation, and point problems with each other. • Indicators should as much as possible be disaggregated by gender, age, or whichever category desired. • The number of indicators should be small; as a rule of thumb, maximum of six per objective. • Indicators may be relevant to stakeholders based on different needs and interests. Compiled For: Development Management Institute 34
  • 36. Indicators: Typologies Indicator Type Definition When to use Risk / Enabling Measures influence of external factors on project During project designing usually. Input Measures resources devoted to project At start of Project, atwhich point baseline data are collected Process Measures delivery activities; monitor achievements through time While Project is ongoing Output Measures immediate results Used near the end of donor involvement Outcome / Impact Measures long term effects of project Used after donor involvement. Usually 3–5 years after the project ended (or was completed). Compiled For: Development Management Institute 36
  • 37. M&E Tools • Theory of Change (ToC) • LogFrame Compiled For: Development Management Institute 37
  • 40. Terminology INPUT ACTIVITIE S OUTPUTS OUTCOM ES IMPACT The resources, both financial and human resources required to undertake your project. Actions taken or work performed which should lead to outputs. e.g. the collection of data, the running of workshops, the organisation of meetings, the development of models etc. The immediate results of a grantee’s activities –the processes, products, goods and services delivered through funded activities e.g. publications, manuals, dataset, models, workshops, stakeholder meetings etc. produced. The short-term and medium-term effects of an intervention’s outputs affecting policy or practice. Outcomes are observable behavioural, institutional or societal changes. The long term sustained effects of a development intervention, direct or indirect. This is usually the goal of the programme/project. Compiled For: Development Management Institute 40
  • 41. Monitoring With Log Frame OUTPU TS OUTCO MES IMPACT Objective, easily measurable signs of change or progress. Can be quant/qual but do not show how much will be achieved. Indicator Data collected against indicators before any work has taken place. Baseline Milestone Data collected against indicators before any work has taken place. Target Amount/ final effect expected at the end of the project cycle. Compiled For: Development Management Institute 41
  • 42. Develop An M&E Plan Review ToC Identifying monitoring questions Identify Indicators Develop an M+E plan Compiled For: Development Management Institute 42
  • 43. Developing An M&E Plan Output 1 Evaluation Questions Monitoring Question Indicator Journal articles Effectiveness How many journal articles were produced? #journal articles Quality and usefulness # downloaded Citation rates Has the research fed into policy? Discussions of findings within relevant southern policy institutions facilitated by research experts Compiled For: Development Management Institute 43
  • 44. M&E Plan • Source: where will the data come from? • Baseline: data before you started? • When and who will collect the data? Compiled For: Development Management Institute 44