SlideShare a Scribd company logo
IPDET
Module 3:
Building a Results-Based
Monitoring and Evaluation
System
IPDET © 2009
2
Introduction
• Importance of Results-Based M&E
• What Is Results-Based M&E?
• Traditional vs. Results-Based M&E
• Ten Steps to Building a Results-Based
M&E System
IPDET © 2009
3
Importance
• There are growing pressures in
developing countries to improve
performance of their public sectors
• Involves reform by tracking results of
government or organizational actions
over time
• Is a management tool
IPDET © 2009
4
The Power of Measuring
Results
• If you do not measure results, you cannot tell
success from failure
• If you can not see success, you can not
reward it
• If you can not reward success, you are probably
rewarding failure
• If you can not see success, you can not learn from it
• If you can not recognize failure, you can not correct it
• If you can demonstrate results, you can win
public support
IPDET © 2009
5
Results-Based M&E:
• Provides crucial information about public
sector performance
• Provides a view over time on the status of a
project, program, or policy
• Promotes credibility and public confidence by
reporting on the results of programs
• Helps formulate and justify budget requests
• Identifies potentially promising programs
or practices by studying pilots
(continued on next slide)
IPDET © 2009
6
Results-Based M&E: (cont.)
• Focuses attention on achieving outcomes
important to the organization and
its stakeholders
• Provides timely, frequent information to staff
• Helps establish key goals and outcomes
• Permits managers to identify and take action
to correct weaknesses
• Supports a development agenda that is
shifting towards greater accountability
for aid lending
IPDET © 2009
7
Results-Based
Monitoring
• Results-based monitoring (what we
call “monitoring”) is a continuous
process of collecting and analyzing
information on key indicators, and
comparing actual results to expected
results
IPDET © 2009
8
Results-Based
Evaluation
• Results-based evaluation is an
assessment of a planned, ongoing, or
completed intervention to determine
its relevance, efficiency, effectiveness,
impact, and/or sustainability
IPDET © 2009
9
Difference between
Results-Based Monitoring and
Results-Based Evaluation
• Monitoring: tracks movement of indicators
towards the achievement of specific,
predetermined targets
• Evaluation: takes a broader view,
considering progress toward stated goals,
the logic of the initiative, and its
consequences
• Both are needed to better manage policies,
programs, and projects
IPDET © 2009
10
Comparison
Traditional
• inputs
• activities
• outputs
Results-Based
• combines traditional with
assessment of outcomes
and impacts
• allows organization to
modify and make
adjustments to theory of
change and/or
implementation
processes
IPDET © 2009
11
Complementary Roles of
Monitoring and
Evaluation
Monitoring
 Clarifies program objectives
 Links activities and their
resources to objectives
 Translates objectives into
performance indicators and sets
targets and baseline
 Routinely collects data on these
indicators, compares actual
results with targets
 Reports progress to managers
and alerts them to problems
Evaluation
 Analyzes why intended results
were or were not achieved
 Assesses specific causal
contributions of activities to
results
 Examines implementation process
 Explores unintended results
 Provides lessons, highlights
significant accomplishment or
program potential, and offers
recommendations for
improvement
Brief Introduction to
Theory of Change
• Theory of change is a representation
of how a project, program or policy
initiative is expected to lead to the
outcomes and impacts. It also
identifies the underlying assumptions
being made with respect to how the
change will occur.
IPDET © 2009
12
Components of
Theory of Change
• Inputs – financial, human, and
material resources
• Activities – tasks undertaken
• Outputs – products and services
• Outcomes – behavioral changes
• Impacts – long term widespread
improvement in society
IPDET © 2009
13
IPDET © 2009
14
Key Types of Monitoring
Results Monitoring
Implementation
Monitoring
(Means and Strategies)
Outcomes
Impacts
Results
Inputs
Activities
Outputs
Implementati
on
Performance Indicators
• A variable that tracks the changes in
the development intervention or shows
results relative to what was planned
• The cumulative evidence of a cluster of
indicators is used to see if an initiative
is making progress
IPDET © 2009
15
IPDET
Step 1: Conducting a
Readiness Assessment
planning for
improvement :
selecting realistic
targets
5
1
Conducting
a Readiness
Assessment
3
selecting key
indicators to
monitor
outcomes
7
using
evaluation
information
9
using
findings
2
agreeing
on outcomes to
monitor and
evaluate
4
gathering
baseline data
on indicators
6
monitoring
for results
8
reporting
findings
sustaining the
M&E system
within the
organization
10
Ten Steps to Building a
Results-Based M&E System
What Is a Readiness
Assessment?
• A systematic approach to determine
the capacity and willingness of a
government or organization to
construct a results-based M&E system
– The approach focuses on: presence or
absence of champions, incentives, roles
and responsibilities, organizational
capacity, and barriers to getting started
IPDET © 2009
17
IPDET © 2009
18
Incentives
• Sort out the answers to these questions:
– What is driving the need for building an
M&E system?
– Who are the champions for building and
using an M&E system?
– What is motivating those who champion
building an M&E system?
– Who will benefit from the system?
– Who will not benefit?
IPDET © 2009
19
Roles and
Responsibilities
• Ask:
– What are the roles of central and line ministries in
assessing performance?
– What is the role of the legislature?
– What is the role of the supreme audit agency?
– Do ministries and agencies share information
with one another?
– Who in the country produces data?
– Where at different levels in the government are
data used?
IPDET © 2009
20
Organizational Capacity
• Assess current government capacity
with respect to:
– technical skills
– managerial skills
– existing data systems and their quality
– technology available
– fiscal resources available
– institutional experience
IPDET © 2009
21
Barriers to M&E
• Do any of the following present
barriers to building an M&E system?
– lack of fiscal resources
– lack of political will
– lack of a champion for the system
– lack of an outcome-linked strategy ,or
experience
• How do we confront these barriers?
IPDET © 2009
22
Key Questions for Predicting
Success in Building an M&E System
• Does a clear mandate exist for M&E at the national level?
• Are Poverty Reduction Strategy Papers, laws, and
regulations in place?
• Is there the presence of strong leadership and support
at the most senior levels of the government?
• How reliable is information that may be used for policy
and management decision making?
• How involved is civil society as a partner with
government in building and tracking performance
information?
• Are there pockets of innovation that can serve as
beginning practices or pilot programs?
IPDET Step 2: Agreeing on
Outcomes to Monitor and
Evaluate
planning for
improvement :
selecting realistic
targets
5
1
conducting a
readiness
assessment
3
selecting key
indicators to
monitor
outcomes
7
using
evaluation
information
9
using
findings
2
Agreeing
on Outcomes
to Monitor
and Evaluate
4
gathering
baseline data
on indicators
6
monitoring
for results
8
reporting
findings
sustaining the
M&E system
within the
organization
10
IPDET © 2009
24
Why an Emphasis on
Outcomes?
• Makes explicit the intended objectives
of government action
• Outcomes are what produce benefits
• Clearly setting outcomes is key to
designing and building results-based M&E
system
• Important! Budget to outputs, manage to
outcomes!
• (“Know where you are going before you get moving”)
Issues to Consider for
Generating Outcomes
• Are there stated national/sectoral goals?
• Have political promises been made that specify
improved performance in a particular area?
• Do citizen polling data or citizen score cards
indicate specific concerns?
• Is donor lending linked to specific goals?
• Is authorizing legislation present?
• Has the government made a serious
commitment to achieving the MGDs?
IPDET © 2009
25
IPDET © 2009
26
Developing Outcomes
for One Policy Area:
Education
2. Improved
primary school
learning
outcomes
1. Improved
coverage of
preschool
programs
Targets:
Baselines:
Indicators:
Outcomes:
IPDET © 2009
27
Outcomes:
• Outcomes are usually not directly
measured — only reported on
• Outcomes must be translated to a set
of key indicators
• When choosing outcomes, “Do not go
it alone!” – agreement is crucial
IPDET Step 3: Selecting Key
Indicators to Monitor
Outcomes
planning for
improvement :
selecting realistic
targets
5
1
conducting
a readiness
assessment
3
Selecting
Key
Indicators to
Monitor
Outcomes
7
using
evaluation
information
9
using
findings
2
agreeing on
outcomes to
monitor and
evaluate
4
gathering
baseline data
on indicators
6
monitoring
for results
8
reporting
findings
sustaining the
M&E system
within the
organization
10
IPDET © 2009
29
Results Indicator
• A specific variable, that when tracked
systematically over time, indicates progress
(or lack thereof) toward an outcome or
impact
– for new M&E systems, all indicators should be
numerical
– qualitative indicators can come later with mature
M&E systems
• Indicators ask: How will we know success when
we see it?
IPDET © 2009
30
Indicator Development
• “CREAM”
–Clear
–Relevant
–Economic
–Adequate
–Monitorable
IPDET © 2009 31
Matrix for
Building/Using
Indicators
Indicato
r
Data
source
Data
collectio
n
method
Who will
collect
data
Frequenc
y of data
collection
Cost
to
collect
data
Difficul
ty to
collect
Who will
analyze
and report
data
Who
will
use
data
1.
2.
3.
4.
IPDET © 2009
32
Developing Set of
Outcome Indicators for One
Policy Area: Education
1. Percent of Grade 6
students scoring 70 percent
or better on standardized
math and science tests
2. Percent of Grade 6
students scoring higher on
standardized math and
science tests in comparison
to baseline data
1. Percent of eligible
urban children enrolled in
preschool
2. Percent of eligible rural
children enrolled in pre-
school
Targets:
Baselines:
Indicators:
Outcomes:
2. Improved
primary school
learning
outcomes
1. Improved
coverage of
preschool
programs
IPDET © 2009
33
Developing Indicators
• Develop your own indicators to meet
your needs
• Developing good indicators usually
takes more than one try
• State all indicators neutrally – not
“increase in…” or “decrease in…”
• Pilot, Pilot, and Pilot!
IPDET
Step 4: Gathering Baseline
Data on Indicators
planning for
improvement :
selecting realistic
targets
5
1
conducting a
readiness
assessment
3
selecting key
indicators to
monitor
outcomes
7
using
evaluation
information
9
using
findings
2
agreeing on
outcomes to
monitor and
evaluate
4
Gathering
Baseline Data
on Indicators
6
monitoring
for results
8
reporting
findings
sustaining the
M&E system
within the
organization
10
IPDET © 2009
35
Baseline Data and
Sources
• Baseline data:
– Measurements to find out - where are we
today?
• Primary source:
– gathered specifically for the project
• Secondary source:
– collected for another purpose
– can save money but be careful to ensure that it
is truly the information you need
IPDET © 2009
36
Possible Sources
• Written records (paper and electronic)
• Individuals involved with the
intervention
• The general public
• Trained observers
• Mechanical measurements
• Geographical information system
IPDET © 2009
37
Conversation
with
Concerned
Individuals
Community
Interviews
Field
Visits
Reviews of
Official
Records (MIS
and admin
data)
Participant
Observations
Key Informant
Interviews
Focus
Group
Interviews
Panel
Surveys
Censuses
Field
Experiments
Informal/Less Structured Methods Formal/More Structured Methods
One-Time
Surveys
Direct
Observations
Surveys
IPDET © 2009
38
Continuing Example, Developing
Baseline Data for One Policy Area:
Education
1. In 2002, 47% of
students scored 70% or
better in math and 50%
or better in science
2. In 2002 mean score
for Grade 6 students
was 68% in math and
53% in science
1. 75 % in urban areas
in 1999
2. 40 % in rural areas in
2000
1. Percent of Grade 6
students scoring 70% or
better on standardized math
and science tests
2. Percent of Grade 6
students scoring higher on
standardized math and
science tests in comparison
to baseline data
2. Improved
primary school
learning
outcomes
1. Percent of eligible
urban children enrolled
in preschool
2. Percent of eligible
rural children enrolled in
pre-school
1. Improved
coverage of
preschool
programs
Targets:
Baselines:
Indicators:
Outcomes
:
IPDET Step 5: Planning for
Improvement: Selecting
Realistic Targets
Planning for
Improvement :
Selecting
Realistic
Targets
5
1
conducting a
readiness
assessment
3
selecting key
indicators to
monitor
outcomes
7
using
evaluation
information
9
using
findings
2
agreeing on
outcomes to
monitor and
evaluate
4
gathering
baseline data
on indicators
6
monitoring
for results
8
reporting
findings
sustaining the
M&E system
within the
organization
10
IPDET © 2009
40
Targets:
• The quantifiable levels of the indicators
that a country or organization wants to
achieve at a given point in time
• Example:
–Agricultural exports will increase in the next
three years by 20% over the baseline
IPDET © 2009 41
Identifying Expected or
Desired Level of Improvement
Requires Targets
Desired Level
of
Improvement
Assumes a
finite and
expected level
of inputs,
activities, and
outputs
Baseline
Indicator
Level
Target
Performance
Desired level of
performance to
be reached within
a specific time
+ =
IPDET © 2009
42
Caution:
• It takes time to observe the effects of
improvements, therefore:
– Be realistic when setting targets
– Avoid promising too much and thus
programming yourself to fail
IPDET © 2009
43
Continuing Example, Setting
Performance Targets for One Policy
Area: Education
1. By 2006, 80% of
students will score 70%
or better in math 67 %
will score 70% or better
in science
2. In 2006 mean test
score will be 78% for
math and 65% in science
1. 85 % in urban
areas by 2006
2. 60 % in rural
areas by 2006
1. In 2002, 47% of
students scored 70% or
better in math and 50%
or better in science
2. In 2002 mean score
for Grade 6 students
was 68% in math and
53% in science
1. 75 % in urban areas
in 1999
2. 40 % in rural areas in
2000
1. Percent of Grade 6
students scoring 70% or
better on standardized math
and science tests
2. Percent of Grade 6
students scoring higher on
standardized math and
science tests in comparison
to baseline data
2. Improved
primary school
learning
outcomes
1. Percent of eligible
urban children enrolled
in preschool
2. Percent of eligible
rural children enrolled in
pre-school
1. Improved
coverage of
preschool
programs
Targets:
Baselines:
Indicators:
Outcomes
:
IPDET
Step 6: Monitoring for
Results
planning for
improvement :
selecting realistic
targets
5
1
conducting a
readiness
assessment
3
selecting key
indicators to
monitor
outcomes
7
using
evaluation
information
9
using
findings
2
agreeing on
outcomes to
monitor and
evaluate
4
gathering
baseline data
on indicators
6
Monitorin
g for
Results
8
reporting
findings
sustaining the
M&E system
within the
organization
10
IPDET © 2009 45
Key Types of Monitoring
Results Monitoring
Implementation
Monitoring
(Means and Strategies)
Outcomes
Impacts
Results
Inputs
Activities
Outputs
Implementati
on
IPDET © 2009
46
Implementation
Monitoring Links to
Results Monitoring
Means and
Strategies
(Multi-year
and Annual
Work Plans)
Means and
Strategies
(Multi-year
and Annual
Work Plans)
Means and
Strategies
(Multi-year
and Annual
Work Plans)
Monitor
Results
Monitor
Implementation
Ta rget 1 Ta rget 2 Ta rget 3
O utcom e
IPDET © 2009
47
Target 1
Partner 1
Partner 2
Partner 3
M eans & Strategy
Partner 1
Partner 2
Partner 3
M eans & Strategy
Partner 1
Partner 2
Partner 3
M eans & Strategy
Target 2
Outcome Outcome Outcome
Impact
IPDET © 2009
48
Successful Monitoring
Systems
• To be successful, every monitoring
system needs the following:
– ownership
– management
– maintenance
– credibility
IPDET
Step 7: Using Evaluation
Information
planning for
improvement :
selecting realistic
targets
5
1
conducting a
readiness
assessment
3
selecting key
indicators to
monitor
outcomes
7
Using
Evaluation
Informatio
n
9
using
findings
2
agreeing on
outcomes to
monitor and
evaluate
4
gathering
baseline data
on indicators
6
monitoring
for results
8
reporting
findings
sustaining the
M&E system
within the
organization
10
IPDET © 2009
50
Evaluation Means Info
on:
Strategy
• Whether we are doing the right things
– Rationale/justification
– Clear theory of change
Operatio
n
•Whether we are doing things right
– Effectiveness in achieving expected outcomes
– Efficiency in optimizing resources
– Client satisfaction
Learning
• Whether there are better ways of doing it
– Alternatives
– Best practices
– Lessons learned
IPDET © 2009
51
Evaluation — When to
Use?
• Any time there is an unexpected result or
performance outlier that requires further
investigation
• When resource or budget allocations are being made
across projects, programs, or policies
• When a decision is being made whether or not to
expand a pilot
• When there is a long period with no improvement,
and the reasons for this are not clear
• When similar programs or policies are reporting
divergent outcomes
IPDET
Step 8: Reporting Findings
planning for
improvement :
selecting realistic
targets
5
conducting a
readiness
assessment
3
selecting key
indicators to
monitor
outcomes
7
using
evaluation
information
9
using
findings
2
agreeing on
outcomes to
monitor and
evaluate
4
gathering
baseline data
on indicators
6
monitoring
for results
8
Reporting
Findings
sustaining the
M&E system
within the
organization
10
1
IPDET © 2009
53
Reporting Findings
• Provides information on status of projects,
programs, and policies
• Yields clues to problems
• Creates opportunities to consider changes
• Provides important information over time
on trends and directions
• Helps confirm or challenge theory of
change
When Analyzing and
Presenting Data:
• Compare indicator data with the baseline and
targets, and provide this information in an easy-
to-understand visual display
• Compare current information with past data and
look for patterns and trends
• Be careful about drawing sweeping conclusions
based on small amounts of information. The
more data points you have, the more certain you
can be that trends are real
(continued on next slide)
IPDET © 2009
54
IPDET © 2009
55
When Analyzing and
Presenting Data: (cont.)
• Protect the messenger: people who deliver
bad news should not be punished.
Uncomfortable findings can indicate new
trends or notify managers of problems
early on, allowing them time needed to
solve these problems
IPDET
Step 9: Using Findings
planning for
improvement :
selecting realistic
targets
5
1
conducting a
readiness
assessment
3
selecting key
indicators to
monitor
outcomes
7
using
evaluation
information
9
Using
Finding
s
2
agreeing on
outcomes to
monitor and
evaluate
4
gathering
baseline data on
indicators
6
monitoring
for results
8
reporting
findings
sustaining the
M&E system
within the
organization
10
IPDET © 2009
57
Strategies for Sharing
Information
• Empower the media
• Enact “freedom of information” legislation
• Institute e-government
• Add information on internal and external Internet
sites
• Publish annual budget reports
• Engage civil society and citizen groups
• Strengthen legislative oversight
• Strengthen the office of the auditor general
• Share and compare results findings with
development partners
IPDET © 2009
58
Ten Uses of Results
Findings
• Responds to elected officials’ and the public’s
demands for accountability
• Helps formulate and justify budget requests
• Helps in making operational resource
allocation decisions
• Triggers in-depth examinations of what
performance problems exist and what
corrections are needed
• Helps motivate personnel to continue making
program improvements
(continued on next slide)
IPDET © 2009
59
Ten Uses of Results
Findings (cont.)
• Monitors the project or program performance
against outcome targets
• Provides data for special, in-depth program
evaluations
• Helps track services delivery against precise
outcome targets
• Supports strategic and other long-term
planning efforts
• Communicates with the public to
build public trust
IPDET Step 10: Sustaining the
M&E System within the
Organization
planning for
improvement :
selecting realistic
targets
5
1
conducting
a readiness
assessment
3
selecting key
indicators to
monitor
outcomes
7
using
evaluation
information
9
using
findings
2
agreeing on
outcomes to
monitor and
evaluate
4
gathering
baseline data
on indicators
6
monitoring
for results
8
reporting
findings
Sustaining
the M&E
System
within the
Organization
10
IPDET © 2009
61
Critical Components
Crucial to Sustaining
• Demand
• Clear roles and responsibilities
• Trustworthy and credible information
• Accountability
• Capacity
• Incentives
IPDET © 2009
62
Concluding Comments
• The demand for capacity building never ends!
The only way an organization can coast is
downhill
• Keep your champions on your side and help
them!
• Establish the understanding with the Ministry of
Finance and the Parliament that an M&E system
needs sustained resources
• Look for every opportunity to link results
information to budget and resource allocation
decisions
(continued on next slide)
IPDET © 2009
63
Concluding Comments
(cont.)
• Begin with pilot efforts to demonstrate effective
results-based monitoring and evaluation
• Begin with an enclave strategy (e.g., islands of
innovation) as opposed to a whole-of-
government approach.
• Monitor both implementation progress and
results achievements
• Complement performance monitoring with
evaluations to ensure better understanding of
public sector results
A Final Note….
IPDET © 2009
“We are what we repeatedly do.
Excellence, then, is not an act, but a habit.”
-- Aristotle
Questions?

More Related Content

PPT
Ten_Steps_Results_Based_MESystem.ppt
PDF
Ten Steps To A Resultsbased Monitoring And Evaluation System A Handbook For D...
PPTX
Monitoring and evaluation presentation equi gov
PPTX
Dr Brian Mutie on basics of Monitoring and Evaluation
PPTX
Monitoring And Evaluation Presentation
PPTX
Results Based Monitoring and Evaluation
PPT
formseminar_module2.ppt
PPT
formseminar_1111111113131342435235235module2.ppt
Ten_Steps_Results_Based_MESystem.ppt
Ten Steps To A Resultsbased Monitoring And Evaluation System A Handbook For D...
Monitoring and evaluation presentation equi gov
Dr Brian Mutie on basics of Monitoring and Evaluation
Monitoring And Evaluation Presentation
Results Based Monitoring and Evaluation
formseminar_module2.ppt
formseminar_1111111113131342435235235module2.ppt

Similar to Monitoring and evaluation seminar_module3.ppt (20)

PPT
formseminar_module21111111111111111111111111111111
PDF
Results measurement
PPTX
M&E CLW 26Nov2015, MMM
PPTX
Importance of M&E
PPTX
TTI PEC Nairobi Workshop - Unpacking impact and influence
PDF
Monitoring and Evaluation: Lesson 2
PDF
Introduction to Policy Evaluation
PPT
Workshop: Monitoring, evaluation and impact assessment
PPTX
Monitoring and Evaluation - AT.pdf
PPT
Chapter III of project planning-PPP (2).ppt
PPTX
Result based management
PPTX
Monitoring and evaluation of programs that changes life's
PPTX
6 monitoring and evaluation
PPTX
Results based monitoring
PPT
Introduction to RBM
PPTX
Chapter Two PME.pptx
PPTX
The role of Monitoring and Evaluation in Improving Public Policies – Challeng...
PPTX
Addressing the Risks & Opportunities of Implementing an Outcomes Based Strategy
PPT
Monitoring and Evaluation.ppt
PDF
Evaluation to Improve Development Results
formseminar_module21111111111111111111111111111111
Results measurement
M&E CLW 26Nov2015, MMM
Importance of M&E
TTI PEC Nairobi Workshop - Unpacking impact and influence
Monitoring and Evaluation: Lesson 2
Introduction to Policy Evaluation
Workshop: Monitoring, evaluation and impact assessment
Monitoring and Evaluation - AT.pdf
Chapter III of project planning-PPP (2).ppt
Result based management
Monitoring and evaluation of programs that changes life's
6 monitoring and evaluation
Results based monitoring
Introduction to RBM
Chapter Two PME.pptx
The role of Monitoring and Evaluation in Improving Public Policies – Challeng...
Addressing the Risks & Opportunities of Implementing an Outcomes Based Strategy
Monitoring and Evaluation.ppt
Evaluation to Improve Development Results
Ad

More from GoharSaeed6 (20)

PPTX
history-of-management and practices.pptx
PPTX
Quality Circles development and Management.pptx
PPTX
formulating the Researches proposals.pptx
PPTX
conducting and literature research for literature
PPTX
research methodology process lecture_19.pptx
PPT
Evaluation of Supply Chain Design Decisions Under Uncertainty.ppt
PPT
Decision Tree Analysis in Supply Chain Management
PPT
Office Supervision Concept GOHAR 9 August 2021.ppt
PPTX
Lecture 12 Technology in Logistics 23.1.25.pptx
PPTX
REVERSE LOGISTICS AND THE ENVIRONMENT.pptx
PPT
Types of Forecasts in Supply Chain Management
PPT
Time Series Forecasting in Supply Chain Management
PPT
Role of Cycle Inventory in a Supply Chain.ppt
PPT
PushPull View of Supply Chain Processes.ppt
PPT
Supply Chain Management strategy making.ppt
PPT
Supply Chain Management Network process.ppt
PPT
overview of supply chain process and systematise
PPT
Maximising Supply Chain Management surplus.ppt
PPT
evolution of Supply chain management C.ppt
PPT
Designing Supply Chain Management process.ppt
history-of-management and practices.pptx
Quality Circles development and Management.pptx
formulating the Researches proposals.pptx
conducting and literature research for literature
research methodology process lecture_19.pptx
Evaluation of Supply Chain Design Decisions Under Uncertainty.ppt
Decision Tree Analysis in Supply Chain Management
Office Supervision Concept GOHAR 9 August 2021.ppt
Lecture 12 Technology in Logistics 23.1.25.pptx
REVERSE LOGISTICS AND THE ENVIRONMENT.pptx
Types of Forecasts in Supply Chain Management
Time Series Forecasting in Supply Chain Management
Role of Cycle Inventory in a Supply Chain.ppt
PushPull View of Supply Chain Processes.ppt
Supply Chain Management strategy making.ppt
Supply Chain Management Network process.ppt
overview of supply chain process and systematise
Maximising Supply Chain Management surplus.ppt
evolution of Supply chain management C.ppt
Designing Supply Chain Management process.ppt
Ad

Recently uploaded (20)

PDF
Unit 1 Cost Accounting - Cost sheet
PPTX
5 Stages of group development guide.pptx
PDF
Ôn tập tiếng anh trong kinh doanh nâng cao
PPTX
job Avenue by vinith.pptxvnbvnvnvbnvbnbmnbmbh
PPTX
New Microsoft PowerPoint Presentation - Copy.pptx
PDF
Stem Cell Market Report | Trends, Growth & Forecast 2025-2034
PDF
A Brief Introduction About Julia Allison
PDF
Power and position in leadershipDOC-20250808-WA0011..pdf
PDF
kom-180-proposal-for-a-directive-amending-directive-2014-45-eu-and-directive-...
PPT
Chapter four Project-Preparation material
PDF
Elevate Cleaning Efficiency Using Tallfly Hair Remover Roller Factory Expertise
PDF
Types of control:Qualitative vs Quantitative
PDF
Business model innovation report 2022.pdf
PPTX
Dragon_Fruit_Cultivation_in Nepal ppt.pptx
PPTX
Belch_12e_PPT_Ch18_Accessible_university.pptx
PDF
Laughter Yoga Basic Learning Workshop Manual
PPT
Data mining for business intelligence ch04 sharda
PPTX
HR Introduction Slide (1).pptx on hr intro
DOCX
unit 1 COST ACCOUNTING AND COST SHEET
PPTX
AI-assistance in Knowledge Collection and Curation supporting Safe and Sustai...
Unit 1 Cost Accounting - Cost sheet
5 Stages of group development guide.pptx
Ôn tập tiếng anh trong kinh doanh nâng cao
job Avenue by vinith.pptxvnbvnvnvbnvbnbmnbmbh
New Microsoft PowerPoint Presentation - Copy.pptx
Stem Cell Market Report | Trends, Growth & Forecast 2025-2034
A Brief Introduction About Julia Allison
Power and position in leadershipDOC-20250808-WA0011..pdf
kom-180-proposal-for-a-directive-amending-directive-2014-45-eu-and-directive-...
Chapter four Project-Preparation material
Elevate Cleaning Efficiency Using Tallfly Hair Remover Roller Factory Expertise
Types of control:Qualitative vs Quantitative
Business model innovation report 2022.pdf
Dragon_Fruit_Cultivation_in Nepal ppt.pptx
Belch_12e_PPT_Ch18_Accessible_university.pptx
Laughter Yoga Basic Learning Workshop Manual
Data mining for business intelligence ch04 sharda
HR Introduction Slide (1).pptx on hr intro
unit 1 COST ACCOUNTING AND COST SHEET
AI-assistance in Knowledge Collection and Curation supporting Safe and Sustai...

Monitoring and evaluation seminar_module3.ppt

  • 1. IPDET Module 3: Building a Results-Based Monitoring and Evaluation System
  • 2. IPDET © 2009 2 Introduction • Importance of Results-Based M&E • What Is Results-Based M&E? • Traditional vs. Results-Based M&E • Ten Steps to Building a Results-Based M&E System
  • 3. IPDET © 2009 3 Importance • There are growing pressures in developing countries to improve performance of their public sectors • Involves reform by tracking results of government or organizational actions over time • Is a management tool
  • 4. IPDET © 2009 4 The Power of Measuring Results • If you do not measure results, you cannot tell success from failure • If you can not see success, you can not reward it • If you can not reward success, you are probably rewarding failure • If you can not see success, you can not learn from it • If you can not recognize failure, you can not correct it • If you can demonstrate results, you can win public support
  • 5. IPDET © 2009 5 Results-Based M&E: • Provides crucial information about public sector performance • Provides a view over time on the status of a project, program, or policy • Promotes credibility and public confidence by reporting on the results of programs • Helps formulate and justify budget requests • Identifies potentially promising programs or practices by studying pilots (continued on next slide)
  • 6. IPDET © 2009 6 Results-Based M&E: (cont.) • Focuses attention on achieving outcomes important to the organization and its stakeholders • Provides timely, frequent information to staff • Helps establish key goals and outcomes • Permits managers to identify and take action to correct weaknesses • Supports a development agenda that is shifting towards greater accountability for aid lending
  • 7. IPDET © 2009 7 Results-Based Monitoring • Results-based monitoring (what we call “monitoring”) is a continuous process of collecting and analyzing information on key indicators, and comparing actual results to expected results
  • 8. IPDET © 2009 8 Results-Based Evaluation • Results-based evaluation is an assessment of a planned, ongoing, or completed intervention to determine its relevance, efficiency, effectiveness, impact, and/or sustainability
  • 9. IPDET © 2009 9 Difference between Results-Based Monitoring and Results-Based Evaluation • Monitoring: tracks movement of indicators towards the achievement of specific, predetermined targets • Evaluation: takes a broader view, considering progress toward stated goals, the logic of the initiative, and its consequences • Both are needed to better manage policies, programs, and projects
  • 10. IPDET © 2009 10 Comparison Traditional • inputs • activities • outputs Results-Based • combines traditional with assessment of outcomes and impacts • allows organization to modify and make adjustments to theory of change and/or implementation processes
  • 11. IPDET © 2009 11 Complementary Roles of Monitoring and Evaluation Monitoring  Clarifies program objectives  Links activities and their resources to objectives  Translates objectives into performance indicators and sets targets and baseline  Routinely collects data on these indicators, compares actual results with targets  Reports progress to managers and alerts them to problems Evaluation  Analyzes why intended results were or were not achieved  Assesses specific causal contributions of activities to results  Examines implementation process  Explores unintended results  Provides lessons, highlights significant accomplishment or program potential, and offers recommendations for improvement
  • 12. Brief Introduction to Theory of Change • Theory of change is a representation of how a project, program or policy initiative is expected to lead to the outcomes and impacts. It also identifies the underlying assumptions being made with respect to how the change will occur. IPDET © 2009 12
  • 13. Components of Theory of Change • Inputs – financial, human, and material resources • Activities – tasks undertaken • Outputs – products and services • Outcomes – behavioral changes • Impacts – long term widespread improvement in society IPDET © 2009 13
  • 14. IPDET © 2009 14 Key Types of Monitoring Results Monitoring Implementation Monitoring (Means and Strategies) Outcomes Impacts Results Inputs Activities Outputs Implementati on
  • 15. Performance Indicators • A variable that tracks the changes in the development intervention or shows results relative to what was planned • The cumulative evidence of a cluster of indicators is used to see if an initiative is making progress IPDET © 2009 15
  • 16. IPDET Step 1: Conducting a Readiness Assessment planning for improvement : selecting realistic targets 5 1 Conducting a Readiness Assessment 3 selecting key indicators to monitor outcomes 7 using evaluation information 9 using findings 2 agreeing on outcomes to monitor and evaluate 4 gathering baseline data on indicators 6 monitoring for results 8 reporting findings sustaining the M&E system within the organization 10 Ten Steps to Building a Results-Based M&E System
  • 17. What Is a Readiness Assessment? • A systematic approach to determine the capacity and willingness of a government or organization to construct a results-based M&E system – The approach focuses on: presence or absence of champions, incentives, roles and responsibilities, organizational capacity, and barriers to getting started IPDET © 2009 17
  • 18. IPDET © 2009 18 Incentives • Sort out the answers to these questions: – What is driving the need for building an M&E system? – Who are the champions for building and using an M&E system? – What is motivating those who champion building an M&E system? – Who will benefit from the system? – Who will not benefit?
  • 19. IPDET © 2009 19 Roles and Responsibilities • Ask: – What are the roles of central and line ministries in assessing performance? – What is the role of the legislature? – What is the role of the supreme audit agency? – Do ministries and agencies share information with one another? – Who in the country produces data? – Where at different levels in the government are data used?
  • 20. IPDET © 2009 20 Organizational Capacity • Assess current government capacity with respect to: – technical skills – managerial skills – existing data systems and their quality – technology available – fiscal resources available – institutional experience
  • 21. IPDET © 2009 21 Barriers to M&E • Do any of the following present barriers to building an M&E system? – lack of fiscal resources – lack of political will – lack of a champion for the system – lack of an outcome-linked strategy ,or experience • How do we confront these barriers?
  • 22. IPDET © 2009 22 Key Questions for Predicting Success in Building an M&E System • Does a clear mandate exist for M&E at the national level? • Are Poverty Reduction Strategy Papers, laws, and regulations in place? • Is there the presence of strong leadership and support at the most senior levels of the government? • How reliable is information that may be used for policy and management decision making? • How involved is civil society as a partner with government in building and tracking performance information? • Are there pockets of innovation that can serve as beginning practices or pilot programs?
  • 23. IPDET Step 2: Agreeing on Outcomes to Monitor and Evaluate planning for improvement : selecting realistic targets 5 1 conducting a readiness assessment 3 selecting key indicators to monitor outcomes 7 using evaluation information 9 using findings 2 Agreeing on Outcomes to Monitor and Evaluate 4 gathering baseline data on indicators 6 monitoring for results 8 reporting findings sustaining the M&E system within the organization 10
  • 24. IPDET © 2009 24 Why an Emphasis on Outcomes? • Makes explicit the intended objectives of government action • Outcomes are what produce benefits • Clearly setting outcomes is key to designing and building results-based M&E system • Important! Budget to outputs, manage to outcomes! • (“Know where you are going before you get moving”)
  • 25. Issues to Consider for Generating Outcomes • Are there stated national/sectoral goals? • Have political promises been made that specify improved performance in a particular area? • Do citizen polling data or citizen score cards indicate specific concerns? • Is donor lending linked to specific goals? • Is authorizing legislation present? • Has the government made a serious commitment to achieving the MGDs? IPDET © 2009 25
  • 26. IPDET © 2009 26 Developing Outcomes for One Policy Area: Education 2. Improved primary school learning outcomes 1. Improved coverage of preschool programs Targets: Baselines: Indicators: Outcomes:
  • 27. IPDET © 2009 27 Outcomes: • Outcomes are usually not directly measured — only reported on • Outcomes must be translated to a set of key indicators • When choosing outcomes, “Do not go it alone!” – agreement is crucial
  • 28. IPDET Step 3: Selecting Key Indicators to Monitor Outcomes planning for improvement : selecting realistic targets 5 1 conducting a readiness assessment 3 Selecting Key Indicators to Monitor Outcomes 7 using evaluation information 9 using findings 2 agreeing on outcomes to monitor and evaluate 4 gathering baseline data on indicators 6 monitoring for results 8 reporting findings sustaining the M&E system within the organization 10
  • 29. IPDET © 2009 29 Results Indicator • A specific variable, that when tracked systematically over time, indicates progress (or lack thereof) toward an outcome or impact – for new M&E systems, all indicators should be numerical – qualitative indicators can come later with mature M&E systems • Indicators ask: How will we know success when we see it?
  • 30. IPDET © 2009 30 Indicator Development • “CREAM” –Clear –Relevant –Economic –Adequate –Monitorable
  • 31. IPDET © 2009 31 Matrix for Building/Using Indicators Indicato r Data source Data collectio n method Who will collect data Frequenc y of data collection Cost to collect data Difficul ty to collect Who will analyze and report data Who will use data 1. 2. 3. 4.
  • 32. IPDET © 2009 32 Developing Set of Outcome Indicators for One Policy Area: Education 1. Percent of Grade 6 students scoring 70 percent or better on standardized math and science tests 2. Percent of Grade 6 students scoring higher on standardized math and science tests in comparison to baseline data 1. Percent of eligible urban children enrolled in preschool 2. Percent of eligible rural children enrolled in pre- school Targets: Baselines: Indicators: Outcomes: 2. Improved primary school learning outcomes 1. Improved coverage of preschool programs
  • 33. IPDET © 2009 33 Developing Indicators • Develop your own indicators to meet your needs • Developing good indicators usually takes more than one try • State all indicators neutrally – not “increase in…” or “decrease in…” • Pilot, Pilot, and Pilot!
  • 34. IPDET Step 4: Gathering Baseline Data on Indicators planning for improvement : selecting realistic targets 5 1 conducting a readiness assessment 3 selecting key indicators to monitor outcomes 7 using evaluation information 9 using findings 2 agreeing on outcomes to monitor and evaluate 4 Gathering Baseline Data on Indicators 6 monitoring for results 8 reporting findings sustaining the M&E system within the organization 10
  • 35. IPDET © 2009 35 Baseline Data and Sources • Baseline data: – Measurements to find out - where are we today? • Primary source: – gathered specifically for the project • Secondary source: – collected for another purpose – can save money but be careful to ensure that it is truly the information you need
  • 36. IPDET © 2009 36 Possible Sources • Written records (paper and electronic) • Individuals involved with the intervention • The general public • Trained observers • Mechanical measurements • Geographical information system
  • 37. IPDET © 2009 37 Conversation with Concerned Individuals Community Interviews Field Visits Reviews of Official Records (MIS and admin data) Participant Observations Key Informant Interviews Focus Group Interviews Panel Surveys Censuses Field Experiments Informal/Less Structured Methods Formal/More Structured Methods One-Time Surveys Direct Observations Surveys
  • 38. IPDET © 2009 38 Continuing Example, Developing Baseline Data for One Policy Area: Education 1. In 2002, 47% of students scored 70% or better in math and 50% or better in science 2. In 2002 mean score for Grade 6 students was 68% in math and 53% in science 1. 75 % in urban areas in 1999 2. 40 % in rural areas in 2000 1. Percent of Grade 6 students scoring 70% or better on standardized math and science tests 2. Percent of Grade 6 students scoring higher on standardized math and science tests in comparison to baseline data 2. Improved primary school learning outcomes 1. Percent of eligible urban children enrolled in preschool 2. Percent of eligible rural children enrolled in pre-school 1. Improved coverage of preschool programs Targets: Baselines: Indicators: Outcomes :
  • 39. IPDET Step 5: Planning for Improvement: Selecting Realistic Targets Planning for Improvement : Selecting Realistic Targets 5 1 conducting a readiness assessment 3 selecting key indicators to monitor outcomes 7 using evaluation information 9 using findings 2 agreeing on outcomes to monitor and evaluate 4 gathering baseline data on indicators 6 monitoring for results 8 reporting findings sustaining the M&E system within the organization 10
  • 40. IPDET © 2009 40 Targets: • The quantifiable levels of the indicators that a country or organization wants to achieve at a given point in time • Example: –Agricultural exports will increase in the next three years by 20% over the baseline
  • 41. IPDET © 2009 41 Identifying Expected or Desired Level of Improvement Requires Targets Desired Level of Improvement Assumes a finite and expected level of inputs, activities, and outputs Baseline Indicator Level Target Performance Desired level of performance to be reached within a specific time + =
  • 42. IPDET © 2009 42 Caution: • It takes time to observe the effects of improvements, therefore: – Be realistic when setting targets – Avoid promising too much and thus programming yourself to fail
  • 43. IPDET © 2009 43 Continuing Example, Setting Performance Targets for One Policy Area: Education 1. By 2006, 80% of students will score 70% or better in math 67 % will score 70% or better in science 2. In 2006 mean test score will be 78% for math and 65% in science 1. 85 % in urban areas by 2006 2. 60 % in rural areas by 2006 1. In 2002, 47% of students scored 70% or better in math and 50% or better in science 2. In 2002 mean score for Grade 6 students was 68% in math and 53% in science 1. 75 % in urban areas in 1999 2. 40 % in rural areas in 2000 1. Percent of Grade 6 students scoring 70% or better on standardized math and science tests 2. Percent of Grade 6 students scoring higher on standardized math and science tests in comparison to baseline data 2. Improved primary school learning outcomes 1. Percent of eligible urban children enrolled in preschool 2. Percent of eligible rural children enrolled in pre-school 1. Improved coverage of preschool programs Targets: Baselines: Indicators: Outcomes :
  • 44. IPDET Step 6: Monitoring for Results planning for improvement : selecting realistic targets 5 1 conducting a readiness assessment 3 selecting key indicators to monitor outcomes 7 using evaluation information 9 using findings 2 agreeing on outcomes to monitor and evaluate 4 gathering baseline data on indicators 6 Monitorin g for Results 8 reporting findings sustaining the M&E system within the organization 10
  • 45. IPDET © 2009 45 Key Types of Monitoring Results Monitoring Implementation Monitoring (Means and Strategies) Outcomes Impacts Results Inputs Activities Outputs Implementati on
  • 46. IPDET © 2009 46 Implementation Monitoring Links to Results Monitoring Means and Strategies (Multi-year and Annual Work Plans) Means and Strategies (Multi-year and Annual Work Plans) Means and Strategies (Multi-year and Annual Work Plans) Monitor Results Monitor Implementation Ta rget 1 Ta rget 2 Ta rget 3 O utcom e
  • 47. IPDET © 2009 47 Target 1 Partner 1 Partner 2 Partner 3 M eans & Strategy Partner 1 Partner 2 Partner 3 M eans & Strategy Partner 1 Partner 2 Partner 3 M eans & Strategy Target 2 Outcome Outcome Outcome Impact
  • 48. IPDET © 2009 48 Successful Monitoring Systems • To be successful, every monitoring system needs the following: – ownership – management – maintenance – credibility
  • 49. IPDET Step 7: Using Evaluation Information planning for improvement : selecting realistic targets 5 1 conducting a readiness assessment 3 selecting key indicators to monitor outcomes 7 Using Evaluation Informatio n 9 using findings 2 agreeing on outcomes to monitor and evaluate 4 gathering baseline data on indicators 6 monitoring for results 8 reporting findings sustaining the M&E system within the organization 10
  • 50. IPDET © 2009 50 Evaluation Means Info on: Strategy • Whether we are doing the right things – Rationale/justification – Clear theory of change Operatio n •Whether we are doing things right – Effectiveness in achieving expected outcomes – Efficiency in optimizing resources – Client satisfaction Learning • Whether there are better ways of doing it – Alternatives – Best practices – Lessons learned
  • 51. IPDET © 2009 51 Evaluation — When to Use? • Any time there is an unexpected result or performance outlier that requires further investigation • When resource or budget allocations are being made across projects, programs, or policies • When a decision is being made whether or not to expand a pilot • When there is a long period with no improvement, and the reasons for this are not clear • When similar programs or policies are reporting divergent outcomes
  • 52. IPDET Step 8: Reporting Findings planning for improvement : selecting realistic targets 5 conducting a readiness assessment 3 selecting key indicators to monitor outcomes 7 using evaluation information 9 using findings 2 agreeing on outcomes to monitor and evaluate 4 gathering baseline data on indicators 6 monitoring for results 8 Reporting Findings sustaining the M&E system within the organization 10 1
  • 53. IPDET © 2009 53 Reporting Findings • Provides information on status of projects, programs, and policies • Yields clues to problems • Creates opportunities to consider changes • Provides important information over time on trends and directions • Helps confirm or challenge theory of change
  • 54. When Analyzing and Presenting Data: • Compare indicator data with the baseline and targets, and provide this information in an easy- to-understand visual display • Compare current information with past data and look for patterns and trends • Be careful about drawing sweeping conclusions based on small amounts of information. The more data points you have, the more certain you can be that trends are real (continued on next slide) IPDET © 2009 54
  • 55. IPDET © 2009 55 When Analyzing and Presenting Data: (cont.) • Protect the messenger: people who deliver bad news should not be punished. Uncomfortable findings can indicate new trends or notify managers of problems early on, allowing them time needed to solve these problems
  • 56. IPDET Step 9: Using Findings planning for improvement : selecting realistic targets 5 1 conducting a readiness assessment 3 selecting key indicators to monitor outcomes 7 using evaluation information 9 Using Finding s 2 agreeing on outcomes to monitor and evaluate 4 gathering baseline data on indicators 6 monitoring for results 8 reporting findings sustaining the M&E system within the organization 10
  • 57. IPDET © 2009 57 Strategies for Sharing Information • Empower the media • Enact “freedom of information” legislation • Institute e-government • Add information on internal and external Internet sites • Publish annual budget reports • Engage civil society and citizen groups • Strengthen legislative oversight • Strengthen the office of the auditor general • Share and compare results findings with development partners
  • 58. IPDET © 2009 58 Ten Uses of Results Findings • Responds to elected officials’ and the public’s demands for accountability • Helps formulate and justify budget requests • Helps in making operational resource allocation decisions • Triggers in-depth examinations of what performance problems exist and what corrections are needed • Helps motivate personnel to continue making program improvements (continued on next slide)
  • 59. IPDET © 2009 59 Ten Uses of Results Findings (cont.) • Monitors the project or program performance against outcome targets • Provides data for special, in-depth program evaluations • Helps track services delivery against precise outcome targets • Supports strategic and other long-term planning efforts • Communicates with the public to build public trust
  • 60. IPDET Step 10: Sustaining the M&E System within the Organization planning for improvement : selecting realistic targets 5 1 conducting a readiness assessment 3 selecting key indicators to monitor outcomes 7 using evaluation information 9 using findings 2 agreeing on outcomes to monitor and evaluate 4 gathering baseline data on indicators 6 monitoring for results 8 reporting findings Sustaining the M&E System within the Organization 10
  • 61. IPDET © 2009 61 Critical Components Crucial to Sustaining • Demand • Clear roles and responsibilities • Trustworthy and credible information • Accountability • Capacity • Incentives
  • 62. IPDET © 2009 62 Concluding Comments • The demand for capacity building never ends! The only way an organization can coast is downhill • Keep your champions on your side and help them! • Establish the understanding with the Ministry of Finance and the Parliament that an M&E system needs sustained resources • Look for every opportunity to link results information to budget and resource allocation decisions (continued on next slide)
  • 63. IPDET © 2009 63 Concluding Comments (cont.) • Begin with pilot efforts to demonstrate effective results-based monitoring and evaluation • Begin with an enclave strategy (e.g., islands of innovation) as opposed to a whole-of- government approach. • Monitor both implementation progress and results achievements • Complement performance monitoring with evaluations to ensure better understanding of public sector results
  • 64. A Final Note…. IPDET © 2009 “We are what we repeatedly do. Excellence, then, is not an act, but a habit.” -- Aristotle Questions?