SlideShare a Scribd company logo
Monitoring and Evaluation of
Health Services
Dr. Rasha Salama
PhD Public Health
Faculty of Medicine
Suez Canal University-Egypt
Presentation Outline
Monitoring and Evaluation (M&E)
• Monitoring progress and evaluating results are key
functions to improve the performance of those
responsible for implementing health services.
• M&E show whether a service/program is accomplishing
its goals. It identifies program weaknesses and strengths,
areas of the program that need revision, and areas of the
program that meet or exceed expectations.
• To do this, analysis of any or all of a program’s domains
is required
Where does M&E fit?
Monitoring versus Evaluation
Monitoring Evaluation
A process that assesses an
achievement against
preset criteria.
Has a variety of purposes,
and follow distinct
methodologies (process,
outcome, performance,
etc).
A planned, systematic
process of observation
that closely follows a
course of activities, and
compares what is
happening with what is
expected to happen
Evaluation Monitoring
• A systematic process to
determine the extent to which
service needs and results have
been or are being achieved and
analyse the reasons for any
discrepancy.
• Attempts to measure service’s
relevance, efficiency and
effectiveness. It measures
whether and to what extent the
programme’s inputs and
services are improving the
quality of people’s lives.
• The periodic collection and
review of information on
programme implementation,
coverage and use for
comparison with
implementation plans.
• Open to modifying original
plans during implementation
• Identifies shortcomings before
it is too late.
• Provides elements of analysis
as to why progress fell short of
expectations
Comparison between Monitoring and Evaluation
Evaluation
Evaluation can focus on:
• Projects
normally consist of a set of activities undertaken to
achieve specific objectives within a given budget and
time period.
• Programs
are organized sets of projects or services concerned
with a particular sector or geographic region
• Services
are based on a permanent structure, and, have the
goal of becoming, national in coverage, e.g. Health
services, whereas programmes are usually limited in
time or area.
• Processes
are organizational operations of a continuous and
supporting nature (e.g. personnel procedures,
administrative support for projects, distribution
systems, information systems, management
operations).
• Conditions
are particular characteristics or states of being of
persons or things (e.g. disease, nutritional status,
literacy, income level).
Processes
Services
Projects
Conditions
Programs
Evaluation may focus on different aspects of a service or program:
• Inputs
are resources provided for an activity, and include cash,
supplies, personnel, equipment and training.
• Processes
transform inputs into outputs.
• Outputs
are the specific products or services, that an activity is
expected to deliver as a result of receiving the inputs.
• A service is effective if
it “works”, i.e. it delivers outputs in accordance with its
objectives.
• A service is efficient or cost-effective if
effectiveness is achieved at the lowest practical cost.
• Outcomes
refer to peoples’ responses to a programme and how they
are doing things differently as a result of it. They are
short-term effects related to objectives.
• Impacts
are the effects of the service on the people and their
surroundings. These may be economic, social,
organizational, health, environmental, or other
intended or unintended results of the programme.
Impacts are long-term effects.
Processes
Inputs
Impacts
Outputs
Efficiency
Effectiveness
Outcomes
monitoring and evaluation of health services.ppt
So what do you think?
• When is evaluation desirable?
When Is Evaluation Desirable?
• Program evaluation is often used when programs have
been functioning for some time. This is called
Retrospective Evaluation.
• However, evaluation should also be conducted when a
new program within a service is being introduced. These
are called Prospective Evaluations.
• A prospective evaluation identifies ways to increase
the impact of a program on clients; it examines and
describes a program’s attributes; and, it identifies how to
improve delivery mechanisms to be more effective.
Prospective versus Retrospective Evaluation
• Prospective Evaluation, determines
what ought to happen (and why)
• Retrospective Evaluation, determines
what actually happened (and why)
Evaluation Matrix
The broadest and most common classification of
evaluation identifies two kinds of evaluation:
• Formative evaluation.
Evaluation of components and activities of a
program other than their outcomes. (Structure
and Process Evaluation)
• Summative evaluation.
Evaluation of the degree to which a program has
achieved its desired outcomes, and the degree to
which any other outcomes (positive or negative)
have resulted from the program.
Evaluation Matrix
Components of Comprehensive Evaluation
Evaluation Designs
• Ongoing service/program evaluation
• End of program evaluation
• Impact evaluation
• Spot check evaluation
• Desk evaluation
Who conducts evaluation?
Who conducts evaluation?
• Internal evaluation (self evaluation), in which
people within a program sponsor, conduct and
control the evaluation.
• External evaluation, in which someone from
beyond the program acts as the evaluator and
controls the evaluation.
Tradeoffs between External and Internal Evaluation
Tradeoffs between External and Internal Evaluation
Source: Adapted from UNICEF Guide for Monitoring and Evaluation, 1991.
Guidelines for Evaluation (FIVE phases)
Phase A: Planning the Evaluation
• Determine the purpose of the
evaluation.
• Decide on type of evaluation.
• Decide on who conducts
evaluation (evaluation team)
• Review existing information in
programme documents
including monitoring
information.
• List the relevant information
sources
• Describe the programme. *
• Assess your own strengths and
limitations.
•*Provide background
information on the history
and current status of the
programme being
evaluated including:
• How it works: its
objectives, strategies and
management process)
•Policy environment
•Economic and financial
feasibility
•Institutional capacity
•Socio-cultural aspects
•Participation and
ownership
•Environment
•Technology
Phase B:Selecting Appropriate Evaluation
Methods
• Identify evaluation goals and objectives.
(SMART)
• Formulate evaluation questions and sub-
questions
• Decide on the appropriate evaluation design
• Identify measurement standards
• Identify measurement indicators
• Develop an evaluation schedule
• Develop a budget for the evaluation.
Sample evaluation questions: What might
stakeholders want to know?
Program clients:
• Does this program provide us
with high quality service?
• Are some clients provided with
better services than other
clients? If so, why?
Program Staff:
• Does this program provide our
clients with high quality service?
• Should staff make any changes
in how they perform their work,
as individuals and as a team, to
improve program processes and
outcomes?
Program managers:
• Does this program provide our
clients with high quality service?
• Are there ways managers can
improve or change their
activities, to improve program
processes and outcomes?
Funding bodies:
• Does this program provide its
clients with high quality service?
• Is the program cost-effective?
• Should we make changes in how
we fund this program or in the
level of funding to the program?
monitoring and evaluation of health services.ppt
Indicators..... What are they?
An indicator is a standardized, objective
measure that allows—
• A comparison among health facilities
• A comparison among countries
• A comparison between different time periods
• A measure of the progress toward achieving
program goals
Characteristics of Indicators
• Clarity: easily understandable by everybody
• Useful: represent all the important dimensions
of performance
• Measurable
▫ Quantitative: rates, proportions, percentage, common
denominator (e.g., population)
▫ Qualitative: “yes” or “no”
• Reliability: can be collected consistently by
different data collectors
• Validity: measure what we mean to measure
Which Indicators?
The following questions can help determine
measurable indicators:
▫ How will I know if an objective has been
accomplished?
▫ What would be considered effective?
▫ What would be a success?
▫ What change is expected?
Evaluation Area
(Formative assessment )
Evaluation Question Examples of Specific
Measurable Indicators
Staff Supply Is staff supply
sufficient?
Staff-to-client ratios
Service Utilization What are the program’s
usage levels?
Percentage of utilization
Accessibility of Services How do members of the
target population
perceive service
availability?
• Percentage of target
population who are
aware of the program in
their area
• Percentage of the
“aware” target
population who know
how to access the service
Client Satisfaction How satisfied are
clients?
Percentage of clients
who report being
satisfied with the service
received
Evaluation Area
(Summative Assessment)
Evaluation question Examples of specific
measurable indicators
Changes in Behaviour Have risk factors for
cardiac disease have
changed?
Compare proportion of
respondents who reported
increased physical activity
Morbidity/Mortality • Has lung cancer
mortality decreased by
10%?
• Has there been a
reduction in the rate of
low birth weight
babies?
• Age-standardized lung
cancer mortality rates for
males and females
•Compare annual rates of
low-birth weight babies
over five years period
monitoring and evaluation of health services.ppt
So what will we do ? Use Importance
Feasibility Matrix
Face reality! Assess your strengths and weakness
Eventually......
Phase C: Collecting and Analysing Information
• Develop data collection instruments.
• Pre-test data collection instruments.
• Undertake data collection activities.
• Analyse data.
• Interpret the data
Development of a frame logical model
A program logic model provides a framework for an evaluation. It is
a flow chart that shows the program’s components, the
relationships between components and the sequencing of events.
Use of IF-THEN Logic Model Statements
To support logic model development, a set of “IF-THEN” statements
helps determine if the rationale linking program inputs, outputs and
objectives/outcomes is plausible, filling in links in the chain of reasoning
CAT SOLO mnemonic
Next, the CAT Elements (Components, Activities and Target Groups) of a
logic model can be examined
Gathering of Qualitative and Quantitative
Information: Instruments
Qualitative tools:
There are five frequently used data collection processes in qualitative
evaluation (more than one method can be used):
1. Unobtrusive seeing, involving an observer who is not seen by
those who are observed;
2. Participant observation, involving an observer who does not
take part in an activity but is seen by the activity’s participants.
3. Interviewing, involving a more active role for the evaluator
because she /he poses questions to the respondent, usually on a
one-on-one basis
4. Group-based data collection processes such as focus groups;
and
5. Content analysis, which involves reviewing documents and
transcripts to identify patterns within the material
Quantitative tools:
• “Quantitative, or numeric information, is obtained
from various databases and can be expressed using
statistics.”
• Surveys/questionnaires;
• Registries
• Activity logs;
• Administrative records;
• Patient/client charts;
• Registration forms;
• Case studies;
• Attendance sheets.
Pretesting or piloting......
Other monitoring and evaluation methods:
• Biophysical measurements
• Cost-benefit analysis
• Sketch mapping
• GIS mapping
• Transects
• Seasonal calendars
• Most significant change
method
• Impact flow diagram ( cause-
effect diagram)
• Institutional linkage diagram
(Venn/Chapati diagram)
• Problem and objectives tree
• Systems (inputs-outputs)
diagram
• Monitoring and evaluation
Wheel (spider web)
Spider Web Method:
This method is a visual index developed to identify the kind of
indicators/criteria that can be used to monitor change over the
program period. This would present a ‘before’ and ‘after’
program/project situation. It is commonly used in participatory
evaluation.
Phase D: Reporting Findings
• Write the evaluation report.
• Decide on the method of sharing the evaluation
results and on communication strategies.
• Share the draft report with stakeholders and
revise as needed to be followed by follow up.
• Disseminate evaluation report.
Example of
suggested outline
for an evaluation
report
Phase E:Implementing Evaluation
Recommendations
• Develop a new/revised implementation plan in
partnership with stakeholders.
• Monitor the implementation of evaluation
recommendations and report regularly on the
implementation progress.
• Plan the next evaluation
monitoring and evaluation of health services.ppt
References
• WHO: UNFPA. Programme Manager’s Planning Monitoring & Evaluation
Toolkit. Division for oversight services, August 2004,
• Ontario Ministry of Health and Long-Term Care, Public Health Branch In: The Health
Communication Unit at the Centre for Health Promotion. Introduction to
evaluation health promotion programs. November 23, 24, 2007.
• Donaldson SI, Gooler LE, Scriven M. (2002). Strategies for managing
evaluation anxiety: Toward a psychology of program evaluation. American Journal
of Evaluation. 23(3), 261-272.
• CIDA. “CIDA Evaluation Guide”, Performance Review Branch, 2000.
• OECD. “Improving Evaluation Practices: Best Practice Guidelines for
Evaluation and Background Paper”, 1999.
• UNDP. “Results-Oriented Monitoring and Evaluation: A Handbook for
Programme Managers”,
• Office of Evaluation and Strategic Planning, New York, 1997.
• UNICEF. “A UNICEF Guide for Monitoring and Evaluation: Making a
Difference?”, Evaluation Office, New York, 1991.
References (cont.)
• UNICEF. “Evaluation Reports Standards”, 2004.
• USAID. “Performance Monitoring and Evaluation –
TIPS # 3: Preparing an Evaluation Scope of
• Work”, 1996 and “TIPS # 11: The Role of Evaluation
in USAID”, 1997, Centre for Development Information
and Evaluation. Available at
http://www.dec.org/usaid_eval/#004
• U.S. Centres for Disease Control and Prevention (CDC).
“Framework for Program Evaluation in Public
Health”, 1999. Available in English at
http://www.cdc.gov/eval/over.htm
• U.S. Department of Health and Human Services.
Administration on Children, Youth, and Families (ACYF),
“The Program Manager’s Guide to Evaluation”, 1997.
Thank You

More Related Content

PDF
Monitoring & Evaluation Framework - Fiinovation
PPTX
Decentralization of health services.pptx
DOC
Npcscd
PPTX
Health System Management Field Program 4th year
PPTX
National health education, information and communication center
PPTX
A presentation on health education, information and communication
PPT
Behavior Change Communication
PPTX
Logistics management information system(lmis)
Monitoring & Evaluation Framework - Fiinovation
Decentralization of health services.pptx
Npcscd
Health System Management Field Program 4th year
National health education, information and communication center
A presentation on health education, information and communication
Behavior Change Communication
Logistics management information system(lmis)

What's hot (20)

PPT
Health planning approaches hahm 17
PDF
Setting up a Delivery Unit
PDF
Data for Impact: Lessons Learned in Using the Ripple Effects Mapping Method
PPTX
Decision Making
PPTX
Controlling in nursing management
PPTX
District Health Information System.pptx
PPTX
Innovation in nursing
PPTX
Monitoring & Evalution ...... Orientation PPT
PPTX
Behavioural change communication
PPTX
HMIS Nepal
PDF
Some Challenges of M&E
PDF
Monitoring and Evaluation for development and governmental organizations.pdf
PPTX
Logistics management information system(lmis)
PPTX
How to Conduct a Community Assessment for Health Projects
PPTX
M&E Plan
PPTX
Planning
DOCX
Health sector decentralization in Nepal report
PPTX
Monitoring and evaluation frameworks logical framework
DOCX
OBG LESSON PLAN on organization and administration of family welfare system
PPTX
National health policy 1991
Health planning approaches hahm 17
Setting up a Delivery Unit
Data for Impact: Lessons Learned in Using the Ripple Effects Mapping Method
Decision Making
Controlling in nursing management
District Health Information System.pptx
Innovation in nursing
Monitoring & Evalution ...... Orientation PPT
Behavioural change communication
HMIS Nepal
Some Challenges of M&E
Monitoring and Evaluation for development and governmental organizations.pdf
Logistics management information system(lmis)
How to Conduct a Community Assessment for Health Projects
M&E Plan
Planning
Health sector decentralization in Nepal report
Monitoring and evaluation frameworks logical framework
OBG LESSON PLAN on organization and administration of family welfare system
National health policy 1991
Ad

Similar to monitoring and evaluation of health services.ppt (20)

PPT
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
PPT
Monitoring and Evaluation of Health Services
PPT
Street Jibe Evaluation
PPTX
Evaluation seminar1
PPT
Monitoring and evaluation for hiv
PPT
Street Jibe Evaluation Workshop 2
PPTX
Chapter 1 Introduction to Monitoing and evaluationpptx
PPT
ME_Katende (2).ppt
PPT
Assessment MEAL Frameworks in scientific field.ppt
PPTX
COMMUNITY EVALUATION 2023.pptx
PDF
Training Impact assessment or capacity development impact assessment pdf
PPTX
National health program evaluation
PPTX
9program evaluation.pptx
PPTX
Result based management
PPTX
Unit IV_Monitoring_and_Evaluation.pptx
PPTX
M & E Training guide
PPTX
M&E Concepts.pptx
PPTX
Performance based secondary healthcare monitoring & evaluation
PPTX
Evaluating Systems Change
PPTX
Organizational Capacity-Building Series - Session 6: Program Evaluation
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
Monitoring and Evaluation of Health Services
Street Jibe Evaluation
Evaluation seminar1
Monitoring and evaluation for hiv
Street Jibe Evaluation Workshop 2
Chapter 1 Introduction to Monitoing and evaluationpptx
ME_Katende (2).ppt
Assessment MEAL Frameworks in scientific field.ppt
COMMUNITY EVALUATION 2023.pptx
Training Impact assessment or capacity development impact assessment pdf
National health program evaluation
9program evaluation.pptx
Result based management
Unit IV_Monitoring_and_Evaluation.pptx
M & E Training guide
M&E Concepts.pptx
Performance based secondary healthcare monitoring & evaluation
Evaluating Systems Change
Organizational Capacity-Building Series - Session 6: Program Evaluation
Ad

More from AliIbrahim517066 (18)

PPTX
agriculturalextansionunit-1introductiontoagriculturalextension-150316081052-c...
PPT
Communication skills presentation 2nd.ppt
PPT
adultlearning3-110604051319-phpapp01.ppt
PDF
فن الاتصال الفعال فن الاتصال الفعال.pdf
PDF
الاتصال في العمل الارشادي الزراعي المحاضرة الخامسة
PPTX
Diffusion of innovation Diffusion of innovation.pptx
PPT
مهارات الاتصال الفعال مهارات الاتصال الفعال.ppt
PPT
بنوك المعلومات و تكنولوجيا الاتصال .ppt
PPT
ِassessment and evaluation for improved student learning.ppt
PPT
lecture 7. measuring outcomes student evaluation.ppt
PPT
the difference between assessment and evaluation.ppt
PPT
assessment scoring and evaluation.ppt
PPT
Kirkpatrick - Evaluation- Model -PPT.ppt
PPT
human resource management equal opportunity and the law.ppt
PPT
strategic human resource management and the HR scorecard.ppt
PPT
motivation at work Nelson and Quick .ppt
PPT
Agricultural extension program goals and objectives .ppt
PPT
مشكلات إدارة التنظيم الارشادي المصري.ppt
agriculturalextansionunit-1introductiontoagriculturalextension-150316081052-c...
Communication skills presentation 2nd.ppt
adultlearning3-110604051319-phpapp01.ppt
فن الاتصال الفعال فن الاتصال الفعال.pdf
الاتصال في العمل الارشادي الزراعي المحاضرة الخامسة
Diffusion of innovation Diffusion of innovation.pptx
مهارات الاتصال الفعال مهارات الاتصال الفعال.ppt
بنوك المعلومات و تكنولوجيا الاتصال .ppt
ِassessment and evaluation for improved student learning.ppt
lecture 7. measuring outcomes student evaluation.ppt
the difference between assessment and evaluation.ppt
assessment scoring and evaluation.ppt
Kirkpatrick - Evaluation- Model -PPT.ppt
human resource management equal opportunity and the law.ppt
strategic human resource management and the HR scorecard.ppt
motivation at work Nelson and Quick .ppt
Agricultural extension program goals and objectives .ppt
مشكلات إدارة التنظيم الارشادي المصري.ppt

Recently uploaded (20)

PPTX
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
PDF
Anesthesia in Laparoscopic Surgery in India
PDF
Abdominal Access Techniques with Prof. Dr. R K Mishra
PDF
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
PPTX
Cell Structure & Organelles in detailed.
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PDF
Complications of Minimal Access Surgery at WLH
PDF
Microbial disease of the cardiovascular and lymphatic systems
PDF
Pre independence Education in Inndia.pdf
PDF
Sports Quiz easy sports quiz sports quiz
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PDF
RMMM.pdf make it easy to upload and study
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PPTX
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
PPTX
Lesson notes of climatology university.
PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PDF
Computing-Curriculum for Schools in Ghana
PDF
Classroom Observation Tools for Teachers
PPTX
Cell Types and Its function , kingdom of life
PDF
Supply Chain Operations Speaking Notes -ICLT Program
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
Anesthesia in Laparoscopic Surgery in India
Abdominal Access Techniques with Prof. Dr. R K Mishra
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
Cell Structure & Organelles in detailed.
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
Complications of Minimal Access Surgery at WLH
Microbial disease of the cardiovascular and lymphatic systems
Pre independence Education in Inndia.pdf
Sports Quiz easy sports quiz sports quiz
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
RMMM.pdf make it easy to upload and study
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
Lesson notes of climatology university.
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
Computing-Curriculum for Schools in Ghana
Classroom Observation Tools for Teachers
Cell Types and Its function , kingdom of life
Supply Chain Operations Speaking Notes -ICLT Program

monitoring and evaluation of health services.ppt

  • 1. Monitoring and Evaluation of Health Services Dr. Rasha Salama PhD Public Health Faculty of Medicine Suez Canal University-Egypt
  • 3. Monitoring and Evaluation (M&E) • Monitoring progress and evaluating results are key functions to improve the performance of those responsible for implementing health services. • M&E show whether a service/program is accomplishing its goals. It identifies program weaknesses and strengths, areas of the program that need revision, and areas of the program that meet or exceed expectations. • To do this, analysis of any or all of a program’s domains is required
  • 5. Monitoring versus Evaluation Monitoring Evaluation A process that assesses an achievement against preset criteria. Has a variety of purposes, and follow distinct methodologies (process, outcome, performance, etc). A planned, systematic process of observation that closely follows a course of activities, and compares what is happening with what is expected to happen
  • 6. Evaluation Monitoring • A systematic process to determine the extent to which service needs and results have been or are being achieved and analyse the reasons for any discrepancy. • Attempts to measure service’s relevance, efficiency and effectiveness. It measures whether and to what extent the programme’s inputs and services are improving the quality of people’s lives. • The periodic collection and review of information on programme implementation, coverage and use for comparison with implementation plans. • Open to modifying original plans during implementation • Identifies shortcomings before it is too late. • Provides elements of analysis as to why progress fell short of expectations
  • 9. Evaluation can focus on: • Projects normally consist of a set of activities undertaken to achieve specific objectives within a given budget and time period. • Programs are organized sets of projects or services concerned with a particular sector or geographic region • Services are based on a permanent structure, and, have the goal of becoming, national in coverage, e.g. Health services, whereas programmes are usually limited in time or area. • Processes are organizational operations of a continuous and supporting nature (e.g. personnel procedures, administrative support for projects, distribution systems, information systems, management operations). • Conditions are particular characteristics or states of being of persons or things (e.g. disease, nutritional status, literacy, income level). Processes Services Projects Conditions Programs
  • 10. Evaluation may focus on different aspects of a service or program: • Inputs are resources provided for an activity, and include cash, supplies, personnel, equipment and training. • Processes transform inputs into outputs. • Outputs are the specific products or services, that an activity is expected to deliver as a result of receiving the inputs. • A service is effective if it “works”, i.e. it delivers outputs in accordance with its objectives. • A service is efficient or cost-effective if effectiveness is achieved at the lowest practical cost. • Outcomes refer to peoples’ responses to a programme and how they are doing things differently as a result of it. They are short-term effects related to objectives. • Impacts are the effects of the service on the people and their surroundings. These may be economic, social, organizational, health, environmental, or other intended or unintended results of the programme. Impacts are long-term effects. Processes Inputs Impacts Outputs Efficiency Effectiveness Outcomes
  • 12. So what do you think? • When is evaluation desirable?
  • 13. When Is Evaluation Desirable? • Program evaluation is often used when programs have been functioning for some time. This is called Retrospective Evaluation. • However, evaluation should also be conducted when a new program within a service is being introduced. These are called Prospective Evaluations. • A prospective evaluation identifies ways to increase the impact of a program on clients; it examines and describes a program’s attributes; and, it identifies how to improve delivery mechanisms to be more effective.
  • 14. Prospective versus Retrospective Evaluation • Prospective Evaluation, determines what ought to happen (and why) • Retrospective Evaluation, determines what actually happened (and why)
  • 15. Evaluation Matrix The broadest and most common classification of evaluation identifies two kinds of evaluation: • Formative evaluation. Evaluation of components and activities of a program other than their outcomes. (Structure and Process Evaluation) • Summative evaluation. Evaluation of the degree to which a program has achieved its desired outcomes, and the degree to which any other outcomes (positive or negative) have resulted from the program.
  • 18. Evaluation Designs • Ongoing service/program evaluation • End of program evaluation • Impact evaluation • Spot check evaluation • Desk evaluation
  • 20. Who conducts evaluation? • Internal evaluation (self evaluation), in which people within a program sponsor, conduct and control the evaluation. • External evaluation, in which someone from beyond the program acts as the evaluator and controls the evaluation.
  • 21. Tradeoffs between External and Internal Evaluation
  • 22. Tradeoffs between External and Internal Evaluation Source: Adapted from UNICEF Guide for Monitoring and Evaluation, 1991.
  • 23. Guidelines for Evaluation (FIVE phases)
  • 24. Phase A: Planning the Evaluation • Determine the purpose of the evaluation. • Decide on type of evaluation. • Decide on who conducts evaluation (evaluation team) • Review existing information in programme documents including monitoring information. • List the relevant information sources • Describe the programme. * • Assess your own strengths and limitations. •*Provide background information on the history and current status of the programme being evaluated including: • How it works: its objectives, strategies and management process) •Policy environment •Economic and financial feasibility •Institutional capacity •Socio-cultural aspects •Participation and ownership •Environment •Technology
  • 25. Phase B:Selecting Appropriate Evaluation Methods • Identify evaluation goals and objectives. (SMART) • Formulate evaluation questions and sub- questions • Decide on the appropriate evaluation design • Identify measurement standards • Identify measurement indicators • Develop an evaluation schedule • Develop a budget for the evaluation.
  • 26. Sample evaluation questions: What might stakeholders want to know? Program clients: • Does this program provide us with high quality service? • Are some clients provided with better services than other clients? If so, why? Program Staff: • Does this program provide our clients with high quality service? • Should staff make any changes in how they perform their work, as individuals and as a team, to improve program processes and outcomes? Program managers: • Does this program provide our clients with high quality service? • Are there ways managers can improve or change their activities, to improve program processes and outcomes? Funding bodies: • Does this program provide its clients with high quality service? • Is the program cost-effective? • Should we make changes in how we fund this program or in the level of funding to the program?
  • 28. Indicators..... What are they? An indicator is a standardized, objective measure that allows— • A comparison among health facilities • A comparison among countries • A comparison between different time periods • A measure of the progress toward achieving program goals
  • 29. Characteristics of Indicators • Clarity: easily understandable by everybody • Useful: represent all the important dimensions of performance • Measurable ▫ Quantitative: rates, proportions, percentage, common denominator (e.g., population) ▫ Qualitative: “yes” or “no” • Reliability: can be collected consistently by different data collectors • Validity: measure what we mean to measure
  • 30. Which Indicators? The following questions can help determine measurable indicators: ▫ How will I know if an objective has been accomplished? ▫ What would be considered effective? ▫ What would be a success? ▫ What change is expected?
  • 31. Evaluation Area (Formative assessment ) Evaluation Question Examples of Specific Measurable Indicators Staff Supply Is staff supply sufficient? Staff-to-client ratios Service Utilization What are the program’s usage levels? Percentage of utilization Accessibility of Services How do members of the target population perceive service availability? • Percentage of target population who are aware of the program in their area • Percentage of the “aware” target population who know how to access the service Client Satisfaction How satisfied are clients? Percentage of clients who report being satisfied with the service received
  • 32. Evaluation Area (Summative Assessment) Evaluation question Examples of specific measurable indicators Changes in Behaviour Have risk factors for cardiac disease have changed? Compare proportion of respondents who reported increased physical activity Morbidity/Mortality • Has lung cancer mortality decreased by 10%? • Has there been a reduction in the rate of low birth weight babies? • Age-standardized lung cancer mortality rates for males and females •Compare annual rates of low-birth weight babies over five years period
  • 34. So what will we do ? Use Importance Feasibility Matrix
  • 35. Face reality! Assess your strengths and weakness
  • 37. Phase C: Collecting and Analysing Information • Develop data collection instruments. • Pre-test data collection instruments. • Undertake data collection activities. • Analyse data. • Interpret the data
  • 38. Development of a frame logical model A program logic model provides a framework for an evaluation. It is a flow chart that shows the program’s components, the relationships between components and the sequencing of events.
  • 39. Use of IF-THEN Logic Model Statements To support logic model development, a set of “IF-THEN” statements helps determine if the rationale linking program inputs, outputs and objectives/outcomes is plausible, filling in links in the chain of reasoning
  • 40. CAT SOLO mnemonic Next, the CAT Elements (Components, Activities and Target Groups) of a logic model can be examined
  • 41. Gathering of Qualitative and Quantitative Information: Instruments Qualitative tools: There are five frequently used data collection processes in qualitative evaluation (more than one method can be used): 1. Unobtrusive seeing, involving an observer who is not seen by those who are observed; 2. Participant observation, involving an observer who does not take part in an activity but is seen by the activity’s participants. 3. Interviewing, involving a more active role for the evaluator because she /he poses questions to the respondent, usually on a one-on-one basis 4. Group-based data collection processes such as focus groups; and 5. Content analysis, which involves reviewing documents and transcripts to identify patterns within the material
  • 42. Quantitative tools: • “Quantitative, or numeric information, is obtained from various databases and can be expressed using statistics.” • Surveys/questionnaires; • Registries • Activity logs; • Administrative records; • Patient/client charts; • Registration forms; • Case studies; • Attendance sheets.
  • 44. Other monitoring and evaluation methods: • Biophysical measurements • Cost-benefit analysis • Sketch mapping • GIS mapping • Transects • Seasonal calendars • Most significant change method • Impact flow diagram ( cause- effect diagram) • Institutional linkage diagram (Venn/Chapati diagram) • Problem and objectives tree • Systems (inputs-outputs) diagram • Monitoring and evaluation Wheel (spider web)
  • 45. Spider Web Method: This method is a visual index developed to identify the kind of indicators/criteria that can be used to monitor change over the program period. This would present a ‘before’ and ‘after’ program/project situation. It is commonly used in participatory evaluation.
  • 46. Phase D: Reporting Findings • Write the evaluation report. • Decide on the method of sharing the evaluation results and on communication strategies. • Share the draft report with stakeholders and revise as needed to be followed by follow up. • Disseminate evaluation report.
  • 47. Example of suggested outline for an evaluation report
  • 48. Phase E:Implementing Evaluation Recommendations • Develop a new/revised implementation plan in partnership with stakeholders. • Monitor the implementation of evaluation recommendations and report regularly on the implementation progress. • Plan the next evaluation
  • 50. References • WHO: UNFPA. Programme Manager’s Planning Monitoring & Evaluation Toolkit. Division for oversight services, August 2004, • Ontario Ministry of Health and Long-Term Care, Public Health Branch In: The Health Communication Unit at the Centre for Health Promotion. Introduction to evaluation health promotion programs. November 23, 24, 2007. • Donaldson SI, Gooler LE, Scriven M. (2002). Strategies for managing evaluation anxiety: Toward a psychology of program evaluation. American Journal of Evaluation. 23(3), 261-272. • CIDA. “CIDA Evaluation Guide”, Performance Review Branch, 2000. • OECD. “Improving Evaluation Practices: Best Practice Guidelines for Evaluation and Background Paper”, 1999. • UNDP. “Results-Oriented Monitoring and Evaluation: A Handbook for Programme Managers”, • Office of Evaluation and Strategic Planning, New York, 1997. • UNICEF. “A UNICEF Guide for Monitoring and Evaluation: Making a Difference?”, Evaluation Office, New York, 1991.
  • 51. References (cont.) • UNICEF. “Evaluation Reports Standards”, 2004. • USAID. “Performance Monitoring and Evaluation – TIPS # 3: Preparing an Evaluation Scope of • Work”, 1996 and “TIPS # 11: The Role of Evaluation in USAID”, 1997, Centre for Development Information and Evaluation. Available at http://www.dec.org/usaid_eval/#004 • U.S. Centres for Disease Control and Prevention (CDC). “Framework for Program Evaluation in Public Health”, 1999. Available in English at http://www.cdc.gov/eval/over.htm • U.S. Department of Health and Human Services. Administration on Children, Youth, and Families (ACYF), “The Program Manager’s Guide to Evaluation”, 1997.