SlideShare a Scribd company logo
Monitoring and Evaluation of Health Services Dr. Rasha Salama PhD Public Health Faculty of Medicine Suez Canal University-Egypt
Presentation Outline
Monitoring and Evaluation (M&E) Monitoring progress and evaluating results are key functions to improve the performance of those responsible for implementing  health services.  M&E show whether a service/program is accomplishing its goals. It identifies program weaknesses and strengths, areas of the program that need revision, and areas of the program that meet or exceed expectations. To do this, analysis of any or all of a program’s domains is required
Where does M&E fit?
Monitoring versus Evaluation Monitoring  Evaluation  A process that assesses an achievement against preset criteria.  Has a variety of purposes, and follow distinct methodologies (process, outcome, performance, etc). A planned, systematic process of observation that closely follows a course of activities, and compares what is happening with what is expected to happen
Evaluation  Monitoring  A systematic process to determine the extent to which service needs and results have been or are being achieved and analyse the reasons for any discrepancy. Attempts to measure service’s relevance, efficiency and effectiveness. It measures whether and to what extent the programme’s inputs and services are improving the quality of people’s lives. The periodic collection and review of information on programme implementation, coverage and use for comparison with implementation plans. Open to modifying original plans during implementation Identifies shortcomings before it is too late.  Provides elements of analysis as to why progress fell short of expectations
Comparison between Monitoring and Evaluation
Evaluation
Evaluation can focus on: Projects  normally consist of a set of activities undertaken  to achieve specific objectives within a given budget and time period. Programs  are organized sets of projects or services  concerned with a particular sector or geographic region Services  are based on a permanent structure, and, have  the goal of becoming, national in coverage, e.g. Health services, whereas programmes are usually limited in time or area. Processes  are organizational operations of a continuous  and supporting nature (e.g. personnel procedures, administrative support for projects, distribution systems, information systems, management operations). Conditions  are particular characteristics or states of  being of persons or things (e.g. disease, nutritional status, literacy, income level). Processes  Services  Projects  Conditions  Programs
Evaluation may focus on different aspects of a service or program: Inputs  are resources provided for an activity, and include  cash, supplies, personnel, equipment and training. Processes  transform inputs into outputs. Outputs  are the specific products or services,  that an activity is expected to deliver as a result of receiving the inputs. A service is  effective  if  it “works”, i.e. it delivers outputs  in accordance with its objectives. A service is  efficient  or cost-effective if   effectiveness is  achieved at the lowest practical cost.   Outcomes  refer to peoples’ responses to a  programme and how they are doing things differently as a result of it. They are short-term effects related to objectives. Impacts  are the effects of the service on the  people and their surroundings. These may be economic, social, organizational, health, environmental,  or other intended or unintended results of the programme. Impacts are long-term effects. Processes  Inputs  Impacts  Outputs  Efficiency  Effectiveness Outcomes
 
So what do you think? When is evaluation desirable?
When Is Evaluation Desirable? Program evaluation is often used when programs have been functioning for some time. This is called  Retrospective Evaluation.  However, evaluation should also be conducted when a new program within a service is being introduced. These are called  Prospective Evaluations.  A prospective evaluation  identifies ways to increase the impact of a program on clients; it examines and describes a program’s attributes; and, it identifies how to improve delivery mechanisms to be more effective.
Prospective versus Retrospective Evaluation Prospective Evaluation, determines  what ought to happen (and why) Retrospective Evaluation, determines what actually happened (and why)
Evaluation Matrix The broadest and most common classification of evaluation identifies two kinds of evaluation: Formative evaluation.  Evaluation of components  and activities of a program other than their outcomes.  (Structure and Process Evaluation) Summative evaluation.  Evaluation of the degree to which a program has achieved its desired outcomes, and the degree to which any other outcomes (positive or negative) have resulted from the program.
Evaluation Matrix
Components of Comprehensive Evaluation
Evaluation Designs  Ongoing service/program evaluation End of program evaluation Impact evaluation Spot check evaluation Desk evaluation
Who conducts evaluation?
Who conducts evaluation? Internal evaluation  (self evaluation), in which people within a program sponsor, conduct and control the evaluation.  External evaluation,  in which someone from beyond the program acts as the evaluator and controls the evaluation.
Tradeoffs between External and Internal Evaluation
Tradeoffs between External and Internal Evaluation Source: Adapted from UNICEF Guide for Monitoring and Evaluation, 1991 .
Guidelines for Evaluation (FIVE phases)
Phase A: Planning the Evaluation Determine the purpose of the evaluation. Decide on type of evaluation. Decide on who conducts evaluation (evaluation team) Review existing information in programme documents including monitoring information. List the relevant  information sources Describe the programme.  *   Assess your own strengths and limitations. * Provide background information on the history and current status of the programme being evaluated including: How it works: its objectives, strategies and management process )  Policy environment Economic and financial feasibility Institutional capacity Socio-cultural aspects Participation and ownership Environment Technology
Phase B:Selecting Appropriate Evaluation Methods   Identify evaluation goals and objectives.  (SMART) Formulate evaluation questions and sub-questions Decide on the appropriate evaluation design Identify measurement standards Identify measurement indicators  Develop an evaluation schedule Develop a budget for the evaluation.
Sample evaluation questions: What might  stakeholders  want to know? Program clients:  Does this program provide us with high quality service? Are some clients provided with better services than other clients? If so, why? Program Staff: Does this program provide our clients with high quality service? Should staff make any changes in how they perform their work, as individuals and as a team, to improve program processes and outcomes? Program managers: Does this program provide our clients with high quality service? Are there ways managers can improve or change their activities, to improve program processes and outcomes? Funding bodies:  Does this program provide its clients with high quality service? Is the program cost-effective? Should we make changes in how we fund this program or in the level of funding to the program?
 
Indicators..... What are they? An indicator is a standardized, objective measure that allows— A comparison among health facilities A comparison among countries A comparison between different time periods A measure of the progress toward achieving program goals
Characteristics of Indicators Clarity: easily understandable by everybody Useful: represent all the important dimensions of performance Measurable Quantitative: rates, proportions, percentage,  common denominator (e.g., population) Qualitative: “yes” or “no” Reliability: can be collected consistently by different data collectors Validity: measure what we mean to measure
Which Indicators? The following questions can help determine measurable indicators: How will I know if an objective has been accomplished? What would be considered effective? What would be a success? What change is expected?
Evaluation Area (Formative assessment ) Evaluation Question Examples of Specific Measurable Indicators Staff Supply  Is staff supply sufficient? Staff-to-client ratios Service Utilization What are the program’s usage levels? Percentage of utilization  Accessibility of Services How do members of the target population perceive service availability? Percentage of target population who are  aware of the program in their area •  Percentage of the “aware” target population who know how to access the service Client Satisfaction How satisfied are clients? Percentage of clients who report being satisfied with the service received
Evaluation Area ( Summative Assessment) Evaluation question  Examples of specific measurable indicators  Changes in Behaviour Have risk factors for cardiac disease have changed? Compare proportion of respondents who reported increased physical activity Morbidity/Mortality Has lung cancer mortality decreased by 10%? Has there been a reduction in the rate of low birth weight babies?  Age-standardized lung cancer mortality rates for males and females Compare annual rates of low-birth weight babies over five years period
 
So what will we do ? Use Importance Feasibility Matrix
Face reality! Assess your strengths and weakness
Eventually......
Phase C: Collecting and Analysing Information Develop data collection instruments. Pre-test data collection instruments. Undertake data collection activities. Analyse data. Interpret the data
Development of a frame logical model A program logic model provides a framework for an evaluation. It is a flow chart that shows the program’s components, the relationships between components and the sequencing of events.
Use of IF-THEN Logic Model Statements To support logic model development, a set of “IF-THEN” statements helps determine if the rationale linking program inputs, outputs and objectives/outcomes is plausible, filling in links in the chain of reasoning
CAT SOLO mnemonic  Next, the CAT Elements ( Components, Activities and Target Groups) of a logic model can be examined
Gathering of Qualitative and Quantitative Information: Instruments Qualitative tools :  There are five frequently used data collection processes in qualitative evaluation (more than one method can be used): 1.  Unobtrusive seeing,  involving an observer who is not seen by those who are observed; 2.  Participant observation,  involving an observer who does not take part in an activity but is seen by the activity’s participants. 3.  Interviewing,  involving a more active role for the evaluator because she /he poses questions to the respondent, usually on a one-on-one basis  4.  Group-based data collection processes  such as focus groups; and 5.  Content analysis,  which involves reviewing documents and transcripts to identify patterns within the material
Quantitative tools: “ Quantitative, or numeric information, is obtained from various databases and can be expressed using statistics.” Surveys/questionnaires;  Registries Activity logs; Administrative records; Patient/client charts; Registration forms;  Case studies;  Attendance sheets.
Pretesting or piloting......
Other monitoring and evaluation methods:  Biophysical measurements Cost-benefit analysis Sketch mapping GIS mapping Transects  Seasonal calendars Most significant change method Impact flow diagram ( cause-effect diagram) Institutional linkage diagram (Venn/Chapati diagram) Problem and objectives tree Systems (inputs-outputs) diagram Monitoring and evaluation Wheel (spider web)
Spider Web Method: This method is a visual index developed to identify the kind of indicators/criteria that can be used to monitor change over the program period. This would present a ‘before’ and ‘after’ program/project situation. It is commonly used in participatory evaluation.
Phase D: Reporting Findings Write the evaluation report. Decide on the method of sharing the evaluation results and on communication strategies. Share the draft report with stakeholders and revise as needed to be followed by follow up.  Disseminate evaluation report.
Example of suggested outline for an evaluation report
Phase E:Implementing Evaluation Recommendations   Develop a new/revised implementation plan in partnership with stakeholders. Monitor the implementation of evaluation recommendations and report regularly on the implementation progress. Plan the next evaluation
 
References WHO: UNFPA. Programme Manager’s Planning Monitoring & Evaluation Toolkit.  Division for oversight services, August 2004,  Ontario Ministry of Health and Long-Term Care, Public Health Branch In: The Health Communication Unit at the Centre for Health Promotion.  Introduction to evaluation health promotion programs . November 23, 24, 2007. Donaldson SI, Gooler LE, Scriven M. (2002).  Strategies for managing evaluation anxiety : Toward a psychology of program evaluation. American Journal of Evaluation. 23(3), 261-272. CIDA.  “CIDA Evaluation Guide”, Performance Review Branch, 2000. OECD.  “Improving Evaluation Practices: Best Practice Guidelines for Evaluation and Background Paper”, 1999.  UNDP.  “Results-Oriented Monitoring and Evaluation: A Handbook for Programme Managers”, Office of Evaluation and Strategic Planning, New York, 1997. UNICEF.  “A UNICEF Guide for Monitoring and Evaluation: Making a Difference?”, Evaluation  Office, New York, 1991.
References (cont.)  UNICEF.  “Evaluation Reports Standards”, 2004. USAID.  “Performance Monitoring and Evaluation – TIPS # 3: Preparing an Evaluation Scope of Work”, 1996 and “TIPS # 11: The Role of Evaluation in USAID”, 1997, Centre for Development  Information and Evaluation. Available at http://www.dec.org/usaid_eval/#004 U.S. Centres for Disease Control and Prevention (CDC).  “Framework for Program Evaluation in Public Health”, 1999. Available in English at http://www.cdc.gov/eval/over.htm U.S. Department of Health and Human Services. Administration on Children, Youth, and Families (ACYF),  “The Program Manager’s Guide to Evaluation”, 1997.
Thank You

More Related Content

PPTX
Monitoring and evaluation
PPT
Monitoring and evaluation (2)
PPTX
Planning, monitoring & evaluation of health care program
PDF
Health system-evaluation-and-monitoring
PPTX
The Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
PPTX
Supervision in Health
PPTX
National health program evaluation
PPTX
Monitoring and Evaluation for Project management.
Monitoring and evaluation
Monitoring and evaluation (2)
Planning, monitoring & evaluation of health care program
Health system-evaluation-and-monitoring
The Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
Supervision in Health
National health program evaluation
Monitoring and Evaluation for Project management.

What's hot (20)

PPT
1.hs building blocks
PDF
Health System Management.pdf
PPSX
Community diagnosis
DOCX
Human resource for health in Nepal
PPTX
Nepal health sector strategy Outcome 1
PPTX
Healthcare management
PPTX
Disease surveillance
PPTX
Surveillance
PDF
CB-IMNCI Program in Nepal
PPTX
New Organogram of Nepalese Health System (Please check the updated slides on ...
PPTX
National Health Policy and Plan
PPTX
Organogram/ Organization Structure of Nepalese Health System (Updated- Nov 2021)
PPTX
Universal Health Coverage (UHC) Day 12.12.14, Nepal
PPT
Global burden of disease & International Health Regulation
PPTX
HRH Strategy 2021-2030, Nepal.pptx
PPTX
Epidemiologic transition
DOCX
Health Systems and Health Care Services
PPTX
Health service management
PPTX
Health management information system
1.hs building blocks
Health System Management.pdf
Community diagnosis
Human resource for health in Nepal
Nepal health sector strategy Outcome 1
Healthcare management
Disease surveillance
Surveillance
CB-IMNCI Program in Nepal
New Organogram of Nepalese Health System (Please check the updated slides on ...
National Health Policy and Plan
Organogram/ Organization Structure of Nepalese Health System (Updated- Nov 2021)
Universal Health Coverage (UHC) Day 12.12.14, Nepal
Global burden of disease & International Health Regulation
HRH Strategy 2021-2030, Nepal.pptx
Epidemiologic transition
Health Systems and Health Care Services
Health service management
Health management information system
Ad

Similar to Monitoring and Evaluation of Health Services (20)

PPT
monitoring and evaluation of health services.ppt
PDF
37851 101218095128-phpapp02
PPT
Street Jibe Evaluation
PPT
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
PPTX
Evaluation seminar1
DOCX
Monitoring and Evaluation, Types and different strategies for ME
PPTX
COMMUNITY EVALUATION 2023.pptx
PPT
Street Jibe Evaluation Workshop 2
PPTX
Evaluation of health programs
PPTX
M & E Fundamentals.
PPTX
Research, Monitoring and Evaluation, in Public Health
PPT
Monitoring and evaluation for hiv
PPTX
Planning cycle for Disaster-Managment.
PPTX
Performance based secondary healthcare monitoring & evaluation
PPTX
Dr Brian Mutie on basics of Monitoring and Evaluation
PPT
Almm monitoring and evaluation tools draft[1]acm
PPTX
Monitoring and Evaluation Lesson 2.pptx
PPT
Program Evaluation 1
DOCX
School of ManagementProgram EvaluationMPA 513Week 3.docx
PPT
Program Evaluation In the Non-Profit Sector
monitoring and evaluation of health services.ppt
37851 101218095128-phpapp02
Street Jibe Evaluation
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
Evaluation seminar1
Monitoring and Evaluation, Types and different strategies for ME
COMMUNITY EVALUATION 2023.pptx
Street Jibe Evaluation Workshop 2
Evaluation of health programs
M & E Fundamentals.
Research, Monitoring and Evaluation, in Public Health
Monitoring and evaluation for hiv
Planning cycle for Disaster-Managment.
Performance based secondary healthcare monitoring & evaluation
Dr Brian Mutie on basics of Monitoring and Evaluation
Almm monitoring and evaluation tools draft[1]acm
Monitoring and Evaluation Lesson 2.pptx
Program Evaluation 1
School of ManagementProgram EvaluationMPA 513Week 3.docx
Program Evaluation In the Non-Profit Sector
Ad

More from Nayyar Kazmi (20)

PPT
introduction-to-health-policy
PPT
models-of-public-policy-formulation
DOC
Public Health Questions and Answers for Students
PDF
Policy and research gap
PDF
[James f. mc_kenzie,_r._r._pinger,_jerome_edward_ko(bookfi.org)
PPT
OSE in Public Health
PPT
Cash handling
PPT
Nutritional assessment of community
PPT
Advocacy for health policy change
PPT
Devolution and health
PPT
Income statement and flow of cash
PPT
Financial management and accounting part-iii
PPT
Nutritional assessment of community
PPT
Common pakistani foods what our diet suffers from
PPT
Role of state in health policy
PPT
Organizing the personnel in a hospital
PPT
Introduction to financial accounting
PPT
Introduction to public health nutrition
PPT
Models of policy making and the concept of power in policy
PPT
Power and political systems
introduction-to-health-policy
models-of-public-policy-formulation
Public Health Questions and Answers for Students
Policy and research gap
[James f. mc_kenzie,_r._r._pinger,_jerome_edward_ko(bookfi.org)
OSE in Public Health
Cash handling
Nutritional assessment of community
Advocacy for health policy change
Devolution and health
Income statement and flow of cash
Financial management and accounting part-iii
Nutritional assessment of community
Common pakistani foods what our diet suffers from
Role of state in health policy
Organizing the personnel in a hospital
Introduction to financial accounting
Introduction to public health nutrition
Models of policy making and the concept of power in policy
Power and political systems

Recently uploaded (20)

PPTX
1 General Principles of Radiotherapy.pptx
PPTX
neonatal infection(7392992y282939y5.pptx
PPTX
Note on Abortion.pptx for the student note
PDF
NEET PG 2025 | 200 High-Yield Recall Topics Across All Subjects
DOC
Adobe Premiere Pro CC Crack With Serial Key Full Free Download 2025
PPTX
15.MENINGITIS AND ENCEPHALITIS-elias.pptx
PPTX
Imaging of parasitic D. Case Discussions.pptx
PPTX
JUVENILE NASOPHARYNGEAL ANGIOFIBROMA.pptx
PPT
Obstructive sleep apnea in orthodontics treatment
PPTX
Chapter-1-The-Human-Body-Orientation-Edited-55-slides.pptx
PPT
Breast Cancer management for medicsl student.ppt
PPTX
Electromyography (EMG) in Physiotherapy: Principles, Procedure & Clinical App...
PDF
Human Health And Disease hggyutgghg .pdf
PPTX
DENTAL CARIES FOR DENTISTRY STUDENT.pptx
PPTX
Uterus anatomy embryology, and clinical aspects
PPTX
NEET PG 2025 Pharmacology Recall | Real Exam Questions from 3rd August with D...
PPTX
Fundamentals of human energy transfer .pptx
PPTX
History and examination of abdomen, & pelvis .pptx
PPT
MENTAL HEALTH - NOTES.ppt for nursing students
PPTX
Respiratory drugs, drugs acting on the respi system
1 General Principles of Radiotherapy.pptx
neonatal infection(7392992y282939y5.pptx
Note on Abortion.pptx for the student note
NEET PG 2025 | 200 High-Yield Recall Topics Across All Subjects
Adobe Premiere Pro CC Crack With Serial Key Full Free Download 2025
15.MENINGITIS AND ENCEPHALITIS-elias.pptx
Imaging of parasitic D. Case Discussions.pptx
JUVENILE NASOPHARYNGEAL ANGIOFIBROMA.pptx
Obstructive sleep apnea in orthodontics treatment
Chapter-1-The-Human-Body-Orientation-Edited-55-slides.pptx
Breast Cancer management for medicsl student.ppt
Electromyography (EMG) in Physiotherapy: Principles, Procedure & Clinical App...
Human Health And Disease hggyutgghg .pdf
DENTAL CARIES FOR DENTISTRY STUDENT.pptx
Uterus anatomy embryology, and clinical aspects
NEET PG 2025 Pharmacology Recall | Real Exam Questions from 3rd August with D...
Fundamentals of human energy transfer .pptx
History and examination of abdomen, & pelvis .pptx
MENTAL HEALTH - NOTES.ppt for nursing students
Respiratory drugs, drugs acting on the respi system

Monitoring and Evaluation of Health Services

  • 1. Monitoring and Evaluation of Health Services Dr. Rasha Salama PhD Public Health Faculty of Medicine Suez Canal University-Egypt
  • 3. Monitoring and Evaluation (M&E) Monitoring progress and evaluating results are key functions to improve the performance of those responsible for implementing health services. M&E show whether a service/program is accomplishing its goals. It identifies program weaknesses and strengths, areas of the program that need revision, and areas of the program that meet or exceed expectations. To do this, analysis of any or all of a program’s domains is required
  • 5. Monitoring versus Evaluation Monitoring Evaluation A process that assesses an achievement against preset criteria. Has a variety of purposes, and follow distinct methodologies (process, outcome, performance, etc). A planned, systematic process of observation that closely follows a course of activities, and compares what is happening with what is expected to happen
  • 6. Evaluation Monitoring A systematic process to determine the extent to which service needs and results have been or are being achieved and analyse the reasons for any discrepancy. Attempts to measure service’s relevance, efficiency and effectiveness. It measures whether and to what extent the programme’s inputs and services are improving the quality of people’s lives. The periodic collection and review of information on programme implementation, coverage and use for comparison with implementation plans. Open to modifying original plans during implementation Identifies shortcomings before it is too late. Provides elements of analysis as to why progress fell short of expectations
  • 9. Evaluation can focus on: Projects normally consist of a set of activities undertaken to achieve specific objectives within a given budget and time period. Programs are organized sets of projects or services concerned with a particular sector or geographic region Services are based on a permanent structure, and, have the goal of becoming, national in coverage, e.g. Health services, whereas programmes are usually limited in time or area. Processes are organizational operations of a continuous and supporting nature (e.g. personnel procedures, administrative support for projects, distribution systems, information systems, management operations). Conditions are particular characteristics or states of being of persons or things (e.g. disease, nutritional status, literacy, income level). Processes Services Projects Conditions Programs
  • 10. Evaluation may focus on different aspects of a service or program: Inputs are resources provided for an activity, and include cash, supplies, personnel, equipment and training. Processes transform inputs into outputs. Outputs are the specific products or services, that an activity is expected to deliver as a result of receiving the inputs. A service is effective if it “works”, i.e. it delivers outputs in accordance with its objectives. A service is efficient or cost-effective if effectiveness is achieved at the lowest practical cost. Outcomes refer to peoples’ responses to a programme and how they are doing things differently as a result of it. They are short-term effects related to objectives. Impacts are the effects of the service on the people and their surroundings. These may be economic, social, organizational, health, environmental, or other intended or unintended results of the programme. Impacts are long-term effects. Processes Inputs Impacts Outputs Efficiency Effectiveness Outcomes
  • 11.  
  • 12. So what do you think? When is evaluation desirable?
  • 13. When Is Evaluation Desirable? Program evaluation is often used when programs have been functioning for some time. This is called Retrospective Evaluation. However, evaluation should also be conducted when a new program within a service is being introduced. These are called Prospective Evaluations. A prospective evaluation identifies ways to increase the impact of a program on clients; it examines and describes a program’s attributes; and, it identifies how to improve delivery mechanisms to be more effective.
  • 14. Prospective versus Retrospective Evaluation Prospective Evaluation, determines what ought to happen (and why) Retrospective Evaluation, determines what actually happened (and why)
  • 15. Evaluation Matrix The broadest and most common classification of evaluation identifies two kinds of evaluation: Formative evaluation. Evaluation of components and activities of a program other than their outcomes. (Structure and Process Evaluation) Summative evaluation. Evaluation of the degree to which a program has achieved its desired outcomes, and the degree to which any other outcomes (positive or negative) have resulted from the program.
  • 18. Evaluation Designs Ongoing service/program evaluation End of program evaluation Impact evaluation Spot check evaluation Desk evaluation
  • 20. Who conducts evaluation? Internal evaluation (self evaluation), in which people within a program sponsor, conduct and control the evaluation. External evaluation, in which someone from beyond the program acts as the evaluator and controls the evaluation.
  • 21. Tradeoffs between External and Internal Evaluation
  • 22. Tradeoffs between External and Internal Evaluation Source: Adapted from UNICEF Guide for Monitoring and Evaluation, 1991 .
  • 23. Guidelines for Evaluation (FIVE phases)
  • 24. Phase A: Planning the Evaluation Determine the purpose of the evaluation. Decide on type of evaluation. Decide on who conducts evaluation (evaluation team) Review existing information in programme documents including monitoring information. List the relevant information sources Describe the programme. * Assess your own strengths and limitations. * Provide background information on the history and current status of the programme being evaluated including: How it works: its objectives, strategies and management process ) Policy environment Economic and financial feasibility Institutional capacity Socio-cultural aspects Participation and ownership Environment Technology
  • 25. Phase B:Selecting Appropriate Evaluation Methods Identify evaluation goals and objectives. (SMART) Formulate evaluation questions and sub-questions Decide on the appropriate evaluation design Identify measurement standards Identify measurement indicators Develop an evaluation schedule Develop a budget for the evaluation.
  • 26. Sample evaluation questions: What might stakeholders want to know? Program clients: Does this program provide us with high quality service? Are some clients provided with better services than other clients? If so, why? Program Staff: Does this program provide our clients with high quality service? Should staff make any changes in how they perform their work, as individuals and as a team, to improve program processes and outcomes? Program managers: Does this program provide our clients with high quality service? Are there ways managers can improve or change their activities, to improve program processes and outcomes? Funding bodies: Does this program provide its clients with high quality service? Is the program cost-effective? Should we make changes in how we fund this program or in the level of funding to the program?
  • 27.  
  • 28. Indicators..... What are they? An indicator is a standardized, objective measure that allows— A comparison among health facilities A comparison among countries A comparison between different time periods A measure of the progress toward achieving program goals
  • 29. Characteristics of Indicators Clarity: easily understandable by everybody Useful: represent all the important dimensions of performance Measurable Quantitative: rates, proportions, percentage, common denominator (e.g., population) Qualitative: “yes” or “no” Reliability: can be collected consistently by different data collectors Validity: measure what we mean to measure
  • 30. Which Indicators? The following questions can help determine measurable indicators: How will I know if an objective has been accomplished? What would be considered effective? What would be a success? What change is expected?
  • 31. Evaluation Area (Formative assessment ) Evaluation Question Examples of Specific Measurable Indicators Staff Supply Is staff supply sufficient? Staff-to-client ratios Service Utilization What are the program’s usage levels? Percentage of utilization Accessibility of Services How do members of the target population perceive service availability? Percentage of target population who are aware of the program in their area • Percentage of the “aware” target population who know how to access the service Client Satisfaction How satisfied are clients? Percentage of clients who report being satisfied with the service received
  • 32. Evaluation Area ( Summative Assessment) Evaluation question Examples of specific measurable indicators Changes in Behaviour Have risk factors for cardiac disease have changed? Compare proportion of respondents who reported increased physical activity Morbidity/Mortality Has lung cancer mortality decreased by 10%? Has there been a reduction in the rate of low birth weight babies? Age-standardized lung cancer mortality rates for males and females Compare annual rates of low-birth weight babies over five years period
  • 33.  
  • 34. So what will we do ? Use Importance Feasibility Matrix
  • 35. Face reality! Assess your strengths and weakness
  • 37. Phase C: Collecting and Analysing Information Develop data collection instruments. Pre-test data collection instruments. Undertake data collection activities. Analyse data. Interpret the data
  • 38. Development of a frame logical model A program logic model provides a framework for an evaluation. It is a flow chart that shows the program’s components, the relationships between components and the sequencing of events.
  • 39. Use of IF-THEN Logic Model Statements To support logic model development, a set of “IF-THEN” statements helps determine if the rationale linking program inputs, outputs and objectives/outcomes is plausible, filling in links in the chain of reasoning
  • 40. CAT SOLO mnemonic Next, the CAT Elements ( Components, Activities and Target Groups) of a logic model can be examined
  • 41. Gathering of Qualitative and Quantitative Information: Instruments Qualitative tools : There are five frequently used data collection processes in qualitative evaluation (more than one method can be used): 1. Unobtrusive seeing, involving an observer who is not seen by those who are observed; 2. Participant observation, involving an observer who does not take part in an activity but is seen by the activity’s participants. 3. Interviewing, involving a more active role for the evaluator because she /he poses questions to the respondent, usually on a one-on-one basis 4. Group-based data collection processes such as focus groups; and 5. Content analysis, which involves reviewing documents and transcripts to identify patterns within the material
  • 42. Quantitative tools: “ Quantitative, or numeric information, is obtained from various databases and can be expressed using statistics.” Surveys/questionnaires; Registries Activity logs; Administrative records; Patient/client charts; Registration forms; Case studies; Attendance sheets.
  • 44. Other monitoring and evaluation methods: Biophysical measurements Cost-benefit analysis Sketch mapping GIS mapping Transects Seasonal calendars Most significant change method Impact flow diagram ( cause-effect diagram) Institutional linkage diagram (Venn/Chapati diagram) Problem and objectives tree Systems (inputs-outputs) diagram Monitoring and evaluation Wheel (spider web)
  • 45. Spider Web Method: This method is a visual index developed to identify the kind of indicators/criteria that can be used to monitor change over the program period. This would present a ‘before’ and ‘after’ program/project situation. It is commonly used in participatory evaluation.
  • 46. Phase D: Reporting Findings Write the evaluation report. Decide on the method of sharing the evaluation results and on communication strategies. Share the draft report with stakeholders and revise as needed to be followed by follow up. Disseminate evaluation report.
  • 47. Example of suggested outline for an evaluation report
  • 48. Phase E:Implementing Evaluation Recommendations Develop a new/revised implementation plan in partnership with stakeholders. Monitor the implementation of evaluation recommendations and report regularly on the implementation progress. Plan the next evaluation
  • 49.  
  • 50. References WHO: UNFPA. Programme Manager’s Planning Monitoring & Evaluation Toolkit. Division for oversight services, August 2004, Ontario Ministry of Health and Long-Term Care, Public Health Branch In: The Health Communication Unit at the Centre for Health Promotion. Introduction to evaluation health promotion programs . November 23, 24, 2007. Donaldson SI, Gooler LE, Scriven M. (2002). Strategies for managing evaluation anxiety : Toward a psychology of program evaluation. American Journal of Evaluation. 23(3), 261-272. CIDA. “CIDA Evaluation Guide”, Performance Review Branch, 2000. OECD. “Improving Evaluation Practices: Best Practice Guidelines for Evaluation and Background Paper”, 1999. UNDP. “Results-Oriented Monitoring and Evaluation: A Handbook for Programme Managers”, Office of Evaluation and Strategic Planning, New York, 1997. UNICEF. “A UNICEF Guide for Monitoring and Evaluation: Making a Difference?”, Evaluation Office, New York, 1991.
  • 51. References (cont.) UNICEF. “Evaluation Reports Standards”, 2004. USAID. “Performance Monitoring and Evaluation – TIPS # 3: Preparing an Evaluation Scope of Work”, 1996 and “TIPS # 11: The Role of Evaluation in USAID”, 1997, Centre for Development Information and Evaluation. Available at http://www.dec.org/usaid_eval/#004 U.S. Centres for Disease Control and Prevention (CDC). “Framework for Program Evaluation in Public Health”, 1999. Available in English at http://www.cdc.gov/eval/over.htm U.S. Department of Health and Human Services. Administration on Children, Youth, and Families (ACYF), “The Program Manager’s Guide to Evaluation”, 1997.