SlideShare a Scribd company logo
Programme Evaluation
Conducting the Programme Evaluation
Puja Shrivastav
JRF (UGC)
If the Goal of Evaluation is…
… to improve a program
Then no evaluation is good unless
findings are used to make a difference
2Puja Shrivastav
Programme Evaluation
 Any evaluation to examine and assess
the implementation and effectiveness of
specific instructional activities in order to
make adjustments or changes in the
activities is often labeled "process or
programme evaluation.”
 The focus of process evaluation
includes a description and assessment
of the curriculum, teaching methods
used, staff experience and performance,
in-service training, and adequacy of
equipment and facilities.
3Puja Shrivastav
When to Conduct Evaluation?
 The stage of program development
influences the reason for program
evaluation.
o The design stage.
o The start up stage.
o While the programme is in progress.
o After the programme wrap up.
o Long after programme finishes.
4Puja Shrivastav
Steps of conducting
evaluation
1. Planning for Evaluation-Identify the
problem, Renew program goals.
2. Identify stakeholders and their needs-
Identify and contact evaluation
stakeholders,
3. Determining the evaluation purpose -
Revisit the purpose/objectives of evaluation
4. Decide who will evaluate-Decide if
evaluation will be in house or contracted
out.
5. Report results
6. Justify Conclusions
Cont..5Puja Shrivastav
1. Planning for Evaluation
Identify the problem.. And renew the goals
The mission and objectives of the
instructional program should be focused.
Include information about its
purpose, expected effects, available
resources, the program’s stage of
development, and instructional context.
Descriptions set the frame of reference for all
subsequent planning decisions in an
evaluation.
6Puja Shrivastav
Planning for Evaluation
o Determine data-collection methods,
o Create data-collection instrument,
o Test data-collection instrument,
o Evaluation of collected data.
o Summarize and analyze the data,
prepare reports for stakeholders
7Puja Shrivastav
Planning for Evaluation
Gather data
 Data gathering focuses on collecting
information that conveys a holistic
picture of the instructional program.
 Data gathering includes consideration
about what indicators, data
sources and methods to use,
the quality and quantity of the
information, human subject protections,
and the context in which the data
gathering occurs.
8Puja Shrivastav
Create an evaluation plan
 The evaluation plan outlines how to
implement the evaluation including:
i. Identification of the sponsor and
resources available for implementing
the plan,
ii. What information is to be gathered,
iii. The research method(s) to be used,
iv. A description of the roles and
responsibilities of
sponsors and evaluators.
v. A timeline for accomplishing tasks. 9Puja Shrivastav
2. Identify stakeholders and their
needs
 Stakeholders are the individuals and
organizations involved in program
operations,
 Those served or affected by the
program, and the intended users of the
assessment or evaluation.
 Stakeholder needs generally reflect the
central questions which they have about
the instructional activity, innovation, or
program.
 Determining stakeholder needs helps to
focus the evaluation process so that the
results are of the greatest utility.
10Puja Shrivastav
Three principle groups of
stakeholders
 Persons Involved in Program Operations
◦ Staff and Partners
 Persons affected or served by the
program
◦ Clients, their families and social networks,
providers and community groups
 Intended users of the evaluation findings
◦ Policy makers, managers, administrators,
advocates, funders, and others
11Puja Shrivastav
3. Determining the evaluation
purpose
Three general purposes for instructional
evaluations are --
a. Gain Insight -
o Assess needs and wants of community
members
o Identify barriers to use of the program
o Learn how to best describe and measure
program activities
b. Change Practice - to improve the quality,
effectiveness, or efficiency of instructional
activities.
12Puja Shrivastav
Determining the evaluation purpose
◦ Refine plans for introducing a new practice
◦ Determine the extent to which plans were
implemented
◦ Improve educational materials
◦ Enhance cultural competence
◦ Verify that participants' rights are protected
◦ Set priorities for staff training
◦ Make mid-course adjustments
◦ Clarify communication
◦ Determine if client satisfaction can be improved
◦ Compare costs to benefits
◦ Find out which participants benefit most from the
program
◦ Mobilize community support for the program
13Puja Shrivastav
Determining the evaluation
purpose
c. Measure Effects of program– to examine
the relationship between instructional
activities and observed consequences
◦ Assess skills development by program
participants
◦ Compare changes in behavior over time
◦ Decide where to allocate new resources
◦ Document the level of success in
accomplishing objectives
◦ Demonstrate that accountability requirements
are fulfilled
◦ Use information from multiple evaluations to
predict the likely effects of similar programs
14Puja Shrivastav
Determining the evaluation purpose
d. Affect on participants-
◦ Empower program participants (for example,
being part of an evaluation can increase
community members' sense of control over the
program);
◦ Supplement the program (for example, using a
follow-up questionnaire can reinforce the main
messages of the program);
◦ Promote staff development (for example, by
teaching staff how to collect, analyze, and
interpret evidence); or
◦ Contribute to organizational growth (for
example, the evaluation may clarify how the
program relates to the organization's mission).
15Puja Shrivastav
Determining the evaluation purpose
◦ Reinforce messages of the program
◦ Stimulate dialogue and raise awareness
about community issues
◦ Broaden consensus among partners
about program goals
◦ Teach evaluation skills to staff and other
stakeholders
◦ Gather success storie
◦ Support organizational change and
improvement
16Puja Shrivastav
Identify intended uses
 Intended uses are the specific ways
evaluation results will be applied.
 They are the underlying goals of the
evaluation, and are linked to the
central questions of the study that
identify the specific aspects of the
instructional program to be examined.
 The purpose, uses, and central
questions of an evaluation are all
closely related.
17Puja Shrivastav
4. Decide who will evaluate
 Decide who will evaluate-Decide if
evaluation will be in house or
contracted out.
In house – Principal, Teachers,
Students, or Parents.
Out – Some agencies can be hired to
help out, Retired professional of same
stream etc can also evaluate the
program.
18Puja Shrivastav
5.Reporting Results
Analyze data
 Data analysis involves identifying
patterns in the data, either by isolating
important findings (analysis) or by
combining sources of information to
reach a larger understanding (synthesis),
and
 Making decisions about how to
organize, classify, interrelate, compare,
and display information.
 These decisions are guided by the
questions being asked, the types of data
available, and by input from
stakeholders.
19Puja Shrivastav
Report results
 Factors to consider when reporting results, or
dissemination, include tailoring report content
for a specific audience, explaining the focus
of the study and its limitations, and listing
both the strengths and weaknesses of the
study.
 It may also include the reporting of active
follow-up and interim findings.
 Reporting interim findings is sometimes
useful to instructors or staff in making
immediate instructional adjustments.
Cont..
20Puja Shrivastav
Report results
 Describe the accomplishments of the
program, identifying those instructional
elements that were the most effective;
 Describe instructional elements that
were ineffective and problematic as
well as areas that need modifications
in the future; and
 Describe the outcomes or the impact
of the instructional unit on students.
21Puja Shrivastav
Report results
 Complete documentation will make
the report useful for making decisions
about improving curriculum and
instructional strategies.
 In other words, the evaluation report
is a tool supporting decision making,
program improvement, accountability,
and quality control in curriculum.
 This will help in reframing the
curriculum ….if needed.
22Puja Shrivastav
Make conclusions and
recommendations
 Conclusions are linked to the evidence
gathered and judged against agreed-
upon standards set by stakeholders.
 Recommendations are actions for
consideration that are based on
conclusions but go beyond
simple judgments about efficacy
or interpretation of the evidence
gathered.
23Puja Shrivastav
Justify the Conclusions
 Conclusions become justified when
they are linked to the evidence
gathered and judged against agreed-
upon values set by the stakeholders.
 Stakeholders must agree that
conclusions are justified in order to
use the evaluation results with
confidence.
The principal elements involved in
justifying conclusions based on
24Puja Shrivastav
Justify the Conclusions
 Standards- Standards reflect the values
held by stakeholders about the program.
They provide the basis to make program
judgments.
 Analysis and synthesis- Analysis and
synthesis are methods to discover and
summarize an evaluation's findings.
 Interpretation- Interpretation is the effort to
figure out what the findings mean.
Uncovering facts about a program's
performance.
 Judgements- Judgments are statements
about the merit, worth, or significance of the
program.
 Recommendations-Recommendations are 25Puja Shrivastav
Standards for Effective Evaluation
26
Standards
Utility
Feasibility
Propriety
Accuracy
Engage
stakeholders
Steps
Describe
the program
Gather credible
evidence
Focus the
Evaluation
design
Justify
conclusions
Ensure use
and share
lessons learned
26
Puja Shrivastav
The Four Standards
 Utility: Who needs the information
and what information do they need?
 Feasibility: How much money, time,
and effort can we put into this?
 Propriety: What steps need to be
taken for the evaluation to be ethical?
 Accuracy: What design will lead to
accurate information?
27Puja Shrivastav
Standard: Utility
Ensures that the information needs of
intended users are met.
 Who needs the evaluation findings?
 What do the users of the evaluation
need?
 Will the evaluation provide relevant
(useful) information in a timely
manner?
28Puja Shrivastav
Standard: Feasibility
Ensures that evaluation is realistic,
prudent, diplomatic, and frugal.
 Are the planned evaluation activities
realistic given the time, resources, and
expertise at hand?
29Puja Shrivastav
Standard: Propriety
Ensures the evaluation is conducted
legally, ethically, and with due regard
for the welfare of those involved and
those affected.
 Does the evaluation protect the rights of
individuals and protect the welfare of
those involved?
 Does it engage those most directly
affected by the program and by changes
in the program, such as participants or
the surrounding community?
30Puja Shrivastav
Standard: Accuracy
 Ensures that the evaluation reveals
and conveys technically accurate
information.
Will the evaluation produce findings
that are valid and reliable, given the
needs of those who will use the
results?
31Puja Shrivastav
Utilizing the Evaluation Result
 The evaluator records the actions, the features
and experiences of students, teachers and
administrators. People who read the report will
be able to visualise what the place looks like and
the processes taking place. Thus the reader will
understand the area’s for requirement of
improvement.
 The evaluator interpret and explains the meaning
of events reported by putting it in its context. For
example, why academically weak students were
motivated to ask questions; why reading
comprehension skills improved; why enthusiasm
for doing science experiments increased and so
forth.
32Puja Shrivastav
Utilization of Evaluation
Result
 Use of Available resources-
organization of staff for learning,
Administrative and physical
conditions.
 Decision area of Teacher- identifying
the objective, selection of teaching
learning process.
 Communication- Properly done with
the stakeholders.
33Puja Shrivastav
Utilization of Evaluation
Result
 The Results ensures that the
information needs of intended users
are met/ or not. If not then further
recommendations can be used which
are made by the evaluator.
 The feedback obtained could be used
to revise and improve instruction or
whether or not to adopt the
programme before full implementation.
 Development of overall programme or
curriculum.
34Puja Shrivastav
35Puja Shrivastav

More Related Content

PPTX
Impact evaluation
PPT
Process Evaluation Handout
PPTX
Functions of evaluation
PPTX
Monitoring & Evaluating projects & programs: A stakeholder perspective
PPTX
Theory of Change vs. Program Logic Model
PPTX
Program evaluation
PPTX
Evaluation design
PDF
Evaluation report
Impact evaluation
Process Evaluation Handout
Functions of evaluation
Monitoring & Evaluating projects & programs: A stakeholder perspective
Theory of Change vs. Program Logic Model
Program evaluation
Evaluation design
Evaluation report

What's hot (20)

PPTX
Formative & summative evaluation
KEY
Program evaluation
PPTX
Program evaluation
PPT
Program Evaluation In the Non-Profit Sector
PDF
Quality assurance in higher education
PPT
Program Evaluation 1
PPTX
Monitoring And Evaluation Presentation
PPT
Monotoring and evaluation principles and theories
PPT
Curriculum evaluation
PPTX
Monitoring and Evaluation of Plan
PDF
Logical framework
PPT
Quality Assurance in HEIs. and Accreditation
PPT
Lfa Logical Framework Analysis
PPTX
What are the roles of evaluators?
PPTX
Cipp evaluation model
PPT
Monitoring and evaluation Learning and Development
PPTX
Curriculum planning 2
PDF
Monitoring and Evaluation for development and governmental organizations.pdf
PPT
Monitoring and Evaluation Framework
PPT
Accountability
Formative & summative evaluation
Program evaluation
Program evaluation
Program Evaluation In the Non-Profit Sector
Quality assurance in higher education
Program Evaluation 1
Monitoring And Evaluation Presentation
Monotoring and evaluation principles and theories
Curriculum evaluation
Monitoring and Evaluation of Plan
Logical framework
Quality Assurance in HEIs. and Accreditation
Lfa Logical Framework Analysis
What are the roles of evaluators?
Cipp evaluation model
Monitoring and evaluation Learning and Development
Curriculum planning 2
Monitoring and Evaluation for development and governmental organizations.pdf
Monitoring and Evaluation Framework
Accountability
Ad

Viewers also liked (18)

PPTX
Community Based Nutrition-Recommendations From Community Surveys
PPT
Programme Evaluation in Open and Distance Learning
PPT
PPTX
Designing and Conducting Formative Evaluation
PPTX
Testing and evaluation in foreing language teaching
PPTX
Language testing and evaluation
PPTX
Estores steps in making guidance program
PPTX
Testing and evaluation powerpoint
PDF
Program evaluation 20121016
PPTX
Education system of Bangladesh
PPTX
Testing for Language Teachers Arthur Hughes
PPTX
Program evaluation
PPTX
Program Evaluation and Review Technique (PERT)
PPTX
Assessment vs evaluation
PPT
Designing and conducting summative evaluations
PPTX
Chapter 12: Planning and Evaluating Operations
PPT
Definition and types of research
PPTX
Types of Research
Community Based Nutrition-Recommendations From Community Surveys
Programme Evaluation in Open and Distance Learning
Designing and Conducting Formative Evaluation
Testing and evaluation in foreing language teaching
Language testing and evaluation
Estores steps in making guidance program
Testing and evaluation powerpoint
Program evaluation 20121016
Education system of Bangladesh
Testing for Language Teachers Arthur Hughes
Program evaluation
Program Evaluation and Review Technique (PERT)
Assessment vs evaluation
Designing and conducting summative evaluations
Chapter 12: Planning and Evaluating Operations
Definition and types of research
Types of Research
Ad

Similar to Conducting Programme Evaluation (20)

DOCX
Directions  Respond to the Case Study below using the S.O.A.P. fo.docx
PPTX
1793 Ch 11 PowerPoint.pptx
PPT
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
PPTX
PROGRAM EVALUATION PRESENTATION GR.1.pptx
PPTX
evaluation-guidance and councseling
PPTX
Measuring learning outcomes
PPTX
Learning_Unit_3
PPTX
Course Evaluation
PPTX
Week 1 - Introduction to Program Evaluation.pptx
PPTX
Formative and summative evaluation
PPTX
Assessment and Evaluation
PDF
37851 101218095128-phpapp02
PPTX
Evaluation – concepts and principles
PPT
Program evaluation part 2
PPT
monitoring and evaluation of health services.ppt
PPT
Lecture 7. student evaluation
PPTX
Administration of nursing curriculm (Harmandeep Kaur) (1).pptx
PPTX
Saswat pani
PPTX
Evaluation
PPTX
Data Driven Decision Making Presentation
Directions  Respond to the Case Study below using the S.O.A.P. fo.docx
1793 Ch 11 PowerPoint.pptx
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
PROGRAM EVALUATION PRESENTATION GR.1.pptx
evaluation-guidance and councseling
Measuring learning outcomes
Learning_Unit_3
Course Evaluation
Week 1 - Introduction to Program Evaluation.pptx
Formative and summative evaluation
Assessment and Evaluation
37851 101218095128-phpapp02
Evaluation – concepts and principles
Program evaluation part 2
monitoring and evaluation of health services.ppt
Lecture 7. student evaluation
Administration of nursing curriculm (Harmandeep Kaur) (1).pptx
Saswat pani
Evaluation
Data Driven Decision Making Presentation

More from Puja Shrivastav (9)

PPTX
Differential Learning_Instruction _Slide Share.pptx
PPTX
Paper presentation on teacher training modules
PPTX
Science and technology in education
PPTX
Reflective Teaching
PPTX
Reflective Teaching and Action Research
PPTX
Equalityof Opportunity
PPTX
Conducting Material Evaluation
PPTX
Knowledge management
PPTX
Purpose and planning of evaluation (ps)
Differential Learning_Instruction _Slide Share.pptx
Paper presentation on teacher training modules
Science and technology in education
Reflective Teaching
Reflective Teaching and Action Research
Equalityof Opportunity
Conducting Material Evaluation
Knowledge management
Purpose and planning of evaluation (ps)

Recently uploaded (20)

PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PDF
RMMM.pdf make it easy to upload and study
PPTX
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
PPTX
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
PPTX
Renaissance Architecture: A Journey from Faith to Humanism
PDF
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
PPTX
Cell Types and Its function , kingdom of life
PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PDF
Pre independence Education in Inndia.pdf
PPTX
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
PDF
Insiders guide to clinical Medicine.pdf
PDF
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PDF
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
PPTX
Lesson notes of climatology university.
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PPTX
Pharma ospi slides which help in ospi learning
PPTX
human mycosis Human fungal infections are called human mycosis..pptx
PDF
Abdominal Access Techniques with Prof. Dr. R K Mishra
PPTX
Cell Structure & Organelles in detailed.
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
RMMM.pdf make it easy to upload and study
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
Renaissance Architecture: A Journey from Faith to Humanism
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
Cell Types and Its function , kingdom of life
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
Pre independence Education in Inndia.pdf
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
Insiders guide to clinical Medicine.pdf
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
Lesson notes of climatology university.
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
Pharma ospi slides which help in ospi learning
human mycosis Human fungal infections are called human mycosis..pptx
Abdominal Access Techniques with Prof. Dr. R K Mishra
Cell Structure & Organelles in detailed.

Conducting Programme Evaluation

  • 1. Programme Evaluation Conducting the Programme Evaluation Puja Shrivastav JRF (UGC)
  • 2. If the Goal of Evaluation is… … to improve a program Then no evaluation is good unless findings are used to make a difference 2Puja Shrivastav
  • 3. Programme Evaluation  Any evaluation to examine and assess the implementation and effectiveness of specific instructional activities in order to make adjustments or changes in the activities is often labeled "process or programme evaluation.”  The focus of process evaluation includes a description and assessment of the curriculum, teaching methods used, staff experience and performance, in-service training, and adequacy of equipment and facilities. 3Puja Shrivastav
  • 4. When to Conduct Evaluation?  The stage of program development influences the reason for program evaluation. o The design stage. o The start up stage. o While the programme is in progress. o After the programme wrap up. o Long after programme finishes. 4Puja Shrivastav
  • 5. Steps of conducting evaluation 1. Planning for Evaluation-Identify the problem, Renew program goals. 2. Identify stakeholders and their needs- Identify and contact evaluation stakeholders, 3. Determining the evaluation purpose - Revisit the purpose/objectives of evaluation 4. Decide who will evaluate-Decide if evaluation will be in house or contracted out. 5. Report results 6. Justify Conclusions Cont..5Puja Shrivastav
  • 6. 1. Planning for Evaluation Identify the problem.. And renew the goals The mission and objectives of the instructional program should be focused. Include information about its purpose, expected effects, available resources, the program’s stage of development, and instructional context. Descriptions set the frame of reference for all subsequent planning decisions in an evaluation. 6Puja Shrivastav
  • 7. Planning for Evaluation o Determine data-collection methods, o Create data-collection instrument, o Test data-collection instrument, o Evaluation of collected data. o Summarize and analyze the data, prepare reports for stakeholders 7Puja Shrivastav
  • 8. Planning for Evaluation Gather data  Data gathering focuses on collecting information that conveys a holistic picture of the instructional program.  Data gathering includes consideration about what indicators, data sources and methods to use, the quality and quantity of the information, human subject protections, and the context in which the data gathering occurs. 8Puja Shrivastav
  • 9. Create an evaluation plan  The evaluation plan outlines how to implement the evaluation including: i. Identification of the sponsor and resources available for implementing the plan, ii. What information is to be gathered, iii. The research method(s) to be used, iv. A description of the roles and responsibilities of sponsors and evaluators. v. A timeline for accomplishing tasks. 9Puja Shrivastav
  • 10. 2. Identify stakeholders and their needs  Stakeholders are the individuals and organizations involved in program operations,  Those served or affected by the program, and the intended users of the assessment or evaluation.  Stakeholder needs generally reflect the central questions which they have about the instructional activity, innovation, or program.  Determining stakeholder needs helps to focus the evaluation process so that the results are of the greatest utility. 10Puja Shrivastav
  • 11. Three principle groups of stakeholders  Persons Involved in Program Operations ◦ Staff and Partners  Persons affected or served by the program ◦ Clients, their families and social networks, providers and community groups  Intended users of the evaluation findings ◦ Policy makers, managers, administrators, advocates, funders, and others 11Puja Shrivastav
  • 12. 3. Determining the evaluation purpose Three general purposes for instructional evaluations are -- a. Gain Insight - o Assess needs and wants of community members o Identify barriers to use of the program o Learn how to best describe and measure program activities b. Change Practice - to improve the quality, effectiveness, or efficiency of instructional activities. 12Puja Shrivastav
  • 13. Determining the evaluation purpose ◦ Refine plans for introducing a new practice ◦ Determine the extent to which plans were implemented ◦ Improve educational materials ◦ Enhance cultural competence ◦ Verify that participants' rights are protected ◦ Set priorities for staff training ◦ Make mid-course adjustments ◦ Clarify communication ◦ Determine if client satisfaction can be improved ◦ Compare costs to benefits ◦ Find out which participants benefit most from the program ◦ Mobilize community support for the program 13Puja Shrivastav
  • 14. Determining the evaluation purpose c. Measure Effects of program– to examine the relationship between instructional activities and observed consequences ◦ Assess skills development by program participants ◦ Compare changes in behavior over time ◦ Decide where to allocate new resources ◦ Document the level of success in accomplishing objectives ◦ Demonstrate that accountability requirements are fulfilled ◦ Use information from multiple evaluations to predict the likely effects of similar programs 14Puja Shrivastav
  • 15. Determining the evaluation purpose d. Affect on participants- ◦ Empower program participants (for example, being part of an evaluation can increase community members' sense of control over the program); ◦ Supplement the program (for example, using a follow-up questionnaire can reinforce the main messages of the program); ◦ Promote staff development (for example, by teaching staff how to collect, analyze, and interpret evidence); or ◦ Contribute to organizational growth (for example, the evaluation may clarify how the program relates to the organization's mission). 15Puja Shrivastav
  • 16. Determining the evaluation purpose ◦ Reinforce messages of the program ◦ Stimulate dialogue and raise awareness about community issues ◦ Broaden consensus among partners about program goals ◦ Teach evaluation skills to staff and other stakeholders ◦ Gather success storie ◦ Support organizational change and improvement 16Puja Shrivastav
  • 17. Identify intended uses  Intended uses are the specific ways evaluation results will be applied.  They are the underlying goals of the evaluation, and are linked to the central questions of the study that identify the specific aspects of the instructional program to be examined.  The purpose, uses, and central questions of an evaluation are all closely related. 17Puja Shrivastav
  • 18. 4. Decide who will evaluate  Decide who will evaluate-Decide if evaluation will be in house or contracted out. In house – Principal, Teachers, Students, or Parents. Out – Some agencies can be hired to help out, Retired professional of same stream etc can also evaluate the program. 18Puja Shrivastav
  • 19. 5.Reporting Results Analyze data  Data analysis involves identifying patterns in the data, either by isolating important findings (analysis) or by combining sources of information to reach a larger understanding (synthesis), and  Making decisions about how to organize, classify, interrelate, compare, and display information.  These decisions are guided by the questions being asked, the types of data available, and by input from stakeholders. 19Puja Shrivastav
  • 20. Report results  Factors to consider when reporting results, or dissemination, include tailoring report content for a specific audience, explaining the focus of the study and its limitations, and listing both the strengths and weaknesses of the study.  It may also include the reporting of active follow-up and interim findings.  Reporting interim findings is sometimes useful to instructors or staff in making immediate instructional adjustments. Cont.. 20Puja Shrivastav
  • 21. Report results  Describe the accomplishments of the program, identifying those instructional elements that were the most effective;  Describe instructional elements that were ineffective and problematic as well as areas that need modifications in the future; and  Describe the outcomes or the impact of the instructional unit on students. 21Puja Shrivastav
  • 22. Report results  Complete documentation will make the report useful for making decisions about improving curriculum and instructional strategies.  In other words, the evaluation report is a tool supporting decision making, program improvement, accountability, and quality control in curriculum.  This will help in reframing the curriculum ….if needed. 22Puja Shrivastav
  • 23. Make conclusions and recommendations  Conclusions are linked to the evidence gathered and judged against agreed- upon standards set by stakeholders.  Recommendations are actions for consideration that are based on conclusions but go beyond simple judgments about efficacy or interpretation of the evidence gathered. 23Puja Shrivastav
  • 24. Justify the Conclusions  Conclusions become justified when they are linked to the evidence gathered and judged against agreed- upon values set by the stakeholders.  Stakeholders must agree that conclusions are justified in order to use the evaluation results with confidence. The principal elements involved in justifying conclusions based on 24Puja Shrivastav
  • 25. Justify the Conclusions  Standards- Standards reflect the values held by stakeholders about the program. They provide the basis to make program judgments.  Analysis and synthesis- Analysis and synthesis are methods to discover and summarize an evaluation's findings.  Interpretation- Interpretation is the effort to figure out what the findings mean. Uncovering facts about a program's performance.  Judgements- Judgments are statements about the merit, worth, or significance of the program.  Recommendations-Recommendations are 25Puja Shrivastav
  • 26. Standards for Effective Evaluation 26 Standards Utility Feasibility Propriety Accuracy Engage stakeholders Steps Describe the program Gather credible evidence Focus the Evaluation design Justify conclusions Ensure use and share lessons learned 26 Puja Shrivastav
  • 27. The Four Standards  Utility: Who needs the information and what information do they need?  Feasibility: How much money, time, and effort can we put into this?  Propriety: What steps need to be taken for the evaluation to be ethical?  Accuracy: What design will lead to accurate information? 27Puja Shrivastav
  • 28. Standard: Utility Ensures that the information needs of intended users are met.  Who needs the evaluation findings?  What do the users of the evaluation need?  Will the evaluation provide relevant (useful) information in a timely manner? 28Puja Shrivastav
  • 29. Standard: Feasibility Ensures that evaluation is realistic, prudent, diplomatic, and frugal.  Are the planned evaluation activities realistic given the time, resources, and expertise at hand? 29Puja Shrivastav
  • 30. Standard: Propriety Ensures the evaluation is conducted legally, ethically, and with due regard for the welfare of those involved and those affected.  Does the evaluation protect the rights of individuals and protect the welfare of those involved?  Does it engage those most directly affected by the program and by changes in the program, such as participants or the surrounding community? 30Puja Shrivastav
  • 31. Standard: Accuracy  Ensures that the evaluation reveals and conveys technically accurate information. Will the evaluation produce findings that are valid and reliable, given the needs of those who will use the results? 31Puja Shrivastav
  • 32. Utilizing the Evaluation Result  The evaluator records the actions, the features and experiences of students, teachers and administrators. People who read the report will be able to visualise what the place looks like and the processes taking place. Thus the reader will understand the area’s for requirement of improvement.  The evaluator interpret and explains the meaning of events reported by putting it in its context. For example, why academically weak students were motivated to ask questions; why reading comprehension skills improved; why enthusiasm for doing science experiments increased and so forth. 32Puja Shrivastav
  • 33. Utilization of Evaluation Result  Use of Available resources- organization of staff for learning, Administrative and physical conditions.  Decision area of Teacher- identifying the objective, selection of teaching learning process.  Communication- Properly done with the stakeholders. 33Puja Shrivastav
  • 34. Utilization of Evaluation Result  The Results ensures that the information needs of intended users are met/ or not. If not then further recommendations can be used which are made by the evaluator.  The feedback obtained could be used to revise and improve instruction or whether or not to adopt the programme before full implementation.  Development of overall programme or curriculum. 34Puja Shrivastav

Editor's Notes

  • #27: Slide 16 Objective: Provide an overview of the standards for program evaluation Speaker: Good evaluations are diligently conducted under the guidance of the standards for program evaluation provided by the Joint Committee of Standards for Educational Evaluation. There are 4 standards for program evaluation: Utility, Feasibility, Propriety, and Accuracy. These standards serve to guide your decision making process at each step of the Framework to ensure that the evaluation stays focused and balanced. In the next couple of slides, we will go over each standard in depth.