SlideShare a Scribd company logo
T E A M 4
MODULE 7: CHAPTER 17
G&M
What is evaluation to you?
Our text lists five program issues for evaluation:
● Quality
● Suitability
● Effectiveness
● Efficiency
● Importance
How do these components relate to your place of work? Can you
provide examples?
EVALUATION PRE-DISCUSSION
—Evaluation-estimating value
—Two steps
1.Compare results and objectives
2.Appraise or judge the value of the differences assessed
What is the difference between “measurement” and
“evaluation”?
EVALUATION
—A systematic process, reliant upon multiple skills
● —Collecting information
● —Interpreting data
● —Drawing conclusions
● —Communicating outcomes
The authors list two major purposes of evaluations.
What are they?
EVALUATION
ADDIE MODEL
1. Identify All Clients and Stakeholders and Clarify Their Needs
2. Identify the Performance Improvement Initiative to Be Evaluated
3. Identify and Clarify the Purposes for the Evaluation
4. Determine the Critical Research Questions That the Evaluation Must
Address
5. Develop an Evaluation Design
6. Analyze Resources and Constraints
7. Determine the Best Data Collection Methods
8. Plan Reporting and Communications Actions
Can one person explain a step of the process from their own
experience at work?
THE PROCESS
KIRKPATRICK’S FOUR LEVELS OF
EVALUATION MODEL
Most of us said that our organizations utilized the
Kirkpatrick model of evaluation, but in a limited
capacity (reaction and learning)
What do you think would have been an effective
method of incorporating Levels 3-4 (behavior and
impact) ?
KIRKPATRICK
● Helps training professionals to understand evaluation
in a systematic way
● A straightforward system for discussing training
outcomes
● recognizes that single outcome measures cannot
adequately reflect complexity of organizational
training programs (Bates, 2004)
Are there any other advantages of Kirkpatrick’s model
that you would like to add?
KIRKPATRICK ADVANTAGES
● Model is incomplete- oversimplified view of training
effectiveness. Other factors influence training
outcomes.
o learning culture of the organization
o organizational or work unit goals and values
o interpersonal support
o climate for learning transfer
o adequacy of material resources
(Bates, 2004)
KIRKPATRICK LIMITATIONS
● Assumption of Causal Linkages- model assumes
that the criteria represent a causal relationship
between the levels of evaluation
o research has failed to confirm causal linkages
o “ if training is going to be effective , it is
important that trainees react favorably” and
“without learning, no change in behavior will
occur” (Kirkpatrick, 1994)
(Bates, 2004)
KIRKPATRICK LIMITATIONS
● Incremental Importance of Information- model assumes
that each level of evaluation provides data that is more
informative than the last
o perception that establishing level 4 results provide
the most useful information
o “in practice, however, the weak conceptual
linkages inherent in the model and resulting data it
generates do not provide an adequate basis for this
assumption” (Bates, 2004)
KIRKPATRICK LIMITATIONS
•“Evaluation is the systematic process of delineating,
obtaining, reporting, and applying descriptive and
judgmental information about some object’s merit,
worth, probity [moral correctness], feasibility, safety,
significance, or equity"(Stufflebeam & Shinkfield,
2007)
EVALUATION
Context- Input-Process-Product
STUFFLEBEAM’S CIPP MODEL (1983)
What needs to be done? Context
How should it be done? Input
Is it being done? Process
Did it succeed? Product
STUFFLEBEAM’S CIPP MODEL
Uses for CIPP model:
● Conduct a needs analysis
● Evaluation of alternatives for addressing needs
● Monitor design/implementation of interventions
● Helps to examine outcomes of intervention
regarding impact to the organization
STUFFLEBEAM’S CIPP MODEL
● Addresses concerns of decision-makers for
justifying the investment in interventions/initiatives
● Provides a framework for comparing alternatives
for future investments
COST BENEFIT MODEL (KEARSLEY,
1986)
● Provides the expected benefit or return on
investment
● Expressed as a percentage or in actual dollars
● Identify the benefits of intervention ($), divide by
the cost (%) or subtract costs
RETURN ON INVESTMENT
● Helps to decide how to best allocate resources
● Disadvantage: most interventions or initiatives
provide benefits which are hard to quantify
RETURN ON INVESTMENT
1. What do you think is the difference between the
Cost Benefit Model and the ROI model?
2. Provide examples of when these models should be
used.
QUESTIONS:
7 step model
1. Determine purpose, objectives, participants (who
wants this information)
2. Assess information needs
3. Consider proper protocol
4. Describe population to be studied, selet subjects
5. Identify other variables
6. Formulate a study design
7. Formulate a management plan
FORMATIVE EVALUATIONS
● Most useful for evaluating instruction
● May be used to for performance improvement
and change interventions
FORMATIVE EVALUATIONS
“...evaluative inquiry can not only be a means of
accumulating information for decision making and
action..but that it also be equally concerned with
questioning and debating the value of what we do
in organizations” (Preskill and Torres, 1999)
EVALUATIVE INQUIRY
Evaluative inquiry is a way of fostering individual
learning and team learning within an organization ,
about issues that are critical to its purpose and what
it values (Parsons, 2009)
EVALUATIVE INQUIRY
● Collaboration
● Organizational learning and change
● Links learning and performance
● Diverse perspectives
EVALUATIVE INQUIRY
● A study is needed to evaluate and redesign an
online master’s degree program consisting of 12
courses in informatics
● Educators are concerned about the quality of
online education courses
● Meaningful assessment is essential for improving
quality of such programs
CASE STUDY FOR EVALUATION
Considering the evaluation models that we have
discussed:
1. Which model(s) would you consider appropriate
for this case? Why?
2. Design an evaluation program to include Steps 1-5
as described by Rothwell and Kazanas ( Gilley &
Maycunich, p. 430-432)
CASE STUDY FOR EVALUATION
Bates, R. (2004) A critical analysis of evaluation practice; the Kirkpatrick model and the
principle
of beneficence. Evaluation and program planning,27(3),341-34
Parsons, B. (2009) Evaluative inquiry for complex times. OD Practitioner, 41(1).
Preskill, H. & Torres, R. T.(1999). building capacity for organizational learning through
evaluative
inquiry. Evaluation, 5(1) 42-60
REFERENCES

More Related Content

PPTX
Stufflebeam's CIPP Model
PPT
L+D ROI
DOCX
Apt chapter 12 summative evaluation
PPT
Theory of change vs. logic model
PDF
Black, Adam Dr - Efficacy and how to improve learner outcomes
PPTX
Designing and conducting summative evaluations
PPTX
Euro symposium Action Design Research practise 19092019
DOCX
Tm b systematic instructional design questions wk 8 (1)
Stufflebeam's CIPP Model
L+D ROI
Apt chapter 12 summative evaluation
Theory of change vs. logic model
Black, Adam Dr - Efficacy and how to improve learner outcomes
Designing and conducting summative evaluations
Euro symposium Action Design Research practise 19092019
Tm b systematic instructional design questions wk 8 (1)

What's hot (20)

PPTX
PROVUS'S DISCREPENCY EVALUATION MODEL
PPT
Rethinking Evaluation
PDF
Building Capacity to Measure, Analyze and Evaluate Government Performance
PPT
Assessing learning in Instructional Design
PPTX
Krickpatrick basic level of evaluation
PDF
The Ultimate Personal Project Guide
PPTX
Cet7034 unit 3
PPTX
Presentation (m & e)
PPTX
Cet7034 unit 7
PPTX
09 l2i adaptivemanagement,innovationresearch
PDF
System approach
PPTX
The Evaluation of Learning
PPTX
STRATEGIC AND OPERATIONAL MANAGEMENT-8615
PPT
Problem and situation analysis
PPTX
CHAPTER 12 DESIGNING AND CONDUCTING SUMMATIVE EVALUATION
PPTX
Chapter 12
PDF
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
PPTX
Systems Approach to Training
PPTX
Cet7034 unit 1
PPTX
1 proposal writing 3-3
PROVUS'S DISCREPENCY EVALUATION MODEL
Rethinking Evaluation
Building Capacity to Measure, Analyze and Evaluate Government Performance
Assessing learning in Instructional Design
Krickpatrick basic level of evaluation
The Ultimate Personal Project Guide
Cet7034 unit 3
Presentation (m & e)
Cet7034 unit 7
09 l2i adaptivemanagement,innovationresearch
System approach
The Evaluation of Learning
STRATEGIC AND OPERATIONAL MANAGEMENT-8615
Problem and situation analysis
CHAPTER 12 DESIGNING AND CONDUCTING SUMMATIVE EVALUATION
Chapter 12
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
Systems Approach to Training
Cet7034 unit 1
1 proposal writing 3-3
Ad

Viewers also liked (6)

PDF
Design Options for Open Education
PPTX
Curriculum constrction sem i evaluation models
PPT
Management-Oriented Evaluation Approaches
DOCX
Final thesis presented december 2009 march 2010
DOCX
Final na final thesis
Design Options for Open Education
Curriculum constrction sem i evaluation models
Management-Oriented Evaluation Approaches
Final thesis presented december 2009 march 2010
Final na final thesis
Ad

Similar to Module 7 g&m (20)

PPTX
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
PPTX
evaluation of program.pptx
PPTX
Kirkpatricks Foul Levels Evaluation.pptx
PPTX
Measuring ROI in Training
PPTX
G.training evaluation by jyoti k
PPTX
EVALUATING HRD PROGRAMS AND CAREER MANAGEMENT
PPTX
Training Evaluation Model.pptx
PPTX
Kirkpatricks Four Level Model of Curriculum Evaluation.pptx
PPTX
Training Evaluation -IRWIN-Module 3.pptx
PDF
The Kirkpatrick-Phillips-Evaluation Model
PPTX
evaluation methods in clinical teaching .pptx
PPTX
Training Evaluation
PPT
Evaluation models by dr.shazia zamir by
PDF
Kirkpatrick's Four Levels of Training Evaluation in Detail
PDF
Kirkpatrik Evaluation Questions samples.pdf
PPTX
University Cooperative Extension Evaluation 2.0
PDF
MED07_joycepagkatipunan.pdf
PPTX
Training Evaluation methods for employee training
PPT
Kirkpatricks Levels Presentation
PDF
Measuring Learning Impact
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
evaluation of program.pptx
Kirkpatricks Foul Levels Evaluation.pptx
Measuring ROI in Training
G.training evaluation by jyoti k
EVALUATING HRD PROGRAMS AND CAREER MANAGEMENT
Training Evaluation Model.pptx
Kirkpatricks Four Level Model of Curriculum Evaluation.pptx
Training Evaluation -IRWIN-Module 3.pptx
The Kirkpatrick-Phillips-Evaluation Model
evaluation methods in clinical teaching .pptx
Training Evaluation
Evaluation models by dr.shazia zamir by
Kirkpatrick's Four Levels of Training Evaluation in Detail
Kirkpatrik Evaluation Questions samples.pdf
University Cooperative Extension Evaluation 2.0
MED07_joycepagkatipunan.pdf
Training Evaluation methods for employee training
Kirkpatricks Levels Presentation
Measuring Learning Impact

Recently uploaded (20)

PDF
Practical Manual AGRO-233 Principles and Practices of Natural Farming
PDF
2.FourierTransform-ShortQuestionswithAnswers.pdf
PPTX
Orientation - ARALprogram of Deped to the Parents.pptx
PDF
Anesthesia in Laparoscopic Surgery in India
PPTX
Lesson notes of climatology university.
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PDF
Trump Administration's workforce development strategy
PDF
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PDF
01-Introduction-to-Information-Management.pdf
PDF
LNK 2025 (2).pdf MWEHEHEHEHEHEHEHEHEHEHE
PPTX
202450812 BayCHI UCSC-SV 20250812 v17.pptx
PDF
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
PDF
Yogi Goddess Pres Conference Studio Updates
PDF
Microbial disease of the cardiovascular and lymphatic systems
PDF
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
PPTX
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
PDF
Updated Idioms and Phrasal Verbs in English subject
PPTX
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
Practical Manual AGRO-233 Principles and Practices of Natural Farming
2.FourierTransform-ShortQuestionswithAnswers.pdf
Orientation - ARALprogram of Deped to the Parents.pptx
Anesthesia in Laparoscopic Surgery in India
Lesson notes of climatology university.
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
Trump Administration's workforce development strategy
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
Final Presentation General Medicine 03-08-2024.pptx
01-Introduction-to-Information-Management.pdf
LNK 2025 (2).pdf MWEHEHEHEHEHEHEHEHEHEHE
202450812 BayCHI UCSC-SV 20250812 v17.pptx
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
Yogi Goddess Pres Conference Studio Updates
Microbial disease of the cardiovascular and lymphatic systems
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
Updated Idioms and Phrasal Verbs in English subject
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...

Module 7 g&m

  • 1. T E A M 4 MODULE 7: CHAPTER 17 G&M
  • 2. What is evaluation to you? Our text lists five program issues for evaluation: ● Quality ● Suitability ● Effectiveness ● Efficiency ● Importance How do these components relate to your place of work? Can you provide examples? EVALUATION PRE-DISCUSSION
  • 3. —Evaluation-estimating value —Two steps 1.Compare results and objectives 2.Appraise or judge the value of the differences assessed What is the difference between “measurement” and “evaluation”? EVALUATION
  • 4. —A systematic process, reliant upon multiple skills ● —Collecting information ● —Interpreting data ● —Drawing conclusions ● —Communicating outcomes The authors list two major purposes of evaluations. What are they? EVALUATION
  • 6. 1. Identify All Clients and Stakeholders and Clarify Their Needs 2. Identify the Performance Improvement Initiative to Be Evaluated 3. Identify and Clarify the Purposes for the Evaluation 4. Determine the Critical Research Questions That the Evaluation Must Address 5. Develop an Evaluation Design 6. Analyze Resources and Constraints 7. Determine the Best Data Collection Methods 8. Plan Reporting and Communications Actions Can one person explain a step of the process from their own experience at work? THE PROCESS
  • 7. KIRKPATRICK’S FOUR LEVELS OF EVALUATION MODEL
  • 8. Most of us said that our organizations utilized the Kirkpatrick model of evaluation, but in a limited capacity (reaction and learning) What do you think would have been an effective method of incorporating Levels 3-4 (behavior and impact) ? KIRKPATRICK
  • 9. ● Helps training professionals to understand evaluation in a systematic way ● A straightforward system for discussing training outcomes ● recognizes that single outcome measures cannot adequately reflect complexity of organizational training programs (Bates, 2004) Are there any other advantages of Kirkpatrick’s model that you would like to add? KIRKPATRICK ADVANTAGES
  • 10. ● Model is incomplete- oversimplified view of training effectiveness. Other factors influence training outcomes. o learning culture of the organization o organizational or work unit goals and values o interpersonal support o climate for learning transfer o adequacy of material resources (Bates, 2004) KIRKPATRICK LIMITATIONS
  • 11. ● Assumption of Causal Linkages- model assumes that the criteria represent a causal relationship between the levels of evaluation o research has failed to confirm causal linkages o “ if training is going to be effective , it is important that trainees react favorably” and “without learning, no change in behavior will occur” (Kirkpatrick, 1994) (Bates, 2004) KIRKPATRICK LIMITATIONS
  • 12. ● Incremental Importance of Information- model assumes that each level of evaluation provides data that is more informative than the last o perception that establishing level 4 results provide the most useful information o “in practice, however, the weak conceptual linkages inherent in the model and resulting data it generates do not provide an adequate basis for this assumption” (Bates, 2004) KIRKPATRICK LIMITATIONS
  • 13. •“Evaluation is the systematic process of delineating, obtaining, reporting, and applying descriptive and judgmental information about some object’s merit, worth, probity [moral correctness], feasibility, safety, significance, or equity"(Stufflebeam & Shinkfield, 2007) EVALUATION
  • 15. What needs to be done? Context How should it be done? Input Is it being done? Process Did it succeed? Product STUFFLEBEAM’S CIPP MODEL
  • 16. Uses for CIPP model: ● Conduct a needs analysis ● Evaluation of alternatives for addressing needs ● Monitor design/implementation of interventions ● Helps to examine outcomes of intervention regarding impact to the organization STUFFLEBEAM’S CIPP MODEL
  • 17. ● Addresses concerns of decision-makers for justifying the investment in interventions/initiatives ● Provides a framework for comparing alternatives for future investments COST BENEFIT MODEL (KEARSLEY, 1986)
  • 18. ● Provides the expected benefit or return on investment ● Expressed as a percentage or in actual dollars ● Identify the benefits of intervention ($), divide by the cost (%) or subtract costs RETURN ON INVESTMENT
  • 19. ● Helps to decide how to best allocate resources ● Disadvantage: most interventions or initiatives provide benefits which are hard to quantify RETURN ON INVESTMENT
  • 20. 1. What do you think is the difference between the Cost Benefit Model and the ROI model? 2. Provide examples of when these models should be used. QUESTIONS:
  • 21. 7 step model 1. Determine purpose, objectives, participants (who wants this information) 2. Assess information needs 3. Consider proper protocol 4. Describe population to be studied, selet subjects 5. Identify other variables 6. Formulate a study design 7. Formulate a management plan FORMATIVE EVALUATIONS
  • 22. ● Most useful for evaluating instruction ● May be used to for performance improvement and change interventions FORMATIVE EVALUATIONS
  • 23. “...evaluative inquiry can not only be a means of accumulating information for decision making and action..but that it also be equally concerned with questioning and debating the value of what we do in organizations” (Preskill and Torres, 1999) EVALUATIVE INQUIRY
  • 24. Evaluative inquiry is a way of fostering individual learning and team learning within an organization , about issues that are critical to its purpose and what it values (Parsons, 2009) EVALUATIVE INQUIRY
  • 25. ● Collaboration ● Organizational learning and change ● Links learning and performance ● Diverse perspectives EVALUATIVE INQUIRY
  • 26. ● A study is needed to evaluate and redesign an online master’s degree program consisting of 12 courses in informatics ● Educators are concerned about the quality of online education courses ● Meaningful assessment is essential for improving quality of such programs CASE STUDY FOR EVALUATION
  • 27. Considering the evaluation models that we have discussed: 1. Which model(s) would you consider appropriate for this case? Why? 2. Design an evaluation program to include Steps 1-5 as described by Rothwell and Kazanas ( Gilley & Maycunich, p. 430-432) CASE STUDY FOR EVALUATION
  • 28. Bates, R. (2004) A critical analysis of evaluation practice; the Kirkpatrick model and the principle of beneficence. Evaluation and program planning,27(3),341-34 Parsons, B. (2009) Evaluative inquiry for complex times. OD Practitioner, 41(1). Preskill, H. & Torres, R. T.(1999). building capacity for organizational learning through evaluative inquiry. Evaluation, 5(1) 42-60 REFERENCES