McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 1
6
Chapter
Training EvaluationTraining Evaluation
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 2
IntroductionIntroduction (1 of 2)(1 of 2)
Training effectivenessTraining effectiveness refers to the benefits that
the company and the trainees receive from
training
Training outcomes or criteriaTraining outcomes or criteria refer to measures
that the trainer and the company use to evaluate
training programs
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 3
IntroductionIntroduction (2 of 2)(2 of 2)
Training evaluationTraining evaluation refers to the process of
collecting the outcomes needed to determine if
training is effective
Evaluation designEvaluation design refers to from whom, what,
when, and how information needed for
determining the effectiveness of the training
program will be collected
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 4
Reasons for Evaluating TrainingReasons for Evaluating Training (1 of 2)(1 of 2)
Companies are investing millions of dollars in
training programs to help gain a competitive
advantage
Training investment is increasing because
learning creates knowledge which differentiates
between those companies and employees who are
successful and those who are not
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 5
Reasons for Evaluating TrainingReasons for Evaluating Training (2 of 2)(2 of 2)
Because companies have made large dollar
investments in training and education and view
training as a strategy to be successful, they
expect the outcomes or benefits related to
training to be measurable.
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 6
Training evaluationTraining evaluation provides the
data needed to demonstrate that
training does provide benefits to
the company.
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 7
Formative EvaluationFormative Evaluation
Formative evaluation –Formative evaluation – evaluation conducted to
improve the training process
Helps to ensure that:
the training program is well organized and runs
smoothly
trainees learn and are satisfied with the program
Provides information about how to make the
program better
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 8
Summative EvaluationSummative Evaluation
Summative evaluation –Summative evaluation – evaluation conducted
to determine the extent to which trainees have
changed as a result of participating in the training
program
May also measure the return on investment (ROI)
that the company receives from the training
program
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 9
Why Should A Training Program BeWhy Should A Training Program Be
Evaluated?Evaluated? (1 of 2)(1 of 2)
To identify the program’s strengths and
weaknesses
To assess whether content, organization, and
administration of the program contribute to
learning and the use of training content on the job
To identify which trainees benefited most or least
from the program
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 10
Why Should A Training Program BeWhy Should A Training Program Be
Evaluated?Evaluated? (2 of 2)(2 of 2)
To gather data to assist in marketing training
programs
To determine the financial benefits and costs of
the programs
To compare the costs and benefits of training
versus non-training investments
To compare the costs and benefits of different
training programs to choose the best program
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 11
The Evaluation ProcessThe Evaluation Process
Conduct a Needs AnalysisConduct a Needs Analysis
Develop Measurable Learning OutcomesDevelop Measurable Learning Outcomes
and Analyze Transfer of Trainingand Analyze Transfer of Training
Develop Outcome MeasuresDevelop Outcome Measures
Choose an Evaluation StrategyChoose an Evaluation Strategy
Plan and Execute the EvaluationPlan and Execute the Evaluation
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 12
Training Outcomes:Training Outcomes: Kirkpatrick’s Four-LevelKirkpatrick’s Four-Level
Framework of Evaluation CriteriaFramework of Evaluation Criteria
LevelLevel CriteriaCriteria FocusFocus
1 Reactions Trainee satisfaction
2 Learning Acquisition of knowledge, skills, attitudes, behavior
3 Behavior Improvement of behavior on the job
4 Results Business results achieved by trainees
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 13
Outcomes Used in Evaluating TrainingOutcomes Used in Evaluating Training
Programs:Programs: (1 of 4)(1 of 4)
3. Affective
Outcomes
5. Results
4. Return on
Investment
1. Cognitive
Outcomes
2. Skill-Based
Outcomes
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 14
Outcomes Used in Evaluating TrainingOutcomes Used in Evaluating Training
Programs:Programs: (2 of 4)(2 of 4)
Cognitive OutcomesCognitive Outcomes
Determine the degree to which trainees are familiar
with the principles, facts, techniques, procedures, or
processes emphasized in the training program
Measure what knowledge trainees learned in the
program
Skill-Based OutcomesSkill-Based Outcomes
Assess the level of technical or motor skills
Include acquisition or learning of skills and use of
skills on the job
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 15
Outcomes Used in Evaluating TrainingOutcomes Used in Evaluating Training
Programs:Programs: (3 of 4)(3 of 4)
Affective OutcomesAffective Outcomes
Include attitudes and motivation
Trainees’ perceptions of the program including the
facilities, trainers, and content
ResultsResults
Determine the training program’s payoff for the
company
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 16
Outcomes Used in Evaluating TrainingOutcomes Used in Evaluating Training
Programs:Programs: (4 of 4)(4 of 4)
Return on Investment (ROI)
Comparing the training’s monetary benefits with the
cost of the training
direct costs
indirect costs
benefits
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 17
How do you know if your outcomes areHow do you know if your outcomes are
good?good?
Good training outcomes need to be:
Relevant
Reliable
Discriminative
Practical
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 18
Good Outcomes:Good Outcomes: RelevanceRelevance
Criteria relevance –Criteria relevance – the extent to which training
programs are related to learned capabilities
emphasized in the training program
Criterion contamination –Criterion contamination – extent that training
outcomes measure inappropriate capabilities or are
affected by extraneous conditions
Criterion deficiency – failure to measure training
outcomes that were emphasized in the training
objectives
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 19
Criterion deficiency, relevance, andCriterion deficiency, relevance, and
contamination:contamination:
Outcomes
Measured in
Evaluation
Outcomes
Identified by
Needs
Assessment and
Included in
Training
Objectives
Outcomes
Related to
Training
Objectives
Contamination Relevance Deficiency
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 20
Good OutcomesGood Outcomes (continued)(continued)
Reliability –Reliability – degree to which outcomes can be
measured consistently over time
Discrimination –Discrimination – degree to which trainee’s
performances on the outcome actually reflect true
differences in performance
Practicality –Practicality – refers to the ease with which the
outcomes measures can be collected
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 21
Training Evaluation PracticesTraining Evaluation Practices
79%
38%
15% 9%
0%
10%
20%
30%
40%
50%
60%
70%
80%
Reaction Cognitive Behavior Results
Outcomes
PercentageofCoursesUsingOutcome
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 22
Training Program Objectives and TheirTraining Program Objectives and Their
Implications for Evaluation:Implications for Evaluation:
Reactions: Did trainees like the program?
Did the environment help learning?
Was material meaningful?
Skill-Based: Ratings by peers or managers
based on observation of behavior
Cognitive: Pencil-and-paper tests Affective: Trainees’ motivation or job attitudes
Skill-Based: Performance on a work sample Results: Did company benefit through sales,
quality, productivity, reduced
accidents, and complaints?
Performance on work equipment
Outcomes
Learning Transfer
Objective
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 23
Evaluation Designs: Threats to ValidityEvaluation Designs: Threats to Validity
Threats to validityThreats to validity refer to a factor that will lead
one to question either:
The believability of the study results (internal(internal
validity)validity), or
The extent to which the evaluation results are
generalizable to other groups of trainees and situations
(external validity)(external validity)
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 24
Threats to ValidityThreats to Validity
Threats To Internal
Validity
Company
Persons
Outcome Measures
Threats To External
Validity
Reaction to pretest
Reaction to evaluation
Interaction of selection
and training
Interaction of methods
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 25
Methods to Control for Threats to ValidityMethods to Control for Threats to Validity
Pre- and PosttestsPre- and Posttests
Use of ComparisonUse of Comparison
GroupsGroups
Random AssignmentRandom Assignment
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 26
Types of Evaluation DesignsTypes of Evaluation Designs
Posttest – only
Pretest / Posttest
Posttest – only with
Comparison Group
Pretest / Posttest with
Comparison Group
Time Series
Time Series with
Comparison Group and
Reversal
Solomon Four–Group
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 27
Comparison of Evaluation DesignsComparison of Evaluation Designs
(1 of 2)(1 of 2)
Design Groups Pre-training Post-training Cost Time Strength
Posttest Only Trainees No Yes Low Low Low
Pretest / Posttest Trainees Yes Yes Low Low Medium
Posttest Only with
Comparison Group
Trainees and
Comparison
No Yes Medium Medium Medium
Pretest / Posttest with
Comparison Group
Trainees and
Comparison
Yes Yes Medium Medium High
Measures
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 28
Comparison of Evaluation DesignsComparison of Evaluation Designs
(2 of 2)(2 of 2)
Design Groups Pre-training Post-training Cost Time Strength
Time Series Trainees Yes Yes, several Medium Medium Medium
Time Series with
Comparison Group
and Reversal
Trainees and
Comparison
Yes Yes, several High Medium High
Solomon Four-Group Trainees A
Trainees B
Comparison A
Comparison B
Yes
No
Yes
No
Yes
Yes
Yes
Yes
High High High
Measures
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 29
Example of a Pretest / Posttest ComparisonExample of a Pretest / Posttest Comparison
Group Design:Group Design:
Pre-training Training Post-training
Time 1
Post-training
Time 2
Lecture Yes Yes Yes Yes
Self-Paced Yes Yes Yes Yes
Behavior
Modeling
Yes Yes Yes Yes
No Training
(Comparison)
Yes No Yes Yes
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 30
Example of a Solomon Four-GroupExample of a Solomon Four-Group
Design:Design:
Pretest Training Posttest
Group 1 Yes IL-based Yes
Group 2 Yes Traditional Yes
Group 3 No IL-based Yes
Group 4 No Traditional Yes
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 31
Factors That Influence the Type ofFactors That Influence the Type of
Evaluation DesignEvaluation Design
Factor How Factor Influences Type of Evaluation Design
Change potential Can program be modified?
Importance Does ineffective training affect customer service, product
development, or relationships between employees?
Scale How many trainees are involved?
Purpose of training Is training conducted for learning, results, or both?
Organization culture Is demonstrating results part of company norms and expectations?
Expertise Can a complex study be analyzed?
Cost Is evaluation too expensive?
Time frame When do we need the information?
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 32
Conditions for choosing aConditions for choosing a rigorousrigorous
evaluation design:evaluation design: (1 of 2)(1 of 2)
1. The evaluation results can be used to change
the program
2. The training program is ongoing and has the
potential to affect many employees (and
customers)
3. The training program involves multiple classes
and a large number of trainees
4. Cost justification for training is based on
numerical indicators
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 33
Conditions for choosing aConditions for choosing a rigorousrigorous
evaluation design:evaluation design: (2 of 2)(2 of 2)
5. You or others have the expertise to design and
evaluate the data collected from the evaluation
study
6. The cost of training creates a need to show that
it works
7. There is sufficient time for conducting an
evaluation
8. There is interest in measuring change from pre-
training levels or in comparing two or more
different programs
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 34
Importance of Training Cost InformationImportance of Training Cost Information
To understand total expenditures for training,
including direct and indirect costs
To compare costs of alternative training
programs
To evaluate the proportion of money spent on
training development, administration, and
evaluation as well as to compare monies spent
on training for different groups of employees
To control costs
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 35
To calculate return on investment (ROI),To calculate return on investment (ROI),
follow these steps:follow these steps: (1 of 2)(1 of 2)
1. Identify outcome(s) (e.g., quality, accidents)
2. Place a value on the outcome(s)
3. Determine the change in performance after
eliminating other potential influences on training
results.
4. Obtain an annual amount of benefits (operational
results) from training by comparing results after
training to results before training (in dollars)
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 36
To calculate return on investment (ROI),To calculate return on investment (ROI),
follow these steps:follow these steps: (2 of 2)(2 of 2)
5. Determine training costs (direct costs + indirect
costs + development costs + overhead costs +
compensation for trainees)
6. Calculate the total savings by subtracting the
training costs from benefits (operational results)
7. Calculate the ROI by dividing benefits
(operational results) by costs
 The ROI gives you an estimate of the dollar return
expected from each dollar invested in training.
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 37
Determining Costs for a Cost-BenefitDetermining Costs for a Cost-Benefit
Analysis:Analysis:
Development
Costs
Overhead
Costs
Compensation
for
Trainees
Direct Costs Indirect Costs
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 38
Example of Return on InvestmentExample of Return on Investment
Industry Training Program ROI
Bottling company Workshops on managers’ roles 15:1
Large commercial bank Sales training 21:1
Electric & gas utility Behavior modification 5:1
Oil company Customer service 4.8:1
Health maintenance
organization
Team training 13.7:1

More Related Content

PPTX
Kirkpatrick's Levels of Training Evaluation - Training and Development
PPTX
Evaluation of training methods
PPTX
placement & induction
PPT
Chapter 7 Training And Development
PDF
Human Resource Planning
PPTX
Training evaluation
PPTX
Training evaluation
Kirkpatrick's Levels of Training Evaluation - Training and Development
Evaluation of training methods
placement & induction
Chapter 7 Training And Development
Human Resource Planning
Training evaluation
Training evaluation

What's hot (20)

PPTX
Training Evaluation workshop slides.may2011
PDF
TRAINING DESIGN, DEVELOPMENT & IMPLEMENTATION
PPT
Training And Development
PPT
PMS presentation ppt
DOCX
Organizational Behavior assignment
PPT
Training And Developing Employees - Human Resource Management
PPTX
Kirkpatrick's Four-Level Training Evaluation Model
PPTX
Employee Development Chap 9
PPTX
Evaluation of training effectiveness
PPTX
Kirkpatrick Training Evaluation Method
PPTX
Compensation ppt
PPTX
Kirkpatrick model
PPTX
PPTX
Compensation and Performance Management
PPTX
MBA760 Chapter 06
PPTX
Performance management system
PPTX
Introduction of organizational behavior
PPTX
Attitude - Organizational Behaviour
PPTX
Training Evaluation
Training Evaluation workshop slides.may2011
TRAINING DESIGN, DEVELOPMENT & IMPLEMENTATION
Training And Development
PMS presentation ppt
Organizational Behavior assignment
Training And Developing Employees - Human Resource Management
Kirkpatrick's Four-Level Training Evaluation Model
Employee Development Chap 9
Evaluation of training effectiveness
Kirkpatrick Training Evaluation Method
Compensation ppt
Kirkpatrick model
Compensation and Performance Management
MBA760 Chapter 06
Performance management system
Introduction of organizational behavior
Attitude - Organizational Behaviour
Training Evaluation
Ad

Similar to Training Feedback and Evaluation, Training Audit, Training as Continuous Process (20)

PPT
chap06- Training Evaluation (HRM) 123.ppt
PPTX
Training Evaluation -IRWIN-Module 3.pptx
PPT
Employee development
PPT
Employeedevelopment 130515124628-phpapp02
PPTX
trainingevaluation-ppt6-131226030229-phpapp02 (1)-converted.pptx
PPT
chap001 Introduction to Training and development.ppt
PPT
Training evaluation ppt 6
PPT
Chap006
PPT
Chap006
PPTX
KTU TRAINING AND DEVELOPMENT MBA NOTES 1st module
PPTX
Training Evaluation
PDF
Employee Training and Development 6th Edition Noe Solutions Manual
PPT
Training Evaluation PPT FROM THE PGDM ..
PDF
Employee Training and Development 6th Edition Noe Solutions Manual
PDF
Employee Training and Development 6th Edition Noe Solutions Manual
PDF
Employee Training and Development 6th Edition Noe Solutions Manual
PDF
All chapter download Employee Training and Development 6th Edition Noe Soluti...
PDF
Employee Training and Development 6th Edition Noe Solutions Manual
PDF
Employee Training and Development 6th Edition Noe Solutions Manual
chap06- Training Evaluation (HRM) 123.ppt
Training Evaluation -IRWIN-Module 3.pptx
Employee development
Employeedevelopment 130515124628-phpapp02
trainingevaluation-ppt6-131226030229-phpapp02 (1)-converted.pptx
chap001 Introduction to Training and development.ppt
Training evaluation ppt 6
Chap006
Chap006
KTU TRAINING AND DEVELOPMENT MBA NOTES 1st module
Training Evaluation
Employee Training and Development 6th Edition Noe Solutions Manual
Training Evaluation PPT FROM THE PGDM ..
Employee Training and Development 6th Edition Noe Solutions Manual
Employee Training and Development 6th Edition Noe Solutions Manual
Employee Training and Development 6th Edition Noe Solutions Manual
All chapter download Employee Training and Development 6th Edition Noe Soluti...
Employee Training and Development 6th Edition Noe Solutions Manual
Employee Training and Development 6th Edition Noe Solutions Manual
Ad

More from Ashish Hande (20)

PPT
5 social ethical issues in sdm
PPTX
4 wholesaling-and-retailing
PPT
3 channel-information-systems
PPT
2 management of channel-conflict
PPT
1 distribution cost analyis
PPTX
Telemarketing ppt-1
PPTX
Sdm 4.0
PPTX
Sdm 2.1
PPTX
Sdm 2.0
PPT
Selling fab
PPT
Personal selling
PPTX
Sdm 1.3
PPT
Sdm 1.2
PPT
Sdm 1.0
PPTX
Sdm 1.1
PPTX
Models of cb
PPTX
Models of cb 1
PPTX
CONSUMER BEHAVIOUR MODELS
PPTX
CONSUMER BEHAVIOUR INDUSTRIAL BUYING BEHAVIOUR
PPTX
MODEL OF CONSUMER BEHAVIOUR
5 social ethical issues in sdm
4 wholesaling-and-retailing
3 channel-information-systems
2 management of channel-conflict
1 distribution cost analyis
Telemarketing ppt-1
Sdm 4.0
Sdm 2.1
Sdm 2.0
Selling fab
Personal selling
Sdm 1.3
Sdm 1.2
Sdm 1.0
Sdm 1.1
Models of cb
Models of cb 1
CONSUMER BEHAVIOUR MODELS
CONSUMER BEHAVIOUR INDUSTRIAL BUYING BEHAVIOUR
MODEL OF CONSUMER BEHAVIOUR

Recently uploaded (20)

PDF
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 2).pdf
PPTX
Share_Module_2_Power_conflict_and_negotiation.pptx
DOCX
Cambridge-Practice-Tests-for-IELTS-12.docx
PDF
FORM 1 BIOLOGY MIND MAPS and their schemes
PDF
medical_surgical_nursing_10th_edition_ignatavicius_TEST_BANK_pdf.pdf
PDF
MICROENCAPSULATION_NDDS_BPHARMACY__SEM VII_PCI .pdf
PDF
LIFE & LIVING TRILOGY - PART - (2) THE PURPOSE OF LIFE.pdf
PPTX
B.Sc. DS Unit 2 Software Engineering.pptx
PDF
Race Reva University – Shaping Future Leaders in Artificial Intelligence
PDF
BP 505 T. PHARMACEUTICAL JURISPRUDENCE (UNIT 2).pdf
PPTX
A powerpoint presentation on the Revised K-10 Science Shaping Paper
PDF
ChatGPT for Dummies - Pam Baker Ccesa007.pdf
PPTX
Module on health assessment of CHN. pptx
PDF
semiconductor packaging in vlsi design fab
PDF
1.3 FINAL REVISED K-10 PE and Health CG 2023 Grades 4-10 (1).pdf
PDF
Uderstanding digital marketing and marketing stratergie for engaging the digi...
PPTX
What’s under the hood: Parsing standardized learning content for AI
PPTX
Virtual and Augmented Reality in Current Scenario
PDF
Paper A Mock Exam 9_ Attempt review.pdf.
PDF
Journal of Dental Science - UDMY (2021).pdf
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 2).pdf
Share_Module_2_Power_conflict_and_negotiation.pptx
Cambridge-Practice-Tests-for-IELTS-12.docx
FORM 1 BIOLOGY MIND MAPS and their schemes
medical_surgical_nursing_10th_edition_ignatavicius_TEST_BANK_pdf.pdf
MICROENCAPSULATION_NDDS_BPHARMACY__SEM VII_PCI .pdf
LIFE & LIVING TRILOGY - PART - (2) THE PURPOSE OF LIFE.pdf
B.Sc. DS Unit 2 Software Engineering.pptx
Race Reva University – Shaping Future Leaders in Artificial Intelligence
BP 505 T. PHARMACEUTICAL JURISPRUDENCE (UNIT 2).pdf
A powerpoint presentation on the Revised K-10 Science Shaping Paper
ChatGPT for Dummies - Pam Baker Ccesa007.pdf
Module on health assessment of CHN. pptx
semiconductor packaging in vlsi design fab
1.3 FINAL REVISED K-10 PE and Health CG 2023 Grades 4-10 (1).pdf
Uderstanding digital marketing and marketing stratergie for engaging the digi...
What’s under the hood: Parsing standardized learning content for AI
Virtual and Augmented Reality in Current Scenario
Paper A Mock Exam 9_ Attempt review.pdf.
Journal of Dental Science - UDMY (2021).pdf

Training Feedback and Evaluation, Training Audit, Training as Continuous Process

  • 1. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 1 6 Chapter Training EvaluationTraining Evaluation
  • 2. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 2 IntroductionIntroduction (1 of 2)(1 of 2) Training effectivenessTraining effectiveness refers to the benefits that the company and the trainees receive from training Training outcomes or criteriaTraining outcomes or criteria refer to measures that the trainer and the company use to evaluate training programs
  • 3. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 3 IntroductionIntroduction (2 of 2)(2 of 2) Training evaluationTraining evaluation refers to the process of collecting the outcomes needed to determine if training is effective Evaluation designEvaluation design refers to from whom, what, when, and how information needed for determining the effectiveness of the training program will be collected
  • 4. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 4 Reasons for Evaluating TrainingReasons for Evaluating Training (1 of 2)(1 of 2) Companies are investing millions of dollars in training programs to help gain a competitive advantage Training investment is increasing because learning creates knowledge which differentiates between those companies and employees who are successful and those who are not
  • 5. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 5 Reasons for Evaluating TrainingReasons for Evaluating Training (2 of 2)(2 of 2) Because companies have made large dollar investments in training and education and view training as a strategy to be successful, they expect the outcomes or benefits related to training to be measurable.
  • 6. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 6 Training evaluationTraining evaluation provides the data needed to demonstrate that training does provide benefits to the company.
  • 7. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 7 Formative EvaluationFormative Evaluation Formative evaluation –Formative evaluation – evaluation conducted to improve the training process Helps to ensure that: the training program is well organized and runs smoothly trainees learn and are satisfied with the program Provides information about how to make the program better
  • 8. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 8 Summative EvaluationSummative Evaluation Summative evaluation –Summative evaluation – evaluation conducted to determine the extent to which trainees have changed as a result of participating in the training program May also measure the return on investment (ROI) that the company receives from the training program
  • 9. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 9 Why Should A Training Program BeWhy Should A Training Program Be Evaluated?Evaluated? (1 of 2)(1 of 2) To identify the program’s strengths and weaknesses To assess whether content, organization, and administration of the program contribute to learning and the use of training content on the job To identify which trainees benefited most or least from the program
  • 10. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 10 Why Should A Training Program BeWhy Should A Training Program Be Evaluated?Evaluated? (2 of 2)(2 of 2) To gather data to assist in marketing training programs To determine the financial benefits and costs of the programs To compare the costs and benefits of training versus non-training investments To compare the costs and benefits of different training programs to choose the best program
  • 11. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 11 The Evaluation ProcessThe Evaluation Process Conduct a Needs AnalysisConduct a Needs Analysis Develop Measurable Learning OutcomesDevelop Measurable Learning Outcomes and Analyze Transfer of Trainingand Analyze Transfer of Training Develop Outcome MeasuresDevelop Outcome Measures Choose an Evaluation StrategyChoose an Evaluation Strategy Plan and Execute the EvaluationPlan and Execute the Evaluation
  • 12. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 12 Training Outcomes:Training Outcomes: Kirkpatrick’s Four-LevelKirkpatrick’s Four-Level Framework of Evaluation CriteriaFramework of Evaluation Criteria LevelLevel CriteriaCriteria FocusFocus 1 Reactions Trainee satisfaction 2 Learning Acquisition of knowledge, skills, attitudes, behavior 3 Behavior Improvement of behavior on the job 4 Results Business results achieved by trainees
  • 13. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 13 Outcomes Used in Evaluating TrainingOutcomes Used in Evaluating Training Programs:Programs: (1 of 4)(1 of 4) 3. Affective Outcomes 5. Results 4. Return on Investment 1. Cognitive Outcomes 2. Skill-Based Outcomes
  • 14. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 14 Outcomes Used in Evaluating TrainingOutcomes Used in Evaluating Training Programs:Programs: (2 of 4)(2 of 4) Cognitive OutcomesCognitive Outcomes Determine the degree to which trainees are familiar with the principles, facts, techniques, procedures, or processes emphasized in the training program Measure what knowledge trainees learned in the program Skill-Based OutcomesSkill-Based Outcomes Assess the level of technical or motor skills Include acquisition or learning of skills and use of skills on the job
  • 15. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 15 Outcomes Used in Evaluating TrainingOutcomes Used in Evaluating Training Programs:Programs: (3 of 4)(3 of 4) Affective OutcomesAffective Outcomes Include attitudes and motivation Trainees’ perceptions of the program including the facilities, trainers, and content ResultsResults Determine the training program’s payoff for the company
  • 16. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 16 Outcomes Used in Evaluating TrainingOutcomes Used in Evaluating Training Programs:Programs: (4 of 4)(4 of 4) Return on Investment (ROI) Comparing the training’s monetary benefits with the cost of the training direct costs indirect costs benefits
  • 17. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 17 How do you know if your outcomes areHow do you know if your outcomes are good?good? Good training outcomes need to be: Relevant Reliable Discriminative Practical
  • 18. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 18 Good Outcomes:Good Outcomes: RelevanceRelevance Criteria relevance –Criteria relevance – the extent to which training programs are related to learned capabilities emphasized in the training program Criterion contamination –Criterion contamination – extent that training outcomes measure inappropriate capabilities or are affected by extraneous conditions Criterion deficiency – failure to measure training outcomes that were emphasized in the training objectives
  • 19. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 19 Criterion deficiency, relevance, andCriterion deficiency, relevance, and contamination:contamination: Outcomes Measured in Evaluation Outcomes Identified by Needs Assessment and Included in Training Objectives Outcomes Related to Training Objectives Contamination Relevance Deficiency
  • 20. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 20 Good OutcomesGood Outcomes (continued)(continued) Reliability –Reliability – degree to which outcomes can be measured consistently over time Discrimination –Discrimination – degree to which trainee’s performances on the outcome actually reflect true differences in performance Practicality –Practicality – refers to the ease with which the outcomes measures can be collected
  • 21. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 21 Training Evaluation PracticesTraining Evaluation Practices 79% 38% 15% 9% 0% 10% 20% 30% 40% 50% 60% 70% 80% Reaction Cognitive Behavior Results Outcomes PercentageofCoursesUsingOutcome
  • 22. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 22 Training Program Objectives and TheirTraining Program Objectives and Their Implications for Evaluation:Implications for Evaluation: Reactions: Did trainees like the program? Did the environment help learning? Was material meaningful? Skill-Based: Ratings by peers or managers based on observation of behavior Cognitive: Pencil-and-paper tests Affective: Trainees’ motivation or job attitudes Skill-Based: Performance on a work sample Results: Did company benefit through sales, quality, productivity, reduced accidents, and complaints? Performance on work equipment Outcomes Learning Transfer Objective
  • 23. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 23 Evaluation Designs: Threats to ValidityEvaluation Designs: Threats to Validity Threats to validityThreats to validity refer to a factor that will lead one to question either: The believability of the study results (internal(internal validity)validity), or The extent to which the evaluation results are generalizable to other groups of trainees and situations (external validity)(external validity)
  • 24. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 24 Threats to ValidityThreats to Validity Threats To Internal Validity Company Persons Outcome Measures Threats To External Validity Reaction to pretest Reaction to evaluation Interaction of selection and training Interaction of methods
  • 25. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 25 Methods to Control for Threats to ValidityMethods to Control for Threats to Validity Pre- and PosttestsPre- and Posttests Use of ComparisonUse of Comparison GroupsGroups Random AssignmentRandom Assignment
  • 26. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 26 Types of Evaluation DesignsTypes of Evaluation Designs Posttest – only Pretest / Posttest Posttest – only with Comparison Group Pretest / Posttest with Comparison Group Time Series Time Series with Comparison Group and Reversal Solomon Four–Group
  • 27. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 27 Comparison of Evaluation DesignsComparison of Evaluation Designs (1 of 2)(1 of 2) Design Groups Pre-training Post-training Cost Time Strength Posttest Only Trainees No Yes Low Low Low Pretest / Posttest Trainees Yes Yes Low Low Medium Posttest Only with Comparison Group Trainees and Comparison No Yes Medium Medium Medium Pretest / Posttest with Comparison Group Trainees and Comparison Yes Yes Medium Medium High Measures
  • 28. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 28 Comparison of Evaluation DesignsComparison of Evaluation Designs (2 of 2)(2 of 2) Design Groups Pre-training Post-training Cost Time Strength Time Series Trainees Yes Yes, several Medium Medium Medium Time Series with Comparison Group and Reversal Trainees and Comparison Yes Yes, several High Medium High Solomon Four-Group Trainees A Trainees B Comparison A Comparison B Yes No Yes No Yes Yes Yes Yes High High High Measures
  • 29. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 29 Example of a Pretest / Posttest ComparisonExample of a Pretest / Posttest Comparison Group Design:Group Design: Pre-training Training Post-training Time 1 Post-training Time 2 Lecture Yes Yes Yes Yes Self-Paced Yes Yes Yes Yes Behavior Modeling Yes Yes Yes Yes No Training (Comparison) Yes No Yes Yes
  • 30. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 30 Example of a Solomon Four-GroupExample of a Solomon Four-Group Design:Design: Pretest Training Posttest Group 1 Yes IL-based Yes Group 2 Yes Traditional Yes Group 3 No IL-based Yes Group 4 No Traditional Yes
  • 31. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 31 Factors That Influence the Type ofFactors That Influence the Type of Evaluation DesignEvaluation Design Factor How Factor Influences Type of Evaluation Design Change potential Can program be modified? Importance Does ineffective training affect customer service, product development, or relationships between employees? Scale How many trainees are involved? Purpose of training Is training conducted for learning, results, or both? Organization culture Is demonstrating results part of company norms and expectations? Expertise Can a complex study be analyzed? Cost Is evaluation too expensive? Time frame When do we need the information?
  • 32. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 32 Conditions for choosing aConditions for choosing a rigorousrigorous evaluation design:evaluation design: (1 of 2)(1 of 2) 1. The evaluation results can be used to change the program 2. The training program is ongoing and has the potential to affect many employees (and customers) 3. The training program involves multiple classes and a large number of trainees 4. Cost justification for training is based on numerical indicators
  • 33. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 33 Conditions for choosing aConditions for choosing a rigorousrigorous evaluation design:evaluation design: (2 of 2)(2 of 2) 5. You or others have the expertise to design and evaluate the data collected from the evaluation study 6. The cost of training creates a need to show that it works 7. There is sufficient time for conducting an evaluation 8. There is interest in measuring change from pre- training levels or in comparing two or more different programs
  • 34. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 34 Importance of Training Cost InformationImportance of Training Cost Information To understand total expenditures for training, including direct and indirect costs To compare costs of alternative training programs To evaluate the proportion of money spent on training development, administration, and evaluation as well as to compare monies spent on training for different groups of employees To control costs
  • 35. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 35 To calculate return on investment (ROI),To calculate return on investment (ROI), follow these steps:follow these steps: (1 of 2)(1 of 2) 1. Identify outcome(s) (e.g., quality, accidents) 2. Place a value on the outcome(s) 3. Determine the change in performance after eliminating other potential influences on training results. 4. Obtain an annual amount of benefits (operational results) from training by comparing results after training to results before training (in dollars)
  • 36. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 36 To calculate return on investment (ROI),To calculate return on investment (ROI), follow these steps:follow these steps: (2 of 2)(2 of 2) 5. Determine training costs (direct costs + indirect costs + development costs + overhead costs + compensation for trainees) 6. Calculate the total savings by subtracting the training costs from benefits (operational results) 7. Calculate the ROI by dividing benefits (operational results) by costs  The ROI gives you an estimate of the dollar return expected from each dollar invested in training.
  • 37. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 37 Determining Costs for a Cost-BenefitDetermining Costs for a Cost-Benefit Analysis:Analysis: Development Costs Overhead Costs Compensation for Trainees Direct Costs Indirect Costs
  • 38. McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved. 6 - 38 Example of Return on InvestmentExample of Return on Investment Industry Training Program ROI Bottling company Workshops on managers’ roles 15:1 Large commercial bank Sales training 21:1 Electric & gas utility Behavior modification 5:1 Oil company Customer service 4.8:1 Health maintenance organization Team training 13.7:1

Editor's Notes

  • #14: The following suggestions specify the types of changes in jobs that are most likely to lead to improvements in each of the five core dimensions. (1) Combine tasks - managers should put existing fractionalized tasks back together to form a new, larger module of work. This increases skill variety and task identify. (2) Create natural work units - managers should design tasks that form an identifiable and meaningful whole. This increases employee “ownership” of the work and encourages employees to view their work as meaningful and important rather than as irrelevant and boring. (3) Establish client relationships - the client is the user of the product or service that the employee works on. Whenever possible, managers should establish direct relationships between workers and their clients. This increases skill variety, autonomy, and feedback for the employee. (4) Expand jobs vertically - vertical expansion means giving employees responsibilities and controls that were formerly reserved for management. It partially closes the gap between the “doing” and “controlling” aspects of the job, and it increases employee autonomy. (5) Open feedback channels - by increasing feedback, employees not only learn how well they are performing their jobs but also whether their performances are improving, deteriorating, or remaining at a constant level. Ideally, employees should receive performance feedback directly as they do their jobs rather than from management on an occasional basis.
  • #38: The following suggestions specify the types of changes in jobs that are most likely to lead to improvements in each of the five core dimensions. (1) Combine tasks - managers should put existing fractionalized tasks back together to form a new, larger module of work. This increases skill variety and task identify. (2) Create natural work units - managers should design tasks that form an identifiable and meaningful whole. This increases employee “ownership” of the work and encourages employees to view their work as meaningful and important rather than as irrelevant and boring. (3) Establish client relationships - the client is the user of the product or service that the employee works on. Whenever possible, managers should establish direct relationships between workers and their clients. This increases skill variety, autonomy, and feedback for the employee. (4) Expand jobs vertically - vertical expansion means giving employees responsibilities and controls that were formerly reserved for management. It partially closes the gap between the “doing” and “controlling” aspects of the job, and it increases employee autonomy. (5) Open feedback channels - by increasing feedback, employees not only learn how well they are performing their jobs but also whether their performances are improving, deteriorating, or remaining at a constant level. Ideally, employees should receive performance feedback directly as they do their jobs rather than from management on an occasional basis.