SlideShare a Scribd company logo
Evaluating HRD Programs
Effectiveness The degree to which a training (or other HRD program) achieves its intended purpose Measures are relative to some starting point Measures how well the desired goal is achieved
Evaluation
HRD Evaluation Textbook definition: “ The systematic collection of descriptive and judgmental information necessary to make effective training decisions related to the selection, adoption, value, and modification of various instructional activities.”
In Other Words… Are we training:  the right people the right “stuff” the right way with the right materials at the right time?
Evaluation Needs Descriptive and judgmental information needed Objective and subjective data Information gathered according to a plan and in a desired format Gathered to provide decision making information
Purposes of Evaluation Determine whether the program is meeting the intended objectives Identify strengths and weaknesses Determine cost-benefit ratio Identify who benefited most or least Determine future participants Provide information for improving HRD programs
Purposes of Evaluation  –  2 Reinforce major points to be made Gather marketing information Determine if training program is appropriate Establish management database
Evaluation Bottom Line Is HRD a revenue contributor or a revenue user? Is HRD credible to line and upper-level managers? Are benefits of HRD readily evident to all?
How Often are HRD Evaluations Conducted? Not often enough!!! Frequently, only end-of-course participant reactions are collected Transfer to the workplace is evaluated less frequently
Why HRD Evaluations are Rare Reluctance to having HRD programs evaluated Evaluation needs expertise and resources Factors other than HRD cause performance improvements  –  e.g., Economy Equipment Policies, etc.
Need for HRD Evaluation Shows the value of HRD Provides metrics for HRD efficiency Demonstrates value-added approach for HRD Demonstrates accountability for HRD activities Everyone else has it… why not HRD?
Make or Buy Evaluation “ I bought it, therefore it is good.” “ Since it’s good, I don’t need to post-test.” Who says it’s:  Appropriate? Effective? Timely? Transferable to the workplace?
Evolution of Evaluation Efforts Anecdotal  approach  –  talk to other users  Try before buy  –  borrow and use samples  Analytical  approach  –  match research data to training needs  Holistic  approach  –  look at overall HRD process, as well as individual training
Models and Frameworks of Evaluation Table 7-1 lists six frameworks for evaluation The most popular is that of D. Kirkpatrick: Reaction Learning Job Behavior Results
Kirkpatrick’s Four Levels Reaction Focus on trainee’s reactions Learning Did they learn what they were supposed to? Job Behavior Was it used on job? Results Did it improve the organization’s effectiveness?
Issues Concerning Kirkpatrick’s Framework Most organizations don’t evaluate at all four levels Focuses only on post-training Doesn’t treat inter-stage improvements WHAT ARE YOUR THOUGHTS?
A Suggested Framework – 1 Reaction Did trainees like the training? Did the training seem useful? Learning How much did they learn? Behavior What behavior change occurred?
Suggested Framework – 2 Results What were the tangible outcomes? What was the return on investment  (ROI)? What was the contribution to the  organization?
Data Collection for HRD Evaluation Possible methods: Interviews Questionnaires Direct observation Written tests Simulation/Performance tests Archival performance information
Interviews Advantages : Flexible Opportunity for clarification Depth possible Personal contact Limitations: High reactive effects High cost Face-to-face threat potential Labor intensive Trained observers needed
Questionnaires Advantages : Low cost to administer Honesty increased Anonymity possible Respondent sets the pace Variety of options Limitations : Possible inaccurate data Response conditions not controlled Respondents set varying paces Uncontrolled return rate
Direct Observation Advantages : Nonthreatening Excellent way to measure behavior change Limitations : Possibly disruptive Reactive effects are possible May be unreliable Need trained observers
Written Tests Advantages : Low purchase cost Readily scored Quickly processed Easily administered Wide sampling possible Limitations : May be threatening Possibly no relation to job performance Measures only cognitive learning Relies on norms Concern for racial/ ethnic bias
Simulation/Performance Tests Advantages : Reliable Objective Close relation to job performance Includes cognitive, psychomotor and affective domains Limitations : Time consuming Simulations often difficult to create High costs to development and use
Archival Performance Data Advantages : Reliable Objective Job-based Easy to review Minimal reactive effects Limitations : Criteria for keeping/ discarding records Information system discrepancies Indirect Not always usable Records prepared for other purposes
Choosing Data Collection Methods Reliability Consistency of results, and freedom from collection method bias and error Validity Does the device measure what we want to measure? Practicality Does it make sense in terms of the resources used to get the data?
Type of Data Used/Needed Individual performance Systemwide performance Economic
Individual Performance Data Individual knowledge Individual behaviors Examples: Test scores Performance quantity, quality, and timeliness Attendance records Attitudes
Systemwide Performance Data Productivity Scrap/rework rates Customer satisfaction levels On-time performance levels Quality rates and improvement rates
Economic Data Profits Product liability claims Avoidance of penalties Market share Competitive position Return on investment (ROI) Financial utility calculations
Use of Self-Report Data Most common method Pre-training and post-training data  Problems: Mono-method bias Desire to be consistent between tests Socially desirable responses Response Shift Bias:  Trainees adjust expectations to training
Research Design Specifies in advance: the expected results of the study the methods of data collection to be used how the data will be analyzed
Research Design Issues Pretest and Posttest Shows trainee what training has accomplished Helps eliminate pretest knowledge bias Control Group Compares performance of group with training against the performance of a similar group without training
Recommended Research Design Pretest and posttest with control group Whenever possible: Randomly assign individuals to the test group and the control group to minimize bias Use “time-series” approach to data collection to verify performance improvement is due to training
Ethical Issues Concerning Evaluation Research Confidentiality Informed consent Withholding training from control groups Use of deception Pressure to produce positive results
Assessing the Impact of HRD Money is the language of business. You MUST talk dollars, not HRD jargon. No one (except maybe you) cares about “the effectiveness of training interventions as measured by and analysis of formal pretest, posttest control group data.”
HRD Program Assessment HRD programs and training are  investments Line managers often see HR and HRD as  costs  –   i.e.,   revenue users, not revenue producers You must prove your worth to the organization  Or you’ll have to find another organization…
Evaluation of Training Costs Cost-benefit analysis Compares cost of training to benefits gained such as attitudes, reduction in accidents, reduction in employee sick-days, etc. Cost-effectiveness analysis Focuses on increases in quality, reduction in scrap/rework, productivity, etc.
Return on Investment Return on investment = Results/Costs
Calculating Training Return On Investment       Results Results     Operational How Before After   Differences Expressed Results Area  Measured Training Training (+ or –) in $ Quality of panels % rejected 2% rejected 1.5% rejected .5% $720 per day     1,440 panels  1,080 panels 360 panels $172,800        per day    per day      per year Housekeeping Visual   10 defects 2 defects 8 defects Not measur-      inspection    (average)    (average)      able in $      using              20-item              checklist         Preventable Number of 24 per year 16 per year 8 per year      accidents    accidents           Direct cost $144,000 $96,000 per $48,000 $48,000 per      of each    per year    year      year      accident             Return  Investment     Total savings:  $220,800.00 ROI = =                 SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact.  Training and Development Journal,   43 (8), 41. Printed by permission.   Operational Results Training Costs = $220,800 $32,564 = 6.8
Types of Training Costs Direct costs Indirect costs Development costs Overhead costs Compensation for participants
Direct Costs Instructor Base pay Fringe benefits Travel and per diem Materials Classroom and audiovisual equipment Travel  Food and refreshments
Indirect Costs Training management Clerical/Administrative Postal/shipping, telephone, computers, etc. Pre- and post-learning materials Other overhead costs
Development Costs Fee to purchase program Costs to tailor program to organization Instructor training costs
Overhead Costs General organization support Top management participation Utilities, facilities General and administrative costs, such as HRM
Compensation for Participants Participants’ salary and benefits for time away from job Travel, lodging, and per-diem costs
Measuring Benefits Change in quality per unit measured in dollars Reduction in scrap/rework measured in dollar cost of labor and materials Reduction in preventable accidents measured in dollars ROI = Benefits/Training costs
Utility Analysis Uses a statistical approach to support claims of training effectiveness: N  =  Number of trainees T  =  Length of time benefits are expected to last d t   = True performance difference resulting from      training SD y  = Dollar value of untrained job performance (in    standard deviation units) C  =  Cost of training  U = (N)(T)(d t )(Sd y ) – C
Critical Information for Utility Analysis d t  = difference in units between trained/untrained, divided by standard deviation in units produced by trained SD y  = standard deviation in dollars, or overall productivity of organization
Ways to Improve HRD Assessment Walk the walk, talk the talk: MONEY Involve HRD in strategic planning Involve management in HRD planning and estimation efforts Gain mutual ownership Use credible and conservative estimates Share credit for successes and blame for failures
HRD Evaluation Steps Analyze needs. Determine explicit evaluation strategy. Insist on specific and measurable training objectives. Obtain participant reactions. Develop criterion measures/instruments to measure results. Plan and execute evaluation strategy.
Summary Training results must be measured against costs Training must contribute to the “bottom line” HRD must justify itself repeatedly as a revenue enhancer, not a revenue waster

More Related Content

PPT
Designing Effective HRD programs
PPT
Assessment of HRD Needs
PPT
Development & career management
PPT
Assesment Of Human Resource Development Needs
PPTX
HUMAN RESOURCE DEVELOPMENT
PPTX
HRD System Design, Assessing HRD Needs, Designing and Implementing HRD Progra...
PPTX
Introduction to HRD
PPTX
Human resource management unit 3
Designing Effective HRD programs
Assessment of HRD Needs
Development & career management
Assesment Of Human Resource Development Needs
HUMAN RESOURCE DEVELOPMENT
HRD System Design, Assessing HRD Needs, Designing and Implementing HRD Progra...
Introduction to HRD
Human resource management unit 3

What's hot (20)

PPTX
Training - Human Resource Management HRM
PDF
126328945 training-development-notes-mba
PDF
Human resource management
PDF
Hr audit
PPTX
Human resources accounting and audit
PPTX
HRD Mechanism
PPTX
A framework for the hrd process
PPTX
Human Resource Development- framework
PPT
Hrd audit
PPTX
Human Resource Management and Human Resource Development
PPTX
Factors Affecting Human Resource Planning
PPTX
human resource outsourcing
PPTX
hrd systems, processes, outcomes and organizational effectiveness
PPT
Ppt on hrd
PPTX
Introduction to Strategic HRM
PPTX
job analysis
DOCX
Investment perspective of human resource management
PPTX
Functions of hrm
PPTX
Challenges of HRM
Training - Human Resource Management HRM
126328945 training-development-notes-mba
Human resource management
Hr audit
Human resources accounting and audit
HRD Mechanism
A framework for the hrd process
Human Resource Development- framework
Hrd audit
Human Resource Management and Human Resource Development
Factors Affecting Human Resource Planning
human resource outsourcing
hrd systems, processes, outcomes and organizational effectiveness
Ppt on hrd
Introduction to Strategic HRM
job analysis
Investment perspective of human resource management
Functions of hrm
Challenges of HRM
Ad

Similar to Evaluating hrd-programs (20)

PPTX
chapter7-Evaluating HRD Program s.pptx
PPT
evaluating hrd programs
PPT
chapter-7.pptfsdafffffffffffffffffffffffffffffffffffff
PPTX
SHRD-Lecture-7-Evaluating-HRD-Programs.pptx
PPTX
Measuring roi of training
PPTX
Measuring roi of training ppt slides
PPT
Evaluating hrd interventions
DOCX
Chapter 7Evaluating HRD ProgramsWerner© 2017 Cengage Learn.docx
PPT
Unique file 8
PPTX
Strategic hrd
PDF
IBA_HRMC Course Material
PPT
Measuring ROI of Training
PPT
Measuring Roi Of Training & Development Ravinder Tulsiani
PPT
Introduction
PPTX
Trainning and development (t&d)
PPT
ROI PPT_Final.ppt
PPT
Return on Invetsment HR.ppt
PPTX
Case study a matter of costs (final)
PPTX
training
chapter7-Evaluating HRD Program s.pptx
evaluating hrd programs
chapter-7.pptfsdafffffffffffffffffffffffffffffffffffff
SHRD-Lecture-7-Evaluating-HRD-Programs.pptx
Measuring roi of training
Measuring roi of training ppt slides
Evaluating hrd interventions
Chapter 7Evaluating HRD ProgramsWerner© 2017 Cengage Learn.docx
Unique file 8
Strategic hrd
IBA_HRMC Course Material
Measuring ROI of Training
Measuring Roi Of Training & Development Ravinder Tulsiani
Introduction
Trainning and development (t&d)
ROI PPT_Final.ppt
Return on Invetsment HR.ppt
Case study a matter of costs (final)
training
Ad

Recently uploaded (20)

PPTX
Lecture (1)-Introduction.pptx business communication
PDF
kom-180-proposal-for-a-directive-amending-directive-2014-45-eu-and-directive-...
DOCX
unit 1 COST ACCOUNTING AND COST SHEET
DOCX
Euro SEO Services 1st 3 General Updates.docx
PDF
MSPs in 10 Words - Created by US MSP Network
PDF
COST SHEET- Tender and Quotation unit 2.pdf
PPTX
Belch_12e_PPT_Ch18_Accessible_university.pptx
PPTX
Amazon (Business Studies) management studies
PDF
Reconciliation AND MEMORANDUM RECONCILATION
PPTX
Starting the business from scratch using well proven technique
PDF
Ôn tập tiếng anh trong kinh doanh nâng cao
PDF
Nidhal Samdaie CV - International Business Consultant
PPTX
HR Introduction Slide (1).pptx on hr intro
PPTX
sales presentation، Training Overview.pptx
PPTX
job Avenue by vinith.pptxvnbvnvnvbnvbnbmnbmbh
PDF
Traveri Digital Marketing Seminar 2025 by Corey and Jessica Perlman
PDF
Stem Cell Market Report | Trends, Growth & Forecast 2025-2034
DOCX
Business Management - unit 1 and 2
PDF
Roadmap Map-digital Banking feature MB,IB,AB
PDF
Katrina Stoneking: Shaking Up the Alcohol Beverage Industry
Lecture (1)-Introduction.pptx business communication
kom-180-proposal-for-a-directive-amending-directive-2014-45-eu-and-directive-...
unit 1 COST ACCOUNTING AND COST SHEET
Euro SEO Services 1st 3 General Updates.docx
MSPs in 10 Words - Created by US MSP Network
COST SHEET- Tender and Quotation unit 2.pdf
Belch_12e_PPT_Ch18_Accessible_university.pptx
Amazon (Business Studies) management studies
Reconciliation AND MEMORANDUM RECONCILATION
Starting the business from scratch using well proven technique
Ôn tập tiếng anh trong kinh doanh nâng cao
Nidhal Samdaie CV - International Business Consultant
HR Introduction Slide (1).pptx on hr intro
sales presentation، Training Overview.pptx
job Avenue by vinith.pptxvnbvnvnvbnvbnbmnbmbh
Traveri Digital Marketing Seminar 2025 by Corey and Jessica Perlman
Stem Cell Market Report | Trends, Growth & Forecast 2025-2034
Business Management - unit 1 and 2
Roadmap Map-digital Banking feature MB,IB,AB
Katrina Stoneking: Shaking Up the Alcohol Beverage Industry

Evaluating hrd-programs

  • 2. Effectiveness The degree to which a training (or other HRD program) achieves its intended purpose Measures are relative to some starting point Measures how well the desired goal is achieved
  • 4. HRD Evaluation Textbook definition: “ The systematic collection of descriptive and judgmental information necessary to make effective training decisions related to the selection, adoption, value, and modification of various instructional activities.”
  • 5. In Other Words… Are we training: the right people the right “stuff” the right way with the right materials at the right time?
  • 6. Evaluation Needs Descriptive and judgmental information needed Objective and subjective data Information gathered according to a plan and in a desired format Gathered to provide decision making information
  • 7. Purposes of Evaluation Determine whether the program is meeting the intended objectives Identify strengths and weaknesses Determine cost-benefit ratio Identify who benefited most or least Determine future participants Provide information for improving HRD programs
  • 8. Purposes of Evaluation – 2 Reinforce major points to be made Gather marketing information Determine if training program is appropriate Establish management database
  • 9. Evaluation Bottom Line Is HRD a revenue contributor or a revenue user? Is HRD credible to line and upper-level managers? Are benefits of HRD readily evident to all?
  • 10. How Often are HRD Evaluations Conducted? Not often enough!!! Frequently, only end-of-course participant reactions are collected Transfer to the workplace is evaluated less frequently
  • 11. Why HRD Evaluations are Rare Reluctance to having HRD programs evaluated Evaluation needs expertise and resources Factors other than HRD cause performance improvements – e.g., Economy Equipment Policies, etc.
  • 12. Need for HRD Evaluation Shows the value of HRD Provides metrics for HRD efficiency Demonstrates value-added approach for HRD Demonstrates accountability for HRD activities Everyone else has it… why not HRD?
  • 13. Make or Buy Evaluation “ I bought it, therefore it is good.” “ Since it’s good, I don’t need to post-test.” Who says it’s: Appropriate? Effective? Timely? Transferable to the workplace?
  • 14. Evolution of Evaluation Efforts Anecdotal approach – talk to other users Try before buy – borrow and use samples Analytical approach – match research data to training needs Holistic approach – look at overall HRD process, as well as individual training
  • 15. Models and Frameworks of Evaluation Table 7-1 lists six frameworks for evaluation The most popular is that of D. Kirkpatrick: Reaction Learning Job Behavior Results
  • 16. Kirkpatrick’s Four Levels Reaction Focus on trainee’s reactions Learning Did they learn what they were supposed to? Job Behavior Was it used on job? Results Did it improve the organization’s effectiveness?
  • 17. Issues Concerning Kirkpatrick’s Framework Most organizations don’t evaluate at all four levels Focuses only on post-training Doesn’t treat inter-stage improvements WHAT ARE YOUR THOUGHTS?
  • 18. A Suggested Framework – 1 Reaction Did trainees like the training? Did the training seem useful? Learning How much did they learn? Behavior What behavior change occurred?
  • 19. Suggested Framework – 2 Results What were the tangible outcomes? What was the return on investment (ROI)? What was the contribution to the organization?
  • 20. Data Collection for HRD Evaluation Possible methods: Interviews Questionnaires Direct observation Written tests Simulation/Performance tests Archival performance information
  • 21. Interviews Advantages : Flexible Opportunity for clarification Depth possible Personal contact Limitations: High reactive effects High cost Face-to-face threat potential Labor intensive Trained observers needed
  • 22. Questionnaires Advantages : Low cost to administer Honesty increased Anonymity possible Respondent sets the pace Variety of options Limitations : Possible inaccurate data Response conditions not controlled Respondents set varying paces Uncontrolled return rate
  • 23. Direct Observation Advantages : Nonthreatening Excellent way to measure behavior change Limitations : Possibly disruptive Reactive effects are possible May be unreliable Need trained observers
  • 24. Written Tests Advantages : Low purchase cost Readily scored Quickly processed Easily administered Wide sampling possible Limitations : May be threatening Possibly no relation to job performance Measures only cognitive learning Relies on norms Concern for racial/ ethnic bias
  • 25. Simulation/Performance Tests Advantages : Reliable Objective Close relation to job performance Includes cognitive, psychomotor and affective domains Limitations : Time consuming Simulations often difficult to create High costs to development and use
  • 26. Archival Performance Data Advantages : Reliable Objective Job-based Easy to review Minimal reactive effects Limitations : Criteria for keeping/ discarding records Information system discrepancies Indirect Not always usable Records prepared for other purposes
  • 27. Choosing Data Collection Methods Reliability Consistency of results, and freedom from collection method bias and error Validity Does the device measure what we want to measure? Practicality Does it make sense in terms of the resources used to get the data?
  • 28. Type of Data Used/Needed Individual performance Systemwide performance Economic
  • 29. Individual Performance Data Individual knowledge Individual behaviors Examples: Test scores Performance quantity, quality, and timeliness Attendance records Attitudes
  • 30. Systemwide Performance Data Productivity Scrap/rework rates Customer satisfaction levels On-time performance levels Quality rates and improvement rates
  • 31. Economic Data Profits Product liability claims Avoidance of penalties Market share Competitive position Return on investment (ROI) Financial utility calculations
  • 32. Use of Self-Report Data Most common method Pre-training and post-training data Problems: Mono-method bias Desire to be consistent between tests Socially desirable responses Response Shift Bias: Trainees adjust expectations to training
  • 33. Research Design Specifies in advance: the expected results of the study the methods of data collection to be used how the data will be analyzed
  • 34. Research Design Issues Pretest and Posttest Shows trainee what training has accomplished Helps eliminate pretest knowledge bias Control Group Compares performance of group with training against the performance of a similar group without training
  • 35. Recommended Research Design Pretest and posttest with control group Whenever possible: Randomly assign individuals to the test group and the control group to minimize bias Use “time-series” approach to data collection to verify performance improvement is due to training
  • 36. Ethical Issues Concerning Evaluation Research Confidentiality Informed consent Withholding training from control groups Use of deception Pressure to produce positive results
  • 37. Assessing the Impact of HRD Money is the language of business. You MUST talk dollars, not HRD jargon. No one (except maybe you) cares about “the effectiveness of training interventions as measured by and analysis of formal pretest, posttest control group data.”
  • 38. HRD Program Assessment HRD programs and training are investments Line managers often see HR and HRD as costs – i.e., revenue users, not revenue producers You must prove your worth to the organization Or you’ll have to find another organization…
  • 39. Evaluation of Training Costs Cost-benefit analysis Compares cost of training to benefits gained such as attitudes, reduction in accidents, reduction in employee sick-days, etc. Cost-effectiveness analysis Focuses on increases in quality, reduction in scrap/rework, productivity, etc.
  • 40. Return on Investment Return on investment = Results/Costs
  • 41. Calculating Training Return On Investment     Results Results     Operational How Before After Differences Expressed Results Area Measured Training Training (+ or –) in $ Quality of panels % rejected 2% rejected 1.5% rejected .5% $720 per day     1,440 panels 1,080 panels 360 panels $172,800        per day    per day      per year Housekeeping Visual 10 defects 2 defects 8 defects Not measur-      inspection    (average)    (average)      able in $      using              20-item              checklist         Preventable Number of 24 per year 16 per year 8 per year      accidents    accidents           Direct cost $144,000 $96,000 per $48,000 $48,000 per      of each    per year    year      year      accident             Return Investment     Total savings: $220,800.00 ROI = =                 SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact. Training and Development Journal, 43 (8), 41. Printed by permission. Operational Results Training Costs = $220,800 $32,564 = 6.8
  • 42. Types of Training Costs Direct costs Indirect costs Development costs Overhead costs Compensation for participants
  • 43. Direct Costs Instructor Base pay Fringe benefits Travel and per diem Materials Classroom and audiovisual equipment Travel Food and refreshments
  • 44. Indirect Costs Training management Clerical/Administrative Postal/shipping, telephone, computers, etc. Pre- and post-learning materials Other overhead costs
  • 45. Development Costs Fee to purchase program Costs to tailor program to organization Instructor training costs
  • 46. Overhead Costs General organization support Top management participation Utilities, facilities General and administrative costs, such as HRM
  • 47. Compensation for Participants Participants’ salary and benefits for time away from job Travel, lodging, and per-diem costs
  • 48. Measuring Benefits Change in quality per unit measured in dollars Reduction in scrap/rework measured in dollar cost of labor and materials Reduction in preventable accidents measured in dollars ROI = Benefits/Training costs
  • 49. Utility Analysis Uses a statistical approach to support claims of training effectiveness: N = Number of trainees T = Length of time benefits are expected to last d t = True performance difference resulting from training SD y = Dollar value of untrained job performance (in standard deviation units) C = Cost of training  U = (N)(T)(d t )(Sd y ) – C
  • 50. Critical Information for Utility Analysis d t = difference in units between trained/untrained, divided by standard deviation in units produced by trained SD y = standard deviation in dollars, or overall productivity of organization
  • 51. Ways to Improve HRD Assessment Walk the walk, talk the talk: MONEY Involve HRD in strategic planning Involve management in HRD planning and estimation efforts Gain mutual ownership Use credible and conservative estimates Share credit for successes and blame for failures
  • 52. HRD Evaluation Steps Analyze needs. Determine explicit evaluation strategy. Insist on specific and measurable training objectives. Obtain participant reactions. Develop criterion measures/instruments to measure results. Plan and execute evaluation strategy.
  • 53. Summary Training results must be measured against costs Training must contribute to the “bottom line” HRD must justify itself repeatedly as a revenue enhancer, not a revenue waster

Editor's Notes