SlideShare a Scribd company logo
11
Evaluating HRDEvaluating HRD
ProgramsPrograms
Professor Jayashree SadriProfessor Jayashree Sadri
and Dr. Sorab Sadriand Dr. Sorab Sadri
22
EffectivenessEffectiveness
 The degree to which a training (orThe degree to which a training (or
other HRD program) achieves itsother HRD program) achieves its
intended purposeintended purpose
 Measures are relative to someMeasures are relative to some
starting pointstarting point
 Measures how well the desiredMeasures how well the desired
goal is achievedgoal is achieved
33
EvaluationEvaluation
44
HRD EvaluationHRD Evaluation
Textbook definition:Textbook definition:
““The systematic collection ofThe systematic collection of
descriptive and judgmentaldescriptive and judgmental
information necessary to makeinformation necessary to make
effective training decisions relatedeffective training decisions related
to the selection, adoption, value,to the selection, adoption, value,
and modification of variousand modification of various
instructional activities.”instructional activities.”
55
In Other Words…In Other Words…
Are we training:Are we training:
 the right peoplethe right people
 the right “stuff”the right “stuff”
 the right waythe right way
 with the right materialswith the right materials
 at the right time?at the right time?
66
Evaluation NeedsEvaluation Needs
 Descriptive and judgmentalDescriptive and judgmental
information neededinformation needed
– Objective and subjective dataObjective and subjective data
 Information gathered according toInformation gathered according to
a plan and in a desired formata plan and in a desired format
 Gathered to provide decisionGathered to provide decision
making informationmaking information
77
Purposes of EvaluationPurposes of Evaluation
 Determine whether the program isDetermine whether the program is
meeting the intended objectivesmeeting the intended objectives
 Identify strengths and weaknessesIdentify strengths and weaknesses
 Determine cost-benefit ratioDetermine cost-benefit ratio
 Identify who benefited most or leastIdentify who benefited most or least
 Determine future participantsDetermine future participants
 Provide information for improvingProvide information for improving
HRD programsHRD programs
88
Purposes of EvaluationPurposes of Evaluation –– 22
 Reinforce major points to be madeReinforce major points to be made
 Gather marketing informationGather marketing information
 Determine if training program isDetermine if training program is
appropriateappropriate
 Establish management databaseEstablish management database
99
Evaluation Bottom LineEvaluation Bottom Line
 Is HRD a revenue contributor or aIs HRD a revenue contributor or a
revenue user?revenue user?
 Is HRD credible to line and upper-Is HRD credible to line and upper-
level managers?level managers?
 Are benefits of HRD readily evidentAre benefits of HRD readily evident
to all?to all?
1010
How Often are HRD EvaluationsHow Often are HRD Evaluations
Conducted?Conducted?
 Not often enough!!!Not often enough!!!
 Frequently, only end-of-courseFrequently, only end-of-course
participant reactions are collectedparticipant reactions are collected
 Transfer to the workplace isTransfer to the workplace is
evaluated less frequentlyevaluated less frequently
1111
Why HRD Evaluations are RareWhy HRD Evaluations are Rare
 Reluctance to having HRD programsReluctance to having HRD programs
evaluatedevaluated
 Evaluation needs expertise andEvaluation needs expertise and
resourcesresources
 Factors other than HRD causeFactors other than HRD cause
performance improvementsperformance improvements –– e.g.,e.g.,
– EconomyEconomy
– EquipmentEquipment
– Policies, etc.Policies, etc.
1212
Need for HRD EvaluationNeed for HRD Evaluation
 Shows the value of HRDShows the value of HRD
 Provides metrics for HRD efficiencyProvides metrics for HRD efficiency
 Demonstrates value-addedDemonstrates value-added
approach for HRDapproach for HRD
 Demonstrates accountability forDemonstrates accountability for
HRD activitiesHRD activities
 Everyone else has it… why notEveryone else has it… why not
HRD?HRD?
1313
Make or Buy EvaluationMake or Buy Evaluation
 ““I bought it, therefore it is good.”I bought it, therefore it is good.”
 ““Since it’s good, I don’t need to post-Since it’s good, I don’t need to post-
test.”test.”
 Who says it’s:Who says it’s:
– Appropriate?Appropriate?
– Effective?Effective?
– Timely?Timely?
– Transferable to the workplace?Transferable to the workplace?
1414
Evolution of Evaluation EffortsEvolution of Evaluation Efforts
1.1. AnecdotalAnecdotal approachapproach –– talk to othertalk to other
usersusers
2.2. Try before buyTry before buy –– borrow and useborrow and use
samplessamples
3.3. AnalyticalAnalytical approachapproach –– matchmatch
research data to training needsresearch data to training needs
4.4. HolisticHolistic approachapproach –– look at overalllook at overall
HRD process, as well as individualHRD process, as well as individual
trainingtraining
1515
Models and Frameworks ofModels and Frameworks of
EvaluationEvaluation
 Table 7-1 lists six frameworks forTable 7-1 lists six frameworks for
evaluationevaluation
 The most popular is that of D.The most popular is that of D.
Kirkpatrick:Kirkpatrick:
– ReactionReaction
– LearningLearning
– Job BehaviorJob Behavior
– ResultsResults
1616
Kirkpatrick’s Four LevelsKirkpatrick’s Four Levels
 ReactionReaction
– Focus on trainee’s reactionsFocus on trainee’s reactions
 LearningLearning
– Did they learn what they were supposed to?Did they learn what they were supposed to?
 Job BehaviorJob Behavior
– Was it used on job?Was it used on job?
 ResultsResults
– Did it improve the organization’s effectiveness?Did it improve the organization’s effectiveness?
1717
Issues Concerning Kirkpatrick’sIssues Concerning Kirkpatrick’s
FrameworkFramework
 Most organizations don’t evaluateMost organizations don’t evaluate
at all four levelsat all four levels
 Focuses only on post-trainingFocuses only on post-training
 Doesn’t treat inter-stageDoesn’t treat inter-stage
improvementsimprovements
 WHAT ARE YOUR THOUGHTS?WHAT ARE YOUR THOUGHTS?
1818
Other Frameworks/ModelsOther Frameworks/Models
 CIPP: Context, Input, Process, ProductCIPP: Context, Input, Process, Product
(Galvin, 1983)(Galvin, 1983)
 Brinkerhoff (1987):Brinkerhoff (1987):
– Goal settingGoal setting
– Program designProgram design
– Program implementationProgram implementation
– Immediate outcomesImmediate outcomes
– Usage outcomesUsage outcomes
– Impacts and worthImpacts and worth
1919
Other Frameworks/Models – 2Other Frameworks/Models – 2
 Kraiger, Ford, & Salas (1993):Kraiger, Ford, & Salas (1993):
– Cognitive outcomesCognitive outcomes
– Skill-based outcomesSkill-based outcomes
– Affective outcomesAffective outcomes
 Holton (1996): Five Categories:Holton (1996): Five Categories:
– Secondary InfluencesSecondary Influences
– Motivation ElementsMotivation Elements
– Environmental ElementsEnvironmental Elements
– OutcomesOutcomes
– Ability/Enabling ElementsAbility/Enabling Elements
2020
Other Frameworks/Models – 3Other Frameworks/Models – 3
 Phillips (1996):Phillips (1996):
– Reaction and Planned ActionReaction and Planned Action
– LearningLearning
– Applied Learning on the JobApplied Learning on the Job
– Business ResultsBusiness Results
– ROIROI
2121
A Suggested Framework – 1A Suggested Framework – 1
 ReactionReaction
– Did trainees like the training?Did trainees like the training?
– Did the training seem useful?Did the training seem useful?
 LearningLearning
– How much did they learn?How much did they learn?
 BehaviorBehavior
– What behavior change occurred?What behavior change occurred?
2222
Suggested Framework – 2Suggested Framework – 2
 ResultsResults
– What were the tangible outcomes?What were the tangible outcomes?
– What was the return on investmentWhat was the return on investment
(ROI)?(ROI)?
– What was the contribution to theWhat was the contribution to the
organization?organization?
2323
Data Collection for HRDData Collection for HRD
EvaluationEvaluation
Possible methods:Possible methods:
 InterviewsInterviews
 QuestionnairesQuestionnaires
 Direct observationDirect observation
 Written testsWritten tests
 Simulation/Performance testsSimulation/Performance tests
 Archival performance informationArchival performance information
2424
InterviewsInterviews
AdvantagesAdvantages::
 FlexibleFlexible
 Opportunity forOpportunity for
clarificationclarification
 Depth possibleDepth possible
 Personal contactPersonal contact
Limitations:Limitations:
 High reactiveHigh reactive
effectseffects
 High costHigh cost
 Face-to-face threatFace-to-face threat
potentialpotential
 Labor intensiveLabor intensive
 Trained observersTrained observers
neededneeded
2525
QuestionnairesQuestionnaires
AdvantagesAdvantages::
 Low cost toLow cost to
administeradminister
 Honesty increasedHonesty increased
 Anonymity possibleAnonymity possible
 Respondent setsRespondent sets
the pacethe pace
 Variety of optionsVariety of options
LimitationsLimitations::
 Possible inaccuratePossible inaccurate
datadata
 ResponseResponse
conditions notconditions not
controlledcontrolled
 Respondents setRespondents set
varying pacesvarying paces
 Uncontrolled returnUncontrolled return
raterate
2626
Direct ObservationDirect Observation
AdvantagesAdvantages::
 NonthreateningNonthreatening
 Excellent way toExcellent way to
measure behaviormeasure behavior
changechange
LimitationsLimitations::
 Possibly disruptivePossibly disruptive
 Reactive effectsReactive effects
are possibleare possible
 May be unreliableMay be unreliable
 Need trainedNeed trained
observersobservers
2727
Written TestsWritten Tests
AdvantagesAdvantages::
 Low purchase costLow purchase cost
 Readily scoredReadily scored
 Quickly processedQuickly processed
 Easily administeredEasily administered
 Wide samplingWide sampling
possiblepossible
LimitationsLimitations::
 May be threateningMay be threatening
 Possibly no relationPossibly no relation
to job performanceto job performance
 Measures onlyMeasures only
cognitive learningcognitive learning
 Relies on normsRelies on norms
 Concern for racial/Concern for racial/
ethnic biasethnic bias
2828
Simulation/Performance TestsSimulation/Performance Tests
AdvantagesAdvantages::
 ReliableReliable
 ObjectiveObjective
 Close relation toClose relation to
job performancejob performance
 Includes cognitive,Includes cognitive,
psychomotor andpsychomotor and
affective domainsaffective domains
LimitationsLimitations::
 Time consumingTime consuming
 Simulations oftenSimulations often
difficult to createdifficult to create
 High costs toHigh costs to
development anddevelopment and
useuse
2929
Archival Performance DataArchival Performance Data
AdvantagesAdvantages::
 ReliableReliable
 ObjectiveObjective
 Job-basedJob-based
 Easy to reviewEasy to review
 Minimal reactiveMinimal reactive
effectseffects
LimitationsLimitations::
 Criteria forCriteria for
keeping/ discardingkeeping/ discarding
recordsrecords
 Information systemInformation system
discrepanciesdiscrepancies
 IndirectIndirect
 Not always usableNot always usable
 Records preparedRecords prepared
for other purposesfor other purposes
3030
Choosing Data CollectionChoosing Data Collection
MethodsMethods
 ReliabilityReliability
– Consistency of results, and freedom fromConsistency of results, and freedom from
collection method bias and errorcollection method bias and error
 ValidityValidity
– Does the device measure what we want toDoes the device measure what we want to
measure?measure?
 PracticalityPracticality
– Does it make sense in terms of theDoes it make sense in terms of the
resources used to get the data?resources used to get the data?
3131
Type of Data Used/NeededType of Data Used/Needed
 Individual performanceIndividual performance
 Systemwide performanceSystemwide performance
 EconomicEconomic
3232
Individual Performance DataIndividual Performance Data
 Individual knowledgeIndividual knowledge
 Individual behaviorsIndividual behaviors
 Examples:Examples:
– Test scoresTest scores
– Performance quantity, quality, andPerformance quantity, quality, and
timelinesstimeliness
– Attendance recordsAttendance records
– AttitudesAttitudes
3333
Systemwide Performance DataSystemwide Performance Data
 ProductivityProductivity
 Scrap/rework ratesScrap/rework rates
 Customer satisfaction levelsCustomer satisfaction levels
 On-time performance levelsOn-time performance levels
 Quality rates and improvement ratesQuality rates and improvement rates
3434
Economic DataEconomic Data
 ProfitsProfits
 Product liability claimsProduct liability claims
 Avoidance of penaltiesAvoidance of penalties
 Market shareMarket share
 Competitive positionCompetitive position
 Return on investment (ROI)Return on investment (ROI)
 Financial utility calculationsFinancial utility calculations
3535
Use of Self-Report DataUse of Self-Report Data
 Most common methodMost common method
 Pre-training and post-training dataPre-training and post-training data
 Problems:Problems:
– Mono-method biasMono-method bias
 Desire to be consistent between testsDesire to be consistent between tests
– Socially desirable responsesSocially desirable responses
– Response Shift Bias:Response Shift Bias:
 Trainees adjust expectations to trainingTrainees adjust expectations to training
3636
Research DesignResearch Design
Specifies in advance:Specifies in advance:
 the expected results of the studythe expected results of the study
 the methods of data collection to bethe methods of data collection to be
usedused
 how the data will be analyzedhow the data will be analyzed
3737
Research Design IssuesResearch Design Issues
 Pretest and PosttestPretest and Posttest
– Shows trainee what training hasShows trainee what training has
accomplishedaccomplished
– Helps eliminate pretest knowledge biasHelps eliminate pretest knowledge bias
 Control GroupControl Group
– Compares performance of group withCompares performance of group with
training against the performance of atraining against the performance of a
similar group without trainingsimilar group without training
3838
Recommended ResearchRecommended Research
DesignDesign
 Pretest and posttest with controlPretest and posttest with control
groupgroup
 Whenever possible:Whenever possible:
– Randomly assign individuals to the testRandomly assign individuals to the test
group and the control group to minimizegroup and the control group to minimize
biasbias
– Use “time-series” approach to dataUse “time-series” approach to data
collection to verify performancecollection to verify performance
improvement is due to trainingimprovement is due to training
3939
Ethical Issues ConcerningEthical Issues Concerning
Evaluation ResearchEvaluation Research
 ConfidentialityConfidentiality
 Informed consentInformed consent
 Withholding training from controlWithholding training from control
groupsgroups
 Use of deceptionUse of deception
 Pressure to produce positive resultsPressure to produce positive results
4040
Assessing the Impact of HRDAssessing the Impact of HRD
 Money is the language of business.Money is the language of business.
 You MUST talk dollars, not HRDYou MUST talk dollars, not HRD
jargon.jargon.
 No one (except maybe you) caresNo one (except maybe you) cares
about “the effectiveness of trainingabout “the effectiveness of training
interventions as measured by andinterventions as measured by and
analysis of formal pretest, posttestanalysis of formal pretest, posttest
control group data.”control group data.”
4141
HRD Program AssessmentHRD Program Assessment
 HRD programs and training areHRD programs and training are
investmentsinvestments
 Line managers often see HR and HRD asLine managers often see HR and HRD as
costscosts –– i.e.,i.e., revenue users, not revenuerevenue users, not revenue
producersproducers
 You must prove your worth to theYou must prove your worth to the
organization –organization –
– Or you’ll have to find anotherOr you’ll have to find another
organization…organization…
4242
Two Basic Methods forTwo Basic Methods for
Assessing Financial ImpactAssessing Financial Impact
 Evaluation of training costsEvaluation of training costs
 Utility analysisUtility analysis
4343
Evaluation of Training CostsEvaluation of Training Costs
 Cost-benefit analysisCost-benefit analysis
– Compares cost of training to benefitsCompares cost of training to benefits
gained such as attitudes, reduction ingained such as attitudes, reduction in
accidents, reduction in employee sick-accidents, reduction in employee sick-
days, etc.days, etc.
 Cost-effectiveness analysisCost-effectiveness analysis
– Focuses on increases in quality,Focuses on increases in quality,
reduction in scrap/rework, productivity,reduction in scrap/rework, productivity,
etc.etc.
4444
Return on InvestmentReturn on Investment
 Return on investment =Return on investment =
Results/CostsResults/Costs
4545
Calculating Training Return OnCalculating Training Return On
InvestmentInvestment
    Results Results    
Operational How Before After Differences Expressed
Results Area Measured Training Training (+ or –) in $
Quality of panels % rejected 2% rejected 1.5% rejected .5% $720 per day
    1,440 panels 1,080 panels 360 panels $172,800
      per day   per day     per year
Housekeeping Visual  10 defects 2 defects 8 defects Not measur-
 
  inspection   (average)   (average)
 
  able in $
 
  using
 
 
   
 
  20-item
       
 
  checklist
       
Preventable Number of 24 per year 16 per year 8 per year  
  accidents   accidents
       
  Direct cost $144,000 $96,000 per $48,000 $48,000 per
 
  of each   per year   year
 
  year
 
  accident
       
 
 
  Return
Investment
    Total savings: $220,800.00
ROI = =    
   
   
   
SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact. Training and Development Journal, 43(8), 41. Printed by permission.
Operational Results
Training Costs
= $220,800
$32,564
= 6.8
4646
Types of Training CostsTypes of Training Costs
 Direct costsDirect costs
 Indirect costsIndirect costs
 Development costsDevelopment costs
 Overhead costsOverhead costs
 Compensation for participantsCompensation for participants
4747
Direct CostsDirect Costs
 InstructorInstructor
– Base payBase pay
– Fringe benefitsFringe benefits
– Travel and per diemTravel and per diem
 MaterialsMaterials
 Classroom and audiovisualClassroom and audiovisual
equipmentequipment
 TravelTravel
 Food and refreshmentsFood and refreshments
4848
Indirect CostsIndirect Costs
 Training managementTraining management
 Clerical/AdministrativeClerical/Administrative
 Postal/shipping, telephone,Postal/shipping, telephone,
computers, etc.computers, etc.
 Pre- and post-learning materialsPre- and post-learning materials
 Other overhead costsOther overhead costs
4949
Development CostsDevelopment Costs
 Fee to purchase programFee to purchase program
 Costs to tailor program toCosts to tailor program to
organizationorganization
 Instructor training costsInstructor training costs
5050
Overhead CostsOverhead Costs
 General organization supportGeneral organization support
 Top management participationTop management participation
 Utilities, facilitiesUtilities, facilities
 General and administrative costs,General and administrative costs,
such as HRMsuch as HRM
5151
Compensation for ParticipantsCompensation for Participants
 Participants’ salary and benefits forParticipants’ salary and benefits for
time away from jobtime away from job
 Travel, lodging, and per-diem costsTravel, lodging, and per-diem costs
5252
Measuring BenefitsMeasuring Benefits
– Change in quality per unit measured inChange in quality per unit measured in
dollarsdollars
– Reduction in scrap/rework measured inReduction in scrap/rework measured in
dollar cost of labor and materialsdollar cost of labor and materials
– Reduction in preventable accidentsReduction in preventable accidents
measured in dollarsmeasured in dollars
– ROI = Benefits/Training costsROI = Benefits/Training costs
5353
Utility AnalysisUtility Analysis
 Uses a statistical approach toUses a statistical approach to
support claims of trainingsupport claims of training
effectiveness:effectiveness:
– N = Number of traineesN = Number of trainees
– T = Length of time benefits are expected to lastT = Length of time benefits are expected to last
– ddtt = True performance difference resulting from= True performance difference resulting from
trainingtraining
– SDSDyy = Dollar value of untrained job performance (in= Dollar value of untrained job performance (in
standard deviation units)standard deviation units)
– C = Cost of trainingC = Cost of training
 ∆∆U = (N)(T)(dU = (N)(T)(dtt)(Sd)(Sdyy) – C) – C
5454
Critical Information for UtilityCritical Information for Utility
AnalysisAnalysis
 ddtt = difference in units between= difference in units between
trained/untrained, divided bytrained/untrained, divided by
standard deviation in unitsstandard deviation in units
produced by trainedproduced by trained
 SDSDyy = standard deviation in= standard deviation in
dollars, or overall productivity ofdollars, or overall productivity of
organizationorganization
5555
Ways to Improve HRDWays to Improve HRD
AssessmentAssessment
 Walk the walk, talk the talk: MONEYWalk the walk, talk the talk: MONEY
 Involve HRD in strategic planningInvolve HRD in strategic planning
 Involve management in HRD planning andInvolve management in HRD planning and
estimation effortsestimation efforts
– Gain mutual ownershipGain mutual ownership
 Use credible and conservative estimatesUse credible and conservative estimates
 Share credit for successes and blame forShare credit for successes and blame for
failuresfailures
5656
HRD Evaluation StepsHRD Evaluation Steps
1.1. Analyze needs.Analyze needs.
2.2. Determine explicit evaluation strategy.Determine explicit evaluation strategy.
3.3. Insist on specific and measurableInsist on specific and measurable
training objectives.training objectives.
4.4. Obtain participant reactions.Obtain participant reactions.
5.5. Develop criterion measures/instrumentsDevelop criterion measures/instruments
to measure results.to measure results.
6.6. Plan and execute evaluation strategy.Plan and execute evaluation strategy.
5757
SummarySummary
 Training results must be measuredTraining results must be measured
against costsagainst costs
 Training must contribute to theTraining must contribute to the
“bottom line”“bottom line”
 HRD must justify itself repeatedlyHRD must justify itself repeatedly
as a revenue enhanceras a revenue enhancer

More Related Content

PDF
IBM: The Value of Training
PPT
38235217 career-and-sucession-planning
PPT
Griffinchap01 110410121102-phpapp02
PPT
Organizational Behavior CH 05
PDF
Talent Gene - The Physiology of High Potential and Talent 01 2014 Final 1.1
PDF
The Definitive Guide To Talent Mobility
PPTX
Success at Every Level: HiPo Development Strategies for Accelerated Successio...
PDF
7 People Results - HR performance and perception measures
IBM: The Value of Training
38235217 career-and-sucession-planning
Griffinchap01 110410121102-phpapp02
Organizational Behavior CH 05
Talent Gene - The Physiology of High Potential and Talent 01 2014 Final 1.1
The Definitive Guide To Talent Mobility
Success at Every Level: HiPo Development Strategies for Accelerated Successio...
7 People Results - HR performance and perception measures

What's hot (19)

PDF
Innovation white paper final
PDF
High Potential Talent: One Firm's Approach to HIPO Development
PDF
Building Leadership Bench Strength: Current Trends in Succession Planning and...
PPT
Chap12
PPT
management chapter 11
PPTX
Succession planning June 2013
PPTX
Denise shillito 22 12 2015 (002)
PPTX
Competency Iceberg Model
PPTX
Succession Planning - Public Works Institute
PPTX
A Comparative Study of Multinational Companies’ Mission Statements Implementa...
DOCX
MGT 521 AID Education Counseling--mgt521aid.com
DOCX
MGT 521 NERD Lessons in Excellence / mgt521nerd.com
PPT
Organizational Behavior CH 06
PDF
MGT 521 AID Inspiring Innovation--mgt521aid.com
PPT
Competency based feedback system workshop slides chadramowly
PDF
Succession planning hr and strategy innovation orlando conference presentat...
PPT
General Desktop Presentation
PPTX
Using competencies to define job levels & determine pay - Case Study
PDF
From Shr To Op Ispa 250211
Innovation white paper final
High Potential Talent: One Firm's Approach to HIPO Development
Building Leadership Bench Strength: Current Trends in Succession Planning and...
Chap12
management chapter 11
Succession planning June 2013
Denise shillito 22 12 2015 (002)
Competency Iceberg Model
Succession Planning - Public Works Institute
A Comparative Study of Multinational Companies’ Mission Statements Implementa...
MGT 521 AID Education Counseling--mgt521aid.com
MGT 521 NERD Lessons in Excellence / mgt521nerd.com
Organizational Behavior CH 06
MGT 521 AID Inspiring Innovation--mgt521aid.com
Competency based feedback system workshop slides chadramowly
Succession planning hr and strategy innovation orlando conference presentat...
General Desktop Presentation
Using competencies to define job levels & determine pay - Case Study
From Shr To Op Ispa 250211
Ad

Similar to Evaluating hrd interventions (20)

PPT
evaluating hrd programs
PPTX
chapter7-Evaluating HRD Program s.pptx
PPTX
SHRD-Lecture-7-Evaluating-HRD-Programs.pptx
PPT
Evaluating hrd-programs
PPTX
HRD System Design, Assessing HRD Needs, Designing and Implementing HRD Progra...
PPTX
Strategic hrd
DOCX
Chapter 7Evaluating HRD ProgramsWerner© 2017 Cengage Learn.docx
PPT
chapter-7.pptfsdafffffffffffffffffffffffffffffffffffff
PPTX
Measuring roi of training
PPTX
Measuring roi of training ppt slides
PPT
Unique file 8
PDF
Human Resource Policies and Practices, Chapter 18, Organizational Behavior
PPTX
Human Resource Development- framework
PPT
Human Capital Solutions
PPT
Ob12 18st
PPT
Trg evaluation
PDF
Human Resource Development Training (SABPP Certificate)
PDF
Human Resource Development Training
PDF
IBA_HRMC Course Material
evaluating hrd programs
chapter7-Evaluating HRD Program s.pptx
SHRD-Lecture-7-Evaluating-HRD-Programs.pptx
Evaluating hrd-programs
HRD System Design, Assessing HRD Needs, Designing and Implementing HRD Progra...
Strategic hrd
Chapter 7Evaluating HRD ProgramsWerner© 2017 Cengage Learn.docx
chapter-7.pptfsdafffffffffffffffffffffffffffffffffffff
Measuring roi of training
Measuring roi of training ppt slides
Unique file 8
Human Resource Policies and Practices, Chapter 18, Organizational Behavior
Human Resource Development- framework
Human Capital Solutions
Ob12 18st
Trg evaluation
Human Resource Development Training (SABPP Certificate)
Human Resource Development Training
IBA_HRMC Course Material
Ad

Recently uploaded (20)

PDF
kom-180-proposal-for-a-directive-amending-directive-2014-45-eu-and-directive-...
PDF
Types of control:Qualitative vs Quantitative
PDF
Deliverable file - Regulatory guideline analysis.pdf
PDF
Chapter 5_Foreign Exchange Market in .pdf
PPTX
Probability Distribution, binomial distribution, poisson distribution
DOCX
Business Management - unit 1 and 2
PDF
Stem Cell Market Report | Trends, Growth & Forecast 2025-2034
PPTX
New Microsoft PowerPoint Presentation - Copy.pptx
PPTX
AI-assistance in Knowledge Collection and Curation supporting Safe and Sustai...
PDF
COST SHEET- Tender and Quotation unit 2.pdf
PDF
Dr. Enrique Segura Ense Group - A Self-Made Entrepreneur And Executive
PDF
Nidhal Samdaie CV - International Business Consultant
PPTX
Belch_12e_PPT_Ch18_Accessible_university.pptx
PDF
IFRS Notes in your pocket for study all the time
PDF
pdfcoffee.com-opt-b1plus-sb-answers.pdfvi
PDF
BsN 7th Sem Course GridNNNNNNNN CCN.pdf
PPTX
Amazon (Business Studies) management studies
PDF
MSPs in 10 Words - Created by US MSP Network
PPTX
job Avenue by vinith.pptxvnbvnvnvbnvbnbmnbmbh
PDF
SIMNET Inc – 2023’s Most Trusted IT Services & Solution Provider
kom-180-proposal-for-a-directive-amending-directive-2014-45-eu-and-directive-...
Types of control:Qualitative vs Quantitative
Deliverable file - Regulatory guideline analysis.pdf
Chapter 5_Foreign Exchange Market in .pdf
Probability Distribution, binomial distribution, poisson distribution
Business Management - unit 1 and 2
Stem Cell Market Report | Trends, Growth & Forecast 2025-2034
New Microsoft PowerPoint Presentation - Copy.pptx
AI-assistance in Knowledge Collection and Curation supporting Safe and Sustai...
COST SHEET- Tender and Quotation unit 2.pdf
Dr. Enrique Segura Ense Group - A Self-Made Entrepreneur And Executive
Nidhal Samdaie CV - International Business Consultant
Belch_12e_PPT_Ch18_Accessible_university.pptx
IFRS Notes in your pocket for study all the time
pdfcoffee.com-opt-b1plus-sb-answers.pdfvi
BsN 7th Sem Course GridNNNNNNNN CCN.pdf
Amazon (Business Studies) management studies
MSPs in 10 Words - Created by US MSP Network
job Avenue by vinith.pptxvnbvnvnvbnvbnbmnbmbh
SIMNET Inc – 2023’s Most Trusted IT Services & Solution Provider

Evaluating hrd interventions

  • 1. 11 Evaluating HRDEvaluating HRD ProgramsPrograms Professor Jayashree SadriProfessor Jayashree Sadri and Dr. Sorab Sadriand Dr. Sorab Sadri
  • 2. 22 EffectivenessEffectiveness  The degree to which a training (orThe degree to which a training (or other HRD program) achieves itsother HRD program) achieves its intended purposeintended purpose  Measures are relative to someMeasures are relative to some starting pointstarting point  Measures how well the desiredMeasures how well the desired goal is achievedgoal is achieved
  • 4. 44 HRD EvaluationHRD Evaluation Textbook definition:Textbook definition: ““The systematic collection ofThe systematic collection of descriptive and judgmentaldescriptive and judgmental information necessary to makeinformation necessary to make effective training decisions relatedeffective training decisions related to the selection, adoption, value,to the selection, adoption, value, and modification of variousand modification of various instructional activities.”instructional activities.”
  • 5. 55 In Other Words…In Other Words… Are we training:Are we training:  the right peoplethe right people  the right “stuff”the right “stuff”  the right waythe right way  with the right materialswith the right materials  at the right time?at the right time?
  • 6. 66 Evaluation NeedsEvaluation Needs  Descriptive and judgmentalDescriptive and judgmental information neededinformation needed – Objective and subjective dataObjective and subjective data  Information gathered according toInformation gathered according to a plan and in a desired formata plan and in a desired format  Gathered to provide decisionGathered to provide decision making informationmaking information
  • 7. 77 Purposes of EvaluationPurposes of Evaluation  Determine whether the program isDetermine whether the program is meeting the intended objectivesmeeting the intended objectives  Identify strengths and weaknessesIdentify strengths and weaknesses  Determine cost-benefit ratioDetermine cost-benefit ratio  Identify who benefited most or leastIdentify who benefited most or least  Determine future participantsDetermine future participants  Provide information for improvingProvide information for improving HRD programsHRD programs
  • 8. 88 Purposes of EvaluationPurposes of Evaluation –– 22  Reinforce major points to be madeReinforce major points to be made  Gather marketing informationGather marketing information  Determine if training program isDetermine if training program is appropriateappropriate  Establish management databaseEstablish management database
  • 9. 99 Evaluation Bottom LineEvaluation Bottom Line  Is HRD a revenue contributor or aIs HRD a revenue contributor or a revenue user?revenue user?  Is HRD credible to line and upper-Is HRD credible to line and upper- level managers?level managers?  Are benefits of HRD readily evidentAre benefits of HRD readily evident to all?to all?
  • 10. 1010 How Often are HRD EvaluationsHow Often are HRD Evaluations Conducted?Conducted?  Not often enough!!!Not often enough!!!  Frequently, only end-of-courseFrequently, only end-of-course participant reactions are collectedparticipant reactions are collected  Transfer to the workplace isTransfer to the workplace is evaluated less frequentlyevaluated less frequently
  • 11. 1111 Why HRD Evaluations are RareWhy HRD Evaluations are Rare  Reluctance to having HRD programsReluctance to having HRD programs evaluatedevaluated  Evaluation needs expertise andEvaluation needs expertise and resourcesresources  Factors other than HRD causeFactors other than HRD cause performance improvementsperformance improvements –– e.g.,e.g., – EconomyEconomy – EquipmentEquipment – Policies, etc.Policies, etc.
  • 12. 1212 Need for HRD EvaluationNeed for HRD Evaluation  Shows the value of HRDShows the value of HRD  Provides metrics for HRD efficiencyProvides metrics for HRD efficiency  Demonstrates value-addedDemonstrates value-added approach for HRDapproach for HRD  Demonstrates accountability forDemonstrates accountability for HRD activitiesHRD activities  Everyone else has it… why notEveryone else has it… why not HRD?HRD?
  • 13. 1313 Make or Buy EvaluationMake or Buy Evaluation  ““I bought it, therefore it is good.”I bought it, therefore it is good.”  ““Since it’s good, I don’t need to post-Since it’s good, I don’t need to post- test.”test.”  Who says it’s:Who says it’s: – Appropriate?Appropriate? – Effective?Effective? – Timely?Timely? – Transferable to the workplace?Transferable to the workplace?
  • 14. 1414 Evolution of Evaluation EffortsEvolution of Evaluation Efforts 1.1. AnecdotalAnecdotal approachapproach –– talk to othertalk to other usersusers 2.2. Try before buyTry before buy –– borrow and useborrow and use samplessamples 3.3. AnalyticalAnalytical approachapproach –– matchmatch research data to training needsresearch data to training needs 4.4. HolisticHolistic approachapproach –– look at overalllook at overall HRD process, as well as individualHRD process, as well as individual trainingtraining
  • 15. 1515 Models and Frameworks ofModels and Frameworks of EvaluationEvaluation  Table 7-1 lists six frameworks forTable 7-1 lists six frameworks for evaluationevaluation  The most popular is that of D.The most popular is that of D. Kirkpatrick:Kirkpatrick: – ReactionReaction – LearningLearning – Job BehaviorJob Behavior – ResultsResults
  • 16. 1616 Kirkpatrick’s Four LevelsKirkpatrick’s Four Levels  ReactionReaction – Focus on trainee’s reactionsFocus on trainee’s reactions  LearningLearning – Did they learn what they were supposed to?Did they learn what they were supposed to?  Job BehaviorJob Behavior – Was it used on job?Was it used on job?  ResultsResults – Did it improve the organization’s effectiveness?Did it improve the organization’s effectiveness?
  • 17. 1717 Issues Concerning Kirkpatrick’sIssues Concerning Kirkpatrick’s FrameworkFramework  Most organizations don’t evaluateMost organizations don’t evaluate at all four levelsat all four levels  Focuses only on post-trainingFocuses only on post-training  Doesn’t treat inter-stageDoesn’t treat inter-stage improvementsimprovements  WHAT ARE YOUR THOUGHTS?WHAT ARE YOUR THOUGHTS?
  • 18. 1818 Other Frameworks/ModelsOther Frameworks/Models  CIPP: Context, Input, Process, ProductCIPP: Context, Input, Process, Product (Galvin, 1983)(Galvin, 1983)  Brinkerhoff (1987):Brinkerhoff (1987): – Goal settingGoal setting – Program designProgram design – Program implementationProgram implementation – Immediate outcomesImmediate outcomes – Usage outcomesUsage outcomes – Impacts and worthImpacts and worth
  • 19. 1919 Other Frameworks/Models – 2Other Frameworks/Models – 2  Kraiger, Ford, & Salas (1993):Kraiger, Ford, & Salas (1993): – Cognitive outcomesCognitive outcomes – Skill-based outcomesSkill-based outcomes – Affective outcomesAffective outcomes  Holton (1996): Five Categories:Holton (1996): Five Categories: – Secondary InfluencesSecondary Influences – Motivation ElementsMotivation Elements – Environmental ElementsEnvironmental Elements – OutcomesOutcomes – Ability/Enabling ElementsAbility/Enabling Elements
  • 20. 2020 Other Frameworks/Models – 3Other Frameworks/Models – 3  Phillips (1996):Phillips (1996): – Reaction and Planned ActionReaction and Planned Action – LearningLearning – Applied Learning on the JobApplied Learning on the Job – Business ResultsBusiness Results – ROIROI
  • 21. 2121 A Suggested Framework – 1A Suggested Framework – 1  ReactionReaction – Did trainees like the training?Did trainees like the training? – Did the training seem useful?Did the training seem useful?  LearningLearning – How much did they learn?How much did they learn?  BehaviorBehavior – What behavior change occurred?What behavior change occurred?
  • 22. 2222 Suggested Framework – 2Suggested Framework – 2  ResultsResults – What were the tangible outcomes?What were the tangible outcomes? – What was the return on investmentWhat was the return on investment (ROI)?(ROI)? – What was the contribution to theWhat was the contribution to the organization?organization?
  • 23. 2323 Data Collection for HRDData Collection for HRD EvaluationEvaluation Possible methods:Possible methods:  InterviewsInterviews  QuestionnairesQuestionnaires  Direct observationDirect observation  Written testsWritten tests  Simulation/Performance testsSimulation/Performance tests  Archival performance informationArchival performance information
  • 24. 2424 InterviewsInterviews AdvantagesAdvantages::  FlexibleFlexible  Opportunity forOpportunity for clarificationclarification  Depth possibleDepth possible  Personal contactPersonal contact Limitations:Limitations:  High reactiveHigh reactive effectseffects  High costHigh cost  Face-to-face threatFace-to-face threat potentialpotential  Labor intensiveLabor intensive  Trained observersTrained observers neededneeded
  • 25. 2525 QuestionnairesQuestionnaires AdvantagesAdvantages::  Low cost toLow cost to administeradminister  Honesty increasedHonesty increased  Anonymity possibleAnonymity possible  Respondent setsRespondent sets the pacethe pace  Variety of optionsVariety of options LimitationsLimitations::  Possible inaccuratePossible inaccurate datadata  ResponseResponse conditions notconditions not controlledcontrolled  Respondents setRespondents set varying pacesvarying paces  Uncontrolled returnUncontrolled return raterate
  • 26. 2626 Direct ObservationDirect Observation AdvantagesAdvantages::  NonthreateningNonthreatening  Excellent way toExcellent way to measure behaviormeasure behavior changechange LimitationsLimitations::  Possibly disruptivePossibly disruptive  Reactive effectsReactive effects are possibleare possible  May be unreliableMay be unreliable  Need trainedNeed trained observersobservers
  • 27. 2727 Written TestsWritten Tests AdvantagesAdvantages::  Low purchase costLow purchase cost  Readily scoredReadily scored  Quickly processedQuickly processed  Easily administeredEasily administered  Wide samplingWide sampling possiblepossible LimitationsLimitations::  May be threateningMay be threatening  Possibly no relationPossibly no relation to job performanceto job performance  Measures onlyMeasures only cognitive learningcognitive learning  Relies on normsRelies on norms  Concern for racial/Concern for racial/ ethnic biasethnic bias
  • 28. 2828 Simulation/Performance TestsSimulation/Performance Tests AdvantagesAdvantages::  ReliableReliable  ObjectiveObjective  Close relation toClose relation to job performancejob performance  Includes cognitive,Includes cognitive, psychomotor andpsychomotor and affective domainsaffective domains LimitationsLimitations::  Time consumingTime consuming  Simulations oftenSimulations often difficult to createdifficult to create  High costs toHigh costs to development anddevelopment and useuse
  • 29. 2929 Archival Performance DataArchival Performance Data AdvantagesAdvantages::  ReliableReliable  ObjectiveObjective  Job-basedJob-based  Easy to reviewEasy to review  Minimal reactiveMinimal reactive effectseffects LimitationsLimitations::  Criteria forCriteria for keeping/ discardingkeeping/ discarding recordsrecords  Information systemInformation system discrepanciesdiscrepancies  IndirectIndirect  Not always usableNot always usable  Records preparedRecords prepared for other purposesfor other purposes
  • 30. 3030 Choosing Data CollectionChoosing Data Collection MethodsMethods  ReliabilityReliability – Consistency of results, and freedom fromConsistency of results, and freedom from collection method bias and errorcollection method bias and error  ValidityValidity – Does the device measure what we want toDoes the device measure what we want to measure?measure?  PracticalityPracticality – Does it make sense in terms of theDoes it make sense in terms of the resources used to get the data?resources used to get the data?
  • 31. 3131 Type of Data Used/NeededType of Data Used/Needed  Individual performanceIndividual performance  Systemwide performanceSystemwide performance  EconomicEconomic
  • 32. 3232 Individual Performance DataIndividual Performance Data  Individual knowledgeIndividual knowledge  Individual behaviorsIndividual behaviors  Examples:Examples: – Test scoresTest scores – Performance quantity, quality, andPerformance quantity, quality, and timelinesstimeliness – Attendance recordsAttendance records – AttitudesAttitudes
  • 33. 3333 Systemwide Performance DataSystemwide Performance Data  ProductivityProductivity  Scrap/rework ratesScrap/rework rates  Customer satisfaction levelsCustomer satisfaction levels  On-time performance levelsOn-time performance levels  Quality rates and improvement ratesQuality rates and improvement rates
  • 34. 3434 Economic DataEconomic Data  ProfitsProfits  Product liability claimsProduct liability claims  Avoidance of penaltiesAvoidance of penalties  Market shareMarket share  Competitive positionCompetitive position  Return on investment (ROI)Return on investment (ROI)  Financial utility calculationsFinancial utility calculations
  • 35. 3535 Use of Self-Report DataUse of Self-Report Data  Most common methodMost common method  Pre-training and post-training dataPre-training and post-training data  Problems:Problems: – Mono-method biasMono-method bias  Desire to be consistent between testsDesire to be consistent between tests – Socially desirable responsesSocially desirable responses – Response Shift Bias:Response Shift Bias:  Trainees adjust expectations to trainingTrainees adjust expectations to training
  • 36. 3636 Research DesignResearch Design Specifies in advance:Specifies in advance:  the expected results of the studythe expected results of the study  the methods of data collection to bethe methods of data collection to be usedused  how the data will be analyzedhow the data will be analyzed
  • 37. 3737 Research Design IssuesResearch Design Issues  Pretest and PosttestPretest and Posttest – Shows trainee what training hasShows trainee what training has accomplishedaccomplished – Helps eliminate pretest knowledge biasHelps eliminate pretest knowledge bias  Control GroupControl Group – Compares performance of group withCompares performance of group with training against the performance of atraining against the performance of a similar group without trainingsimilar group without training
  • 38. 3838 Recommended ResearchRecommended Research DesignDesign  Pretest and posttest with controlPretest and posttest with control groupgroup  Whenever possible:Whenever possible: – Randomly assign individuals to the testRandomly assign individuals to the test group and the control group to minimizegroup and the control group to minimize biasbias – Use “time-series” approach to dataUse “time-series” approach to data collection to verify performancecollection to verify performance improvement is due to trainingimprovement is due to training
  • 39. 3939 Ethical Issues ConcerningEthical Issues Concerning Evaluation ResearchEvaluation Research  ConfidentialityConfidentiality  Informed consentInformed consent  Withholding training from controlWithholding training from control groupsgroups  Use of deceptionUse of deception  Pressure to produce positive resultsPressure to produce positive results
  • 40. 4040 Assessing the Impact of HRDAssessing the Impact of HRD  Money is the language of business.Money is the language of business.  You MUST talk dollars, not HRDYou MUST talk dollars, not HRD jargon.jargon.  No one (except maybe you) caresNo one (except maybe you) cares about “the effectiveness of trainingabout “the effectiveness of training interventions as measured by andinterventions as measured by and analysis of formal pretest, posttestanalysis of formal pretest, posttest control group data.”control group data.”
  • 41. 4141 HRD Program AssessmentHRD Program Assessment  HRD programs and training areHRD programs and training are investmentsinvestments  Line managers often see HR and HRD asLine managers often see HR and HRD as costscosts –– i.e.,i.e., revenue users, not revenuerevenue users, not revenue producersproducers  You must prove your worth to theYou must prove your worth to the organization –organization – – Or you’ll have to find anotherOr you’ll have to find another organization…organization…
  • 42. 4242 Two Basic Methods forTwo Basic Methods for Assessing Financial ImpactAssessing Financial Impact  Evaluation of training costsEvaluation of training costs  Utility analysisUtility analysis
  • 43. 4343 Evaluation of Training CostsEvaluation of Training Costs  Cost-benefit analysisCost-benefit analysis – Compares cost of training to benefitsCompares cost of training to benefits gained such as attitudes, reduction ingained such as attitudes, reduction in accidents, reduction in employee sick-accidents, reduction in employee sick- days, etc.days, etc.  Cost-effectiveness analysisCost-effectiveness analysis – Focuses on increases in quality,Focuses on increases in quality, reduction in scrap/rework, productivity,reduction in scrap/rework, productivity, etc.etc.
  • 44. 4444 Return on InvestmentReturn on Investment  Return on investment =Return on investment = Results/CostsResults/Costs
  • 45. 4545 Calculating Training Return OnCalculating Training Return On InvestmentInvestment     Results Results     Operational How Before After Differences Expressed Results Area Measured Training Training (+ or –) in $ Quality of panels % rejected 2% rejected 1.5% rejected .5% $720 per day     1,440 panels 1,080 panels 360 panels $172,800       per day   per day     per year Housekeeping Visual  10 defects 2 defects 8 defects Not measur-     inspection   (average)   (average)     able in $     using             20-item             checklist         Preventable Number of 24 per year 16 per year 8 per year     accidents   accidents           Direct cost $144,000 $96,000 per $48,000 $48,000 per     of each   per year   year     year     accident               Return Investment     Total savings: $220,800.00 ROI = =                 SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact. Training and Development Journal, 43(8), 41. Printed by permission. Operational Results Training Costs = $220,800 $32,564 = 6.8
  • 46. 4646 Types of Training CostsTypes of Training Costs  Direct costsDirect costs  Indirect costsIndirect costs  Development costsDevelopment costs  Overhead costsOverhead costs  Compensation for participantsCompensation for participants
  • 47. 4747 Direct CostsDirect Costs  InstructorInstructor – Base payBase pay – Fringe benefitsFringe benefits – Travel and per diemTravel and per diem  MaterialsMaterials  Classroom and audiovisualClassroom and audiovisual equipmentequipment  TravelTravel  Food and refreshmentsFood and refreshments
  • 48. 4848 Indirect CostsIndirect Costs  Training managementTraining management  Clerical/AdministrativeClerical/Administrative  Postal/shipping, telephone,Postal/shipping, telephone, computers, etc.computers, etc.  Pre- and post-learning materialsPre- and post-learning materials  Other overhead costsOther overhead costs
  • 49. 4949 Development CostsDevelopment Costs  Fee to purchase programFee to purchase program  Costs to tailor program toCosts to tailor program to organizationorganization  Instructor training costsInstructor training costs
  • 50. 5050 Overhead CostsOverhead Costs  General organization supportGeneral organization support  Top management participationTop management participation  Utilities, facilitiesUtilities, facilities  General and administrative costs,General and administrative costs, such as HRMsuch as HRM
  • 51. 5151 Compensation for ParticipantsCompensation for Participants  Participants’ salary and benefits forParticipants’ salary and benefits for time away from jobtime away from job  Travel, lodging, and per-diem costsTravel, lodging, and per-diem costs
  • 52. 5252 Measuring BenefitsMeasuring Benefits – Change in quality per unit measured inChange in quality per unit measured in dollarsdollars – Reduction in scrap/rework measured inReduction in scrap/rework measured in dollar cost of labor and materialsdollar cost of labor and materials – Reduction in preventable accidentsReduction in preventable accidents measured in dollarsmeasured in dollars – ROI = Benefits/Training costsROI = Benefits/Training costs
  • 53. 5353 Utility AnalysisUtility Analysis  Uses a statistical approach toUses a statistical approach to support claims of trainingsupport claims of training effectiveness:effectiveness: – N = Number of traineesN = Number of trainees – T = Length of time benefits are expected to lastT = Length of time benefits are expected to last – ddtt = True performance difference resulting from= True performance difference resulting from trainingtraining – SDSDyy = Dollar value of untrained job performance (in= Dollar value of untrained job performance (in standard deviation units)standard deviation units) – C = Cost of trainingC = Cost of training  ∆∆U = (N)(T)(dU = (N)(T)(dtt)(Sd)(Sdyy) – C) – C
  • 54. 5454 Critical Information for UtilityCritical Information for Utility AnalysisAnalysis  ddtt = difference in units between= difference in units between trained/untrained, divided bytrained/untrained, divided by standard deviation in unitsstandard deviation in units produced by trainedproduced by trained  SDSDyy = standard deviation in= standard deviation in dollars, or overall productivity ofdollars, or overall productivity of organizationorganization
  • 55. 5555 Ways to Improve HRDWays to Improve HRD AssessmentAssessment  Walk the walk, talk the talk: MONEYWalk the walk, talk the talk: MONEY  Involve HRD in strategic planningInvolve HRD in strategic planning  Involve management in HRD planning andInvolve management in HRD planning and estimation effortsestimation efforts – Gain mutual ownershipGain mutual ownership  Use credible and conservative estimatesUse credible and conservative estimates  Share credit for successes and blame forShare credit for successes and blame for failuresfailures
  • 56. 5656 HRD Evaluation StepsHRD Evaluation Steps 1.1. Analyze needs.Analyze needs. 2.2. Determine explicit evaluation strategy.Determine explicit evaluation strategy. 3.3. Insist on specific and measurableInsist on specific and measurable training objectives.training objectives. 4.4. Obtain participant reactions.Obtain participant reactions. 5.5. Develop criterion measures/instrumentsDevelop criterion measures/instruments to measure results.to measure results. 6.6. Plan and execute evaluation strategy.Plan and execute evaluation strategy.
  • 57. 5757 SummarySummary  Training results must be measuredTraining results must be measured against costsagainst costs  Training must contribute to theTraining must contribute to the “bottom line”“bottom line”  HRD must justify itself repeatedlyHRD must justify itself repeatedly as a revenue enhanceras a revenue enhancer

Editor's Notes