The Val ue of Eval uat i on
John J. Heldrich Center for Workforce Development
April 8, 2011
Presentation to the New Jersey County Colleges’
Institutional Advancement Affinity Group
William Mabe, PhD, Director of Research and Evaluation
John J. Heldrich Center for
Workforce Development
22
About the Heldrich Center
 The Heldrich Center is a research and policy institute
at Rutgers dedicated to raising the effectiveness of
the American workplace by strengthening workforce
education and training programs
 Founded in 1997, the center employs 18 full-time
professional staff and faculty representing an array of
disciplines, from economics and human services to
business and public policy
John J. Heldrich Center for
Workforce Development
3
Heldrich Center Research and Evaluation
 Have evaluated over 30 education and workforce
programs
 Selected evaluation projects include:
– A profile of performance outcomes of New Jersey’s community
colleges
– Evaluation of Kessler Foundation funded grantees
– Evaluation of US Department of Education financed Parent
Information and Resource Centers
John J. Heldrich Center for
Workforce Development
44
Benefits of Evaluation
• Learn what works and what does not work so
resources can be directed to most effective programs
• Identify barriers to success and program weaknesses
• Provide evidence for program sustainability
• Document best practices for replication
John J. Heldrich Center for
Workforce Development
5
What Evaluation is NOT
John J. Heldrich Center for
Workforce Development
66
Evaluation Is Not
• Auditing
• A gotcha
• Needs assessment
• Just about the numbers
• Customer satisfaction
John J. Heldrich Center for
Workforce Development
77
Evaluation Is
• Systematic and objective process by which a
researcher assesses the quality, effectiveness, or
value of a program
 Systematic--follows established rules of scientific inquiry
 Objective--any neutral observer would arrive at the same
conclusions about the program if she were to use the
same methods
 Assess program quality by studying how the program
operates
 Effectiveness--whether the program achieves its intended
goal
 Value assessment places the effectiveness of the program
in the context of its costs
John J. Heldrich Center for
Workforce Development
88
Types of Evaluations
• Process Evaluations:
 Identify barriers to successful implementation
 Identify which program components are effective and which are
ineffective at helping program achieve its goals
 Recommend strategies for modifying program to strengthen it
 Tend to use more qualitative data
• Outcome Evaluations:
 How well is the program achieving it goals? Is it effective?
 Tend to use more quantitative data
John J. Heldrich Center for
Workforce Development
99
When Programs Should Be Evaluated
• Before the program begins. In the early stages of
developing any program, staff should plan the
program to accommodate evaluation
• Data collection should be ongoing
• Program should be evaluated at regular intervals
John J. Heldrich Center for
Workforce Development
1010
Life-Cycle Model of Program Evaluation
• Like products, programs have life cycles:
• conceptualization
• piloting
• widespread implementation
• maturity
• (possibly) phase-out
• Type of evaluation to be conducted is a function of
where a program is in the life-cycle
John J. Heldrich Center for
Workforce Development
1111
How to Approach Evaluation
• At what stage is the program in its life-cycle?
• What evidence would be needed to convince a
neutral observer that the program is effective?
That it’s being implemented well?
• If this program were successful what would I
expect to observe?
 Look for the observable implications of program success
 Observable implications can be measured, either qualitatively or
quantitatively
• Focus on both process and outcomes
John J. Heldrich Center for
Workforce Development
1212
How to Evaluate
• Identify stakeholders
• Types of data: qualitative and quantitative
• Methodologies of data collection
• Individual interviews
• Focus groups
• Surveys
• Observation of program activities
• Collection of programmatic and administrative data
• Some methodologies can be used to collect
qualitative and quantitative data, e.g., surveys
• When to collect data
• Design program so that needed data can be collected from day
one
John J. Heldrich Center for
Workforce Development
1313
Data Heldrich Center Has Used for
Program Evaluations
• New Jersey Unemployment Insurance Wage Record Data
• Quarterly census of earnings of over nearly all workers in NJ
• Heldrich Center has access to UI wage records through data sharing
agreement with NJ Dept of Labor and Workforce Development
• Permitted to use for approved research purposes
• Employment Service OSOS Data
• Records of people who received training through public workforce
system
• Heldrich has data sharing agreement with NJLWD
• Valuable for creating comparison groups
• New Jersey Student Unit Record Data System
• Centralized database of student enrollment and graduation records
that CHE collects from you and all public and some independent
institutions in the state
• Heldrich has data sharing agreement with NJCHE to use for approved
research purposes
John J. Heldrich Center for
Workforce Development
14
Case Study #1: Essex County College Math
Initiative
John J. Heldrich Center for
Workforce Development
15
Program Overview
Semester Key Program Elements
Summer I Pilot
Semester
•Linked developmentalpre-algebracourse and scientific
reasoningcourse
•STEM specific tutors
•Financialsupport
Fall2009 Semester &
Spring 2010 Semester
•Collegelevel Math course with a supported recitation
•STEMspecific tutors
•Financialsupport
•Access to a dedicatedcomputer lab
•Feedbackoriented homework structure.
John J. Heldrich Center for
Workforce Development
16
Key Differences: STEM Algebra Course
versus College’s Regular Algebra Course
• Revised curriculum:
– Focus on concepts
– New textbook
– Coverage of more difficult topics
• Mandatory recitation
• Teaching Assistant
• Financial support for students
• Mandatory homework
• Different grading structure
John J. Heldrich Center for
Workforce Development
17
Evaluation Design
• Process and outcome evaluations
• Use qualitative and quantitative data collection methods
• Interdisciplinary evaluation team
• Submitted interim reports to provide actionable information
in a timely manner
• Focused on understanding the culture and politics of the
school as part of the evaluation process
• Considered the views of all stakeholders
John J. Heldrich Center for
Workforce Development
18
Qualitative Data Collection
Method Time Frame Number of Participants
Course Observations Three times per
course per
semester
Eighteen observations
Instructor Interviews Pre-Post Semester Twelve interviews with five different
instructors
Student Focus Groups Post Semester Five groups consisting of 5-8
students
Tutor Focus Groups Post Semester Four groups consisting of 5-10 tutors
Meetings, phone calls
and e-mails with
administrative
staff
Regularly throughout
the semester –
(Approx. two to three
times per month)
Two
John J. Heldrich Center for
Workforce Development
19
Quantitative Data Collection
• Administrative Data
– Program level data is collected from the program administrator
– Includes program participant information
• Preliminary College level data is collected from the
college data administrator
– Includes grade information for ALL students enrolled in College
Algebra
– Retention information, and follow up grade information is
forthcoming
John J. Heldrich Center for
Workforce Development
20
Notable Findings from Process Evaluation
• Increased social integration helped students establish a
sense of community.
• The STEM Program promoted small group interactivity and
encouraged students to prioritize self study.
• Students sought opportunities to interact with peers and
more advanced students as a learning strategy.
• Students began to see themselves and their peers as
resources and active participants in a dynamic learning
process
• Poor faculty collaboration and planning led to an ineffective
implementation of the interdisciplinary component of the
learning community.
John J. Heldrich Center for
Workforce Development
21
Outcomes Comparison
• 76 percent of students in STEM Algebra course passed
in Spring 2010
• Compared with 54 percent of all non-STEM majors
• Compared with 56 percent of STEM majors
John J. Heldrich Center for
Workforce Development
22
Study Limitations
• Results may be an indicator of the redesigned course’s
effectiveness, but it may also be a product of:
– Students self-selected into the STEM program
– Financial incentives may have impacted student behavior in
the STEM class
– The grading structure for the STEM section consisted of more
elements than the grading structure of the other MTH-100
courses
– The curriculum in the STEM section was more difficult than the
curriculum of the other MTH-100 courses
John J. Heldrich Center for
Workforce Development
23
Case Study #2: Newark Pre-Apprenticeship
Program
John J. Heldrich Center for
Workforce Development
24
Program Overview
• Program for low-income women and minorities in
Newark that prepares them for a union apprenticeship
in the construction industry
• Mostly basic skills preparation but also addresses
other barriers to employment, such as suspended
driver’s licenses
• Participants’ earnings low. Demographics:
predominantly (89 percent) black, some Latinos
• Research Question: Does this program increase
participants’ earnings?
John J. Heldrich Center for
Workforce Development
25
How to Answer the Research Question?
 What we really want to know: what would have
happened to participants had they not participated
• Just look at what participants earn after the program?
– No way to know if that’s a “lot”
• See if they make more after the program than before?
– Many entered the program because they were unemployed
• See if they’re doing better than similar people who did
not participate in the program?
– Promising but potentially problematic: applicants more
motivated than non-applicants
– Need a way to compare to similarly motivated individuals
– Need a way to identify similar people
John J. Heldrich Center for
Workforce Development
26
Creating a Valid Comparison Group
• Taking motivation into account:
– Comparison pool: individuals who graduated from training
programs through the One Stop in Newark at the same time
as program graduates
• Thousands of people completed training in Newark at
this time, which ones are most similar?
– Similar age
– Same race
– Same sex
– Similar employment and earnings history
• Use statistical software to match program graduates
to most similar completers of training programs in
Newark
John J. Heldrich Center for
Workforce Development
27
Study Design
• Data
– New Jersey Unemployment Wage Record data
– Americas One-Stop Operating System
• Methodology
– Matched program participants by social security number with
their records in the NJ UI Wage Record and OSOS data
– Probabilistic matching to generate a comparison group
– Parametric statistical models to estimate the program’s effect
• Sample
– Treatment: 129 individuals who completed the program and
who were 22 years old or older at enrollment
– Comparison: Matched individuals from ASOS data
John J. Heldrich Center for
Workforce Development
28
Work Histories of Program Graduates
• Program graduates tend to have:
– Limited employment histories
– Low average earnings
– Limited academic skills (as measured by TABE)
• Twelve percent of graduates had formerly
been incarcerated
John J. Heldrich Center for
Workforce Development
29
• Program graduates earn more than similar
individuals who receive One-Stop training.
• Program graduates experience greater wage
growth after training than this comparison group.
• Subgroup analysis: Youth participants in program do
no better than comparison group
Program Graduates Out Earn Comparison
Group Members
John J. Heldrich Center for
Workforce Development
30
Earnings Progression (Post-Matching)
John J. Heldrich Center for
Workforce Development
31
• Study looked at graduates from 2004/2005 and
earnings through mid-2006, during real estate
boom
• Graduates versus participants
• Selective nature of program means cannot rule
out some differences, though pre-training
earnings help control for ability and motivation
Study Limitations
John J. Heldrich Center for
Workforce Development
3232
Questions and Discussion
John J. Heldrich Center for
Workforce Development
33
Heldrich Center Contact Information
William Mabe, PhD, John J. Heldrich Center for
Workforce Development, Rutgers University
– billmabe@rutgers.edu or 732-932-4100, ext. 6210
Heldrich Center Website: www.heldrich.rutgers.edu

More Related Content

PPTX
07 18-13 webinar - sharnell jackson - using data to personalize learning
PDF
Student Learning Outcomes Assessment: An Update
PDF
Edu1x tutorial-april2017
PPTX
Monitoring and evaluation
PPTX
Theoretical Aspects of Training
PDF
ετπε2017 invited speech-april2017
PDF
Renata Lemos: Does Management Matter for Schools? (May 2016)
PDF
The new Evaluation Methodology
07 18-13 webinar - sharnell jackson - using data to personalize learning
Student Learning Outcomes Assessment: An Update
Edu1x tutorial-april2017
Monitoring and evaluation
Theoretical Aspects of Training
ετπε2017 invited speech-april2017
Renata Lemos: Does Management Matter for Schools? (May 2016)
The new Evaluation Methodology

What's hot (20)

PPT
Usfq&puce research
PPTX
Building Data Literacy Among Middle School Administrators and Teachers
DOCX
Administration and supervision
PPTX
B.K. Ashley Hoolash Conference 2014
PDF
Big Data Analysis on Student Learning & Course Evaluation in Waseda Universit...
PPTX
K marvel week 6 leadership academy
DOC
training
PPTX
Assessment in economics
PDF
CypherWorx OST Effiacy Study Results 2015
PPTX
Closing the Gap With STEM Education: Why, What, and How
PPTX
Goals, Objectives, and Logic Models
PPTX
Presentation (m & e)
PPTX
Impact assessment , monitoring and evaluation
PPTX
Plainville Plan Opening Share
PPT
Pascual Ale06
PPTX
CCSA 2015 225. Increasing the Teacher's Effectiveness Toolbox
PDF
Cohort 2 Schools Orientation to Next Steps NH
PDF
Presentation Thesis - Adriana Melo - Educational Marketing
PPTX
Glfes summer institute2013_raleigh_final
Usfq&puce research
Building Data Literacy Among Middle School Administrators and Teachers
Administration and supervision
B.K. Ashley Hoolash Conference 2014
Big Data Analysis on Student Learning & Course Evaluation in Waseda Universit...
K marvel week 6 leadership academy
training
Assessment in economics
CypherWorx OST Effiacy Study Results 2015
Closing the Gap With STEM Education: Why, What, and How
Goals, Objectives, and Logic Models
Presentation (m & e)
Impact assessment , monitoring and evaluation
Plainville Plan Opening Share
Pascual Ale06
CCSA 2015 225. Increasing the Teacher's Effectiveness Toolbox
Cohort 2 Schools Orientation to Next Steps NH
Presentation Thesis - Adriana Melo - Educational Marketing
Glfes summer institute2013_raleigh_final
Ad

Viewers also liked (18)

PPTX
Ascent infotile
PPT
Roundtable Discussion on the Implications of Student Loan Indebtedness
PPT
Unfulfilled Expectations of Recent College and High School Graduates in the U...
PPT
NJ Construction Outlook/Best Practices in Workforce Development
PDF
Sample Resume
PPT
Labor Market Outcomes of New Jersey County Colleges
PPT
The Effects of the Great Recession on American Workers: What it Means for Wor...
PPTX
Research; Star theory
PPT
Public and Private Strategies for Assisting Unemployed Older Workers
PPTX
SWITCH FROM TRADITIONAL TO MODERN CRYPTOGRAPHY
PDF
Modified bicmos
PPTX
Connecting devices
PPT
2013 report
PPTX
Quantum cryptography
PDF
time hopping spread spectrum term paper
PPTX
Static variable
PDF
Questionnaire on Mobile usage
PPTX
Kata kata Motivasi Andini
Ascent infotile
Roundtable Discussion on the Implications of Student Loan Indebtedness
Unfulfilled Expectations of Recent College and High School Graduates in the U...
NJ Construction Outlook/Best Practices in Workforce Development
Sample Resume
Labor Market Outcomes of New Jersey County Colleges
The Effects of the Great Recession on American Workers: What it Means for Wor...
Research; Star theory
Public and Private Strategies for Assisting Unemployed Older Workers
SWITCH FROM TRADITIONAL TO MODERN CRYPTOGRAPHY
Modified bicmos
Connecting devices
2013 report
Quantum cryptography
time hopping spread spectrum term paper
Static variable
Questionnaire on Mobile usage
Kata kata Motivasi Andini
Ad

Similar to The Value of Evaluation (20)

PPTX
Organizational Capacity-Building Series - Session 6: Program Evaluation
PPTX
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
PDF
Westat Presentation
PPT
Chapter 32
PPTX
Self-Assessment: Concept and Process
PPT
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
PPTX
The Road to Becoming a Center of Excellence
PPT
M & E Presentation DSK.ppt
PPTX
Assessment 101 Part 3
PPTX
NEEDS ASSESSMENT - Copy.pptx
PPTX
Assessing OER impact across varied organisations and learners: experiences fr...
PPTX
Assessing OER impact across varied organisations and learners: experiences fr...
PPTX
#nacada15: Higher Education Change & the Culture of Assessment
PPT
Assessmentof studentlearningoutcomesinstudentservices
PPTX
P.1 Quality Assurance.pptx
PPTX
Leadership strategies
PPT
ME_Katende (2).ppt
PPT
Assessment MEAL Frameworks in scientific field.ppt
PDF
Black, Adam Dr - Efficacy and how to improve learner outcomes
PPTX
Learning_Unit_3
Organizational Capacity-Building Series - Session 6: Program Evaluation
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
Westat Presentation
Chapter 32
Self-Assessment: Concept and Process
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
The Road to Becoming a Center of Excellence
M & E Presentation DSK.ppt
Assessment 101 Part 3
NEEDS ASSESSMENT - Copy.pptx
Assessing OER impact across varied organisations and learners: experiences fr...
Assessing OER impact across varied organisations and learners: experiences fr...
#nacada15: Higher Education Change & the Culture of Assessment
Assessmentof studentlearningoutcomesinstudentservices
P.1 Quality Assurance.pptx
Leadership strategies
ME_Katende (2).ppt
Assessment MEAL Frameworks in scientific field.ppt
Black, Adam Dr - Efficacy and how to improve learner outcomes
Learning_Unit_3

Recently uploaded (20)

PDF
Uderstanding digital marketing and marketing stratergie for engaging the digi...
PPTX
TNA_Presentation-1-Final(SAVE)) (1).pptx
PPTX
Introduction to pro and eukaryotes and differences.pptx
PDF
Hazard Identification & Risk Assessment .pdf
PPTX
A powerpoint presentation on the Revised K-10 Science Shaping Paper
PPTX
Onco Emergencies - Spinal cord compression Superior vena cava syndrome Febr...
PPTX
Chinmaya Tiranga Azadi Quiz (Class 7-8 )
PDF
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...
PPTX
B.Sc. DS Unit 2 Software Engineering.pptx
DOCX
Cambridge-Practice-Tests-for-IELTS-12.docx
PDF
Empowerment Technology for Senior High School Guide
PDF
Trump Administration's workforce development strategy
PDF
AI-driven educational solutions for real-life interventions in the Philippine...
PDF
Vision Prelims GS PYQ Analysis 2011-2022 www.upscpdf.com.pdf
PDF
FORM 1 BIOLOGY MIND MAPS and their schemes
PDF
Paper A Mock Exam 9_ Attempt review.pdf.
PDF
ChatGPT for Dummies - Pam Baker Ccesa007.pdf
PDF
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
PDF
International_Financial_Reporting_Standa.pdf
PDF
advance database management system book.pdf
Uderstanding digital marketing and marketing stratergie for engaging the digi...
TNA_Presentation-1-Final(SAVE)) (1).pptx
Introduction to pro and eukaryotes and differences.pptx
Hazard Identification & Risk Assessment .pdf
A powerpoint presentation on the Revised K-10 Science Shaping Paper
Onco Emergencies - Spinal cord compression Superior vena cava syndrome Febr...
Chinmaya Tiranga Azadi Quiz (Class 7-8 )
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...
B.Sc. DS Unit 2 Software Engineering.pptx
Cambridge-Practice-Tests-for-IELTS-12.docx
Empowerment Technology for Senior High School Guide
Trump Administration's workforce development strategy
AI-driven educational solutions for real-life interventions in the Philippine...
Vision Prelims GS PYQ Analysis 2011-2022 www.upscpdf.com.pdf
FORM 1 BIOLOGY MIND MAPS and their schemes
Paper A Mock Exam 9_ Attempt review.pdf.
ChatGPT for Dummies - Pam Baker Ccesa007.pdf
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
International_Financial_Reporting_Standa.pdf
advance database management system book.pdf

The Value of Evaluation

  • 1. The Val ue of Eval uat i on John J. Heldrich Center for Workforce Development April 8, 2011 Presentation to the New Jersey County Colleges’ Institutional Advancement Affinity Group William Mabe, PhD, Director of Research and Evaluation
  • 2. John J. Heldrich Center for Workforce Development 22 About the Heldrich Center  The Heldrich Center is a research and policy institute at Rutgers dedicated to raising the effectiveness of the American workplace by strengthening workforce education and training programs  Founded in 1997, the center employs 18 full-time professional staff and faculty representing an array of disciplines, from economics and human services to business and public policy
  • 3. John J. Heldrich Center for Workforce Development 3 Heldrich Center Research and Evaluation  Have evaluated over 30 education and workforce programs  Selected evaluation projects include: – A profile of performance outcomes of New Jersey’s community colleges – Evaluation of Kessler Foundation funded grantees – Evaluation of US Department of Education financed Parent Information and Resource Centers
  • 4. John J. Heldrich Center for Workforce Development 44 Benefits of Evaluation • Learn what works and what does not work so resources can be directed to most effective programs • Identify barriers to success and program weaknesses • Provide evidence for program sustainability • Document best practices for replication
  • 5. John J. Heldrich Center for Workforce Development 5 What Evaluation is NOT
  • 6. John J. Heldrich Center for Workforce Development 66 Evaluation Is Not • Auditing • A gotcha • Needs assessment • Just about the numbers • Customer satisfaction
  • 7. John J. Heldrich Center for Workforce Development 77 Evaluation Is • Systematic and objective process by which a researcher assesses the quality, effectiveness, or value of a program  Systematic--follows established rules of scientific inquiry  Objective--any neutral observer would arrive at the same conclusions about the program if she were to use the same methods  Assess program quality by studying how the program operates  Effectiveness--whether the program achieves its intended goal  Value assessment places the effectiveness of the program in the context of its costs
  • 8. John J. Heldrich Center for Workforce Development 88 Types of Evaluations • Process Evaluations:  Identify barriers to successful implementation  Identify which program components are effective and which are ineffective at helping program achieve its goals  Recommend strategies for modifying program to strengthen it  Tend to use more qualitative data • Outcome Evaluations:  How well is the program achieving it goals? Is it effective?  Tend to use more quantitative data
  • 9. John J. Heldrich Center for Workforce Development 99 When Programs Should Be Evaluated • Before the program begins. In the early stages of developing any program, staff should plan the program to accommodate evaluation • Data collection should be ongoing • Program should be evaluated at regular intervals
  • 10. John J. Heldrich Center for Workforce Development 1010 Life-Cycle Model of Program Evaluation • Like products, programs have life cycles: • conceptualization • piloting • widespread implementation • maturity • (possibly) phase-out • Type of evaluation to be conducted is a function of where a program is in the life-cycle
  • 11. John J. Heldrich Center for Workforce Development 1111 How to Approach Evaluation • At what stage is the program in its life-cycle? • What evidence would be needed to convince a neutral observer that the program is effective? That it’s being implemented well? • If this program were successful what would I expect to observe?  Look for the observable implications of program success  Observable implications can be measured, either qualitatively or quantitatively • Focus on both process and outcomes
  • 12. John J. Heldrich Center for Workforce Development 1212 How to Evaluate • Identify stakeholders • Types of data: qualitative and quantitative • Methodologies of data collection • Individual interviews • Focus groups • Surveys • Observation of program activities • Collection of programmatic and administrative data • Some methodologies can be used to collect qualitative and quantitative data, e.g., surveys • When to collect data • Design program so that needed data can be collected from day one
  • 13. John J. Heldrich Center for Workforce Development 1313 Data Heldrich Center Has Used for Program Evaluations • New Jersey Unemployment Insurance Wage Record Data • Quarterly census of earnings of over nearly all workers in NJ • Heldrich Center has access to UI wage records through data sharing agreement with NJ Dept of Labor and Workforce Development • Permitted to use for approved research purposes • Employment Service OSOS Data • Records of people who received training through public workforce system • Heldrich has data sharing agreement with NJLWD • Valuable for creating comparison groups • New Jersey Student Unit Record Data System • Centralized database of student enrollment and graduation records that CHE collects from you and all public and some independent institutions in the state • Heldrich has data sharing agreement with NJCHE to use for approved research purposes
  • 14. John J. Heldrich Center for Workforce Development 14 Case Study #1: Essex County College Math Initiative
  • 15. John J. Heldrich Center for Workforce Development 15 Program Overview Semester Key Program Elements Summer I Pilot Semester •Linked developmentalpre-algebracourse and scientific reasoningcourse •STEM specific tutors •Financialsupport Fall2009 Semester & Spring 2010 Semester •Collegelevel Math course with a supported recitation •STEMspecific tutors •Financialsupport •Access to a dedicatedcomputer lab •Feedbackoriented homework structure.
  • 16. John J. Heldrich Center for Workforce Development 16 Key Differences: STEM Algebra Course versus College’s Regular Algebra Course • Revised curriculum: – Focus on concepts – New textbook – Coverage of more difficult topics • Mandatory recitation • Teaching Assistant • Financial support for students • Mandatory homework • Different grading structure
  • 17. John J. Heldrich Center for Workforce Development 17 Evaluation Design • Process and outcome evaluations • Use qualitative and quantitative data collection methods • Interdisciplinary evaluation team • Submitted interim reports to provide actionable information in a timely manner • Focused on understanding the culture and politics of the school as part of the evaluation process • Considered the views of all stakeholders
  • 18. John J. Heldrich Center for Workforce Development 18 Qualitative Data Collection Method Time Frame Number of Participants Course Observations Three times per course per semester Eighteen observations Instructor Interviews Pre-Post Semester Twelve interviews with five different instructors Student Focus Groups Post Semester Five groups consisting of 5-8 students Tutor Focus Groups Post Semester Four groups consisting of 5-10 tutors Meetings, phone calls and e-mails with administrative staff Regularly throughout the semester – (Approx. two to three times per month) Two
  • 19. John J. Heldrich Center for Workforce Development 19 Quantitative Data Collection • Administrative Data – Program level data is collected from the program administrator – Includes program participant information • Preliminary College level data is collected from the college data administrator – Includes grade information for ALL students enrolled in College Algebra – Retention information, and follow up grade information is forthcoming
  • 20. John J. Heldrich Center for Workforce Development 20 Notable Findings from Process Evaluation • Increased social integration helped students establish a sense of community. • The STEM Program promoted small group interactivity and encouraged students to prioritize self study. • Students sought opportunities to interact with peers and more advanced students as a learning strategy. • Students began to see themselves and their peers as resources and active participants in a dynamic learning process • Poor faculty collaboration and planning led to an ineffective implementation of the interdisciplinary component of the learning community.
  • 21. John J. Heldrich Center for Workforce Development 21 Outcomes Comparison • 76 percent of students in STEM Algebra course passed in Spring 2010 • Compared with 54 percent of all non-STEM majors • Compared with 56 percent of STEM majors
  • 22. John J. Heldrich Center for Workforce Development 22 Study Limitations • Results may be an indicator of the redesigned course’s effectiveness, but it may also be a product of: – Students self-selected into the STEM program – Financial incentives may have impacted student behavior in the STEM class – The grading structure for the STEM section consisted of more elements than the grading structure of the other MTH-100 courses – The curriculum in the STEM section was more difficult than the curriculum of the other MTH-100 courses
  • 23. John J. Heldrich Center for Workforce Development 23 Case Study #2: Newark Pre-Apprenticeship Program
  • 24. John J. Heldrich Center for Workforce Development 24 Program Overview • Program for low-income women and minorities in Newark that prepares them for a union apprenticeship in the construction industry • Mostly basic skills preparation but also addresses other barriers to employment, such as suspended driver’s licenses • Participants’ earnings low. Demographics: predominantly (89 percent) black, some Latinos • Research Question: Does this program increase participants’ earnings?
  • 25. John J. Heldrich Center for Workforce Development 25 How to Answer the Research Question?  What we really want to know: what would have happened to participants had they not participated • Just look at what participants earn after the program? – No way to know if that’s a “lot” • See if they make more after the program than before? – Many entered the program because they were unemployed • See if they’re doing better than similar people who did not participate in the program? – Promising but potentially problematic: applicants more motivated than non-applicants – Need a way to compare to similarly motivated individuals – Need a way to identify similar people
  • 26. John J. Heldrich Center for Workforce Development 26 Creating a Valid Comparison Group • Taking motivation into account: – Comparison pool: individuals who graduated from training programs through the One Stop in Newark at the same time as program graduates • Thousands of people completed training in Newark at this time, which ones are most similar? – Similar age – Same race – Same sex – Similar employment and earnings history • Use statistical software to match program graduates to most similar completers of training programs in Newark
  • 27. John J. Heldrich Center for Workforce Development 27 Study Design • Data – New Jersey Unemployment Wage Record data – Americas One-Stop Operating System • Methodology – Matched program participants by social security number with their records in the NJ UI Wage Record and OSOS data – Probabilistic matching to generate a comparison group – Parametric statistical models to estimate the program’s effect • Sample – Treatment: 129 individuals who completed the program and who were 22 years old or older at enrollment – Comparison: Matched individuals from ASOS data
  • 28. John J. Heldrich Center for Workforce Development 28 Work Histories of Program Graduates • Program graduates tend to have: – Limited employment histories – Low average earnings – Limited academic skills (as measured by TABE) • Twelve percent of graduates had formerly been incarcerated
  • 29. John J. Heldrich Center for Workforce Development 29 • Program graduates earn more than similar individuals who receive One-Stop training. • Program graduates experience greater wage growth after training than this comparison group. • Subgroup analysis: Youth participants in program do no better than comparison group Program Graduates Out Earn Comparison Group Members
  • 30. John J. Heldrich Center for Workforce Development 30 Earnings Progression (Post-Matching)
  • 31. John J. Heldrich Center for Workforce Development 31 • Study looked at graduates from 2004/2005 and earnings through mid-2006, during real estate boom • Graduates versus participants • Selective nature of program means cannot rule out some differences, though pre-training earnings help control for ability and motivation Study Limitations
  • 32. John J. Heldrich Center for Workforce Development 3232 Questions and Discussion
  • 33. John J. Heldrich Center for Workforce Development 33 Heldrich Center Contact Information William Mabe, PhD, John J. Heldrich Center for Workforce Development, Rutgers University – billmabe@rutgers.edu or 732-932-4100, ext. 6210 Heldrich Center Website: www.heldrich.rutgers.edu

Editor's Notes

  • #5: So what is evaluation?
  • #27: Note that the only data we have for both treatment and comparison groups is on graduates from training.
  • #29: We matched the 230 or so individuals who participated in the program in 2004 and 2005 with their records in the New Jersey Unemployment Insurance Wage Database to which almost all employers in the state report their employees’ wages. In none of the 12 quarters before enrollment was the average wage greater than $3000 in any quarter. In last four quarters before enrollment the average wage was below $2500 a quarter. In the 12 quarters before enrollment less than 50 percent of program participants were employed in any given quarter. The point is that the program does not engage in creaming.
  • #30: The first bullet compares the absolute post-graduation wages of the Newark Essex graduates with the post-training wages of the One-Stop trainees. The second shows that the Newark Essex graduates experience greater wage growth from before training to after training than the One-Stop trainees. These results are extremely positive, but we then wondered whether they were being driven by the fifty percent of adults who managed to obtain apprenticeships. Maybe the graduates who didn’t get apprenticeships were doing very poorly. So what we did was we re-ran the comparison and excluded the individuals who had earned apprenticeships. And what did we find …
  • #32: The first bullet compares the absolute post-graduation wages of the Newark Essex graduates with the post-training wages of the One-Stop trainees. The second shows that the Newark Essex graduates experience greater wage growth from before training to after training than the One-Stop trainees. These results are extremely positive, but we then wondered whether they were being driven by the fifty percent of adults who managed to obtain apprenticeships. Maybe the graduates who didn’t get apprenticeships were doing very poorly. So what we did was we re-ran the comparison and excluded the individuals who had earned apprenticeships. And what did we find …