SlideShare a Scribd company logo
You Can Do Program Evaluation
November 19, 2010
Stephen B. Johnson, PhD
Diane M. Talley, MA
James A. Penny, PhD
Castle Worldwide
NOCA 1100
5.3 The certificate provider shall conduct periodic
program evaluations to assess program quality and
effectiveness and implement future improvements.
ANSI
6.2.10.1 The evaluation shall measure the quality,
effectiveness, and value of the certificate program
against stated program performance objectives.
6.2.10.4 The evaluation shall include mechanisms to
monitor and identify regularly the need for changes to
the program’s purpose, score, or learning outcomes ….
 
Program evaluation standard
Definition of evaluation: Systematic investigation of the
worth of something to gain direction for future
improvement (Joint Commission - Program Evaluation
Standards page 3).
Why are we here?
Plato and Aristotle in The
School of Athens fresco, 
by Raphael.
ICE 2010 Conference Atlanta Georgia
Simplified program evaluation model
1. Identify a problem.
2. Generate and implement alternatives to relieve 
the stress.
3. Evaluate those alternatives.
4. Adopt those most likely to reduce problem.
ICE 2010 Conference Atlanta Georgia
Good program evaluation is a tool
1. Improve your program
2. Illuminate/explain how/why a program 
works
3. Challenge assumptions
4. Avoid the doldrums
5. For certificate programs – focuses on the 
learner
6. Takes program design from:
•Trial and error
•Based on what is known/assumed 
•Heuristic design (planned trial and error)
7. To a planned program design that might 
surprise
ICE 2010 Conference Atlanta Georgia
The Wright model
ICE 2010 Conference Atlanta Georgia
The Wright flyer
ICE 2010 Conference Atlanta Georgia
Wright wind tunnel
ICE 2010 Conference Atlanta Georgia
General Model of Program Evaluation
ICE 2010 Conference Atlanta Georgia
Questions about evaluation
 Why evaluate?
 Purpose?
 Who will do it?
 Where do the questions come from?
 What questions to ask?
 Importance (and number) of questions?
 Methods to use?
 How will the results be used/implemented?
 Time and budget?
 What is the fall back if something breaks?
ICE 2010 Conference Atlanta Georgia
Evaluating education/training programs
Appraise the worth of educational undertaking,
against a desired standard, for the purpose of
decision making typically around changing
elements of the program (Popham, Scriven et. al.)
Focus on learner behavior/outcomes
MOST concerned with post-instruction outcomes
directly related to the learner
ICE 2010 Conference Atlanta Georgia
Novice errors
 A single evaluation – inevitably flawed
 Too many questions
 Expensive, inconclusive, problems with
implementing solutions
 Choose easy but trivial issues
 The change is not worth the cost
 The identified problem may actually be a
symptom – the problem may be deeper
ICE 2010 Conference Atlanta Georgia
Going deep
ICE 2010 Conference Atlanta Georgia
Hubble deep field image
ICE 2010 Conference Atlanta Georgia
Our advice
1. Start with 1 or 2 practical problems.
2. Be aware of confusing the symptom with the
problem.
3. Plan for the long term (with flexibility).
4. Get deeper over time.
 Will most likely lead to innovative solutions and
program outcomes including different clients,
products, or services
1. Remember the Wright model.
ICE 2010 Conference Atlanta Georgia
Questions?
Stephen B. Johnson sjohnson@castleworldwide.com
Diane M. Talley dtalley@castleworldwide.com
James A. Penny jpenny@castleworldwide.com
919.572.6880
www.castleworldwide.com

More Related Content

DOCX
Questions for the final nord plus project report
PPT
Innovation Tasters
PPTX
Research in Social Investment - Tshikululu Social Investments workshop 2010
PPSX
Product Knowledge
PPS
How and why the LMI proven process works
PPT
Cpo Presentation Final
PPT
Direct Supervision Country Presence
PPT
Exemplars And Tec Practice
Questions for the final nord plus project report
Innovation Tasters
Research in Social Investment - Tshikululu Social Investments workshop 2010
Product Knowledge
How and why the LMI proven process works
Cpo Presentation Final
Direct Supervision Country Presence
Exemplars And Tec Practice

What's hot (20)

PPTX
Survey of Engineering Managers
PPTX
Quality In Action - May 2011
PPTX
Functions of evaluation
PDF
Logic Models Training
PPT
The design cycle personal project connection
PDF
PDF✔Download❤ Problem-Based Learning An Inquiry Approach
PPT
DOCX
Syllabus ref02
PPTX
Program Logic Models
PPTX
A Guide to Curriculum and Accreditation Mapping Using Examsoft Categories and...
PPT
Cpec Pilot Flowchart
PDF
Logic Model Workbook
DOCX
Mkt 435 week 5 team assignment repositioning presentation
PPTX
Zmitrowicz SQA DAYS EU Riga
PPTX
Part 2 quality assurance and data
PPT
Making the Grade, HMC Slides
PPTX
PIR FACULTY TRAINING
PPTX
Assignmment #2 diagnostic learning logs
PPT
2007 Guest Speaker Session for Dr. Jen York-Barr
Survey of Engineering Managers
Quality In Action - May 2011
Functions of evaluation
Logic Models Training
The design cycle personal project connection
PDF✔Download❤ Problem-Based Learning An Inquiry Approach
Syllabus ref02
Program Logic Models
A Guide to Curriculum and Accreditation Mapping Using Examsoft Categories and...
Cpec Pilot Flowchart
Logic Model Workbook
Mkt 435 week 5 team assignment repositioning presentation
Zmitrowicz SQA DAYS EU Riga
Part 2 quality assurance and data
Making the Grade, HMC Slides
PIR FACULTY TRAINING
Assignmment #2 diagnostic learning logs
2007 Guest Speaker Session for Dr. Jen York-Barr
Ad

Similar to You Can Do Program Evaluation (20)

PPTX
Designing assessment and assessment-criteria
DOCX
Workbook for Designing a Process Evaluation .docx
DOCX
Workbook for Designing a Process Evaluation
PDF
2010 tqa assessor workshop by thomas schamberger day two
PPTX
Chapter 13 An evaluation framework
DOCX
Hey Carzetta,  You did a beautiful job on your char.docx
PPTX
Data Driven Decision Making Presentation
PPTX
Logical framework
PPTX
THE CIPP MODEL IN TEACHING PROOF-POWER POINT
PPT
Assessment_Basics[1]
PDF
EDD8534 Designing, Delivering, and Manag
DOCX
Lesson 1 bb.docx
PDF
1What did you do best in your undergraduate studies Have you ever.pdf
PPTX
Chapter 4: Evaluating the curriculum
PDF
Program Evaluation 3rd Edition John M. Owen
PPTX
action_research_format_1.pptx
PPTX
National university assessment process
PDF
Week 12_Designing Instructional Materials and Conducting Summative Evaluation...
PPTX
action_research_presentation_format_sample
PDF
We Do WHAT?: A Higher Ed Customer Experience Audit | iFactory at Illinois Web...
Designing assessment and assessment-criteria
Workbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation
2010 tqa assessor workshop by thomas schamberger day two
Chapter 13 An evaluation framework
Hey Carzetta,  You did a beautiful job on your char.docx
Data Driven Decision Making Presentation
Logical framework
THE CIPP MODEL IN TEACHING PROOF-POWER POINT
Assessment_Basics[1]
EDD8534 Designing, Delivering, and Manag
Lesson 1 bb.docx
1What did you do best in your undergraduate studies Have you ever.pdf
Chapter 4: Evaluating the curriculum
Program Evaluation 3rd Edition John M. Owen
action_research_format_1.pptx
National university assessment process
Week 12_Designing Instructional Materials and Conducting Summative Evaluation...
action_research_presentation_format_sample
We Do WHAT?: A Higher Ed Customer Experience Audit | iFactory at Illinois Web...
Ad

Recently uploaded (20)

PDF
STATICS OF THE RIGID BODIES Hibbelers.pdf
PPTX
Cell Structure & Organelles in detailed.
PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PDF
Microbial disease of the cardiovascular and lymphatic systems
PDF
Abdominal Access Techniques with Prof. Dr. R K Mishra
PPTX
Renaissance Architecture: A Journey from Faith to Humanism
PDF
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PPTX
PPH.pptx obstetrics and gynecology in nursing
PPTX
master seminar digital applications in india
PPTX
human mycosis Human fungal infections are called human mycosis..pptx
PPTX
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
PDF
Insiders guide to clinical Medicine.pdf
PDF
O7-L3 Supply Chain Operations - ICLT Program
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PDF
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
PDF
O5-L3 Freight Transport Ops (International) V1.pdf
PDF
01-Introduction-to-Information-Management.pdf
PDF
2.FourierTransform-ShortQuestionswithAnswers.pdf
PDF
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
PPTX
Final Presentation General Medicine 03-08-2024.pptx
STATICS OF THE RIGID BODIES Hibbelers.pdf
Cell Structure & Organelles in detailed.
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
Microbial disease of the cardiovascular and lymphatic systems
Abdominal Access Techniques with Prof. Dr. R K Mishra
Renaissance Architecture: A Journey from Faith to Humanism
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PPH.pptx obstetrics and gynecology in nursing
master seminar digital applications in india
human mycosis Human fungal infections are called human mycosis..pptx
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
Insiders guide to clinical Medicine.pdf
O7-L3 Supply Chain Operations - ICLT Program
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
O5-L3 Freight Transport Ops (International) V1.pdf
01-Introduction-to-Information-Management.pdf
2.FourierTransform-ShortQuestionswithAnswers.pdf
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
Final Presentation General Medicine 03-08-2024.pptx

You Can Do Program Evaluation

  • 1. You Can Do Program Evaluation November 19, 2010 Stephen B. Johnson, PhD Diane M. Talley, MA James A. Penny, PhD Castle Worldwide
  • 2. NOCA 1100 5.3 The certificate provider shall conduct periodic program evaluations to assess program quality and effectiveness and implement future improvements. ANSI 6.2.10.1 The evaluation shall measure the quality, effectiveness, and value of the certificate program against stated program performance objectives. 6.2.10.4 The evaluation shall include mechanisms to monitor and identify regularly the need for changes to the program’s purpose, score, or learning outcomes ….   Program evaluation standard Definition of evaluation: Systematic investigation of the worth of something to gain direction for future improvement (Joint Commission - Program Evaluation Standards page 3). Why are we here? Plato and Aristotle in The School of Athens fresco,  by Raphael. ICE 2010 Conference Atlanta Georgia
  • 3. Simplified program evaluation model 1. Identify a problem. 2. Generate and implement alternatives to relieve  the stress. 3. Evaluate those alternatives. 4. Adopt those most likely to reduce problem. ICE 2010 Conference Atlanta Georgia
  • 4. Good program evaluation is a tool 1. Improve your program 2. Illuminate/explain how/why a program  works 3. Challenge assumptions 4. Avoid the doldrums 5. For certificate programs – focuses on the  learner 6. Takes program design from: •Trial and error •Based on what is known/assumed  •Heuristic design (planned trial and error) 7. To a planned program design that might  surprise ICE 2010 Conference Atlanta Georgia
  • 5. The Wright model ICE 2010 Conference Atlanta Georgia
  • 6. The Wright flyer ICE 2010 Conference Atlanta Georgia
  • 7. Wright wind tunnel ICE 2010 Conference Atlanta Georgia
  • 8. General Model of Program Evaluation ICE 2010 Conference Atlanta Georgia
  • 9. Questions about evaluation  Why evaluate?  Purpose?  Who will do it?  Where do the questions come from?  What questions to ask?  Importance (and number) of questions?  Methods to use?  How will the results be used/implemented?  Time and budget?  What is the fall back if something breaks? ICE 2010 Conference Atlanta Georgia
  • 10. Evaluating education/training programs Appraise the worth of educational undertaking, against a desired standard, for the purpose of decision making typically around changing elements of the program (Popham, Scriven et. al.) Focus on learner behavior/outcomes MOST concerned with post-instruction outcomes directly related to the learner ICE 2010 Conference Atlanta Georgia
  • 11. Novice errors  A single evaluation – inevitably flawed  Too many questions  Expensive, inconclusive, problems with implementing solutions  Choose easy but trivial issues  The change is not worth the cost  The identified problem may actually be a symptom – the problem may be deeper ICE 2010 Conference Atlanta Georgia
  • 12. Going deep ICE 2010 Conference Atlanta Georgia
  • 13. Hubble deep field image ICE 2010 Conference Atlanta Georgia
  • 14. Our advice 1. Start with 1 or 2 practical problems. 2. Be aware of confusing the symptom with the problem. 3. Plan for the long term (with flexibility). 4. Get deeper over time.  Will most likely lead to innovative solutions and program outcomes including different clients, products, or services 1. Remember the Wright model. ICE 2010 Conference Atlanta Georgia
  • 15. Questions? Stephen B. Johnson sjohnson@castleworldwide.com Diane M. Talley dtalley@castleworldwide.com James A. Penny jpenny@castleworldwide.com 919.572.6880 www.castleworldwide.com

Editor's Notes

  • #3: Our goal here today is to: Make program evaluation something tangible/understandable that you can think of in terms of your own program rather than an abstract concept See program evaluation not as a burden, another hurdle to learn but something to give your program wings.
  • #4: Life is never this simple – too many assumptions such as you know the problem
  • #5: Becalmed, a study on the east coast of Australia [sailing ship, possibly a grain ship, taken while Hurley was en route to Antarctica with the Australasian Antarctic Expedition in 1911] [picture] : [Australia] / [Frank Hurley] National Library of Australia online catalog http://catalogue.nla.gov.au/Record/90519
  • #6: 1901 Wright Glider
  • #7: 1903 Wright Aeroplane (Kitty Hawk NC)
  • #8: Model of Wright Wind tunnel – Kitty Hawk NC
  • #9: General Model of Evaluating Training Programs: Proposed by Stephen Brian Johnson 2004. Levels refer to Donald Kirkpatrick (1979). Level I: Training Reaction:  At the end of the training program, participants are asked to assess the value of the training they received. Level II: Learning: Assesses participant learning through testing, skill practices, role-plays, simulations, group evaluations and other assessment tools. Level III: Behavior: After completing the training program, participants should demonstrate a change or improvement in on-the-job performance.  Level IV: Business Results: Focuses on the actual business results achieved by program participants as they successfully apply the program education. Metrics can include output, quality, costs, time, and customer satisfaction. Measurement at this level will often compare the program’s monetary benefits with the program’s costs Donald Kirkpatrick (1979)
  • #11: A special case Aim to appraise the worth of educational undertaking, against a desired standard, for the purpose of decision making typically around changing elements of the program. Reporting the results is NOT evaluation Evaluation of educational programs should focus on learner behavior Sure you can evaluate the impact of process variables (e.g., learner background, instructional time), and what happened during the instruction (e.g., engagement), but we are MOST concerned with post-instruction outcomes directly related to the learner. As a result you need to have a theory of action that defines what post-instruction learner changes will have occurred. Could be behavioral, attitudinal or other affect. But, let’s face it, for training purposes we looking to change affect (e.g., attitudes) and/or knowledge and skills so there is an impact on the learner’s behavior to
  • #13: Hubble deep field location
  • #14: Seeing more than you expect