SlideShare a Scribd company logo
3
Most read
4
Most read
19
Most read
Chapter Five:  Management-Oriented Evaluation Approaches Presented by: Iva Angelova & Larry Weas ETR 531 Program Evaluation Northern Illinois University Education, Technology & Research
Introduction Developers of the Management-Oriented Evaluation Approach and their Contributions How the Management-Oriented Evaluation Approach Has Been Used Strengths and Limitations of the Management-Oriented Evaluation Approach Other References  Questions for Discussion
The CIPP Evaluation Model  (Stufflebeam , 1971) Context Evaluation Planning Decisions Input Evaluation Structuring Decisions Process Evaluation Implementing Decisions Product Evaluation Recycling Decisions
Logical Structure for Designing Focusing the Evaluation Collection of Information Organization of Information Analysis of Information Reporting of Information Administration of the Information (Stufflebeam , 1973a) Step 1 Step 2 Step 3 Step 4 Step 5 Step 6
Step 1: Further Detail Focusing the Evaluation Identify the major level(s) of decision making to be served, for example local, state, or national.  For each level of decision making, project the decision situations to be served and describe each one in terms of its locus , focus,  criticality, timing, and composition of alternatives.  Define criteria for each decision situation by specific variables for measurement and standards for use in judgment of alternatives. Define policies within which the evaluator must operate.  Step 1
Step 2: Further Detail Collection of Information Specify the source of the information to be collected. Specify the instruments and methods for collecting the needed information. Specify the sampling procedure  to be employed. Specify the conditions and schedule  for information collection. Step 2
Step 3: Further Detail Organization of Information Provide a format for the information that is to be collected. Designate a means for performing the analysis. Step 3
Step 4: Further Detail Analysis of Information Select the analytical procedures to be employed. Designate a means of performing the analysis. Step 4
Step 5: Further Detail Reporting of Information Define the audiences for the evaluation reports. Specify means for providing information to the audience. Specify the format for evaluation  reports and /or  reporting sessions. Schedule the reporting of information.  Step 5
Step 6: Further Detail Administration of the Information Summarize the evaluation schedule. Define staff and resource requirements and plans for meeting these requirements. Specify means for meeting policy  requirements. Evaluate the potential  of the evaluation design  for providing  information that is valid , reliable, credible, timely, and pervasive (i.e. will reach all relevant stakeholders). Specify and schedule means for periodic updating of the evaluation design. Provide a budget for the total evaluation program.  Step 6
Four Types of Evaluation  (Stufflebeam & Shinkfield,  1985) Context  Evaluation  Input Evaluation Process Evaluation Product Evaluation
The UCLA Evaluation Model (Alkin, 1969) Systems  Assessment  (C) UCLA Evaluation Model compared to CIPP Program  Planning  (I) Program  Implementation To assist in the selection of particular programs likely to be effective in meeting specific education needs Program  Improvement  (P) Program  Certification  (P) To provide information about whether a program was introduced to the appropriate group in manner intended  To provide information about how a program in functioning, whether interim objective are being achieved, and whether unanticipated outcomes are appearing  To provide information about the value of the program and its potential for use elsewhere  To provide information about the state of the system
The UCLA Evaluation Model Evaluation is a process of gathering information. The information collected in an evaluation will be used mainly to make decisions about alternative course of action. Evaluation information  should be presented to the decision maker in form that he can use effectively and that is designed to help rather than confuse or mislead him. Different kinds of decision require different kinds of evaluation procedures.  (Alkin, 1991) Four assumptions:
Growth & Development  of the Early Models The CIPP and UCLA appear to be linear and Sequential Evaluators may undertake “retrospective” evaluations Process evaluation can be done without having specific decisions. Cycle through another type of evaluation is the nature of Management-Oriented Evaluation.
Guides produces using the CIPP Model CONTEXT Evaluation INPUT Evaluation PROCESS Evaluation PRODUCT Evaluation Shufflebeam, (1977) advanced  the procedure for conducting a  context  evaluation with his guidelines for designing a  needs assessment  for an educational program or activity.  Reinhard (1972) developed a guide for use in  input  evaluation called the  advocate team technique .  It is used when acceptable alternatives for designing a new program are not available or obvious.  Cronbach, (1963) proposed procedures which provided useful suggestions for the conduct of  process  evaluation. Techniques discussed in Chapter 6 provide information useful in conducting  product  evaluation.
Management-Oriented Evaluation Approach Used Record of attainment and recycling decisions Guidance for termination, continuation, modification, or installation P roduct Record of actual process Guidance for implementation P rocess Record of choice  strategy and  design and reason for their choice over other  alternatives Guidance for choice or program strategy: input for specification of procedural design  I nput Record of objectives and bases for their choice along with a record of needs, opportunities, and problems Guidance for choice of objectives and assignment of priorities C ontext Accountability (Summative Evaluation) Decision Making (Formative Orientation)
Strengths  & Limitations Proved appealing to evaluators Give focus to the evaluation Stress the importance of the utility of information Instrumental in showing evaluators and program managers  that they need not wait until and activity or program has run its course before evaluating it  Preferred choice in the eyes of most managers and boards The CIPP model is a useful  and simple heuristic tool that helps the evaluator generate potentially important questions to be addressed in an evaluation Using CIPP model, the user can identify a number of questions about an  undertaking  Supports evaluation of every component of program as it operates, grows, or changes It stresses timely used feedback  Strengths
Strengths &  Limitations Evaluator’s occasional inability to respond to questions or issues that may be significant Given to top management, can be become the “hired gun” thus, can become unfair and undemocratic Direct effect of  policy-shaping community If follows in its entirety, can result in costly and complex evaluations Assumes the importance decisions can be clearly identified in advanced Thus, frequent adjustments may be needed in the original evaluation plan if this approach is to work well Limitations
Major Concepts, Theories & Summary The management-oriented evaluation approach, informs decision makers about the inputs, processes, and outputs of the program under evaluation Shufflebeam’s CIPP evaluation model incorporates four separate evaluations into one framework to better serve managers and decision makers In the CIPP model,  the  Context Evaluation  helps define objectives Process Evaluation  is used to determine  how well a program is being implemented Product Evaluation  is used to provide information on what program results were obtained, how well needs were reduced, and what should be done once the program has ended Alkin’s UCLA model is similar to  the CIPP model in that it provides decision makers information on the context, inputs, implementations, processes and products of the program under evaluation
Alkin, M. C. (1991) Evaluation theory development: II. In M.W.  McLaughlin & D. C. Phillips (Eds.),  Evaluation and education: at quarter century , Ninetieth Yearbook of the National Society for the Study of education, Part II. Chicago: University of Chicago Press. Shufflebeam, D. L. (2000) The CIPP model for evaluation. In D. L. Shufflebeam, G. F. Madaus, T. Kelleghan (Eds.) Evaluation models: Viewpoints on educational and human services evaluation (2 nd  ed., pp.274-317). Shufflebeam , D. L. & Shinkfield, A. J. (1985). Systematic evaluation, Boston: Kluwer- Nijhoff.  R References
Questions for Leading a Discussion  (Larry’s question):   Based on the two model (CIPP & UCLA) we have discussed, what are some decisions you may have in you own organization?  Could one of these two models be used in evaluating your organization's decision? Why? What are some limitations you may incur when evaluation you organization? Q Questions

More Related Content

PPTX
Management oriented evaluation approaches
PPT
Functional Dimension
PDF
(2011) - “METODOLOGIA DE INVESTIGAÇÃO EM CIÊNCIAS SOCIAIS E HUMANAS: TEORIA E...
PPTX
Powerpoint variation
PDF
Models of curriculum development
PPTX
Management of Company
PPT
COACHING SKILLS POWERPOINT
PPTX
Sociological foundation of education
Management oriented evaluation approaches
Functional Dimension
(2011) - “METODOLOGIA DE INVESTIGAÇÃO EM CIÊNCIAS SOCIAIS E HUMANAS: TEORIA E...
Powerpoint variation
Models of curriculum development
Management of Company
COACHING SKILLS POWERPOINT
Sociological foundation of education

What's hot (20)

PPT
Pakistan Education Plan
PPTX
Program evaluation
PPTX
Cipp model
PDF
Leadership in educational administration
PDF
Iii.1 educational management
PPT
Naturalistic evaluation2
DOCX
PLANNING CLASSROOM TESTS AND ASSESSMENTS
PPTX
Educational Planning Process in Pakistan
PPTX
Concept of Classroom Assessment
PPTX
Education planning types
PPTX
Curriculum constrction sem i evaluation models
PPTX
Education mangement powerpoint
PPTX
The concept of curriculum
PPT
Assessing and Evaluating Learning
PDF
CURRICULUM DEVELOPMENT CYCLE.pdf
PPTX
Educational Supervision and its types
PPTX
Curriculum evaluation
PPTX
Approaches To Curriculum Planning
PDF
Types of tests in measurement and evaluation
PPTX
comparison and contrast of educational policies
Pakistan Education Plan
Program evaluation
Cipp model
Leadership in educational administration
Iii.1 educational management
Naturalistic evaluation2
PLANNING CLASSROOM TESTS AND ASSESSMENTS
Educational Planning Process in Pakistan
Concept of Classroom Assessment
Education planning types
Curriculum constrction sem i evaluation models
Education mangement powerpoint
The concept of curriculum
Assessing and Evaluating Learning
CURRICULUM DEVELOPMENT CYCLE.pdf
Educational Supervision and its types
Curriculum evaluation
Approaches To Curriculum Planning
Types of tests in measurement and evaluation
comparison and contrast of educational policies
Ad

Viewers also liked (20)

PPTX
Cipp evaluation model
PPTX
cipp model
PPTX
Program evaluation
PPTX
Curriculum Evaluation
PPTX
PROVUS'S DISCREPENCY EVALUATION MODEL
PPTX
Curriculum evaluation
PPTX
Daniel stufflebeam’s.pptx (1)
PPTX
Models of curriculum evaluation and application in educational
PPTX
Evaluation models
PPTX
Curriculum evaluation model
PPT
Tyler objective model
PPTX
Curriculum Evaluation
PPTX
Components of curriculum
PPTX
Evaluation – concepts and principles
PDF
Design Options for Open Education
PPTX
Module 7 g&m
PPT
Responsive illuminative evaluation
PDF
New Approaches to Performance Evaluation
PPT
EPE 312 Evaluating of learning objects
PPTX
Evaluating & Analyzing the Phases of ADDIE
Cipp evaluation model
cipp model
Program evaluation
Curriculum Evaluation
PROVUS'S DISCREPENCY EVALUATION MODEL
Curriculum evaluation
Daniel stufflebeam’s.pptx (1)
Models of curriculum evaluation and application in educational
Evaluation models
Curriculum evaluation model
Tyler objective model
Curriculum Evaluation
Components of curriculum
Evaluation – concepts and principles
Design Options for Open Education
Module 7 g&m
Responsive illuminative evaluation
New Approaches to Performance Evaluation
EPE 312 Evaluating of learning objects
Evaluating & Analyzing the Phases of ADDIE
Ad

Similar to Management-Oriented Evaluation Approaches (20)

PPTX
THE CIPP MODEL IN TEACHING PROOF-POWER POINT
PDF
Performance Management to Program Evaluation: Creating a Complementary Connec...
PPT
Introduction To Evaluation
PDF
Levels evaluation models
PPT
June 20 2010 bsi christie
PPTX
Community engagement - what constitutes success
DOCX
Chapter 4 MAAM ELLA narrative report FINAL.docx
DOCX
Lesson 1 bb.docx
PPTX
Evaluation research-resty-samosa
PPT
Program Evaluation 1
PPTX
Philosophical foundations of education
PPTX
Policy Review/Evaluation
PPTX
Ot5101 005 week 5
PPT
Program Evaluation: Forms and Approaches by Helen A. Casimiro
PPTX
Research, Monitoring and Evaluation, in Public Health
PDF
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
PPTX
Collaborative 2 ingrid margarita and sandra
PPTX
Pogram Evaluation.pptx
PPT
Administration and Supervision in Evaluation
PPTX
R.M Evaluation Program complete research.pptx
THE CIPP MODEL IN TEACHING PROOF-POWER POINT
Performance Management to Program Evaluation: Creating a Complementary Connec...
Introduction To Evaluation
Levels evaluation models
June 20 2010 bsi christie
Community engagement - what constitutes success
Chapter 4 MAAM ELLA narrative report FINAL.docx
Lesson 1 bb.docx
Evaluation research-resty-samosa
Program Evaluation 1
Philosophical foundations of education
Policy Review/Evaluation
Ot5101 005 week 5
Program Evaluation: Forms and Approaches by Helen A. Casimiro
Research, Monitoring and Evaluation, in Public Health
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
Collaborative 2 ingrid margarita and sandra
Pogram Evaluation.pptx
Administration and Supervision in Evaluation
R.M Evaluation Program complete research.pptx

More from Larry Weas (16)

PPTX
KONE Corporation Training Program Presentation
PPTX
An Introduction To The Dick & Carey Instructional Design Model
POTX
Ageism In the Workplace
PPT
Using Humor In HRD & Training
PPTX
Doctoral Research Theoretical Framework
PPTX
BGTW Consulting Team Presentation
PPTX
Grounded Theory Presentation
PPTX
Memletics Learning Styles Inventory
PDF
2010 AHRD Conference Portfolio
PPT
Adult Learning & Technology
PPT
Using Humor in HRD and Training
PPT
Adult Learning Styles Presentation
PPT
Kirkpatricks Levels Presentation
PDF
Linking Value-Based Strategies In HRD
PPT
Authentic Assessment
PPT
Theory Paper Presentation for Ed. D. Program
KONE Corporation Training Program Presentation
An Introduction To The Dick & Carey Instructional Design Model
Ageism In the Workplace
Using Humor In HRD & Training
Doctoral Research Theoretical Framework
BGTW Consulting Team Presentation
Grounded Theory Presentation
Memletics Learning Styles Inventory
2010 AHRD Conference Portfolio
Adult Learning & Technology
Using Humor in HRD and Training
Adult Learning Styles Presentation
Kirkpatricks Levels Presentation
Linking Value-Based Strategies In HRD
Authentic Assessment
Theory Paper Presentation for Ed. D. Program

Recently uploaded (20)

PDF
ChatGPT for Dummies - Pam Baker Ccesa007.pdf
PDF
My India Quiz Book_20210205121199924.pdf
PDF
1.3 FINAL REVISED K-10 PE and Health CG 2023 Grades 4-10 (1).pdf
PPTX
Introduction to Building Materials
PDF
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
PDF
Chinmaya Tiranga quiz Grand Finale.pdf
PDF
Trump Administration's workforce development strategy
PDF
احياء السادس العلمي - الفصل الثالث (التكاثر) منهج متميزين/كلية بغداد/موهوبين
PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PDF
Indian roads congress 037 - 2012 Flexible pavement
PDF
Paper A Mock Exam 9_ Attempt review.pdf.
PDF
LDMMIA Reiki Yoga Finals Review Spring Summer
PPTX
Onco Emergencies - Spinal cord compression Superior vena cava syndrome Febr...
PDF
CISA (Certified Information Systems Auditor) Domain-Wise Summary.pdf
PDF
Hazard Identification & Risk Assessment .pdf
PPTX
A powerpoint presentation on the Revised K-10 Science Shaping Paper
PDF
Vision Prelims GS PYQ Analysis 2011-2022 www.upscpdf.com.pdf
PDF
1_English_Language_Set_2.pdf probationary
PDF
IGGE1 Understanding the Self1234567891011
PDF
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...
ChatGPT for Dummies - Pam Baker Ccesa007.pdf
My India Quiz Book_20210205121199924.pdf
1.3 FINAL REVISED K-10 PE and Health CG 2023 Grades 4-10 (1).pdf
Introduction to Building Materials
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
Chinmaya Tiranga quiz Grand Finale.pdf
Trump Administration's workforce development strategy
احياء السادس العلمي - الفصل الثالث (التكاثر) منهج متميزين/كلية بغداد/موهوبين
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
Indian roads congress 037 - 2012 Flexible pavement
Paper A Mock Exam 9_ Attempt review.pdf.
LDMMIA Reiki Yoga Finals Review Spring Summer
Onco Emergencies - Spinal cord compression Superior vena cava syndrome Febr...
CISA (Certified Information Systems Auditor) Domain-Wise Summary.pdf
Hazard Identification & Risk Assessment .pdf
A powerpoint presentation on the Revised K-10 Science Shaping Paper
Vision Prelims GS PYQ Analysis 2011-2022 www.upscpdf.com.pdf
1_English_Language_Set_2.pdf probationary
IGGE1 Understanding the Self1234567891011
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...

Management-Oriented Evaluation Approaches

  • 1. Chapter Five: Management-Oriented Evaluation Approaches Presented by: Iva Angelova & Larry Weas ETR 531 Program Evaluation Northern Illinois University Education, Technology & Research
  • 2. Introduction Developers of the Management-Oriented Evaluation Approach and their Contributions How the Management-Oriented Evaluation Approach Has Been Used Strengths and Limitations of the Management-Oriented Evaluation Approach Other References Questions for Discussion
  • 3. The CIPP Evaluation Model (Stufflebeam , 1971) Context Evaluation Planning Decisions Input Evaluation Structuring Decisions Process Evaluation Implementing Decisions Product Evaluation Recycling Decisions
  • 4. Logical Structure for Designing Focusing the Evaluation Collection of Information Organization of Information Analysis of Information Reporting of Information Administration of the Information (Stufflebeam , 1973a) Step 1 Step 2 Step 3 Step 4 Step 5 Step 6
  • 5. Step 1: Further Detail Focusing the Evaluation Identify the major level(s) of decision making to be served, for example local, state, or national. For each level of decision making, project the decision situations to be served and describe each one in terms of its locus , focus, criticality, timing, and composition of alternatives. Define criteria for each decision situation by specific variables for measurement and standards for use in judgment of alternatives. Define policies within which the evaluator must operate. Step 1
  • 6. Step 2: Further Detail Collection of Information Specify the source of the information to be collected. Specify the instruments and methods for collecting the needed information. Specify the sampling procedure to be employed. Specify the conditions and schedule for information collection. Step 2
  • 7. Step 3: Further Detail Organization of Information Provide a format for the information that is to be collected. Designate a means for performing the analysis. Step 3
  • 8. Step 4: Further Detail Analysis of Information Select the analytical procedures to be employed. Designate a means of performing the analysis. Step 4
  • 9. Step 5: Further Detail Reporting of Information Define the audiences for the evaluation reports. Specify means for providing information to the audience. Specify the format for evaluation reports and /or reporting sessions. Schedule the reporting of information. Step 5
  • 10. Step 6: Further Detail Administration of the Information Summarize the evaluation schedule. Define staff and resource requirements and plans for meeting these requirements. Specify means for meeting policy requirements. Evaluate the potential of the evaluation design for providing information that is valid , reliable, credible, timely, and pervasive (i.e. will reach all relevant stakeholders). Specify and schedule means for periodic updating of the evaluation design. Provide a budget for the total evaluation program. Step 6
  • 11. Four Types of Evaluation (Stufflebeam & Shinkfield, 1985) Context Evaluation Input Evaluation Process Evaluation Product Evaluation
  • 12. The UCLA Evaluation Model (Alkin, 1969) Systems Assessment (C) UCLA Evaluation Model compared to CIPP Program Planning (I) Program Implementation To assist in the selection of particular programs likely to be effective in meeting specific education needs Program Improvement (P) Program Certification (P) To provide information about whether a program was introduced to the appropriate group in manner intended To provide information about how a program in functioning, whether interim objective are being achieved, and whether unanticipated outcomes are appearing To provide information about the value of the program and its potential for use elsewhere To provide information about the state of the system
  • 13. The UCLA Evaluation Model Evaluation is a process of gathering information. The information collected in an evaluation will be used mainly to make decisions about alternative course of action. Evaluation information should be presented to the decision maker in form that he can use effectively and that is designed to help rather than confuse or mislead him. Different kinds of decision require different kinds of evaluation procedures. (Alkin, 1991) Four assumptions:
  • 14. Growth & Development of the Early Models The CIPP and UCLA appear to be linear and Sequential Evaluators may undertake “retrospective” evaluations Process evaluation can be done without having specific decisions. Cycle through another type of evaluation is the nature of Management-Oriented Evaluation.
  • 15. Guides produces using the CIPP Model CONTEXT Evaluation INPUT Evaluation PROCESS Evaluation PRODUCT Evaluation Shufflebeam, (1977) advanced the procedure for conducting a context evaluation with his guidelines for designing a needs assessment for an educational program or activity. Reinhard (1972) developed a guide for use in input evaluation called the advocate team technique . It is used when acceptable alternatives for designing a new program are not available or obvious. Cronbach, (1963) proposed procedures which provided useful suggestions for the conduct of process evaluation. Techniques discussed in Chapter 6 provide information useful in conducting product evaluation.
  • 16. Management-Oriented Evaluation Approach Used Record of attainment and recycling decisions Guidance for termination, continuation, modification, or installation P roduct Record of actual process Guidance for implementation P rocess Record of choice strategy and design and reason for their choice over other alternatives Guidance for choice or program strategy: input for specification of procedural design I nput Record of objectives and bases for their choice along with a record of needs, opportunities, and problems Guidance for choice of objectives and assignment of priorities C ontext Accountability (Summative Evaluation) Decision Making (Formative Orientation)
  • 17. Strengths & Limitations Proved appealing to evaluators Give focus to the evaluation Stress the importance of the utility of information Instrumental in showing evaluators and program managers that they need not wait until and activity or program has run its course before evaluating it Preferred choice in the eyes of most managers and boards The CIPP model is a useful and simple heuristic tool that helps the evaluator generate potentially important questions to be addressed in an evaluation Using CIPP model, the user can identify a number of questions about an undertaking Supports evaluation of every component of program as it operates, grows, or changes It stresses timely used feedback Strengths
  • 18. Strengths & Limitations Evaluator’s occasional inability to respond to questions or issues that may be significant Given to top management, can be become the “hired gun” thus, can become unfair and undemocratic Direct effect of policy-shaping community If follows in its entirety, can result in costly and complex evaluations Assumes the importance decisions can be clearly identified in advanced Thus, frequent adjustments may be needed in the original evaluation plan if this approach is to work well Limitations
  • 19. Major Concepts, Theories & Summary The management-oriented evaluation approach, informs decision makers about the inputs, processes, and outputs of the program under evaluation Shufflebeam’s CIPP evaluation model incorporates four separate evaluations into one framework to better serve managers and decision makers In the CIPP model, the Context Evaluation helps define objectives Process Evaluation is used to determine how well a program is being implemented Product Evaluation is used to provide information on what program results were obtained, how well needs were reduced, and what should be done once the program has ended Alkin’s UCLA model is similar to the CIPP model in that it provides decision makers information on the context, inputs, implementations, processes and products of the program under evaluation
  • 20. Alkin, M. C. (1991) Evaluation theory development: II. In M.W. McLaughlin & D. C. Phillips (Eds.), Evaluation and education: at quarter century , Ninetieth Yearbook of the National Society for the Study of education, Part II. Chicago: University of Chicago Press. Shufflebeam, D. L. (2000) The CIPP model for evaluation. In D. L. Shufflebeam, G. F. Madaus, T. Kelleghan (Eds.) Evaluation models: Viewpoints on educational and human services evaluation (2 nd ed., pp.274-317). Shufflebeam , D. L. & Shinkfield, A. J. (1985). Systematic evaluation, Boston: Kluwer- Nijhoff. R References
  • 21. Questions for Leading a Discussion (Larry’s question): Based on the two model (CIPP & UCLA) we have discussed, what are some decisions you may have in you own organization? Could one of these two models be used in evaluating your organization's decision? Why? What are some limitations you may incur when evaluation you organization? Q Questions