Meta–Evaluation Theory:Definitions, Processes, & SystemsDavid PassmoreWorkforce Education & Development
Meta — from Greek: μετά = “after", "beyond", "with", “adjacent”, “self” A prefix used…indicate a concept which is an abstraction from another concept. What is meta–evaluation
Educational Evaluation and Decision Making1971Out of printConceived in reaction to the difficulty and practical inconsequence of:Applying experimental and quasi–experimental designsIn educational and work settings that rarely afford opportunities for attaining the gold standard of random assignment of subjects to experimental conditions  From Phi Delta Kappa Committee on Evaluation
PDK Committee DefinitionEvaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
PDK Committee DefinitionA dissection of the definitionEvaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
An activity subsuming many methods and involving a number of steps or operations. A process…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Two or more different actions that might be taken in response to some situation requiring altered action. Decision alternatives…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Situations include: (a) unmet needs exist; (b) some barrier to fulfillment of needs exists; or (c) opportunities exist which ought to be exploited. Decision alternatives…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Quantitative or qualitative data about entities (tangible or intangible) and their relationships in terms of some purpose.Information…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Information is derived from many sources and methods. Could be from scientific data, precedence, or experience. Information…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Information is more than a collection of facts and data. Rather, facts and data must be organized for intelligibility and must reduce the uncertainty that surrounds decision making.Information…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Information often is limited and imperfect in evaluation situations in comparison with experimental situations in which controls are possible and are formalized by rigorous design. Information…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Identifying evaluative information required through an inventory of the decision alternatives to be weighed and the criteria to be applied in weighing them. Delineating…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Knowing at least two elements is essential: (a) decision alternatives and (b) values or criteria to be applied. These are obtained only from clients for evaluations who must make or ratify decisions. Delineating…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Therefore, evaluation has an “interface” role as well as a “technical” role because its worth is defined by meeting client needs. Delineating…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Collecting, organizing, and analyzing information through such formal means as observation, review of artifacts, measurement, data processing, and statistical analysis. Obtaining…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Ways of obtaining information are guided by scientific criteria and disciplinary preferences. Obtaining…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Fitting information together into systems and subsystems that best serve the purposes of the evaluation, and reporting the information to the decision maker.Providing…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Involves interaction between the evaluator and the various audiences for information. Multiple audiences are common. Providing…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
The direct audience for information is the decision maker. Also important are people and organizations who must ratify decisions made as are others who have strong stakeholder positions.Providing…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
The information delivery preferences of these audiences vary in specificity, modality, and timing.Providing…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Satisfying criteria for evaluation and matched to the judgment criteria to be employed in choosing among decision alternatives. Useful…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Criteria to be imposed on evaluations include: scientific criteria (internal and external validity, reliability, objectivity); Useful…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
practical criteria (relevance, importance, scope, credibility, timeliness, and pervasiveness); Useful…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
and prudential criteria (efficiency). Useful…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Must meet the client’s identified information needs. Useful…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
The act of choosing among decision alternatives. Judging…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Judging is central because evaluation is meant to serve decision making. Judging…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Although judging is central to evaluation, the act of judging is not central to an evaluator’s role. Judging…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
PDK Committee DefinitionEvaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
Evaluation process serves decision–making
Types of decisions are linked to types of evaluation
Planning decisionsContext evaluation—This type of evaluation determines goals and objectives; defines the relevant environment, its actual and desired conditions, its unmet needs and unused opportunities, and why needs and opportunities are not being met and used; examines contingencies that pressure and promote improvements; and assesses congruities between actual and intended performance.Types of decisions are linked to types of evaluation
Structuring decisionsInput evaluation—Essentially, this type of evaluation helps state objectives operationally and whether their accomplishment is feasible. Types of decisions are linked to types of evaluation
Implementing decisionsProcess evaluation—This type of evaluation is used to: identify and monitor potential sources of failure in an activity; to service preprogrammed decisions that are to be made during the implementation of an activity; and to record events that have occurred so that “lessons learned” can be delineated at the end of an activity. Process evaluation assesses the extent to which procedures operate as intended.Types of decisions are linked to types of evaluation
Recycling decisionsProduct evaluation—This type of evaluation measures criteria associated with the objectives for an activity, compares these measurements with predetermined absolute or relative standards, and makes rational interpretations of these outcomes using recorded context, input, and process information. Product evaluation investigates the extent to which objectives have been, or are being, attained.Types of decisions are linked to types of evaluation
C = ContextI = InputP = ProcessP = ProductPDK model is often called “CIPP” model
General work breakdown for evaluation
Delineating informationWho needs what when?
Obtaining informationHow?
Providing informationTo whom and in what form?
Providing informationTo whom and in what form?
Scientific criteria— Practical criteriaPrudential criteriaEvaluating the evaluation…More expansive than evaluating quality of researchEvaluation is, itself, evaluated….
These criteria primarily assess “representational goodness;” that is, they assess how well the evaluation depicts a situation. These criteria, in their detailed and technical forms, are quite familiar to most people with research training. Scientific criteria…More expansive that evaluating quality of researchEvaluation is, itself, evaluated….
Internal validityExternal validityReliabilityObjectivityScientific criteria…More expansive that evaluating quality of researchEvaluation is, itself, evaluated….
Extent to which evaluation findings can be attributed to the activity evaluated rather than to some other factor or artifact. Internal validity…Evaluation is, itself, evaluated….
The extent to which evaluation findings are generalizable over people, places, and time. External validity…Evaluation is, itself, evaluated….
Simply said, the extent to which evidence is measured accurately. Reliability…Evaluation is, itself, evaluated….
Intersubjectivity in the sense of agreement on conclusions between independent observers of the same evidence. The “Do you see what I see?” phenomenon. Objectivity…Evaluation is, itself, evaluated….
Intersubjectivity in the sense of agreement on conclusions between independent observers of the same evidence. The “Do you see what I see?” phenomenon. Objectivity…Evaluation is, itself, evaluated….
Practical criteria…Evaluation is, itself, evaluated….RelevanceImportanceScopeCredibilityTimelinessPersuasiveness
Evaluation information is obtained to serve decision making. If the information obtained does not match these decisions, then this information, no matter how well  obtained scientifically, is useless. Relevance…Evaluation is, itself, evaluated….
Information needs to be more than nominally relevant to a decision. Importance…Evaluation is, itself, evaluated….
Not all information is important. Information with the highest importance (relevance graded by quality) must be obtained and provided, within budget constraints. Importance…Evaluation is, itself, evaluated….
The evaluator must determine what the client believes is important. In some cases, the evaluator might suggest to the client what information ought to be considered important because the evaluator often has considerable experience with obtaining and using some types of information. Importance…Evaluation is, itself, evaluated….
This is a “completeness” criterion.That is, is all information obtained and provided that is needed to make a decision? Scope…Evaluation is, itself, evaluated….
Clients often are not in a position to judge whether information obtained and provided by an evaluator meets scientific criteria. Therefore, the trust invested in the evaluator by the client is an important dimension of the quality of the outcomes of an evaluation project. Credibility…Evaluation is, itself, evaluated….
Just like the scientific criterion of internal validity pertains to a particular evaluation situation, credibility is never generalizable and always refers to a particular situation. Credibility…Evaluation is, itself, evaluated….
Information from evaluation projects is provided to identified audiences. And, this information is meant to be used. The criterion of pervasiveness is met if all of the people and organizations who should, do, in fact, know about and use evaluative information.Persuasiveness…Evaluation is, itself, evaluated….
Proper application of practical criteria of relevance, importance, and scope should remedy many inefficiencies in evaluation.However, the conduct of an evaluation project must be weighed against alternative evaluation designs that could have achieved the same outcome with different time, financial, and personnel resources. Prudential criterion…Evaluation is, itself, evaluated….
Meta–Evaluation Theory:Definitions, Processes, & SystemsDavid PassmoreWorkforce Education & Development

More Related Content

PPT
Educational evaluation
PPTX
Expertise, Consumer-Oriented, and Program-Oriented Evaluation Approaches
PPTX
Management oriented evaluation approaches
PPT
Health Inequality Monitoring
PPT
Surveillance and Notification of Diseases
PPTX
Final corraletional research ppts
PPTX
World Health Organization: health security preparedness
PDF
Global health
Educational evaluation
Expertise, Consumer-Oriented, and Program-Oriented Evaluation Approaches
Management oriented evaluation approaches
Health Inequality Monitoring
Surveillance and Notification of Diseases
Final corraletional research ppts
World Health Organization: health security preparedness
Global health

What's hot (20)

PPTX
Decision oriented evaluation approaches
PPTX
Components of curriculum evaluation
PPTX
Reliability (assessment of student learning I)
PPTX
HEALTH ISSUE AS A PUBLIC HEALTH PROBLEM
PPT
Triad wheel web new
PDF
Validity of instrument
PPTX
Educational Evaluation
PPTX
Program evaluation
PPTX
Key Informant Interview (KII)
PPTX
HEALTH SEEKING BEHAVIOUR and all about it pptx
PPTX
Measurement theory.
PPT
Non communicable diseases (final)
PPTX
Epidemiologic transition
PPTX
Mixed methods research
PPTX
Research, Monitoring and Evaluation, in Public Health
PPTX
Purpose of Assessment
PPTX
Presentation of determinants of health
PPT
Narrative research design
PDF
Health system models-an overview
PPT
Attributable risk and population attributable risk
Decision oriented evaluation approaches
Components of curriculum evaluation
Reliability (assessment of student learning I)
HEALTH ISSUE AS A PUBLIC HEALTH PROBLEM
Triad wheel web new
Validity of instrument
Educational Evaluation
Program evaluation
Key Informant Interview (KII)
HEALTH SEEKING BEHAVIOUR and all about it pptx
Measurement theory.
Non communicable diseases (final)
Epidemiologic transition
Mixed methods research
Research, Monitoring and Evaluation, in Public Health
Purpose of Assessment
Presentation of determinants of health
Narrative research design
Health system models-an overview
Attributable risk and population attributable risk
Ad

Similar to Meta-Evaluation Theory (20)

PPTX
Meta-Evaluation
PPTX
Unit 2.pptx
PPTX
module4decisionmaking-220417174409.pptx assignment
PPTX
Decision Making.pptx
PDF
Outline model Impact Evaluation toolkit
DOCX
PPTX
Definition of Evaluation
PPTX
HRMIS Decision Support System chapter 7 p
PPTX
decision makinng.pptx
PPTX
Meaning of Test, Testing and Evaluation
PPTX
Program evaluation
PPTX
Module 4_Decision Making.pptx
PDF
PPTX
Businessintelligence101
PPTX
QA_Chapter_01_Dr_B_Dayal_Overview.pptx
PPTX
Decision Making in An Organization
PPTX
Decision Making in An Organization
PPT
Fom6 ch04in
PPTX
Evaluation definitions
Meta-Evaluation
Unit 2.pptx
module4decisionmaking-220417174409.pptx assignment
Decision Making.pptx
Outline model Impact Evaluation toolkit
Definition of Evaluation
HRMIS Decision Support System chapter 7 p
decision makinng.pptx
Meaning of Test, Testing and Evaluation
Program evaluation
Module 4_Decision Making.pptx
Businessintelligence101
QA_Chapter_01_Dr_B_Dayal_Overview.pptx
Decision Making in An Organization
Decision Making in An Organization
Fom6 ch04in
Evaluation definitions
Ad

More from Penn State University (20)

PPTX
Artificial Intelligence and the Future of Work
PPTX
Research Design
PPTX
Validity of conclusions, internal validity, and external validity; research d...
PPTX
Validity of Conclusions & Generalizations
PPTX
PPTX
Research Design
PPTX
WF ED 540 Hypothesis Testing - 2018
PPTX
Some Research Concepts
PPTX
Sharing Science: Tools for Improving our Research, Teaching, and Impact
PPTX
WF ED 540, Fall Semester 2018, Class Meeting 1 - Intro to the course
PPTX
Class Meeting 12 -- WF ED 540 -- Fall Semester 2017
PPTX
WF ED 540 - Class Meeting 7 - Fall Semester 2017
PPTX
WF ED 540, Class Meeting 5, Fall Semester 2017
PPTX
WF ED 540, Data Analysis, Fall 2017
PPTX
WF ED 540 - Class Meeting 3 - Fall Semester 2017
PPTX
R syntax, including procedures for communicating data
PPTX
Introduction to WF ED 540, Data Analysis, Fall 2017
PPTX
Bob Game - Photos for a Celebration of Life
PPTX
In Preparation for 5 April UCWHRE Meeting
PPTX
Structural Model of Topics in Academy of Human Resource Development Journals,...
Artificial Intelligence and the Future of Work
Research Design
Validity of conclusions, internal validity, and external validity; research d...
Validity of Conclusions & Generalizations
Research Design
WF ED 540 Hypothesis Testing - 2018
Some Research Concepts
Sharing Science: Tools for Improving our Research, Teaching, and Impact
WF ED 540, Fall Semester 2018, Class Meeting 1 - Intro to the course
Class Meeting 12 -- WF ED 540 -- Fall Semester 2017
WF ED 540 - Class Meeting 7 - Fall Semester 2017
WF ED 540, Class Meeting 5, Fall Semester 2017
WF ED 540, Data Analysis, Fall 2017
WF ED 540 - Class Meeting 3 - Fall Semester 2017
R syntax, including procedures for communicating data
Introduction to WF ED 540, Data Analysis, Fall 2017
Bob Game - Photos for a Celebration of Life
In Preparation for 5 April UCWHRE Meeting
Structural Model of Topics in Academy of Human Resource Development Journals,...

Recently uploaded (20)

PDF
Download IObit Driver Booster Pro 12.6.0.620 Crack Free Full Version 2025
PDF
4K Video Downloader 4.33.5 Crack / Plus 25.2.0 latest Downloads 2025
PDF
Module-3-Week005-to-Week006-PPT.pdf hahahgs
PPTX
Kulipari: Army of Frogs Movie - OVFX Story Internship 2023
PDF
Can You Imagine? Read along and let’s see!
PDF
KarolG CarRace Sequence...why a 40 character minimum for a title?
PPTX
SWweredddddaregqrgWWEQEwqdewf final.pptx
PDF
Features of Python_ A Beginner's Guide.pdf
PDF
D009 - Lahoo Ke Pyaase. its a hindi comics
PPTX
very useful for every thing in this area
PPTX
see.pptxo89i7uklyjhukj,hiukjiu8iu8j78uyuy
PPTX
Picture Perception - a constructive narrative
PPTX
Addition and Subtraction Word Problems Math Presentation Orange in Pink an_20...
PPTX
VE_Situational_Question_Set1___2et2.pptx
PPTX
Randomiser Wheel- All About Me Wheel for KG-3 Can be used as an icebreaker
PDF
What Happened to Sue Aikens’ Granddaughter on Life Below Zero.pdf
PPTX
eNTREP OHS 5jhjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjs.pptx
PDF
Disney Junior's Pupstruction: Great Outdoors Song
PPTX
Social Awareness on Municipal Solid Waste.pptx
PPTX
Food Processing Engineering.pptx ucuuvvu
Download IObit Driver Booster Pro 12.6.0.620 Crack Free Full Version 2025
4K Video Downloader 4.33.5 Crack / Plus 25.2.0 latest Downloads 2025
Module-3-Week005-to-Week006-PPT.pdf hahahgs
Kulipari: Army of Frogs Movie - OVFX Story Internship 2023
Can You Imagine? Read along and let’s see!
KarolG CarRace Sequence...why a 40 character minimum for a title?
SWweredddddaregqrgWWEQEwqdewf final.pptx
Features of Python_ A Beginner's Guide.pdf
D009 - Lahoo Ke Pyaase. its a hindi comics
very useful for every thing in this area
see.pptxo89i7uklyjhukj,hiukjiu8iu8j78uyuy
Picture Perception - a constructive narrative
Addition and Subtraction Word Problems Math Presentation Orange in Pink an_20...
VE_Situational_Question_Set1___2et2.pptx
Randomiser Wheel- All About Me Wheel for KG-3 Can be used as an icebreaker
What Happened to Sue Aikens’ Granddaughter on Life Below Zero.pdf
eNTREP OHS 5jhjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjs.pptx
Disney Junior's Pupstruction: Great Outdoors Song
Social Awareness on Municipal Solid Waste.pptx
Food Processing Engineering.pptx ucuuvvu

Meta-Evaluation Theory

  • 1. Meta–Evaluation Theory:Definitions, Processes, & SystemsDavid PassmoreWorkforce Education & Development
  • 2. Meta — from Greek: μετά = “after", "beyond", "with", “adjacent”, “self” A prefix used…indicate a concept which is an abstraction from another concept. What is meta–evaluation
  • 3. Educational Evaluation and Decision Making1971Out of printConceived in reaction to the difficulty and practical inconsequence of:Applying experimental and quasi–experimental designsIn educational and work settings that rarely afford opportunities for attaining the gold standard of random assignment of subjects to experimental conditions From Phi Delta Kappa Committee on Evaluation
  • 4. PDK Committee DefinitionEvaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 5. PDK Committee DefinitionA dissection of the definitionEvaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 6. An activity subsuming many methods and involving a number of steps or operations. A process…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 7. Two or more different actions that might be taken in response to some situation requiring altered action. Decision alternatives…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 8. Situations include: (a) unmet needs exist; (b) some barrier to fulfillment of needs exists; or (c) opportunities exist which ought to be exploited. Decision alternatives…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 9. Quantitative or qualitative data about entities (tangible or intangible) and their relationships in terms of some purpose.Information…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 10. Information is derived from many sources and methods. Could be from scientific data, precedence, or experience. Information…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 11. Information is more than a collection of facts and data. Rather, facts and data must be organized for intelligibility and must reduce the uncertainty that surrounds decision making.Information…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 12. Information often is limited and imperfect in evaluation situations in comparison with experimental situations in which controls are possible and are formalized by rigorous design. Information…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 13. Identifying evaluative information required through an inventory of the decision alternatives to be weighed and the criteria to be applied in weighing them. Delineating…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 14. Knowing at least two elements is essential: (a) decision alternatives and (b) values or criteria to be applied. These are obtained only from clients for evaluations who must make or ratify decisions. Delineating…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 15. Therefore, evaluation has an “interface” role as well as a “technical” role because its worth is defined by meeting client needs. Delineating…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 16. Collecting, organizing, and analyzing information through such formal means as observation, review of artifacts, measurement, data processing, and statistical analysis. Obtaining…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 17. Ways of obtaining information are guided by scientific criteria and disciplinary preferences. Obtaining…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 18. Fitting information together into systems and subsystems that best serve the purposes of the evaluation, and reporting the information to the decision maker.Providing…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 19. Involves interaction between the evaluator and the various audiences for information. Multiple audiences are common. Providing…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 20. The direct audience for information is the decision maker. Also important are people and organizations who must ratify decisions made as are others who have strong stakeholder positions.Providing…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 21. The information delivery preferences of these audiences vary in specificity, modality, and timing.Providing…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 22. Satisfying criteria for evaluation and matched to the judgment criteria to be employed in choosing among decision alternatives. Useful…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 23. Criteria to be imposed on evaluations include: scientific criteria (internal and external validity, reliability, objectivity); Useful…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 24. practical criteria (relevance, importance, scope, credibility, timeliness, and pervasiveness); Useful…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 25. and prudential criteria (efficiency). Useful…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 26. Must meet the client’s identified information needs. Useful…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 27. The act of choosing among decision alternatives. Judging…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 28. Judging is central because evaluation is meant to serve decision making. Judging…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 29. Although judging is central to evaluation, the act of judging is not central to an evaluator’s role. Judging…Evaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 30. PDK Committee DefinitionEvaluation is….A process for delineating, obtaining, and providing useful information for judging decision alternatives.
  • 31. Evaluation process serves decision–making
  • 32. Types of decisions are linked to types of evaluation
  • 33. Planning decisionsContext evaluation—This type of evaluation determines goals and objectives; defines the relevant environment, its actual and desired conditions, its unmet needs and unused opportunities, and why needs and opportunities are not being met and used; examines contingencies that pressure and promote improvements; and assesses congruities between actual and intended performance.Types of decisions are linked to types of evaluation
  • 34. Structuring decisionsInput evaluation—Essentially, this type of evaluation helps state objectives operationally and whether their accomplishment is feasible. Types of decisions are linked to types of evaluation
  • 35. Implementing decisionsProcess evaluation—This type of evaluation is used to: identify and monitor potential sources of failure in an activity; to service preprogrammed decisions that are to be made during the implementation of an activity; and to record events that have occurred so that “lessons learned” can be delineated at the end of an activity. Process evaluation assesses the extent to which procedures operate as intended.Types of decisions are linked to types of evaluation
  • 36. Recycling decisionsProduct evaluation—This type of evaluation measures criteria associated with the objectives for an activity, compares these measurements with predetermined absolute or relative standards, and makes rational interpretations of these outcomes using recorded context, input, and process information. Product evaluation investigates the extent to which objectives have been, or are being, attained.Types of decisions are linked to types of evaluation
  • 37. C = ContextI = InputP = ProcessP = ProductPDK model is often called “CIPP” model
  • 38. General work breakdown for evaluation
  • 41. Providing informationTo whom and in what form?
  • 42. Providing informationTo whom and in what form?
  • 43. Scientific criteria— Practical criteriaPrudential criteriaEvaluating the evaluation…More expansive than evaluating quality of researchEvaluation is, itself, evaluated….
  • 44. These criteria primarily assess “representational goodness;” that is, they assess how well the evaluation depicts a situation. These criteria, in their detailed and technical forms, are quite familiar to most people with research training. Scientific criteria…More expansive that evaluating quality of researchEvaluation is, itself, evaluated….
  • 45. Internal validityExternal validityReliabilityObjectivityScientific criteria…More expansive that evaluating quality of researchEvaluation is, itself, evaluated….
  • 46. Extent to which evaluation findings can be attributed to the activity evaluated rather than to some other factor or artifact. Internal validity…Evaluation is, itself, evaluated….
  • 47. The extent to which evaluation findings are generalizable over people, places, and time. External validity…Evaluation is, itself, evaluated….
  • 48. Simply said, the extent to which evidence is measured accurately. Reliability…Evaluation is, itself, evaluated….
  • 49. Intersubjectivity in the sense of agreement on conclusions between independent observers of the same evidence. The “Do you see what I see?” phenomenon. Objectivity…Evaluation is, itself, evaluated….
  • 50. Intersubjectivity in the sense of agreement on conclusions between independent observers of the same evidence. The “Do you see what I see?” phenomenon. Objectivity…Evaluation is, itself, evaluated….
  • 51. Practical criteria…Evaluation is, itself, evaluated….RelevanceImportanceScopeCredibilityTimelinessPersuasiveness
  • 52. Evaluation information is obtained to serve decision making. If the information obtained does not match these decisions, then this information, no matter how well obtained scientifically, is useless. Relevance…Evaluation is, itself, evaluated….
  • 53. Information needs to be more than nominally relevant to a decision. Importance…Evaluation is, itself, evaluated….
  • 54. Not all information is important. Information with the highest importance (relevance graded by quality) must be obtained and provided, within budget constraints. Importance…Evaluation is, itself, evaluated….
  • 55. The evaluator must determine what the client believes is important. In some cases, the evaluator might suggest to the client what information ought to be considered important because the evaluator often has considerable experience with obtaining and using some types of information. Importance…Evaluation is, itself, evaluated….
  • 56. This is a “completeness” criterion.That is, is all information obtained and provided that is needed to make a decision? Scope…Evaluation is, itself, evaluated….
  • 57. Clients often are not in a position to judge whether information obtained and provided by an evaluator meets scientific criteria. Therefore, the trust invested in the evaluator by the client is an important dimension of the quality of the outcomes of an evaluation project. Credibility…Evaluation is, itself, evaluated….
  • 58. Just like the scientific criterion of internal validity pertains to a particular evaluation situation, credibility is never generalizable and always refers to a particular situation. Credibility…Evaluation is, itself, evaluated….
  • 59. Information from evaluation projects is provided to identified audiences. And, this information is meant to be used. The criterion of pervasiveness is met if all of the people and organizations who should, do, in fact, know about and use evaluative information.Persuasiveness…Evaluation is, itself, evaluated….
  • 60. Proper application of practical criteria of relevance, importance, and scope should remedy many inefficiencies in evaluation.However, the conduct of an evaluation project must be weighed against alternative evaluation designs that could have achieved the same outcome with different time, financial, and personnel resources. Prudential criterion…Evaluation is, itself, evaluated….
  • 61. Meta–Evaluation Theory:Definitions, Processes, & SystemsDavid PassmoreWorkforce Education & Development