SlideShare a Scribd company logo
Inputs, Process Indicators, & Intermediate
Outcomes: Tools for Answering Evaluation
           Research Questions
                      Session 3
    Steve Kimball, Tony Milanowski, & Chris Thorn
            Value-Added Research Center
          University of Wisconsin – Madison

   Bush Foundation Teacher Effectiveness Initiative
                January 12 & 13, 2010
                  Minneapolis, MN
Session Goals
• Consider useful alternatives to
  experimental or quasi-experimental
  designs
  – For mid-course corrections
  – To provide some evidence of program effects

• Review & help you develop measures of
  processes & intermediate outcomes
Typical Situation
• Evaluation starts after program
  implementation has begun
  – No pre-implementation data on some key
    variables, such as instructional practice
• No randomization of treatment
  – Educator choice or assignment based on
    perceived need
  – Hard to find equivalent control group
• Limited resources
Basic Design Elements
• Mixed methods process tracing
  – How did the purported causal chain play out?
• Non-equivalent but policy-relevant
  comparison groups
• Implementation variation
  – If within-project variation was present, was
    that variation related to outcomes?
Logic for Pattern-Match Design
• Program was well implemented
• Expected intermediate outcomes occurred
• Outcome measures changed in intended
  direction


• Conclusion: theory of action appears to be
  operating & changes in outcomes likely due to
  program
Basic Design
                       Intermediate Outcomes
Implementation     Educator        Behavior
•Fidelity
•Variation         Reactions       Change



                                     Outcome
                                     Change
                    Context
Potential Non-equivalent Comparison
               Groups
• Non-implementers
• District average
• Benchmark schools
• State average for demographically similar
  schools/districts
Comparing Trends I
                     45

                     40

                     35

                     30

                     25                         Comparison Group
                     20                         Project Schools

                     15
m
M
O
o
u
s
a
e
c
r
t




                     10

                      5

                      0
    -6    -4    -2        0   2     4       6
         Years Before/After Project Start
Comparing Trends II
          Program Schools          Turnover Rate -Other

    14%

    12%

    10%

    8%

    6%
R
o
n
u
T
a
e
v
r
t




    4%

    2%
          -5   -4   -3 -2 -1 0         1    2   3   4   5
                     Yrs Before/After Program Start
Measuring Inputs, Processes, &
 Intermediate Outcomes
Logic Model
Measuring Implementation
• Key processes from logic model
• Supplement by reviewing common
  implementation problems:
  – Educators do not understand the program
  – Lack of stakeholder support
  – Conflicting/competing initiatives
  – Supporting systems not in place
Implementation Measures: Inputs
• Spending patterns & costs
  – Were funds spent as planned?
  – Was funding provided sufficient to support planned
    activities?
• Staffing: adequacy & continuity; champion
• Stakeholder attitudes
• Context features that would work with/against
  program
• Data system adequacy (data quality plan)
Implementation Measures – Activities &
                Outputs
• Quantitative
   –   # of communication sessions/newsletters/web site hits
   –   % of schools in which leaders communicated program to staff
   –   % of target educators receiving communications
   –   % of schools with program coordinators in place & trained
   –   % of teachers with valid teacher-student link
   –   Change in attributes of incoming teacher candidates
   –   Planned vs. actual $ spent on in school supports
   –   Actual vs. planned warranty costs (differentiation)
• Data sources: grant application, budgets, self-assessments,
  annual reports, administrative data
Implementation Measures – Activities &
              Outcomes
• Qualitative Examples
   – Educators have access to clear & complete description of
     program via multiple channels
   – Performance measures actually used to evaluation
     teachers and programs were the same as in the design
   – Process in place & used to correct performance
     measurement or mentoring problems
   – Alignment of program with potentially
     competing/conflicting initiatives
• Data sources: interviews, document review (grant
  applications, self-assessments, annual reports,
  meeting agendas & minutes)
Implementation Rubrics, Indices &
               Scorecards
• Combine quantitative output measures & qualitative
  assessments to summarize quality of implementation &
  implementation fidelity on key dimensions
• Provide a way to translate judgments into numbers for use
  in statistical analyses (e.g., is variation across schools
  related to intermediate or long range outcomes?)
• Examples
   – TAP Implementation Standards
   – Berends, Bodilly, & Kirby, 2002, Chapter 4 (RAND study of New
     American Schools)
Measuring Intermediate Outcomes:
              Educator Reactions
• Priority Areas
  – Program understanding
  – Warranty costs
  – Performance-reward contingency
  – Effort-performance link
  – Fairness
Educator Reactions Surveys

•   Motivation Model Elements
•   Motivational Responses
•   See “Short Form Teacher Reaction Survey”
•   Other examples:
• http://www.performanceincentives.org/data/files/news/BooksNe
• http://www.performanceincentives.org/data/files/directory/Conf
Educator Reactions Interview

• Find out what higher education faculty,
  schools & teachers are doing in response to
  the program
• Uncover unexpected consequences
• Facilitate survey development
• Some examples of what we have learned
  via interviews
Is motivation taking place?
Interview Questions
  – How has the program redesign affected your teaching?
  – How has the program affected how your school is run?
  – Has the introduction of a warranty changed how you focus
    your efforts?
  – Have you done anything differently in order to improve
    your chances of receiving the incentive?
  – Has the warranty motivated you to work harder or change
    the focus of your efforts?
Measuring Intermediate Outcomes:
             Behavior Change
•   Surveys
•   Focus groups
•   Interviews
•   Observations
•   Unobtrusive Measures
Measuring Intermediate Outcomes:
             Behavior Change
Surveys of instructional practice are attractive, but hard to do
  right.
• Need at theory of instruction
• Content specificity
• Response Problems
   – Social desirability/demand effects
   – Memory
   – Can most educators recognize depth of practice change?
• Can you build a confirmatory survey based on interviews or
  focus groups?
Potential Interview Questions
• Principals/other school leaders
   – Have teachers been doing anything differently in the
     classroom since the program began?
• Teachers
   – Have your school administrators been doing anything
     differently since the program began?
• Central office administrators
   – Have the school administrators you work with been doing
     anything differently since the program began?
Unobtrusive Measures

•   Change in PD demand (volume, content)
•   Curriculum material purchases
•   Scheduling changes/time allocations
•   Staffing changes
•   Staff meeting agendas
•   School improvement plans
Your Turn

• Work on developing process & intermediate
  outcome measures that fit your program


• Share innovative ways you have used to
  measure inputs, processes, & intermediate
  outcomes

More Related Content

PDF
Introduction to Logic Models
PPT
Introduction to logic models
PPT
Gauging Effectiveness of Instructional Grants
PPTX
Summer 2012 presentation team evaluation
PPT
Goal free model
PDF
Resume - Pope - 2016.05
PPT
Evaluating Student Success Initiatives
Introduction to Logic Models
Introduction to logic models
Gauging Effectiveness of Instructional Grants
Summer 2012 presentation team evaluation
Goal free model
Resume - Pope - 2016.05
Evaluating Student Success Initiatives

What's hot (20)

PPT
Reviewing the Research and PEAC Recommendations around Principal Evaluation
PPT
Effective assessment practices Project
PPTX
School Monitoring and Evaluation
PPTX
Program Logic Models
PPTX
Basics of Extension Evaluation (Foundations Course 2019)
PPT
Logic Model and Education
PPT
Wsu District Capacity Of Well Crafted District Wide System Of Support
PPTX
WSU new standards mean new baseline
PDF
CURRICULUM DEVELOPMENT CYCLE.pdf
PPTX
Knowledge Based Governance: Learning Governance and Leadership
PDF
Toni-Yungai
PDF
Cohort 2 Schools Orientation to Next Steps NH
PPTX
Seeking Evidence of Impact: Answering "How Do We Know?"
PPTX
HEIR conference 8-9 September 2014: Forsyth and Stubbs
PDF
Reclaiming and Reimagining Assessment
PDF
Interrater Reliability Made Easy
PPT
FTLA 2011
PPT
Wsu Ppt Building District Data Capacity
PPTX
2. grantseeking creating a program logic model
PPTX
2010 ohio tif meeting creating a comprehensive teacher effectiveness system
Reviewing the Research and PEAC Recommendations around Principal Evaluation
Effective assessment practices Project
School Monitoring and Evaluation
Program Logic Models
Basics of Extension Evaluation (Foundations Course 2019)
Logic Model and Education
Wsu District Capacity Of Well Crafted District Wide System Of Support
WSU new standards mean new baseline
CURRICULUM DEVELOPMENT CYCLE.pdf
Knowledge Based Governance: Learning Governance and Leadership
Toni-Yungai
Cohort 2 Schools Orientation to Next Steps NH
Seeking Evidence of Impact: Answering "How Do We Know?"
HEIR conference 8-9 September 2014: Forsyth and Stubbs
Reclaiming and Reimagining Assessment
Interrater Reliability Made Easy
FTLA 2011
Wsu Ppt Building District Data Capacity
2. grantseeking creating a program logic model
2010 ohio tif meeting creating a comprehensive teacher effectiveness system
Ad

Viewers also liked (20)

PPT
Malaria control strategies in india
PDF
You manage what you measure: strengthening outcome monitoring in rural sanita...
 
PPTX
Interpretation of various clauses in iso
PPT
Inputs, process and outputs
PPTX
under construction unit 1
PPT
Outcome Mapping: Monitoring and Evaluation Tool
PPTX
indian Airforce
PDF
What can we learn from the hoover dam
PPTX
Student Presentation: The Hoover Dam by Sid and Vic
PPT
Lfa Logical Framework Analysis
PPS
Glass Bridge-China
PPTX
indian navy
PPTX
progress in civil engg by aatif sadiq
PDF
Key Performance Indicators
PDF
Beginners Guide To Logical Framework Approach (BOND)
PPT
Beautiful Bridges Around the World
PPTX
A view to civil engineering in india by 2020
PPT
Smoking Presentation
PPTX
Indian Army
PPTX
Stop smoking......
Malaria control strategies in india
You manage what you measure: strengthening outcome monitoring in rural sanita...
 
Interpretation of various clauses in iso
Inputs, process and outputs
under construction unit 1
Outcome Mapping: Monitoring and Evaluation Tool
indian Airforce
What can we learn from the hoover dam
Student Presentation: The Hoover Dam by Sid and Vic
Lfa Logical Framework Analysis
Glass Bridge-China
indian navy
progress in civil engg by aatif sadiq
Key Performance Indicators
Beginners Guide To Logical Framework Approach (BOND)
Beautiful Bridges Around the World
A view to civil engineering in india by 2020
Smoking Presentation
Indian Army
Stop smoking......
Ad

Similar to 2010 bush foundation inputs, process indicators, and intermediate outcomes (20)

PPTX
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
PDF
Westat Presentation
PPTX
PBIS FASP Presentation
PDF
YouthREX Webinar: Finding and Selecting Tools for Your Outcome Evaluation
PPTX
Leading Complex District Transformation Efforts - SREE Conference 9/3/2014
PDF
[EADTU-ENQA PLA] The E-xcellence QA methodology: lessons learned over ten yea...
PDF
What is program evaluation lecture 100207 [compatibility mode]
PPTX
Pet 735 presentation week 15
PPTX
Ces 2013 negotiating evaluation design
PPT
Data-Driven Decision Making
PPTX
Step 1 Project Initiation and get organized Rev1_print.pptx
PPT
The nature of program evaluation
PPTX
Evaluability Assessments and Choice of Evaluation Methods
PPTX
Redesigning assessment and feedback - landscape review and areas for development
PDF
Black, Adam Dr - Efficacy and how to improve learner outcomes
PPTX
PowerpointT-Continuous-Improvement-Plan.pptx
PPT
Chapter 18 ppt eval & testing 4e formatted 01.10 kg edits
PPTX
Interview presentation.pptx
PPTX
Evaluation the many faces
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
Westat Presentation
PBIS FASP Presentation
YouthREX Webinar: Finding and Selecting Tools for Your Outcome Evaluation
Leading Complex District Transformation Efforts - SREE Conference 9/3/2014
[EADTU-ENQA PLA] The E-xcellence QA methodology: lessons learned over ten yea...
What is program evaluation lecture 100207 [compatibility mode]
Pet 735 presentation week 15
Ces 2013 negotiating evaluation design
Data-Driven Decision Making
Step 1 Project Initiation and get organized Rev1_print.pptx
The nature of program evaluation
Evaluability Assessments and Choice of Evaluation Methods
Redesigning assessment and feedback - landscape review and areas for development
Black, Adam Dr - Efficacy and how to improve learner outcomes
PowerpointT-Continuous-Improvement-Plan.pptx
Chapter 18 ppt eval & testing 4e formatted 01.10 kg edits
Interview presentation.pptx
Evaluation the many faces

More from Christopher Thorn (20)

PPTX
2015 Carnegie Pathways Spotlight
PPTX
Thorn ODE BHK Talk 2013 09-16 handout
PPTX
2012 it in education management socio-technical gaps exposed by the tif program
PPTX
2012 carnegie foundation the accidental revolution-teacher accountability val...
PPTX
2011 data quality challenges when implementing classroom value added models
PPTX
2011 aera data quality and a new policy agenda
PPTX
2010 nces management information systems conference when world collide
PPTX
2010 eagle co technology presentation
PPT
2010 uw-madison - systems thinking and it leadership
PPT
2008 regional educational laboratory board of directors (rel midwest)
PPT
2008 nces summer data conference making the case for longitudinal data of t...
PPT
2008 cecr annual meeting linking data systems to local reforms
PPT
2006 tuebingen university wcer capabilities - grenzübergänger
PPT
2006 qualitative strategies conference uni durham - are we there yet - grow...
PPT
2005 studying qualitative research expert to novice and back again - uni durham
PPT
2005 scale math and science partnership research and evaluation team mis and...
PPTX
2004 national science foundation knowledge management and network collaborati...
PPT
2002 it in educational management organic school info-systems-decision making...
PPT
2002 aera making decision support systems useful in the classroom
PPT
2001 wisconsin school boards association issues in building data and inform...
2015 Carnegie Pathways Spotlight
Thorn ODE BHK Talk 2013 09-16 handout
2012 it in education management socio-technical gaps exposed by the tif program
2012 carnegie foundation the accidental revolution-teacher accountability val...
2011 data quality challenges when implementing classroom value added models
2011 aera data quality and a new policy agenda
2010 nces management information systems conference when world collide
2010 eagle co technology presentation
2010 uw-madison - systems thinking and it leadership
2008 regional educational laboratory board of directors (rel midwest)
2008 nces summer data conference making the case for longitudinal data of t...
2008 cecr annual meeting linking data systems to local reforms
2006 tuebingen university wcer capabilities - grenzübergänger
2006 qualitative strategies conference uni durham - are we there yet - grow...
2005 studying qualitative research expert to novice and back again - uni durham
2005 scale math and science partnership research and evaluation team mis and...
2004 national science foundation knowledge management and network collaborati...
2002 it in educational management organic school info-systems-decision making...
2002 aera making decision support systems useful in the classroom
2001 wisconsin school boards association issues in building data and inform...

Recently uploaded (20)

PPTX
202450812 BayCHI UCSC-SV 20250812 v17.pptx
PDF
O5-L3 Freight Transport Ops (International) V1.pdf
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PPTX
Introduction-to-Literarature-and-Literary-Studies-week-Prelim-coverage.pptx
PDF
A systematic review of self-coping strategies used by university students to ...
PDF
Computing-Curriculum for Schools in Ghana
PPTX
Lesson notes of climatology university.
PPTX
master seminar digital applications in india
PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PDF
GENETICS IN BIOLOGY IN SECONDARY LEVEL FORM 3
PDF
102 student loan defaulters named and shamed – Is someone you know on the list?
PDF
VCE English Exam - Section C Student Revision Booklet
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PPTX
Microbial diseases, their pathogenesis and prophylaxis
PDF
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
PDF
O7-L3 Supply Chain Operations - ICLT Program
PDF
Chinmaya Tiranga quiz Grand Finale.pdf
PDF
Anesthesia in Laparoscopic Surgery in India
PPTX
Pharmacology of Heart Failure /Pharmacotherapy of CHF
PDF
Supply Chain Operations Speaking Notes -ICLT Program
202450812 BayCHI UCSC-SV 20250812 v17.pptx
O5-L3 Freight Transport Ops (International) V1.pdf
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
Introduction-to-Literarature-and-Literary-Studies-week-Prelim-coverage.pptx
A systematic review of self-coping strategies used by university students to ...
Computing-Curriculum for Schools in Ghana
Lesson notes of climatology university.
master seminar digital applications in india
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
GENETICS IN BIOLOGY IN SECONDARY LEVEL FORM 3
102 student loan defaulters named and shamed – Is someone you know on the list?
VCE English Exam - Section C Student Revision Booklet
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
Microbial diseases, their pathogenesis and prophylaxis
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
O7-L3 Supply Chain Operations - ICLT Program
Chinmaya Tiranga quiz Grand Finale.pdf
Anesthesia in Laparoscopic Surgery in India
Pharmacology of Heart Failure /Pharmacotherapy of CHF
Supply Chain Operations Speaking Notes -ICLT Program

2010 bush foundation inputs, process indicators, and intermediate outcomes

  • 1. Inputs, Process Indicators, & Intermediate Outcomes: Tools for Answering Evaluation Research Questions Session 3 Steve Kimball, Tony Milanowski, & Chris Thorn Value-Added Research Center University of Wisconsin – Madison Bush Foundation Teacher Effectiveness Initiative January 12 & 13, 2010 Minneapolis, MN
  • 2. Session Goals • Consider useful alternatives to experimental or quasi-experimental designs – For mid-course corrections – To provide some evidence of program effects • Review & help you develop measures of processes & intermediate outcomes
  • 3. Typical Situation • Evaluation starts after program implementation has begun – No pre-implementation data on some key variables, such as instructional practice • No randomization of treatment – Educator choice or assignment based on perceived need – Hard to find equivalent control group • Limited resources
  • 4. Basic Design Elements • Mixed methods process tracing – How did the purported causal chain play out? • Non-equivalent but policy-relevant comparison groups • Implementation variation – If within-project variation was present, was that variation related to outcomes?
  • 5. Logic for Pattern-Match Design • Program was well implemented • Expected intermediate outcomes occurred • Outcome measures changed in intended direction • Conclusion: theory of action appears to be operating & changes in outcomes likely due to program
  • 6. Basic Design Intermediate Outcomes Implementation Educator Behavior •Fidelity •Variation Reactions Change Outcome Change Context
  • 7. Potential Non-equivalent Comparison Groups • Non-implementers • District average • Benchmark schools • State average for demographically similar schools/districts
  • 8. Comparing Trends I 45 40 35 30 25 Comparison Group 20 Project Schools 15 m M O o u s a e c r t 10 5 0 -6 -4 -2 0 2 4 6 Years Before/After Project Start
  • 9. Comparing Trends II Program Schools Turnover Rate -Other 14% 12% 10% 8% 6% R o n u T a e v r t 4% 2% -5 -4 -3 -2 -1 0 1 2 3 4 5 Yrs Before/After Program Start
  • 10. Measuring Inputs, Processes, & Intermediate Outcomes
  • 12. Measuring Implementation • Key processes from logic model • Supplement by reviewing common implementation problems: – Educators do not understand the program – Lack of stakeholder support – Conflicting/competing initiatives – Supporting systems not in place
  • 13. Implementation Measures: Inputs • Spending patterns & costs – Were funds spent as planned? – Was funding provided sufficient to support planned activities? • Staffing: adequacy & continuity; champion • Stakeholder attitudes • Context features that would work with/against program • Data system adequacy (data quality plan)
  • 14. Implementation Measures – Activities & Outputs • Quantitative – # of communication sessions/newsletters/web site hits – % of schools in which leaders communicated program to staff – % of target educators receiving communications – % of schools with program coordinators in place & trained – % of teachers with valid teacher-student link – Change in attributes of incoming teacher candidates – Planned vs. actual $ spent on in school supports – Actual vs. planned warranty costs (differentiation) • Data sources: grant application, budgets, self-assessments, annual reports, administrative data
  • 15. Implementation Measures – Activities & Outcomes • Qualitative Examples – Educators have access to clear & complete description of program via multiple channels – Performance measures actually used to evaluation teachers and programs were the same as in the design – Process in place & used to correct performance measurement or mentoring problems – Alignment of program with potentially competing/conflicting initiatives • Data sources: interviews, document review (grant applications, self-assessments, annual reports, meeting agendas & minutes)
  • 16. Implementation Rubrics, Indices & Scorecards • Combine quantitative output measures & qualitative assessments to summarize quality of implementation & implementation fidelity on key dimensions • Provide a way to translate judgments into numbers for use in statistical analyses (e.g., is variation across schools related to intermediate or long range outcomes?) • Examples – TAP Implementation Standards – Berends, Bodilly, & Kirby, 2002, Chapter 4 (RAND study of New American Schools)
  • 17. Measuring Intermediate Outcomes: Educator Reactions • Priority Areas – Program understanding – Warranty costs – Performance-reward contingency – Effort-performance link – Fairness
  • 18. Educator Reactions Surveys • Motivation Model Elements • Motivational Responses • See “Short Form Teacher Reaction Survey” • Other examples: • http://www.performanceincentives.org/data/files/news/BooksNe • http://www.performanceincentives.org/data/files/directory/Conf
  • 19. Educator Reactions Interview • Find out what higher education faculty, schools & teachers are doing in response to the program • Uncover unexpected consequences • Facilitate survey development • Some examples of what we have learned via interviews
  • 20. Is motivation taking place? Interview Questions – How has the program redesign affected your teaching? – How has the program affected how your school is run? – Has the introduction of a warranty changed how you focus your efforts? – Have you done anything differently in order to improve your chances of receiving the incentive? – Has the warranty motivated you to work harder or change the focus of your efforts?
  • 21. Measuring Intermediate Outcomes: Behavior Change • Surveys • Focus groups • Interviews • Observations • Unobtrusive Measures
  • 22. Measuring Intermediate Outcomes: Behavior Change Surveys of instructional practice are attractive, but hard to do right. • Need at theory of instruction • Content specificity • Response Problems – Social desirability/demand effects – Memory – Can most educators recognize depth of practice change? • Can you build a confirmatory survey based on interviews or focus groups?
  • 23. Potential Interview Questions • Principals/other school leaders – Have teachers been doing anything differently in the classroom since the program began? • Teachers – Have your school administrators been doing anything differently since the program began? • Central office administrators – Have the school administrators you work with been doing anything differently since the program began?
  • 24. Unobtrusive Measures • Change in PD demand (volume, content) • Curriculum material purchases • Scheduling changes/time allocations • Staffing changes • Staff meeting agendas • School improvement plans
  • 25. Your Turn • Work on developing process & intermediate outcome measures that fit your program • Share innovative ways you have used to measure inputs, processes, & intermediate outcomes