SlideShare a Scribd company logo
ChiYan Lam, MEd
CES 2013
Insights on Using Developmental Evaluation
for Innovating:
A Case Study on the Co-creation of an
innovative program
@chiyanlam
June 11, 2013
Assessment and Evaluation Group, Queen’s University
Slides available at www.chiyanlam.com
1Monday, 10 June, 13
Acknowledgements
2Monday, 10 June, 13
“The significant
problems we have
cannot be solved at the
same level of thinking
with which we created
them.”
http://yareah.com/wp-content/uploads/2012/04/einstein.jpg
3Monday, 10 June, 13
4Monday, 10 June, 13
Impersonal
Hard to Focus
Passive
5Monday, 10 June, 13
✓Learn from peers
✓Reflect on prior
experiences
✓Meaning-making
✓Active construction
of knowledge
+
6Monday, 10 June, 13
Dilemma
• Barriers to Teaching and Learning: $, time, space.
• PRACTICUM = out of sight, out of touch.
• Instructors became interested in integrating Web
technologies to Teacher Education to open up
possibilities
• The thinking was that assessment learning
requires learners to actively engage with peers
and challenge their own experiences and
conceptions of assessment.
7
7Monday, 10 June, 13
So what happened...?
• 22 teacher candidates participated in a hybrid, blended
learning pilot. They tweeted about their own
experiences around trying to put into practice
contemporary notions of assessment.
• Guided by the script:“Think Tweet Share”
• Developmental evaluation guided this exploration,
between the instructors, evaluator, and teacher candidates
as a collective in this participatory learning experience.
• DE became integrated; Program became agile and
responsive by design
8
8Monday, 10 June, 13
9
9Monday, 10 June, 13
Research Purpose
to learn about the capacity of developmental
evaluation to support innovation
development.
(from nothing to something)
10
10Monday, 10 June, 13
Research Questions
1.	

 To what extent does Assessment Pilot Initiative qualify as a
developmental evaluation?
2.	

 What contribution does developmental evaluation make to
enable and promote program development?
3.	

 To what extent does developmental evaluation address the
needs of the developers in ways that inform program development?
4.	

 What insights, if any, can be drawn from this development about the
roles and the responsibilities of the developmental evaluator?
11
11Monday, 10 June, 13
Literature Review
12Monday, 10 June, 13
Developmental Evaluation
in 1994
• collaborative, long-term
partnership
• purpose: program
development
• observation: clients who
eschew clear, specific,
measurable goals
13Monday, 10 June, 13
Developmental Evaluation
in 2011
• takes on a responsive,
collaborative, adaptive
orientation to evaluation
• complexity concepts
• systems thinking
• social innovation
14Monday, 10 June, 13
Developmental Evaluation
DE supports innovation development
to guide adaptation to emergent and
dynamic realities in complex
environments.
DE brings to innovation and adaptation
the processes of:
• asking evaluative questions
• applying evaluation logic
• gathering and reporting eval data
to inform support project/
program/product, and/or
organizational development in real
time.Thus, feedback is rapid.
Evaluator works collaboratively with
social innovators to conceptualize,
design, and test new approaches in
long-term, ongoing process of
adaptation, intentional change and
development.
Primary functions of evaluator:
• elucidate the innovation and
adaptation processes
• track their implications and results
• facilitate ongoing, real-time data-
based decision-making in the
developmental process.
(Patton, 2011)
15Monday, 10 June, 13
Developmental Evaluation
DE supports innovation development to
guide adaptation to emergent and
dynamic realities in complex
environments.
DE brings to innovation and adaptation
the processes of:
• asking evaluative questions
• applying evaluation logic
• gathering and reporting eval
data to inform support project/
program/product, and/or
organizational development in real
time.Thus, feedback is rapid.
Evaluator works collaboratively with
social innovators to conceptualize,
design, and test new approaches in
long-term, ongoing process of
adaptation, intentional change and
development.
Primary functions of evaluator:
• elucidate the innovation and
adaptation processes
• track their implications and results
• facilitate ongoing, real-time
data-based decision-making
in the developmental process.
(Patton, 2011)
16Monday, 10 June, 13
Developmental Evaluation
DE supports innovation development
to guide adaptation to emergent and
dynamic realities in complex
environments
DE brings to innovation and adaptation
the processes of:
• asking evaluative questions
• applying evaluation logic
• gathering and reporting eval data
to inform support project/
program/product, and/or
organizational development in real
time.Thus, feedback is rapid.
Evaluator works collaboratively with
social innovators to conceptualize,
design, and test new approaches in
long-term, ongoing process of
adaptation, intentional change and
development.
Primary functions of evaluator:
• elucidate the innovation and
adaptation processes
• track their implications and results
• facilitate ongoing, real-time data-
based decision-making in the
developmental process.
(Patton, 2011)
Improvement
vs
Development .
17Monday, 10 June, 13
Developmental
Evaluation
is
reality
testing.
18Monday, 10 June, 13
1. Social Innovation
• SI aspire to change and transform social
realities (Westley, Zimmerman, & Patton, 2006)
• SI is about generating “novel solutions to
social problems” that are more effective,
efficient, sustainable, or just than existing solutions
and for which the value created accrues
primarily to society as a whole rather than
private individuals” (Phills, Deiglmeier, & Miller,
2008)
19
19Monday, 10 June, 13
2. Complexity Thinking
20
Situational Analysis Complexity Concepts
Sensitizing frameworks that
attunes the evaluators to
certain things
20Monday, 10 June, 13
Simple Complicated Complex
• predictable
• replicable
• known
• causal if-then
models
• unpredictable
•difficult to replicate
• unknown
• many interacting
variables/parts
• systems thinking?
•complex dynamics?
• predictable
• replicable
• known
• many variables/parts
working in tandem in
sequence
• requires expertise/training
• causal if-then models (Westley, Zimmerman, Patton, 2008)
C
h
a
o
s21Monday, 10 June, 13
http://s3.frank.itlab.us/photo-essays/small/apr_05_2474_plane_moon.jpg
22Monday, 10 June, 13
Complexity Concepts
• understanding dynamical behaviour of
systems
• description of behaviour over time
• metaphors for describing change
• how things change
• NOT predictive, not explanatory
• (existence of some underlying principles; rules-
driven behaviour)
23
23Monday, 10 June, 13
Complexity Concepts
• Nonlinearity (butterfly flaps its wings, black swan); cause and
effect
• Emergence: new behaviour emerge from interaction... can’t really
predetermine indicators
• Adaptation: systems respond and adapt to each other, to
environments
• Uncertainty: processes and outcomes are unpredictable,
uncontrollable, and unknowable in advance.
• Dynamical: interactions within, between, among subsystems change
in an unpredictable way.
• Co-evolution: change in response to adaptation. (growing old
together)
24
24Monday, 10 June, 13
3. Systems Thinking
• Pays attention to the influences and relationships
between systems in reference to the whole
• a system is a dynamic, complex, structured
functional unit
• there is flow and exchanges between sub-
systems
• systems are situated within a particular
context
25
25Monday, 10 June, 13
Complex Adaptive Dynamic Systems
26Monday, 10 June, 13
Practicing DE
• Adaptive to context, agile in methods,
responsive to needs
• evaluative thinking - critical thinking
• bricoleur
• “purpose-and-relationship-driven not
[research] method driven”(Patton, 2011, p.
288)
27Monday, 10 June, 13
Five Purposes and Uses:
1. Ongoing development in adapting program, strategy,
policy, etc.
2. Adapting effective principles to a local context
3. Developing a rapid response
4. Preformative development of a potentially broad-
impact, scalable innovation
5. Major systems change and cross-scale developmental
evaluation
28
(Patton, 2011, p. 194)
Five Uses of DE
28Monday, 10 June, 13
Method & Methodology
• Questions drive method (Greene, 2007; Teddlie and Tashakkori,
2009)
• Qualitative Case Study
• understanding the intricacies into the phenomenon and
the context
• Case is a “specific, unique, bounded system” (Stake,
2005, p. 436).
• Understanding the system’s activity, and its function and
interactions.
• Qualitative research to describe, understand, and infer
meaning.
29
29Monday, 10 June, 13
Data Sources
Three pillars of data:
1. Program development records
2. Development Artifacts
3. Interviews with clients on the significance of
various DE episodes
30
30Monday, 10 June, 13
Data Analysis
1. Reconstructing evidentiary base
2. Identifying developmental episodes
3. Coding for developmental moments
4. Time-series analysis
31
31Monday, 10 June, 13
32
!
32Monday, 10 June, 13
How the innovation
came to be...
33Monday, 10 June, 13
Key Developmental Episodes
• Ep 1: Evolving understanding in using social media
for professional learning.
• Ep 2: Explicating values through Appreciative
Inquiry for program development.
• Ep 3: Enhancing collaboration through structured
communication
• Ep 4: Program development through the use of
evaluative data
34
Again, you can't connect the dots looking forward; you can only connect them looking backwards. - Steve Jobs
34Monday, 10 June, 13
(Wicked) Uncertainty
• uncertain about how to proceed
• uncertain about in what direction to proceed (given many choices)
• uncertain about the effects and outcome to how teacher
candidates would respond to the intervention
• the more questions we answered , the more questions we raised.
• Typical of DE:
• Clear, Measurable, and Specific Outcomes
• Use of planning frameworks.
• Traditional evaluation cycles wouldn’t work.
35
35Monday, 10 June, 13
How the innovation came to
be...
• Reframing what constituted “data”
• not intentional, but an adaptive response
to emergent needs:
• informational needs concerning
development; collected, analyzed, interpreted
• relevant theories, concepts, ideas; introduced
to catalyze thinking. Led to learning and un-
learning.
36
36Monday, 10 June, 13
Major Findings
RQ1: To what extent does API qualify
as a developmental evaluation?
1. Preformative development of a potentially
broad-impact, scalable innovation
2. Patton: Did something get developed?
(Improvement vs development vs innovation)
37
✔✔✗
37Monday, 10 June, 13
RQ2: What contribution does DE
make to enable and promote program
development?
1. Lent a data-informed process to innovation
2. Implication: responsiveness
• program-in-action became adaptive to
the emergent needs of users
3. Consequence: resolving uncertainty
38
38Monday, 10 June, 13
RQ3: To what extent does DE address
the needs of developers in ways that
inform program development?
1. Definition - defining the “problem”
2. Delineation - narrowing down the problem space
3. Collaboration - collaboration processes; drawing on
collective strength and contributions
4. Prototyping - integration and synthesis of ideas to ready
a program for implementation
5. Illumination - iterative learning and adaptive development
6. Evaluation - formal evaluation processes to reality-test
39
39Monday, 10 June, 13
Implications to
Evaluation
• One of the first documented case study into
developmental evaluation
• Contributions into understanding, analyzing
and reporting development as a process
• Delineating the kinds of roles and
responsibilities that promote development
• The notion of design emerges from this
study
40
40Monday, 10 June, 13
Implication to Theory
41Monday, 10 June, 13
• Program as co-created
• Attending to the “theory” of the program
• DE as a way to drive the innovating process
• Six foci of development
• Designing programs?
42Monday, 10 June, 13
Design and
Design Thinking
43
43Monday, 10 June, 13
Design+Design Thinking
“Design is the systematic exploration into the complexity of options (in
program values, assumptions, output, impact, and technologies) and
decision-making processes that results in purposeful decisions about the
features and components of a program-in-development that is informed by
the best conception of the complexity surrounding a social need.
Design is dependent on the existence and validity of highly situated and
contextualized knowledge about the realities of stakeholders at a site of
innovation.The design process fits potential technologies, ideas, and
concepts to reconfigure the social realities.This results in the emergence of
a program that is adaptive and responsive to the needs of program users.”
(Lam, 2011, p. 137-138)
44
44Monday, 10 June, 13
Implications to
Evaluation Practice
1. Manager
2. Facilitator of learning
3. Evaluator
4. Innovation thinker
45
45Monday, 10 June, 13
Limitations
• Contextually bound, so not generalizable
• but it does add knowledge to the field
• Data of the study is only as good as the data collected from
the evaluation
• better if I had captured the program-in-action
• Analysis of the outcome of API could help strength the case
study
• but not necessary to achieving the research foci
• Cross-case analysis would be a better method for generating
understanding.
46
46Monday, 10 June, 13
ThankYou!
Let’s Connect.
@chiyanlam
chi.lam@QueensU.ca
www.chiyanlam.com
47Monday, 10 June, 13

More Related Content

PDF
Ces13 roles pdf
PPTX
Wicked issues taming problems and systems
PPTX
Wicked issues taming problems and systems
PDF
Lessons-learned from embedding design into a developmental evaluation: The si...
PPTX
Using Developmental Evaluation to Support Prototyping: A Workshop
PPTX
Using realist evaluation with vulnerable young people and the services that s...
PPTX
JISC RSC London Workshop - Learner analytics
KEY
A Case Study on the Use of Developmental Evaluation for Navigating Uncertain...
Ces13 roles pdf
Wicked issues taming problems and systems
Wicked issues taming problems and systems
Lessons-learned from embedding design into a developmental evaluation: The si...
Using Developmental Evaluation to Support Prototyping: A Workshop
Using realist evaluation with vulnerable young people and the services that s...
JISC RSC London Workshop - Learner analytics
A Case Study on the Use of Developmental Evaluation for Navigating Uncertain...

Similar to Insights on Using Developmental Evaluation for Innovating: A Case Study on the Co-Creation of an Innovative Program (20)

PPTX
Joana Zozimo presentation_25042015
PPTX
Week Three - Culture of Inquiry
PPTX
Evidence Hub Activity
PPTX
Learning analytics: the way forward
PDF
Learning Analytics In Higher Education: Struggles & Successes (Part 2)
PDF
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
PDF
Liberact conference 2013 Gnome Surfer & Moclo Planner
PDF
Sdal air education workforce analytics workshop jan. 7 , 2014.pptx
PPTX
Cultivating Pedagogical Innovation Through Emerging Leaders
PDF
Organizing to Get Analytics Right
PPTX
Reflections from a realist evaluation in progress: Scaling ladders and stitch...
PPTX
Using empowerment evaluation to strengthen talent search progamming march 2011
DOC
Doing better aid work
PDF
Using case studies to explore the generalizability of 'complex' development i...
PPT
KTDRR Measuring for Impact_Peter Levesque
PPTX
Learning analytics: developing an action plan ... developing a vision
PPTX
The future of learning analytics: LASI16 Bilbao
PDF
10 things about_evaluation_final_with_hyperlinks
PPTX
Digifest 2017 - Learning Analytics & Learning Design
PPTX
2022 AICRIE Rebecca Ferguson.pptx
Joana Zozimo presentation_25042015
Week Three - Culture of Inquiry
Evidence Hub Activity
Learning analytics: the way forward
Learning Analytics In Higher Education: Struggles & Successes (Part 2)
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
Liberact conference 2013 Gnome Surfer & Moclo Planner
Sdal air education workforce analytics workshop jan. 7 , 2014.pptx
Cultivating Pedagogical Innovation Through Emerging Leaders
Organizing to Get Analytics Right
Reflections from a realist evaluation in progress: Scaling ladders and stitch...
Using empowerment evaluation to strengthen talent search progamming march 2011
Doing better aid work
Using case studies to explore the generalizability of 'complex' development i...
KTDRR Measuring for Impact_Peter Levesque
Learning analytics: developing an action plan ... developing a vision
The future of learning analytics: LASI16 Bilbao
10 things about_evaluation_final_with_hyperlinks
Digifest 2017 - Learning Analytics & Learning Design
2022 AICRIE Rebecca Ferguson.pptx
Ad

Recently uploaded (20)

PPTX
Cell Structure & Organelles in detailed.
PDF
VCE English Exam - Section C Student Revision Booklet
PPTX
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
PPTX
GDM (1) (1).pptx small presentation for students
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PDF
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
PDF
RMMM.pdf make it easy to upload and study
PDF
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PDF
Supply Chain Operations Speaking Notes -ICLT Program
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PDF
2.FourierTransform-ShortQuestionswithAnswers.pdf
PDF
O5-L3 Freight Transport Ops (International) V1.pdf
PPTX
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
PDF
O7-L3 Supply Chain Operations - ICLT Program
PPTX
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
PDF
Classroom Observation Tools for Teachers
PPTX
human mycosis Human fungal infections are called human mycosis..pptx
PPTX
202450812 BayCHI UCSC-SV 20250812 v17.pptx
PPTX
Institutional Correction lecture only . . .
Cell Structure & Organelles in detailed.
VCE English Exam - Section C Student Revision Booklet
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
GDM (1) (1).pptx small presentation for students
Final Presentation General Medicine 03-08-2024.pptx
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
RMMM.pdf make it easy to upload and study
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
Supply Chain Operations Speaking Notes -ICLT Program
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
2.FourierTransform-ShortQuestionswithAnswers.pdf
O5-L3 Freight Transport Ops (International) V1.pdf
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
O7-L3 Supply Chain Operations - ICLT Program
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
Classroom Observation Tools for Teachers
human mycosis Human fungal infections are called human mycosis..pptx
202450812 BayCHI UCSC-SV 20250812 v17.pptx
Institutional Correction lecture only . . .
Ad

Insights on Using Developmental Evaluation for Innovating: A Case Study on the Co-Creation of an Innovative Program

  • 1. ChiYan Lam, MEd CES 2013 Insights on Using Developmental Evaluation for Innovating: A Case Study on the Co-creation of an innovative program @chiyanlam June 11, 2013 Assessment and Evaluation Group, Queen’s University Slides available at www.chiyanlam.com 1Monday, 10 June, 13
  • 3. “The significant problems we have cannot be solved at the same level of thinking with which we created them.” http://yareah.com/wp-content/uploads/2012/04/einstein.jpg 3Monday, 10 June, 13
  • 6. ✓Learn from peers ✓Reflect on prior experiences ✓Meaning-making ✓Active construction of knowledge + 6Monday, 10 June, 13
  • 7. Dilemma • Barriers to Teaching and Learning: $, time, space. • PRACTICUM = out of sight, out of touch. • Instructors became interested in integrating Web technologies to Teacher Education to open up possibilities • The thinking was that assessment learning requires learners to actively engage with peers and challenge their own experiences and conceptions of assessment. 7 7Monday, 10 June, 13
  • 8. So what happened...? • 22 teacher candidates participated in a hybrid, blended learning pilot. They tweeted about their own experiences around trying to put into practice contemporary notions of assessment. • Guided by the script:“Think Tweet Share” • Developmental evaluation guided this exploration, between the instructors, evaluator, and teacher candidates as a collective in this participatory learning experience. • DE became integrated; Program became agile and responsive by design 8 8Monday, 10 June, 13
  • 10. Research Purpose to learn about the capacity of developmental evaluation to support innovation development. (from nothing to something) 10 10Monday, 10 June, 13
  • 11. Research Questions 1. To what extent does Assessment Pilot Initiative qualify as a developmental evaluation? 2. What contribution does developmental evaluation make to enable and promote program development? 3. To what extent does developmental evaluation address the needs of the developers in ways that inform program development? 4. What insights, if any, can be drawn from this development about the roles and the responsibilities of the developmental evaluator? 11 11Monday, 10 June, 13
  • 13. Developmental Evaluation in 1994 • collaborative, long-term partnership • purpose: program development • observation: clients who eschew clear, specific, measurable goals 13Monday, 10 June, 13
  • 14. Developmental Evaluation in 2011 • takes on a responsive, collaborative, adaptive orientation to evaluation • complexity concepts • systems thinking • social innovation 14Monday, 10 June, 13
  • 15. Developmental Evaluation DE supports innovation development to guide adaptation to emergent and dynamic realities in complex environments. DE brings to innovation and adaptation the processes of: • asking evaluative questions • applying evaluation logic • gathering and reporting eval data to inform support project/ program/product, and/or organizational development in real time.Thus, feedback is rapid. Evaluator works collaboratively with social innovators to conceptualize, design, and test new approaches in long-term, ongoing process of adaptation, intentional change and development. Primary functions of evaluator: • elucidate the innovation and adaptation processes • track their implications and results • facilitate ongoing, real-time data- based decision-making in the developmental process. (Patton, 2011) 15Monday, 10 June, 13
  • 16. Developmental Evaluation DE supports innovation development to guide adaptation to emergent and dynamic realities in complex environments. DE brings to innovation and adaptation the processes of: • asking evaluative questions • applying evaluation logic • gathering and reporting eval data to inform support project/ program/product, and/or organizational development in real time.Thus, feedback is rapid. Evaluator works collaboratively with social innovators to conceptualize, design, and test new approaches in long-term, ongoing process of adaptation, intentional change and development. Primary functions of evaluator: • elucidate the innovation and adaptation processes • track their implications and results • facilitate ongoing, real-time data-based decision-making in the developmental process. (Patton, 2011) 16Monday, 10 June, 13
  • 17. Developmental Evaluation DE supports innovation development to guide adaptation to emergent and dynamic realities in complex environments DE brings to innovation and adaptation the processes of: • asking evaluative questions • applying evaluation logic • gathering and reporting eval data to inform support project/ program/product, and/or organizational development in real time.Thus, feedback is rapid. Evaluator works collaboratively with social innovators to conceptualize, design, and test new approaches in long-term, ongoing process of adaptation, intentional change and development. Primary functions of evaluator: • elucidate the innovation and adaptation processes • track their implications and results • facilitate ongoing, real-time data- based decision-making in the developmental process. (Patton, 2011) Improvement vs Development . 17Monday, 10 June, 13
  • 19. 1. Social Innovation • SI aspire to change and transform social realities (Westley, Zimmerman, & Patton, 2006) • SI is about generating “novel solutions to social problems” that are more effective, efficient, sustainable, or just than existing solutions and for which the value created accrues primarily to society as a whole rather than private individuals” (Phills, Deiglmeier, & Miller, 2008) 19 19Monday, 10 June, 13
  • 20. 2. Complexity Thinking 20 Situational Analysis Complexity Concepts Sensitizing frameworks that attunes the evaluators to certain things 20Monday, 10 June, 13
  • 21. Simple Complicated Complex • predictable • replicable • known • causal if-then models • unpredictable •difficult to replicate • unknown • many interacting variables/parts • systems thinking? •complex dynamics? • predictable • replicable • known • many variables/parts working in tandem in sequence • requires expertise/training • causal if-then models (Westley, Zimmerman, Patton, 2008) C h a o s21Monday, 10 June, 13
  • 23. Complexity Concepts • understanding dynamical behaviour of systems • description of behaviour over time • metaphors for describing change • how things change • NOT predictive, not explanatory • (existence of some underlying principles; rules- driven behaviour) 23 23Monday, 10 June, 13
  • 24. Complexity Concepts • Nonlinearity (butterfly flaps its wings, black swan); cause and effect • Emergence: new behaviour emerge from interaction... can’t really predetermine indicators • Adaptation: systems respond and adapt to each other, to environments • Uncertainty: processes and outcomes are unpredictable, uncontrollable, and unknowable in advance. • Dynamical: interactions within, between, among subsystems change in an unpredictable way. • Co-evolution: change in response to adaptation. (growing old together) 24 24Monday, 10 June, 13
  • 25. 3. Systems Thinking • Pays attention to the influences and relationships between systems in reference to the whole • a system is a dynamic, complex, structured functional unit • there is flow and exchanges between sub- systems • systems are situated within a particular context 25 25Monday, 10 June, 13
  • 26. Complex Adaptive Dynamic Systems 26Monday, 10 June, 13
  • 27. Practicing DE • Adaptive to context, agile in methods, responsive to needs • evaluative thinking - critical thinking • bricoleur • “purpose-and-relationship-driven not [research] method driven”(Patton, 2011, p. 288) 27Monday, 10 June, 13
  • 28. Five Purposes and Uses: 1. Ongoing development in adapting program, strategy, policy, etc. 2. Adapting effective principles to a local context 3. Developing a rapid response 4. Preformative development of a potentially broad- impact, scalable innovation 5. Major systems change and cross-scale developmental evaluation 28 (Patton, 2011, p. 194) Five Uses of DE 28Monday, 10 June, 13
  • 29. Method & Methodology • Questions drive method (Greene, 2007; Teddlie and Tashakkori, 2009) • Qualitative Case Study • understanding the intricacies into the phenomenon and the context • Case is a “specific, unique, bounded system” (Stake, 2005, p. 436). • Understanding the system’s activity, and its function and interactions. • Qualitative research to describe, understand, and infer meaning. 29 29Monday, 10 June, 13
  • 30. Data Sources Three pillars of data: 1. Program development records 2. Development Artifacts 3. Interviews with clients on the significance of various DE episodes 30 30Monday, 10 June, 13
  • 31. Data Analysis 1. Reconstructing evidentiary base 2. Identifying developmental episodes 3. Coding for developmental moments 4. Time-series analysis 31 31Monday, 10 June, 13
  • 33. How the innovation came to be... 33Monday, 10 June, 13
  • 34. Key Developmental Episodes • Ep 1: Evolving understanding in using social media for professional learning. • Ep 2: Explicating values through Appreciative Inquiry for program development. • Ep 3: Enhancing collaboration through structured communication • Ep 4: Program development through the use of evaluative data 34 Again, you can't connect the dots looking forward; you can only connect them looking backwards. - Steve Jobs 34Monday, 10 June, 13
  • 35. (Wicked) Uncertainty • uncertain about how to proceed • uncertain about in what direction to proceed (given many choices) • uncertain about the effects and outcome to how teacher candidates would respond to the intervention • the more questions we answered , the more questions we raised. • Typical of DE: • Clear, Measurable, and Specific Outcomes • Use of planning frameworks. • Traditional evaluation cycles wouldn’t work. 35 35Monday, 10 June, 13
  • 36. How the innovation came to be... • Reframing what constituted “data” • not intentional, but an adaptive response to emergent needs: • informational needs concerning development; collected, analyzed, interpreted • relevant theories, concepts, ideas; introduced to catalyze thinking. Led to learning and un- learning. 36 36Monday, 10 June, 13
  • 37. Major Findings RQ1: To what extent does API qualify as a developmental evaluation? 1. Preformative development of a potentially broad-impact, scalable innovation 2. Patton: Did something get developed? (Improvement vs development vs innovation) 37 ✔✔✗ 37Monday, 10 June, 13
  • 38. RQ2: What contribution does DE make to enable and promote program development? 1. Lent a data-informed process to innovation 2. Implication: responsiveness • program-in-action became adaptive to the emergent needs of users 3. Consequence: resolving uncertainty 38 38Monday, 10 June, 13
  • 39. RQ3: To what extent does DE address the needs of developers in ways that inform program development? 1. Definition - defining the “problem” 2. Delineation - narrowing down the problem space 3. Collaboration - collaboration processes; drawing on collective strength and contributions 4. Prototyping - integration and synthesis of ideas to ready a program for implementation 5. Illumination - iterative learning and adaptive development 6. Evaluation - formal evaluation processes to reality-test 39 39Monday, 10 June, 13
  • 40. Implications to Evaluation • One of the first documented case study into developmental evaluation • Contributions into understanding, analyzing and reporting development as a process • Delineating the kinds of roles and responsibilities that promote development • The notion of design emerges from this study 40 40Monday, 10 June, 13
  • 42. • Program as co-created • Attending to the “theory” of the program • DE as a way to drive the innovating process • Six foci of development • Designing programs? 42Monday, 10 June, 13
  • 44. Design+Design Thinking “Design is the systematic exploration into the complexity of options (in program values, assumptions, output, impact, and technologies) and decision-making processes that results in purposeful decisions about the features and components of a program-in-development that is informed by the best conception of the complexity surrounding a social need. Design is dependent on the existence and validity of highly situated and contextualized knowledge about the realities of stakeholders at a site of innovation.The design process fits potential technologies, ideas, and concepts to reconfigure the social realities.This results in the emergence of a program that is adaptive and responsive to the needs of program users.” (Lam, 2011, p. 137-138) 44 44Monday, 10 June, 13
  • 45. Implications to Evaluation Practice 1. Manager 2. Facilitator of learning 3. Evaluator 4. Innovation thinker 45 45Monday, 10 June, 13
  • 46. Limitations • Contextually bound, so not generalizable • but it does add knowledge to the field • Data of the study is only as good as the data collected from the evaluation • better if I had captured the program-in-action • Analysis of the outcome of API could help strength the case study • but not necessary to achieving the research foci • Cross-case analysis would be a better method for generating understanding. 46 46Monday, 10 June, 13