SlideShare a Scribd company logo
Guerilla 
Evaluation 
Closing the Feedback Loop 
Julie Dirksen – Usable Learning 
(c) Usable Learning 2014
Houston, we have a problem… 
Learning and development has a problem, but, 
in particular, elearning REALLY has a 
problem.
10,000 Hour Rule 
Deliberate Practice 
requires frequent, 
often expert, 
feedback.
My favorite instance...
It’s not just the time…
How many of you regularly get to 
see people use your learning? 
So how do you know if it works?
Typical Evaluation Measures 
Kirkpatrick's Levels 
1. Reaction (participants' opinions) 
2. Learning (pre/post test) 
3. Behavior (measurable behavior change) 
4. Results/ROI (return on investment)
Issues with Typical Measures 
• Levels 1 & 2 are not meaningful 
• Levels 3 & 4 are difficult and costly 
o Require access to the full target audience 
o Measuring behaviors requires extensive and costly 
observation 
o Difficult to implement without pre-existing 
organizational performance metrics in place 
o Difficult to attribute due to confounding variables
The Evaluation Venn Diagram 
Enough 
budget or 
resources or 
resources to 
measure 
Enough 
control over 
the 
environment 
Good 
methods to 
evaluate 
All too often, 
they don’t 
overlap at all.
ROI Calculation 
https://twitter.com/JaneBozarth/status/337733805253206016
We are measuring what we can 
control 
• Seat Time 
• # of learning objects 
• # of people trained 
• Completion Status 
• Pre/post scores 
Why don’t we 
just weigh 
them? 
The Inestimable Gloria Gery
Streetlamp Effect 
So, there’s this story…
It’s kind of like this…
So, what can we do about this? 
visible 
desirable 
feasible 
What is in this 
intersection?
Guerilla Evaluation 
• A quicker and less expensive method to ensure 
a feedback loop that can be used to assess and 
improve the training intervention 
• Not intended to be a full measure of efficacy 
• Qualitative measures of: 
o Retention of information 
o Attitude 
o Anecdotal or Observable behavior change for a small sample 
size
Based on Nielsen's Guerilla HCI 
In 1994, Jakob Nielsen wrote a highly influential article called Guerrilla HCI: 
Using Discount Usability Engineering to Penetrate the Intimidation Barrier 
The article addressed the reasons software development teams rarely did 
usability research to improve the design of software interfaces. 
Studies showed that 
qualitative feedback quickly 
became repetitive after 5-6 
users, and that working 
with a small sample could 
provide meaningful design 
feedback. 
http://www.nngroup.com/articles/guerrilla-hci/.
Right vs. Better 
https://twitter.com/karlfast/status/223825451079057408
It’s like Traditional PM vs Agile 
VS 
To Do Doing Done 
“By putting the most serious planning at the beginning, 
with subsequent work derived from the plan, the waterfall 
method amounts to a pledge by all parties not to learn 
anything while doing the actual work.” 
- Clay Shirky
Keep the cycles short 
Why feedback 
is like weather 
prediction
Formative - User Testing 
Standard Usability Testing 
The first part of the evaluation process is 
standard usability testing that involves 
watching end users interact with the 
software, followed by a short interview. 
Typical evaluation measures such as a 
pre/post test could be incorporated here.
Summative - Follow up Interview 
• Can be used in conjunction with other 
evaluation measures 
• 30-45 minute follow-up interviews that occur 
4-6 weeks after the training intervention 
• Small sample group (~6 users per audience) 
• Structured interview questions
Structured Interview Format 
Structured interview questions relating to: 
● Learner impressions/feedback 
● Most memorable elements 
● Small number of retention questions related to 
key learning objectives 
● Anecdotal usage of the material (How have 
they applied the ideas from the training?)
Brinkerhoff Success Case 
“Performance results can’t be achieved by training alone; 
therefore training should not be the object of evaluation” 
• Part 1: Survey to determine who was successful and 
who was not 
• Part 2: In depth interviews with a selection of successful 
and not successful users 
Find Out Quickly What’s Working and What’s Not
Cohort Analysis 
• Follow smaller groups through level 3 
analysis 
100 
90 
80 
70 
60 
50 
40 
30 
20 
10 
0 
Week 1 Week 2 Week 3 Week 4 
Cohort 1 
Cohort 2
Signaling 
Ask the magic 
question: 
If you woke up 
tomorrow and it was 
all perfect, how would 
you know?
Look for data 
• xAPI 
• Google Analytics
What do you think? 
• With 2-3 people around you, make 
a list of quick and dirty evaluation 
options. 
• As soon as you think of one, come 
up with another one as quickly as 
possible.
Questions? 
• Thanks for coming 
• Contact: 
o Julie Dirksen 
o julie@usablelearning.com 
o http://usablelearning.com 
o Twitter: usablelearning

More Related Content

PPTX
Interface Design for Elearning - Tips and Tricks
PPTX
Designing for Habit Formation
PDF
UXWeek 2015 - Designing for Behavior Change
PPTX
The Science of Behavior Change
PDF
Strategies for Complex Skill Development
PDF
Retrospective Anti-Patterns by Aino Corry at #AgileIndia2019
PPTX
Attention, Willpower and Decision-making for Design of Learning
PDF
Designing for behaviour change
Interface Design for Elearning - Tips and Tricks
Designing for Habit Formation
UXWeek 2015 - Designing for Behavior Change
The Science of Behavior Change
Strategies for Complex Skill Development
Retrospective Anti-Patterns by Aino Corry at #AgileIndia2019
Attention, Willpower and Decision-making for Design of Learning
Designing for behaviour change

What's hot (20)

PPTX
Design for Behavior Change
PDF
Sense & Respond: Book Review & Panel Discussion
PDF
Creating a Virtuous Cycle - The Research and Design Feedback Loop
PDF
Building innovative products
PPTX
Solving Problems with Theory of Constraints Current Reality Trees @ Lean Agil...
PDF
Barcamp Conway: A Design Mindset
PDF
Graham Thomas - The Testers Toolbox - EuroSTAR 2010
PPT
Everybody Lies: Rapid Cadence Research & Usability Testing
PDF
Ross Chapman Etch Design Sprints Agile Outside IT presentation 9 January 2019
PPT
Electronic Performance Support Workshop
PDF
An Introduction to Applied Behavioral Science, for Project Managers
PPTX
CYCLES course (5): Systems and System Thinking
PDF
Behavioral Science for Data Scientists
PDF
Adopting innovation
PDF
You want me to what a practical guide to diary studies
PDF
Finding Leverage with System Dynamics
PDF
EVOLVE & DISRUPT (Agileee 2015)
PPT
decision making and problem solving
PDF
Design for Behavior Change
PDF
Problem Solving 101
Design for Behavior Change
Sense & Respond: Book Review & Panel Discussion
Creating a Virtuous Cycle - The Research and Design Feedback Loop
Building innovative products
Solving Problems with Theory of Constraints Current Reality Trees @ Lean Agil...
Barcamp Conway: A Design Mindset
Graham Thomas - The Testers Toolbox - EuroSTAR 2010
Everybody Lies: Rapid Cadence Research & Usability Testing
Ross Chapman Etch Design Sprints Agile Outside IT presentation 9 January 2019
Electronic Performance Support Workshop
An Introduction to Applied Behavioral Science, for Project Managers
CYCLES course (5): Systems and System Thinking
Behavioral Science for Data Scientists
Adopting innovation
You want me to what a practical guide to diary studies
Finding Leverage with System Dynamics
EVOLVE & DISRUPT (Agileee 2015)
decision making and problem solving
Design for Behavior Change
Problem Solving 101
Ad

Viewers also liked (14)

PDF
L&D_Metrics-Improvement_20150409
PPTX
Narrative Techniques for Learning
PDF
Guided Discovery for Language Instruction
PPTX
Designing for Flow: Creating Compelling User Experiences for Learning
PPTX
Discovery method Introduction
PDF
Employee Training and Development: How to Measure Effectiveness and Impact - ...
PPTX
Discovery learning
PDF
Employee Training and Development: How to Measure Effectiveness and Impact | ...
PPTX
Guided Discovery
PPTX
Experience the Discovery Learning Approach – Paradigm Learning
PPT
Discovery learning presentation
PPT
Guided Discovery
PPTX
Narrative
PPTX
Jerome bruner learning theory
L&D_Metrics-Improvement_20150409
Narrative Techniques for Learning
Guided Discovery for Language Instruction
Designing for Flow: Creating Compelling User Experiences for Learning
Discovery method Introduction
Employee Training and Development: How to Measure Effectiveness and Impact - ...
Discovery learning
Employee Training and Development: How to Measure Effectiveness and Impact | ...
Guided Discovery
Experience the Discovery Learning Approach – Paradigm Learning
Discovery learning presentation
Guided Discovery
Narrative
Jerome bruner learning theory
Ad

Similar to Guerrilla (or Agile) Evaluation for Learning (20)

PPTX
Take one step
PPTX
Needs Assessment
PDF
Black, Adam Dr - Efficacy and how to improve learner outcomes
PPTX
Training needs analysis, skills auditing and training
PPTX
Using realist evaluation with vulnerable young people and the services that s...
PPT
Developing Direct Reports
PPTX
TajdgosvsuzkavsiandgakjxjndhsjsbsbsksnsP.pptx
PPTX
Training needs analysis, skills auditing and training roi presentation 31 aug...
PDF
Arrogance or Apathy: The Need for Formative Evaluation + Current & Emerging S...
PPT
Training Program Evaluation
PPT
1 complete research
PPTX
Evaluation of Training & Development
PPTX
Evaluation design
PPTX
Evaluation design
PDF
Measuring Learning Impact
PPTX
Vicky Pelka's Training Session On Impact Evaluation
PPTX
Alcazar methods of evaluation
PPTX
SharePoint "Moneyball" - The Art and Science of Winning the SharePoint Metric...
PPTX
Next Generation Open Courseware
PPT
Trg evaluation
Take one step
Needs Assessment
Black, Adam Dr - Efficacy and how to improve learner outcomes
Training needs analysis, skills auditing and training
Using realist evaluation with vulnerable young people and the services that s...
Developing Direct Reports
TajdgosvsuzkavsiandgakjxjndhsjsbsbsksnsP.pptx
Training needs analysis, skills auditing and training roi presentation 31 aug...
Arrogance or Apathy: The Need for Formative Evaluation + Current & Emerging S...
Training Program Evaluation
1 complete research
Evaluation of Training & Development
Evaluation design
Evaluation design
Measuring Learning Impact
Vicky Pelka's Training Session On Impact Evaluation
Alcazar methods of evaluation
SharePoint "Moneyball" - The Art and Science of Winning the SharePoint Metric...
Next Generation Open Courseware
Trg evaluation

More from Julie Dirksen (9)

PPTX
Preserving Attention for Learning in the New Normal
PPTX
Science of Attention for Learning
PPTX
The Science of Attention and Engagement for Learning
PPTX
UX for Learning Design
PDF
Ten principles of game design for learning
PPTX
Instructional Design Web Comic #4 - Addendum
PDF
LxD - Learner Experience Design
PDF
Why Your Brain Loves Video Games & The Implications for e-Learning
PDF
Creating Game-like Engagement for Learning
Preserving Attention for Learning in the New Normal
Science of Attention for Learning
The Science of Attention and Engagement for Learning
UX for Learning Design
Ten principles of game design for learning
Instructional Design Web Comic #4 - Addendum
LxD - Learner Experience Design
Why Your Brain Loves Video Games & The Implications for e-Learning
Creating Game-like Engagement for Learning

Recently uploaded (20)

PDF
Anesthesia in Laparoscopic Surgery in India
PDF
Business Ethics Teaching Materials for college
PDF
O5-L3 Freight Transport Ops (International) V1.pdf
PDF
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PDF
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
PDF
Complications of Minimal Access Surgery at WLH
PDF
O7-L3 Supply Chain Operations - ICLT Program
PPTX
Pharma ospi slides which help in ospi learning
PDF
Origin of periodic table-Mendeleev’s Periodic-Modern Periodic table
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
PDF
Physiotherapy_for_Respiratory_and_Cardiac_Problems WEBBER.pdf
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PDF
Pre independence Education in Inndia.pdf
PPTX
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
PPTX
Renaissance Architecture: A Journey from Faith to Humanism
PDF
102 student loan defaulters named and shamed – Is someone you know on the list?
PDF
01-Introduction-to-Information-Management.pdf
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PPTX
Introduction to Child Health Nursing – Unit I | Child Health Nursing I | B.Sc...
Anesthesia in Laparoscopic Surgery in India
Business Ethics Teaching Materials for college
O5-L3 Freight Transport Ops (International) V1.pdf
FourierSeries-QuestionsWithAnswers(Part-A).pdf
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
Complications of Minimal Access Surgery at WLH
O7-L3 Supply Chain Operations - ICLT Program
Pharma ospi slides which help in ospi learning
Origin of periodic table-Mendeleev’s Periodic-Modern Periodic table
Module 4: Burden of Disease Tutorial Slides S2 2025
Physiotherapy_for_Respiratory_and_Cardiac_Problems WEBBER.pdf
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
Final Presentation General Medicine 03-08-2024.pptx
Pre independence Education in Inndia.pdf
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
Renaissance Architecture: A Journey from Faith to Humanism
102 student loan defaulters named and shamed – Is someone you know on the list?
01-Introduction-to-Information-Management.pdf
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
Introduction to Child Health Nursing – Unit I | Child Health Nursing I | B.Sc...

Guerrilla (or Agile) Evaluation for Learning

  • 1. Guerilla Evaluation Closing the Feedback Loop Julie Dirksen – Usable Learning (c) Usable Learning 2014
  • 2. Houston, we have a problem… Learning and development has a problem, but, in particular, elearning REALLY has a problem.
  • 3. 10,000 Hour Rule Deliberate Practice requires frequent, often expert, feedback.
  • 5. It’s not just the time…
  • 6. How many of you regularly get to see people use your learning? So how do you know if it works?
  • 7. Typical Evaluation Measures Kirkpatrick's Levels 1. Reaction (participants' opinions) 2. Learning (pre/post test) 3. Behavior (measurable behavior change) 4. Results/ROI (return on investment)
  • 8. Issues with Typical Measures • Levels 1 & 2 are not meaningful • Levels 3 & 4 are difficult and costly o Require access to the full target audience o Measuring behaviors requires extensive and costly observation o Difficult to implement without pre-existing organizational performance metrics in place o Difficult to attribute due to confounding variables
  • 9. The Evaluation Venn Diagram Enough budget or resources or resources to measure Enough control over the environment Good methods to evaluate All too often, they don’t overlap at all.
  • 11. We are measuring what we can control • Seat Time • # of learning objects • # of people trained • Completion Status • Pre/post scores Why don’t we just weigh them? The Inestimable Gloria Gery
  • 12. Streetlamp Effect So, there’s this story…
  • 13. It’s kind of like this…
  • 14. So, what can we do about this? visible desirable feasible What is in this intersection?
  • 15. Guerilla Evaluation • A quicker and less expensive method to ensure a feedback loop that can be used to assess and improve the training intervention • Not intended to be a full measure of efficacy • Qualitative measures of: o Retention of information o Attitude o Anecdotal or Observable behavior change for a small sample size
  • 16. Based on Nielsen's Guerilla HCI In 1994, Jakob Nielsen wrote a highly influential article called Guerrilla HCI: Using Discount Usability Engineering to Penetrate the Intimidation Barrier The article addressed the reasons software development teams rarely did usability research to improve the design of software interfaces. Studies showed that qualitative feedback quickly became repetitive after 5-6 users, and that working with a small sample could provide meaningful design feedback. http://www.nngroup.com/articles/guerrilla-hci/.
  • 17. Right vs. Better https://twitter.com/karlfast/status/223825451079057408
  • 18. It’s like Traditional PM vs Agile VS To Do Doing Done “By putting the most serious planning at the beginning, with subsequent work derived from the plan, the waterfall method amounts to a pledge by all parties not to learn anything while doing the actual work.” - Clay Shirky
  • 19. Keep the cycles short Why feedback is like weather prediction
  • 20. Formative - User Testing Standard Usability Testing The first part of the evaluation process is standard usability testing that involves watching end users interact with the software, followed by a short interview. Typical evaluation measures such as a pre/post test could be incorporated here.
  • 21. Summative - Follow up Interview • Can be used in conjunction with other evaluation measures • 30-45 minute follow-up interviews that occur 4-6 weeks after the training intervention • Small sample group (~6 users per audience) • Structured interview questions
  • 22. Structured Interview Format Structured interview questions relating to: ● Learner impressions/feedback ● Most memorable elements ● Small number of retention questions related to key learning objectives ● Anecdotal usage of the material (How have they applied the ideas from the training?)
  • 23. Brinkerhoff Success Case “Performance results can’t be achieved by training alone; therefore training should not be the object of evaluation” • Part 1: Survey to determine who was successful and who was not • Part 2: In depth interviews with a selection of successful and not successful users Find Out Quickly What’s Working and What’s Not
  • 24. Cohort Analysis • Follow smaller groups through level 3 analysis 100 90 80 70 60 50 40 30 20 10 0 Week 1 Week 2 Week 3 Week 4 Cohort 1 Cohort 2
  • 25. Signaling Ask the magic question: If you woke up tomorrow and it was all perfect, how would you know?
  • 26. Look for data • xAPI • Google Analytics
  • 27. What do you think? • With 2-3 people around you, make a list of quick and dirty evaluation options. • As soon as you think of one, come up with another one as quickly as possible.
  • 28. Questions? • Thanks for coming • Contact: o Julie Dirksen o julie@usablelearning.com o http://usablelearning.com o Twitter: usablelearning