SlideShare a Scribd company logo
User testing and evaluation:     why, how and when to do it   Evaluating and user testing… Appleby Magna Centre 11 May 2009 Martin Bazley Martin Bazley & Associates www.martinbazley.com
Intro: Martin Bazley Consultancy/websites/training/user testing ICT4Learning.com (10+ yrs) Chair of E-Learning Group for Museums Previously: E-Learning Officer, MLA South East (3yrs) Science Museum, London, Internet Projects (7yrs) Taught Science in secondary schools (8yrs)
Why evaluate websites? Why do evaluation and user testing?  Isn’t it really expensive and time consuming? Save money – avoid substantial, hurried redevelopment later in project Audience feedback improves resource in various ways – new activity ideas, etc Demonstrate involvement of key stakeholders throughout project
Making websites effective 3 key success factors Understanding  audience Learning experience  and  learning outcomes   – right for audience and clearly stated Evaluation  – esp in classroom or home (observe in ‘natural habitat’ wherever possible…)
Who for what for ... Who for?   (audience) Need to be clear from start e.g. ‘ for teachers of yr5/6 in local area with whiteboards’ What ‘real-world’ outcomes?   (learning outcomes) What will they learn or do as a result?  e.g. plan a visit to museum, learn that Romans wore funny clothes, discover that they enjoy using a digital camera… How will they use it?   (learning experiences) What do they actually  do  with the site?  e.g. work online or need to print it? - in pairs or alone? - with or without teacher help? Where, when  and  why  will they use it? context  is important
 
 
 
 
 
 
 
 
 
Website evaluation and testing Need to think ahead a bit: what are you trying to find out? how do you intend to test it? why? what will do  you do as a result ? The  Why?  should drive this process
Test early Testing one user early on in the project… … is better than testing 50 near the end
When to evaluate or test and why Before funding approval – project planning Post-funding - project development Post-project – summative evaluation
Testing is an iterative process Testing isn’t something you do once  Make something => test it  => refine it => test it again
Before funding – project planning *Evaluation of other websites Who for? What for? How use it? etc awareness raising: issues, opportunities contributes to market research possible elements, graphic feel etc *Concept testing  check idea makes sense with audience reshape project based on user feedback Focus group Research
 
Post-funding - project development *Concept testing refine project outcomes based on  feedback from intended users Refine website structure does it work for users? *Evaluate initial look and feel  graphics,navigation etc  Focus group Focus group One-to-one tasks
 
 
 
 
Post-funding - project development 2 *Full evaluation of a draft working version  usability AND content: do activities work, how engaging is it, what else could be offered, etc Observation of  actual use of website by  intended users ,  using it for  intended purpose ,  in  intended context  – classroom, workplace, library, home, etc
 
 
 
 
 
Video clip Moving Here key ideas not lesson plans
 
 
 
 
Post-funding - project development 3 Acceptance testing of ‘finished’ website last minute check, minor corrections only often offered by web developers Summative evaluation report for funders, etc learn lessons at project level for next time
Two usability testing techniques  “ Get it” testing - do they understand the purpose, how it works, etc  Key task testing ask the user to do something, watch how well they do Ideally, do a bit of each, in that order
 
User testing – who should do it? The worst person to conduct (or interpret) user testing of your own site is… you! Beware of hearing what you want to hear… Useful to have an external viewpoint First 5mins in a genuine setting tells you 80% of what’s wrong with the site etc
User testing – more info User testing can be done cheaply – tips on how to do it available (MLA SE guide):  www.ICT4Learning.com/onlineguide
Strengths and weaknesses of different data gathering techniques
Data gathering techniques User testing   - early in development and again near end Online questionnaires   – emailed to people or linked from website Focus groups   - best near beginning of project, or at redevelopment stage Visitor surveys   - link online and real visits  Web stats - useful for long term trends /events etc
Need to distinguish between: Diagnostics  – making a project or service  better Reporting  –  to funders, or for advocacy
Online questionnaires (+) once set up they gather numerical and qualitative data with no further effort –  given time can build up large datasets (+) the datasets can be easily exported and manipulated, can be sampled at various times, and structured queries can yield useful results (–) respondents are self-selected and this will skew results – best to compare with similar data from other sources, like visitor surveys (–) the number and nature of responses may depend on how the online questionnaire is displayed and promoted on the website
Focus groups (+) can explore specific issues in more depth, yielding rich feedback  (+) possible to control participant composition to ensure representative (–) comparatively time-consuming (expensive) to organise and analyse (–) yield qualitative data only - small numbers mean numerical comparisons are unreliable
Visitor surveys  (+) possible to control participant composition to ensure representative (–) comparatively time-consuming (expensive) to organise and analyse (–) responses can be affected by various factors including interviewer, weather on the day, day of the week, etc, reducing validity of numerical comparisons between museums
Web stats (+) Easy to gather data – can decide what to do with it later (+) Person-independent data generated - it is the interpretation, rather than the data themselves, which is subjective.  This means others can review the same data and verify or amend initial conclusions reached
Web stats (–) Different systems generate different data for the same web activity – for example no of unique visits measured via Google Analytics is generally lower than that derived via server log files (–) Metrics are complicated and require specialist knowledge to appreciate them fully
Web stats (–) As the amount of off-website web activity increases (e.g. Web 2.0 style interactions) the validity of website stats decreases, especially for reporting purposes, but also for diagnostics (–) Agreeing a common format for presentation of data and analysis requires collaborative working to be meaningful
Who for what for ... Who for?   (audience) Need to be clear from start e.g. ‘ for teachers of yr5/6 in local area with whiteboards’ What ‘real-world’ outcomes?   (learning outcomes) What will they learn or do as a result?  e.g. plan a visit to museum, learn that Romans wore funny clothes, discover that they enjoy using a digital camera… How will they use it?   (learning experiences) What do they actually  do  with the site?  e.g. work online or need to print it? - in pairs or alone? - with or without teacher help?
How can you ensure you do get these right? Build questions into the planning process  Evaluate/test regularly Get informal feedback whenever possible – and act on it Who is it for? What are the real world outcomes? How will they use it? Also When, Where, Why? Who for what for ...
Martin Bazley 0780 3580 737 www.martinbazley.com   More information

More Related Content

PDF
Digital learning martin bazley gem conference swansea
PDF
Digital supporting pre post visit and classroom martin bazley upload version
PDF
Understanding online audiences ara conf 28 aug 15 martin bazley upload version
PPTX
Introduction to Usability Testing for Survey Research
PPTX
ResearchOps Berlin Meetup #2 - UX Maturity - How to Grow User Research in you...
PDF
D school assignment 3 Prototype and Test
PPT
Research Skills IT Diploma students
PDF
Benchmarking Usability Performance
Digital learning martin bazley gem conference swansea
Digital supporting pre post visit and classroom martin bazley upload version
Understanding online audiences ara conf 28 aug 15 martin bazley upload version
Introduction to Usability Testing for Survey Research
ResearchOps Berlin Meetup #2 - UX Maturity - How to Grow User Research in you...
D school assignment 3 Prototype and Test
Research Skills IT Diploma students
Benchmarking Usability Performance

Viewers also liked (7)

PPTX
Evaluation in ESP
PPTX
ESP Materiasl Evaluation
PPT
Design Chapter 7 - Testing and Evaluation Techniques
PPT
ESP PPT : GROUP 3 SYLLABUS AND COURSE DESIGN IN ESP
PPT
Types of evaluation
PPTX
Types of testing
PPTX
Types of Test
Evaluation in ESP
ESP Materiasl Evaluation
Design Chapter 7 - Testing and Evaluation Techniques
ESP PPT : GROUP 3 SYLLABUS AND COURSE DESIGN IN ESP
Types of evaluation
Types of testing
Types of Test
Ad

Similar to Bazley Developing And Evaluating Online Resources (20)

PPT
Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09
PDF
Bazley understanding online audiences vsg conf march 2016 for uploading
PPT
Bazley Developing And Evaluating Online Resources
PPT
090511 Appleby Magna Overview Presentation
PPT
Sustaining digital learning provision gem conf 2011
PDF
Martin bazley evaluating digital learning resources leicester reduced for upl...
PPT
Combining Methods: Web Analytics and User Research
PPTX
Webinar: How to Conduct Unmoderated Remote Usability Testing
PPT
Combining Methods: Web Analytics and User Testing
PDF
"Open" includes users - Leverage their input
PPT
Digital learning: an overview
PPTX
Experience Research Best Practices - UX Meet Up Boston 2013 - Dan Berlin
PPTX
Experience Research Best Practices
DOCX
Kelly saleh howey_proposal
PDF
UX Design Process | Sample Proposal
PPT
User Zoom Webinar Monster Aug09 Vf
PDF
UCD & Usability testing at the St. Augustine Campus
PDF
UCD and Usability Testing (2007)
PPTX
Evaluation training for wellcome trust 15th may
PDF
Usability Testing and QA 7-18-14
Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09
Bazley understanding online audiences vsg conf march 2016 for uploading
Bazley Developing And Evaluating Online Resources
090511 Appleby Magna Overview Presentation
Sustaining digital learning provision gem conf 2011
Martin bazley evaluating digital learning resources leicester reduced for upl...
Combining Methods: Web Analytics and User Research
Webinar: How to Conduct Unmoderated Remote Usability Testing
Combining Methods: Web Analytics and User Testing
"Open" includes users - Leverage their input
Digital learning: an overview
Experience Research Best Practices - UX Meet Up Boston 2013 - Dan Berlin
Experience Research Best Practices
Kelly saleh howey_proposal
UX Design Process | Sample Proposal
User Zoom Webinar Monster Aug09 Vf
UCD & Usability testing at the St. Augustine Campus
UCD and Usability Testing (2007)
Evaluation training for wellcome trust 15th may
Usability Testing and QA 7-18-14
Ad

More from Martin Bazley (20)

PPT
Digital learning resources
PDF
MA conf cardiff 9 Oct 2014 museum websites online experience martin bazley ...
PDF
Digital technology to generate save money gem conf cambridge 2014 reduced for...
PPT
E learning getting started with online learning reduced for uploading
PPT
Understanding online audiences ux day oxford 18 mar 13
PPT
Digital technology in museums - case studies
PPT
Understanding online audiences creating capacity 19 june 2012
PPT
Digital technology for museum learning oxford 2 mar 12 reduced for uploading
PPT
Developing online learning resources for schools on a budget
PPT
Creating online learning resources for schools for uploading
PPT
Martin Bazley - using simple technologies with different audiences (reduced f...
PPT
Martin bazley-making digital projects sustainable bits2 blogs mar 2011 (reduced)
PPT
Martin bazley Creating effective content 15 Mar 11
PPT
Creating online learning resources royal collection 18 jan 2011 reduced images
PPT
10 11 25 univ of brighton usability and evaluation module shelley boden
PPT
MyLearning and funding ukmw10
PPT
Developing online resources fleet air arm museum 18 oct 2010
PPT
Online exhibitions southampton 22 may 2010
PPT
Writing for the web highpoint leicester may 2010
PPT
Peter Pavement
Digital learning resources
MA conf cardiff 9 Oct 2014 museum websites online experience martin bazley ...
Digital technology to generate save money gem conf cambridge 2014 reduced for...
E learning getting started with online learning reduced for uploading
Understanding online audiences ux day oxford 18 mar 13
Digital technology in museums - case studies
Understanding online audiences creating capacity 19 june 2012
Digital technology for museum learning oxford 2 mar 12 reduced for uploading
Developing online learning resources for schools on a budget
Creating online learning resources for schools for uploading
Martin Bazley - using simple technologies with different audiences (reduced f...
Martin bazley-making digital projects sustainable bits2 blogs mar 2011 (reduced)
Martin bazley Creating effective content 15 Mar 11
Creating online learning resources royal collection 18 jan 2011 reduced images
10 11 25 univ of brighton usability and evaluation module shelley boden
MyLearning and funding ukmw10
Developing online resources fleet air arm museum 18 oct 2010
Online exhibitions southampton 22 may 2010
Writing for the web highpoint leicester may 2010
Peter Pavement

Recently uploaded (20)

PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PDF
Basic Mud Logging Guide for educational purpose
PDF
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
PDF
Microbial disease of the cardiovascular and lymphatic systems
PDF
Complications of Minimal Access Surgery at WLH
PPTX
Institutional Correction lecture only . . .
PDF
Classroom Observation Tools for Teachers
PPTX
human mycosis Human fungal infections are called human mycosis..pptx
PDF
01-Introduction-to-Information-Management.pdf
PDF
RMMM.pdf make it easy to upload and study
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
PPTX
Week 4 Term 3 Study Techniques revisited.pptx
PPTX
Microbial diseases, their pathogenesis and prophylaxis
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PDF
2.FourierTransform-ShortQuestionswithAnswers.pdf
PPTX
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
PPTX
master seminar digital applications in india
PDF
Pre independence Education in Inndia.pdf
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PDF
Insiders guide to clinical Medicine.pdf
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
Basic Mud Logging Guide for educational purpose
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
Microbial disease of the cardiovascular and lymphatic systems
Complications of Minimal Access Surgery at WLH
Institutional Correction lecture only . . .
Classroom Observation Tools for Teachers
human mycosis Human fungal infections are called human mycosis..pptx
01-Introduction-to-Information-Management.pdf
RMMM.pdf make it easy to upload and study
Module 4: Burden of Disease Tutorial Slides S2 2025
Week 4 Term 3 Study Techniques revisited.pptx
Microbial diseases, their pathogenesis and prophylaxis
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
2.FourierTransform-ShortQuestionswithAnswers.pdf
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
master seminar digital applications in india
Pre independence Education in Inndia.pdf
Final Presentation General Medicine 03-08-2024.pptx
Insiders guide to clinical Medicine.pdf

Bazley Developing And Evaluating Online Resources

  • 1. User testing and evaluation: why, how and when to do it Evaluating and user testing… Appleby Magna Centre 11 May 2009 Martin Bazley Martin Bazley & Associates www.martinbazley.com
  • 2. Intro: Martin Bazley Consultancy/websites/training/user testing ICT4Learning.com (10+ yrs) Chair of E-Learning Group for Museums Previously: E-Learning Officer, MLA South East (3yrs) Science Museum, London, Internet Projects (7yrs) Taught Science in secondary schools (8yrs)
  • 3. Why evaluate websites? Why do evaluation and user testing? Isn’t it really expensive and time consuming? Save money – avoid substantial, hurried redevelopment later in project Audience feedback improves resource in various ways – new activity ideas, etc Demonstrate involvement of key stakeholders throughout project
  • 4. Making websites effective 3 key success factors Understanding audience Learning experience and learning outcomes – right for audience and clearly stated Evaluation – esp in classroom or home (observe in ‘natural habitat’ wherever possible…)
  • 5. Who for what for ... Who for? (audience) Need to be clear from start e.g. ‘ for teachers of yr5/6 in local area with whiteboards’ What ‘real-world’ outcomes? (learning outcomes) What will they learn or do as a result? e.g. plan a visit to museum, learn that Romans wore funny clothes, discover that they enjoy using a digital camera… How will they use it? (learning experiences) What do they actually do with the site? e.g. work online or need to print it? - in pairs or alone? - with or without teacher help? Where, when and why will they use it? context is important
  • 6.  
  • 7.  
  • 8.  
  • 9.  
  • 10.  
  • 11.  
  • 12.  
  • 13.  
  • 14.  
  • 15. Website evaluation and testing Need to think ahead a bit: what are you trying to find out? how do you intend to test it? why? what will do you do as a result ? The Why? should drive this process
  • 16. Test early Testing one user early on in the project… … is better than testing 50 near the end
  • 17. When to evaluate or test and why Before funding approval – project planning Post-funding - project development Post-project – summative evaluation
  • 18. Testing is an iterative process Testing isn’t something you do once Make something => test it => refine it => test it again
  • 19. Before funding – project planning *Evaluation of other websites Who for? What for? How use it? etc awareness raising: issues, opportunities contributes to market research possible elements, graphic feel etc *Concept testing check idea makes sense with audience reshape project based on user feedback Focus group Research
  • 20.  
  • 21. Post-funding - project development *Concept testing refine project outcomes based on feedback from intended users Refine website structure does it work for users? *Evaluate initial look and feel graphics,navigation etc Focus group Focus group One-to-one tasks
  • 22.  
  • 23.  
  • 24.  
  • 25.  
  • 26. Post-funding - project development 2 *Full evaluation of a draft working version usability AND content: do activities work, how engaging is it, what else could be offered, etc Observation of actual use of website by intended users , using it for intended purpose , in intended context – classroom, workplace, library, home, etc
  • 27.  
  • 28.  
  • 29.  
  • 30.  
  • 31.  
  • 32. Video clip Moving Here key ideas not lesson plans
  • 33.  
  • 34.  
  • 35.  
  • 36.  
  • 37. Post-funding - project development 3 Acceptance testing of ‘finished’ website last minute check, minor corrections only often offered by web developers Summative evaluation report for funders, etc learn lessons at project level for next time
  • 38. Two usability testing techniques “ Get it” testing - do they understand the purpose, how it works, etc Key task testing ask the user to do something, watch how well they do Ideally, do a bit of each, in that order
  • 39.  
  • 40. User testing – who should do it? The worst person to conduct (or interpret) user testing of your own site is… you! Beware of hearing what you want to hear… Useful to have an external viewpoint First 5mins in a genuine setting tells you 80% of what’s wrong with the site etc
  • 41. User testing – more info User testing can be done cheaply – tips on how to do it available (MLA SE guide): www.ICT4Learning.com/onlineguide
  • 42. Strengths and weaknesses of different data gathering techniques
  • 43. Data gathering techniques User testing - early in development and again near end Online questionnaires – emailed to people or linked from website Focus groups - best near beginning of project, or at redevelopment stage Visitor surveys - link online and real visits Web stats - useful for long term trends /events etc
  • 44. Need to distinguish between: Diagnostics – making a project or service better Reporting – to funders, or for advocacy
  • 45. Online questionnaires (+) once set up they gather numerical and qualitative data with no further effort – given time can build up large datasets (+) the datasets can be easily exported and manipulated, can be sampled at various times, and structured queries can yield useful results (–) respondents are self-selected and this will skew results – best to compare with similar data from other sources, like visitor surveys (–) the number and nature of responses may depend on how the online questionnaire is displayed and promoted on the website
  • 46. Focus groups (+) can explore specific issues in more depth, yielding rich feedback (+) possible to control participant composition to ensure representative (–) comparatively time-consuming (expensive) to organise and analyse (–) yield qualitative data only - small numbers mean numerical comparisons are unreliable
  • 47. Visitor surveys (+) possible to control participant composition to ensure representative (–) comparatively time-consuming (expensive) to organise and analyse (–) responses can be affected by various factors including interviewer, weather on the day, day of the week, etc, reducing validity of numerical comparisons between museums
  • 48. Web stats (+) Easy to gather data – can decide what to do with it later (+) Person-independent data generated - it is the interpretation, rather than the data themselves, which is subjective. This means others can review the same data and verify or amend initial conclusions reached
  • 49. Web stats (–) Different systems generate different data for the same web activity – for example no of unique visits measured via Google Analytics is generally lower than that derived via server log files (–) Metrics are complicated and require specialist knowledge to appreciate them fully
  • 50. Web stats (–) As the amount of off-website web activity increases (e.g. Web 2.0 style interactions) the validity of website stats decreases, especially for reporting purposes, but also for diagnostics (–) Agreeing a common format for presentation of data and analysis requires collaborative working to be meaningful
  • 51. Who for what for ... Who for? (audience) Need to be clear from start e.g. ‘ for teachers of yr5/6 in local area with whiteboards’ What ‘real-world’ outcomes? (learning outcomes) What will they learn or do as a result? e.g. plan a visit to museum, learn that Romans wore funny clothes, discover that they enjoy using a digital camera… How will they use it? (learning experiences) What do they actually do with the site? e.g. work online or need to print it? - in pairs or alone? - with or without teacher help?
  • 52. How can you ensure you do get these right? Build questions into the planning process Evaluate/test regularly Get informal feedback whenever possible – and act on it Who is it for? What are the real world outcomes? How will they use it? Also When, Where, Why? Who for what for ...
  • 53. Martin Bazley 0780 3580 737 www.martinbazley.com More information