USABILITY ANALYSIS
OFVIRTUAL LABS
ICALT2018
Jessica Emory
Horass Mann School, USA
Venkatesh Choppella
IIIT Hyderabad
Mrityunjay Kumar
VLEAD, IIIT Hyderabad
About Me (Mrityunjay Kumar)
Agenda
• Context
• Approach
• Analysis of results
• Future work
CONTEXT
Background
Virtual Labs project
(MHRD initiative)
10
disciplines
70 labs
800
experiments
500
workshops
Used more
than 2
million times
Key resources for undergraduate engineering students inTier-2/Tier-3 city colleges.
Example screenshots of existing labs
Example screenshots of existing labs
Motivation
Usability is a key pre-requisite for
online content
Usability of these virtual labs were
suspect
Research Questions
Are these labs usable?
Do these labs aid learning?
Part of a broader research question – how does one
create 'useful' virtual labs at scale?
What is aVirtual Lab?
Hands-on activities online that
augment classroom lesson
Use of interactivity and
demonstration to promote
learning by doing
What is Usability?
Technical Usability
• Usability of user-
interface elements (for
ex: efficiency of use)
Pedagogical Usability
• Usability of content
that aids learning (for
ex: allow self-pace)
[5] J. Nielsen and R. Molich, “Heuristic evaluation of user interfaces,”
[9] P. Nokelainen, “An empirical assessment of pedagogical usability criteria for digital learning
material with elementary school students,”
APPROACH
Usability Heuristics that we used
T1:Visibility of system status
T2: Match between system and real-
world
T3: Consistency
T4: Error prevention
T5: Recognition rather than recall
T6: Flexibility and efficiency of use
T7: Aesthetic and minimalist design
T8: Error diagnosis and recovery by
users
T9: Help and Documentation
Neilsen’s (technical) usability
heuristics
P1: Learner Control
P2: Learner Activity
P3: Collaborative Learning
P4: Goal Orientation
P5: Applicability
P6: Added Value
P7: Valuation of previous
knowledge
P8: Flexibility
P9: Feedback
Nokelainen’s (pedagogical)
usability heuristics
3 labs (~27 experiments) for analysis
Computer Programming
Mechanics of Machine
Cryptography
Approach
Converted the heuristics into a checklist
for virtual labs
Graded the labs on these items to award
an aggregated score for each heuristic.
Scoring
Define 2 values: Score (Mean),Variation (Standard Deviation)
Categorize the value into Low and High for easier analysis
Apply statistical function to scores for each heuristic
Assign a score (0-3) based on adherence to checklist item
Break each heuristic into relevant checklist items
Technical Usability
Checklist
T1:Visibility of system status
T2: Match between system and
real-world
T3: Consistency
T4: Error prevention
T5: Recognition rather than recall
T6: Flexibility and efficiency of use
T7: Aesthetic and minimalist
design
T8: Error diagnosis and recovery
by users
T9: Help and Documentation
When there is error on a page, it is clearly shown (T1)
Terms in use are similar to what is used in classroom
(T2)
A glossary is provided and linked when appropriate
((T2)
Experience is similar to other online experiences (T2)
Experiments are consistent across the lab (T3)
Links that take the user out of the course are clearly
labeled (T4)
Learner can do experiments without reading
instructions (T5)
Hints and messages are available at error-prone
locations (T5)
Experiment is optimized for low-bandwidth usage
(T6)
Call to action on each page is clearly visible and
prominent (T7)
Each page element provides unique value and has a
purpose (T7)
Every page brings to focus critical content and moves
less critical content off-screen (T7)
All errors have clear messages and provide enough
details for learners to resolve them on their own (T8)
Quiz shows correct answer when learner is wrong (T8)
Context-sensitive help on all screens is available (T9)
Pedagogical Usability
Checklist
P1: Learner Control
P2: Learner Activity
P3: Collaborative Learning
P4: Goal Orientation
P5: Applicability
P6: Added Value
P7: Valuation of previous
knowledge
P8: Flexibility
P9: Feedback
Each experiment covers a manageable chunk of information
(P1)
Each simulation is broken down into small chunks (P1)
Learner can set their own hypothesis (P2)
Learner can select their own input data (P2)
Multiple users can work together on a lab (P3)
Lab objective is clearly laid out, including measure of
achievement (P4)
Experiment allows students to set their own goals in pursuit
of overall objectives (P4)
Take real-life examples in experiments (P5)
Build on something that the learner already knows and
introduce new material (P5)
Creative use of multimedia to aid in learning (P6)
Allow self-pace (P6)
Use techniques to teach that can only be done using a
computer (P6)
Provide some personalization based on what student already
knows (P7)
Provide multiple paths based on what student already knows
(P7)
Navigation free to any part of the experiment (P8)
Multiple assignments available to choose from (P8)
Multiple types of multimedia content so that student can
choose what they like the most (P9)
Frequent quizzes within the simulation (P9)
Immediate feedback (P9)
Track score and gamify it (P9)
Peer feedback is available (P9)
ANALYSIS OF RESULTS
Technical Usability Score andVariation
Heuristic Checklist Score Variation
T1:Visibility of system status 1.7 High
T2: Match between system and real-world 1.5 High
T3: Consistency 2 Low
T4: Error prevention 0 Low
T5: Recognition rather than recall 1.5 High
T6: Flexibility and efficiency of use 0.3 Low
T7: Aesthetic and minimalist design 1 High
T8: Error diagnosis and recovery by users 1.3 High
T9: Help and Documentation 0 Low
Pedagogical Usability Score andVariation
Heuristic Checklist Score Variation
P1: Learner Control 2.5 Low
P2: LearnerActivity 1.8 High
P3: Collaborative Learning 0 Low
P4: Goal Orientation 1 High
P5: Applicability 1.2 Low
P6: AddedValue 2 High
P7:Valuation of previous knowledge 0 Low
P8: Flexibility 1 High
P9: Feedback 0.5 High
Low score,
High variation
(very bad!)
High score,
High
variation
Low score,
Low variation
High score,
Low variation
(very good!)
Score and variation analysis of heuristics
Score
Variation
Low score, high variation
• Aesthetic and minimalist design
• Error diagnostic and recovery by users
• Goal orientation
• Flexibility
• Feedback
High score, high variation
• Visibility of system status
• Match between system and real-world
• Recognition rather than recall
• LearnerActivity
• AddedValue
Low score, low variation
• Error prevention
• Flexibility and efficiency of use
• Help and Documentation
• Collaborative Learning
• Applicability
• Valuation of previous knowledge
High score, low variation
• Consistency
• LearnerControl
Score and variation analysis of heuristics
Poor score means poor design
guidelines
High variation means poor
implementation guidelines
Hypotheses
Observations and Conclusions
• More than 80% of heuristics
fared poorly (16/18)
There is a lack of overall
focus on usability
• More than 60% of the heuristics
have low scores (11/18)
There are
Missing/Incomplete design
guidelines
• More than 50% of the heuristics
have high variations (10/18)
There are
Missing/Incomplete
implementation guidelines
FUTUREWORK
Evaluate other labs and run
usability tests with students
to test the checklist
Make the checklists robust
Incorporate usability in design
Explore meta-models of a
virtual lab to achieve
usability by design
THANKYOU
Jessica Emory
Horass Mann School, USA
Venkatesh Choppella
IIIT Hyderabad
Mrityunjay Kumar
VLEAD, IIIT Hyderabad
Usability Analysis ofVirtual Labs

More Related Content

PPTX
Preservice Teachers' Dispositions Toward Technology Integration
PPTX
Towards Task Analysis Tool Support
PPTX
Analytics - Presentation in DkIT
PPT
Nlp based heuristics for assessing participants in cscl chats
PPT
Survey on Integration of digital tools
PDF
Lecture 1: Introduction to the Course (Practical Information)
PDF
Replicable Evaluation of Recommender Systems
PPTX
The Future of Online Testing with MOOCs: An Exploratory Analysis of Current P...
Preservice Teachers' Dispositions Toward Technology Integration
Towards Task Analysis Tool Support
Analytics - Presentation in DkIT
Nlp based heuristics for assessing participants in cscl chats
Survey on Integration of digital tools
Lecture 1: Introduction to the Course (Practical Information)
Replicable Evaluation of Recommender Systems
The Future of Online Testing with MOOCs: An Exploratory Analysis of Current P...

What's hot (20)

PPTX
Simulation and modelling in teaching and learning in
PDF
VR learning tool
PPT
20080223 Lasvegas Conference Presentation
PDF
HT2014 Tutorial: Evaluating Recommender Systems - Ensuring Replicability of E...
PPTX
What questions are MOOCs asking? An evidence based investigation
PDF
Tma 01 Task sheet
PDF
Blackboard Learning Analytics Research Update
PPTX
Doctoral Consortium ECTEL presentation Learning Analytics Learning Design
PPTX
FHIES2012
DOC
Assignments .30%
PDF
Charting the Design and Analytics Agenda of Learnersourcing Systems
PPTX
Mathematics and technology
PPTX
On the Validity of Peer Grading and a Cloud Teaching Assistant System
PPTX
Occe2018: Student experiences with a bring your own laptop e-Exam system in p...
PDF
14.30 kurilovas bireniene
PDF
Everything I have learnt about eLearning
PPT
Evaluation of the Passit project
PDF
Integrated Department Grants As An Implementation Strategy
PDF
Adaptive Learning Systems: A review of Adaptation.
PPTX
Nominal Group Technique
Simulation and modelling in teaching and learning in
VR learning tool
20080223 Lasvegas Conference Presentation
HT2014 Tutorial: Evaluating Recommender Systems - Ensuring Replicability of E...
What questions are MOOCs asking? An evidence based investigation
Tma 01 Task sheet
Blackboard Learning Analytics Research Update
Doctoral Consortium ECTEL presentation Learning Analytics Learning Design
FHIES2012
Assignments .30%
Charting the Design and Analytics Agenda of Learnersourcing Systems
Mathematics and technology
On the Validity of Peer Grading and a Cloud Teaching Assistant System
Occe2018: Student experiences with a bring your own laptop e-Exam system in p...
14.30 kurilovas bireniene
Everything I have learnt about eLearning
Evaluation of the Passit project
Integrated Department Grants As An Implementation Strategy
Adaptive Learning Systems: A review of Adaptation.
Nominal Group Technique
Ad

Similar to Analysis of virtual labs - Paper presentation at ICALT 2018 (IIT Bombay) (20)

PPTX
Alict evaluation of active learning materials
PDF
Heuristic ux-evaluation
PPTX
Information Experience Lab, IE Lab at SISLT
PPT
Tools and Evaluation Techniques to Support Social Awareness in CSCeL: The AV...
PPT
Tools and Evaluation Techniques to Support Social Awareness in CSCeL: The AVA...
PPT
Usability
PPT
Hci techniques from idea to deployment
PPTX
Evaluation of Interactive Systems Design or Prototype or Product
PPTX
Aect2018 workshop-v6ij-compressed
PDF
Aect 2018 workshop
PPTX
CoMo Game Dev - usability and user experience methods
PPTX
Design and Evaluation techniques unit 5
PDF
PPT
Avaliação da Usabilidade e da Acessibilidade do Ambiente Virtual de Aprendiza...
PPTX
Thesis
PPT
PDF
THE USABILITY METRICS FOR USER EXPERIENCE
PPTX
hci Evaluation Techniques.pptx
PDF
Evaluating Web-Based Learning Tools
PDF
Human Computer Interface factors in virtual learning environment
Alict evaluation of active learning materials
Heuristic ux-evaluation
Information Experience Lab, IE Lab at SISLT
Tools and Evaluation Techniques to Support Social Awareness in CSCeL: The AV...
Tools and Evaluation Techniques to Support Social Awareness in CSCeL: The AVA...
Usability
Hci techniques from idea to deployment
Evaluation of Interactive Systems Design or Prototype or Product
Aect2018 workshop-v6ij-compressed
Aect 2018 workshop
CoMo Game Dev - usability and user experience methods
Design and Evaluation techniques unit 5
Avaliação da Usabilidade e da Acessibilidade do Ambiente Virtual de Aprendiza...
Thesis
THE USABILITY METRICS FOR USER EXPERIENCE
hci Evaluation Techniques.pptx
Evaluating Web-Based Learning Tools
Human Computer Interface factors in virtual learning environment
Ad

Recently uploaded (20)

PDF
English Textual Question & Ans (12th Class).pdf
PDF
Environmental Education MCQ BD2EE - Share Source.pdf
PDF
International_Financial_Reporting_Standa.pdf
PDF
Climate and Adaptation MCQs class 7 from chatgpt
PDF
Empowerment Technology for Senior High School Guide
PDF
BP 505 T. PHARMACEUTICAL JURISPRUDENCE (UNIT 2).pdf
PPTX
DRUGS USED FOR HORMONAL DISORDER, SUPPLIMENTATION, CONTRACEPTION, & MEDICAL T...
PDF
Complications of Minimal Access-Surgery.pdf
PDF
Literature_Review_methods_ BRACU_MKT426 course material
PPTX
Module on health assessment of CHN. pptx
PDF
Myanmar Dental Journal, The Journal of the Myanmar Dental Association (2013).pdf
PDF
LIFE & LIVING TRILOGY - PART - (2) THE PURPOSE OF LIFE.pdf
PDF
FORM 1 BIOLOGY MIND MAPS and their schemes
PDF
BP 505 T. PHARMACEUTICAL JURISPRUDENCE (UNIT 1).pdf
PDF
Journal of Dental Science - UDMY (2022).pdf
PDF
semiconductor packaging in vlsi design fab
PDF
IP : I ; Unit I : Preformulation Studies
PDF
Τίμαιος είναι φιλοσοφικός διάλογος του Πλάτωνα
PPTX
A powerpoint presentation on the Revised K-10 Science Shaping Paper
PDF
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...
English Textual Question & Ans (12th Class).pdf
Environmental Education MCQ BD2EE - Share Source.pdf
International_Financial_Reporting_Standa.pdf
Climate and Adaptation MCQs class 7 from chatgpt
Empowerment Technology for Senior High School Guide
BP 505 T. PHARMACEUTICAL JURISPRUDENCE (UNIT 2).pdf
DRUGS USED FOR HORMONAL DISORDER, SUPPLIMENTATION, CONTRACEPTION, & MEDICAL T...
Complications of Minimal Access-Surgery.pdf
Literature_Review_methods_ BRACU_MKT426 course material
Module on health assessment of CHN. pptx
Myanmar Dental Journal, The Journal of the Myanmar Dental Association (2013).pdf
LIFE & LIVING TRILOGY - PART - (2) THE PURPOSE OF LIFE.pdf
FORM 1 BIOLOGY MIND MAPS and their schemes
BP 505 T. PHARMACEUTICAL JURISPRUDENCE (UNIT 1).pdf
Journal of Dental Science - UDMY (2022).pdf
semiconductor packaging in vlsi design fab
IP : I ; Unit I : Preformulation Studies
Τίμαιος είναι φιλοσοφικός διάλογος του Πλάτωνα
A powerpoint presentation on the Revised K-10 Science Shaping Paper
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...

Analysis of virtual labs - Paper presentation at ICALT 2018 (IIT Bombay)

  • 1. USABILITY ANALYSIS OFVIRTUAL LABS ICALT2018 Jessica Emory Horass Mann School, USA Venkatesh Choppella IIIT Hyderabad Mrityunjay Kumar VLEAD, IIIT Hyderabad
  • 3. Agenda • Context • Approach • Analysis of results • Future work
  • 5. Background Virtual Labs project (MHRD initiative) 10 disciplines 70 labs 800 experiments 500 workshops Used more than 2 million times Key resources for undergraduate engineering students inTier-2/Tier-3 city colleges.
  • 6. Example screenshots of existing labs
  • 7. Example screenshots of existing labs
  • 8. Motivation Usability is a key pre-requisite for online content Usability of these virtual labs were suspect
  • 9. Research Questions Are these labs usable? Do these labs aid learning? Part of a broader research question – how does one create 'useful' virtual labs at scale?
  • 10. What is aVirtual Lab? Hands-on activities online that augment classroom lesson Use of interactivity and demonstration to promote learning by doing
  • 11. What is Usability? Technical Usability • Usability of user- interface elements (for ex: efficiency of use) Pedagogical Usability • Usability of content that aids learning (for ex: allow self-pace) [5] J. Nielsen and R. Molich, “Heuristic evaluation of user interfaces,” [9] P. Nokelainen, “An empirical assessment of pedagogical usability criteria for digital learning material with elementary school students,”
  • 13. Usability Heuristics that we used T1:Visibility of system status T2: Match between system and real- world T3: Consistency T4: Error prevention T5: Recognition rather than recall T6: Flexibility and efficiency of use T7: Aesthetic and minimalist design T8: Error diagnosis and recovery by users T9: Help and Documentation Neilsen’s (technical) usability heuristics P1: Learner Control P2: Learner Activity P3: Collaborative Learning P4: Goal Orientation P5: Applicability P6: Added Value P7: Valuation of previous knowledge P8: Flexibility P9: Feedback Nokelainen’s (pedagogical) usability heuristics
  • 14. 3 labs (~27 experiments) for analysis Computer Programming Mechanics of Machine Cryptography
  • 15. Approach Converted the heuristics into a checklist for virtual labs Graded the labs on these items to award an aggregated score for each heuristic.
  • 16. Scoring Define 2 values: Score (Mean),Variation (Standard Deviation) Categorize the value into Low and High for easier analysis Apply statistical function to scores for each heuristic Assign a score (0-3) based on adherence to checklist item Break each heuristic into relevant checklist items
  • 17. Technical Usability Checklist T1:Visibility of system status T2: Match between system and real-world T3: Consistency T4: Error prevention T5: Recognition rather than recall T6: Flexibility and efficiency of use T7: Aesthetic and minimalist design T8: Error diagnosis and recovery by users T9: Help and Documentation When there is error on a page, it is clearly shown (T1) Terms in use are similar to what is used in classroom (T2) A glossary is provided and linked when appropriate ((T2) Experience is similar to other online experiences (T2) Experiments are consistent across the lab (T3) Links that take the user out of the course are clearly labeled (T4) Learner can do experiments without reading instructions (T5) Hints and messages are available at error-prone locations (T5) Experiment is optimized for low-bandwidth usage (T6) Call to action on each page is clearly visible and prominent (T7) Each page element provides unique value and has a purpose (T7) Every page brings to focus critical content and moves less critical content off-screen (T7) All errors have clear messages and provide enough details for learners to resolve them on their own (T8) Quiz shows correct answer when learner is wrong (T8) Context-sensitive help on all screens is available (T9)
  • 18. Pedagogical Usability Checklist P1: Learner Control P2: Learner Activity P3: Collaborative Learning P4: Goal Orientation P5: Applicability P6: Added Value P7: Valuation of previous knowledge P8: Flexibility P9: Feedback Each experiment covers a manageable chunk of information (P1) Each simulation is broken down into small chunks (P1) Learner can set their own hypothesis (P2) Learner can select their own input data (P2) Multiple users can work together on a lab (P3) Lab objective is clearly laid out, including measure of achievement (P4) Experiment allows students to set their own goals in pursuit of overall objectives (P4) Take real-life examples in experiments (P5) Build on something that the learner already knows and introduce new material (P5) Creative use of multimedia to aid in learning (P6) Allow self-pace (P6) Use techniques to teach that can only be done using a computer (P6) Provide some personalization based on what student already knows (P7) Provide multiple paths based on what student already knows (P7) Navigation free to any part of the experiment (P8) Multiple assignments available to choose from (P8) Multiple types of multimedia content so that student can choose what they like the most (P9) Frequent quizzes within the simulation (P9) Immediate feedback (P9) Track score and gamify it (P9) Peer feedback is available (P9)
  • 20. Technical Usability Score andVariation Heuristic Checklist Score Variation T1:Visibility of system status 1.7 High T2: Match between system and real-world 1.5 High T3: Consistency 2 Low T4: Error prevention 0 Low T5: Recognition rather than recall 1.5 High T6: Flexibility and efficiency of use 0.3 Low T7: Aesthetic and minimalist design 1 High T8: Error diagnosis and recovery by users 1.3 High T9: Help and Documentation 0 Low
  • 21. Pedagogical Usability Score andVariation Heuristic Checklist Score Variation P1: Learner Control 2.5 Low P2: LearnerActivity 1.8 High P3: Collaborative Learning 0 Low P4: Goal Orientation 1 High P5: Applicability 1.2 Low P6: AddedValue 2 High P7:Valuation of previous knowledge 0 Low P8: Flexibility 1 High P9: Feedback 0.5 High
  • 22. Low score, High variation (very bad!) High score, High variation Low score, Low variation High score, Low variation (very good!) Score and variation analysis of heuristics Score Variation
  • 23. Low score, high variation • Aesthetic and minimalist design • Error diagnostic and recovery by users • Goal orientation • Flexibility • Feedback High score, high variation • Visibility of system status • Match between system and real-world • Recognition rather than recall • LearnerActivity • AddedValue Low score, low variation • Error prevention • Flexibility and efficiency of use • Help and Documentation • Collaborative Learning • Applicability • Valuation of previous knowledge High score, low variation • Consistency • LearnerControl Score and variation analysis of heuristics
  • 24. Poor score means poor design guidelines High variation means poor implementation guidelines Hypotheses
  • 25. Observations and Conclusions • More than 80% of heuristics fared poorly (16/18) There is a lack of overall focus on usability • More than 60% of the heuristics have low scores (11/18) There are Missing/Incomplete design guidelines • More than 50% of the heuristics have high variations (10/18) There are Missing/Incomplete implementation guidelines
  • 27. Evaluate other labs and run usability tests with students to test the checklist Make the checklists robust
  • 28. Incorporate usability in design Explore meta-models of a virtual lab to achieve usability by design
  • 29. THANKYOU Jessica Emory Horass Mann School, USA Venkatesh Choppella IIIT Hyderabad Mrityunjay Kumar VLEAD, IIIT Hyderabad Usability Analysis ofVirtual Labs