SlideShare a Scribd company logo
Design Process:
Evaluating Interactive
Systems
Preeti Mishra
Course Instructor
What is Evaluation
 Evaluation is defined as .. “To examine and judge
carefully”
 In order to examine and judge, we need criteria
against which to base our assessment.
Why Evaluate
 In HCI we evaluate interfaces and systems to:
 Determine how usable they are for different
user groups
 Identify good and bad features to inform future design
 Compare design choices to assist us in making
decisions
 Observe the effects of specific interfaces on users
Criteria for Evaluation
 Expert Analysis
 User-Based Criteria
 Model Based
Expert Analysis
Heuristic Evaluation
Cognitive Walkthrough
Review-based Evaluation.
Expert Analysis
 Expert analysis: designer or HCI expert assesses a
design based on known/standard cognitive principles or
empirical results.
 Expert analysis methods can be used at any stage in the life
cycle.
 Expert analysis methods are relatively cheap.
 Expert analysis methods, however, do not assess the actual
use of the system.
 Examples of expert analysis methods: Heuristic
Evaluation (HE), Cognitive Walkthrough (CW), Review-
based Evaluation.
Heuristic Based
Evaluation Technique
(“to discover”) pertains to the process of gaining knowledge
or some desired result by intelligent guesswork rather than
by following some pre established formula.
Introduction
 Heuristic Evaluation (HE) was proposed by Nielsen and
Molich. ( Read Nielsen’s ten Heuristics already discussed
earlier)
 In HE, experts scrutinize the interface and its elements
against established usability heuristics [another previous
tutorial].
 The experts should have some background knowledge or
experience in HCI design and usability evaluation.
The Process..
 3 to 5 experts are considered to be sufficient to detect most of
the usability problems.
 The enlisted experts are provided with the proper roles (and
sometimes scenarios to use) to support them when interacting
with the system/prototype under evaluation.
 They then evaluate the system/prototype individually. This is to
ensure an independent and unbiased evaluation by eachexpert.
 They assess the user interface as a whole and also the
individual user interface elements. The assessment is
performed with reference to a set of established usability
principles.
 When all the experts are through with the assessment, they
come together and compare and appropriately aggregate their
findings.
Cognitive
Walkthrough
Relating to the mental processes of perception, memory,
judgment, and reasoning, as contrasted with emotional
and volitional processes.
Introduction
 Cognitive Walkthrough (CW) was proposed by Polson et al.
 CW evaluates design on how well the design supports user
in learning the task to be performed [primarily through
exploration i.e. hands on].
 •CW is usually performed by expert in cognitive psychology.
The Process..
 The expert ‘walks through’ the design [i.e. steps through
each step of some known/representative task] to identify
potential problems.
 4 requirements in order to perform the CW:
1.specification or prototype of the system
2.description of the task the user is to perform
3.complete, written list of actions constituting the task
4.description of the user (including the level of experience and
knowledge)
The Process..
 With the foregoing information, the evaluator steps through
each of the actions trying to answer the following 4
questions:
1. Is the effect of the action the same as the user's goal at that
point? [what the action will do/action's effect should be what the
user intends/user's goal.]
2.will users see that the action is available [when they want it] -
visibility at that time?
3.once users have found the correct action [as in the foregoing], will
they know/recognize it is the one they need? [effective
representation of the action, clear representation.]
4.after the action is taken, will users understand the feedback they
get? [effective confirmation that the action has been taken.]
The Process..
 forms are used to guide analysis e.g.
 cover form [for the four requirements above, date, time,
evaluators of the CW],
 answer form [for answering the four questions above],
 usability problem report [for describing any negative
answers/problems, severity of the problem e.g. frequency of
occurrence and seriousness of the problem, date, time,
evaluators].
Review Based
Evaluation
Introduction
 Experimental results and empirical evidence from
the literature [e.g., from psychology, HCI, etc] can
be used to support or refute parts of design.
The Process..
 It is expensive to repeat experiments continually and
therefore a review of relevant literature can save resources
(e.g., effort, time, finances, etc).
 However, care should be taken to ensure results are
transferable to the new design
 [e.g., note the design in consideration, the user audience,
the assumptions made, etc].
Model Based
Evaluation
How to Proceed!
 Cognitive models can be used to filter design
options
 e.g. GOMS (Goals, Operators, Methods and
Selection) model can be used to predict user
performance with a user interface, keystroke-level
model can be used to predict performance for
low-level tasks
User Based
Evaluation
 User-based evaluation basically is evaluation through user
participation i.e. evaluation that involves the people for
whom the system is intended; the users.
The Process
 User-based evaluation techniques include:
 experimental methods,
 observational methods,
 query techniques (e.g., questionnaires and
interviews),
 physiological monitoring methods (e.g., eye tracking,
measuring skin conductance, measuring heart rate).
 User-based methods can be conducted in the
laboratory and/or in the field
Using Laboratory
 Advantages:
 Specialist equipment available.
 Uninterrupted environment.
 Disadvantages:
 Lack of context.
 Difficult to observe several users cooperating.
 Appropriate:
 If system usage location is dangerous, remote or
impractical.
 For very constrained single-user tasks [to allow
Field or Working Environment
 Advantages:
 Natural environment.
 Context retained (though observation may alter it).
 Longitudinal studies possible.
 Disadvantages:
 Field challenges e.g., distractions, interruptions,
movements, danger, noise.
 Appropriate:
 Where context is crucial
Query Technique
Asking questions
Observational
Methods
Introduction
 Observational methods
 think aloud,
 cooperative evaluation,
 protocol analysis,
 post-task walkthroughs
 Psychological Monitoring
Think aloud
 User is observed performing task.
 User is asked to describe what s/he is doing and why,
what s/he thinks is happening, etc.
 Advantages:
 Simplicity - requires little expertise.
 Can provide useful insight.
 Can show how system is actually used.
 Disadvantages:
 Subjective [really depends on the user].
 Selective [out of many things, the user may choose
what to describe].
 Act of describing may alter task performance.
Cooperative Evaluation
 Variation on think aloud.
 User collaborates in evaluation.
 Both user and evaluator can ask each other
questions throughout.
 Additional advantages:
 Less constrained and easier to use.
 User is encouraged to criticize system.
 Clarification possible.
Protocol Analysis
 Paper and pencil: cheap, limited to writing speed.
 Audio: good for think aloud, difficult to record sufficient
information to identify exact actions in later analysis, difficult to
match with other protocols ('synchronization').
 Video: accurate and realistic, needs special equipment, obtrusive.
 Computer logging: automatic and unobtrusive, large amounts of
data difficult to analyze.
 User notebooks: coarse and subjective, useful insights, good for
longitudinal studies.
 Mixed use in practice.
 Audio/video transcription difficult and requires skill.
 Some automatic support tools available e.g., EVA
(Experimental Video Annotator), Observer Pro (from
Noldus), Workplace project (Xerox PARC), etc.
Post-task Walkthrough
 Transcript played back to participant for comment i.e. user
reacts on action after the event.
 Used to fill in intention i.e. reasons for actions performed
and alternatives considered.
 It also is necessary where think aloud is not possible.
 Advantages:
 Analyst has time to focus on relevant incidents.
 Avoids excessive interruption of task.
 Disadvantages:
 Lack of freshness.
 May be post-hoc interpretation of events.
Physiological monitoring
methods
 [e.g., eye tracking, measuring skin conductance, measuring
heart rate].
 Eye-tracking
 Head or desk mounted equipment tracks the position of the eye.
 Eye movement reflects the amount of cognitive processing a display
requires.
 Measurements include: fixations, scan paths, etc. For instance:
 number of fixations.
 duration of fixation.
 scan paths: moving straight to a target with a short fixation at
the target is optimal.
Psychological Measurements
 Emotional response linked to physical changes.
 These may help determine a user’s reaction to a user
interface.
 Measurements include: heart, sweat, muscle, brain. For
instance:
 heart activity: e.g. blood pressure, volume and pulse.
 activity of sweat glands: Galvanic Skin Response (GSR).
 electrical activity in muscle: electromyogram (EMG).
 electrical activity in brain: electroencephalogram (EEG).
 There is some difficulty in interpreting these physiological
responses; more research is needed.
Design process  evaluating interactive_designs
Factors that can influence
the choice
 when in process: design vs. implementation
 style of evaluation: laboratory vs. field
 how objective: subjective vs. objective
 type of measures: qualitative vs. quantitative
 level of information: high level vs. low level
 level of interference: obtrusive vs. unobtrusive
 resources available: time, subjects, equipment, expertise

More Related Content

PPTX
Human Computer Interaction - INPUT OUTPUT CHANNELS
PPTX
Introduction to database
PDF
PPTX
Chapter1
PDF
Software Process Models
PPTX
Session 1 Lecture 2 PACT A Framework for Designing Interactive Systems
PPTX
Data Dictionary
PPTX
Human computer interaction -Input output channel with Scenario
Human Computer Interaction - INPUT OUTPUT CHANNELS
Introduction to database
Chapter1
Software Process Models
Session 1 Lecture 2 PACT A Framework for Designing Interactive Systems
Data Dictionary
Human computer interaction -Input output channel with Scenario

What's hot (20)

PPTX
Chapter1(hci)
PPT
Fundamentals of Database system
PPTX
Resource Monitoring
PDF
Human computer interaction-web interface design and mobile eco system
PPTX
Human computer interaction -Input output channel
PDF
Hci activity#3
PDF
User Interface Design - Module 1 Introduction
PPTX
IntrIntroduction
PDF
CS8078-Green Computing Question Bank
PPT
A importancia de IHC no desenvolvimento de software
PPTX
CIS 2303 LO3 Process Modeling
PDF
Centerline Digital - UX vs UI - 050613
PPT
HCI 3e - Ch 13: Socio-organizational issues and stakeholder requirements
PPTX
Heuristic evaluation
PPTX
Corba concepts & corba architecture
PPTX
Human Computer Interaction Chapter 3 HCI in the Software Process and Design ...
PPT
Human Computer Interaction Chapter 5 Universal Design and User Support - Dr....
PPT
HCI - Chapter 4
PPTX
User Interface Analysis and Design
PPT
Advanced data modeling
Chapter1(hci)
Fundamentals of Database system
Resource Monitoring
Human computer interaction-web interface design and mobile eco system
Human computer interaction -Input output channel
Hci activity#3
User Interface Design - Module 1 Introduction
IntrIntroduction
CS8078-Green Computing Question Bank
A importancia de IHC no desenvolvimento de software
CIS 2303 LO3 Process Modeling
Centerline Digital - UX vs UI - 050613
HCI 3e - Ch 13: Socio-organizational issues and stakeholder requirements
Heuristic evaluation
Corba concepts & corba architecture
Human Computer Interaction Chapter 3 HCI in the Software Process and Design ...
Human Computer Interaction Chapter 5 Universal Design and User Support - Dr....
HCI - Chapter 4
User Interface Analysis and Design
Advanced data modeling
Ad

Similar to Design process evaluating interactive_designs (20)

PPTX
POLITEKNIK MALAYSIA
PPTX
Evaluation techniques in HCI
PPTX
evaluation techniques in HCI
PPTX
HCI_chapter_09-Evaluation_techniques
PPT
E3 chap-09
PPT
evaluation technique uni 2
PPT
human computer interaction - powerpoints
PPT
evaluation-ppt is a good paper for ervalution technique
PPT
e3-chap-09.ppt
PPT
HCI 3e - Ch 9: Evaluation techniques
PPT
Evaluation Techniques chapter for Human Computer intaraction
PPT
Evaluation techniques
PPT
7. evalution of interactive system
PPT
Chapter 8 Evaluation Techniques
PPTX
TESTING
PDF
HCI-Chapter9.pdf. learning materials for us all
PDF
HCI-Chapter9 (1).pdf. Simple download and learn
PPTX
Lesson 6 - HCI Evaluation Techniques.pptx
PDF
ICS3211_lecture 9_2022.pdf
POLITEKNIK MALAYSIA
Evaluation techniques in HCI
evaluation techniques in HCI
HCI_chapter_09-Evaluation_techniques
E3 chap-09
evaluation technique uni 2
human computer interaction - powerpoints
evaluation-ppt is a good paper for ervalution technique
e3-chap-09.ppt
HCI 3e - Ch 9: Evaluation techniques
Evaluation Techniques chapter for Human Computer intaraction
Evaluation techniques
7. evalution of interactive system
Chapter 8 Evaluation Techniques
TESTING
HCI-Chapter9.pdf. learning materials for us all
HCI-Chapter9 (1).pdf. Simple download and learn
Lesson 6 - HCI Evaluation Techniques.pptx
ICS3211_lecture 9_2022.pdf
Ad

More from Preeti Mishra (20)

PDF
Effective Ways to Conduct Programming labs
PDF
Uml intro
PDF
Component diagram
PDF
Activity diag
PDF
Object diagram
PDF
Sequence diagrams
PDF
State chart diagram
PPT
Use case Diagram
PPTX
Unit 8 software quality and matrices
PPTX
Unit 5 design engineering ssad
PPT
architectural design
PPTX
Oo concepts and class modeling
PPTX
Unit 7 performing user interface design
PPT
testing strategies and tactics
PPT
requirements analysis and design
PPTX
Design process interaction design basics
PPTX
Design process design rules
PPTX
Foundations understanding users and interactions
PPT
Coupling coheshion tps
PPTX
Analysis
Effective Ways to Conduct Programming labs
Uml intro
Component diagram
Activity diag
Object diagram
Sequence diagrams
State chart diagram
Use case Diagram
Unit 8 software quality and matrices
Unit 5 design engineering ssad
architectural design
Oo concepts and class modeling
Unit 7 performing user interface design
testing strategies and tactics
requirements analysis and design
Design process interaction design basics
Design process design rules
Foundations understanding users and interactions
Coupling coheshion tps
Analysis

Recently uploaded (20)

PDF
The CXO Playbook 2025 – Future-Ready Strategies for C-Suite Leaders Cerebrai...
PDF
SM_6th-Sem__Cse_Internet-of-Things.pdf IOT
PPTX
M Tech Sem 1 Civil Engineering Environmental Sciences.pptx
PDF
Automation-in-Manufacturing-Chapter-Introduction.pdf
PPTX
Welding lecture in detail for understanding
DOCX
573137875-Attendance-Management-System-original
PDF
Mitigating Risks through Effective Management for Enhancing Organizational Pe...
PDF
PPT on Performance Review to get promotions
PPTX
Sustainable Sites - Green Building Construction
PPTX
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
PPTX
web development for engineering and engineering
PDF
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
DOCX
ASol_English-Language-Literature-Set-1-27-02-2023-converted.docx
PDF
Digital Logic Computer Design lecture notes
PPT
Project quality management in manufacturing
PDF
Enhancing Cyber Defense Against Zero-Day Attacks using Ensemble Neural Networks
PDF
composite construction of structures.pdf
PPTX
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
PPTX
OOP with Java - Java Introduction (Basics)
PPTX
UNIT 4 Total Quality Management .pptx
The CXO Playbook 2025 – Future-Ready Strategies for C-Suite Leaders Cerebrai...
SM_6th-Sem__Cse_Internet-of-Things.pdf IOT
M Tech Sem 1 Civil Engineering Environmental Sciences.pptx
Automation-in-Manufacturing-Chapter-Introduction.pdf
Welding lecture in detail for understanding
573137875-Attendance-Management-System-original
Mitigating Risks through Effective Management for Enhancing Organizational Pe...
PPT on Performance Review to get promotions
Sustainable Sites - Green Building Construction
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
web development for engineering and engineering
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
ASol_English-Language-Literature-Set-1-27-02-2023-converted.docx
Digital Logic Computer Design lecture notes
Project quality management in manufacturing
Enhancing Cyber Defense Against Zero-Day Attacks using Ensemble Neural Networks
composite construction of structures.pdf
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
OOP with Java - Java Introduction (Basics)
UNIT 4 Total Quality Management .pptx

Design process evaluating interactive_designs

  • 2. What is Evaluation  Evaluation is defined as .. “To examine and judge carefully”  In order to examine and judge, we need criteria against which to base our assessment.
  • 3. Why Evaluate  In HCI we evaluate interfaces and systems to:  Determine how usable they are for different user groups  Identify good and bad features to inform future design  Compare design choices to assist us in making decisions  Observe the effects of specific interfaces on users
  • 4. Criteria for Evaluation  Expert Analysis  User-Based Criteria  Model Based
  • 5. Expert Analysis Heuristic Evaluation Cognitive Walkthrough Review-based Evaluation.
  • 6. Expert Analysis  Expert analysis: designer or HCI expert assesses a design based on known/standard cognitive principles or empirical results.  Expert analysis methods can be used at any stage in the life cycle.  Expert analysis methods are relatively cheap.  Expert analysis methods, however, do not assess the actual use of the system.  Examples of expert analysis methods: Heuristic Evaluation (HE), Cognitive Walkthrough (CW), Review- based Evaluation.
  • 7. Heuristic Based Evaluation Technique (“to discover”) pertains to the process of gaining knowledge or some desired result by intelligent guesswork rather than by following some pre established formula.
  • 8. Introduction  Heuristic Evaluation (HE) was proposed by Nielsen and Molich. ( Read Nielsen’s ten Heuristics already discussed earlier)  In HE, experts scrutinize the interface and its elements against established usability heuristics [another previous tutorial].  The experts should have some background knowledge or experience in HCI design and usability evaluation.
  • 9. The Process..  3 to 5 experts are considered to be sufficient to detect most of the usability problems.  The enlisted experts are provided with the proper roles (and sometimes scenarios to use) to support them when interacting with the system/prototype under evaluation.  They then evaluate the system/prototype individually. This is to ensure an independent and unbiased evaluation by eachexpert.  They assess the user interface as a whole and also the individual user interface elements. The assessment is performed with reference to a set of established usability principles.  When all the experts are through with the assessment, they come together and compare and appropriately aggregate their findings.
  • 10. Cognitive Walkthrough Relating to the mental processes of perception, memory, judgment, and reasoning, as contrasted with emotional and volitional processes.
  • 11. Introduction  Cognitive Walkthrough (CW) was proposed by Polson et al.  CW evaluates design on how well the design supports user in learning the task to be performed [primarily through exploration i.e. hands on].  •CW is usually performed by expert in cognitive psychology.
  • 12. The Process..  The expert ‘walks through’ the design [i.e. steps through each step of some known/representative task] to identify potential problems.  4 requirements in order to perform the CW: 1.specification or prototype of the system 2.description of the task the user is to perform 3.complete, written list of actions constituting the task 4.description of the user (including the level of experience and knowledge)
  • 13. The Process..  With the foregoing information, the evaluator steps through each of the actions trying to answer the following 4 questions: 1. Is the effect of the action the same as the user's goal at that point? [what the action will do/action's effect should be what the user intends/user's goal.] 2.will users see that the action is available [when they want it] - visibility at that time? 3.once users have found the correct action [as in the foregoing], will they know/recognize it is the one they need? [effective representation of the action, clear representation.] 4.after the action is taken, will users understand the feedback they get? [effective confirmation that the action has been taken.]
  • 14. The Process..  forms are used to guide analysis e.g.  cover form [for the four requirements above, date, time, evaluators of the CW],  answer form [for answering the four questions above],  usability problem report [for describing any negative answers/problems, severity of the problem e.g. frequency of occurrence and seriousness of the problem, date, time, evaluators].
  • 16. Introduction  Experimental results and empirical evidence from the literature [e.g., from psychology, HCI, etc] can be used to support or refute parts of design.
  • 17. The Process..  It is expensive to repeat experiments continually and therefore a review of relevant literature can save resources (e.g., effort, time, finances, etc).  However, care should be taken to ensure results are transferable to the new design  [e.g., note the design in consideration, the user audience, the assumptions made, etc].
  • 19. How to Proceed!  Cognitive models can be used to filter design options  e.g. GOMS (Goals, Operators, Methods and Selection) model can be used to predict user performance with a user interface, keystroke-level model can be used to predict performance for low-level tasks
  • 21.  User-based evaluation basically is evaluation through user participation i.e. evaluation that involves the people for whom the system is intended; the users.
  • 22. The Process  User-based evaluation techniques include:  experimental methods,  observational methods,  query techniques (e.g., questionnaires and interviews),  physiological monitoring methods (e.g., eye tracking, measuring skin conductance, measuring heart rate).  User-based methods can be conducted in the laboratory and/or in the field
  • 23. Using Laboratory  Advantages:  Specialist equipment available.  Uninterrupted environment.  Disadvantages:  Lack of context.  Difficult to observe several users cooperating.  Appropriate:  If system usage location is dangerous, remote or impractical.  For very constrained single-user tasks [to allow
  • 24. Field or Working Environment  Advantages:  Natural environment.  Context retained (though observation may alter it).  Longitudinal studies possible.  Disadvantages:  Field challenges e.g., distractions, interruptions, movements, danger, noise.  Appropriate:  Where context is crucial
  • 27. Introduction  Observational methods  think aloud,  cooperative evaluation,  protocol analysis,  post-task walkthroughs  Psychological Monitoring
  • 28. Think aloud  User is observed performing task.  User is asked to describe what s/he is doing and why, what s/he thinks is happening, etc.  Advantages:  Simplicity - requires little expertise.  Can provide useful insight.  Can show how system is actually used.  Disadvantages:  Subjective [really depends on the user].  Selective [out of many things, the user may choose what to describe].  Act of describing may alter task performance.
  • 29. Cooperative Evaluation  Variation on think aloud.  User collaborates in evaluation.  Both user and evaluator can ask each other questions throughout.  Additional advantages:  Less constrained and easier to use.  User is encouraged to criticize system.  Clarification possible.
  • 30. Protocol Analysis  Paper and pencil: cheap, limited to writing speed.  Audio: good for think aloud, difficult to record sufficient information to identify exact actions in later analysis, difficult to match with other protocols ('synchronization').  Video: accurate and realistic, needs special equipment, obtrusive.  Computer logging: automatic and unobtrusive, large amounts of data difficult to analyze.  User notebooks: coarse and subjective, useful insights, good for longitudinal studies.  Mixed use in practice.  Audio/video transcription difficult and requires skill.  Some automatic support tools available e.g., EVA (Experimental Video Annotator), Observer Pro (from Noldus), Workplace project (Xerox PARC), etc.
  • 31. Post-task Walkthrough  Transcript played back to participant for comment i.e. user reacts on action after the event.  Used to fill in intention i.e. reasons for actions performed and alternatives considered.  It also is necessary where think aloud is not possible.  Advantages:  Analyst has time to focus on relevant incidents.  Avoids excessive interruption of task.  Disadvantages:  Lack of freshness.  May be post-hoc interpretation of events.
  • 32. Physiological monitoring methods  [e.g., eye tracking, measuring skin conductance, measuring heart rate].  Eye-tracking  Head or desk mounted equipment tracks the position of the eye.  Eye movement reflects the amount of cognitive processing a display requires.  Measurements include: fixations, scan paths, etc. For instance:  number of fixations.  duration of fixation.  scan paths: moving straight to a target with a short fixation at the target is optimal.
  • 33. Psychological Measurements  Emotional response linked to physical changes.  These may help determine a user’s reaction to a user interface.  Measurements include: heart, sweat, muscle, brain. For instance:  heart activity: e.g. blood pressure, volume and pulse.  activity of sweat glands: Galvanic Skin Response (GSR).  electrical activity in muscle: electromyogram (EMG).  electrical activity in brain: electroencephalogram (EEG).  There is some difficulty in interpreting these physiological responses; more research is needed.
  • 35. Factors that can influence the choice  when in process: design vs. implementation  style of evaluation: laboratory vs. field  how objective: subjective vs. objective  type of measures: qualitative vs. quantitative  level of information: high level vs. low level  level of interference: obtrusive vs. unobtrusive  resources available: time, subjects, equipment, expertise