SlideShare a Scribd company logo
Comparative Evaluation of Two Interface Tools in Performing Visual Analytics TasksDong Hyun Jeong*, Tera Marie Green†, William Ribarsky*, Remco Chang**Charlotte Visualization Center, UNC Charlotte†School of Interactive Arts and Technology, Simon Fraser University
MotivationHuman interaction & flow of cognitionImportant for problem solving“Visualization design should avoid, as much as possible, menus or other actions that take the user outside of the frame of the task” by Green et al. 2008Previous LiteraturesIn visualization community, pull-down menus may require the human to sort through and think about menu items [Green et al.].In HCI community, Not a clear difference between direct comparison between menu vs. direct manipulation icons [Lim et al.]
ObjectiveTo show the effectiveness of two interface tools, Floating-Menu and Interactive-Icon, on a highly interactive visual analytics tool.A comparative evaluationQuantitative and Qualitative evaluation to find the differences between the two.
A Visual Analytics ToolA visual analytics toolA genomic visualization (called GVis).Uses publicly available biological database (GenBank) by NCIBI (the National Center for Biotechnology Information).System overviewFloating-MenuInteractive-Icon
Information RepresentationRepresenting information corresponding to each option selection# of published articlesYear information about the publicationTitles of the publishing journals# of matched results based on the search query
Comparative StudyWithin-subject studyCollege students31 participants (twelve males and nineteen females)Most participants are not familiar with visualization as well biology.TasksFinding additional information (publications) related to specific organisms using the two interface tools.
Evaluation Results (Accuracy)AccuracyAbout 54.8% (17.0±7.9) [Floating-Menu] and 46.1% (14.3±5.0)  [Interactive-Icon] of the participants answered correctly.Repeated Measures ANOVAAccuracy difference is not statistically significant across the two interfaces (p=0.24)
Evaluation Results (Speed)SpeedNo statistically significant Pearson's Correlation Coefficient measureA trend between the time spent and the difficulty of the task (r =.47, p<.0001).
Evaluation Results (post-task questionnaire)EasinessAbout 60% (18.6±0.5) [Floating-Menu] and about 43% (13.3±4.0) [Interactive-Icon]  of the participants reported all 3 tasks to be “easy” or “very easy”HelpfulnessAbout 74% (23±3.6) [Floating-Menu] and about 65% (20±3.6) [Interactive-Icon] of the participants reported the tool to be “helpful” or “very helpful” in solving tasks.
Evaluation Results (post-app. & study questionnaire)LearnabilityAbout 67% and 51% of the participants rated that Floating-Menu and Interactive-Icon were easy to use (“very easy” or “easy”).PreferenceNo significant difference, but there was a gender difference.likecomfortable
DiscussionTwo interface tools, Floating-Menu and Interactive-Icon, perform similarly both quantitatively and qualitatively.Limitations of the comparative evaluation method.Quantitative (Time & Accuracy) and Qualitative (Users’ feedback) are not good for evaluating on a highly interactive visual analytics tool.Importance of preserving humans’ flow of cognition Interactive-Icon might support well the human’s flow of cognition than Floating-Menu.With the comparative evaluation, the difference between the two interface tools cannot be generalized.
Q&ACharlotte Visualization CenterUNC Charlottedhjeong@uncc.edu

More Related Content

PPT
Update on Opensource Whitepaper: A User-Friendly GIS for Public Health
PPTX
Beyond system logging: human logging for evaluating information visualization.
PPT
Pragmatic Challenges in the Evaluation of Interactive Visualization Systems.
PDF
Understanding Visualization Authoring for Genomics Data through User Interviews
PPTX
AB4Web: An On-Line A/B Tester for Comparing User Interface Design Alternatives
PDF
Hypertext 2016
PPTX
Module 5 - Academic Writing: Writing Your Introduction
PPTX
Evaluating Semantic Search Systems to Identify Future Directions of Research
Update on Opensource Whitepaper: A User-Friendly GIS for Public Health
Beyond system logging: human logging for evaluating information visualization.
Pragmatic Challenges in the Evaluation of Interactive Visualization Systems.
Understanding Visualization Authoring for Genomics Data through User Interviews
AB4Web: An On-Line A/B Tester for Comparing User Interface Design Alternatives
Hypertext 2016
Module 5 - Academic Writing: Writing Your Introduction
Evaluating Semantic Search Systems to Identify Future Directions of Research

Similar to Comparative Evaluation of Two Interface Tools in Performing Visual Analytics Tasks. (20)

PDF
UNit4d.pdf
PPTX
BELIV'10 Keynote: Conceptual and Practical Challenges in InfoViz Evaluations
DOC
Heuristic Evaluation of Immersive 3D Application
PPT
Divoli Presentation at EBI Apr2011 Usability Part
PPT
Ebi apr2011 usability-part
PDF
A meta-analysis of computational biology benchmarks reveals predictors of pro...
PDF
Mixed-initiative recommender systems
PDF
Visual Analytics talk at ISMB2013
PDF
Analytic emperical Mehods
PPTX
Advanced techniques to support the design and evaluation of websites informat...
PPSX
Paper Presentation: Data Mining User Preference in Interactive Multimedia
PPTX
Interactive recommender systems: opening up the “black box”
PPT
Scanning Between Graph Visualizations: An Eye Tracking Evaluation.
PDF
Data Analysis Presentation
PDF
Engelman.2011.exploring interaction modes for image retrieval
PPTX
Thesis
PPTX
Remote Testing Methods & Tools Webinar
PPTX
Anna Divoli (Pingar Research) "How taxonomies and facets bring end-users clos...
PPTX
How Taxonomies and facets bring end users closer to big data
PDF
Les 7 - informatie visualisatie - interactie
UNit4d.pdf
BELIV'10 Keynote: Conceptual and Practical Challenges in InfoViz Evaluations
Heuristic Evaluation of Immersive 3D Application
Divoli Presentation at EBI Apr2011 Usability Part
Ebi apr2011 usability-part
A meta-analysis of computational biology benchmarks reveals predictors of pro...
Mixed-initiative recommender systems
Visual Analytics talk at ISMB2013
Analytic emperical Mehods
Advanced techniques to support the design and evaluation of websites informat...
Paper Presentation: Data Mining User Preference in Interactive Multimedia
Interactive recommender systems: opening up the “black box”
Scanning Between Graph Visualizations: An Eye Tracking Evaluation.
Data Analysis Presentation
Engelman.2011.exploring interaction modes for image retrieval
Thesis
Remote Testing Methods & Tools Webinar
Anna Divoli (Pingar Research) "How taxonomies and facets bring end-users clos...
How Taxonomies and facets bring end users closer to big data
Les 7 - informatie visualisatie - interactie
Ad

More from BELIV Workshop (13)

PDF
Look Before You Link: Eye Tracking in Multiple Coordinated View Visualization.
PPT
Many Roads Lead to Rome. Mapping Users’ Problem Solving Strategies.
PPTX
Proposed Working Memory Measures for Evaluating Information Visualization Tools.
PDF
How is a graphic like pumpkin pie? A framework for analysis and critique of v...
PPTX
Implications of Individual Differences on Evaluating Information Visualizatio...
PDF
Towards Information-Theoretic Visualization Evaluation Measure: A Practical e...
PPT
Is Your User Hunting or Gathering Insights? Identifying Insight Drivers Acros...
KEY
A Descriptive Model of Visual Scanning.
PPT
Generating a synthetic video dataset
PPTX
Focus Groups for Functional InfoVis Prototype Evaluation: A Case Study.
PPT
Learning-Based Evaluation of Visual Analytic Systems.
PDF
Visualization Evaluation of the Masses, by the Masses, and for the Masses.
PPT
Evaluating Information Visualization in Large Companies: Challenges, Experien...
Look Before You Link: Eye Tracking in Multiple Coordinated View Visualization.
Many Roads Lead to Rome. Mapping Users’ Problem Solving Strategies.
Proposed Working Memory Measures for Evaluating Information Visualization Tools.
How is a graphic like pumpkin pie? A framework for analysis and critique of v...
Implications of Individual Differences on Evaluating Information Visualizatio...
Towards Information-Theoretic Visualization Evaluation Measure: A Practical e...
Is Your User Hunting or Gathering Insights? Identifying Insight Drivers Acros...
A Descriptive Model of Visual Scanning.
Generating a synthetic video dataset
Focus Groups for Functional InfoVis Prototype Evaluation: A Case Study.
Learning-Based Evaluation of Visual Analytic Systems.
Visualization Evaluation of the Masses, by the Masses, and for the Masses.
Evaluating Information Visualization in Large Companies: Challenges, Experien...
Ad

Comparative Evaluation of Two Interface Tools in Performing Visual Analytics Tasks.

  • 1. Comparative Evaluation of Two Interface Tools in Performing Visual Analytics TasksDong Hyun Jeong*, Tera Marie Green†, William Ribarsky*, Remco Chang**Charlotte Visualization Center, UNC Charlotte†School of Interactive Arts and Technology, Simon Fraser University
  • 2. MotivationHuman interaction & flow of cognitionImportant for problem solving“Visualization design should avoid, as much as possible, menus or other actions that take the user outside of the frame of the task” by Green et al. 2008Previous LiteraturesIn visualization community, pull-down menus may require the human to sort through and think about menu items [Green et al.].In HCI community, Not a clear difference between direct comparison between menu vs. direct manipulation icons [Lim et al.]
  • 3. ObjectiveTo show the effectiveness of two interface tools, Floating-Menu and Interactive-Icon, on a highly interactive visual analytics tool.A comparative evaluationQuantitative and Qualitative evaluation to find the differences between the two.
  • 4. A Visual Analytics ToolA visual analytics toolA genomic visualization (called GVis).Uses publicly available biological database (GenBank) by NCIBI (the National Center for Biotechnology Information).System overviewFloating-MenuInteractive-Icon
  • 5. Information RepresentationRepresenting information corresponding to each option selection# of published articlesYear information about the publicationTitles of the publishing journals# of matched results based on the search query
  • 6. Comparative StudyWithin-subject studyCollege students31 participants (twelve males and nineteen females)Most participants are not familiar with visualization as well biology.TasksFinding additional information (publications) related to specific organisms using the two interface tools.
  • 7. Evaluation Results (Accuracy)AccuracyAbout 54.8% (17.0±7.9) [Floating-Menu] and 46.1% (14.3±5.0) [Interactive-Icon] of the participants answered correctly.Repeated Measures ANOVAAccuracy difference is not statistically significant across the two interfaces (p=0.24)
  • 8. Evaluation Results (Speed)SpeedNo statistically significant Pearson's Correlation Coefficient measureA trend between the time spent and the difficulty of the task (r =.47, p<.0001).
  • 9. Evaluation Results (post-task questionnaire)EasinessAbout 60% (18.6±0.5) [Floating-Menu] and about 43% (13.3±4.0) [Interactive-Icon] of the participants reported all 3 tasks to be “easy” or “very easy”HelpfulnessAbout 74% (23±3.6) [Floating-Menu] and about 65% (20±3.6) [Interactive-Icon] of the participants reported the tool to be “helpful” or “very helpful” in solving tasks.
  • 10. Evaluation Results (post-app. & study questionnaire)LearnabilityAbout 67% and 51% of the participants rated that Floating-Menu and Interactive-Icon were easy to use (“very easy” or “easy”).PreferenceNo significant difference, but there was a gender difference.likecomfortable
  • 11. DiscussionTwo interface tools, Floating-Menu and Interactive-Icon, perform similarly both quantitatively and qualitatively.Limitations of the comparative evaluation method.Quantitative (Time & Accuracy) and Qualitative (Users’ feedback) are not good for evaluating on a highly interactive visual analytics tool.Importance of preserving humans’ flow of cognition Interactive-Icon might support well the human’s flow of cognition than Floating-Menu.With the comparative evaluation, the difference between the two interface tools cannot be generalized.
  • 12. Q&ACharlotte Visualization CenterUNC Charlottedhjeong@uncc.edu