SlideShare a Scribd company logo
Benefits of Concurrent
 Cognitive and Usability Testing

      Jennifer Childs, Jennifer Romano,
Erica Olmsted-Hawala, and Elizabeth Murphy
             U.S. Census Bureau

             Paper presented at the
  Questionnaire Evaluation Standards Conference
                 Bergen, Norway
                 May 17-20, 2009
                                                  1
Overview
• Background
  – Cognitive and Usability Labs
• Case Study – U.S. 2010 Census
  – Example 1 – Separate Cognitive and Usability
    Testing
  – Example 2 – Joint Cognitive and Usability
    Testing
• Conclusions

                                                   2
Description of Pretesting Methods
• Cognitive Testing         • Usability Testing
  – Focus on respondent’s     – Focus on user’s ability to
    understanding of            complete the survey
    questions                    • User = Interviewer
  – Some focus on                • User = Respondent
    navigation                – Focus on users’
  – Often used to               interaction with
    determine final             questionnaire and on
    question wording            effects of visual design
                              – Used to improve visual
                                design and navigational
                                controls
                                                             3
U.S. Census Bureau Lab Structure
• Cognitive Lab                • Usability Lab
  – Psychologists,               – Psychologists, human
    sociologists,                  factors and usability
    anthropologists,               specialists
    demographers, etc.           – Based on principles of
  – Based on CASM and              user-centered design
    Tourangeau’s 4 stages          (UCD)
  – Measures                     – Measures accuracy,
    comprehension,                 efficiency (time to
    accuracy and ability and       complete tasks), and
    willingness to respond         satisfaction

                                                            4
Census Bureau Lab Techniques
• Cognitive Lab               • Usability Lab
  – Concurrent think aloud      – Random assignment (as
  – Concurrent and/or             appropriate)
    retrospective probing       – Scenarios (hypothetical
  – Retrospective                 situations)
    debriefing                  – Concurrent think aloud
  – Emergent and                – Concurrent and/or
    expansive probing             retrospective probing
  – Vignettes (hypothetical     – Eye tracking
    situations)                 – Satisfaction
  – In-depth ethnographic         questionnaire
    interviews                  – Retrospective debriefing
                                                             5
Case Studies:
United States
2010 Census
U.S. 2010 Census

• Approximately 10 questions
  – Names, relationships, ages, sex, race and
    Hispanic origin for each household member
• Mailed forms to most households
• Non-response follow-up by personal visit




                                                7
U.S. Census Nonresponse Followup
(NRFU)
• Developed and tested CAPI instrument
  – Usability and Cognitive testing
    independently conducted
  – Example 1
• Turned to PAPI data collection
  – Usability and Cognitive testing conducted
    concurrently
  – Example 2

                                                8
Example 1:
CAPI Instrument Testing



• Usability testing and cognitive testing
  conducted separately
CAPI Usability Testing Methods

• Early version of software
• 2 rounds
• Round 1
  – 6 users (4 English and 2 Spanish speakers)
• Round 2
  – 5 users (4 English and 1 Spanish speaker)
CAPI Cognitive Testing Methods
• 4 Rounds (2 English and 2 Spanish)
• English Round 1
  – 14 interviews
  – Paper script
• English Round 2
  – 16 interviews
  – Computer
• Spanish Rounds 1 and 2
  – 30 interviews total
  – Paper script
CAPI Usability Findings (1)

• Data Entry
  – The way the cursor moves between entry
    boxes
• Navigation issues
  – Next and Back button navigated between
    “questions” not “screens”
CAPI Usability Findings (2)

• Question Text
  – Repeating questions for each household member
  – Unclear whether or not to read examples associated
    with question text
  – Unclear if verification for sex was ok
  – Identified instances of problematic question wording
• Interview Tasks
  – Administering flashcard was difficult for interviewers
  – Difficult fills – e.g., Is this (house/apartment/mobile
    home)
CAPI Cognitive Findings

• Navigation Issues
   – “Back” and “Next” buttons
• Question Text
   –   Repetitive to read each question for each person
   –   Reference period unclear
   –   Long, complex questions
   –   Problematic question wording
• Interviewer Tasks
   – Difficulty administering flashcard
   – Difficult fill, e.g., (house/apartment/mobile home)
Separate Usability and Cognitive
Testing Conclusions
• Separate
  – Unique results
  – Common results
  – Areas of expertise and knowledge of relevant
    research
• Together
  – Fully understand how to best resolve problems
Example 2: Joint Cognitive and
Usability PAPI Testing


•   Paper and Pencil Instrument
•   40 cognitive interviews
•   20 usability sessions
•   Concurrent testing led to development of
    joint recommendations


                                               16
Methods
• Cognitive Testing
  • Retrospective probing
  • Comprehension, accuracy, ability to answer
    questions given personal situation
• Usability Testing
  • Shortened interviewer training
  • Test scenarios
  • Accuracy, satisfaction and ease of use

                                                 17
Joint Findings and
Recommendations




                     18
Information Sheet




                    19
Information Sheet Findings

Cognitive Findings
     Respondents understood and were able
        to use the Information Sheet for all
        relevant questions
Usability Findings
     Interviewers were successful in
         administering the information sheet
         and associated questions
                                               20
Hispanic Origin Question




                           21
Hispanic Origin Findings
• Cognitive Findings
   – A lot of difficulty for Hispanic respondents
       • Responding “Latino” or “Spanish”– without a country of
         origin
       • Describing race and origin
       • Unable to respond unassisted
   – With probing by interviewer, able to provide a code-
     able answer (note: the dash is yellow- I must have added it last
     time- so check the other slides too…)
• Usability Findings
            Dominican                   Puerto
 Scenario                  Columbian             Cambodian   Mexican
             Republic                   Rican
 Success
               95%           95%        100%       95%        100%
 Rate                                                                   22
Hispanic Origin Discussion and
Recommendations

• Respondents had some difficulty, but…
• Interviewers in usability testing were
  able to successfully navigate the
  question.
• Recommendations focused on training



                                           23
Conclusions from Joint Cognitive and
Usability Studies
•    Understanding of how:
    – Respondents react to questions
    – Interviewers react to questionnaire
    – Interviewers react to respondent situations
•    Recommendations:
    – Improve the form – question wording, visual design,
      and/or navigational instructions
    – Improve interviewer training

                                                            24
General Recommendations

    Conduct cognitive and usability testing
    concurrently with early versions of the
    questionnaire, with time to change:
     –   Question wording
     –   Visual Design (i.e., format and layout of the
         questionnaire)
     –   Navigational strategies
     –   Interviewer training

    Conduct iterative testing
                                                         25
Future Research

•   Cognitive and Usability testing of
    American Community Survey Internet
    data collection instrument
     –   Early in development cycle
     –   Iterative testing




                                         26
Thank you!

Questions or Comments?

             E-mail:
jennifer.hunter.childs@census.gov



                                    27

More Related Content

PPTX
Session 3 Research Methods - Data Analysis - Case Study Example
PPTX
Session 2 Methods qualitative_quantitative
PDF
-State of the Living – Medical Games & Lifelike Patients - Thomas Talbot, US...
PPTX
State of the Living – Medical Games & Lifelike Patients
PPTX
Srikanth Krishnan - Cognitive Biases in Testing - EuroSTAR 2012
PDF
Integrated Content Teams (Gnostyx)
PPTX
Load and performance testing
PDF
Bart Vrenegoor - How To Solve Testing Problems Without Solving Them - EuroSTA...
Session 3 Research Methods - Data Analysis - Case Study Example
Session 2 Methods qualitative_quantitative
-State of the Living – Medical Games & Lifelike Patients - Thomas Talbot, US...
State of the Living – Medical Games & Lifelike Patients
Srikanth Krishnan - Cognitive Biases in Testing - EuroSTAR 2012
Integrated Content Teams (Gnostyx)
Load and performance testing
Bart Vrenegoor - How To Solve Testing Problems Without Solving Them - EuroSTA...

Similar to Benefits of Concurrent Cognitive and Usability Testing (20)

PPTX
Better UX Surveys at UCD2012 by @cjforms
PPTX
Improving Your Surveys and Questionnaires with Cognitive Interviewing
PDF
PxS’12 - week 4 - qualitative analysis
PPTX
“Usability Testing of a Library Web Site: Librarians and Anthropologists Work...
PDF
Rss Oct 2011 Mixed Modes Pres2
PPTX
Introduction to UX Research: Fundamentals of Contextual Inquiry
PPT
HCI 3e - Ch 9: Evaluation techniques
PDF
Human Computer Interaction Evaluation
PPT
Messy Research: How to Make Qualitative Data Quantifiable and Make Messy Data...
PDF
Analytic emperical Mehods
PDF
2010 Aapor Poster Final
PPT
4 types of research
PPTX
Enc 3241 usability
PPTX
Usability ppt
PPTX
Using Qualitative Methods for Library Evaluation: An Interactive Workshop
PPTX
Using Qualitative Methods for Library Evaluation: An Interactive Workshop
PDF
Fully Exploiting Qualitative and Mixed Methods Data from Online Surveys
PPT
human computer interaction - powerpoints
PPT
evaluation-ppt is a good paper for ervalution technique
Better UX Surveys at UCD2012 by @cjforms
Improving Your Surveys and Questionnaires with Cognitive Interviewing
PxS’12 - week 4 - qualitative analysis
“Usability Testing of a Library Web Site: Librarians and Anthropologists Work...
Rss Oct 2011 Mixed Modes Pres2
Introduction to UX Research: Fundamentals of Contextual Inquiry
HCI 3e - Ch 9: Evaluation techniques
Human Computer Interaction Evaluation
Messy Research: How to Make Qualitative Data Quantifiable and Make Messy Data...
Analytic emperical Mehods
2010 Aapor Poster Final
4 types of research
Enc 3241 usability
Usability ppt
Using Qualitative Methods for Library Evaluation: An Interactive Workshop
Using Qualitative Methods for Library Evaluation: An Interactive Workshop
Fully Exploiting Qualitative and Mixed Methods Data from Online Surveys
human computer interaction - powerpoints
evaluation-ppt is a good paper for ervalution technique
Ad

More from Jennifer Romano Bergstrom (20)

PPTX
Processing Speed and Vocabulary are Related to Older Adults' Internet Experie...
PDF
User-Centered Design of Forms and Surveys
PDF
Eye Tracking the UX of Mobile: What You Need to Know
PDF
UXPA2015 sponsorship prospectus
PDF
Unbiased Methods to Understand the User Experience
PDF
Benchmarking Usability Performance
PDF
User-Centered Research on the Paying for College Website and Tools - EDUI 2014
PDF
Web Survey and Forms Usability Design & Testing
PPTX
Improving Forms with User Experience Testing and Eye Tracking
PPTX
Eye Tracking the User Experience of Mobile: What You Need to Know
PPTX
Launch With Confidence! Integrate UX Research Throughout Development
PPTX
So much UX data! Now what?
PDF
Usable Government Forms and Surveys: Best Practices for Design (from MoDevGov)
PDF
The UX of Social Media
PDF
Unifying the UX of a Survey Across Multiple Devices (MoDevEast 2013)
PDF
UX Assessment Techniques (from NOVA UX Psychology of UX Panel: Dec 11, 2013)
PDF
Age-Related Differences in Search Strategy and Performance When Using a Data-...
PDF
Age Differences in the Knowledge and Usage of QR Codes
PPTX
Making your Brand Pinteresting to Youth - ESOMAR 2013
PPTX
Remote Testing Rocks! Ignite at UXPA 2013
Processing Speed and Vocabulary are Related to Older Adults' Internet Experie...
User-Centered Design of Forms and Surveys
Eye Tracking the UX of Mobile: What You Need to Know
UXPA2015 sponsorship prospectus
Unbiased Methods to Understand the User Experience
Benchmarking Usability Performance
User-Centered Research on the Paying for College Website and Tools - EDUI 2014
Web Survey and Forms Usability Design & Testing
Improving Forms with User Experience Testing and Eye Tracking
Eye Tracking the User Experience of Mobile: What You Need to Know
Launch With Confidence! Integrate UX Research Throughout Development
So much UX data! Now what?
Usable Government Forms and Surveys: Best Practices for Design (from MoDevGov)
The UX of Social Media
Unifying the UX of a Survey Across Multiple Devices (MoDevEast 2013)
UX Assessment Techniques (from NOVA UX Psychology of UX Panel: Dec 11, 2013)
Age-Related Differences in Search Strategy and Performance When Using a Data-...
Age Differences in the Knowledge and Usage of QR Codes
Making your Brand Pinteresting to Youth - ESOMAR 2013
Remote Testing Rocks! Ignite at UXPA 2013
Ad

Recently uploaded (20)

PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PPTX
Machine Learning_overview_presentation.pptx
PDF
Network Security Unit 5.pdf for BCA BBA.
PPTX
Spectroscopy.pptx food analysis technology
PDF
Building Integrated photovoltaic BIPV_UPV.pdf
PPTX
Programs and apps: productivity, graphics, security and other tools
PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
PDF
NewMind AI Weekly Chronicles - August'25-Week II
PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
PDF
Machine learning based COVID-19 study performance prediction
PPT
“AI and Expert System Decision Support & Business Intelligence Systems”
PDF
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PDF
Encapsulation theory and applications.pdf
PDF
Per capita expenditure prediction using model stacking based on satellite ima...
PPTX
sap open course for s4hana steps from ECC to s4
PDF
Encapsulation_ Review paper, used for researhc scholars
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
Reach Out and Touch Someone: Haptics and Empathic Computing
Machine Learning_overview_presentation.pptx
Network Security Unit 5.pdf for BCA BBA.
Spectroscopy.pptx food analysis technology
Building Integrated photovoltaic BIPV_UPV.pdf
Programs and apps: productivity, graphics, security and other tools
Digital-Transformation-Roadmap-for-Companies.pptx
NewMind AI Weekly Chronicles - August'25-Week II
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
Machine learning based COVID-19 study performance prediction
“AI and Expert System Decision Support & Business Intelligence Systems”
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Diabetes mellitus diagnosis method based random forest with bat algorithm
Encapsulation theory and applications.pdf
Per capita expenditure prediction using model stacking based on satellite ima...
sap open course for s4hana steps from ECC to s4
Encapsulation_ Review paper, used for researhc scholars
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx

Benefits of Concurrent Cognitive and Usability Testing

  • 1. Benefits of Concurrent Cognitive and Usability Testing Jennifer Childs, Jennifer Romano, Erica Olmsted-Hawala, and Elizabeth Murphy U.S. Census Bureau Paper presented at the Questionnaire Evaluation Standards Conference Bergen, Norway May 17-20, 2009 1
  • 2. Overview • Background – Cognitive and Usability Labs • Case Study – U.S. 2010 Census – Example 1 – Separate Cognitive and Usability Testing – Example 2 – Joint Cognitive and Usability Testing • Conclusions 2
  • 3. Description of Pretesting Methods • Cognitive Testing • Usability Testing – Focus on respondent’s – Focus on user’s ability to understanding of complete the survey questions • User = Interviewer – Some focus on • User = Respondent navigation – Focus on users’ – Often used to interaction with determine final questionnaire and on question wording effects of visual design – Used to improve visual design and navigational controls 3
  • 4. U.S. Census Bureau Lab Structure • Cognitive Lab • Usability Lab – Psychologists, – Psychologists, human sociologists, factors and usability anthropologists, specialists demographers, etc. – Based on principles of – Based on CASM and user-centered design Tourangeau’s 4 stages (UCD) – Measures – Measures accuracy, comprehension, efficiency (time to accuracy and ability and complete tasks), and willingness to respond satisfaction 4
  • 5. Census Bureau Lab Techniques • Cognitive Lab • Usability Lab – Concurrent think aloud – Random assignment (as – Concurrent and/or appropriate) retrospective probing – Scenarios (hypothetical – Retrospective situations) debriefing – Concurrent think aloud – Emergent and – Concurrent and/or expansive probing retrospective probing – Vignettes (hypothetical – Eye tracking situations) – Satisfaction – In-depth ethnographic questionnaire interviews – Retrospective debriefing 5
  • 7. U.S. 2010 Census • Approximately 10 questions – Names, relationships, ages, sex, race and Hispanic origin for each household member • Mailed forms to most households • Non-response follow-up by personal visit 7
  • 8. U.S. Census Nonresponse Followup (NRFU) • Developed and tested CAPI instrument – Usability and Cognitive testing independently conducted – Example 1 • Turned to PAPI data collection – Usability and Cognitive testing conducted concurrently – Example 2 8
  • 9. Example 1: CAPI Instrument Testing • Usability testing and cognitive testing conducted separately
  • 10. CAPI Usability Testing Methods • Early version of software • 2 rounds • Round 1 – 6 users (4 English and 2 Spanish speakers) • Round 2 – 5 users (4 English and 1 Spanish speaker)
  • 11. CAPI Cognitive Testing Methods • 4 Rounds (2 English and 2 Spanish) • English Round 1 – 14 interviews – Paper script • English Round 2 – 16 interviews – Computer • Spanish Rounds 1 and 2 – 30 interviews total – Paper script
  • 12. CAPI Usability Findings (1) • Data Entry – The way the cursor moves between entry boxes • Navigation issues – Next and Back button navigated between “questions” not “screens”
  • 13. CAPI Usability Findings (2) • Question Text – Repeating questions for each household member – Unclear whether or not to read examples associated with question text – Unclear if verification for sex was ok – Identified instances of problematic question wording • Interview Tasks – Administering flashcard was difficult for interviewers – Difficult fills – e.g., Is this (house/apartment/mobile home)
  • 14. CAPI Cognitive Findings • Navigation Issues – “Back” and “Next” buttons • Question Text – Repetitive to read each question for each person – Reference period unclear – Long, complex questions – Problematic question wording • Interviewer Tasks – Difficulty administering flashcard – Difficult fill, e.g., (house/apartment/mobile home)
  • 15. Separate Usability and Cognitive Testing Conclusions • Separate – Unique results – Common results – Areas of expertise and knowledge of relevant research • Together – Fully understand how to best resolve problems
  • 16. Example 2: Joint Cognitive and Usability PAPI Testing • Paper and Pencil Instrument • 40 cognitive interviews • 20 usability sessions • Concurrent testing led to development of joint recommendations 16
  • 17. Methods • Cognitive Testing • Retrospective probing • Comprehension, accuracy, ability to answer questions given personal situation • Usability Testing • Shortened interviewer training • Test scenarios • Accuracy, satisfaction and ease of use 17
  • 20. Information Sheet Findings Cognitive Findings Respondents understood and were able to use the Information Sheet for all relevant questions Usability Findings Interviewers were successful in administering the information sheet and associated questions 20
  • 22. Hispanic Origin Findings • Cognitive Findings – A lot of difficulty for Hispanic respondents • Responding “Latino” or “Spanish”– without a country of origin • Describing race and origin • Unable to respond unassisted – With probing by interviewer, able to provide a code- able answer (note: the dash is yellow- I must have added it last time- so check the other slides too…) • Usability Findings Dominican Puerto Scenario Columbian Cambodian Mexican Republic Rican Success 95% 95% 100% 95% 100% Rate 22
  • 23. Hispanic Origin Discussion and Recommendations • Respondents had some difficulty, but… • Interviewers in usability testing were able to successfully navigate the question. • Recommendations focused on training 23
  • 24. Conclusions from Joint Cognitive and Usability Studies • Understanding of how: – Respondents react to questions – Interviewers react to questionnaire – Interviewers react to respondent situations • Recommendations: – Improve the form – question wording, visual design, and/or navigational instructions – Improve interviewer training 24
  • 25. General Recommendations  Conduct cognitive and usability testing concurrently with early versions of the questionnaire, with time to change: – Question wording – Visual Design (i.e., format and layout of the questionnaire) – Navigational strategies – Interviewer training  Conduct iterative testing 25
  • 26. Future Research • Cognitive and Usability testing of American Community Survey Internet data collection instrument – Early in development cycle – Iterative testing 26
  • 27. Thank you! Questions or Comments? E-mail: jennifer.hunter.childs@census.gov 27

Editor's Notes

  • #4: Under Usability Testing, it could be argued that the focus is on both the “look and feel” of the form, i.e., the visual design and the ways in which the various controls behave.
  • #5: Note that I added some text under Usability Lab for you to consider. We do not measure how “intuitive” a user-interface is because there is no defined unit of intuitiveness. We measure accuracy, efficiency (time to complete tasks), and satisfaction with the user interface. Accuracy and efficiency together represent the user’s success in completing the survey. So we measure success and satisfaction, and we diagnose what is was about the design that led to something less than success and satisfaction. We find that a problem with Tourangeau’s model is that people don’t always read the question. They often look first at the response options, and they infer the question. People’s goal with an online survey is to finish it as quickly as possible and get back to their lives. Usability recommendations are grounded in what we know about human capabilities and limitations in attention, perception, memory, learning, expectations, problem solving, decision making, i.e., everything we know about human cognition.
  • #20: If this is where the lists came from, it should be presented earlier. I was thinking the lists were on flash cards.