®               ®




Evaluating the Usability of Commercial
Software Applications
Jen Hocko




                                         © 2008 The MathWorks, Inc.
The MathWorks
®   ®




About Me

 Manager of the Business
  Applications (“BizApps”) Usability
  Group at The MathWorks
 Prior lives: Web Development,
  Technical Publications
 B.S. in Computer Science &
  Technical Writing, M.S. in
  Human Factors
 Enjoy replacing chaos with
  something more orderly
 Avid West Coast Swing dancer

                                           2
®          ®




About You

 How many of you have been asked to weigh in on the
  usability of a commercial software application?
 What challenges did you face?




                                                       3
®    ®




What This Presentation is About

   Here’s the situation….
   How did I decide what to try?
   What it turned into: overview of the methodology
   How you can do it too: details about each step
     Discussion of challenges, lessons learned
     Q&A, further thoughts from the audience
 Closing discussion, Q&A




                                                       4
®   ®




Here’s the situation….

 Select the best Expense Reporting System
  to replace our old, home-grown solution

 Project already underway

 Project team requested Usability help because:
     Hadn’t done this type of project before
     Thought the end-user point of view was critical
     Wanted to use tools / templates to work effectively
     Needed guidance about where to go next



                                                                5
®             ®




How did I decide what to try?

           Questions I asked myself:

               What was out there?
               What did other Usability people think?
               Has this already been done?
               Posted to many discussion groups!



             What might work given our company culture?
             What might work given the current state of the
              project?


                                                               6
®             ®




How did I decide what to try?
 What I found / suggestions from others:

   Jerrod Larson’s UX Magazine article on market research firms
        Gartner, Forrester evaluations
     SUS and SUMI Questionnaires
     Nielsen’s (and other) heuristics
     Various checklists
     CIF report (ISO/IEC 250622006)




                                                                   7
®            ®




What it turned into: methodology overview




We recommend that project teams perform both the first and second level qualifications.




                                                                                          8
®      ®




What it turned into: methodology overview




We suggest that project teams perform at least one of the third level qualifications.


                                                                                        9
®              ®




First Level Qualification




 Primary goal: get a sense for how versed the vendor is in
  usability and user-centered design




                                                              10
®            ®




Step 1: Questions for vendors




 What questions about Usability would you ask vendors?




                                                          11
®   ®




Step 1: Questions for vendors




                                    12
®   ®




Step 1: Evaluating vendor responses
 “Does the system support people with disabilities by
  following web accessibility guidelines?”

“The system does not support this functionality.” (Company A)




                                                                    13
®                     ®




Step 1: Evaluating vendor responses
 “Does the system support people with disabilities by
  following web accessibility guidelines?”

“Company B’s user interface is designed in accordance to the principals of the Inductive
User Interface approach. This approach is similar to key applications used on a day-to-
day basis by both lay and performance end users. Microsoft, noted as the most
significant contributor to end user experience and design, adopts this approach in a
number of its applications such as MS Money, Hotmail, and MSN.com. Entry screens are
dynamic in nature meaning that dependent on the type selected different fields will
become visible and dependent on their configuration will be optional or mandatory. Users
fill in values in a logical sequence, using a series of pull down lists, buttons and check
boxes, without requiring screen refreshes which make other products cumbersome.”




                                                                                             14
®                     ®




Step 1: Evaluating vendor responses
 “Does the system support people with disabilities by
  following web accessibility guidelines?”
“Company C believes the application to comply with Section 508 requirements based on
the ability to navigate the application using keyboard access and to adjust text size in the
browser using standard browser functions.”




                                                                                               15
®                     ®




Step 1: Evaluating vendor responses
 “Does the system support people with disabilities by
  following web accessibility guidelines?”
“Company C believes the application to comply with Section 508 requirements based on
the ability to navigate the application using keyboard access and to adjust text size in the
browser using standard browser functions.”

“The system does not support this functionality.” (Company A)

“Company B’s user interface is designed in accordance to the principals of the Inductive
User Interface approach. This approach is similar to key applications used on a day-to-
day basis by both lay and performance end users. Microsoft, noted as the most
significant contributor to end user experience and design, adopts this approach in a
number of its applications such as MS Money, Hotmail, and MSN.com. Entry screens are
dynamic in nature meaning that dependent on the type selected different fields will
become visible and dependent on their configuration will be optional or mandatory. Users
fill in values in a logical sequence, using a series of pull down lists, buttons and check
boxes, without requiring screen refreshes which make other products cumbersome.”

                                                                                               16
®   ®




Let’s talk

 Q&A
 Your thoughts?




                       17
®               ®




Second Level Qualification




 Primary goals:
   Define what is required of the application you are looking to buy
   Set up some structure and evaluation criteria for vendor demos



                                                                        18
®   ®




Step 2: Use cases, requirements, & capabilities

   What is a use case?
       Description of the user
       Description of the user’s goal
       The user’s current workflow
       Pain points associated with the current workflow




                                                               19
®   ®




Step 2: Use cases, requirements, & capabilities


   How do you get the content for your use cases?

                 Write                       Identify
               use cases                    user roles




                        Throughout, keep
                       looking for missing
                          user roles, use
                         cases, or pains /
                                                   Brainstorm
    Affinitize pains          issues               high level
       or issues
                                                   use cases




                            Brainstorm
                           pain points or
                              issues




                                                                    20
®           ®




Step 2: Use cases, requirements, & capabilities


   How do you get the content for your use cases?

                                                           CARD (task analysis)
                                           Brainstorm
          Write
        use cases                          pain points          Big picture
                                            or issues
                                                                Current workflow
                      Throughout, keep
                     looking for missing                   Interviews
                        user roles, use
                       cases, or pains /                   Observations
                            issues


         Identify                      Affinitize pains
        user roles                        or issues




                                                                                    21
®         ®




Step 2: Use cases, requirements, & capabilities




                                              22
®         ®




Step 2: Use cases, requirements, & capabilities




                                              23
®                ®




Step 2: Use cases, requirements, & capabilities

   Challenge #1: Shouldn’t we be documenting the ideal
    workflow?
       Makes vendors think about our problems and how to solve them
       Not limiting to one “ideal” solution – different ones may work well
       Easier to start from what is known
       Shared understanding of today is invaluable




                                                                              24
®               ®




Step 2: Use cases, requirements, & capabilities

   Challenge #2: Why do we have to start with a use case?
     Requirements should be traceable back to an actual user
     Helps reduce scope creep and “bells and whistles”
        If it can’t be tied to a use case it’s probably not needed right now!




                                                                                 25
®                ®




Step 2: Use cases, requirements, & capabilities

   Challenge #3: What is the appropriate
    level of detail for a use case?

   Some things to consider:
     Combine multiple user roles into one use case. (Note where
      any variations in steps or pains occur.)
     Vendors care about what we want the application to do
     Project teams and users care about:
        Getting the best application possible (having their needs met)
        Evaluating the applications



                                                                          26
®                ®




Step 2: Use cases, requirements, & capabilities

   Challenge #4: What should we be giving to vendors?

   Some options:
     The N most important use cases (exactly as we wrote them)

     A prioritized list of requirements pulled from the use cases
        Spreadsheets are fine, if cutoffs are defined – must have’s, should
         have’s, nice to have’s, etc.

     The requirements, organized into high level “capabilities”
        Requirements affinitized back into manageable categories
        Sometimes aligns better with demos (allows for easier scoring)
        Still need to decide how many to address per demo

                                                                               27
®      ®




Step 2: Use cases, requirements, & capabilities

   Challenge #5: How do you write a good requirement?

   Well written requirements:
    1.   Explain what the application should do, not how
    2.   Keep the readers in mind
    3.   Are specific, actionable statements
    4.   Make good use of language (spelling, grammar, etc.)
    5.   Are uniquely numbered




                                                               28
®           ®




Step 2: Use cases, requirements, & capabilities

       Example: related requirements organized by Capability

                                                  Capability
 Requirements




                                                                29
®                ®




Step 2: Use cases, requirements, & capabilities
                                        Capability
  Example:
   related
   requirements
   organized by
   Capability




   Requirements
   (click on Capability
   to view spreadsheet)




                                                 30
®   ®




Let’s talk

 Q&A
 Other thoughts?




                        31
®            ®




Second Level Qualification




 Primary goals:
   Compare applications based on how well vendors are able to
    demonstrate that they meet your requirements
   Get the project team talking about application strengths and
    weaknesses


                                                                   32
®   ®




Step 3: Vendor demos & scorecard review

 An Individual scorecard (example)




                                          33
®   ®




Step 3: Vendor demos & scorecard review

 The Individual comments template:




 Do:
    Train people!
    Have a parking lot

                                          34
®   ®




Step 3: Vendor demos & scorecard review

 A Consolidated Demo Scorecard (example)




                                                35
®   ®




Step 3: Vendor demos & scorecard review

   Count the A’s and multiply by 3
   Count the B’s and multiply by 2
   Count the C’s and multiply by 1
   Add these together to get a “Positives” score

 Count the N’s and multiply by 3 to get a
  “Negatives” score

 “Positives” score – “Negatives” score = Final score


                                                        36
®   ®




Let’s talk

 Q&A
 Other thoughts?




                        37
®            ®




Third Level Qualification




 Primary goal: Look critically at application to identify potential
  usability problems (Usability Specialist)
                                                                       38
®   ®




Step 4a: Usability audit




                               39
®   ®




Step 4a: Usability audit




                               40
®                 ®




Step 4a: Usability audit
 Pros:
   Provides helpful reminders of different usability principles that
    could cause problems for users if not followed
   Can result in useful discussions about configurability (e.g. control
    of style sheets, etc)

 Cons:
   Some checklist items don’t apply or are difficult to measure – need
    to be consistent across audits of different applications
   Isolated activity for single Usability expert goes against our
    collaborative culture, inter-rater reliability

 Do Better Next Time
   Need to look at the checklist items in the context of use cases
                                                                           41
®   ®




Let’s talk

 Q&A
 Other thoughts?




                        42
®            ®




Third Level Qualification




 Primary goal: Look critically at application to identify potential
  usability problems (end-users)
                                                                       43
®             ®




Step 4c: End-user evaluation

 3 measures of system usability:
   Effectiveness – users can complete tasks
   Efficiency – how easily users can complete tasks
   Satisfaction – how users feel about completing the tasks


 2/3 User Acceptance Test (UAT), 1/3 survey

 In UAT, Usability plays a supporting role by ensuring:
   Tasks are adapted from the use cases
   All user groups are represented as participants
   Evaluation documents are designed to capture effectiveness and
    efficiency
                                                                     44
®   ®




Step 4c: End-user evaluation




                                   45
®                 ®




Step 4c: End-user evaluation

Score      Effectiveness                      Efficiency
Type
Individual (Number of Y’s / Total Y’s         (Number of Agrees / Total Agrees
              Possible) * 100                    Possible) * 100
Overall    Average of all Individual Scores   Average of all Individual Scores



We do this for each application being evaluated.


If there were tasks that evaluators were unable to complete
or that took unreasonable time and effort, follow up with
them to identify and document the reasons WHY.

                                                                                 46
®   ®




Step 4c: End-user evaluation

 Administered in
  SurveyMonkey
 Initial section for
  collecting demographic
  information
 Additional question for
  overall feeling of system
  (scored separately)
 Users fill out one survey
  per system

                                   47
®   ®




Step 4c: End-user evaluation




                                   48
®   ®




Let’s talk

 Q&A
 Other thoughts?




                        49
®         ®




Final usability recommendation: system
comparison matrix




                                         50
®             ®




Additional takeaways

 Over communicate: keep the team informed / aligned with
  the process – don’t assume they know it

 Customize as necessary: this isn’t a “one size fits all”
  methodology. It’s only a starting point – get team input

 Fit into the bigger picture: if there’s a larger, centralized
  process of software evaluations

 It does make a difference: project teams think more
  critically about end-users as part of procurement decision



                                                                  51
®   ®




Final Questions




                      52

More Related Content

PDF
User Testing talk by Chris Rourke of User Vision
PDF
Design Simple but Powerful application
PDF
Neodes Uxd Profile 2012
PPT
User interface design for the Web Engineering Psychology
PPT
Fundamental principles of Usability and User Centred Design
PDF
What is User Experience? - Barcamp 4 in Auckland New Zealand
PDF
Agile.usability
PPT
User Centered Design 101
User Testing talk by Chris Rourke of User Vision
Design Simple but Powerful application
Neodes Uxd Profile 2012
User interface design for the Web Engineering Psychology
Fundamental principles of Usability and User Centred Design
What is User Experience? - Barcamp 4 in Auckland New Zealand
Agile.usability
User Centered Design 101

What's hot (20)

PPTX
UX - Beyond Design Practice
KEY
UX Strategy as told by the paintings of Jan Steen
PPTX
What is Usability
PPTX
Mobile UI Design – User Centered Design and UI Best Practices
 
PPT
User experience & design part 3
PDF
User Experience Design & Paper Prototyping
PDF
User Centered Design in short
PDF
Evaluating and Improving Software Usability
PDF
Placements 2005 to date - Part I
PDF
Ebay News 2006 7 19 Earnings
PDF
User Experience and Prototyping
PPT
Sunil Das Profile
PPTX
Usability of web application
PDF
Human(e) machine interaction? A reflection on the development of products
PDF
My UX Portfolio
PDF
UX & RIAs: UI Design Challenges (ERGOSIGN)
PPTX
Design process interaction design basics
PPTX
Usability modeling and measurement
PAGES
Conference Handout - Listing of Unmoderated Remote Usability Testing Tools
PDF
Measuring usability
UX - Beyond Design Practice
UX Strategy as told by the paintings of Jan Steen
What is Usability
Mobile UI Design – User Centered Design and UI Best Practices
 
User experience & design part 3
User Experience Design & Paper Prototyping
User Centered Design in short
Evaluating and Improving Software Usability
Placements 2005 to date - Part I
Ebay News 2006 7 19 Earnings
User Experience and Prototyping
Sunil Das Profile
Usability of web application
Human(e) machine interaction? A reflection on the development of products
My UX Portfolio
UX & RIAs: UI Design Challenges (ERGOSIGN)
Design process interaction design basics
Usability modeling and measurement
Conference Handout - Listing of Unmoderated Remote Usability Testing Tools
Measuring usability
Ad

Viewers also liked (9)

PDF
DOC
Software vendor 5010 checklist
PDF
10 Steps To Successful Enterprise Software Selection
PPT
Vendor Selection
PPT
Software Selection & Evaluation
PPTX
Best Practices in Software Vendor Selection
PPT
Supplier evaluation & selection
PDF
Supplier evaluation criteria
PPT
SUPPLIER SELECTION AND EVALUATION
Software vendor 5010 checklist
10 Steps To Successful Enterprise Software Selection
Vendor Selection
Software Selection & Evaluation
Best Practices in Software Vendor Selection
Supplier evaluation & selection
Supplier evaluation criteria
SUPPLIER SELECTION AND EVALUATION
Ad

Similar to Evaluating Usability Of Commercial Software Applications (20)

PPTX
Introduction & Course Overview: Design Thinking for User Experience Design, P...
PDF
Web Site Usability
PDF
Ux guide
PPTX
GA - product management for entrepreneurs
PDF
Filip Healy (Threesixty Reality): Making Immersive Tech More Usable
PDF
User Zoom Kli Health Webinar Sep09 Vf
PDF
Web Usability (Slideshare Version)
PPT
Ch02 project selection (pp_tshare)
PPTX
Targeted documentation STC Houston, Mar 20, 2012
PDF
Understanding The Value Of User Research, Usability Testing, and Information ...
PDF
U Xmagic Agile Presentation
PDF
UX Camp: Pittsburgh, 2017
PDF
Usability Testing for Qualitative Researchers - QRCA NYC Chapter event
PDF
Usability Testing by Rajdeep Gupta, Misys
PDF
HCI Chapter_2.pdf
PDF
ICS3211_lecture 04 2023.pdf
PDF
The Laws of User Experience: Making it or breaking it with the UX Factor
PDF
The Laws of User Experience: Making it or Breaking It with the UX Factor
PPT
HCI Chapter_2.ppt
PPTX
Usability for all budgets
Introduction & Course Overview: Design Thinking for User Experience Design, P...
Web Site Usability
Ux guide
GA - product management for entrepreneurs
Filip Healy (Threesixty Reality): Making Immersive Tech More Usable
User Zoom Kli Health Webinar Sep09 Vf
Web Usability (Slideshare Version)
Ch02 project selection (pp_tshare)
Targeted documentation STC Houston, Mar 20, 2012
Understanding The Value Of User Research, Usability Testing, and Information ...
U Xmagic Agile Presentation
UX Camp: Pittsburgh, 2017
Usability Testing for Qualitative Researchers - QRCA NYC Chapter event
Usability Testing by Rajdeep Gupta, Misys
HCI Chapter_2.pdf
ICS3211_lecture 04 2023.pdf
The Laws of User Experience: Making it or breaking it with the UX Factor
The Laws of User Experience: Making it or Breaking It with the UX Factor
HCI Chapter_2.ppt
Usability for all budgets

Evaluating Usability Of Commercial Software Applications

  • 1. ® ® Evaluating the Usability of Commercial Software Applications Jen Hocko © 2008 The MathWorks, Inc. The MathWorks
  • 2. ® ® About Me  Manager of the Business Applications (“BizApps”) Usability Group at The MathWorks  Prior lives: Web Development, Technical Publications  B.S. in Computer Science & Technical Writing, M.S. in Human Factors  Enjoy replacing chaos with something more orderly  Avid West Coast Swing dancer 2
  • 3. ® ® About You  How many of you have been asked to weigh in on the usability of a commercial software application?  What challenges did you face? 3
  • 4. ® ® What This Presentation is About  Here’s the situation….  How did I decide what to try?  What it turned into: overview of the methodology  How you can do it too: details about each step  Discussion of challenges, lessons learned  Q&A, further thoughts from the audience  Closing discussion, Q&A 4
  • 5. ® ® Here’s the situation….  Select the best Expense Reporting System to replace our old, home-grown solution  Project already underway  Project team requested Usability help because:  Hadn’t done this type of project before  Thought the end-user point of view was critical  Wanted to use tools / templates to work effectively  Needed guidance about where to go next 5
  • 6. ® ® How did I decide what to try?  Questions I asked myself:  What was out there?  What did other Usability people think?  Has this already been done?  Posted to many discussion groups!  What might work given our company culture?  What might work given the current state of the project? 6
  • 7. ® ® How did I decide what to try?  What I found / suggestions from others:  Jerrod Larson’s UX Magazine article on market research firms  Gartner, Forrester evaluations  SUS and SUMI Questionnaires  Nielsen’s (and other) heuristics  Various checklists  CIF report (ISO/IEC 250622006) 7
  • 8. ® ® What it turned into: methodology overview We recommend that project teams perform both the first and second level qualifications. 8
  • 9. ® ® What it turned into: methodology overview We suggest that project teams perform at least one of the third level qualifications. 9
  • 10. ® ® First Level Qualification  Primary goal: get a sense for how versed the vendor is in usability and user-centered design 10
  • 11. ® ® Step 1: Questions for vendors  What questions about Usability would you ask vendors? 11
  • 12. ® ® Step 1: Questions for vendors 12
  • 13. ® ® Step 1: Evaluating vendor responses  “Does the system support people with disabilities by following web accessibility guidelines?” “The system does not support this functionality.” (Company A) 13
  • 14. ® ® Step 1: Evaluating vendor responses  “Does the system support people with disabilities by following web accessibility guidelines?” “Company B’s user interface is designed in accordance to the principals of the Inductive User Interface approach. This approach is similar to key applications used on a day-to- day basis by both lay and performance end users. Microsoft, noted as the most significant contributor to end user experience and design, adopts this approach in a number of its applications such as MS Money, Hotmail, and MSN.com. Entry screens are dynamic in nature meaning that dependent on the type selected different fields will become visible and dependent on their configuration will be optional or mandatory. Users fill in values in a logical sequence, using a series of pull down lists, buttons and check boxes, without requiring screen refreshes which make other products cumbersome.” 14
  • 15. ® ® Step 1: Evaluating vendor responses  “Does the system support people with disabilities by following web accessibility guidelines?” “Company C believes the application to comply with Section 508 requirements based on the ability to navigate the application using keyboard access and to adjust text size in the browser using standard browser functions.” 15
  • 16. ® ® Step 1: Evaluating vendor responses  “Does the system support people with disabilities by following web accessibility guidelines?” “Company C believes the application to comply with Section 508 requirements based on the ability to navigate the application using keyboard access and to adjust text size in the browser using standard browser functions.” “The system does not support this functionality.” (Company A) “Company B’s user interface is designed in accordance to the principals of the Inductive User Interface approach. This approach is similar to key applications used on a day-to- day basis by both lay and performance end users. Microsoft, noted as the most significant contributor to end user experience and design, adopts this approach in a number of its applications such as MS Money, Hotmail, and MSN.com. Entry screens are dynamic in nature meaning that dependent on the type selected different fields will become visible and dependent on their configuration will be optional or mandatory. Users fill in values in a logical sequence, using a series of pull down lists, buttons and check boxes, without requiring screen refreshes which make other products cumbersome.” 16
  • 17. ® ® Let’s talk  Q&A  Your thoughts? 17
  • 18. ® ® Second Level Qualification  Primary goals:  Define what is required of the application you are looking to buy  Set up some structure and evaluation criteria for vendor demos 18
  • 19. ® ® Step 2: Use cases, requirements, & capabilities  What is a use case?  Description of the user  Description of the user’s goal  The user’s current workflow  Pain points associated with the current workflow 19
  • 20. ® ® Step 2: Use cases, requirements, & capabilities  How do you get the content for your use cases? Write Identify use cases user roles Throughout, keep looking for missing user roles, use cases, or pains / Brainstorm Affinitize pains issues high level or issues use cases Brainstorm pain points or issues 20
  • 21. ® ® Step 2: Use cases, requirements, & capabilities  How do you get the content for your use cases?  CARD (task analysis) Brainstorm Write use cases pain points  Big picture or issues  Current workflow Throughout, keep looking for missing  Interviews user roles, use cases, or pains /  Observations issues Identify Affinitize pains user roles or issues 21
  • 22. ® ® Step 2: Use cases, requirements, & capabilities 22
  • 23. ® ® Step 2: Use cases, requirements, & capabilities 23
  • 24. ® ® Step 2: Use cases, requirements, & capabilities  Challenge #1: Shouldn’t we be documenting the ideal workflow?  Makes vendors think about our problems and how to solve them  Not limiting to one “ideal” solution – different ones may work well  Easier to start from what is known  Shared understanding of today is invaluable 24
  • 25. ® ® Step 2: Use cases, requirements, & capabilities  Challenge #2: Why do we have to start with a use case?  Requirements should be traceable back to an actual user  Helps reduce scope creep and “bells and whistles”  If it can’t be tied to a use case it’s probably not needed right now! 25
  • 26. ® ® Step 2: Use cases, requirements, & capabilities  Challenge #3: What is the appropriate level of detail for a use case?  Some things to consider:  Combine multiple user roles into one use case. (Note where any variations in steps or pains occur.)  Vendors care about what we want the application to do  Project teams and users care about:  Getting the best application possible (having their needs met)  Evaluating the applications 26
  • 27. ® ® Step 2: Use cases, requirements, & capabilities  Challenge #4: What should we be giving to vendors?  Some options:  The N most important use cases (exactly as we wrote them)  A prioritized list of requirements pulled from the use cases  Spreadsheets are fine, if cutoffs are defined – must have’s, should have’s, nice to have’s, etc.  The requirements, organized into high level “capabilities”  Requirements affinitized back into manageable categories  Sometimes aligns better with demos (allows for easier scoring)  Still need to decide how many to address per demo 27
  • 28. ® ® Step 2: Use cases, requirements, & capabilities  Challenge #5: How do you write a good requirement?  Well written requirements: 1. Explain what the application should do, not how 2. Keep the readers in mind 3. Are specific, actionable statements 4. Make good use of language (spelling, grammar, etc.) 5. Are uniquely numbered 28
  • 29. ® ® Step 2: Use cases, requirements, & capabilities  Example: related requirements organized by Capability Capability Requirements 29
  • 30. ® ® Step 2: Use cases, requirements, & capabilities Capability  Example: related requirements organized by Capability Requirements (click on Capability to view spreadsheet) 30
  • 31. ® ® Let’s talk  Q&A  Other thoughts? 31
  • 32. ® ® Second Level Qualification  Primary goals:  Compare applications based on how well vendors are able to demonstrate that they meet your requirements  Get the project team talking about application strengths and weaknesses 32
  • 33. ® ® Step 3: Vendor demos & scorecard review  An Individual scorecard (example) 33
  • 34. ® ® Step 3: Vendor demos & scorecard review  The Individual comments template:  Do:  Train people!  Have a parking lot 34
  • 35. ® ® Step 3: Vendor demos & scorecard review  A Consolidated Demo Scorecard (example) 35
  • 36. ® ® Step 3: Vendor demos & scorecard review  Count the A’s and multiply by 3  Count the B’s and multiply by 2  Count the C’s and multiply by 1  Add these together to get a “Positives” score  Count the N’s and multiply by 3 to get a “Negatives” score  “Positives” score – “Negatives” score = Final score 36
  • 37. ® ® Let’s talk  Q&A  Other thoughts? 37
  • 38. ® ® Third Level Qualification  Primary goal: Look critically at application to identify potential usability problems (Usability Specialist) 38
  • 39. ® ® Step 4a: Usability audit 39
  • 40. ® ® Step 4a: Usability audit 40
  • 41. ® ® Step 4a: Usability audit  Pros:  Provides helpful reminders of different usability principles that could cause problems for users if not followed  Can result in useful discussions about configurability (e.g. control of style sheets, etc)  Cons:  Some checklist items don’t apply or are difficult to measure – need to be consistent across audits of different applications  Isolated activity for single Usability expert goes against our collaborative culture, inter-rater reliability  Do Better Next Time  Need to look at the checklist items in the context of use cases 41
  • 42. ® ® Let’s talk  Q&A  Other thoughts? 42
  • 43. ® ® Third Level Qualification  Primary goal: Look critically at application to identify potential usability problems (end-users) 43
  • 44. ® ® Step 4c: End-user evaluation  3 measures of system usability:  Effectiveness – users can complete tasks  Efficiency – how easily users can complete tasks  Satisfaction – how users feel about completing the tasks  2/3 User Acceptance Test (UAT), 1/3 survey  In UAT, Usability plays a supporting role by ensuring:  Tasks are adapted from the use cases  All user groups are represented as participants  Evaluation documents are designed to capture effectiveness and efficiency 44
  • 45. ® ® Step 4c: End-user evaluation 45
  • 46. ® ® Step 4c: End-user evaluation Score Effectiveness Efficiency Type Individual (Number of Y’s / Total Y’s (Number of Agrees / Total Agrees Possible) * 100 Possible) * 100 Overall Average of all Individual Scores Average of all Individual Scores We do this for each application being evaluated. If there were tasks that evaluators were unable to complete or that took unreasonable time and effort, follow up with them to identify and document the reasons WHY. 46
  • 47. ® ® Step 4c: End-user evaluation  Administered in SurveyMonkey  Initial section for collecting demographic information  Additional question for overall feeling of system (scored separately)  Users fill out one survey per system 47
  • 48. ® ® Step 4c: End-user evaluation 48
  • 49. ® ® Let’s talk  Q&A  Other thoughts? 49
  • 50. ® ® Final usability recommendation: system comparison matrix 50
  • 51. ® ® Additional takeaways  Over communicate: keep the team informed / aligned with the process – don’t assume they know it  Customize as necessary: this isn’t a “one size fits all” methodology. It’s only a starting point – get team input  Fit into the bigger picture: if there’s a larger, centralized process of software evaluations  It does make a difference: project teams think more critically about end-users as part of procurement decision 51
  • 52. ® ® Final Questions 52