SlideShare a Scribd company logo
Usability Test Report of GoIbibo.com
By Jeffrey Jacob
for the Evaluating Designs With Users course
offered by the University of Michigan on Coursera
Table of Contents
List of Figures 4
List of Tables 5
Executive Summary 6
Introduction 7
Methods 8
Usability Metrics Measured 10
Findings and Recommendations
- Task Completion Rate
- Task Ratings
- Time on Task
- Errors
- Overall Satisfaction
- Summary of Data
- Key Findings and Recommendations
11
11
12
14
15
16
19
20
Limitations 28
Conclusion 29
References 30
Appendix 1: User Test Scripts
- Pre-test Checklist
- Post-test Checklist
- User Test Script
31
31
32
33
Appendix 2: Consent Form 35
Appendix 3: Questionnaires
- Pre-test Questionnaire
36
36
2
- Post-task Questionnaire
- Post-test Questionnaire
37
38
Appendix 4: Task Scenarios 40
Appendix 5: Filled Out Logging Forms
- Participant 1
- Participant 2
- Participant 3
- Participant 4
41
41
44
48
53
Appendix 6: Questionnaire Responses 57
Appendix 7: Complete List of Usability Issues 62
3
List of Figures
Figure 1: User profile drop-down on hover. 20
Figure 2: Collapsible sections of the payment details here look like they are
disabled
21
Figure 3: The search results for a 4-passenger journey. Note how the fare of one
ticket is listed on the right
22
Figure 4: The traveller details section of the checkout page 23
Figure 5: The fare alerts modal as advertised on the website 24
Figure 6: The search form (above) filled; in round-trip mode; (below) form reset
automatically on switching to multi city
25
Figure 7: The search form (above) filled; in round-trip mode; (below) form reset
automatically on switching to multi city
26
Figure 8: The footer section of the GoIbibo.com homepage 27
4
List of Tables
Salient demographic details of the participants 9
Task Completion Rate 11
Mean Task Ratings 12
Time on Task (in seconds) 14
Errors 15
SUS questionnaire responses 16
A general guideline used to interpret SUS scores 18
SUS scores for each participant 18
Summary of Completion, Errors, Time on Task, Mean Satisfaction 19
List of task scenarios provided to the participants 40
Collated questionnaire responses 57
List of usability issues found in GoIbibo.com 62
5
Executive Summary
I conducted a remote usability test from July 11th
- 13th
, 2020. The purpose of the test was to
assess the usability of the web interface design, information flow and information architecture.
A total of four participants participated in the test. Each individual test lasted approximately two
hours. The test scenarios consisted of six tasks adapted from the tasks listed in the support
materials of this course.
In general, all participants found the GoIbibo.com website to be clear, straightforward and easy
to use. Two of the four use travel booking websites at least once every 3 months to book flights.
The test identified only a few minor problems including:
- Sign In/Sign Up links not prominent enough on the navigation bar.
- Collapsible sections in the payment details page look like they are disabled.
- Only the ticket rate for one passenger is shown on the search results, even when
booking for multiple passengers.
- The user is able to list the same passenger name and details for all passengers
travelling.
- The Fare Alerts feature does not appear anywhere on the site.
- Filled-in forms don’t get carried over when the user switches to another flight journey
option.
- Obscure and hard-to-find copy is used to describe the Amenities filter options.
- Very low contrast between foreground and background colors in the footer section.
This document contains the participant feedback, satisfaction ratings, task completion rates,
ease or difficulty of completion ratings, time on task, errors, SUS scores and recommendations
for improvements. A copy of the scenarios, scripts and questionnaires used are included in the
Appendix section.
6
Introduction
To put it shortly, the goal of the test is defined as answering the question
“Can frequent travellers use GoIbibo.com to plan their trips?”
GoIbibo.com is one of India’s largest flights and hotels aggregator platforms, attracting over 2
million visits per month. GoIbibo.com provides booking services for hotel reservations, flight
tickets, bus tickets, outstation/inter-city cab & taxi, IRCTC train ticket bookings, etc.
To assess the ease of use and efficiency in making flight and hotel bookings through
GoIbibo.com, I conducted a remote moderated usability test using Google Meet, a video
conferencing tool and Google Forms, a survey administration service. The rest of this document
describes the various methods used in designing and conducting the tests and the metrics and
heuristics used in analyzing the findings from these tests into actionable results.
7
Methods
I reached out to a few colleagues and friends who fit within the requirements for the test. I
selected the participants for this test using the following recruiting criteria:
- Participants must have bought a plane ticket online in the past year.
- Participants must not have used the site before.
Table 1 lists the basic demographic details of the participants. All four participants were young
adults of age 22/23 - two working professionals, one research scholar and one undergraduate
student. Of the four participants, one was female and three were male.
Table 1: Salient demographic details of the participants
Participant Age Gender Occupation Frequency of
travel (every
12 months)
Specific preferences while booking
1 23 Male Student 2 Usually travels with his friends so the
availability of seats is a major concern
along with listings of alternative
options within an affordable budget.
2 22 Female Business
Intelligence
Analyst
4 Price, travel insurance, multiple
payment options (specifically net
banking & UPI).
3 22 Male VLSI Intern 3 Price; prefers booking from a
particular airline.
4 23 Male Software
Developer
6 Flight timings and flight duration;
usually visits flight aggregator sites to
get an idea of his available options.
I sent emails to attendees informing them of the test logistics and requesting their availability
and participation. Participants responded with an appropriate date and time. Each individual
8
session was conducted over Google Meet and lasted approximately one hour. During the
session, I briefed on what the test session would entail and asked the participant some
background questions and to fill out a consent form (listed at Appendix 3 and Appendix 2
respectively). The participants read the task scenarios and tried to find the information on the
website (see Appendix 4 for the task descriptions).
See Appendix 1 for the complete checklists and user scripts used for the tests.
After each task, I asked the participant to rate the interface on a 5-point Likert Scale with
measures ranging from Strongly Disagree to Strongly Agree. Post-task scenario subjective
measures included (see Appendix 3):
1. How easy it was to find the information from the home page.
2. Ability to keep track of their location throughout the website.
3. Accurateness of predicting which section of the website contained the information.
After the last task was completed, I asked the participant to rate the perceived usability of the
website overall using a SUS questionnaire (see Appendix 3).
All questionnaire responses filled by the participants are listed in Appendix 6.
Recordings of the test sessions were taken to playback for review and analysis later. Critical
events, incidents and timestamps were documented on logging sheets for quick review later.
Scanned copies of these logging sheets can be found in Appendix 5.
Some key findings from the tests are documented in this report along with the necessary
recommendations. Standard usability metrics and Nielsen’s 10 heuristic principles were used in
arriving at these results. A complete list of these usability issues is shown in Appendix 7.
9
Usability Metrics Measured
We used some metrics to quantify the usability of the website during the evaluation. These
metrics act as standards to measure, compare and communicate the effectiveness, efficiency
and satisfaction that the website provides to the participants in their contexts. The metrics used
in this exercise are listed below:
- Task Completion Rate & Number of Errors to measure the effectiveness of the website,
- Time on Task to measure its effectiveness and,
- Task- and test-level satisfaction scores using questionnaires: After Scenario
Questionnaire (ASQ) to derive task ratings and System Usability Scale (SUS) to
understand overall satisfaction.
10
Findings and Recommendations
Task Completion Rate
All participants successfully completed Task 1 to 5. None of the participants were able to
complete Task 6 which required them to set up a fare alert for a particular flight.
Table 2: Task Completion Rate
Participant Task 1 Task 2 Task 3 Task 4 Task 5 Task 6
1 ✔ ✔ ✔ ✔ ✔ ❌
2 ✔ ✔ ✔ ✔ ✔ ❌
3 ✔ ✔ ✔ ✔ ✔ ❌
4 ✔ ✔ ✔ ✔ ✔ ❌
Success 4 4 4 4 4 0
Completion
Rate
100% 100% 100% 100% 100% 0%
11
Task Ratings
After the completion of each task, participants rated the ease or difficulty of completing the task
for three factors:
- It was easy to find my way to this information from the home page.
- I was able to keep track of where I was on the website as I was searching for this
information.
- I was able to accurately predict which section of the website contained this information.
The 5-point rating scale ranged from 1 (strongly disagree) to 5 (strongly agree) with mean agree
rating being 4. An overall rating of >4.0 is considered to satisfy all three conditions.
Table 3 lists the task ratings for each of the six tasks by each of the factors listed above and
their means.
Table 3: Mean Task Ratings
Task Ease in Finding
Information
Keeping Track of
Location in Site
Predicting
Information
Section
Mean Task
Rating
1 5.00 (100%) 4.25 (85%) 4.25 (85%) 4.50
2 4.75 (95%) 4.75 (95%) 4.00 (80%) 4.50
3 4.00 (80%) 4.75 (95%) 3.25 (65%) 3.00
4 4.75 (95%) 4.75 (95%) 4.25 (85%) 4.58
5 4.00 (80%) 4.50 (90%) 4.25 (85%) 4.25
6 1.00 (20%) 3.00 (60%) 1.25 (25%) 1.75
12
Ease in Finding Information
All participants agreed it was easy to find necessary information for the cheapest round trip flight
ticket (task 1; rating = 5.00), the cheapest flight available for multiple passengers (task 2; rating
= 4.75) and the number of flight options available at short notice (task 4; rating = 4.75). 80% of
users found it easy to book multi-city flights (task 3; rating = 4.00) and hotels using relevant
filters (task 5; rating = 4.00). Only 20% of users found it easy to create a fare alert (task 6; rating
= 1.00).
Keeping Track of Location in Site
All the participants found it easy to keep track of their location in the site while finding the
cheapest flight available for multiple passengers (task 2; rating = 4.75), cheapest flights for a
multi-city trip (task 3; rating = 4.75), the number of flight options available at short notice (task 4;
rating = 4.75) and finding a top-rated hotel for a stay (task 5; rating = 4.50). In addition, 85%
found it easy to keep track of their location while finding the cheapest round trip flight ticket (task
1; rating = 4.25). However, only 60% of participants found it easy to keep track of their location
while finding fare alert sections (task 6; rating = 3.00).
Predicting Information Section
All the participants agreed it was easy to predict where to find the cheapest round trip flight
ticket for a given source and destination (task 1; rating = 4.25), the number of flight options
available at short notice (task 4; rating = 4.25) and to find a top-rated hotel for a stay (task 5;
rating = 4.25). 80% of participants agreed it was easy to predict where to find cheap flights for
multiple passengers (task 2; rating = 4.00). However, only 65% agreed that it was easy to
predict where to go to book for a multi-city trip (task 3; rating = 3.25) and only 25% agreed they
could predict where to find the fare alert section (task 6; rating = 1.25).
13
Time on Task
I used a timer to record the time on task for each participant. Some tasks were inherently more
difficult to complete than others and are reflected by the average time on task.
Table 4 displays the time taken by each participant for each task and the average time on tasks.
Table 4: Time on Task (in seconds)
Participant Task 1 Task 2 Task 3 Task 4 Task 5 Task 6
1 182 223 524 272 840 792
2 146 184 379 209 531 558
3 192 318 340 326 749 450
4 397 295 422 362 534 856
Avg. Time on Task 229 255 416 293 663 664
Task 1 required participants to find the cheapest flight available for a round trip and took the
shortest time to complete (mean = 229 seconds). Completion times ranged from 192
(approximately 3 minutes) to 397 seconds (more than 6 minutes) with most times less than 240
seconds (less than 4 minutes).
Task 6 required participants to create a fare alert for a flight and took the longest time to
complete (mean = 664 seconds). However, completion times ranged from 450 (just above 7
minutes) to 856 seconds (more than 14 minutes).
14
Errors
I captured the number of critical errors participants made while trying to complete the task
scenarios in Table 5 below.
Table 5: Errors
Participant Task 1 Task 2 Task3 Task4 Task 5 Task 6
1 - - 1 - - 1
2 - - - 1 1 1
3 - - 2 2 1 1
4 - - - - - 1
Errors 0 0 3 3 2 4
Among the six, Task 6 produced the highest number of errors. All participants were unable to
complete this task. The prompt was to create a fare alert for a flight for a given route. Despite
being advertised as a main USP by the service, users were unable to find their way to this
feature. No clear signals pointing to this feature were found across the website.
Task 1, which required participants to find the cheapest flight available for a round trip and Task
2, where the user had to find the cheapest flight available for multiple passengers, were
executed without any non-critical errors by all participants.
15
Overall Satisfaction
After the completion of all tasks, the participants took the System Usability Scale (SUS)
questionnaire which measures the perceived usability of the website. This questionnaire
contained 10 questions where participants were given a 1–5 scale to fill, according to how they
agree with every statement regarding product or feature on the test. 1 means strongly disagree
while 5 means they strongly agree with the statement. The questions along with the results are
listed below in Table 6.
Table 6: SUS questionnaire responses
Question Strongly
disagree
1
Disagree
2
Neutral
3
Agree
4
Strongly
agree
5
Mean
rating
Percent
agree
I think that I would
like to use this
system frequently.
- - - 2 2 4.50 100%
I found the system
unnecessarily
complex.
2 2 - - - 1.50 0%
I thought the
system was easy
to use.
- - - 3 1 4.25 100%
I think that I would
need the support of
a technical person
to be able to use
this system.
2 2 - - - 1.50 0%
I found the various
functions in this
system were well
integrated.
- 1 1 1 1 3.50 50%
16
I thought there was
too much
inconsistency in
this system.
2 1 1 - - 1.75 0%
I would imagine
that most people
would learn to use
this system very
quickly.
- - - 2 2 4.50 100%
I found the system
very cumbersome
to use.
1 2 - 1 - 2.25 25%
I felt confident
using the system.
- - - 2 2 4.50 100%
I needed to learn a
lot of things before
I could get going
with this system.
2 2 - - - 1.50 0%
The SUS scores were then calculated as follows:
- For every odd-numbered question, subtract 1 from the score (X-1)
- For every even-numbered question, subtract the score from 5 (5-X)
- Each score was then multiplied by 2.5 and added up to form a total SUS score between
0 and 100, and graded as shown below.
Based on research, a SUS score above a 68 would be considered above average and anything
below 68 is below average. Table 7 gives a primer on how to interpret a SUS score on a grading
scale.
17
Table 7: A general guideline used to interpret SUS scores
SUS Score Grade Adjective Rating
> 80.3 A Excellent
68 - 80.3 B Good
68 C Okay
51 - 68 D Poor
< 50 F Awful
This scale was used to make sense of the data collected from the user tests conducted, as
shown in Table 8.
Table 8: SUS scores for each participant
Participant Total SUS Score Adjective Rating
1 70 Good
2 92.5 Excellent
3 92.5 Excellent
4 72.5 Good
Average 81.9 Excellent
Going by the average SUS score of 81.9 from the tests taken, the GoIbibo website met the
participants’ basic usability expectations for the most part, ranging from 70 to 92.5.
18
Summary of Data
Table 9 displays a summary of the test data. Low completion rates and satisfaction ratings and
high errors and time on tasks are highlighted in red.
Table 9: Summary of Completion, Errors, Time on Task, Mean Task Rating
Task Task Completion Errors Time on Task Mean Task
Rating
1 100% 0 229 4.50
2 100% 0 255 4.50
3 100% 3 416 3.00
4 100% 3 293 4.58
5 100% 2 386 4.25
6 0% 4 664 1.75
Except for one major task, the participants were able to accomplish most of the tasks without
any hassles and the statistics detailed support that notion. Participants found GoIbibo.com to be
well-organized, comprehensive, clean and uncluttered, very useful, and easy to use. However,
they agreed that the website needs some fine-tuning in its user interface moving forward. Some
major usability issues and violations were observed during testing and these isolated critical
incidents are explained in detail in the next section.
19
Key Findings and Recommendations
This section provides recommended changes and justifications driven by the participant
success rate, behaviours, and comments. Each recommendation includes a severity rating. The
following recommendations will improve the overall ease of use and address the areas where
participants experienced problems or found the interface/information architecture unclear.
Finding 1
Heuristic Violated
#2 Match between system and the real world
Severity Rating
4, usability catastrophe
Description
The Sign In/Sign Up links should be more prominent on the navigation bar. The user profile
icon is also shown even when the user is not logged in. This may confuse the user, who
may click
Figure 1: User profile drop-down on hover.
Recommendation
Remove the profile icon and dropdown and make the Sign In/Sign Up links more prominent
on the page.
20
Finding 2
Heuristic Violated
#1 Visibility of system status
#2 Match between system and the real world
Severity Rating
4, usability catastrophe
Description
On the Payment Details page, some collapsible sections of the payment are closed by default
and looked greyed out. This may give off the impression to the user that these elements are
disabled and not clickable.
Figure 2: Collapsible sections of the payment details here look like they are disabled.
Recommendation
Redesign the sections to be clear and look clickable. Adapt a high contrast color scheme for
the new section.
21
Finding 3
Heuristic Violated
#3 User control and freedom
#7 Flexibility and efficiency of use
Severity Rating
4, usability catastrophe
Description
When booking for more than one passenger, only the ticket rate is shown on the search
results, rather than the total price for the number of passengers given. This violates the
mental model of the user who would want to know the total price they would have to pay
before clicking further.
Figure 3: The search results for a 4-passenger journey. Note how the fare of one ticket is
listed on the right.
Recommendation
List out the total price for the flight based on the search results. This would save the user from
the extra cognitive load of calculating the fare for multiple passengers.
22
Finding 4
Heuristic Violated
#2 Match between system and the real world
#5 Error prevention
Severity Rating
4, usability catastrophe
Description
For bookings involving more than one passenger, the user is able to list the same passenger
name and details for all passengers. The system doesn’t recognize this and even saves these
details to be auto-filled in later bookings.
Figure 4: The traveller details section of the checkout page.
Recommendation
The system should be able to recognize these details and identify whether each passenger's
details are unique and verified.
23
Finding 5
Heuristic Violated
#7 Flexibility and efficiency of use
#3 User control and freedom
Severity Rating
4, usability catastrophe
Description
The Fare Alerts feature (as advertised here) does not appear anywhere else on the site. This
is more an ethical concern of false advertising as there is no way an user can set up fare
alerts for a particular flight on the current iteration of the site.
Figure 5: The fare alerts modal as advertised on the website.
Recommendation
The system should be able to account for this feature by allowing the user to log in using
email and having a fare tracker to send updates for a particular search by email. The current
website uses a phone number or an automatically generated dummy email as username.
24
Finding 6
Heuristic Violated
#2 Match between system and the real world
#3 User control and freedom
Severity Rating
3, major usability problem
Description
Filled-in forms don’t get carried over when the user switches to another flight journey option.
Figure 6: The search form (above) filled; in round-trip mode; (below) form reset automatically
on switching to multi city.
Recommendation
The system should be able to transfer the common form responses (source, destination,
departure date, number of passengers) across all journey modes.
25
Finding 7
Heuristic Violated
#2 Match between system and the real world
#6 Recognition rather than recall
Severity Rating
3, major usability problem
Description
When searching for hotels, one common filter that the user would use is to check for the
availability of certain amenities. While an amenities filter with more than 20 listed facilities is
provided on the site, it uses relatively obscure names for some services which in turn reduce
its findability and frustrates the user (For example, the filter lists ‘Internet’ or ‘Free Internet’
instead of the more commonly used ‘Wifi’).
Figure 7: The search form (above) filled; in round-trip mode; (below) form reset automatically
on switching to multi city.
Recommendation
Use clear, concise and common terms for the facilities (For example, ‘Wifi’ instead of ‘Free
Internet’).
26
Finding 8
Heuristic Violated
#8 Aesthetic and minimalist design
Severity Rating
2, minor usability problem
Description
In the footer sections, there is a very low contrast between foreground and background colors.
Adequate contrast is necessary for all users, especially users with low vision. These are quick
links to flights to some of the more popular destinations and therefore should be more
prominent in the page.
Figure 8: The footer section of the GoIbibo.com homepage.
Recommendation
Use accessibility-compliant color combinations. Increase the contrast between the foreground
(text) color and the background color (ratio<4.5:1). Large text (larger than 18 point or 14 point
bold) does not require as much contrast as smaller text (ratio<3:1).
Other findings are listed in Appendix 7.
27
Limitations
A major limitation faced was in the way these tests were conducted. Owing to the circumstances
caused the coronavirus pandemic, I made the choice to conduct the tests remotely. Participants
were recruited over email. All arrangements for the user test were made over the Google Suite
of services - Calendar to schedule meetings, Meet to host the interview and record meetings,
Forms for the questionnaires and surveys and Drive to store and play back the video
recordings.
All the recruited participants were from a narrow age bracket of 22-23. I reached out to my
immediate social circles for recruitment because of some time constraints and this test was
devised considering the goals and actions of that particular group of users.
Ideally user tests are conducted with 2-3 moderators. However, considering the constraints of
the assignments, I acted as the sole moderator for the tests. This involved handling multiple
duties of logging and moderating the interviews.
28
Conclusion
Most of the participants found GoIbibo.com to be well-organized, comprehensive, clean and
uncluttered, very useful, and easy to use. Having a centralized site to accommodate all kinds of
travel options in India is key to many if not all of the participants. Implementing the above-listed
recommendations and continuing to conduct tests with users in the future will ensure a
continued user-centred website.
29
References
- Nielsen, J. (1994) Heuristic Evaluation. In J. Nielsen. & R. L. Mack (Eds.) Usability
Inspection Methods. New York, NY: John Wiley & Sons.
- https://www.nngroup.com/articles/usability-metrics/
- https://usabilitygeek.com/usability-metrics-a-guide-to-quantify-system-usability/
-
30
Appendix 1: User Test Scripts
Pre-test Checklist
1. Login with phone number and password.
2. Remove login saved memory.
3. Double-check success criteria.
4. Upload and cross-check the Google Form link; the form contains the following:
- Consent form
- Task instructions
- Post-test questionnaire
5. Print test script and logging sheet.
6. Set up a Google Meet link and forward it to the participant ten minutes before meeting
time.
7. Once the participant joins, inform them about the background details of this test and
request consent to record and document the proceedings.
8. Send the survey link to the participant and ask them to fill the consent form to proceed
further.
9. Start screen recording
10. Request the participant to share their screen
11. Ask the participant to clear their cookies from goibibo.com on their browser.
12. Provide the participants with the login details (phone number and password).
31
Post-test Checklist
1. Stop recording; ensure audio and video of the meeting gets saved to your Google Drive.
2. Verify whether the participant filled out all parts of the survey form.
3. File logging sheet.
32
User Test Script
1. Onboarding once the participant joins the meeting
“Hi __! Thanks for coming in today!
The goal for today’s session is to test the website - GoIbibo.com. I’m here to learn from
you so I’ll ask a lot of questions, ​
but I’m not testing you​
. There are no right or wrong
answers.
I’ll start this session by asking some background questions. Then I’ll ask you to do some
tasks. As you work on the tasks, please ​
think aloud.​
​
This means that you should try to give a running commentary on what you're doing as
you work through the tasks. Tell me what you're trying to do and how you think you can
do it. If you get confused or don't understand something, please tell me. If you see things
you like, tell me that too. Candid feedback is the most helpful here.
If you do get stuck, ​
I’m going to try not to answer your questions​or tell you what to do.
I’m just trying to see what you would do if you were using it on your own. But don’t worry-
​
I’ll help you if you get completely stuck​
.
Do you have ​
any questions​before we begin?”
2. Present the consent form​
, summarize it, and obtain permission
3. Present the pre-test questionnaire
4. Task Instructions
Present the tasks one at a time. Read each task aloud and allow the participant to view
the task through the form.
After the completion of each task, ask the participant to fill out a quick post-task
questionnaire before moving to the next one.
5. Once all tasks are completed, present the post-test questionnaire and debrief.
Review parts of the test where the user struggled.
6. Conclusion
“This has been incredibly helpful. Today, you mentioned… [Try to briefly summarize
33
some key parts of the discussion or issues.] Your input is really valuable for me and will
help me ideate the next steps for these ideas. I really appreciate your taking the time to
come in, and answering all of my questions. Thank you so much!”
13.
34
Appendix 2: Consent Form
I agree to participate in the study of a flight booking website being conducted as part of the
Coursera course: Evaluating Designs with Users. I consent to the recording of this test. This
recording will be used for research and product improvements only. I understand that
participation in this usability study is voluntary and I agree to immediately raise any concerns or
areas of discomfort during the session with the study administrator. Please sign below to
indicate that you have read and you understand the information on this form and that any
questions you might have about the session have been answered.
Thank you! We appreciate your participation.
Date:
Name:
35
Appendix 3: Questionnaires
Pre-test Questionnaire
1. Have you used GoIbibo.com before?
2. Tell me about the last trip you planned.
- What do you usually use to plan your trip?
- What is your primary purpose for travelling?
- What is your primary concern?
- What is your budget?
3. What information is the most important when you are planning your trip?
4. How often do you travel?
36
Post-task Questionnaire
1. It was easy to find my way to the information needed to complete this task from the
homepage.
1 2 3 4 5
Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree
2. As I was searching for this information, I was able to keep track of where I was on the
website.
1 2 3 4 5
Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree
3. I was able to accurately predict which section of the website contained this information.
1 2 3 4 5
Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree
37
Post-test Questionnaire
Qualitative Questionnaire
1. What did you like the most about the website?
2. What did you like the least about the website?
3. What changes would you make to better the site’s user experience?
4. How would you describe this website to another person?
5. Under what circumstances would you visit this website in the future? Why?
SUS Questionnaire
1. I think that I would like to use this system frequently.
1 2 3 4 5
Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree
2. I found the system unnecessarily complex.
1 2 3 4 5
Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree
3. I thought the system was easy to use.
1 2 3 4 5
Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree
4. I think that I would need the support of a technical person to be able to use this system.
1 2 3 4 5
Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree
38
5. I found the various functions in this system were well integrated.
1 2 3 4 5
Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree
6. I thought there was too much inconsistency in this system.
1 2 3 4 5
Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree
7. I would imagine that most people would learn to use this system very quickly.
1 2 3 4 5
Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree
8. I found the system very cumbersome to use.
1 2 3 4 5
Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree
9. I felt confident using the system.
1 2 3 4 5
Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree
10. I needed to learn a lot of things before I could get going with this system.
1 2 3 4 5
Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree
39
Appendix 4: Task Scenarios
Table 10: List of task scenarios provided to the participants
Task Brief
1 Your manager asks you to help her plan a few trips for the company. She has heard of a
website called “GoIbibo.com​
” that can help and encourages you to use it.
Plan a round trip from Delhi to Mumbai for under INR 7,000 (or the next cheapest price) from
September 16, 2020, to September 19, 2020.
Note - Unless otherwise specified, any arrival/departure time is okay
2 4 people from the Chennai office want to attend a conference in Bangalore from September 8,
2020, to September 10, 2020. What is the cheapest total price of the trip?
3 Your manager wants to join the Chennai team in Bangalore (your office is in Delhi), but then
she wants to go to Dubai for a week then return to Delhi. She plans to fly business class for
the entire trip. What is the cheapest price for her trip?
4 The Pune office manager has a meeting in Bangalore on October 16, 2020, at noon. She
wants to leave on October 15 after 9 am, and has to arrive back in Pune anytime before 9 pm
on the next day. How many flight options do you have?
5 Help your manager book a place to stay from October 16-18. Find the top-rated hotel that has
wifi for under INR 7000/night in Bangalore.
6 You want to surprise your family with a visit over Christmas but money is tight. Set up a fare
alert for a trip from Delhi to Kochi from December 22, 2020 to December 26, 2020.
40
Appendix 5: Filled Out Logging Forms
Participant 1
41
42
43
Participant 2
44
45
46
47
Participant 3
48
49
50
51
52
Participant 4
53
54
55
56
Appendix 6: Questionnaire Responses
Table 11: Collated questionnaire responses
Participant ID
Completed Date
& Time
1
July 11, 2020
4:54 PM
2
July 12, 2020
4:10 PM
3
July 12, 2020
8:45 PM
4
July 13, 2020
12:30 AM
Demographic Details
Age 23 22 22 23
Gender Male Female Male Male
Occupation Student Business
Intelligence
Analyst
VLSI Intern Software
Developer
Pre-test Questionnaire
Have you used GoIbibo.com
before?
No Yes Yes No
Tell me about the last trip you
planned.
Chennai -
Bangalore
(and back) -
Train
Nov 2019.
Chennai -
Pune
(one-way)
July 2020.
Vizag -
Chennai (and
back)
Dec 2019.
Bangalore -
Delhi
(one-way)
Feb 2020.
What do you usually use to plan
your trip?
Ixigo. Make My Trip. Make My Trip. Make My Trip,
airline carrier
website.
57
What was your primary purpose
for travelling?
Internship/
project
purposes,
trips with
friends.
Lives away
from family
due to work;
visits them on
vacations.
Pilgrimage trip
with family.
To attend a
family
wedding.
What was your budget? Rs. 2000 -
4000 per
ticket.
Rs. 1000 -
2000 per
ticket.
Rs. 4000 -
5000 per
ticket.
Depends on
the situation;
open to paying
more if flight
timings are
convenient.
What information is the most
important when you are planning
your trip?
Route timings,
web check-in,
prices.
Immediate
updates on
changes in
flight status
(Email/SMS),
Easy & quick
payment
options,
prices.
Prices,
availability of
window seats.
Loyal to one
airline (Indigo)
because of the
leg room it
offers; Only
searches
within that
airline when
booking.
Flight timings,
baggage limit;
uses flight
aggregator
sites to get an
overview of
available flight
options and
then books off
the official
airline’s
website to
save costs.
How often do you travel (say,
every 12 months)?
2 4 3 6
Task 1
Ease in Finding Information 5 5 5 5
58
Keeping Track of Location in Site 4 4 5 4
Predicting Information Section 3 4 5 5
Task 2
Ease in Finding Information 5 5 5 4
Keeping Track of Location in Site 4 5 5 5
Predicting Information Section 3 4 5 4
Task 3
Ease in Finding Information 4 5 2 5
Keeping Track of Location in Site 4 5 5 5
Predicting Information Section 2 5 2 4
Task 4
Ease in Finding Information 5 5 5 4
Keeping Track of Location in Site 4 5 5 5
Predicting Information Section 3 5 5 4
Task 5
Ease in Finding Information 4 5 3 4
Keeping Track of Location in Site 3 5 5 5
59
Predicting Information Section 3 5 5 4
Task 6
Ease in Finding Information 1 1 1 1
Keeping Track of Location in Site 1 5 1 5
Predicting Information Section 1 2 1 1
Post-test Questionnaire
I think that I would like to use this
system frequently
4 5 5 4
I found the system unnecessarily
complex.
2 1 1 2
I thought the system was easy to
use.
4 5 4 4
I think that I would need the
support of a technical person to
be able to use this system.
2 1 1 2
I found the various functions in
this system were well integrated.
3 5 4 2
I thought there was too much
inconsistency in this system.
3 1 1 2
I would imagine that most people
would learn to use this system
4 5 4 5
60
very quickly.
I found the system very
cumbersome to use.
2 4 1 2
I felt confident using the system. 4 5 5 4
I needed to learn a lot of things
before I could get going with this
system.
2 1 1 2
61
Appendix 7: Complete List of Usability Issues
Heuristics evaluation is a cheap, fast, and easy- to -use usability engineering method designed
by Jakob Nielsen in 1994 to find usability problems in user interface designs. These heuristics
cover topics such as feedback, visibility, user control, user efficiency, help, error handling, error
prevention, and use of metaphors that match the real world. Each finding is given a severity
rating from 1 (cosmetic problem) to 4 (usability catastrophe) and assigned a recommendation.
Table 12: List of usability issues found in GoIbibo.com
Finding Heuristics
Violated
Severity
Rating
The Sign In/Sign Up links should be more prominent on the
navigation bar. The user profile icon is also shown even
when the user is not logged in.
#2 Match between
system and the
real world
4
On the Payment Details page, some collapsible sections of
the payment are closed by default and looked greyed out.
This may give off the impression to the user that these
elements are disabled and not clickable.
#1 Visibility of
system status
#2 Match between
system and the
real world
4
When booking for more than one passenger, only the ticket
rate for one passenger is shown on the search results,
rather than the total price for the number of passengers
given. This violates the mental model of the user who would
want to know the total price they would have to pay before
clicking further.
#3 User control
and freedom
#7 Flexibility and
efficiency of use
4
62
For bookings involving more than one passenger, the user
is able to list the same passenger name and details for all
passengers. The system doesn’t recognize this and even
saves these details to be auto-filled in later bookings.
#2 Match between
system and the
real world
#5 Error
prevention
4
The Fare Alerts feature (as advertised here) does not
appear anywhere else on the site. This is more an ethical
concern of false advertising as there is no way an user can
set up fare alerts for a particular flight on the current
iteration of the site.
#7 Flexibility and
efficiency of use
#3 User control
and freedom
4
Filled-in forms don’t get carried over when the user
switches to another flight journey option.
#2 Match between
system and the
real world
#3 User control
and freedom
3
Clicking on the ‘Student Fare’ checkbox does not result in
any visible and immediate feedback. Also, this option is
already provided on the homepage and does not appear
anywhere else in the site.
#1 Visibility of
system status
#2 Match between
system and the
real world
3
When searching for hotels, one common filter that the user
would use is to check for the availability of certain
amenities. While an amenities filter with more than 20 listed
facilities is provided on the site, it uses relatively obscure
names for some services which in turn reduce its findability
and frustrates the user (For example, the filter lists ‘Internet’
#2 Match between
system and the
real world
#6 Recognition
rather than recall
3
63
or ‘Free Internet’ instead of the more commonly used ‘Wifi’.)
The sub-sites under GoIbibo.com all are visually
inconsistent with each other in terms of layout, styling and
user interface.
#4 Consistency
and standards
#8 Aesthetic and
minimalist design
2
In the footer section, there is a very low contrast between
foreground and background colors. Adequate contrast is
necessary for all users, especially users with low vision.
These are quick links to flights to some of the more popular
destinations and therefore should be more prominent in the
page.
#8 Aesthetic and
minimalist design
2
Elements are referred to by different names throughout the
website; consistent naming conventions aren’t followed and
reflected in the copy. One major blunder being in the search
bar on the homepage having “From” and “Destination”
fields.
#4 Consistency
and standards
#6 Recognition
rather than recall
2
Some errors messages and other pop-up messages get
very less screen estate and are placed intuitively such that
they don’t stand out at all. Choose a specific tone, colors
and format of these messages such that they stand out
better.
#4 Consistency
and standards
#9 Help users
recognize,
diagnose, and
recover from
errors
2
On the search-box, The prices listed in the date pickers are
illegible. Consider scaling up the element to make the text
appear more clearly. Also, shorten five- or six-figure prices
#8 Aesthetic and
minimalist design
1
64
to ‘K’ notation for better readability in a small space.
The fonts used, icons and user interface of some
conventional sections like navigation, search, footer etc. are
inconsistent across the site.
#4 Consistency
and standards
1
65

More Related Content

PPTX
Diabetes Mellitus
PPTX
Hypertension
PPTX
Republic Act No. 11313 Safe Spaces Act (Bawal Bastos Law).pptx
PPTX
Power Point Presentation on Artificial Intelligence
PDF
Caça palavras - Bullying
PPTX
PDF
Atividade ortográfica - Caçada aos erros
Diabetes Mellitus
Hypertension
Republic Act No. 11313 Safe Spaces Act (Bawal Bastos Law).pptx
Power Point Presentation on Artificial Intelligence
Caça palavras - Bullying
Atividade ortográfica - Caçada aos erros

Similar to User test report for a flight booking travel website jeffrey jacob (20)

PDF
Evaluating the Usability of GrantFinder
PDF
Usability Testing Search Engines
PDF
A METHOD FOR WEBSITE USABILITY EVALUATION: A COMPARATIVE ANALYSIS
DOCX
White Paper
PDF
Panoptic Streaming Usability Report.pdf
PDF
Measuring the facility of use of a website designed with a methodology based ...
PDF
JURNAL AKUNTANSI DAN EKONOMI BISNIS INDONESIA
PDF
Online Examination _Advocacy Document
DOCX
Mockup Usability Test for Golden Rule Rentals Website
DOCX
Online Polling System Proposal
DOCX
eFolioMinnesota Text-Based Usability Test Findings and Analysis Report
PPTX
Mobile Healthcare App
PDF
Interview Process Instructional Plan
PPT
Usability Primer - for Alberta Municipal Webmasters Working Group
DOCX
Documentation seminar
PDF
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
PDF
Usability Report
PDF
A QUANTITATIVE APPROACH IN HEURISTIC EVALUATION OF E-COMMERCE WEBSITES
PPTX
Survey Instrument Validity - OER Project
PDF
Model design to develop online web based questionnaire
Evaluating the Usability of GrantFinder
Usability Testing Search Engines
A METHOD FOR WEBSITE USABILITY EVALUATION: A COMPARATIVE ANALYSIS
White Paper
Panoptic Streaming Usability Report.pdf
Measuring the facility of use of a website designed with a methodology based ...
JURNAL AKUNTANSI DAN EKONOMI BISNIS INDONESIA
Online Examination _Advocacy Document
Mockup Usability Test for Golden Rule Rentals Website
Online Polling System Proposal
eFolioMinnesota Text-Based Usability Test Findings and Analysis Report
Mobile Healthcare App
Interview Process Instructional Plan
Usability Primer - for Alberta Municipal Webmasters Working Group
Documentation seminar
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
Usability Report
A QUANTITATIVE APPROACH IN HEURISTIC EVALUATION OF E-COMMERCE WEBSITES
Survey Instrument Validity - OER Project
Model design to develop online web based questionnaire
Ad

Recently uploaded (20)

PPTX
building Planning Overview for step wise design.pptx
PPTX
6- Architecture design complete (1).pptx
PDF
YOW2022-BNE-MinimalViableArchitecture.pdf
PDF
SEVA- Fashion designing-Presentation.pdf
PDF
Trusted Executive Protection Services in Ontario — Discreet & Professional.pdf
PDF
Phone away, tabs closed: No multitasking
PPTX
Entrepreneur intro, origin, process, method
PDF
GREEN BUILDING MATERIALS FOR SUISTAINABLE ARCHITECTURE AND BUILDING STUDY
PDF
Quality Control Management for RMG, Level- 4, Certificate
DOCX
The story of the first moon landing.docx
PDF
Design Thinking - Module 1 - Introduction To Design Thinking - Dr. Rohan Dasg...
PPTX
AD Bungalow Case studies Sem 2.pptxvwewev
PPTX
BSCS lesson 3.pptxnbbjbb mnbkjbkbbkbbkjb
PDF
Urban Design Final Project-Site Analysis
PPTX
Complete Guide to Microsoft PowerPoint 2019 – Features, Tools, and Tips"
PPTX
areprosthodontics and orthodonticsa text.pptx
PPTX
rapid fire quiz in your house is your india.pptx
PDF
Urban Design Final Project-Context
PDF
BRANDBOOK-Presidential Award Scheme-Kenya-2023
PPTX
DOC-20250430-WA0014._20250714_235747_0000.pptx
building Planning Overview for step wise design.pptx
6- Architecture design complete (1).pptx
YOW2022-BNE-MinimalViableArchitecture.pdf
SEVA- Fashion designing-Presentation.pdf
Trusted Executive Protection Services in Ontario — Discreet & Professional.pdf
Phone away, tabs closed: No multitasking
Entrepreneur intro, origin, process, method
GREEN BUILDING MATERIALS FOR SUISTAINABLE ARCHITECTURE AND BUILDING STUDY
Quality Control Management for RMG, Level- 4, Certificate
The story of the first moon landing.docx
Design Thinking - Module 1 - Introduction To Design Thinking - Dr. Rohan Dasg...
AD Bungalow Case studies Sem 2.pptxvwewev
BSCS lesson 3.pptxnbbjbb mnbkjbkbbkbbkjb
Urban Design Final Project-Site Analysis
Complete Guide to Microsoft PowerPoint 2019 – Features, Tools, and Tips"
areprosthodontics and orthodonticsa text.pptx
rapid fire quiz in your house is your india.pptx
Urban Design Final Project-Context
BRANDBOOK-Presidential Award Scheme-Kenya-2023
DOC-20250430-WA0014._20250714_235747_0000.pptx
Ad

User test report for a flight booking travel website jeffrey jacob

  • 1. Usability Test Report of GoIbibo.com By Jeffrey Jacob for the Evaluating Designs With Users course offered by the University of Michigan on Coursera
  • 2. Table of Contents List of Figures 4 List of Tables 5 Executive Summary 6 Introduction 7 Methods 8 Usability Metrics Measured 10 Findings and Recommendations - Task Completion Rate - Task Ratings - Time on Task - Errors - Overall Satisfaction - Summary of Data - Key Findings and Recommendations 11 11 12 14 15 16 19 20 Limitations 28 Conclusion 29 References 30 Appendix 1: User Test Scripts - Pre-test Checklist - Post-test Checklist - User Test Script 31 31 32 33 Appendix 2: Consent Form 35 Appendix 3: Questionnaires - Pre-test Questionnaire 36 36 2
  • 3. - Post-task Questionnaire - Post-test Questionnaire 37 38 Appendix 4: Task Scenarios 40 Appendix 5: Filled Out Logging Forms - Participant 1 - Participant 2 - Participant 3 - Participant 4 41 41 44 48 53 Appendix 6: Questionnaire Responses 57 Appendix 7: Complete List of Usability Issues 62 3
  • 4. List of Figures Figure 1: User profile drop-down on hover. 20 Figure 2: Collapsible sections of the payment details here look like they are disabled 21 Figure 3: The search results for a 4-passenger journey. Note how the fare of one ticket is listed on the right 22 Figure 4: The traveller details section of the checkout page 23 Figure 5: The fare alerts modal as advertised on the website 24 Figure 6: The search form (above) filled; in round-trip mode; (below) form reset automatically on switching to multi city 25 Figure 7: The search form (above) filled; in round-trip mode; (below) form reset automatically on switching to multi city 26 Figure 8: The footer section of the GoIbibo.com homepage 27 4
  • 5. List of Tables Salient demographic details of the participants 9 Task Completion Rate 11 Mean Task Ratings 12 Time on Task (in seconds) 14 Errors 15 SUS questionnaire responses 16 A general guideline used to interpret SUS scores 18 SUS scores for each participant 18 Summary of Completion, Errors, Time on Task, Mean Satisfaction 19 List of task scenarios provided to the participants 40 Collated questionnaire responses 57 List of usability issues found in GoIbibo.com 62 5
  • 6. Executive Summary I conducted a remote usability test from July 11th - 13th , 2020. The purpose of the test was to assess the usability of the web interface design, information flow and information architecture. A total of four participants participated in the test. Each individual test lasted approximately two hours. The test scenarios consisted of six tasks adapted from the tasks listed in the support materials of this course. In general, all participants found the GoIbibo.com website to be clear, straightforward and easy to use. Two of the four use travel booking websites at least once every 3 months to book flights. The test identified only a few minor problems including: - Sign In/Sign Up links not prominent enough on the navigation bar. - Collapsible sections in the payment details page look like they are disabled. - Only the ticket rate for one passenger is shown on the search results, even when booking for multiple passengers. - The user is able to list the same passenger name and details for all passengers travelling. - The Fare Alerts feature does not appear anywhere on the site. - Filled-in forms don’t get carried over when the user switches to another flight journey option. - Obscure and hard-to-find copy is used to describe the Amenities filter options. - Very low contrast between foreground and background colors in the footer section. This document contains the participant feedback, satisfaction ratings, task completion rates, ease or difficulty of completion ratings, time on task, errors, SUS scores and recommendations for improvements. A copy of the scenarios, scripts and questionnaires used are included in the Appendix section. 6
  • 7. Introduction To put it shortly, the goal of the test is defined as answering the question “Can frequent travellers use GoIbibo.com to plan their trips?” GoIbibo.com is one of India’s largest flights and hotels aggregator platforms, attracting over 2 million visits per month. GoIbibo.com provides booking services for hotel reservations, flight tickets, bus tickets, outstation/inter-city cab & taxi, IRCTC train ticket bookings, etc. To assess the ease of use and efficiency in making flight and hotel bookings through GoIbibo.com, I conducted a remote moderated usability test using Google Meet, a video conferencing tool and Google Forms, a survey administration service. The rest of this document describes the various methods used in designing and conducting the tests and the metrics and heuristics used in analyzing the findings from these tests into actionable results. 7
  • 8. Methods I reached out to a few colleagues and friends who fit within the requirements for the test. I selected the participants for this test using the following recruiting criteria: - Participants must have bought a plane ticket online in the past year. - Participants must not have used the site before. Table 1 lists the basic demographic details of the participants. All four participants were young adults of age 22/23 - two working professionals, one research scholar and one undergraduate student. Of the four participants, one was female and three were male. Table 1: Salient demographic details of the participants Participant Age Gender Occupation Frequency of travel (every 12 months) Specific preferences while booking 1 23 Male Student 2 Usually travels with his friends so the availability of seats is a major concern along with listings of alternative options within an affordable budget. 2 22 Female Business Intelligence Analyst 4 Price, travel insurance, multiple payment options (specifically net banking & UPI). 3 22 Male VLSI Intern 3 Price; prefers booking from a particular airline. 4 23 Male Software Developer 6 Flight timings and flight duration; usually visits flight aggregator sites to get an idea of his available options. I sent emails to attendees informing them of the test logistics and requesting their availability and participation. Participants responded with an appropriate date and time. Each individual 8
  • 9. session was conducted over Google Meet and lasted approximately one hour. During the session, I briefed on what the test session would entail and asked the participant some background questions and to fill out a consent form (listed at Appendix 3 and Appendix 2 respectively). The participants read the task scenarios and tried to find the information on the website (see Appendix 4 for the task descriptions). See Appendix 1 for the complete checklists and user scripts used for the tests. After each task, I asked the participant to rate the interface on a 5-point Likert Scale with measures ranging from Strongly Disagree to Strongly Agree. Post-task scenario subjective measures included (see Appendix 3): 1. How easy it was to find the information from the home page. 2. Ability to keep track of their location throughout the website. 3. Accurateness of predicting which section of the website contained the information. After the last task was completed, I asked the participant to rate the perceived usability of the website overall using a SUS questionnaire (see Appendix 3). All questionnaire responses filled by the participants are listed in Appendix 6. Recordings of the test sessions were taken to playback for review and analysis later. Critical events, incidents and timestamps were documented on logging sheets for quick review later. Scanned copies of these logging sheets can be found in Appendix 5. Some key findings from the tests are documented in this report along with the necessary recommendations. Standard usability metrics and Nielsen’s 10 heuristic principles were used in arriving at these results. A complete list of these usability issues is shown in Appendix 7. 9
  • 10. Usability Metrics Measured We used some metrics to quantify the usability of the website during the evaluation. These metrics act as standards to measure, compare and communicate the effectiveness, efficiency and satisfaction that the website provides to the participants in their contexts. The metrics used in this exercise are listed below: - Task Completion Rate & Number of Errors to measure the effectiveness of the website, - Time on Task to measure its effectiveness and, - Task- and test-level satisfaction scores using questionnaires: After Scenario Questionnaire (ASQ) to derive task ratings and System Usability Scale (SUS) to understand overall satisfaction. 10
  • 11. Findings and Recommendations Task Completion Rate All participants successfully completed Task 1 to 5. None of the participants were able to complete Task 6 which required them to set up a fare alert for a particular flight. Table 2: Task Completion Rate Participant Task 1 Task 2 Task 3 Task 4 Task 5 Task 6 1 ✔ ✔ ✔ ✔ ✔ ❌ 2 ✔ ✔ ✔ ✔ ✔ ❌ 3 ✔ ✔ ✔ ✔ ✔ ❌ 4 ✔ ✔ ✔ ✔ ✔ ❌ Success 4 4 4 4 4 0 Completion Rate 100% 100% 100% 100% 100% 0% 11
  • 12. Task Ratings After the completion of each task, participants rated the ease or difficulty of completing the task for three factors: - It was easy to find my way to this information from the home page. - I was able to keep track of where I was on the website as I was searching for this information. - I was able to accurately predict which section of the website contained this information. The 5-point rating scale ranged from 1 (strongly disagree) to 5 (strongly agree) with mean agree rating being 4. An overall rating of >4.0 is considered to satisfy all three conditions. Table 3 lists the task ratings for each of the six tasks by each of the factors listed above and their means. Table 3: Mean Task Ratings Task Ease in Finding Information Keeping Track of Location in Site Predicting Information Section Mean Task Rating 1 5.00 (100%) 4.25 (85%) 4.25 (85%) 4.50 2 4.75 (95%) 4.75 (95%) 4.00 (80%) 4.50 3 4.00 (80%) 4.75 (95%) 3.25 (65%) 3.00 4 4.75 (95%) 4.75 (95%) 4.25 (85%) 4.58 5 4.00 (80%) 4.50 (90%) 4.25 (85%) 4.25 6 1.00 (20%) 3.00 (60%) 1.25 (25%) 1.75 12
  • 13. Ease in Finding Information All participants agreed it was easy to find necessary information for the cheapest round trip flight ticket (task 1; rating = 5.00), the cheapest flight available for multiple passengers (task 2; rating = 4.75) and the number of flight options available at short notice (task 4; rating = 4.75). 80% of users found it easy to book multi-city flights (task 3; rating = 4.00) and hotels using relevant filters (task 5; rating = 4.00). Only 20% of users found it easy to create a fare alert (task 6; rating = 1.00). Keeping Track of Location in Site All the participants found it easy to keep track of their location in the site while finding the cheapest flight available for multiple passengers (task 2; rating = 4.75), cheapest flights for a multi-city trip (task 3; rating = 4.75), the number of flight options available at short notice (task 4; rating = 4.75) and finding a top-rated hotel for a stay (task 5; rating = 4.50). In addition, 85% found it easy to keep track of their location while finding the cheapest round trip flight ticket (task 1; rating = 4.25). However, only 60% of participants found it easy to keep track of their location while finding fare alert sections (task 6; rating = 3.00). Predicting Information Section All the participants agreed it was easy to predict where to find the cheapest round trip flight ticket for a given source and destination (task 1; rating = 4.25), the number of flight options available at short notice (task 4; rating = 4.25) and to find a top-rated hotel for a stay (task 5; rating = 4.25). 80% of participants agreed it was easy to predict where to find cheap flights for multiple passengers (task 2; rating = 4.00). However, only 65% agreed that it was easy to predict where to go to book for a multi-city trip (task 3; rating = 3.25) and only 25% agreed they could predict where to find the fare alert section (task 6; rating = 1.25). 13
  • 14. Time on Task I used a timer to record the time on task for each participant. Some tasks were inherently more difficult to complete than others and are reflected by the average time on task. Table 4 displays the time taken by each participant for each task and the average time on tasks. Table 4: Time on Task (in seconds) Participant Task 1 Task 2 Task 3 Task 4 Task 5 Task 6 1 182 223 524 272 840 792 2 146 184 379 209 531 558 3 192 318 340 326 749 450 4 397 295 422 362 534 856 Avg. Time on Task 229 255 416 293 663 664 Task 1 required participants to find the cheapest flight available for a round trip and took the shortest time to complete (mean = 229 seconds). Completion times ranged from 192 (approximately 3 minutes) to 397 seconds (more than 6 minutes) with most times less than 240 seconds (less than 4 minutes). Task 6 required participants to create a fare alert for a flight and took the longest time to complete (mean = 664 seconds). However, completion times ranged from 450 (just above 7 minutes) to 856 seconds (more than 14 minutes). 14
  • 15. Errors I captured the number of critical errors participants made while trying to complete the task scenarios in Table 5 below. Table 5: Errors Participant Task 1 Task 2 Task3 Task4 Task 5 Task 6 1 - - 1 - - 1 2 - - - 1 1 1 3 - - 2 2 1 1 4 - - - - - 1 Errors 0 0 3 3 2 4 Among the six, Task 6 produced the highest number of errors. All participants were unable to complete this task. The prompt was to create a fare alert for a flight for a given route. Despite being advertised as a main USP by the service, users were unable to find their way to this feature. No clear signals pointing to this feature were found across the website. Task 1, which required participants to find the cheapest flight available for a round trip and Task 2, where the user had to find the cheapest flight available for multiple passengers, were executed without any non-critical errors by all participants. 15
  • 16. Overall Satisfaction After the completion of all tasks, the participants took the System Usability Scale (SUS) questionnaire which measures the perceived usability of the website. This questionnaire contained 10 questions where participants were given a 1–5 scale to fill, according to how they agree with every statement regarding product or feature on the test. 1 means strongly disagree while 5 means they strongly agree with the statement. The questions along with the results are listed below in Table 6. Table 6: SUS questionnaire responses Question Strongly disagree 1 Disagree 2 Neutral 3 Agree 4 Strongly agree 5 Mean rating Percent agree I think that I would like to use this system frequently. - - - 2 2 4.50 100% I found the system unnecessarily complex. 2 2 - - - 1.50 0% I thought the system was easy to use. - - - 3 1 4.25 100% I think that I would need the support of a technical person to be able to use this system. 2 2 - - - 1.50 0% I found the various functions in this system were well integrated. - 1 1 1 1 3.50 50% 16
  • 17. I thought there was too much inconsistency in this system. 2 1 1 - - 1.75 0% I would imagine that most people would learn to use this system very quickly. - - - 2 2 4.50 100% I found the system very cumbersome to use. 1 2 - 1 - 2.25 25% I felt confident using the system. - - - 2 2 4.50 100% I needed to learn a lot of things before I could get going with this system. 2 2 - - - 1.50 0% The SUS scores were then calculated as follows: - For every odd-numbered question, subtract 1 from the score (X-1) - For every even-numbered question, subtract the score from 5 (5-X) - Each score was then multiplied by 2.5 and added up to form a total SUS score between 0 and 100, and graded as shown below. Based on research, a SUS score above a 68 would be considered above average and anything below 68 is below average. Table 7 gives a primer on how to interpret a SUS score on a grading scale. 17
  • 18. Table 7: A general guideline used to interpret SUS scores SUS Score Grade Adjective Rating > 80.3 A Excellent 68 - 80.3 B Good 68 C Okay 51 - 68 D Poor < 50 F Awful This scale was used to make sense of the data collected from the user tests conducted, as shown in Table 8. Table 8: SUS scores for each participant Participant Total SUS Score Adjective Rating 1 70 Good 2 92.5 Excellent 3 92.5 Excellent 4 72.5 Good Average 81.9 Excellent Going by the average SUS score of 81.9 from the tests taken, the GoIbibo website met the participants’ basic usability expectations for the most part, ranging from 70 to 92.5. 18
  • 19. Summary of Data Table 9 displays a summary of the test data. Low completion rates and satisfaction ratings and high errors and time on tasks are highlighted in red. Table 9: Summary of Completion, Errors, Time on Task, Mean Task Rating Task Task Completion Errors Time on Task Mean Task Rating 1 100% 0 229 4.50 2 100% 0 255 4.50 3 100% 3 416 3.00 4 100% 3 293 4.58 5 100% 2 386 4.25 6 0% 4 664 1.75 Except for one major task, the participants were able to accomplish most of the tasks without any hassles and the statistics detailed support that notion. Participants found GoIbibo.com to be well-organized, comprehensive, clean and uncluttered, very useful, and easy to use. However, they agreed that the website needs some fine-tuning in its user interface moving forward. Some major usability issues and violations were observed during testing and these isolated critical incidents are explained in detail in the next section. 19
  • 20. Key Findings and Recommendations This section provides recommended changes and justifications driven by the participant success rate, behaviours, and comments. Each recommendation includes a severity rating. The following recommendations will improve the overall ease of use and address the areas where participants experienced problems or found the interface/information architecture unclear. Finding 1 Heuristic Violated #2 Match between system and the real world Severity Rating 4, usability catastrophe Description The Sign In/Sign Up links should be more prominent on the navigation bar. The user profile icon is also shown even when the user is not logged in. This may confuse the user, who may click Figure 1: User profile drop-down on hover. Recommendation Remove the profile icon and dropdown and make the Sign In/Sign Up links more prominent on the page. 20
  • 21. Finding 2 Heuristic Violated #1 Visibility of system status #2 Match between system and the real world Severity Rating 4, usability catastrophe Description On the Payment Details page, some collapsible sections of the payment are closed by default and looked greyed out. This may give off the impression to the user that these elements are disabled and not clickable. Figure 2: Collapsible sections of the payment details here look like they are disabled. Recommendation Redesign the sections to be clear and look clickable. Adapt a high contrast color scheme for the new section. 21
  • 22. Finding 3 Heuristic Violated #3 User control and freedom #7 Flexibility and efficiency of use Severity Rating 4, usability catastrophe Description When booking for more than one passenger, only the ticket rate is shown on the search results, rather than the total price for the number of passengers given. This violates the mental model of the user who would want to know the total price they would have to pay before clicking further. Figure 3: The search results for a 4-passenger journey. Note how the fare of one ticket is listed on the right. Recommendation List out the total price for the flight based on the search results. This would save the user from the extra cognitive load of calculating the fare for multiple passengers. 22
  • 23. Finding 4 Heuristic Violated #2 Match between system and the real world #5 Error prevention Severity Rating 4, usability catastrophe Description For bookings involving more than one passenger, the user is able to list the same passenger name and details for all passengers. The system doesn’t recognize this and even saves these details to be auto-filled in later bookings. Figure 4: The traveller details section of the checkout page. Recommendation The system should be able to recognize these details and identify whether each passenger's details are unique and verified. 23
  • 24. Finding 5 Heuristic Violated #7 Flexibility and efficiency of use #3 User control and freedom Severity Rating 4, usability catastrophe Description The Fare Alerts feature (as advertised here) does not appear anywhere else on the site. This is more an ethical concern of false advertising as there is no way an user can set up fare alerts for a particular flight on the current iteration of the site. Figure 5: The fare alerts modal as advertised on the website. Recommendation The system should be able to account for this feature by allowing the user to log in using email and having a fare tracker to send updates for a particular search by email. The current website uses a phone number or an automatically generated dummy email as username. 24
  • 25. Finding 6 Heuristic Violated #2 Match between system and the real world #3 User control and freedom Severity Rating 3, major usability problem Description Filled-in forms don’t get carried over when the user switches to another flight journey option. Figure 6: The search form (above) filled; in round-trip mode; (below) form reset automatically on switching to multi city. Recommendation The system should be able to transfer the common form responses (source, destination, departure date, number of passengers) across all journey modes. 25
  • 26. Finding 7 Heuristic Violated #2 Match between system and the real world #6 Recognition rather than recall Severity Rating 3, major usability problem Description When searching for hotels, one common filter that the user would use is to check for the availability of certain amenities. While an amenities filter with more than 20 listed facilities is provided on the site, it uses relatively obscure names for some services which in turn reduce its findability and frustrates the user (For example, the filter lists ‘Internet’ or ‘Free Internet’ instead of the more commonly used ‘Wifi’). Figure 7: The search form (above) filled; in round-trip mode; (below) form reset automatically on switching to multi city. Recommendation Use clear, concise and common terms for the facilities (For example, ‘Wifi’ instead of ‘Free Internet’). 26
  • 27. Finding 8 Heuristic Violated #8 Aesthetic and minimalist design Severity Rating 2, minor usability problem Description In the footer sections, there is a very low contrast between foreground and background colors. Adequate contrast is necessary for all users, especially users with low vision. These are quick links to flights to some of the more popular destinations and therefore should be more prominent in the page. Figure 8: The footer section of the GoIbibo.com homepage. Recommendation Use accessibility-compliant color combinations. Increase the contrast between the foreground (text) color and the background color (ratio<4.5:1). Large text (larger than 18 point or 14 point bold) does not require as much contrast as smaller text (ratio<3:1). Other findings are listed in Appendix 7. 27
  • 28. Limitations A major limitation faced was in the way these tests were conducted. Owing to the circumstances caused the coronavirus pandemic, I made the choice to conduct the tests remotely. Participants were recruited over email. All arrangements for the user test were made over the Google Suite of services - Calendar to schedule meetings, Meet to host the interview and record meetings, Forms for the questionnaires and surveys and Drive to store and play back the video recordings. All the recruited participants were from a narrow age bracket of 22-23. I reached out to my immediate social circles for recruitment because of some time constraints and this test was devised considering the goals and actions of that particular group of users. Ideally user tests are conducted with 2-3 moderators. However, considering the constraints of the assignments, I acted as the sole moderator for the tests. This involved handling multiple duties of logging and moderating the interviews. 28
  • 29. Conclusion Most of the participants found GoIbibo.com to be well-organized, comprehensive, clean and uncluttered, very useful, and easy to use. Having a centralized site to accommodate all kinds of travel options in India is key to many if not all of the participants. Implementing the above-listed recommendations and continuing to conduct tests with users in the future will ensure a continued user-centred website. 29
  • 30. References - Nielsen, J. (1994) Heuristic Evaluation. In J. Nielsen. & R. L. Mack (Eds.) Usability Inspection Methods. New York, NY: John Wiley & Sons. - https://www.nngroup.com/articles/usability-metrics/ - https://usabilitygeek.com/usability-metrics-a-guide-to-quantify-system-usability/ - 30
  • 31. Appendix 1: User Test Scripts Pre-test Checklist 1. Login with phone number and password. 2. Remove login saved memory. 3. Double-check success criteria. 4. Upload and cross-check the Google Form link; the form contains the following: - Consent form - Task instructions - Post-test questionnaire 5. Print test script and logging sheet. 6. Set up a Google Meet link and forward it to the participant ten minutes before meeting time. 7. Once the participant joins, inform them about the background details of this test and request consent to record and document the proceedings. 8. Send the survey link to the participant and ask them to fill the consent form to proceed further. 9. Start screen recording 10. Request the participant to share their screen 11. Ask the participant to clear their cookies from goibibo.com on their browser. 12. Provide the participants with the login details (phone number and password). 31
  • 32. Post-test Checklist 1. Stop recording; ensure audio and video of the meeting gets saved to your Google Drive. 2. Verify whether the participant filled out all parts of the survey form. 3. File logging sheet. 32
  • 33. User Test Script 1. Onboarding once the participant joins the meeting “Hi __! Thanks for coming in today! The goal for today’s session is to test the website - GoIbibo.com. I’m here to learn from you so I’ll ask a lot of questions, ​ but I’m not testing you​ . There are no right or wrong answers. I’ll start this session by asking some background questions. Then I’ll ask you to do some tasks. As you work on the tasks, please ​ think aloud.​ ​ This means that you should try to give a running commentary on what you're doing as you work through the tasks. Tell me what you're trying to do and how you think you can do it. If you get confused or don't understand something, please tell me. If you see things you like, tell me that too. Candid feedback is the most helpful here. If you do get stuck, ​ I’m going to try not to answer your questions​or tell you what to do. I’m just trying to see what you would do if you were using it on your own. But don’t worry- ​ I’ll help you if you get completely stuck​ . Do you have ​ any questions​before we begin?” 2. Present the consent form​ , summarize it, and obtain permission 3. Present the pre-test questionnaire 4. Task Instructions Present the tasks one at a time. Read each task aloud and allow the participant to view the task through the form. After the completion of each task, ask the participant to fill out a quick post-task questionnaire before moving to the next one. 5. Once all tasks are completed, present the post-test questionnaire and debrief. Review parts of the test where the user struggled. 6. Conclusion “This has been incredibly helpful. Today, you mentioned… [Try to briefly summarize 33
  • 34. some key parts of the discussion or issues.] Your input is really valuable for me and will help me ideate the next steps for these ideas. I really appreciate your taking the time to come in, and answering all of my questions. Thank you so much!” 13. 34
  • 35. Appendix 2: Consent Form I agree to participate in the study of a flight booking website being conducted as part of the Coursera course: Evaluating Designs with Users. I consent to the recording of this test. This recording will be used for research and product improvements only. I understand that participation in this usability study is voluntary and I agree to immediately raise any concerns or areas of discomfort during the session with the study administrator. Please sign below to indicate that you have read and you understand the information on this form and that any questions you might have about the session have been answered. Thank you! We appreciate your participation. Date: Name: 35
  • 36. Appendix 3: Questionnaires Pre-test Questionnaire 1. Have you used GoIbibo.com before? 2. Tell me about the last trip you planned. - What do you usually use to plan your trip? - What is your primary purpose for travelling? - What is your primary concern? - What is your budget? 3. What information is the most important when you are planning your trip? 4. How often do you travel? 36
  • 37. Post-task Questionnaire 1. It was easy to find my way to the information needed to complete this task from the homepage. 1 2 3 4 5 Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree 2. As I was searching for this information, I was able to keep track of where I was on the website. 1 2 3 4 5 Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree 3. I was able to accurately predict which section of the website contained this information. 1 2 3 4 5 Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree 37
  • 38. Post-test Questionnaire Qualitative Questionnaire 1. What did you like the most about the website? 2. What did you like the least about the website? 3. What changes would you make to better the site’s user experience? 4. How would you describe this website to another person? 5. Under what circumstances would you visit this website in the future? Why? SUS Questionnaire 1. I think that I would like to use this system frequently. 1 2 3 4 5 Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree 2. I found the system unnecessarily complex. 1 2 3 4 5 Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree 3. I thought the system was easy to use. 1 2 3 4 5 Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree 4. I think that I would need the support of a technical person to be able to use this system. 1 2 3 4 5 Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree 38
  • 39. 5. I found the various functions in this system were well integrated. 1 2 3 4 5 Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree 6. I thought there was too much inconsistency in this system. 1 2 3 4 5 Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree 7. I would imagine that most people would learn to use this system very quickly. 1 2 3 4 5 Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree 8. I found the system very cumbersome to use. 1 2 3 4 5 Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree 9. I felt confident using the system. 1 2 3 4 5 Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree 10. I needed to learn a lot of things before I could get going with this system. 1 2 3 4 5 Strongly disagree ⭕ ⭕ ⭕ ⭕ ⭕ Strongly agree 39
  • 40. Appendix 4: Task Scenarios Table 10: List of task scenarios provided to the participants Task Brief 1 Your manager asks you to help her plan a few trips for the company. She has heard of a website called “GoIbibo.com​ ” that can help and encourages you to use it. Plan a round trip from Delhi to Mumbai for under INR 7,000 (or the next cheapest price) from September 16, 2020, to September 19, 2020. Note - Unless otherwise specified, any arrival/departure time is okay 2 4 people from the Chennai office want to attend a conference in Bangalore from September 8, 2020, to September 10, 2020. What is the cheapest total price of the trip? 3 Your manager wants to join the Chennai team in Bangalore (your office is in Delhi), but then she wants to go to Dubai for a week then return to Delhi. She plans to fly business class for the entire trip. What is the cheapest price for her trip? 4 The Pune office manager has a meeting in Bangalore on October 16, 2020, at noon. She wants to leave on October 15 after 9 am, and has to arrive back in Pune anytime before 9 pm on the next day. How many flight options do you have? 5 Help your manager book a place to stay from October 16-18. Find the top-rated hotel that has wifi for under INR 7000/night in Bangalore. 6 You want to surprise your family with a visit over Christmas but money is tight. Set up a fare alert for a trip from Delhi to Kochi from December 22, 2020 to December 26, 2020. 40
  • 41. Appendix 5: Filled Out Logging Forms Participant 1 41
  • 42. 42
  • 43. 43
  • 45. 45
  • 46. 46
  • 47. 47
  • 49. 49
  • 50. 50
  • 51. 51
  • 52. 52
  • 54. 54
  • 55. 55
  • 56. 56
  • 57. Appendix 6: Questionnaire Responses Table 11: Collated questionnaire responses Participant ID Completed Date & Time 1 July 11, 2020 4:54 PM 2 July 12, 2020 4:10 PM 3 July 12, 2020 8:45 PM 4 July 13, 2020 12:30 AM Demographic Details Age 23 22 22 23 Gender Male Female Male Male Occupation Student Business Intelligence Analyst VLSI Intern Software Developer Pre-test Questionnaire Have you used GoIbibo.com before? No Yes Yes No Tell me about the last trip you planned. Chennai - Bangalore (and back) - Train Nov 2019. Chennai - Pune (one-way) July 2020. Vizag - Chennai (and back) Dec 2019. Bangalore - Delhi (one-way) Feb 2020. What do you usually use to plan your trip? Ixigo. Make My Trip. Make My Trip. Make My Trip, airline carrier website. 57
  • 58. What was your primary purpose for travelling? Internship/ project purposes, trips with friends. Lives away from family due to work; visits them on vacations. Pilgrimage trip with family. To attend a family wedding. What was your budget? Rs. 2000 - 4000 per ticket. Rs. 1000 - 2000 per ticket. Rs. 4000 - 5000 per ticket. Depends on the situation; open to paying more if flight timings are convenient. What information is the most important when you are planning your trip? Route timings, web check-in, prices. Immediate updates on changes in flight status (Email/SMS), Easy & quick payment options, prices. Prices, availability of window seats. Loyal to one airline (Indigo) because of the leg room it offers; Only searches within that airline when booking. Flight timings, baggage limit; uses flight aggregator sites to get an overview of available flight options and then books off the official airline’s website to save costs. How often do you travel (say, every 12 months)? 2 4 3 6 Task 1 Ease in Finding Information 5 5 5 5 58
  • 59. Keeping Track of Location in Site 4 4 5 4 Predicting Information Section 3 4 5 5 Task 2 Ease in Finding Information 5 5 5 4 Keeping Track of Location in Site 4 5 5 5 Predicting Information Section 3 4 5 4 Task 3 Ease in Finding Information 4 5 2 5 Keeping Track of Location in Site 4 5 5 5 Predicting Information Section 2 5 2 4 Task 4 Ease in Finding Information 5 5 5 4 Keeping Track of Location in Site 4 5 5 5 Predicting Information Section 3 5 5 4 Task 5 Ease in Finding Information 4 5 3 4 Keeping Track of Location in Site 3 5 5 5 59
  • 60. Predicting Information Section 3 5 5 4 Task 6 Ease in Finding Information 1 1 1 1 Keeping Track of Location in Site 1 5 1 5 Predicting Information Section 1 2 1 1 Post-test Questionnaire I think that I would like to use this system frequently 4 5 5 4 I found the system unnecessarily complex. 2 1 1 2 I thought the system was easy to use. 4 5 4 4 I think that I would need the support of a technical person to be able to use this system. 2 1 1 2 I found the various functions in this system were well integrated. 3 5 4 2 I thought there was too much inconsistency in this system. 3 1 1 2 I would imagine that most people would learn to use this system 4 5 4 5 60
  • 61. very quickly. I found the system very cumbersome to use. 2 4 1 2 I felt confident using the system. 4 5 5 4 I needed to learn a lot of things before I could get going with this system. 2 1 1 2 61
  • 62. Appendix 7: Complete List of Usability Issues Heuristics evaluation is a cheap, fast, and easy- to -use usability engineering method designed by Jakob Nielsen in 1994 to find usability problems in user interface designs. These heuristics cover topics such as feedback, visibility, user control, user efficiency, help, error handling, error prevention, and use of metaphors that match the real world. Each finding is given a severity rating from 1 (cosmetic problem) to 4 (usability catastrophe) and assigned a recommendation. Table 12: List of usability issues found in GoIbibo.com Finding Heuristics Violated Severity Rating The Sign In/Sign Up links should be more prominent on the navigation bar. The user profile icon is also shown even when the user is not logged in. #2 Match between system and the real world 4 On the Payment Details page, some collapsible sections of the payment are closed by default and looked greyed out. This may give off the impression to the user that these elements are disabled and not clickable. #1 Visibility of system status #2 Match between system and the real world 4 When booking for more than one passenger, only the ticket rate for one passenger is shown on the search results, rather than the total price for the number of passengers given. This violates the mental model of the user who would want to know the total price they would have to pay before clicking further. #3 User control and freedom #7 Flexibility and efficiency of use 4 62
  • 63. For bookings involving more than one passenger, the user is able to list the same passenger name and details for all passengers. The system doesn’t recognize this and even saves these details to be auto-filled in later bookings. #2 Match between system and the real world #5 Error prevention 4 The Fare Alerts feature (as advertised here) does not appear anywhere else on the site. This is more an ethical concern of false advertising as there is no way an user can set up fare alerts for a particular flight on the current iteration of the site. #7 Flexibility and efficiency of use #3 User control and freedom 4 Filled-in forms don’t get carried over when the user switches to another flight journey option. #2 Match between system and the real world #3 User control and freedom 3 Clicking on the ‘Student Fare’ checkbox does not result in any visible and immediate feedback. Also, this option is already provided on the homepage and does not appear anywhere else in the site. #1 Visibility of system status #2 Match between system and the real world 3 When searching for hotels, one common filter that the user would use is to check for the availability of certain amenities. While an amenities filter with more than 20 listed facilities is provided on the site, it uses relatively obscure names for some services which in turn reduce its findability and frustrates the user (For example, the filter lists ‘Internet’ #2 Match between system and the real world #6 Recognition rather than recall 3 63
  • 64. or ‘Free Internet’ instead of the more commonly used ‘Wifi’.) The sub-sites under GoIbibo.com all are visually inconsistent with each other in terms of layout, styling and user interface. #4 Consistency and standards #8 Aesthetic and minimalist design 2 In the footer section, there is a very low contrast between foreground and background colors. Adequate contrast is necessary for all users, especially users with low vision. These are quick links to flights to some of the more popular destinations and therefore should be more prominent in the page. #8 Aesthetic and minimalist design 2 Elements are referred to by different names throughout the website; consistent naming conventions aren’t followed and reflected in the copy. One major blunder being in the search bar on the homepage having “From” and “Destination” fields. #4 Consistency and standards #6 Recognition rather than recall 2 Some errors messages and other pop-up messages get very less screen estate and are placed intuitively such that they don’t stand out at all. Choose a specific tone, colors and format of these messages such that they stand out better. #4 Consistency and standards #9 Help users recognize, diagnose, and recover from errors 2 On the search-box, The prices listed in the date pickers are illegible. Consider scaling up the element to make the text appear more clearly. Also, shorten five- or six-figure prices #8 Aesthetic and minimalist design 1 64
  • 65. to ‘K’ notation for better readability in a small space. The fonts used, icons and user interface of some conventional sections like navigation, search, footer etc. are inconsistent across the site. #4 Consistency and standards 1 65