BSc (Hons) Business Technology
An Exploration of the Gamification of Business Intelligence Tools
andthe Effect onUser Engagement
Gary Brogan
B00272662
22nd
April 2016
Supervisor: Dr Carolyn Begg
1 | P a g e
Declaration
This dissertation is submitted in partial fulfilment of the requirements for the degree of BSc
Business Technology (Honours) in the University of the West of Scotland.
I declare that this dissertation embodies the results of my own work and that it has been
composed by myself. Following normal academic conventions, I have made due
acknowledgement to the work of others.
Name: GARY BROGAN
Signature:
Date: 22/04/2016
2 | P a g e
Library Reference Sheet
Surname- Brogan
First Name- Gary Initials- GB
Borrower ID Number- B00272662
Course Code – BTCS – COMPSCI
Course Description – BSc Hons Business Technology
Project Supervisor-Dr Carolyn Begg
Dissertation Title-
An Exploration of the Gamification of Business Intelligence Tools and the Effect
on User Engagement
Session- 2015/2016
Acknowledgements
I would like to thank both Dr Carolyn Begg and PhD student Stephen Miller for their
continued support and advice throughout this project. A special thank you goes to my wife
Tracey Brogan, who has been completely understanding and supportive during my time at
University, and my two daughters, Abbey and Liara, who have also fully supported me
throughout this journey.
3 | P a g e
Contents
Abstract.......................................................................................................................................5
Chapter 1: Introduction:............................................................................................................... 6
1.1 Introduction to Key Themes.................................................................................................6
1.2 Aims and objectives of the project........................................................................................ 6
1.3 Research methodology and techniques used ........................................................................7
1.4 Project scope and limitations............................................................................................... 7
Chapter 2: Literature review:........................................................................................................8
2.1 Business Intelligence ...........................................................................................................9
2.1.1 Business Intelligence defined......................................................................................... 9
2.1.2 Current state of Business Intelligence........................................................................... 10
2.1.3 Summary.................................................................................................................... 12
2.2 User Engagement with BI Tools.......................................................................................... 12
2.2.1 What is employee engagement?.................................................................................. 13
2.2.2 What is User Engagement?.......................................................................................... 13
2.2.3 BI Adoption Rate......................................................................................................... 13
2.2.4 Summary.................................................................................................................... 15
2.3 Gamification ..................................................................................................................... 15
2.3.1 Game Elements........................................................................................................... 17
2.3.2 Scope of Gamification ................................................................................................. 18
2.3.3 Successful Gamification............................................................................................... 19
2.3.4 Gamification platforms for Business Intelligence tools................................................... 19
2.3.2 Summary.................................................................................................................... 20
2.4 Literature review conclusion.............................................................................................. 20
2.4.1 User Engagement........................................................................................................ 21
2.4.2 Enterprise Gamification relationship with BI................................................................. 21
2.4.3 Motivational Theory linked to SSBI and Gamification .................................................... 21
2.4.4 Summary.................................................................................................................... 21
Chapter 3: Research methodology: ............................................................................................ 21
3.1 Selection criteria............................................................................................................... 22
3.2 Project GamBIT................................................................................................................. 22
3.3 Ethical debate surrounding gamification and study participation.......................................... 23
3.4 Quantitative Research ....................................................................................................... 24
3.4.1 Measuring User Engagement....................................................................................... 25
3.5 Qualitative Research.......................................................................................................... 25
3.6 Methodological stages....................................................................................................... 26
4 | P a g e
3.6.1 Steps involved in open coding...................................................................................... 27
Chapter 4: Experimental and Interview Process:......................................................................... 28
4.2 GamBIT Experiment........................................................................................................... 28
4.2.1 Appeal for Volunteers ................................................................................................. 28
4.2.2 Experiment................................................................................................................. 29
4.3 Interview process.............................................................................................................. 29
Chapter 5: Results and Analysis:................................................................................................. 30
5.1 Quantitative data.............................................................................................................. 30
5.1.1 Participant Results and Analysis................................................................................... 31
5.1.2 Survey Information Results andAnalysis....................................................................... 33
5.1.3 UES statistical Results and Analysis............................................................................... 39
5.1.4 User engagement highest rankingfactors..................................................................... 40
5.1.5 User engagementlowest ranking factors...................................................................... 43
5.1.6 Summary of UES data.................................................................................................. 44
5.1.7 Time taken to complete tasks results and analysis......................................................... 44
5.1.8 Summary of time taken to complete tasks.................................................................... 45
5.2 Qualitative data................................................................................................................. 45
5.2.1 Participant A............................................................................................................... 46
5.2.2 Participant B............................................................................................................... 48
5.2.3 Participant C............................................................................................................... 50
5.2.4 Participant D............................................................................................................... 51
5.3 Summary of Qualitative Data Results.................................................................................. 53
Chapter 6: Conclusion: .............................................................................................................. 54
6.1 Review of research objectives............................................................................................ 55
6.2 Discussion of primary and secondary conclusions................................................................ 55
6.3 Limitations placed on project............................................................................................. 56
6.4 Future research work ........................................................................................................ 57
6.5 Summary .......................................................................................................................... 58
Chapter 7: Critical evaluation: ..................................................................................................... 58
7.1 Reflecting on the initial stages of the project....................................................................... 58
7.2 Approach to project........................................................................................................... 59
7.3 Honours year modules....................................................................................................... 59
7.4 Project aids....................................................................................................................... 59
7.5 Summary .......................................................................................................................... 60
References................................................................................................................................. 60
Appendix:................................................................................................................................... 63
5 | P a g e
Appendix A Descriptive statistics.............................................................................................. 63
Appendix B Appeal for volunteers............................................................................................ 66
Appendix C Semi-Structured Interviews.................................................................................... 67
Appendix D Interview Guide.................................................................................................... 68
Appendix E Project Specification Form..................................................................................... 68
Appendix F Initial GamBIT development involvement ............................................................... 69
Appendix G User Engagement Scale including Research Survey ................................................. 70
Appendix H GamBIT gamification elements.............................................................................. 70
Abstract
The principal objective of the study is to explore the issue of lack of user engagement with
BI tools. The point it aims to address is whether making BI tools more fun and engaging by
applying gamification to BI tools effects user engagement with the tools. This project will
explore the gamification of BI tools and the effects on user engagement, to see if there is an
increase in user engagement.
The literature revealed that only 24% of staff who are exposed to BI tools are considered
actively engaged with BI tools. It is also widely acknowledged that BI has not fulfilled its true
potential with traditional BI best practises being considered a bit of a failure. Could
“gamifying” a BI tool affect user engagement with the tool and address the issue of lack of
engagement? To test this theory a prototype gamified BI tool has been developed, namely
Project GamBIT.
To carry out the research on the study objectives a mixed methodology was used which
helped gather both qualitative and quantitative data. This was deemed the most
appropriate approach to a study which is exploratory in nature as each approach has the
potential to enhance and/or complement the other in knowledge gained on the same
research problem.
To gather the quantitative data the User Engagement Scale (UES) was applied to the GamBIT
prototype and analysis of the data. To gather the qualitative data semi-structured
interviews were conducted with volunteers who had took part in the GamBIT experiment in
an attempt to glean more information over and above the quantitative data.
The study is unique as there is little credible academic research been carried out on the lack
of user engagement of BI tools. The results of this unique study demonstrates that
gamifying a BI tool does increase certain user engagement factors and can increases
motivation to use BI tools more often. Feedback from the interviews conducted highlights
further areas where user engagement was considered more significantly increased.
6 | P a g e
Chapter 1: Introduction:
The first chapter will set the scene for the research that has been undertaken and will
provide an introduction of the topics covered in the report. It will cover the aims and
objectives of the report and include the inspiration that led to the research being carried
out along with the research problem, that there is lack of engagement with BI tools, and
aims to provide a brief background on why this is. It will provide a very brief overview of the
research methods used along with the scope and limitations of the report.
1.1 Introduction to Key Themes
Business Intelligence systems and all their components have been around for a number of
years now. Business Intelligence (BI) has, since the late 1980s, evolved into a multi-billion
dollar industry. Its main purpose is to produce timely, accurate, high-value, and actionable
information. As a technology, BI has been seen to be under used and, as such, has
significant untapped potential. One of the main factors that contribute to it being under
used is a lack of user engagement with BI front end tools. With the adoption rates of BI tools
remaining flat at around 24% over the past few years, many BI initiatives have failed to
deliver the expected results leading to a common belief that traditional BI best practices
where “a bit of a failure” (Howson,C 2014). To tackle the issue of lack of user engagement,
applying the concept of gamification to BI tools may offer a solution.
Organisations are increasingly recognising that applying gamification platforms to a wide
variety of business processes may hold the key to increased user engagement. A widely
cited 2011 Gartner study predicted by 2016 the widespread use of gamification would be
applied to 50% of organisations business processes. Although this now seems highly
unlikely, the industry does continues to grow with many gamification platform providers
such as Badgeville, Game Effective, and Bunchball leading the way in “gamifying” business
processes. These gamification practitioners champion the use of techniques such as
rewarding certain behaviours using points and badges, highlighting personal achievements
on leader boards and basically trying to make business processes more fun and rewarding.
Successful gamification practitioners also understand the relationship between psychology
and technology giving thought to what motivates someone to engage with a certain task,
process, or software tool. Understanding motivational theory and indeed why users engage
with certain tasks, processes, or software tools may provide some answers to the question
of “why individual IT adoption rates are much lower that many organisations originally
forecast?” (Wu,M. 2011).
Early indicators entertain the possibility that the recent trend of enterprise gamification,
which applies gamification to the workplace environment, may become an integral part of
any organisations future BI initiatives and a way to further operationalize BI. Could providing
enterprise gamification platforms for BI processes hold the key to tackling the issue of lack
of user engagement with BI tools?
1.2 Aims and objectives of the project
The point this project is aimed at is to address the issues surrounding lack of user
engagement with BI front end tools, and asks the question, “Whether the gamification of BI
tools can affect user engagement?” The objectives are to explore the gamification of BI tools
7 | P a g e
and the effect, if any, on user engagement with the tools. Once complete this will achieve
the aims of the project.
This has been chosen as the focus of the project as BI and its modular components are
currently playing a major role in the Business Technology sector with the lack of user
engagement with BI tools being a global organisational issue. Combined with the recent
trend of gamification, and it’s potential to increase user engagement, these subject areas
form an interesting basis of exploration for any Business Technology student.
This project will also form part of an on-going experimental study named Project GamBIT, a
prototype gamified BI tool. Work carried out has aided GamBIT application development
and helped gather evidence whether GamBIT achieved or not, increased user engagement
with a BI tool.
1.3 Research methodology and techniques used
The report will contain details of how primary research will be conducted, providing details
of how user engagement with a BI tool will be measured and analysed. This will provide
both the quantitative and qualitative data needed to address the main points of the report,
whether the gamification of BI tools can affect user engagement. As the objective is to
explore the gamification of BI tools and the effects on user engagement, if any, analysis of
both the quantitative and qualitative data was conducted to aid in the exploration process.
Additionally the report focuses on key academic papers from both the BI and Gamification
sectors, with the emphasis on user engagement within each sector, and draws on findings
from industry experts such as (Howson,C.), (Werbach,K.), (Zichermann,G.). This forms the
basis for the literature review and aims to address the key areas of the report.
As this report covers areas with few comprehensive academic works it will draw on white
papers, articles and blogs, Vendor specific websites, webinars, studies, and gamification
platform providers where appropriate. As some non-academic researched literature may be
somewhat biased, the application of criticismwill be provided when deemed necessary and
appropriate in an attempt to mitigate as much bias as possible.
1.4 Project scope and limitations
This section will concentrate on the boundaries of the secondary and primary research.
The primary research will explore if the gamification of a front-end BI tool will have any
effect on user engagement with the tool. As this subject is unique in that there is no current
academic research been done in this area, the project will include both qualitative and
quantitative research methods. Quantitative and qualitative data was collected on one
specific experimental front- end BI reporting tool. The tool was designed using the Eclipse
BIRT platform which is an open source platform for BI tools. It is also worth noting that the
majority of primary data was collected from students of the School of Engineering and
Computing who may already be considered somewhat “engaged” with front-end BI
reporting tools.
Further qualitative data was collected in an attempt glean more information over and above
the quantitative data collected. It also attempts to gain further insight into user’s thoughts,
8 | P a g e
feelings and opinions on the future evolution of both the BI and gamification sectors and
identify any correlations between these sectors.
The secondary research of this report explores the concept of BI, its modular components,
the emergence of BI and the factors influencing the BI industry with a focus on the adoption
rate of BI tools. The concept of BI is examined in its broadest sense by reviewing the
published literature with particular reference to material based on user engagement with BI
tools which is relatively limited in scope and detail. The report will then centre on the recent
trend of gamification, what it is, and its scope. It will explore the possibility of whether, by
applying gamification to a front-end BI tool, this could have an effect on user engagement
with the tool. When researching user engagement, users mainly fall into two groups,
employees and customers. For the purposes of this study the focus is on the employee user
group.
The subject areas that will form the basis of the literature review are:
 Business Intelligence (BI)
 User engagement with BI tools
 Gamification.
Figure 1-0 The Venn diagram organises the key subject areas of this report visually so the
similarity between relationships can be clearly seen.
Chapter 2: Literature review:
Keywords - Gamification, Employee Engagement, User Engagement, Business Intelligence,
Business Intelligence Tools, Game Elements, Game Mechanics, Intrinsic Motivation,
Enterprise Gamification.
9 | P a g e
This chapter contains the literature review carried out by the researcher and examines
relevant literature on the BI sector, focusing on the history of BI and the current state of the
industry. The literature review will then concentrate on user engagement with BI and
especially front-end BI tools. This part focuses specifically on the exploration of adoption
rates of BI tools and highlights any potential issues that could lead to “a lack of user
engagement with BI tools”. The focus will then turn to the new trend of gamification and its
potential correlation with BI and user engagement with front-end BI tools
2.1 Business Intelligence
The term Business Intelligence, or BI, was coined by Howard Dresner of the Gartner Group,
in the late 1980s. BI is a huge and rapidly growing industry that emerged as a result of
organisations beginning to realise and understand that the data stored within their decision
support systems (DSS) had the potential to be of great value to them. Many of the early
adopters of BI were in transaction-intensive businesses, such as telecommunications and
financial services. As the industry matured the BI technical architecture began to include
Data Warehouses, Data Marts, Executive Information Systems (EIS), Online Analytical
Processing (OLAP) and by the mid-1990s BI, along with all its modular components, became
widely adopted (Miller, 2013). As a result, BI became so closely associated with Data
Warehouse technology it became identified as one and the same and is referred to using
the acronym BI/DW. By the mid-1990s two main leaders in the BI industry emerged, Bill
Inmon and Ralph Kimball. Inmon’s philosophy is based on an enterprise approach to data
warehouse design using a top-down design method (Inmon 2005) while Kimball’s offering
consists of a dimensional design method which is considered a bottom-up design approach
(Kimball 2002). Even now a debate still rages on which of these approaches is more
effective. Research points towards both Inmon and Kimballs approaches having advantages
and disadvantages with many organisations having successfully implementing either
approach. Organisations who are considering implementing a BI infrastructure would have
to give careful consideration to both these approaches and closely aligned either approach
with the overall high level business strategy of the organisation. Until recently BI had
adopted a mainly centralised model around organisations IT departments. This meant that
getting information to the right users could take considerable time and the build-up of
requests for reports, analytics and insights from within the organisation could become
“bottlenecked”. The general consensus was that business users viewed BI tools as complex
and left the use of these tools to the “power users” within IT departments. This naturally
evolved into a big disconnect between the IT power users and business users and led to
many problems for what is now referred to as “Traditional BI”. Research suggests that
Traditional BI best practices were considered slow, painful, and expensive therefore seen as
a bit of a failure (Howson,C 2014).
2.1.1 Business Intelligence defined
As BI has evolved so too has its definition and as such can be defined in various ways.
(Howson, C. 2014) defines BI as a “set of technologies and processes that allow people at all
levels of the organisation to access and analyse data”. Gartner (2013), the world's leading
information technology research and advisory company, describes BI as an umbrella term
that includes the applications, infrastructure and tools, and best practices that enables
10 | P a g e
access to and analysis of information to improve and optimize decisions and performance.
Eckerson,W (2010), Director of BI Leadership Research , appreciated the need for BI tools to
provide production reporting, end-user query and reporting, OLAP, dashboard/screen tools,
data mining tools, and planning and modelling tools. Research suggests that currently there
are no combinations of hardware and software, any processes, protocols, or architectures
that can truly define BI. What (Wu, L, Barash, G, & Bartolini, C. et al 2007) have made clear
however, is that up until recently BI’s objectives were to:
 Offer an organisation a “single version of the truth”.
 Provide a simplified systemimplementation, deployment and administration.
 Deliver strategic, tactical and operational knowledge and actionable insight.
2.1.2 Current state of Business Intelligence
The recent unstructured data explosion and the trend towards “Big data” (Davenport, T.H.,
Barth, P. & Bean, R. 2012) has seen BI evolve yet again and as such BI has become
synonymous with Big Data and Big Data analytics. As the volume, velocity and variety of
data (the three V’s) has exponentially increased so too has the demand for cost-effective,
innovative forms of information processing for enhanced insight and decision making (Lohr,
S. 2012) .Vast volumes of data are now being captured and stored, but research shows it has
been impossible for traditional BI to analyse and manage this data due to the unstructured
nature of it (figure 2-0). Wixom, B. (2014) highlights how BI responded to the challenges
posed by Big Data by adopting advanced technologies such as:
 Hadoop architectures
 Data visualization and discovery tools
 Predictive analytics
 Rare combinations of user skills (e.g., data scientists)
Figure 2-0 - Graphic: Imex Research
Businesses are now demanding faster time to insight (DiSanto,D, 2012), to stay competitive
in today’s fast paced, evolving global markets and BI has to at least try and keep up with the
pace of these demands. Traditional BI tools could take days or weeks to produce reports and
11 | P a g e
analysis, this is no longer enough. This seen a demand for real time Business Intelligence,
(RTBI) Azvine, B, Cui, Z, & Nauck, D. (2005) agreed that it is
“becoming essential nowadays that not only is the analysis done on real-time data, but also
actions in response to analysis results can be performed in real time and instantaneously
change parameters of business processes”.
As RTBI evolved, so too has the more recent BI trend of self-service BI. Front-end business
users, who are considered the main information consumers, want to see, analyse and act
upon their data more quickly without having to heavily rely on IT departments making their
data available to them. The shift away from a centralised BI model to a more balanced
centralised/de-centralised BI model (Wu, L., Barash, G., and Bartolini,C, 2007) has seen the
emergence of, and increased organisational involvement with, self-service BI (SSBI). Gartner
(2013) defines SSBI “as end users designing and deploying their own reports and analyses
within an approved and supported architecture and tools portfolio.” Imhoff, C. & White, C.
(2011) define SSBI as the facilities within the BI environment that enable BI users to become
more self-reliant and less dependent on the IT organization. These facilities focus on four
main objectives:
1. Easier access to source data for reporting and analysis,
2. Easier and improved support for data analysis features,
3. Faster deployment options such as appliances and cloud computing, and
4. Simpler, customizable, and collaborative end-user interfaces.
Figure 2-1 - Graphic: BI Research and Intelligent Solutions, Inc.
To help organisations achieve these four main objectives it would be worth exploring the
concept of intrinsic motivation later on in this report, which Paharia, R. (2013) argues is
directly linked to SSBI users feeling empowered, and how it fits in with individual adoption
rates with SSBI processes.
Research points towards SSBI lending itself to “multiple versions of the same truth” whereas
traditional BI offered organisations a “single version of the truth”. SSBI has been facilitated
by the increased use of BI front end tools, mainly Visual Data Discovery (VDD) tools
(Howson, C. 2014). Eckerson,W (2010) defines VDD tools as “self-service, in-memory
analysis tools that enable business users to access and analyse data visually at the speed of
thought with minimal or no IT assistance and then share the results of their discoveries with
12 | P a g e
colleagues, usually in the form of an interactive dashboard”. SSBI has now become
synonymous with VDD tools and has become a top investment and innovation priority for
businesses over the past few years.
The annual Gartner Business Intelligence and Analytics Summit (2014) looks at the current
trends within the BI industry and highlighted that:
 Self-service analytics is “white hot” and growing while demand for traditional
dashboard BI is in remission.
 BI on Big Data (i.e. Hadoop-based and outside of the data warehouse) is a dynamic
new class of problem that requires a new class of solution.
 Today's buyers are increasingly coming from the business side of the house and not
from corporate IT, which has seen the move away from a centralised BI model to
more decentralized BI model.
2.1.3 Summary
 Traditional BI best practices considered a bit of a failure.
 Business users viewed BI tools as complex and left the use of these tools to the
“power users” within IT departments. Leading to a big ‘disconnect’ between business
and IT staff.
 The de-centralisation of BI has seen the emergence of self-service BI. This new trend
has been facilitated by the increased use of BI front end tools, mainly Visual Data
Discovery (VDD) tools.
2.2 User Engagement with BI Tools
This section of the literature review concentrates on user engagement with BI and looks at
the links between user engagement with BI, or lack of it, and the wider global issue of
employee engagement in the workplace.
Technology is important in any BI initiative but so too is need for BI users to be “engaged”
with the BI environment. Having an engaged workforce has proven to help foster an
analytical culture within organisations. Paharia,R. (2013) suggests that engaged workers
“can drive meaningful increases in productivity, profitability, and product quality, as well as
less absenteeism, turnover, and shrinkage”. This is no mean feat to achieve. It’s the
combination of people and technology that turn data into actionable information that can
be used to enhance the organisations decision-making (Miller, A.S. 2013), that lies at the
heart of BI. By getting the right information to the right people at the right time BI can
become an integral part when improving decision making, providing valuable business
insights, optimising organisational performance and of measuring success. However,
employee adoption of and engagement with BI is critical in any BI initiatives success or
failure.
13 | P a g e
2.2.1 What is employee engagement?
Employee engagement does not have one simple or accepted definition. The Chartered
Institute of Personnel and Development take a three dimensional approach to defining
employee engagement:
• Intellectual engagement – thinking hard about the job and how to do it better
• Affective engagement – feeling positively about doing a good job
• Social engagement – actively taking opportunities to discuss work-related improvements
with others at work
2.2.2 What is User Engagement?
Research has shown that user engagement has several definitions. This highly cited
definition by O'Brien, H.L., & Toms, E.G. (2008) states “Engagement is a user’s response to
an interaction that gains maintains, and encourages their attention, particularly when they
are intrinsically motivated” while Attfield, S, Kazai, G., Lalmas, M., & Piwowarski, B. (2011)
explain that “User engagement is a quality of the user experience that emphasizes the
positive aspects of interaction – in particular the fact of being captivated by the technology”
Research points towards user engagement being the determining factor in any successful BI
initiative. Organisations that have more users engaging with BI, with the emphasise on BI
tools, will more than likely see a better Return on Investment (ROI) in their BI ventures than
that of those whose workforce are lacking in engagement (Howson,C. 2014).
2.2.3 BI Adoption Rate
Recent survey suggests that BI adoption as a percentage of employees remains flat at 22%,
but companies who have successfully deployed mobile BI (Dresner.H, 2012), show the
highest adoption at 42% of employees (figure 2-2)
Figure 2-2 - Graphic: BI Scorecard
The lack of BI adoption from the employee perspective can be aligned closely with the wider
global problem of “lack of employee engagement” in the workplace. According to Deloitte’s
2015 Global Human Capital Trends survey (figure 2-3), employee and cultural engagement is
the number one challenge companies’ face around the world.
14 | P a g e
Figure 2-3 - Graphic: Deloitte University Press
Gallup conducted a study in 2013 into the state of the global workplace. The findings show
of the 142 countries that measured employee engagement that 13% of employees are
engaged in their jobs, while 63% are not engaged and 24% are actively disengaged. While in
the U.S Dale Carnegie and MSW did a study on over 1500 employees that measured
employee engagement. It revealed that 29% of the workforce is engaged, 45% are not
engaged, and 26% are actively disengaged (Dale Carnegie Training 2012).
As more organisations employ BI and analytics to improve and optimize decisions and
performance research points towards the question many organisations have asked “what is
going to make the difference between a successful BI initiative and one that will flat line?”
The need to stay one step ahead in an ever increasing and competitive global marketplace is
proving harder. Business leaders are looking to technology as the main driver in remaining
competitive in today’s markets. Having the right information technology infrastructure in
place is not enough to give organisations the edge.
What the research leans towards is having an engaged, motivated and collaborative
workforce. This is especially true in the BI environment where adoption rates of BI tools has
flat lined over the past decade. Some have suggested that those who are exposed to the
front end tools and how they engage with them may make the difference in the success or
failure in any BI initiative. It would seemthat Organisations looking to take BI adoption
rates, and indeed user engagement with BI tools, to the next level would have to have a
clear strategy that makes user engagement a priority.
To get the right information to the right person at the right time does not guarantee BI
success, if users are not engaging with BI tools an organisations BI deployment could be
doomed to failure. However to address this problem an important questions should be
asked “is user engagement with BI at a required level to make BI a success?” If this question
cannot be clearly answered an organisations BI efforts could fail to deliver the results that
were initially predicted.
15 | P a g e
Senapati, L., (2013) argues that to gain competitive advantage through active user
engagement, organizations must leverage gamification mechanics to influence user
behaviour and drive results.
The summary below gives an indication to why there is a lack of engagement with BI tools.
2.2.4 Summary
 The adoption rate of BI tools has flat lined at 22% over the past decade.
 The issues surrounding user engagement with BI tools can be directly linked to the
wider global issue of lack of employee engagement in the workplace.
 Organisations who have employees actively engaged with BI tools see a great return
on investment with their BI initiatives.
 To take user engagement with front-end BI tools to the next level, organisations will
need a clear strategy that makes user engagement a priority.
2.3 Gamification
This section of the Interim report will focus on the subject area of gamification. It will
explore its history, how it is defined and its correlation with BI and in particular exploring
the possibilities that the gamification of BI tools could have an effect on user engagement.
Gamification is a relatively new concept that is constantly evolving and has been gaining
popularity over the past few years with many vendors now offering gamification platforms
and solutions. The development of new frameworks, technologies and design patterns has
made gamification scalable and effective (Werbach,K, Hunter,D, 2012). This has led to it
being applied and utilised throughout organisations to gain business benefits across a wide
processes, tasks and tools.
The term “gamification” has been accredited to the British-born computer programmer and
inventor Nick Pelling who coined the phrase in 2002 but it was not until 2010 that articles
and journals based on gamification started to appear. The rise in popularity of gamification
has resulted in it experiencing considerable attention over the past few years. Google trends
shows that search volume for gamification increased significantly since 2010 and spiked in
February 2014.(Figure 2-4) Since then it has stayed at a steadier search volume. (December
2015). Gartners top 10 strategic technology trends 2014 showed gamification as a rising
trend for a number of years. (Figure 2-5) but has seen the hype surrounding it die down and
should reach its plateau of productivity in the next 2 to 5 years. Like all trends it has is
champions and its critics and although gamification has quickly evolved into a multi-million
dollar industry it is still considered to be in its infancy and therefore not fully matured.
16 | P a g e
Figure 2-4 Google Trend search results for the keywords gamification & business
gamification
Figure 2-5 Gamification in the Gartner 2014 Hype Cycle
There are many schools of thought on the definition of “what” gamification is. Duggan, K. &
Shoup, K. (2013) use this explanation of Gamification to highlight both the human
behavioural and technology elements used in gamification.
“Think of gamification as the intersection of psychology and technology… understanding
what motivates someone to ‘engage’ with certain elements of a website, app, or what have
you… It’s about humanising the technology and applying psychology and behavioural
concepts to increase the likelihood that the technology will be used and used properly”.
Werbach,K. Hunter,D.(2012) define gamification as “the use of game mechanics and design
in a non-game context to engage users or solve problems”. It is important that the research
does not confuse gamification with “playing games” or “serious games” (Nicholson, S. 2012)
which also applies game elements and design to non-game concepts. Gamification is not
people playing or creating full blown games, whether it be for employees or customers, but
using game elements such as dynamics, mechanics, and components to make an existing
experience, like a task, business process, or software tool more fun, engaging, collaborative,
17 | P a g e
and rewarding. Gamification uses these motivational factors based on needs and desires to
get organizational task completed. Organisational tasks with game like engagement and
actions can make people excited about work and boost productivity (Wu,M. 2011).
2.3.1 Game Elements
Game elements can be though if as the “toolkit” needed to build and implement successful
gamification. Points, Badges, and Leader boards (PBLs) are common components within the
game elements and are a seen as surface level features of gamification. PBLs are usually a
good place to start when introducing gamification platforms but research suggests awarding
and rewarding are not enough.
Through the review of literature it would be safe to imply that if gamification initiatives are
to succeed, other certain aspects must be considered. The two key questions that emerged
where
1. What are the motivational factors that drive engagement with a
product/service/process?
2. Why should gamification be taken seriously especially in a business environment?
To answer these questions first we must look at the three key elements of gamification
namely dynamics, mechanics and components. Figure 2-6 shows how these elements relate
to each other and why they are considered the building blocks to successful gamification.
Figure 2-6 Graphic: Gamification Course 2014
The research will now look at the relationship between these three elements starting with
dynamics.
Kim, B., (2012) states that “the power of game dynamics stems from the fact that it requires
meeting relatively simple conditions in return for attainable rewards. Then gradually, the
tasks become complicated and more challenging for bigger rewards”. This could conceivably
be considered the meaning behind the game.
18 | P a g e
Game mechanics refers to a set of rules, design and tools, employed by game designers, to
generate and reward activity amongst users in a game that are intended to produce an
enjoyable gaming experience (Senapati, L., 2013). Game mechanics are the elements of the
game that makes it fun, drives the action forward, and generates user engagement. Game
mechanics could reasonably be considered the momentum behind the game.
Werbach, K. (2014) describes game components as specific instantiations of mechanics and
dynamics and can include PBLs, avatars, collectibles, unlockables. This can be closely linked
to what is considered the motivation to continue with the game.
The objectives of any gamification platform or solution should be aligned directly with the
business objectives and as such an understanding of the primary stakeholders is essential in
creating an experience that engages users while accomplishing the business objectives
(Deterding, S. et al 2012).
To make the experience engaging research highlighted that three major factors must exist
and be correctly positioned. These are motivation, momentum and meaning. This is
achieved through a combination of carefully crafted game elements and design and a deep
understanding of what motivates the users of the gamified system. Research points to the
Volkswagen (2009) initiative named the “fun theory”. This initiative puts “fun” at the heart
of seemly mundane tasks such as using a set of stairs or disposing of litter and turning it into
an engaging and somewhat rewarding experience. Gamification practitioners have learned
from this and as a result the fun theory is considered a driving factor for successful
gamification and should never be far from the thoughts of any gamification designer
(Werbach,K. Hunter,D. 2012).
Underlying the concept of gamification is motivation. Research suggests that people can be
driven to do something because of internal or external motivation (Nicholson, S. 2012).
Paharia,R. (2013) adds to this by stating “Knowing what truly motivates people, and what
doesn’t, enables us to create stronger engagement and true loyalty”.
2.3.2 Scope of Gamification
The extremely broad and expanding range of ways gamification has been successfully
utilized in recent years has led to its increase in scope. The frameworks, technologies and
design expertise are readily available to introduce gamification platforms or solutions into
organisations business processes. With the trajectory of gamification constantly changing
some Industry experts have argued that each and every business process or problem has a
“gamified” solution (Zichermann,G. Linder, J. 2013). Although this may seeman exaggerated
statement it would be worth future consideration and exploration because as of yet there is
no credible academic research been done on the subject. If what Zichermann,G. Linder,
J.(2013) say is the case, then gamification has massive scope but the legal, moral and ethical
implications of gamification put forward by Kumar,J.M. & Herger,M. (2013) could affect its
future scope. As gamification is still in its infancy and not fully matured, research suggests
gauging its scope may raise more questions than answers.
19 | P a g e
2.3.3 Successful Gamification
Gamification has proven to be successful in many diverse business fields and because it can
provide quantitative data, organisations can measure engagement with whatever process,
task or tool that has been gamified. With more and more organisations realizing
gamifications potential the type of data collected can lead to valuable insights for
organisations.
Zichermann,G.(2013) describes how in 2006 Nike introduced gamification to tackle the issue
of why business had fallen to its lowest market share ownership in the influential running
shoe category. By 2009 Nike had reversed the trend due in no small part to the gamification
platform that featured social networking and location based technology that relied heavily
on games called Nike+. Individuals who went for a run could now track the number of steps
they took, calories burned and routes they ran by attaching the Nike Fuelband round their
wrists. Zichermann,G (2013) goes on to explain that “once downloaded this data could be
compared to that of others and the experience of going for a run became much richer”. This
created a whole new level of social engagement with running challenges being issued, prizes
such as electronic badges being awarded, and videos of praise from celebrity athletes for
reaching certain goals. By 2012 Nike+ had over five million users. By leveraging a simple
concept “beating your best time” Nike created a gamification platform that encouraged
wellbeing and fitness and in turn saw its market share increased by 10% in a single year.
Stanley,R (2014) looks at Engine Yard as an example of successful gamification. Engine yard
is described as a platform for deploying, scaling, and monitoring applications. The company
implemented a Zendesk knowledge base, but didn’t see the levels of engagement they had
hoped for. To encourage participation, Engine Yard incorporated PBLs and other
gamification tactics to boost participation and reward users for making contributions to the
community. These actions successfully increased user-generated content for its customer
self-help portal, decreasing the number of support tickets and reducing the demand on
support staff.
These examples show the diverse range of business processes that have benefited from
gamification. The literature review will now focus on the relationships between BI and
gamification and look to uncover any evidence of front-end BI tools that have been
gamified.
2.3.4 Gamification platforms for Business Intelligence tools
There is considerable overlap between the aims of both gamification and BI. RedCritter, who
offer business solution software that enables enterprises to manage, showcase, and reward
employee achievements, utilize game elements as an integral part of their social enterprise
platform by incorporating Profiles, PBLs, Rewards and Skill tracking into their customers’
existing BI processes. RedCritter works with Tableau, a leading self-service BI visual data
discovery tool vendor, and Microsoft Excel to provide BI and analytics. RedCritter integrates
Tableau and Excel with their enterprise gamification platforms with RedCritter Product
Manager, Jenness, D, (2014), claiming that this type of enterprise gamification of BI leads to
“valuable insights about employee performance and engagement” and “enables self-service
data visualization and behavioural insights”. Swoyer, S. (2012) states in his article for the
20 | P a g e
TDWI that gamification has particular resonance with BI and analytics, where the search for,
and discovery of, insights already has a game-like feel to it. Gamification advocates want to
amplify this effect to intelligently apply game-like concepts and methods to BI and analytics.
The article continues with: "It's a question of game play: of how we can make [interacting
with] BI more engaging. For example, you want to get people into the flow where they're
asking questions continuously, where they're following [an analysis] from one question to
another. Where questions lead to insights, and vice versa" lead analyst at the Information
Management company, Ovum, Madan S. (2013), identified that many BI systems resemble
gamified systems in that they: “Seeks to engage business users and change organizational
behaviours to improve business performance and outcomes. Gamified functions also
typically generate a lot of data for analysis. The key is providing users with an immersive
data experience that drives them to improve on that information through exploration and
feedback.”
Madan. S, recognises that gamification and BI “are both are highly complementary” and
gamification can be seen as a way to further operationalize BI by embedding it seamlessly
into everyday knowledge work, albeit in a competitively friendly and fun way.
Research points towards the correlation between gamification and SSBI with a blog post on
Decision Hacker (2012) suggesting SSBI could reasonably be defined as an early attempt to
gamify the workplace this statement is also championed by Werbach, K. (2014). Its overall
goal is intended to engage the workforce and align organisational behaviours through
carefully designed elements. This statement may seema little premature as it is unclear that
using game elements with business processes and applications can become a viable, long-
term concept that meets business objectives (Madan, S. 2013).
2.3.2 Summary
 There is considerable overlap between the aims of gamification and BI.
 Enterprise gamification platforms are now being integrated with BI tools such as
Tableau.
 Gamification has been proven to increase user engagement with business processes,
tasks and tools.
 Gamification must be closely aligned with business objectives to be successful in the
workplace.
 As yet there is no credible academic research suggesting gamification can increase
user engagement with individual BI tools.
2.4 Literature review conclusion
The following section contains the findings from the three subject areas discussed in the
literature review and how they are connected. It also gives justification for further research
into the main points the report aims to address.
21 | P a g e
2.4.1 User Engagement
User engagement with front-end BI tools has flat lined at around 22%-26% for almost a
decade now. The review of literature entertains the idea that adding gamified layers to
front-end BI tools could have an effect on user engagement with the given tools. What this
research has attempted to reveal is that to take user engagement with front-end BI tools to
the next level, organisations will need a clear strategy that makes user engagement a
priority. Gamification platforms and solutions maybe one way of addressing this priority but
no credible academic evidence of this is currently available.
2.4.2 Enterprise Gamification relationship with BI
Many industry leaders agree that gamification may very well change the face of BI. With the
emergence of enterprise gamification platforms from providers such as Badgeville,
Bunchball, and Redcritter, more and more business processes have been successfully
gamified. Research shows little evidence of the gamification of individual BI tools. What is
more relevant is the increasing number of enterprise gamification platforms being provided
for BI vendors, with particular focus on VDD tool vendors. But as this is also a very recent,
and still emerging field it provides very little in the way of measurable results to support the
claims that these platforms will be successful applied to BI and in particular BI front end
tools.
2.4.3 Motivational Theory linked to SSBI and Gamification
The Literature review revealed SSBI and gamification share a common use of the motivation
theory, with the focus on intrinsic motivation, in an attempt to increase loyalty,
engagement, and collaboration. The relationship and similarities between both these
subject areas highlight the importance of what motivates individuals to engage with certain
tasks, processes or (more importantly for the purposes of this report) BI tools.
2.4.4 Summary
The key theme of the literature review clearly shows that there is considerable overlap
between the aims of BI and gamification and that BI systems can indeed resemble
gamification systems. Gamification platforms can generate valuable insights into user
engagement and therefore would be a good starting point for exploring the idea of its
potential effects on user engagement with BI tools. The literature review shows early
indications that by gamifying BI tools, especially front-end tools, user engagement with the
tool may very well increase. With the key theme and findings from the literature review,
further research on the exploration of the gamification of BI tools and the effects on user
engagement can be justified.
Chapter 3: Research methodology:
The purpose of this chapter is to define the type of research that was carried out through an
identification and selection process and to explain the research approach, strategy and
associated methods chosen for the data collection and analysis. The challenges and ethical
issues that were encountered as well as the modifications that were made throughout the
research journey are also presented. A discussion on the ‘reliability and validity’ of the
research is provided and latterly, a conclusion is reached.
22 | P a g e
“Qualitative and quantitative research methods have grown out of, and still
represent, different paradigms. However, the fact that the approaches are
incommensurate does not mean that multiple methods cannot be combined in a
single study if it is done for complementary purposes”
Sale, J, Lohfeld, M, Brazil, K (2002)
3.1 Selection criteria
Quillan (2011) insists that it is good practice and wise to reiterate what the main objective
is, as it serves to reinforce what is being measured and how it fits with the research
questions. The main research objective is: “to address, and asks the question, whether the
Gamification of BI tools can affect user engagement.”
Specific study objectives have been formulated, which are:
 To address the issues surrounding user engagement with BI tools.
 To explore the gamification of BI tools and the effect, if any, on user engagement
with the tools.
To carry out the research on the study objectives it has been decided to use a mixed
methodology which will help gather both qualitative and quantitative data. This was
deemed the most appropriate approach to a study which is exploratory in nature as each
approach has the potential to enhance and/or complement the other in knowledge gained
on the same research problem, while each remains true to its own identity (Salomon, 1991).
The mixed methodology approach adopted throughout is designed to carry out relevant and
valuable research. According to Carey (1993), quantitative and qualitative techniques are
merely tools; integrating them allows us to answer questions of substantial importance.
3.2 Project GamBIT
This section will introduce the experimental study named Project GamBIT which forms part
of the primary research for the report objective. To gain a better understanding of the
research methodology it is important to have a clear understanding of what the prototype
purpose is, how it was developed and how it will be used.
Project GamBIT is centred on the main themes covered in the literature review, BI, user
engagement with BI tools and gamification. Its objective is to address a worldwide issue of
“lack of user engagement and adoption by users of BI tools (employees) throughout the
business world”. The study is unique in that the concept of “gamifying” a BI tool would see
an increase in user engagement with the tool. As yet this subject has lacked academic
research which has resulted in a limited existing body of knowledge.
Project GamBIT is a software prototype that has been designed and developed in an
attempt “To apply the concept of gamification to a business intelligence tool and to evaluate
what effect it has on user engagement levels” (Miller,S, 2013), I joined the study at the early
23 | P a g e
stage of testing and evaluation of the prototype. My part in the study was to aid in Project
GamBIT development and gather evidence whether GamBIT achieved or not, increased user
engagement with a BI tool. To aid GamBIT application development this report will identify,
describe and apply appropriate research methods to gather feedback on early versions of
the GamBIT prototype which include
 Use of the GamBIT prototype
 Feedback on the experience
 Ideas on improvements to the prototype
The GamBIT tool was developed using the Eclipse BIRT Java platform
http://www.eclipse.org/birt/ . Eclipse BIRT is an open source technology platform used to
create data visualizations and reports that can be embedded into rich client and web
applications. This tool has many advantages over other BI software tools and is particularly
suitable for developing or dismantling, rebuilding and customising, which this project
requires. It has allowed the developer, PHD student Stephen Miller, to strip back and
“gamify” the tool. This was achieved by dismantling the tools framework and reassembling
the tool with additional layers which incorporated gamification.
Access to the BI software and Java developers’ platform was given in an attempt aimed to
give me a better understand how the GamBIT tool has been developed and at what stage
the project is currently at. To achieve these aims an understanding of the Java code used
within the developers’ platform was deemed necessary. This included access to and an
understanding of the Java files, folders, and source code used. Java code was then edited
which create a new configuration of the code and Gambit front end.
Appendix H shows screen dumps from the GamBIT tool. The screen dumps highlight the
gamification elements added to the Eclipse platform and the process undertaken by the
volunteers who took part in the gamified experiment. These steps are deemed necessary to
help
3.3 Ethical debate surrounding gamification and study participation
There are ethical issues surrounding gamification mainly the aspect that user of the
gamified system must be treated fairly and with respect. There must be a balance struck
between the desired actions or outcomes the gamification systemis looking to achieve and
the exploitation of the user. Bogost,I (2015) has described gamification as a form of
“exploitation-ware”. The results of a study into the ethical debate surrounding gamification
within an enterprise concluded that “Gamification could be seen as an unfair mechanism to
increase productivity with no real costs. In addition, it could increase pressure on employees
to achieve more or avoid being in the bottom of the list” (Shahri, A., Hosseini, M., Phalp, K,
Taylor, J. & Ali, R. 2014).
Some have argued that gamification can be used to confuse users and ignore what is
“reality”. Gamified systems that have been designed without considering the ethical issues
surrounding gamification can fundamentally undermine the business objectives that they
24 | P a g e
were set out to achieve. The counter argument put forward by DeMonte.A (2014) of
Badgeville states that:
"Gamification can never be successful exploitationware, because it only works when the
behaviours that are motivated are behaviours that the user wants to perform in the first
place. It's not some magic solution where you can manipulate users to perform behaviours
against their will.”
As gamification matures the ethical and legal issues surrounding it will undoubtedly
become clearer (Kumar,J.M. & Herger,M. 2013). But for the purposes of the research carried
out in this report the ethical debate surrounding gamification was carefully considered as
there are no clear best practices relating to the subject area.
The research involved groups of students who volunteered to take part in the experimental
stage. There are ethical considerations to take into account and as such all volunteers were
given an information sheet that fully explained their involvement in the study giving the
volunteers the freedom to out of the study at any point. Great care and consideration was
taken to put volunteers at ease and to make them fully aware of what was expected during
the experimental stage of the GamBIT prototype and during the interview process. The
intention was to protect the confidentiality of, and give anonymity to, volunteers.
3.4 Quantitative Research
This section discusses what quantitative research is, its goals, and how this approach was
applied to the aims and objectives of the report. Quantitative research is the systematic
empirical investigation of observable phenomena via statistical, mathematical or
computational techniques. (Given, M. 2008). Quantitative research methods have been
chosen as a means of “collecting ‘facts’ of human behaviour, which when accumulated will
provide verification and elaboration on a theory that will allow scientists to state causes and
predict human behaviour” (Bogdan & Biklen, 1998, p. 38). The ontological position of the
quantitative paradigm is that there is only one truth, an objective reality that exists
independent of human perception. (Sale, J, Lohfeld, M, Brazil, K 2002), This type of research
fits with the aims of the report in as much as it is a research method that can help facilitate
the process of measuring user engagement.
The approach applied to the quantitative research methods are as follows:
1. Apply the User Engagement Scale (UES) to the GamBIT prototype to measure user
engagement.
2. Analyse the data collected from the UES.
3. Document the results and findings using tables, charts and/or graphs
4. Interpret and summarise the results
25 | P a g e
3.4.1 Measuring User Engagement
To develop an approach to measuring user engagement the question of “how can we
measure user engagement?” must be answered. O'Brien, H.L., & Toms, E.G. (2008) have
conducted several studies focusing on the assessment of engagement and believe the
following factors are considered to be most relevant in measuring user engagement:
 Perceived usability - user’s affective (e.g. frustration) & cognitive (e.g. effort)
responses
 Novelty - user’s level of interest in the task and the curiosity evoked
 Aesthetic appeal - user’s perceptions of the visual appeal of the user interface
 Focused attention - the concentration of mental activity, flow, absorption etc…
 Felt involvement - user’s feelings of being ‘drawn’ in, interested and having ‘fun’
 Endurability - user’s overall evaluation of the IS e.g. likely to return/recommend
Given the belief that the factors listed are considered the most relevant to use with the
GamBIT tool, the user engagement scale (O’Brien, H.L. & Toms, E.G. (2013) (2008)) was
chosen to collect the quantitative data. The UES has been modified to fit the needs of the
GamBIT tool.
Research suggests there is no “perfect” or “complete” way of measuring user engagement,
there are several different methods that could have been applied to project GamBIT to
produce the quantitative data needed for this study. Through research the UES was
considered best as it considers the most relevant factors in measuring user engagement.
Others such as the System Usability Scale (SUS) where considered but the developer
dismissed this “quick and dirty” scale as it was considered “one-dimensional” and the
questionnaire is, by its own nature, quite general. The User Engagement Scale (UES)
(Appendix G) was applied to measure user engagement Gambit’s software prototype and
collected quantitative data which was used to test the theory that whether GamBIT
achieved or not, increased user engagement with a BI tool.
3.5 Qualitative Research
Qualitative research methods have been chosen as a way to produce findings not arrived at
by means of quantification i.e. the UES. Qualitative research is based on interpretivism
(Altheide and Johnson, 1994; Kuzel and Like, 1991; Secker et al., 1995) and constructivism
(Guba and Lincoln, 1994). Interpretivism naturally lends itself to qualitative methods. It is, in
its simplest form, an ideal means of exploring individuals’ interpretations of their
experiences when faced with certain situations or conditions (Woods & Trexler, 2001).
The qualitative research will attempt to understand an area which little is known, in this
case the main theme of the report exploring the gamification of BI tools and its effects on
user engagement, and to obtain intricate details about the feelings, thoughts, and emotions
that are difficult to extract and/or learn about through quantitative research methods. In
this case the feelings, thoughts and emotions of the volunteers who took part in the GamBIT
experiment. Strauss, A, and Corbin,J, (1998) study of the basics of qualitative research
26 | P a g e
points to the three major components of quantitative research. The three points below
highlight how these components relate to this project:
1. The data. Which will come from semi structured interviews.
2. The procedures used to interpret and organise the data. Coding
3. The Analytical process. Taking an analytical approach to interpreting the results and
findings and including these in the report.
Qualitative data analysis consists of identifying, coding, and categorizing patterns or themes
found in the data. Data analysis was an ongoing, inductive process where data was sorted,
sifted through, read and reread. With the methods proposed in this report, codes are
assigned to certain themes and patterns that emerge. Categories are formed and
restructured until the relationships seem appropriately represented, and the story and
interpretation can be written (Strauss & Corbin, 1998)
The following section describes the methodological stages undertaken during the qualitative
research and can be loosely attributed to the grounded theory approach (Strauss, A, and
Corbin,J, 1998).
3.6 Methodological stages
This section contains a step by step process on the methodological stages used to conduct
the qualitative research. The methodological stages and how they are connected is shown in
figure 3.6 below.
Figure 3.6 Qualitative research methodological stages
The first part of the process was identifying the substantive area. The area of interest for
this report being the exploration the gamification of BI tools and the effects on user
engagement.
27 | P a g e
The study is about the perspective of one (or more) of the groups of people of the
substantive area who will comprise your substantive population. In this study University
students who are part of the School of Engineering and Computing at UWS, Paisley.
To collect data pertaining to the substantive area, conversing with individuals face-to-face
by means of a semi-structured interview was considered most appropriate.
The process of open coding was carried out as the data was collected. Open coding and data
collection are integrated activities therefore the data collection stage and open coding stage
occur simultaneously and continue until the core category is recognised/selected. Eventually
the core category and the main themes became apparent; the core category explains the
behaviour in the substantive area i.e. it explains how the main concern is resolved or
processed. This projects main concern was lack of user engagement with BI tools and the
core category was “whether the gamification of BI tools effects user engagement”.
3.6.1 Steps involved in open coding
The following section gives an overview of the steps involved during the process of open
coding.
1. The transcripts where read and first impressions note. The transcripts where read
again with microanalysis of each line carried out.
2. The following relevant pieces where then labelled- words, sentences, quotes,
phrases. This were based on what was deemed relevant to the study and included
thoughts, concerns, opinions, experiences, actions.
This type of analytical process aims to address what is considered relevant to exploring the
gamification of BI tools and the effects on user engagement. During this process the
following possibilities were looked at
 Repeating data.
 Surprises in the data.
 Relevance to objectives.
3. The next step focused on deciding which codes where most important and to create
categories by bring codes together. Some codes where combined to create new
codes. At this point some of the codes deemed less relevant where dropped. Codes
considered important where then group together allowing for the creation of the
categories.
4. The next step focused on labelling relevant categories and identifying how they are
connected. Comparative analysis was used as a means of labelling. The data
contained within the categories made up the content of the main results.
5. The results and analysis were written up.
Memos where written throughout the entire process. This helped in the interpretation of
the results and analysis with some memos written directly after the semi-structured
interviews were conducted.
28 | P a g e
Chapter 4: Experimental and Interview Process:
During the initial development of the GamBIT prototype a number of tests were conducted
to help evaluate the prototype. An approach was made to a number of students, from the
School of Engineering and Computing at the University of the West of Scotland (UWS),
Paisley who had shown an interest in work being carried out in this report. This was done
through direct observation of volunteers who had agreed to test the prototype. This was
done in an attempt to observe their interaction with the prototype and with the Eclipse
platform. The main areas under observation where
 Length of time to complete the tasks
 Navigation of the platform
 Reaction to the gamification elements
After the tests were conducted feedback was given by the volunteers which included
 Incorporate rewards such as badges when a task is complete
 Simplification of the game based rules
 Reworking of the tutorial to highlight every step of the process involved in carrying
out the tasks.
Time taken for the volunteers to complete the tasks varied from 50 to 75 minutes. The
estimated time to be applied to the actual experiment was around 45 to 60 minutes. This
gave the developer time to re-evaluate the prototype and make the necessary changes prior
to the experiments being carried out.
4.2 GamBIT Experiment
In an attempt to appeal for volunteers, students from the school of Engineering and
Computing at the University of the West of Scotland, (UWS) Paisley where approached to
take part in the GamBIT experimental study. The following section includes how the appeals
were made, justification for selection, and estimated duration of the experiment.
4.2.1 Appeal for Volunteers
The GamBIT developer approached the lecturer of a 1st year class studying the module
‘Introduction to Computer Programming’ and asked if he could appeal to students to
volunteer for the experiment. These students where familiar with the Eclipse software
platform as they were learning Java programming through the use of this platform,
therefore, they were familiar with the layout of the Graphical User Interface (GUI). It is
worth noting that many of the students had little experience using BI tools.
The second group was a 3rd year group of students who were currently studying a BI
module and therefore where familiar with BI and had experience of using a BI tool. An
approach was made to the lecturer of BI class by the researcher to ask if an appeal to
students from the BI class was possible. The lecturer agreed, and subsequently all students
where emailed prior the appeal to give notice of the appeal (Appendix B). A five minute
overview of the project and the experimental study was given and then an appeal for
29 | P a g e
volunteers was made. Students were given the opportunity to ask any questions or state
any concerns. They were then advised of the time and location of the experiment and finally
thanked for their time.
The last group consisted of 4th year (Honours) students who were studying Business
Technology. These students were chosen as they would (hopefully) provide a more critical
viewpoint and assessment of the tool as they were in the last year of their studies and had a
broader experience of BI, BI applications and associated tools.
One hour time slots booked at the UWS labs for the GamBIT experiment to take place. The
estimated completion time was forty minutes. Given scope for late arrivals and varying
completion times by volunteers, one hour was deem sufficient for all volunteers to fully
carry out the experiment.
Further experiments where undertaken by other volunteers who showed an interest in the
project. These experiments where conducted over several days in the Labs at UWS.
4.2.2 Experiment
Volunteers were randomly split into Group A (control - BI tool only) and Group B
(experimental - ‘GamBIT’ tool). The random split was deemed necessary as it was a
fundamental requirement of the test design under scrutiny. Both groups where issued
envelopes on arrival containing a USB stick (with JAVA coding installed, pen, a guide to
launch software, a guide to complete the exercise and a User Engagement scale.
Group A were given USB sticks with a JAR file named: NonGambit.install.data. This file, once
installed integrated new Java programming code that generated text files (.txt extensions)
onto the USB stick whenever a user had clicked certain buttons during each of the 6 BI tasks.
Group B where given USB sticks that contained a JAR file named: Gambit.Install. This file,
once installed integrated new Java programming code that created the ‘GamBIT’
gamification techniques on all of the 6 BI tasks on the exercise tutorial. It also created text
files for the collection of a number of different qualitative and quantitative data and wrote
this data to the new text files on the USB stick during the experiment.
The volunteers were briefed on the support available during the experiment and advised
that help was available at any time from the three observers present (researcher, developer
and moderator).
On completion of the experiment every volunteer was thanked for their time and
participation. All UES, USB sticks, and pens where then collected, sealed in their given
envelopes, and split into 2 piles, group A and Group B. The data was then collected and
analysed over the next few weeks. (The results and analysis are covered in chapter 5).
4.3 Interview process
30 | P a g e
The semi structured interviews were conducted on 4 participants who took part in the
experiment. Each interview followed a similar theme based around 3 main objectives
(Appendix D).
1. To understand what each participant felt about the application of gamification
techniques to a business intelligence tool and to determine what effect it had on their
level of user engagement.
2. To ascertainhow eachparticipant felt during the test, their reasons for feeling the way
they did and to glean further information from them over and above the survey data.
3. To gather qualitative evidence from each participant on a wide range of relevant
issues concerning lack of user engagement with BI tools and to use quotes by them
as to their opinions, views, suggestions and constructive criticism.
Initial contact with each participant was made through a response to feedback given after
the experiment in which they expressed an interest in taking part in the interview process.
An email was sent stating the following points that were to be addressed prior to the semi-
structured interviews being conducted.
• Explanation of the purpose of the interview
• Addressed terms of confidentiality - The participant’s informed consent was
voluntarily achieved by means of a disclaimer attached to the (UES).
• Explained the format of interview
• Indicated how long the interview may take
• Asked them if they have any questions
• Asked for consent in the recording the session
The email contained details of the proposed dates and times, approx. duration and location
of each interview. Further correspondence took place until eventually pre-determined times
and dates where agreed with each participant. Given the busy schedules of all the
participants each interview was conducted at various places within the University campus
and on separate days. It was necessary to follow up with the participants as quickly as
possible after the experiment was conducted to keep the thoughts and feelings of
participants’ as fresh in their memory as possible.
Chapter 5: Results and Analysis:
This chapter will document the findings gathered from the collection of quantitative and
qualitative data. It will focus on the results and analysis from the experiment then
document the results and interpretation of the semi-structured interviews that were carried
out on four participants.
5.1 Quantitative data
31 | P a g e
This section contains the results and analysis of the survey data, the UES data, and the data
collected relating to the participants time spent during the experiment.
5.1.1 Participant Results and Analysis
The experiment attracted a total of 68 participants (n= number of participants, (n = 68)) and
were ‘randomly’ split into one of two groups (A/B). Table 5.1 shows that there was an
almost even split.
As slightly more participants (n = 2) used the BI tool only (non-gamified version) showing the
random nature of the group split. All statistical analyses has been conducted with this slight
differentiation.
Group A/B Group name
No. of
participants
(N =)
N=
%age
A
Control group using the
BI tool only
35 51.5%
B
Experimental group
using the GamBIT tool
33 48.5%
Table 5.1 Group A/B split
Table 5.2 shows the spread among the 3 groups of participants by UWS class/course. The
largest group was the 1st year students, of which 46 took part. The 3rd and 4th year students
consisted of 17 and 5 participants respectively. When the initial approach was made to the
3rd year student, the class consisted of around 40 students however, less than half of those
invited took part (n = 17) which accounted for 25% of the cumulative total. The 4th year
students consisted of 5 participants (n = 5, 7%).
Participants Frequency Percent
Valid
Percent
Cumulative
Percent
Valid 1st year - Intro to
Programming
46 67.6 67.6 67.6
3rd year - BI class 17 25.0 25.0 92.6
4th year Hons – Comp.
Science
5 7.4 7.4 100.0
Total 68 100.0 100.0
Table 5.2 - University Course distribution
32 | P a g e
Figure 5.1 shows the spread among the groups of participants by UWS class/course in a bar
chart.
Figure 5.1 University Course distribution
Table 5.3 shows how the three groups of volunteers were divided and allocated to the two
groups (Group A/B) during the experiment by their different university courses. This helps to
demonstrate the randomisation of the participants. The table shows a very close division
and split between the three groups and their respective student courses. From the optimum
50/50 split the largest group (1st year students) shows a +/- 2% (52%/48%) difference, with
the other groups following a similar pattern.
Figure 5.2 shows the same information in a Bar Chart.
Table 5.3 Group type A/B * University Course - Cross-tabulation
33 | P a g e
Figure 5.2 Group type A/B * University Course - Cross-tabulation Bar Chart
Summary
 The grouping of participants was completely random.
 There was an almost even group A/B split.
 1st year students made up the majority of participants (68%).
 3rd year students made up 25% of participants with a total of 17 taking part. The
number was lower than expected given the class size of 40+ students.
5.1.2 Survey Information Results and Analysis
The following section gives an overview of the survey information gathered from each
participant prior to completing the UES (Appendix G). The data is based on the responses to
four questions (Q).
1. What is your gender?
2. What is your age?
3. On average, how often have you used a business intelligence (BI) tool at work or
study before?
4. On average, how often have you played any kind of video/app/mobile game before?
Q1.
34 | P a g e
Table 5.4 shows the gender split with only 8 females participating (12%) in the experimental
study compared to a larger male participation of 60 (88%). To show the randomness of how
males and females where allocated to their respective groups (control and experimental),
Table 5.5 show the cross tablature distribution.
Table 5.4 Gender Split
Table 5.5 Gender * Group type A/B Cross-tabulation
The random nature of the allocation of test groups shows that no prior consideration was
made to ensure there was a more even distribution of males and females within the groups.
Figure 5.3 highlights the lack of female participants, this was an unfortunate circumstance
that was out with the scope and control of the researcher.
35 | P a g e
Figure 5.3 Gender distribution among the represented UWS courses
Q2.
The age distribution of participants is shown in table 5.6 and clearly shows the 18-24 age
range represented the highest majority (68%). A more even distribution was between the 25-
29 and 30-39 years age range.
Table 5.6 Age range distribution
Figure 5.4 shows the same data in a pie chart.
36 | P a g e
Figure 5.4 Age range distribution Pie Chart
Q3.
Figure 5.5 shows cross tabulation results that emerged when participants where asked how
frequently they had used a business intelligence (BI) tool.
37 | P a g e
Figure 5.5 BI usage by University course
More detailed analysis can be seen in Table 5.7. The fact that 0% of 4th year students had
never used BI tools was an expected result given that they would be considered the most
experienced in using BI tools. What was surprising is that 5% of the 3rd year students have
never used a BI tool given the course content for 3rd year BI students. A high percentage of 1st
year student (25%) had never used a BI tool before. A more even distribution can be seen
between the 1sts year students’ BItool usageof 2or3 times a week and onceortwicebefore.
38 | P a g e
Table 5.7 BI Usage by University Course
Q4.
When asked the final survey question about how frequently they had used video, application
or mobile games before, table 5.7 shows the participants’ answers.
Table 5.7 Games usage
Figure 5.6 shows the same information in a pie chart.
Figure 5.6 Frequency of games usage
39 | P a g e
Summary of survey data:
 68 people participated in the experiments
 They were split into the 2 groups almost equally: group A (51.5%), group B (48.5%)
 3 UWS classes were selected from the 1st year (68%), 3rd year (25%) and 4th year -
Honours (7%) within the School of Engineering and Computing
 The gender split was: male (88%) and female (12%)
 The majority age group was in the ‘18-24 years’ category (68%)
 44% of participants had ‘never’ used a BI tool before and a further 20% only ‘once
or twice’ before
 As expected, most of the participants play video/mobile games on a ‘daily’ or ‘two
or three per week’ basis (c.80%)
5.1.3 UES statistical Results and Analysis
All of the UES data from the 68 participants was inputted into a statistical package software
tool known as SPSS (version 23) by the GamBIT Developer. This allowed for a wide range of
statistical testing to be conducted on the survey data (Appendix A), an overview is provided
in the tables, charts and statements below.
The median (middle) score was found for each variable for all 68 cases. The mean of the
medians were calculated for all of the 6 sub-scales (factors to be measured in the UES) as
shown in tables 5.8 and 5.9.
Table 5.8 User Engagement (UE) factor scores for Group A: Control – BI tool only
40 | P a g e
Table 5.9 User Engagement (UE) factor scores for Group B: Experimental – GamBIT
The mean of all six factor mean scores for both groups can be seen in table 5.10.
Table 5.10 Mean of all 6 factor mean scores for Groups A/B
5.1.4 User engagement highest ranking factors
Based on the results and analysis of the mean scores, experimental group B (GamBIT) had
Perceived Usability (PU) and Novelty (NO) ranked 1 and 2 respectively.
41 | P a g e
Table 5.11 Ranking of lowest mean score by factor - Group B (GamBIT)
The score in brackets at the end of each statement is the %age of respondents who either:
strongly agreed (1) or agreed (2) with the statement. The PU and NO statements are:
Perceived Usability (PU):
 PU1 - I felt discouraged using the tool (70%) – 7th
 PU2 - I felt annoyed using the tool (72%) – 6th
 PU3 - Using the tool was mentally taxing (73%) – 5th
 PU4 - I found the tool confusing to use (76%) – 1st
 PU5 - I felt frustrated using the tool (76%) – 1st
 PU6 - I could not do some of the things I needed to on the tool (74%) – 4th
 PU7 - The tool experience was demanding (76%) – 1st
Novelty (NO):
 NO1 - The content of the tool incited my curiosity (52%) – 2nd
 NO2 - I would have continued to use the tool out of curiosity (45%) – 3rd
 NO3 - I felt interested in my BI tasks on the tool (61%) – 1st
Group A (control group) had Perceived Usability (PU) and Endurability (EN) ranked 1 and 2
respectively.
42 | P a g e
Table 5.12 Ranking of lowest mean score by UES Factor - Group A (control)
The PU and EN statements are:
Perceived Usability (PU):
 PU1 - I felt discouraged using the tool (76%) – 3rd
 PU2 - I felt annoyed using the tool (79%) – 2nd
 PU3 - Using the tool was mentally taxing (67%) – 6th
 PU4 - I found the tool confusing to use (76%) – 3rd
 PU5 - I felt frustrated using the tool (82%) – 1st
 PU6 - I could not do some of the things I needed to on the tool (64%) – 7th
 PU7 - The tool experience was demanding (76%) – 3rd
Endurability (EN):
 EN1 - The tool experience did not work out the way I had thought (64%) – 1st
 EN2 - I would recommend the tool to appropriate others (61%) – 2nd
 EN3 - Using the tool was worthwhile (61%) – 2nd
 EN4 - My tool experience was rewarding (52%) – 4th
The results suggest that the participants in both groups did not find either of the tools a
hindrance, demanding, or confusing in any significant way. They seemed to be able to
accomplish what they were asked to without any great difficulty.
The ‘Endurability’ (EN) aspect is associated with the users’ overall evaluation of the
experience, its worthiness and recommendation value for others to use the tool. For the
control group this factor was ranked 2nd highest. Interestingly, EN4 ranked lowest and
suggests their experience could have been more rewarding with EN1 indicating the overall
experience could have been better.
43 | P a g e
The Novelty factor, which is associated with the curiosity the tool evoked, interest levels,
and surprise elements ranked higher for GamBIT users. This suggests that GamBIT users
where more interested in the BI tasks they were asked to complete. Interestingly, over half
the experimental group (52%) stated that the content of the tool incited their curiosity
(NO3). Given that gamification aims to make tasks more fun, engaging and intrinsically
motivating the results demonstrate the developers attempt to add these elements to the
gamified BI tool.
Results from the statement ‘I felt interested in my BI tasks’ scored 61% with the GamBIT
group compared to the control group who only rated this statement at 45%. The %age
difference of 16 points, which is an increase of 35%, can be seen in table 5.13. This can be
interpreted as a significant difference in the level of interest shown by both groups.
Table 5.13 NO3 ranking score Group A/B
5.1.5 User engagement lowest ranking factors
Focused attention (FA) factor scored lowest for both groups. The FA factor is associated with
the concentration of mental activity including elements of flow, absorption and time
dissociation in the tasks. The results highlight that participants appeared to be more
concerned with their tasks than the actual BI tools. This suggests that the gamification
elements did not fully absorb the participants and that they seemed more focused on task
completion.
Focused Attention (FA) statements:
↓
Group: → BI tool only
(A)
GamBIT
(B)
%age Ran
k
%age Ran
k
FA1 - When using the tool, I lost track of the world 21% 7th 21% 7th
44 | P a g e
Table 5.14 Comparison of FA scores by * Group A (control) / B (experimental)
5.1.6 Summary of UES data
 The highest rated variables (statements on the UES survey) for both groups was
different i.e.
Group A - did not find the BI tool frustrating (82% strongly agreed or agreed)
Group B – did not find the GamBIT tool confusing, frustrating or demanding (76%)
 Novelty ranked high with the GamBIT group and seen a significant difference in
results from the control group.
 Perceived usability was ranked highest by both groups.
 Lowest ranked factor for both groups was focused attention. This suggests
participants where not fully absorbed in the gamification elements, with task
completion being of higher importance.
5.1.7 Time taken to complete tasks results and analysis
This section shows the results from the data collected relating to the time taken to complete
each task and includes the optional task 6 results. This section also questions whether the
gamification of a BI tool places additional time constraints on participants.
The results from Group A (control- BI tool) revealed that Task 2 (T2 - building a data source)
was the quickest time at 2 minutes and 11 seconds. There were two tasks that took on
average over 8 minutes i.e. T4 (formatting the data) and T6 (creating a report title) with T4
taking the longest time to complete at 8 minutes and 36 seconds.
The results from Group B (GamBIT) revealed that Task 2 (T2 - building a data source) was
also the quickest time at 2 minutes and 30 seconds which was the same task as group A -
only a little slower (19 seconds). Task 6 (T6),creating a report title, was an optional task.
Given that it was introduced at the end of the experiment, participants by this point may
have been somewhat disengaged. It is a good gauge to measure if the participants were still
around me
FA2 – I blocked out things around me when using the
tool
30% 5th 30% 5th
FA3 - My time on the tool just slipped away 45% 3rd 45% 3rd
FA4 - I was absorbed in my BI tasks 54% 2nd 58% 1st
FA5 - I was so involved in my BI tasks that I lost track
of time
58% 1st 45% 3rd
FA6 - During this experience I let myself go 33% 4th 33% 4th
FA7 - I lost myself in the tool 22% 6th 22% 6th
45 | P a g e
engaged in the tasks. T6 took the longest time to complete at 8 minutes and 06 seconds,
some 30 seconds quicker than the control group (A) which is a good result for the research.
The overall mean times to complete all six tasks are detailed below:
• Group A (BI only) -32 minutes 31 seconds
• Group B (GamBIT) - 30 minutes 25 seconds
• Time difference - 1 minute 54 seconds (in favour of GamBIT)
To answers the question whether the gamification of a BI tool places additional time
constraints on participants, evidence of time differences shows that there are no significant
time disadvantages or distractions. Results show the opposite appears to be true as the
times to complete tasks are quicker which is a positive result in regards to the research.
5.1.8 Summary of time taken to complete tasks
 The participants who used the GamBIT tool took less time to complete the six tasks.
 GamBIT group had more participants complete the additional task (n=16)
 Task 4 took longest to complete.
 Using the GamBIT BI tool lead to tasks being completed quicker compared to the
non-gamified tool.
5.2 Qualitative data
This section will give an overview of each of the interviews conducted and report on the key
finding under each of the main categories.
The interviews were based on the experiences of each participant when carrying out the
GamBIT experiment. It looked to glean more information over and above the quantitative
data collected by the application of the UES. To explore key issues further questions were
based on -
 Their experiences with BI tools in general.
 Their thoughts on gamification, in particular the gamification of BI and BI tools.
46 | P a g e
 Their experiences of the use of BI in the workplace with a focus on any issues,
obstacles and concerns.
 Their thoughts on user engagement with BI tools.
Full transcripts of all four interviews can be seen in Appendix C.
The following section will report key findings under each of these main categories.
 Game Elements
 GamBIT experiment
 Concept
 Enterprise Gamification
 Gamification of BI tools
 User engagement
A snippet of the coding process is provided to give a clearer understanding of how the
results of the coding were analysed and then interpreted.
Table 5.2.1 Sample of coding classifications. Taken from a Microsoft Excel file.
5.2.1 Participant A
Game elements
The gamification elements added to the BI tool where an unwanted distraction taking them
away from completing the tasks, stating that “I never really paid attention” and “ I never
looked at the leaderboard, never read it to see what it said”. When discussing what is the
most important features of BI tools their response was “functionality of the tools is most
important”. All of which suggests the gamification elements where not as important as
actually completing the given tasks.
GamBIT
The participant stated that coming into the experiment “I wasn’t looking to enjoy it.” The
Eclipse platform lacked the visual elements (aesthetics) needed to keep them engaged with
the task and experienced issues with the platform layout “I think it was not very user
friendly everything was clumped together. I lost one of the elements when carrying out the
task of sorting and it proved hard to find. I could not move the element back to where they
should be”. This proved to be a major issue with the Eclipse platform. This suggests that the
47 | P a g e
participant is a visual person that likes software platforms that have a familiar GUI and are
easy to navigate. I would be safe to assume that the Eclipse platform was not as user
friendly or aesthetically appealing compared to other BI platforms they had used. This
contributed to a lack of engagement with the gamified BI tool.
Concept
The concept of the mountain climber “bagging a ben” was not something they were
particularly interested in. The following quote highlights this by stating “Maybe if it was
something different (concept), as bens and mountains I am not interested in. Maybe if it
was focuses along with something that interested me a bit more maybe I would have
focused but I just clicked through it”. Suggesting that if the concept was more tailored to
them, the overall experience could have be more engaging.
Enterprise Gamification
The participant stated a personal view on how enterprise gamification could benefit an
organisation “it would really depend on staff’s attitude to the software or tools”. Asked if
this form of gamification could increase user engagement in a BI environment they stated “I
don’t think it is going to create engagement personally”.
Gamification of BI tools
When asked about their wider views on gamifying BI tools the participant states “I think BI
tools are used by professional who know how to use them and realise how critical the
information is. It would be good for learning (gamifying a BI tool)… like teaching people to
use the BI tool. So for learning purposes yes, but on the whole may slow people down”.
The participant explored the idea of gamification as a possible aid in learning to use BI tools,
quoting “As a lot of these new tools can be frustrating and maybe having a pop-up or
reward saying you have achieved may help out there. I see its place as a teaching aid for a
new tool. But using the tool for a long period of time may get more people annoyed”.
User engagement
On the subject of user engagement with gamified BI tools the response was “personally it is
not something I would engage with I don’t think, it’s not something that if added to a (BI)
tool, especially a tool I was not keen on using, would make me use it “. When describing
their feelings during the experiments “I don’t think I was overly engaged or lost track of
myself in it” and commenting on the concept “using mountains just didn’t engage me”. It is
clear that the participant actively “disengaged” with the gamified BI tool therefore the tool
had no positive effect on user engagement.
The following points stood out when writing up the memos
 To engage user’s, visualisation though the use of colours was important.
 The gamification concept has to resonate with each individual user and provide a
variety of game-based activities that appeals to them.
48 | P a g e
 Gamified elements can be intrusive if not designed correctly
 The participant has experience using VDD tools and SQL reporting and querying
tools.
 The participant did not fully engage with the Gambit tool. It was seen as an
unwanted distraction.
 Functionality of the tools is more important than gamification layers.
The following links were established
 Gamification platforms and their ability to aid learning/ education.
 Links between good quality data and effective decision making.
5.2.2 Participant B
Game elements
When asked their opinion on the game elements the participant responded “I did like them”
and in commenting on the aesthetics stated “I quite liked them”. The overall responses to
the game elements was surprisingly enthusiastic with the elements having an positive effect
on engagement with the gamified BI tool.
The participant focused on the leaderboard element stating “If I was not going further up
that leaderboard I probably wouldn’t have kept going as long as I did” and elaborated that
“It did want to make you keep going and get further up that leaderboard. I wanted to get
further up that leaderboard”. They reflected on their motivation to continue, saying “I think
you know most people are competitive even if they don’t like to think they are. I don’t think
I am competitive but I think I probably am”. This hints that the game elements allowed the
participant to “find something out about themselves “and highlights their extrinsic
motivation by commenting “I was motivated because of the reward, which sounds greedy!”
The participant liked the novelty element with responses like “I wasn’t expecting that” and
“I was a bit surprise when the pop-ups appeared”. They felt involved in the tasks by stating
“when you feel you are getting near to that level you feel like doing that wee bit more then
you get badges and things and it makes you feel good.”
GamBIT Experiment
The participant stated frustration at not being able to complete the experimental tasks and
declared “I was quite annoyed when you said stop… I wanted to keep going”. When asked if
the tasks fun and engaging they pointed out “yes I was quite into it”. Their overall
experience was a positive one and stated that “it was good I liked it...I liked the fact that it
broke it down into quite small segments. It broke it down into nice little bits”.
Concept
49 | P a g e
The concept of “bagging a ben” appealed to the participant “I like walking and stuff like and
it’s quite my thing” but did add “I know probably not a lot of women wouldn’t engage in
that as it is quite a macho thing the whole mountain climbing, you might need to have a
girly girl version (aimed at females) with shoes and handbags”.
This highlights that the concept must have “meaning” for the user and be tailored to their
needs and experiences.
Enterprise Gamification
When asked about the idea of implementing a gamification platform in the workplace, the
participant responded “In my workplace then no, not really. Sometimes you can’t make
things fun and engaging… things are what they are. But the more fun and engaging the
better? Yes.” This was a mixed response which can be construed as a mix of lack of
employee engagement and a desire to have a workplace that is more fun, engaging and
participative.
The participants thoughts on who would engage with an enterprise gamification platform
they stated “I think the younger people would probably be into it…Older employees
wouldn’t like it as it would think it would be reporting on how they are doing their job and
stuff like that. A lot of older people are... more worried about how you are judged in the
workplace”. They raised concerns about this type of implementation by saying
“I can’t see how you can use gamification in a work context. I think it was really good for
what we are doing (experiment) when learning through different steps the workplace would
not be happy about stuff like this… peoples jobs being so insecure it would worry people”.
Another concern indicated that enterprise gamification “could be negative. It makes you
start thinking I am not doing very well, I am not doing this”.
Gamification of BI tools
The overall feeling on the gamification of BI tools was positive “I liked that (the gamified
tool) broke it down… I think as a studying tool it would be outstanding it would be really,
really good. I could really see it working as that”. But concerns were raised around the use
of BI tools as previous experiences with BI tools highlighted that “no training was provided”
and stated that BI tools should be “made simpler”.
User engagement
When asked “did you feel engaged with the BI tool?” The Participant response was “Yes,
totally”. The feedback received during the experiment was a major positive for the
participant commenting that “everybody wants to be praised don’t they … makes you feel
good when someone says something nice to you. If you are getting told something good
then it is positive affirmation so you want to keep going” this confirms their belief that
feedback is a main driver in increasing user engagement.
The following points stood out when writing up the memos
 To engage user’s, competition, challenge and reward are important.
50 | P a g e
 The participant liked the concept of the gamified tool but did say that it may not be
to everyone’s liking especially females.
 Gamified elements engaged them and made them want to continue especially to get
the additional task done.
 They were both intrinsically and extrinsically motivated at various points during the
experiment.
 The participant has experience using various BI tools in their job role (2nd Line
Support)
 The participant stated on several occasions that gamification platforms could have a
place in a learning or educational environment.
 A gamification platform is not something they could see being rolled out in their
workplace mainly due to financial constraints.
The following links were established
 Gamification platforms and their ability to aid learning/ education especially when
studying.
 Social collaboration through a gamification platform.
5.2.3 Participant C
Game Elements
Certain game elements resonated with the participant mainly the visual aesthetics,
commenting that “You get to see the images. It was like a games console where you get to
see achievements for everything you have completed”. Additional game elements including
“adding achievements and unlockables” would have made the BI tool more engaging and
possibly introducing “individual avatars”. The participant liked the game elements that gave
you your own unique score
Some negative elements emerged noting the pop-ups were ”straight up in front of you” and
were “It was a bit imposing at times”. This felt a little intrusive to the participant stating that
it is “an element that could have given more thought to”. But the participant acknowledged
that the game elements did “pull you in to the software.”
GamBIT
The participant was impressed with the GamBIT tool and declared that it “was encouraging I
liked the feedback and progression” they felt that it “made it seem quicker to complete (the
tasks) and made you feel like you wanted to continue”. When asked about the additional
tasks participant responded “I found them fairly easy to complete”.
Concept
When asked about the “bagging a ben” concept, the participant responded by saying “I
quite liked the idea where the more you do the higher up the mountain you go”. It would be
safe to imply that they were somewhat engaged with the concept.
Enterprise Gamification
51 | P a g e
The participant expressed little opinion on this category but did stated that enterprise
gamification “could show up people who need more training and help them all get to similar
levels”. When asked if enterprise gamification was good idea the response was “I am unsure
about it in a business context”.
Gamification of BI tools
The participant has experience using a number of different BI tools i.e. Tableau, Power BI,
Qlik Sense and SQL reporting tools with different experiences using the tools. What did
emerge was that with “some of the tools and software there should be more information
online.” When describing their experience with Power BI the lack of online support “stopped
me from progressing when using the tool…there was not much information available for
that at all”. Commenting on the idea of implementing gamification to these tools the
participant said “I think it would be a good training aid for using BI tools”.
User engagement
When asked if it was likely that a gamified BI would increase user engagement, their
response was “I am not sure”.
The following points stood out when writing up the memos
 The gamified tool was more engaging than the non-gamified tool. (This participant
had participated in both experiments)
 The game elements pulled the participant in to the software and wanted to make
them continue explore it.
 The participant felt intrinsically motivated to continue using the gamified BI tool.
 The gamified experiment seemed easier and quicker to complete.
 Gamification would be best suited to the IT industry.
 The participant liked the concept of “bagging a ben”
The following links were established
 Would like to see the concept of gamification integrating into platforms like Yammer
as a tool to aid in collaboration with colleagues in the workplace.
5.2.4 Participant D
Game Elements
The participant highlights novelty as a key game element that they engaged in stating “I was
really amused by that!” when the first pop-up appeared. The element of surprise was
another key element with the participant indicating that “you are excited because you don’t
know what is going to happen”.
Other game elements such as achievement and challenge proved engaging with quotes like
“Yes is was a challenge. It was something to aim for” and “You are building here, you are
climbing, and you are getting somewhere.” This highlights how the participant engaged with
these particular elements.
52 | P a g e
GamBIT Experiment
The participant reflected on their participation and admitted that “Once I was into it I
thought this is quite good!” I wish I had more time. When asked if they has more time was
available would they have continued they responded “Yes, regardless of how long it took
me. I felt the application was making it easier for me. “
The following quotes are examples of enthusiastic responses to the experiment given by the
participant “I liked the thinking behind it”, “It was well laid out” and” It was something I
would look to learn to do.” These quotes give a clear indication that the participant was
engaged in using the gamified BI tool and the gamification of BI tools, in a more general
context, is a good idea.
The participant showed an understanding of the work the GamBIT developer put into the BI
tool by saying “I appreciate the work that went into it as I have used Java and I know how
difficult that can be.”
When asked what they considered most fun and engaging the participant replied “It was the
whole package that did it for me. As a front end user I have no criticismof it”.
Concept
The participant thought overall GamBIT concept was “quite appropriate for both genders to
do” highlighting that “it was appropriate for that. It wasn’t too childish, if you know what I
mean, it wasn’t effeminate and it wasn’t too “blokey” (aimed at males). They liked the
“bagging a ben” concept and shared their feeling on the experiment by saying “You are
building here, you are climbing, and you are getting somewhere. That’s why I think it was
quite poignant that it was a climber as opposed to someone doing water skiing for
example”. They describe the emotional attachment to the mountaineer by commenting “I
got involved with it. I became emotionally attached to the man going up the hill. I was
saying "that is me trying to get up that hill".
These quotes highlight how engaged the participant felt during the GamBIT experiment and
how it left a positive impression on the idea of gamifying BI tools.
Enterprise Gamification
The participant believes that making the gaming appropriate to the application and the
population that you are working with are key components to successful enterprise
gamification. They go on to acknowledge that “Businesses would need to do two things. The
consultancy is the most important part they need to find out what the service is about,
product knowledge. Then you would need to find out who is going to be operating this?
What kind of education do they have in IT? How much training do they need? How involved
in change are they? Some staff just wouldn’t what the change. So consultancy to build an
application would be imperative.”
Gamification of BI tools
53 | P a g e
When asked if gamifying BI tools to increase user engagement was a good idea in principal
the response was “personally I think it would be... I was very impressed with it. I would love
to learn how to use it (gamification) and I would love to learn how to incorporate it into my
databases”. The participant went further to suggest that gamified tools “would be great in
education” by stating "this is a tool that can teach me. You are learning and not really
realising how much you are taking in. It is the ease of the learning. That’s the big thing”.
Given that the participant has used numerous BI tools in the past the feedback given about
gamified BI tools was surprisingly very positive.
User engagement
When asked about user engagement with the tool the participant responded positively by
saying “I absolutely loved it (GamBIT tool). Was I engaged? Absolutely! It was encouraging
me I was very impressed with it”.
The following points stood out when writing up the memos
 The participant works for the NHS and has experience using BI tools, database
development, and procurement of IT systems within the NHS.
 The participant found the reward elements very encouraging. Indicating that they
were extrinsically motivated to continue.
 The participant was very impressed with the gamified tool and would like to
incorporate something similar into their workplace.
 The participant has previous experience using the Eclipse platform and had
previously found it difficult to use and engage with. This was not the case with the
gamified platform.
 The participant seemed to be more activley engaged with the gamified BI tool
compared to the others interviewed.
The following links were established
 Gamification as a learning platform for using BI tools in the NHS.
 The participant has used Eclipse as part of a Java programming course and as part of
the experiment with research pointing to an increase in their engagement with the
gamified tool as opposed non-gamified BI tools they have used with this platform
5.3 Summary of Qualitative Data Results
The main finding from the qualitative research highlights that each participant had various
levels of experience using BI tools. This gave a wider range of opinions on the subject areas.
 Three of the participants found the game elements engaging, novel and fun with one
participant finding them an unwanted distraction.
 Two of the participants considered themselves engaged and immersed in the
experimental tasks. One participant was somewhat engaged and one participant
completely disengaged.
 Each participant stated that the concept had to be relevant to them as individuals.
54 | P a g e
 Three out of four participants were considered intrinsically motivated to complete
the tasks.
 Two participants showed signs of being extrinsically motivated due to the reward
elements.
 Three participants did not find either of the tools demanding, confusing, annoying or
discouraging in any significant way.
 Surprises in the data emerged as each participant commented on how gamified BI
tools would be a useful training aid in learning to use BI platforms. With the concept
of gamification having a place in a learning or training environment.
 Gamification could be used as an “instructional designer”, by complementing pre-
existing instructions and making them better. This was a key theme that resonated
from each participants personal experiences with BI tools. A lack of online support,
technical assistance and instruction on how to optimise BI tools has lead participants
to believe that these factors contribute to lack of user engagement with BI tools.
 Participants generally agreed that the gamified BI tool broke the tasks down into
steps that could be easily followed and helped them in measuring progress
throughout the tasks.
Responses to the main research questions are detailed in table 5.2.2.
Participant Research question Response
Participant A Given your experience using
a Gamified BI tool do you
think the gamification of BI
tools is a good idea?
No
Participant B Yes
Participant C Yes
Participant D Yes
Participant A Would adding gamification
layers to a BI tool make you
want to engage more with
the tool?
No
Participant B Yes
Participant C Yes
Participant D Yes
Participant A Overall would the
gamification of BI tools
increase user engagement
Maybe
Participant B Yes
Participant C Not sure
Participant D Yes
Table 5.2.2 Main research questions
Chapter 6: Conclusion:
The purpose of this chapter is to present the results of the analysis of primary data and
present conclusions reached based on analysis of the primary data in relation to the
participants and tools used in this project. It will look to identify any limitations associated
with the data analysed, from both the experiment and the subsequent interviews that took
55 | P a g e
place, and the conclusions reached. Finally, a summary of the main points made in this
chapter will be presented.
6.1 Review of research objectives
The principal objective of this project was to explore the issue of lack of user engagement
with BI tools. The point it aimed to address was whether making BI tools more fun and
engaging by applying gamification to, or in other words ”gamify”, a BI tool, can lead to
increased user engagement.
The UES was applied to a prototype gamified BI tool (GamBIT) and a non-gamified BI tool in
an attempt to produce quantitative data to address the aims of the project whether or not
‘gamifying’ a BI tool has the potential to increase an individual’s motivation to use BI tools
more often. Quantitative data was collected through means of conducting semi-structured
interviews with selected participants from the experiment. This allowed the researcher to
glean more information over and above the quantitative data provided from the UES. The
qualitative data attempts to capture the feelings thoughts, and emotions of participants on
a number of issues related to this project
6.2 Discussion of primary and secondary conclusions
This section will discuss the conclusions made from the primary research within the context
of the secondary research i.e. the literature review. Its aim is to highlight how the results of
the project address the main research question.
The literature review highlighted that engagement with BI tools has flat lined at around 24%
for over a decade. The BI industry has to address this issue or this current generation of BI
platforms and tools may not reach, or be used to, their full potential. Primary research
demonstrates that a lack of support for, and complexity of, BI tools and platforms leads users
to become frustrated and de-motivated, resulting in users being actively disengaged.
The literature review revealed that business users viewed BI tools as complex and left the use
of these tools to the power users within IT departments. Leading to a big ‘disconnect’
between business and IT staff. The primary research supports the theory that providing BI
tools with little or no support, training or technical assistance will ultimately see the user look
for alternative BI tools to engage with. Furthermore users may become completely
disengaged with BI tools in general compounding to the problem of user engagement with BI
tools. This has been a major issue for the BI industry as a whole leading to only 24% of those
using BI tools considered engaged. With the amount of BI tools that are now available, and
competition for market share, vendors who are more customer centric will see engagement
with their products increase.
On a more positive note when asked “would adding gamification layers to a BI tool make you
want to engage more with the tool?” and “given your experience using a Gamified BI tool do
you think the gamification of BI tools is a good idea?” research shows 75% of the participants
interviewed responded positively to these questions. Given these results it would be safe to
56 | P a g e
suggest that on this occasion the gamification of BI tools would see engagement raise above
the average 24% of actively engaged users.
Primary research results show an increases in key areas of user engagement with the
gamified BI tool as opposed to the non-gamified BI tool. The study found that there was a
significant difference in the level of interest shown by both groups when responding to the
question ‘I felt interested in my BI tasks’. The score of 61% with the GamBIT group
compared to the control group who only rated this statement at 45% resulted in a
percentage difference of 16 points, which is an increase of 35%. Evidence that the gamified
BI tool did increase user engagement in this key area and suggests a potential increase in
user’s motivation to use the tool again. These findings help answer the main research
question, whether the Gamification of BI tools can affect user engagement.
The Lowest ranked UES factor for both groups was focused attention. This suggests
participants where not fully absorbed in the gamification elements, with task completion
being of higher importance. This was certainly true for one of the interview participants
with functionality of a BI tool i.e. the user interface, visual aesthetics, and platform layout
being considered more important than having a gamified BI tool that promotes engagement
and collaboration. The participant went further to say that BI tools should not necessarily be
made fun and engaging as users who come into regular contact with BI tools only use them
to “get the job done”. This argument suggests that implementing a gamification platform
must not deter from the actual task or process being carried out and links back to the
strategy of closely aligning the objectives of the gamification platform with those of the
business.
The research demonstrated that user engagement with the gamified BI tool made task
completion quicker compared to a non-gamified BI tool (a time difference of 1 minute 54
seconds in favour of GamBIT). Indicating that participants were more engaged with the task
at hand than those who had used the non-gamified BI tool. Although this was not one of the
objectives outlined at the start of the project, this does highlight that active engagement
with a gamified BI tool can result in quicker task completion. The clear time difference was
surprising as initial indicators suggested that the experimental group may have taken longer
to complete the tasks given the game elements that where introduced throughout the
experiment. It also answers the question whether the gamification of a BI tool places
additional time constraints on participants.
6.3 Limitations placed on project
Given the limitations and restrictions placed on this project it cannot be conclusively proved
that by gamifying a BI tool under different conditions and/or environment would produce
similar results. As Project GamBIT was a unique experimental study conducted within the
constraints of an academic environment involving students it is hard to say if a similar
experiment, conducted in a workplace environment, would have produced similar results.
Participants consisted of students from the school of engineering and computing who may
be considered somewhat engaged with BI tools previously to the research being carried out.
The possibility of using volunteers from a different school i.e. School of Business and
57 | P a g e
Enterprise within UWS may have given the project a wider demographic and produced
different results. Business and Enterprise students are assumed to have less experience with
BI tools than students from the School of Engineering and Computing but could be exposed
to front-end BI tools in the future given their chosen career paths and would have been
worthwhile participants in this study.
The data was produced for only one individual BI reporting tool and does not include any
findings or results from alternative BI tools and BI platforms. This resulted no comparisons
being made with other BI tools. The lack of research in this area also contributes to the
failure to compare with other findings in this area.
The researcher was restricted to using the tool developed for the experiment and although
a contribution was made during the development no involvement in the programming or
conceptual design of the GamBIT tool was made. The researcher did contribute to issues of
functionality, aesthetics and game elements.
The qualitative data was collected from four volunteer participants this placed limitations on
the data. To give a comprehensive overview of the experiment ideally all 68 participants
would have needed to contribute to the qualitative research. This was unrealistic given the
project timescales and accessibility of participants. In some cases the interviews conducted
where approx. carried out 3-5 days after the experiment. Participants had time to reflect on
their experience and build on what their perceived version of events was. One of the
interview participants did raise this issue with the researcher. Consideration to this and
similar issues were given when analysing and interpreting the qualitative data.
6.4 Future research work
As of yet there is no current credible academic research been done in this area therefore no
comparisons can be made with other studies. The literature review found many reasons
why user engagement with BI tools is low but did not find any clear solutions to this global
problem. This project hopefully gives some answers to the issue of lack of user engagement
with BI tools and can be a basis for future research in this area.
The results of this study suggests that gamifying BI tools would be a good training aid when
learning to use BI tools or engaging in BI tasks or activities. The idea of gamification being an
“instructional designer”, by complementing pre-existing instructions, could encourage a
behaviour or attitude that would increase user engagement.
The game elements such as reward, novelty, challenge and competition particularly
resonated with the majority of participants who took part in the interview process. They
agreed that gamification would be a good training aid to monitor progress, motivate users,
and encourage collaboration. All of the interviewees agreed that gamification could have a
place in a learning or educational environment. This has been the subject of much academic
research recently and has been championed by gamification vendors as the “gamification of
learning”. The findings support this argument and no doubt further academic research will
conducted in this area.
58 | P a g e
Enterprise gamification platforms that integrate with BI tools such as Tableau and Excel
have not provided any concrete statistical evidence to support their claims of increased user
engagement with BI tools. This is one of issues this project tackles, to produce evidence of
whether gamifying a BI tool increases user engagement. As gamification is a recent trend
and enterprise gamification platforms are in their infancy and not fully matured, once the
hype surrounding gamification plateaus the industry will find its feet, conclusive evidence of
gamifications potential to increase user engagement may become available.
6.5 Summary
 A lack of support for, and complexity of, BI tools leads users to become frustrated
and de-motivated, resulting in users becoming actively disengaged.
 If done correctly and directly aligned with the overall business objectives, the
gamification of BI tools could see engagement levels raise above the average 24% of
actively engaged users.
 Enterprise gamification is still very much in its infancy and although it has its
champions no conclusive evidence of claims to increase user engagement with BI
processes has been found. Results from the qualitative data collected highlights that
the interviewees generally agreed that, in their place of work, introducing a
gamification solution would not be high on the list of priorities for the business.
 The experiment produced some surprises in the results with the gamified BI tool
proving to reduce task completion times compared to the non-gamified tool.
 Research suggests that gamification can be a worthwhile contributor to learning and
developing BI skill sets.
Chapter 7: Critical evaluation:
The honours project has tested me in many levels as I expected but has pushed me to the
limits of my academic ability. From the beginning serious thought had to be given to the
approach to the overall project.
7.1 Reflecting on the initial stages of the project
The initial draft specification which was done over the summer period looked to explore the
use of BI within the charity sector as this was an area that was of interest to me given my
working involvement in this sector. This changed due to an approach being made by my
supervisor to consider being part of an ongoing study by UWS PhD student Stephen Miller
to aid in the development of a prototype gamified BI tool, namely Project GamBIT. After
researching the subject areas project GamBIT was involved with (BI, user engagement with
BI tools and gamification) I decided that this would not only be very challenging and
rewarding but also allow me to explore an area that up until then I had never heard of,
namely gamification. This was the basis upon which I agreed to tackle this unique study.
There were obvious advantages to this mainly having access to the knowledge, and
experience, the PhD student had in terms of undertaking such a project. Also having the
opportunity to explore the subject area of gamification, which I considered an interesting
and stimulating topic.
59 | P a g e
There were many disadvantages to following this route. I found out very quickly that the
scope of the GamBIT study restricted my creative thinking and ability to explore different
areas and ideas within my project. There was always an awareness that I had to keep within
the scope of the GamBIT study and even though there was always advice on how to proceed
with my project there was a feeling of lack of control over my own project at points. This
was compounded in many ways as I feel I could have developed alternative routes and
raised different points of investigation and discussion during the course of the project.
These constrains became more evident towards the end of the project as there was some
debate on what results I could use from the quantitative data that was collected and what
alternative methods I would use to differentiate my project from that of the PhD student .
Reflecting on this, the initial honours project idea may have allowed a more creative
approach and a feeling that I was more in control of the overall project. The project
supervisor admitted that in future a different approach may be needed when an Honours
student becomes involved in a PhD study. The main lesson I can take away from this is that
when committing to future projects clearly defined objectives, responsibilities and roles are
set out from the beginning.
On a more positive note the challenge of taking on such a huge project was daunting, but at
the same time exciting. Keeping a high level of motivation and setting targets helped to
keep the project on track and avoided me from procrastinating. I felt the management of
the project was done in a professional manner with milestones and deadlines adhered to
throughout. I considered this one of my main strengths throughout the project and will
continue to use this style of project management as a blueprint for the future projects.
7.2 Approach to project
My approach to the project was to treat it as a full-time project and put in the necessary
hours each week in order to keep the project on track and enhance my learning of the
subject areas. A lot of work was done at home and given that I have 2 daughters, a 1 and 8
year old, to look after finding time to meet my targeted hours each week was challenging
but I understood that certain sacrifices had to be made and at points sleep was minimal.
There was rarely a point when I felt I could not continue with the project as the motivation
to succeed, provided by my daughters, kept me focused.
7.3 Honours year modules
The honours year modules helped to expand on my knowledge of the main areas of the
project especially the data warehouse environment module. This module was key to
understanding BI and exploring user engagement with both BI platforms and tools. Given
the module was based around the work of BI industry expert Cindi Howson, this was a
constant source of referral during the early stages of the project.
7.4 Project aids
Numerous tools where used throughout the project. Many of them familiar to me such the
Microsoft Office Suite and Gliffy (a web-based diagram editor). Mendeley (a web program
for managing research papers and discovering research data) was discovered during the
initial stages of the literature review and was invaluable in the organisation, retrieval and
60 | P a g e
storage of research documents. Mic Note (an audio recorder + notepad, 2 in 1 tool) was
another tool discovered during the course of the project and used extensively during the
interviewing process. These tools enabled the project to run more efficiently and effectively
enabling more time to be spent on the key areas of the project. If I undertake another
research project in the future all of the above tools would prove useful.
7.5 Summary
Overall the honours project has tested my academic capabilities to the limit and will be a
process I will no doubt reflect upon in the future. The project has helped me develop in
many different ways. Examples include being exposed to experimental processes and having
the confidence to conduct face-to-face interviews and source participants for both the
interviews and the experiments. Each of which I had never done before and took me outside
my comfort zone. I will take the lessons learned from undertaking such a huge project with
me going forward to hopefully achieve my personal goal of carving out a long and successful
career in the IT industry.
References
Attfield, S., Kazai, G., Lalmas, M., & Piwowarski, B. (2011). Towards a science of user
engagement (Position paper). In Paper presented at the WSDM workshop on user modelling
for web applications, Hong Kong, China.
Azvine, B., Cui, Z. & Nauck, D.D., (2005). Towards real-time business intelligence. BT
Technology Journal, 23, pp.214–225.
Bogdan, R. C., & Biklen, S. K. (1998). Qualitative research in education: An introduction to
theory and methods (3rd ed.). Needham Heights, MA: Allyn & Bacon
Bogost,(2011), Persuasive Games: Exploitationware,
http://www.gamasutra.com/view/feature/6366/persuasive_games_exploitationware.php
[Accessed October 2015]
Carey, J. W. (1993). Linking qualitative and quantitative methods: Integrating cultural factors
into public health. Qualitative Health Research 3: 298–318.
Dale Carnegie Training (2012) What Drives Employee Engagement and why it matters,
White Paper, Dale Carnegie & Associates, Inc.
Davenport, T.H., Barth, P. & Bean, R., (2012). How “Big Data” is Different. MIT Sloan
Management Review, 54(1), pp.22–24. Massachusetts Institute of Technology
Decision Hacker (2012) Gamification and Gamified Business Intelligence, blog post,
http://decisionhacker.com/2012/11/08/gamification-and-gamifiedbusiness-intelligence/
Deloitte, (2015). Global Human Capital Trends 2015. Leading in the new world of work. ,
p.112. Graphic: Deloitte University Press
61 | P a g e
DeMonte.A (2014), Badgeville on gamification and the psychology of motivation
https://badgeville.com/adena-demonte-badgeville-on-gamification-andthe-psychology-of-
motivation/
Deterding, S., (2012). Gamification. Interactions, 19(4), p.14. Available at:
http://doi.acm.org/10.1145/2212877.2212883nhttp://dl.acm.org/ft_gateway.cfm?id=2212
883&type=pdf.
DiSanto,D (2012), Time to insight,
http://www.digitalistmag.com/technologies/analytics/new-kpi-time-to-insight-017026
(Accessed : October 2015)
Dresner Advisory Services, (2012). Wisdom of Crowds Mobile Computing / Mobile Business
Intelligence Market Study 2012. , (November), pp.1–76.
Duggan, K., Shoup, K. (2013) Business Gamification for Dummies John Wiley & Sons, Inc.,
Hoboken, New Jersey
Eckerson, W. (2010), Performance Dashboards: Measuring, Monitoring, and Managing Your
Business,2010 Wiley
Gallup, (2013), “State of the Global Workplace: Employee Engagement Insights For Buisnes
Leaders Worldwide,” Gallup HQ, Washington, 2013.[Accessed: November 2015]
Gartner, Inc. (2013) “Business Intelligence” http://www.gartner.com/it-glossary/business-
intelligence-bi/ NYSE: IT [Accessed: October, 2015]
Given, Lisa M. (2008). The Sage encyclopaedia of qualitative research methods. Los Angeles,
Calif.: Sage Publications.
Howson, C, (2014). Successful Business Intelligence: Unlock the Value of BI & Big Data.
McGraw-Hill Education.
Howson,C,(2014) “BI Scorecard: Successful BI survey,”
http://biscorecard.typepad.com/biscorecard/2014/04/bi-adoption-remains-flat.html,
[Accessed: 01/11/2015] Inmon, W.H. (2005) Building the Data Warehouse, 4th ed. Wiley &
Sons,
Jenness, D, (2014), RedCritter , https://www.redcritterconnecter.com/home [Accessed:
November 2015]
Kim, B. (2012). Harnessing the Power of Game Dynamics: Why, How to, and How Not to
Gamify the Library Experience. College & Research Libraries News, 73(8), pp.465–469.
Available at: http://search.proquest.com/docview
Kimball. R. & Ross.M, (2002) The Data Warehouse Toolkit: the complete guide to
dimensional modelling, Wiley Computer, Publishing, 2002.
Kumar,J.M. & Herger,M. (2013): Chapter 8: Legal and Ethical Considerations. In:
"Gamification at Work: Designing Engaging Business Software, https://www.interaction-
design.org/literature/author/janaki-mythily-kumar [Accessed October 2015]
62 | P a g e
Lohr, Steve, (2012) The Age of Big Data, New York Times –Sunday Review- news analysis,
http://www.nytimes.com/2012/02/12/sunday-review/big-datasimpact-in-the-world.html?
(Accessed: October 2015). A version of this news analysis appears in print on February 12,
2012, on page SR1 of the New York edition with the headline: The Age of Big Data.
Madan S. (2013), Lead Analyst, Information Management, Ovum
http://www.appstechnews.com/news/2013/feb/12/jury-still-out-on-value-of-
bigamification/ [Accessed: November 2015]
Miller, A.S, (2013), Transfer Event Report MPhil / PhD à PhD. What’s the BIG idea ? Business
Intelligence using Gamification - Evaluating the effects on user engagement, School of
Engineering & Computing, University of the West of Scotland, Scotland
Miller,S, McRobbie,G, (2013), Business Intelligence Tools – Should they be ‘gamified’?
Project ‘GamBIT’: Evaluating user engagement of a business intelligence tool, University of
the West of Scotland UWS, Paisley, Scotland
Nicholson, S. (2012). A User-Centred Theoretical Framework for Meaningful Gamification.
Paper Presented at Games &Learning & Society 8.0, Madison, WI.
O’Brien, H.L. & Toms, E.G. (2013). Examining the Generalizability of the User Engagement
Scale (UES) in Exploratory Search, Information Processing and Management 49, pp.1092–
1107,
O'Brien, H.L., & Toms, E.G. (2008). What is user engagement? A conceptual framework for
defining user engagement with technology. Journal of the American Society for Information
Science and Technology, 59(6), 938–955.
Paharia, R. (2013). Loyalty 3.0: How to revolutionize customer and employee engagement
with big data and gamification. McGraw Hill Professional.
Quillan, C. (2011) Business Research Methods Andover, South-West Cengage Learning
Sale, J, Lohfeld, M, Brazil, K (2002). Revisiting the quantitative-qualitative debate:
Implications for mixed-methods research
Senapati, L., (2013). Boosting User Engagement through Gamification. , p.5. Available at:
http://www.cognizant.com/InsightsWhitepapers/Boosting-UserEngagement-through-
Gamification.pdf
Shahri, A, Hosseini, M, Phalp, K., Taylor, J, & Ali, R. (2014), towards a code of ethics for
gamification at enterprise. In The Practice of Enterprise Modelling (pp. 235-245). Springer
Berlin Heidelberg.
Stanley,R, (2014). Top 25 Best Examples of Gamification in Business.
http://blogs.clicksoftware.com/index/top-25-best-examples-of-gamification-in- business/.
Strauss, A, Corbin,J (1998). Basics of qualitative research: Techniques and procedures for
developing grounded theory, 2nd edition, Sage Publications Ltd, London, UK.
63 | P a g e
Swain Scheps, 4 Jan (2008), Business Intelligence for Dummies Paperback. John Wiley &
Sons, Inc, Hoboken, New Jersey
Swoyer, S, (2012) Making BI analytics fun https://tdwi.org/articles/2012/04/17/making-bi-
analytics-fun.aspx
Tableau Software, "Tableau Business Intelligence"., http://www.tableau.com/business-
intelligence [Accessed: November 2015}.
Volkswagen (2009), Fun Theory, http://www.thefuntheory.com/ [Accessed October 2015]
Werbach, K. (2014) http://www.slideshare.net/mikederntl/gamification-of-learning-design-
environments-workshop, Gamification course. [Accessed October 2015]
Werbach,K, Hunter,D,(2012) , For the Win – How game thinking can revolutionize your
business ,Wharton Digital Press, The Wharton School University of Pennsylvania,
Philadelphia
Wixom, B; Ariyachandra, T; Douglas, D; Goul, M; Gupta, B; Iyer, L; Kulkarni, U; Mooney, J. G.;
Phillips-Wren, G; and Turetken, O. (2014) "The Current State of Business Intelligence in
Academia: The Arrival of Big Data," Communications of the Association for Information
Systems: Vol. 34, Article 1.
Wu, M, (2011), “Gamification from a Company of Pro Gamers”,@lithospherlithium.com.
Zichermann,G. Linder, J.(2013), The gamification revolution: how leaders leverage game
mechanic to crush the competition, p158-156, McGraw Hill Education , USA
Appendix:
This chapter is the collection of the appendixes
Appendix A Descriptive statistics
Provided by GamBIT developer – PhD student Stephen Miller
Section 2 Descriptive Statistics
Section 2.1 - Basic testing methods
There are anumber of different basicstatisticaltests that can be applied to any research work
that utilises a Likert Scale survey instrument as in this research. The tests are primarily used
to describe the sample group or to summarise information about them. The most common
ones include: the mean, median, mode (figures of central tendency), minimum and maximum
scores and the standard deviation (σ – a measure of how spread out the numbers are).
64 | P a g e
Section 2.2 – SPSS: Overview of the data input procedures
All of the survey data (68 responses) were inputted into a statistical package software tool
known as SPSS (version 23). This tool allows for a wide range of statistical testing to be
conducted on the survey data once it is in the format required and by clicking just a few
buttons. Two main files were created: (i) GamBIT.survey.data.sav (see Appendix A) and (ii)
GamBIT.USB.data.sav. (see Appendix B) to hold the data entries. Further information about
each file follows:
1. GamBIT.survey.data.sav – this file was created to analyse the 28 variables (where
each question on the User Engagement Scale (UES) survey is a variable). Within this
scale there were sub-scales known as factors or constructs i.e. a group of variables
that relate to a hidden or latent variable (something that was being measured). In the
case of the UES there are six sub-scales with the following latent variables:
 Focused Attention – FA (7 inter-related variables or questions)
 Perceived Usability – PU (7 inter-related variables or questions)
 Endurability – EN (4 inter-related variables or questions)
 Aesthetics – AE (5 inter-related variables or questions)
 Novelty – NO (3 inter-related variables or questions)
 Felt Involvement – FI (2 inter-related variables or questions)
Once allof the Likert scalescores were input into the filefor each of the 68 participants a new
‘super’ variable was created for each of the six latent variables above. These new variables
measured the median (middle score) for each of the 68 responses against each of the inter-
related variables to create one variable to measure rather than trying to measure all 28
variables (questions) individually. This makes it much easier to statistically analyse and
measure the latent variables and to provide output information in a user friendly format.
The median was recommended as the best measure to use from the associated research
literature becausethe measurement level was ordinal i.e. aranking order of 1 (stronglyagree)
to 5 (strongly disagree) based on the survey options.
2. GamBIT.USB.data.sav – this file was created to measure the ‘actual’ times recorded on
each of the 68 USB’s (memory sticks)usedby the participants to complete the BItutorial
65 | P a g e
tasks. Time variables were created to analyse the time elements corresponding to each
of the 6 tasks as follows:
 Tutorial start (TS) – the time recorded at the start of the tutorial when the
participant clicks the relevant buttons to ‘create a new project’ or in the case of
the GamBIT experiment group, whenever they entered a user name.
 Task 1 (T1) – time recorded when the participant ‘creates a new report’.
 Task 2 (T2) - time recorded when the participant ‘builds a data source’.
 Task 3 (T3) - time recorded once the user ‘builds a new data set’.
 Task 4 (T4) - time recorded after the participant ‘formats the data’.
 Task 5 (T5) - time recorded once the user ‘styles the data’.
 Task 6 (T6) – this was created as an ‘optional’ task. Participants were advised on
the tutorial that they did not need to complete it. Task 6 was designed to see if
users were engaged enough to continue or alternatively, to ‘opt out’. It required
users to ‘create a report title’ using HTML (Hypertext Mailing Language) tags
which is arguably the hardest and most time-consuming of the tasks.
‘Super’ variables were created after all of the times were input into the file i.e. a time taken
to complete eachtask for comparison purposes. This meant computing the variables by taking
the time recorded on the completion of eachtask and subtracting that time from the recorded
time for the previous task i.e. T1 - TS (time taken to complete task 1), T2 -T1 (task 2 time
taken) etc.… A further variable was created; Time_taken_all6tasks. This variable was added
to provide data on the times taken on those who completed all 6 tasks by subtracting the T6
completed time from the tutorial start time T6 – TS. Two other variables were created:
 UserName - Listing the user names the participants entered at the start (GamBIT only)
 Time_completed_T6 – identifying the users that did (or didn’t) complete task 6.
Section 2.3 – Further SPSS files
As a consequence of the type of analyses being carried out it was necessary to create another
two SPSS files as follows:
66 | P a g e
1. GamBIT.data.experimental.sav: this file was created from the main file i.e. the data
about ‘Group B: experimental – ‘GamBIT’’ was copied onto a new file to allow
independent testing of Group B without jeopardising or compromising Group A data.
2. GamBIT.data.factor.sav: this file was created from the main survey data file to allow
the researcher to carry out various statistical tests on the data alone i.e. structural
equation modelling (SEM), factorial analysis, multiple regression, Cronbach’s Alpha
etc.… A screenshot of this file is shown at Appendix C.
Section 2.4 – Test results and conclusions
A number of different tests were carried out on the data files listed previously. Let us start
with the survey data file first (GamBIT.survey.data.sav). The following tests have been
conducted on the data:
 Mean - a measure of central tendency that entails summing all values in a distribution
of values and dividing the sum by the number of cases (n = 68).
 Median - a measure of central tendency that entails arraying in ascending or
descending order all values of a distribution of values and then calculating the mid-
point of that distribution. Half of all cases (50%) will be on one side of the median and
half (50%) will be on the other side of the median.
 Standard deviation – a measure that summarises the amount of dispersion in a
sample and is based on the amount of variation around the arithmetic mean.
 Minimum - the minimum recorded score on a scale (or in this case it’s the minimum
average score of the median score which can be a decimal place score of say 1.50 or
2.50 rather just 1 (strongly agree) or 2 (agree) in line with the survey grading’s).
 Maximum - the maximum recorded score on a scale (or any scale). In this instance, it
is the maximum recorded score of the average score of the median score which can
be a decimal place score of 3.50 or 4.50 rather than 4 (disagree) or 5 (strongly
disagree)).
Inter-quartile range (IQR) - the difference between the highest and the lowest values in a
distribution of values when the highest 25% and the lowest 25% have been removed. It is
the difference between the first and the third quartile (Q) i.e. Q3 – Q1.
Appendix B Appeal for volunteers
67 | P a g e
Honours Year Research Project
Appeal for volunteers to participate in experiment
Dear fellow students,
The research: I ama 4th year honour’s student within the School of Engineering & Computing
at UWS and as such, I have been working on a research project that involves the exploration
of gamification (the use of game elements and design in a non-game context) and its effects
on user engagement. In this case, measuring user engagement on a business intelligence tool
(a tool that stores and analyses business data). The coding has finished and I now need to test
it on a number of people to obtain their views and feedback.
My appeal: As a university student you will know that we all need people to help us with our
studies at some point or another so I amnow appealing for as many volunteers as possible to
take part in my own experiment. Therefore, it would be very much appreciated if you are able
to help me at this crucial time in my studies. My supervisor Dr Carolyn Begg has advised me
that the current 3rd year BI student would potentially make great volunteers as they are
engaged in BI tools and participation would give them an insight into how honour’s students
gather primary research for their projects. I am due to speak to the class on Monday 22nd
February to appeal to anyone who maybe interested and answer any questions.
Your part: The testing is straightforward – You will be given an exercise tutorial that has a
plain English guide with screenshots showing you what you should do on one of the
computers in the Lab (Room E116). Follow the guide for as long as you can or want to and
complete a short survey at the end expressing how you felt about the exercise. Someone will
be in the Lab during the experiment to provide assistance should it be needed. Your
participation is entirely voluntarily.
Time: If you complete the full exercise it should last about 50 minutes (including the time
taken to complete the survey).
What next: Please advise myself (email address below) or Dr. Carolyn Begg if you wish to
participate and we will tell you what happens next.
Thanks: Your participation is greatly appreciated.
Student: Gary Brogan (B00272662).
Email – B00272662@studentmail.uws.ac.uk
Appendix C Semi-Structured Interviews
The PDF files below contain the full transcripts of the four interviews conducted. Refer to
author if further information is needed.
Interview 1.pdf Interview2.pdf Interview3.pdf Interview4.pdf
68 | P a g e
Appendix D Interview Guide
This PDF file contains the guide on how the interviews were conducted.
Interview guide - .pdf
Appendix E Project Specification Form
COMPUTING HONOURS PROJECT SPECIFICATION FORM
Project Title: An Exploration of the Gamification of Business Intelligence tools and the Effect
on User Engagement
Student: Gary Brogan Banner ID: B00272662
Supervisor: Dr Carolyn Begg
Moderator: Dr Graeme McRobbie
Outline of Project:
Business Intelligence (BI) main purpose is to produce timely, accurate, high-value, and
actionable information. As a technology, BI has been seen to be under used and, as such,
has significant untapped potential. One of the main factors that contribute to it being under
used is a lack of user engagement with BI front end tools. This is the point that this project
addresses and asks the question whether gamification of a BI tool can affect user
engagement.
Gamification is the use of game design and mechanics in a non-game context to
engage users or solve problems. By applying gamification to, or in other words ”gamify”, a BI
tool, the research described in this project seeks to gather evidence that gamification of a BI
tool can lead to increased user engagement.
This project will form part of an on-going study named GamBIT, a gamified BI tool.
The work carried out will aid GamBIT application development and gather evidence whether
GamBIT achieved or not, increased user engagement with a BI tool.
A Passable Project will:
 Carry out a literature review relevant to this project.
 Investigate and evaluate user engagement with BI tools.
 Produce conclusions and analytical information based on the research and
evaluation findings.
69 | P a g e
A First Class Project will:
 Carry out a literature review which critically examines relevant and pertinent
published works and shows a thorough understanding of the subject area.
 Conduct highly detailed and exemplary research in carrying out the project, both
primary and secondary, making use of a wide variety of sources and methodologies.
 Contribute to the body of knowledge of whether gamification of BI tools has the
ability to increase user engagement with BI and highlight the implications for the
future enhancement of BI using gamification with particular focus on end-users and
front-end tools.
Marking Scheme:
Marks
Introduction 10%
Literature Review25%
Primary Research 25%
Discussion 10%
Conclusions & Recommendations 20%
Critical Evaluation 10%
Appendix F Initial GamBIT development involvement
During the initial development of the GamBIT prototype a number of tests were conducted
to help evaluate the prototype. An approach was made to a number of students, from the
School of Engineering and Computing at the University of the West of Scotland (UWS),
Paisley who had shown an interest in work being carried out in this report. This was done
through direct observation of volunteers who had agreed to test the prototype. This was
done in an attempt to observe their interaction with the prototype and with the Eclipse
platform. The main areas under observation where
 Length of time to complete the tasks
 Navigation of the platform
 Reaction to the gamification elements
After the tests were conducted feedback was given by the volunteers which included
 Incorporate rewards such as badges when a task is complete
 Simplification of the game based rules
 Reworking of the tutorial to highlight every step of the process involved in carrying
out the tasks.
Time taken for the volunteers to complete the tasks varied from 50 to 75 minutes. The
estimated time to be applied to the actual experiment was around 45 to 60 minutes. This
70 | P a g e
gave the developer time to re-evaluate the prototype and make the necessary changes prior
to the experiments being carried out.
Appendix G User Engagement Scale including Research Survey
The link below will give access to the User Engagement Scale (UES) including the Research
survey that will be used as part of this project. Refer to author if further information is
needed.
Questionnaire UES survey 10.2015.pdf
Appendix H GamBIT gamification elements
The following are screen grabs of the game elements taken from the eclipse BIRT platform.
It shows results of the gamification layers that were added to the BI tool and details of the
report building process undertaken by the participants of the experiment.
Eclipse BIRT platform
71 | P a g e
On the commencement of the report build process this “Welcome” pop-up appears.
Badges awarded for completing a task
72 | P a g e
Levels involved in “Bagging a Ben”. Green highlighting the completed tasks.
Task completion pop-up
73 | P a g e
Project GamBIT rules.
Report preview
74 | P a g e
GamBIT Leaderboard

More Related Content

DOCX
Enterprise Ontology and Semantics
DOCX
Internet Marketing Plan
PDF
Fulltext01
PDF
Professional networking online A qualitative study of LinkedIn use in Norway ...
PDF
Expanding Businesses by Integrating e-Business Strategies
PDF
RDGB Corporate Profile
PDF
UP653689 - PJS40
PDF
Social Business Transformation through Gamification
Enterprise Ontology and Semantics
Internet Marketing Plan
Fulltext01
Professional networking online A qualitative study of LinkedIn use in Norway ...
Expanding Businesses by Integrating e-Business Strategies
RDGB Corporate Profile
UP653689 - PJS40
Social Business Transformation through Gamification

Similar to Final.Dissertation.Sub (20)

DOCX
Gamification marketing
PDF
Gamification 3.0: The Power of Personalization
PDF
Master Project - Noémie Sauvage (1)
PDF
Learning Gamification Design – An Usability First Approach for the Enterprise...
PPTX
Gamification training pros-2015
PDF
Innovation Pioneers Tank Meeting 22 May 2013: Gamification
PDF
Gamification-Its-All-About-Processes
PDF
IBM Bluemix: Gamification
PDF
Reinventing Customer, Employee Engagement Through Gamification
PPTX
6th annual bi conference nz
PDF
Enterprise Gamification as Methodology to Promote Employee Engagement
PPTX
Gamification: The Next Trend in User Engagement - David Perkins (Managing Dir...
PDF
A Study on Operational Expectations of BI Implemantaions and Performance.
PDF
Lecture 1 - Introduction(1) - GAMIFICATION.pdf
PPTX
Towards a code of ethics for gamification at enterprise
PPTX
Towards a code of ethics for gamification at enterprise po em
PPTX
A short presentation for The International Gamification for Business Conference
PDF
Whitepaper - Understanding Gamification
PDF
Healthcare Gamification
PPTX
The Impact of Big Data on Business Intelligence: A Field Study on Jordanian T...
Gamification marketing
Gamification 3.0: The Power of Personalization
Master Project - Noémie Sauvage (1)
Learning Gamification Design – An Usability First Approach for the Enterprise...
Gamification training pros-2015
Innovation Pioneers Tank Meeting 22 May 2013: Gamification
Gamification-Its-All-About-Processes
IBM Bluemix: Gamification
Reinventing Customer, Employee Engagement Through Gamification
6th annual bi conference nz
Enterprise Gamification as Methodology to Promote Employee Engagement
Gamification: The Next Trend in User Engagement - David Perkins (Managing Dir...
A Study on Operational Expectations of BI Implemantaions and Performance.
Lecture 1 - Introduction(1) - GAMIFICATION.pdf
Towards a code of ethics for gamification at enterprise
Towards a code of ethics for gamification at enterprise po em
A short presentation for The International Gamification for Business Conference
Whitepaper - Understanding Gamification
Healthcare Gamification
The Impact of Big Data on Business Intelligence: A Field Study on Jordanian T...
Ad

Final.Dissertation.Sub

  • 1. BSc (Hons) Business Technology An Exploration of the Gamification of Business Intelligence Tools andthe Effect onUser Engagement Gary Brogan B00272662 22nd April 2016 Supervisor: Dr Carolyn Begg
  • 2. 1 | P a g e Declaration This dissertation is submitted in partial fulfilment of the requirements for the degree of BSc Business Technology (Honours) in the University of the West of Scotland. I declare that this dissertation embodies the results of my own work and that it has been composed by myself. Following normal academic conventions, I have made due acknowledgement to the work of others. Name: GARY BROGAN Signature: Date: 22/04/2016
  • 3. 2 | P a g e Library Reference Sheet Surname- Brogan First Name- Gary Initials- GB Borrower ID Number- B00272662 Course Code – BTCS – COMPSCI Course Description – BSc Hons Business Technology Project Supervisor-Dr Carolyn Begg Dissertation Title- An Exploration of the Gamification of Business Intelligence Tools and the Effect on User Engagement Session- 2015/2016 Acknowledgements I would like to thank both Dr Carolyn Begg and PhD student Stephen Miller for their continued support and advice throughout this project. A special thank you goes to my wife Tracey Brogan, who has been completely understanding and supportive during my time at University, and my two daughters, Abbey and Liara, who have also fully supported me throughout this journey.
  • 4. 3 | P a g e Contents Abstract.......................................................................................................................................5 Chapter 1: Introduction:............................................................................................................... 6 1.1 Introduction to Key Themes.................................................................................................6 1.2 Aims and objectives of the project........................................................................................ 6 1.3 Research methodology and techniques used ........................................................................7 1.4 Project scope and limitations............................................................................................... 7 Chapter 2: Literature review:........................................................................................................8 2.1 Business Intelligence ...........................................................................................................9 2.1.1 Business Intelligence defined......................................................................................... 9 2.1.2 Current state of Business Intelligence........................................................................... 10 2.1.3 Summary.................................................................................................................... 12 2.2 User Engagement with BI Tools.......................................................................................... 12 2.2.1 What is employee engagement?.................................................................................. 13 2.2.2 What is User Engagement?.......................................................................................... 13 2.2.3 BI Adoption Rate......................................................................................................... 13 2.2.4 Summary.................................................................................................................... 15 2.3 Gamification ..................................................................................................................... 15 2.3.1 Game Elements........................................................................................................... 17 2.3.2 Scope of Gamification ................................................................................................. 18 2.3.3 Successful Gamification............................................................................................... 19 2.3.4 Gamification platforms for Business Intelligence tools................................................... 19 2.3.2 Summary.................................................................................................................... 20 2.4 Literature review conclusion.............................................................................................. 20 2.4.1 User Engagement........................................................................................................ 21 2.4.2 Enterprise Gamification relationship with BI................................................................. 21 2.4.3 Motivational Theory linked to SSBI and Gamification .................................................... 21 2.4.4 Summary.................................................................................................................... 21 Chapter 3: Research methodology: ............................................................................................ 21 3.1 Selection criteria............................................................................................................... 22 3.2 Project GamBIT................................................................................................................. 22 3.3 Ethical debate surrounding gamification and study participation.......................................... 23 3.4 Quantitative Research ....................................................................................................... 24 3.4.1 Measuring User Engagement....................................................................................... 25 3.5 Qualitative Research.......................................................................................................... 25 3.6 Methodological stages....................................................................................................... 26
  • 5. 4 | P a g e 3.6.1 Steps involved in open coding...................................................................................... 27 Chapter 4: Experimental and Interview Process:......................................................................... 28 4.2 GamBIT Experiment........................................................................................................... 28 4.2.1 Appeal for Volunteers ................................................................................................. 28 4.2.2 Experiment................................................................................................................. 29 4.3 Interview process.............................................................................................................. 29 Chapter 5: Results and Analysis:................................................................................................. 30 5.1 Quantitative data.............................................................................................................. 30 5.1.1 Participant Results and Analysis................................................................................... 31 5.1.2 Survey Information Results andAnalysis....................................................................... 33 5.1.3 UES statistical Results and Analysis............................................................................... 39 5.1.4 User engagement highest rankingfactors..................................................................... 40 5.1.5 User engagementlowest ranking factors...................................................................... 43 5.1.6 Summary of UES data.................................................................................................. 44 5.1.7 Time taken to complete tasks results and analysis......................................................... 44 5.1.8 Summary of time taken to complete tasks.................................................................... 45 5.2 Qualitative data................................................................................................................. 45 5.2.1 Participant A............................................................................................................... 46 5.2.2 Participant B............................................................................................................... 48 5.2.3 Participant C............................................................................................................... 50 5.2.4 Participant D............................................................................................................... 51 5.3 Summary of Qualitative Data Results.................................................................................. 53 Chapter 6: Conclusion: .............................................................................................................. 54 6.1 Review of research objectives............................................................................................ 55 6.2 Discussion of primary and secondary conclusions................................................................ 55 6.3 Limitations placed on project............................................................................................. 56 6.4 Future research work ........................................................................................................ 57 6.5 Summary .......................................................................................................................... 58 Chapter 7: Critical evaluation: ..................................................................................................... 58 7.1 Reflecting on the initial stages of the project....................................................................... 58 7.2 Approach to project........................................................................................................... 59 7.3 Honours year modules....................................................................................................... 59 7.4 Project aids....................................................................................................................... 59 7.5 Summary .......................................................................................................................... 60 References................................................................................................................................. 60 Appendix:................................................................................................................................... 63
  • 6. 5 | P a g e Appendix A Descriptive statistics.............................................................................................. 63 Appendix B Appeal for volunteers............................................................................................ 66 Appendix C Semi-Structured Interviews.................................................................................... 67 Appendix D Interview Guide.................................................................................................... 68 Appendix E Project Specification Form..................................................................................... 68 Appendix F Initial GamBIT development involvement ............................................................... 69 Appendix G User Engagement Scale including Research Survey ................................................. 70 Appendix H GamBIT gamification elements.............................................................................. 70 Abstract The principal objective of the study is to explore the issue of lack of user engagement with BI tools. The point it aims to address is whether making BI tools more fun and engaging by applying gamification to BI tools effects user engagement with the tools. This project will explore the gamification of BI tools and the effects on user engagement, to see if there is an increase in user engagement. The literature revealed that only 24% of staff who are exposed to BI tools are considered actively engaged with BI tools. It is also widely acknowledged that BI has not fulfilled its true potential with traditional BI best practises being considered a bit of a failure. Could “gamifying” a BI tool affect user engagement with the tool and address the issue of lack of engagement? To test this theory a prototype gamified BI tool has been developed, namely Project GamBIT. To carry out the research on the study objectives a mixed methodology was used which helped gather both qualitative and quantitative data. This was deemed the most appropriate approach to a study which is exploratory in nature as each approach has the potential to enhance and/or complement the other in knowledge gained on the same research problem. To gather the quantitative data the User Engagement Scale (UES) was applied to the GamBIT prototype and analysis of the data. To gather the qualitative data semi-structured interviews were conducted with volunteers who had took part in the GamBIT experiment in an attempt to glean more information over and above the quantitative data. The study is unique as there is little credible academic research been carried out on the lack of user engagement of BI tools. The results of this unique study demonstrates that gamifying a BI tool does increase certain user engagement factors and can increases motivation to use BI tools more often. Feedback from the interviews conducted highlights further areas where user engagement was considered more significantly increased.
  • 7. 6 | P a g e Chapter 1: Introduction: The first chapter will set the scene for the research that has been undertaken and will provide an introduction of the topics covered in the report. It will cover the aims and objectives of the report and include the inspiration that led to the research being carried out along with the research problem, that there is lack of engagement with BI tools, and aims to provide a brief background on why this is. It will provide a very brief overview of the research methods used along with the scope and limitations of the report. 1.1 Introduction to Key Themes Business Intelligence systems and all their components have been around for a number of years now. Business Intelligence (BI) has, since the late 1980s, evolved into a multi-billion dollar industry. Its main purpose is to produce timely, accurate, high-value, and actionable information. As a technology, BI has been seen to be under used and, as such, has significant untapped potential. One of the main factors that contribute to it being under used is a lack of user engagement with BI front end tools. With the adoption rates of BI tools remaining flat at around 24% over the past few years, many BI initiatives have failed to deliver the expected results leading to a common belief that traditional BI best practices where “a bit of a failure” (Howson,C 2014). To tackle the issue of lack of user engagement, applying the concept of gamification to BI tools may offer a solution. Organisations are increasingly recognising that applying gamification platforms to a wide variety of business processes may hold the key to increased user engagement. A widely cited 2011 Gartner study predicted by 2016 the widespread use of gamification would be applied to 50% of organisations business processes. Although this now seems highly unlikely, the industry does continues to grow with many gamification platform providers such as Badgeville, Game Effective, and Bunchball leading the way in “gamifying” business processes. These gamification practitioners champion the use of techniques such as rewarding certain behaviours using points and badges, highlighting personal achievements on leader boards and basically trying to make business processes more fun and rewarding. Successful gamification practitioners also understand the relationship between psychology and technology giving thought to what motivates someone to engage with a certain task, process, or software tool. Understanding motivational theory and indeed why users engage with certain tasks, processes, or software tools may provide some answers to the question of “why individual IT adoption rates are much lower that many organisations originally forecast?” (Wu,M. 2011). Early indicators entertain the possibility that the recent trend of enterprise gamification, which applies gamification to the workplace environment, may become an integral part of any organisations future BI initiatives and a way to further operationalize BI. Could providing enterprise gamification platforms for BI processes hold the key to tackling the issue of lack of user engagement with BI tools? 1.2 Aims and objectives of the project The point this project is aimed at is to address the issues surrounding lack of user engagement with BI front end tools, and asks the question, “Whether the gamification of BI tools can affect user engagement?” The objectives are to explore the gamification of BI tools
  • 8. 7 | P a g e and the effect, if any, on user engagement with the tools. Once complete this will achieve the aims of the project. This has been chosen as the focus of the project as BI and its modular components are currently playing a major role in the Business Technology sector with the lack of user engagement with BI tools being a global organisational issue. Combined with the recent trend of gamification, and it’s potential to increase user engagement, these subject areas form an interesting basis of exploration for any Business Technology student. This project will also form part of an on-going experimental study named Project GamBIT, a prototype gamified BI tool. Work carried out has aided GamBIT application development and helped gather evidence whether GamBIT achieved or not, increased user engagement with a BI tool. 1.3 Research methodology and techniques used The report will contain details of how primary research will be conducted, providing details of how user engagement with a BI tool will be measured and analysed. This will provide both the quantitative and qualitative data needed to address the main points of the report, whether the gamification of BI tools can affect user engagement. As the objective is to explore the gamification of BI tools and the effects on user engagement, if any, analysis of both the quantitative and qualitative data was conducted to aid in the exploration process. Additionally the report focuses on key academic papers from both the BI and Gamification sectors, with the emphasis on user engagement within each sector, and draws on findings from industry experts such as (Howson,C.), (Werbach,K.), (Zichermann,G.). This forms the basis for the literature review and aims to address the key areas of the report. As this report covers areas with few comprehensive academic works it will draw on white papers, articles and blogs, Vendor specific websites, webinars, studies, and gamification platform providers where appropriate. As some non-academic researched literature may be somewhat biased, the application of criticismwill be provided when deemed necessary and appropriate in an attempt to mitigate as much bias as possible. 1.4 Project scope and limitations This section will concentrate on the boundaries of the secondary and primary research. The primary research will explore if the gamification of a front-end BI tool will have any effect on user engagement with the tool. As this subject is unique in that there is no current academic research been done in this area, the project will include both qualitative and quantitative research methods. Quantitative and qualitative data was collected on one specific experimental front- end BI reporting tool. The tool was designed using the Eclipse BIRT platform which is an open source platform for BI tools. It is also worth noting that the majority of primary data was collected from students of the School of Engineering and Computing who may already be considered somewhat “engaged” with front-end BI reporting tools. Further qualitative data was collected in an attempt glean more information over and above the quantitative data collected. It also attempts to gain further insight into user’s thoughts,
  • 9. 8 | P a g e feelings and opinions on the future evolution of both the BI and gamification sectors and identify any correlations between these sectors. The secondary research of this report explores the concept of BI, its modular components, the emergence of BI and the factors influencing the BI industry with a focus on the adoption rate of BI tools. The concept of BI is examined in its broadest sense by reviewing the published literature with particular reference to material based on user engagement with BI tools which is relatively limited in scope and detail. The report will then centre on the recent trend of gamification, what it is, and its scope. It will explore the possibility of whether, by applying gamification to a front-end BI tool, this could have an effect on user engagement with the tool. When researching user engagement, users mainly fall into two groups, employees and customers. For the purposes of this study the focus is on the employee user group. The subject areas that will form the basis of the literature review are:  Business Intelligence (BI)  User engagement with BI tools  Gamification. Figure 1-0 The Venn diagram organises the key subject areas of this report visually so the similarity between relationships can be clearly seen. Chapter 2: Literature review: Keywords - Gamification, Employee Engagement, User Engagement, Business Intelligence, Business Intelligence Tools, Game Elements, Game Mechanics, Intrinsic Motivation, Enterprise Gamification.
  • 10. 9 | P a g e This chapter contains the literature review carried out by the researcher and examines relevant literature on the BI sector, focusing on the history of BI and the current state of the industry. The literature review will then concentrate on user engagement with BI and especially front-end BI tools. This part focuses specifically on the exploration of adoption rates of BI tools and highlights any potential issues that could lead to “a lack of user engagement with BI tools”. The focus will then turn to the new trend of gamification and its potential correlation with BI and user engagement with front-end BI tools 2.1 Business Intelligence The term Business Intelligence, or BI, was coined by Howard Dresner of the Gartner Group, in the late 1980s. BI is a huge and rapidly growing industry that emerged as a result of organisations beginning to realise and understand that the data stored within their decision support systems (DSS) had the potential to be of great value to them. Many of the early adopters of BI were in transaction-intensive businesses, such as telecommunications and financial services. As the industry matured the BI technical architecture began to include Data Warehouses, Data Marts, Executive Information Systems (EIS), Online Analytical Processing (OLAP) and by the mid-1990s BI, along with all its modular components, became widely adopted (Miller, 2013). As a result, BI became so closely associated with Data Warehouse technology it became identified as one and the same and is referred to using the acronym BI/DW. By the mid-1990s two main leaders in the BI industry emerged, Bill Inmon and Ralph Kimball. Inmon’s philosophy is based on an enterprise approach to data warehouse design using a top-down design method (Inmon 2005) while Kimball’s offering consists of a dimensional design method which is considered a bottom-up design approach (Kimball 2002). Even now a debate still rages on which of these approaches is more effective. Research points towards both Inmon and Kimballs approaches having advantages and disadvantages with many organisations having successfully implementing either approach. Organisations who are considering implementing a BI infrastructure would have to give careful consideration to both these approaches and closely aligned either approach with the overall high level business strategy of the organisation. Until recently BI had adopted a mainly centralised model around organisations IT departments. This meant that getting information to the right users could take considerable time and the build-up of requests for reports, analytics and insights from within the organisation could become “bottlenecked”. The general consensus was that business users viewed BI tools as complex and left the use of these tools to the “power users” within IT departments. This naturally evolved into a big disconnect between the IT power users and business users and led to many problems for what is now referred to as “Traditional BI”. Research suggests that Traditional BI best practices were considered slow, painful, and expensive therefore seen as a bit of a failure (Howson,C 2014). 2.1.1 Business Intelligence defined As BI has evolved so too has its definition and as such can be defined in various ways. (Howson, C. 2014) defines BI as a “set of technologies and processes that allow people at all levels of the organisation to access and analyse data”. Gartner (2013), the world's leading information technology research and advisory company, describes BI as an umbrella term that includes the applications, infrastructure and tools, and best practices that enables
  • 11. 10 | P a g e access to and analysis of information to improve and optimize decisions and performance. Eckerson,W (2010), Director of BI Leadership Research , appreciated the need for BI tools to provide production reporting, end-user query and reporting, OLAP, dashboard/screen tools, data mining tools, and planning and modelling tools. Research suggests that currently there are no combinations of hardware and software, any processes, protocols, or architectures that can truly define BI. What (Wu, L, Barash, G, & Bartolini, C. et al 2007) have made clear however, is that up until recently BI’s objectives were to:  Offer an organisation a “single version of the truth”.  Provide a simplified systemimplementation, deployment and administration.  Deliver strategic, tactical and operational knowledge and actionable insight. 2.1.2 Current state of Business Intelligence The recent unstructured data explosion and the trend towards “Big data” (Davenport, T.H., Barth, P. & Bean, R. 2012) has seen BI evolve yet again and as such BI has become synonymous with Big Data and Big Data analytics. As the volume, velocity and variety of data (the three V’s) has exponentially increased so too has the demand for cost-effective, innovative forms of information processing for enhanced insight and decision making (Lohr, S. 2012) .Vast volumes of data are now being captured and stored, but research shows it has been impossible for traditional BI to analyse and manage this data due to the unstructured nature of it (figure 2-0). Wixom, B. (2014) highlights how BI responded to the challenges posed by Big Data by adopting advanced technologies such as:  Hadoop architectures  Data visualization and discovery tools  Predictive analytics  Rare combinations of user skills (e.g., data scientists) Figure 2-0 - Graphic: Imex Research Businesses are now demanding faster time to insight (DiSanto,D, 2012), to stay competitive in today’s fast paced, evolving global markets and BI has to at least try and keep up with the pace of these demands. Traditional BI tools could take days or weeks to produce reports and
  • 12. 11 | P a g e analysis, this is no longer enough. This seen a demand for real time Business Intelligence, (RTBI) Azvine, B, Cui, Z, & Nauck, D. (2005) agreed that it is “becoming essential nowadays that not only is the analysis done on real-time data, but also actions in response to analysis results can be performed in real time and instantaneously change parameters of business processes”. As RTBI evolved, so too has the more recent BI trend of self-service BI. Front-end business users, who are considered the main information consumers, want to see, analyse and act upon their data more quickly without having to heavily rely on IT departments making their data available to them. The shift away from a centralised BI model to a more balanced centralised/de-centralised BI model (Wu, L., Barash, G., and Bartolini,C, 2007) has seen the emergence of, and increased organisational involvement with, self-service BI (SSBI). Gartner (2013) defines SSBI “as end users designing and deploying their own reports and analyses within an approved and supported architecture and tools portfolio.” Imhoff, C. & White, C. (2011) define SSBI as the facilities within the BI environment that enable BI users to become more self-reliant and less dependent on the IT organization. These facilities focus on four main objectives: 1. Easier access to source data for reporting and analysis, 2. Easier and improved support for data analysis features, 3. Faster deployment options such as appliances and cloud computing, and 4. Simpler, customizable, and collaborative end-user interfaces. Figure 2-1 - Graphic: BI Research and Intelligent Solutions, Inc. To help organisations achieve these four main objectives it would be worth exploring the concept of intrinsic motivation later on in this report, which Paharia, R. (2013) argues is directly linked to SSBI users feeling empowered, and how it fits in with individual adoption rates with SSBI processes. Research points towards SSBI lending itself to “multiple versions of the same truth” whereas traditional BI offered organisations a “single version of the truth”. SSBI has been facilitated by the increased use of BI front end tools, mainly Visual Data Discovery (VDD) tools (Howson, C. 2014). Eckerson,W (2010) defines VDD tools as “self-service, in-memory analysis tools that enable business users to access and analyse data visually at the speed of thought with minimal or no IT assistance and then share the results of their discoveries with
  • 13. 12 | P a g e colleagues, usually in the form of an interactive dashboard”. SSBI has now become synonymous with VDD tools and has become a top investment and innovation priority for businesses over the past few years. The annual Gartner Business Intelligence and Analytics Summit (2014) looks at the current trends within the BI industry and highlighted that:  Self-service analytics is “white hot” and growing while demand for traditional dashboard BI is in remission.  BI on Big Data (i.e. Hadoop-based and outside of the data warehouse) is a dynamic new class of problem that requires a new class of solution.  Today's buyers are increasingly coming from the business side of the house and not from corporate IT, which has seen the move away from a centralised BI model to more decentralized BI model. 2.1.3 Summary  Traditional BI best practices considered a bit of a failure.  Business users viewed BI tools as complex and left the use of these tools to the “power users” within IT departments. Leading to a big ‘disconnect’ between business and IT staff.  The de-centralisation of BI has seen the emergence of self-service BI. This new trend has been facilitated by the increased use of BI front end tools, mainly Visual Data Discovery (VDD) tools. 2.2 User Engagement with BI Tools This section of the literature review concentrates on user engagement with BI and looks at the links between user engagement with BI, or lack of it, and the wider global issue of employee engagement in the workplace. Technology is important in any BI initiative but so too is need for BI users to be “engaged” with the BI environment. Having an engaged workforce has proven to help foster an analytical culture within organisations. Paharia,R. (2013) suggests that engaged workers “can drive meaningful increases in productivity, profitability, and product quality, as well as less absenteeism, turnover, and shrinkage”. This is no mean feat to achieve. It’s the combination of people and technology that turn data into actionable information that can be used to enhance the organisations decision-making (Miller, A.S. 2013), that lies at the heart of BI. By getting the right information to the right people at the right time BI can become an integral part when improving decision making, providing valuable business insights, optimising organisational performance and of measuring success. However, employee adoption of and engagement with BI is critical in any BI initiatives success or failure.
  • 14. 13 | P a g e 2.2.1 What is employee engagement? Employee engagement does not have one simple or accepted definition. The Chartered Institute of Personnel and Development take a three dimensional approach to defining employee engagement: • Intellectual engagement – thinking hard about the job and how to do it better • Affective engagement – feeling positively about doing a good job • Social engagement – actively taking opportunities to discuss work-related improvements with others at work 2.2.2 What is User Engagement? Research has shown that user engagement has several definitions. This highly cited definition by O'Brien, H.L., & Toms, E.G. (2008) states “Engagement is a user’s response to an interaction that gains maintains, and encourages their attention, particularly when they are intrinsically motivated” while Attfield, S, Kazai, G., Lalmas, M., & Piwowarski, B. (2011) explain that “User engagement is a quality of the user experience that emphasizes the positive aspects of interaction – in particular the fact of being captivated by the technology” Research points towards user engagement being the determining factor in any successful BI initiative. Organisations that have more users engaging with BI, with the emphasise on BI tools, will more than likely see a better Return on Investment (ROI) in their BI ventures than that of those whose workforce are lacking in engagement (Howson,C. 2014). 2.2.3 BI Adoption Rate Recent survey suggests that BI adoption as a percentage of employees remains flat at 22%, but companies who have successfully deployed mobile BI (Dresner.H, 2012), show the highest adoption at 42% of employees (figure 2-2) Figure 2-2 - Graphic: BI Scorecard The lack of BI adoption from the employee perspective can be aligned closely with the wider global problem of “lack of employee engagement” in the workplace. According to Deloitte’s 2015 Global Human Capital Trends survey (figure 2-3), employee and cultural engagement is the number one challenge companies’ face around the world.
  • 15. 14 | P a g e Figure 2-3 - Graphic: Deloitte University Press Gallup conducted a study in 2013 into the state of the global workplace. The findings show of the 142 countries that measured employee engagement that 13% of employees are engaged in their jobs, while 63% are not engaged and 24% are actively disengaged. While in the U.S Dale Carnegie and MSW did a study on over 1500 employees that measured employee engagement. It revealed that 29% of the workforce is engaged, 45% are not engaged, and 26% are actively disengaged (Dale Carnegie Training 2012). As more organisations employ BI and analytics to improve and optimize decisions and performance research points towards the question many organisations have asked “what is going to make the difference between a successful BI initiative and one that will flat line?” The need to stay one step ahead in an ever increasing and competitive global marketplace is proving harder. Business leaders are looking to technology as the main driver in remaining competitive in today’s markets. Having the right information technology infrastructure in place is not enough to give organisations the edge. What the research leans towards is having an engaged, motivated and collaborative workforce. This is especially true in the BI environment where adoption rates of BI tools has flat lined over the past decade. Some have suggested that those who are exposed to the front end tools and how they engage with them may make the difference in the success or failure in any BI initiative. It would seemthat Organisations looking to take BI adoption rates, and indeed user engagement with BI tools, to the next level would have to have a clear strategy that makes user engagement a priority. To get the right information to the right person at the right time does not guarantee BI success, if users are not engaging with BI tools an organisations BI deployment could be doomed to failure. However to address this problem an important questions should be asked “is user engagement with BI at a required level to make BI a success?” If this question cannot be clearly answered an organisations BI efforts could fail to deliver the results that were initially predicted.
  • 16. 15 | P a g e Senapati, L., (2013) argues that to gain competitive advantage through active user engagement, organizations must leverage gamification mechanics to influence user behaviour and drive results. The summary below gives an indication to why there is a lack of engagement with BI tools. 2.2.4 Summary  The adoption rate of BI tools has flat lined at 22% over the past decade.  The issues surrounding user engagement with BI tools can be directly linked to the wider global issue of lack of employee engagement in the workplace.  Organisations who have employees actively engaged with BI tools see a great return on investment with their BI initiatives.  To take user engagement with front-end BI tools to the next level, organisations will need a clear strategy that makes user engagement a priority. 2.3 Gamification This section of the Interim report will focus on the subject area of gamification. It will explore its history, how it is defined and its correlation with BI and in particular exploring the possibilities that the gamification of BI tools could have an effect on user engagement. Gamification is a relatively new concept that is constantly evolving and has been gaining popularity over the past few years with many vendors now offering gamification platforms and solutions. The development of new frameworks, technologies and design patterns has made gamification scalable and effective (Werbach,K, Hunter,D, 2012). This has led to it being applied and utilised throughout organisations to gain business benefits across a wide processes, tasks and tools. The term “gamification” has been accredited to the British-born computer programmer and inventor Nick Pelling who coined the phrase in 2002 but it was not until 2010 that articles and journals based on gamification started to appear. The rise in popularity of gamification has resulted in it experiencing considerable attention over the past few years. Google trends shows that search volume for gamification increased significantly since 2010 and spiked in February 2014.(Figure 2-4) Since then it has stayed at a steadier search volume. (December 2015). Gartners top 10 strategic technology trends 2014 showed gamification as a rising trend for a number of years. (Figure 2-5) but has seen the hype surrounding it die down and should reach its plateau of productivity in the next 2 to 5 years. Like all trends it has is champions and its critics and although gamification has quickly evolved into a multi-million dollar industry it is still considered to be in its infancy and therefore not fully matured.
  • 17. 16 | P a g e Figure 2-4 Google Trend search results for the keywords gamification & business gamification Figure 2-5 Gamification in the Gartner 2014 Hype Cycle There are many schools of thought on the definition of “what” gamification is. Duggan, K. & Shoup, K. (2013) use this explanation of Gamification to highlight both the human behavioural and technology elements used in gamification. “Think of gamification as the intersection of psychology and technology… understanding what motivates someone to ‘engage’ with certain elements of a website, app, or what have you… It’s about humanising the technology and applying psychology and behavioural concepts to increase the likelihood that the technology will be used and used properly”. Werbach,K. Hunter,D.(2012) define gamification as “the use of game mechanics and design in a non-game context to engage users or solve problems”. It is important that the research does not confuse gamification with “playing games” or “serious games” (Nicholson, S. 2012) which also applies game elements and design to non-game concepts. Gamification is not people playing or creating full blown games, whether it be for employees or customers, but using game elements such as dynamics, mechanics, and components to make an existing experience, like a task, business process, or software tool more fun, engaging, collaborative,
  • 18. 17 | P a g e and rewarding. Gamification uses these motivational factors based on needs and desires to get organizational task completed. Organisational tasks with game like engagement and actions can make people excited about work and boost productivity (Wu,M. 2011). 2.3.1 Game Elements Game elements can be though if as the “toolkit” needed to build and implement successful gamification. Points, Badges, and Leader boards (PBLs) are common components within the game elements and are a seen as surface level features of gamification. PBLs are usually a good place to start when introducing gamification platforms but research suggests awarding and rewarding are not enough. Through the review of literature it would be safe to imply that if gamification initiatives are to succeed, other certain aspects must be considered. The two key questions that emerged where 1. What are the motivational factors that drive engagement with a product/service/process? 2. Why should gamification be taken seriously especially in a business environment? To answer these questions first we must look at the three key elements of gamification namely dynamics, mechanics and components. Figure 2-6 shows how these elements relate to each other and why they are considered the building blocks to successful gamification. Figure 2-6 Graphic: Gamification Course 2014 The research will now look at the relationship between these three elements starting with dynamics. Kim, B., (2012) states that “the power of game dynamics stems from the fact that it requires meeting relatively simple conditions in return for attainable rewards. Then gradually, the tasks become complicated and more challenging for bigger rewards”. This could conceivably be considered the meaning behind the game.
  • 19. 18 | P a g e Game mechanics refers to a set of rules, design and tools, employed by game designers, to generate and reward activity amongst users in a game that are intended to produce an enjoyable gaming experience (Senapati, L., 2013). Game mechanics are the elements of the game that makes it fun, drives the action forward, and generates user engagement. Game mechanics could reasonably be considered the momentum behind the game. Werbach, K. (2014) describes game components as specific instantiations of mechanics and dynamics and can include PBLs, avatars, collectibles, unlockables. This can be closely linked to what is considered the motivation to continue with the game. The objectives of any gamification platform or solution should be aligned directly with the business objectives and as such an understanding of the primary stakeholders is essential in creating an experience that engages users while accomplishing the business objectives (Deterding, S. et al 2012). To make the experience engaging research highlighted that three major factors must exist and be correctly positioned. These are motivation, momentum and meaning. This is achieved through a combination of carefully crafted game elements and design and a deep understanding of what motivates the users of the gamified system. Research points to the Volkswagen (2009) initiative named the “fun theory”. This initiative puts “fun” at the heart of seemly mundane tasks such as using a set of stairs or disposing of litter and turning it into an engaging and somewhat rewarding experience. Gamification practitioners have learned from this and as a result the fun theory is considered a driving factor for successful gamification and should never be far from the thoughts of any gamification designer (Werbach,K. Hunter,D. 2012). Underlying the concept of gamification is motivation. Research suggests that people can be driven to do something because of internal or external motivation (Nicholson, S. 2012). Paharia,R. (2013) adds to this by stating “Knowing what truly motivates people, and what doesn’t, enables us to create stronger engagement and true loyalty”. 2.3.2 Scope of Gamification The extremely broad and expanding range of ways gamification has been successfully utilized in recent years has led to its increase in scope. The frameworks, technologies and design expertise are readily available to introduce gamification platforms or solutions into organisations business processes. With the trajectory of gamification constantly changing some Industry experts have argued that each and every business process or problem has a “gamified” solution (Zichermann,G. Linder, J. 2013). Although this may seeman exaggerated statement it would be worth future consideration and exploration because as of yet there is no credible academic research been done on the subject. If what Zichermann,G. Linder, J.(2013) say is the case, then gamification has massive scope but the legal, moral and ethical implications of gamification put forward by Kumar,J.M. & Herger,M. (2013) could affect its future scope. As gamification is still in its infancy and not fully matured, research suggests gauging its scope may raise more questions than answers.
  • 20. 19 | P a g e 2.3.3 Successful Gamification Gamification has proven to be successful in many diverse business fields and because it can provide quantitative data, organisations can measure engagement with whatever process, task or tool that has been gamified. With more and more organisations realizing gamifications potential the type of data collected can lead to valuable insights for organisations. Zichermann,G.(2013) describes how in 2006 Nike introduced gamification to tackle the issue of why business had fallen to its lowest market share ownership in the influential running shoe category. By 2009 Nike had reversed the trend due in no small part to the gamification platform that featured social networking and location based technology that relied heavily on games called Nike+. Individuals who went for a run could now track the number of steps they took, calories burned and routes they ran by attaching the Nike Fuelband round their wrists. Zichermann,G (2013) goes on to explain that “once downloaded this data could be compared to that of others and the experience of going for a run became much richer”. This created a whole new level of social engagement with running challenges being issued, prizes such as electronic badges being awarded, and videos of praise from celebrity athletes for reaching certain goals. By 2012 Nike+ had over five million users. By leveraging a simple concept “beating your best time” Nike created a gamification platform that encouraged wellbeing and fitness and in turn saw its market share increased by 10% in a single year. Stanley,R (2014) looks at Engine Yard as an example of successful gamification. Engine yard is described as a platform for deploying, scaling, and monitoring applications. The company implemented a Zendesk knowledge base, but didn’t see the levels of engagement they had hoped for. To encourage participation, Engine Yard incorporated PBLs and other gamification tactics to boost participation and reward users for making contributions to the community. These actions successfully increased user-generated content for its customer self-help portal, decreasing the number of support tickets and reducing the demand on support staff. These examples show the diverse range of business processes that have benefited from gamification. The literature review will now focus on the relationships between BI and gamification and look to uncover any evidence of front-end BI tools that have been gamified. 2.3.4 Gamification platforms for Business Intelligence tools There is considerable overlap between the aims of both gamification and BI. RedCritter, who offer business solution software that enables enterprises to manage, showcase, and reward employee achievements, utilize game elements as an integral part of their social enterprise platform by incorporating Profiles, PBLs, Rewards and Skill tracking into their customers’ existing BI processes. RedCritter works with Tableau, a leading self-service BI visual data discovery tool vendor, and Microsoft Excel to provide BI and analytics. RedCritter integrates Tableau and Excel with their enterprise gamification platforms with RedCritter Product Manager, Jenness, D, (2014), claiming that this type of enterprise gamification of BI leads to “valuable insights about employee performance and engagement” and “enables self-service data visualization and behavioural insights”. Swoyer, S. (2012) states in his article for the
  • 21. 20 | P a g e TDWI that gamification has particular resonance with BI and analytics, where the search for, and discovery of, insights already has a game-like feel to it. Gamification advocates want to amplify this effect to intelligently apply game-like concepts and methods to BI and analytics. The article continues with: "It's a question of game play: of how we can make [interacting with] BI more engaging. For example, you want to get people into the flow where they're asking questions continuously, where they're following [an analysis] from one question to another. Where questions lead to insights, and vice versa" lead analyst at the Information Management company, Ovum, Madan S. (2013), identified that many BI systems resemble gamified systems in that they: “Seeks to engage business users and change organizational behaviours to improve business performance and outcomes. Gamified functions also typically generate a lot of data for analysis. The key is providing users with an immersive data experience that drives them to improve on that information through exploration and feedback.” Madan. S, recognises that gamification and BI “are both are highly complementary” and gamification can be seen as a way to further operationalize BI by embedding it seamlessly into everyday knowledge work, albeit in a competitively friendly and fun way. Research points towards the correlation between gamification and SSBI with a blog post on Decision Hacker (2012) suggesting SSBI could reasonably be defined as an early attempt to gamify the workplace this statement is also championed by Werbach, K. (2014). Its overall goal is intended to engage the workforce and align organisational behaviours through carefully designed elements. This statement may seema little premature as it is unclear that using game elements with business processes and applications can become a viable, long- term concept that meets business objectives (Madan, S. 2013). 2.3.2 Summary  There is considerable overlap between the aims of gamification and BI.  Enterprise gamification platforms are now being integrated with BI tools such as Tableau.  Gamification has been proven to increase user engagement with business processes, tasks and tools.  Gamification must be closely aligned with business objectives to be successful in the workplace.  As yet there is no credible academic research suggesting gamification can increase user engagement with individual BI tools. 2.4 Literature review conclusion The following section contains the findings from the three subject areas discussed in the literature review and how they are connected. It also gives justification for further research into the main points the report aims to address.
  • 22. 21 | P a g e 2.4.1 User Engagement User engagement with front-end BI tools has flat lined at around 22%-26% for almost a decade now. The review of literature entertains the idea that adding gamified layers to front-end BI tools could have an effect on user engagement with the given tools. What this research has attempted to reveal is that to take user engagement with front-end BI tools to the next level, organisations will need a clear strategy that makes user engagement a priority. Gamification platforms and solutions maybe one way of addressing this priority but no credible academic evidence of this is currently available. 2.4.2 Enterprise Gamification relationship with BI Many industry leaders agree that gamification may very well change the face of BI. With the emergence of enterprise gamification platforms from providers such as Badgeville, Bunchball, and Redcritter, more and more business processes have been successfully gamified. Research shows little evidence of the gamification of individual BI tools. What is more relevant is the increasing number of enterprise gamification platforms being provided for BI vendors, with particular focus on VDD tool vendors. But as this is also a very recent, and still emerging field it provides very little in the way of measurable results to support the claims that these platforms will be successful applied to BI and in particular BI front end tools. 2.4.3 Motivational Theory linked to SSBI and Gamification The Literature review revealed SSBI and gamification share a common use of the motivation theory, with the focus on intrinsic motivation, in an attempt to increase loyalty, engagement, and collaboration. The relationship and similarities between both these subject areas highlight the importance of what motivates individuals to engage with certain tasks, processes or (more importantly for the purposes of this report) BI tools. 2.4.4 Summary The key theme of the literature review clearly shows that there is considerable overlap between the aims of BI and gamification and that BI systems can indeed resemble gamification systems. Gamification platforms can generate valuable insights into user engagement and therefore would be a good starting point for exploring the idea of its potential effects on user engagement with BI tools. The literature review shows early indications that by gamifying BI tools, especially front-end tools, user engagement with the tool may very well increase. With the key theme and findings from the literature review, further research on the exploration of the gamification of BI tools and the effects on user engagement can be justified. Chapter 3: Research methodology: The purpose of this chapter is to define the type of research that was carried out through an identification and selection process and to explain the research approach, strategy and associated methods chosen for the data collection and analysis. The challenges and ethical issues that were encountered as well as the modifications that were made throughout the research journey are also presented. A discussion on the ‘reliability and validity’ of the research is provided and latterly, a conclusion is reached.
  • 23. 22 | P a g e “Qualitative and quantitative research methods have grown out of, and still represent, different paradigms. However, the fact that the approaches are incommensurate does not mean that multiple methods cannot be combined in a single study if it is done for complementary purposes” Sale, J, Lohfeld, M, Brazil, K (2002) 3.1 Selection criteria Quillan (2011) insists that it is good practice and wise to reiterate what the main objective is, as it serves to reinforce what is being measured and how it fits with the research questions. The main research objective is: “to address, and asks the question, whether the Gamification of BI tools can affect user engagement.” Specific study objectives have been formulated, which are:  To address the issues surrounding user engagement with BI tools.  To explore the gamification of BI tools and the effect, if any, on user engagement with the tools. To carry out the research on the study objectives it has been decided to use a mixed methodology which will help gather both qualitative and quantitative data. This was deemed the most appropriate approach to a study which is exploratory in nature as each approach has the potential to enhance and/or complement the other in knowledge gained on the same research problem, while each remains true to its own identity (Salomon, 1991). The mixed methodology approach adopted throughout is designed to carry out relevant and valuable research. According to Carey (1993), quantitative and qualitative techniques are merely tools; integrating them allows us to answer questions of substantial importance. 3.2 Project GamBIT This section will introduce the experimental study named Project GamBIT which forms part of the primary research for the report objective. To gain a better understanding of the research methodology it is important to have a clear understanding of what the prototype purpose is, how it was developed and how it will be used. Project GamBIT is centred on the main themes covered in the literature review, BI, user engagement with BI tools and gamification. Its objective is to address a worldwide issue of “lack of user engagement and adoption by users of BI tools (employees) throughout the business world”. The study is unique in that the concept of “gamifying” a BI tool would see an increase in user engagement with the tool. As yet this subject has lacked academic research which has resulted in a limited existing body of knowledge. Project GamBIT is a software prototype that has been designed and developed in an attempt “To apply the concept of gamification to a business intelligence tool and to evaluate what effect it has on user engagement levels” (Miller,S, 2013), I joined the study at the early
  • 24. 23 | P a g e stage of testing and evaluation of the prototype. My part in the study was to aid in Project GamBIT development and gather evidence whether GamBIT achieved or not, increased user engagement with a BI tool. To aid GamBIT application development this report will identify, describe and apply appropriate research methods to gather feedback on early versions of the GamBIT prototype which include  Use of the GamBIT prototype  Feedback on the experience  Ideas on improvements to the prototype The GamBIT tool was developed using the Eclipse BIRT Java platform http://www.eclipse.org/birt/ . Eclipse BIRT is an open source technology platform used to create data visualizations and reports that can be embedded into rich client and web applications. This tool has many advantages over other BI software tools and is particularly suitable for developing or dismantling, rebuilding and customising, which this project requires. It has allowed the developer, PHD student Stephen Miller, to strip back and “gamify” the tool. This was achieved by dismantling the tools framework and reassembling the tool with additional layers which incorporated gamification. Access to the BI software and Java developers’ platform was given in an attempt aimed to give me a better understand how the GamBIT tool has been developed and at what stage the project is currently at. To achieve these aims an understanding of the Java code used within the developers’ platform was deemed necessary. This included access to and an understanding of the Java files, folders, and source code used. Java code was then edited which create a new configuration of the code and Gambit front end. Appendix H shows screen dumps from the GamBIT tool. The screen dumps highlight the gamification elements added to the Eclipse platform and the process undertaken by the volunteers who took part in the gamified experiment. These steps are deemed necessary to help 3.3 Ethical debate surrounding gamification and study participation There are ethical issues surrounding gamification mainly the aspect that user of the gamified system must be treated fairly and with respect. There must be a balance struck between the desired actions or outcomes the gamification systemis looking to achieve and the exploitation of the user. Bogost,I (2015) has described gamification as a form of “exploitation-ware”. The results of a study into the ethical debate surrounding gamification within an enterprise concluded that “Gamification could be seen as an unfair mechanism to increase productivity with no real costs. In addition, it could increase pressure on employees to achieve more or avoid being in the bottom of the list” (Shahri, A., Hosseini, M., Phalp, K, Taylor, J. & Ali, R. 2014). Some have argued that gamification can be used to confuse users and ignore what is “reality”. Gamified systems that have been designed without considering the ethical issues surrounding gamification can fundamentally undermine the business objectives that they
  • 25. 24 | P a g e were set out to achieve. The counter argument put forward by DeMonte.A (2014) of Badgeville states that: "Gamification can never be successful exploitationware, because it only works when the behaviours that are motivated are behaviours that the user wants to perform in the first place. It's not some magic solution where you can manipulate users to perform behaviours against their will.” As gamification matures the ethical and legal issues surrounding it will undoubtedly become clearer (Kumar,J.M. & Herger,M. 2013). But for the purposes of the research carried out in this report the ethical debate surrounding gamification was carefully considered as there are no clear best practices relating to the subject area. The research involved groups of students who volunteered to take part in the experimental stage. There are ethical considerations to take into account and as such all volunteers were given an information sheet that fully explained their involvement in the study giving the volunteers the freedom to out of the study at any point. Great care and consideration was taken to put volunteers at ease and to make them fully aware of what was expected during the experimental stage of the GamBIT prototype and during the interview process. The intention was to protect the confidentiality of, and give anonymity to, volunteers. 3.4 Quantitative Research This section discusses what quantitative research is, its goals, and how this approach was applied to the aims and objectives of the report. Quantitative research is the systematic empirical investigation of observable phenomena via statistical, mathematical or computational techniques. (Given, M. 2008). Quantitative research methods have been chosen as a means of “collecting ‘facts’ of human behaviour, which when accumulated will provide verification and elaboration on a theory that will allow scientists to state causes and predict human behaviour” (Bogdan & Biklen, 1998, p. 38). The ontological position of the quantitative paradigm is that there is only one truth, an objective reality that exists independent of human perception. (Sale, J, Lohfeld, M, Brazil, K 2002), This type of research fits with the aims of the report in as much as it is a research method that can help facilitate the process of measuring user engagement. The approach applied to the quantitative research methods are as follows: 1. Apply the User Engagement Scale (UES) to the GamBIT prototype to measure user engagement. 2. Analyse the data collected from the UES. 3. Document the results and findings using tables, charts and/or graphs 4. Interpret and summarise the results
  • 26. 25 | P a g e 3.4.1 Measuring User Engagement To develop an approach to measuring user engagement the question of “how can we measure user engagement?” must be answered. O'Brien, H.L., & Toms, E.G. (2008) have conducted several studies focusing on the assessment of engagement and believe the following factors are considered to be most relevant in measuring user engagement:  Perceived usability - user’s affective (e.g. frustration) & cognitive (e.g. effort) responses  Novelty - user’s level of interest in the task and the curiosity evoked  Aesthetic appeal - user’s perceptions of the visual appeal of the user interface  Focused attention - the concentration of mental activity, flow, absorption etc…  Felt involvement - user’s feelings of being ‘drawn’ in, interested and having ‘fun’  Endurability - user’s overall evaluation of the IS e.g. likely to return/recommend Given the belief that the factors listed are considered the most relevant to use with the GamBIT tool, the user engagement scale (O’Brien, H.L. & Toms, E.G. (2013) (2008)) was chosen to collect the quantitative data. The UES has been modified to fit the needs of the GamBIT tool. Research suggests there is no “perfect” or “complete” way of measuring user engagement, there are several different methods that could have been applied to project GamBIT to produce the quantitative data needed for this study. Through research the UES was considered best as it considers the most relevant factors in measuring user engagement. Others such as the System Usability Scale (SUS) where considered but the developer dismissed this “quick and dirty” scale as it was considered “one-dimensional” and the questionnaire is, by its own nature, quite general. The User Engagement Scale (UES) (Appendix G) was applied to measure user engagement Gambit’s software prototype and collected quantitative data which was used to test the theory that whether GamBIT achieved or not, increased user engagement with a BI tool. 3.5 Qualitative Research Qualitative research methods have been chosen as a way to produce findings not arrived at by means of quantification i.e. the UES. Qualitative research is based on interpretivism (Altheide and Johnson, 1994; Kuzel and Like, 1991; Secker et al., 1995) and constructivism (Guba and Lincoln, 1994). Interpretivism naturally lends itself to qualitative methods. It is, in its simplest form, an ideal means of exploring individuals’ interpretations of their experiences when faced with certain situations or conditions (Woods & Trexler, 2001). The qualitative research will attempt to understand an area which little is known, in this case the main theme of the report exploring the gamification of BI tools and its effects on user engagement, and to obtain intricate details about the feelings, thoughts, and emotions that are difficult to extract and/or learn about through quantitative research methods. In this case the feelings, thoughts and emotions of the volunteers who took part in the GamBIT experiment. Strauss, A, and Corbin,J, (1998) study of the basics of qualitative research
  • 27. 26 | P a g e points to the three major components of quantitative research. The three points below highlight how these components relate to this project: 1. The data. Which will come from semi structured interviews. 2. The procedures used to interpret and organise the data. Coding 3. The Analytical process. Taking an analytical approach to interpreting the results and findings and including these in the report. Qualitative data analysis consists of identifying, coding, and categorizing patterns or themes found in the data. Data analysis was an ongoing, inductive process where data was sorted, sifted through, read and reread. With the methods proposed in this report, codes are assigned to certain themes and patterns that emerge. Categories are formed and restructured until the relationships seem appropriately represented, and the story and interpretation can be written (Strauss & Corbin, 1998) The following section describes the methodological stages undertaken during the qualitative research and can be loosely attributed to the grounded theory approach (Strauss, A, and Corbin,J, 1998). 3.6 Methodological stages This section contains a step by step process on the methodological stages used to conduct the qualitative research. The methodological stages and how they are connected is shown in figure 3.6 below. Figure 3.6 Qualitative research methodological stages The first part of the process was identifying the substantive area. The area of interest for this report being the exploration the gamification of BI tools and the effects on user engagement.
  • 28. 27 | P a g e The study is about the perspective of one (or more) of the groups of people of the substantive area who will comprise your substantive population. In this study University students who are part of the School of Engineering and Computing at UWS, Paisley. To collect data pertaining to the substantive area, conversing with individuals face-to-face by means of a semi-structured interview was considered most appropriate. The process of open coding was carried out as the data was collected. Open coding and data collection are integrated activities therefore the data collection stage and open coding stage occur simultaneously and continue until the core category is recognised/selected. Eventually the core category and the main themes became apparent; the core category explains the behaviour in the substantive area i.e. it explains how the main concern is resolved or processed. This projects main concern was lack of user engagement with BI tools and the core category was “whether the gamification of BI tools effects user engagement”. 3.6.1 Steps involved in open coding The following section gives an overview of the steps involved during the process of open coding. 1. The transcripts where read and first impressions note. The transcripts where read again with microanalysis of each line carried out. 2. The following relevant pieces where then labelled- words, sentences, quotes, phrases. This were based on what was deemed relevant to the study and included thoughts, concerns, opinions, experiences, actions. This type of analytical process aims to address what is considered relevant to exploring the gamification of BI tools and the effects on user engagement. During this process the following possibilities were looked at  Repeating data.  Surprises in the data.  Relevance to objectives. 3. The next step focused on deciding which codes where most important and to create categories by bring codes together. Some codes where combined to create new codes. At this point some of the codes deemed less relevant where dropped. Codes considered important where then group together allowing for the creation of the categories. 4. The next step focused on labelling relevant categories and identifying how they are connected. Comparative analysis was used as a means of labelling. The data contained within the categories made up the content of the main results. 5. The results and analysis were written up. Memos where written throughout the entire process. This helped in the interpretation of the results and analysis with some memos written directly after the semi-structured interviews were conducted.
  • 29. 28 | P a g e Chapter 4: Experimental and Interview Process: During the initial development of the GamBIT prototype a number of tests were conducted to help evaluate the prototype. An approach was made to a number of students, from the School of Engineering and Computing at the University of the West of Scotland (UWS), Paisley who had shown an interest in work being carried out in this report. This was done through direct observation of volunteers who had agreed to test the prototype. This was done in an attempt to observe their interaction with the prototype and with the Eclipse platform. The main areas under observation where  Length of time to complete the tasks  Navigation of the platform  Reaction to the gamification elements After the tests were conducted feedback was given by the volunteers which included  Incorporate rewards such as badges when a task is complete  Simplification of the game based rules  Reworking of the tutorial to highlight every step of the process involved in carrying out the tasks. Time taken for the volunteers to complete the tasks varied from 50 to 75 minutes. The estimated time to be applied to the actual experiment was around 45 to 60 minutes. This gave the developer time to re-evaluate the prototype and make the necessary changes prior to the experiments being carried out. 4.2 GamBIT Experiment In an attempt to appeal for volunteers, students from the school of Engineering and Computing at the University of the West of Scotland, (UWS) Paisley where approached to take part in the GamBIT experimental study. The following section includes how the appeals were made, justification for selection, and estimated duration of the experiment. 4.2.1 Appeal for Volunteers The GamBIT developer approached the lecturer of a 1st year class studying the module ‘Introduction to Computer Programming’ and asked if he could appeal to students to volunteer for the experiment. These students where familiar with the Eclipse software platform as they were learning Java programming through the use of this platform, therefore, they were familiar with the layout of the Graphical User Interface (GUI). It is worth noting that many of the students had little experience using BI tools. The second group was a 3rd year group of students who were currently studying a BI module and therefore where familiar with BI and had experience of using a BI tool. An approach was made to the lecturer of BI class by the researcher to ask if an appeal to students from the BI class was possible. The lecturer agreed, and subsequently all students where emailed prior the appeal to give notice of the appeal (Appendix B). A five minute overview of the project and the experimental study was given and then an appeal for
  • 30. 29 | P a g e volunteers was made. Students were given the opportunity to ask any questions or state any concerns. They were then advised of the time and location of the experiment and finally thanked for their time. The last group consisted of 4th year (Honours) students who were studying Business Technology. These students were chosen as they would (hopefully) provide a more critical viewpoint and assessment of the tool as they were in the last year of their studies and had a broader experience of BI, BI applications and associated tools. One hour time slots booked at the UWS labs for the GamBIT experiment to take place. The estimated completion time was forty minutes. Given scope for late arrivals and varying completion times by volunteers, one hour was deem sufficient for all volunteers to fully carry out the experiment. Further experiments where undertaken by other volunteers who showed an interest in the project. These experiments where conducted over several days in the Labs at UWS. 4.2.2 Experiment Volunteers were randomly split into Group A (control - BI tool only) and Group B (experimental - ‘GamBIT’ tool). The random split was deemed necessary as it was a fundamental requirement of the test design under scrutiny. Both groups where issued envelopes on arrival containing a USB stick (with JAVA coding installed, pen, a guide to launch software, a guide to complete the exercise and a User Engagement scale. Group A were given USB sticks with a JAR file named: NonGambit.install.data. This file, once installed integrated new Java programming code that generated text files (.txt extensions) onto the USB stick whenever a user had clicked certain buttons during each of the 6 BI tasks. Group B where given USB sticks that contained a JAR file named: Gambit.Install. This file, once installed integrated new Java programming code that created the ‘GamBIT’ gamification techniques on all of the 6 BI tasks on the exercise tutorial. It also created text files for the collection of a number of different qualitative and quantitative data and wrote this data to the new text files on the USB stick during the experiment. The volunteers were briefed on the support available during the experiment and advised that help was available at any time from the three observers present (researcher, developer and moderator). On completion of the experiment every volunteer was thanked for their time and participation. All UES, USB sticks, and pens where then collected, sealed in their given envelopes, and split into 2 piles, group A and Group B. The data was then collected and analysed over the next few weeks. (The results and analysis are covered in chapter 5). 4.3 Interview process
  • 31. 30 | P a g e The semi structured interviews were conducted on 4 participants who took part in the experiment. Each interview followed a similar theme based around 3 main objectives (Appendix D). 1. To understand what each participant felt about the application of gamification techniques to a business intelligence tool and to determine what effect it had on their level of user engagement. 2. To ascertainhow eachparticipant felt during the test, their reasons for feeling the way they did and to glean further information from them over and above the survey data. 3. To gather qualitative evidence from each participant on a wide range of relevant issues concerning lack of user engagement with BI tools and to use quotes by them as to their opinions, views, suggestions and constructive criticism. Initial contact with each participant was made through a response to feedback given after the experiment in which they expressed an interest in taking part in the interview process. An email was sent stating the following points that were to be addressed prior to the semi- structured interviews being conducted. • Explanation of the purpose of the interview • Addressed terms of confidentiality - The participant’s informed consent was voluntarily achieved by means of a disclaimer attached to the (UES). • Explained the format of interview • Indicated how long the interview may take • Asked them if they have any questions • Asked for consent in the recording the session The email contained details of the proposed dates and times, approx. duration and location of each interview. Further correspondence took place until eventually pre-determined times and dates where agreed with each participant. Given the busy schedules of all the participants each interview was conducted at various places within the University campus and on separate days. It was necessary to follow up with the participants as quickly as possible after the experiment was conducted to keep the thoughts and feelings of participants’ as fresh in their memory as possible. Chapter 5: Results and Analysis: This chapter will document the findings gathered from the collection of quantitative and qualitative data. It will focus on the results and analysis from the experiment then document the results and interpretation of the semi-structured interviews that were carried out on four participants. 5.1 Quantitative data
  • 32. 31 | P a g e This section contains the results and analysis of the survey data, the UES data, and the data collected relating to the participants time spent during the experiment. 5.1.1 Participant Results and Analysis The experiment attracted a total of 68 participants (n= number of participants, (n = 68)) and were ‘randomly’ split into one of two groups (A/B). Table 5.1 shows that there was an almost even split. As slightly more participants (n = 2) used the BI tool only (non-gamified version) showing the random nature of the group split. All statistical analyses has been conducted with this slight differentiation. Group A/B Group name No. of participants (N =) N= %age A Control group using the BI tool only 35 51.5% B Experimental group using the GamBIT tool 33 48.5% Table 5.1 Group A/B split Table 5.2 shows the spread among the 3 groups of participants by UWS class/course. The largest group was the 1st year students, of which 46 took part. The 3rd and 4th year students consisted of 17 and 5 participants respectively. When the initial approach was made to the 3rd year student, the class consisted of around 40 students however, less than half of those invited took part (n = 17) which accounted for 25% of the cumulative total. The 4th year students consisted of 5 participants (n = 5, 7%). Participants Frequency Percent Valid Percent Cumulative Percent Valid 1st year - Intro to Programming 46 67.6 67.6 67.6 3rd year - BI class 17 25.0 25.0 92.6 4th year Hons – Comp. Science 5 7.4 7.4 100.0 Total 68 100.0 100.0 Table 5.2 - University Course distribution
  • 33. 32 | P a g e Figure 5.1 shows the spread among the groups of participants by UWS class/course in a bar chart. Figure 5.1 University Course distribution Table 5.3 shows how the three groups of volunteers were divided and allocated to the two groups (Group A/B) during the experiment by their different university courses. This helps to demonstrate the randomisation of the participants. The table shows a very close division and split between the three groups and their respective student courses. From the optimum 50/50 split the largest group (1st year students) shows a +/- 2% (52%/48%) difference, with the other groups following a similar pattern. Figure 5.2 shows the same information in a Bar Chart. Table 5.3 Group type A/B * University Course - Cross-tabulation
  • 34. 33 | P a g e Figure 5.2 Group type A/B * University Course - Cross-tabulation Bar Chart Summary  The grouping of participants was completely random.  There was an almost even group A/B split.  1st year students made up the majority of participants (68%).  3rd year students made up 25% of participants with a total of 17 taking part. The number was lower than expected given the class size of 40+ students. 5.1.2 Survey Information Results and Analysis The following section gives an overview of the survey information gathered from each participant prior to completing the UES (Appendix G). The data is based on the responses to four questions (Q). 1. What is your gender? 2. What is your age? 3. On average, how often have you used a business intelligence (BI) tool at work or study before? 4. On average, how often have you played any kind of video/app/mobile game before? Q1.
  • 35. 34 | P a g e Table 5.4 shows the gender split with only 8 females participating (12%) in the experimental study compared to a larger male participation of 60 (88%). To show the randomness of how males and females where allocated to their respective groups (control and experimental), Table 5.5 show the cross tablature distribution. Table 5.4 Gender Split Table 5.5 Gender * Group type A/B Cross-tabulation The random nature of the allocation of test groups shows that no prior consideration was made to ensure there was a more even distribution of males and females within the groups. Figure 5.3 highlights the lack of female participants, this was an unfortunate circumstance that was out with the scope and control of the researcher.
  • 36. 35 | P a g e Figure 5.3 Gender distribution among the represented UWS courses Q2. The age distribution of participants is shown in table 5.6 and clearly shows the 18-24 age range represented the highest majority (68%). A more even distribution was between the 25- 29 and 30-39 years age range. Table 5.6 Age range distribution Figure 5.4 shows the same data in a pie chart.
  • 37. 36 | P a g e Figure 5.4 Age range distribution Pie Chart Q3. Figure 5.5 shows cross tabulation results that emerged when participants where asked how frequently they had used a business intelligence (BI) tool.
  • 38. 37 | P a g e Figure 5.5 BI usage by University course More detailed analysis can be seen in Table 5.7. The fact that 0% of 4th year students had never used BI tools was an expected result given that they would be considered the most experienced in using BI tools. What was surprising is that 5% of the 3rd year students have never used a BI tool given the course content for 3rd year BI students. A high percentage of 1st year student (25%) had never used a BI tool before. A more even distribution can be seen between the 1sts year students’ BItool usageof 2or3 times a week and onceortwicebefore.
  • 39. 38 | P a g e Table 5.7 BI Usage by University Course Q4. When asked the final survey question about how frequently they had used video, application or mobile games before, table 5.7 shows the participants’ answers. Table 5.7 Games usage Figure 5.6 shows the same information in a pie chart. Figure 5.6 Frequency of games usage
  • 40. 39 | P a g e Summary of survey data:  68 people participated in the experiments  They were split into the 2 groups almost equally: group A (51.5%), group B (48.5%)  3 UWS classes were selected from the 1st year (68%), 3rd year (25%) and 4th year - Honours (7%) within the School of Engineering and Computing  The gender split was: male (88%) and female (12%)  The majority age group was in the ‘18-24 years’ category (68%)  44% of participants had ‘never’ used a BI tool before and a further 20% only ‘once or twice’ before  As expected, most of the participants play video/mobile games on a ‘daily’ or ‘two or three per week’ basis (c.80%) 5.1.3 UES statistical Results and Analysis All of the UES data from the 68 participants was inputted into a statistical package software tool known as SPSS (version 23) by the GamBIT Developer. This allowed for a wide range of statistical testing to be conducted on the survey data (Appendix A), an overview is provided in the tables, charts and statements below. The median (middle) score was found for each variable for all 68 cases. The mean of the medians were calculated for all of the 6 sub-scales (factors to be measured in the UES) as shown in tables 5.8 and 5.9. Table 5.8 User Engagement (UE) factor scores for Group A: Control – BI tool only
  • 41. 40 | P a g e Table 5.9 User Engagement (UE) factor scores for Group B: Experimental – GamBIT The mean of all six factor mean scores for both groups can be seen in table 5.10. Table 5.10 Mean of all 6 factor mean scores for Groups A/B 5.1.4 User engagement highest ranking factors Based on the results and analysis of the mean scores, experimental group B (GamBIT) had Perceived Usability (PU) and Novelty (NO) ranked 1 and 2 respectively.
  • 42. 41 | P a g e Table 5.11 Ranking of lowest mean score by factor - Group B (GamBIT) The score in brackets at the end of each statement is the %age of respondents who either: strongly agreed (1) or agreed (2) with the statement. The PU and NO statements are: Perceived Usability (PU):  PU1 - I felt discouraged using the tool (70%) – 7th  PU2 - I felt annoyed using the tool (72%) – 6th  PU3 - Using the tool was mentally taxing (73%) – 5th  PU4 - I found the tool confusing to use (76%) – 1st  PU5 - I felt frustrated using the tool (76%) – 1st  PU6 - I could not do some of the things I needed to on the tool (74%) – 4th  PU7 - The tool experience was demanding (76%) – 1st Novelty (NO):  NO1 - The content of the tool incited my curiosity (52%) – 2nd  NO2 - I would have continued to use the tool out of curiosity (45%) – 3rd  NO3 - I felt interested in my BI tasks on the tool (61%) – 1st Group A (control group) had Perceived Usability (PU) and Endurability (EN) ranked 1 and 2 respectively.
  • 43. 42 | P a g e Table 5.12 Ranking of lowest mean score by UES Factor - Group A (control) The PU and EN statements are: Perceived Usability (PU):  PU1 - I felt discouraged using the tool (76%) – 3rd  PU2 - I felt annoyed using the tool (79%) – 2nd  PU3 - Using the tool was mentally taxing (67%) – 6th  PU4 - I found the tool confusing to use (76%) – 3rd  PU5 - I felt frustrated using the tool (82%) – 1st  PU6 - I could not do some of the things I needed to on the tool (64%) – 7th  PU7 - The tool experience was demanding (76%) – 3rd Endurability (EN):  EN1 - The tool experience did not work out the way I had thought (64%) – 1st  EN2 - I would recommend the tool to appropriate others (61%) – 2nd  EN3 - Using the tool was worthwhile (61%) – 2nd  EN4 - My tool experience was rewarding (52%) – 4th The results suggest that the participants in both groups did not find either of the tools a hindrance, demanding, or confusing in any significant way. They seemed to be able to accomplish what they were asked to without any great difficulty. The ‘Endurability’ (EN) aspect is associated with the users’ overall evaluation of the experience, its worthiness and recommendation value for others to use the tool. For the control group this factor was ranked 2nd highest. Interestingly, EN4 ranked lowest and suggests their experience could have been more rewarding with EN1 indicating the overall experience could have been better.
  • 44. 43 | P a g e The Novelty factor, which is associated with the curiosity the tool evoked, interest levels, and surprise elements ranked higher for GamBIT users. This suggests that GamBIT users where more interested in the BI tasks they were asked to complete. Interestingly, over half the experimental group (52%) stated that the content of the tool incited their curiosity (NO3). Given that gamification aims to make tasks more fun, engaging and intrinsically motivating the results demonstrate the developers attempt to add these elements to the gamified BI tool. Results from the statement ‘I felt interested in my BI tasks’ scored 61% with the GamBIT group compared to the control group who only rated this statement at 45%. The %age difference of 16 points, which is an increase of 35%, can be seen in table 5.13. This can be interpreted as a significant difference in the level of interest shown by both groups. Table 5.13 NO3 ranking score Group A/B 5.1.5 User engagement lowest ranking factors Focused attention (FA) factor scored lowest for both groups. The FA factor is associated with the concentration of mental activity including elements of flow, absorption and time dissociation in the tasks. The results highlight that participants appeared to be more concerned with their tasks than the actual BI tools. This suggests that the gamification elements did not fully absorb the participants and that they seemed more focused on task completion. Focused Attention (FA) statements: ↓ Group: → BI tool only (A) GamBIT (B) %age Ran k %age Ran k FA1 - When using the tool, I lost track of the world 21% 7th 21% 7th
  • 45. 44 | P a g e Table 5.14 Comparison of FA scores by * Group A (control) / B (experimental) 5.1.6 Summary of UES data  The highest rated variables (statements on the UES survey) for both groups was different i.e. Group A - did not find the BI tool frustrating (82% strongly agreed or agreed) Group B – did not find the GamBIT tool confusing, frustrating or demanding (76%)  Novelty ranked high with the GamBIT group and seen a significant difference in results from the control group.  Perceived usability was ranked highest by both groups.  Lowest ranked factor for both groups was focused attention. This suggests participants where not fully absorbed in the gamification elements, with task completion being of higher importance. 5.1.7 Time taken to complete tasks results and analysis This section shows the results from the data collected relating to the time taken to complete each task and includes the optional task 6 results. This section also questions whether the gamification of a BI tool places additional time constraints on participants. The results from Group A (control- BI tool) revealed that Task 2 (T2 - building a data source) was the quickest time at 2 minutes and 11 seconds. There were two tasks that took on average over 8 minutes i.e. T4 (formatting the data) and T6 (creating a report title) with T4 taking the longest time to complete at 8 minutes and 36 seconds. The results from Group B (GamBIT) revealed that Task 2 (T2 - building a data source) was also the quickest time at 2 minutes and 30 seconds which was the same task as group A - only a little slower (19 seconds). Task 6 (T6),creating a report title, was an optional task. Given that it was introduced at the end of the experiment, participants by this point may have been somewhat disengaged. It is a good gauge to measure if the participants were still around me FA2 – I blocked out things around me when using the tool 30% 5th 30% 5th FA3 - My time on the tool just slipped away 45% 3rd 45% 3rd FA4 - I was absorbed in my BI tasks 54% 2nd 58% 1st FA5 - I was so involved in my BI tasks that I lost track of time 58% 1st 45% 3rd FA6 - During this experience I let myself go 33% 4th 33% 4th FA7 - I lost myself in the tool 22% 6th 22% 6th
  • 46. 45 | P a g e engaged in the tasks. T6 took the longest time to complete at 8 minutes and 06 seconds, some 30 seconds quicker than the control group (A) which is a good result for the research. The overall mean times to complete all six tasks are detailed below: • Group A (BI only) -32 minutes 31 seconds • Group B (GamBIT) - 30 minutes 25 seconds • Time difference - 1 minute 54 seconds (in favour of GamBIT) To answers the question whether the gamification of a BI tool places additional time constraints on participants, evidence of time differences shows that there are no significant time disadvantages or distractions. Results show the opposite appears to be true as the times to complete tasks are quicker which is a positive result in regards to the research. 5.1.8 Summary of time taken to complete tasks  The participants who used the GamBIT tool took less time to complete the six tasks.  GamBIT group had more participants complete the additional task (n=16)  Task 4 took longest to complete.  Using the GamBIT BI tool lead to tasks being completed quicker compared to the non-gamified tool. 5.2 Qualitative data This section will give an overview of each of the interviews conducted and report on the key finding under each of the main categories. The interviews were based on the experiences of each participant when carrying out the GamBIT experiment. It looked to glean more information over and above the quantitative data collected by the application of the UES. To explore key issues further questions were based on -  Their experiences with BI tools in general.  Their thoughts on gamification, in particular the gamification of BI and BI tools.
  • 47. 46 | P a g e  Their experiences of the use of BI in the workplace with a focus on any issues, obstacles and concerns.  Their thoughts on user engagement with BI tools. Full transcripts of all four interviews can be seen in Appendix C. The following section will report key findings under each of these main categories.  Game Elements  GamBIT experiment  Concept  Enterprise Gamification  Gamification of BI tools  User engagement A snippet of the coding process is provided to give a clearer understanding of how the results of the coding were analysed and then interpreted. Table 5.2.1 Sample of coding classifications. Taken from a Microsoft Excel file. 5.2.1 Participant A Game elements The gamification elements added to the BI tool where an unwanted distraction taking them away from completing the tasks, stating that “I never really paid attention” and “ I never looked at the leaderboard, never read it to see what it said”. When discussing what is the most important features of BI tools their response was “functionality of the tools is most important”. All of which suggests the gamification elements where not as important as actually completing the given tasks. GamBIT The participant stated that coming into the experiment “I wasn’t looking to enjoy it.” The Eclipse platform lacked the visual elements (aesthetics) needed to keep them engaged with the task and experienced issues with the platform layout “I think it was not very user friendly everything was clumped together. I lost one of the elements when carrying out the task of sorting and it proved hard to find. I could not move the element back to where they should be”. This proved to be a major issue with the Eclipse platform. This suggests that the
  • 48. 47 | P a g e participant is a visual person that likes software platforms that have a familiar GUI and are easy to navigate. I would be safe to assume that the Eclipse platform was not as user friendly or aesthetically appealing compared to other BI platforms they had used. This contributed to a lack of engagement with the gamified BI tool. Concept The concept of the mountain climber “bagging a ben” was not something they were particularly interested in. The following quote highlights this by stating “Maybe if it was something different (concept), as bens and mountains I am not interested in. Maybe if it was focuses along with something that interested me a bit more maybe I would have focused but I just clicked through it”. Suggesting that if the concept was more tailored to them, the overall experience could have be more engaging. Enterprise Gamification The participant stated a personal view on how enterprise gamification could benefit an organisation “it would really depend on staff’s attitude to the software or tools”. Asked if this form of gamification could increase user engagement in a BI environment they stated “I don’t think it is going to create engagement personally”. Gamification of BI tools When asked about their wider views on gamifying BI tools the participant states “I think BI tools are used by professional who know how to use them and realise how critical the information is. It would be good for learning (gamifying a BI tool)… like teaching people to use the BI tool. So for learning purposes yes, but on the whole may slow people down”. The participant explored the idea of gamification as a possible aid in learning to use BI tools, quoting “As a lot of these new tools can be frustrating and maybe having a pop-up or reward saying you have achieved may help out there. I see its place as a teaching aid for a new tool. But using the tool for a long period of time may get more people annoyed”. User engagement On the subject of user engagement with gamified BI tools the response was “personally it is not something I would engage with I don’t think, it’s not something that if added to a (BI) tool, especially a tool I was not keen on using, would make me use it “. When describing their feelings during the experiments “I don’t think I was overly engaged or lost track of myself in it” and commenting on the concept “using mountains just didn’t engage me”. It is clear that the participant actively “disengaged” with the gamified BI tool therefore the tool had no positive effect on user engagement. The following points stood out when writing up the memos  To engage user’s, visualisation though the use of colours was important.  The gamification concept has to resonate with each individual user and provide a variety of game-based activities that appeals to them.
  • 49. 48 | P a g e  Gamified elements can be intrusive if not designed correctly  The participant has experience using VDD tools and SQL reporting and querying tools.  The participant did not fully engage with the Gambit tool. It was seen as an unwanted distraction.  Functionality of the tools is more important than gamification layers. The following links were established  Gamification platforms and their ability to aid learning/ education.  Links between good quality data and effective decision making. 5.2.2 Participant B Game elements When asked their opinion on the game elements the participant responded “I did like them” and in commenting on the aesthetics stated “I quite liked them”. The overall responses to the game elements was surprisingly enthusiastic with the elements having an positive effect on engagement with the gamified BI tool. The participant focused on the leaderboard element stating “If I was not going further up that leaderboard I probably wouldn’t have kept going as long as I did” and elaborated that “It did want to make you keep going and get further up that leaderboard. I wanted to get further up that leaderboard”. They reflected on their motivation to continue, saying “I think you know most people are competitive even if they don’t like to think they are. I don’t think I am competitive but I think I probably am”. This hints that the game elements allowed the participant to “find something out about themselves “and highlights their extrinsic motivation by commenting “I was motivated because of the reward, which sounds greedy!” The participant liked the novelty element with responses like “I wasn’t expecting that” and “I was a bit surprise when the pop-ups appeared”. They felt involved in the tasks by stating “when you feel you are getting near to that level you feel like doing that wee bit more then you get badges and things and it makes you feel good.” GamBIT Experiment The participant stated frustration at not being able to complete the experimental tasks and declared “I was quite annoyed when you said stop… I wanted to keep going”. When asked if the tasks fun and engaging they pointed out “yes I was quite into it”. Their overall experience was a positive one and stated that “it was good I liked it...I liked the fact that it broke it down into quite small segments. It broke it down into nice little bits”. Concept
  • 50. 49 | P a g e The concept of “bagging a ben” appealed to the participant “I like walking and stuff like and it’s quite my thing” but did add “I know probably not a lot of women wouldn’t engage in that as it is quite a macho thing the whole mountain climbing, you might need to have a girly girl version (aimed at females) with shoes and handbags”. This highlights that the concept must have “meaning” for the user and be tailored to their needs and experiences. Enterprise Gamification When asked about the idea of implementing a gamification platform in the workplace, the participant responded “In my workplace then no, not really. Sometimes you can’t make things fun and engaging… things are what they are. But the more fun and engaging the better? Yes.” This was a mixed response which can be construed as a mix of lack of employee engagement and a desire to have a workplace that is more fun, engaging and participative. The participants thoughts on who would engage with an enterprise gamification platform they stated “I think the younger people would probably be into it…Older employees wouldn’t like it as it would think it would be reporting on how they are doing their job and stuff like that. A lot of older people are... more worried about how you are judged in the workplace”. They raised concerns about this type of implementation by saying “I can’t see how you can use gamification in a work context. I think it was really good for what we are doing (experiment) when learning through different steps the workplace would not be happy about stuff like this… peoples jobs being so insecure it would worry people”. Another concern indicated that enterprise gamification “could be negative. It makes you start thinking I am not doing very well, I am not doing this”. Gamification of BI tools The overall feeling on the gamification of BI tools was positive “I liked that (the gamified tool) broke it down… I think as a studying tool it would be outstanding it would be really, really good. I could really see it working as that”. But concerns were raised around the use of BI tools as previous experiences with BI tools highlighted that “no training was provided” and stated that BI tools should be “made simpler”. User engagement When asked “did you feel engaged with the BI tool?” The Participant response was “Yes, totally”. The feedback received during the experiment was a major positive for the participant commenting that “everybody wants to be praised don’t they … makes you feel good when someone says something nice to you. If you are getting told something good then it is positive affirmation so you want to keep going” this confirms their belief that feedback is a main driver in increasing user engagement. The following points stood out when writing up the memos  To engage user’s, competition, challenge and reward are important.
  • 51. 50 | P a g e  The participant liked the concept of the gamified tool but did say that it may not be to everyone’s liking especially females.  Gamified elements engaged them and made them want to continue especially to get the additional task done.  They were both intrinsically and extrinsically motivated at various points during the experiment.  The participant has experience using various BI tools in their job role (2nd Line Support)  The participant stated on several occasions that gamification platforms could have a place in a learning or educational environment.  A gamification platform is not something they could see being rolled out in their workplace mainly due to financial constraints. The following links were established  Gamification platforms and their ability to aid learning/ education especially when studying.  Social collaboration through a gamification platform. 5.2.3 Participant C Game Elements Certain game elements resonated with the participant mainly the visual aesthetics, commenting that “You get to see the images. It was like a games console where you get to see achievements for everything you have completed”. Additional game elements including “adding achievements and unlockables” would have made the BI tool more engaging and possibly introducing “individual avatars”. The participant liked the game elements that gave you your own unique score Some negative elements emerged noting the pop-ups were ”straight up in front of you” and were “It was a bit imposing at times”. This felt a little intrusive to the participant stating that it is “an element that could have given more thought to”. But the participant acknowledged that the game elements did “pull you in to the software.” GamBIT The participant was impressed with the GamBIT tool and declared that it “was encouraging I liked the feedback and progression” they felt that it “made it seem quicker to complete (the tasks) and made you feel like you wanted to continue”. When asked about the additional tasks participant responded “I found them fairly easy to complete”. Concept When asked about the “bagging a ben” concept, the participant responded by saying “I quite liked the idea where the more you do the higher up the mountain you go”. It would be safe to imply that they were somewhat engaged with the concept. Enterprise Gamification
  • 52. 51 | P a g e The participant expressed little opinion on this category but did stated that enterprise gamification “could show up people who need more training and help them all get to similar levels”. When asked if enterprise gamification was good idea the response was “I am unsure about it in a business context”. Gamification of BI tools The participant has experience using a number of different BI tools i.e. Tableau, Power BI, Qlik Sense and SQL reporting tools with different experiences using the tools. What did emerge was that with “some of the tools and software there should be more information online.” When describing their experience with Power BI the lack of online support “stopped me from progressing when using the tool…there was not much information available for that at all”. Commenting on the idea of implementing gamification to these tools the participant said “I think it would be a good training aid for using BI tools”. User engagement When asked if it was likely that a gamified BI would increase user engagement, their response was “I am not sure”. The following points stood out when writing up the memos  The gamified tool was more engaging than the non-gamified tool. (This participant had participated in both experiments)  The game elements pulled the participant in to the software and wanted to make them continue explore it.  The participant felt intrinsically motivated to continue using the gamified BI tool.  The gamified experiment seemed easier and quicker to complete.  Gamification would be best suited to the IT industry.  The participant liked the concept of “bagging a ben” The following links were established  Would like to see the concept of gamification integrating into platforms like Yammer as a tool to aid in collaboration with colleagues in the workplace. 5.2.4 Participant D Game Elements The participant highlights novelty as a key game element that they engaged in stating “I was really amused by that!” when the first pop-up appeared. The element of surprise was another key element with the participant indicating that “you are excited because you don’t know what is going to happen”. Other game elements such as achievement and challenge proved engaging with quotes like “Yes is was a challenge. It was something to aim for” and “You are building here, you are climbing, and you are getting somewhere.” This highlights how the participant engaged with these particular elements.
  • 53. 52 | P a g e GamBIT Experiment The participant reflected on their participation and admitted that “Once I was into it I thought this is quite good!” I wish I had more time. When asked if they has more time was available would they have continued they responded “Yes, regardless of how long it took me. I felt the application was making it easier for me. “ The following quotes are examples of enthusiastic responses to the experiment given by the participant “I liked the thinking behind it”, “It was well laid out” and” It was something I would look to learn to do.” These quotes give a clear indication that the participant was engaged in using the gamified BI tool and the gamification of BI tools, in a more general context, is a good idea. The participant showed an understanding of the work the GamBIT developer put into the BI tool by saying “I appreciate the work that went into it as I have used Java and I know how difficult that can be.” When asked what they considered most fun and engaging the participant replied “It was the whole package that did it for me. As a front end user I have no criticismof it”. Concept The participant thought overall GamBIT concept was “quite appropriate for both genders to do” highlighting that “it was appropriate for that. It wasn’t too childish, if you know what I mean, it wasn’t effeminate and it wasn’t too “blokey” (aimed at males). They liked the “bagging a ben” concept and shared their feeling on the experiment by saying “You are building here, you are climbing, and you are getting somewhere. That’s why I think it was quite poignant that it was a climber as opposed to someone doing water skiing for example”. They describe the emotional attachment to the mountaineer by commenting “I got involved with it. I became emotionally attached to the man going up the hill. I was saying "that is me trying to get up that hill". These quotes highlight how engaged the participant felt during the GamBIT experiment and how it left a positive impression on the idea of gamifying BI tools. Enterprise Gamification The participant believes that making the gaming appropriate to the application and the population that you are working with are key components to successful enterprise gamification. They go on to acknowledge that “Businesses would need to do two things. The consultancy is the most important part they need to find out what the service is about, product knowledge. Then you would need to find out who is going to be operating this? What kind of education do they have in IT? How much training do they need? How involved in change are they? Some staff just wouldn’t what the change. So consultancy to build an application would be imperative.” Gamification of BI tools
  • 54. 53 | P a g e When asked if gamifying BI tools to increase user engagement was a good idea in principal the response was “personally I think it would be... I was very impressed with it. I would love to learn how to use it (gamification) and I would love to learn how to incorporate it into my databases”. The participant went further to suggest that gamified tools “would be great in education” by stating "this is a tool that can teach me. You are learning and not really realising how much you are taking in. It is the ease of the learning. That’s the big thing”. Given that the participant has used numerous BI tools in the past the feedback given about gamified BI tools was surprisingly very positive. User engagement When asked about user engagement with the tool the participant responded positively by saying “I absolutely loved it (GamBIT tool). Was I engaged? Absolutely! It was encouraging me I was very impressed with it”. The following points stood out when writing up the memos  The participant works for the NHS and has experience using BI tools, database development, and procurement of IT systems within the NHS.  The participant found the reward elements very encouraging. Indicating that they were extrinsically motivated to continue.  The participant was very impressed with the gamified tool and would like to incorporate something similar into their workplace.  The participant has previous experience using the Eclipse platform and had previously found it difficult to use and engage with. This was not the case with the gamified platform.  The participant seemed to be more activley engaged with the gamified BI tool compared to the others interviewed. The following links were established  Gamification as a learning platform for using BI tools in the NHS.  The participant has used Eclipse as part of a Java programming course and as part of the experiment with research pointing to an increase in their engagement with the gamified tool as opposed non-gamified BI tools they have used with this platform 5.3 Summary of Qualitative Data Results The main finding from the qualitative research highlights that each participant had various levels of experience using BI tools. This gave a wider range of opinions on the subject areas.  Three of the participants found the game elements engaging, novel and fun with one participant finding them an unwanted distraction.  Two of the participants considered themselves engaged and immersed in the experimental tasks. One participant was somewhat engaged and one participant completely disengaged.  Each participant stated that the concept had to be relevant to them as individuals.
  • 55. 54 | P a g e  Three out of four participants were considered intrinsically motivated to complete the tasks.  Two participants showed signs of being extrinsically motivated due to the reward elements.  Three participants did not find either of the tools demanding, confusing, annoying or discouraging in any significant way.  Surprises in the data emerged as each participant commented on how gamified BI tools would be a useful training aid in learning to use BI platforms. With the concept of gamification having a place in a learning or training environment.  Gamification could be used as an “instructional designer”, by complementing pre- existing instructions and making them better. This was a key theme that resonated from each participants personal experiences with BI tools. A lack of online support, technical assistance and instruction on how to optimise BI tools has lead participants to believe that these factors contribute to lack of user engagement with BI tools.  Participants generally agreed that the gamified BI tool broke the tasks down into steps that could be easily followed and helped them in measuring progress throughout the tasks. Responses to the main research questions are detailed in table 5.2.2. Participant Research question Response Participant A Given your experience using a Gamified BI tool do you think the gamification of BI tools is a good idea? No Participant B Yes Participant C Yes Participant D Yes Participant A Would adding gamification layers to a BI tool make you want to engage more with the tool? No Participant B Yes Participant C Yes Participant D Yes Participant A Overall would the gamification of BI tools increase user engagement Maybe Participant B Yes Participant C Not sure Participant D Yes Table 5.2.2 Main research questions Chapter 6: Conclusion: The purpose of this chapter is to present the results of the analysis of primary data and present conclusions reached based on analysis of the primary data in relation to the participants and tools used in this project. It will look to identify any limitations associated with the data analysed, from both the experiment and the subsequent interviews that took
  • 56. 55 | P a g e place, and the conclusions reached. Finally, a summary of the main points made in this chapter will be presented. 6.1 Review of research objectives The principal objective of this project was to explore the issue of lack of user engagement with BI tools. The point it aimed to address was whether making BI tools more fun and engaging by applying gamification to, or in other words ”gamify”, a BI tool, can lead to increased user engagement. The UES was applied to a prototype gamified BI tool (GamBIT) and a non-gamified BI tool in an attempt to produce quantitative data to address the aims of the project whether or not ‘gamifying’ a BI tool has the potential to increase an individual’s motivation to use BI tools more often. Quantitative data was collected through means of conducting semi-structured interviews with selected participants from the experiment. This allowed the researcher to glean more information over and above the quantitative data provided from the UES. The qualitative data attempts to capture the feelings thoughts, and emotions of participants on a number of issues related to this project 6.2 Discussion of primary and secondary conclusions This section will discuss the conclusions made from the primary research within the context of the secondary research i.e. the literature review. Its aim is to highlight how the results of the project address the main research question. The literature review highlighted that engagement with BI tools has flat lined at around 24% for over a decade. The BI industry has to address this issue or this current generation of BI platforms and tools may not reach, or be used to, their full potential. Primary research demonstrates that a lack of support for, and complexity of, BI tools and platforms leads users to become frustrated and de-motivated, resulting in users being actively disengaged. The literature review revealed that business users viewed BI tools as complex and left the use of these tools to the power users within IT departments. Leading to a big ‘disconnect’ between business and IT staff. The primary research supports the theory that providing BI tools with little or no support, training or technical assistance will ultimately see the user look for alternative BI tools to engage with. Furthermore users may become completely disengaged with BI tools in general compounding to the problem of user engagement with BI tools. This has been a major issue for the BI industry as a whole leading to only 24% of those using BI tools considered engaged. With the amount of BI tools that are now available, and competition for market share, vendors who are more customer centric will see engagement with their products increase. On a more positive note when asked “would adding gamification layers to a BI tool make you want to engage more with the tool?” and “given your experience using a Gamified BI tool do you think the gamification of BI tools is a good idea?” research shows 75% of the participants interviewed responded positively to these questions. Given these results it would be safe to
  • 57. 56 | P a g e suggest that on this occasion the gamification of BI tools would see engagement raise above the average 24% of actively engaged users. Primary research results show an increases in key areas of user engagement with the gamified BI tool as opposed to the non-gamified BI tool. The study found that there was a significant difference in the level of interest shown by both groups when responding to the question ‘I felt interested in my BI tasks’. The score of 61% with the GamBIT group compared to the control group who only rated this statement at 45% resulted in a percentage difference of 16 points, which is an increase of 35%. Evidence that the gamified BI tool did increase user engagement in this key area and suggests a potential increase in user’s motivation to use the tool again. These findings help answer the main research question, whether the Gamification of BI tools can affect user engagement. The Lowest ranked UES factor for both groups was focused attention. This suggests participants where not fully absorbed in the gamification elements, with task completion being of higher importance. This was certainly true for one of the interview participants with functionality of a BI tool i.e. the user interface, visual aesthetics, and platform layout being considered more important than having a gamified BI tool that promotes engagement and collaboration. The participant went further to say that BI tools should not necessarily be made fun and engaging as users who come into regular contact with BI tools only use them to “get the job done”. This argument suggests that implementing a gamification platform must not deter from the actual task or process being carried out and links back to the strategy of closely aligning the objectives of the gamification platform with those of the business. The research demonstrated that user engagement with the gamified BI tool made task completion quicker compared to a non-gamified BI tool (a time difference of 1 minute 54 seconds in favour of GamBIT). Indicating that participants were more engaged with the task at hand than those who had used the non-gamified BI tool. Although this was not one of the objectives outlined at the start of the project, this does highlight that active engagement with a gamified BI tool can result in quicker task completion. The clear time difference was surprising as initial indicators suggested that the experimental group may have taken longer to complete the tasks given the game elements that where introduced throughout the experiment. It also answers the question whether the gamification of a BI tool places additional time constraints on participants. 6.3 Limitations placed on project Given the limitations and restrictions placed on this project it cannot be conclusively proved that by gamifying a BI tool under different conditions and/or environment would produce similar results. As Project GamBIT was a unique experimental study conducted within the constraints of an academic environment involving students it is hard to say if a similar experiment, conducted in a workplace environment, would have produced similar results. Participants consisted of students from the school of engineering and computing who may be considered somewhat engaged with BI tools previously to the research being carried out. The possibility of using volunteers from a different school i.e. School of Business and
  • 58. 57 | P a g e Enterprise within UWS may have given the project a wider demographic and produced different results. Business and Enterprise students are assumed to have less experience with BI tools than students from the School of Engineering and Computing but could be exposed to front-end BI tools in the future given their chosen career paths and would have been worthwhile participants in this study. The data was produced for only one individual BI reporting tool and does not include any findings or results from alternative BI tools and BI platforms. This resulted no comparisons being made with other BI tools. The lack of research in this area also contributes to the failure to compare with other findings in this area. The researcher was restricted to using the tool developed for the experiment and although a contribution was made during the development no involvement in the programming or conceptual design of the GamBIT tool was made. The researcher did contribute to issues of functionality, aesthetics and game elements. The qualitative data was collected from four volunteer participants this placed limitations on the data. To give a comprehensive overview of the experiment ideally all 68 participants would have needed to contribute to the qualitative research. This was unrealistic given the project timescales and accessibility of participants. In some cases the interviews conducted where approx. carried out 3-5 days after the experiment. Participants had time to reflect on their experience and build on what their perceived version of events was. One of the interview participants did raise this issue with the researcher. Consideration to this and similar issues were given when analysing and interpreting the qualitative data. 6.4 Future research work As of yet there is no current credible academic research been done in this area therefore no comparisons can be made with other studies. The literature review found many reasons why user engagement with BI tools is low but did not find any clear solutions to this global problem. This project hopefully gives some answers to the issue of lack of user engagement with BI tools and can be a basis for future research in this area. The results of this study suggests that gamifying BI tools would be a good training aid when learning to use BI tools or engaging in BI tasks or activities. The idea of gamification being an “instructional designer”, by complementing pre-existing instructions, could encourage a behaviour or attitude that would increase user engagement. The game elements such as reward, novelty, challenge and competition particularly resonated with the majority of participants who took part in the interview process. They agreed that gamification would be a good training aid to monitor progress, motivate users, and encourage collaboration. All of the interviewees agreed that gamification could have a place in a learning or educational environment. This has been the subject of much academic research recently and has been championed by gamification vendors as the “gamification of learning”. The findings support this argument and no doubt further academic research will conducted in this area.
  • 59. 58 | P a g e Enterprise gamification platforms that integrate with BI tools such as Tableau and Excel have not provided any concrete statistical evidence to support their claims of increased user engagement with BI tools. This is one of issues this project tackles, to produce evidence of whether gamifying a BI tool increases user engagement. As gamification is a recent trend and enterprise gamification platforms are in their infancy and not fully matured, once the hype surrounding gamification plateaus the industry will find its feet, conclusive evidence of gamifications potential to increase user engagement may become available. 6.5 Summary  A lack of support for, and complexity of, BI tools leads users to become frustrated and de-motivated, resulting in users becoming actively disengaged.  If done correctly and directly aligned with the overall business objectives, the gamification of BI tools could see engagement levels raise above the average 24% of actively engaged users.  Enterprise gamification is still very much in its infancy and although it has its champions no conclusive evidence of claims to increase user engagement with BI processes has been found. Results from the qualitative data collected highlights that the interviewees generally agreed that, in their place of work, introducing a gamification solution would not be high on the list of priorities for the business.  The experiment produced some surprises in the results with the gamified BI tool proving to reduce task completion times compared to the non-gamified tool.  Research suggests that gamification can be a worthwhile contributor to learning and developing BI skill sets. Chapter 7: Critical evaluation: The honours project has tested me in many levels as I expected but has pushed me to the limits of my academic ability. From the beginning serious thought had to be given to the approach to the overall project. 7.1 Reflecting on the initial stages of the project The initial draft specification which was done over the summer period looked to explore the use of BI within the charity sector as this was an area that was of interest to me given my working involvement in this sector. This changed due to an approach being made by my supervisor to consider being part of an ongoing study by UWS PhD student Stephen Miller to aid in the development of a prototype gamified BI tool, namely Project GamBIT. After researching the subject areas project GamBIT was involved with (BI, user engagement with BI tools and gamification) I decided that this would not only be very challenging and rewarding but also allow me to explore an area that up until then I had never heard of, namely gamification. This was the basis upon which I agreed to tackle this unique study. There were obvious advantages to this mainly having access to the knowledge, and experience, the PhD student had in terms of undertaking such a project. Also having the opportunity to explore the subject area of gamification, which I considered an interesting and stimulating topic.
  • 60. 59 | P a g e There were many disadvantages to following this route. I found out very quickly that the scope of the GamBIT study restricted my creative thinking and ability to explore different areas and ideas within my project. There was always an awareness that I had to keep within the scope of the GamBIT study and even though there was always advice on how to proceed with my project there was a feeling of lack of control over my own project at points. This was compounded in many ways as I feel I could have developed alternative routes and raised different points of investigation and discussion during the course of the project. These constrains became more evident towards the end of the project as there was some debate on what results I could use from the quantitative data that was collected and what alternative methods I would use to differentiate my project from that of the PhD student . Reflecting on this, the initial honours project idea may have allowed a more creative approach and a feeling that I was more in control of the overall project. The project supervisor admitted that in future a different approach may be needed when an Honours student becomes involved in a PhD study. The main lesson I can take away from this is that when committing to future projects clearly defined objectives, responsibilities and roles are set out from the beginning. On a more positive note the challenge of taking on such a huge project was daunting, but at the same time exciting. Keeping a high level of motivation and setting targets helped to keep the project on track and avoided me from procrastinating. I felt the management of the project was done in a professional manner with milestones and deadlines adhered to throughout. I considered this one of my main strengths throughout the project and will continue to use this style of project management as a blueprint for the future projects. 7.2 Approach to project My approach to the project was to treat it as a full-time project and put in the necessary hours each week in order to keep the project on track and enhance my learning of the subject areas. A lot of work was done at home and given that I have 2 daughters, a 1 and 8 year old, to look after finding time to meet my targeted hours each week was challenging but I understood that certain sacrifices had to be made and at points sleep was minimal. There was rarely a point when I felt I could not continue with the project as the motivation to succeed, provided by my daughters, kept me focused. 7.3 Honours year modules The honours year modules helped to expand on my knowledge of the main areas of the project especially the data warehouse environment module. This module was key to understanding BI and exploring user engagement with both BI platforms and tools. Given the module was based around the work of BI industry expert Cindi Howson, this was a constant source of referral during the early stages of the project. 7.4 Project aids Numerous tools where used throughout the project. Many of them familiar to me such the Microsoft Office Suite and Gliffy (a web-based diagram editor). Mendeley (a web program for managing research papers and discovering research data) was discovered during the initial stages of the literature review and was invaluable in the organisation, retrieval and
  • 61. 60 | P a g e storage of research documents. Mic Note (an audio recorder + notepad, 2 in 1 tool) was another tool discovered during the course of the project and used extensively during the interviewing process. These tools enabled the project to run more efficiently and effectively enabling more time to be spent on the key areas of the project. If I undertake another research project in the future all of the above tools would prove useful. 7.5 Summary Overall the honours project has tested my academic capabilities to the limit and will be a process I will no doubt reflect upon in the future. The project has helped me develop in many different ways. Examples include being exposed to experimental processes and having the confidence to conduct face-to-face interviews and source participants for both the interviews and the experiments. Each of which I had never done before and took me outside my comfort zone. I will take the lessons learned from undertaking such a huge project with me going forward to hopefully achieve my personal goal of carving out a long and successful career in the IT industry. References Attfield, S., Kazai, G., Lalmas, M., & Piwowarski, B. (2011). Towards a science of user engagement (Position paper). In Paper presented at the WSDM workshop on user modelling for web applications, Hong Kong, China. Azvine, B., Cui, Z. & Nauck, D.D., (2005). Towards real-time business intelligence. BT Technology Journal, 23, pp.214–225. Bogdan, R. C., & Biklen, S. K. (1998). Qualitative research in education: An introduction to theory and methods (3rd ed.). Needham Heights, MA: Allyn & Bacon Bogost,(2011), Persuasive Games: Exploitationware, http://www.gamasutra.com/view/feature/6366/persuasive_games_exploitationware.php [Accessed October 2015] Carey, J. W. (1993). Linking qualitative and quantitative methods: Integrating cultural factors into public health. Qualitative Health Research 3: 298–318. Dale Carnegie Training (2012) What Drives Employee Engagement and why it matters, White Paper, Dale Carnegie & Associates, Inc. Davenport, T.H., Barth, P. & Bean, R., (2012). How “Big Data” is Different. MIT Sloan Management Review, 54(1), pp.22–24. Massachusetts Institute of Technology Decision Hacker (2012) Gamification and Gamified Business Intelligence, blog post, http://decisionhacker.com/2012/11/08/gamification-and-gamifiedbusiness-intelligence/ Deloitte, (2015). Global Human Capital Trends 2015. Leading in the new world of work. , p.112. Graphic: Deloitte University Press
  • 62. 61 | P a g e DeMonte.A (2014), Badgeville on gamification and the psychology of motivation https://badgeville.com/adena-demonte-badgeville-on-gamification-andthe-psychology-of- motivation/ Deterding, S., (2012). Gamification. Interactions, 19(4), p.14. Available at: http://doi.acm.org/10.1145/2212877.2212883nhttp://dl.acm.org/ft_gateway.cfm?id=2212 883&type=pdf. DiSanto,D (2012), Time to insight, http://www.digitalistmag.com/technologies/analytics/new-kpi-time-to-insight-017026 (Accessed : October 2015) Dresner Advisory Services, (2012). Wisdom of Crowds Mobile Computing / Mobile Business Intelligence Market Study 2012. , (November), pp.1–76. Duggan, K., Shoup, K. (2013) Business Gamification for Dummies John Wiley & Sons, Inc., Hoboken, New Jersey Eckerson, W. (2010), Performance Dashboards: Measuring, Monitoring, and Managing Your Business,2010 Wiley Gallup, (2013), “State of the Global Workplace: Employee Engagement Insights For Buisnes Leaders Worldwide,” Gallup HQ, Washington, 2013.[Accessed: November 2015] Gartner, Inc. (2013) “Business Intelligence” http://www.gartner.com/it-glossary/business- intelligence-bi/ NYSE: IT [Accessed: October, 2015] Given, Lisa M. (2008). The Sage encyclopaedia of qualitative research methods. Los Angeles, Calif.: Sage Publications. Howson, C, (2014). Successful Business Intelligence: Unlock the Value of BI & Big Data. McGraw-Hill Education. Howson,C,(2014) “BI Scorecard: Successful BI survey,” http://biscorecard.typepad.com/biscorecard/2014/04/bi-adoption-remains-flat.html, [Accessed: 01/11/2015] Inmon, W.H. (2005) Building the Data Warehouse, 4th ed. Wiley & Sons, Jenness, D, (2014), RedCritter , https://www.redcritterconnecter.com/home [Accessed: November 2015] Kim, B. (2012). Harnessing the Power of Game Dynamics: Why, How to, and How Not to Gamify the Library Experience. College & Research Libraries News, 73(8), pp.465–469. Available at: http://search.proquest.com/docview Kimball. R. & Ross.M, (2002) The Data Warehouse Toolkit: the complete guide to dimensional modelling, Wiley Computer, Publishing, 2002. Kumar,J.M. & Herger,M. (2013): Chapter 8: Legal and Ethical Considerations. In: "Gamification at Work: Designing Engaging Business Software, https://www.interaction- design.org/literature/author/janaki-mythily-kumar [Accessed October 2015]
  • 63. 62 | P a g e Lohr, Steve, (2012) The Age of Big Data, New York Times –Sunday Review- news analysis, http://www.nytimes.com/2012/02/12/sunday-review/big-datasimpact-in-the-world.html? (Accessed: October 2015). A version of this news analysis appears in print on February 12, 2012, on page SR1 of the New York edition with the headline: The Age of Big Data. Madan S. (2013), Lead Analyst, Information Management, Ovum http://www.appstechnews.com/news/2013/feb/12/jury-still-out-on-value-of- bigamification/ [Accessed: November 2015] Miller, A.S, (2013), Transfer Event Report MPhil / PhD à PhD. What’s the BIG idea ? Business Intelligence using Gamification - Evaluating the effects on user engagement, School of Engineering & Computing, University of the West of Scotland, Scotland Miller,S, McRobbie,G, (2013), Business Intelligence Tools – Should they be ‘gamified’? Project ‘GamBIT’: Evaluating user engagement of a business intelligence tool, University of the West of Scotland UWS, Paisley, Scotland Nicholson, S. (2012). A User-Centred Theoretical Framework for Meaningful Gamification. Paper Presented at Games &Learning & Society 8.0, Madison, WI. O’Brien, H.L. & Toms, E.G. (2013). Examining the Generalizability of the User Engagement Scale (UES) in Exploratory Search, Information Processing and Management 49, pp.1092– 1107, O'Brien, H.L., & Toms, E.G. (2008). What is user engagement? A conceptual framework for defining user engagement with technology. Journal of the American Society for Information Science and Technology, 59(6), 938–955. Paharia, R. (2013). Loyalty 3.0: How to revolutionize customer and employee engagement with big data and gamification. McGraw Hill Professional. Quillan, C. (2011) Business Research Methods Andover, South-West Cengage Learning Sale, J, Lohfeld, M, Brazil, K (2002). Revisiting the quantitative-qualitative debate: Implications for mixed-methods research Senapati, L., (2013). Boosting User Engagement through Gamification. , p.5. Available at: http://www.cognizant.com/InsightsWhitepapers/Boosting-UserEngagement-through- Gamification.pdf Shahri, A, Hosseini, M, Phalp, K., Taylor, J, & Ali, R. (2014), towards a code of ethics for gamification at enterprise. In The Practice of Enterprise Modelling (pp. 235-245). Springer Berlin Heidelberg. Stanley,R, (2014). Top 25 Best Examples of Gamification in Business. http://blogs.clicksoftware.com/index/top-25-best-examples-of-gamification-in- business/. Strauss, A, Corbin,J (1998). Basics of qualitative research: Techniques and procedures for developing grounded theory, 2nd edition, Sage Publications Ltd, London, UK.
  • 64. 63 | P a g e Swain Scheps, 4 Jan (2008), Business Intelligence for Dummies Paperback. John Wiley & Sons, Inc, Hoboken, New Jersey Swoyer, S, (2012) Making BI analytics fun https://tdwi.org/articles/2012/04/17/making-bi- analytics-fun.aspx Tableau Software, "Tableau Business Intelligence"., http://www.tableau.com/business- intelligence [Accessed: November 2015}. Volkswagen (2009), Fun Theory, http://www.thefuntheory.com/ [Accessed October 2015] Werbach, K. (2014) http://www.slideshare.net/mikederntl/gamification-of-learning-design- environments-workshop, Gamification course. [Accessed October 2015] Werbach,K, Hunter,D,(2012) , For the Win – How game thinking can revolutionize your business ,Wharton Digital Press, The Wharton School University of Pennsylvania, Philadelphia Wixom, B; Ariyachandra, T; Douglas, D; Goul, M; Gupta, B; Iyer, L; Kulkarni, U; Mooney, J. G.; Phillips-Wren, G; and Turetken, O. (2014) "The Current State of Business Intelligence in Academia: The Arrival of Big Data," Communications of the Association for Information Systems: Vol. 34, Article 1. Wu, M, (2011), “Gamification from a Company of Pro Gamers”,@lithospherlithium.com. Zichermann,G. Linder, J.(2013), The gamification revolution: how leaders leverage game mechanic to crush the competition, p158-156, McGraw Hill Education , USA Appendix: This chapter is the collection of the appendixes Appendix A Descriptive statistics Provided by GamBIT developer – PhD student Stephen Miller Section 2 Descriptive Statistics Section 2.1 - Basic testing methods There are anumber of different basicstatisticaltests that can be applied to any research work that utilises a Likert Scale survey instrument as in this research. The tests are primarily used to describe the sample group or to summarise information about them. The most common ones include: the mean, median, mode (figures of central tendency), minimum and maximum scores and the standard deviation (σ – a measure of how spread out the numbers are).
  • 65. 64 | P a g e Section 2.2 – SPSS: Overview of the data input procedures All of the survey data (68 responses) were inputted into a statistical package software tool known as SPSS (version 23). This tool allows for a wide range of statistical testing to be conducted on the survey data once it is in the format required and by clicking just a few buttons. Two main files were created: (i) GamBIT.survey.data.sav (see Appendix A) and (ii) GamBIT.USB.data.sav. (see Appendix B) to hold the data entries. Further information about each file follows: 1. GamBIT.survey.data.sav – this file was created to analyse the 28 variables (where each question on the User Engagement Scale (UES) survey is a variable). Within this scale there were sub-scales known as factors or constructs i.e. a group of variables that relate to a hidden or latent variable (something that was being measured). In the case of the UES there are six sub-scales with the following latent variables:  Focused Attention – FA (7 inter-related variables or questions)  Perceived Usability – PU (7 inter-related variables or questions)  Endurability – EN (4 inter-related variables or questions)  Aesthetics – AE (5 inter-related variables or questions)  Novelty – NO (3 inter-related variables or questions)  Felt Involvement – FI (2 inter-related variables or questions) Once allof the Likert scalescores were input into the filefor each of the 68 participants a new ‘super’ variable was created for each of the six latent variables above. These new variables measured the median (middle score) for each of the 68 responses against each of the inter- related variables to create one variable to measure rather than trying to measure all 28 variables (questions) individually. This makes it much easier to statistically analyse and measure the latent variables and to provide output information in a user friendly format. The median was recommended as the best measure to use from the associated research literature becausethe measurement level was ordinal i.e. aranking order of 1 (stronglyagree) to 5 (strongly disagree) based on the survey options. 2. GamBIT.USB.data.sav – this file was created to measure the ‘actual’ times recorded on each of the 68 USB’s (memory sticks)usedby the participants to complete the BItutorial
  • 66. 65 | P a g e tasks. Time variables were created to analyse the time elements corresponding to each of the 6 tasks as follows:  Tutorial start (TS) – the time recorded at the start of the tutorial when the participant clicks the relevant buttons to ‘create a new project’ or in the case of the GamBIT experiment group, whenever they entered a user name.  Task 1 (T1) – time recorded when the participant ‘creates a new report’.  Task 2 (T2) - time recorded when the participant ‘builds a data source’.  Task 3 (T3) - time recorded once the user ‘builds a new data set’.  Task 4 (T4) - time recorded after the participant ‘formats the data’.  Task 5 (T5) - time recorded once the user ‘styles the data’.  Task 6 (T6) – this was created as an ‘optional’ task. Participants were advised on the tutorial that they did not need to complete it. Task 6 was designed to see if users were engaged enough to continue or alternatively, to ‘opt out’. It required users to ‘create a report title’ using HTML (Hypertext Mailing Language) tags which is arguably the hardest and most time-consuming of the tasks. ‘Super’ variables were created after all of the times were input into the file i.e. a time taken to complete eachtask for comparison purposes. This meant computing the variables by taking the time recorded on the completion of eachtask and subtracting that time from the recorded time for the previous task i.e. T1 - TS (time taken to complete task 1), T2 -T1 (task 2 time taken) etc.… A further variable was created; Time_taken_all6tasks. This variable was added to provide data on the times taken on those who completed all 6 tasks by subtracting the T6 completed time from the tutorial start time T6 – TS. Two other variables were created:  UserName - Listing the user names the participants entered at the start (GamBIT only)  Time_completed_T6 – identifying the users that did (or didn’t) complete task 6. Section 2.3 – Further SPSS files As a consequence of the type of analyses being carried out it was necessary to create another two SPSS files as follows:
  • 67. 66 | P a g e 1. GamBIT.data.experimental.sav: this file was created from the main file i.e. the data about ‘Group B: experimental – ‘GamBIT’’ was copied onto a new file to allow independent testing of Group B without jeopardising or compromising Group A data. 2. GamBIT.data.factor.sav: this file was created from the main survey data file to allow the researcher to carry out various statistical tests on the data alone i.e. structural equation modelling (SEM), factorial analysis, multiple regression, Cronbach’s Alpha etc.… A screenshot of this file is shown at Appendix C. Section 2.4 – Test results and conclusions A number of different tests were carried out on the data files listed previously. Let us start with the survey data file first (GamBIT.survey.data.sav). The following tests have been conducted on the data:  Mean - a measure of central tendency that entails summing all values in a distribution of values and dividing the sum by the number of cases (n = 68).  Median - a measure of central tendency that entails arraying in ascending or descending order all values of a distribution of values and then calculating the mid- point of that distribution. Half of all cases (50%) will be on one side of the median and half (50%) will be on the other side of the median.  Standard deviation – a measure that summarises the amount of dispersion in a sample and is based on the amount of variation around the arithmetic mean.  Minimum - the minimum recorded score on a scale (or in this case it’s the minimum average score of the median score which can be a decimal place score of say 1.50 or 2.50 rather just 1 (strongly agree) or 2 (agree) in line with the survey grading’s).  Maximum - the maximum recorded score on a scale (or any scale). In this instance, it is the maximum recorded score of the average score of the median score which can be a decimal place score of 3.50 or 4.50 rather than 4 (disagree) or 5 (strongly disagree)). Inter-quartile range (IQR) - the difference between the highest and the lowest values in a distribution of values when the highest 25% and the lowest 25% have been removed. It is the difference between the first and the third quartile (Q) i.e. Q3 – Q1. Appendix B Appeal for volunteers
  • 68. 67 | P a g e Honours Year Research Project Appeal for volunteers to participate in experiment Dear fellow students, The research: I ama 4th year honour’s student within the School of Engineering & Computing at UWS and as such, I have been working on a research project that involves the exploration of gamification (the use of game elements and design in a non-game context) and its effects on user engagement. In this case, measuring user engagement on a business intelligence tool (a tool that stores and analyses business data). The coding has finished and I now need to test it on a number of people to obtain their views and feedback. My appeal: As a university student you will know that we all need people to help us with our studies at some point or another so I amnow appealing for as many volunteers as possible to take part in my own experiment. Therefore, it would be very much appreciated if you are able to help me at this crucial time in my studies. My supervisor Dr Carolyn Begg has advised me that the current 3rd year BI student would potentially make great volunteers as they are engaged in BI tools and participation would give them an insight into how honour’s students gather primary research for their projects. I am due to speak to the class on Monday 22nd February to appeal to anyone who maybe interested and answer any questions. Your part: The testing is straightforward – You will be given an exercise tutorial that has a plain English guide with screenshots showing you what you should do on one of the computers in the Lab (Room E116). Follow the guide for as long as you can or want to and complete a short survey at the end expressing how you felt about the exercise. Someone will be in the Lab during the experiment to provide assistance should it be needed. Your participation is entirely voluntarily. Time: If you complete the full exercise it should last about 50 minutes (including the time taken to complete the survey). What next: Please advise myself (email address below) or Dr. Carolyn Begg if you wish to participate and we will tell you what happens next. Thanks: Your participation is greatly appreciated. Student: Gary Brogan (B00272662). Email – B00272662@studentmail.uws.ac.uk Appendix C Semi-Structured Interviews The PDF files below contain the full transcripts of the four interviews conducted. Refer to author if further information is needed. Interview 1.pdf Interview2.pdf Interview3.pdf Interview4.pdf
  • 69. 68 | P a g e Appendix D Interview Guide This PDF file contains the guide on how the interviews were conducted. Interview guide - .pdf Appendix E Project Specification Form COMPUTING HONOURS PROJECT SPECIFICATION FORM Project Title: An Exploration of the Gamification of Business Intelligence tools and the Effect on User Engagement Student: Gary Brogan Banner ID: B00272662 Supervisor: Dr Carolyn Begg Moderator: Dr Graeme McRobbie Outline of Project: Business Intelligence (BI) main purpose is to produce timely, accurate, high-value, and actionable information. As a technology, BI has been seen to be under used and, as such, has significant untapped potential. One of the main factors that contribute to it being under used is a lack of user engagement with BI front end tools. This is the point that this project addresses and asks the question whether gamification of a BI tool can affect user engagement. Gamification is the use of game design and mechanics in a non-game context to engage users or solve problems. By applying gamification to, or in other words ”gamify”, a BI tool, the research described in this project seeks to gather evidence that gamification of a BI tool can lead to increased user engagement. This project will form part of an on-going study named GamBIT, a gamified BI tool. The work carried out will aid GamBIT application development and gather evidence whether GamBIT achieved or not, increased user engagement with a BI tool. A Passable Project will:  Carry out a literature review relevant to this project.  Investigate and evaluate user engagement with BI tools.  Produce conclusions and analytical information based on the research and evaluation findings.
  • 70. 69 | P a g e A First Class Project will:  Carry out a literature review which critically examines relevant and pertinent published works and shows a thorough understanding of the subject area.  Conduct highly detailed and exemplary research in carrying out the project, both primary and secondary, making use of a wide variety of sources and methodologies.  Contribute to the body of knowledge of whether gamification of BI tools has the ability to increase user engagement with BI and highlight the implications for the future enhancement of BI using gamification with particular focus on end-users and front-end tools. Marking Scheme: Marks Introduction 10% Literature Review25% Primary Research 25% Discussion 10% Conclusions & Recommendations 20% Critical Evaluation 10% Appendix F Initial GamBIT development involvement During the initial development of the GamBIT prototype a number of tests were conducted to help evaluate the prototype. An approach was made to a number of students, from the School of Engineering and Computing at the University of the West of Scotland (UWS), Paisley who had shown an interest in work being carried out in this report. This was done through direct observation of volunteers who had agreed to test the prototype. This was done in an attempt to observe their interaction with the prototype and with the Eclipse platform. The main areas under observation where  Length of time to complete the tasks  Navigation of the platform  Reaction to the gamification elements After the tests were conducted feedback was given by the volunteers which included  Incorporate rewards such as badges when a task is complete  Simplification of the game based rules  Reworking of the tutorial to highlight every step of the process involved in carrying out the tasks. Time taken for the volunteers to complete the tasks varied from 50 to 75 minutes. The estimated time to be applied to the actual experiment was around 45 to 60 minutes. This
  • 71. 70 | P a g e gave the developer time to re-evaluate the prototype and make the necessary changes prior to the experiments being carried out. Appendix G User Engagement Scale including Research Survey The link below will give access to the User Engagement Scale (UES) including the Research survey that will be used as part of this project. Refer to author if further information is needed. Questionnaire UES survey 10.2015.pdf Appendix H GamBIT gamification elements The following are screen grabs of the game elements taken from the eclipse BIRT platform. It shows results of the gamification layers that were added to the BI tool and details of the report building process undertaken by the participants of the experiment. Eclipse BIRT platform
  • 72. 71 | P a g e On the commencement of the report build process this “Welcome” pop-up appears. Badges awarded for completing a task
  • 73. 72 | P a g e Levels involved in “Bagging a Ben”. Green highlighting the completed tasks. Task completion pop-up
  • 74. 73 | P a g e Project GamBIT rules. Report preview
  • 75. 74 | P a g e GamBIT Leaderboard