SlideShare a Scribd company logo
International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018
DOI:10.5121/ijsea.2018.9604 41
INVESTIGATING GAME DEVELOPERS’ GUILT
EMOTIONS USING SENTIMENT ANALYSIS
Lamiaa Mostafa and MarwaAbd Elghany
Department of Business Information Systems, College of Management and Technology,
Arab Academy for Science & Technology, Alexandria, Egypt
ABSTRACT
Game Development is one of the most important emerging fields in software engineering era. Game
addiction is the nowadays disease which is combined with playing computer and videogames. Shame is a
negative feeling about self evaluationas well as guilt that is considered as a negative evaluation of the
transgressing behaviour, both are associated withadaptive and concealing responses. Sentiment analysis
demonstrates a huge progression towards the understanding of web users’ opinions. In this paper, the
sentiments of game developers are examined to measure their guilt’s emotions when working in this career.
The sentiment analysis model is implementedthrough the following steps: sentiment collector, sentiment
pre-processing, and then machine learning methods were used. The model classifies sentiments into guilt or
no guilt and is trained with 1000 Reddit website sentiment. Results have shown that Support Vector
Machine (SVM) approach is more accurate incomparison to Naïve Bayes (NV) and Decision Tree.
KEYWORDS
Ethics,Game Addiction, Guilt Emotions, Software Engineering, Sentiment Analysis Model& Value Sensitive
Design
1. INTRODUCTION
Ethics is the study of value concepts such as ‘bad,’ ‘good,’ ‘ought,’ ‘right,’ ‘wrong,’ put on into
actions linked to group standards and norms. Hence, it is contended with many issues essential to
practical decision-making (Veatch, 1977). Computer software systems stay at the centre of
modern decision making, counting data/information storage and manipulation, data availability,
and ‘alternatives’choice and formulation (Schmoldt and Rauscher, 1994). The ubiquity of
software systems in every aspect requires that the created environment should be critically
investigated as they are developed and deployed as well. A range of artificial intelligence (AI)
approaches has been proposed to represent different codes of ethics in environmental decision
systems (Thomson, 1997).
Artificial Intelligence (AI) empowers modern information technologies and the advent of
machines and have already intensely impacted the world of work in the 21st century. Computers,
algorithms and software actually streamline everyday tasks, and it is difficult to conceive how
tasks could be accomplished without their use. The name behind the idea of AI is John McCarthy,
who started his research in 1955 and suggested that each aspect of learning and any other domain
of intelligence can specifically be defined in details then simulated by a machine. Therefore, the
term ‘artificial intelligence’ entails the investigation of intelligent problem-solving behaviour to
generate intelligent computer systems.There are two kinds of artificial intelligence: weak artificial
intelligence; where intelligence is stimulated by the computer for examining cognitive processes;
strong artificial intelligence;where computers are able to perform intellectual, self-learning
International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018
42
processes by the right software/programming for the optimisation of their behaviour based upon
experience.This involves automatic networking with other machines leading to a dramatic scaling
effect (Wiss kirchen et al. 2017).
The term “ethics” tackles any intentional action affecting others’ values and lives either positively
or negatively. Software engineering is mainly perceived as a technical discipline for the
development of software. The industry personnel who contribute in professional software
developmentare primarily called as information systems analysts and the key concern of the
software engineering activity is focused on the technical appropriateness of the developed
products. Reed (2000) demonstrated that roughly more than one billion people around the world
rely on software systems for the conduction of their daily lives, which has made many in the
computing field to consider the nontechnical aspects and the ethical effect of their daily decisions
and the values enclosed therein. The relation between ethics and computers can be clearly viewed
when computers are affected by humans’ decisions, and then people’s lives are impacted by those
decisions. Accordingly, human values are connected to technical decisions in this manner, which
are made by computing professionals including the individuals, teams, or the management during
the design, development, construction and maintenance of computing artefacts influencing other
people. For example, the airbag design release William pactothers’ lives and should be directed
by an ethical decision about human values.
This paper begins with an overview of the perception of ethics in computer software systems and
its associated activities in addition to its related technical decisions made by professionals.Then
focuses on the game addiction and the accompanied guilt emotions that the developers can feel
about it through the collection of Reddit responses subsequently, analysing these responses using
the sentiment analysis model to classify whether they are guilty or not.
2. BACKGROUND
2.1. SOFTWARE ENGINEERING ETHICS
The role of ethics in software system design has recently fostered in significance.Mason (1986)
demonstrates an early viewpoint on ethical issues in the information age,classifying them into
accessibility, accuracy, privacy, and property concerns.Later on, more views added concerns
about knowledge use in organizations (Bella,1992) and its affecting impact on life quality
(Forester and Morrison, 1994). The issue is not about malicious be haviour,such as computer
crime, deliberate invasions of privacy, hacking, software theft, and viruses, rather than the subtler
of the impact that the software development and deployment can have on people and their
cultural, corporate andother institutions. Previous approaches to system development were
concentrating towards technical aspects, while current approaches give emphasis to the
consideration of all aspects of the human environment in which the systems are being developed
and used.
Software engineering ethics can be overviewed from three approaches:first, it canbe regarded as
the activity of software engineers who are making practical choices that have an effect on other
people in substantial ways. Second, it can act as a description of a collection of
principles,guidelines, or ethical imperatives giving guidance or legislative action.And third, it can
be named as a discipline, which investigates the relationship between the other two senses of
ethics.Accordingly, software engineering ethics is considered as mutually principles’ set and
activity as well (Gotterbarn et al., 1997).
International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018
43
The ethical activity embedded in technical decisions should be dependent on an understanding of
the effect of those decisions. The technical decisions should be consciously steered by
values.Professions designate the values in Codes of Ethics, Codes of Conduct and Codes of
Practice, where these codes help in many ways embracing guidance for the ethical selections of
the practitioner and enlightening the public and the practitioners about the profession’s ethical
standard. A good software engineering decision entails an awareness of both the technical and the
ethical dimensions.Codes of ethics are statements about the interconnectivity between values and
technology to ensure that the activities of practicing professionals care for human values rather
than do any harm to them.The discipline of software engineering ethics examines the relationship
between technology and the ethical principles; how it influences society, citizens, and products
(Gotterbarn, 2002).
In the nineties, a number of organizations like the Association for Computing Machinery (ACM),
the Australian Computer Society, the British Computer Society (BCS), the IEEE Computer
Society (IEEE-CS), and the New Zealand Computer Society have reconsidered their ethics’codes
to state explicitly an ethical obligation to the public (Gotterbarn a, 1996). Especially, the IEEE
Computer Society joined by the ACM started to professionalize software engineering and set
initial goals in the following:
1. Adoption of Standard Definitions
2. Definition of Required Body of Knowledge and Recommended Practices
3. Definition of Ethical Standards
4. Define Educational Curricula (Gotterbarn b, 1996).
However, the above goals are coherent with the goals of any other professions because the key
element in any profession stands in its recognition of ethical and moral responsibilities to its
clients, to society, and to the profession itself. Nevertheless, many professions such as medicine
for example declare these responsibilities publicly in a code of ethics then
afterwardsnecessitatetraining in professional ethics for entering the profession. The
responsibilities formulated in the Software Engineering Code of Ethics and Professional Conduct
(SECODE 1999) were briefed in:Software engineers shall be committed to these Eight Principles:
1 PUBLIC - act steadily with the public interest.
2 CLIENT AND EMPLOYER - perform in a way, which is in the best interests of their client or
employer.
3 PRODUCT -make sure that their products and related modifications fulfil the highest
professional standards possible.
4 JUDGMENT - keep independence and integrity in their professional judgment.
5 MANAGEMENT - endorse an ethical approach to the management of software development
and maintenance.
6 PROFESSION - align the reputation and the integrity of the profession in consistence with the
public interest.
7 COLLEAGUES - being fair and supportive to their colleagues.
8 SELF - contribute in lifelong learning in regards to the practice of their profession.
International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018
44
The Software Engineering Institute recognised software engineering as an engineering form
applying the principles of computer science and mathematics for the achievement of cost
effective solutions to software practical problems in the mankind service. The perception of cost
effectiveness denotes the resources investment for good value satisfying the required quality and
minimised time (Ford, 1990).Even though, software engineering does not meet all of the above
mentioned inscriptions of a profession, they are still important to understand how ethics is linked
to software engineering. Training and specialised skills are spent in the developments of products
that have a direct impact on many lives for their usage in society, in other words affecting the
public's well-being. Consequently, software engineering ethics can be identified as a professional
ethics(Pour, 2000). Like medical professionals, software engineers should be aware that
knowledge is commonly inaccessible to people outside the profession and have special duties and
responsibilities to their clients like lawyers, which can be inconsistent with duties and
responsibilities towards the society at large. Then software engineering ethics shares the common
base with other professional ethics for health, safety, and welfare concern. How ever,it is uniquely
bounded to the technical details of computing (SE Code, 1999).
The context to which moral rules are applied differentiates between various professional ethics
including legal, medical and engineering ethics. For the reason that different contexts raise
different ethical concerns, even the order in which they are practised changes for each application
domain. Within the software engineering process, moral rules are assigned different importance at
different phases of the software development life cycle. An understanding agreement (informative
consent) creates a rule during the requirements phase of life-critical software. The honesty and
transparency principle is another rule during testing. Yet the tort law should be confused with
professional ethics in the sense of individual responsibility. If a developed product does not
encounter the ideological technical goals of the profession there is no need for legal action till
some damage has happened. There exists three main requirements for a tort law suit: (1) the duty
is being indebted; (2) the duty is penetrated; and (3) caused damages happened by the duty
penetration, one could be unethical and still the law was not violated(West 1991).In the design
phase, other moral rules may get exercised first.
For example; in case of a journal file design for a library check-out system to ascertain the status
of books and the number of additional copies to be ordered, if required. The association of the
patron's name with the book checked out has a potential for the violation of several moral rules,
such as the violation of privacy and the deprivation of pleasure since one does not feel free to read
what the other one wants, and perhaps causing psychological pain. Moreover, the language
selection for a life-critical system can have moral implications because if the language is difficult
for debugging, writing, or modification, then people are put at risk and principles of good system
design in that language selection are violated. The ethical issues that can be drawn here;software
developers were morally culpable to take a project beyond their skill of language proficiency,
despite their awareness of the major difference in languages, they chose to stick to the more
dangerous language for other purposes such as profit, then they have intentionally violated
professional ethics.
Technical responsibility encompasses moral commitments to standards of software production, to
the software development process whereas societal responsibility implicates commitment to the
usage of these specialised skills in the society service.Software development is considered as a
social process and the software engineer has a two-fold compulsion for excellence striving. One
compulsion depends on the fulfilment of technical standards and the other one is about social
responsibility to those end-users who will work with the developed product. These ethics’
approaches provide positive guidance for software practitioners’ behaviours (Gotterbarn, 1999).
Without an ethical approach, systems might be put into operational use regardless of persistent
International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018
45
errors and faulty trials that is why it is not surprising that software developers hardly afford their
clients or purchasers with warranties of any substance (Forester and Morrison, 1994). On the
other hand, it is up to the end-user in most cases to provide critical feedback and the availability
of software websites and e-mail contact addresses offer avenues for end-user responses to
unethical practices.
A number of books have been printed on computer ethics (also denoted to as machine morality).
A hypothetical theoretical scenario is delineated in the book of "Moral Machines" where an
artificially high oil price is contributed to "unethical" robotic trading systems causing the
automated program to regulate the energy output switches over from oil to more polluting coal
power plants in order to prevent the rise in electricity prices. At the same time, coal-fired power
plants cannot withstand flowing at full production long and blasts after some time leading to
massive power outage with the concerns it has for life and health. Nevertheless, power outages
produce terror alarms at the nearest international airport resultant in chaos both at the airport and
arriving jets colliding etc. The outcome is loss in lives and huge economic costs, merely for the
reason of the interaction between independently programmed systems that automatically make
decisions (Wallach et al., 2009).
The scenario illustrated above shows that it is significant to have control mechanisms when a
multisystem, which are making their own decisions interact to set limitations, and give
notification to the operators about the condition deemed to call for human review. Therefore, it
becomes important that moral based decision-making becomes a part of the artificial intelligence
systems. The ethical implications of the possible actions that systems can take must be evaluated
and this can be done on several levels, including if private laws are broken or not. However, it is
not easy to construct machines that are incorporating all the world's philosophical and religious
traditions. Furthermore, a very effective driver support system can be developed to reduce the
number of accidents and save numerous lives yet, it is not desirable to be responsible for building
a system where there is a real risk for severe adverse events in case of failure (Tørresen, 2014).
In the same book detailed review of how artificial moral agents can be applied is provided. This
embraces the use of ethical expertise in program development. It suggests three approaches: logic
and mathematical formulated ethical reasoning, machine learning methods depending on
examples of ethical and unethical behaviour and simulation where it is viewed by different ethical
strategies. For example; in case of a loan application to a bank, an AI-based system can be used
for credit evaluation based on a number of criteria. If the application is rejected, the reason behind
refusal should be clarified. Skin color or race can be considered one of the parameters included
rather than the applicant’s financial situation, still the bank can not state it explicitly. A more open
system for inspection can however, show that the residence address has been crucial in this case.
It has given the same result that the selected criterion has been provided. This is essential to avoid
as far as possible through the simulation of AI systems’ behaviour of AI systems for the detection
of possible ethically undesirable actions.All software that will substitute human evaluation and
social function should follow criteria such as accountability, inspect ability, manipulation
robustness and predictability. Also, all developers should have an inherent desire to develop
products delivering the best possible user experience and user safety. The capability for
manipulation must be restricted, and the system must have a predictable behaviour. It is definitely
easier for a robot to move in a known and limited environment rather than in new and unfamiliar
surroundings (Wallach et al., 2009).
The Italian robot scientist Gian Marco Viareggio introduced the term roboethicsin 2002. He
realised a need for guidelines for the development of robots intended to help progressing of the
human society and individuals and prevent abuse against humanity. Accordingly, ethics for robot
designers, manufacturers and users are needed. Robots are mechanical systems that might
International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018
46
Unintentionally cause harm to users. There exists a danger that the collected information can be
accessed by unauthorized people and be made available to others through the Internet. Robots can
be misused for criminal activities such as burglary. Although it is presumed that the developers
for robots and AI have good intentions. The intelligent systems must be designed so that the
robots are friendly and kind, while difficult to abuse for potential malicious actions in the future.
In 2004 the first international symposium on robotics was held in Sanremo, Italy. The research
program ETHICBOTS has been funded by EU where a multidisciplinary team of researchers
participated to identify and analyse techno-ethical challenges in the integration of human and
artificial entities. European Robotics Research Network (Euronet) in 2005 provided fund to the
project Euronet Roboethics Atelier, with the goal of the development of the first roadmap for
roboethics for a systematic assessment of the ethical issues surrounding robot design. The focus
was on human ethics for designers, manufacturers and users of robots.The work ended up with
some recommendations that should satisfied in commercial robots such as (Tørresen, 2014):
• Traceability as in aircraft, robots should have a "black box" to record and document their own
behaviour;
• Identifiability, similar to cars, robots should have serial numbers and registration number.
• Privacy and policy,software and hardware should be used for encryption and password
protection of sensitive data that the robot should save.
The machines should be able to make ethical decisions using ethical frameworks. Ethical Issues
are too interdisciplinary that programmers alone should discover them. Researchers in ethics and
philosophy should also be involved in the formulation of ethical "conscious" machines that are
targeted at providing acceptable machine behaviour. Michael And Susan Leigh Anderson have
collected contributions from both philosophers and AI Researchers in the book "Machine Ethics"
(2011). The Book discusses why and how to include an ethical dimension in machines that will
act autonomously. Medical Important information must be reported on, but at the same time, the
person must be able to maintain privacy. Maybe video surveillance is desirable for the user (by
relatives or others), but it should be clear to the user when and how it happens. An autonomous
robot must also be able to adapt to the user's "chemistry" to have a good dialogue.
Value Sensitive Design emphasizes values with ethical import. A list of frequently human values
that might be implicated in systems design is presented in table1 (adapted from Friedman et al.,
2012) to act as a heuristic for values proposition that should be considered.
Table 1. Human Values (with Ethical Import); Source: (Friedman et al., 2012)
Human Value Definition Research
Human Welfare Refers to people’s physical,
material, and psychological well-
being
Leveson [1991]; Friedman,
Kahn, &Hagman [2003];
Neumann [1995]; Turiel [1983,
1998]
Ownership and
Property
Refers to a right to possess an
object (or information), use it,
manage it, derive income from it,
and bequeath it
Friedman [1997]; Herskovits
[1952]; Lipinski &Britz [2000]
Privacy Refers to a claim, an entitlement,
or a right of an individual to
determine what information
about himself or herself can be
communicated to others
Agre and Rotenberg [1998];
Bellotti [1998]; Boyle, Edwards,
& Greenberg [2000]; Friedman
[1997]; Fuchs [1999]; Jancke,
Venolia, Grudin, Cadiz, and
International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018
47
Gupta [2001]; Palen&Dourish
[2003]; Nissenbaum [1998];
Phillips [1998]; Schoeman
[1984]; Svensson, Hook,
Laaksolahti, &Waern [2001]
Freedom From
Bias
Refers to systematic unfairness
perpetrated on individuals or
groups, including pre-existing
social bias, technical bias, and
emergent social bias
Friedman &Nissenbaum [1996];
cf. Nass& Gong [2000]; Reeves
&Nass [1996]
Universal
Usability
Refers to making all people
successful users of information
technology
Aberg&Shahmehri [2001];
Shneiderman [1999, 2000];
Cooper &Rejmer [2001]; Jacko,
Dixon, Rosa, Scott, & Pappas
[1999]; Stephanidis [2001]
Trust Refers to expectations that exist
between people who can
experience good will, extend
good will toward others, feel
vulnerable, and experience
betrayal
Baier [1986]; Camp
[2000];Dieberger, Hook,
Svensson, &Lonnqvist [2001];
Egger [2000]; Fogg& Tseng
[1999]; Friedman, Kahn, &
Howe [2000]; Kahn &Turiel
[1988]; Mayer, Davis,
&Schoorman [1995]; Olson &
Olson [2000]; Nissenbaum
[2001]; Rocco [1998]
Autonomy Refers to people’s ability to
decide, plan, and act in ways that
they believe will help them to
achieve their goals
Friedman &Nissenbaum [1997];
Hill [1991]; Isaacs, Tang, &
Morris [1996]; Suchman [1994];
Winograd [1994]
Informed Consent Refers to garnering
people’sagreement,
encompassing criteria of
disclosure and comprehension
(for ‘informed’) and
voluntariness, competence, and
agreement (for ‘consent’)
Faden& Beauchamp [1986];
Friedman, Millett, &Felten
[2000]; The Belmont Report
[1978]
Accountability Refers to the properties that
ensures that the actions of a
person, people, or institution may
be traced uniquely to the person,
people, or institution
Friedman & Kahn [1992];
Friedman & Millet [1995];
Reeves &Nass [1996]
Courtesy Refers to treating people
withpoliteness and consideration
Bennett &Delatree [1978];
Wynne & Ryan [1993]
Identity Refers to people’s understanding
ofwho they are over time,
embracing both continuity and
discontinuity over time
Bers, Gonzalo-Heydrich,
&DeMaso [2001]; Rosenberg
[1997]; Schiano& White [1998];
Turkle [1996]
Calmness Refers to a peaceful and
composed psychological state
Friedman & Kahn [2003];
Weiser & Brown [1997]
Environmental Sustainability Refers to sustaining ecosystems
such that they meet the needs of
the present without
compromising future generations
United Nations [1992]; World
Commission on Environment and
Development [1987]; Hart
[1999]; Moldan, Billharz,
&Matravers [1997]; Northwest
Environment Watch [2002]
International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018
48
After the illustration of the software engineering ethical activities and technical responsibilities,
the following sections will discuss game addiction and the use of guilt model to question via
sentiment analysis the guilt emotions of the software developers.
2.2. GAME ADDICTION
Game addiction is considered presently the utmost deliberated psychosocial aspects accompanied
with playing computer and videogames. Lately, the American Medical Association (2007)
intensely stimulated the American Psychiatric Association (APA) to include the ‘‘video game
addiction’’ as a formal diagnostic disorder in the updated revision of the Diagnostic and
Statistical Manual of Mental Disorders (DSM-V, 2012). Adolescents are more exposed to game
addiction than adults (Griffiths & Wood, 2000)because (a)they commonly play computer and
videogames more often than adults(e.g., Griffiths, Davies, &Chappel, 2004). Moreover, there is
substantial disagreement among researchers about the concept of ‘game addiction’.
Game addiction is the widespread dominant term among researchers to designate
compulsive,excessive, obsessive, and mostly problematic use of videogames (e.g., Charlton
&Danforth, 2007; Chiu, Lee, & Huang, 2004;Chou & Ting, 2003; Fisher, 1994; Griffiths &
Davies, 2005; Grüsser, Thale mann, & Griffiths, 2007; Hauge& Gentile, 2003; Ko, Yen, Chen,
Chen, & Yen, 2005; Ng &Wiemer-Hastings, 2005; Soper& Miller, 1983; Wan &Chiou, 2006).
Other terms for the description of excessive or problematic gaming involve ‘‘videogame
dependence’’ (Griffiths & Hunt, 1995, 1998), ‘‘problematic game playing’’ (Salguero& Moran,
2002; Seay& Kraut, 2007), and ‘‘pathological gaming’’ (Johansson &Gotestam, 2004; Keepers,
1990). Regard lessof the used terminology, researchers agree that computer and videogame
overuse can cause a behavioural addiction (Griffiths, 2005). Addictive behaviour refers to
behaviour that is compulsive, excessive, psychologically or physically destructive, and
uncontrollable (Mendel son& Mello, 1986).
2.3. GUILT EMOTIONS
According to Lewis (1971), shame involves a negative evaluation of the self and guilt involves a
negative evaluation of a specific behaviour or action. Tangney (1991) has concluded that shame
and guilt are associated with different feelings such as: cognitions, public avoidance and
aggressive responses. Different researchers examined guilt and shame feelings in situations,
which defined as public–private distinctions (Wolf et al., 2010). Roos et al. (2014) have
concluded that guilt and shame could influence an individual’s social behaviours. Therefore, it
would be very beneficial to explore the feeling of guilty that developers can feel towards game
development and its negative damaging effect of addiction harming personal lives.
3. SENTIMENT ANALYSIS
Marketing managers comprehend the importance of customer sentiments in different services or
products through understanding the sentiment of internet users. Additionally, universities can take
advantage of sentiment analysis to recognise the opinion of students. Vocabulary based approach
can be used with conjunction of sentiment analysis; this is exercised for semantic annotation and
word suggestion. Machine learning strategies can classify sentiments into either positive or
negative.
Sentiment analysis tools have been used on non-technical domains including social media and
marketing. Rajat and Gaonkar (2017) have performed Sentiment Analysis on customers in order
International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018
49
to develop a rating forecasting tool, Lei and Xueming (2015) have measured the reputation of the
company based upon sentiment analysis of online customer. Das et al. (2017) conducted
sentiment analysis on airline industries; they classified sentiment of Twitter using Naïve Bayes
NB machine learning algorithm, and Rapid Miner RM. Furthermore, Support Vector Machine
SVM is one of the best performing machine learning technique nevertheless, NB can perform
better in a few cases.
Collomb et al. (2014) divided machine learning approach into the following four types. Machine
learning, which uses a learning algorithm via training data classification .Lexicon-based approach
involves calculation, using the semantic orientation of words or sentences in a document.Rule-
based approach classifies words in relevance to the number of positive and negative words. And
Senti4SD that is a classifier trained to support sentiment analysis in developers’ communication
channels (Calefato, et al., 2018). Senti4SD uses lexicon, keyword-based and semantic features.
In the proceeding sections, the modelling of sentiment analysis is exemplified and its three phased
stages are explained to end up with the experimental results that has been concluded from the data
collection.
3.1. MODELLING SENTIMENT ANALYSIS
Sentiment analysis guilt models composed of three stages: sentiment collector, processing and
classification as illustrated in Figure 1 below. Sentiments are collected from reddit.com. which is
an online forum for game developers, then a java tool is used to collect all sentiments while
removing html tags. Sentient collector has clustered 1000 sentiments and classified these
sentiments into two groups: guilt and no guilt.
After collecting the sentiment, a processing step is designed using Knime, which is a data
analytical and integration platform that includes components for machine learning, data mining
and web mining (Knime, 2018). Processing includes: punctuation eraser, case converter, stop
word removal and stemming. And the third step is the machine learning technique, where
sentiment analysis guilt model will use SVM, NV, and decision tree.
3.1.1 PRE-PROCESSING STEP
Pre-processing is the step of cleaning data and removing unwanted words. Pre-processing can
affect the accuracy of the sentiment analysis model (El-Masri, et al., 2017). Many pre-processing
techniques can be used on data such as: punctuation removal, remove stop words, repeated letters
and numbers and covert text to lower or upper case, stop word removal and stemming. Stop word
removal is the process of omitting connecting words like: and, the , to and so on, while stemming
is the process of returning a word to its basic form which is called stem. Porterstemmeris used in
the sentiment model, which is one of the known stemmers in text processing (Porter Stemmer,
2018).
International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018
50
Figure 1.Sentiment Analysis Model
3.1.2. FEATURES SELECTION STEP
Features are inputs of the classifiers. They allow a more detailed analysis of the raw data. N-
grams are one of the most commonly used features (Arauj, et al., 2014). Bag of words (BOW) is a
representation method for object classification. BOW idea" is to quantize each extracted key point
into one of visual words, and then represent each image by a histogram of the visual words"
(Zhang et al., 2010). BOW is defined as a conjunction method with classification techniques.
Term Frequency Inverse Document Frequency (TF-IDF) is used to define what words in a
document is best used in a query. TF-IDF "calculates values for each word in a document through
an inverse proportion of the frequency of the word in a particular document to the percentage of
documents the word appears in" (Ramos, 2003). High TF-IDF numbers defines a strong
relationship with the document they appear in. Both BOW and TFIDF are used in Sentiment
analysis guilt model for selection of features.
3.1.3 MACHINE LEARNING STEP
Different machine learning approaches are used in sentiment analysis such as Naive Bayes (Go, et
al., 2009; Pang and Lee, 2004) and Support Vector Machines (Duwairi and El-Orfali, 2014).
Using the classifiers depends on the domain it is used in for example in education SVM
performed better than other classifiers (Altrabsheh et al., 2015). In the Sentiment analysis guilt
model classifiers used are: SVM, Naive Bayes and decision tree.
For the validation of classifiers, accuracy, precision and recall have been used. Classification
accuracy criteria focus"on how well the classifier assigns objects to their correct classes" (Hand,
2012). The precision is the correct predictions percentage and the recall is the instances percent
that were predicted as positive (Mostafa, 2009).
4. EXPERIMENTAL RESULTS
Sentiment analysis depends on the training data, the data was collected from Reddit.com forum
(Reddit, 2018). Game developers use Reddit forum for discussing any problem or topics related
to game development. Thousands sentiments were collected;five hundredslabeled guilt emotion
and another five hundredslabeled no guilt emotion. Sentiments were collected using a java
application that gets the text from the webpage, while removing the html tags. Classifiers were
then trained and tested in the thousands sentiments with the percent 80% for training and 20% for
testing as referred to Lewis and Catlett(1994).
International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018
51
Sentiment analysis guilt model tests the classifiers based on the 1000 sentiment and labelled these
sentiments into either guilt or not guilt. Naive Bayes, SVM and decision tree were used for the
classification of the sentiments. Sentiments collectors collect the sentiments and different
processing techniques were used to remove the noise from sentiment sentences. Table 2 shows
the results of the sentiment model. SVM was the highest accuracy level 77 however, this is
contradicting with Altrabsheh, et al. (2015) in which it not an educational field.
Table 2. Sentiment Analysis Guilt Model Results
5. CONCLUSIONS AND LIMITATIONS
Digital addiction affects the social life of a person, it is so important to understand the emotions
of the game developers who provide products that affect other person's life. In accordance to
previous studies, the guilt model was not tested on game developers. Sentiment analysis guilt
model can be used to test the guilt motion on the sentiments of game developers and can help in
solving guilt consequences emotions for game developers for the reason that guilt emotions can
develop to depression level in some cases.
The applied sentiment analysis guilt model in this research paper has the following limitations.
Reddit forum is the only one used to extract sentiments. Only three machine learning classifiers
were used: SVM, NV and decision tree nonetheless, other classifiers can be used such as Nearest
Neighbour. In addition, the data collection sentiment set is limited to 1000 sentiments. Feature
selection method was dependent on TF-IDF and BOW, further features selection can be expanded
to hold other semantic relations from Word Net tool as an example. Finally yet importantly,
sentiment analysis guilt model can be applied on different people category in other working fields
likethe educational field for instance.
REFERENCES
[1] R.M.Veatch, (1977), Case Studies in Medical Ethics, Harvard University Press, Cambridge.
[2] D.L.Schmoldt, H.M. Rauscher, (1994), ‘Knowledge management and six supporting technologies’,
Comput. Electron.Agric., 10 (1), 11–30.
[3] A.J.Thomson, (1997), ‘Artificial Intelligence and environmental ethics’, Al Appl., 11 (1), 69–73.
[4] GerlindWisskirchen et al., (April 2017) ‘Artificial Intelligence and Robotics and Their Impact on the
Workplace’, IBA Global Employment Institute.
[5] K.Reed, (July/August 2000), “Software Engineering- A New Millennium?,” IEEE Software, pp.107,
in D.W. Gotterbarn (a), (1996), Software Engineering: The New Professionalism, The Responsible
Software Engineer, ed. Colin Meyer, Springer Verlag.
[6] R.O.Mason, (1986), ‘Four ethical issues of the information age’, MIS Q., vol. 10 (1), p. 5–12.
[7] D.A.Bella, (1992), ‘Ethics and the credibility of applied science’, in: G.H. Reeves, D.L. Bottom, M.H.
Brookes, M.H. (Technical coordinators), Ethical Questions for Resource Managers, USDA Forest
Service General Technical Report PNW-GTR-288, pp. 19–32.
[8] T.Forester, P. Morrison, (1994), Computer Ethics, MIT Press, Cambridge,.
[9] The joint ACM/IEEE-CS Software Engineering Code was published as: Don Gotter barn, Keith
Miller, and Simon Rogerson,(November 1997), ‘Software Engineering Code of Ethics’,
Communication of the ACM , Vol. 40, Issue 11, pp. 110-118, DOI: 10.1145/265684.265699
International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018
52
[10] D.Gotterbarn, (2002), SOFTWARE ENGINEERING ETHICS, published by Software Engineering
Ethics Research Institute forEncyclopedia of Software Engineering. Available at:
https://www.uio.no/studier/emner/matnat/ifi/INF3700/v12/undervisningsmateriale/Software%20enge
neering%20ethics.pdf
[11] D.W. Gotterbarn (b),(1996), Establishing Standards of Professional Practice, chapter 3 in The
Responsible Software Engineer, ed. Colin Meyer, Springer Verlag.
[12] SE Code,(October 1999), Computer Society and ACM Approve Software Engineering Code of
Ethics, Computer, pages 84-88.
[13] G.Pour, M. Griss, and M. Lutz, (May 2000), “The Push to Make Software Engineering
Respectable”,Computer, page 35-43.
[14] L.B. West,(October 1991), "Professional Civil Engineering: Responsibility," Journal of Professional
Issues in Engineering Education and Practice, 117, 4.
[15] D.W. Gotterbarn,(November/December 1999), “How the new Software Engineering Code of Ethics
Affects You", IEEE Software, 58-64.
[16] Wallach, Wendell og Allen, Colin, (2009), Moral Machines: Teaching Robots Right from Wrong,
New York: Oxford University Press.
[17] Jim Tørresen, (2014), Future Perspectives on Artificial Intelligence (AI), University of Oslo.
[18] Anderson, Michael og Susan Leigh, (2011), Machine Ethics, Cambridge University Press.
[19] BATYA FRIEDMAN, PETER H. KAHN, JR., AND ALAN BORNING, (2012), Value Sensitive
Design and Information Systems, University of Washington, In P. Zhang & D. Galletta (Eds.),
Human-Computer Interaction in Management Information Systems: Foundations, M.E. Sharpe, Inc:
NY.
[20] N.G. LEVESON, (1991), ‘Software safety in embedded computer systems’,Commun. ACM, Vol. 34,
Iss. 2, P. 34-46.
[21] B.FRIEDMAN, P. H. JR. KAHN, AND J. HAGMAN, (2003), ‘Hardware companions?: What online
AIBO discussion forums reveal about the human-robotic relationship’, Conference Proceedings of
CHI 2003, 273- 280, New York, NY: ACM Press.
[22] P.G. NEUMANN, (1995), Computer Related Risks, New York, NY: Association for Computing
Machinery Press.
[23] E.TURIEL, (1983), The Development of Social Knowledge, Cambridge, England: Cambridge
University Press.
[24] E.TURIEL, (1998), ‘Moral development’,In N. Eisenberg, Ed., Social, Emotional, and Personality
Development, (pp. 863-932), Vol. 3 of W. Damon, Ed., Handbook of Child Psychology, 5th edition,
New York, NY: Wiley.
[25] B.FRIEDMAN, (1997), ‘Social judgments and technological innovation: Adolescents, understanding
of property, privacy, and electronic information’, Computers in Human Behaviour, 13(3), 327-351.
[26] M.J. HERSKOVITS,(1952), Economic Anthropology: A Study of Comparative Economics, New
York, NY: Alfred A. Knopf.
[27] T.A. LIPINSKI, and J. J. BRITZ, (2000), ‘Rethinking the ownership of information in the 21st
century: Ethical implications’, Ethics and Information Technology, 2, 1, 49-71.
[28] P.E. AGRE, and M. ROTENBERG, (1998), Eds., ‘Technology and Privacy: The New Landscape’,
MIT Press, Cambridge, MA.
[29] V.BELLOTTI, (1998), Design for privacy in multimedia computing and communications
environments, In P. E. Agre and M. Rotenberg, Eds., Technology and Privacy: The New Landscape
(pp. 63-98), Cambridge, MA: The MIT Press.
[30] M.BOYLE, C. EDWARDS, and S. GREENBERG, (2000), ‘The effects of filtered video on
awareness and privacy’, In Proceedings of Conference on Computer Supported Cooperative Work
(CSCW 2000), 1-10, New York, NY: Association for Computing Machinery.
[31] L.FUCHS, (1999), ‘AREA: A cross-application notification service for groupware’, In Proceedings of
ECSCW 1999, Kluwer, Dordrechet Germany, 61-80.
[32] G.JANCKE, G. D. VENOLIA, J. GRUDIN, J. J. CADIZ, and A. GUPTA, (2001), ‘Linking public
spaces: Technical and social issues’, In Proceedings of CHI 2001, 530-537.
[33] L.PALEN, and P. DOURISH, (2003), ‘Privacy and trust: Unpacking, privacy, for a networked world’,
In Proceedings of CHI 2003, 129-136.
[34] H.NISSENBAUM, (1998), ‘Protecting privacy in an information age: The problem with privacy in
public’, Law and Philosophy, 17, pp. 559-596.
International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018
53
[35] D.J. PHILLIPS, (1998), Cryptography, secrets, and structuring of trust, In P. E. Agre and M.
Rotenberg, Eds., Technology and Privacy: The New Landscape (pp. 243-276), Cambridge, MA: The
MIT Press.
[36] F.D. SCHOEMAN, (1984),Philosophical Dimensions of Privacy: An Anthology, Cambridge,
England: Cambridge University Press.
[37] M.SVENSSON, K. HOOK, J. LAAKSOLAHTI, and A. WAERN, (2001), ‘Social navigation of food
recipes’, In Proceedings of the Conference of Human Factors in Computing Systems (CHI 2001),
341-348, New York, NY: Association for Computing Machinery.
[38] J. ABERG, and N. SHAHMEHRI, (2001), ‘An empirical study of human Web assistants:
Implications for user support in Web information systems’, In Proceedings of the Conference on
Human Factors in Computing Systems (CHI 2000), (pp. 404-411), New York, NY: Association for
Computing Machinery Press.
[39] B.SHNEIDERMAN, (1999), ‘Universal usability: Pushing human-computer interaction research to
empower every citizen’, ISR Technical Report 99-72, University of Maryland, Institute for Systems
Research, College Park, MD.
[40] B.SHNEIDERMAN, (2000), ‘Universal usability’, Commun.of the ACM, 43, 5, 84-91, 2000.
[41] M.COOPER, and P. REJMER, (2001), ‘Case study: Localization of an accessibility evaluation’, In
Extended Abstracts of the Conference on Human Factors in Computing Systems (CHI 2001), 141-
142, New York, NY: Association for Computing Machinery Press.
[42] J.A. JACKO, M. A. DIXON, R. H. JR. ROSA, I. U. SCOTT, and C. J. PAPPAS, (1999), ‘Visual
profiles: A critical component of universal access’, In Proceedings of the Conference on Human
Factors in Computing Systems (CHI 99), pp. 330-337, New York, NY: Association for Computing
Machinery Press.
[43] C.STEPHANIDIS, (2001),Ed. 2001 User Interfaces for All: Concepts, Methods, and Tools, Mahwah,
NJ: Lawrence Erlbaum Associates.
[44] A.BAIER, (1986), Trust and antitrust, Ethics, 231-260.
[45] L.J. CAMP, (2000), Trust & Risk in Internet Commerce, MIT Press, Cambridge, MA, L. J.
[46] A.DIEBERGER, K. HOOK, M. SVENSSON, and P. LONNQVIST, (2001), ‘Social navigation
research agenda’, InExtended Abstracts of the Conference on Human Factors in Computing Systems
(CHI 2001), pp. 107-108, NewYork, NY: Association of Computing Machinery Press.
[47] F.N. EGGER, (2000) ‘Trust me, I am an online vendor.: Towards a model of trust for e-commerce
systemdesign’, In Extended Abstracts of the Conference of Human Factors in Computing Systems
(CHI 2000), 101-102, New York, NY: Association for Computing Machinery.
[48] B.J. FOGG, and H. TSENG,(1999), ‘The elements of computer credibility’, In Proceedings of CHI
1999, ACM Press, 80-87.
[49] B.FRIEDMAN, P. H. JR. KAHN, and D. C. HOWE, (2000), ‘Trust online’, Commun.ACM, 43, 12,
34-40.
[50] P.H. JR. KAHN, and E. TURIEL, (1988), ‘Children’s conceptions of trust in the context of social
expectations’, Merrill-Palmer Quarterly, 34, 403-419.
[51] R.C. MAYER, J. H. DAVIS, AND F. D. SCHOORMAN,(1995), ‘An integrative model of
organizational trust’, TheAcademy of Management Review, vol. 20, issue 3, pp. 709-734.
[52] J.S. OLSON, and G. M. OLSON, (2000), ‘i2i trust in e-commerce’, Communications of the ACM,
43(12), 41-44.
[53] H.NISSENBAUM, (2001), ‘Securing trust online: Wisdom or oxymoron’, Boston University Law
Review, 81(3), 635-664.
[54] E.ROCCO, (1998), ‘Trust breaks down in electronic contexts but can be repaired by some initial face-
to-face contact’, In Proceedings of CHI 1998, ACM Press, 496-502.
[55] B.FRIEDMAN, and H. NISSENBAUM, (1997), ‘Software agents and user autonomy’, Proceedings
of the FirstInternational Conference on Autonomous Agents, 466-469, New York, NY: Association
for Computing Machinery Press.
[56] T.E. JR. HILL, (1991), Autonomy and self-respect, Cambridge: Cambridge University Press.
[57] E.A. ISAACS, J. C. TANG, and T. MORRIS, (1996), ‘Piazza: A desktop environment supporting
impromptu andplanned interactions’, In Proceedings of the Conference on Computer Supported
Cooperative Work (CSCW 96), pp. 315-324, New York, NY: Association for Computing Machinery
Press.
International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018
54
[58] L.SUCHMAN, (1994), ‘Do categories have politics? The language/action perspective reconsidered’,
CSCW Journal, vol. 2, no. 3, pp. 177-190.
[59] T.WINOGRAD, (1994), ‘Categories, disciplines, and social coordination’, CSCW Journal, vol. 2, no.
3, pp. 191-197.
[60] R.FADEN, and T. BEAUCHAMP, (1986), A History and Theory of Informed Consent, New York,
NY: OxfordUniversity Press.
[61] B.FRIEDMAN, L. MILLETT, and E. FELTEN, (2000), ‘Informed Consent Online: A Conceptual
Model and DesignPrinciples’, University of Washington Computer Science & Engineering Technical
Report 00-12-2.
[62] The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of
Research, The National Commission for the Protection of Human Subjects of Biomedical and
Behavioural Research, 1978.
[63] B.FRIEDMAN, and P. H. JR. KAHN, (1992), ‘Human agency and responsible computing:
Implications for computer system design’, Journal of Systems Software, 17, 7-14.
[64] B.FRIEDMAN, and L. MILLETT, (1995), ‘It’s the computer’s fault: Reasoning about computers as
moral agents’, In Conference Companion of the Conference on Human Factors in Computing Systems
(CHI 95), (pp. 226-227), New York, NY: Association for Computing Machinery Press.
[65] B.REEVES, and C. NASS, (1996), The Media Equation: How People Treat Computers, Television,
and New Media Like Real People and Places’, New York, NY and Stanford, CA: Cambridge
University Press and CSLI Publications.
[66] W.J. BENNETT, and E. J. DELATREE, (1978), ‘Moral education in the schools’, The Public
Interest, 50, pp. 81-98.
[67] E.A. WYNNE, and K. RYAN, (1993), Reclaiming our schools: A handbook on teaching character,
academics, and discipline, New York, Macmillan.
[68] M.U. BERS, J. GONZALEZ-HEYDRICH, and D. R. DEMASO, (2001), ‘Identity construction
environments: Supporting a virtual therapeutic community of pediatric patients undergoing dialysis’,
In Proceedings of the Conference of Human Factors in Computing Systems (CHI 2001), 380-387,
New York, NY: Association for Computing Machinery.
[69] S.ROSENBERG, (1997), Multiplicity of selves, In R. D. Ashmore and L. Jussim, Eds., Self and
Identity:Fundamental Issues, (pp. 23-45), New York, NY: Oxford University Press.
[70] D.J. SCHIANO, and S. WHITE, (1998), ‘The first noble truth of cyberspace: People are people (even
when they MOO)’, In Proceedings of the Conference of Human Factors in Computing Systems (CHI
98), 352-359, New York, NY: Association for Computing Machinery.
[71] S.TURKLE, (1996), ‘Life on the Screen: Identify in the Age of the Internet’, New York, NY: Simon
and Schuster.
[72] B.FRIEDMAN, and P. H. JR. KAHN, Human values, ethics, and design, In J. Jacko and A. Sears,
Eds., The Human-Computer Interaction Handbook, Lawrence Erlbaum Associates, Mahwah NJ,
2003.
[73] M.WEISER, and J. S. BROWN, (1997), The coming age of calm technology, In P. Denning and B.
Metcalfe, Eds., Beyond Calculation: The Next 50 Years of Computing, (pp. 75-85), New York, NY:
Springer-Verlag.
[74] UNITED NATIONS (2002), Report of the United Nations Conference on Environment and
Development, held in Rio de Janeiro, Brazil, 1992. Available
from:http://www.un.org/esa/sustdev/documents/agenda21/english/agenda21toc.htm
[75] WORLD COMMISSION ON ENVIRONMENT AND DEVELOPMENT (Gro Harlem Brundtland,
Chair), Our Common Future, Oxford University Press, Oxford, 1987.
[76] M.HART, (1999), Guide to Sustainable Community Indicators, Hart Environmental Data, PO Box
361, North Andover, MA 01845, second edition.
[77] B. MOLDAN, S. BILLHARZ, and R. MATRAVERS, (1997), Sustainability Indicators: A Report on
the Project on Indicators of Sustainable Development, Wiley, Chichester, England.
[78] NORTHWEST ENVIRONMENT WATCH, This Place on Earth 2002: Measuring What Matters,
Northwest Environment Watch, 1402 Third Avenue, Seattle, WA 98101.
[79] American Medical Association, AMA takes action on video games, Retrieved October 15, 2007, from
http://www.amaassn.org/ama/pub/category/17770.html
[80] M.Griffiths, & R. Wood, (2000), ‘Risk factors in adolescence: The case of gambling, videogame
playing, and the internet’, Journal of Gambling Studies, vol. 16, pp. 199–225.
International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018
55
[81] M.D. Griffiths, M. N. O. Davies, & D. Chappell, (2004), ‘Online computer gaming: A comparison of
adolescent and adult gamers’, Journal of Adolescence, vol. 27, pp. 87–96.
[82] J.P. Charlton, & I. D. W. Danforth, (2007), ‘Distinguishing addiction and high engagement in the
context of online game playing’, Computers in Human Behavior, vol. 23, pp. 1531–1548.
[83] S.Chiu, J. Lee, & D. Huang, (2004), ‘Video game addiction in children and teenagers in Taiwan’,
Cyber Psychology & Behaviour, vol. 7, pp. 571–581.
[84] T.Chou, & C. Ting, (2003), ‘The role of flow in cyber-game addiction’, Cyber Psychology &
Behaviour, vol. 6, pp. 663–675.
[85] S.E. Fisher, (1994), ‘Identifying video game addiction in children and adolescents’, Addictive
Behaviours, vol. 19, pp. 545–553.
[86] M.D. Griffiths, & M. N. O. Davies, (2005), Videogame addiction: Does it exist?, In J. Goldstein & J.
Raessens (Eds.), Handbook of computer gamestudies (pp. 359– 368). Boston: MIT Press.
[87] S.M. Grüsser, C. Thalemann, & M. Griffiths, (2007), ‘Excessive computer game playing: Evidence
for addiction and aggression?’,Cyber Psychology & Behaviour, vol. 10, pp. 290–292.
[88] M.R. Hauge, & D. A. Gentile, (April 2003), ‘Video game addiction among adolescents: Associations
with academic performance and aggression’, Paper presented at Society for Research in Child
Development Conference, Tampa, FL.
[89] C.Ko, J. Yen, C. Chen, S. Chen, & C. Yen, (2005), ‘Gender differences and related factors affecting
online gaming addiction among Taiwanese adolescents’, Journal of Nervous and Mental Disease,
vol.193, pp. 273–277.
[90] B.D. Ng, & P. Wiemer-Hastings, (2005), ‘Addiction to the internet and online gaming’, Cyber
Psychology & Behaviour, vol. 8, pp. 110–113.
[91] W.B. Soper, & M. J. Miller, (1983), ‘Junk-time junkies: An emerging addiction among students’, The
School Counselor, 31, 40–43.
[92] C.S. Wan, & W. B. Chiou, (2006), ‘Psychological motives and online games addiction: A test of flow
theory and humanistic needs theory for Taiwanese adolescents’, Cyber Psychology & Behaviour, vol.
9, pp. 317–324.
[93] R.A. T. Salguero, & R. M. B. Moran, (2002), ‘Measuring problem video game playing in
adolescents’, Addiction, 97, 1601–1606.
[94] A.F. Seay, & R. E. Kraut, (2007), ‘Project massive: Self-regulation and problematic use of online
gaming’, In CHI 2007: Proceedings of the ACM conference on human factors in computing systems,
(pp. 829–838). New York: ACM Press.
[95] A.Johansson, & K. G. Gotestam, (2004), ‘Problems with computer games without monetary reward:
Similarity to pathological gambling’, Psychological Reports, 95, 641–650.
[96] G.A. Keepers, ‘Pathological preoccupation with video games’, Journal of the American Academy of
Child & Adolescent Psychiatry, 29, 49–50, (1990).
[97] M.Griffiths, (2005), ‘A ‘‘components’’ model of addiction within a biopsychosocial framework’,
Journal of Substance Use, vol. 10, pp. 191–197.
[98] J.Mendelson, & N. Mello, (1986), The addictive personality, New York: Chelsea House.
[99] Knime.com [Internet].Knime; [cited 2018 Nov 11]. Available from: http://www.knime.com/
[100]M. El-Masri, N. Altrabsheh, H. Mansour, A. Ramsay, (November 2017), ‘A web-based tool for
Arabic sentiment analysis’, 3rd International Conference on Arabic Computational Linguistics
ACLing 2017, 5–6, Dubai, United Arab Emirates.
[101]PorterStemmer [Internet]. PorterStemmer; [cited 2018 Nov 11]. Available from:
https://tartarus.org/martin/PorterStemmer/
[102]J.Ramos, (2003),‘Using tf-idf to determine word relevance in document queries’, In First International
Conference on Machine Learning, New Brunswick:NJ, USA, Rutgers University.
[103]Y.Zhang, R. Jin, &ZH.Zhou, (December 2010),‘Understanding bag-of-words model: AStaistical
Framework,International Journal of Machine Learning and Cybernetics, Volume 1, Issue 1–4, pp. 43–
52.
[104]M.Araujo, P.Goncalves, M. Cha, F.Benevenuto, (2014),‘I feel: a system that compares and combines
sentiment analysis methods’, in: Proceedings of the 23rd International Conference on World Wide
Web, ACM. pp. 75–78
[105]N.Altrabsheh, M.Cocea, S.Fallahkhair, (2015), ‘Predicting students emotions using machine learning
techniques’, in: The 17th International Conference on Artificial Intelligence in Education.
International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018
56
[106]A.Go, R. Bhayani, L.Huang, (2009),Twitter sentiment classification using distant supervision,
CS224N Project Report, Stanford.
[107]R.Duwairi, M. El-Orfali, (2014),‘A study of the effects of preprocessing strategies on sentiment
analysis for arabic text’,Journal of Information Science, 40, 501–513.
[108]B Pang, L. Lee, (2004), ‘A sentimental education: Sentiment analysis using subjectivity
summarization based on minimum cuts’, Annual Meeting on Association for Computational
Linguistics, 42, 271–278.
[109]D.Hand, (2012), ‘Assessing the Performance of Classification Methods’,International Statistical
Review, 80, 3, 400–414.
[110]L.Mostafa,M. Farouk, and M. Fakhry, (2009),‘An automated approach for webpage classification’,
ICCTA09 Proceedings of 19th International conference on computer theory and applications,Vol. 5,
Issue 10,pp. 10-15.
[111]Reddit [Internet]. Reddit; [cited 2018 Nov 12]. Available from: https://www.reddit.com/
[112]D.Lewis,and J. Catlett,(10-13 July 1994), ‘Heterogeneous Uncertainty Sampling for Supervised
Learning’, Proceedings of the Eleventh International Conference, Rutgers University, New
Brunswick, NJ.
[113]H.B. Lewis, (1971),Shame and guilt in neurosis, New York, NY: International Universities Press.
[114]J.P. Tangney, (1991),‘Moral affect: The good, the bad, and the ugly’,Journal of Personality and Social
Psychology, 61, 598- 607. doi:10.1037/0022-3514.61.4.598
[115]S.T. Wolf, T. R. Cohen, A. T.Panter, &C. A. Insko, (2010),‘Shame proneness and guilt proneness:
Toward the further understanding of reactions to public and private transgressions’,Self and Identity,
9, 337-362. doi:10.1080/15298860903106843
[116]S.Roos, E. V. E. Hodges, &C. Salmivalli,, (2014),‘Do guiltand shame-proneness differentially predict
prosocial, aggressive, and withdrawn behaviors during early adolescence?’,Developmental
Psychology, 50, 941-946. doi:10.1037/a0033904
[117]F.Calefato, F.Lanubile, F. Maiorano, N.Novielli,(2018), ‘Sentiment Polarity Detection for Software
Development’,Empir Software Eng (2018), 23: 1352.
[118]B.RajatS.Gaonkar, (2017), ‘Sentiment Analysis of Product Reviews using Hadoop’,IJSRD –
International Journal for Scientific Research Development, Vol. 4, Issue 12.
[119]X.Lei, X. Qian,(2015), ‘Rating Prediction via Exploring Service Reputation’, IEEE 17th International
Workshop on Multimedia Signal processing.
[120]D.Das, S. Sharma, S.Natani, N.Khare, and B. Singh,(2017),‘Sentimental Analysis for Airline Twitter
data’,IOP Conf. Series: Materials Science and Engineering, 263, 042067.
[121]W.Mariel, S.Mariyah, and S. Pramana, (2018), ‘Sentiment analysis: a comparison of deep learning
neural network algorithm with SVM and naϊve Bayes for Indonesian text’IOP Conf. Series: Journal of
Physics:.Conf. Series 971, 012049.
[122]A.Collomb, C.Costea, D.Joyeux, O. Hasan, &L. Brunie, (2014),‘A study and comparison of sentiment
analysis methods for reputation evaluation’,Rapport de recherche RR-LIRIS- 2014-002.

More Related Content

PPTX
Components of an ICT System
PPT
Social & Ethical Issues in Information Systems
PPTX
Ethics in an information society
PDF
Usability guidelines for usable user interface
DOC
Jennings it security overview 1 2
PPTX
When Worlds Collide: Tracking the Trends at the Intersection of Social, Mobil...
PDF
Globalcompose.com sample essay paper on cyber ethics
PDF
Compliance With Data Security Policies
Components of an ICT System
Social & Ethical Issues in Information Systems
Ethics in an information society
Usability guidelines for usable user interface
Jennings it security overview 1 2
When Worlds Collide: Tracking the Trends at the Intersection of Social, Mobil...
Globalcompose.com sample essay paper on cyber ethics
Compliance With Data Security Policies

What's hot (18)

PPT
Socio Technical Complex Systemsstcs
PDF
BYOD SCOPE: A Study of Corporate Policies in Pakistan
PPTX
Hightech
PPSX
Socio Technical Systems
DOCX
Iot term paper_sonu_18
PDF
The Developer’s Guide to Test Automation
PPTX
IoT [Internet of Things]
PPT
Impactof IT
PDF
BYOD for Employees
PDF
Framework for A Personalized Intelligent Assistant to Elderly People for Acti...
DOCX
DPP Assignment 1
PDF
Mobile Device Users’ Susceptibility To Phishing Attacks
PDF
BRAIN-TO-SOCIETY DECISION AND BEHAVIOR SEMINAR : Measuring what users really ...
DOC
Effects of Developers’ Training on User-Developer Interactions in Information...
PDF
Recent trends in cloud computing articles
PPT
Ch10
PDF
CREATING AND SHARING KNOWLEDGE THROUGH A CORPORATE SOCIAL NETWORKING SITE: TH...
PDF
Steam++ An Extensible End-to-end Framework for Developing IoT Data Processing...
Socio Technical Complex Systemsstcs
BYOD SCOPE: A Study of Corporate Policies in Pakistan
Hightech
Socio Technical Systems
Iot term paper_sonu_18
The Developer’s Guide to Test Automation
IoT [Internet of Things]
Impactof IT
BYOD for Employees
Framework for A Personalized Intelligent Assistant to Elderly People for Acti...
DPP Assignment 1
Mobile Device Users’ Susceptibility To Phishing Attacks
BRAIN-TO-SOCIETY DECISION AND BEHAVIOR SEMINAR : Measuring what users really ...
Effects of Developers’ Training on User-Developer Interactions in Information...
Recent trends in cloud computing articles
Ch10
CREATING AND SHARING KNOWLEDGE THROUGH A CORPORATE SOCIAL NETWORKING SITE: TH...
Steam++ An Extensible End-to-end Framework for Developing IoT Data Processing...
Ad

Similar to INVESTIGATING GAME DEVELOPERS’ GUILT EMOTIONS USING SENTIMENT ANALYSIS (20)

PPTX
computer ethics
PPTX
3.0 computer ethic
DOCX
1. Emergence of Software EngineeringIn the software industry, we.docx
DOCX
Running Head TRENDS IN CYBERSECURITY1TRENDS IN CYBERSECURITY.docx
PDF
A Survey of Building Robust Business Models in Pervasive Computing
PDF
Design After the Rise of AI-Driven Services
DOCX
chapter 3 AIS by james hall summarize in book
PDF
Ethically Aligned Design (Overview - Version 2)
PDF
ARTIFICIAL INTELLIGENCE, COGNITIVE TECHNOLOGIES AND DIGITAL LABOR
PDF
Innovative Information Systems Modelling Techniques Edited By Christos Kallon...
PDF
A REVIEW OF THE ETHICS OF ARTIFICIAL INTELLIGENCE AND ITS APPLICATIONS IN THE...
PDF
Ethical Analysis Applied to User Experience
PDF
A LITERATURE SURVEY AND ANALYSIS ON SOCIAL ENGINEERING DEFENSE MECHANISMS AND...
PDF
INFORMATION SYSTEM ‎SUCCESS AND FAILURE
PPTX
COM 421 Chapter 1.pptx computer studies university of eldoret
PPTX
DS IB DP Course keywords Lesson 2.pptx
DOCX
Analysis of personal information security behavior and awareness.docx
PDF
Eleonora Fiore: Ethical challenges of the Internet of Things in the household...
PPTX
IT careers and professionals
PDF
A Survey On Recent Research Areas In Computer Science
computer ethics
3.0 computer ethic
1. Emergence of Software EngineeringIn the software industry, we.docx
Running Head TRENDS IN CYBERSECURITY1TRENDS IN CYBERSECURITY.docx
A Survey of Building Robust Business Models in Pervasive Computing
Design After the Rise of AI-Driven Services
chapter 3 AIS by james hall summarize in book
Ethically Aligned Design (Overview - Version 2)
ARTIFICIAL INTELLIGENCE, COGNITIVE TECHNOLOGIES AND DIGITAL LABOR
Innovative Information Systems Modelling Techniques Edited By Christos Kallon...
A REVIEW OF THE ETHICS OF ARTIFICIAL INTELLIGENCE AND ITS APPLICATIONS IN THE...
Ethical Analysis Applied to User Experience
A LITERATURE SURVEY AND ANALYSIS ON SOCIAL ENGINEERING DEFENSE MECHANISMS AND...
INFORMATION SYSTEM ‎SUCCESS AND FAILURE
COM 421 Chapter 1.pptx computer studies university of eldoret
DS IB DP Course keywords Lesson 2.pptx
Analysis of personal information security behavior and awareness.docx
Eleonora Fiore: Ethical challenges of the Internet of Things in the household...
IT careers and professionals
A Survey On Recent Research Areas In Computer Science
Ad

Recently uploaded (20)

PPTX
Agentic AI Use Case- Contract Lifecycle Management (CLM).pptx
PDF
SAP S4 Hana Brochure 3 (PTS SYSTEMS AND SOLUTIONS)
PPTX
Oracle E-Business Suite: A Comprehensive Guide for Modern Enterprises
PDF
How to Migrate SBCGlobal Email to Yahoo Easily
PDF
Claude Code: Everyone is a 10x Developer - A Comprehensive AI-Powered CLI Tool
PPTX
Transform Your Business with a Software ERP System
PDF
System and Network Administration Chapter 2
PDF
Which alternative to Crystal Reports is best for small or large businesses.pdf
PDF
Digital Strategies for Manufacturing Companies
PDF
Odoo Companies in India – Driving Business Transformation.pdf
PPTX
L1 - Introduction to python Backend.pptx
PDF
How to Choose the Right IT Partner for Your Business in Malaysia
PPTX
VVF-Customer-Presentation2025-Ver1.9.pptx
PDF
Internet Downloader Manager (IDM) Crack 6.42 Build 41
PPTX
CHAPTER 2 - PM Management and IT Context
PDF
Why TechBuilder is the Future of Pickup and Delivery App Development (1).pdf
PPTX
Online Work Permit System for Fast Permit Processing
PDF
Design an Analysis of Algorithms I-SECS-1021-03
PDF
Design an Analysis of Algorithms II-SECS-1021-03
PDF
Raksha Bandhan Grocery Pricing Trends in India 2025.pdf
Agentic AI Use Case- Contract Lifecycle Management (CLM).pptx
SAP S4 Hana Brochure 3 (PTS SYSTEMS AND SOLUTIONS)
Oracle E-Business Suite: A Comprehensive Guide for Modern Enterprises
How to Migrate SBCGlobal Email to Yahoo Easily
Claude Code: Everyone is a 10x Developer - A Comprehensive AI-Powered CLI Tool
Transform Your Business with a Software ERP System
System and Network Administration Chapter 2
Which alternative to Crystal Reports is best for small or large businesses.pdf
Digital Strategies for Manufacturing Companies
Odoo Companies in India – Driving Business Transformation.pdf
L1 - Introduction to python Backend.pptx
How to Choose the Right IT Partner for Your Business in Malaysia
VVF-Customer-Presentation2025-Ver1.9.pptx
Internet Downloader Manager (IDM) Crack 6.42 Build 41
CHAPTER 2 - PM Management and IT Context
Why TechBuilder is the Future of Pickup and Delivery App Development (1).pdf
Online Work Permit System for Fast Permit Processing
Design an Analysis of Algorithms I-SECS-1021-03
Design an Analysis of Algorithms II-SECS-1021-03
Raksha Bandhan Grocery Pricing Trends in India 2025.pdf

INVESTIGATING GAME DEVELOPERS’ GUILT EMOTIONS USING SENTIMENT ANALYSIS

  • 1. International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018 DOI:10.5121/ijsea.2018.9604 41 INVESTIGATING GAME DEVELOPERS’ GUILT EMOTIONS USING SENTIMENT ANALYSIS Lamiaa Mostafa and MarwaAbd Elghany Department of Business Information Systems, College of Management and Technology, Arab Academy for Science & Technology, Alexandria, Egypt ABSTRACT Game Development is one of the most important emerging fields in software engineering era. Game addiction is the nowadays disease which is combined with playing computer and videogames. Shame is a negative feeling about self evaluationas well as guilt that is considered as a negative evaluation of the transgressing behaviour, both are associated withadaptive and concealing responses. Sentiment analysis demonstrates a huge progression towards the understanding of web users’ opinions. In this paper, the sentiments of game developers are examined to measure their guilt’s emotions when working in this career. The sentiment analysis model is implementedthrough the following steps: sentiment collector, sentiment pre-processing, and then machine learning methods were used. The model classifies sentiments into guilt or no guilt and is trained with 1000 Reddit website sentiment. Results have shown that Support Vector Machine (SVM) approach is more accurate incomparison to Naïve Bayes (NV) and Decision Tree. KEYWORDS Ethics,Game Addiction, Guilt Emotions, Software Engineering, Sentiment Analysis Model& Value Sensitive Design 1. INTRODUCTION Ethics is the study of value concepts such as ‘bad,’ ‘good,’ ‘ought,’ ‘right,’ ‘wrong,’ put on into actions linked to group standards and norms. Hence, it is contended with many issues essential to practical decision-making (Veatch, 1977). Computer software systems stay at the centre of modern decision making, counting data/information storage and manipulation, data availability, and ‘alternatives’choice and formulation (Schmoldt and Rauscher, 1994). The ubiquity of software systems in every aspect requires that the created environment should be critically investigated as they are developed and deployed as well. A range of artificial intelligence (AI) approaches has been proposed to represent different codes of ethics in environmental decision systems (Thomson, 1997). Artificial Intelligence (AI) empowers modern information technologies and the advent of machines and have already intensely impacted the world of work in the 21st century. Computers, algorithms and software actually streamline everyday tasks, and it is difficult to conceive how tasks could be accomplished without their use. The name behind the idea of AI is John McCarthy, who started his research in 1955 and suggested that each aspect of learning and any other domain of intelligence can specifically be defined in details then simulated by a machine. Therefore, the term ‘artificial intelligence’ entails the investigation of intelligent problem-solving behaviour to generate intelligent computer systems.There are two kinds of artificial intelligence: weak artificial intelligence; where intelligence is stimulated by the computer for examining cognitive processes; strong artificial intelligence;where computers are able to perform intellectual, self-learning
  • 2. International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018 42 processes by the right software/programming for the optimisation of their behaviour based upon experience.This involves automatic networking with other machines leading to a dramatic scaling effect (Wiss kirchen et al. 2017). The term “ethics” tackles any intentional action affecting others’ values and lives either positively or negatively. Software engineering is mainly perceived as a technical discipline for the development of software. The industry personnel who contribute in professional software developmentare primarily called as information systems analysts and the key concern of the software engineering activity is focused on the technical appropriateness of the developed products. Reed (2000) demonstrated that roughly more than one billion people around the world rely on software systems for the conduction of their daily lives, which has made many in the computing field to consider the nontechnical aspects and the ethical effect of their daily decisions and the values enclosed therein. The relation between ethics and computers can be clearly viewed when computers are affected by humans’ decisions, and then people’s lives are impacted by those decisions. Accordingly, human values are connected to technical decisions in this manner, which are made by computing professionals including the individuals, teams, or the management during the design, development, construction and maintenance of computing artefacts influencing other people. For example, the airbag design release William pactothers’ lives and should be directed by an ethical decision about human values. This paper begins with an overview of the perception of ethics in computer software systems and its associated activities in addition to its related technical decisions made by professionals.Then focuses on the game addiction and the accompanied guilt emotions that the developers can feel about it through the collection of Reddit responses subsequently, analysing these responses using the sentiment analysis model to classify whether they are guilty or not. 2. BACKGROUND 2.1. SOFTWARE ENGINEERING ETHICS The role of ethics in software system design has recently fostered in significance.Mason (1986) demonstrates an early viewpoint on ethical issues in the information age,classifying them into accessibility, accuracy, privacy, and property concerns.Later on, more views added concerns about knowledge use in organizations (Bella,1992) and its affecting impact on life quality (Forester and Morrison, 1994). The issue is not about malicious be haviour,such as computer crime, deliberate invasions of privacy, hacking, software theft, and viruses, rather than the subtler of the impact that the software development and deployment can have on people and their cultural, corporate andother institutions. Previous approaches to system development were concentrating towards technical aspects, while current approaches give emphasis to the consideration of all aspects of the human environment in which the systems are being developed and used. Software engineering ethics can be overviewed from three approaches:first, it canbe regarded as the activity of software engineers who are making practical choices that have an effect on other people in substantial ways. Second, it can act as a description of a collection of principles,guidelines, or ethical imperatives giving guidance or legislative action.And third, it can be named as a discipline, which investigates the relationship between the other two senses of ethics.Accordingly, software engineering ethics is considered as mutually principles’ set and activity as well (Gotterbarn et al., 1997).
  • 3. International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018 43 The ethical activity embedded in technical decisions should be dependent on an understanding of the effect of those decisions. The technical decisions should be consciously steered by values.Professions designate the values in Codes of Ethics, Codes of Conduct and Codes of Practice, where these codes help in many ways embracing guidance for the ethical selections of the practitioner and enlightening the public and the practitioners about the profession’s ethical standard. A good software engineering decision entails an awareness of both the technical and the ethical dimensions.Codes of ethics are statements about the interconnectivity between values and technology to ensure that the activities of practicing professionals care for human values rather than do any harm to them.The discipline of software engineering ethics examines the relationship between technology and the ethical principles; how it influences society, citizens, and products (Gotterbarn, 2002). In the nineties, a number of organizations like the Association for Computing Machinery (ACM), the Australian Computer Society, the British Computer Society (BCS), the IEEE Computer Society (IEEE-CS), and the New Zealand Computer Society have reconsidered their ethics’codes to state explicitly an ethical obligation to the public (Gotterbarn a, 1996). Especially, the IEEE Computer Society joined by the ACM started to professionalize software engineering and set initial goals in the following: 1. Adoption of Standard Definitions 2. Definition of Required Body of Knowledge and Recommended Practices 3. Definition of Ethical Standards 4. Define Educational Curricula (Gotterbarn b, 1996). However, the above goals are coherent with the goals of any other professions because the key element in any profession stands in its recognition of ethical and moral responsibilities to its clients, to society, and to the profession itself. Nevertheless, many professions such as medicine for example declare these responsibilities publicly in a code of ethics then afterwardsnecessitatetraining in professional ethics for entering the profession. The responsibilities formulated in the Software Engineering Code of Ethics and Professional Conduct (SECODE 1999) were briefed in:Software engineers shall be committed to these Eight Principles: 1 PUBLIC - act steadily with the public interest. 2 CLIENT AND EMPLOYER - perform in a way, which is in the best interests of their client or employer. 3 PRODUCT -make sure that their products and related modifications fulfil the highest professional standards possible. 4 JUDGMENT - keep independence and integrity in their professional judgment. 5 MANAGEMENT - endorse an ethical approach to the management of software development and maintenance. 6 PROFESSION - align the reputation and the integrity of the profession in consistence with the public interest. 7 COLLEAGUES - being fair and supportive to their colleagues. 8 SELF - contribute in lifelong learning in regards to the practice of their profession.
  • 4. International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018 44 The Software Engineering Institute recognised software engineering as an engineering form applying the principles of computer science and mathematics for the achievement of cost effective solutions to software practical problems in the mankind service. The perception of cost effectiveness denotes the resources investment for good value satisfying the required quality and minimised time (Ford, 1990).Even though, software engineering does not meet all of the above mentioned inscriptions of a profession, they are still important to understand how ethics is linked to software engineering. Training and specialised skills are spent in the developments of products that have a direct impact on many lives for their usage in society, in other words affecting the public's well-being. Consequently, software engineering ethics can be identified as a professional ethics(Pour, 2000). Like medical professionals, software engineers should be aware that knowledge is commonly inaccessible to people outside the profession and have special duties and responsibilities to their clients like lawyers, which can be inconsistent with duties and responsibilities towards the society at large. Then software engineering ethics shares the common base with other professional ethics for health, safety, and welfare concern. How ever,it is uniquely bounded to the technical details of computing (SE Code, 1999). The context to which moral rules are applied differentiates between various professional ethics including legal, medical and engineering ethics. For the reason that different contexts raise different ethical concerns, even the order in which they are practised changes for each application domain. Within the software engineering process, moral rules are assigned different importance at different phases of the software development life cycle. An understanding agreement (informative consent) creates a rule during the requirements phase of life-critical software. The honesty and transparency principle is another rule during testing. Yet the tort law should be confused with professional ethics in the sense of individual responsibility. If a developed product does not encounter the ideological technical goals of the profession there is no need for legal action till some damage has happened. There exists three main requirements for a tort law suit: (1) the duty is being indebted; (2) the duty is penetrated; and (3) caused damages happened by the duty penetration, one could be unethical and still the law was not violated(West 1991).In the design phase, other moral rules may get exercised first. For example; in case of a journal file design for a library check-out system to ascertain the status of books and the number of additional copies to be ordered, if required. The association of the patron's name with the book checked out has a potential for the violation of several moral rules, such as the violation of privacy and the deprivation of pleasure since one does not feel free to read what the other one wants, and perhaps causing psychological pain. Moreover, the language selection for a life-critical system can have moral implications because if the language is difficult for debugging, writing, or modification, then people are put at risk and principles of good system design in that language selection are violated. The ethical issues that can be drawn here;software developers were morally culpable to take a project beyond their skill of language proficiency, despite their awareness of the major difference in languages, they chose to stick to the more dangerous language for other purposes such as profit, then they have intentionally violated professional ethics. Technical responsibility encompasses moral commitments to standards of software production, to the software development process whereas societal responsibility implicates commitment to the usage of these specialised skills in the society service.Software development is considered as a social process and the software engineer has a two-fold compulsion for excellence striving. One compulsion depends on the fulfilment of technical standards and the other one is about social responsibility to those end-users who will work with the developed product. These ethics’ approaches provide positive guidance for software practitioners’ behaviours (Gotterbarn, 1999). Without an ethical approach, systems might be put into operational use regardless of persistent
  • 5. International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018 45 errors and faulty trials that is why it is not surprising that software developers hardly afford their clients or purchasers with warranties of any substance (Forester and Morrison, 1994). On the other hand, it is up to the end-user in most cases to provide critical feedback and the availability of software websites and e-mail contact addresses offer avenues for end-user responses to unethical practices. A number of books have been printed on computer ethics (also denoted to as machine morality). A hypothetical theoretical scenario is delineated in the book of "Moral Machines" where an artificially high oil price is contributed to "unethical" robotic trading systems causing the automated program to regulate the energy output switches over from oil to more polluting coal power plants in order to prevent the rise in electricity prices. At the same time, coal-fired power plants cannot withstand flowing at full production long and blasts after some time leading to massive power outage with the concerns it has for life and health. Nevertheless, power outages produce terror alarms at the nearest international airport resultant in chaos both at the airport and arriving jets colliding etc. The outcome is loss in lives and huge economic costs, merely for the reason of the interaction between independently programmed systems that automatically make decisions (Wallach et al., 2009). The scenario illustrated above shows that it is significant to have control mechanisms when a multisystem, which are making their own decisions interact to set limitations, and give notification to the operators about the condition deemed to call for human review. Therefore, it becomes important that moral based decision-making becomes a part of the artificial intelligence systems. The ethical implications of the possible actions that systems can take must be evaluated and this can be done on several levels, including if private laws are broken or not. However, it is not easy to construct machines that are incorporating all the world's philosophical and religious traditions. Furthermore, a very effective driver support system can be developed to reduce the number of accidents and save numerous lives yet, it is not desirable to be responsible for building a system where there is a real risk for severe adverse events in case of failure (Tørresen, 2014). In the same book detailed review of how artificial moral agents can be applied is provided. This embraces the use of ethical expertise in program development. It suggests three approaches: logic and mathematical formulated ethical reasoning, machine learning methods depending on examples of ethical and unethical behaviour and simulation where it is viewed by different ethical strategies. For example; in case of a loan application to a bank, an AI-based system can be used for credit evaluation based on a number of criteria. If the application is rejected, the reason behind refusal should be clarified. Skin color or race can be considered one of the parameters included rather than the applicant’s financial situation, still the bank can not state it explicitly. A more open system for inspection can however, show that the residence address has been crucial in this case. It has given the same result that the selected criterion has been provided. This is essential to avoid as far as possible through the simulation of AI systems’ behaviour of AI systems for the detection of possible ethically undesirable actions.All software that will substitute human evaluation and social function should follow criteria such as accountability, inspect ability, manipulation robustness and predictability. Also, all developers should have an inherent desire to develop products delivering the best possible user experience and user safety. The capability for manipulation must be restricted, and the system must have a predictable behaviour. It is definitely easier for a robot to move in a known and limited environment rather than in new and unfamiliar surroundings (Wallach et al., 2009). The Italian robot scientist Gian Marco Viareggio introduced the term roboethicsin 2002. He realised a need for guidelines for the development of robots intended to help progressing of the human society and individuals and prevent abuse against humanity. Accordingly, ethics for robot designers, manufacturers and users are needed. Robots are mechanical systems that might
  • 6. International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018 46 Unintentionally cause harm to users. There exists a danger that the collected information can be accessed by unauthorized people and be made available to others through the Internet. Robots can be misused for criminal activities such as burglary. Although it is presumed that the developers for robots and AI have good intentions. The intelligent systems must be designed so that the robots are friendly and kind, while difficult to abuse for potential malicious actions in the future. In 2004 the first international symposium on robotics was held in Sanremo, Italy. The research program ETHICBOTS has been funded by EU where a multidisciplinary team of researchers participated to identify and analyse techno-ethical challenges in the integration of human and artificial entities. European Robotics Research Network (Euronet) in 2005 provided fund to the project Euronet Roboethics Atelier, with the goal of the development of the first roadmap for roboethics for a systematic assessment of the ethical issues surrounding robot design. The focus was on human ethics for designers, manufacturers and users of robots.The work ended up with some recommendations that should satisfied in commercial robots such as (Tørresen, 2014): • Traceability as in aircraft, robots should have a "black box" to record and document their own behaviour; • Identifiability, similar to cars, robots should have serial numbers and registration number. • Privacy and policy,software and hardware should be used for encryption and password protection of sensitive data that the robot should save. The machines should be able to make ethical decisions using ethical frameworks. Ethical Issues are too interdisciplinary that programmers alone should discover them. Researchers in ethics and philosophy should also be involved in the formulation of ethical "conscious" machines that are targeted at providing acceptable machine behaviour. Michael And Susan Leigh Anderson have collected contributions from both philosophers and AI Researchers in the book "Machine Ethics" (2011). The Book discusses why and how to include an ethical dimension in machines that will act autonomously. Medical Important information must be reported on, but at the same time, the person must be able to maintain privacy. Maybe video surveillance is desirable for the user (by relatives or others), but it should be clear to the user when and how it happens. An autonomous robot must also be able to adapt to the user's "chemistry" to have a good dialogue. Value Sensitive Design emphasizes values with ethical import. A list of frequently human values that might be implicated in systems design is presented in table1 (adapted from Friedman et al., 2012) to act as a heuristic for values proposition that should be considered. Table 1. Human Values (with Ethical Import); Source: (Friedman et al., 2012) Human Value Definition Research Human Welfare Refers to people’s physical, material, and psychological well- being Leveson [1991]; Friedman, Kahn, &Hagman [2003]; Neumann [1995]; Turiel [1983, 1998] Ownership and Property Refers to a right to possess an object (or information), use it, manage it, derive income from it, and bequeath it Friedman [1997]; Herskovits [1952]; Lipinski &Britz [2000] Privacy Refers to a claim, an entitlement, or a right of an individual to determine what information about himself or herself can be communicated to others Agre and Rotenberg [1998]; Bellotti [1998]; Boyle, Edwards, & Greenberg [2000]; Friedman [1997]; Fuchs [1999]; Jancke, Venolia, Grudin, Cadiz, and
  • 7. International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018 47 Gupta [2001]; Palen&Dourish [2003]; Nissenbaum [1998]; Phillips [1998]; Schoeman [1984]; Svensson, Hook, Laaksolahti, &Waern [2001] Freedom From Bias Refers to systematic unfairness perpetrated on individuals or groups, including pre-existing social bias, technical bias, and emergent social bias Friedman &Nissenbaum [1996]; cf. Nass& Gong [2000]; Reeves &Nass [1996] Universal Usability Refers to making all people successful users of information technology Aberg&Shahmehri [2001]; Shneiderman [1999, 2000]; Cooper &Rejmer [2001]; Jacko, Dixon, Rosa, Scott, & Pappas [1999]; Stephanidis [2001] Trust Refers to expectations that exist between people who can experience good will, extend good will toward others, feel vulnerable, and experience betrayal Baier [1986]; Camp [2000];Dieberger, Hook, Svensson, &Lonnqvist [2001]; Egger [2000]; Fogg& Tseng [1999]; Friedman, Kahn, & Howe [2000]; Kahn &Turiel [1988]; Mayer, Davis, &Schoorman [1995]; Olson & Olson [2000]; Nissenbaum [2001]; Rocco [1998] Autonomy Refers to people’s ability to decide, plan, and act in ways that they believe will help them to achieve their goals Friedman &Nissenbaum [1997]; Hill [1991]; Isaacs, Tang, & Morris [1996]; Suchman [1994]; Winograd [1994] Informed Consent Refers to garnering people’sagreement, encompassing criteria of disclosure and comprehension (for ‘informed’) and voluntariness, competence, and agreement (for ‘consent’) Faden& Beauchamp [1986]; Friedman, Millett, &Felten [2000]; The Belmont Report [1978] Accountability Refers to the properties that ensures that the actions of a person, people, or institution may be traced uniquely to the person, people, or institution Friedman & Kahn [1992]; Friedman & Millet [1995]; Reeves &Nass [1996] Courtesy Refers to treating people withpoliteness and consideration Bennett &Delatree [1978]; Wynne & Ryan [1993] Identity Refers to people’s understanding ofwho they are over time, embracing both continuity and discontinuity over time Bers, Gonzalo-Heydrich, &DeMaso [2001]; Rosenberg [1997]; Schiano& White [1998]; Turkle [1996] Calmness Refers to a peaceful and composed psychological state Friedman & Kahn [2003]; Weiser & Brown [1997] Environmental Sustainability Refers to sustaining ecosystems such that they meet the needs of the present without compromising future generations United Nations [1992]; World Commission on Environment and Development [1987]; Hart [1999]; Moldan, Billharz, &Matravers [1997]; Northwest Environment Watch [2002]
  • 8. International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018 48 After the illustration of the software engineering ethical activities and technical responsibilities, the following sections will discuss game addiction and the use of guilt model to question via sentiment analysis the guilt emotions of the software developers. 2.2. GAME ADDICTION Game addiction is considered presently the utmost deliberated psychosocial aspects accompanied with playing computer and videogames. Lately, the American Medical Association (2007) intensely stimulated the American Psychiatric Association (APA) to include the ‘‘video game addiction’’ as a formal diagnostic disorder in the updated revision of the Diagnostic and Statistical Manual of Mental Disorders (DSM-V, 2012). Adolescents are more exposed to game addiction than adults (Griffiths & Wood, 2000)because (a)they commonly play computer and videogames more often than adults(e.g., Griffiths, Davies, &Chappel, 2004). Moreover, there is substantial disagreement among researchers about the concept of ‘game addiction’. Game addiction is the widespread dominant term among researchers to designate compulsive,excessive, obsessive, and mostly problematic use of videogames (e.g., Charlton &Danforth, 2007; Chiu, Lee, & Huang, 2004;Chou & Ting, 2003; Fisher, 1994; Griffiths & Davies, 2005; Grüsser, Thale mann, & Griffiths, 2007; Hauge& Gentile, 2003; Ko, Yen, Chen, Chen, & Yen, 2005; Ng &Wiemer-Hastings, 2005; Soper& Miller, 1983; Wan &Chiou, 2006). Other terms for the description of excessive or problematic gaming involve ‘‘videogame dependence’’ (Griffiths & Hunt, 1995, 1998), ‘‘problematic game playing’’ (Salguero& Moran, 2002; Seay& Kraut, 2007), and ‘‘pathological gaming’’ (Johansson &Gotestam, 2004; Keepers, 1990). Regard lessof the used terminology, researchers agree that computer and videogame overuse can cause a behavioural addiction (Griffiths, 2005). Addictive behaviour refers to behaviour that is compulsive, excessive, psychologically or physically destructive, and uncontrollable (Mendel son& Mello, 1986). 2.3. GUILT EMOTIONS According to Lewis (1971), shame involves a negative evaluation of the self and guilt involves a negative evaluation of a specific behaviour or action. Tangney (1991) has concluded that shame and guilt are associated with different feelings such as: cognitions, public avoidance and aggressive responses. Different researchers examined guilt and shame feelings in situations, which defined as public–private distinctions (Wolf et al., 2010). Roos et al. (2014) have concluded that guilt and shame could influence an individual’s social behaviours. Therefore, it would be very beneficial to explore the feeling of guilty that developers can feel towards game development and its negative damaging effect of addiction harming personal lives. 3. SENTIMENT ANALYSIS Marketing managers comprehend the importance of customer sentiments in different services or products through understanding the sentiment of internet users. Additionally, universities can take advantage of sentiment analysis to recognise the opinion of students. Vocabulary based approach can be used with conjunction of sentiment analysis; this is exercised for semantic annotation and word suggestion. Machine learning strategies can classify sentiments into either positive or negative. Sentiment analysis tools have been used on non-technical domains including social media and marketing. Rajat and Gaonkar (2017) have performed Sentiment Analysis on customers in order
  • 9. International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018 49 to develop a rating forecasting tool, Lei and Xueming (2015) have measured the reputation of the company based upon sentiment analysis of online customer. Das et al. (2017) conducted sentiment analysis on airline industries; they classified sentiment of Twitter using Naïve Bayes NB machine learning algorithm, and Rapid Miner RM. Furthermore, Support Vector Machine SVM is one of the best performing machine learning technique nevertheless, NB can perform better in a few cases. Collomb et al. (2014) divided machine learning approach into the following four types. Machine learning, which uses a learning algorithm via training data classification .Lexicon-based approach involves calculation, using the semantic orientation of words or sentences in a document.Rule- based approach classifies words in relevance to the number of positive and negative words. And Senti4SD that is a classifier trained to support sentiment analysis in developers’ communication channels (Calefato, et al., 2018). Senti4SD uses lexicon, keyword-based and semantic features. In the proceeding sections, the modelling of sentiment analysis is exemplified and its three phased stages are explained to end up with the experimental results that has been concluded from the data collection. 3.1. MODELLING SENTIMENT ANALYSIS Sentiment analysis guilt models composed of three stages: sentiment collector, processing and classification as illustrated in Figure 1 below. Sentiments are collected from reddit.com. which is an online forum for game developers, then a java tool is used to collect all sentiments while removing html tags. Sentient collector has clustered 1000 sentiments and classified these sentiments into two groups: guilt and no guilt. After collecting the sentiment, a processing step is designed using Knime, which is a data analytical and integration platform that includes components for machine learning, data mining and web mining (Knime, 2018). Processing includes: punctuation eraser, case converter, stop word removal and stemming. And the third step is the machine learning technique, where sentiment analysis guilt model will use SVM, NV, and decision tree. 3.1.1 PRE-PROCESSING STEP Pre-processing is the step of cleaning data and removing unwanted words. Pre-processing can affect the accuracy of the sentiment analysis model (El-Masri, et al., 2017). Many pre-processing techniques can be used on data such as: punctuation removal, remove stop words, repeated letters and numbers and covert text to lower or upper case, stop word removal and stemming. Stop word removal is the process of omitting connecting words like: and, the , to and so on, while stemming is the process of returning a word to its basic form which is called stem. Porterstemmeris used in the sentiment model, which is one of the known stemmers in text processing (Porter Stemmer, 2018).
  • 10. International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018 50 Figure 1.Sentiment Analysis Model 3.1.2. FEATURES SELECTION STEP Features are inputs of the classifiers. They allow a more detailed analysis of the raw data. N- grams are one of the most commonly used features (Arauj, et al., 2014). Bag of words (BOW) is a representation method for object classification. BOW idea" is to quantize each extracted key point into one of visual words, and then represent each image by a histogram of the visual words" (Zhang et al., 2010). BOW is defined as a conjunction method with classification techniques. Term Frequency Inverse Document Frequency (TF-IDF) is used to define what words in a document is best used in a query. TF-IDF "calculates values for each word in a document through an inverse proportion of the frequency of the word in a particular document to the percentage of documents the word appears in" (Ramos, 2003). High TF-IDF numbers defines a strong relationship with the document they appear in. Both BOW and TFIDF are used in Sentiment analysis guilt model for selection of features. 3.1.3 MACHINE LEARNING STEP Different machine learning approaches are used in sentiment analysis such as Naive Bayes (Go, et al., 2009; Pang and Lee, 2004) and Support Vector Machines (Duwairi and El-Orfali, 2014). Using the classifiers depends on the domain it is used in for example in education SVM performed better than other classifiers (Altrabsheh et al., 2015). In the Sentiment analysis guilt model classifiers used are: SVM, Naive Bayes and decision tree. For the validation of classifiers, accuracy, precision and recall have been used. Classification accuracy criteria focus"on how well the classifier assigns objects to their correct classes" (Hand, 2012). The precision is the correct predictions percentage and the recall is the instances percent that were predicted as positive (Mostafa, 2009). 4. EXPERIMENTAL RESULTS Sentiment analysis depends on the training data, the data was collected from Reddit.com forum (Reddit, 2018). Game developers use Reddit forum for discussing any problem or topics related to game development. Thousands sentiments were collected;five hundredslabeled guilt emotion and another five hundredslabeled no guilt emotion. Sentiments were collected using a java application that gets the text from the webpage, while removing the html tags. Classifiers were then trained and tested in the thousands sentiments with the percent 80% for training and 20% for testing as referred to Lewis and Catlett(1994).
  • 11. International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018 51 Sentiment analysis guilt model tests the classifiers based on the 1000 sentiment and labelled these sentiments into either guilt or not guilt. Naive Bayes, SVM and decision tree were used for the classification of the sentiments. Sentiments collectors collect the sentiments and different processing techniques were used to remove the noise from sentiment sentences. Table 2 shows the results of the sentiment model. SVM was the highest accuracy level 77 however, this is contradicting with Altrabsheh, et al. (2015) in which it not an educational field. Table 2. Sentiment Analysis Guilt Model Results 5. CONCLUSIONS AND LIMITATIONS Digital addiction affects the social life of a person, it is so important to understand the emotions of the game developers who provide products that affect other person's life. In accordance to previous studies, the guilt model was not tested on game developers. Sentiment analysis guilt model can be used to test the guilt motion on the sentiments of game developers and can help in solving guilt consequences emotions for game developers for the reason that guilt emotions can develop to depression level in some cases. The applied sentiment analysis guilt model in this research paper has the following limitations. Reddit forum is the only one used to extract sentiments. Only three machine learning classifiers were used: SVM, NV and decision tree nonetheless, other classifiers can be used such as Nearest Neighbour. In addition, the data collection sentiment set is limited to 1000 sentiments. Feature selection method was dependent on TF-IDF and BOW, further features selection can be expanded to hold other semantic relations from Word Net tool as an example. Finally yet importantly, sentiment analysis guilt model can be applied on different people category in other working fields likethe educational field for instance. REFERENCES [1] R.M.Veatch, (1977), Case Studies in Medical Ethics, Harvard University Press, Cambridge. [2] D.L.Schmoldt, H.M. Rauscher, (1994), ‘Knowledge management and six supporting technologies’, Comput. Electron.Agric., 10 (1), 11–30. [3] A.J.Thomson, (1997), ‘Artificial Intelligence and environmental ethics’, Al Appl., 11 (1), 69–73. [4] GerlindWisskirchen et al., (April 2017) ‘Artificial Intelligence and Robotics and Their Impact on the Workplace’, IBA Global Employment Institute. [5] K.Reed, (July/August 2000), “Software Engineering- A New Millennium?,” IEEE Software, pp.107, in D.W. Gotterbarn (a), (1996), Software Engineering: The New Professionalism, The Responsible Software Engineer, ed. Colin Meyer, Springer Verlag. [6] R.O.Mason, (1986), ‘Four ethical issues of the information age’, MIS Q., vol. 10 (1), p. 5–12. [7] D.A.Bella, (1992), ‘Ethics and the credibility of applied science’, in: G.H. Reeves, D.L. Bottom, M.H. Brookes, M.H. (Technical coordinators), Ethical Questions for Resource Managers, USDA Forest Service General Technical Report PNW-GTR-288, pp. 19–32. [8] T.Forester, P. Morrison, (1994), Computer Ethics, MIT Press, Cambridge,. [9] The joint ACM/IEEE-CS Software Engineering Code was published as: Don Gotter barn, Keith Miller, and Simon Rogerson,(November 1997), ‘Software Engineering Code of Ethics’, Communication of the ACM , Vol. 40, Issue 11, pp. 110-118, DOI: 10.1145/265684.265699
  • 12. International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018 52 [10] D.Gotterbarn, (2002), SOFTWARE ENGINEERING ETHICS, published by Software Engineering Ethics Research Institute forEncyclopedia of Software Engineering. Available at: https://www.uio.no/studier/emner/matnat/ifi/INF3700/v12/undervisningsmateriale/Software%20enge neering%20ethics.pdf [11] D.W. Gotterbarn (b),(1996), Establishing Standards of Professional Practice, chapter 3 in The Responsible Software Engineer, ed. Colin Meyer, Springer Verlag. [12] SE Code,(October 1999), Computer Society and ACM Approve Software Engineering Code of Ethics, Computer, pages 84-88. [13] G.Pour, M. Griss, and M. Lutz, (May 2000), “The Push to Make Software Engineering Respectable”,Computer, page 35-43. [14] L.B. West,(October 1991), "Professional Civil Engineering: Responsibility," Journal of Professional Issues in Engineering Education and Practice, 117, 4. [15] D.W. Gotterbarn,(November/December 1999), “How the new Software Engineering Code of Ethics Affects You", IEEE Software, 58-64. [16] Wallach, Wendell og Allen, Colin, (2009), Moral Machines: Teaching Robots Right from Wrong, New York: Oxford University Press. [17] Jim Tørresen, (2014), Future Perspectives on Artificial Intelligence (AI), University of Oslo. [18] Anderson, Michael og Susan Leigh, (2011), Machine Ethics, Cambridge University Press. [19] BATYA FRIEDMAN, PETER H. KAHN, JR., AND ALAN BORNING, (2012), Value Sensitive Design and Information Systems, University of Washington, In P. Zhang & D. Galletta (Eds.), Human-Computer Interaction in Management Information Systems: Foundations, M.E. Sharpe, Inc: NY. [20] N.G. LEVESON, (1991), ‘Software safety in embedded computer systems’,Commun. ACM, Vol. 34, Iss. 2, P. 34-46. [21] B.FRIEDMAN, P. H. JR. KAHN, AND J. HAGMAN, (2003), ‘Hardware companions?: What online AIBO discussion forums reveal about the human-robotic relationship’, Conference Proceedings of CHI 2003, 273- 280, New York, NY: ACM Press. [22] P.G. NEUMANN, (1995), Computer Related Risks, New York, NY: Association for Computing Machinery Press. [23] E.TURIEL, (1983), The Development of Social Knowledge, Cambridge, England: Cambridge University Press. [24] E.TURIEL, (1998), ‘Moral development’,In N. Eisenberg, Ed., Social, Emotional, and Personality Development, (pp. 863-932), Vol. 3 of W. Damon, Ed., Handbook of Child Psychology, 5th edition, New York, NY: Wiley. [25] B.FRIEDMAN, (1997), ‘Social judgments and technological innovation: Adolescents, understanding of property, privacy, and electronic information’, Computers in Human Behaviour, 13(3), 327-351. [26] M.J. HERSKOVITS,(1952), Economic Anthropology: A Study of Comparative Economics, New York, NY: Alfred A. Knopf. [27] T.A. LIPINSKI, and J. J. BRITZ, (2000), ‘Rethinking the ownership of information in the 21st century: Ethical implications’, Ethics and Information Technology, 2, 1, 49-71. [28] P.E. AGRE, and M. ROTENBERG, (1998), Eds., ‘Technology and Privacy: The New Landscape’, MIT Press, Cambridge, MA. [29] V.BELLOTTI, (1998), Design for privacy in multimedia computing and communications environments, In P. E. Agre and M. Rotenberg, Eds., Technology and Privacy: The New Landscape (pp. 63-98), Cambridge, MA: The MIT Press. [30] M.BOYLE, C. EDWARDS, and S. GREENBERG, (2000), ‘The effects of filtered video on awareness and privacy’, In Proceedings of Conference on Computer Supported Cooperative Work (CSCW 2000), 1-10, New York, NY: Association for Computing Machinery. [31] L.FUCHS, (1999), ‘AREA: A cross-application notification service for groupware’, In Proceedings of ECSCW 1999, Kluwer, Dordrechet Germany, 61-80. [32] G.JANCKE, G. D. VENOLIA, J. GRUDIN, J. J. CADIZ, and A. GUPTA, (2001), ‘Linking public spaces: Technical and social issues’, In Proceedings of CHI 2001, 530-537. [33] L.PALEN, and P. DOURISH, (2003), ‘Privacy and trust: Unpacking, privacy, for a networked world’, In Proceedings of CHI 2003, 129-136. [34] H.NISSENBAUM, (1998), ‘Protecting privacy in an information age: The problem with privacy in public’, Law and Philosophy, 17, pp. 559-596.
  • 13. International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018 53 [35] D.J. PHILLIPS, (1998), Cryptography, secrets, and structuring of trust, In P. E. Agre and M. Rotenberg, Eds., Technology and Privacy: The New Landscape (pp. 243-276), Cambridge, MA: The MIT Press. [36] F.D. SCHOEMAN, (1984),Philosophical Dimensions of Privacy: An Anthology, Cambridge, England: Cambridge University Press. [37] M.SVENSSON, K. HOOK, J. LAAKSOLAHTI, and A. WAERN, (2001), ‘Social navigation of food recipes’, In Proceedings of the Conference of Human Factors in Computing Systems (CHI 2001), 341-348, New York, NY: Association for Computing Machinery. [38] J. ABERG, and N. SHAHMEHRI, (2001), ‘An empirical study of human Web assistants: Implications for user support in Web information systems’, In Proceedings of the Conference on Human Factors in Computing Systems (CHI 2000), (pp. 404-411), New York, NY: Association for Computing Machinery Press. [39] B.SHNEIDERMAN, (1999), ‘Universal usability: Pushing human-computer interaction research to empower every citizen’, ISR Technical Report 99-72, University of Maryland, Institute for Systems Research, College Park, MD. [40] B.SHNEIDERMAN, (2000), ‘Universal usability’, Commun.of the ACM, 43, 5, 84-91, 2000. [41] M.COOPER, and P. REJMER, (2001), ‘Case study: Localization of an accessibility evaluation’, In Extended Abstracts of the Conference on Human Factors in Computing Systems (CHI 2001), 141- 142, New York, NY: Association for Computing Machinery Press. [42] J.A. JACKO, M. A. DIXON, R. H. JR. ROSA, I. U. SCOTT, and C. J. PAPPAS, (1999), ‘Visual profiles: A critical component of universal access’, In Proceedings of the Conference on Human Factors in Computing Systems (CHI 99), pp. 330-337, New York, NY: Association for Computing Machinery Press. [43] C.STEPHANIDIS, (2001),Ed. 2001 User Interfaces for All: Concepts, Methods, and Tools, Mahwah, NJ: Lawrence Erlbaum Associates. [44] A.BAIER, (1986), Trust and antitrust, Ethics, 231-260. [45] L.J. CAMP, (2000), Trust & Risk in Internet Commerce, MIT Press, Cambridge, MA, L. J. [46] A.DIEBERGER, K. HOOK, M. SVENSSON, and P. LONNQVIST, (2001), ‘Social navigation research agenda’, InExtended Abstracts of the Conference on Human Factors in Computing Systems (CHI 2001), pp. 107-108, NewYork, NY: Association of Computing Machinery Press. [47] F.N. EGGER, (2000) ‘Trust me, I am an online vendor.: Towards a model of trust for e-commerce systemdesign’, In Extended Abstracts of the Conference of Human Factors in Computing Systems (CHI 2000), 101-102, New York, NY: Association for Computing Machinery. [48] B.J. FOGG, and H. TSENG,(1999), ‘The elements of computer credibility’, In Proceedings of CHI 1999, ACM Press, 80-87. [49] B.FRIEDMAN, P. H. JR. KAHN, and D. C. HOWE, (2000), ‘Trust online’, Commun.ACM, 43, 12, 34-40. [50] P.H. JR. KAHN, and E. TURIEL, (1988), ‘Children’s conceptions of trust in the context of social expectations’, Merrill-Palmer Quarterly, 34, 403-419. [51] R.C. MAYER, J. H. DAVIS, AND F. D. SCHOORMAN,(1995), ‘An integrative model of organizational trust’, TheAcademy of Management Review, vol. 20, issue 3, pp. 709-734. [52] J.S. OLSON, and G. M. OLSON, (2000), ‘i2i trust in e-commerce’, Communications of the ACM, 43(12), 41-44. [53] H.NISSENBAUM, (2001), ‘Securing trust online: Wisdom or oxymoron’, Boston University Law Review, 81(3), 635-664. [54] E.ROCCO, (1998), ‘Trust breaks down in electronic contexts but can be repaired by some initial face- to-face contact’, In Proceedings of CHI 1998, ACM Press, 496-502. [55] B.FRIEDMAN, and H. NISSENBAUM, (1997), ‘Software agents and user autonomy’, Proceedings of the FirstInternational Conference on Autonomous Agents, 466-469, New York, NY: Association for Computing Machinery Press. [56] T.E. JR. HILL, (1991), Autonomy and self-respect, Cambridge: Cambridge University Press. [57] E.A. ISAACS, J. C. TANG, and T. MORRIS, (1996), ‘Piazza: A desktop environment supporting impromptu andplanned interactions’, In Proceedings of the Conference on Computer Supported Cooperative Work (CSCW 96), pp. 315-324, New York, NY: Association for Computing Machinery Press.
  • 14. International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018 54 [58] L.SUCHMAN, (1994), ‘Do categories have politics? The language/action perspective reconsidered’, CSCW Journal, vol. 2, no. 3, pp. 177-190. [59] T.WINOGRAD, (1994), ‘Categories, disciplines, and social coordination’, CSCW Journal, vol. 2, no. 3, pp. 191-197. [60] R.FADEN, and T. BEAUCHAMP, (1986), A History and Theory of Informed Consent, New York, NY: OxfordUniversity Press. [61] B.FRIEDMAN, L. MILLETT, and E. FELTEN, (2000), ‘Informed Consent Online: A Conceptual Model and DesignPrinciples’, University of Washington Computer Science & Engineering Technical Report 00-12-2. [62] The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research, The National Commission for the Protection of Human Subjects of Biomedical and Behavioural Research, 1978. [63] B.FRIEDMAN, and P. H. JR. KAHN, (1992), ‘Human agency and responsible computing: Implications for computer system design’, Journal of Systems Software, 17, 7-14. [64] B.FRIEDMAN, and L. MILLETT, (1995), ‘It’s the computer’s fault: Reasoning about computers as moral agents’, In Conference Companion of the Conference on Human Factors in Computing Systems (CHI 95), (pp. 226-227), New York, NY: Association for Computing Machinery Press. [65] B.REEVES, and C. NASS, (1996), The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places’, New York, NY and Stanford, CA: Cambridge University Press and CSLI Publications. [66] W.J. BENNETT, and E. J. DELATREE, (1978), ‘Moral education in the schools’, The Public Interest, 50, pp. 81-98. [67] E.A. WYNNE, and K. RYAN, (1993), Reclaiming our schools: A handbook on teaching character, academics, and discipline, New York, Macmillan. [68] M.U. BERS, J. GONZALEZ-HEYDRICH, and D. R. DEMASO, (2001), ‘Identity construction environments: Supporting a virtual therapeutic community of pediatric patients undergoing dialysis’, In Proceedings of the Conference of Human Factors in Computing Systems (CHI 2001), 380-387, New York, NY: Association for Computing Machinery. [69] S.ROSENBERG, (1997), Multiplicity of selves, In R. D. Ashmore and L. Jussim, Eds., Self and Identity:Fundamental Issues, (pp. 23-45), New York, NY: Oxford University Press. [70] D.J. SCHIANO, and S. WHITE, (1998), ‘The first noble truth of cyberspace: People are people (even when they MOO)’, In Proceedings of the Conference of Human Factors in Computing Systems (CHI 98), 352-359, New York, NY: Association for Computing Machinery. [71] S.TURKLE, (1996), ‘Life on the Screen: Identify in the Age of the Internet’, New York, NY: Simon and Schuster. [72] B.FRIEDMAN, and P. H. JR. KAHN, Human values, ethics, and design, In J. Jacko and A. Sears, Eds., The Human-Computer Interaction Handbook, Lawrence Erlbaum Associates, Mahwah NJ, 2003. [73] M.WEISER, and J. S. BROWN, (1997), The coming age of calm technology, In P. Denning and B. Metcalfe, Eds., Beyond Calculation: The Next 50 Years of Computing, (pp. 75-85), New York, NY: Springer-Verlag. [74] UNITED NATIONS (2002), Report of the United Nations Conference on Environment and Development, held in Rio de Janeiro, Brazil, 1992. Available from:http://www.un.org/esa/sustdev/documents/agenda21/english/agenda21toc.htm [75] WORLD COMMISSION ON ENVIRONMENT AND DEVELOPMENT (Gro Harlem Brundtland, Chair), Our Common Future, Oxford University Press, Oxford, 1987. [76] M.HART, (1999), Guide to Sustainable Community Indicators, Hart Environmental Data, PO Box 361, North Andover, MA 01845, second edition. [77] B. MOLDAN, S. BILLHARZ, and R. MATRAVERS, (1997), Sustainability Indicators: A Report on the Project on Indicators of Sustainable Development, Wiley, Chichester, England. [78] NORTHWEST ENVIRONMENT WATCH, This Place on Earth 2002: Measuring What Matters, Northwest Environment Watch, 1402 Third Avenue, Seattle, WA 98101. [79] American Medical Association, AMA takes action on video games, Retrieved October 15, 2007, from http://www.amaassn.org/ama/pub/category/17770.html [80] M.Griffiths, & R. Wood, (2000), ‘Risk factors in adolescence: The case of gambling, videogame playing, and the internet’, Journal of Gambling Studies, vol. 16, pp. 199–225.
  • 15. International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018 55 [81] M.D. Griffiths, M. N. O. Davies, & D. Chappell, (2004), ‘Online computer gaming: A comparison of adolescent and adult gamers’, Journal of Adolescence, vol. 27, pp. 87–96. [82] J.P. Charlton, & I. D. W. Danforth, (2007), ‘Distinguishing addiction and high engagement in the context of online game playing’, Computers in Human Behavior, vol. 23, pp. 1531–1548. [83] S.Chiu, J. Lee, & D. Huang, (2004), ‘Video game addiction in children and teenagers in Taiwan’, Cyber Psychology & Behaviour, vol. 7, pp. 571–581. [84] T.Chou, & C. Ting, (2003), ‘The role of flow in cyber-game addiction’, Cyber Psychology & Behaviour, vol. 6, pp. 663–675. [85] S.E. Fisher, (1994), ‘Identifying video game addiction in children and adolescents’, Addictive Behaviours, vol. 19, pp. 545–553. [86] M.D. Griffiths, & M. N. O. Davies, (2005), Videogame addiction: Does it exist?, In J. Goldstein & J. Raessens (Eds.), Handbook of computer gamestudies (pp. 359– 368). Boston: MIT Press. [87] S.M. Grüsser, C. Thalemann, & M. Griffiths, (2007), ‘Excessive computer game playing: Evidence for addiction and aggression?’,Cyber Psychology & Behaviour, vol. 10, pp. 290–292. [88] M.R. Hauge, & D. A. Gentile, (April 2003), ‘Video game addiction among adolescents: Associations with academic performance and aggression’, Paper presented at Society for Research in Child Development Conference, Tampa, FL. [89] C.Ko, J. Yen, C. Chen, S. Chen, & C. Yen, (2005), ‘Gender differences and related factors affecting online gaming addiction among Taiwanese adolescents’, Journal of Nervous and Mental Disease, vol.193, pp. 273–277. [90] B.D. Ng, & P. Wiemer-Hastings, (2005), ‘Addiction to the internet and online gaming’, Cyber Psychology & Behaviour, vol. 8, pp. 110–113. [91] W.B. Soper, & M. J. Miller, (1983), ‘Junk-time junkies: An emerging addiction among students’, The School Counselor, 31, 40–43. [92] C.S. Wan, & W. B. Chiou, (2006), ‘Psychological motives and online games addiction: A test of flow theory and humanistic needs theory for Taiwanese adolescents’, Cyber Psychology & Behaviour, vol. 9, pp. 317–324. [93] R.A. T. Salguero, & R. M. B. Moran, (2002), ‘Measuring problem video game playing in adolescents’, Addiction, 97, 1601–1606. [94] A.F. Seay, & R. E. Kraut, (2007), ‘Project massive: Self-regulation and problematic use of online gaming’, In CHI 2007: Proceedings of the ACM conference on human factors in computing systems, (pp. 829–838). New York: ACM Press. [95] A.Johansson, & K. G. Gotestam, (2004), ‘Problems with computer games without monetary reward: Similarity to pathological gambling’, Psychological Reports, 95, 641–650. [96] G.A. Keepers, ‘Pathological preoccupation with video games’, Journal of the American Academy of Child & Adolescent Psychiatry, 29, 49–50, (1990). [97] M.Griffiths, (2005), ‘A ‘‘components’’ model of addiction within a biopsychosocial framework’, Journal of Substance Use, vol. 10, pp. 191–197. [98] J.Mendelson, & N. Mello, (1986), The addictive personality, New York: Chelsea House. [99] Knime.com [Internet].Knime; [cited 2018 Nov 11]. Available from: http://www.knime.com/ [100]M. El-Masri, N. Altrabsheh, H. Mansour, A. Ramsay, (November 2017), ‘A web-based tool for Arabic sentiment analysis’, 3rd International Conference on Arabic Computational Linguistics ACLing 2017, 5–6, Dubai, United Arab Emirates. [101]PorterStemmer [Internet]. PorterStemmer; [cited 2018 Nov 11]. Available from: https://tartarus.org/martin/PorterStemmer/ [102]J.Ramos, (2003),‘Using tf-idf to determine word relevance in document queries’, In First International Conference on Machine Learning, New Brunswick:NJ, USA, Rutgers University. [103]Y.Zhang, R. Jin, &ZH.Zhou, (December 2010),‘Understanding bag-of-words model: AStaistical Framework,International Journal of Machine Learning and Cybernetics, Volume 1, Issue 1–4, pp. 43– 52. [104]M.Araujo, P.Goncalves, M. Cha, F.Benevenuto, (2014),‘I feel: a system that compares and combines sentiment analysis methods’, in: Proceedings of the 23rd International Conference on World Wide Web, ACM. pp. 75–78 [105]N.Altrabsheh, M.Cocea, S.Fallahkhair, (2015), ‘Predicting students emotions using machine learning techniques’, in: The 17th International Conference on Artificial Intelligence in Education.
  • 16. International Journal of Software Engineering & Applications (IJSEA), Vol.9, No.6, November 2018 56 [106]A.Go, R. Bhayani, L.Huang, (2009),Twitter sentiment classification using distant supervision, CS224N Project Report, Stanford. [107]R.Duwairi, M. El-Orfali, (2014),‘A study of the effects of preprocessing strategies on sentiment analysis for arabic text’,Journal of Information Science, 40, 501–513. [108]B Pang, L. Lee, (2004), ‘A sentimental education: Sentiment analysis using subjectivity summarization based on minimum cuts’, Annual Meeting on Association for Computational Linguistics, 42, 271–278. [109]D.Hand, (2012), ‘Assessing the Performance of Classification Methods’,International Statistical Review, 80, 3, 400–414. [110]L.Mostafa,M. Farouk, and M. Fakhry, (2009),‘An automated approach for webpage classification’, ICCTA09 Proceedings of 19th International conference on computer theory and applications,Vol. 5, Issue 10,pp. 10-15. [111]Reddit [Internet]. Reddit; [cited 2018 Nov 12]. Available from: https://www.reddit.com/ [112]D.Lewis,and J. Catlett,(10-13 July 1994), ‘Heterogeneous Uncertainty Sampling for Supervised Learning’, Proceedings of the Eleventh International Conference, Rutgers University, New Brunswick, NJ. [113]H.B. Lewis, (1971),Shame and guilt in neurosis, New York, NY: International Universities Press. [114]J.P. Tangney, (1991),‘Moral affect: The good, the bad, and the ugly’,Journal of Personality and Social Psychology, 61, 598- 607. doi:10.1037/0022-3514.61.4.598 [115]S.T. Wolf, T. R. Cohen, A. T.Panter, &C. A. Insko, (2010),‘Shame proneness and guilt proneness: Toward the further understanding of reactions to public and private transgressions’,Self and Identity, 9, 337-362. doi:10.1080/15298860903106843 [116]S.Roos, E. V. E. Hodges, &C. Salmivalli,, (2014),‘Do guiltand shame-proneness differentially predict prosocial, aggressive, and withdrawn behaviors during early adolescence?’,Developmental Psychology, 50, 941-946. doi:10.1037/a0033904 [117]F.Calefato, F.Lanubile, F. Maiorano, N.Novielli,(2018), ‘Sentiment Polarity Detection for Software Development’,Empir Software Eng (2018), 23: 1352. [118]B.RajatS.Gaonkar, (2017), ‘Sentiment Analysis of Product Reviews using Hadoop’,IJSRD – International Journal for Scientific Research Development, Vol. 4, Issue 12. [119]X.Lei, X. Qian,(2015), ‘Rating Prediction via Exploring Service Reputation’, IEEE 17th International Workshop on Multimedia Signal processing. [120]D.Das, S. Sharma, S.Natani, N.Khare, and B. Singh,(2017),‘Sentimental Analysis for Airline Twitter data’,IOP Conf. Series: Materials Science and Engineering, 263, 042067. [121]W.Mariel, S.Mariyah, and S. Pramana, (2018), ‘Sentiment analysis: a comparison of deep learning neural network algorithm with SVM and naϊve Bayes for Indonesian text’IOP Conf. Series: Journal of Physics:.Conf. Series 971, 012049. [122]A.Collomb, C.Costea, D.Joyeux, O. Hasan, &L. Brunie, (2014),‘A study and comparison of sentiment analysis methods for reputation evaluation’,Rapport de recherche RR-LIRIS- 2014-002.