SlideShare a Scribd company logo
International Journal of Trend in Scientific Research and Development (IJTSRD)
Volume 6 Issue 5, July-August 2022 Available Online: www.ijtsrd.com e-ISSN: 2456 – 6470
@ IJTSRD | Unique Paper ID – IJTSRD50450 | Volume – 6 | Issue – 5 | July-August 2022 Page 237
A Study on Face Expression Observation Systems
Jyoti1
, Neeraj Chawaria2
, Ekta2
1
Student, MERI College of Engineering & Technology, Sampla, Rohtak, Haryana, India
2
Assistant Professor, MERI College of Engineering & Technology, Sampla, Rohtak, Haryana, India
ABSTRACT
Human expressions can convey a great deal of information. We
cannot learn every language in the world, but we can decipher the
majority of human expressions. The state of a user's conduct in
various settings and scenarios can be inferred from their facial
expressions. Through various human-computer interface and
programming approaches, facial expression can be digitized. Face
detection, feature extraction, and kind of expression determination
are all parts of the facial expression perception process. Verbal and
non-verbal forms of communication are both possible. Through their
emotions, people can communicate nonverbally We've given a broad
summary of the various facial expression perception processes in the
literature in this article.
KEYWORDS: Face Expressions, Neural Nets, Knowledge, Statistics
How to cite this paper: Jyoti | Neeraj
Chawaria | Ekta "A Study on Face
Expression Observation Systems"
Published in
International Journal
of Trend in
Scientific Research
and Development
(ijtsrd), ISSN: 2456-
6470, Volume-6 |
Issue-5, August
2022, pp.237-243, URL:
www.ijtsrd.com/papers/ijtsrd50450.pdf
Copyright © 2022 by author(s) and
International Journal of Trend in
Scientific Research and Development
Journal. This is an
Open Access article
distributed under the
terms of the Creative Commons
Attribution License (CC BY 4.0)
(http://creativecommons.org/licenses/by/4.0)
INTRODUCTION
It is the procedure of classifying human emotions,
generally via facial expressions and via oral
languages. It is accomplished through human’s
perception inevitably and through computation
techniques. According to research, generally 90% of
the human communication is non-verbal, however
technique has emerged and usual code is not so good
for grasping a person’s stresses and intents. Now a
day, emotion recognition is also known as affective
Computation and is becoming available to many
developers. The process of emotion recognition has
been shown in Fig. 1.1.
Fig 1: Emotions Classification
IJTSRD50450
International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470
@ IJTSRD | Unique Paper ID – IJTSRD50450 | Volume – 6 | Issue – 5 | July-August 2022 Page 238
RELATED WORK
Face recognition compares a face in a photo or video with a list of known faces to identify it. It is true that faces
must be entered into the system in order to build the collection of distinctive facial traits [1-2]. After then, the
system separates a new picture into its essential components and analyzes them to the data kept in the repository.
The emotions are classified on the basis of features analysis [3-4]. A lot of work has also been accomplished in
the field of image enhancement i.e. retrieval of an original image form a destroyed image [5-6], counting of
peoples in an image [7], image smoothing [8] and image inpainting methods [9].
TECHNIQUES OF EMOTION PERCEPTION
Generally, the process of recognition of emotions comprises of the investigation of human sentiments in multiple
configurations like texts, audios, or videos [10]. Several emotion categories are identified via the combination of
data from facial expressions, body passage, gesticulations, and language [11]. The methodology backs the
appearance of the emotive or emotional expressions. The present methods of emotion recognition can be
categorized into three specific emotion categories: On the basis of knowledge, according to statistics, and mixed
techniques [12].
Knowledge Related Techniques
Knowledge related methods are also known as lexicon methods. It employs field information, their meaning and
linguistic characteristics of speech to identifyspecific emotion categories [13]. This technique generally employs
knowledge related sources throughout the emotion category method like WordNet, SenticNet [14] ConceptNet,
and EmotiNet [15]. Main benefit of this technique is the convenience and budget carried out through the huge
accessibility of similar knowledge related documents. Also, this technique also has a drawback, i.e. its
incapability to manage perception distinctions and difficult linguistic guidelines. These methods may be
primarily categorized into two classes: dictionary related and quantity related methods. Dictionary related
methods get view or emotion source verses in a dictionary and find out their alternative and opposite word to
develop the primary list of thoughts or sentiments [16]. Quantity related methods begins with a source list of
view or sentiment words, and increase the datasets through discovering attributes with situation definite features
in a big quantity [16]. Although Quantity related methods consider situations but their enactment diverge in
dissimilar areas as a word in one area may have another meaning in next area.
Statistical Techniques
Statistical techniques usually include the practice of dissimilar supervised machine learning (ML)procedures that
have a huge set of marked data. This data is served to the procedures for the method to study and forecast the
suitable emotion forms. Usually, this method includes few groups of information: the training and the testing
groups. The training group is employed to study the qualities of the data. The testing group is employed to
authenticate the enactment of the (ML) procedure [17]. The ML procedures normally offer added sensible
categorization accurateness as compared to other methods, however it to obtain efficient outcomes it requires an
adequately big training group. Support Vector Machines (SVM) is one a popular ML procedure . There are also
unsupervised ML techniques which are used for emotion recognition, e.g., Deep learning [18]. These procedures
involve various configurations of (ANN) like CNN, LSTM, and ELM. In the area of emotion recognition, the
deep learning methods are highly accepted because of to its attainment in associated applications. See fig 2 [19]
for deep learning based emotion recognition.
Fig 2: Deep learning Based Emotion Perception
Hybrid Methods
Hybrid methods used for recognition of emotions are basically a mixture of info related methods and
mathematical approaches. These use corresponding features from both knowledge based and statistical
International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470
@ IJTSRD | Unique Paper ID – IJTSRD50450 | Volume – 6 | Issue – 5 | July-August 2022 Page 239
procedures. Certain processes have employed a collection of knowledge focused verbal components and
statistical approaches comprises of semantic computation and iFeel[20]. The function of these knowledge related
sources in the employment of hybrid methods is extremely significant in the emotion organization procedure. As
hybrid methods acquire knowledge from the profits offered by knowledge related as well as from statistical
methods. The hybrid approaches achieve improved grouping enactment as contrary to employing knowledge
centered or statistical approaches individually. The main drawback of consuming hybrid techniques is the
computation intricacy through the arrangement procedure.
Neural Nets for Emotion Recognition
A Neural Net is a subdivision of deep Learning and is a kind of technology which has turned to be widely trendy
since last term of years. Additionally, due to its mysterious capability to attain advanced up-to-date accurateness
for numerous sorting jobs, these have a precarious advantage which is massively supportive in emotion
recognition and in feature engineering spontaneously.
In case of the Neural Net, scientist may input the data in the form of transcript or vernaculars. Here the data
moves across various layers in the net. Every layer changes the inputs to attempt and transform it into somewhat
beneficial and analytical in the paradigm. Fig. 3 exhibits the working of convolution neural network for feature
extraction [21].
Fig 3: CNN Based Emotion Recognition
ANALYSIS OF THE EMOTION ASSESSMENT SYSTEM
To examine how facial expressions affect the mining of face features, authors in [22] created the Point
Distribution Model (PDM) technique. The location points (x, y) of the categorised or annotated points from the
training data set are analysed by the PDM method. Utilizing 180 photographs from 15 volunteers, the suggested
method selects 12 pictures from each person after each person chooses six sentiments. The collected features
from facial photos are categorised and matched using the Action Parameters (AP) Classifier. Authors in [23]
categorize CK facial sentiment and live videos through the SVM method to identify the universally recognised
feelings that are (in the scenario of the essential feeling of human supplied during instruction.
In order to obtain aspects from facial photos, authors in [24] introduced three independent feature extraction
techniques. The methods utilised were tested using different facial expression recognition parameters. The SVM
was used to categorise the retrieved features (SVM). The Cohn-Kanade (CK) database, which was constructed
from 100 individuals with ages ranging from 18 to 30, was used. For these studies, 310 photos were chosen from
the CK database. Authors in [25] studied three separate strategies for extraction of feature from face sentiments
for expression identification on 6 generally characterised primary emotions. These methods were discussed
individually. These methods are Singular Value Decomposition, FFT, and DCT. The retrieved facial features are
classified using the (SVM) classifier. The JAFFE database is used to conduct the investigation. There are 219
photos in this database. DCT+SVM technique, FFT+SVM technique, and SVD+SVM method all attained
recognition rates of 95 percent and 93 percent, respectively.
Authors in [26] studied three separate strategies for extracting features from face gestures for emotion
classification on 6 generally characterized primary emotions, especially angry and calm. These techniques were
International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470
@ IJTSRD | Unique Paper ID – IJTSRD50450 | Volume – 6 | Issue – 5 | July-August 2022 Page 240
discussed individually. These methods are Singular Value Decomposition, FFT, and DCT. The retrieved facial
features are classified using the (SVM) classifier. The JAFFE database is used to conduct the investigation.
There are 219 photos in this database.
Using the Discrete Wavelet Transform (DWT) method, authors in [27] retrieved characteristics from facial
photographs. They used 460 images they had taken from a collection of videos for their research. The (KNN)
technique and the (LDA) method were employed as two different classifiers after the features were extracted
from the images. LDA seeks to identify the best hyper plane for each of the six groups of sentiments (neutral,
happy, surprise and sad).
As feature extraction techniques, authors in [28] proposed the (WPCA) and (PPCA). The suggested algorithms
are thought to be the two most crucial techniques for reducing dimension and characteristic mining from a
person's facial region. Separate applications of the suggested techniques were made on the Image Database
(CKACFEID), which contains 500 photographs from 100 different people. For their investigations, researchers
chose 400 photos from this database. The proposed algorithms are assessed using Support Vector Machine.
To mine and encode facial traits, authors in [29] presented PHOG and Local Phase Quantisation (LPQ)
approaches. First, the PHOG technique was used to extract the features, and then PHOG and LPQ techniques
were combined. In these trials, 289 photos from the GEMEP-FERA dataset are used. Three techniques were
used by the researchers: the PHOG methodology with an SVM classifier.
To obtain traits from facial photos and increase the rate of recognition, authors in [30] integrated the (DCT),
(WT), (GF), and (GD) approaches. This research uses 213 photos from the JAFFE database that represent seven
different emotions: melancholy, anxiety, wrath, amazement, pleasure, calm, and contempt. Only 126 of the 213
total photographs were chosen for this experiment.
Authors in [31] recommend using the principal component analysis (PCA) approach to mine characteristics from
facial images to recognize or sense the person's sentiments (disgust, anger, sadness, happiness) in his face
picture. Authors collected characteristics and used the Singular Value Decomposition (SVD) classifier to
categories these features. For their experiments, they employed 50 training photos and 31 test pictures from the
JAFFE collection and the real database.
Authors in [32] used a different method to extract characteristics from facial photos called patch-based Gabor
characteristics. This method is defined in terms of maintaining the facial region statistics while mining local
attributes. The suggested technique is based on an approach used on the JAFFE dataset, which includes over 200
gloomy images for seven moods (six fundamental and one neutral). This research also makes use of the Cohn-
Kanade (CK) database. The SVM was used by researchers to categorise the retrieved features.
To determine the face emotions from a face image, authors in [3] employed a variety of techniques. To mine the
traits from a human face, utilise the (GLCM) approach. The features that are extracted are very effective and
take up little processing time. The key emotions recognised in this case are Joyful, Shock, Displeasure, Calm,
and Sadness. SVM is employed to train the retrieved attributes using different kernels. Researchers used the
Emotion Recognition Database in their research and got a classification rate of nearly 90%.
Table 1 displays a concise aspect of preceding readings which studied emotion approximation from facial
pictures. This table lists the names of the scientists, their techniques, databases, the overall number of photos,
expressions, train images, testing images, and the correctness they were able to obtain in each investigation.
Table 1: Analysis of Emotion Recognition from facial pictures
Sr.
No
References Techniques Databases Images Expressions
Trained
Image
Test
Image
Precision
(%)
1.
Chung-Lin &
Yu-Ming [22]
PDM+AP
Real DB 181
6 91 91 85
2.
Philipp & Rana
[23]
SVM
(CK&CK+)
and live video
80 6 91 88
3.
Tommaso,
Caifeng,
Vincent
& Ralph [24]
HOG+SVM CK 311 6 91 93
LBP+SVM CK 311 6 91 93
LTD+SVM CK 311 6 91 92
International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470
@ IJTSRD | Unique Paper ID – IJTSRD50450 | Volume – 6 | Issue – 5 | July-August 2022 Page 241
4.
KHARAT
&DUDUL [25]
DCT+SVM JAFFE 220 7 91 95
FFT +SVM JAFFE 220 7 90 95
SVD +SVM JAFFE 220 7 78 93
5.
Caifeng,
Shaogang and
Peter. [26]
LBP+SVM
MMI 80 6 90 87
JAFFE 214 7 90 80
6.
Murugappan,
Nagarajan, &
Yaacob [27]
DWT+KNN
Real
Database
461 5 91 84
DWT+ KDA
Real
Database
461 5 91 76
7.
Zhiguo &
Xuehong [28]
WPCA+SVM CKACFEID 501 6 100 400 89
PPCA+SVM
CKACFEID 501 6 100 400 85
8.
Abhinav,
Akshay, Yogesh
&Tom [29]
PHOG+SVM GEMEP 290 5 155 134 67%
PHOG&LPQ + SVM GEMEP 290 5 155 134 73
PHOG&LPQ+
LMNN
GEMEP 290 5 156 135 74
9.
Sandeep,
Shubh, Yogesh
&Neeta [30]
(DCT+WT+GF+GT)
+ADB
JAFFE 214 7 126 87 94
10.
Mandeep&
Rajeev [31]
PCA+SVD
JAFFE 15 6 7 7 99.99
Real
Database
83 6 50 31 99.99
11.
Ligang & Dian
[32]
Patch-based
Gabor+SVM
JAFFE 214 7 143 70 92.93%
CK 328 6 247 80 94.48 %
12.
Punitha &
Geetha [33]
GLCM+SVM
Facial
expression
DB
90 %
Conclusion
In this article, various methods are used to evaluate an
expression identification system from facial
photographs. However, classifying data is a
challenging process as well, especially when some
data are almost identical. This makes emotion
recognition from face photos a challenging work
because evaluating data is still complex. We
discussed some review of the literature that relate to
the topic of emotion recognition in facial expressions.
REFERENCES
[1] Sanju, Kirti Bhatia, Rohini Sharma, An
analytical survey on face recognition systems,
International Journal of Industrial Electronics
and Electrical Engineering, Volume-6, Issue-3,
Mar. -2018, pp. 61-68.
[2] Sanju, K. Bhatia, Rohini Sharma, Pca and
Eigen Face Based Face Recognition Method,
Journal of Emerging Technologies and
Innovative Research, June 2018, Volume 5,
Issue 6, pp. 491-496.
[3] Ankit Jain, Kirti Bhatia, Rohini Sharma, Shalini
Bhadola, An Overview on Facial Expression
Perception Mechanisms, SSRG International
Journal of Computer Science and Engineering,
Volume 6 Issue 4 - April 2019, pp. 19-24.
[4] Ankit Jain, Kirti Bhatia, Rohini Sharma, Shalini
Bhadola, An emotion recognition framework
through local binary patterns, Journal of
Emerging Technologies and Innovative
Research, Vol -6, Issue-5, May 2019.
[5] Jyoti, Kirti Bhatia, Shalini Bhadola, Rohini
Sharma, An Analysis of Facsimile
Demosaicing Procedures, International journal
of Innovative Research in computer and
communication engineering, Vol-08, Issue-07,
July 2020.
[6] Deepak Dahiya, Kirti Bhatia, Rohini Sharma,
Shalini Bhadola, A Deep Overview on Image
Denoising Approaches, International Journal of
Innovative Research in Computer and
International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470
@ IJTSRD | Unique Paper ID – IJTSRD50450 | Volume – 6 | Issue – 5 | July-August 2022 Page 242
Communication Engineering, Volume 9, Issue
7, July 2021.
[7] Nakul Nalwa, Shalini Bhadola, Kirti Bhatia,
Rohini Sharma, A Detailed Study on People
Tracking Methodologies in Different Scenarios,
International Journal of Innovative Research in
Computer and Communication Engineering,
Volume 10, Issue 6, June 2022.
[8] Mahesh Kumar Attri, Kirti Bhatia, Shalini
Bhadola, Rohini Sharma, An Image Sharpening
and Smoothing Approaches Analysis,
International Journal of Innovative Research in
Computer and Communication Engineering,
Volume 10, Issue 6, June 2022, pp 5573-5580.
[9] Yogita, Shalini Bhadola, Kirti Bhatia, Rohini
Sharma, A Deep Overview of Image Inpainting
Approaches, International Journal of Innovative
Research in Science, Engineering and
Technology, Volume 11, Issue 6, June 2022,
pp. 8744-8749.
[10] Poria, S., Cambria, E., Bajpai, R. and Hussain,
A. (2017). A review of affective computing:
From unimodal analysis to multimodal fusion.
Information Fusion, 37, 98–125.
[11] Caridakis, G. et al. (2007). Multimodal emotion
recognition from expressive faces, body
gestures and speech. IFIP the International
Federation for Information Processing, 247,
375–388.
[12] Cambria, E. (2016). Affective Computing and
Sentiment Analysis. IEEE Intelligent Systems,
31(2), 102–107.
[13] Rani, Meesala, S., and Sumathy, S. (2017).
Perspectives of the performance metrics in
lexicon and hybrid based approaches: a review.
International Journal of Engineering &
Technology, 6(4), 108.
[14] Cambria, E., Poria, S., Bajpai, R. and Schuller,
B. (2016). SenticNet 4: A Semantic Resource
for Sentiment Analysis Based on Conceptual
Primitives, In Proc. of COLING, the 26th
International Conference on Computational
Linguistics: Technical Papers, 2016, pp. 2666–
2677.
[15] Balahur, A., Hermida, Jesú, M. and Montoyo,
A. Detecting implicit expressions of emotion in
text: A comparative analysis. Decision Support
Systems, 53(4), 742–753.
[16] Madhoushi, Z., Hamdan, A., Razak, Z. and
Suhaila. (2015). Sentiment analysis techniques
in recent works, In Proc. of IEEE Conference
Publication, 2015, pp. 288–291.
[17] Sharef et al. (2016). Overview and Future
Opportunities of Sentiment Analysis
Approaches for Big Data. Journal of Computer
Science, 12 (3), 153–168.
[18] Sun, S., Luo, C. and Chen, J. (2017). A review
of natural language processing techniques for
opinion mining systems. Information Fusion,
36, 10–25.
[19] Xiaofeng Lu, Deep Learning Based Emotion
Recognition and Visualization of Figural
Representation, Front. Psychol., 06 January
2022, Human-Media Interaction.
[20] Araújo, M. et al. (2014). iFeel: a system that
compares and combines sentiment analysis
method, In Proc. of WWW '14 Companion.
ACM, 2014, pp. 75–78.
[21] Shiliang Zhang et al., Multimodal Deep
Convolutional Neural Network for Audio-
Visual Emotion Recognition, Proceedings of
the 2016 ACM on International Conference on
Multimedia Retrieval, 2016.
[22] Huang C. -L. and Huang, Y. -M. (1997). Facial
expression recognition using model-based
feature extraction and action parameters
classification. Journal of Visual
Communication and Image Representation, 8
(3), 278-290.
[23] Michel, P. and Kaliouby, R. E. (2005). Facial
expression recognition using support vector
machines, In Proc. of10th International
Conference on Human-Computer Interaction,
Crete, Greece, pp. 6-10.
[24] Gritti, T. et al. (2008). Local features based
facial expression recognition with face
registration errors, In Proc. of Automatic Face
& Gesture Recognition, 2008. FG'08. 8th IEEE
International Conference on, pp. 8-14.
[25] Kharat, G. and Dudul, S. (2008). Human
emotion recognition system using optimally
designed SVM with different facial feature
extraction techniques. WSEAS Transaction
Computing, 7 (6), 650-659.
[26] Shan, C., Gong, S. and McOwan, P. W. (2009).
Facial expression recognition based on local
binary patterns: A comprehensive study. Image
and Vision Computing, 27 (6), 803-816.
[27] Murugappan, M., Ramachandran, N. and
Sazali, Y. (2010). Classification of human
International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470
@ IJTSRD | Unique Paper ID – IJTSRD50450 | Volume – 6 | Issue – 5 | July-August 2022 Page 243
emotion from EEG using discrete wavelet
transform. Journal of Biomedical Science and
Engineering, 3(4), 390.
[28] Niu, Z. and Qiu, X. (2010). Facial expression
recognition based on weighted principal
component analysis and support vector
machines, In Proc. 2010 3rd International
Conference on Advanced Computer Theory and
Engineering (ICACTE), 2010, pp. 2-8.
[29] Dhall, A. et al. (2011). Emotion recognition
using PHOG and LPQ features, In Proc. of
Automatic Face & Gesture Recognition and
Workshops (FG 2011), IEEE International
Conference on. 2011, pp. 2-9.
[30] Gupta, S. K. et al. (2011). A hybrid method of
feature extraction for facial expression
recognition”, in Proc. of Signal-Image
Technology and Internet-Based Systems
(SITIS), Seventh International Conference
2011, pp. 47-52.
[31] Kaur, M. and Vashisht, R. (2011). Comparative
study of facial expression recognition
techniques. International Journal of Computer
Applications, 13(1), 43-50.
[32] Zhang, L. and Tjondronegoro, D. (2011). Facial
expression recognition using facial movement
features. IEEE Transactions on Affective
Computing, 2(4), 219-229.
[33] Punitha, A. and Geetha, M. K. (2013). Texture
based emotion recognition from facial
expressions using Support Vector Machine.
International journal of computer applications,
80 (5), 112-119.

More Related Content

PDF
Analysis on techniques used to recognize and identifying the Human emotions
PDF
2018 09
PDF
IRJET- Prediction of Human Facial Expression using Deep Learning
PDF
EMOTION RECOGNITION SYSTEMS: A REVIEW
PDF
STUDENT TEACHER INTERACTION ANALYSIS WITH EMOTION RECOGNITION FROM VIDEO AND ...
PDF
Facial Emoji Recognition
PDF
Face expression recognition using Scaled-conjugate gradient Back-Propagation ...
PDF
IRJET- A Literature Review on Technique of Facial Variance to Detect the Stat...
Analysis on techniques used to recognize and identifying the Human emotions
2018 09
IRJET- Prediction of Human Facial Expression using Deep Learning
EMOTION RECOGNITION SYSTEMS: A REVIEW
STUDENT TEACHER INTERACTION ANALYSIS WITH EMOTION RECOGNITION FROM VIDEO AND ...
Facial Emoji Recognition
Face expression recognition using Scaled-conjugate gradient Back-Propagation ...
IRJET- A Literature Review on Technique of Facial Variance to Detect the Stat...

Similar to A Study on Face Expression Observation Systems (20)

PDF
IRJET- EI-Expression Illustrator
PDF
Ct35535539
PDF
Affective analysis in machine learning using AMIGOS with Gaussian expectatio...
PDF
Study on Different Human Emotions Using Back Propagation Method
PPTX
Emotion intelligence
PDF
IRJET- An Innovative Approach for Interviewer to Judge State of Mind of an In...
PDF
IRJET- Emotion Classification of Human Face Expressions using Transfer Le...
PDF
ENHANCING THE HUMAN EMOTION RECOGNITION WITH FEATURE EXTRACTION TECHNIQUES
PDF
Human emotion detection and classification using modified Viola-Jones and con...
PDF
Automatic Emotion Recognition Using Facial Expression: A Review
PDF
A Literature Review On Emotion Recognition System Using Various Facial Expres...
PDF
Ijariie1177
PDF
Emotion Recognition using Image Processing
PDF
IRJET - A Survey on Human Facial Expression Recognition Techniques
PDF
IRJET - Automatic Emotion Recognition using Facial Expression: An Overview
PDF
Recognition of Facial Emotions Based on Sparse Coding
PDF
Efficient Face Expression Recognition Methods FER A Literature Review
PDF
IRJET- Facial Emotion Detection using Convolutional Neural Network
PDF
Facial Expression Recognition System: A Digital Printing Application
PDF
Facial Expression Recognition System: A Digital Printing Application
IRJET- EI-Expression Illustrator
Ct35535539
Affective analysis in machine learning using AMIGOS with Gaussian expectatio...
Study on Different Human Emotions Using Back Propagation Method
Emotion intelligence
IRJET- An Innovative Approach for Interviewer to Judge State of Mind of an In...
IRJET- Emotion Classification of Human Face Expressions using Transfer Le...
ENHANCING THE HUMAN EMOTION RECOGNITION WITH FEATURE EXTRACTION TECHNIQUES
Human emotion detection and classification using modified Viola-Jones and con...
Automatic Emotion Recognition Using Facial Expression: A Review
A Literature Review On Emotion Recognition System Using Various Facial Expres...
Ijariie1177
Emotion Recognition using Image Processing
IRJET - A Survey on Human Facial Expression Recognition Techniques
IRJET - Automatic Emotion Recognition using Facial Expression: An Overview
Recognition of Facial Emotions Based on Sparse Coding
Efficient Face Expression Recognition Methods FER A Literature Review
IRJET- Facial Emotion Detection using Convolutional Neural Network
Facial Expression Recognition System: A Digital Printing Application
Facial Expression Recognition System: A Digital Printing Application
Ad

More from ijtsrd (20)

PDF
A Study of School Dropout in Rural Districts of Darjeeling and Its Causes
PDF
Pre extension Demonstration and Evaluation of Soybean Technologies in Fedis D...
PDF
Pre extension Demonstration and Evaluation of Potato Technologies in Selected...
PDF
Pre extension Demonstration and Evaluation of Animal Drawn Potato Digger in S...
PDF
Pre extension Demonstration and Evaluation of Drought Tolerant and Early Matu...
PDF
Pre extension Demonstration and Evaluation of Double Cropping Practice Legume...
PDF
Pre extension Demonstration and Evaluation of Common Bean Technology in Low L...
PDF
Enhancing Image Quality in Compression and Fading Channels A Wavelet Based Ap...
PDF
Manpower Training and Employee Performance in Mellienium Ltdawka, Anambra State
PDF
A Statistical Analysis on the Growth Rate of Selected Sectors of Nigerian Eco...
PDF
Automatic Accident Detection and Emergency Alert System using IoT
PDF
Corporate Social Responsibility Dimensions and Corporate Image of Selected Up...
PDF
The Role of Media in Tribal Health and Educational Progress of Odisha
PDF
Advancements and Future Trends in Advanced Quantum Algorithms A Prompt Scienc...
PDF
A Study on Seismic Analysis of High Rise Building with Mass Irregularities, T...
PDF
Descriptive Study to Assess the Knowledge of B.Sc. Interns Regarding Biomedic...
PDF
Performance of Grid Connected Solar PV Power Plant at Clear Sky Day
PDF
Vitiligo Treated Homoeopathically A Case Report
PDF
Vitiligo Treated Homoeopathically A Case Report
PDF
Uterine Fibroids Homoeopathic Perspectives
A Study of School Dropout in Rural Districts of Darjeeling and Its Causes
Pre extension Demonstration and Evaluation of Soybean Technologies in Fedis D...
Pre extension Demonstration and Evaluation of Potato Technologies in Selected...
Pre extension Demonstration and Evaluation of Animal Drawn Potato Digger in S...
Pre extension Demonstration and Evaluation of Drought Tolerant and Early Matu...
Pre extension Demonstration and Evaluation of Double Cropping Practice Legume...
Pre extension Demonstration and Evaluation of Common Bean Technology in Low L...
Enhancing Image Quality in Compression and Fading Channels A Wavelet Based Ap...
Manpower Training and Employee Performance in Mellienium Ltdawka, Anambra State
A Statistical Analysis on the Growth Rate of Selected Sectors of Nigerian Eco...
Automatic Accident Detection and Emergency Alert System using IoT
Corporate Social Responsibility Dimensions and Corporate Image of Selected Up...
The Role of Media in Tribal Health and Educational Progress of Odisha
Advancements and Future Trends in Advanced Quantum Algorithms A Prompt Scienc...
A Study on Seismic Analysis of High Rise Building with Mass Irregularities, T...
Descriptive Study to Assess the Knowledge of B.Sc. Interns Regarding Biomedic...
Performance of Grid Connected Solar PV Power Plant at Clear Sky Day
Vitiligo Treated Homoeopathically A Case Report
Vitiligo Treated Homoeopathically A Case Report
Uterine Fibroids Homoeopathic Perspectives
Ad

Recently uploaded (20)

PDF
Hazard Identification & Risk Assessment .pdf
PPTX
UV-Visible spectroscopy..pptx UV-Visible Spectroscopy – Electronic Transition...
PDF
1_English_Language_Set_2.pdf probationary
PDF
RTP_AR_KS1_Tutor's Guide_English [FOR REPRODUCTION].pdf
PPTX
Introduction to Building Materials
PDF
What if we spent less time fighting change, and more time building what’s rig...
PDF
ChatGPT for Dummies - Pam Baker Ccesa007.pdf
PDF
medical_surgical_nursing_10th_edition_ignatavicius_TEST_BANK_pdf.pdf
PPTX
UNIT III MENTAL HEALTH NURSING ASSESSMENT
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PPTX
Orientation - ARALprogram of Deped to the Parents.pptx
PDF
احياء السادس العلمي - الفصل الثالث (التكاثر) منهج متميزين/كلية بغداد/موهوبين
PDF
Indian roads congress 037 - 2012 Flexible pavement
PDF
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
PDF
Complications of Minimal Access Surgery at WLH
PDF
GENETICS IN BIOLOGY IN SECONDARY LEVEL FORM 3
PDF
IGGE1 Understanding the Self1234567891011
PDF
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
PDF
advance database management system book.pdf
PPTX
Digestion and Absorption of Carbohydrates, Proteina and Fats
Hazard Identification & Risk Assessment .pdf
UV-Visible spectroscopy..pptx UV-Visible Spectroscopy – Electronic Transition...
1_English_Language_Set_2.pdf probationary
RTP_AR_KS1_Tutor's Guide_English [FOR REPRODUCTION].pdf
Introduction to Building Materials
What if we spent less time fighting change, and more time building what’s rig...
ChatGPT for Dummies - Pam Baker Ccesa007.pdf
medical_surgical_nursing_10th_edition_ignatavicius_TEST_BANK_pdf.pdf
UNIT III MENTAL HEALTH NURSING ASSESSMENT
Final Presentation General Medicine 03-08-2024.pptx
Orientation - ARALprogram of Deped to the Parents.pptx
احياء السادس العلمي - الفصل الثالث (التكاثر) منهج متميزين/كلية بغداد/موهوبين
Indian roads congress 037 - 2012 Flexible pavement
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
Complications of Minimal Access Surgery at WLH
GENETICS IN BIOLOGY IN SECONDARY LEVEL FORM 3
IGGE1 Understanding the Self1234567891011
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
advance database management system book.pdf
Digestion and Absorption of Carbohydrates, Proteina and Fats

A Study on Face Expression Observation Systems

  • 1. International Journal of Trend in Scientific Research and Development (IJTSRD) Volume 6 Issue 5, July-August 2022 Available Online: www.ijtsrd.com e-ISSN: 2456 – 6470 @ IJTSRD | Unique Paper ID – IJTSRD50450 | Volume – 6 | Issue – 5 | July-August 2022 Page 237 A Study on Face Expression Observation Systems Jyoti1 , Neeraj Chawaria2 , Ekta2 1 Student, MERI College of Engineering & Technology, Sampla, Rohtak, Haryana, India 2 Assistant Professor, MERI College of Engineering & Technology, Sampla, Rohtak, Haryana, India ABSTRACT Human expressions can convey a great deal of information. We cannot learn every language in the world, but we can decipher the majority of human expressions. The state of a user's conduct in various settings and scenarios can be inferred from their facial expressions. Through various human-computer interface and programming approaches, facial expression can be digitized. Face detection, feature extraction, and kind of expression determination are all parts of the facial expression perception process. Verbal and non-verbal forms of communication are both possible. Through their emotions, people can communicate nonverbally We've given a broad summary of the various facial expression perception processes in the literature in this article. KEYWORDS: Face Expressions, Neural Nets, Knowledge, Statistics How to cite this paper: Jyoti | Neeraj Chawaria | Ekta "A Study on Face Expression Observation Systems" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456- 6470, Volume-6 | Issue-5, August 2022, pp.237-243, URL: www.ijtsrd.com/papers/ijtsrd50450.pdf Copyright © 2022 by author(s) and International Journal of Trend in Scientific Research and Development Journal. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (CC BY 4.0) (http://creativecommons.org/licenses/by/4.0) INTRODUCTION It is the procedure of classifying human emotions, generally via facial expressions and via oral languages. It is accomplished through human’s perception inevitably and through computation techniques. According to research, generally 90% of the human communication is non-verbal, however technique has emerged and usual code is not so good for grasping a person’s stresses and intents. Now a day, emotion recognition is also known as affective Computation and is becoming available to many developers. The process of emotion recognition has been shown in Fig. 1.1. Fig 1: Emotions Classification IJTSRD50450
  • 2. International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470 @ IJTSRD | Unique Paper ID – IJTSRD50450 | Volume – 6 | Issue – 5 | July-August 2022 Page 238 RELATED WORK Face recognition compares a face in a photo or video with a list of known faces to identify it. It is true that faces must be entered into the system in order to build the collection of distinctive facial traits [1-2]. After then, the system separates a new picture into its essential components and analyzes them to the data kept in the repository. The emotions are classified on the basis of features analysis [3-4]. A lot of work has also been accomplished in the field of image enhancement i.e. retrieval of an original image form a destroyed image [5-6], counting of peoples in an image [7], image smoothing [8] and image inpainting methods [9]. TECHNIQUES OF EMOTION PERCEPTION Generally, the process of recognition of emotions comprises of the investigation of human sentiments in multiple configurations like texts, audios, or videos [10]. Several emotion categories are identified via the combination of data from facial expressions, body passage, gesticulations, and language [11]. The methodology backs the appearance of the emotive or emotional expressions. The present methods of emotion recognition can be categorized into three specific emotion categories: On the basis of knowledge, according to statistics, and mixed techniques [12]. Knowledge Related Techniques Knowledge related methods are also known as lexicon methods. It employs field information, their meaning and linguistic characteristics of speech to identifyspecific emotion categories [13]. This technique generally employs knowledge related sources throughout the emotion category method like WordNet, SenticNet [14] ConceptNet, and EmotiNet [15]. Main benefit of this technique is the convenience and budget carried out through the huge accessibility of similar knowledge related documents. Also, this technique also has a drawback, i.e. its incapability to manage perception distinctions and difficult linguistic guidelines. These methods may be primarily categorized into two classes: dictionary related and quantity related methods. Dictionary related methods get view or emotion source verses in a dictionary and find out their alternative and opposite word to develop the primary list of thoughts or sentiments [16]. Quantity related methods begins with a source list of view or sentiment words, and increase the datasets through discovering attributes with situation definite features in a big quantity [16]. Although Quantity related methods consider situations but their enactment diverge in dissimilar areas as a word in one area may have another meaning in next area. Statistical Techniques Statistical techniques usually include the practice of dissimilar supervised machine learning (ML)procedures that have a huge set of marked data. This data is served to the procedures for the method to study and forecast the suitable emotion forms. Usually, this method includes few groups of information: the training and the testing groups. The training group is employed to study the qualities of the data. The testing group is employed to authenticate the enactment of the (ML) procedure [17]. The ML procedures normally offer added sensible categorization accurateness as compared to other methods, however it to obtain efficient outcomes it requires an adequately big training group. Support Vector Machines (SVM) is one a popular ML procedure . There are also unsupervised ML techniques which are used for emotion recognition, e.g., Deep learning [18]. These procedures involve various configurations of (ANN) like CNN, LSTM, and ELM. In the area of emotion recognition, the deep learning methods are highly accepted because of to its attainment in associated applications. See fig 2 [19] for deep learning based emotion recognition. Fig 2: Deep learning Based Emotion Perception Hybrid Methods Hybrid methods used for recognition of emotions are basically a mixture of info related methods and mathematical approaches. These use corresponding features from both knowledge based and statistical
  • 3. International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470 @ IJTSRD | Unique Paper ID – IJTSRD50450 | Volume – 6 | Issue – 5 | July-August 2022 Page 239 procedures. Certain processes have employed a collection of knowledge focused verbal components and statistical approaches comprises of semantic computation and iFeel[20]. The function of these knowledge related sources in the employment of hybrid methods is extremely significant in the emotion organization procedure. As hybrid methods acquire knowledge from the profits offered by knowledge related as well as from statistical methods. The hybrid approaches achieve improved grouping enactment as contrary to employing knowledge centered or statistical approaches individually. The main drawback of consuming hybrid techniques is the computation intricacy through the arrangement procedure. Neural Nets for Emotion Recognition A Neural Net is a subdivision of deep Learning and is a kind of technology which has turned to be widely trendy since last term of years. Additionally, due to its mysterious capability to attain advanced up-to-date accurateness for numerous sorting jobs, these have a precarious advantage which is massively supportive in emotion recognition and in feature engineering spontaneously. In case of the Neural Net, scientist may input the data in the form of transcript or vernaculars. Here the data moves across various layers in the net. Every layer changes the inputs to attempt and transform it into somewhat beneficial and analytical in the paradigm. Fig. 3 exhibits the working of convolution neural network for feature extraction [21]. Fig 3: CNN Based Emotion Recognition ANALYSIS OF THE EMOTION ASSESSMENT SYSTEM To examine how facial expressions affect the mining of face features, authors in [22] created the Point Distribution Model (PDM) technique. The location points (x, y) of the categorised or annotated points from the training data set are analysed by the PDM method. Utilizing 180 photographs from 15 volunteers, the suggested method selects 12 pictures from each person after each person chooses six sentiments. The collected features from facial photos are categorised and matched using the Action Parameters (AP) Classifier. Authors in [23] categorize CK facial sentiment and live videos through the SVM method to identify the universally recognised feelings that are (in the scenario of the essential feeling of human supplied during instruction. In order to obtain aspects from facial photos, authors in [24] introduced three independent feature extraction techniques. The methods utilised were tested using different facial expression recognition parameters. The SVM was used to categorise the retrieved features (SVM). The Cohn-Kanade (CK) database, which was constructed from 100 individuals with ages ranging from 18 to 30, was used. For these studies, 310 photos were chosen from the CK database. Authors in [25] studied three separate strategies for extraction of feature from face sentiments for expression identification on 6 generally characterised primary emotions. These methods were discussed individually. These methods are Singular Value Decomposition, FFT, and DCT. The retrieved facial features are classified using the (SVM) classifier. The JAFFE database is used to conduct the investigation. There are 219 photos in this database. DCT+SVM technique, FFT+SVM technique, and SVD+SVM method all attained recognition rates of 95 percent and 93 percent, respectively. Authors in [26] studied three separate strategies for extracting features from face gestures for emotion classification on 6 generally characterized primary emotions, especially angry and calm. These techniques were
  • 4. International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470 @ IJTSRD | Unique Paper ID – IJTSRD50450 | Volume – 6 | Issue – 5 | July-August 2022 Page 240 discussed individually. These methods are Singular Value Decomposition, FFT, and DCT. The retrieved facial features are classified using the (SVM) classifier. The JAFFE database is used to conduct the investigation. There are 219 photos in this database. Using the Discrete Wavelet Transform (DWT) method, authors in [27] retrieved characteristics from facial photographs. They used 460 images they had taken from a collection of videos for their research. The (KNN) technique and the (LDA) method were employed as two different classifiers after the features were extracted from the images. LDA seeks to identify the best hyper plane for each of the six groups of sentiments (neutral, happy, surprise and sad). As feature extraction techniques, authors in [28] proposed the (WPCA) and (PPCA). The suggested algorithms are thought to be the two most crucial techniques for reducing dimension and characteristic mining from a person's facial region. Separate applications of the suggested techniques were made on the Image Database (CKACFEID), which contains 500 photographs from 100 different people. For their investigations, researchers chose 400 photos from this database. The proposed algorithms are assessed using Support Vector Machine. To mine and encode facial traits, authors in [29] presented PHOG and Local Phase Quantisation (LPQ) approaches. First, the PHOG technique was used to extract the features, and then PHOG and LPQ techniques were combined. In these trials, 289 photos from the GEMEP-FERA dataset are used. Three techniques were used by the researchers: the PHOG methodology with an SVM classifier. To obtain traits from facial photos and increase the rate of recognition, authors in [30] integrated the (DCT), (WT), (GF), and (GD) approaches. This research uses 213 photos from the JAFFE database that represent seven different emotions: melancholy, anxiety, wrath, amazement, pleasure, calm, and contempt. Only 126 of the 213 total photographs were chosen for this experiment. Authors in [31] recommend using the principal component analysis (PCA) approach to mine characteristics from facial images to recognize or sense the person's sentiments (disgust, anger, sadness, happiness) in his face picture. Authors collected characteristics and used the Singular Value Decomposition (SVD) classifier to categories these features. For their experiments, they employed 50 training photos and 31 test pictures from the JAFFE collection and the real database. Authors in [32] used a different method to extract characteristics from facial photos called patch-based Gabor characteristics. This method is defined in terms of maintaining the facial region statistics while mining local attributes. The suggested technique is based on an approach used on the JAFFE dataset, which includes over 200 gloomy images for seven moods (six fundamental and one neutral). This research also makes use of the Cohn- Kanade (CK) database. The SVM was used by researchers to categorise the retrieved features. To determine the face emotions from a face image, authors in [3] employed a variety of techniques. To mine the traits from a human face, utilise the (GLCM) approach. The features that are extracted are very effective and take up little processing time. The key emotions recognised in this case are Joyful, Shock, Displeasure, Calm, and Sadness. SVM is employed to train the retrieved attributes using different kernels. Researchers used the Emotion Recognition Database in their research and got a classification rate of nearly 90%. Table 1 displays a concise aspect of preceding readings which studied emotion approximation from facial pictures. This table lists the names of the scientists, their techniques, databases, the overall number of photos, expressions, train images, testing images, and the correctness they were able to obtain in each investigation. Table 1: Analysis of Emotion Recognition from facial pictures Sr. No References Techniques Databases Images Expressions Trained Image Test Image Precision (%) 1. Chung-Lin & Yu-Ming [22] PDM+AP Real DB 181 6 91 91 85 2. Philipp & Rana [23] SVM (CK&CK+) and live video 80 6 91 88 3. Tommaso, Caifeng, Vincent & Ralph [24] HOG+SVM CK 311 6 91 93 LBP+SVM CK 311 6 91 93 LTD+SVM CK 311 6 91 92
  • 5. International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470 @ IJTSRD | Unique Paper ID – IJTSRD50450 | Volume – 6 | Issue – 5 | July-August 2022 Page 241 4. KHARAT &DUDUL [25] DCT+SVM JAFFE 220 7 91 95 FFT +SVM JAFFE 220 7 90 95 SVD +SVM JAFFE 220 7 78 93 5. Caifeng, Shaogang and Peter. [26] LBP+SVM MMI 80 6 90 87 JAFFE 214 7 90 80 6. Murugappan, Nagarajan, & Yaacob [27] DWT+KNN Real Database 461 5 91 84 DWT+ KDA Real Database 461 5 91 76 7. Zhiguo & Xuehong [28] WPCA+SVM CKACFEID 501 6 100 400 89 PPCA+SVM CKACFEID 501 6 100 400 85 8. Abhinav, Akshay, Yogesh &Tom [29] PHOG+SVM GEMEP 290 5 155 134 67% PHOG&LPQ + SVM GEMEP 290 5 155 134 73 PHOG&LPQ+ LMNN GEMEP 290 5 156 135 74 9. Sandeep, Shubh, Yogesh &Neeta [30] (DCT+WT+GF+GT) +ADB JAFFE 214 7 126 87 94 10. Mandeep& Rajeev [31] PCA+SVD JAFFE 15 6 7 7 99.99 Real Database 83 6 50 31 99.99 11. Ligang & Dian [32] Patch-based Gabor+SVM JAFFE 214 7 143 70 92.93% CK 328 6 247 80 94.48 % 12. Punitha & Geetha [33] GLCM+SVM Facial expression DB 90 % Conclusion In this article, various methods are used to evaluate an expression identification system from facial photographs. However, classifying data is a challenging process as well, especially when some data are almost identical. This makes emotion recognition from face photos a challenging work because evaluating data is still complex. We discussed some review of the literature that relate to the topic of emotion recognition in facial expressions. REFERENCES [1] Sanju, Kirti Bhatia, Rohini Sharma, An analytical survey on face recognition systems, International Journal of Industrial Electronics and Electrical Engineering, Volume-6, Issue-3, Mar. -2018, pp. 61-68. [2] Sanju, K. Bhatia, Rohini Sharma, Pca and Eigen Face Based Face Recognition Method, Journal of Emerging Technologies and Innovative Research, June 2018, Volume 5, Issue 6, pp. 491-496. [3] Ankit Jain, Kirti Bhatia, Rohini Sharma, Shalini Bhadola, An Overview on Facial Expression Perception Mechanisms, SSRG International Journal of Computer Science and Engineering, Volume 6 Issue 4 - April 2019, pp. 19-24. [4] Ankit Jain, Kirti Bhatia, Rohini Sharma, Shalini Bhadola, An emotion recognition framework through local binary patterns, Journal of Emerging Technologies and Innovative Research, Vol -6, Issue-5, May 2019. [5] Jyoti, Kirti Bhatia, Shalini Bhadola, Rohini Sharma, An Analysis of Facsimile Demosaicing Procedures, International journal of Innovative Research in computer and communication engineering, Vol-08, Issue-07, July 2020. [6] Deepak Dahiya, Kirti Bhatia, Rohini Sharma, Shalini Bhadola, A Deep Overview on Image Denoising Approaches, International Journal of Innovative Research in Computer and
  • 6. International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470 @ IJTSRD | Unique Paper ID – IJTSRD50450 | Volume – 6 | Issue – 5 | July-August 2022 Page 242 Communication Engineering, Volume 9, Issue 7, July 2021. [7] Nakul Nalwa, Shalini Bhadola, Kirti Bhatia, Rohini Sharma, A Detailed Study on People Tracking Methodologies in Different Scenarios, International Journal of Innovative Research in Computer and Communication Engineering, Volume 10, Issue 6, June 2022. [8] Mahesh Kumar Attri, Kirti Bhatia, Shalini Bhadola, Rohini Sharma, An Image Sharpening and Smoothing Approaches Analysis, International Journal of Innovative Research in Computer and Communication Engineering, Volume 10, Issue 6, June 2022, pp 5573-5580. [9] Yogita, Shalini Bhadola, Kirti Bhatia, Rohini Sharma, A Deep Overview of Image Inpainting Approaches, International Journal of Innovative Research in Science, Engineering and Technology, Volume 11, Issue 6, June 2022, pp. 8744-8749. [10] Poria, S., Cambria, E., Bajpai, R. and Hussain, A. (2017). A review of affective computing: From unimodal analysis to multimodal fusion. Information Fusion, 37, 98–125. [11] Caridakis, G. et al. (2007). Multimodal emotion recognition from expressive faces, body gestures and speech. IFIP the International Federation for Information Processing, 247, 375–388. [12] Cambria, E. (2016). Affective Computing and Sentiment Analysis. IEEE Intelligent Systems, 31(2), 102–107. [13] Rani, Meesala, S., and Sumathy, S. (2017). Perspectives of the performance metrics in lexicon and hybrid based approaches: a review. International Journal of Engineering & Technology, 6(4), 108. [14] Cambria, E., Poria, S., Bajpai, R. and Schuller, B. (2016). SenticNet 4: A Semantic Resource for Sentiment Analysis Based on Conceptual Primitives, In Proc. of COLING, the 26th International Conference on Computational Linguistics: Technical Papers, 2016, pp. 2666– 2677. [15] Balahur, A., Hermida, Jesú, M. and Montoyo, A. Detecting implicit expressions of emotion in text: A comparative analysis. Decision Support Systems, 53(4), 742–753. [16] Madhoushi, Z., Hamdan, A., Razak, Z. and Suhaila. (2015). Sentiment analysis techniques in recent works, In Proc. of IEEE Conference Publication, 2015, pp. 288–291. [17] Sharef et al. (2016). Overview and Future Opportunities of Sentiment Analysis Approaches for Big Data. Journal of Computer Science, 12 (3), 153–168. [18] Sun, S., Luo, C. and Chen, J. (2017). A review of natural language processing techniques for opinion mining systems. Information Fusion, 36, 10–25. [19] Xiaofeng Lu, Deep Learning Based Emotion Recognition and Visualization of Figural Representation, Front. Psychol., 06 January 2022, Human-Media Interaction. [20] Araújo, M. et al. (2014). iFeel: a system that compares and combines sentiment analysis method, In Proc. of WWW '14 Companion. ACM, 2014, pp. 75–78. [21] Shiliang Zhang et al., Multimodal Deep Convolutional Neural Network for Audio- Visual Emotion Recognition, Proceedings of the 2016 ACM on International Conference on Multimedia Retrieval, 2016. [22] Huang C. -L. and Huang, Y. -M. (1997). Facial expression recognition using model-based feature extraction and action parameters classification. Journal of Visual Communication and Image Representation, 8 (3), 278-290. [23] Michel, P. and Kaliouby, R. E. (2005). Facial expression recognition using support vector machines, In Proc. of10th International Conference on Human-Computer Interaction, Crete, Greece, pp. 6-10. [24] Gritti, T. et al. (2008). Local features based facial expression recognition with face registration errors, In Proc. of Automatic Face & Gesture Recognition, 2008. FG'08. 8th IEEE International Conference on, pp. 8-14. [25] Kharat, G. and Dudul, S. (2008). Human emotion recognition system using optimally designed SVM with different facial feature extraction techniques. WSEAS Transaction Computing, 7 (6), 650-659. [26] Shan, C., Gong, S. and McOwan, P. W. (2009). Facial expression recognition based on local binary patterns: A comprehensive study. Image and Vision Computing, 27 (6), 803-816. [27] Murugappan, M., Ramachandran, N. and Sazali, Y. (2010). Classification of human
  • 7. International Journal of Trend in Scientific Research and Development @ www.ijtsrd.com eISSN: 2456-6470 @ IJTSRD | Unique Paper ID – IJTSRD50450 | Volume – 6 | Issue – 5 | July-August 2022 Page 243 emotion from EEG using discrete wavelet transform. Journal of Biomedical Science and Engineering, 3(4), 390. [28] Niu, Z. and Qiu, X. (2010). Facial expression recognition based on weighted principal component analysis and support vector machines, In Proc. 2010 3rd International Conference on Advanced Computer Theory and Engineering (ICACTE), 2010, pp. 2-8. [29] Dhall, A. et al. (2011). Emotion recognition using PHOG and LPQ features, In Proc. of Automatic Face & Gesture Recognition and Workshops (FG 2011), IEEE International Conference on. 2011, pp. 2-9. [30] Gupta, S. K. et al. (2011). A hybrid method of feature extraction for facial expression recognition”, in Proc. of Signal-Image Technology and Internet-Based Systems (SITIS), Seventh International Conference 2011, pp. 47-52. [31] Kaur, M. and Vashisht, R. (2011). Comparative study of facial expression recognition techniques. International Journal of Computer Applications, 13(1), 43-50. [32] Zhang, L. and Tjondronegoro, D. (2011). Facial expression recognition using facial movement features. IEEE Transactions on Affective Computing, 2(4), 219-229. [33] Punitha, A. and Geetha, M. K. (2013). Texture based emotion recognition from facial expressions using Support Vector Machine. International journal of computer applications, 80 (5), 112-119.