Emotion Recognition using Brainwave Datasets
Heba Elgazzar and Bethlehem Seid
School of Engineering and Computer Science
Morehead State University
Morehead, KY, USA
Outline
1. Introduction and Related Work
2. Proposed Method
3. Machine Learning Algorithms
4. Results
5. Conclusion
Introduction
• Traditionally, we use speech as a form of communication when we are feeling depressed or anxious.
• Expressions are vague and don’t describe our emotions very well.
• Therefore, it is essential to find other methods to understand the mental state of an individual.
• Our bodies uses electric signals to send and receive messages so measuring the electrical signals to see
the mental state of individuals
Introduction
• The dataset used for this research limits us to 3 emotional state which is positive, negative, and neutral.
• But this can be easily expanded to other emotions.
• This EEG test was also expensive and required the assistance of medical professionals to interpret
the meaning of the signals.
• However, with the improvements in technology low-cost commercial devices such as
the Emotiv, NeuroSky, and MUSE headband the process of collecting brainwave dataset and interpreting
the results has been simplified.
Related Work
• Other researchers have worked on similar topics using EEG headbands
• Detecting mental state relaxed, concentrated, and neutral
• Detecting different positive emotions
• Detecting emotional state using EEG headband
Proposed Method
• MUSE EEG headband was used
• The headband had four electrodes placed on the
TP9, AF7, AF8 and TP10
• Used to capture the readings within a time
window.
Dataset
• We used an existing public EEG Brainwave dataset this was described and released in for classifying
emotional state.
• In this dataset 2548 different attributes and 2133 instances
• Two individuals one male and one female were shown 6 different scenes to invoke negative and
positive emotions. Three minutes of each state was recorded.
• In addition, 6 minutes of resting neutral state was recorded.
Dataset
To invoke negative emotions
• Death scene from Marley and Me
produced by Twentieth Century studios
• Opening death scene from Up
produced by Walt Disney Pictures
• Funeral scene from My Girl
produced by Imagine Entertainment.
To invoke positive emotions
• Opening musical number from La La Land
produced by Summit Entertainment
• Nature time-lapse
from slow life by BioQuest Studios,
• Funny dog clips
by MashupZone
• Anaconda which is a free and open-source platform used for scientific computation and package
management.
• Pandas which is a software library for python programming and helps with the data manipulation and
analysis.
• Matplotlib which is a library for plotting
• Supervised machine learning techniques were chosen for this research
• Supervised learning is where a dataset and its labels is provided.
• The data is then split into training and testing sets
• Training set is used to develop algorithm can produce a way to predict the label of an unknown item by using the
information previously provided.
• Testing set is used to measure how well the algorithm is performing by comparing the predicted label to the actual
label given in the dataset.
Dataset
The classification algorithm used in this research are:
• K-Nearest Neighbors(KNN)
• Decision Trees
• Random Forest
• Artificial Neural Network(ANN).
Dataset
K-nearest neighbors KNN
• KNN is a simple machine learning algorithm
• Based on data elements that are classified in a group will have similar attributes.
• So, a data will be classified into a group by first finding the closest points then classify the point based on majority
vote of the K neighbors.
• The closeness will be calculated using the Euclidean distance
š‘‘š‘–š‘ š‘” š‘„, š‘¦ =
š‘–=1
š‘›
(š‘„1 āˆ’ š‘¦1)2
0
2
4
6
8
10
12
0 2 4 6 8 10 12
Pros
• Training is much faster compared to other classification
algorithms.
• Useful in case of nonlinear data.
• Used with the regression problem.
Cons
• Slow and costly
• Requires large memory for storing the entire training dataset for
prediction.
• Not suitable for large dimensional data.
K-nearest neighbors KNN
Decision Trees
• Flowchart-like upside-down tree structure.
• Root represents the entire dataset
• Nodes represent a single feature
• Branch represents the decision rule for partitioning
• Leaves represent the outcome/classes.
• Partitioning is done by first calculating the entropy which is the measure of impurity
Decision Trees
Pros
• Decision trees are easy to interpret and visualize.
• It can easily capture Non-linear patterns.
• The decision tree has no assumptions about distribution
Cons
• Sensitive to noisy data.
• The small variation in data can result in the different decision tree
• Decision trees are biased
Random Forest
• Similar to decision tree
• Overcomes the drawbacks of the decision tree
• Creates several decision trees
• It classifies the data based on majority vote from the several decision trees
Artificial Neural Networks
• Modeled after the human brain
• Processes information in a very complex and nonlinear fashion by using a web of neurons that communicate
through electric signals.
• If the sum reaches a certain threshold, then it transmits message to the other neurons.
• Here, weight represents the strength of the connection of between the two neurons.
• Weight and the bias are the parts that are adjusted as the neural network is learning.
Feature Selection
Principal Component Analysis (PCA)
• Linear dimensionality reduction
• Preserves the essential parts that have more variation of the data
• Cluster the similar data points based on the feature correlation
• Features selected based on the variance or based on the number of
components
0
2
4
6
8
10
12
0 2 4 6 8 10 12
Chart Title
-8
-6
-4
-2
0
2
4
6
8
0 2 4 6 8 10 12 14
Chart Title
Feature Selection
SelectKBest
• The correlation of the different independent features and the outcome
• Univariant feature selection
• Pick a feature do a chi-square test with the target variable find out the statistical significance of the test of
independence and keep feature based on the test of the significance then select the top K features to run the data on
ReliefF
• Filter-method approach to feature selection
• Scoring is based on the identifying the feature’s value difference compared to its nearest neighbor
• The closest same-class instance is called 'near-hit', and the closest different-class instance is called 'near-miss'
Results Using PCA
89.1%
92.51%
95.55% 95.66%
84
86
88
90
92
94
96
98
KNN Decision Tree Random Forest ANN
Accuracy
Machine Learning Classifiers
Accuracy of diffrent machine learning algorithms
Results Using SeleckKBest
75.78%
85.9%
89%
81.25%
65
70
75
80
85
90
KNN Decision Tree Random Forest ANN
Accuracy
Machine Learning Classifiers
Accuracy of diffrent machine learning algorithms
Conclusion
• In this research, we proposed a novel method for classifying the emotional state of individuals using few features.
• ANN shows promising results when using principal component analysis
• Random forest performed better when using SelectKBest
• KNN showed the least accurate results
Thank You

More Related Content

PPT
Supervised Learningclassification Part1.ppt
PPTX
PPTX
Macine learning algorithms - K means, KNN
PPTX
Supervised Learning Algorithm Slide.pptx
PPTX
DMW PPT created for engg submission in college
PPTX
Lec 18-19.pptx
PPTX
Intro to machine learning
PDF
Lect#2_Pattern_Recognition_PGIT204D_By_Dr_TSSinha.pdf
Supervised Learningclassification Part1.ppt
Macine learning algorithms - K means, KNN
Supervised Learning Algorithm Slide.pptx
DMW PPT created for engg submission in college
Lec 18-19.pptx
Intro to machine learning
Lect#2_Pattern_Recognition_PGIT204D_By_Dr_TSSinha.pdf

Similar to Emotion_Recognition_using_Brainwave_Datasets_Bethlehem_Seid (1).pptx (20)

PDF
Lecture 6 - Classification Classification
PDF
7 decision tree
PPTX
1. Intro DS.pptx
PPTX
04 Classification in Data Mining
PDF
Machinr Learning and artificial_Lect1.pdf
PPTX
DECISION TREE AND PROBABILISTIC MODELS.pptx
PPTX
Fetal Health Final ppt using machine learning.pptx
PDF
Artificial Intelligence Chapter 9 Negnevitsky
PPTX
PAIML-UNIT 5hgthgghrgergrgrgrgrgrgrg.pptx
PDF
Unit 5-1.pdf
PPTX
CSU_comp
PPTX
NN and DL_Intro__ to Neural Network.pptx
PDF
Lect#1_Pattern_Recognition_PGIT204D_By_Dr_TSSinha.pdf
PPTX
Weka bike rental
PPTX
Diagnosis Support by Machine Learning Using Posturography Data
PDF
Lecture 11 - KNN and Clustering, a lecture in subject module Statistical & Ma...
PPTX
Introduction to Soft Computing - Presentation
PPT
neuralnetworklearningalgorithm-231219123006-bb13a863.ppt
PPTX
Predictive analytics
PPTX
ML SFCSE.pptx
Lecture 6 - Classification Classification
7 decision tree
1. Intro DS.pptx
04 Classification in Data Mining
Machinr Learning and artificial_Lect1.pdf
DECISION TREE AND PROBABILISTIC MODELS.pptx
Fetal Health Final ppt using machine learning.pptx
Artificial Intelligence Chapter 9 Negnevitsky
PAIML-UNIT 5hgthgghrgergrgrgrgrgrgrg.pptx
Unit 5-1.pdf
CSU_comp
NN and DL_Intro__ to Neural Network.pptx
Lect#1_Pattern_Recognition_PGIT204D_By_Dr_TSSinha.pdf
Weka bike rental
Diagnosis Support by Machine Learning Using Posturography Data
Lecture 11 - KNN and Clustering, a lecture in subject module Statistical & Ma...
Introduction to Soft Computing - Presentation
neuralnetworklearningalgorithm-231219123006-bb13a863.ppt
Predictive analytics
ML SFCSE.pptx
Ad

Recently uploaded (20)

PDF
OSCE SERIES ( Questions & Answers ) - Set 5.pdf
PDF
AGE(Acute Gastroenteritis)pdf. Specific.
PDF
Adverse drug reaction and classification
PPTX
thio and propofol mechanism and uses.pptx
PPTX
Vesico ureteric reflux.. Introduction and clinical management
PPTX
CARDIOVASCULAR AND RENAL DRUGS.pptx for health study
PDF
The_EHRA_Book_of_Interventional Electrophysiology.pdf
PPTX
NRP and care of Newborn.pptx- APPT presentation about neonatal resuscitation ...
PPTX
@K. CLINICAL TRIAL(NEW DRUG DISCOVERY)- KIRTI BHALALA.pptx
PPTX
Reading between the Rings: Imaging in Brain Infections
PPTX
ANESTHETIC CONSIDERATION IN ALCOHOLIC ASSOCIATED LIVER DISEASE.pptx
PPTX
HYPERSENSITIVITY REACTIONS - Pathophysiology Notes for Second Year Pharm D St...
PDF
OSCE Series ( Questions & Answers ) - Set 6.pdf
PDF
Forensic Psychology and Its Impact on the Legal System.pdf
PPTX
Wheat allergies and Disease in gastroenterology
PPTX
Neoplasia III.pptxjhghgjhfj fjfhgfgdfdfsrbvhv
PPT
Opthalmology presentation MRCP preparation.ppt
PPTX
NUCLEAR-MEDICINE-Copy.pptxbabaabahahahaahha
PPT
neurology Member of Royal College of Physicians (MRCP).ppt
PPTX
The Human Reproductive System Presentation
OSCE SERIES ( Questions & Answers ) - Set 5.pdf
AGE(Acute Gastroenteritis)pdf. Specific.
Adverse drug reaction and classification
thio and propofol mechanism and uses.pptx
Vesico ureteric reflux.. Introduction and clinical management
CARDIOVASCULAR AND RENAL DRUGS.pptx for health study
The_EHRA_Book_of_Interventional Electrophysiology.pdf
NRP and care of Newborn.pptx- APPT presentation about neonatal resuscitation ...
@K. CLINICAL TRIAL(NEW DRUG DISCOVERY)- KIRTI BHALALA.pptx
Reading between the Rings: Imaging in Brain Infections
ANESTHETIC CONSIDERATION IN ALCOHOLIC ASSOCIATED LIVER DISEASE.pptx
HYPERSENSITIVITY REACTIONS - Pathophysiology Notes for Second Year Pharm D St...
OSCE Series ( Questions & Answers ) - Set 6.pdf
Forensic Psychology and Its Impact on the Legal System.pdf
Wheat allergies and Disease in gastroenterology
Neoplasia III.pptxjhghgjhfj fjfhgfgdfdfsrbvhv
Opthalmology presentation MRCP preparation.ppt
NUCLEAR-MEDICINE-Copy.pptxbabaabahahahaahha
neurology Member of Royal College of Physicians (MRCP).ppt
The Human Reproductive System Presentation
Ad

Emotion_Recognition_using_Brainwave_Datasets_Bethlehem_Seid (1).pptx

  • 1. Emotion Recognition using Brainwave Datasets Heba Elgazzar and Bethlehem Seid School of Engineering and Computer Science Morehead State University Morehead, KY, USA
  • 2. Outline 1. Introduction and Related Work 2. Proposed Method 3. Machine Learning Algorithms 4. Results 5. Conclusion
  • 3. Introduction • Traditionally, we use speech as a form of communication when we are feeling depressed or anxious. • Expressions are vague and don’t describe our emotions very well. • Therefore, it is essential to find other methods to understand the mental state of an individual. • Our bodies uses electric signals to send and receive messages so measuring the electrical signals to see the mental state of individuals
  • 4. Introduction • The dataset used for this research limits us to 3 emotional state which is positive, negative, and neutral. • But this can be easily expanded to other emotions. • This EEG test was also expensive and required the assistance of medical professionals to interpret the meaning of the signals. • However, with the improvements in technology low-cost commercial devices such as the Emotiv, NeuroSky, and MUSE headband the process of collecting brainwave dataset and interpreting the results has been simplified.
  • 5. Related Work • Other researchers have worked on similar topics using EEG headbands • Detecting mental state relaxed, concentrated, and neutral • Detecting different positive emotions • Detecting emotional state using EEG headband
  • 6. Proposed Method • MUSE EEG headband was used • The headband had four electrodes placed on the TP9, AF7, AF8 and TP10 • Used to capture the readings within a time window.
  • 7. Dataset • We used an existing public EEG Brainwave dataset this was described and released in for classifying emotional state. • In this dataset 2548 different attributes and 2133 instances • Two individuals one male and one female were shown 6 different scenes to invoke negative and positive emotions. Three minutes of each state was recorded. • In addition, 6 minutes of resting neutral state was recorded.
  • 8. Dataset To invoke negative emotions • Death scene from Marley and Me produced by Twentieth Century studios • Opening death scene from Up produced by Walt Disney Pictures • Funeral scene from My Girl produced by Imagine Entertainment. To invoke positive emotions • Opening musical number from La La Land produced by Summit Entertainment • Nature time-lapse from slow life by BioQuest Studios, • Funny dog clips by MashupZone
  • 9. • Anaconda which is a free and open-source platform used for scientific computation and package management. • Pandas which is a software library for python programming and helps with the data manipulation and analysis. • Matplotlib which is a library for plotting
  • 10. • Supervised machine learning techniques were chosen for this research • Supervised learning is where a dataset and its labels is provided. • The data is then split into training and testing sets • Training set is used to develop algorithm can produce a way to predict the label of an unknown item by using the information previously provided. • Testing set is used to measure how well the algorithm is performing by comparing the predicted label to the actual label given in the dataset. Dataset
  • 11. The classification algorithm used in this research are: • K-Nearest Neighbors(KNN) • Decision Trees • Random Forest • Artificial Neural Network(ANN). Dataset
  • 12. K-nearest neighbors KNN • KNN is a simple machine learning algorithm • Based on data elements that are classified in a group will have similar attributes. • So, a data will be classified into a group by first finding the closest points then classify the point based on majority vote of the K neighbors. • The closeness will be calculated using the Euclidean distance š‘‘š‘–š‘ š‘” š‘„, š‘¦ = š‘–=1 š‘› (š‘„1 āˆ’ š‘¦1)2 0 2 4 6 8 10 12 0 2 4 6 8 10 12
  • 13. Pros • Training is much faster compared to other classification algorithms. • Useful in case of nonlinear data. • Used with the regression problem. Cons • Slow and costly • Requires large memory for storing the entire training dataset for prediction. • Not suitable for large dimensional data. K-nearest neighbors KNN
  • 14. Decision Trees • Flowchart-like upside-down tree structure. • Root represents the entire dataset • Nodes represent a single feature • Branch represents the decision rule for partitioning • Leaves represent the outcome/classes. • Partitioning is done by first calculating the entropy which is the measure of impurity
  • 15. Decision Trees Pros • Decision trees are easy to interpret and visualize. • It can easily capture Non-linear patterns. • The decision tree has no assumptions about distribution Cons • Sensitive to noisy data. • The small variation in data can result in the different decision tree • Decision trees are biased
  • 16. Random Forest • Similar to decision tree • Overcomes the drawbacks of the decision tree • Creates several decision trees • It classifies the data based on majority vote from the several decision trees
  • 17. Artificial Neural Networks • Modeled after the human brain • Processes information in a very complex and nonlinear fashion by using a web of neurons that communicate through electric signals. • If the sum reaches a certain threshold, then it transmits message to the other neurons. • Here, weight represents the strength of the connection of between the two neurons. • Weight and the bias are the parts that are adjusted as the neural network is learning.
  • 18. Feature Selection Principal Component Analysis (PCA) • Linear dimensionality reduction • Preserves the essential parts that have more variation of the data • Cluster the similar data points based on the feature correlation • Features selected based on the variance or based on the number of components 0 2 4 6 8 10 12 0 2 4 6 8 10 12 Chart Title -8 -6 -4 -2 0 2 4 6 8 0 2 4 6 8 10 12 14 Chart Title
  • 19. Feature Selection SelectKBest • The correlation of the different independent features and the outcome • Univariant feature selection • Pick a feature do a chi-square test with the target variable find out the statistical significance of the test of independence and keep feature based on the test of the significance then select the top K features to run the data on ReliefF • Filter-method approach to feature selection • Scoring is based on the identifying the feature’s value difference compared to its nearest neighbor • The closest same-class instance is called 'near-hit', and the closest different-class instance is called 'near-miss'
  • 20. Results Using PCA 89.1% 92.51% 95.55% 95.66% 84 86 88 90 92 94 96 98 KNN Decision Tree Random Forest ANN Accuracy Machine Learning Classifiers Accuracy of diffrent machine learning algorithms
  • 21. Results Using SeleckKBest 75.78% 85.9% 89% 81.25% 65 70 75 80 85 90 KNN Decision Tree Random Forest ANN Accuracy Machine Learning Classifiers Accuracy of diffrent machine learning algorithms
  • 22. Conclusion • In this research, we proposed a novel method for classifying the emotional state of individuals using few features. • ANN shows promising results when using principal component analysis • Random forest performed better when using SelectKBest • KNN showed the least accurate results

Editor's Notes

  • #24: import matplotlib.pyplot as plt import numpy as np n_groups = 4 means_frank = (89.1, 92.51, 95.55, 95.66) fig, ax = plt.subplots() index = np.arange(n_groups) bar_width = 0.4 opacity = 0.8 rects1 = plt.bar(index, means_frank, bar_width,alpha=opacity,color='red') plt.xlabel('') plt.ylabel('Accuracy') plt.title('Accuracy of diffrent classifier') plt.xticks(index, ('KNN', 'Decision Tree', 'Random Forest', 'ANN')) plt.legend() plt.tight_layout() plt.show() import matplotlib.pyplot as plt import numpy as np n_groups = 4 means_frank = (75.78, 85.9, 89, 81.25) fig, ax = plt.subplots() index = np.arange(n_groups) bar_width = 0.4 opacity = 0.8 rects1 = plt.bar(index, means_frank, bar_width,alpha=opacity,color='red') plt.xlabel('') plt.ylabel('Accuracy') plt.title('Accuracy of diffrent classifier') plt.xticks(index, ('KNN', 'Decision Tree', 'Random Forest', 'ANN')) plt.legend() plt.tight_layout() plt.show() import matplotlib.pyplot as plt import numpy as np n_groups = 4 means_frank = (75.78, 85.9, 89, 81.25) fig, ax = plt.subplots() index = np.arange(n_groups) bar_width = 0.4 opacity = 0.8 rects1 = plt.bar(index, means_frank, bar_width,alpha=opacity,color='red') plt.xlabel('') plt.ylabel('Accuracy') plt.title('Accuracy of diffrent classifier') plt.xticks(index, ('KNN', 'Decision Tree', 'Random Forest', 'ANN')) plt.legend() plt.tight_layout() plt.show()