SlideShare a Scribd company logo
Thesis Presentation Outline
Introduction
Literature Review
Implementation
Experimental Results
01
02
03
04
05
06
07
08
09
Proposed Research
Related work
Title Slide
Concept of Facial Expression and Emotion
Theory of FER
Future Directions
MONITORING STUDENTS’
ENGAGEMENT IN CLASS USING
FACIAL EMOTION RECOGNITION
Name: Muhammad Adnan Bashir
Advisor: Dr. Shaista Habib
SCHOOL OF SYSTEM AND TECHNOLOGY
UNIVERSITY OF MANAGEMENT AND
TECHNOLOGY
INTRODUCTION
The system-based identification of facial expressions has been a vital area of
research in the literature for a long time. The most meaningful way humans
exhibit emotions is by facial expressions.
Automatic emotions recognition has been a source of contention. When
students in a classroom are exposed to circumstances that are taken from
real life, they experience a wide spectrum of emotions at any given time.
We purposed a mechanism in which students' facial expressions can be used
to automatically tell if they are paying attention and how they are feeling,
such as if they are active or not active.
LITERATURE
REVIEW
Divjak and Bischof
• evaluated three factors (eye
tracking, head movement, and
eye distance)
• generate a warning when they
detected alarm will sound to
notify
Kamath
• create an automated gaze
system to find out how
interested the students were.
• Using video from the
classroom, they made a way
to tell how engaged students
are.
• They used a face-tracking
system to get students to look
at the screen.
Bidwell
• use the histogram of
oriented gradients (HOG) to
analyze the input images.
RELATED
WORK
Turabzadeh
• investigated the Local Binary
Point (LBP) algorithm for
recognizing facial emotions in
real time
• proposed a real-time system
based on how a student's
face shows emotion during a
lesson.
• Based on the student's
emotions, the system would
automatically adjust the
content to the student's level
of concentration. .
• They also showed that there
were three different levels of
focus (high, medium, and
low)
Sharma
• watches how students move
their eyes and heads to see if
they are paying attention and
sounds an alarm if they aren’t.
Kithira
Title Methods Results
Face Recognition and FER using PCA PCA Better Face Recognition Rate with Low Error
Rate
A Real Time Facial Expression Classification System Using
LBP
Haar Classifier High face detection rate and produce more
efficient features
Automated Facial Expression Recognition System Methods Constraint Model with SVM Constraint Model with SVM
FER using Optical Flow without Complex Feature Extraction Uniform grids MLP 95.1%
A neural-AdaBoost based facial expression recognition
system
Viola Jones, Gabor Feature
Extraction ,Ada Boost, MNN-NP
96.83% and 92.22% are respectively
Facial Expression Recognition 2D appearance-based Model
Radial Symmetry Transform
Accuracy achieved at 81 %
Facial action unit recognition using multi-class classification
Methods
MLBP, FCBF, ECOC Good Result achieve by Multi-Class Classifier
Recognition of facial expressions and measurement of levels
of interest from video
Grid points HMMs 90%
A multi-task model for simultaneous face identification and
FER
MT-FIM 92.57%
An SOM-based Automatic FER System Viola Jones, Gabor Wavelet
Filters, Landmark Point
Tracking
Recognition Rate over 90%
MOTIVATION
&
CONTRIBUTION
In a competitive environment, it is critical to
use technology and learning strategies to
assist students in adapting.
Included as well is synopsis of the numerous
facial expressions of emotion that have been
categorized by a variety of researchers
throughout the course of time.
Many schools have implemented e-learning
systems in recent decades to improve
learning attention and efficiency.
Show a new way to recognize facial expressions
with a way to pay attention.
Contextual face perception provides relevant
context for processing facial expressions.
To offer a framework for future research, each
study's weaknesses have been identified and
addressed.
To give new researchers with unified
knowledge on a single platform, research on
facial expression detection using image and
video classification has been conducted, and a
collection of the relevant literature has been
prepared because of this research.
Motivation Contribution
Concept of Facial
Expression and
Emotion
Explainable Facial Expressions and Emotions
Expressions
Emotions
a form of nonverbal signaling using the
movement of facial muscles.
Emotions can be understood of in terms of
neurobiology as complex action plans that
are triggered by external or internal inputs.
Facial Expression Analysis Techniques
Facial electromyography (FEMG)
• It is based on facial anatomical features.
• Facial expressions are broken down into their component
parts, which are referred to as "Action Units".
• Using facial action unit coding, you can acquire all the
knowledge that is required to differentiate between the
following three categories of facial expressions.
The Facial Action Coding
System (FACS)
Most of the time,
they happen in
everyday
situations, last
between 0.5 and 4
seconds, and can
be seen with the
naked eye.
Macro
expressions
are short facial
movements that last
less than half a second.
They happen when a
person is trying to hide
or repress their current
emotional state,
whether they are aware
of it or not.
Micro
expressions
Subtle
expressions
is the first sign of a
facial expression
when the emotion
it shows is still
considered to be
mild.
01 02
Facial Emotions Analysis
such as "fight or flight," which
are designed to either
immediately remove oneself
from a potentially harmful
situation or to prepare one for
a physical confrontation with
an adversary.
Behavior patterns
Emotions and physical and mental arousal are inextricably linked, and different levels of
arousal are associated with various emotions.
Bodily symptoms
03 04
evaluations of the activities,
stimuli, or things being
considered.
Cognitive evaluations
showing one's teeth and
frowning are both signs of
displeasure.
Facial expressions example
Most of these symptoms
occurs unconsciously and
cannot be controlled by the
patient.
Classification Of Facial Emotions
Emotion Definition Motion of facial part
Happiness
Joy, pride, relief, hope, pleasure, and
excitement.
Open eyes, mouth corner pulled up, open
mouth, elevated cheeks, and wrinkles around
the eyes.
Sadness
Sadness is the opposite emotion of Joy. The
secondary emotions are suffering, pain,
sadness, sympathy, and despair.
Outer brow lowered, inner brow corner lifted,
mouth edge lowered, eye closed, lip corner
pushed down.
Fear
Danger is the source of fear. It could be due
to the possibility of bodily or mental injury.
The secondary emotions of fear are Horror,
anxiety, panic, and dread.
Danger is the source of fear. It could be due to
the possibility of bodily or mental injury. The
secondary emotions of fear are Horror, anxiety,
panic, and dread.
Humans can undoubtedly make thousands of slightly different combinations of face expressions.
There are six basic happiness, sadness, fear, anger, surprise, and disgust emotions
Surprise
This emotion arises when the unexpected
occurs. The secondary emotions of surprise
include astonishment and amazement.
The individual has raised eyebrows, an open
eye, an open mouth, and a lowered jaw.
Disgust
Disgust is a sense of disgust. A human's
disgust can be triggered by any taste, smell,
sound, or texture.
In corner depressor, nose wrinkle, lower lip
depressor, and lowered eyebrows.
Anger
Anger is among the most perilous emotions.
This emotion may be detrimental; thus,
humans attempt to avoid it. Anger's
secondary emotions are irritation,
annoyance, exasperation, hatred, and
aversion.
The eyebrows are pulled down, the mouth is
closed, and the lips are tightened, and the upper
and lower lids are pulled up.
Theory of FER
In this context, "Facial Emotion
Recognition" is a type of software
that can decipher emotional states in
still images and moving video.
The three phases of FER analysis are as follows:
(a) identification of the face
(b) identification of the facial expression
(c) classification of the facial expressions
Analysis of the positions of key facial landmarks can be used to infer emotional states
(e.g., end of nose, eyebrows).
METHODOLOGY
Facial Recognition Methods
1- Point-based Display 2- Stick-figure Models
3- 3D XFace Animations
Proposed Research
We propose a system that uses techniques from deep convolutional neural networks to analyze the
emotions that are present in a classroom setting.
The real-time webcam footage from the classroom was subjected to a technique known as "key
frame extraction," which enabled the analysis of the students' emotional expressions.
Using cameras to monitor classrooms is a non-intrusive method for digitizing student behavior.
The purpose of the proposed system is to determine the level of concentration of the students in
classroom during lecture.
This is done by continuously monitoring head rotation and eye movement. Once facial
characteristics have been identified, it will be determined whether the student is attentive or not.
The system can be utilized during a live lecture using a laptop computer with an integrated web
camera. After that the first thing that needs to be checked off the list is whether or not the student
possesses eyes.
Workflow diagram of the proposed system
Methodology
• The proposed system consists of two
models for detecting engagement and
emotions.
• For engagement detection, only the eye
region of a face is examined with the aid
of a Haar-Cascade classifier.
• Once the eye region has been segmented,
the image is manually labeled as
"engaged" or "not engaged.“
• The CNN is then trained using these
manually annotated datasets, and the
engagement probability is calculated
using the CNN model.
Proposed Approach for Engagement Detection
In our model, we take into consideration what the engagement levels of student during learning process.
However, we didn’t take into consideration whether the student learned or not.
We decided to divide the level of engagement into two groups: Active and Not Active.
State of Positions Perceive the Expressions/Emotions
Active
Neutral emotion (front), smile (front), frown (front), Neutral
emotion, eyes to the left (front), neutral emotion, eyes to the
right (front),
Smile, eyes to the left (front), smile, eyes to the right (front),
Teeth smile (front), teeth smile, eyes to the left (front), teeth
smile, eyes to the right (front)
Wide eyes
Dilated pupils
Slight frown
Staring with half-lidded eyes
Nodding
Pursing one’s lips
Making eye contact when listening or conversing
Not Active
Looking down (towards the ground), Looking down (right
side), looking down (left side),
Neutral emotion, looking forward (front)
Smile; looking forward (right side), smile; looking forward
(left side)
Hand over the mouth, looking forward (front).
Blank stare
Glazed eyes
Minimal eye contact
Yawning
Closing or half-closing one’s eyes
Propping one’s head in hands
Unfocused gaze
Rapid blinking
Squinting
IMPLEMENTATIONS
Three cases have been defined to detect matching.
● Matching can be existing if emotion found in the proposed affective
model and compatible with the engagement level.
● Matching cannot be existing if emotion is not compatible with the
engagement level.
● Matching can be as a new emotion if volunteer felt about new
emotion not found in the proposed affective model.
• This application, which is completely based on the theory of compound
emotions, seeks to classify all human emotions into distinct groups, active
and inactive, using deep convolutional neural networks.
• When processing videos in Python, we'll be using the Haar cascade
module in conjunction with OpenCV, an open-source computer vision
library.
• To begin, we need to get a hold of footage from our main computer's
camera or any other video in which a person's face may be identified.
Application
Work
RESULTS
Artificial Intelligence And Face Detection
Artificial Intelligence And Face Detection
Artificial Intelligence And Face Detection
Future Directions
● All the engagement levels and feelings that are included in this model were
chosen based on other works that were similar. Active and inactive
participation are the two levels of engagement available.
● We offered a few recommendations aimed at enhancing the affective model.
At this disengagement level, we need more investigation. As a result, one of
our upcoming projects will consist of conducting an analysis of the physical
and psychological activities of students while they are experiencing various
emotions to define a comprehensive behavior for each level of engagement.
THANKS
Questions ?

More Related Content

PDF
Automatic Emotion Recognition Using Facial Expression: A Review
PDF
A Literature Review On Emotion Recognition System Using Various Facial Expres...
PDF
Ijariie1177
PPTX
Sri-Pratap-College-Department-of-Information-Technology Presentation.pptx
PDF
IRJET - A Survey on Human Facial Expression Recognition Techniques
PPTX
PERFORMANCE EVALUATION AND IMPLEMENTATION OF FACIAL EXPRESSION AND EMOTION R...
PDF
IRJET - Automatic Emotion Recognition using Facial Expression: An Overview
PDF
STUDENT TEACHER INTERACTION ANALYSIS WITH EMOTION RECOGNITION FROM VIDEO AND ...
Automatic Emotion Recognition Using Facial Expression: A Review
A Literature Review On Emotion Recognition System Using Various Facial Expres...
Ijariie1177
Sri-Pratap-College-Department-of-Information-Technology Presentation.pptx
IRJET - A Survey on Human Facial Expression Recognition Techniques
PERFORMANCE EVALUATION AND IMPLEMENTATION OF FACIAL EXPRESSION AND EMOTION R...
IRJET - Automatic Emotion Recognition using Facial Expression: An Overview
STUDENT TEACHER INTERACTION ANALYSIS WITH EMOTION RECOGNITION FROM VIDEO AND ...

Similar to Artificial Intelligence And Face Detection (20)

PDF
Efficient Face Expression Recognition Methods FER A Literature Review
PDF
Face expression recognition using Scaled-conjugate gradient Back-Propagation ...
PDF
IRJET- EI-Expression Illustrator
PDF
IRJET- A Literature Review on Technique of Facial Variance to Detect the Stat...
PDF
Analysis on techniques used to recognize and identifying the Human emotions
PDF
Facial Expression Identification System
PDF
K017326168
PDF
Real Time Facial Emotion Recognition using Kinect V2 Sensor
PDF
Emotion Recognition from Facial Expression Based on Fiducial Points Detection...
PPTX
HUMAN EMOTION RECOGNIITION SYSTEM
PDF
Facial expression using 3 d animation
PDF
Facial expression using 3 d animation
PDF
Facial expression using 3 d animation
PDF
Facial expression using 3 d animation
PDF
Recognition of Facial Emotions Based on Sparse Coding
PDF
A Study of Method in Facial Emotional Recognitation
PDF
Research Inventy : International Journal of Engineering and Science
PPTX
Emotion intelligence
PDF
2018 09
Efficient Face Expression Recognition Methods FER A Literature Review
Face expression recognition using Scaled-conjugate gradient Back-Propagation ...
IRJET- EI-Expression Illustrator
IRJET- A Literature Review on Technique of Facial Variance to Detect the Stat...
Analysis on techniques used to recognize and identifying the Human emotions
Facial Expression Identification System
K017326168
Real Time Facial Emotion Recognition using Kinect V2 Sensor
Emotion Recognition from Facial Expression Based on Fiducial Points Detection...
HUMAN EMOTION RECOGNIITION SYSTEM
Facial expression using 3 d animation
Facial expression using 3 d animation
Facial expression using 3 d animation
Facial expression using 3 d animation
Recognition of Facial Emotions Based on Sparse Coding
A Study of Method in Facial Emotional Recognitation
Research Inventy : International Journal of Engineering and Science
Emotion intelligence
2018 09
Ad

Recently uploaded (20)

PDF
Network Security Unit 5.pdf for BCA BBA.
PPTX
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
PDF
cuic standard and advanced reporting.pdf
PDF
Electronic commerce courselecture one. Pdf
PDF
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
PDF
Chapter 3 Spatial Domain Image Processing.pdf
PDF
KodekX | Application Modernization Development
PDF
Spectral efficient network and resource selection model in 5G networks
PDF
Review of recent advances in non-invasive hemoglobin estimation
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
DOCX
The AUB Centre for AI in Media Proposal.docx
PDF
Advanced methodologies resolving dimensionality complications for autism neur...
PPTX
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
PDF
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
PDF
Building Integrated photovoltaic BIPV_UPV.pdf
PDF
MIND Revenue Release Quarter 2 2025 Press Release
PPT
Teaching material agriculture food technology
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PPTX
Big Data Technologies - Introduction.pptx
PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
Network Security Unit 5.pdf for BCA BBA.
Effective Security Operations Center (SOC) A Modern, Strategic, and Threat-In...
cuic standard and advanced reporting.pdf
Electronic commerce courselecture one. Pdf
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
Chapter 3 Spatial Domain Image Processing.pdf
KodekX | Application Modernization Development
Spectral efficient network and resource selection model in 5G networks
Review of recent advances in non-invasive hemoglobin estimation
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
The AUB Centre for AI in Media Proposal.docx
Advanced methodologies resolving dimensionality complications for autism neur...
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
How UI/UX Design Impacts User Retention in Mobile Apps.pdf
Building Integrated photovoltaic BIPV_UPV.pdf
MIND Revenue Release Quarter 2 2025 Press Release
Teaching material agriculture food technology
Reach Out and Touch Someone: Haptics and Empathic Computing
Big Data Technologies - Introduction.pptx
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
Ad

Artificial Intelligence And Face Detection

  • 1. Thesis Presentation Outline Introduction Literature Review Implementation Experimental Results 01 02 03 04 05 06 07 08 09 Proposed Research Related work Title Slide Concept of Facial Expression and Emotion Theory of FER Future Directions
  • 2. MONITORING STUDENTS’ ENGAGEMENT IN CLASS USING FACIAL EMOTION RECOGNITION Name: Muhammad Adnan Bashir Advisor: Dr. Shaista Habib SCHOOL OF SYSTEM AND TECHNOLOGY UNIVERSITY OF MANAGEMENT AND TECHNOLOGY
  • 4. The system-based identification of facial expressions has been a vital area of research in the literature for a long time. The most meaningful way humans exhibit emotions is by facial expressions. Automatic emotions recognition has been a source of contention. When students in a classroom are exposed to circumstances that are taken from real life, they experience a wide spectrum of emotions at any given time. We purposed a mechanism in which students' facial expressions can be used to automatically tell if they are paying attention and how they are feeling, such as if they are active or not active.
  • 6. Divjak and Bischof • evaluated three factors (eye tracking, head movement, and eye distance) • generate a warning when they detected alarm will sound to notify Kamath • create an automated gaze system to find out how interested the students were. • Using video from the classroom, they made a way to tell how engaged students are. • They used a face-tracking system to get students to look at the screen. Bidwell • use the histogram of oriented gradients (HOG) to analyze the input images. RELATED WORK
  • 7. Turabzadeh • investigated the Local Binary Point (LBP) algorithm for recognizing facial emotions in real time • proposed a real-time system based on how a student's face shows emotion during a lesson. • Based on the student's emotions, the system would automatically adjust the content to the student's level of concentration. . • They also showed that there were three different levels of focus (high, medium, and low) Sharma • watches how students move their eyes and heads to see if they are paying attention and sounds an alarm if they aren’t. Kithira
  • 8. Title Methods Results Face Recognition and FER using PCA PCA Better Face Recognition Rate with Low Error Rate A Real Time Facial Expression Classification System Using LBP Haar Classifier High face detection rate and produce more efficient features Automated Facial Expression Recognition System Methods Constraint Model with SVM Constraint Model with SVM FER using Optical Flow without Complex Feature Extraction Uniform grids MLP 95.1% A neural-AdaBoost based facial expression recognition system Viola Jones, Gabor Feature Extraction ,Ada Boost, MNN-NP 96.83% and 92.22% are respectively Facial Expression Recognition 2D appearance-based Model Radial Symmetry Transform Accuracy achieved at 81 % Facial action unit recognition using multi-class classification Methods MLBP, FCBF, ECOC Good Result achieve by Multi-Class Classifier Recognition of facial expressions and measurement of levels of interest from video Grid points HMMs 90% A multi-task model for simultaneous face identification and FER MT-FIM 92.57% An SOM-based Automatic FER System Viola Jones, Gabor Wavelet Filters, Landmark Point Tracking Recognition Rate over 90%
  • 10. In a competitive environment, it is critical to use technology and learning strategies to assist students in adapting. Included as well is synopsis of the numerous facial expressions of emotion that have been categorized by a variety of researchers throughout the course of time. Many schools have implemented e-learning systems in recent decades to improve learning attention and efficiency. Show a new way to recognize facial expressions with a way to pay attention. Contextual face perception provides relevant context for processing facial expressions. To offer a framework for future research, each study's weaknesses have been identified and addressed. To give new researchers with unified knowledge on a single platform, research on facial expression detection using image and video classification has been conducted, and a collection of the relevant literature has been prepared because of this research. Motivation Contribution
  • 12. Explainable Facial Expressions and Emotions Expressions Emotions a form of nonverbal signaling using the movement of facial muscles. Emotions can be understood of in terms of neurobiology as complex action plans that are triggered by external or internal inputs.
  • 13. Facial Expression Analysis Techniques Facial electromyography (FEMG) • It is based on facial anatomical features. • Facial expressions are broken down into their component parts, which are referred to as "Action Units". • Using facial action unit coding, you can acquire all the knowledge that is required to differentiate between the following three categories of facial expressions. The Facial Action Coding System (FACS) Most of the time, they happen in everyday situations, last between 0.5 and 4 seconds, and can be seen with the naked eye. Macro expressions are short facial movements that last less than half a second. They happen when a person is trying to hide or repress their current emotional state, whether they are aware of it or not. Micro expressions Subtle expressions is the first sign of a facial expression when the emotion it shows is still considered to be mild.
  • 14. 01 02 Facial Emotions Analysis such as "fight or flight," which are designed to either immediately remove oneself from a potentially harmful situation or to prepare one for a physical confrontation with an adversary. Behavior patterns Emotions and physical and mental arousal are inextricably linked, and different levels of arousal are associated with various emotions. Bodily symptoms 03 04 evaluations of the activities, stimuli, or things being considered. Cognitive evaluations showing one's teeth and frowning are both signs of displeasure. Facial expressions example Most of these symptoms occurs unconsciously and cannot be controlled by the patient.
  • 15. Classification Of Facial Emotions Emotion Definition Motion of facial part Happiness Joy, pride, relief, hope, pleasure, and excitement. Open eyes, mouth corner pulled up, open mouth, elevated cheeks, and wrinkles around the eyes. Sadness Sadness is the opposite emotion of Joy. The secondary emotions are suffering, pain, sadness, sympathy, and despair. Outer brow lowered, inner brow corner lifted, mouth edge lowered, eye closed, lip corner pushed down. Fear Danger is the source of fear. It could be due to the possibility of bodily or mental injury. The secondary emotions of fear are Horror, anxiety, panic, and dread. Danger is the source of fear. It could be due to the possibility of bodily or mental injury. The secondary emotions of fear are Horror, anxiety, panic, and dread. Humans can undoubtedly make thousands of slightly different combinations of face expressions. There are six basic happiness, sadness, fear, anger, surprise, and disgust emotions
  • 16. Surprise This emotion arises when the unexpected occurs. The secondary emotions of surprise include astonishment and amazement. The individual has raised eyebrows, an open eye, an open mouth, and a lowered jaw. Disgust Disgust is a sense of disgust. A human's disgust can be triggered by any taste, smell, sound, or texture. In corner depressor, nose wrinkle, lower lip depressor, and lowered eyebrows. Anger Anger is among the most perilous emotions. This emotion may be detrimental; thus, humans attempt to avoid it. Anger's secondary emotions are irritation, annoyance, exasperation, hatred, and aversion. The eyebrows are pulled down, the mouth is closed, and the lips are tightened, and the upper and lower lids are pulled up.
  • 18. In this context, "Facial Emotion Recognition" is a type of software that can decipher emotional states in still images and moving video.
  • 19. The three phases of FER analysis are as follows: (a) identification of the face (b) identification of the facial expression (c) classification of the facial expressions Analysis of the positions of key facial landmarks can be used to infer emotional states (e.g., end of nose, eyebrows).
  • 21. Facial Recognition Methods 1- Point-based Display 2- Stick-figure Models 3- 3D XFace Animations
  • 23. We propose a system that uses techniques from deep convolutional neural networks to analyze the emotions that are present in a classroom setting. The real-time webcam footage from the classroom was subjected to a technique known as "key frame extraction," which enabled the analysis of the students' emotional expressions. Using cameras to monitor classrooms is a non-intrusive method for digitizing student behavior. The purpose of the proposed system is to determine the level of concentration of the students in classroom during lecture. This is done by continuously monitoring head rotation and eye movement. Once facial characteristics have been identified, it will be determined whether the student is attentive or not. The system can be utilized during a live lecture using a laptop computer with an integrated web camera. After that the first thing that needs to be checked off the list is whether or not the student possesses eyes.
  • 24. Workflow diagram of the proposed system Methodology • The proposed system consists of two models for detecting engagement and emotions. • For engagement detection, only the eye region of a face is examined with the aid of a Haar-Cascade classifier. • Once the eye region has been segmented, the image is manually labeled as "engaged" or "not engaged.“ • The CNN is then trained using these manually annotated datasets, and the engagement probability is calculated using the CNN model.
  • 25. Proposed Approach for Engagement Detection In our model, we take into consideration what the engagement levels of student during learning process. However, we didn’t take into consideration whether the student learned or not. We decided to divide the level of engagement into two groups: Active and Not Active. State of Positions Perceive the Expressions/Emotions Active Neutral emotion (front), smile (front), frown (front), Neutral emotion, eyes to the left (front), neutral emotion, eyes to the right (front), Smile, eyes to the left (front), smile, eyes to the right (front), Teeth smile (front), teeth smile, eyes to the left (front), teeth smile, eyes to the right (front) Wide eyes Dilated pupils Slight frown Staring with half-lidded eyes Nodding Pursing one’s lips Making eye contact when listening or conversing Not Active Looking down (towards the ground), Looking down (right side), looking down (left side), Neutral emotion, looking forward (front) Smile; looking forward (right side), smile; looking forward (left side) Hand over the mouth, looking forward (front). Blank stare Glazed eyes Minimal eye contact Yawning Closing or half-closing one’s eyes Propping one’s head in hands Unfocused gaze Rapid blinking Squinting
  • 27. Three cases have been defined to detect matching. ● Matching can be existing if emotion found in the proposed affective model and compatible with the engagement level. ● Matching cannot be existing if emotion is not compatible with the engagement level. ● Matching can be as a new emotion if volunteer felt about new emotion not found in the proposed affective model.
  • 28. • This application, which is completely based on the theory of compound emotions, seeks to classify all human emotions into distinct groups, active and inactive, using deep convolutional neural networks. • When processing videos in Python, we'll be using the Haar cascade module in conjunction with OpenCV, an open-source computer vision library. • To begin, we need to get a hold of footage from our main computer's camera or any other video in which a person's face may be identified. Application Work
  • 33. Future Directions ● All the engagement levels and feelings that are included in this model were chosen based on other works that were similar. Active and inactive participation are the two levels of engagement available. ● We offered a few recommendations aimed at enhancing the affective model. At this disengagement level, we need more investigation. As a result, one of our upcoming projects will consist of conducting an analysis of the physical and psychological activities of students while they are experiencing various emotions to define a comprehensive behavior for each level of engagement.