SlideShare a Scribd company logo
Nikppt
 Gestures are an important aspect of human interaction,
both interpersonally and in the context of man-machine
interfaces.
 A gesture is a form of non-verbal communication in
which visible bodily actions communicate particular
messages, either in place of speech or in parallel with
words.
 It includes movement of the hands, face, or other parts
of the body.
What Are Gestures?
 The primary goal of gesture recognition is to create a system
which understands human gestures and uses them to control
various devices.
 With the development of ubiquitous computing, current user
interaction approaches with keyboard, mouse and pen are not
sufficient.
 With gesture recognition, computers can become familiarized
with the way humans communicate using gestures.
Data glove:
It uses thin fiber optic cables running down the back of each hand,
each with a small crack in it.
Cyber Glove:
It uses strain gauges placed between the fingers to measure
abduction as well as more accurate bend sensing.
Videoplace:
It uses real time image processing of live video of the user. It has a
very good feature recognition technique which can easily distinguish
between hands and fingers.
Media Room:
It was the first interface to support combined speech and gesture
recognition technology. Within the Media Room the user could use
speech, gesture, eye movements or a combination of all three.
Since The Beginning
Virtual keyboards:
Virtual keyboards use a lens to project an image of a keyboard on
to a desk or other flat surface. An infrared light beam that the
device directs above the projected keyboard detects the users
fingers.
Navigaze:
Users can work with applications by moving cursors with head
movements and clicking the mouse with eye blinks.
The Present
CePal:
 Developed by Dhirubai Ambani Institute, Gandhinagar
 Cepal is a gizmo which can be worn like a watch and an infrared
gesture based remote control which helps the motor impaired
and other limb related disorders to complete routine tasks such as
operating TV, air conditioner, lights and fans.
 It helps the people with cerebral palsy to be self-reliant.
The Future
ADITI:
 ADITI was developed by IIT-Madras.
 It helps people with debilitating diseases – such cerebral palsy,
muscular skeletal disorders
 It is an indigenous USB device.
 ADITI is a screen-based device running software that provides
them with a list of visual options like words pictures or alphabets
to express themselves.
 Using ADITI patients can communicate through simple gestures
like nod of heads moving feet to generate a mouse click.
 Visual interpretation of hand gestures is mainly
divided into three parts:
Gesture modeling
Gesture analysis
Gesture recognition
Gesture Recognition System
 The scope of a gestural interface for Human-Computer Interaction
is directly related to the proper modeling of hand gestures.
Temporal Modeling of gestures:
Three phases make a gesture
 Preparation - Sets the hand in motion from resting position.
 Nucleus - Some definite form and enhanced dynamic qualities.
 Retraction - Hand returns to the resting position or repositions for
new gesture phase.
Gesture Modeling
Spatial Modeling of gestures:
 Appearance based models directly link the appearance of the
hand and arm movements in visual images to specific gestures.
3D Hand Arm Models:
 These models are the premier choice of hand gesture modeling in
which volumetric analysis is done.
 Structures like cylinders and super quadrics which are
combination of simple spheres, circles, hyper rectangles are used
to shape body parts like forearms or upper arms.
Appearance–based Models:

 A large number of models belong to this group.
 These models take the outline of an object performing the
gesture.
 The hand gesture is taken as a video input and then the 2D image
is compared with predefined models.
 If both the images are similar the gesture is determined and
executed.
 It is the second phase of the gesture recognition system. It
consists of two steps:
 Feature Detection.
 Parameter Estimation.
Feature Detection:
 Feature detection is a process where a person performing the
gestural instructions is identified and the features to understand
the instructions are extracted from the rest of the image.
 Cues are used in feature detection and are of two types.
Gesture Analysis
 Color cues
 It characteristic color footprints of the human skin.
 Motion cues
 The motion in the visual image is the movement of the arm of the
gesturer whose movements are located, identified and executed
 Overcoming the Problems of Cues:
 One way is the fusion of colour, motion and other visual cue or
the fusion of visual cues with non-visual cues like speech or gaze.
 The second solution is to make use of prediction techniques.
 Simplest method is to detect the hand’s outline which can be
extracted using an edge detection algorithm.
 Zernike Moments method divides the hand into two subregions
the palm and the fingers.
 Multi Scale Color Features method performs feature extraction
directly in the color space.
Parameter Estimation:
 This type of computation depends on the model parameters and
the features detected.
 It is the phase in which data analyzed from visual images of
gestures is recognized as specific gesture
Tasks associated in recognition are
 Optimal partitioning of the parameter space: Related to the
choice of gestural models and their parameters.
 Implementation of the recognition procedure
 Key concern is the computational efficiency
Gesture Recognition
 Static hand gestures was describes by Thierry Messer.
 The classification represents the task of assigning a feature vector
or a set of features to some predefined classes in order to
recognize the hand gesture.
 The classification mainly helps in finding the best matching
reference features for the features extracted
 k-Nearest Neighbors: This classification method uses the feature-
vectors gathered in the training to and the k nearest neighbors in
an n-dimensional space.
Classification Methods Of Hand
Gestures
 Hidden Markov Models
 The Hidden Markov Model (HMM) classifiers belong to the class
of trainable classifiers.
 It represents a statistical model, in which the most probable
matching gesture-class is determined for a given feature vector-
based on the training data.
 Multi Layer Perceptron
 A Multi Layer Perceptron (MLP) classifier is based on a neural
network.
 It represent a trainable classifier (similar to Hidden Markov
Models).
 Invented by JAN.
 Used the OpenCV library.
 OpenCV is a cross-platform computer vision library developed for
real-time image processing. It contains all the algorithms.
 Jan utilized the concept that in order for a computer to recognize
something in the screen, it is required to describe the object that
needs to be recognized in mathematical terms.
 For this object’s contours are needed which can then turn into a
row of numbers that can be subsequently stored in memory or
further computed with.
Static Hand Gesture Recognition
Software
 Contours represent a simplified version of a real video image that
can be compared against some pattern stored in memory
Contours of the hand gesture
Steps described to implement visual recognition:
 Scan the image of interest.
 Subtract the background (i.e. everything we are not interested
in).
 Find and store contours of the given objects.
 Represent the contours in a given mathematical way.
 Compare the simplified mathematical representation to a pattern
stored in computer’s memory.
Jan describes feature detection in two ways
 Color cues
 Background Segmentation
 Sixth Sense is a wearable gestural interface that
supplements the physical world around us with digital
information and lets us use natural hand gestures to
interact with that information.
 It comprises of pocket projector mirror and a camera. The
projector and the camera are connected to a mobile
wearable device.
 It enables us to link with the digital information available
using natural hand gestures and automatically recognizes
the objects and retrieves information related to it.
Sixth Sense Technology
Components of Sixth Sense device
Nikppt
 This application uses the concept of pattern recognition using
clusters.
 The basic pattern recognition technique is to derive several
liner/nonlinear lines to separate the feature space into multiple
clusters
 The less clusters cause higher recognition rate
Security Elevator Using Gesture &
Face Recognition
Applications
Surface less computingMotion capture
More pictures More info
 By using proper sensors worn on the
body of a patient and by reading the
values from those sensors, robots
can assist in patient rehabilitation.
Socially assistive robotics
Immersive gaming technology
 In video games like xbox one
with kinetic sensor often user is the
controller and has to perform all the
actions
Nikppt

More Related Content

PPTX
Hand Gesture Recognition Based on Shape Parameters
PPTX
PPTX
Real time gesture recognition
PDF
Gesture recognition using artificial neural network,a technology for identify...
PPTX
Gesture recognition 2
PPT
Gesture recognition PPPT
PDF
Gesture recognition using artificial neural network,a technology for identify...
PPTX
Part 1 - Gesture Recognition Technology
Hand Gesture Recognition Based on Shape Parameters
Real time gesture recognition
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition 2
Gesture recognition PPPT
Gesture recognition using artificial neural network,a technology for identify...
Part 1 - Gesture Recognition Technology

What's hot (20)

PPTX
Gesturerecognition
PPTX
Hand Gesture recognition
PPTX
Hand gesture recognition
PDF
Final Year Project-Gesture Based Interaction and Image Processing
PPTX
gesture recognition!
PPTX
A Framework For Dynamic Hand Gesture Recognition Using Key Frames Extraction
PDF
Hand Gesture Recognition System for Human-Computer Interaction with Web-Cam
PDF
IRJET- Hand Gesture Recognition System using Convolutional Neural Networks
PPTX
Movement Tracking in Real-time Hand Gesture Recognition
PDF
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
PPT
GESTURE RECOGNITION TECHNOLOGY
PDF
IRJET- Survey Paper on Vision based Hand Gesture Recognition
PPTX
Gesture recognition technology
PDF
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
PPTX
Gesture recognition technology
PPTX
Gesture recognition technology
PPTX
Gesture recognition technology
PPT
My old 2002 Thesis on Hand Gesture Recognition using a Web Cam! 
PDF
Hand Gesture Recognition using Neural Network
PDF
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
Gesturerecognition
Hand Gesture recognition
Hand gesture recognition
Final Year Project-Gesture Based Interaction and Image Processing
gesture recognition!
A Framework For Dynamic Hand Gesture Recognition Using Key Frames Extraction
Hand Gesture Recognition System for Human-Computer Interaction with Web-Cam
IRJET- Hand Gesture Recognition System using Convolutional Neural Networks
Movement Tracking in Real-time Hand Gesture Recognition
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
GESTURE RECOGNITION TECHNOLOGY
IRJET- Survey Paper on Vision based Hand Gesture Recognition
Gesture recognition technology
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Gesture recognition technology
Gesture recognition technology
Gesture recognition technology
My old 2002 Thesis on Hand Gesture Recognition using a Web Cam! 
Hand Gesture Recognition using Neural Network
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
Ad

Viewers also liked (6)

PPTX
sign language recognition using HMM
PDF
Gesture Recognition
PPTX
Gesture recognition adi
PPTX
Gesture Recognition
PPTX
Gesture Recognition Technology
PPTX
Gesture recognition
sign language recognition using HMM
Gesture Recognition
Gesture recognition adi
Gesture Recognition
Gesture Recognition Technology
Gesture recognition
Ad

Similar to Nikppt (20)

PDF
Gesture recognition document
PDF
Ay4103315317
DOCX
Gesture recognition
PPTX
Gesture Recognition
PPTX
ppt of gesture recognition
PPT
hand gesture based interactive photo silder
PDF
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
PPTX
qwerasdfzxcv
PPTX
Finalgesture22
PDF
Controlling Computer using Hand Gestures
PPTX
Gesture Technology
PDF
Gesture Recognition System
PDF
Hand Gesture Recognition using OpenCV and Python
PDF
Indian Sign Language Recognition using Vision Transformer based Convolutional...
PPTX
Elsevier PPT
PDF
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
PDF
HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...
PDF
Paper id 21201494
PPTX
Gesture recognition technology ppt
PDF
Sign Language Identification based on Hand Gestures
Gesture recognition document
Ay4103315317
Gesture recognition
Gesture Recognition
ppt of gesture recognition
hand gesture based interactive photo silder
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
qwerasdfzxcv
Finalgesture22
Controlling Computer using Hand Gestures
Gesture Technology
Gesture Recognition System
Hand Gesture Recognition using OpenCV and Python
Indian Sign Language Recognition using Vision Transformer based Convolutional...
Elsevier PPT
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...
Paper id 21201494
Gesture recognition technology ppt
Sign Language Identification based on Hand Gestures

Nikppt

  • 2.  Gestures are an important aspect of human interaction, both interpersonally and in the context of man-machine interfaces.  A gesture is a form of non-verbal communication in which visible bodily actions communicate particular messages, either in place of speech or in parallel with words.  It includes movement of the hands, face, or other parts of the body. What Are Gestures?
  • 3.  The primary goal of gesture recognition is to create a system which understands human gestures and uses them to control various devices.  With the development of ubiquitous computing, current user interaction approaches with keyboard, mouse and pen are not sufficient.  With gesture recognition, computers can become familiarized with the way humans communicate using gestures.
  • 4. Data glove: It uses thin fiber optic cables running down the back of each hand, each with a small crack in it. Cyber Glove: It uses strain gauges placed between the fingers to measure abduction as well as more accurate bend sensing. Videoplace: It uses real time image processing of live video of the user. It has a very good feature recognition technique which can easily distinguish between hands and fingers. Media Room: It was the first interface to support combined speech and gesture recognition technology. Within the Media Room the user could use speech, gesture, eye movements or a combination of all three. Since The Beginning
  • 5. Virtual keyboards: Virtual keyboards use a lens to project an image of a keyboard on to a desk or other flat surface. An infrared light beam that the device directs above the projected keyboard detects the users fingers. Navigaze: Users can work with applications by moving cursors with head movements and clicking the mouse with eye blinks. The Present
  • 6. CePal:  Developed by Dhirubai Ambani Institute, Gandhinagar  Cepal is a gizmo which can be worn like a watch and an infrared gesture based remote control which helps the motor impaired and other limb related disorders to complete routine tasks such as operating TV, air conditioner, lights and fans.  It helps the people with cerebral palsy to be self-reliant. The Future
  • 7. ADITI:  ADITI was developed by IIT-Madras.  It helps people with debilitating diseases – such cerebral palsy, muscular skeletal disorders  It is an indigenous USB device.  ADITI is a screen-based device running software that provides them with a list of visual options like words pictures or alphabets to express themselves.  Using ADITI patients can communicate through simple gestures like nod of heads moving feet to generate a mouse click.
  • 8.  Visual interpretation of hand gestures is mainly divided into three parts: Gesture modeling Gesture analysis Gesture recognition Gesture Recognition System
  • 9.  The scope of a gestural interface for Human-Computer Interaction is directly related to the proper modeling of hand gestures. Temporal Modeling of gestures: Three phases make a gesture  Preparation - Sets the hand in motion from resting position.  Nucleus - Some definite form and enhanced dynamic qualities.  Retraction - Hand returns to the resting position or repositions for new gesture phase. Gesture Modeling
  • 10. Spatial Modeling of gestures:  Appearance based models directly link the appearance of the hand and arm movements in visual images to specific gestures. 3D Hand Arm Models:  These models are the premier choice of hand gesture modeling in which volumetric analysis is done.  Structures like cylinders and super quadrics which are combination of simple spheres, circles, hyper rectangles are used to shape body parts like forearms or upper arms.
  • 11. Appearance–based Models:   A large number of models belong to this group.  These models take the outline of an object performing the gesture.  The hand gesture is taken as a video input and then the 2D image is compared with predefined models.  If both the images are similar the gesture is determined and executed.
  • 12.  It is the second phase of the gesture recognition system. It consists of two steps:  Feature Detection.  Parameter Estimation. Feature Detection:  Feature detection is a process where a person performing the gestural instructions is identified and the features to understand the instructions are extracted from the rest of the image.  Cues are used in feature detection and are of two types. Gesture Analysis
  • 13.  Color cues  It characteristic color footprints of the human skin.  Motion cues  The motion in the visual image is the movement of the arm of the gesturer whose movements are located, identified and executed  Overcoming the Problems of Cues:  One way is the fusion of colour, motion and other visual cue or the fusion of visual cues with non-visual cues like speech or gaze.  The second solution is to make use of prediction techniques.
  • 14.  Simplest method is to detect the hand’s outline which can be extracted using an edge detection algorithm.  Zernike Moments method divides the hand into two subregions the palm and the fingers.  Multi Scale Color Features method performs feature extraction directly in the color space. Parameter Estimation:  This type of computation depends on the model parameters and the features detected.
  • 15.  It is the phase in which data analyzed from visual images of gestures is recognized as specific gesture Tasks associated in recognition are  Optimal partitioning of the parameter space: Related to the choice of gestural models and their parameters.  Implementation of the recognition procedure  Key concern is the computational efficiency Gesture Recognition
  • 16.  Static hand gestures was describes by Thierry Messer.  The classification represents the task of assigning a feature vector or a set of features to some predefined classes in order to recognize the hand gesture.  The classification mainly helps in finding the best matching reference features for the features extracted  k-Nearest Neighbors: This classification method uses the feature- vectors gathered in the training to and the k nearest neighbors in an n-dimensional space. Classification Methods Of Hand Gestures
  • 17.  Hidden Markov Models  The Hidden Markov Model (HMM) classifiers belong to the class of trainable classifiers.  It represents a statistical model, in which the most probable matching gesture-class is determined for a given feature vector- based on the training data.  Multi Layer Perceptron  A Multi Layer Perceptron (MLP) classifier is based on a neural network.  It represent a trainable classifier (similar to Hidden Markov Models).
  • 18.  Invented by JAN.  Used the OpenCV library.  OpenCV is a cross-platform computer vision library developed for real-time image processing. It contains all the algorithms.  Jan utilized the concept that in order for a computer to recognize something in the screen, it is required to describe the object that needs to be recognized in mathematical terms.  For this object’s contours are needed which can then turn into a row of numbers that can be subsequently stored in memory or further computed with. Static Hand Gesture Recognition Software
  • 19.  Contours represent a simplified version of a real video image that can be compared against some pattern stored in memory Contours of the hand gesture Steps described to implement visual recognition:  Scan the image of interest.  Subtract the background (i.e. everything we are not interested in).  Find and store contours of the given objects.
  • 20.  Represent the contours in a given mathematical way.  Compare the simplified mathematical representation to a pattern stored in computer’s memory. Jan describes feature detection in two ways  Color cues  Background Segmentation
  • 21.  Sixth Sense is a wearable gestural interface that supplements the physical world around us with digital information and lets us use natural hand gestures to interact with that information.  It comprises of pocket projector mirror and a camera. The projector and the camera are connected to a mobile wearable device.  It enables us to link with the digital information available using natural hand gestures and automatically recognizes the objects and retrieves information related to it. Sixth Sense Technology
  • 22. Components of Sixth Sense device
  • 24.  This application uses the concept of pattern recognition using clusters.  The basic pattern recognition technique is to derive several liner/nonlinear lines to separate the feature space into multiple clusters  The less clusters cause higher recognition rate Security Elevator Using Gesture & Face Recognition
  • 25. Applications Surface less computingMotion capture More pictures More info
  • 26.  By using proper sensors worn on the body of a patient and by reading the values from those sensors, robots can assist in patient rehabilitation. Socially assistive robotics Immersive gaming technology  In video games like xbox one with kinetic sensor often user is the controller and has to perform all the actions