SlideShare a Scribd company logo
Gesture Based Human 
Computer-Interaction 
Using Kinect for Interaction with Computer for 
Gaming, Entertainment 
Prepared by: 
Mirza Akif Israr
Introduction 
• The development of ubiquitous computing and the need to 
communicate in a more natural, flexible, efficient but powerful 
way has rendered most current user interaction approaches 
which utilizes keyboard, mouse and pen insufficient. 
• Human-Computer-Interaction technology has seen significant 
changes over the years which range from text-based UI which 
relies heavily on the keyboard as an input device to 2-D 
graphical-based interfaces based on mice
Gesture Recognition 
• Gesture Recognition is a technology that achieves dynamic 
human-machine interactions without requiring physical, touch, 
or contact based input mechanisms. 
• The main goal of gesture recognition involves creating a system 
capable of interpreting specific human gestures via 
mathematical algorithms and using them to convey meaningful 
information or for device control.
Tracking Technologies 
• The main requirement for supporting gesture recognition is the 
tracking technology used for obtaining the input data. The 
approach used generally fall into two major categories: Glove-based 
approaches and Vision-based approaches.
Glove-based approaches 
The common technique for hand pose tracking 
is to instrument the hand with a glove. 
Sensors 
Send input to 
Computer about 
Hand Position 
Orientation, and flex 
of the fingers using 
magnetic or inertial 
tracking devices.
Glove-based approaches 
• While it is however easier to collect hand configuration and 
movement with this approach, the major drawback is that the 
required devices are quite expensive and cumbersome. Also, the 
ease and naturalness with which the user can interact with the 
computer controlled environment is hampered by the load of 
cables attached to the user.
Vision-based approaches 
• It offers a more natural way of HCI as it requires no physical 
contact with any devices. 
• Camera(s) placed at a fixed location or on a mobile platform in 
the environment are used for capturing the input image usually 
at a frame rate of 30 HZ or more. 
• To recognize human gestures, these images are interpreted to 
produce visual features which can then be used to interpret 
human activity.
Sensor based interaction
Gesture Recognition System 
• Gesture recognition is a complex task which involves many 
aspects and while there may be variations on the methodology 
used and recognition phases depending on the application areas, 
a typical gesture recognition system involves such process as 
data acquisition, gesture modeling, feature extraction and 
gesture recognition.
Microsoft’s Kinect 
• Of recent, gesture recognition has been integrated in various 
consumer devices for the purpose of entertainment. An example 
of such device is Microsoft’s Kinect, which allows a user to use 
gestures that are typically intuitive and relatively simple to 
perform various tasks such as controlling games, starting a movie 
etc.
The Kinect Sensor 
• The Kinect sensor is a motion-sensing input device that was 
originally developed in November 2010 for use with the Xbox 360 
but has recently been opened up for use with Windows PCs for 
commercial purposes.
• The Kinect sensor has the following properties and functions: 
• An RGB Camera that stores three channel data in a 1280x960 
resolution at 30Hz. The camera’s field of view as specified by 
Microsoft is 43° vertical by 57° horizontal. The system can 
measure distance with a 1cm accuracy at 2 meters distance. 
• An infrared (IR) emitter and an IR depth sensor used for 
capturing depth image. 
• An array of four microphones to capture positioned sounds. 
• A tilt motor which allows the camera angle to be changed 
without physical interaction and a three-axis accelerometer 
which can be used to determine the current orientation of the 
Kinect.
One 
Microphone 
Three 
Microphone 
Status LED CMOS Color Sensor 
(RBG Imaging) 
Three axis 
Accelerometer 
Motorized 
tilting base 
CMOS IR 
Sensor 
IR Light 
Source 
Used for 3D 
Depth Sensing
Hardware and Software 
Requirements 
• According to Microsoft, the PC that is to be used with the Kinect 
sensor must have the following minimum capabilities: (a) 32-bit 
(x86) or 64-bit (x64) processors,(b) Dual-core, 2.66-GHz or faster 
processor,(c) USB 2.0 bus dedicated to the Kinect, and (d) 2 GB of 
RAM. 
• To access Kinect’s capabilities, the following software is also 
required to be installed on the developer’s PC: Microsoft Visual 
Studio 2010/2012 Express or other Visual Studio edition. The 
development programming languages that can be used include 
C++, C# (C-Sharp), and Visual Basic.
Limitations 
• Kinect has the following limitations which are primarily based on 
the optical lenses: 
• For a smooth skeleton tracking, the user distance from the 
sensor must be between 1mand 3m. 
• Kinect cannot perform Finger Detection. This is due to the low-resolution 
of the Kinect depth map which, although works well 
when tracking a large object such as the human body; it is 
however difficult to track human hand (which occupies a very 
small portion of the image) especially gestures made with 
fingers.
Sensor based interaction
Sensor based interaction
Sensor based interaction

More Related Content

PPT
HCI - Chapter 2
PPTX
Gesture Recognition Technology
PPTX
Gesture recognition
PPTX
Human computer interaction-Memory, Reasoning and Problem solving
PPT
GESTURE RECOGNITION TECHNOLOGY
PPTX
Gesture recognition technology
PPTX
Finger tracking
PPTX
Gesture Recognition Technology-Seminar PPT
HCI - Chapter 2
Gesture Recognition Technology
Gesture recognition
Human computer interaction-Memory, Reasoning and Problem solving
GESTURE RECOGNITION TECHNOLOGY
Gesture recognition technology
Finger tracking
Gesture Recognition Technology-Seminar PPT

What's hot (20)

PPTX
Hand gesture recognition
PPTX
Hand Gesture recognition
PPTX
Human Computer Interaction Introduction
PDF
Gesture recognition document
PPTX
Gesture recognition technology
PPTX
Gesture recognition adi
DOCX
Seminar report Of Touchless Touchscreen
DOCX
Finger reader thesis and seminar report
DOC
Near Field Communication (NFC) technology
PPTX
Virtual mouse
PDF
Hand Gesture Recognition using Neural Network
PPT
HCI - Chapter 1
PPTX
Human Computer Interaction - INPUT OUTPUT CHANNELS
PDF
Saksham seminar report
PPSX
Sixth Sense Technology
PPTX
Human computer interaction -Input output channel
PPTX
CS8494 SOFTWARE ENGINEERING Unit-1
PPTX
Real time gesture recognition
PPTX
ppt of gesture recognition
PDF
gesture-recognition
Hand gesture recognition
Hand Gesture recognition
Human Computer Interaction Introduction
Gesture recognition document
Gesture recognition technology
Gesture recognition adi
Seminar report Of Touchless Touchscreen
Finger reader thesis and seminar report
Near Field Communication (NFC) technology
Virtual mouse
Hand Gesture Recognition using Neural Network
HCI - Chapter 1
Human Computer Interaction - INPUT OUTPUT CHANNELS
Saksham seminar report
Sixth Sense Technology
Human computer interaction -Input output channel
CS8494 SOFTWARE ENGINEERING Unit-1
Real time gesture recognition
ppt of gesture recognition
gesture-recognition
Ad

Viewers also liked (20)

PPTX
human computer interface
PDF
Multimodal Interaction: An Introduction
ODP
Human-Computer Interaction
PPTX
Peno sensor
PPT
Digital Memory Design for Elder People
PDF
An HCI Principles based Framework to Support Deaf Community
PPT
Open hci2013
PPTX
MULTIMODAL INTERFACE OF BRAQIN COMPUTER INTERFACE AND ELECTOOCULOGRAPHY
PPTX
Multimodal Interaction
PPTX
Poetic speculation share
PDF
Interaction design paradigms
PDF
盒子中的野獸
PPTX
設計與虛構
PDF
Hypnotist Framing: Hypnotic Practice as a Resource for Poetic Interaction Design
PPT
Introduction to HCI
PPT
Lecture 1: Human-Computer Interaction Introduction (2014)
PDF
Human-Computer Interaction: An Overview
PPT
Artificial intelligence Speech recognition system
PPTX
Human computer interaction
PPT
Speech recognition
human computer interface
Multimodal Interaction: An Introduction
Human-Computer Interaction
Peno sensor
Digital Memory Design for Elder People
An HCI Principles based Framework to Support Deaf Community
Open hci2013
MULTIMODAL INTERFACE OF BRAQIN COMPUTER INTERFACE AND ELECTOOCULOGRAPHY
Multimodal Interaction
Poetic speculation share
Interaction design paradigms
盒子中的野獸
設計與虛構
Hypnotist Framing: Hypnotic Practice as a Resource for Poetic Interaction Design
Introduction to HCI
Lecture 1: Human-Computer Interaction Introduction (2014)
Human-Computer Interaction: An Overview
Artificial intelligence Speech recognition system
Human computer interaction
Speech recognition
Ad

Similar to Sensor based interaction (20)

PPTX
Natural User Interface Microsoft Kinect and Surface Computing
PDF
Xbox One Kinect
PPTX
Microsoft Kinect for Human-Computer Interaction
PDF
Human interface guidelines_v1.8.0
PPTX
Sit microsoft kinect
PPTX
Visug: Say Hello to my little friend: a session on Kinect
PPTX
SIT - Microsoft Kinect
DOCX
Xbox one development kit 2 copy - copy
PPTX
Gesture recognition techniques
PPTX
Motion Game
PPTX
Natural User Interfaces
PDF
Gam02 kinect1, kinect2
PPTX
Writing applications using the Microsoft Kinect Sensor
PDF
Research on Detecting Hand Gesture
PDF
Programming with kinect v2
PDF
Kinect for Windows SDK - Programming Guide
PPTX
Microsoft Kinect and Kinect SDK
PDF
Gesture Gaming on the World Wide Web Using an Ordinary Web Camera
PPTX
Kinect2 hands on
PDF
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...
Natural User Interface Microsoft Kinect and Surface Computing
Xbox One Kinect
Microsoft Kinect for Human-Computer Interaction
Human interface guidelines_v1.8.0
Sit microsoft kinect
Visug: Say Hello to my little friend: a session on Kinect
SIT - Microsoft Kinect
Xbox one development kit 2 copy - copy
Gesture recognition techniques
Motion Game
Natural User Interfaces
Gam02 kinect1, kinect2
Writing applications using the Microsoft Kinect Sensor
Research on Detecting Hand Gesture
Programming with kinect v2
Kinect for Windows SDK - Programming Guide
Microsoft Kinect and Kinect SDK
Gesture Gaming on the World Wide Web Using an Ordinary Web Camera
Kinect2 hands on
Design and Evaluation Case Study: Evaluating The Kinect Device In The Task of...

Recently uploaded (20)

PDF
VCE English Exam - Section C Student Revision Booklet
PPTX
PPH.pptx obstetrics and gynecology in nursing
PDF
Insiders guide to clinical Medicine.pdf
PPTX
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
PDF
Pre independence Education in Inndia.pdf
PDF
Microbial disease of the cardiovascular and lymphatic systems
PDF
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PPTX
Introduction to Child Health Nursing – Unit I | Child Health Nursing I | B.Sc...
PDF
Origin of periodic table-Mendeleev’s Periodic-Modern Periodic table
PDF
Classroom Observation Tools for Teachers
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PDF
Complications of Minimal Access Surgery at WLH
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PPTX
Institutional Correction lecture only . . .
PDF
2.FourierTransform-ShortQuestionswithAnswers.pdf
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
PDF
TR - Agricultural Crops Production NC III.pdf
PPTX
Pharmacology of Heart Failure /Pharmacotherapy of CHF
PDF
102 student loan defaulters named and shamed – Is someone you know on the list?
PDF
Abdominal Access Techniques with Prof. Dr. R K Mishra
VCE English Exam - Section C Student Revision Booklet
PPH.pptx obstetrics and gynecology in nursing
Insiders guide to clinical Medicine.pdf
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
Pre independence Education in Inndia.pdf
Microbial disease of the cardiovascular and lymphatic systems
FourierSeries-QuestionsWithAnswers(Part-A).pdf
Introduction to Child Health Nursing – Unit I | Child Health Nursing I | B.Sc...
Origin of periodic table-Mendeleev’s Periodic-Modern Periodic table
Classroom Observation Tools for Teachers
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
Complications of Minimal Access Surgery at WLH
Final Presentation General Medicine 03-08-2024.pptx
Institutional Correction lecture only . . .
2.FourierTransform-ShortQuestionswithAnswers.pdf
Module 4: Burden of Disease Tutorial Slides S2 2025
TR - Agricultural Crops Production NC III.pdf
Pharmacology of Heart Failure /Pharmacotherapy of CHF
102 student loan defaulters named and shamed – Is someone you know on the list?
Abdominal Access Techniques with Prof. Dr. R K Mishra

Sensor based interaction

  • 1. Gesture Based Human Computer-Interaction Using Kinect for Interaction with Computer for Gaming, Entertainment Prepared by: Mirza Akif Israr
  • 2. Introduction • The development of ubiquitous computing and the need to communicate in a more natural, flexible, efficient but powerful way has rendered most current user interaction approaches which utilizes keyboard, mouse and pen insufficient. • Human-Computer-Interaction technology has seen significant changes over the years which range from text-based UI which relies heavily on the keyboard as an input device to 2-D graphical-based interfaces based on mice
  • 3. Gesture Recognition • Gesture Recognition is a technology that achieves dynamic human-machine interactions without requiring physical, touch, or contact based input mechanisms. • The main goal of gesture recognition involves creating a system capable of interpreting specific human gestures via mathematical algorithms and using them to convey meaningful information or for device control.
  • 4. Tracking Technologies • The main requirement for supporting gesture recognition is the tracking technology used for obtaining the input data. The approach used generally fall into two major categories: Glove-based approaches and Vision-based approaches.
  • 5. Glove-based approaches The common technique for hand pose tracking is to instrument the hand with a glove. Sensors Send input to Computer about Hand Position Orientation, and flex of the fingers using magnetic or inertial tracking devices.
  • 6. Glove-based approaches • While it is however easier to collect hand configuration and movement with this approach, the major drawback is that the required devices are quite expensive and cumbersome. Also, the ease and naturalness with which the user can interact with the computer controlled environment is hampered by the load of cables attached to the user.
  • 7. Vision-based approaches • It offers a more natural way of HCI as it requires no physical contact with any devices. • Camera(s) placed at a fixed location or on a mobile platform in the environment are used for capturing the input image usually at a frame rate of 30 HZ or more. • To recognize human gestures, these images are interpreted to produce visual features which can then be used to interpret human activity.
  • 9. Gesture Recognition System • Gesture recognition is a complex task which involves many aspects and while there may be variations on the methodology used and recognition phases depending on the application areas, a typical gesture recognition system involves such process as data acquisition, gesture modeling, feature extraction and gesture recognition.
  • 10. Microsoft’s Kinect • Of recent, gesture recognition has been integrated in various consumer devices for the purpose of entertainment. An example of such device is Microsoft’s Kinect, which allows a user to use gestures that are typically intuitive and relatively simple to perform various tasks such as controlling games, starting a movie etc.
  • 11. The Kinect Sensor • The Kinect sensor is a motion-sensing input device that was originally developed in November 2010 for use with the Xbox 360 but has recently been opened up for use with Windows PCs for commercial purposes.
  • 12. • The Kinect sensor has the following properties and functions: • An RGB Camera that stores three channel data in a 1280x960 resolution at 30Hz. The camera’s field of view as specified by Microsoft is 43° vertical by 57° horizontal. The system can measure distance with a 1cm accuracy at 2 meters distance. • An infrared (IR) emitter and an IR depth sensor used for capturing depth image. • An array of four microphones to capture positioned sounds. • A tilt motor which allows the camera angle to be changed without physical interaction and a three-axis accelerometer which can be used to determine the current orientation of the Kinect.
  • 13. One Microphone Three Microphone Status LED CMOS Color Sensor (RBG Imaging) Three axis Accelerometer Motorized tilting base CMOS IR Sensor IR Light Source Used for 3D Depth Sensing
  • 14. Hardware and Software Requirements • According to Microsoft, the PC that is to be used with the Kinect sensor must have the following minimum capabilities: (a) 32-bit (x86) or 64-bit (x64) processors,(b) Dual-core, 2.66-GHz or faster processor,(c) USB 2.0 bus dedicated to the Kinect, and (d) 2 GB of RAM. • To access Kinect’s capabilities, the following software is also required to be installed on the developer’s PC: Microsoft Visual Studio 2010/2012 Express or other Visual Studio edition. The development programming languages that can be used include C++, C# (C-Sharp), and Visual Basic.
  • 15. Limitations • Kinect has the following limitations which are primarily based on the optical lenses: • For a smooth skeleton tracking, the user distance from the sensor must be between 1mand 3m. • Kinect cannot perform Finger Detection. This is due to the low-resolution of the Kinect depth map which, although works well when tracking a large object such as the human body; it is however difficult to track human hand (which occupies a very small portion of the image) especially gestures made with fingers.