SlideShare a Scribd company logo
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 04 Issue: 11 | Nov -2017 www.irjet.net p-ISSN: 2395-0072
© 2017, IRJET | Impact Factor value: 6.171 | ISO 9001:2008 Certified Journal | Page 2172
Control buggy using Leap sensor camera in Data Mining Domain.
Prof Bhadkumbhe S.M.1, Kurup Amitha2, Jadhav Snehal3, Makar Shraddha4, Ghule Priyanka5
Dept. Computer Engineering, PDEA’s college of Manjari (BK), Maharashtra, Indian
---------------------------------------------------------------------***---------------------------------------------------------------------
Abstract - This paper portrays a method for controllingthe
buggy with proper gestures utilizing the leap motion camera.
We have used leap as motion controller and the operation is
completed on a buggy. The proper gesture for the action to be
performed is accumulated in the database, whenthegestureis
performed five points are extracted from which fifteen
features are selected and this fifteen features are furtherused
for comparison with stored gestureusingsimilarityalgorithm
and the proper gesture identified by the system which is
further performed by the buggy. In thissystemwearegoingto
develop a buggy having two Direct Current motors connected
to microcontroller via device driver. The micro-controller is
affiliated to the PC by Max 232serialcommunication wire. The
above mentioned system diminishes the human efforts while
driving a car in real life (buggy is considered). This paperdoes
not make driving 100% automatic but it reduces the effortsto
a greater extent elevates the efficiency of the system.
Key Words: Leap motion, gesture, Euclidean distance,
cosinesimilarity,Trajectoryfeature,gesturerecognition
1. INTRODUCTION
Now days the use of community transport like car has
increased a lot in last few years. People send most of their
energy driving from one locality to another, short as well as
long distance. So to reduce the effortsrequired bythehuman
being this system is designed. In our case gesture is used to
control the buggy/car. Gesture is the movement or action
performed by body/limbs/hands to express certain idea or
sentiments. In simple words gesture means talking with
hands. The capturing of the gesture is carried out by
different devices in our case leap camera controller is used
for gesture recognition which are performed by hand and
this results in the motion or controlling of the buggy by
gestures of human hand.
In order to send the correct gesture to the buggy the system
requires some device that could scan the image of gesture.
The leap camera is utilized to capture the actual gesture
performed by the human being in other words leap is just a
simple device that provides input to the system. Theheart of
the device consists of two camera & three infrared LEDS. It
can track wavelength about 850 nm outside visible light
spectrum. The viewing range of the device of the device is
2.6 feet. The data is in gray scale from which is separated
into left & right cameras.
As soon as the image data is streamed to your computer, the
mathematical calculation takes place. The leap motion
service is the software present in the computer that
processes the image
Fig 1: Leap Motion coordinates.
2. Literature Review
Gesture Control of Drone Using a Motion Controller [1]
provide an easy way to interact and communicate with
machine. The human gesture or actions are applied to
interact or to control the AR Drone .The actions done by the
drone are lifting, moving and yawing. It also perform
movements for flips, python script is used for flip. . The Leap
Motion Controller uses its two monochromatic infrared(IR)
cameras and three infrared LEDs to identify any gesture
performed above the leap to a distance of 1 meter or 3 feet
directly above the leap. Means it creates a hemispherical
area around it which is about 1 meter and recognize any
hand gesture in that volume.
A lot of analysis work is completed in recognition of leap
gesture. Several techniques are planned for recognition of
hand gestures. During a Gesture Guided Moving mechanism
with Microsoft Kinect [2]: demonstrate a gesture-guided
mechanism moving system that is controlled by human
gestures and keep step with the controller with
comparatively constant distance. They used 2 advance
devices programmable mechanism system: iRobot [1] and
Microsoft Kinect [2]. Some straightforward gestures will be
accustomed build the mechanismmoveany,closer,andkeep
step with the walking person.
Application of hand detection algorithm in robot control [3]
presented the concept of vision based robot control system
.The hand detection algorithmincontrollingrobotpresented
a robot control system which is vision based. To control the
robot manipulator hand gesture tracking and detection
algorithm is used. The points and their paths are converted
into a set of orientation and location value so as to achieve
appropriate motion of manipulator. Thedrawback isthatfor
accuracy we need to provide appropriate.
Indian Sign Language to Forecast Text using Leap Motion
Sensor and RF Classifier its proffered system involve
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 04 Issue: 11 | Nov -2017 www.irjet.net p-ISSN: 2395-0072
© 2017, IRJET | Impact Factor value: 6.171 | ISO 9001:2008 Certified Journal | Page 2173
recognition of Indian language (ISL) exploitation Leap
Motion sensing element. The device extracts vital
characters/ attributes from the dataset such high-accuracy
angle between fingers cab be retrieved, and this could be
used for sturdy prediction. Feature classifier is employed to
spot that gesture is being performed. Here, Random Forest
(RF) classifier model is employed as feature classification. A
block-list classifier is employed to get rid of all non-gestures
frames, when that correct gesture frames area unit
recognized by RF to create a significant word. This
significant word is then displayed on digital display.
Downside of this analysis is every country and every region
has its own language, therefore this model is beneficial for
less than individuals who now the Indian language.
The motion controller’s internal elements could be a 2
camera digital to find the motion of the hands and fingers
and a 3 LED’s infrared for detective work the space between
the user’s hands with the leap motion controller, to find the
space for controlling the automaton arm. Mainframe within
the LEAP motion controller is processing visual information.
It then sends the info of the movement of the hand and
fingers into the pc. The information are formatted data on
the space and the position coordinates of the hand and
fingers at the moment send to the microcontroller for signal
process and send management commands to drive server
motor to manage the robotic arm. This technique carries
with it four main parts:
 The signal detection consists of a Leap motion
controller
 The signal processing part using JavaScript and
node.js
 Microcontroller
 The robotic arm
3. Proposed System
Fig 2: Flow chart
We have proffered a system which identifies the gesture
done by the user which is detected by the leap motion
camera and the appropriate gesture is performed by the
buggy. In this system 6 points are extracted fromthegesture
done above or incident to the leap motion camera. From
these 6 points the features requiredfordetectingthegesture
are calculated. In the previous system only 12 features were
identified in this system total of 15 features are selected
making the system more efficient. The proffered system is
partitioned into 4 main steps:
1. Leap interfacing
2. Preprocessing and point extraction
3. Feature extraction
4. Feature comparison
Fig 3: Block diagram
1. Leap interfacing
The software used for leap motion runs as a service of
window system. The leap motion services are accessed
by leap enabled application for the sake of receiving the
data, leap motion SDK contribute 2 API which are
a) Native interface
b) Web socket interface
a) Native interface :
It is a set of small programs that can be loaded
whenever needed (dynamic library) which is
used to compose new leap enabled application.
b) Web socket interface :
The web socket interface & JavaScript client
library are together used to develop leapenabled
web applications.
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 04 Issue: 11 | Nov -2017 www.irjet.net p-ISSN: 2395-0072
© 2017, IRJET | Impact Factor value: 6.171 | ISO 9001:2008 Certified Journal | Page 2174
2. Preprocessing and point extraction
Initially the features for the appropriate gestureandthe
action to be performed are hoarded in the database as
the dataset for future recognition.Assoonasthegesture
is done by the user in front of leap camera. Extraction of
the 6 points take place these 6 points indicate the
centroid of the five fingers which are thumb, index
finger, middle finger, ring finger & the centroid of plan.
3. Feature extraction :
The features required for the system are extracted
from the 6 points in order to analyze the action to be
performed the features are selectedtotal 15featuresare
selected for appropriate identification of action to be
performed, the extracted features are actually the
distance between all points. The 6 points are p1(x1, y1,
z1)……p6(x6, y6, z6) to find the distance between
points Euclidean distance formula is applied:
Fig 4: Points extraction
4. Feature comparison
Once the 15 features are selected for each gesture a
value is calculated for this. Then the current value is
compared with the features in the dataset the value
which is near the value of the current gesture is the
gestured action which is to be performed. In order to
compare and find the appropriate action for the gesture
the cosine similarity formula is used which is
Where and are components of vector and respectively.
4. HARDWARE USED
1. ATmega32
Fig 5: ATmega32
ATmega32 is an 8-bit a sophisticated microcontroller of
Atmel's Huge AVR family. Atmega32 is dependent on
increased RISC architecture which consist of 131 powerful
instructions. Almost all of the instructions execute in one
instruction cycle. Atmega32 can work s on the maximum
periodicity of 16MHz.
2. MAX232
Fig 6: MAX232
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 04 Issue: 11 | Nov -2017 www.irjet.net p-ISSN: 2395-0072
© 2017, IRJET | Impact Factor value: 6.171 | ISO 9001:2008 Certified Journal | Page 2175
The MAX232 IC is utilized to convert the TTL logic levels to
RS232 logic levels during serial communication of
microcontrollers with computer. The controller operates At
TTL logic level (0-5V) the controller works whereas the
serial communication in computer works on RS232
standards (-25 V to + 25V). This makes it challenging to
create a direct association between them to interact with
each other.
ULN2803
Fig 7: ULN2803
ULN28xx high voltage, excessive-cutting-edge Darlington
arrays are used for interfacing among low degree logic
circuitry and a couple of peripheral powerloads.Someof the
loads are solenoid, magnetic print hammers and heaters. All
the devices have featureopen-collectoroutputswithintegral
clamp diodes.
3. Sensor used (IR Sensor)
Fig 8: IR Sensor
The IR sensor is a tool which detects IR the radiation falling
on it. There are several types of IR receptors that are built
and can be built with respect to the application. Proximity
sensors (Used in mobile phones and Edge Avoiding
Robots),contrast sensors (Used in Line Following Robots)
and blockage counters/sensors (Used for keeping track of
goods and in Thief Alarms) are some illustrations, whichuse
IR sensors.
Every sensor in this world has three ports:
 VCC - to switch on the sensor
 GND – to set negative reference
 Output - analog output of the sensor(in some
receptors , there may be more than one output
terminals)
In this system the IR sensor is used to detect the obstacles in
front of the buggy/car.
5. CONCLUSIONS
We have proffered a framework to control buggyusinghand
gesture. In this study we have utilized hand motion dataset
that includes numerous gesture representations. We have
utilized leap motion sensor camera to observe hand motion.
To compare current gesture with stored gesture we have
extracted a feature set. The features are calculated using
Euclidean distance formula. This feature set will be
compared with stored data usingcosinesimilarityandbuggy
will take appropriate action.
In this study we have utilized 3D leap motion sensor camera
and we have extracted features of hand or fingers for
maximum accuracy. For detecting gesture in 3D we can use
AI based algorithm for further study
REFERENCES
[1] Ayanava Sarkar, Ganesh Ram R.K., Ketul Arvindbhai
Patel, Geet Krishna Capoor, “Gesture Control of Drone
Using a Motion Controller” BITS Pilani, Dubai Campus,
2016.
[2] Qingyu Li, Panlong Yang, “A Gesture Guided Moving
Robot with Microsoft Kinect”,CollegeofCommunication
Engineering, PLA University of Science and Technology,
2013.
[3] Tomasz Grzejszczak, Adrian Łe,gowski, Michał
Niezabitowski, “Tomasz Grzejszczak, Adrian
Łe,gowski, Michał Niezabitowski”, Akademicka 16
Street, 44-100 Gliwice, Poland, 2016.
[4] Poonam Chavan1, Prof. Tushar Ghorpade2, Prof. Puja
Padiya3, “Indian Sign Language to Forecast Text using
Leap Motion Sensor and RF Classifier”, Ramrao Adik
Institute of Technology Nerul, Navi Mumbai,
Maharashtra 400706 , 2017.
[5] Y.Pititeeraphob, P.Choitkuman, N.Thongpance,
K.Kullatham, Ch.Pintavirooj, “Robot-armcontrol system
using leap motion controller”, Rangsit University,
Pathumthani, Thailand, 2017.
[6] Shogo Okada1 and Kazuhiro Otsuka, “Recognizing
Words from Gestures: Discovering Gesture Descriptors
Associated with Spoken Utterances”, School of
Computing, Tokyo Institute of Technology, Japan 2 NTT
Communication Science Laboratories, Japan, 2017.
[7] A. Marcos-Ramiro et al. Body communicative cue
extraction for conversational analysis. In IEEE Int. Conf.
on Automatic Face and Gesture Recognition, 2013.
[8] A. Joshi et al. A random forest approach to segmenting
and classifying gestures. In IEEE Int. Conf. on Automatic
Face and Gesture Recognition, volume 1, 2015.

More Related Content

PDF
IRJET= Air Writing: Gesture Recognition using Ultrasound Sensors and Grid-Eye...
PDF
IRJET- Intrusion Detection through Image Processing and Getting Notified ...
PDF
IRJET- Offline Location Detection and Accident Indication using Mobile Sensors
PDF
Real Human Face Detection for Surveillance System Using Heterogeneous Sensors
PDF
IRJET- Automatic Self Parking Chair
PDF
IRJET - Drowsiness Detection System using ML & IP
PDF
IRJET - Floor Cleaning Robot with Vision
PDF
IRJET- Recognition of Theft by Gestures using Kinect Sensor in Machine Le...
IRJET= Air Writing: Gesture Recognition using Ultrasound Sensors and Grid-Eye...
IRJET- Intrusion Detection through Image Processing and Getting Notified ...
IRJET- Offline Location Detection and Accident Indication using Mobile Sensors
Real Human Face Detection for Surveillance System Using Heterogeneous Sensors
IRJET- Automatic Self Parking Chair
IRJET - Drowsiness Detection System using ML & IP
IRJET - Floor Cleaning Robot with Vision
IRJET- Recognition of Theft by Gestures using Kinect Sensor in Machine Le...

What's hot (20)

PDF
IRJET-Fuzzy Logic Based Path Navigation for Robot using Matlab
PDF
IRJET-Gesture Recognition Robot using Digital Image Processing
PDF
IRJET- A Survey on Various Location Tracking Systems
PDF
IRJET- Implementation of Smart Black Box System for Gathering the Safety Info...
PDF
Note to Coin converter using Digital Image Processing
PPTX
IOT BASED BLACK BOX DEVICE FOR VEHICLE DETECTION
PDF
Smart Bank Locker Access System Using Iris ,Fingerprints,Face Recognization A...
PDF
IRJET- A Review Paper on IoT based Cognitive Robot for Military Surveillance
PDF
IRJET- Multiple Motion Control System of Robotic Car Based on IoT
PDF
IRJET- An Approach to Accelerometer based Controlled Robot
PDF
Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
PDF
IRJET- Note to Coin Exchanger
PDF
Four IR Sensor Based Automatic Control of Railway Gate using Microcontroller
PDF
Implementation of pid control to reduce wobbling in a
PDF
Secured Spy for Highly Secured Areas
PDF
IRJET- Smart Vehicle Automation with Blackbox using IoT
PDF
IRJET- Hand Gesture based Camera Monitorning System using Raspberry Pi
PDF
IRJET- Intelligent Traffic Signal Control System using ANN
PDF
IRJET- Survey Paper on Automatic Cart Movement Trailer
PDF
Ink-Less Electro Pen
IRJET-Fuzzy Logic Based Path Navigation for Robot using Matlab
IRJET-Gesture Recognition Robot using Digital Image Processing
IRJET- A Survey on Various Location Tracking Systems
IRJET- Implementation of Smart Black Box System for Gathering the Safety Info...
Note to Coin converter using Digital Image Processing
IOT BASED BLACK BOX DEVICE FOR VEHICLE DETECTION
Smart Bank Locker Access System Using Iris ,Fingerprints,Face Recognization A...
IRJET- A Review Paper on IoT based Cognitive Robot for Military Surveillance
IRJET- Multiple Motion Control System of Robotic Car Based on IoT
IRJET- An Approach to Accelerometer based Controlled Robot
Hand gesture recognition using ultrasonic sensor and atmega128 microcontroller
IRJET- Note to Coin Exchanger
Four IR Sensor Based Automatic Control of Railway Gate using Microcontroller
Implementation of pid control to reduce wobbling in a
Secured Spy for Highly Secured Areas
IRJET- Smart Vehicle Automation with Blackbox using IoT
IRJET- Hand Gesture based Camera Monitorning System using Raspberry Pi
IRJET- Intelligent Traffic Signal Control System using ANN
IRJET- Survey Paper on Automatic Cart Movement Trailer
Ink-Less Electro Pen
Ad

Similar to Control Buggy using Leap Sensor Camera in Data Mining Domain (20)

PPTX
Leap Motion - Aydin Akcasu
PDF
A SOFTWARE TOOL FOR EXPERIMENTAL STUDY LEAP MOTION
ODP
Gesture recognition
PDF
Recognition of sign language hand gestures using leap motion sensor based on ...
PDF
Arduino Based Hand Gesture Controlled Robot
PPTX
Geasture Control Robotic Arm
PDF
IRJET- Mechanical Design and Fabrication of Hand Motion Controlled Robotic...
PDF
georgefox_template1
PDF
using the Leap.pdf
PDF
IRJET- Finger Gesture Recognition using Laser Line Generator and Camera
PDF
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
PDF
Password Based Hand Gesture Controlled Robot
PDF
Gesture final report new
PDF
IRJET- Finger Gesture Recognition Using Linear Camera
PDF
Deep Belief Networks for Recognizing Handwriting Captured by Leap Motion Cont...
PDF
Wireless Gesture Controlled Robot (FYP Report)
PDF
2017theseDupontM.pdf
PDF
Analysis of Inertial Sensor Data Using Trajectory Recognition Algorithm
DOCX
STEFANO CARRINO
PPTX
final presentation from William, Amy and Alex
Leap Motion - Aydin Akcasu
A SOFTWARE TOOL FOR EXPERIMENTAL STUDY LEAP MOTION
Gesture recognition
Recognition of sign language hand gestures using leap motion sensor based on ...
Arduino Based Hand Gesture Controlled Robot
Geasture Control Robotic Arm
IRJET- Mechanical Design and Fabrication of Hand Motion Controlled Robotic...
georgefox_template1
using the Leap.pdf
IRJET- Finger Gesture Recognition using Laser Line Generator and Camera
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Password Based Hand Gesture Controlled Robot
Gesture final report new
IRJET- Finger Gesture Recognition Using Linear Camera
Deep Belief Networks for Recognizing Handwriting Captured by Leap Motion Cont...
Wireless Gesture Controlled Robot (FYP Report)
2017theseDupontM.pdf
Analysis of Inertial Sensor Data Using Trajectory Recognition Algorithm
STEFANO CARRINO
final presentation from William, Amy and Alex
Ad

More from IRJET Journal (20)

PDF
Enhanced heart disease prediction using SKNDGR ensemble Machine Learning Model
PDF
Utilizing Biomedical Waste for Sustainable Brick Manufacturing: A Novel Appro...
PDF
Kiona – A Smart Society Automation Project
PDF
DESIGN AND DEVELOPMENT OF BATTERY THERMAL MANAGEMENT SYSTEM USING PHASE CHANG...
PDF
Invest in Innovation: Empowering Ideas through Blockchain Based Crowdfunding
PDF
SPACE WATCH YOUR REAL-TIME SPACE INFORMATION HUB
PDF
A Review on Influence of Fluid Viscous Damper on The Behaviour of Multi-store...
PDF
Wireless Arduino Control via Mobile: Eliminating the Need for a Dedicated Wir...
PDF
Explainable AI(XAI) using LIME and Disease Detection in Mango Leaf by Transfe...
PDF
BRAIN TUMOUR DETECTION AND CLASSIFICATION
PDF
The Project Manager as an ambassador of the contract. The case of NEC4 ECC co...
PDF
"Enhanced Heat Transfer Performance in Shell and Tube Heat Exchangers: A CFD ...
PDF
Advancements in CFD Analysis of Shell and Tube Heat Exchangers with Nanofluid...
PDF
Breast Cancer Detection using Computer Vision
PDF
Auto-Charging E-Vehicle with its battery Management.
PDF
Analysis of high energy charge particle in the Heliosphere
PDF
A Novel System for Recommending Agricultural Crops Using Machine Learning App...
PDF
Auto-Charging E-Vehicle with its battery Management.
PDF
Analysis of high energy charge particle in the Heliosphere
PDF
Wireless Arduino Control via Mobile: Eliminating the Need for a Dedicated Wir...
Enhanced heart disease prediction using SKNDGR ensemble Machine Learning Model
Utilizing Biomedical Waste for Sustainable Brick Manufacturing: A Novel Appro...
Kiona – A Smart Society Automation Project
DESIGN AND DEVELOPMENT OF BATTERY THERMAL MANAGEMENT SYSTEM USING PHASE CHANG...
Invest in Innovation: Empowering Ideas through Blockchain Based Crowdfunding
SPACE WATCH YOUR REAL-TIME SPACE INFORMATION HUB
A Review on Influence of Fluid Viscous Damper on The Behaviour of Multi-store...
Wireless Arduino Control via Mobile: Eliminating the Need for a Dedicated Wir...
Explainable AI(XAI) using LIME and Disease Detection in Mango Leaf by Transfe...
BRAIN TUMOUR DETECTION AND CLASSIFICATION
The Project Manager as an ambassador of the contract. The case of NEC4 ECC co...
"Enhanced Heat Transfer Performance in Shell and Tube Heat Exchangers: A CFD ...
Advancements in CFD Analysis of Shell and Tube Heat Exchangers with Nanofluid...
Breast Cancer Detection using Computer Vision
Auto-Charging E-Vehicle with its battery Management.
Analysis of high energy charge particle in the Heliosphere
A Novel System for Recommending Agricultural Crops Using Machine Learning App...
Auto-Charging E-Vehicle with its battery Management.
Analysis of high energy charge particle in the Heliosphere
Wireless Arduino Control via Mobile: Eliminating the Need for a Dedicated Wir...

Recently uploaded (20)

PDF
Mitigating Risks through Effective Management for Enhancing Organizational Pe...
PDF
Embodied AI: Ushering in the Next Era of Intelligent Systems
PDF
TFEC-4-2020-Design-Guide-for-Timber-Roof-Trusses.pdf
PDF
Model Code of Practice - Construction Work - 21102022 .pdf
PPTX
bas. eng. economics group 4 presentation 1.pptx
PPTX
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
PPTX
Foundation to blockchain - A guide to Blockchain Tech
PDF
composite construction of structures.pdf
PPTX
OOP with Java - Java Introduction (Basics)
PDF
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
PPT
introduction to datamining and warehousing
PDF
July 2025 - Top 10 Read Articles in International Journal of Software Enginee...
PDF
The CXO Playbook 2025 – Future-Ready Strategies for C-Suite Leaders Cerebrai...
PPTX
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
PDF
Evaluating the Democratization of the Turkish Armed Forces from a Normative P...
PDF
PPT on Performance Review to get promotions
PPTX
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
PDF
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
PPTX
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
PPT
Introduction, IoT Design Methodology, Case Study on IoT System for Weather Mo...
Mitigating Risks through Effective Management for Enhancing Organizational Pe...
Embodied AI: Ushering in the Next Era of Intelligent Systems
TFEC-4-2020-Design-Guide-for-Timber-Roof-Trusses.pdf
Model Code of Practice - Construction Work - 21102022 .pdf
bas. eng. economics group 4 presentation 1.pptx
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
Foundation to blockchain - A guide to Blockchain Tech
composite construction of structures.pdf
OOP with Java - Java Introduction (Basics)
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
introduction to datamining and warehousing
July 2025 - Top 10 Read Articles in International Journal of Software Enginee...
The CXO Playbook 2025 – Future-Ready Strategies for C-Suite Leaders Cerebrai...
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
Evaluating the Democratization of the Turkish Armed Forces from a Normative P...
PPT on Performance Review to get promotions
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
Introduction, IoT Design Methodology, Case Study on IoT System for Weather Mo...

Control Buggy using Leap Sensor Camera in Data Mining Domain

  • 1. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 04 Issue: 11 | Nov -2017 www.irjet.net p-ISSN: 2395-0072 © 2017, IRJET | Impact Factor value: 6.171 | ISO 9001:2008 Certified Journal | Page 2172 Control buggy using Leap sensor camera in Data Mining Domain. Prof Bhadkumbhe S.M.1, Kurup Amitha2, Jadhav Snehal3, Makar Shraddha4, Ghule Priyanka5 Dept. Computer Engineering, PDEA’s college of Manjari (BK), Maharashtra, Indian ---------------------------------------------------------------------***--------------------------------------------------------------------- Abstract - This paper portrays a method for controllingthe buggy with proper gestures utilizing the leap motion camera. We have used leap as motion controller and the operation is completed on a buggy. The proper gesture for the action to be performed is accumulated in the database, whenthegestureis performed five points are extracted from which fifteen features are selected and this fifteen features are furtherused for comparison with stored gestureusingsimilarityalgorithm and the proper gesture identified by the system which is further performed by the buggy. In thissystemwearegoingto develop a buggy having two Direct Current motors connected to microcontroller via device driver. The micro-controller is affiliated to the PC by Max 232serialcommunication wire. The above mentioned system diminishes the human efforts while driving a car in real life (buggy is considered). This paperdoes not make driving 100% automatic but it reduces the effortsto a greater extent elevates the efficiency of the system. Key Words: Leap motion, gesture, Euclidean distance, cosinesimilarity,Trajectoryfeature,gesturerecognition 1. INTRODUCTION Now days the use of community transport like car has increased a lot in last few years. People send most of their energy driving from one locality to another, short as well as long distance. So to reduce the effortsrequired bythehuman being this system is designed. In our case gesture is used to control the buggy/car. Gesture is the movement or action performed by body/limbs/hands to express certain idea or sentiments. In simple words gesture means talking with hands. The capturing of the gesture is carried out by different devices in our case leap camera controller is used for gesture recognition which are performed by hand and this results in the motion or controlling of the buggy by gestures of human hand. In order to send the correct gesture to the buggy the system requires some device that could scan the image of gesture. The leap camera is utilized to capture the actual gesture performed by the human being in other words leap is just a simple device that provides input to the system. Theheart of the device consists of two camera & three infrared LEDS. It can track wavelength about 850 nm outside visible light spectrum. The viewing range of the device of the device is 2.6 feet. The data is in gray scale from which is separated into left & right cameras. As soon as the image data is streamed to your computer, the mathematical calculation takes place. The leap motion service is the software present in the computer that processes the image Fig 1: Leap Motion coordinates. 2. Literature Review Gesture Control of Drone Using a Motion Controller [1] provide an easy way to interact and communicate with machine. The human gesture or actions are applied to interact or to control the AR Drone .The actions done by the drone are lifting, moving and yawing. It also perform movements for flips, python script is used for flip. . The Leap Motion Controller uses its two monochromatic infrared(IR) cameras and three infrared LEDs to identify any gesture performed above the leap to a distance of 1 meter or 3 feet directly above the leap. Means it creates a hemispherical area around it which is about 1 meter and recognize any hand gesture in that volume. A lot of analysis work is completed in recognition of leap gesture. Several techniques are planned for recognition of hand gestures. During a Gesture Guided Moving mechanism with Microsoft Kinect [2]: demonstrate a gesture-guided mechanism moving system that is controlled by human gestures and keep step with the controller with comparatively constant distance. They used 2 advance devices programmable mechanism system: iRobot [1] and Microsoft Kinect [2]. Some straightforward gestures will be accustomed build the mechanismmoveany,closer,andkeep step with the walking person. Application of hand detection algorithm in robot control [3] presented the concept of vision based robot control system .The hand detection algorithmincontrollingrobotpresented a robot control system which is vision based. To control the robot manipulator hand gesture tracking and detection algorithm is used. The points and their paths are converted into a set of orientation and location value so as to achieve appropriate motion of manipulator. Thedrawback isthatfor accuracy we need to provide appropriate. Indian Sign Language to Forecast Text using Leap Motion Sensor and RF Classifier its proffered system involve
  • 2. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 04 Issue: 11 | Nov -2017 www.irjet.net p-ISSN: 2395-0072 © 2017, IRJET | Impact Factor value: 6.171 | ISO 9001:2008 Certified Journal | Page 2173 recognition of Indian language (ISL) exploitation Leap Motion sensing element. The device extracts vital characters/ attributes from the dataset such high-accuracy angle between fingers cab be retrieved, and this could be used for sturdy prediction. Feature classifier is employed to spot that gesture is being performed. Here, Random Forest (RF) classifier model is employed as feature classification. A block-list classifier is employed to get rid of all non-gestures frames, when that correct gesture frames area unit recognized by RF to create a significant word. This significant word is then displayed on digital display. Downside of this analysis is every country and every region has its own language, therefore this model is beneficial for less than individuals who now the Indian language. The motion controller’s internal elements could be a 2 camera digital to find the motion of the hands and fingers and a 3 LED’s infrared for detective work the space between the user’s hands with the leap motion controller, to find the space for controlling the automaton arm. Mainframe within the LEAP motion controller is processing visual information. It then sends the info of the movement of the hand and fingers into the pc. The information are formatted data on the space and the position coordinates of the hand and fingers at the moment send to the microcontroller for signal process and send management commands to drive server motor to manage the robotic arm. This technique carries with it four main parts:  The signal detection consists of a Leap motion controller  The signal processing part using JavaScript and node.js  Microcontroller  The robotic arm 3. Proposed System Fig 2: Flow chart We have proffered a system which identifies the gesture done by the user which is detected by the leap motion camera and the appropriate gesture is performed by the buggy. In this system 6 points are extracted fromthegesture done above or incident to the leap motion camera. From these 6 points the features requiredfordetectingthegesture are calculated. In the previous system only 12 features were identified in this system total of 15 features are selected making the system more efficient. The proffered system is partitioned into 4 main steps: 1. Leap interfacing 2. Preprocessing and point extraction 3. Feature extraction 4. Feature comparison Fig 3: Block diagram 1. Leap interfacing The software used for leap motion runs as a service of window system. The leap motion services are accessed by leap enabled application for the sake of receiving the data, leap motion SDK contribute 2 API which are a) Native interface b) Web socket interface a) Native interface : It is a set of small programs that can be loaded whenever needed (dynamic library) which is used to compose new leap enabled application. b) Web socket interface : The web socket interface & JavaScript client library are together used to develop leapenabled web applications.
  • 3. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 04 Issue: 11 | Nov -2017 www.irjet.net p-ISSN: 2395-0072 © 2017, IRJET | Impact Factor value: 6.171 | ISO 9001:2008 Certified Journal | Page 2174 2. Preprocessing and point extraction Initially the features for the appropriate gestureandthe action to be performed are hoarded in the database as the dataset for future recognition.Assoonasthegesture is done by the user in front of leap camera. Extraction of the 6 points take place these 6 points indicate the centroid of the five fingers which are thumb, index finger, middle finger, ring finger & the centroid of plan. 3. Feature extraction : The features required for the system are extracted from the 6 points in order to analyze the action to be performed the features are selectedtotal 15featuresare selected for appropriate identification of action to be performed, the extracted features are actually the distance between all points. The 6 points are p1(x1, y1, z1)……p6(x6, y6, z6) to find the distance between points Euclidean distance formula is applied: Fig 4: Points extraction 4. Feature comparison Once the 15 features are selected for each gesture a value is calculated for this. Then the current value is compared with the features in the dataset the value which is near the value of the current gesture is the gestured action which is to be performed. In order to compare and find the appropriate action for the gesture the cosine similarity formula is used which is Where and are components of vector and respectively. 4. HARDWARE USED 1. ATmega32 Fig 5: ATmega32 ATmega32 is an 8-bit a sophisticated microcontroller of Atmel's Huge AVR family. Atmega32 is dependent on increased RISC architecture which consist of 131 powerful instructions. Almost all of the instructions execute in one instruction cycle. Atmega32 can work s on the maximum periodicity of 16MHz. 2. MAX232 Fig 6: MAX232
  • 4. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 04 Issue: 11 | Nov -2017 www.irjet.net p-ISSN: 2395-0072 © 2017, IRJET | Impact Factor value: 6.171 | ISO 9001:2008 Certified Journal | Page 2175 The MAX232 IC is utilized to convert the TTL logic levels to RS232 logic levels during serial communication of microcontrollers with computer. The controller operates At TTL logic level (0-5V) the controller works whereas the serial communication in computer works on RS232 standards (-25 V to + 25V). This makes it challenging to create a direct association between them to interact with each other. ULN2803 Fig 7: ULN2803 ULN28xx high voltage, excessive-cutting-edge Darlington arrays are used for interfacing among low degree logic circuitry and a couple of peripheral powerloads.Someof the loads are solenoid, magnetic print hammers and heaters. All the devices have featureopen-collectoroutputswithintegral clamp diodes. 3. Sensor used (IR Sensor) Fig 8: IR Sensor The IR sensor is a tool which detects IR the radiation falling on it. There are several types of IR receptors that are built and can be built with respect to the application. Proximity sensors (Used in mobile phones and Edge Avoiding Robots),contrast sensors (Used in Line Following Robots) and blockage counters/sensors (Used for keeping track of goods and in Thief Alarms) are some illustrations, whichuse IR sensors. Every sensor in this world has three ports:  VCC - to switch on the sensor  GND – to set negative reference  Output - analog output of the sensor(in some receptors , there may be more than one output terminals) In this system the IR sensor is used to detect the obstacles in front of the buggy/car. 5. CONCLUSIONS We have proffered a framework to control buggyusinghand gesture. In this study we have utilized hand motion dataset that includes numerous gesture representations. We have utilized leap motion sensor camera to observe hand motion. To compare current gesture with stored gesture we have extracted a feature set. The features are calculated using Euclidean distance formula. This feature set will be compared with stored data usingcosinesimilarityandbuggy will take appropriate action. In this study we have utilized 3D leap motion sensor camera and we have extracted features of hand or fingers for maximum accuracy. For detecting gesture in 3D we can use AI based algorithm for further study REFERENCES [1] Ayanava Sarkar, Ganesh Ram R.K., Ketul Arvindbhai Patel, Geet Krishna Capoor, “Gesture Control of Drone Using a Motion Controller” BITS Pilani, Dubai Campus, 2016. [2] Qingyu Li, Panlong Yang, “A Gesture Guided Moving Robot with Microsoft Kinect”,CollegeofCommunication Engineering, PLA University of Science and Technology, 2013. [3] Tomasz Grzejszczak, Adrian Łe,gowski, Michał Niezabitowski, “Tomasz Grzejszczak, Adrian Łe,gowski, Michał Niezabitowski”, Akademicka 16 Street, 44-100 Gliwice, Poland, 2016. [4] Poonam Chavan1, Prof. Tushar Ghorpade2, Prof. Puja Padiya3, “Indian Sign Language to Forecast Text using Leap Motion Sensor and RF Classifier”, Ramrao Adik Institute of Technology Nerul, Navi Mumbai, Maharashtra 400706 , 2017. [5] Y.Pititeeraphob, P.Choitkuman, N.Thongpance, K.Kullatham, Ch.Pintavirooj, “Robot-armcontrol system using leap motion controller”, Rangsit University, Pathumthani, Thailand, 2017. [6] Shogo Okada1 and Kazuhiro Otsuka, “Recognizing Words from Gestures: Discovering Gesture Descriptors Associated with Spoken Utterances”, School of Computing, Tokyo Institute of Technology, Japan 2 NTT Communication Science Laboratories, Japan, 2017. [7] A. Marcos-Ramiro et al. Body communicative cue extraction for conversational analysis. In IEEE Int. Conf. on Automatic Face and Gesture Recognition, 2013. [8] A. Joshi et al. A random forest approach to segmenting and classifying gestures. In IEEE Int. Conf. on Automatic Face and Gesture Recognition, volume 1, 2015.