SlideShare a Scribd company logo
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 01 | Jan 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1557
HAND GESTURE CONTROLLED MOUSE
Yash Mali1, Himani Malani2, Nishad Mahore3, Rushikesh Mali4
1-4Department of Engineering Sciences and Humanities, Vishwakarma Institute of Technology, Pune 411037
---------------------------------------------------------------------***----------------------------------------------------------------------
Abstract - This paper proposes a virtually controlledmouse
system that uses machine learning technology. Thesoftware
will allow the user to manage the mouse using hand
gestures. The system requires a webcam as an device. The
tools required are Python and its various in-built libraries
such as OpenCv, NumPy, math, etc. The actual hand gestures
made by the user are analyzed by the system to perform a
particular task. Thus, we are proposing the simplest way to
make human machine interactions easier.
Key Words: Human-machine interactions, HandGestures,
Hand Tracking, Python, Gesture Recognition.
1.INTRODUCTION
In the past few years, human-machine interaction has
become very significant. In a exceeding world where
machines play a necessary role in almost every aspect of
our lives, it is important to cutback manual machine
handling the maximum amountofmakinghuman-machine
interactions simple and less laborious. Since there’s a
drastic difference between human and machine language,
it’s crucial to search out a common ground to interact with
machines. One such way is by using hand gestures. Hand
gestures are an efficient way of communication among
humans and sometimes even animals.Thus,tosearchouta
way to use them to communicate with machines is merely
logical. Gesture Recognition System provides a natural
way, innovative and modern way of non-verbal
communication. It has a wide area of applications in
human-machine interaction and sign language. The setup
consists of a single camera to capture the gesture formed
by the user and take this as input. The primary goal of
gesture recognition is to createa systemwhichcanidentify
specific human gestures and use them to convey
information for machine control. Thus, the system can be
operated using hand gestures without using keyboard or
mouse. The first step is frame-capture followed by which,
the hand gestures made by the user are processed and
accordingly the task is performed by the System PC The
range of tasks that may be performed using the proposed
software varies from dragging the mouse to draw on the
screen using hand gestures. Other functions include, right
click, left click, volume control, etc.
2. LITERATURE SURVEY
The current virtually controlled mouse systemsusecolour
recognition technique. The present virtual mouse control
system consists of easy mouse operations employing a
hand recognition system, within which we are able to
control the mouse pointer, left click, right click, and drag,
and so on.
Abhilash S , Lisho Thomas, Naveen Wilson, Chaithanya C
have proposed in their paper, a system that uses nothing
more than a low resolution webcam that acts as a sensor
and it is able to track the users hand bearing color caps in
two dimensions.
Vijay Kumar Sharma, Vimal Kumar, Md. Iqbal, Sachin
Tawara, Vishal Jayaswal have created a system that
controls the cursormovementusingtwofingers.Theyhave
proposed a model which can control the movementsofthe
mouse using the two fingers and the gestures made by
them.
In the paper , “Fusion of Information From Data Glove and
a Camera for Hand Gesture Recognition” byRamakant,the
author proposed a model whichusesanalgorithmforhand
gesture recognition based on the information received
from data-glove and camera for all static hand gesture
recognition.
3. METHODOLOGY
The software is divided into 4 modules:
A. Hand-tracking Module:Wefirstenablethewebcamfor
video capturing.The webcam captures one frame per
second so as to not miss anything. The system reads the
frame and then converts the frame from one colour to
another. Then the coordinates of the hand landmarks are
calculated in terms of width and height. The coordinates
are then further converted into pixels. Thus, showing the
hand landmarks with the help of coloured counters.
B. Volume Control: The function of this module is to
control the volume of the computer using hand gestures.
We first import Hand-Tracking Module to enable the
system to capture, analyse and decipher the hand gestures
made by the user and then calculate the coordinates of the
hand landmarks. We then set the minimum and maximum
range of the volume of the system. Then we set the
landmark coordinatesin proportionwiththevolumerange
previously declared. Thus the gap between the thumb and
the forefinger of the user’s hand will determinethevolume
of the computer.
C. Virtual Painter: This module enables the user to draw
on the screen using their fingers. Again, we first importthe
hand-tracking module. We use some header files in order
to make it more interactive. We used some previously
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 01 | Jan 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1558
defined functions from hand tracking module. Then we
define two modes one is selection mode and other is
drawing mode. In selection mode the usergetstoselectthe
color of the brush and in drawing mode the user gets to
draw on the board.
D. Mouse Control: With this module, we can control the
mouse of the computer and perform basic mousefunctions
using just our first two fingers. In this module, we use
“autopy”, which is an in-built library of Python language.
This library enables the user to use the mouse operations
with ease and in an efficient way. Using this module, the
user can control basic mouse operations such as
movement, scrolling and clicking using hand gestures.
4. DESIGN
4.1. Video Capturing : To decipher hand gestures,wefirst
have to capture live images of the hand gestures made by
the user. This is done by using a Web Cam which
continuously provides a sequence of images in a particular
speed of FPS(Frames per second).
4.2. Calculation of coordinates of hand landmarks:
After capturing the gestures, the system analyses the
images and then calculates the coordinates of hand
landmarks.
4.3. Tracking the Cursor: After determining the
coordinates, the mouse driver is accessed and the
coordinates are sent to the cursor. The cursor positions
itself in the required position using these coordinates.Asa
result, the mouse moves proportionally across the screen
as the user moves his hands across the camera's field of
view.
5. RESULTS AND DISCUSSIONS
5.1 Hand Tracking Module: We can observe that the
video is being captured and shown in a separate window.
The video highlights the hand above wrist with 21 points
and a line joining all the points. The points are highlighted
with different colors and the line joining them is
highlighted with different color.
Fig-1: Hand Tracking using the algorithm.
5.2 Volume Control Module: We can observe that the
software is making use of hand tracking module to detect
the 21 points on the hand. After that it is focusing more on
index finger and thumb and highlightingthem witha white
line joining both the points. There is also one morepointin
the middle of this white line whose color changes when
there is a change in volume. It shows the maximum,
optimum and minimum volumerange bychangingitscolor
to red, green and blue respectively.
Fig-2 : Volume Control using the algorithm
5.3 Virtual Painter Module: We can observe that , here
too the software uses hand tracking module and tracksthe
21 points of our hand. Here we can alsoobservetheheader
file which we used to make it more interactive and alsothe
canvas used to draw or write something. The mode which
we are using like the Drawing mode or Selection mode is
printed on the terminal automatically.
Fig-3 : Virtual paint on screen using the algorithm
5.4 Mouse Control Module: We can observe that in this
module ,the working and the display window is very much
similar to the hand tracking module but , as we move our
index finger the cursor on the screen also shows
movement . This proves that we can control the cursor as
per the movement of the index finger.
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 01 | Jan 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1559
Fig-4 : Mouse cursor control using the algorithm.
6. LIMITATION
The user has to give some time to get adopted t this type of
system since at the beginning there will be a littledifficulty
in controlling the cursor. Hence, it takes some time to get
familiar with the system. The module can track only one
hand so there might be a problem when there is more than
one hand in the frame.The handtrackingsoftwarerequires
optimum level of light and brightness which limits the use
of the software in places with low light.
7. FUTURE SCOPE
The Virtual Mouse control software can be used in various
ways in the coming future. The virtual painter module can
be useful to teachers and professors to elaborate their
teaching in an interactive way to their students. It can be
also used by professionals to make presentations more
understandable. The Mouse control module can take
Human-Computer Interaction to a next step. People with
diseases like paralysis who were unable to use computers
can now use it since they can control the operations using
just their fingers.
8. CONCLUSION
The main purpose of this paper is to implement the idea of
virtual mouse. The main tasks performedandcontrolledby
the virtual mouse are clicking, double clicking, adjusting
volume controls, and virtual painters. From all the above
discussion it can be concluded that virtual mouse plays an
important role in interacting with computer as a virtual
machineand also reduces thehardwarecostbyeliminating
use of mouse which further helps to reduce and avoidwrist
related damages such as CTS(carpal tunnel syndrome).
9. ACKNOWLEDGEMENT
WE WOULDLIKETOTHANKVISHWAKARMAINSTITUTE
OF TECHNOLOGY FOR PRESENTING US THIS
OPPORTUNITY.WE WOULD ALSO LIKE TO THANK OUR
GUIDE PROF. MANISHA MORE FOR HER GUIDANCE.
REFERENCES
1. Riza Sande, Neha Marathe, Neha Bhegade,
Akanksha Lugade, Prof. S. S. Jogdand PCET’s
Pimpri Chinchwad Polytechnic International
Journal of Advanced Research in Science
Engineering and Technology Vol. 8, Issue 4 , April
2021.
2. Vijay Kumar Sharma, Vimal Kumar, Md. Iqbal ,
Sachin Tawara, Vishal Jayaswal Department of
Computer Science and Engineering MIET, Meerut
.GIS Science Journal ISSN NO : 1869-9391
3. Sachin Tawara, Vishal Jayaswal , Department of
Computer Science and Engineering MIET, Meerut
GIS SCIENCE JOURNAL ISSN NO : 1869-9391
VOLUME 7, ISSUE 12 2020
4. Abhilash S, Lisho Thomas, Naveen Wilson,
Chaitanya, Students of Department of Computer
Science Engineering, Mar Athanasius College of
Engineering
5. Naveen Wilson, Chaitanya, Mar Athanasius
College of Engineering, International Journal of
Engineering and Technology (IRJET) Volume 05
Issue: 04 : April 2018
6. Ramakant, Faculty Rajiv Gandhi University of
Knowledge and Technologies. International
Journal of Engineering Research and Technology
(IJERT). ISSN: 2278-0181. Vol.3 Issue 3 March
2014.
7. Jeong H-J, Lee.M-J, Lee C.E, Sangjin Kim, Ha Y-G.
Journal of InternetTechnology,10.6138/JIT.2017.
18.4.20160130b
8. Paula Trigueiros, Fernando Ribeiro, Luis Reis.
Iberian Conference on Information Systems and
Technologies. CISTI .9789899624764
9. Delip K, Chandrashekhar M, Hameed Badhusha
Irfan F, Kalaiarasi P. International Journal of
Creative Research Thoughts (IJCRT). ISSN:2320-
2882. Volume 8,Issue 3, March 2020
10. Indrajit Bastapure, Tanmai Aurangabadkar,
Revati Tavare, AISSMS Institute of Information
Technology. International Journal of Engineering
Research and Technology (IJERT). ISSN:2278-
0181. Volume 8, Issue 3,March-2014
11. Cynthia Tuscano, Blossom Lopes, Stephina
Machado, Pradnya Rane, St. Francis Institute of
Technology. International Journal of Modern
Engineering Research(IJMER). ISSN: 2249-6645
Vol.3, Issue.2, March-April.2013

More Related Content

PPTX
Blue Eyes (Artificial Intelligence)
DOCX
Seminar report on quantum computing
PPT
Virtual Mouse
PDF
Virtual Mouse Using Hand Gesture Recognition
PDF
Gesture recognition document
DOCX
Seminar report on blue eyes
PPTX
SMART NOTE TAKER
PPTX
Blue eyes technology
Blue Eyes (Artificial Intelligence)
Seminar report on quantum computing
Virtual Mouse
Virtual Mouse Using Hand Gesture Recognition
Gesture recognition document
Seminar report on blue eyes
SMART NOTE TAKER
Blue eyes technology

What's hot (20)

PPTX
Interaction styles
PPTX
ppt of gesture recognition
PPTX
Hand Gesture Recognition Using OpenCV Python
PPTX
E ball technology
PPTX
Hand Gesture Recognition using Image Processing
PPTX
hand gestures
PPTX
A Framework For Dynamic Hand Gesture Recognition Using Key Frames Extraction
PPT
project presentation on mouse simulation using finger tip detection
PPT
skinput technology
PPTX
Global wireless e voting powerpoint presentation
PPTX
Virtual Mouse using hand gesture recognition
PPTX
PPTX
Gesture Recognition Technology-Seminar PPT
PPTX
Blue eyes
PPTX
Gesture recognition adi
DOC
The eye-gaze-communication-system-1.doc(updated)
PPTX
I twin technology
PPTX
Unit2 hci
PPTX
Eye mouse
PPTX
Blue eyes- The perfect presentation for a technical seminar
Interaction styles
ppt of gesture recognition
Hand Gesture Recognition Using OpenCV Python
E ball technology
Hand Gesture Recognition using Image Processing
hand gestures
A Framework For Dynamic Hand Gesture Recognition Using Key Frames Extraction
project presentation on mouse simulation using finger tip detection
skinput technology
Global wireless e voting powerpoint presentation
Virtual Mouse using hand gesture recognition
Gesture Recognition Technology-Seminar PPT
Blue eyes
Gesture recognition adi
The eye-gaze-communication-system-1.doc(updated)
I twin technology
Unit2 hci
Eye mouse
Blue eyes- The perfect presentation for a technical seminar
Ad

Similar to HAND GESTURE CONTROLLED MOUSE (20)

PDF
Virtual Mouse Control Using Hand Gesture Recognition
PPTX
Group-3 (3).pptx
PDF
AI Virtual Mouse
PDF
VIRTUAL MOUSE USING OPENCV
PPTX
SEMINAR_PPT.pptx
PDF
Virtual Mouse Control Using Hand Gestures
PDF
A Survey on Detecting Hand Gesture
PDF
Controlling Mouse Movements Using hand Gesture And X box 360
PDF
Controlling Computer using Hand Gestures
PPTX
Cursor movement by hand gesture.pptx
PDF
AN INTERNSHIP REPORT ON VIRTUAL MOUSE USING HAND GESTURES PROJECT
PDF
Media Control Using Hand Gesture Moments
PDF
Research on Detecting Hand Gesture
PPTX
virtual mouse using hand gesture.pptx
PDF
MOUSE SIMULATION USING NON MAXIMUM SUPPRESSION
PDF
VIRTUAL MOUSE USING HAND GESTURES PROJECT.pdf
PDF
A Survey Paper on Controlling Computer using Hand Gestures
PPTX
Introduction-to-Virtual-Mouse-using-Hand-Gestures.pptx
PDF
Real time hand gesture recognition system for dynamic applications
PDF
Real time hand gesture recognition system for dynamic applications
Virtual Mouse Control Using Hand Gesture Recognition
Group-3 (3).pptx
AI Virtual Mouse
VIRTUAL MOUSE USING OPENCV
SEMINAR_PPT.pptx
Virtual Mouse Control Using Hand Gestures
A Survey on Detecting Hand Gesture
Controlling Mouse Movements Using hand Gesture And X box 360
Controlling Computer using Hand Gestures
Cursor movement by hand gesture.pptx
AN INTERNSHIP REPORT ON VIRTUAL MOUSE USING HAND GESTURES PROJECT
Media Control Using Hand Gesture Moments
Research on Detecting Hand Gesture
virtual mouse using hand gesture.pptx
MOUSE SIMULATION USING NON MAXIMUM SUPPRESSION
VIRTUAL MOUSE USING HAND GESTURES PROJECT.pdf
A Survey Paper on Controlling Computer using Hand Gestures
Introduction-to-Virtual-Mouse-using-Hand-Gestures.pptx
Real time hand gesture recognition system for dynamic applications
Real time hand gesture recognition system for dynamic applications
Ad

More from IRJET Journal (20)

PDF
Enhanced heart disease prediction using SKNDGR ensemble Machine Learning Model
PDF
Utilizing Biomedical Waste for Sustainable Brick Manufacturing: A Novel Appro...
PDF
Kiona – A Smart Society Automation Project
PDF
DESIGN AND DEVELOPMENT OF BATTERY THERMAL MANAGEMENT SYSTEM USING PHASE CHANG...
PDF
Invest in Innovation: Empowering Ideas through Blockchain Based Crowdfunding
PDF
SPACE WATCH YOUR REAL-TIME SPACE INFORMATION HUB
PDF
A Review on Influence of Fluid Viscous Damper on The Behaviour of Multi-store...
PDF
Wireless Arduino Control via Mobile: Eliminating the Need for a Dedicated Wir...
PDF
Explainable AI(XAI) using LIME and Disease Detection in Mango Leaf by Transfe...
PDF
BRAIN TUMOUR DETECTION AND CLASSIFICATION
PDF
The Project Manager as an ambassador of the contract. The case of NEC4 ECC co...
PDF
"Enhanced Heat Transfer Performance in Shell and Tube Heat Exchangers: A CFD ...
PDF
Advancements in CFD Analysis of Shell and Tube Heat Exchangers with Nanofluid...
PDF
Breast Cancer Detection using Computer Vision
PDF
Auto-Charging E-Vehicle with its battery Management.
PDF
Analysis of high energy charge particle in the Heliosphere
PDF
A Novel System for Recommending Agricultural Crops Using Machine Learning App...
PDF
Auto-Charging E-Vehicle with its battery Management.
PDF
Analysis of high energy charge particle in the Heliosphere
PDF
Wireless Arduino Control via Mobile: Eliminating the Need for a Dedicated Wir...
Enhanced heart disease prediction using SKNDGR ensemble Machine Learning Model
Utilizing Biomedical Waste for Sustainable Brick Manufacturing: A Novel Appro...
Kiona – A Smart Society Automation Project
DESIGN AND DEVELOPMENT OF BATTERY THERMAL MANAGEMENT SYSTEM USING PHASE CHANG...
Invest in Innovation: Empowering Ideas through Blockchain Based Crowdfunding
SPACE WATCH YOUR REAL-TIME SPACE INFORMATION HUB
A Review on Influence of Fluid Viscous Damper on The Behaviour of Multi-store...
Wireless Arduino Control via Mobile: Eliminating the Need for a Dedicated Wir...
Explainable AI(XAI) using LIME and Disease Detection in Mango Leaf by Transfe...
BRAIN TUMOUR DETECTION AND CLASSIFICATION
The Project Manager as an ambassador of the contract. The case of NEC4 ECC co...
"Enhanced Heat Transfer Performance in Shell and Tube Heat Exchangers: A CFD ...
Advancements in CFD Analysis of Shell and Tube Heat Exchangers with Nanofluid...
Breast Cancer Detection using Computer Vision
Auto-Charging E-Vehicle with its battery Management.
Analysis of high energy charge particle in the Heliosphere
A Novel System for Recommending Agricultural Crops Using Machine Learning App...
Auto-Charging E-Vehicle with its battery Management.
Analysis of high energy charge particle in the Heliosphere
Wireless Arduino Control via Mobile: Eliminating the Need for a Dedicated Wir...

Recently uploaded (20)

PDF
Mohammad Mahdi Farshadian CV - Prospective PhD Student 2026
PPTX
Construction Project Organization Group 2.pptx
PPTX
Fundamentals of Mechanical Engineering.pptx
PPTX
UNIT 4 Total Quality Management .pptx
PDF
Categorization of Factors Affecting Classification Algorithms Selection
PDF
R24 SURVEYING LAB MANUAL for civil enggi
PPT
Project quality management in manufacturing
PPTX
additive manufacturing of ss316l using mig welding
PDF
Human-AI Collaboration: Balancing Agentic AI and Autonomy in Hybrid Systems
PPTX
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
PPTX
Sustainable Sites - Green Building Construction
PDF
Well-logging-methods_new................
PDF
Embodied AI: Ushering in the Next Era of Intelligent Systems
PPTX
Geodesy 1.pptx...............................................
DOCX
573137875-Attendance-Management-System-original
PPT
Introduction, IoT Design Methodology, Case Study on IoT System for Weather Mo...
PPTX
Safety Seminar civil to be ensured for safe working.
PPT
introduction to datamining and warehousing
PPT
Total quality management ppt for engineering students
PDF
Artificial Superintelligence (ASI) Alliance Vision Paper.pdf
Mohammad Mahdi Farshadian CV - Prospective PhD Student 2026
Construction Project Organization Group 2.pptx
Fundamentals of Mechanical Engineering.pptx
UNIT 4 Total Quality Management .pptx
Categorization of Factors Affecting Classification Algorithms Selection
R24 SURVEYING LAB MANUAL for civil enggi
Project quality management in manufacturing
additive manufacturing of ss316l using mig welding
Human-AI Collaboration: Balancing Agentic AI and Autonomy in Hybrid Systems
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
Sustainable Sites - Green Building Construction
Well-logging-methods_new................
Embodied AI: Ushering in the Next Era of Intelligent Systems
Geodesy 1.pptx...............................................
573137875-Attendance-Management-System-original
Introduction, IoT Design Methodology, Case Study on IoT System for Weather Mo...
Safety Seminar civil to be ensured for safe working.
introduction to datamining and warehousing
Total quality management ppt for engineering students
Artificial Superintelligence (ASI) Alliance Vision Paper.pdf

HAND GESTURE CONTROLLED MOUSE

  • 1. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 01 | Jan 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1557 HAND GESTURE CONTROLLED MOUSE Yash Mali1, Himani Malani2, Nishad Mahore3, Rushikesh Mali4 1-4Department of Engineering Sciences and Humanities, Vishwakarma Institute of Technology, Pune 411037 ---------------------------------------------------------------------***---------------------------------------------------------------------- Abstract - This paper proposes a virtually controlledmouse system that uses machine learning technology. Thesoftware will allow the user to manage the mouse using hand gestures. The system requires a webcam as an device. The tools required are Python and its various in-built libraries such as OpenCv, NumPy, math, etc. The actual hand gestures made by the user are analyzed by the system to perform a particular task. Thus, we are proposing the simplest way to make human machine interactions easier. Key Words: Human-machine interactions, HandGestures, Hand Tracking, Python, Gesture Recognition. 1.INTRODUCTION In the past few years, human-machine interaction has become very significant. In a exceeding world where machines play a necessary role in almost every aspect of our lives, it is important to cutback manual machine handling the maximum amountofmakinghuman-machine interactions simple and less laborious. Since there’s a drastic difference between human and machine language, it’s crucial to search out a common ground to interact with machines. One such way is by using hand gestures. Hand gestures are an efficient way of communication among humans and sometimes even animals.Thus,tosearchouta way to use them to communicate with machines is merely logical. Gesture Recognition System provides a natural way, innovative and modern way of non-verbal communication. It has a wide area of applications in human-machine interaction and sign language. The setup consists of a single camera to capture the gesture formed by the user and take this as input. The primary goal of gesture recognition is to createa systemwhichcanidentify specific human gestures and use them to convey information for machine control. Thus, the system can be operated using hand gestures without using keyboard or mouse. The first step is frame-capture followed by which, the hand gestures made by the user are processed and accordingly the task is performed by the System PC The range of tasks that may be performed using the proposed software varies from dragging the mouse to draw on the screen using hand gestures. Other functions include, right click, left click, volume control, etc. 2. LITERATURE SURVEY The current virtually controlled mouse systemsusecolour recognition technique. The present virtual mouse control system consists of easy mouse operations employing a hand recognition system, within which we are able to control the mouse pointer, left click, right click, and drag, and so on. Abhilash S , Lisho Thomas, Naveen Wilson, Chaithanya C have proposed in their paper, a system that uses nothing more than a low resolution webcam that acts as a sensor and it is able to track the users hand bearing color caps in two dimensions. Vijay Kumar Sharma, Vimal Kumar, Md. Iqbal, Sachin Tawara, Vishal Jayaswal have created a system that controls the cursormovementusingtwofingers.Theyhave proposed a model which can control the movementsofthe mouse using the two fingers and the gestures made by them. In the paper , “Fusion of Information From Data Glove and a Camera for Hand Gesture Recognition” byRamakant,the author proposed a model whichusesanalgorithmforhand gesture recognition based on the information received from data-glove and camera for all static hand gesture recognition. 3. METHODOLOGY The software is divided into 4 modules: A. Hand-tracking Module:Wefirstenablethewebcamfor video capturing.The webcam captures one frame per second so as to not miss anything. The system reads the frame and then converts the frame from one colour to another. Then the coordinates of the hand landmarks are calculated in terms of width and height. The coordinates are then further converted into pixels. Thus, showing the hand landmarks with the help of coloured counters. B. Volume Control: The function of this module is to control the volume of the computer using hand gestures. We first import Hand-Tracking Module to enable the system to capture, analyse and decipher the hand gestures made by the user and then calculate the coordinates of the hand landmarks. We then set the minimum and maximum range of the volume of the system. Then we set the landmark coordinatesin proportionwiththevolumerange previously declared. Thus the gap between the thumb and the forefinger of the user’s hand will determinethevolume of the computer. C. Virtual Painter: This module enables the user to draw on the screen using their fingers. Again, we first importthe hand-tracking module. We use some header files in order to make it more interactive. We used some previously
  • 2. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 01 | Jan 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1558 defined functions from hand tracking module. Then we define two modes one is selection mode and other is drawing mode. In selection mode the usergetstoselectthe color of the brush and in drawing mode the user gets to draw on the board. D. Mouse Control: With this module, we can control the mouse of the computer and perform basic mousefunctions using just our first two fingers. In this module, we use “autopy”, which is an in-built library of Python language. This library enables the user to use the mouse operations with ease and in an efficient way. Using this module, the user can control basic mouse operations such as movement, scrolling and clicking using hand gestures. 4. DESIGN 4.1. Video Capturing : To decipher hand gestures,wefirst have to capture live images of the hand gestures made by the user. This is done by using a Web Cam which continuously provides a sequence of images in a particular speed of FPS(Frames per second). 4.2. Calculation of coordinates of hand landmarks: After capturing the gestures, the system analyses the images and then calculates the coordinates of hand landmarks. 4.3. Tracking the Cursor: After determining the coordinates, the mouse driver is accessed and the coordinates are sent to the cursor. The cursor positions itself in the required position using these coordinates.Asa result, the mouse moves proportionally across the screen as the user moves his hands across the camera's field of view. 5. RESULTS AND DISCUSSIONS 5.1 Hand Tracking Module: We can observe that the video is being captured and shown in a separate window. The video highlights the hand above wrist with 21 points and a line joining all the points. The points are highlighted with different colors and the line joining them is highlighted with different color. Fig-1: Hand Tracking using the algorithm. 5.2 Volume Control Module: We can observe that the software is making use of hand tracking module to detect the 21 points on the hand. After that it is focusing more on index finger and thumb and highlightingthem witha white line joining both the points. There is also one morepointin the middle of this white line whose color changes when there is a change in volume. It shows the maximum, optimum and minimum volumerange bychangingitscolor to red, green and blue respectively. Fig-2 : Volume Control using the algorithm 5.3 Virtual Painter Module: We can observe that , here too the software uses hand tracking module and tracksthe 21 points of our hand. Here we can alsoobservetheheader file which we used to make it more interactive and alsothe canvas used to draw or write something. The mode which we are using like the Drawing mode or Selection mode is printed on the terminal automatically. Fig-3 : Virtual paint on screen using the algorithm 5.4 Mouse Control Module: We can observe that in this module ,the working and the display window is very much similar to the hand tracking module but , as we move our index finger the cursor on the screen also shows movement . This proves that we can control the cursor as per the movement of the index finger.
  • 3. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 01 | Jan 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1559 Fig-4 : Mouse cursor control using the algorithm. 6. LIMITATION The user has to give some time to get adopted t this type of system since at the beginning there will be a littledifficulty in controlling the cursor. Hence, it takes some time to get familiar with the system. The module can track only one hand so there might be a problem when there is more than one hand in the frame.The handtrackingsoftwarerequires optimum level of light and brightness which limits the use of the software in places with low light. 7. FUTURE SCOPE The Virtual Mouse control software can be used in various ways in the coming future. The virtual painter module can be useful to teachers and professors to elaborate their teaching in an interactive way to their students. It can be also used by professionals to make presentations more understandable. The Mouse control module can take Human-Computer Interaction to a next step. People with diseases like paralysis who were unable to use computers can now use it since they can control the operations using just their fingers. 8. CONCLUSION The main purpose of this paper is to implement the idea of virtual mouse. The main tasks performedandcontrolledby the virtual mouse are clicking, double clicking, adjusting volume controls, and virtual painters. From all the above discussion it can be concluded that virtual mouse plays an important role in interacting with computer as a virtual machineand also reduces thehardwarecostbyeliminating use of mouse which further helps to reduce and avoidwrist related damages such as CTS(carpal tunnel syndrome). 9. ACKNOWLEDGEMENT WE WOULDLIKETOTHANKVISHWAKARMAINSTITUTE OF TECHNOLOGY FOR PRESENTING US THIS OPPORTUNITY.WE WOULD ALSO LIKE TO THANK OUR GUIDE PROF. MANISHA MORE FOR HER GUIDANCE. REFERENCES 1. Riza Sande, Neha Marathe, Neha Bhegade, Akanksha Lugade, Prof. S. S. Jogdand PCET’s Pimpri Chinchwad Polytechnic International Journal of Advanced Research in Science Engineering and Technology Vol. 8, Issue 4 , April 2021. 2. Vijay Kumar Sharma, Vimal Kumar, Md. Iqbal , Sachin Tawara, Vishal Jayaswal Department of Computer Science and Engineering MIET, Meerut .GIS Science Journal ISSN NO : 1869-9391 3. Sachin Tawara, Vishal Jayaswal , Department of Computer Science and Engineering MIET, Meerut GIS SCIENCE JOURNAL ISSN NO : 1869-9391 VOLUME 7, ISSUE 12 2020 4. Abhilash S, Lisho Thomas, Naveen Wilson, Chaitanya, Students of Department of Computer Science Engineering, Mar Athanasius College of Engineering 5. Naveen Wilson, Chaitanya, Mar Athanasius College of Engineering, International Journal of Engineering and Technology (IRJET) Volume 05 Issue: 04 : April 2018 6. Ramakant, Faculty Rajiv Gandhi University of Knowledge and Technologies. International Journal of Engineering Research and Technology (IJERT). ISSN: 2278-0181. Vol.3 Issue 3 March 2014. 7. Jeong H-J, Lee.M-J, Lee C.E, Sangjin Kim, Ha Y-G. Journal of InternetTechnology,10.6138/JIT.2017. 18.4.20160130b 8. Paula Trigueiros, Fernando Ribeiro, Luis Reis. Iberian Conference on Information Systems and Technologies. CISTI .9789899624764 9. Delip K, Chandrashekhar M, Hameed Badhusha Irfan F, Kalaiarasi P. International Journal of Creative Research Thoughts (IJCRT). ISSN:2320- 2882. Volume 8,Issue 3, March 2020 10. Indrajit Bastapure, Tanmai Aurangabadkar, Revati Tavare, AISSMS Institute of Information Technology. International Journal of Engineering Research and Technology (IJERT). ISSN:2278- 0181. Volume 8, Issue 3,March-2014 11. Cynthia Tuscano, Blossom Lopes, Stephina Machado, Pradnya Rane, St. Francis Institute of Technology. International Journal of Modern Engineering Research(IJMER). ISSN: 2249-6645 Vol.3, Issue.2, March-April.2013