SlideShare a Scribd company logo
Journal of Physics: Conference Series
PAPER • OPEN ACCESS
Social Service Robot using Gesture recognition
technique
To cite this article: D. Jessintha et al 2023 J. Phys.: Conf. Ser. 2466 012020
View the article online for updates and enhancements.
You may also like
Scene Understanding Technology of
Intelligent Customer Service Robot Based
on Deep Learning
Jianfeng Zhong
-
Path Planning in Service Robot Based on
Improved A* Algorithm
Junyi Cao and Jie Liu
-
A Review on Service Robots: Mechanical
Design and Localization System
M. Q. Bakri, A. H. Ismail, M.S.M. Hashim
et al.
-
This content was downloaded from IP address 213.230.107.211 on 12/04/2023 at 17:25
Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution
of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.
Published under licence by IOP Publishing Ltd
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
1
Social Service Robot using Gesture recognition technique
D. Jessintha1
, P. Praveen kumar2
, S. Jaisiva3
, T. Ananth kumar4
*, Christo
Ananth5
1
Electronics and Communication Engineering, Easwari Engineering College,
Chennai, India
2
Deparment of Information Tehncology, Sri Manakulavinayagar Engineering college ,
Puducherry, India.
3
Department of EEE, M.Kumarasamy College of Engineering, Karur,Tamilnadu,
India
4
Computer Science and Engineering, IFET College of Engineering, Tamilnadu, India.
5
Department of Natural and Exact Sciences, Samarkand State University, Uzbekistan
*tananthkumar@ifet.ac.in
Abstract. A robot is a machine that can automatically do a task or a series of tasks based on its
programming and environment. They are artificially built machines or devices that can perform
activities with utmost accuracy and precision minimizing time constraints. Service robots are
technologically advanced machines deployed to service and maintain certain activities.
Research findings convey the essential fact that serving robots are now being deployed
worldwide. Social robotics is one such field that heavily involves an interaction between
humans and an artificially built machine. These man-built machines interact with humans and
can also understand social terms and words. Modernization has bought changes in design and
mechanisms due to this ever-lasting growth in technology and innovation. Therefore, food
industries are also dynamically adapting to the new changes in the field of automation to
reduce human workload and increase the quality of service. Deployment of a robot in the food
industries which help to aid deaf and mute people who face social constraints is an ever-
growing challenge faced by engineers for the last few decades. Moreover, a contactless form of
speedy service system which accomplishes its task with at most precision and reduced
complexity is a feat yet to be perfected. Preservation of personal hygiene, a better quality of
service, and reduced labour costs is achieved.
1. Introduction
Social service robots, in contrast to their industrial counterparts, have a definite role to play. The
frequent enhancements in the technological field and innovation led to an everlasting development in
robotic systems[1][2]. Intelligent Robotic Systems deploy technologies such as gesture and voice
recognition to overcome human workload as well as to maintain the quality of service thus rendered.
The robot is summoned using a button embedded in an RF transmitter module. After the reception of
the signal, a color sensor is attached to the robot, which utilizes the technique of line following, and
can move along a pre-determined path to reach its destination[3-5]. Constructing a gesture recognition
system programmed using Raspberry Pi 4 to capture the gesture shown by people can help serve the
desired particulars such as water, tea, coffee, or other beverages without any physical contact. On
receiving relevant inputs from the user, the robot arm designed using MG996r and SG90 can grab the
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
2
cup and place it beneath the dispenser. The custom-built dispenser using L298N, the submersible
water pump, transfers the requested beverage to the cup. The robotic arm thereby serves the requested
particulars on a tray. The methodologies used are interpreted in Section 2. The hardware modules used
for this design are discussed in Section 3. The efforts required for such people to get their particulars
without any hassle can be overcome through this method. Robots help reduce the physical stress level
of an employee and maintain a cleaner ambiance without compromising the food quality. A social
service robot should aid people in diverse conditions, such as a shop, restaurant, hospital, or even
home. The sole purpose of this robot is to increase robot - humans interaction through sophisticated
technological advancements. This paper interprets the basic operation which can be applied to various
places for specific tasks[5-9].
2. Related Works
Due to technological advances, robots are becoming more common. Shortly, we may see intelligent
robots helping people in need. Human-robot interaction research is becoming more critical as it
integrates hand sign recognition components[10]. Hand gesture recognition may make human-robot
communication more natural. This may help humans and robots collaborate to increase application
efficiency and avoid problems[11]. De Smedt et al. classified hand gestures using the SVM algorithm
and skeletal and depth data[12]. Nunez et al. classified hand gestures using an HMM and SVM
(HMM). Image acquisition and division used Kinect sensor data and a skeleton-based method[13].
According to Tang et al. [14], hand gestures can be tracked in real-time using a recursive connected
component algorithm and pixelated hand skeletons' 3D geodesic distance. Praveen kumar et al. studied
nonverbal communication and an R-CNN to improve avocado harvesting in a simulated
workspace[15]. This improved efficiency. A robot located workers and determined if they needed help
by recognizing human activity, hand gestures, or flags. et al. CNN/LSTM networks were used (long
short-term memory). The harvester learned from his hands. Hand gesture recognition results may vary
based on image color texture. Due to differences in skin color between people and countries, results
may vary. Light affects color and texture. Shape-based features can also recognize hand gestures. We
took a different approach. Normal thumb and finger lengths are about the same on both hands. Hand
shape-based gesture recognition frame rate is comparable to most existing systems. The number and
accuracy of recognized gestures were among the best. Robots must understand what humans are
saying to collaborate effectively with them. Humans and robots must communicate using natural
gestures in HRC manufacturing[16]. The hand is differentiated from its surroundings using a skin
tone. Principal component analysis categorized all eight static gestures. Pishardy et al. use a restricted
coulomb energy (RCE) neural network to separate a hand from an image. To train a second RCE
neural network to recognize static hand gestures, measure the hand-to-arm distance and number of
spread fingers. 95% accuracy is possible with an eight-size gesture lexicon[17]. A color camera
captures real-time images in full color. In Otsu segmentation, the Y-Cb-Cr color space distinguishes
the moving hand from the constant background. The k-curvature algorithm [18] determines an image's
high and low points. A gesture's peak-to-valley ratio determines its group. The judgments can be
95.2% accurate. This system recognizes six gestures. One factor means this cannot increase work
output. They assume that hands are the camera's most accessible part. The hand's orientation can be
determined using a vector from the center to its farthest point. The robot's movement is controlled by
the hand's orientation, while its straight-line velocity is determined by its distance from the image. El
Makrini et al. found that hand shape can control a robot. Instead of using high-level depth and color
features, all intensity values in a box around the hand are used[19]. This replaces feature extraction.
The average neighborhood margin maximization algorithm reduces feature space dimensions
(ANMM). Four hand gestures are classified using the nearest neighbor classifier. Wadhawan et al.
[20] classify six hand gestures using a geometric property. In color images, hand skin color is
evaluated to identify it. Calculate the distance between the hand's center and each outline point. A
gesture's number of peaks and valleys determines its category. Zhang et al. created a system that
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
3
recognizes hand gestures online in real time[21]. The chamfer distance aligns a hand template with an
image's edges after removing a person's body. This step follows body division. The hand is then given
a realistic skin color model. Tracing the hand's center across multiple frames reveals the final feature.
Vector shows hand location and motion. Hidden Markov models learn hand movements. Due to color
cameras' high frame rate, most current techniques use video sequences. Human-robot interaction
(HRI) has become a focus of research in computer vision and robotics due to the broad range of
applications in the field of human-computer interaction (HCI).
3. Working Prototype
The robot, which comprises different modules, is primarily stationed at the origin of the desired area
intended to be used. The following techniques are adapted to demonstrate the basic idea of this project.
Figure 1 shows the block diagram of the proposed system.
3.1 Transmission and Reception
A transmitter-receiver is enclosed in the base of the robot. This is operated at a frequency of 27
MHz. Radiofrequency utilizes the principles of radio waves to transfer signals to the receiver from the
transmitter by adjusting the current and voltage parameters that alter the oscillation rate. It can send
signals from 20KHz to around 200GHz. This concept is utilized such that when the robot is
summoned at the user's side, an RF signal is transmitted from the transmitter situated at the table to the
receiver, housed within the base of the robot. This receiver module is connected to the microcontroller
that initiates the robot's movement. Consequently, the base of the robot moves to the respective table
where the user is seated using TCS3200 module.
Figure 1. Block diagram of the Proposed system
3.2. Line follower with color sensor
After the successful reception of the signal, the robot is destined to reach the respective table where it
was summoned. DC motors are powered by the motor driver that makes forward and backward
movement possible depending on the command given to it by the microcontroller. Therefore, the base
of the robot has to plan its path. It utilizes a color sensor to check the path that the robot takes. The
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
4
color sensor module is calibrated to detect two colors, Red and Blue. The path to the table is laid using
these two colors. Applying the principles of the line follower technique, the robot has to sense the
right color and reach the table where the person is seated.
3.3. Gesture recognition
The camera encompassed at the head of the robot helps to capture a frame containing the real-time
image of the hand. The threshold of the frame is adjusted manually from its default value. Blur is
introduced into the frame to cancel out unnecessary areas after adjusting the threshold according to the
area's lighting conditions. It is then converted to a grayscale version. Finally, the image is then
extrapolated using the microcomputer that interprets the following image shown in the frame to a
value that is stored in a list.
3.4. I2
C interfacing
After receiving the correct count, this value is passed to the microcontroller. The microcontroller is
powered by a 5v rechargeable lithium-polymer battery, whereas the Raspberry pi 4 utilizes a 3.3v on
its GPIO bus. The essential difference between the Arduino and Pi is the number of I/O ports they
contain, especially Arduino, which contains several analog and digital ports. They can also be used to
handle interrupts and timing circuits based on their usage. Therefore to utilize their maximum
potential, they should be interfaced in such a way that they can communicate with each other. This is
done using pre-built user-defined functions present in Python and Arduino IDE.
3.5. Dispenser
The dispenser is built using a 12v DC pump, unlike a 9v pump with fewer rotations per minute. Being
a submersible type water pump minimizes the expenses and becomes easier to replace if the pump
does not function properly. The pipe wound around the DC motor must be checked and analyzed
before use. Dimensions of the pipe are to be altered depending on the liquid that flows through it and
the pressure applied to push the fluid from the storage unit to the cup. It can dispense up to two
different drinks. The pipe used in the prototype is 5mm in diameter and 30cm in length. By adjusting
the flow rate, the amount of liquid poured into the cup can be increased or decreased. This prototype
can fill 100 ml into the cup for about 3 seconds.
3.6. Robotic arm
A robotic arm is placed beside the dispenser to the right, which helps to place the cup below the
dispenser at a fixed coordinate. After the cup is filled, the robotic arm then serves this cup on top of a
tray. The robotic arm is manipulated using a microcontroller and is maneuvered using a metal gear
servo motor. Being constructed with two degrees of freedom, it can lift heavy objects. Accordingly,
the user can place heavier cups between its claws. This prototype can be attached to a cup holder if the
user requires it. Temporarily the cup has to be placed on the robot's hands to receive the particulars.
This eliminates the hassle of the user doubting the cleanliness of the cup used.
4. Module Description
4.1. Module 1: Raspberry PI 4
The proposed prototype is shown in figure 1. The Raspberry pi 4 is the updated version of the Pi series
of microcomputers that was released a few years ago. It is a 64-bit processor having four cores. Some
of the features are wireless, Ethernet, and Bluetooth connectivity. These cores operate at a
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
5
synchronized speed of 1.5GHz. This is a computer with the smallest size factor possible. This
prototype coordinates the working of each module.
Consequently, the Pi transfers data between itself and the microcontroller establishing a two-way
communication termed I2
C (Inter-Integrating Circuits) Communication. Therefore, it also helps in
processing the finger count captured by the camera, scheduling tasks for each module connected to the
microcontroller, and handling interrupts. However, due to its small form factor, there are some notable
discrepancies, such as time delay and thermal latency issues. Moreover, Pi can also be used as a
microcontroller by connecting an external module.
Figure 2. Proposed Prototype
4.2. Module 2: Arduino UNO
Arduino UNO is a microcontroller that contains a total of 20 input/output pins which are separated as
digital and analog pins. ATmega328 is the chip built into it. The prototype only utilizes the digital
section of the module and is powered by a 5v rechargeable lithium-polymer battery. Arduino is a
microcontroller with more digital and analog pins than a microcomputer. Therefore, it helps to
manipulate certain dependent modules, such as the Motor drivers and Servo motors, keeping the
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
6
expenses to a minimum. Apart from hardware manipulation, the PWM pins present in the digital
section of this module also help in timing and interrupt handling.
4.3. Module 3: RF Transceiver
The 27MHz remotely controlled two-way channel button type transmitter that transmits the signal to
the receiver uses the principles of radio waves. The left button signifies the Red color line, and the
right button signifies the blue color line. The buttons work with the principles of that of a switch.
When the right button is pushed, the signal sent is HIGH, keeping the other signal LOW. The working
of the left button is similar to that of the right button. Accordingly, when we push the control button,
the transmitter sends specific electrical pulses corresponding to that action through the air. The
transmitter has its power source, usually in the form of a 9 to 12-volt battery. Radiofrequency, a short
distant signal with good noise immunity and resistance, is used to command the robot to set its
destination towards which it later moves. Utilizing a 27 MHz frequency also helps minimize latency
issues and power consumption. Without the battery, the transmitter cannot send the radio frequency
signals to the receiver.
4.4. Module 4: Motor driver
L298N utilizes two H-Bridge mechanisms to control the low current-rated motor. This high-power
module is connected to the microcontroller that can control up to four DC circuits, such as DC and
Stepper Motors, with directional and speed control. This prototype utilizes this particular module to
enhance the operability of the DC motors and pumps connected with it, reducing the space needed to
encompass the setup.
4.5. Module 5: camera
Logitech Quickcam Notebook Delux is an optical digital 640X480 resolution camera. The sensor is a
0.3 megapixel CMOS type sensor able to capture images and record real-time videos by allowing light
to pass through the digital lens. Here, this camera module helps in capturing a real-time image of the
hand within a frame of dimension 0.5cm in the X direction and 0.8cm towards the Y direction. This
information is then sent to the microcomputer where it is further processed.
4.6. Module 6: Submersible DC Pump
A submersible pump is a form of a pressure pump that can be fully submerged in water. Using a shaft,
attached to the DC motor, the water is pumped out through the end pipe. The DC motor is protected
from water by an insulated coating made of plastic. The pipe used in the prototype is 5mm in diameter
and 30cm in length. By adjusting the rate of flow, the amount of liquid dispensed into the cup can be
increased or decreased. This prototype can fill 100 ml into the cup for about 3 seconds.
4.7. Module 7: Servo Motor (Sg90)
Servo motors are mechanical shaft like devices possessing high torque employed in fields such as
robotics and automation. Due to high current requirements, the power supply connected to it must be
sufficient enough to power all the servos connected to the microcontroller. In this prototype, the SG90
module helps to control the action of the robotic claw. The geared shaft in the servo motor rotates 1
degree at a time which can be electrically controlled. The robotic claw is set at a default value of 55
degrees after which it will come back to its initial position.
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
7
4.8. Module 8: Servo motor (mg 996r)
Similar to the SG90 module, this MG996r module is a metal gear servo motor that consists of a geared
shaft built out of metal and is electrically controlled. This module can lift heavier and sturdy objects
with ease. The prototype utilizes its function in the working of the robotic arm. Since the robotic arm
is custom-built and bulky, this module helps to preserve the functions of the servo motor without
getting damaged when a heavier object is lifted.
5. Results and Discussion
The proposed prototype helps to tackle the issue between robot and human interaction. A remote-
controlled RF signal is first transmitted to the receiver embedded in the robot's base. After setting its
destination coordinate, using the line follower technique along with the help of a color sensor, limited
to two colors, the robot can move towards the table where the signal was first transmitted. The flow is
then shifted to the gesture recognition system that helps to detect the fingers shown within the frame
captured using a camera. This framework consists of the representation and the decision processes.
The representation process converts the raw numerical data into a form adapted to the decision
process, which then relays this information to the next stage. Therefore, the robot has to be placed in a
surrounding with good lighting conditions.
Consequently, the robotic hand, built with 2 degrees of freedom, activates, thus grabbing the cup
within its gripper and swiveling to place it below the dispenser at a particular assigned coordinate. The
custom-built dispenser pours the required particulars making the claw grab the cup again and serve it
on a tray. The number of beverages poured into the cup can be adjusted manually. Therefore, this
prototype is an uncomplicated model that requires further modification and customization to deploy in
real-time scenarios. Nevertheless, this is a step taken that involves intense engineering and
programming to achieve this feat.
6. Conclusion
Gesture recognition is a topic of language technology to interpret human gestures via mathematical
algorithms. This is a field where researchers are actively working to break the barrier between human
and robot interaction. The need for handheld devices can be reduced by employing this concept of
gesture recognition which opens up an avenue of newer specialized interactive devices. Our project
thus helps bridge the gap between robots and humans using this technology. It serves as a gateway and
inception for those who are deaf and mute. They can now indulge in social interaction without needing
a third person to aid them. Due to the recent pandemic, vendors must adopt innovative strategic ways
of serving and attending to guests. Moreover, even in hospitals and houses, older adults need
assistance to get beverages to satisfy their thirst. By employing this prototype, these issues can be
overcome with ease. Socially and physically challenged people can also use such a machine to serve
their particulars without needing another human being in their care. A contactless form of interaction
is achieved, thus limiting the spread of germs and viruses. This ensures cleanliness and a hygienic way
of serving the ordered particulars. Contactless forms of service in the new future, thereby, the users’
satisfaction and quality of service increase exponentially. Despite its limitations, this prototype can be
further enhanced with technologies such as AI and Machine learning to detect surrounding objects and
Computer Vision to aid the robot in dynamic room mapping.
References
[1] https://github.com/lzane/Fingers-Detection-using-OpenCV-and-Python accessed on 10th
Nov,2022. accessed on 1st
Dec 2022.
[2] Masuda T, Misaki D. Development of Japanese Green Tea Serving Robot “T-Bartender”.
Proceeding of the IEEE International Conference on Mechatronics & Automation; July
2005; Niagara Falls, Canada. Fukuroi-shi, Toyosawa, Japan: Department of Mechanical
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
8
Engineering, Shizuoka Institute of Science and Technology; 2005. p. 1069- 1074.
[3] Takahashi Y, Hosokawa M, Mochizuki T. Tea Serving Robot. SICE; 29-31 July 1997.
Tokushima, Japan. p. 1111-1114.
[4] Bannach, D., Amft, O., Kunze, K.S., Heinz, E.A. Tröster, G., and Lukowicz, P. Waving real-
hand gestures recorded by wearable motion sensors to a virtual car and driver in a mixed-
reality parking game. In Proceedings of the Second IEEE Symposium on Computational
Intelligence and Games (Honolulu, Apr. 1--5, 2007), 32—39
[5] Chen, Y.T. and Tseng, K.T. Developing a multiple-angle hand-gesture-recognition system for
human-machine interactions. In Proceedings of the 33rd Annual Conference of the IEEE
Industrial Electronics Society (Taipei, Nov. 5--8, 2007), 489—492
[6] Niemelä, Marketta, Päivi Heikkilä, and Hanna Lammi. "A social service robot in a shopping
mall: expectations of the management, retailers and consumers." In Proceedings of the
Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction,
pp. 227-228. 2017.
[7] Chang, Woojung, and Kyoungmi Kate Kim. "Appropriate service robots in exchange and
communal relationships." Journal of Business Research 141 (2022): 462-474.
[8] Tojib, Dewi, Ting Hin Ho, Yelena Tsarenko, and Iryna Pentina. "Service robots or human staff?
The role of performance goal orientation in service robot adoption." Computers in Human
Behavior (2022): 107339.
[9] Takanokura, Masato, Ren Kurashima, Tsubasa Ohhira, Yoshihiro Kawahara, and Mitsuharu
Ogiya. "Implementation and user acceptance of social service robot for an elderly care
program in a daycare facility." Journal of Ambient Intelligence and Humanized Computing
(2021): 1-10.
[10] Prasanalakshmi, B. "Deep Regression hybridized Neural Network in human stress detection." In
2022 International Conference on Smart Technologies and Systems for Next Generation
Computing (ICSTSN), pp. 1-5. IEEE, 2022.
[11] Sabapathy, Sundaresan, Surendar Maruthu, Suresh Kumar Krishnadhas, Ananth Kumar
Tamilarasan, and Nishanth Raghavan. "Competent and Affordable Rehabilitation Robots for
Nervous System Disorders Powered with Dynamic CNN and HMM." Intelligent Systems for
Rehabilitation Engineering (2022): 57-93.
[12] De Smedt, Quentin, Hazem Wannous, and Jean-Philippe Vandeborre. "Skeleton-based dynamic
hand gesture recognition." In Proceedings of the IEEE Conference on Computer Vision and
Pattern Recognition Workshops, pp. 1-9. 2016.
[13] Nunez, Juan C., Raul Cabido, Juan J. Pantrigo, Antonio S. Montemayor, and Jose F. Velez.
"Convolutional neural networks and long short-term memory for skeleton-based human
activity and hand gesture recognition." Pattern Recognition 76 (2018): 80-94.
[14] Tang, Danhang, Hyung Jin Chang, Alykhan Tejani, and Tae-Kyun Kim. "Latent regression
forest: structured estimation of 3d hand poses." IEEE Transactions on Pattern Analysis and
Machine Intelligence 39, no. 7 (2016): 1374-1387.
[15] Kumar, P. Praveen, T. Ananth Kumar, R. Rajmohan, and M. Pavithra. "AI-Based Robotics in E-
Healthcare Applications." In Intelligent Interactive Multimedia Systems for E-Healthcare
Applications, pp. 249-269. Apple Academic Press, 2022.
[16] Yao, Yuan, and Yun Fu. "Contour model-based hand-gesture recognition using the Kinect
sensor." IEEE Transactions on Circuits and Systems for Video Technology 24, no. 11
(2014): 1935-1944.
[17] Pisharady, Pramod Kumar, and Martin Saerbeck. "Recent methods and databases in vision-
based hand gesture recognition: A review." Computer Vision and Image Understanding 141
(2015): 152-165.
[18] Barman, H., Gösta H. Granlund, and Hans Knutsson. "A new approach to curvature estimation
and description." In Third International Conference on Image Processing and its
Applications, 1989., pp. 54-58. IET, 1989.
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
9
[19] El Makrini, Ilias, Shirley A. Elprama, Jan Van den Bergh, Bram Vanderborght, Albert-Jan
Knevels, Charlotte IC Jewell, Frank Stals et al. "Working with walt: How a cobot was
developed and inserted on an auto assembly line." IEEE Robotics & Automation Magazine
25, no. 2 (2018): 51-58.
[20] Wadhawan, Ankita, and Parteek Kumar. "Sign language recognition systems: A decade
systematic literature review." Archives of Computational Methods in Engineering 28, no. 3
(2021): 785-813.
[21] Zhang, Jie, Xiao‐Qing Xu, Jun Liu, Lei Li, and Qiong‐Hua Wang. "Three‐dimensional
interaction and autostereoscopic display system using gesture recognition." Journal of the
Society for Information Display 21, no. 5 (2013): 203-208.

More Related Content

PDF
L01117074
PDF
HCI for Real world Applications
PDF
Arduino Based Hand Gesture Controlled Robot
PDF
TOUCHLESS ECOSYSTEM USING HAND GESTURES
PDF
A novel visual tracking scheme for unstructured indoor environments
PDF
Human-machine interactions based on hand gesture recognition using deep learn...
PDF
A Survey on Virtual Whiteboard-A Gesture Controlled Pen-free Tool
PDF
EMBED SYSTEM FOR ROBOTIC ARM WITH 3 DEGREE OF FREEDOM CONTROLLER USING COMPUT...
L01117074
HCI for Real world Applications
Arduino Based Hand Gesture Controlled Robot
TOUCHLESS ECOSYSTEM USING HAND GESTURES
A novel visual tracking scheme for unstructured indoor environments
Human-machine interactions based on hand gesture recognition using deep learn...
A Survey on Virtual Whiteboard-A Gesture Controlled Pen-free Tool
EMBED SYSTEM FOR ROBOTIC ARM WITH 3 DEGREE OF FREEDOM CONTROLLER USING COMPUT...

Similar to Social Service Robot using Gesture recognition technique (20)

PPT
Swarm ROBOTICS
PDF
Human Centered Robot Systems Cognition Interaction Technology 1st Edition Tho...
PDF
Tool delivery robot using convolutional neural network
PDF
IRJET - Paint using Hand Gesture
PDF
IRJET - Gesture Controlled Home Automation using CNN
PDF
Gesture Recognition System
PDF
A Review On AI Vision Robotic Arm Using Raspberry Pi
PDF
Accessing Operating System using Finger Gesture
PDF
Visual control system for grip of glasses oriented to assistance robotics
PDF
Real Time Hand Gesture Recognition Based Control of Arduino Robot
PDF
IRJET- Survey Paper on Vision based Hand Gesture Recognition
PDF
Airflow Canvas in Deep learning (Convolutional neural network)
PDF
Gesture Recognition System using Computer Vision
PDF
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
PDF
IRJET- Survey on Sign Language and Gesture Recognition System
PDF
Controlling Computer using Hand Gestures
PDF
Hand gesture recognition using machine learning algorithms
PDF
Indian Sign Language Recognition using Vision Transformer based Convolutional...
PDF
Password Based Hand Gesture Controlled Robot
PDF
VIRTUAL PAINT APPLICATION USING HAND GESTURES
Swarm ROBOTICS
Human Centered Robot Systems Cognition Interaction Technology 1st Edition Tho...
Tool delivery robot using convolutional neural network
IRJET - Paint using Hand Gesture
IRJET - Gesture Controlled Home Automation using CNN
Gesture Recognition System
A Review On AI Vision Robotic Arm Using Raspberry Pi
Accessing Operating System using Finger Gesture
Visual control system for grip of glasses oriented to assistance robotics
Real Time Hand Gesture Recognition Based Control of Arduino Robot
IRJET- Survey Paper on Vision based Hand Gesture Recognition
Airflow Canvas in Deep learning (Convolutional neural network)
Gesture Recognition System using Computer Vision
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
IRJET- Survey on Sign Language and Gesture Recognition System
Controlling Computer using Hand Gestures
Hand gesture recognition using machine learning algorithms
Indian Sign Language Recognition using Vision Transformer based Convolutional...
Password Based Hand Gesture Controlled Robot
VIRTUAL PAINT APPLICATION USING HAND GESTURES
Ad

More from Christo Ananth (20)

PDF
Call for Papers - Journal of Electrical Systems (JES), E-ISSN: 1112-5209, ind...
PDF
Call for Papers - Utilitas Mathematica, E-ISSN: 0315-3681, indexed in Scopus
PDF
Call for Chapters- Edited Book: Quantum Networks and Their Applications in AI...
PDF
Call for Chapters- Edited Book: Real World Challenges in Quantum Electronics ...
PDF
Call for Chapters- Edited Book: Real-World Applications of Quantum Computers ...
PDF
Call for Papers- Thematic Issue: Food, Drug and Energy Production, PERIÓDICO ...
PDF
Call for Papers - PROCEEDINGS ON ENGINEERING SCIENCES, P-ISSN-2620-2832, E-IS...
PDF
Call for Papers - Onkologia i Radioterapia, P-ISSN-1896-8961, E-ISSN 2449-916...
PDF
Call for Papers - Journal of Indian School of Political Economy, E-ISSN 0971-...
PDF
Call for Papers - Journal of Ecohumanism (JoE), ISSN (Print): 2752-6798, ISSN...
PDF
Call for Papers- Journal of Wireless Mobile Networks, Ubiquitous Computing, a...
PDF
Call for Papers - International Journal of Intelligent Systems and Applicatio...
PDF
Call for Papers- Special Issue: Recent Trends, Innovations and Sustainable So...
PDF
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
PDF
Call for Papers - Bharathiya Shiksha Shodh Patrika, E-ISSN 0970-7603, indexed...
PDF
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
PDF
Virtual Science Is A New Scientific Paradigm
PDF
Wind Energy Harvesting: Technological Advances and Environmental Impacts
PDF
Hydrogen Economy: Opportunities and Challenges for a Sustainable Future
PDF
The Economics of Transitioning to Renewable Energy Sources
Call for Papers - Journal of Electrical Systems (JES), E-ISSN: 1112-5209, ind...
Call for Papers - Utilitas Mathematica, E-ISSN: 0315-3681, indexed in Scopus
Call for Chapters- Edited Book: Quantum Networks and Their Applications in AI...
Call for Chapters- Edited Book: Real World Challenges in Quantum Electronics ...
Call for Chapters- Edited Book: Real-World Applications of Quantum Computers ...
Call for Papers- Thematic Issue: Food, Drug and Energy Production, PERIÓDICO ...
Call for Papers - PROCEEDINGS ON ENGINEERING SCIENCES, P-ISSN-2620-2832, E-IS...
Call for Papers - Onkologia i Radioterapia, P-ISSN-1896-8961, E-ISSN 2449-916...
Call for Papers - Journal of Indian School of Political Economy, E-ISSN 0971-...
Call for Papers - Journal of Ecohumanism (JoE), ISSN (Print): 2752-6798, ISSN...
Call for Papers- Journal of Wireless Mobile Networks, Ubiquitous Computing, a...
Call for Papers - International Journal of Intelligent Systems and Applicatio...
Call for Papers- Special Issue: Recent Trends, Innovations and Sustainable So...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Bharathiya Shiksha Shodh Patrika, E-ISSN 0970-7603, indexed...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Virtual Science Is A New Scientific Paradigm
Wind Energy Harvesting: Technological Advances and Environmental Impacts
Hydrogen Economy: Opportunities and Challenges for a Sustainable Future
The Economics of Transitioning to Renewable Energy Sources
Ad

Recently uploaded (20)

PPT
Project quality management in manufacturing
PPTX
UNIT-1 - COAL BASED THERMAL POWER PLANTS
PPTX
Welding lecture in detail for understanding
PDF
Evaluating the Democratization of the Turkish Armed Forces from a Normative P...
PDF
Operating System & Kernel Study Guide-1 - converted.pdf
PPTX
M Tech Sem 1 Civil Engineering Environmental Sciences.pptx
PDF
TFEC-4-2020-Design-Guide-for-Timber-Roof-Trusses.pdf
PDF
The CXO Playbook 2025 – Future-Ready Strategies for C-Suite Leaders Cerebrai...
PPTX
Foundation to blockchain - A guide to Blockchain Tech
PPTX
Sustainable Sites - Green Building Construction
PPTX
OOP with Java - Java Introduction (Basics)
PPTX
Geodesy 1.pptx...............................................
PDF
Digital Logic Computer Design lecture notes
PDF
Enhancing Cyber Defense Against Zero-Day Attacks using Ensemble Neural Networks
PDF
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
PDF
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
PPTX
Recipes for Real Time Voice AI WebRTC, SLMs and Open Source Software.pptx
PPTX
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
PDF
Automation-in-Manufacturing-Chapter-Introduction.pdf
PPTX
IOT PPTs Week 10 Lecture Material.pptx of NPTEL Smart Cities contd
Project quality management in manufacturing
UNIT-1 - COAL BASED THERMAL POWER PLANTS
Welding lecture in detail for understanding
Evaluating the Democratization of the Turkish Armed Forces from a Normative P...
Operating System & Kernel Study Guide-1 - converted.pdf
M Tech Sem 1 Civil Engineering Environmental Sciences.pptx
TFEC-4-2020-Design-Guide-for-Timber-Roof-Trusses.pdf
The CXO Playbook 2025 – Future-Ready Strategies for C-Suite Leaders Cerebrai...
Foundation to blockchain - A guide to Blockchain Tech
Sustainable Sites - Green Building Construction
OOP with Java - Java Introduction (Basics)
Geodesy 1.pptx...............................................
Digital Logic Computer Design lecture notes
Enhancing Cyber Defense Against Zero-Day Attacks using Ensemble Neural Networks
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
Recipes for Real Time Voice AI WebRTC, SLMs and Open Source Software.pptx
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
Automation-in-Manufacturing-Chapter-Introduction.pdf
IOT PPTs Week 10 Lecture Material.pptx of NPTEL Smart Cities contd

Social Service Robot using Gesture recognition technique

  • 1. Journal of Physics: Conference Series PAPER • OPEN ACCESS Social Service Robot using Gesture recognition technique To cite this article: D. Jessintha et al 2023 J. Phys.: Conf. Ser. 2466 012020 View the article online for updates and enhancements. You may also like Scene Understanding Technology of Intelligent Customer Service Robot Based on Deep Learning Jianfeng Zhong - Path Planning in Service Robot Based on Improved A* Algorithm Junyi Cao and Jie Liu - A Review on Service Robots: Mechanical Design and Localization System M. Q. Bakri, A. H. Ismail, M.S.M. Hashim et al. - This content was downloaded from IP address 213.230.107.211 on 12/04/2023 at 17:25
  • 2. Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI. Published under licence by IOP Publishing Ltd 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 1 Social Service Robot using Gesture recognition technique D. Jessintha1 , P. Praveen kumar2 , S. Jaisiva3 , T. Ananth kumar4 *, Christo Ananth5 1 Electronics and Communication Engineering, Easwari Engineering College, Chennai, India 2 Deparment of Information Tehncology, Sri Manakulavinayagar Engineering college , Puducherry, India. 3 Department of EEE, M.Kumarasamy College of Engineering, Karur,Tamilnadu, India 4 Computer Science and Engineering, IFET College of Engineering, Tamilnadu, India. 5 Department of Natural and Exact Sciences, Samarkand State University, Uzbekistan *tananthkumar@ifet.ac.in Abstract. A robot is a machine that can automatically do a task or a series of tasks based on its programming and environment. They are artificially built machines or devices that can perform activities with utmost accuracy and precision minimizing time constraints. Service robots are technologically advanced machines deployed to service and maintain certain activities. Research findings convey the essential fact that serving robots are now being deployed worldwide. Social robotics is one such field that heavily involves an interaction between humans and an artificially built machine. These man-built machines interact with humans and can also understand social terms and words. Modernization has bought changes in design and mechanisms due to this ever-lasting growth in technology and innovation. Therefore, food industries are also dynamically adapting to the new changes in the field of automation to reduce human workload and increase the quality of service. Deployment of a robot in the food industries which help to aid deaf and mute people who face social constraints is an ever- growing challenge faced by engineers for the last few decades. Moreover, a contactless form of speedy service system which accomplishes its task with at most precision and reduced complexity is a feat yet to be perfected. Preservation of personal hygiene, a better quality of service, and reduced labour costs is achieved. 1. Introduction Social service robots, in contrast to their industrial counterparts, have a definite role to play. The frequent enhancements in the technological field and innovation led to an everlasting development in robotic systems[1][2]. Intelligent Robotic Systems deploy technologies such as gesture and voice recognition to overcome human workload as well as to maintain the quality of service thus rendered. The robot is summoned using a button embedded in an RF transmitter module. After the reception of the signal, a color sensor is attached to the robot, which utilizes the technique of line following, and can move along a pre-determined path to reach its destination[3-5]. Constructing a gesture recognition system programmed using Raspberry Pi 4 to capture the gesture shown by people can help serve the desired particulars such as water, tea, coffee, or other beverages without any physical contact. On receiving relevant inputs from the user, the robot arm designed using MG996r and SG90 can grab the
  • 3. 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 2 cup and place it beneath the dispenser. The custom-built dispenser using L298N, the submersible water pump, transfers the requested beverage to the cup. The robotic arm thereby serves the requested particulars on a tray. The methodologies used are interpreted in Section 2. The hardware modules used for this design are discussed in Section 3. The efforts required for such people to get their particulars without any hassle can be overcome through this method. Robots help reduce the physical stress level of an employee and maintain a cleaner ambiance without compromising the food quality. A social service robot should aid people in diverse conditions, such as a shop, restaurant, hospital, or even home. The sole purpose of this robot is to increase robot - humans interaction through sophisticated technological advancements. This paper interprets the basic operation which can be applied to various places for specific tasks[5-9]. 2. Related Works Due to technological advances, robots are becoming more common. Shortly, we may see intelligent robots helping people in need. Human-robot interaction research is becoming more critical as it integrates hand sign recognition components[10]. Hand gesture recognition may make human-robot communication more natural. This may help humans and robots collaborate to increase application efficiency and avoid problems[11]. De Smedt et al. classified hand gestures using the SVM algorithm and skeletal and depth data[12]. Nunez et al. classified hand gestures using an HMM and SVM (HMM). Image acquisition and division used Kinect sensor data and a skeleton-based method[13]. According to Tang et al. [14], hand gestures can be tracked in real-time using a recursive connected component algorithm and pixelated hand skeletons' 3D geodesic distance. Praveen kumar et al. studied nonverbal communication and an R-CNN to improve avocado harvesting in a simulated workspace[15]. This improved efficiency. A robot located workers and determined if they needed help by recognizing human activity, hand gestures, or flags. et al. CNN/LSTM networks were used (long short-term memory). The harvester learned from his hands. Hand gesture recognition results may vary based on image color texture. Due to differences in skin color between people and countries, results may vary. Light affects color and texture. Shape-based features can also recognize hand gestures. We took a different approach. Normal thumb and finger lengths are about the same on both hands. Hand shape-based gesture recognition frame rate is comparable to most existing systems. The number and accuracy of recognized gestures were among the best. Robots must understand what humans are saying to collaborate effectively with them. Humans and robots must communicate using natural gestures in HRC manufacturing[16]. The hand is differentiated from its surroundings using a skin tone. Principal component analysis categorized all eight static gestures. Pishardy et al. use a restricted coulomb energy (RCE) neural network to separate a hand from an image. To train a second RCE neural network to recognize static hand gestures, measure the hand-to-arm distance and number of spread fingers. 95% accuracy is possible with an eight-size gesture lexicon[17]. A color camera captures real-time images in full color. In Otsu segmentation, the Y-Cb-Cr color space distinguishes the moving hand from the constant background. The k-curvature algorithm [18] determines an image's high and low points. A gesture's peak-to-valley ratio determines its group. The judgments can be 95.2% accurate. This system recognizes six gestures. One factor means this cannot increase work output. They assume that hands are the camera's most accessible part. The hand's orientation can be determined using a vector from the center to its farthest point. The robot's movement is controlled by the hand's orientation, while its straight-line velocity is determined by its distance from the image. El Makrini et al. found that hand shape can control a robot. Instead of using high-level depth and color features, all intensity values in a box around the hand are used[19]. This replaces feature extraction. The average neighborhood margin maximization algorithm reduces feature space dimensions (ANMM). Four hand gestures are classified using the nearest neighbor classifier. Wadhawan et al. [20] classify six hand gestures using a geometric property. In color images, hand skin color is evaluated to identify it. Calculate the distance between the hand's center and each outline point. A gesture's number of peaks and valleys determines its category. Zhang et al. created a system that
  • 4. 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 3 recognizes hand gestures online in real time[21]. The chamfer distance aligns a hand template with an image's edges after removing a person's body. This step follows body division. The hand is then given a realistic skin color model. Tracing the hand's center across multiple frames reveals the final feature. Vector shows hand location and motion. Hidden Markov models learn hand movements. Due to color cameras' high frame rate, most current techniques use video sequences. Human-robot interaction (HRI) has become a focus of research in computer vision and robotics due to the broad range of applications in the field of human-computer interaction (HCI). 3. Working Prototype The robot, which comprises different modules, is primarily stationed at the origin of the desired area intended to be used. The following techniques are adapted to demonstrate the basic idea of this project. Figure 1 shows the block diagram of the proposed system. 3.1 Transmission and Reception A transmitter-receiver is enclosed in the base of the robot. This is operated at a frequency of 27 MHz. Radiofrequency utilizes the principles of radio waves to transfer signals to the receiver from the transmitter by adjusting the current and voltage parameters that alter the oscillation rate. It can send signals from 20KHz to around 200GHz. This concept is utilized such that when the robot is summoned at the user's side, an RF signal is transmitted from the transmitter situated at the table to the receiver, housed within the base of the robot. This receiver module is connected to the microcontroller that initiates the robot's movement. Consequently, the base of the robot moves to the respective table where the user is seated using TCS3200 module. Figure 1. Block diagram of the Proposed system 3.2. Line follower with color sensor After the successful reception of the signal, the robot is destined to reach the respective table where it was summoned. DC motors are powered by the motor driver that makes forward and backward movement possible depending on the command given to it by the microcontroller. Therefore, the base of the robot has to plan its path. It utilizes a color sensor to check the path that the robot takes. The
  • 5. 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 4 color sensor module is calibrated to detect two colors, Red and Blue. The path to the table is laid using these two colors. Applying the principles of the line follower technique, the robot has to sense the right color and reach the table where the person is seated. 3.3. Gesture recognition The camera encompassed at the head of the robot helps to capture a frame containing the real-time image of the hand. The threshold of the frame is adjusted manually from its default value. Blur is introduced into the frame to cancel out unnecessary areas after adjusting the threshold according to the area's lighting conditions. It is then converted to a grayscale version. Finally, the image is then extrapolated using the microcomputer that interprets the following image shown in the frame to a value that is stored in a list. 3.4. I2 C interfacing After receiving the correct count, this value is passed to the microcontroller. The microcontroller is powered by a 5v rechargeable lithium-polymer battery, whereas the Raspberry pi 4 utilizes a 3.3v on its GPIO bus. The essential difference between the Arduino and Pi is the number of I/O ports they contain, especially Arduino, which contains several analog and digital ports. They can also be used to handle interrupts and timing circuits based on their usage. Therefore to utilize their maximum potential, they should be interfaced in such a way that they can communicate with each other. This is done using pre-built user-defined functions present in Python and Arduino IDE. 3.5. Dispenser The dispenser is built using a 12v DC pump, unlike a 9v pump with fewer rotations per minute. Being a submersible type water pump minimizes the expenses and becomes easier to replace if the pump does not function properly. The pipe wound around the DC motor must be checked and analyzed before use. Dimensions of the pipe are to be altered depending on the liquid that flows through it and the pressure applied to push the fluid from the storage unit to the cup. It can dispense up to two different drinks. The pipe used in the prototype is 5mm in diameter and 30cm in length. By adjusting the flow rate, the amount of liquid poured into the cup can be increased or decreased. This prototype can fill 100 ml into the cup for about 3 seconds. 3.6. Robotic arm A robotic arm is placed beside the dispenser to the right, which helps to place the cup below the dispenser at a fixed coordinate. After the cup is filled, the robotic arm then serves this cup on top of a tray. The robotic arm is manipulated using a microcontroller and is maneuvered using a metal gear servo motor. Being constructed with two degrees of freedom, it can lift heavy objects. Accordingly, the user can place heavier cups between its claws. This prototype can be attached to a cup holder if the user requires it. Temporarily the cup has to be placed on the robot's hands to receive the particulars. This eliminates the hassle of the user doubting the cleanliness of the cup used. 4. Module Description 4.1. Module 1: Raspberry PI 4 The proposed prototype is shown in figure 1. The Raspberry pi 4 is the updated version of the Pi series of microcomputers that was released a few years ago. It is a 64-bit processor having four cores. Some of the features are wireless, Ethernet, and Bluetooth connectivity. These cores operate at a
  • 6. 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 5 synchronized speed of 1.5GHz. This is a computer with the smallest size factor possible. This prototype coordinates the working of each module. Consequently, the Pi transfers data between itself and the microcontroller establishing a two-way communication termed I2 C (Inter-Integrating Circuits) Communication. Therefore, it also helps in processing the finger count captured by the camera, scheduling tasks for each module connected to the microcontroller, and handling interrupts. However, due to its small form factor, there are some notable discrepancies, such as time delay and thermal latency issues. Moreover, Pi can also be used as a microcontroller by connecting an external module. Figure 2. Proposed Prototype 4.2. Module 2: Arduino UNO Arduino UNO is a microcontroller that contains a total of 20 input/output pins which are separated as digital and analog pins. ATmega328 is the chip built into it. The prototype only utilizes the digital section of the module and is powered by a 5v rechargeable lithium-polymer battery. Arduino is a microcontroller with more digital and analog pins than a microcomputer. Therefore, it helps to manipulate certain dependent modules, such as the Motor drivers and Servo motors, keeping the
  • 7. 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 6 expenses to a minimum. Apart from hardware manipulation, the PWM pins present in the digital section of this module also help in timing and interrupt handling. 4.3. Module 3: RF Transceiver The 27MHz remotely controlled two-way channel button type transmitter that transmits the signal to the receiver uses the principles of radio waves. The left button signifies the Red color line, and the right button signifies the blue color line. The buttons work with the principles of that of a switch. When the right button is pushed, the signal sent is HIGH, keeping the other signal LOW. The working of the left button is similar to that of the right button. Accordingly, when we push the control button, the transmitter sends specific electrical pulses corresponding to that action through the air. The transmitter has its power source, usually in the form of a 9 to 12-volt battery. Radiofrequency, a short distant signal with good noise immunity and resistance, is used to command the robot to set its destination towards which it later moves. Utilizing a 27 MHz frequency also helps minimize latency issues and power consumption. Without the battery, the transmitter cannot send the radio frequency signals to the receiver. 4.4. Module 4: Motor driver L298N utilizes two H-Bridge mechanisms to control the low current-rated motor. This high-power module is connected to the microcontroller that can control up to four DC circuits, such as DC and Stepper Motors, with directional and speed control. This prototype utilizes this particular module to enhance the operability of the DC motors and pumps connected with it, reducing the space needed to encompass the setup. 4.5. Module 5: camera Logitech Quickcam Notebook Delux is an optical digital 640X480 resolution camera. The sensor is a 0.3 megapixel CMOS type sensor able to capture images and record real-time videos by allowing light to pass through the digital lens. Here, this camera module helps in capturing a real-time image of the hand within a frame of dimension 0.5cm in the X direction and 0.8cm towards the Y direction. This information is then sent to the microcomputer where it is further processed. 4.6. Module 6: Submersible DC Pump A submersible pump is a form of a pressure pump that can be fully submerged in water. Using a shaft, attached to the DC motor, the water is pumped out through the end pipe. The DC motor is protected from water by an insulated coating made of plastic. The pipe used in the prototype is 5mm in diameter and 30cm in length. By adjusting the rate of flow, the amount of liquid dispensed into the cup can be increased or decreased. This prototype can fill 100 ml into the cup for about 3 seconds. 4.7. Module 7: Servo Motor (Sg90) Servo motors are mechanical shaft like devices possessing high torque employed in fields such as robotics and automation. Due to high current requirements, the power supply connected to it must be sufficient enough to power all the servos connected to the microcontroller. In this prototype, the SG90 module helps to control the action of the robotic claw. The geared shaft in the servo motor rotates 1 degree at a time which can be electrically controlled. The robotic claw is set at a default value of 55 degrees after which it will come back to its initial position.
  • 8. 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 7 4.8. Module 8: Servo motor (mg 996r) Similar to the SG90 module, this MG996r module is a metal gear servo motor that consists of a geared shaft built out of metal and is electrically controlled. This module can lift heavier and sturdy objects with ease. The prototype utilizes its function in the working of the robotic arm. Since the robotic arm is custom-built and bulky, this module helps to preserve the functions of the servo motor without getting damaged when a heavier object is lifted. 5. Results and Discussion The proposed prototype helps to tackle the issue between robot and human interaction. A remote- controlled RF signal is first transmitted to the receiver embedded in the robot's base. After setting its destination coordinate, using the line follower technique along with the help of a color sensor, limited to two colors, the robot can move towards the table where the signal was first transmitted. The flow is then shifted to the gesture recognition system that helps to detect the fingers shown within the frame captured using a camera. This framework consists of the representation and the decision processes. The representation process converts the raw numerical data into a form adapted to the decision process, which then relays this information to the next stage. Therefore, the robot has to be placed in a surrounding with good lighting conditions. Consequently, the robotic hand, built with 2 degrees of freedom, activates, thus grabbing the cup within its gripper and swiveling to place it below the dispenser at a particular assigned coordinate. The custom-built dispenser pours the required particulars making the claw grab the cup again and serve it on a tray. The number of beverages poured into the cup can be adjusted manually. Therefore, this prototype is an uncomplicated model that requires further modification and customization to deploy in real-time scenarios. Nevertheless, this is a step taken that involves intense engineering and programming to achieve this feat. 6. Conclusion Gesture recognition is a topic of language technology to interpret human gestures via mathematical algorithms. This is a field where researchers are actively working to break the barrier between human and robot interaction. The need for handheld devices can be reduced by employing this concept of gesture recognition which opens up an avenue of newer specialized interactive devices. Our project thus helps bridge the gap between robots and humans using this technology. It serves as a gateway and inception for those who are deaf and mute. They can now indulge in social interaction without needing a third person to aid them. Due to the recent pandemic, vendors must adopt innovative strategic ways of serving and attending to guests. Moreover, even in hospitals and houses, older adults need assistance to get beverages to satisfy their thirst. By employing this prototype, these issues can be overcome with ease. Socially and physically challenged people can also use such a machine to serve their particulars without needing another human being in their care. A contactless form of interaction is achieved, thus limiting the spread of germs and viruses. This ensures cleanliness and a hygienic way of serving the ordered particulars. Contactless forms of service in the new future, thereby, the users’ satisfaction and quality of service increase exponentially. Despite its limitations, this prototype can be further enhanced with technologies such as AI and Machine learning to detect surrounding objects and Computer Vision to aid the robot in dynamic room mapping. References [1] https://github.com/lzane/Fingers-Detection-using-OpenCV-and-Python accessed on 10th Nov,2022. accessed on 1st Dec 2022. [2] Masuda T, Misaki D. Development of Japanese Green Tea Serving Robot “T-Bartender”. Proceeding of the IEEE International Conference on Mechatronics & Automation; July 2005; Niagara Falls, Canada. Fukuroi-shi, Toyosawa, Japan: Department of Mechanical
  • 9. 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 8 Engineering, Shizuoka Institute of Science and Technology; 2005. p. 1069- 1074. [3] Takahashi Y, Hosokawa M, Mochizuki T. Tea Serving Robot. SICE; 29-31 July 1997. Tokushima, Japan. p. 1111-1114. [4] Bannach, D., Amft, O., Kunze, K.S., Heinz, E.A. Tröster, G., and Lukowicz, P. Waving real- hand gestures recorded by wearable motion sensors to a virtual car and driver in a mixed- reality parking game. In Proceedings of the Second IEEE Symposium on Computational Intelligence and Games (Honolulu, Apr. 1--5, 2007), 32—39 [5] Chen, Y.T. and Tseng, K.T. Developing a multiple-angle hand-gesture-recognition system for human-machine interactions. In Proceedings of the 33rd Annual Conference of the IEEE Industrial Electronics Society (Taipei, Nov. 5--8, 2007), 489—492 [6] Niemelä, Marketta, Päivi Heikkilä, and Hanna Lammi. "A social service robot in a shopping mall: expectations of the management, retailers and consumers." In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, pp. 227-228. 2017. [7] Chang, Woojung, and Kyoungmi Kate Kim. "Appropriate service robots in exchange and communal relationships." Journal of Business Research 141 (2022): 462-474. [8] Tojib, Dewi, Ting Hin Ho, Yelena Tsarenko, and Iryna Pentina. "Service robots or human staff? The role of performance goal orientation in service robot adoption." Computers in Human Behavior (2022): 107339. [9] Takanokura, Masato, Ren Kurashima, Tsubasa Ohhira, Yoshihiro Kawahara, and Mitsuharu Ogiya. "Implementation and user acceptance of social service robot for an elderly care program in a daycare facility." Journal of Ambient Intelligence and Humanized Computing (2021): 1-10. [10] Prasanalakshmi, B. "Deep Regression hybridized Neural Network in human stress detection." In 2022 International Conference on Smart Technologies and Systems for Next Generation Computing (ICSTSN), pp. 1-5. IEEE, 2022. [11] Sabapathy, Sundaresan, Surendar Maruthu, Suresh Kumar Krishnadhas, Ananth Kumar Tamilarasan, and Nishanth Raghavan. "Competent and Affordable Rehabilitation Robots for Nervous System Disorders Powered with Dynamic CNN and HMM." Intelligent Systems for Rehabilitation Engineering (2022): 57-93. [12] De Smedt, Quentin, Hazem Wannous, and Jean-Philippe Vandeborre. "Skeleton-based dynamic hand gesture recognition." In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 1-9. 2016. [13] Nunez, Juan C., Raul Cabido, Juan J. Pantrigo, Antonio S. Montemayor, and Jose F. Velez. "Convolutional neural networks and long short-term memory for skeleton-based human activity and hand gesture recognition." Pattern Recognition 76 (2018): 80-94. [14] Tang, Danhang, Hyung Jin Chang, Alykhan Tejani, and Tae-Kyun Kim. "Latent regression forest: structured estimation of 3d hand poses." IEEE Transactions on Pattern Analysis and Machine Intelligence 39, no. 7 (2016): 1374-1387. [15] Kumar, P. Praveen, T. Ananth Kumar, R. Rajmohan, and M. Pavithra. "AI-Based Robotics in E- Healthcare Applications." In Intelligent Interactive Multimedia Systems for E-Healthcare Applications, pp. 249-269. Apple Academic Press, 2022. [16] Yao, Yuan, and Yun Fu. "Contour model-based hand-gesture recognition using the Kinect sensor." IEEE Transactions on Circuits and Systems for Video Technology 24, no. 11 (2014): 1935-1944. [17] Pisharady, Pramod Kumar, and Martin Saerbeck. "Recent methods and databases in vision- based hand gesture recognition: A review." Computer Vision and Image Understanding 141 (2015): 152-165. [18] Barman, H., Gösta H. Granlund, and Hans Knutsson. "A new approach to curvature estimation and description." In Third International Conference on Image Processing and its Applications, 1989., pp. 54-58. IET, 1989.
  • 10. 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 9 [19] El Makrini, Ilias, Shirley A. Elprama, Jan Van den Bergh, Bram Vanderborght, Albert-Jan Knevels, Charlotte IC Jewell, Frank Stals et al. "Working with walt: How a cobot was developed and inserted on an auto assembly line." IEEE Robotics & Automation Magazine 25, no. 2 (2018): 51-58. [20] Wadhawan, Ankita, and Parteek Kumar. "Sign language recognition systems: A decade systematic literature review." Archives of Computational Methods in Engineering 28, no. 3 (2021): 785-813. [21] Zhang, Jie, Xiao‐Qing Xu, Jun Liu, Lei Li, and Qiong‐Hua Wang. "Three‐dimensional interaction and autostereoscopic display system using gesture recognition." Journal of the Society for Information Display 21, no. 5 (2013): 203-208.