SlideShare a Scribd company logo
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 11 | Nov 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 250
Multi-faceted Wheelchair control Interface
Dr. Deepa Karuppaiah1, Srikanth Balakrishna2, Prarthana Shiwakoti3
1Assistant Professor (Senior) SCOPE, Vellore Institute of Technology, Vellore, Tamil Nadu, India
2,3U.G. Student of Computer Science and Engineering Department, Vellore Institute of Technology, Vellore, Tamil
Nadu, India
---------------------------------------------------------------------***--------------------------------------------------------------------
Abstract - As the population of the elderly and the disabled
grows, so does the demand for care and support equipment to
enhance their quality of life. The most popular mobilityaidfor
people with limited mobility for the past 20 years has beenthe
electric powered wheelchair (EPW), and more recently, the
intelligent EPW, also known as an intelligentwheelchair(IW),
has attracted significant attention as a new technology to
meet users' varied needs. Elderly people and disabled people
face a lot of difficulties in performing the simplest day to day
tasks. Many of them rely on others or utilizing conventional
technologies such as wheelchairstoaccomplishtasks. With the
help of modern technology and the advent of voice-enabled
applications and devices we can build tools to help them
interact with society and smooth their mobility during
everyday activities. A major problem that they face is to reach
the wheelchair, hence, to curb this issue we propose a mobile
application that enables the user to locate and navigate the
wheelchair towards themselves whenever they need it. The
primary goal of the interactive user operated wheelchair
system project is to provide a user- friendly interface by
employing two ways of interaction with the wheelchair thatis
entering choice of direction through touchscreen(haptic)and
voice recognition input using speech recognition module to
operate a wheelchair.
Key Words: Support; Intelligent wheelchair; Mobile
application; Interact; Voice-recognition; Haptic;
1. INTRODUCTION
The project on using technology with wheelchair is an
assistive technology that includes this initiative to make life
more independent, fruitful, and joyful for dependent and
disabled people. The primary goal of the interactive user
operated wheelchair system project is to provide a user-
friendly interface by employing two waysofinteractionwith
the wheelchair that is entering choice of direction through
touch screen (haptic) and voice recognition input using
speech recognition module to operate a wheelchair. The
device is made to allow a person to operate a wheelchair
with their voice. This project aims tofacilitatethemovement
of older persons who are unable to move well and disabled
or handicapped people. It is hoped that this approach will
enable certain folks to move aroundlessfrequentlyasa daily
necessity. A crucial piece of technology that will enable new
forms of human-machine connection is speech recognition.
Therefore, by applyingspeechrecognitiontechnologyforthe
movement of a wheelchair, the issues they confront can be
resolved. The employment of the smart phone as a
middleman or interface can actualize and maximize this. To
create a program that can detect speech, control chair
movement, and handle or manage graphical commands,
interfaces have been built for this project. The wheelchair
with a motor that can be moved using vocal commands. It is
crucial for a motorized wheelchair to be able to avoid
obstacles automatically and in real time, so it can go quickly.
Through research and design wise, thewheelchairtocontrol
development along safe and effective use of the provision
independence and self-use mobility.
2. Literature Review
Numerous studies have found that all impaired people can
benefit from independent mobility or movement, which
includes powered wheelchairs, manual wheelchairs, and
walkers. Independent mobility improves educational and
employment prospects, lessens dependency on group
members, and fosters sentiments of independence.
Independent mobility is crucial in layingthegroundwork for
a lot of young children's early learning. [1] A cycle of
deprivation and lack of drive that develops learned
helplessness is frequently the result of a lack of exploration
and control. Independent movement is a crucial component
of self-esteem for older individuals and aids in "ageing in
place." Due to the necessity of movement for many of these
activities, mobility issues contributed to the issue of
activities of daily living (ADL) and instrumental ADL
limitations. [2] Reduced socialization chances as a result of
mobility issues frequentlycausesocial isolationanda variety
of mental health issues. While traditional manual or self-
automated wheelchairs can often meet the needs of people with
impairments, certain members of the disabled community find it
difficult or impossible to utilize wheelchairs independently. In
paper [3], they describe a smart wheelchair navigation
system through multiple inputs hence the users with
different types of disabilities can also use the same system.
Simple and low cost ATMega32L of AVR microcontroller
family is used for processing purpose.
Microcontroller will compare those values with saved
threshold values. When it gets below the threshold value, it
will transmit stop command to the H-bridge motor driver
circuit to avoid the possibility of collision. Controller will get
navigational commands from android phone, which are
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 11 | Nov 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 251
connected through Bluetooth.Usercangivevoicecommands
to the android device which will be compared with saved
commands and its value will be passed to controller.
In paper [4], they describe a design showing the movement
and voice-controlled wheelchair that can guide the
paraplegic to head towards their will and wish with the help
of the voice command wheelchair. It has provided a design
that is efficient in helping the handicapped people without
putting their strengths and efforts to pull the wheelchair, by
commanding it on their voice with a facility to control the
speed of the wheelchair. The brain of the proposedsystem is
Microcontroller. Voice to text app is used to get the voice
input from the handicapped person and it is given to the
Microcontroller via Bluetooth EGBT45ML.The voice gets
converted to the machine code which will be given to the
motor driver to control the movement of wheelchairs.
This paper [5], presents an automatic wheel chair using
voice recognition. A voice-controlled wheelchair makes it
easy for physically disabled person who cannotcontrol their
movements of hands. The powered wheel chair depends on
motors for locomotion and voice recognition for command.
The circuit comprises of an Arduino, HM2007 Voice
recognition module and Motors. The voice recognition
module recognizes the command by the user and provides
the corresponding coded data stored in the memory to
Arduino Microcontroller. Arduino Microcontroller controls
the locomotion accordingly. They are using HM2007 voice
recognition module which correlates commands to do
speech processing and give the result to Arduino which is
further programmedwith respectivelocomotioncommands.
The locomotion is controlled using L293D and Relay which
control the motors. The software iscodedinArduinoand the
hardware components parse the voice commands and
translate it to the motors.
This paper [6], describes the wheelchair system with user
friendly touch screen interface. This device helps the
disables to have automatic advancement totheirdestination
through predefined paths in the indoorsystem.Useoftouch-
screen enables less muscle movement and less muscular
pressure than the selfpropelledwheelchairswhichare being
used from ages. The ability to choose between manual
operating mode and predefined operating mode uniquely
presents capacity of the wheelchair to operate in multiple
environments. Obstacle avoidance facility enables to drive
safely in unknown as well as dynamic environments.
In paper [7], a wheel chair ROBOT model containsanin-built
Micro Controller And Eye Ball Sensing system, which does
the functions like right, left, forward and reverseoperations.
The wheel chair is designed in such a way that it can move
freely without external supportordependency.Through this
feature the patients can enable movements of their
wheelchair as per their desire.
In paper [8], the design of an automated wheelchair for
people suffering from total or partial paralysis is presented.
This wheelchair design allows self-control using an Infrared
Sensor and a microcontroller (Arduino Uno) based circuitry
with emergency message transmitting systems. This design
will not only prevent stress on the body duringmovement in
all four directions but will also be accessible by low income
households.
Robotic wheelchairs have enhanced the manual wheelchairs
by introducing locomotion controls. These devices can ease
the lives of many disabled people, particularly those with
severe impairments by increasing their range of
mobility[9].These robotic enhancement will providebenefit
people who cannot use hands and legs. In this project we
have developed a voice controlled wheelchair which aim to
counter the above problems. The wheelchair can be
controlled using joystick as well as using voice commands.
He/She just needs to say the directionormovethe button for
that direction and the wheelchair moves in the desired
direction. In hardware development, we are using HM2007
voice recognition module which correlates commands to do
speech processing and give the result to Arduino which is
further programmed with respective locomotion
commands[10].
3. Gap analysis
Prior to this, researchers have designed wheelchairs that
require high hardware configuration using heavy and
complex models such as 2-axis joystick, huge brakes and
tires etc. All these components combinedly contribute to
difficulty in mobility for the patient using the wheelchair.
With the use of dedicated hardware peripherals, the elderly
and disabled people require maximum physical efforts for
motion . Also, even though they are heavy and complicated,
they provide a similar performance to other light-weight
software-based models. Most of the papers described, lack
the concept of integrating software into the hardware for
efficient mobility. With the use of software, not only we can
design light weight and minimalistic wheel-chair , but also
we can embed various technologieslikeartificial intelligence
for including latest features like autopilot into the
wheelchair. Hence, the majority of the proposed models
were not performing as efficientlyastheyweresupposed to.
4. Proposed system
4.1 Flowcharts
Wheelchairs can be controlled in two ways: utilizing the
user's voice and an Android app.
The first flowchart demonstrates how voice instructionscan
be used to operate a wheelchair. First, a Bluetooth
connection is made between the wheelchair and the user
voice application. The user is then expected to utilize the
application to utter particular commands. The word is then
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 11 | Nov 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 252
validated and converted to text using the Google voice
service. The text format is then analysed by the controller,
who then verifies that the input is correct before providing
the motor drivers with precise instructions for the
movement in the directions of left, right, straight, backward,
or stop.
Fig 1: Flowchart for User Voice Command
The user will utilize an Android device with a GUI app in the
second part of our project to sendcommandsbyclickingona
certain button, and the wheelchair will respond as
instructed. The wheelchair will move as a result of the
commands. The internet or Wi-Fi connection is a must for
this module. The microcontroller first verifies that the input
is genuine before giving the motor drivers specific
instructions for moving left, right, straight, backward, or
stopping altogether.
Fig 2: Flowchart for GUI Command
4.2 Block Diagram and explanation
Fig 3 :- Block diagram for the proposed model
It is an application to run on the android device with the GUI
app for sending command by clicking on the specific button
and the wheelchair will behave accordingly as per the
command. The commands will produce the movement of
wheelchair. The Bluetooth connection is used to transmit
instructions to the actuators in the wheelchair.
Microcontroller checks for the valid input and then gives
specific instruction to the motor drivers for its movement
towards left, right, straight, backward otherwise stop.
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 11 | Nov 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 253
Android will act as middleware. Android will offer
connectivity through Bluetooth and will send wireless data
over the connection. Android will have a voice recognition
app to receive voice commands. The voice recognition app
will first decode the sets of instructions from the voice
commands and then later send it to the Bluetooth serial
module. The microcontroller will receive the set of
instructions; it will compare the instructions with the
predefined conditions that are set as base. If the condition is
satisfied, then the signal will be sent to the driver circuit
which in turn will convert output pulse signal from
microcontroller to electrical signal to drive the wheel or to
give the mechanical instructions to the wheel’s DC motors.
There is also one other way to give directions and other
controls, by pressing a finger against the various quadrants
on the app which is programmed with different values for
different directions.
4.3 UI Interface
The user-interface is designed in compliance with the
Sneiderman’s 8 golden rules, fitts’ law and Neilsen’s 10
heuristics for UI Design. We have used a simple interface
where we prioritize the arrow buttonsfordirectionandthus
making them of bigger size. Thus, according to Fitts’formula
the time to acquire the direction buttons for the user is
reduced as it is inversely proportional to the width of the
buttons. The direction options in the user interface are
provided according to real-world directions for ease of use
and the voice option makes the process immensely simple
for users that might have less or no vision. The standard
settings and positions for buttonshavebeenusedsotheuser
needs least or no adaptation to the interface.
Fig 4: Login screen Fig 5: Home page
Fig 6: Bluetooth Connection Fig 7: Battery info
Fig 8: Emergency contact Fig 9: Quick Actions
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 11 | Nov 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 254
The user can immediately stop the wheelchair at any point
during their journey by speaking or clicking. A user could
run into an error for example if the voice enabled system is
unable to recognize the command. When this happens, the
system reads out the 5 commands that the user can use
LEFT, RIGHT, FORWARD, BACKWARD, AND STOP.Theuser-
interface implements recognition over recall by providing
interface where directions are already given and not asked
to be entered by user so the user can just recognize and
select. The user-interface is flexible to various types of
handicap or disabled people by taking input in multiple
ways.
4.4 Interaction Model
Abowd & Beale’s Model is used as interaction framework.
Abowd and Beale’s interaction framework identifies system
and user components which communicate via the input and
output components of aninterface.Similarcyclical processes
are taken in this communication from the user's task
formulation to the system's execution and presentation of
the task to the user's observation of the task's outcomes,
from which the user can then design new tasks.
In the proposed system, user is the person using the
wheelchair application interface on their phone. Input is via
touchscreen (Haptic input) input through smartphone or
voice-enabled input. The system takesinasinputcommands
from the user to either move in a certain direction or stop
the movement of the wheelchair. Using the Arduino IDE, we
can convert the commands to actual actions performed by
the wheels through motor. Theoutputisrequiredmovement
of the wheelchair.
4.5 Testing the model
Unit testing method was employed for generating the test
cases . Unit testing involves isolating differentsectionsofthe
code to check for bugs and errors for different possible
inputs to the application. Various output were observed for
different inputs as a result of the test.
5.Result and discussion
Fig 10 : Tkinter GUI Panel for voice recognition
Testcase Result
Haptic input: Left Success
Haptic input: Right Success
Haptic input: Forward Success
Haptic input: Backward Success
Haptic input: Stop Success
Haptic input: Multiple keys Failed
Audio input: Left Success
Audio input: Right Success
Audio input: Forward Success
Audio input: Backward Success
Audio input: Stop Success
Audio input: Noise Failed
Audio input: “Great day” Failed
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 11 | Nov 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 255
Fig 11: Sample output for commands from buttons and voice
The project was tested for parsing and transmission of user
input. Haptic input enables the user to hold down the
buttons and transmit the movement signal via bluetooth as
long the button is held. The signal ceases when the button is
released. This design is quite convenient and intuitive for
people to work with. Voice recognition functionality allows
the user to toggle the movement state by saying key voice-
activation words like ‘forward’, ‘stop’ etc. Once the voice
command is recognized,the correspondingmovementsignal
is transmitted continuously until interrupted by another
user input. The voice recognition performance was found to
be dependent on the phone’s microphone qualityandvaried
from phone to phone. Usage of earphone and headphone
mics improved command recognition.Thenoisetolerance of
the system was tested in a room by playingmusic andhaving
multiple audio sources. It was found to be quite sensitive to
noise as it attempted to parse and recognize words from
surrounding sound and failed to recognize commandswhen
the music loudness levels were similar to the testers voice.
The system had satisfactory performance in lightly noisy
environments and when the microphone was held close
when speaking.
6. Conclusion and Future work
In this project we developed a user-friendly graphical user
interface to perform operations on a wheelchair from
instructions given through a smartphone device. We
integrated a simplistic and minimalistic user-interface with
voice recognition feature that enables movement by of the
chair by just speaking the command. In the future wecan try
to collaborate with mechanical experts to build a fully
functional voice-enabled wheelchair. Another one of our
main goals is to expand the functionality of our software and
include more functionality such as speed control or AI-
powered obstacle detection using the phone’s built in
camera. We also seek NGOs that support the cause of
providing easy movement to elders and handicappeople. An
implementation of the final productcanhelpdisabledpeople
live an easier and better life.
7. References
[1] Chin-Tuan Tan and Brian C. J. Moore, Perception of
nonlinear distortion by hearing-impaired people,
International Journal of Ideology 2008,Vol. 47, No. 5 , Pages
246-256.
[2] Oberle, S., and Kaelin, A. "Recognitionofacoustical alarm
signals for the profoundly deaf using hidden Markov
models," in IEEE International symposium on Circuits and
Systems (Hong Kong), pp. 2285-2288., 1995.
[3] A. Shawki and Z. J., A smart reconfigurable visual system
for the blind, Proceedings of the Tunisian-German
Conference on: Smart Systems and Devices, 2001.
[4] Dalsaniya, A. K., & Gawali, D. H. (2016). Smart phone
based wheelchair navigation and home automation for
disabled. 2016 10th International Conference on Intelligent
SystemsandControl (ISCO). doi:10.1109/isco.2016.7727033
[5] Leela, R. J., Joshi, A., Agasthiya, B., Aarthiee,U.K.,Jameela,
E., & Varshitha, S. (2017). Android Based Automated
Wheelchair Control. 2017 Second International Conference
on Recent Trends and Challenges in Computational Models
(ICRTCCM). doi:10.1109/icrtccm.2017.44
[6] Vasundhara G. Posugade, Komal K. Shedge, Chaitali S.
Tikhe,Touch-ScreenBased WheelchairSystem,International
Journal of Engineering Research and Applications (IJERA)
[7] Shreyasi Samanta and Mrs. R Dayana ,Sensor Based Eye
Controlle Automated Wheelchair , International Journal For
Research In Emerging Science And Technology
[8] Nowshin, N., Rashid, M. M., Akhtar, T., & Akhtar, N.
(2018). InfraredSensorControlledWheelchairforPhysically
Disabled People. Advances in Intelligent Systems and
Computing, 847–855. doi:10.1007/978-3-030-02683-7_60
[9] R.S.Nipanikar ,VinayGaikwad, ChetanChoudhari,
RamGosavi,Vishal Harne,2013,“Automatic wheelchair for
physically disabled persons”.
[10] S.D. Suryawanshi, J. S. Chitode, S. S.
Pethakar,2013,”Voice Operated Intelligent Wheelchair.”
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 11 | Nov 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 256
BIOGRAPHIES
Dr. Deepa Karuppaiah
Assistant Professor(Senior)
SCOPE, 9791570158
deepa.k@vit.ac.in
VIT, Vellore,
Tamil Nadu
Mr. Srikanth Balakrishna
UG Student VIT, Vellore,
Tamil Nadu
Ms. Prarthana Shiwakoti
UG Student VIT, Vellore,
Tamil Nadu

More Related Content

DOCX
Smart wheel chair based on voice recognition for handicapped
PDF
IRJET- An Ameliorated Methodology of Smart Assistant to Physically Challenged...
PDF
IRJET- Advanced Voice Operating Wheelchair using Arduino
PDF
IRJET- Android based Automated Smart Wheel Chair
PDF
Low Cost Self-assistive Voice Controlled Technology for Disabled People
PDF
IRJET- Voice Controlled Reclining Wheelchair
PDF
Design and Development of Smart Wheelchair for Physically Disable people
PDF
02 smart wheelchair11 (edit a)
Smart wheel chair based on voice recognition for handicapped
IRJET- An Ameliorated Methodology of Smart Assistant to Physically Challenged...
IRJET- Advanced Voice Operating Wheelchair using Arduino
IRJET- Android based Automated Smart Wheel Chair
Low Cost Self-assistive Voice Controlled Technology for Disabled People
IRJET- Voice Controlled Reclining Wheelchair
Design and Development of Smart Wheelchair for Physically Disable people
02 smart wheelchair11 (edit a)

Similar to Multi-faceted Wheelchair control Interface (20)

PDF
I010315762
PDF
Voiceandaccelerometercontrolledwheelchair
PDF
Smart Wheelchair Cum Bed Based on Voice Recognition for Disabled Person
PDF
Modeling and Manufacturing of Powered vehicle for physically challenged people
PDF
Smart Mutatable Advanced Technology Wheelchair
PDF
I010626877
PDF
W04507129134
PDF
Smart Wheelchair with Trolley for Elderly People
PDF
IRJET- Smartphone Based Wheelchair
PDF
IRJET- Smart Wheelchair with Object Detection using Deep Learning
PDF
IRJET- IoT based Smart Sensing Wheelchair to Assist in Healthcare
PDF
PDF
L.E.D (LifeFone for Elderly and Disabled)
DOCX
HEAD MOVEMENT WHEEL CHAIR
PDF
A9399109119
PDF
EYE SCRUTINIZED WHEEL CHAIR FOR PEOPLE AFFECTED WITH TETRAPLEGIA
PDF
EYE SCRUTINIZED WHEEL CHAIR FOR PEOPLE AFFECTED WITH TETRAPLEGIA
PDF
IRJET- A Survey on Indoor Navigation for Blind People
PPTX
Iot based smart wheelchair for disabled peoples
PDF
An Assistive System for Visually Impaired People
I010315762
Voiceandaccelerometercontrolledwheelchair
Smart Wheelchair Cum Bed Based on Voice Recognition for Disabled Person
Modeling and Manufacturing of Powered vehicle for physically challenged people
Smart Mutatable Advanced Technology Wheelchair
I010626877
W04507129134
Smart Wheelchair with Trolley for Elderly People
IRJET- Smartphone Based Wheelchair
IRJET- Smart Wheelchair with Object Detection using Deep Learning
IRJET- IoT based Smart Sensing Wheelchair to Assist in Healthcare
L.E.D (LifeFone for Elderly and Disabled)
HEAD MOVEMENT WHEEL CHAIR
A9399109119
EYE SCRUTINIZED WHEEL CHAIR FOR PEOPLE AFFECTED WITH TETRAPLEGIA
EYE SCRUTINIZED WHEEL CHAIR FOR PEOPLE AFFECTED WITH TETRAPLEGIA
IRJET- A Survey on Indoor Navigation for Blind People
Iot based smart wheelchair for disabled peoples
An Assistive System for Visually Impaired People
Ad

More from IRJET Journal (20)

PDF
Enhanced heart disease prediction using SKNDGR ensemble Machine Learning Model
PDF
Utilizing Biomedical Waste for Sustainable Brick Manufacturing: A Novel Appro...
PDF
Kiona – A Smart Society Automation Project
PDF
DESIGN AND DEVELOPMENT OF BATTERY THERMAL MANAGEMENT SYSTEM USING PHASE CHANG...
PDF
Invest in Innovation: Empowering Ideas through Blockchain Based Crowdfunding
PDF
SPACE WATCH YOUR REAL-TIME SPACE INFORMATION HUB
PDF
A Review on Influence of Fluid Viscous Damper on The Behaviour of Multi-store...
PDF
Wireless Arduino Control via Mobile: Eliminating the Need for a Dedicated Wir...
PDF
Explainable AI(XAI) using LIME and Disease Detection in Mango Leaf by Transfe...
PDF
BRAIN TUMOUR DETECTION AND CLASSIFICATION
PDF
The Project Manager as an ambassador of the contract. The case of NEC4 ECC co...
PDF
"Enhanced Heat Transfer Performance in Shell and Tube Heat Exchangers: A CFD ...
PDF
Advancements in CFD Analysis of Shell and Tube Heat Exchangers with Nanofluid...
PDF
Breast Cancer Detection using Computer Vision
PDF
Auto-Charging E-Vehicle with its battery Management.
PDF
Analysis of high energy charge particle in the Heliosphere
PDF
A Novel System for Recommending Agricultural Crops Using Machine Learning App...
PDF
Auto-Charging E-Vehicle with its battery Management.
PDF
Analysis of high energy charge particle in the Heliosphere
PDF
Wireless Arduino Control via Mobile: Eliminating the Need for a Dedicated Wir...
Enhanced heart disease prediction using SKNDGR ensemble Machine Learning Model
Utilizing Biomedical Waste for Sustainable Brick Manufacturing: A Novel Appro...
Kiona – A Smart Society Automation Project
DESIGN AND DEVELOPMENT OF BATTERY THERMAL MANAGEMENT SYSTEM USING PHASE CHANG...
Invest in Innovation: Empowering Ideas through Blockchain Based Crowdfunding
SPACE WATCH YOUR REAL-TIME SPACE INFORMATION HUB
A Review on Influence of Fluid Viscous Damper on The Behaviour of Multi-store...
Wireless Arduino Control via Mobile: Eliminating the Need for a Dedicated Wir...
Explainable AI(XAI) using LIME and Disease Detection in Mango Leaf by Transfe...
BRAIN TUMOUR DETECTION AND CLASSIFICATION
The Project Manager as an ambassador of the contract. The case of NEC4 ECC co...
"Enhanced Heat Transfer Performance in Shell and Tube Heat Exchangers: A CFD ...
Advancements in CFD Analysis of Shell and Tube Heat Exchangers with Nanofluid...
Breast Cancer Detection using Computer Vision
Auto-Charging E-Vehicle with its battery Management.
Analysis of high energy charge particle in the Heliosphere
A Novel System for Recommending Agricultural Crops Using Machine Learning App...
Auto-Charging E-Vehicle with its battery Management.
Analysis of high energy charge particle in the Heliosphere
Wireless Arduino Control via Mobile: Eliminating the Need for a Dedicated Wir...
Ad

Recently uploaded (20)

PPTX
Current and future trends in Computer Vision.pptx
PPTX
UNIT - 3 Total quality Management .pptx
PDF
Unit I ESSENTIAL OF DIGITAL MARKETING.pdf
PDF
A SYSTEMATIC REVIEW OF APPLICATIONS IN FRAUD DETECTION
PDF
Human-AI Collaboration: Balancing Agentic AI and Autonomy in Hybrid Systems
PDF
Exploratory_Data_Analysis_Fundamentals.pdf
PPT
Total quality management ppt for engineering students
PPTX
introduction to high performance computing
PDF
PREDICTION OF DIABETES FROM ELECTRONIC HEALTH RECORDS
PDF
BIO-INSPIRED HORMONAL MODULATION AND ADAPTIVE ORCHESTRATION IN S-AI-GPT
PPT
introduction to datamining and warehousing
PPTX
communication and presentation skills 01
PPT
Occupational Health and Safety Management System
PPTX
Nature of X-rays, X- Ray Equipment, Fluoroscopy
PDF
Enhancing Cyber Defense Against Zero-Day Attacks using Ensemble Neural Networks
PDF
Level 2 – IBM Data and AI Fundamentals (1)_v1.1.PDF
PPTX
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
PPTX
Information Storage and Retrieval Techniques Unit III
PDF
Integrating Fractal Dimension and Time Series Analysis for Optimized Hyperspe...
PDF
UNIT no 1 INTRODUCTION TO DBMS NOTES.pdf
Current and future trends in Computer Vision.pptx
UNIT - 3 Total quality Management .pptx
Unit I ESSENTIAL OF DIGITAL MARKETING.pdf
A SYSTEMATIC REVIEW OF APPLICATIONS IN FRAUD DETECTION
Human-AI Collaboration: Balancing Agentic AI and Autonomy in Hybrid Systems
Exploratory_Data_Analysis_Fundamentals.pdf
Total quality management ppt for engineering students
introduction to high performance computing
PREDICTION OF DIABETES FROM ELECTRONIC HEALTH RECORDS
BIO-INSPIRED HORMONAL MODULATION AND ADAPTIVE ORCHESTRATION IN S-AI-GPT
introduction to datamining and warehousing
communication and presentation skills 01
Occupational Health and Safety Management System
Nature of X-rays, X- Ray Equipment, Fluoroscopy
Enhancing Cyber Defense Against Zero-Day Attacks using Ensemble Neural Networks
Level 2 – IBM Data and AI Fundamentals (1)_v1.1.PDF
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
Information Storage and Retrieval Techniques Unit III
Integrating Fractal Dimension and Time Series Analysis for Optimized Hyperspe...
UNIT no 1 INTRODUCTION TO DBMS NOTES.pdf

Multi-faceted Wheelchair control Interface

  • 1. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 11 | Nov 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 250 Multi-faceted Wheelchair control Interface Dr. Deepa Karuppaiah1, Srikanth Balakrishna2, Prarthana Shiwakoti3 1Assistant Professor (Senior) SCOPE, Vellore Institute of Technology, Vellore, Tamil Nadu, India 2,3U.G. Student of Computer Science and Engineering Department, Vellore Institute of Technology, Vellore, Tamil Nadu, India ---------------------------------------------------------------------***-------------------------------------------------------------------- Abstract - As the population of the elderly and the disabled grows, so does the demand for care and support equipment to enhance their quality of life. The most popular mobilityaidfor people with limited mobility for the past 20 years has beenthe electric powered wheelchair (EPW), and more recently, the intelligent EPW, also known as an intelligentwheelchair(IW), has attracted significant attention as a new technology to meet users' varied needs. Elderly people and disabled people face a lot of difficulties in performing the simplest day to day tasks. Many of them rely on others or utilizing conventional technologies such as wheelchairstoaccomplishtasks. With the help of modern technology and the advent of voice-enabled applications and devices we can build tools to help them interact with society and smooth their mobility during everyday activities. A major problem that they face is to reach the wheelchair, hence, to curb this issue we propose a mobile application that enables the user to locate and navigate the wheelchair towards themselves whenever they need it. The primary goal of the interactive user operated wheelchair system project is to provide a user- friendly interface by employing two ways of interaction with the wheelchair thatis entering choice of direction through touchscreen(haptic)and voice recognition input using speech recognition module to operate a wheelchair. Key Words: Support; Intelligent wheelchair; Mobile application; Interact; Voice-recognition; Haptic; 1. INTRODUCTION The project on using technology with wheelchair is an assistive technology that includes this initiative to make life more independent, fruitful, and joyful for dependent and disabled people. The primary goal of the interactive user operated wheelchair system project is to provide a user- friendly interface by employing two waysofinteractionwith the wheelchair that is entering choice of direction through touch screen (haptic) and voice recognition input using speech recognition module to operate a wheelchair. The device is made to allow a person to operate a wheelchair with their voice. This project aims tofacilitatethemovement of older persons who are unable to move well and disabled or handicapped people. It is hoped that this approach will enable certain folks to move aroundlessfrequentlyasa daily necessity. A crucial piece of technology that will enable new forms of human-machine connection is speech recognition. Therefore, by applyingspeechrecognitiontechnologyforthe movement of a wheelchair, the issues they confront can be resolved. The employment of the smart phone as a middleman or interface can actualize and maximize this. To create a program that can detect speech, control chair movement, and handle or manage graphical commands, interfaces have been built for this project. The wheelchair with a motor that can be moved using vocal commands. It is crucial for a motorized wheelchair to be able to avoid obstacles automatically and in real time, so it can go quickly. Through research and design wise, thewheelchairtocontrol development along safe and effective use of the provision independence and self-use mobility. 2. Literature Review Numerous studies have found that all impaired people can benefit from independent mobility or movement, which includes powered wheelchairs, manual wheelchairs, and walkers. Independent mobility improves educational and employment prospects, lessens dependency on group members, and fosters sentiments of independence. Independent mobility is crucial in layingthegroundwork for a lot of young children's early learning. [1] A cycle of deprivation and lack of drive that develops learned helplessness is frequently the result of a lack of exploration and control. Independent movement is a crucial component of self-esteem for older individuals and aids in "ageing in place." Due to the necessity of movement for many of these activities, mobility issues contributed to the issue of activities of daily living (ADL) and instrumental ADL limitations. [2] Reduced socialization chances as a result of mobility issues frequentlycausesocial isolationanda variety of mental health issues. While traditional manual or self- automated wheelchairs can often meet the needs of people with impairments, certain members of the disabled community find it difficult or impossible to utilize wheelchairs independently. In paper [3], they describe a smart wheelchair navigation system through multiple inputs hence the users with different types of disabilities can also use the same system. Simple and low cost ATMega32L of AVR microcontroller family is used for processing purpose. Microcontroller will compare those values with saved threshold values. When it gets below the threshold value, it will transmit stop command to the H-bridge motor driver circuit to avoid the possibility of collision. Controller will get navigational commands from android phone, which are
  • 2. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 11 | Nov 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 251 connected through Bluetooth.Usercangivevoicecommands to the android device which will be compared with saved commands and its value will be passed to controller. In paper [4], they describe a design showing the movement and voice-controlled wheelchair that can guide the paraplegic to head towards their will and wish with the help of the voice command wheelchair. It has provided a design that is efficient in helping the handicapped people without putting their strengths and efforts to pull the wheelchair, by commanding it on their voice with a facility to control the speed of the wheelchair. The brain of the proposedsystem is Microcontroller. Voice to text app is used to get the voice input from the handicapped person and it is given to the Microcontroller via Bluetooth EGBT45ML.The voice gets converted to the machine code which will be given to the motor driver to control the movement of wheelchairs. This paper [5], presents an automatic wheel chair using voice recognition. A voice-controlled wheelchair makes it easy for physically disabled person who cannotcontrol their movements of hands. The powered wheel chair depends on motors for locomotion and voice recognition for command. The circuit comprises of an Arduino, HM2007 Voice recognition module and Motors. The voice recognition module recognizes the command by the user and provides the corresponding coded data stored in the memory to Arduino Microcontroller. Arduino Microcontroller controls the locomotion accordingly. They are using HM2007 voice recognition module which correlates commands to do speech processing and give the result to Arduino which is further programmedwith respectivelocomotioncommands. The locomotion is controlled using L293D and Relay which control the motors. The software iscodedinArduinoand the hardware components parse the voice commands and translate it to the motors. This paper [6], describes the wheelchair system with user friendly touch screen interface. This device helps the disables to have automatic advancement totheirdestination through predefined paths in the indoorsystem.Useoftouch- screen enables less muscle movement and less muscular pressure than the selfpropelledwheelchairswhichare being used from ages. The ability to choose between manual operating mode and predefined operating mode uniquely presents capacity of the wheelchair to operate in multiple environments. Obstacle avoidance facility enables to drive safely in unknown as well as dynamic environments. In paper [7], a wheel chair ROBOT model containsanin-built Micro Controller And Eye Ball Sensing system, which does the functions like right, left, forward and reverseoperations. The wheel chair is designed in such a way that it can move freely without external supportordependency.Through this feature the patients can enable movements of their wheelchair as per their desire. In paper [8], the design of an automated wheelchair for people suffering from total or partial paralysis is presented. This wheelchair design allows self-control using an Infrared Sensor and a microcontroller (Arduino Uno) based circuitry with emergency message transmitting systems. This design will not only prevent stress on the body duringmovement in all four directions but will also be accessible by low income households. Robotic wheelchairs have enhanced the manual wheelchairs by introducing locomotion controls. These devices can ease the lives of many disabled people, particularly those with severe impairments by increasing their range of mobility[9].These robotic enhancement will providebenefit people who cannot use hands and legs. In this project we have developed a voice controlled wheelchair which aim to counter the above problems. The wheelchair can be controlled using joystick as well as using voice commands. He/She just needs to say the directionormovethe button for that direction and the wheelchair moves in the desired direction. In hardware development, we are using HM2007 voice recognition module which correlates commands to do speech processing and give the result to Arduino which is further programmed with respective locomotion commands[10]. 3. Gap analysis Prior to this, researchers have designed wheelchairs that require high hardware configuration using heavy and complex models such as 2-axis joystick, huge brakes and tires etc. All these components combinedly contribute to difficulty in mobility for the patient using the wheelchair. With the use of dedicated hardware peripherals, the elderly and disabled people require maximum physical efforts for motion . Also, even though they are heavy and complicated, they provide a similar performance to other light-weight software-based models. Most of the papers described, lack the concept of integrating software into the hardware for efficient mobility. With the use of software, not only we can design light weight and minimalistic wheel-chair , but also we can embed various technologieslikeartificial intelligence for including latest features like autopilot into the wheelchair. Hence, the majority of the proposed models were not performing as efficientlyastheyweresupposed to. 4. Proposed system 4.1 Flowcharts Wheelchairs can be controlled in two ways: utilizing the user's voice and an Android app. The first flowchart demonstrates how voice instructionscan be used to operate a wheelchair. First, a Bluetooth connection is made between the wheelchair and the user voice application. The user is then expected to utilize the application to utter particular commands. The word is then
  • 3. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 11 | Nov 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 252 validated and converted to text using the Google voice service. The text format is then analysed by the controller, who then verifies that the input is correct before providing the motor drivers with precise instructions for the movement in the directions of left, right, straight, backward, or stop. Fig 1: Flowchart for User Voice Command The user will utilize an Android device with a GUI app in the second part of our project to sendcommandsbyclickingona certain button, and the wheelchair will respond as instructed. The wheelchair will move as a result of the commands. The internet or Wi-Fi connection is a must for this module. The microcontroller first verifies that the input is genuine before giving the motor drivers specific instructions for moving left, right, straight, backward, or stopping altogether. Fig 2: Flowchart for GUI Command 4.2 Block Diagram and explanation Fig 3 :- Block diagram for the proposed model It is an application to run on the android device with the GUI app for sending command by clicking on the specific button and the wheelchair will behave accordingly as per the command. The commands will produce the movement of wheelchair. The Bluetooth connection is used to transmit instructions to the actuators in the wheelchair. Microcontroller checks for the valid input and then gives specific instruction to the motor drivers for its movement towards left, right, straight, backward otherwise stop.
  • 4. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 11 | Nov 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 253 Android will act as middleware. Android will offer connectivity through Bluetooth and will send wireless data over the connection. Android will have a voice recognition app to receive voice commands. The voice recognition app will first decode the sets of instructions from the voice commands and then later send it to the Bluetooth serial module. The microcontroller will receive the set of instructions; it will compare the instructions with the predefined conditions that are set as base. If the condition is satisfied, then the signal will be sent to the driver circuit which in turn will convert output pulse signal from microcontroller to electrical signal to drive the wheel or to give the mechanical instructions to the wheel’s DC motors. There is also one other way to give directions and other controls, by pressing a finger against the various quadrants on the app which is programmed with different values for different directions. 4.3 UI Interface The user-interface is designed in compliance with the Sneiderman’s 8 golden rules, fitts’ law and Neilsen’s 10 heuristics for UI Design. We have used a simple interface where we prioritize the arrow buttonsfordirectionandthus making them of bigger size. Thus, according to Fitts’formula the time to acquire the direction buttons for the user is reduced as it is inversely proportional to the width of the buttons. The direction options in the user interface are provided according to real-world directions for ease of use and the voice option makes the process immensely simple for users that might have less or no vision. The standard settings and positions for buttonshavebeenusedsotheuser needs least or no adaptation to the interface. Fig 4: Login screen Fig 5: Home page Fig 6: Bluetooth Connection Fig 7: Battery info Fig 8: Emergency contact Fig 9: Quick Actions
  • 5. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 11 | Nov 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 254 The user can immediately stop the wheelchair at any point during their journey by speaking or clicking. A user could run into an error for example if the voice enabled system is unable to recognize the command. When this happens, the system reads out the 5 commands that the user can use LEFT, RIGHT, FORWARD, BACKWARD, AND STOP.Theuser- interface implements recognition over recall by providing interface where directions are already given and not asked to be entered by user so the user can just recognize and select. The user-interface is flexible to various types of handicap or disabled people by taking input in multiple ways. 4.4 Interaction Model Abowd & Beale’s Model is used as interaction framework. Abowd and Beale’s interaction framework identifies system and user components which communicate via the input and output components of aninterface.Similarcyclical processes are taken in this communication from the user's task formulation to the system's execution and presentation of the task to the user's observation of the task's outcomes, from which the user can then design new tasks. In the proposed system, user is the person using the wheelchair application interface on their phone. Input is via touchscreen (Haptic input) input through smartphone or voice-enabled input. The system takesinasinputcommands from the user to either move in a certain direction or stop the movement of the wheelchair. Using the Arduino IDE, we can convert the commands to actual actions performed by the wheels through motor. Theoutputisrequiredmovement of the wheelchair. 4.5 Testing the model Unit testing method was employed for generating the test cases . Unit testing involves isolating differentsectionsofthe code to check for bugs and errors for different possible inputs to the application. Various output were observed for different inputs as a result of the test. 5.Result and discussion Fig 10 : Tkinter GUI Panel for voice recognition Testcase Result Haptic input: Left Success Haptic input: Right Success Haptic input: Forward Success Haptic input: Backward Success Haptic input: Stop Success Haptic input: Multiple keys Failed Audio input: Left Success Audio input: Right Success Audio input: Forward Success Audio input: Backward Success Audio input: Stop Success Audio input: Noise Failed Audio input: “Great day” Failed
  • 6. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 11 | Nov 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 255 Fig 11: Sample output for commands from buttons and voice The project was tested for parsing and transmission of user input. Haptic input enables the user to hold down the buttons and transmit the movement signal via bluetooth as long the button is held. The signal ceases when the button is released. This design is quite convenient and intuitive for people to work with. Voice recognition functionality allows the user to toggle the movement state by saying key voice- activation words like ‘forward’, ‘stop’ etc. Once the voice command is recognized,the correspondingmovementsignal is transmitted continuously until interrupted by another user input. The voice recognition performance was found to be dependent on the phone’s microphone qualityandvaried from phone to phone. Usage of earphone and headphone mics improved command recognition.Thenoisetolerance of the system was tested in a room by playingmusic andhaving multiple audio sources. It was found to be quite sensitive to noise as it attempted to parse and recognize words from surrounding sound and failed to recognize commandswhen the music loudness levels were similar to the testers voice. The system had satisfactory performance in lightly noisy environments and when the microphone was held close when speaking. 6. Conclusion and Future work In this project we developed a user-friendly graphical user interface to perform operations on a wheelchair from instructions given through a smartphone device. We integrated a simplistic and minimalistic user-interface with voice recognition feature that enables movement by of the chair by just speaking the command. In the future wecan try to collaborate with mechanical experts to build a fully functional voice-enabled wheelchair. Another one of our main goals is to expand the functionality of our software and include more functionality such as speed control or AI- powered obstacle detection using the phone’s built in camera. We also seek NGOs that support the cause of providing easy movement to elders and handicappeople. An implementation of the final productcanhelpdisabledpeople live an easier and better life. 7. References [1] Chin-Tuan Tan and Brian C. J. Moore, Perception of nonlinear distortion by hearing-impaired people, International Journal of Ideology 2008,Vol. 47, No. 5 , Pages 246-256. [2] Oberle, S., and Kaelin, A. "Recognitionofacoustical alarm signals for the profoundly deaf using hidden Markov models," in IEEE International symposium on Circuits and Systems (Hong Kong), pp. 2285-2288., 1995. [3] A. Shawki and Z. J., A smart reconfigurable visual system for the blind, Proceedings of the Tunisian-German Conference on: Smart Systems and Devices, 2001. [4] Dalsaniya, A. K., & Gawali, D. H. (2016). Smart phone based wheelchair navigation and home automation for disabled. 2016 10th International Conference on Intelligent SystemsandControl (ISCO). doi:10.1109/isco.2016.7727033 [5] Leela, R. J., Joshi, A., Agasthiya, B., Aarthiee,U.K.,Jameela, E., & Varshitha, S. (2017). Android Based Automated Wheelchair Control. 2017 Second International Conference on Recent Trends and Challenges in Computational Models (ICRTCCM). doi:10.1109/icrtccm.2017.44 [6] Vasundhara G. Posugade, Komal K. Shedge, Chaitali S. Tikhe,Touch-ScreenBased WheelchairSystem,International Journal of Engineering Research and Applications (IJERA) [7] Shreyasi Samanta and Mrs. R Dayana ,Sensor Based Eye Controlle Automated Wheelchair , International Journal For Research In Emerging Science And Technology [8] Nowshin, N., Rashid, M. M., Akhtar, T., & Akhtar, N. (2018). InfraredSensorControlledWheelchairforPhysically Disabled People. Advances in Intelligent Systems and Computing, 847–855. doi:10.1007/978-3-030-02683-7_60 [9] R.S.Nipanikar ,VinayGaikwad, ChetanChoudhari, RamGosavi,Vishal Harne,2013,“Automatic wheelchair for physically disabled persons”. [10] S.D. Suryawanshi, J. S. Chitode, S. S. Pethakar,2013,”Voice Operated Intelligent Wheelchair.”
  • 7. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 11 | Nov 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 256 BIOGRAPHIES Dr. Deepa Karuppaiah Assistant Professor(Senior) SCOPE, 9791570158 deepa.k@vit.ac.in VIT, Vellore, Tamil Nadu Mr. Srikanth Balakrishna UG Student VIT, Vellore, Tamil Nadu Ms. Prarthana Shiwakoti UG Student VIT, Vellore, Tamil Nadu