SlideShare a Scribd company logo
1Kyungpook National University
Pose Estimation of a Mobile Robot Based on
Fusion
of IMU Data and Vision Data Using an Extended
Kalman Filter
Mary B. Alatise and Gerhard P. Hancke
SENSORS JOURNAL,
September 21, 2017
2Kyungpook National University
Motivation
The objective of this paper is to estimate the pose of
a device. This paper presents a fusion of IMU sensor
and vision to determine the accurate position of
mobile robot .
3Kyungpook National University
Why we need fusion of IMU sensor and vision
 IMU sensor  accumulated drift
Sensor Calibration
 Vision  illumination
visual occlusion
Blurred features under fast and unpredicted motions.
Field of view of the camera
 Use the complementary features of IMU sensor and vison based methods.
 Advantages : Implementation
low cost
fast computation
improved accuracy
4Kyungpook National University
Proposed Method
Overview of the stages of IMU and Vision fusion
5Kyungpook National University
Hardware used for the experiment:
a) Arduino 101 microcontroller b) LinkSprite camera (c) 4WD robot platform
6Kyungpook National University
Results and Discussion
Simulated Results of Object Detection and Recognition in an Image
Proposed object (query image) Training image
(RGB image)
Convert the image from RGB
to grayscale
Removal of lens distortion
Image including Outliers and
inliers Image with inliers Images with display box around the recognized object
7Kyungpook National University
Simulated Results of Object Analysis of Experimental Results
Euler angles from IMU. Roll: red; pitch: green; yaw: blue.
 The figure shows the robot travelling on a flat
surface.
 At 49s, roll and pitch angles maintained a close
to zero angle until there was a change in
direction.
 At the point where the robot turned 90 degree
to the right, the yaw angle was 91.25 degrees.
 The maximum values obtained for pitch and
roll angles are 15 degree and 18 degree.
 From the experiment results, it can be
conclude that Euler angles are a good choice
for the experiment performed because the pitch
angle did not attain ±90 degree to cause
Gimbal lock.
8Kyungpook National University
Orientation results for fused sensors
 When the robot made a 90 degree right turn,
 Yaw value for IMU was 91.25 degree
 Yaw value for vision was 88 degree
 Pitch value 1.2 degree
 Roll value 4 degree
 From the results, it can be conclude that the proposed method reduced the accumulated errors and drifts.
Roll angles Pitch angles Yaw angles
9Kyungpook National University
comparison of the three directions of the mobile robot taken from
vision method
Experimental position in XYZ
directions from vision data.
 The figure shows a distinctive estimation
of position of the mobile robot.
 The position estimation based on the
reference object in the image is relative
to the position of the mobile robot and
the world coordinate , with the median
vector of the planar object for Z-axis
close to 1 and -1.
 From the experimental position result, it
can be conclude that the SURC and
RANSAC algorithms combination can be
used for the accurate position estimation.
10Kyungpook National University
Ground truth data
 The ground truth data was collected with the
use of external camera placed in the
environment of experiment.
 The camera was placed on a flat terrain with
the mobile robot with a distance of 4.90 m
Ground truth system based on a camera
11Kyungpook National University
Comparison
Trajectory in the XY plane
Position corresponding to the
trajectory
12Kyungpook National University
Performance: Accuracy
 The accuracy of proposed method is evaluated by root mean square error (RMSE)
 Figures show the difference between the ground truth and proposed method.
 Maximum error value for position is 0.145m
 Maximum error value for orientation is 0.95 degree.
RMSE for position RMSE for orientation
13Kyungpook National University
RMSE of position and orientation
 Position error slightly increases with increase in time.
 Both pitch and yaw error angles decreases as time increases.
 Roll error was gradually increasing from the start time to about 80 s and later
decreases.
14Kyungpook National University
CONCLUSION
 Computer vision and IMU sensor fusion.
 For object recognition SURF and RANSAC algorithms were used to detect and match
features in images.
 Experimental results show that proposed method gives better performance as compared to
results from individual methods.
 The proposed method gives high position accuracy.
 The error values from proposed method are reasonable for indoor localization.

More Related Content

PPT
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
PPT
Inertial Navigation System
PPTX
Optimal Sensor Management Technique For An Unmanned Aerial Vehicle Tracking M...
PPTX
2017 PLSC Track: Positional Accuracy Comparison of a DII Inspire 1 and a Sens...
PDF
PPTX
Total station
PDF
Flight management optimization using Advanced Analytics
PPTX
FR3T10-3-IGARSS2011_Geolocation_20110720.pptx
A SELF-ADJUSTIVE GEOMETRIC CORRECTION METHOD FOR SERIOUSLY OBLIQUE AERO IMAGE...
Inertial Navigation System
Optimal Sensor Management Technique For An Unmanned Aerial Vehicle Tracking M...
2017 PLSC Track: Positional Accuracy Comparison of a DII Inspire 1 and a Sens...
Total station
Flight management optimization using Advanced Analytics
FR3T10-3-IGARSS2011_Geolocation_20110720.pptx

What's hot (18)

PDF
Poster Presentation "Generation of High Resolution DSM Usin UAV Images"
PDF
Satellite Infrared
PPTX
DSM Generation Using High Resolution UAV Images
PPT
P1131210137
PPTX
Advantages and Disadvantages of Motion Capture
PPTX
I lab reactivity
PPT
Inertial navigaton systems11
PPT
Total station
PDF
2013APRU_NO40-abstract-mobilePIV_YangYaoYu
PDF
Sms fence detection
PPTX
Total Station surveying
PPT
Inertial navigation systems
PDF
Enhanced Tracking Aerial Image by Applying Fusion & Image Registration Technique
PDF
Total station
PDF
Sensor Fusion Algorithm by Complementary Filter for Attitude Estimation of Qu...
PPTX
Math 390 - Machine Learning Techniques Presentation
PPSX
Welcome to the presentation on ‘total station
Poster Presentation "Generation of High Resolution DSM Usin UAV Images"
Satellite Infrared
DSM Generation Using High Resolution UAV Images
P1131210137
Advantages and Disadvantages of Motion Capture
I lab reactivity
Inertial navigaton systems11
Total station
2013APRU_NO40-abstract-mobilePIV_YangYaoYu
Sms fence detection
Total Station surveying
Inertial navigation systems
Enhanced Tracking Aerial Image by Applying Fusion & Image Registration Technique
Total station
Sensor Fusion Algorithm by Complementary Filter for Attitude Estimation of Qu...
Math 390 - Machine Learning Techniques Presentation
Welcome to the presentation on ‘total station
Ad

Similar to Pose estimation of a mobile robot (20)

PDF
ShawnQuinnCSS565FinalResearchProject
PDF
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
PPTX
Motion recognition based 3D pedestrian navigation system using smartphone
PDF
Research on Calibration Reading Technology of High-Precision Meter
PDF
Distance Estimation to Image Objects Using Adapted Scale
PDF
Distance Estimation to Image Objects Using Adapted Scale
PDF
Control of a Movable Robot Head Using Vision-Based Object Tracking
PDF
D04432528
PDF
The flow of baseline estimation using a single omnidirectional camera
PDF
An Enhanced Computer Vision Based Hand Movement Capturing System with Stereo ...
PDF
Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
DOCX
Eye gaze tracking with a web camera
PDF
IRJET- Study on the Feature of Cavitation Bubbles in Hydraulic Valve by using...
PDF
booysen_vehicle_paper automotive 2015.pdf
PDF
GPGPU-Assisted Subpixel Tracking Method for Fiducial Markers
PDF
Intelligent indoor mobile robot navigation using stereo vision
PDF
Computer Input with Human Eyes-Only Using Two Purkinje Images Which Works in ...
PDF
Visual odometry & slam utilizing indoor structured environments
PDF
Application of Vision based Techniques for Position Estimation
ShawnQuinnCSS565FinalResearchProject
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Motion recognition based 3D pedestrian navigation system using smartphone
Research on Calibration Reading Technology of High-Precision Meter
Distance Estimation to Image Objects Using Adapted Scale
Distance Estimation to Image Objects Using Adapted Scale
Control of a Movable Robot Head Using Vision-Based Object Tracking
D04432528
The flow of baseline estimation using a single omnidirectional camera
An Enhanced Computer Vision Based Hand Movement Capturing System with Stereo ...
Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
Eye gaze tracking with a web camera
IRJET- Study on the Feature of Cavitation Bubbles in Hydraulic Valve by using...
booysen_vehicle_paper automotive 2015.pdf
GPGPU-Assisted Subpixel Tracking Method for Fiducial Markers
Intelligent indoor mobile robot navigation using stereo vision
Computer Input with Human Eyes-Only Using Two Purkinje Images Which Works in ...
Visual odometry & slam utilizing indoor structured environments
Application of Vision based Techniques for Position Estimation
Ad

More from Alwin Poulose (11)

PPTX
Pedestrian dead reckoning indoor localization based on os-elm
PPTX
ANN Features for Heading Classifier
PPTX
Flexible display
PPTX
VLC Simulation
PPTX
Visible light communication
PPTX
Indoor Heading Estimation
PPTX
Smartphone-based Pedestrian Dead Reckoning as an Indoor Positioning System
PPTX
An Overview of the ATSC 3.0 Physical Layer Specification
PPTX
Brain computer interface
PPTX
Visible light communication literature review
PPTX
OFDM Basics
Pedestrian dead reckoning indoor localization based on os-elm
ANN Features for Heading Classifier
Flexible display
VLC Simulation
Visible light communication
Indoor Heading Estimation
Smartphone-based Pedestrian Dead Reckoning as an Indoor Positioning System
An Overview of the ATSC 3.0 Physical Layer Specification
Brain computer interface
Visible light communication literature review
OFDM Basics

Recently uploaded (20)

PPTX
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
PDF
Model Code of Practice - Construction Work - 21102022 .pdf
PPT
CRASH COURSE IN ALTERNATIVE PLUMBING CLASS
PPTX
MCN 401 KTU-2019-PPE KITS-MODULE 2.pptx
PPTX
UNIT-1 - COAL BASED THERMAL POWER PLANTS
PDF
Enhancing Cyber Defense Against Zero-Day Attacks using Ensemble Neural Networks
PPTX
CYBER-CRIMES AND SECURITY A guide to understanding
PDF
composite construction of structures.pdf
PPTX
web development for engineering and engineering
PPTX
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
PPTX
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
PPTX
KTU 2019 -S7-MCN 401 MODULE 2-VINAY.pptx
PPT
Project quality management in manufacturing
PPTX
Construction Project Organization Group 2.pptx
PDF
PPT on Performance Review to get promotions
PDF
The CXO Playbook 2025 – Future-Ready Strategies for C-Suite Leaders Cerebrai...
PPTX
IOT PPTs Week 10 Lecture Material.pptx of NPTEL Smart Cities contd
PDF
keyrequirementskkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk
PPTX
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
PDF
R24 SURVEYING LAB MANUAL for civil enggi
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
Model Code of Practice - Construction Work - 21102022 .pdf
CRASH COURSE IN ALTERNATIVE PLUMBING CLASS
MCN 401 KTU-2019-PPE KITS-MODULE 2.pptx
UNIT-1 - COAL BASED THERMAL POWER PLANTS
Enhancing Cyber Defense Against Zero-Day Attacks using Ensemble Neural Networks
CYBER-CRIMES AND SECURITY A guide to understanding
composite construction of structures.pdf
web development for engineering and engineering
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
KTU 2019 -S7-MCN 401 MODULE 2-VINAY.pptx
Project quality management in manufacturing
Construction Project Organization Group 2.pptx
PPT on Performance Review to get promotions
The CXO Playbook 2025 – Future-Ready Strategies for C-Suite Leaders Cerebrai...
IOT PPTs Week 10 Lecture Material.pptx of NPTEL Smart Cities contd
keyrequirementskkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
R24 SURVEYING LAB MANUAL for civil enggi

Pose estimation of a mobile robot

  • 1. 1Kyungpook National University Pose Estimation of a Mobile Robot Based on Fusion of IMU Data and Vision Data Using an Extended Kalman Filter Mary B. Alatise and Gerhard P. Hancke SENSORS JOURNAL, September 21, 2017
  • 2. 2Kyungpook National University Motivation The objective of this paper is to estimate the pose of a device. This paper presents a fusion of IMU sensor and vision to determine the accurate position of mobile robot .
  • 3. 3Kyungpook National University Why we need fusion of IMU sensor and vision  IMU sensor  accumulated drift Sensor Calibration  Vision  illumination visual occlusion Blurred features under fast and unpredicted motions. Field of view of the camera  Use the complementary features of IMU sensor and vison based methods.  Advantages : Implementation low cost fast computation improved accuracy
  • 4. 4Kyungpook National University Proposed Method Overview of the stages of IMU and Vision fusion
  • 5. 5Kyungpook National University Hardware used for the experiment: a) Arduino 101 microcontroller b) LinkSprite camera (c) 4WD robot platform
  • 6. 6Kyungpook National University Results and Discussion Simulated Results of Object Detection and Recognition in an Image Proposed object (query image) Training image (RGB image) Convert the image from RGB to grayscale Removal of lens distortion Image including Outliers and inliers Image with inliers Images with display box around the recognized object
  • 7. 7Kyungpook National University Simulated Results of Object Analysis of Experimental Results Euler angles from IMU. Roll: red; pitch: green; yaw: blue.  The figure shows the robot travelling on a flat surface.  At 49s, roll and pitch angles maintained a close to zero angle until there was a change in direction.  At the point where the robot turned 90 degree to the right, the yaw angle was 91.25 degrees.  The maximum values obtained for pitch and roll angles are 15 degree and 18 degree.  From the experiment results, it can be conclude that Euler angles are a good choice for the experiment performed because the pitch angle did not attain ±90 degree to cause Gimbal lock.
  • 8. 8Kyungpook National University Orientation results for fused sensors  When the robot made a 90 degree right turn,  Yaw value for IMU was 91.25 degree  Yaw value for vision was 88 degree  Pitch value 1.2 degree  Roll value 4 degree  From the results, it can be conclude that the proposed method reduced the accumulated errors and drifts. Roll angles Pitch angles Yaw angles
  • 9. 9Kyungpook National University comparison of the three directions of the mobile robot taken from vision method Experimental position in XYZ directions from vision data.  The figure shows a distinctive estimation of position of the mobile robot.  The position estimation based on the reference object in the image is relative to the position of the mobile robot and the world coordinate , with the median vector of the planar object for Z-axis close to 1 and -1.  From the experimental position result, it can be conclude that the SURC and RANSAC algorithms combination can be used for the accurate position estimation.
  • 10. 10Kyungpook National University Ground truth data  The ground truth data was collected with the use of external camera placed in the environment of experiment.  The camera was placed on a flat terrain with the mobile robot with a distance of 4.90 m Ground truth system based on a camera
  • 11. 11Kyungpook National University Comparison Trajectory in the XY plane Position corresponding to the trajectory
  • 12. 12Kyungpook National University Performance: Accuracy  The accuracy of proposed method is evaluated by root mean square error (RMSE)  Figures show the difference between the ground truth and proposed method.  Maximum error value for position is 0.145m  Maximum error value for orientation is 0.95 degree. RMSE for position RMSE for orientation
  • 13. 13Kyungpook National University RMSE of position and orientation  Position error slightly increases with increase in time.  Both pitch and yaw error angles decreases as time increases.  Roll error was gradually increasing from the start time to about 80 s and later decreases.
  • 14. 14Kyungpook National University CONCLUSION  Computer vision and IMU sensor fusion.  For object recognition SURF and RANSAC algorithms were used to detect and match features in images.  Experimental results show that proposed method gives better performance as compared to results from individual methods.  The proposed method gives high position accuracy.  The error values from proposed method are reasonable for indoor localization.