SlideShare a Scribd company logo
Vector Distance Transform Maps for
Autonomous Mobile Robot Navigation
Janindu Arukgoda
20 December 2019
Content
Background
List of Contributions
Contribution 1 → Contribution 6
Conclusions
List of Publications
1
Content
Background
List of Contributions
Contribution 1 → Contribution 6
Conclusions
List of Publications
2
Background
Fundamental questions of autonomous mobile robot
navigation :
Where am I?
Where am I going?
How do I get there?
(Hugh Durrant-Whyte and John Leonard, 1989)
common requirement : Map
3
Background
Map
Mathematical representation of the environment
Machine understandable
Required to answer all three questions
Examples
Feature maps (landmark maps, point cloud maps)
Occupancy grid maps (OGM)
Distance transform (DT) maps
4
Background
Property
Feature
Maps
OGM DT
Clearly defined occupied
and free spaces
  
Analytical observation
equation
  
Continuous representa-
tion
  
No feature extraction /
loss of information from
sensor measurement
  
5
Background
Distance Transforms
Originated in the image processing domain
For a 2D binary image where pixels belong either to the set S,
the set of 1’s that belong to objects, or the set ¯S, the set of
0’s that belong to the background, a distance map or a
distance transform L(S) is an image defined as
L(x) = min(d[(i, j), ¯S]) ∀x = (i, j) ∈ S (1)
The distance function d is a positive definite, symmetric and
triangular function
6
Background
Distance Transform Variants
Unsigned DT : distance to the closest object
Signed DT : distance assigned a sign based on the
concept of inside vs outside
Vector DT : distance described using orthonormal vectors
7
Content
Background
List of Contributions
Contribution 1 → Contribution 6
Conclusions
List of Publications
8
List of Contributions
1. Vector distance transforms (VDT) for environment
representation
2. Sensor model on VDT maps
3. Optimization framework for localization on VDT maps
4. EKF framework for localization on VDT maps
5. Active localization using sparse range measurements
6. Capturing the uncertainty of VDT maps
9
Content
Background
List of Contributions
Contribution 1 : VDT for Environment Representation
Conclusions
List of Publications
10
VDT for Environment Representation
Figure: Box shaped environment
11
VDT for Environment Representation
(a) (b)
(c) (d)
Figure: (a) UDT, (b) SDT, (c) x component of VDT and (d) y
component of VDT for box shaped environment
12
Behavior of Distance Transforms
(a) (b)
(c) (d)
Figure: (a) UDT, (b) SDT, (c) x component of VDT and (d) y
component of VDT along the dotted line in box shaped
environment
13
Behavior of Interpolated Distance Transforms
(a) (b)
(c) (d)
Figure: The behaviour of cubic spline interpolations of (a) UDT,
(b) SDT, (c) x component of VDT and (d) y component of VDT
close to the boundary 14
Gradients of Distance Transforms at the Boundary
(a) (b)
(c) (d)
Figure: Cubic spline interpolations of the gradients along the x
direction of (a) UDT, (b) SDT, (c) VDT x and (d) VDT y close to
the boundary 15
Behavior of Distance Transforms at Cut Locus
(a) (b)
(c) (d)
Figure: Cubic spline interpolations of the gradients along the x
direction of (a) UDT, (b) SDT, (c) VDT x and (d) VDT y close to
the cut locus 16
VDT for Environment Representation
Property UDT SDT VDT
Capable of represent-
ing complex 2D environ-
ments
  
Continuous gradients on
boundaries
  
Continuous function on
cut locus
  
17
Content
Background
List of Contributions
Contribution 2 : Sensor Model for VDT Maps
Conclusions
List of Publications
18
Sensor Model for VDT Maps
Sensor Model associates sensor measurements
with the map
given robot state
Likelihood of an observation given the robot state
P(z | x, m) (2)
z : observation x : robot state m : map / internal
representation of the environment
19
Sensor Model for VDT Maps
Given a VDT map, an observation zi ∈ Z; zi = {ri, θi} and
an estimate for robot pose XR = (xR, yR, φR) , what is the
disparity between the actual and the expected observation?
Xo =
xoi
yoi
=
xR + ri sin(θ + φi)
yR + ri cos(θ + φi)
(3)
20
Sensor Model for VDT Maps
dV DT = V DT(Xo) =














DTx(Xo1 )
DTy(Xo1 )
.
.
DTx(Xoi
)
DTy(Xoi
)
.
DTx(Xon )
DTy(Xon )














(4)
21
Sensor Model for VDT Maps
Assuming
1. Each range measurement is independent within a laser
scan
2. Range measurement noise σr is the only contributing
factor to the sensor noise
Sensitivity of dV DT due to sensor noise is
ΣdV DT
= diag(σ2
DTx,r1
, σ2
DTy,r1
, ..., ..., σ2
DTx,ri, σ2
DTy,ri, ..., ) (5)
22
Sensor Model for VDT Maps
σ2
DTx,ri
= JDTx,ri
· σ2
r · JDTx,ri
σ2
DTy,ri
= JDTy,ri
· σ2
r · JDTy,ri
(6)
JDTx,ri
=
∂dDTx
∂r
ri
=
∂DTx
∂xoi
xoi
·
∂xoi
∂r ri
+
∂DTx
∂yoi
yoi
·
∂xoi
∂r
ri
JDTy,ri
=
∂dDTy
∂r
ri
=
∂DTy
∂xoi
xoi
·
∂xoi
∂r ri
+
∂DTy
∂yoi
yoi
·
∂xoi
∂r
ri
(7)
23
Sensor Model for VDT Maps
A scalar disparity measurement between the expected and the
actual observations, inspired by the Chamfer distance
dV CD =
2n
i=1
dV DT (i)2
=
n
i=1
DTx(Xoi
)2
+ DTy(Xoi
)2
(8)
24
Content
Background
List of Contributions
Contribution 3 : Optimization Framework for Localization
Conclusions
List of Publications
25
Localization
“Where am I?” == Localization
Given
a map
sensor measurements
prior information
figure out where I am.
26
Optimization Framework for Localization
Given
S : sensor measurement
M : map
XR = (xR, yR, φR) : hypothesized robot pose
if a function f can evaluate the disparity between XR and the
observations, localization becomes a minimization problem.
ˆXR = argmin
xR,yR,φR
f(S, M) (9)
Is dV CD scalar disparity measurement suitable for this?
27
Optimization Framework for Localization
Figure: Behavior of dV CD around the true robot position with the
orientation fixed at true value
28
Optimization Framework for Localization
Figure: Behavior of dV CD around the true robot orientation with
the position fixed at true value
29
Optimization Framework for Localization
ˆXR = argmin
xR,yR,φR
dV CD(Xo, DTv) (10)
Initial guess required
Similar to C-LOG
30
Optimization Framework for Localization
Uncertainty estimation
Sensitivity of the pose estimate to the error in sensor
measurement Σz = diag(σ2
r )
Pose derived using non linear least squares optimization
Use implicit function theorem
cov( ˆXR) = J · Σz · J (11)
31
Optimization Framework for Localization
J = −H−1
∗



∂2dV CD
∂xr∂r
∂2dV CD
∂yr∂r
∂2dV CD
∂φr∂r


 (12)
H =



∂2dV CD
∂x2
r
∂2dV CD
∂xr∂yr
∂2dV DT
∂xr∂φr
∂2dV CD
∂xr∂yr
∂2dV CD
∂y2
r
∂2dV CD
∂yr∂φr
∂2dV CD
∂xr∂φr
∂2dV CD
∂yr∂φr
∂2dV CD
∂φ2
r


 (13)
Calculating these derivatives is possible because of the
good behavior of the VDT at the boundary
32
Optimization Framework for Localization
Simulation Dataset : Ground truth available
Figure: Trajectory estimate of the optimization framework
33
Optimization Framework for Localization
Figure: Optimization pose estimate error
34
Optimization Framework for Localization
Localization of a UAV in Kentland, VA, USA for MBZIRC 2017
Flight controller estimates roll and pitch
Height obtained from a range sensor
3DOF localization using monocular camera
Edge map of the flat ground terrain represented using a
VDT
35
Optimization Framework for Localization
Edge pixels (λi, µi) projected to the ground using
hypothesized robot position (xr, yr, zr) and orientation R
and focal length f.
xoi
=
xoi
yoi
=
xr + zr
λiR1,1+µiR1,2−fR1,3
λiR3,1−µiR3,2+fR3,3
yr + zr
λiR2,1+µiR2,2−fR2,3
λiR3,1−µiR3,2+fR3,3
(14)
Figure: Bogey 5
36
Optimization Framework for Localization
Figure: UAV trajectory
37
Optimization Framework for Localization
Figure: Optimization pose error
38
Content
Background
List of Contributions
Contribution 4 : EKF Framework for Localization
Conclusions
List of Publications
39
EKF Framework for Localization
Prediction using motion model
ˆXk|k−1 = g( ˆXk−1, Uk)


ˆxk|k−1
ˆyk|k−1
ˆφk|k−1

 =


ˆxk−1 + ∆T · υk · cos(ˆφk−1)
ˆyk−1 + ∆T · υk · sin(ˆφk−1)
ˆφk−1 + ∆T · ωk

 (15)
ˆXRk−1
: robot pose at step k − 1
Uk = [υk, ωk] : odometry
∆T : time between steps k and k − 1
40
EKF Framework for Localization
Covariance propagation
Pk|k−1 = Gx · Pk−1 · Gx + Gu · Q · Gu (16)
Gx =
∂g
∂X ˆXk−1,Uk
=


1 0 −∆T · υk · sin(φk−1)
0 1 ∆T · υk · cos(φk−1)
0 0 1

 (17)
Gu =
∂g
∂U ˆXk−1,Uk
=


∆T · cos(φk−1) 0
∆T · sin(φk−1) 0
0 ∆T

 (18)
Pk−1 : pose covariance at step k − 1
Q : odometry noise
41
EKF Framework for Localization
Observation model
Standard EKF calculates an expected measurement
Innovation is the disparity between expected and actual
measurements
In a VDT map, this disparity is implicitly captured in the
map
h( ˆXRk|k−1
, zk) = dV DT (Xok
) = 0 (19)
42
EKF Framework for Localization
Innovation :
νk = h( ˆXRk|k−1
, zk) = dV DT (Xok
) (20)
A vector of distance values at the projected laser end points
43
EKF Framework for Localization
Innovation covariance
Sk = Hz · Σz · Hz + Hx · Pk|k−1 · Hx (21)
Hz =
∂h
∂z ˆXk|k−1,z
(22)
Hx =
∂h
∂X ˆXk|k−1,z
(23)
44
EKF Framework for Localization
Calculation of Kalman gain and update equations follow the
standard EKF.
Kk = Pk|k−1 · Hx · S−1
k (24)
ˆXk = ˆXk|k−1 + Kk · νk (25)
Pk = Pk|k−1 − Kk · Sk · Kk (26)
45
EKF Framework for Localization
Simulation Dataset : Ground truth available
Figure: Trajectory estimate of the EKF framework
46
EKF Framework for Localization
Figure: EKF pose estimate error
47
EKF Framework for Localization
Algorithm
Mean Absolute Error
P osition(m) Orientation(rad)
Optimization (proposed) 0.0181 ± 0.0126 0.0024 ± 0.0037
EKF (proposed) 0.0235 ± 0.0223 0.0161 ± 0.0280
Optimization (C-LOG) 0.0099 ± 0.0141 0.0042 ± 0.0167
EKF (UDT) 0.0227 ± 0.0251 0.0157 ± 0.0211
Particle Filter (AMCL) 0.0296 ± 0.0243 0.0252 ± 0.0431
Table: Mean Absolute Errors : Simulation
Algorithm
Mean Squared Error
P osition(m2
× 10−3
) Orientation(rad2
× 10−3
)
Optimization (proposed) 0.4847 ± 1.700 0.0192 ± 0.2636
EKF (proposed) 1.1000 ± 2.6000 1.0000 ± 2.5000
Optimization (C-LOG) 0.299 ± 2.167 0.2975 ± 1.7000
EKF (UDT) 1.146 ± 4.492 1.4000 ± 3.7000
Particle Filter (AMCL) 1.464 ± 2.078 2.5000 ± 7.1000
Table: Mean Squared Errors : Simulation
48
EKF Framework for Localization
Localization of a mobility scooter
Ground surface structure observations
Assume locally flat terrain
Tilted depth camera
Landmark based bearing-only EKF localization
implemented by Maleen Jayasuriya
49
EKF Framework for Localization
Figure: PMD localization framework
50
EKF Framework for Localization
(a) (b)
Figure: (a) Ground Surface Image (b) the corresponding HED
Output Image
51
EKF Framework for Localization
Figure: Stitched ground surface map
52
EKF Framework for Localization
Figure: Trajectory estimate in Glebe, Sydney
53
Content
Background
List of Contributions
Contribution 5 : Active Localization
Conclusions
List of Publications
54
Active Localization
LiDARs are high frequency, high resolution sensors
The EKF framework can handle sparse measurements -
even a single range measurement
Using a rotating single point laser to localize
Active localization - determining the direction of the
single point laser
Based on information gain
55
Active Localization
Figure: Active localization framework
56
Active Localization
Prediction step
standard
Observation step
similar to the previously presented step
The observation vector contains only two elements
Update step
standard
57
Active Localization
The sensor rotation range is divided into n segments
Once a segment is selected as the goal, the sensor rotates
from current position to the farthest end of the selected
segment in a sweeping motion
Sensor continuously detect environment
Once the sensor reaches the goal, calculate a new goal
58
Active Localization
Calculating the goal
Given current position estimate and uncertainty
Ray trace at fixed angular resolution for hypothetical
observations
Update pose uncertainty using hypothetical observations
Pick the segment that produces the best cost function
value
For the ith
segment, the cost function is defined as
cost(i) = trace(Pki) + (λ.d) (27)
Pki - Uncertainty of the pose estimate using hypothetical
observations in the ith
segment
d - angular distance between current position and ith
segment
λ - tuning parameter
59
Active Localization
Experiment
Willow Garage environment in Gazebo
8 segments covering 360 °
Range limited to 3m
Video
60
Content
Background
List of Contributions
Contribution 6 : Uncertain VDT Maps
Conclusions
List of Publications
61
Uncertain VDT Maps
Desired properties of a map built from point cloud data
Continuous
Captures uncertainty of map building process
Simpler Sensor Model
62
Uncertain VDT Maps
Problem definition :
Given
a set of noisy robot poses
noisy observations obtained at said poses
how to represent the environment using the observations while
capturing the uncertainties?
63
Uncertain VDT Maps
If u(x), x ∈ 2
is the distance from point x to a set of points
S,
u(x) =⇒ UDT
u satisfies the Eikonal equation
| u| = 1, u|S = 0 (28)
64
Uncertain VDT Maps
Given the relationship between UDT and VDT
DTu(x, y)2
= DTx(x, y)2
+ DTy(x, y)2
(29)
A PDE can be derived as
(DTx
∂DTx
∂x
+ DTy
∂DTy
∂x
)2
+ (DTx
∂DTx
∂y
+ DTy
∂DTy
∂y
)2
DT2
x + DT2
y
= 1
(30)
With boundary conditions
DTx(x, y) = 0, DTy(x, y) = 0, ∀(x, y) ∈ S (31)
65
Uncertain VDT Maps
Parametric appriximation of a solution to a PDE
L u = f, x ∈ Ω (32)
Bu = g, x ∈ ∂Ω (33)
f, g - given functions
L , B - differential operators
∂Ω boundary of bounded open domain Ω
If a parametric function ua(x, βi) exists that is arbitrarily close
to u, βi can be optimized to obtain u = ua(x, βi) using
objective function h
h =
Ω
||L ua − f||2
dV +
∂Ω
||Bua − g||2
dS (34)
66
Uncertain VDT Maps
Cubic B-Spline approximation of distance transoforms
P(u, w) = (
1
6
)2
u3
, u2
, u, 1 MPMT




w3
w2
w
1



 (35)
M =




−1 3 −3 1
3 −6 3 0
−3 0 3 0
1 4 1 0



 (36)
P =




P00 P01 P02 P03
P10 P11 P12 P13
P20 P21 P22 P23
P30 P31 P32 P33



 (37)
67
Uncertain VDT Maps
hv =
Ω
(DTx
∂DTx
∂x
+ DTy
∂DTy
∂x
)2
+ (DTx
∂DTx
∂y
+ DTy
∂DTy
∂y
)2
DT2
x + DT2
y
− 1
2
dV +
∂Ω
DT2
x + DT2
y dS
(38)
Optimize for the control points Pij using hv
Neural networks may also be candidates for the approximation
function
68
Uncertain VDT Maps
Integrating uncertainty
Each point in a point cloud Xo is
Xoi =
xoi
yoi
=
xr + ri cos(θi − φr)
yr + ri sin(θi − φr)
(39)
ˆxR = (xr, yr, φr)T
- robot pose
cov( ˆxR) - robot pose uncertainty
Srθ = {(ri, θi}) - laser range - bearing observations
σr - laser range measurement noise
69
Uncertain VDT Maps
Uncertainty of each point in the point cloud
cov(Xoi) =
∂Xoi
∂ ˆxR
cov( ˆxR)
∂Xoi
∂ ˆxR
T
+
∂Xoi
∂r
σ2
r
∂Xoi
∂r
T
(40)
70
Uncertain VDT Maps
Weights for observations based on uncertainty ρoi calculated
as :
ρoi =
1
∂(DT2
x +DT2
y )
∂Xoi
cov(Xoi)
∂(DT2
x +DT2
y )
∂Xoi
T
(41)
Weights for Eikonal equation ρei are empirically determined
71
Uncertain VDT Maps
Objective function with weights incorporated :
hv∗ =
Ω
ρei
(DTx
∂DTx
∂x
+ DTy
∂DTy
∂x
)2
+ (DTx
∂DTx
∂y
+ DTy
∂DTy
∂y
)2
DT2
x + DT2
y
− 1
2
dV +
∂Ω
ρoi DT2
x + DT2
y dS
(42)
72
Uncertain VDT Maps
Estimating map uncertainty
Px∗
ij - optimized values of the control points for DTx
Py∗
ij - optimized values of the control points for DTy
cov(Px∗
ij) = Jx ∗ cov(Xoi) ∗ JT
x (43)
cov(Py∗
ij) = Jy ∗ cov(Xoi) ∗ JT
y (44)
Jx = −H−1
∗
∂2hv∗
∂Px∗
ij∂rk
(45)
(implicit function theorem)
73
Uncertain VDT Maps
Experiment 1
Simulated Robot
Equipped with 180° FOV LiDAR
Figure: Environment and robot poses 74
Uncertain VDT Maps
Measurement Noise Parameter
Position Estimates σx = 0.01m σy = 0.01m
Orientation Estimates σφ = 0.005rad
Range Measurements σr = 0.02m
Table: Noise Parameters used in Experiment 1
75
Uncertain VDT Maps
Figure: Noisy point cloud
76
Uncertain VDT Maps
(a) DTx (b) DTy
Figure: Optimal VDT at 0.2m control point resolution
77
Uncertain VDT Maps
Figure: Experiment 1 : Zero contour of the approximated VDT,
observations and true boundary
78
Uncertain VDT Maps
Figure: Error of the VDT approximation bounded by uncertainty
79
Uncertain VDT Maps
Experiment 2
Turtlebot in Gazebo equipped with a 270° FOV LiDAR
Navigating through Willow Garage office
Measurement Noise Parameter
Position Estimates σx = 0.05m σy = 0.05m
Orientation Estimates σφ = 0.005rad
Range Measurements σr = 0.02m
Table: Noise Parameters for Experiment 2
80
Uncertain VDT Maps
Figure: A corridor in the Gazebo Willow Garage environment
81
Uncertain VDT Maps
Figure: Point Cloud Map 82
Uncertain VDT Maps
Figure: OGM of the office space corridor derived from VDT 83
Uncertain VDT Maps
Demonstrate the use of Eikonal equation and boundary
condition to build a map
What the best approximation function is, is an open
question
Need improvements for practical use such as handling cut
locus efficiently, reduce computational burden using
submaps
84
Content
Background
List of Contributions
Contribution 1 → Contribution 6
Conclusions
List of Publications
85
Conclusion
Vector Distance Transform
Continuous
Implicitly captures the geometry
Behavior along the boundary makes it preferable to UDT
Preferable over SDT for unstructured environments
Localization using optimization or EKF on different types
of sensors
Uncertainty of mapping data can be embedded
86
Content
Background
List of Contributions
Contribution 1 → Contribution 6
Conclusions
List of Publications
87
Publication Matrix
Contribution
ACRA
2017
ICIIS
2017
AIM
2019
CASE
2019
ICRA
2020
VDT Maps     
Sensor
Model
   
Optimization
Localization
 
EKF Local-
ization

Active Local-
ization
Uncertain
VDT Maps

88
List of Publications
Main Publications
Arukgoda, J., Ranasinghe, R., Dantanarayana, L.,
Dissanayake, G. and Furukawa, T., 2017. Vector Distance
Function Based Map Representation for Robot
Localisation. In The Australian Conference on Robotics
and Automation (ACRA), (Vol. 12). ARAA. ISBN:
978-0-9807404-8-6 ISSN: 1448-2053
Ranasinghe, R., Dissanayake, G., Furukawa, T.,
Arukgoda, J. and Dantanarayana, L., 2017, December.
Environment representation for mobile robot localisation.
In 2017 IEEE International Conference on Industrial and
Information Systems (ICIIS), (pp. 1-6). IEEE. doi:
10.1109/ICIINFS.2017.8300384
89
List of Publications
Main Publications ctd...
Arukgoda, J., Ranasinghe, R. and Dissanayake, G., 2019,
July. Robot Localisation in 3D Environments Using
Sparse Range Measurements. In 2019 IEEE/ASME
International Conference on Advanced Intelligent
Mechatronics (AIM), (pp. 551-558). IEEE. doi:
10.1109/AIM.2019.8868466
Arukgoda, J., Ranasinghe, R. and Dissanayake, G., 2019,
August. Representation of Uncertain Occupancy Maps
with High Level Feature Vectors. In 2019 IEEE 15th
International Conference on Automation Science and
Engineering (CASE), (pp. 1035-1041). IEEE. doi:
10.1109/COASE.2019.8842965
90
List of Publications
Under Review
Jayasuriya, M., Arukgoda, J., Ranasinghe, R. and
Dissanayake, G., 2020, May. Localising PMDs through
CNN Based Perception of Urban Streets. Under review in
2020 International Conference on Robotics and
Automation (ICRA).
91
List of Publications
Other Publications During Candidature
Perera, A., Arukgoda, J., Ranasinghe, R. and
Dissanayake, G., 2017, September. Localization System
for Carers to Track Elderly People in Visits to a Crowded
Shopping Mall. In 2017 International Conference on
Indoor Positioning and Indoor Navigation (IPIN), (pp.
1-8). IEEE. doi: 10.1109/IPIN.2017.8115936
Unicomb, J., Dantanarayana, L., Arukgoda, J.,
Ranasinghe, R., Dissanayake, G. and Furukawa, T., 2017,
September. Distance function based 6DOF localization
for unmanned aerial vehicles in GPS denied environments.
In 2017 IEEE/RSJ International Conference on Intelligent
Robots and Systems (IROS) (pp. 5292-5297). IEEE.
10.1109/IROS.2017.8206421
92
List of Publications
Other Publications During Candidature ctd...
Hodges, J., Attia, T., Arukgoda, J., Kang, C., Cowden,
M., Doan, L., Ranasinghe, R., Abdelatty, K., Dissanayake,
G. and Furukawa, T., 2019. Multistage Bayesian
Autonomy for High-precision Operation in a Large Field.
Journal of Field Robotics, (Vol 36 (1)), (pp.183-203).
doi: 10.1002/rob.21829
93

More Related Content

PDF
Theories and Engineering Technics of 2D-to-3D Back-Projection Problem
PPT
CS 354 Transformation, Clipping, and Culling
PDF
R04404112115
PDF
Collision Detection In 3D Environments
PDF
Image sampling and quantization
PPT
Geometry Shader-based Bump Mapping Setup
PPT
CS 354 Pixel Updating
Theories and Engineering Technics of 2D-to-3D Back-Projection Problem
CS 354 Transformation, Clipping, and Culling
R04404112115
Collision Detection In 3D Environments
Image sampling and quantization
Geometry Shader-based Bump Mapping Setup
CS 354 Pixel Updating

What's hot (20)

PDF
IVR - Chapter 2 - Basics of filtering I: Spatial filters (25Mb)
PPS
A Tutorial On Ip 1
PPT
CS 354 Graphics Math
PDF
Robust Image Denoising in RKHS via Orthogonal Matching Pursuit
PPT
CS 354 Acceleration Structures
PPT
1422798749.2779lecture 5
PPTX
ARCHITECTURAL CONDITIONING FOR DISENTANGLEMENT OF OBJECT IDENTITY AND POSTURE...
PDF
Math behind the kernels
PDF
Information-theoretic clustering with applications
PDF
GPU Accelerated Domain Decomposition
PDF
Iaetsd vlsi implementation of gabor filter based image edge detection
PDF
Digital signal and image processing FAQ
PPT
CS 354 Object Viewing and Representation
PPT
CS 354 Understanding Color
PDF
Non-local Neural Network
PDF
Distributed ADMM
PDF
Blur Filter - Hanpo
IVR - Chapter 2 - Basics of filtering I: Spatial filters (25Mb)
A Tutorial On Ip 1
CS 354 Graphics Math
Robust Image Denoising in RKHS via Orthogonal Matching Pursuit
CS 354 Acceleration Structures
1422798749.2779lecture 5
ARCHITECTURAL CONDITIONING FOR DISENTANGLEMENT OF OBJECT IDENTITY AND POSTURE...
Math behind the kernels
Information-theoretic clustering with applications
GPU Accelerated Domain Decomposition
Iaetsd vlsi implementation of gabor filter based image edge detection
Digital signal and image processing FAQ
CS 354 Object Viewing and Representation
CS 354 Understanding Color
Non-local Neural Network
Distributed ADMM
Blur Filter - Hanpo
Ad

Similar to Vector Distance Transform Maps for Autonomous Mobile Robot Navigation (20)

PDF
Robust and Efficient Coupling of Perception to Actuation with Metric and Non-...
PDF
Robot Pose Estimation: A Vertical Stereo Pair Versus a Horizontal One
PDF
Visual odometry _report
PDF
UiA Slam (Øystein Øihusom & Ørjan l. Olsen)
PDF
Keynote at Tracking Workshop during ISMAR 2014
PDF
(Research Note) Model-Aided Monocular Visual-Inertial State Estimation and De...
PDF
Dense Visual Odometry Using Genetic Algorithm
PDF
Jung.Rapport
PDF
Robotics Localization
PDF
=iros16tutorial_2.pdf
PPTX
Monocular simultaneous localization and generalized object mapping with undel...
PDF
Goal location prediction based on deep learning using RGB-D camera
PDF
Fusion of Multi-MAV Data
PDF
Master_Thesis_Jiaqi_Liu
PDF
sawano-icma2000
PDF
Visual Mapping and Collision Avoidance Dynamic Environments in Dynamic Enviro...
PDF
Towards Robust and Safe Autonomous Drones
PDF
IRJET- Simultaneous Localization and Mapping for Automatic Chair Re-Arran...
PDF
"How to Choose a 3D Vision Sensor," a Presentation from Capable Robot Components
PDF
Mobile robot localization using visual odometry in indoor environments with T...
Robust and Efficient Coupling of Perception to Actuation with Metric and Non-...
Robot Pose Estimation: A Vertical Stereo Pair Versus a Horizontal One
Visual odometry _report
UiA Slam (Øystein Øihusom & Ørjan l. Olsen)
Keynote at Tracking Workshop during ISMAR 2014
(Research Note) Model-Aided Monocular Visual-Inertial State Estimation and De...
Dense Visual Odometry Using Genetic Algorithm
Jung.Rapport
Robotics Localization
=iros16tutorial_2.pdf
Monocular simultaneous localization and generalized object mapping with undel...
Goal location prediction based on deep learning using RGB-D camera
Fusion of Multi-MAV Data
Master_Thesis_Jiaqi_Liu
sawano-icma2000
Visual Mapping and Collision Avoidance Dynamic Environments in Dynamic Enviro...
Towards Robust and Safe Autonomous Drones
IRJET- Simultaneous Localization and Mapping for Automatic Chair Re-Arran...
"How to Choose a 3D Vision Sensor," a Presentation from Capable Robot Components
Mobile robot localization using visual odometry in indoor environments with T...
Ad

Recently uploaded (20)

PDF
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
PDF
Well-logging-methods_new................
PDF
Model Code of Practice - Construction Work - 21102022 .pdf
PDF
Enhancing Cyber Defense Against Zero-Day Attacks using Ensemble Neural Networks
PPTX
CH1 Production IntroductoryConcepts.pptx
PDF
Digital Logic Computer Design lecture notes
PDF
Mohammad Mahdi Farshadian CV - Prospective PhD Student 2026
PDF
Automation-in-Manufacturing-Chapter-Introduction.pdf
PDF
SM_6th-Sem__Cse_Internet-of-Things.pdf IOT
PDF
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
PDF
The CXO Playbook 2025 – Future-Ready Strategies for C-Suite Leaders Cerebrai...
PPTX
bas. eng. economics group 4 presentation 1.pptx
PDF
Embodied AI: Ushering in the Next Era of Intelligent Systems
PPTX
OOP with Java - Java Introduction (Basics)
PDF
R24 SURVEYING LAB MANUAL for civil enggi
PPTX
Geodesy 1.pptx...............................................
PPTX
CYBER-CRIMES AND SECURITY A guide to understanding
PPTX
Welding lecture in detail for understanding
PDF
PPT on Performance Review to get promotions
PPTX
additive manufacturing of ss316l using mig welding
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
Well-logging-methods_new................
Model Code of Practice - Construction Work - 21102022 .pdf
Enhancing Cyber Defense Against Zero-Day Attacks using Ensemble Neural Networks
CH1 Production IntroductoryConcepts.pptx
Digital Logic Computer Design lecture notes
Mohammad Mahdi Farshadian CV - Prospective PhD Student 2026
Automation-in-Manufacturing-Chapter-Introduction.pdf
SM_6th-Sem__Cse_Internet-of-Things.pdf IOT
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
The CXO Playbook 2025 – Future-Ready Strategies for C-Suite Leaders Cerebrai...
bas. eng. economics group 4 presentation 1.pptx
Embodied AI: Ushering in the Next Era of Intelligent Systems
OOP with Java - Java Introduction (Basics)
R24 SURVEYING LAB MANUAL for civil enggi
Geodesy 1.pptx...............................................
CYBER-CRIMES AND SECURITY A guide to understanding
Welding lecture in detail for understanding
PPT on Performance Review to get promotions
additive manufacturing of ss316l using mig welding

Vector Distance Transform Maps for Autonomous Mobile Robot Navigation

  • 1. Vector Distance Transform Maps for Autonomous Mobile Robot Navigation Janindu Arukgoda 20 December 2019
  • 2. Content Background List of Contributions Contribution 1 → Contribution 6 Conclusions List of Publications 1
  • 3. Content Background List of Contributions Contribution 1 → Contribution 6 Conclusions List of Publications 2
  • 4. Background Fundamental questions of autonomous mobile robot navigation : Where am I? Where am I going? How do I get there? (Hugh Durrant-Whyte and John Leonard, 1989) common requirement : Map 3
  • 5. Background Map Mathematical representation of the environment Machine understandable Required to answer all three questions Examples Feature maps (landmark maps, point cloud maps) Occupancy grid maps (OGM) Distance transform (DT) maps 4
  • 6. Background Property Feature Maps OGM DT Clearly defined occupied and free spaces Analytical observation equation Continuous representa- tion No feature extraction / loss of information from sensor measurement 5
  • 7. Background Distance Transforms Originated in the image processing domain For a 2D binary image where pixels belong either to the set S, the set of 1’s that belong to objects, or the set ¯S, the set of 0’s that belong to the background, a distance map or a distance transform L(S) is an image defined as L(x) = min(d[(i, j), ¯S]) ∀x = (i, j) ∈ S (1) The distance function d is a positive definite, symmetric and triangular function 6
  • 8. Background Distance Transform Variants Unsigned DT : distance to the closest object Signed DT : distance assigned a sign based on the concept of inside vs outside Vector DT : distance described using orthonormal vectors 7
  • 9. Content Background List of Contributions Contribution 1 → Contribution 6 Conclusions List of Publications 8
  • 10. List of Contributions 1. Vector distance transforms (VDT) for environment representation 2. Sensor model on VDT maps 3. Optimization framework for localization on VDT maps 4. EKF framework for localization on VDT maps 5. Active localization using sparse range measurements 6. Capturing the uncertainty of VDT maps 9
  • 11. Content Background List of Contributions Contribution 1 : VDT for Environment Representation Conclusions List of Publications 10
  • 12. VDT for Environment Representation Figure: Box shaped environment 11
  • 13. VDT for Environment Representation (a) (b) (c) (d) Figure: (a) UDT, (b) SDT, (c) x component of VDT and (d) y component of VDT for box shaped environment 12
  • 14. Behavior of Distance Transforms (a) (b) (c) (d) Figure: (a) UDT, (b) SDT, (c) x component of VDT and (d) y component of VDT along the dotted line in box shaped environment 13
  • 15. Behavior of Interpolated Distance Transforms (a) (b) (c) (d) Figure: The behaviour of cubic spline interpolations of (a) UDT, (b) SDT, (c) x component of VDT and (d) y component of VDT close to the boundary 14
  • 16. Gradients of Distance Transforms at the Boundary (a) (b) (c) (d) Figure: Cubic spline interpolations of the gradients along the x direction of (a) UDT, (b) SDT, (c) VDT x and (d) VDT y close to the boundary 15
  • 17. Behavior of Distance Transforms at Cut Locus (a) (b) (c) (d) Figure: Cubic spline interpolations of the gradients along the x direction of (a) UDT, (b) SDT, (c) VDT x and (d) VDT y close to the cut locus 16
  • 18. VDT for Environment Representation Property UDT SDT VDT Capable of represent- ing complex 2D environ- ments Continuous gradients on boundaries Continuous function on cut locus 17
  • 19. Content Background List of Contributions Contribution 2 : Sensor Model for VDT Maps Conclusions List of Publications 18
  • 20. Sensor Model for VDT Maps Sensor Model associates sensor measurements with the map given robot state Likelihood of an observation given the robot state P(z | x, m) (2) z : observation x : robot state m : map / internal representation of the environment 19
  • 21. Sensor Model for VDT Maps Given a VDT map, an observation zi ∈ Z; zi = {ri, θi} and an estimate for robot pose XR = (xR, yR, φR) , what is the disparity between the actual and the expected observation? Xo = xoi yoi = xR + ri sin(θ + φi) yR + ri cos(θ + φi) (3) 20
  • 22. Sensor Model for VDT Maps dV DT = V DT(Xo) =               DTx(Xo1 ) DTy(Xo1 ) . . DTx(Xoi ) DTy(Xoi ) . DTx(Xon ) DTy(Xon )               (4) 21
  • 23. Sensor Model for VDT Maps Assuming 1. Each range measurement is independent within a laser scan 2. Range measurement noise σr is the only contributing factor to the sensor noise Sensitivity of dV DT due to sensor noise is ΣdV DT = diag(σ2 DTx,r1 , σ2 DTy,r1 , ..., ..., σ2 DTx,ri, σ2 DTy,ri, ..., ) (5) 22
  • 24. Sensor Model for VDT Maps σ2 DTx,ri = JDTx,ri · σ2 r · JDTx,ri σ2 DTy,ri = JDTy,ri · σ2 r · JDTy,ri (6) JDTx,ri = ∂dDTx ∂r ri = ∂DTx ∂xoi xoi · ∂xoi ∂r ri + ∂DTx ∂yoi yoi · ∂xoi ∂r ri JDTy,ri = ∂dDTy ∂r ri = ∂DTy ∂xoi xoi · ∂xoi ∂r ri + ∂DTy ∂yoi yoi · ∂xoi ∂r ri (7) 23
  • 25. Sensor Model for VDT Maps A scalar disparity measurement between the expected and the actual observations, inspired by the Chamfer distance dV CD = 2n i=1 dV DT (i)2 = n i=1 DTx(Xoi )2 + DTy(Xoi )2 (8) 24
  • 26. Content Background List of Contributions Contribution 3 : Optimization Framework for Localization Conclusions List of Publications 25
  • 27. Localization “Where am I?” == Localization Given a map sensor measurements prior information figure out where I am. 26
  • 28. Optimization Framework for Localization Given S : sensor measurement M : map XR = (xR, yR, φR) : hypothesized robot pose if a function f can evaluate the disparity between XR and the observations, localization becomes a minimization problem. ˆXR = argmin xR,yR,φR f(S, M) (9) Is dV CD scalar disparity measurement suitable for this? 27
  • 29. Optimization Framework for Localization Figure: Behavior of dV CD around the true robot position with the orientation fixed at true value 28
  • 30. Optimization Framework for Localization Figure: Behavior of dV CD around the true robot orientation with the position fixed at true value 29
  • 31. Optimization Framework for Localization ˆXR = argmin xR,yR,φR dV CD(Xo, DTv) (10) Initial guess required Similar to C-LOG 30
  • 32. Optimization Framework for Localization Uncertainty estimation Sensitivity of the pose estimate to the error in sensor measurement Σz = diag(σ2 r ) Pose derived using non linear least squares optimization Use implicit function theorem cov( ˆXR) = J · Σz · J (11) 31
  • 33. Optimization Framework for Localization J = −H−1 ∗    ∂2dV CD ∂xr∂r ∂2dV CD ∂yr∂r ∂2dV CD ∂φr∂r    (12) H =    ∂2dV CD ∂x2 r ∂2dV CD ∂xr∂yr ∂2dV DT ∂xr∂φr ∂2dV CD ∂xr∂yr ∂2dV CD ∂y2 r ∂2dV CD ∂yr∂φr ∂2dV CD ∂xr∂φr ∂2dV CD ∂yr∂φr ∂2dV CD ∂φ2 r    (13) Calculating these derivatives is possible because of the good behavior of the VDT at the boundary 32
  • 34. Optimization Framework for Localization Simulation Dataset : Ground truth available Figure: Trajectory estimate of the optimization framework 33
  • 35. Optimization Framework for Localization Figure: Optimization pose estimate error 34
  • 36. Optimization Framework for Localization Localization of a UAV in Kentland, VA, USA for MBZIRC 2017 Flight controller estimates roll and pitch Height obtained from a range sensor 3DOF localization using monocular camera Edge map of the flat ground terrain represented using a VDT 35
  • 37. Optimization Framework for Localization Edge pixels (λi, µi) projected to the ground using hypothesized robot position (xr, yr, zr) and orientation R and focal length f. xoi = xoi yoi = xr + zr λiR1,1+µiR1,2−fR1,3 λiR3,1−µiR3,2+fR3,3 yr + zr λiR2,1+µiR2,2−fR2,3 λiR3,1−µiR3,2+fR3,3 (14) Figure: Bogey 5 36
  • 38. Optimization Framework for Localization Figure: UAV trajectory 37
  • 39. Optimization Framework for Localization Figure: Optimization pose error 38
  • 40. Content Background List of Contributions Contribution 4 : EKF Framework for Localization Conclusions List of Publications 39
  • 41. EKF Framework for Localization Prediction using motion model ˆXk|k−1 = g( ˆXk−1, Uk)   ˆxk|k−1 ˆyk|k−1 ˆφk|k−1   =   ˆxk−1 + ∆T · υk · cos(ˆφk−1) ˆyk−1 + ∆T · υk · sin(ˆφk−1) ˆφk−1 + ∆T · ωk   (15) ˆXRk−1 : robot pose at step k − 1 Uk = [υk, ωk] : odometry ∆T : time between steps k and k − 1 40
  • 42. EKF Framework for Localization Covariance propagation Pk|k−1 = Gx · Pk−1 · Gx + Gu · Q · Gu (16) Gx = ∂g ∂X ˆXk−1,Uk =   1 0 −∆T · υk · sin(φk−1) 0 1 ∆T · υk · cos(φk−1) 0 0 1   (17) Gu = ∂g ∂U ˆXk−1,Uk =   ∆T · cos(φk−1) 0 ∆T · sin(φk−1) 0 0 ∆T   (18) Pk−1 : pose covariance at step k − 1 Q : odometry noise 41
  • 43. EKF Framework for Localization Observation model Standard EKF calculates an expected measurement Innovation is the disparity between expected and actual measurements In a VDT map, this disparity is implicitly captured in the map h( ˆXRk|k−1 , zk) = dV DT (Xok ) = 0 (19) 42
  • 44. EKF Framework for Localization Innovation : νk = h( ˆXRk|k−1 , zk) = dV DT (Xok ) (20) A vector of distance values at the projected laser end points 43
  • 45. EKF Framework for Localization Innovation covariance Sk = Hz · Σz · Hz + Hx · Pk|k−1 · Hx (21) Hz = ∂h ∂z ˆXk|k−1,z (22) Hx = ∂h ∂X ˆXk|k−1,z (23) 44
  • 46. EKF Framework for Localization Calculation of Kalman gain and update equations follow the standard EKF. Kk = Pk|k−1 · Hx · S−1 k (24) ˆXk = ˆXk|k−1 + Kk · νk (25) Pk = Pk|k−1 − Kk · Sk · Kk (26) 45
  • 47. EKF Framework for Localization Simulation Dataset : Ground truth available Figure: Trajectory estimate of the EKF framework 46
  • 48. EKF Framework for Localization Figure: EKF pose estimate error 47
  • 49. EKF Framework for Localization Algorithm Mean Absolute Error P osition(m) Orientation(rad) Optimization (proposed) 0.0181 ± 0.0126 0.0024 ± 0.0037 EKF (proposed) 0.0235 ± 0.0223 0.0161 ± 0.0280 Optimization (C-LOG) 0.0099 ± 0.0141 0.0042 ± 0.0167 EKF (UDT) 0.0227 ± 0.0251 0.0157 ± 0.0211 Particle Filter (AMCL) 0.0296 ± 0.0243 0.0252 ± 0.0431 Table: Mean Absolute Errors : Simulation Algorithm Mean Squared Error P osition(m2 × 10−3 ) Orientation(rad2 × 10−3 ) Optimization (proposed) 0.4847 ± 1.700 0.0192 ± 0.2636 EKF (proposed) 1.1000 ± 2.6000 1.0000 ± 2.5000 Optimization (C-LOG) 0.299 ± 2.167 0.2975 ± 1.7000 EKF (UDT) 1.146 ± 4.492 1.4000 ± 3.7000 Particle Filter (AMCL) 1.464 ± 2.078 2.5000 ± 7.1000 Table: Mean Squared Errors : Simulation 48
  • 50. EKF Framework for Localization Localization of a mobility scooter Ground surface structure observations Assume locally flat terrain Tilted depth camera Landmark based bearing-only EKF localization implemented by Maleen Jayasuriya 49
  • 51. EKF Framework for Localization Figure: PMD localization framework 50
  • 52. EKF Framework for Localization (a) (b) Figure: (a) Ground Surface Image (b) the corresponding HED Output Image 51
  • 53. EKF Framework for Localization Figure: Stitched ground surface map 52
  • 54. EKF Framework for Localization Figure: Trajectory estimate in Glebe, Sydney 53
  • 55. Content Background List of Contributions Contribution 5 : Active Localization Conclusions List of Publications 54
  • 56. Active Localization LiDARs are high frequency, high resolution sensors The EKF framework can handle sparse measurements - even a single range measurement Using a rotating single point laser to localize Active localization - determining the direction of the single point laser Based on information gain 55
  • 57. Active Localization Figure: Active localization framework 56
  • 58. Active Localization Prediction step standard Observation step similar to the previously presented step The observation vector contains only two elements Update step standard 57
  • 59. Active Localization The sensor rotation range is divided into n segments Once a segment is selected as the goal, the sensor rotates from current position to the farthest end of the selected segment in a sweeping motion Sensor continuously detect environment Once the sensor reaches the goal, calculate a new goal 58
  • 60. Active Localization Calculating the goal Given current position estimate and uncertainty Ray trace at fixed angular resolution for hypothetical observations Update pose uncertainty using hypothetical observations Pick the segment that produces the best cost function value For the ith segment, the cost function is defined as cost(i) = trace(Pki) + (λ.d) (27) Pki - Uncertainty of the pose estimate using hypothetical observations in the ith segment d - angular distance between current position and ith segment λ - tuning parameter 59
  • 61. Active Localization Experiment Willow Garage environment in Gazebo 8 segments covering 360 ° Range limited to 3m Video 60
  • 62. Content Background List of Contributions Contribution 6 : Uncertain VDT Maps Conclusions List of Publications 61
  • 63. Uncertain VDT Maps Desired properties of a map built from point cloud data Continuous Captures uncertainty of map building process Simpler Sensor Model 62
  • 64. Uncertain VDT Maps Problem definition : Given a set of noisy robot poses noisy observations obtained at said poses how to represent the environment using the observations while capturing the uncertainties? 63
  • 65. Uncertain VDT Maps If u(x), x ∈ 2 is the distance from point x to a set of points S, u(x) =⇒ UDT u satisfies the Eikonal equation | u| = 1, u|S = 0 (28) 64
  • 66. Uncertain VDT Maps Given the relationship between UDT and VDT DTu(x, y)2 = DTx(x, y)2 + DTy(x, y)2 (29) A PDE can be derived as (DTx ∂DTx ∂x + DTy ∂DTy ∂x )2 + (DTx ∂DTx ∂y + DTy ∂DTy ∂y )2 DT2 x + DT2 y = 1 (30) With boundary conditions DTx(x, y) = 0, DTy(x, y) = 0, ∀(x, y) ∈ S (31) 65
  • 67. Uncertain VDT Maps Parametric appriximation of a solution to a PDE L u = f, x ∈ Ω (32) Bu = g, x ∈ ∂Ω (33) f, g - given functions L , B - differential operators ∂Ω boundary of bounded open domain Ω If a parametric function ua(x, βi) exists that is arbitrarily close to u, βi can be optimized to obtain u = ua(x, βi) using objective function h h = Ω ||L ua − f||2 dV + ∂Ω ||Bua − g||2 dS (34) 66
  • 68. Uncertain VDT Maps Cubic B-Spline approximation of distance transoforms P(u, w) = ( 1 6 )2 u3 , u2 , u, 1 MPMT     w3 w2 w 1     (35) M =     −1 3 −3 1 3 −6 3 0 −3 0 3 0 1 4 1 0     (36) P =     P00 P01 P02 P03 P10 P11 P12 P13 P20 P21 P22 P23 P30 P31 P32 P33     (37) 67
  • 69. Uncertain VDT Maps hv = Ω (DTx ∂DTx ∂x + DTy ∂DTy ∂x )2 + (DTx ∂DTx ∂y + DTy ∂DTy ∂y )2 DT2 x + DT2 y − 1 2 dV + ∂Ω DT2 x + DT2 y dS (38) Optimize for the control points Pij using hv Neural networks may also be candidates for the approximation function 68
  • 70. Uncertain VDT Maps Integrating uncertainty Each point in a point cloud Xo is Xoi = xoi yoi = xr + ri cos(θi − φr) yr + ri sin(θi − φr) (39) ˆxR = (xr, yr, φr)T - robot pose cov( ˆxR) - robot pose uncertainty Srθ = {(ri, θi}) - laser range - bearing observations σr - laser range measurement noise 69
  • 71. Uncertain VDT Maps Uncertainty of each point in the point cloud cov(Xoi) = ∂Xoi ∂ ˆxR cov( ˆxR) ∂Xoi ∂ ˆxR T + ∂Xoi ∂r σ2 r ∂Xoi ∂r T (40) 70
  • 72. Uncertain VDT Maps Weights for observations based on uncertainty ρoi calculated as : ρoi = 1 ∂(DT2 x +DT2 y ) ∂Xoi cov(Xoi) ∂(DT2 x +DT2 y ) ∂Xoi T (41) Weights for Eikonal equation ρei are empirically determined 71
  • 73. Uncertain VDT Maps Objective function with weights incorporated : hv∗ = Ω ρei (DTx ∂DTx ∂x + DTy ∂DTy ∂x )2 + (DTx ∂DTx ∂y + DTy ∂DTy ∂y )2 DT2 x + DT2 y − 1 2 dV + ∂Ω ρoi DT2 x + DT2 y dS (42) 72
  • 74. Uncertain VDT Maps Estimating map uncertainty Px∗ ij - optimized values of the control points for DTx Py∗ ij - optimized values of the control points for DTy cov(Px∗ ij) = Jx ∗ cov(Xoi) ∗ JT x (43) cov(Py∗ ij) = Jy ∗ cov(Xoi) ∗ JT y (44) Jx = −H−1 ∗ ∂2hv∗ ∂Px∗ ij∂rk (45) (implicit function theorem) 73
  • 75. Uncertain VDT Maps Experiment 1 Simulated Robot Equipped with 180° FOV LiDAR Figure: Environment and robot poses 74
  • 76. Uncertain VDT Maps Measurement Noise Parameter Position Estimates σx = 0.01m σy = 0.01m Orientation Estimates σφ = 0.005rad Range Measurements σr = 0.02m Table: Noise Parameters used in Experiment 1 75
  • 77. Uncertain VDT Maps Figure: Noisy point cloud 76
  • 78. Uncertain VDT Maps (a) DTx (b) DTy Figure: Optimal VDT at 0.2m control point resolution 77
  • 79. Uncertain VDT Maps Figure: Experiment 1 : Zero contour of the approximated VDT, observations and true boundary 78
  • 80. Uncertain VDT Maps Figure: Error of the VDT approximation bounded by uncertainty 79
  • 81. Uncertain VDT Maps Experiment 2 Turtlebot in Gazebo equipped with a 270° FOV LiDAR Navigating through Willow Garage office Measurement Noise Parameter Position Estimates σx = 0.05m σy = 0.05m Orientation Estimates σφ = 0.005rad Range Measurements σr = 0.02m Table: Noise Parameters for Experiment 2 80
  • 82. Uncertain VDT Maps Figure: A corridor in the Gazebo Willow Garage environment 81
  • 83. Uncertain VDT Maps Figure: Point Cloud Map 82
  • 84. Uncertain VDT Maps Figure: OGM of the office space corridor derived from VDT 83
  • 85. Uncertain VDT Maps Demonstrate the use of Eikonal equation and boundary condition to build a map What the best approximation function is, is an open question Need improvements for practical use such as handling cut locus efficiently, reduce computational burden using submaps 84
  • 86. Content Background List of Contributions Contribution 1 → Contribution 6 Conclusions List of Publications 85
  • 87. Conclusion Vector Distance Transform Continuous Implicitly captures the geometry Behavior along the boundary makes it preferable to UDT Preferable over SDT for unstructured environments Localization using optimization or EKF on different types of sensors Uncertainty of mapping data can be embedded 86
  • 88. Content Background List of Contributions Contribution 1 → Contribution 6 Conclusions List of Publications 87
  • 89. Publication Matrix Contribution ACRA 2017 ICIIS 2017 AIM 2019 CASE 2019 ICRA 2020 VDT Maps Sensor Model Optimization Localization EKF Local- ization Active Local- ization Uncertain VDT Maps 88
  • 90. List of Publications Main Publications Arukgoda, J., Ranasinghe, R., Dantanarayana, L., Dissanayake, G. and Furukawa, T., 2017. Vector Distance Function Based Map Representation for Robot Localisation. In The Australian Conference on Robotics and Automation (ACRA), (Vol. 12). ARAA. ISBN: 978-0-9807404-8-6 ISSN: 1448-2053 Ranasinghe, R., Dissanayake, G., Furukawa, T., Arukgoda, J. and Dantanarayana, L., 2017, December. Environment representation for mobile robot localisation. In 2017 IEEE International Conference on Industrial and Information Systems (ICIIS), (pp. 1-6). IEEE. doi: 10.1109/ICIINFS.2017.8300384 89
  • 91. List of Publications Main Publications ctd... Arukgoda, J., Ranasinghe, R. and Dissanayake, G., 2019, July. Robot Localisation in 3D Environments Using Sparse Range Measurements. In 2019 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), (pp. 551-558). IEEE. doi: 10.1109/AIM.2019.8868466 Arukgoda, J., Ranasinghe, R. and Dissanayake, G., 2019, August. Representation of Uncertain Occupancy Maps with High Level Feature Vectors. In 2019 IEEE 15th International Conference on Automation Science and Engineering (CASE), (pp. 1035-1041). IEEE. doi: 10.1109/COASE.2019.8842965 90
  • 92. List of Publications Under Review Jayasuriya, M., Arukgoda, J., Ranasinghe, R. and Dissanayake, G., 2020, May. Localising PMDs through CNN Based Perception of Urban Streets. Under review in 2020 International Conference on Robotics and Automation (ICRA). 91
  • 93. List of Publications Other Publications During Candidature Perera, A., Arukgoda, J., Ranasinghe, R. and Dissanayake, G., 2017, September. Localization System for Carers to Track Elderly People in Visits to a Crowded Shopping Mall. In 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN), (pp. 1-8). IEEE. doi: 10.1109/IPIN.2017.8115936 Unicomb, J., Dantanarayana, L., Arukgoda, J., Ranasinghe, R., Dissanayake, G. and Furukawa, T., 2017, September. Distance function based 6DOF localization for unmanned aerial vehicles in GPS denied environments. In 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 5292-5297). IEEE. 10.1109/IROS.2017.8206421 92
  • 94. List of Publications Other Publications During Candidature ctd... Hodges, J., Attia, T., Arukgoda, J., Kang, C., Cowden, M., Doan, L., Ranasinghe, R., Abdelatty, K., Dissanayake, G. and Furukawa, T., 2019. Multistage Bayesian Autonomy for High-precision Operation in a Large Field. Journal of Field Robotics, (Vol 36 (1)), (pp.183-203). doi: 10.1002/rob.21829 93