SlideShare a Scribd company logo
International Journal of Artificial Intelligence & Applications (IJAIA), Vol. 5, No. 5, September 2014 
COMPLEMENTARY VISION BASED DATA FUSION 
FOR ROBUST POSITIONING AND DIRECTED FLIGHT 
OF AN AUTONOMOUS QUADROCOPTER 
Nils Gageik1, Eric Reinthal2, Paul Benz3and Sergio Montenegro4 
Chair of Computer Science 8, University of Würzburg, Germany 
ABSTRACT 
The present paper describes an improved 4 DOF (x/y/z/yaw) vision based positioning solution for fully 6 
DOF autonomous UAVs, optimised in terms of computation and development costs as well as robustness 
and performance. The positioning system combines Fourier transform-based image registration (Fourier 
Tracking) and differential optical flow computation to overcome the drawbacks of a single approach. The 
first method is capable of recognizing movement in four degree of freedom under variable lighting 
conditions, but suffers from low sample rate and high computational costs. Differential optical flow 
computation, on the other hand, enables a very high sample rate to gain control robustness. This method, 
however, is limited to translational movement only and performs poor in bad lighting conditions. A reliable 
positioning system for autonomous flights with free heading is obtained by fusing both techniques. 
Although the vision system can measure the variable altitude during flight, infrared and ultrasonic sensors 
are used for robustness. This work is part of the AQopterI8 project, which aims to develop an autonomous 
flying quadrocopter for indoor application and makes autonomous directed flight possible. 
KEYWORDS 
Autonomous UAV, Quadrocopter, Quadrotor, Vision Based, Positioning, Data Fusion, Directed Flight 
1. INTRODUCTION 
In spite of the fact that nowadays exist solutions for vision based positioning to enable 
autonomous flight of UAV’s (Unmanned Aerial Vehicles), these solutions suffer from different 
inherent drawbacks. Main drawbacks are high computational burden [1, 2] and low sample rate 
[3],limitations to translational movement [11, 12], and bad robustness to varying lighting 
conditions[4]. Despite the considerable rise of computing power during the last decade, 
computational burden is still a concern for autonomous UAVs. High computing capacity results 
in a significant increase in weight as well as power consumption of a UAV. Flight-time is thereby 
reduced by a noticeable factor. This lead to several scientific works on UAVs, where computation 
for the positioning system is done on external hardware. These approaches, however, break the 
requirements for a fully autonomous system. Our proposed positioning system focuses on 
computational efficiency with acceptable hardware load to deploy relatively lightweight on-board- 
hardware with low power consumption. The system therefore provides fully autonomous 
positioning without the need of external systems. Hence fully autonomous directed flight 
becomes possible, that is required for passing through narrow openings such as doors and 
windows with an on-Board collision avoidance system having preferential direction. This is an 
alternative to laser based solutions [16-17]. 
DOI : 10.5121/ijaia.2014.5501 1
International Journal of Artificial Intelligence & Applications (IJAIA), Vol. 5, No. 5, September 2014 
The following study deals with the concept, implementation and evaluation of merging two vision 
based methods for positioning of an autonomous UAV. First two sections describe the two 
methods used here, explaining each on its own. The fourth section describes the overall concept 
of merging the two systems with focus on the data fusion and is followed by a section about its 
implementation. The paper ends with sections on evaluation and discussion. 
2 
2. FOURIER TRACKING (METHOD 1) 
2.1. Overview 
Fourier-transform based image registration (short: Fourier Tracking)is the process of determining 
the geometric correspondence of two images. With a camera mounted on the UAV and pointing 
vertically to the ground, the motion of the UAV can be computed by continuous registration of 
succeeding images. With the constraint that the focus of the camera stays in the plane of surface, 
succeeding images are affine transformations with respect to translation, rotation and scaling. By 
registering the images, the parameters of the transformation are gained, which permit to conclude 
the movement of the UAV. Image registration is done in Fourier space to reduce computational 
costs and to improve robustness against changing light conditions. 
Figure 1.Fourier-transform based image registration 
Figure 1 pictures the simplified concept of the implemented Fourier Tracking. To gain robustness 
under variable lighting conditions, a histogram equalization is applied to each image before 
processing. This gives the positioning system the possibility to function under bad lighting 
conditions as well as with surfaces having very low contrast. 
Next Log-Polar-Transformation (LPT) and Phase Correlation are executed to gain the yaw angle 
and scaling factor. The scaling factor, which corresponds to the height or change in height, 
together with the yaw angle is required for the affine transformation, which transforms the first 
image i0 to i0’, so that it fits to the second image i1. Now both images correspond in height and 
yaw angle to another and the second phase correlation can determine the translation between both 
images. This translation corresponds to the position change of the quadrocopter. 
Each LPT block consists of a FT (Fourier Transform), a Log-Polar-Transformation(section 2.3). 
and a high pass filter. The coordinate-change to logarithmic polar coordinates induces distortion
International Journal of Artificial Intelligence & Applications (IJAIA), Vol. 5, No. 5, September 2014 
in the transformed image since the transform is non-uniform. To overcome bad performing of the 
phase correlation, a high-pass filter is applied after the transform. [6] 
∗	, 
 denotes the complex conjugate of 
	, 
. By inserting formula (2.3) in (2.4), 
3 
Each Phase correlation (section 2.2) requires three Fourier transforms (i0, i1, inverse). 
2.2. Fourier Transform and Cross-Power-Spectrum (CPS) 
The concept uses extended phase correlation [5] for registering images which are translated, 
rotated and scaled with respect to each other. 
Let ,  be an image in spatial domain and 	, 
 the corresponding image in Fourier 
domain thus 
ℱ =  (2.1) 
Two images 	and 
, which are translated by 
, , own the following dependence 
(neglecting border-effects): 

,  = 	 +,  +  (2.2) 
Due to the shift-theorem of the Fourier transform, this dependence can be written in Fourier space 
as: 

	, 
 = 
		, 
 (2.3) 
The phase correlation allows to compute this translation by using the cross-power-spectrum 
(CPS) of the two images. The CPS is defined as: 
  = 
∗	, 
 
		, 

 
		, 

	, 
 (2.4) 
where
 
one can show that the CPS corresponds to the translational difference of the two images: 
  = 
 (2.5) 
The CPS is technically the convolution of the two images in Fourier space. By taking the inverse 
Fourier transform of the CPS, the translation 
,  is obtained, because the exponential term 
of equation (2.3) corresponds to the delta-function. 
ℱ	# $ = ℱ	%
 = ' − ,  −  (2.6) 
For two overlapping images, equation (2.2) is only satisfied in the overlapping area. The non-corresponding 
areas add distortion to the CPS, which therefore differs from an exact delta-impulse. 
The real CPS describes a correlation surface of the images, with its peak at the delta-impulse 
from equation (2.6). This peak gives the translation between	and 
: 
)* = +, 
, 
ℱ	# $ (2.7) 
With this concept only horizontal translated images can be registered. A preceding rotation and 
scaling compensation is therefore required.
International Journal of Artificial Intelligence  Applications (IJAIA), Vol. 5, No. 5, September 2014 
4 
2.3. Polar-Coordinates 
By applying a coordinate-change to logarithmic polar-coordinateslog 0 , 1, rotation and scaling 
can be computed using phase correlation as well. This concept is derived from Reddy and 
Chatterji[6]. 
Let 
2, 3 be rotated by 1and translated to 	, . Their variables are related by: 
2 = 45 1 +   1 −  
3 = −sin 1 +  cos 1 −  (2.8) 
Applying the Fourier transform, we get: 
	,  = 
:;
, = (2.9) 
Due to the rotation property of the Fourier transform, the variable relation is given by the 
following equation: 
 =  cos 1 +  sin 1 
= = − sin 1 +  cos 1 
(2.10) 
The magnitudes of the Fourier transforms differ only by rotation with a rotation centre in the 
middle of the images. This corresponds to a translation in polar coordinates, since rotation around 
the image centre only affects the 1-coordinate. Applying a coordinate-change to polar 
coordinates, the magnitudes of the Fourier transforms are related by: 
	, ? = 
, ? − 1 (2.11) 
If 
 is also scaled to 	by @, which only affects the 0-coordinate in polar coordinates, equation 
(2.11) extends to: 
	, ? = 
1 
@ 

@, ? − 1 (2.12) 
Assuming a minimal scale change between two images, the factor @	 may be neglected. 
Switching to logarithmic polar coordinates gives: 
	B5C  , ? = 
1 
@ 

B5C  − B5C @ , ? − 1 (2.13) 
By substituting , = log and 4 = log @, we obtain the following formula: 
	,, ? = 
, − 4, ? − 1 (2.14) 
These translations 4 and 1 can be obtained by applying phase correlation, where @ = D 
corresponds to the scaling and 1 to the rotation of the two input images. Once rotation and scale 
change is computed, the second image is scaled and rotated by the computed values. Thus, the 
transformed image and the first input image differ merely in translation. This translation is 
obtained by applying another phase correlation.
International Journal of Artificial Intelligence  Applications (IJAIA), Vol. 5, No. 5, September 2014 
5 
2.3. Implementation details 
Once two images are registered, their translation, scaling and rotation are transformed to the 
coordinate system of the UAV, which leads to the change in position. Let  be taken at time E 
and 	 slightly thereafter at time E	. The position of the UAV in the surface plane at time Eequals 

,, F , the altitude is ℎ and the heading is given by H. 
The rotation of the images corresponds directly to the change in heading of the UAV, leading to 
HE	 = H + 1 (2.15) 
The scaling of the images represents the relative change in altitude. 
ℎE	 = 
ℎ 
@ (2.16) 
The translation of the image registration is obtained in pixels. Therefore, a calibration factor α has 
to be determined to translate the translation to meters. This factor α is constant for a given camera 
and resolution. The ratio of pixel to meter is also affected by the altitude of the UAV. The field of 
view of the camera changes linearly with the distance to the camera scene. Thus the translation in 
meters is given by: 

,	, F	 = +ℎ
,  (2.17) 
This translation is aligned in camera-coordinates and has to be rotated in UAV-coordinates with 
the angle of heading at time E: 
45H H 
−H 45H 

,E	, FE	 = J 
K 
,	, F	 (2.18) 
By continuous registration of images, the movement in the surface plane, the altitude and the 
heading of the UAV can be iteratively measured. Each registration with the given 
concept (Fig. 1)requires eight Fourier transforms, representing the most computational load of the 
positioning system. To reduce computational costs, Fourier transform is implemented using Fast 
Fourier Transform. For a square image of  ×  pixels, the transform can be computed 
in M log . [7] 
In order to reduce computation time further, the symmetry of the Fourier transform is exploited. 
Since Fourier transform is designed for complex input, real input sequences such as images lead 
to complex-conjugated symmetry in Fourier space. Therefore it is sufficient to compute merely 
the upper half of the CPS. To gain further precision, the peak location of the CPS is computed 
with a weighted average around the absolute peak. This gives sub-pixel-accuracy for the 
translation, which also improves the robustness due to higher accuracy in the rotation-and-scale 
transform. 
Since images are finite, Fourier transform induces border effects leading to miscalculations in the 
CPS. Applying a Hanning-window before each transform minimizes these effects by blackening 
out edge regions of the images. This increases the reliability of the phase correlation significantly. 
[8]
International Journal of Artificial Intelligence  Applications 
3. DIFFERENTIAL OPTICAL 
IFFERENTIAL FLOW COMPUTATION (METHOD 2) 
3.1. Overview 
(IJAIA), Vol. 5, No. 5, September 2014 
A common method for differential optical flow computation is the 
can be realized with the algorithm of Srinivasan [ 
change in the picture is only a translation and at most one pixel per frame. Under these 
constraints, the searched optical flow 
then be determined by equation 3: 
N 
3K =O 
J 
presumed, that the 
, values u and v in the x- and y-axis between two pictur 
SST −OT	 ∗ 
 −O	O 
V U 
SST −OT	 ∗ W −OXO 
P Q Q Q R 
U V Y Z Z Z [ 
∗O 
P Q Q Q R 
SS
 −O	O 
V U 
SS
 −O	 ∗ W −OXO 
V U 
with Px(i,j), Py(i,j) and Pt(i,j) being the partial 
axis or after time t, respectively: 
pictures can 
ing intensity derivatives of point P(i, j) in the x 
V 
, ] =O
 −O	 
U 
, ] =OW −OX 
T 
, ] =OT −OT	 
are computed from the neighboring pixels to the left (P 
10]. For simplification it is presume 
The partial intensity derivatives 
(P1), up (P4) and down (P3) and the previous picture 
) (Fig. 2). 
Figure 2.Centre Pixel P 
3.3. Implementation details 
Lucas-Kanade-Method [ 
SS
 −O	 ∗ W −OXO 
V U 
	 
U V Y Z Z Z [ 
SSW −OXO 
Pt or Pt-1 in yellow and neighbours in red [11] 
In this work the ADNS 3080, an optical flow or optical mouse sensor, is used 
proper lens [12]. This sensor works with an internal sample rate of 
good results under good lighting conditions 
sensor fails, if there is too high ( 
drawbacks, the Fourier Tracking has been 
9]. It 
, together with a 
up to 6400fps and provides 
[4], but it cannot handle rotations. Furthermore the 
as in case of outdoors) or too less light. To overcome th 
added to the system. 
Since there is no information in the data sheet, which method is implemented on this sensor, 
is not absolutely clear. But because of 
mentioned problems and limitation to translations 
method with similar characteristics 
ecause its characteristics and drawbacks, the high frame rate, the 
it is considered to be either method 2 or a 
characteristics. 
6 
(3.1) 
x- or y- 
(3.2) 
P2), right 
00fps these two 
this 
haracteristics
International Journal of Artificial Intelligence  Applications (IJAIA), Vol. 5, No. 5, September 2014 
7 
4. CONCEPT 
4.1. Overview 
The main motivation for this work was the limitation of method 2 (optical flow) to translations. 
Any yaw rotation is interpreted as a translation and therefore cannot be executed without 
accumulating high position error. After any yaw rotation with position hold, the system would 
hold position on a different location. Therefore method 1 (Fourier Tracking) was designed in a 
way which can handle yaw rotations, but suffers from a high computational burden. 
Since computation power on-Board the quadrocopter is very much limited and is required also for 
other applications like mapping, object and obstacle detection, as well as method 2 is sufficient in 
most cases, method 1should only be activated for error correction after rotating. This concept idea 
is called rotation compensation (4.2). 
Besides this, method 2 also fails in bad lighting conditions. In this case method 1 is also activated, 
because it can handle bad lighting conditions. The data of both methods are then merged, which is 
called dynamic complementary data fusion (4.3). It is called dynamic, because method 1 is 
automatically activated, if method 2 produces bad results. 
Figure 3. Concept 
The overall concept of the system is illustrated in Figure 3. Both sensor systems send its data, the 
last measured position P and the quality Q of the last measurement to the complementary data 
fusion, where these data are merged. Depending on the quality data QOF of the optical flow system 
or the rotation state, the execution controller then activates or deactivates the Fourier Tracking 
and the dynamic complementary data fusion. The rotation state can be either rotating or not 
rotating. 
The Fourier Tracking uses a space-fixed frame, while the optical flow and the quadrocopter with 
its position control can only operate in a body-fixed frame. That’s why the results have to be 
transformed from one frame to another. Furthermore both methods are performed on different 
processors, because the Fourier Tracking needs high computation power and the ADNS uses SPI 
and therefore is best connected to the microcontroller, which also performs the control loop of the 
system and drives the motors. This means, the results of the Fourier Tracking also have to be sent 
to the microcontroller. To release the microcontroller, the data fusion is performed on the 
computer with high computation power.
International Journal of Artificial Intelligence  Applications (IJAIA), Vol. 5, No. 5, September 2014 
8 
4.2. Rotation Compensation 
The idea of the rotation compensation is to start the Fourier Tracking whenever the quadrocopter 
is going to perform a yaw rotation and stop the Fourier Tracking after the rotation is finished. 
Then the position error, which occurs while rotating, is corrected. In this case the Data Fusion 
simply discards the erroneous optical flow measurements and uses only the measurements from 
the Fourier Tracking. 
The advantage of this method is, that it is easy to implement and enables the quadrocopter to 
perform yaw rotations and correct errors after rotating. It can be improved by updating the 
position during the rotation, but then the transformation as well as the correction has to be 
executed more often. This means more communication and work load for the microcontroller, but 
it could also lead to instability of the control, because of actual value jumps through position 
corrections. 
4.3. Dynamic Complementary Data Fusion 
The idea of the dynamic complementary data fusion is to activate and use the Fourier Tracking 
only when required and to complementary incorporate both measurements depending on each 
quality (formula 4.1). To ensure erroneous measurements from the optical flow sensor are not 
used during rotation, @ is set to zero in this case. 
 =  + O@O ∙ OΔ`a + 1 − @ ∙ OΔab (4.1) 
Three stats can be divided depending on the quality of the optical flow QOF. In the first state the 
quality QOF is so low, that only the Fourier Tracking is used, so @ is set zero. In the second state 
the quality QOF is so high, that the Fourier tracking is deactivated so @is set to one. In the third 
state @ is computed by the relationship of the previously normalized and scaled qualities (formula 
4.2). 
@ = 
c`a 
c`a +Ocab 
(4.2) 
4.4. Directed Flight 
Directed Flight means, that the nose of the quadrocopter is always directed into flight direction, as 
it is known from airplanes. This becomes necessary, if the quadrocopter is no longer symmetrical, 
but has a preferential direction, because of a fix-mounted PMD camera or stereo vision system for 
collision avoidance. With this configuration, to fly through a narrow opening like a window or a 
door; the flight direction, the preferential direction and the yaw set value of the quadrocopter have 
to be the same. 
To realize this, the quadrocopter can simply be rotated over the yaw axis. For directed flight from 
space-fixed frame Position P1 = (x1,y1) to P2 = (x2,y2) the yaw set value d can be computed using 
formula 4.3: 
d = O+0445 e 
,
 − ,	 
f,
 − ,	
 O+O F
 − F	
 
g ∙ COF
 − F	 (4.3)
International Journal of Artificial Intelligence  Applications (IJAIA), Vol. 5, No. 5, September 2014 
9 
5. IMPLEMENTATION 
5.1. Hardware Design 
The overall Hardware Design of the full system is shown in figure 4. The red dashed line 
separates the components connected to the AVR (upper) from those connected to the LP-180 
Pico-ITX board. The minimum components required for this study have a red bordering line. 
The system uses the IMU3000 for orientation computation, while the MinIMU-9 v2 is for 
backup. Two infrared sensors, one ultrasonic sensor, a pressure sensor and the IMU are fused for 
height over ground computation[13]. The 6 DOF (degree of freedom) control of the autonomous 
quadrocopter is executed on the AVR32 UC3A1512 microcontroller with 10ms sample time[4]. 
Method 1 is implemented on the LP-180 and the C920 webcam from Logitech is used as Fourier 
Tracking sensor. The C270 webcam has also been tested for this application, but showed very 
disappointing performance under dynamic movements. 
The second position sensor is the ADNS 3080.The dynamic complementary data fusion for 
position computation of both methods and the navigation is executed on the LP-180. The fused 
position, if Fourier Tracking is active, and the position set value are sent to the UC3A1515 via 
USART/RS232. This applies to the set value for yaw also. 
Figure 4. Hardware Design
International Journal of Artificial Intelligence  Applications (IJAIA), Vol. 5, No. 5, September 2014 
In this work the object detection and the obstacle detection is not used, thus these sensors (except 
the C270 webcams) are not connected to the evaluated quadrocopter for this paper. The other 
sensors are attached to the system, though they are not used here. 
10 
5.1. Software Design 
Figure 5 shows the simplified software design of the software on the LP-180 for directed flight. 
The position POF measured by the optical flow sensor, its quality QOF and the current orientation q 
in quaternion are sent from the AVR to the LP-180 every 10ms. This information is processed by 
the Pose Receiver method, which then execute the Execution Controller. 
Figure 5. Software Design 
The Execution Controller activates or deactivates the Fourier Tracking depending on the quality 
QOF. It also updates the Complementary Data Fusion by incorporating the new position POF every 
10ms. Subsequently the Waypoint Commanding function is executed, which realizes the directed 
navigation. While doing this, the quadrocopter always first yaws to face the next waypoint and 
then approaches it. Hence, it rotates, translates, rotates, and translates, and so on, until the last 
waypoint has been reached. 
The Waypoint Commanding function consists of four states and a Boolean flag rotation, 
indicating whether a translation or a rotation has to be performed next: 
• Navigation Off: 
This is the initial state and the directed navigation is not possible. The rotation flag is set 
to true. User commands are necessary to set up the waypoint list and to switch to the 
Navigation On state. 
• Navigation On: 
In this state navigation is generally possible. If the last waypoint has not been reached, 
the next waypoint or rotation is processed. This means the next state of Waypoint 
Commanding is Waypoint Control or Rotation Control, depending on the rotation flag. 
The according set values are also sent to the AVR, which changes the set value of the 
according controller. In case of a rotation a signal is also send to the Execution Controller 
to activate the Fourier Tracking.
International Journal of Artificial Intelligence  Applications (IJAIA), Vol. 5, No. 5, September 2014 
11 
• Waypoint Control: 
The waypoint control state checks, whether the current waypoint is reached or not. If yes, 
it switches back to the Navigation On state. After every 3 seconds, when a waypoint is 
not reached, the set values are sent again to the AVR. This is important in case, that a 
command via the USART/RS232 communication link gets lost. Then the LP-180 would 
wait (forever) for the AVR to reach a certain waypoint and the AVR will not be able to 
react properly. It would control to a different position and the navigation would stick. 
This procedure is also much more simple and robust than using acknowledgements. For 
safety all commands are sent twice and together with a checksum, so that invalid 
commands can be discarded. 
• Rotation Control: 
By analogy to the Waypoint Control the Rotation Control state checks, if the rotation is 
finished. If so, it sends a signal to the Execution Controller to deactivate the Fourier 
Tracking. Then the position is corrected using the Fourier Tracking and it is updated on 
the AVR. 
The Fourier Tracking is the implementation of method 1 (chapter 2). The Complementary Data 
Fusion is the implementation of the already mentioned concept in chapter 4.3. It is executed after 
every Fourier Tracking sample, but not during rotation. Position updates are sent to the AVR 
every 100ms or 500ms, depending on the quality QOF. This is so, because too many position 
updates disturb the controller because of jumps, feedback and delay issues. 
The software on the LP-180 uses Qt [14], Open CV and FFT libraries and is implemented in C++. 
The software on the AVR is implemented using AVR32 Studio and C. 
6. EVALUATION 
The system has been extensively evaluated. In three different scenarios the system has been 
tracked with the Optical Tracking System OptiTrack from Natural Point (referenced as OTS) with 
five Flex 3 cameras [15] to get a reference position, while different autonomous flight scenarios 
have been executed. For position estimation and control exclusively the mentioned on-Board 
optical sensors with the described data fusion has been used (referenced as EST). In total 70 
experiments (trials) has been documented and the most representative results were selected to 
picture the behaviour here 
Table 1. Final Position Errors for P1 and P2 
Position 
Error 
P1 P2 
Ex 
[cm] 
Ey 
[cm] 
ΔFp 
[cm] 
Ex 
[cm] 
Ey 
[cm] 
ΔFp 
[cm] 
Trial 1 16.6 5.9 17.6 -15.4 -6.0 16.5 
Trial 2 -13.1 -20.0 23.9 -6.1 -3.8 7.2 
Trial 3 -22.0 -15.5 26.9 -26.3 -0.7 26.3 
Trial 4 -8.4 -17.1 19.1 -10.8 -13.2 17.1 
Trial 5 -11.6 -17.4 20.9 -11.9 -4.2 12.6 
Mean -7.7 -12.82 21.68 -14.1 -5.58 15.94
International Journal of Artificial Intelligence  Applications (IJAIA), Vol. 5, No. 5, September 2014 
12 
5.1. Single Rotation Compensation (Headed Flight) 
In this setup the quadrocopter is tracked while performing a directed flight containing of a single 
waypoint. It flies from the initial position P0 = (0m, 0m) to P1 = (0m, 1.5m) or P2 = (1.5m, 1.5m). 
Each experiment was repeated five times with the same settings and all results showed very 
similar behaviour. The position was measured before the rotation and after 20s (P1) or 10s (P2), 
when the rotation and translation were already finished. The position error is the difference 
between the position change of OTS and EST and can be seen in Table 1. Ex and Ey are the errors 
in the x-axis and y-axis, respectively. Fp is the total 2D position error. From this data it can be 
concluded that the error in the position system after the mentioned manoeuvre is in the range of 
12-27cm with a mean of about 19cm. This is already quite high after such a short period of time, 
but many effects such as wrong scaling and misalignment of OTS and EST increase this error. 
Therefore more data has to be taken into account to come to a conclusion. 
As all experiments showed similar results, two experiments from both scenarios are illustrated in 
Figure 6 and Figure 7 to discuss this setup more in detail. The graphene show, that after about 2s 
the rotation is finished and after about 5s the set point is reached. The green line shows, that the 
Fourier Tracking is activated before the rotation and is deactivated after the rotation is finished. 
Then the translation is executed and the system reaches the set position with a first overshoot of 
about 30 - 50cm. This overshoot is caused by the PID controller and its parameters, but also by 
the fact, that the EST measures a smaller position then the OTS. Both graphene taken into 
account together, it can be derived, that the overshoot is also affected by the happenings on the 
other axis. 
The system has a 2° x-axis mechanical misalignment towards the ground level (over pitch). That 
is the reason, why the system does not lift straight, but flies forward. This explains a 25cm 
position error over the x-axis after starting (Fig. 7). 
Figure 6.P1Trial 5 
Fourier Tracking Off: FA = 0 
Fourier Tracking On: FA = 100 
Figure 7.P2Trial 4 
5.2. Multiple Rotation Compensation (Headed Flight) 
In this setup the quadrocopter is tracked while performing a directed flight containing of a set of 
waypoints. Compared to the first setup the quadrocopter now performs multiple iterative rotations 
and directed translations. As waypoint set a square (Fig. 8)comprising of 4 waypoints and a 
Nikolaus house (Fig. 10) comprising of 8 waypoints has been used.
International Journal of Artificial Intelligence  Applications (IJAIA), Vol. 5, No. 5, September 2014 
Again the position error and the time have been documented (Table 2). The position error is in the 
range of 12-34 cm, with one exception of 55cm. The mean is about 28cm for square and about 
31cm for Nikolaus house. This means with 4 or 8 times more waypoints resulting in as many 
additional translations and rotations, the error increases, but not proportional to the time or 
amount of waypoints. 
13 
Table 2. Final Position Errors for Square and Nikolaus House Flight 
Position 
Error 
Square Nikolaus House 
Ex 
[cm] 
Ey 
[cm] 
t 
[s] 
Fp 
[cm] 
Ex 
[cm] 
Ey 
[cm] 
t 
[s] 
Fp 
[cm] 
Trial 1 11,5 -21,8 25 24,7 -31.5 -11.2 75 33.4 
Trial 2 11,6 -22,6 25 25,4 21.1 10.5 55 23.5 
Trial 3 -13,6 -18,2 20 22,8 28.3 -11.5 140 30.6 
Trial 4 8,7 -32,9 35 34,0 -7.4 10.5 70 12.9 
Trial 5 18,8 -25,4 25 31,6 -53.3 -13.,4 50 55.0 
Mean 7,4 -24,8 26 27,7 -8,6 -3,0 78 31,0 
The big error of 55cm in trial 5 of the Nikolaus house made a closer analyzation necessary. The 
data indicate that the error occurred during a fast movement. The reason for this and a solution 
need to be found in a further investigation. 
Figure 8. Square Waypoints Figure 9. Directed Square Flight: Position 
(Trial 2) tracked with on-Board estimation (EST) and 
OTS during different phases start, rotation and 
translation 
Again two trials are illustrated to explain the behaviour of the system in more details. Figure 9 
shows the tracked position (EST, OTS) of the quadrotor during the square flight. The tracks are 
coloured in three different colours for the start, rotation and translation phase. It can clearly be 
seen, that during rotation a position error occurs, which is corrected afterwards, so that the system 
approaches correctly to the next waypoint.
International Journal of Artificial Intelligence  Applications (IJAIA), Vol. 5, No. 5, September 2014 
Figure 11 shows the tracked position (OTS) during the flight of a Nikolaus house. Again the 
greatest variations from the set values occur during orientation. Figure 12 shows the 
corresponding yaw angle. The system adopts the closer yaw angle fastly(within a few seconds). 
The rotation speed is limited because of the fact, that the Fourier Tracking sensor and system 
cannot handle faster rotations. During translation the yaw angle is constant and therefore the 
Fourier Tracking can be deactivated. 
Time Time 14 
Figure 10. Nikolaus House Waypoints 
Figure 11. Directed Nikolaus House Flight: 
Position (Trial 2) tracked with OTS during different 
phases start, rotation and translation 
Figure 12. Directed Nikolaus House: 
Corresponding Yaw Angle (Trial 2) 
[s] 
Figure 13. Effect of a Sensor Failure on 
Position Error under critical lighting 
conditions 
5.1. Dynamic Weighted Data Fusion 
In the last setup the system has been evaluated under varying lighting conditions to investigate 
and demonstrate this effect and the systems behaviour on changes. Figure 14 illustrates the four 
different investigated lighting conditions. Good conditions are those, which are normal operating 
conditions for the ADNS-3080with relative much but not too much light. This is the light you
International Journal of Artificial Intelligence  Applications (IJAIA), Vol. 5, No. 5, September 2014 
normally have in a bright room. Bad conditions are those, which make the ADNS fail sometimes, 
though there is still much light in the room. In our case we switched off the direct top light and 
there was only indirect light left. For the experiments with Very Bad Light conditions, we 
switched off most lights, but kept on some lights, so in the test section there was also a brightness 
bridge from dark to bright. The other time, for Total Bad conditions, all lights were switched off 
and only some daylight came through the curtains. 
Figure 13 shows the effect of a sensor failure because of bad lighting conditions. After about 120s 
even the Fourier Tracking failed resulting in a great position error. The relationship between 
position error and sensor failure because of low quality can be clearly seen here, though it is also 
shown, that the system does not totally fail, but follows the further movement correctly with 
about constant error. 
During all experiments of this setup the quadrotor is on position hold and again 5 trials have been 
documented for each lighting conditions. During the experiment, it could be seen, that the 
quadrotor became more dynamic under worse lighting conditions. That is why beside the position 
error this time also the standard deviation is computed after 60s. 
15 
Figure 14. Different Lighting Conditions Top: Good (left), Bad (right) 
Bottom: Very Bad (left), Total Bad (right) 
Figure 15 summarizes the results of this setup (illustration of Table 3 4).Figure 15 and Table 3 
proves that the position error is significantly higher under worse lighting conditions compared to 
good conditions. The position error under worse conditions is about 4 times higher. Rather 
unexpected is the result, that the position error under bad conditions is higher than the error under
International Journal of Artificial Intelligence  Applications (IJAIA), Vol. 5, No. 5, September 2014 
very bad or total bad conditions. Under bad conditions the Fourier Tracking is often activated and 
deactivated and both positions are fused following formula 4.1 and 4.2. It can be concluded, that a 
problem with the data fusion still exists and the parameters need to be tuned further. It might be, 
that wrong ADNS sensor values are used or that an error is incorporated into the system because 
of a wrong synchronisation of both systems, as both sensors are running on different processors 
with a different sample time and delay before their data fusion. This problem becomes more 
sophisticated, as the Fourier Tracking system dynamically activates and deactivates. These effects 
were taken into account, but further tuning of the data fusion under changing light conditions is 
required or the presented solution is not fully suitable. Despite this error the system performed 
better under bad lighting conditions compared to the worse lighting conditions (very bad  total 
bad), because of the following two facts. 
16 
Figure 15.Effect of Different Lighting Conditions on Position Error and Standard Deviation 
The entries of table 3 containing an ‘X’ represent experiments, which could not finished properly, 
because the position system failed in a way, that the quadrotor left the test section. This happened 
one time under very bad and total bad conditions. In these cases the Fourier Tracking also failed 
(compare Fig.13). The experiment was then repeated and it worked in all other cases. 
Table3.Position Errors under different (difficult) lighting conditions 
P = hi
 + j
 
Errors 
[cm] 
Trial 1 Trial 2 Trial 3 Trial 4 Trial 5 
k 
X Y P X Y P X Y P X Y P X Y P 
Good -1 -13 13 -3 -3 4 1 -2 2 -5 0 5 2 5 6 6 
Bad -47 -44 64 -31 -6 31 -48 -32 58 -28 -32 43 -2 -15 15 35 
Very 
X 
X 
X 
- 
-8 14 -4 -67 67 -12 -21 24 -6 -11 13 23 
Bad 
-8 
-17 
18 
11 
Total 
Bad 
8 -35 36 2 -8 8 
X 
-8 
X 
19 
X 
21 
-16 13 20 -43 -67 80 28
International Journal of Artificial Intelligence  Applications (IJAIA), Vol. 5, No. 5, September 2014 
Besides this, the quadrotor had more problems to fly calm as the lighting conditions became 
worse. This is shown in Table 4 with the increase in the standard deviation of the position (OTS). 
The reason for this difficulty is the fact, that with worse conditions, the ADNS failed more and 
more until totally and the controller had to rely on the low-sampled Fourier Tracking only. 
Thereby, to prevent against instability, the control parameters for both cases are set very similar 
and this might be tuned further. 
17 
Table 4. Standard deviation under different (difficult) lighting conditions 
Standard 
deviation 
[cm] 
Trial 1 Trial 2 Trial 3 Trial 4 Trial 5 
Mean 
X Y X Y X Y X Y X Y 
Good 6 19 16 18 12 13 21 18 11 13 14.7 
Bad 19 29 12 14 23 24 18 20 26 15 19.6 
Very Bad 20 26 23 20 24 46 15 25 16 23 23.8 
Total Bad 28 36 19 19 24 22 28 22 24 29 25.1 
Furthermore a switch in the lighting conditions also incorporates difficulties to the control, which 
is shown on figure 16. In this experiment the conditions have been changed every 10-20 seconds 
in the following procedure: good - bad – very bad – bad – good 
Figure 16. System behaviour and performance under changing lighting conditions Fourier Tracking Off: 
FA = 0; Fourier Tracking On: FA = 50 
It can clearly be seen, that the system becomes fitful under worse changing conditions and even 
more after a clarification. This experiment also shows that the position system can handle these 
critical changes in the lighting conditions. 
6. CONCLUSION AND PERSPECTIVE 
The evaluation proved that the system is capable of a fully autonomous directed flight and the 
presented complementary vision based data fusion is sufficient for controlling a quadrotor in an 
autonomous flight. Even autonomous position hold under very bad lighting conditions is possible. 
In spite of this, there is still space for optimization and the accuracy as well as the reliability of 
the system need to be further improved. The control accuracy can be improved by updating the 
position during rotation. Position errors can be reduced by optimizing the data fusion, its 
parameters or the concept, but probably by reducing the weight of the ADNS under erroneous
International Journal of Artificial Intelligence  Applications (IJAIA), Vol. 5, No. 5, September 2014 
conditions. Still it has been shown, that even under bad lighting conditions the ADNS improves 
the control behaviour. 
However, the reliability of the system can only be significantly improved, by adding other, 
non-optical sensors for positioning like radar, or ultrasonic, since optical sensors depend 
inherently on light. For outdoor applications, GPS would also be possible. 
The directed flight can now be combined together with other optical systems - like pmd 
camera and camera based stereo-optical distance determination – to fly fully 
autonomously through narrow openings like windows or doors. 
18 
ACKNOWLEDGEMENTS 
The author would like to thank Diana Baeva and Qasim Ali for reviewing this paper. This work 
was funded by the IHK Würzburg-Schweinfurt and the Universitätsbund Würzburg. This 
publication was funded by the German Research Foundation (DFG) and the University of 
Wuerzburg in the funding program Open Access Publishing. 
REFERENCES 
[1] Kendoul F. et al, Optical-flow based vision system for autonomous 3D localization and control of 
small aerial vehicles, Robotics and Autonomous Systems 2009, Elsevier 
[2] Herisse B. et al, Hovering flight and vertical landing control of a VTOL Unmanned Aerial Vehicle 
using Optical Flow, 2008 IEEE International Conference on Intelligent Robots and Systems 
[3] Reinthal E., Positionsbestimmungeinesautonomen Quadrokopters durchBildverarbeitung, 2014, BA 
Thesis, University of Wuerzburg 
[4] Gageik, N., Autonomous UAV with Optical Flow Sensor for Positioning and Navigation, 2013, 
International Journal of Advanced Robotic Systems, INTECH 
[5] Averbuch A. and Keller Y., A Unified Approach to FFT Based Image Registration, 2002, Tel Aviv 
University 
[6] Reddy B. S. and Chatterji B. N., An FFT-Based Technique for Translation, Rotation, and Scale- 
Invariant Registration, 1996, IEEE Transactions on Image Processing vol 5 no 8 
[7] Arens T. et al, Mathematik, 2008, Heidelberg SpektrumAkademischerVerlag 
[8] Jähne B., Practical Handbook on Image Processing for Scientific Applications, 1997, Boca Raton 
CRC Press LLC 
[9] Lucas, B. and Kanade, T. 1981. An iterative image registration technique with an application to stereo 
vision. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 674–679. 
[10] Srinivasan M., An image-interpolation technique for the computation of optic flow and egomotion, 
Biological Cybernetics, 1994, Springer-Verlag 
[11] Strohmeier M., Implementierung und EvaluierungeinerPositionsregelungunterVerwendung des 
optischenFlusses, Würzburg 2012, BA Thesis 
[12] ADNS-3080 High-Performance Optical Mouse Sensor, Data Sheet, Avago Technologies, 
http://www.avagotech.com 
[13] Gageik N., Rothe J., Montenegro S., Data Fusion Principles for Height Control and Autonomous 
Landing of a Quadrocopter, UAVveek 2012 
[14] Qt Project, http://qt.digia.com 
[15] Natural Point, OptiTrack, www.naturalpoint.com/optitrack/ 
[16] Ascending Technologies, Research Price List, 2013, Krailling, Germany, www.asctec.de 
[17] Shen, S., Autonomous Multi-Floor Indoor Navigation with a Computationally Contrained MAV, 
International Conference on Robotics and Automation, 2011, Shanghai, IEEE
International Journal of Artificial Intelligence  Applications (IJAIA), Vol. 5, No. 5, September 2014 
19 
AUTHORS 
Dipl.-Ing. Nils Gageik is working as a research assistant and PhD student at the Chair 
Aerospace Information Technology at the University of Wuerzburg. He received his diploma 
from the RWTH Aachen University2010 in Computer Engineering. 
B. Sc. Eric Reinthal is a Master Student in the international Spacemaster program. He received 
his bachelor degree in 2014 at the University of Wuerzburg. 
B. Sc. Paul Benz is a Master Student at the University of Wuerzburg. He received his 
bachelor degree in 2013 at the University of Wuerzburg. 
Prof.Dr. Sergio Montenegro is holder of the Chair Aerospace Information Technology at 
the University of Wuerzburg.

More Related Content

PDF
N045077984
PDF
An Unmanned Rotorcraft System with Embedded Design
PDF
06466595
PDF
Image Registration Methode in Radar Interferometry
PDF
MULTIPLE REGION OF INTEREST TRACKING OF NON-RIGID OBJECTS USING DEMON'S ALGOR...
PDF
Multiple region of interest tracking of non rigid objects using demon's algor...
PDF
ADAPTIVE, SCALABLE, TRANSFORMDOMAIN GLOBAL MOTION ESTIMATION FOR VIDEO STABIL...
PDF
Ak03302260233
N045077984
An Unmanned Rotorcraft System with Embedded Design
06466595
Image Registration Methode in Radar Interferometry
MULTIPLE REGION OF INTEREST TRACKING OF NON-RIGID OBJECTS USING DEMON'S ALGOR...
Multiple region of interest tracking of non rigid objects using demon's algor...
ADAPTIVE, SCALABLE, TRANSFORMDOMAIN GLOBAL MOTION ESTIMATION FOR VIDEO STABIL...
Ak03302260233

What's hot (20)

PDF
Simulation of Robot Manipulator Trajectory Optimization Design
PPT
Time History Analysis With Recorded Accelerograms
PDF
Ravasi_etal_EAGE2015b
PDF
Parallel implementation of geodesic distance transform with application in su...
PDF
Elements Space and Amplitude Perturbation Using Genetic Algorithm for Antenna...
PDF
Attitude determination of multirotors using camera
PDF
PDF
A0420105
PDF
SIMMECHANICS VISUALIZATION OF EXPERIMENTAL MODEL OVERHEAD CRANE, ITS LINEARIZ...
PDF
G143741
PDF
Termpaper ai
PDF
High speed cordic design for fixed angle of rotation
PDF
Optimization Of K-Means Clustering For DECT Using ACO
PDF
Smear correction of highly variable,
PDF
Interferogram Filtering Using Gaussians Scale Mixtures in Steerable Wavelet D...
PDF
HARDWARE EFFICIENT SCALING FREE VECTORING AND ROTATIONAL CORDIC FOR DSP APPLI...
PDF
Iaetsd a modified image fusion approach using guided filter
PPT
Intelligent back analysis using data from the instrument (poster)
PDF
Time Multiplexed VLSI Architecture for Real-Time Barrel Distortion Correction...
PDF
DICTA 2017 poster
Simulation of Robot Manipulator Trajectory Optimization Design
Time History Analysis With Recorded Accelerograms
Ravasi_etal_EAGE2015b
Parallel implementation of geodesic distance transform with application in su...
Elements Space and Amplitude Perturbation Using Genetic Algorithm for Antenna...
Attitude determination of multirotors using camera
A0420105
SIMMECHANICS VISUALIZATION OF EXPERIMENTAL MODEL OVERHEAD CRANE, ITS LINEARIZ...
G143741
Termpaper ai
High speed cordic design for fixed angle of rotation
Optimization Of K-Means Clustering For DECT Using ACO
Smear correction of highly variable,
Interferogram Filtering Using Gaussians Scale Mixtures in Steerable Wavelet D...
HARDWARE EFFICIENT SCALING FREE VECTORING AND ROTATIONAL CORDIC FOR DSP APPLI...
Iaetsd a modified image fusion approach using guided filter
Intelligent back analysis using data from the instrument (poster)
Time Multiplexed VLSI Architecture for Real-Time Barrel Distortion Correction...
DICTA 2017 poster
Ad

Similar to COMPLEMENTARY VISION BASED DATA FUSION FOR ROBUST POSITIONING AND DIRECTED FLIGHT OF AN AUTONOMOUS QUADROCOPTER (20)

PPTX
sodapdf-converzxXxccccCCCCCCCSsted (1).pptx
PDF
Lecture 7 (Digital Image Processing)
PDF
Lecture 14 Properties of Fourier Transform for 2D Signal
PDF
Jung.Rapport
PDF
Multi-hypothesis projection-based shift estimation for sweeping panorama reco...
PDF
The Technology Research of Camera Calibration Based On LabVIEW
PPTX
Super Resolution of Image
PDF
Multi-hypothesis projection-based shift estimation for sweeping panorama reco...
PDF
Improving image resolution through the cra algorithm involved recycling proce...
PDF
IMPROVING IMAGE RESOLUTION THROUGH THE CRA ALGORITHM INVOLVED RECYCLING PROCE...
PPT
Chapter-05c-Image-Restoration-(Reconstruction-from-Projections).ppt
PPT
Automatic image mosaicing an approach based on fft
PDF
視訊訊號處理與深度學習應用
PDF
Wave optics analysis of camera image
PDF
Wave Optics Analysis of Camera Image Formation With Respect to Rectangular Ap...
PPTX
Digital Image Morphing through Field Morphing
PDF
“Next-generation Computer Vision Methods for Automated Navigation of Unmanned...
PPTX
CBM Variable Speed Machinery
PPT
2D Geometric_Transformations in graphics.ppt
sodapdf-converzxXxccccCCCCCCCSsted (1).pptx
Lecture 7 (Digital Image Processing)
Lecture 14 Properties of Fourier Transform for 2D Signal
Jung.Rapport
Multi-hypothesis projection-based shift estimation for sweeping panorama reco...
The Technology Research of Camera Calibration Based On LabVIEW
Super Resolution of Image
Multi-hypothesis projection-based shift estimation for sweeping panorama reco...
Improving image resolution through the cra algorithm involved recycling proce...
IMPROVING IMAGE RESOLUTION THROUGH THE CRA ALGORITHM INVOLVED RECYCLING PROCE...
Chapter-05c-Image-Restoration-(Reconstruction-from-Projections).ppt
Automatic image mosaicing an approach based on fft
視訊訊號處理與深度學習應用
Wave optics analysis of camera image
Wave Optics Analysis of Camera Image Formation With Respect to Rectangular Ap...
Digital Image Morphing through Field Morphing
“Next-generation Computer Vision Methods for Automated Navigation of Unmanned...
CBM Variable Speed Machinery
2D Geometric_Transformations in graphics.ppt
Ad

Recently uploaded (20)

PDF
Unlocking AI with Model Context Protocol (MCP)
PDF
Encapsulation theory and applications.pdf
PDF
Assigned Numbers - 2025 - Bluetooth® Document
PPTX
Big Data Technologies - Introduction.pptx
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PDF
Electronic commerce courselecture one. Pdf
PDF
Chapter 3 Spatial Domain Image Processing.pdf
PDF
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
PDF
MIND Revenue Release Quarter 2 2025 Press Release
DOCX
The AUB Centre for AI in Media Proposal.docx
PDF
Review of recent advances in non-invasive hemoglobin estimation
PDF
Building Integrated photovoltaic BIPV_UPV.pdf
PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
PDF
NewMind AI Weekly Chronicles - August'25-Week II
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PPTX
Cloud computing and distributed systems.
PDF
cuic standard and advanced reporting.pdf
PDF
A comparative analysis of optical character recognition models for extracting...
Unlocking AI with Model Context Protocol (MCP)
Encapsulation theory and applications.pdf
Assigned Numbers - 2025 - Bluetooth® Document
Big Data Technologies - Introduction.pptx
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
Electronic commerce courselecture one. Pdf
Chapter 3 Spatial Domain Image Processing.pdf
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
MIND Revenue Release Quarter 2 2025 Press Release
The AUB Centre for AI in Media Proposal.docx
Review of recent advances in non-invasive hemoglobin estimation
Building Integrated photovoltaic BIPV_UPV.pdf
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
Encapsulation_ Review paper, used for researhc scholars
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
NewMind AI Weekly Chronicles - August'25-Week II
Reach Out and Touch Someone: Haptics and Empathic Computing
Cloud computing and distributed systems.
cuic standard and advanced reporting.pdf
A comparative analysis of optical character recognition models for extracting...

COMPLEMENTARY VISION BASED DATA FUSION FOR ROBUST POSITIONING AND DIRECTED FLIGHT OF AN AUTONOMOUS QUADROCOPTER

  • 1. International Journal of Artificial Intelligence & Applications (IJAIA), Vol. 5, No. 5, September 2014 COMPLEMENTARY VISION BASED DATA FUSION FOR ROBUST POSITIONING AND DIRECTED FLIGHT OF AN AUTONOMOUS QUADROCOPTER Nils Gageik1, Eric Reinthal2, Paul Benz3and Sergio Montenegro4 Chair of Computer Science 8, University of Würzburg, Germany ABSTRACT The present paper describes an improved 4 DOF (x/y/z/yaw) vision based positioning solution for fully 6 DOF autonomous UAVs, optimised in terms of computation and development costs as well as robustness and performance. The positioning system combines Fourier transform-based image registration (Fourier Tracking) and differential optical flow computation to overcome the drawbacks of a single approach. The first method is capable of recognizing movement in four degree of freedom under variable lighting conditions, but suffers from low sample rate and high computational costs. Differential optical flow computation, on the other hand, enables a very high sample rate to gain control robustness. This method, however, is limited to translational movement only and performs poor in bad lighting conditions. A reliable positioning system for autonomous flights with free heading is obtained by fusing both techniques. Although the vision system can measure the variable altitude during flight, infrared and ultrasonic sensors are used for robustness. This work is part of the AQopterI8 project, which aims to develop an autonomous flying quadrocopter for indoor application and makes autonomous directed flight possible. KEYWORDS Autonomous UAV, Quadrocopter, Quadrotor, Vision Based, Positioning, Data Fusion, Directed Flight 1. INTRODUCTION In spite of the fact that nowadays exist solutions for vision based positioning to enable autonomous flight of UAV’s (Unmanned Aerial Vehicles), these solutions suffer from different inherent drawbacks. Main drawbacks are high computational burden [1, 2] and low sample rate [3],limitations to translational movement [11, 12], and bad robustness to varying lighting conditions[4]. Despite the considerable rise of computing power during the last decade, computational burden is still a concern for autonomous UAVs. High computing capacity results in a significant increase in weight as well as power consumption of a UAV. Flight-time is thereby reduced by a noticeable factor. This lead to several scientific works on UAVs, where computation for the positioning system is done on external hardware. These approaches, however, break the requirements for a fully autonomous system. Our proposed positioning system focuses on computational efficiency with acceptable hardware load to deploy relatively lightweight on-board- hardware with low power consumption. The system therefore provides fully autonomous positioning without the need of external systems. Hence fully autonomous directed flight becomes possible, that is required for passing through narrow openings such as doors and windows with an on-Board collision avoidance system having preferential direction. This is an alternative to laser based solutions [16-17]. DOI : 10.5121/ijaia.2014.5501 1
  • 2. International Journal of Artificial Intelligence & Applications (IJAIA), Vol. 5, No. 5, September 2014 The following study deals with the concept, implementation and evaluation of merging two vision based methods for positioning of an autonomous UAV. First two sections describe the two methods used here, explaining each on its own. The fourth section describes the overall concept of merging the two systems with focus on the data fusion and is followed by a section about its implementation. The paper ends with sections on evaluation and discussion. 2 2. FOURIER TRACKING (METHOD 1) 2.1. Overview Fourier-transform based image registration (short: Fourier Tracking)is the process of determining the geometric correspondence of two images. With a camera mounted on the UAV and pointing vertically to the ground, the motion of the UAV can be computed by continuous registration of succeeding images. With the constraint that the focus of the camera stays in the plane of surface, succeeding images are affine transformations with respect to translation, rotation and scaling. By registering the images, the parameters of the transformation are gained, which permit to conclude the movement of the UAV. Image registration is done in Fourier space to reduce computational costs and to improve robustness against changing light conditions. Figure 1.Fourier-transform based image registration Figure 1 pictures the simplified concept of the implemented Fourier Tracking. To gain robustness under variable lighting conditions, a histogram equalization is applied to each image before processing. This gives the positioning system the possibility to function under bad lighting conditions as well as with surfaces having very low contrast. Next Log-Polar-Transformation (LPT) and Phase Correlation are executed to gain the yaw angle and scaling factor. The scaling factor, which corresponds to the height or change in height, together with the yaw angle is required for the affine transformation, which transforms the first image i0 to i0’, so that it fits to the second image i1. Now both images correspond in height and yaw angle to another and the second phase correlation can determine the translation between both images. This translation corresponds to the position change of the quadrocopter. Each LPT block consists of a FT (Fourier Transform), a Log-Polar-Transformation(section 2.3). and a high pass filter. The coordinate-change to logarithmic polar coordinates induces distortion
  • 3. International Journal of Artificial Intelligence & Applications (IJAIA), Vol. 5, No. 5, September 2014 in the transformed image since the transform is non-uniform. To overcome bad performing of the phase correlation, a high-pass filter is applied after the transform. [6] ∗ , denotes the complex conjugate of , . By inserting formula (2.3) in (2.4), 3 Each Phase correlation (section 2.2) requires three Fourier transforms (i0, i1, inverse). 2.2. Fourier Transform and Cross-Power-Spectrum (CPS) The concept uses extended phase correlation [5] for registering images which are translated, rotated and scaled with respect to each other. Let , be an image in spatial domain and , the corresponding image in Fourier domain thus ℱ = (2.1) Two images and , which are translated by , , own the following dependence (neglecting border-effects): , = +, + (2.2) Due to the shift-theorem of the Fourier transform, this dependence can be written in Fourier space as: , = , (2.3) The phase correlation allows to compute this translation by using the cross-power-spectrum (CPS) of the two images. The CPS is defined as: = ∗ , , , , (2.4) where one can show that the CPS corresponds to the translational difference of the two images: = (2.5) The CPS is technically the convolution of the two images in Fourier space. By taking the inverse Fourier transform of the CPS, the translation , is obtained, because the exponential term of equation (2.3) corresponds to the delta-function. ℱ # $ = ℱ % = ' − , − (2.6) For two overlapping images, equation (2.2) is only satisfied in the overlapping area. The non-corresponding areas add distortion to the CPS, which therefore differs from an exact delta-impulse. The real CPS describes a correlation surface of the images, with its peak at the delta-impulse from equation (2.6). This peak gives the translation between and : )* = +, , ℱ # $ (2.7) With this concept only horizontal translated images can be registered. A preceding rotation and scaling compensation is therefore required.
  • 4. International Journal of Artificial Intelligence Applications (IJAIA), Vol. 5, No. 5, September 2014 4 2.3. Polar-Coordinates By applying a coordinate-change to logarithmic polar-coordinateslog 0 , 1, rotation and scaling can be computed using phase correlation as well. This concept is derived from Reddy and Chatterji[6]. Let 2, 3 be rotated by 1and translated to , . Their variables are related by: 2 = 45 1 + 1 − 3 = −sin 1 + cos 1 − (2.8) Applying the Fourier transform, we get: , = :; , = (2.9) Due to the rotation property of the Fourier transform, the variable relation is given by the following equation: = cos 1 + sin 1 = = − sin 1 + cos 1 (2.10) The magnitudes of the Fourier transforms differ only by rotation with a rotation centre in the middle of the images. This corresponds to a translation in polar coordinates, since rotation around the image centre only affects the 1-coordinate. Applying a coordinate-change to polar coordinates, the magnitudes of the Fourier transforms are related by: , ? = , ? − 1 (2.11) If is also scaled to by @, which only affects the 0-coordinate in polar coordinates, equation (2.11) extends to: , ? = 1 @ @, ? − 1 (2.12) Assuming a minimal scale change between two images, the factor @ may be neglected. Switching to logarithmic polar coordinates gives: B5C , ? = 1 @ B5C − B5C @ , ? − 1 (2.13) By substituting , = log and 4 = log @, we obtain the following formula: ,, ? = , − 4, ? − 1 (2.14) These translations 4 and 1 can be obtained by applying phase correlation, where @ = D corresponds to the scaling and 1 to the rotation of the two input images. Once rotation and scale change is computed, the second image is scaled and rotated by the computed values. Thus, the transformed image and the first input image differ merely in translation. This translation is obtained by applying another phase correlation.
  • 5. International Journal of Artificial Intelligence Applications (IJAIA), Vol. 5, No. 5, September 2014 5 2.3. Implementation details Once two images are registered, their translation, scaling and rotation are transformed to the coordinate system of the UAV, which leads to the change in position. Let be taken at time E and slightly thereafter at time E . The position of the UAV in the surface plane at time Eequals ,, F , the altitude is ℎ and the heading is given by H. The rotation of the images corresponds directly to the change in heading of the UAV, leading to HE = H + 1 (2.15) The scaling of the images represents the relative change in altitude. ℎE = ℎ @ (2.16) The translation of the image registration is obtained in pixels. Therefore, a calibration factor α has to be determined to translate the translation to meters. This factor α is constant for a given camera and resolution. The ratio of pixel to meter is also affected by the altitude of the UAV. The field of view of the camera changes linearly with the distance to the camera scene. Thus the translation in meters is given by: , , F = +ℎ , (2.17) This translation is aligned in camera-coordinates and has to be rotated in UAV-coordinates with the angle of heading at time E: 45H H −H 45H ,E , FE = J K , , F (2.18) By continuous registration of images, the movement in the surface plane, the altitude and the heading of the UAV can be iteratively measured. Each registration with the given concept (Fig. 1)requires eight Fourier transforms, representing the most computational load of the positioning system. To reduce computational costs, Fourier transform is implemented using Fast Fourier Transform. For a square image of × pixels, the transform can be computed in M log . [7] In order to reduce computation time further, the symmetry of the Fourier transform is exploited. Since Fourier transform is designed for complex input, real input sequences such as images lead to complex-conjugated symmetry in Fourier space. Therefore it is sufficient to compute merely the upper half of the CPS. To gain further precision, the peak location of the CPS is computed with a weighted average around the absolute peak. This gives sub-pixel-accuracy for the translation, which also improves the robustness due to higher accuracy in the rotation-and-scale transform. Since images are finite, Fourier transform induces border effects leading to miscalculations in the CPS. Applying a Hanning-window before each transform minimizes these effects by blackening out edge regions of the images. This increases the reliability of the phase correlation significantly. [8]
  • 6. International Journal of Artificial Intelligence Applications 3. DIFFERENTIAL OPTICAL IFFERENTIAL FLOW COMPUTATION (METHOD 2) 3.1. Overview (IJAIA), Vol. 5, No. 5, September 2014 A common method for differential optical flow computation is the can be realized with the algorithm of Srinivasan [ change in the picture is only a translation and at most one pixel per frame. Under these constraints, the searched optical flow then be determined by equation 3: N 3K =O J presumed, that the , values u and v in the x- and y-axis between two pictur SST −OT ∗ −O O V U SST −OT ∗ W −OXO P Q Q Q R U V Y Z Z Z [ ∗O P Q Q Q R SS −O O V U SS −O ∗ W −OXO V U with Px(i,j), Py(i,j) and Pt(i,j) being the partial axis or after time t, respectively: pictures can ing intensity derivatives of point P(i, j) in the x V , ] =O −O U , ] =OW −OX T , ] =OT −OT are computed from the neighboring pixels to the left (P 10]. For simplification it is presume The partial intensity derivatives (P1), up (P4) and down (P3) and the previous picture ) (Fig. 2). Figure 2.Centre Pixel P 3.3. Implementation details Lucas-Kanade-Method [ SS −O ∗ W −OXO V U U V Y Z Z Z [ SSW −OXO Pt or Pt-1 in yellow and neighbours in red [11] In this work the ADNS 3080, an optical flow or optical mouse sensor, is used proper lens [12]. This sensor works with an internal sample rate of good results under good lighting conditions sensor fails, if there is too high ( drawbacks, the Fourier Tracking has been 9]. It , together with a up to 6400fps and provides [4], but it cannot handle rotations. Furthermore the as in case of outdoors) or too less light. To overcome th added to the system. Since there is no information in the data sheet, which method is implemented on this sensor, is not absolutely clear. But because of mentioned problems and limitation to translations method with similar characteristics ecause its characteristics and drawbacks, the high frame rate, the it is considered to be either method 2 or a characteristics. 6 (3.1) x- or y- (3.2) P2), right 00fps these two this haracteristics
  • 7. International Journal of Artificial Intelligence Applications (IJAIA), Vol. 5, No. 5, September 2014 7 4. CONCEPT 4.1. Overview The main motivation for this work was the limitation of method 2 (optical flow) to translations. Any yaw rotation is interpreted as a translation and therefore cannot be executed without accumulating high position error. After any yaw rotation with position hold, the system would hold position on a different location. Therefore method 1 (Fourier Tracking) was designed in a way which can handle yaw rotations, but suffers from a high computational burden. Since computation power on-Board the quadrocopter is very much limited and is required also for other applications like mapping, object and obstacle detection, as well as method 2 is sufficient in most cases, method 1should only be activated for error correction after rotating. This concept idea is called rotation compensation (4.2). Besides this, method 2 also fails in bad lighting conditions. In this case method 1 is also activated, because it can handle bad lighting conditions. The data of both methods are then merged, which is called dynamic complementary data fusion (4.3). It is called dynamic, because method 1 is automatically activated, if method 2 produces bad results. Figure 3. Concept The overall concept of the system is illustrated in Figure 3. Both sensor systems send its data, the last measured position P and the quality Q of the last measurement to the complementary data fusion, where these data are merged. Depending on the quality data QOF of the optical flow system or the rotation state, the execution controller then activates or deactivates the Fourier Tracking and the dynamic complementary data fusion. The rotation state can be either rotating or not rotating. The Fourier Tracking uses a space-fixed frame, while the optical flow and the quadrocopter with its position control can only operate in a body-fixed frame. That’s why the results have to be transformed from one frame to another. Furthermore both methods are performed on different processors, because the Fourier Tracking needs high computation power and the ADNS uses SPI and therefore is best connected to the microcontroller, which also performs the control loop of the system and drives the motors. This means, the results of the Fourier Tracking also have to be sent to the microcontroller. To release the microcontroller, the data fusion is performed on the computer with high computation power.
  • 8. International Journal of Artificial Intelligence Applications (IJAIA), Vol. 5, No. 5, September 2014 8 4.2. Rotation Compensation The idea of the rotation compensation is to start the Fourier Tracking whenever the quadrocopter is going to perform a yaw rotation and stop the Fourier Tracking after the rotation is finished. Then the position error, which occurs while rotating, is corrected. In this case the Data Fusion simply discards the erroneous optical flow measurements and uses only the measurements from the Fourier Tracking. The advantage of this method is, that it is easy to implement and enables the quadrocopter to perform yaw rotations and correct errors after rotating. It can be improved by updating the position during the rotation, but then the transformation as well as the correction has to be executed more often. This means more communication and work load for the microcontroller, but it could also lead to instability of the control, because of actual value jumps through position corrections. 4.3. Dynamic Complementary Data Fusion The idea of the dynamic complementary data fusion is to activate and use the Fourier Tracking only when required and to complementary incorporate both measurements depending on each quality (formula 4.1). To ensure erroneous measurements from the optical flow sensor are not used during rotation, @ is set to zero in this case. = + O@O ∙ OΔ`a + 1 − @ ∙ OΔab (4.1) Three stats can be divided depending on the quality of the optical flow QOF. In the first state the quality QOF is so low, that only the Fourier Tracking is used, so @ is set zero. In the second state the quality QOF is so high, that the Fourier tracking is deactivated so @is set to one. In the third state @ is computed by the relationship of the previously normalized and scaled qualities (formula 4.2). @ = c`a c`a +Ocab (4.2) 4.4. Directed Flight Directed Flight means, that the nose of the quadrocopter is always directed into flight direction, as it is known from airplanes. This becomes necessary, if the quadrocopter is no longer symmetrical, but has a preferential direction, because of a fix-mounted PMD camera or stereo vision system for collision avoidance. With this configuration, to fly through a narrow opening like a window or a door; the flight direction, the preferential direction and the yaw set value of the quadrocopter have to be the same. To realize this, the quadrocopter can simply be rotated over the yaw axis. For directed flight from space-fixed frame Position P1 = (x1,y1) to P2 = (x2,y2) the yaw set value d can be computed using formula 4.3: d = O+0445 e , − , f, − , O+O F − F g ∙ COF − F (4.3)
  • 9. International Journal of Artificial Intelligence Applications (IJAIA), Vol. 5, No. 5, September 2014 9 5. IMPLEMENTATION 5.1. Hardware Design The overall Hardware Design of the full system is shown in figure 4. The red dashed line separates the components connected to the AVR (upper) from those connected to the LP-180 Pico-ITX board. The minimum components required for this study have a red bordering line. The system uses the IMU3000 for orientation computation, while the MinIMU-9 v2 is for backup. Two infrared sensors, one ultrasonic sensor, a pressure sensor and the IMU are fused for height over ground computation[13]. The 6 DOF (degree of freedom) control of the autonomous quadrocopter is executed on the AVR32 UC3A1512 microcontroller with 10ms sample time[4]. Method 1 is implemented on the LP-180 and the C920 webcam from Logitech is used as Fourier Tracking sensor. The C270 webcam has also been tested for this application, but showed very disappointing performance under dynamic movements. The second position sensor is the ADNS 3080.The dynamic complementary data fusion for position computation of both methods and the navigation is executed on the LP-180. The fused position, if Fourier Tracking is active, and the position set value are sent to the UC3A1515 via USART/RS232. This applies to the set value for yaw also. Figure 4. Hardware Design
  • 10. International Journal of Artificial Intelligence Applications (IJAIA), Vol. 5, No. 5, September 2014 In this work the object detection and the obstacle detection is not used, thus these sensors (except the C270 webcams) are not connected to the evaluated quadrocopter for this paper. The other sensors are attached to the system, though they are not used here. 10 5.1. Software Design Figure 5 shows the simplified software design of the software on the LP-180 for directed flight. The position POF measured by the optical flow sensor, its quality QOF and the current orientation q in quaternion are sent from the AVR to the LP-180 every 10ms. This information is processed by the Pose Receiver method, which then execute the Execution Controller. Figure 5. Software Design The Execution Controller activates or deactivates the Fourier Tracking depending on the quality QOF. It also updates the Complementary Data Fusion by incorporating the new position POF every 10ms. Subsequently the Waypoint Commanding function is executed, which realizes the directed navigation. While doing this, the quadrocopter always first yaws to face the next waypoint and then approaches it. Hence, it rotates, translates, rotates, and translates, and so on, until the last waypoint has been reached. The Waypoint Commanding function consists of four states and a Boolean flag rotation, indicating whether a translation or a rotation has to be performed next: • Navigation Off: This is the initial state and the directed navigation is not possible. The rotation flag is set to true. User commands are necessary to set up the waypoint list and to switch to the Navigation On state. • Navigation On: In this state navigation is generally possible. If the last waypoint has not been reached, the next waypoint or rotation is processed. This means the next state of Waypoint Commanding is Waypoint Control or Rotation Control, depending on the rotation flag. The according set values are also sent to the AVR, which changes the set value of the according controller. In case of a rotation a signal is also send to the Execution Controller to activate the Fourier Tracking.
  • 11. International Journal of Artificial Intelligence Applications (IJAIA), Vol. 5, No. 5, September 2014 11 • Waypoint Control: The waypoint control state checks, whether the current waypoint is reached or not. If yes, it switches back to the Navigation On state. After every 3 seconds, when a waypoint is not reached, the set values are sent again to the AVR. This is important in case, that a command via the USART/RS232 communication link gets lost. Then the LP-180 would wait (forever) for the AVR to reach a certain waypoint and the AVR will not be able to react properly. It would control to a different position and the navigation would stick. This procedure is also much more simple and robust than using acknowledgements. For safety all commands are sent twice and together with a checksum, so that invalid commands can be discarded. • Rotation Control: By analogy to the Waypoint Control the Rotation Control state checks, if the rotation is finished. If so, it sends a signal to the Execution Controller to deactivate the Fourier Tracking. Then the position is corrected using the Fourier Tracking and it is updated on the AVR. The Fourier Tracking is the implementation of method 1 (chapter 2). The Complementary Data Fusion is the implementation of the already mentioned concept in chapter 4.3. It is executed after every Fourier Tracking sample, but not during rotation. Position updates are sent to the AVR every 100ms or 500ms, depending on the quality QOF. This is so, because too many position updates disturb the controller because of jumps, feedback and delay issues. The software on the LP-180 uses Qt [14], Open CV and FFT libraries and is implemented in C++. The software on the AVR is implemented using AVR32 Studio and C. 6. EVALUATION The system has been extensively evaluated. In three different scenarios the system has been tracked with the Optical Tracking System OptiTrack from Natural Point (referenced as OTS) with five Flex 3 cameras [15] to get a reference position, while different autonomous flight scenarios have been executed. For position estimation and control exclusively the mentioned on-Board optical sensors with the described data fusion has been used (referenced as EST). In total 70 experiments (trials) has been documented and the most representative results were selected to picture the behaviour here Table 1. Final Position Errors for P1 and P2 Position Error P1 P2 Ex [cm] Ey [cm] ΔFp [cm] Ex [cm] Ey [cm] ΔFp [cm] Trial 1 16.6 5.9 17.6 -15.4 -6.0 16.5 Trial 2 -13.1 -20.0 23.9 -6.1 -3.8 7.2 Trial 3 -22.0 -15.5 26.9 -26.3 -0.7 26.3 Trial 4 -8.4 -17.1 19.1 -10.8 -13.2 17.1 Trial 5 -11.6 -17.4 20.9 -11.9 -4.2 12.6 Mean -7.7 -12.82 21.68 -14.1 -5.58 15.94
  • 12. International Journal of Artificial Intelligence Applications (IJAIA), Vol. 5, No. 5, September 2014 12 5.1. Single Rotation Compensation (Headed Flight) In this setup the quadrocopter is tracked while performing a directed flight containing of a single waypoint. It flies from the initial position P0 = (0m, 0m) to P1 = (0m, 1.5m) or P2 = (1.5m, 1.5m). Each experiment was repeated five times with the same settings and all results showed very similar behaviour. The position was measured before the rotation and after 20s (P1) or 10s (P2), when the rotation and translation were already finished. The position error is the difference between the position change of OTS and EST and can be seen in Table 1. Ex and Ey are the errors in the x-axis and y-axis, respectively. Fp is the total 2D position error. From this data it can be concluded that the error in the position system after the mentioned manoeuvre is in the range of 12-27cm with a mean of about 19cm. This is already quite high after such a short period of time, but many effects such as wrong scaling and misalignment of OTS and EST increase this error. Therefore more data has to be taken into account to come to a conclusion. As all experiments showed similar results, two experiments from both scenarios are illustrated in Figure 6 and Figure 7 to discuss this setup more in detail. The graphene show, that after about 2s the rotation is finished and after about 5s the set point is reached. The green line shows, that the Fourier Tracking is activated before the rotation and is deactivated after the rotation is finished. Then the translation is executed and the system reaches the set position with a first overshoot of about 30 - 50cm. This overshoot is caused by the PID controller and its parameters, but also by the fact, that the EST measures a smaller position then the OTS. Both graphene taken into account together, it can be derived, that the overshoot is also affected by the happenings on the other axis. The system has a 2° x-axis mechanical misalignment towards the ground level (over pitch). That is the reason, why the system does not lift straight, but flies forward. This explains a 25cm position error over the x-axis after starting (Fig. 7). Figure 6.P1Trial 5 Fourier Tracking Off: FA = 0 Fourier Tracking On: FA = 100 Figure 7.P2Trial 4 5.2. Multiple Rotation Compensation (Headed Flight) In this setup the quadrocopter is tracked while performing a directed flight containing of a set of waypoints. Compared to the first setup the quadrocopter now performs multiple iterative rotations and directed translations. As waypoint set a square (Fig. 8)comprising of 4 waypoints and a Nikolaus house (Fig. 10) comprising of 8 waypoints has been used.
  • 13. International Journal of Artificial Intelligence Applications (IJAIA), Vol. 5, No. 5, September 2014 Again the position error and the time have been documented (Table 2). The position error is in the range of 12-34 cm, with one exception of 55cm. The mean is about 28cm for square and about 31cm for Nikolaus house. This means with 4 or 8 times more waypoints resulting in as many additional translations and rotations, the error increases, but not proportional to the time or amount of waypoints. 13 Table 2. Final Position Errors for Square and Nikolaus House Flight Position Error Square Nikolaus House Ex [cm] Ey [cm] t [s] Fp [cm] Ex [cm] Ey [cm] t [s] Fp [cm] Trial 1 11,5 -21,8 25 24,7 -31.5 -11.2 75 33.4 Trial 2 11,6 -22,6 25 25,4 21.1 10.5 55 23.5 Trial 3 -13,6 -18,2 20 22,8 28.3 -11.5 140 30.6 Trial 4 8,7 -32,9 35 34,0 -7.4 10.5 70 12.9 Trial 5 18,8 -25,4 25 31,6 -53.3 -13.,4 50 55.0 Mean 7,4 -24,8 26 27,7 -8,6 -3,0 78 31,0 The big error of 55cm in trial 5 of the Nikolaus house made a closer analyzation necessary. The data indicate that the error occurred during a fast movement. The reason for this and a solution need to be found in a further investigation. Figure 8. Square Waypoints Figure 9. Directed Square Flight: Position (Trial 2) tracked with on-Board estimation (EST) and OTS during different phases start, rotation and translation Again two trials are illustrated to explain the behaviour of the system in more details. Figure 9 shows the tracked position (EST, OTS) of the quadrotor during the square flight. The tracks are coloured in three different colours for the start, rotation and translation phase. It can clearly be seen, that during rotation a position error occurs, which is corrected afterwards, so that the system approaches correctly to the next waypoint.
  • 14. International Journal of Artificial Intelligence Applications (IJAIA), Vol. 5, No. 5, September 2014 Figure 11 shows the tracked position (OTS) during the flight of a Nikolaus house. Again the greatest variations from the set values occur during orientation. Figure 12 shows the corresponding yaw angle. The system adopts the closer yaw angle fastly(within a few seconds). The rotation speed is limited because of the fact, that the Fourier Tracking sensor and system cannot handle faster rotations. During translation the yaw angle is constant and therefore the Fourier Tracking can be deactivated. Time Time 14 Figure 10. Nikolaus House Waypoints Figure 11. Directed Nikolaus House Flight: Position (Trial 2) tracked with OTS during different phases start, rotation and translation Figure 12. Directed Nikolaus House: Corresponding Yaw Angle (Trial 2) [s] Figure 13. Effect of a Sensor Failure on Position Error under critical lighting conditions 5.1. Dynamic Weighted Data Fusion In the last setup the system has been evaluated under varying lighting conditions to investigate and demonstrate this effect and the systems behaviour on changes. Figure 14 illustrates the four different investigated lighting conditions. Good conditions are those, which are normal operating conditions for the ADNS-3080with relative much but not too much light. This is the light you
  • 15. International Journal of Artificial Intelligence Applications (IJAIA), Vol. 5, No. 5, September 2014 normally have in a bright room. Bad conditions are those, which make the ADNS fail sometimes, though there is still much light in the room. In our case we switched off the direct top light and there was only indirect light left. For the experiments with Very Bad Light conditions, we switched off most lights, but kept on some lights, so in the test section there was also a brightness bridge from dark to bright. The other time, for Total Bad conditions, all lights were switched off and only some daylight came through the curtains. Figure 13 shows the effect of a sensor failure because of bad lighting conditions. After about 120s even the Fourier Tracking failed resulting in a great position error. The relationship between position error and sensor failure because of low quality can be clearly seen here, though it is also shown, that the system does not totally fail, but follows the further movement correctly with about constant error. During all experiments of this setup the quadrotor is on position hold and again 5 trials have been documented for each lighting conditions. During the experiment, it could be seen, that the quadrotor became more dynamic under worse lighting conditions. That is why beside the position error this time also the standard deviation is computed after 60s. 15 Figure 14. Different Lighting Conditions Top: Good (left), Bad (right) Bottom: Very Bad (left), Total Bad (right) Figure 15 summarizes the results of this setup (illustration of Table 3 4).Figure 15 and Table 3 proves that the position error is significantly higher under worse lighting conditions compared to good conditions. The position error under worse conditions is about 4 times higher. Rather unexpected is the result, that the position error under bad conditions is higher than the error under
  • 16. International Journal of Artificial Intelligence Applications (IJAIA), Vol. 5, No. 5, September 2014 very bad or total bad conditions. Under bad conditions the Fourier Tracking is often activated and deactivated and both positions are fused following formula 4.1 and 4.2. It can be concluded, that a problem with the data fusion still exists and the parameters need to be tuned further. It might be, that wrong ADNS sensor values are used or that an error is incorporated into the system because of a wrong synchronisation of both systems, as both sensors are running on different processors with a different sample time and delay before their data fusion. This problem becomes more sophisticated, as the Fourier Tracking system dynamically activates and deactivates. These effects were taken into account, but further tuning of the data fusion under changing light conditions is required or the presented solution is not fully suitable. Despite this error the system performed better under bad lighting conditions compared to the worse lighting conditions (very bad total bad), because of the following two facts. 16 Figure 15.Effect of Different Lighting Conditions on Position Error and Standard Deviation The entries of table 3 containing an ‘X’ represent experiments, which could not finished properly, because the position system failed in a way, that the quadrotor left the test section. This happened one time under very bad and total bad conditions. In these cases the Fourier Tracking also failed (compare Fig.13). The experiment was then repeated and it worked in all other cases. Table3.Position Errors under different (difficult) lighting conditions P = hi + j Errors [cm] Trial 1 Trial 2 Trial 3 Trial 4 Trial 5 k X Y P X Y P X Y P X Y P X Y P Good -1 -13 13 -3 -3 4 1 -2 2 -5 0 5 2 5 6 6 Bad -47 -44 64 -31 -6 31 -48 -32 58 -28 -32 43 -2 -15 15 35 Very X X X - -8 14 -4 -67 67 -12 -21 24 -6 -11 13 23 Bad -8 -17 18 11 Total Bad 8 -35 36 2 -8 8 X -8 X 19 X 21 -16 13 20 -43 -67 80 28
  • 17. International Journal of Artificial Intelligence Applications (IJAIA), Vol. 5, No. 5, September 2014 Besides this, the quadrotor had more problems to fly calm as the lighting conditions became worse. This is shown in Table 4 with the increase in the standard deviation of the position (OTS). The reason for this difficulty is the fact, that with worse conditions, the ADNS failed more and more until totally and the controller had to rely on the low-sampled Fourier Tracking only. Thereby, to prevent against instability, the control parameters for both cases are set very similar and this might be tuned further. 17 Table 4. Standard deviation under different (difficult) lighting conditions Standard deviation [cm] Trial 1 Trial 2 Trial 3 Trial 4 Trial 5 Mean X Y X Y X Y X Y X Y Good 6 19 16 18 12 13 21 18 11 13 14.7 Bad 19 29 12 14 23 24 18 20 26 15 19.6 Very Bad 20 26 23 20 24 46 15 25 16 23 23.8 Total Bad 28 36 19 19 24 22 28 22 24 29 25.1 Furthermore a switch in the lighting conditions also incorporates difficulties to the control, which is shown on figure 16. In this experiment the conditions have been changed every 10-20 seconds in the following procedure: good - bad – very bad – bad – good Figure 16. System behaviour and performance under changing lighting conditions Fourier Tracking Off: FA = 0; Fourier Tracking On: FA = 50 It can clearly be seen, that the system becomes fitful under worse changing conditions and even more after a clarification. This experiment also shows that the position system can handle these critical changes in the lighting conditions. 6. CONCLUSION AND PERSPECTIVE The evaluation proved that the system is capable of a fully autonomous directed flight and the presented complementary vision based data fusion is sufficient for controlling a quadrotor in an autonomous flight. Even autonomous position hold under very bad lighting conditions is possible. In spite of this, there is still space for optimization and the accuracy as well as the reliability of the system need to be further improved. The control accuracy can be improved by updating the position during rotation. Position errors can be reduced by optimizing the data fusion, its parameters or the concept, but probably by reducing the weight of the ADNS under erroneous
  • 18. International Journal of Artificial Intelligence Applications (IJAIA), Vol. 5, No. 5, September 2014 conditions. Still it has been shown, that even under bad lighting conditions the ADNS improves the control behaviour. However, the reliability of the system can only be significantly improved, by adding other, non-optical sensors for positioning like radar, or ultrasonic, since optical sensors depend inherently on light. For outdoor applications, GPS would also be possible. The directed flight can now be combined together with other optical systems - like pmd camera and camera based stereo-optical distance determination – to fly fully autonomously through narrow openings like windows or doors. 18 ACKNOWLEDGEMENTS The author would like to thank Diana Baeva and Qasim Ali for reviewing this paper. This work was funded by the IHK Würzburg-Schweinfurt and the Universitätsbund Würzburg. This publication was funded by the German Research Foundation (DFG) and the University of Wuerzburg in the funding program Open Access Publishing. REFERENCES [1] Kendoul F. et al, Optical-flow based vision system for autonomous 3D localization and control of small aerial vehicles, Robotics and Autonomous Systems 2009, Elsevier [2] Herisse B. et al, Hovering flight and vertical landing control of a VTOL Unmanned Aerial Vehicle using Optical Flow, 2008 IEEE International Conference on Intelligent Robots and Systems [3] Reinthal E., Positionsbestimmungeinesautonomen Quadrokopters durchBildverarbeitung, 2014, BA Thesis, University of Wuerzburg [4] Gageik, N., Autonomous UAV with Optical Flow Sensor for Positioning and Navigation, 2013, International Journal of Advanced Robotic Systems, INTECH [5] Averbuch A. and Keller Y., A Unified Approach to FFT Based Image Registration, 2002, Tel Aviv University [6] Reddy B. S. and Chatterji B. N., An FFT-Based Technique for Translation, Rotation, and Scale- Invariant Registration, 1996, IEEE Transactions on Image Processing vol 5 no 8 [7] Arens T. et al, Mathematik, 2008, Heidelberg SpektrumAkademischerVerlag [8] Jähne B., Practical Handbook on Image Processing for Scientific Applications, 1997, Boca Raton CRC Press LLC [9] Lucas, B. and Kanade, T. 1981. An iterative image registration technique with an application to stereo vision. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 674–679. [10] Srinivasan M., An image-interpolation technique for the computation of optic flow and egomotion, Biological Cybernetics, 1994, Springer-Verlag [11] Strohmeier M., Implementierung und EvaluierungeinerPositionsregelungunterVerwendung des optischenFlusses, Würzburg 2012, BA Thesis [12] ADNS-3080 High-Performance Optical Mouse Sensor, Data Sheet, Avago Technologies, http://www.avagotech.com [13] Gageik N., Rothe J., Montenegro S., Data Fusion Principles for Height Control and Autonomous Landing of a Quadrocopter, UAVveek 2012 [14] Qt Project, http://qt.digia.com [15] Natural Point, OptiTrack, www.naturalpoint.com/optitrack/ [16] Ascending Technologies, Research Price List, 2013, Krailling, Germany, www.asctec.de [17] Shen, S., Autonomous Multi-Floor Indoor Navigation with a Computationally Contrained MAV, International Conference on Robotics and Automation, 2011, Shanghai, IEEE
  • 19. International Journal of Artificial Intelligence Applications (IJAIA), Vol. 5, No. 5, September 2014 19 AUTHORS Dipl.-Ing. Nils Gageik is working as a research assistant and PhD student at the Chair Aerospace Information Technology at the University of Wuerzburg. He received his diploma from the RWTH Aachen University2010 in Computer Engineering. B. Sc. Eric Reinthal is a Master Student in the international Spacemaster program. He received his bachelor degree in 2014 at the University of Wuerzburg. B. Sc. Paul Benz is a Master Student at the University of Wuerzburg. He received his bachelor degree in 2013 at the University of Wuerzburg. Prof.Dr. Sergio Montenegro is holder of the Chair Aerospace Information Technology at the University of Wuerzburg.