SlideShare a Scribd company logo
1	
  |	
  P a g e 	
  
	
  
Wavelet based Image Fusion
Term Paper
Report
CE 672
Machine Data Processing of Remotely Sensed Data
Submitted by: -
Umed Paliwal
10327774
2	
  |	
  P a g e 	
  
	
  
Abstract
The objective of image fusion is to combine information from multiple images of the same
scene. The result of image fusion is a new image which is more suitable for human and
machine perception or further image-processing tasks such as segmentation, feature
extraction and object recognition. Different fusion methods have been proposed in literature,
including multiresolution analysis. This paper is on image fusion based on wavelet
decomposition, i.e. a multiresolution image fusion approach. We can fuse images with the
same or different resolution level, i.e. range sensing, visual CCD, infrared, thermal or
medical. Over the past decade, a significant amount of research has been conducted
concerning the application of wavelet transforms in image fusion. In this paper, an
introduction to wavelet transform theory and an overview of image fusion technique are
given. The results from wavelet-based methods can also be improved by applying more
sophisticated models for injecting detail information; however, these schemes often have
greater set-up requirements.
3	
  |	
  P a g e 	
  
	
  
Contents
1. Introduction ………………………………………………………………………………....5
2. Wavelet transform theory......................................................................................................6
2.1. Wavelet family.......................................................................................................6
2.1.1. Wavelet functions…………………………………………………6
2.1.2. Scaling functions..............................................................................7
2.2. Wavelet transforms................................................................................. ……….7
2.2.1. Continuous wavelet transform..........................................................7
2.2.2. Discrete wavelet transform.................................................................8
2.2.2.1. Decimated………………………………………………....8
2.2.2.2. Undecimated……………………………………………...9
2.2.2.3. Non-separated…………………………………………....10
2.3 Image Fusion……………………………………………………………………..10
2.3.1 Image Fusion Scheme………………………………………………..11
3. Methodology………………………………………………………………………………...11
4. Results………………………………………………………………………………………..11
5. Conclusion…………………………………………………………………………………...13
6. References……………………………………………………………………………………14
7. MATLAB Code……………………………………………………………………………...15
4	
  |	
  P a g e 	
  
	
  
Table of Figures
Figure Title Page No.
2.1 Three-Level one-dimensional discrete wavelet transform 9
2.2 Image at Decomposition Level 1 and 2 9
3.1 Flowchart of the Methodology 11
4.1 The design of user interface 12
5.1 Fusion Example 13
5	
  |	
  P a g e 	
  
	
  
1 . Introduction
It is often desirable to fuse images from different sources, acquired at different times, or
otherwise having different characteristics. There are various methods that have been
developed to perform image fusion. The standard image fusion techniques, such as those that
use IHS, PCA, and Brovey transforms, however, can often produce poor results, at least in
comparison with the ideal output of the fusion. New approaches, or improvements on
existing approaches, are regularly being proposed that address particular problems with the
standard techniques. Most recently, the potential benefits of wavelet-based image fusion
methods have been explored in a variety of fields and for a variety of purposes.
Wavelet theory has developed since the beginning of the last century. It was first applied to
signal processing in the 1980's, and over the past decade it has been recognized as having
great potential in image processing applications. Wavelet transforms are essentially
extensions of the idea of high pass filtering. In visual terms, image detail is a result of high
contrast between features, for example a light rooftop and dark ground, and high contrast in
the spatial domain corresponds to high values in the frequency domain. Frequency
information can be extracted by applying Fourier transforms, however it is then no longer
associated with any spatial information. Wavelet transforms can therefore be more useful
than Fourier transforms, since they are based on functions that are localized in both space
and frequency. The detail information that is extracted from one image using wavelet
transforms can be injected into another image using one of a number of methods, for example
substitution, addition, or a selection method based on either frequency or spatial context.
Further- more, the wavelet function used in the transform can be designed to have specific
properties that are useful in the particular application of the transform .
Experiments with wavelet-based fusion schemes have, for the most part, produced positive
results, although there are some negative aspects, such as the introduction of artifacts in the
fused image when decimated algorithms are used. In earlier studies , wavelet- based schemes
were generally assessed in comparison to standard schemes; more recent studies propose
hybrid schemes, which use wavelets to extract the detail information from one image and
standard image transformations to inject it into another image, or propose improvements in
the method of injecting information. These approaches seem to achieve better results than
either the standard image fusion schemes (e.g. IHS, PCA) or standard wavelet-based image
fusion schemes (e.g. substitution, addition); however, they involve greater computational
complexity.
Wavelet theory and wavelet analysis is a relatively recent branch of mathematics. The first,
and simplest, wavelet was developed by Alfred Haar in 1909. The Haar wavelet belongs to
the group of wavelets known as Daubechies wavelets, which are named after Ingrid
Daubechies, who proved the existence of wavelet families whose scaling functions have
certain useful properties, namely compact support over an interval, at least one non-
vanishing moment, and orthogonal translates. Because of its simplicity, the Haar wavelet is
useful for illustrating the basic concepts of wavelet theory but has limited utility in
applications.
!Haar(x) =
1,0!x<1
0,otherwise
"
#
$
(1) !Haar(x) =
1,0 ! x <1/ 2
"1,1/ 2 ! x <1
0,otherwise
#
$
%
&
%
(2)
6	
  |	
  P a g e 	
  
	
  
Over the past decade, there has been an increasing amount of research into the applications of
wavelet transforms to remote sensing, particularly in image fusion. It has been found that
wavelets can be used to extract detail information from one image and inject it into another,
since this information is contained in high frequencies and wavelets can be used to select a set
of frequencies in both time and space. The resulting merged image, which can in fact be a
combination of any number of images, contains the best characteristics of all the original
images.
2. Wavelet Transform Theory
2.1. Wavelet family
Wavelets can be described in terms of two groups of functions: wavelet functions and scaling
functions. It is also common to refer to them as families: the wavelet function is the “mother”
wavelet, the scaling function is the “father” wavelet, and transformations of the parent
wavelets are “daughter” and “son” wavelets.
2.1.1. Wavelet functions
Generally, a wavelet family is described in terms of its mother wavelet, denoted as Ψ(x). The
mother wavelet must satisfy certain conditions to ensure that its wavelet transform is stably
invertible. These conditions are:
!(x)!
2
dx =1
!(x) dx < "!
!(x)dx = 0!
3&4&5
The conditions specify that the function must be an element of L2(R), and in fact must have
normalized energy, that it must be an element of L1(R), and that it have zero mean. The third
condition allows the addition of wavelet coefficients without changing the total flux of the
signal. Other conditions might be specified according to the application. For example, the
wavelet function might need to be continuous, or continuously differentiable, or it might
need to have compact support over a specific interval, or a certain number of vanishing
moments. Each of these conditions affects the results of the wavelet transform.
To apply a wavelet function, it must be scaled and translated. Generally, a normalization
factor is also applied so that the daughter wavelet inherits all of the properties of the mother
wavelet. A daughter wavelet a,b(x) is defined by the equation
!a, b(x) = a!1/2
!((x ! b) / a) (6)
where a, b ∈ R and a ≠ 0; a is called the scaling or dilation factor and b is called the translation
factor. In most practical applications it is necessary to place limits on the values of a and b. A
common choice isa = 2—j and b = 2-k where j and k are integers. The resulting equation is
!j,k (x) = 21/2
!(2j
x ! k) (7)
This choice for dilation and translation factors is called a dyadic sampling. Changing j by one
corresponds to changing the dilation by a factor of two, and changing k by one corresponds
to a shift of 2j.
7	
  |	
  P a g e 	
  
	
  
2.1.2. Scaling functions
In discrete wavelet transforms, a scaling function, or father wavelet, is needed to cover the
low frequencies. If the mother wavelet is regarded as a high pass filter then the father
wavelet, denoted as (x), should be a low pass filter. To ensure that this is the case, it cannot
have any vanishing moments. It is useful to specify that, in fact, the father wavelet have a
zeroth moment, or mean, equal to one:
!(x)dx =1! (8)
Multiresolution analysis makes use of a closed and nested sequence of sub- spaces ,which is
dense in L2(R) : each subsequent subspace is at a higher resolution and contains all the
subspaces at lower resolutions. Since the father wavelet is in V0, it, as well as the mother
wavelet, can be expressed as linear combinations of the basis functions for V1, k(x) :
!(x) = hk"i,k (x)
k
! (9)
!(x) = lk!i,k (x)
k
! (10)
2.2. Wavelet transforms
Wavelet transforms provide a framework in which a signal is decomposed, with each level
corresponding to a coarser resolution, or lower frequency band. There are two main groups
of transforms, continuous and discrete. Discrete transforms are more commonly used and can
be subdivided in various categories. Although a review of the literature produces a number
of different names and approaches for wavelet transformations, most fall into one of the
following three categories: decimated, undecimated, and non- separated.
2.2.1. Continuous wavelet transform
A continuous wavelet transform is performed by applying an inner product to the signal and
the wavelet functions. The dilation and translation factors are elements of the real line. For a
particular dilation a and translation b, the wavelet coefficient Wf(a,b) for a signal f can be
calculated as
Wf (a,b) = f,!a,b = f (x)!! a,b
(x)dx (11)
Wavelet coefficients represent the information contained in a signal at the corresponding
dilation and translation. The original signal can be reconstructed by applying the inverse
transform:
f (x) =
1
Cw
Wf (a,b)!a,b (x)db
da
a2
!"
"
# (12)
where Cw is the normalization factor of the mother wavelet. Although the continuous wavelet
transform is simple to describe mathematically, both the signal and the wavelet function
must have closed forms, making it difficult or impractical to apply. The discrete wavelet is
used instead.
8	
  |	
  P a g e 	
  
	
  
2.2.2. Discrete wavelet transform
The term discrete wavelet transform (DWT) is a general term, encompassing several different
methods. It must be noted that the signal itself is continuous; discrete refers to discrete sets of
dilation and translation factors and discrete sampling of the signal. For simplicity, it will be
assumed that the dilation and translation factors are chosen so as to have dyadic sampling,
but the concepts can be extended to other choices of factors.
At a given scale J, a finite number of translations are used in applying multiresolution
analysis to obtain a finite number of scaling and wavelet coefficients. The signal can be
represented in terms of these coefficients as
f (x) = cJk!Jk (x)+ djk"jk (x)
k
!
j=1
!
k
! (13)
where cjk are the scaling coefficients and djk are the wavelet coefficients. The first term in Eq.
(14) gives the low-resolution approximation of the signal while the second term gives the
detailed information at resolutions from the original down to the current resolution J. The
process of applying the DWT can be represented as a bank of filters. At each level of
decomposition, the signal is split into high frequency and low frequency components; the low
frequency components can be further decomposed until the desired resolution is reached.
When multiple levels of decomposition are applied, the process is referred to as
multiresolution decomposition. In practice when wavelet decomposition is used for image
fusion, one level of decomposition can be sufficient, but this depends on the ratio of the
spatial resolutions of the images being fused (for dyadic sampling, a 1:2 ratio is needed).
2.2.2.1. Decimated
The conventional DWT can be applied using either a decimated or an undecimated
algorithm. In the decimated algorithm, the signal is down- sampled after each level of
transformation. In the case of a two-dimensional image, downsampling is performed by
keeping one out of every two rows and columns, making the transformed image one quarter
of the original size and half the original resolution. The decimated algorithm can therefore be
represented visually as a pyramid, where the spatial resolution becomes coarser as the image
becomes smaller Further discussion of the DWT will be primarily with respect to two-
dimensional images, keeping in mind that the concepts can be simplified to the one-
dimensional case.
The wavelet and scaling filters are one-dimensional, necessitating a two-stage process for
each level in the multiresolution analysis: the filtering and downsampling are first applied to
the rows of the image and then to its columns. This produces four images at the lower
resolution, one approximation image and three wavelet coefficient, or detail, images. A, HD,
VD, and DD are the sub-images produced after one level of transformation. The A sub-image
is the approximation image and results from applying the scaling or low-pass filter to both
rows and columns. A subsequent level of transformation would be applied only to this sub-
image. The HD sub- image contains the horizontal details (from low-pass on rows, high-pass
on columns), the VD sub-image contains the vertical details (from high-pass on rows, lows-
pass on columns) and the DD sub-image contains the diagonal details (from high-pass, or
wavelet filter, on both rows and columns).
9	
  |	
  P a g e 	
  
	
  
Fig. 2.1. Three-Level one-dimensional discrete wavelet transform
Fig.2.2. a) Image at first decomposition level b) Second Decomposition Level
The decimated algorithm is not shift-invariant, which means that it is sensitive to shifts of the
input image. The decimation process also has a negative impact on the linear continuity of
spatial features that do not have a horizontal or vertical orientation. These two factors tend to
introduce artifacts when the algorithm is used in applications such as image fusion.
Image Source : - G. Pajares, J. Manuel , A wavelet-based image fusion tutorial, Pattern
Recognition 37(2004) 1855-1872.
10	
  |	
  P a g e 	
  
	
  
2.2.2.2. Undecimated.
The undecimated algorithm addresses the issue of shift-invariance. It does so by suppressing
the down-sampling step of the decimated algorithm and instead up-sampling the filters by
inserting zeros between the filter coefficients. Algorithms in which the filter is up-sampled
are called “à trous”, meaning “with holes”. As with the decimated algorithm, the filters are
applied first to the rows and then to the columns. In this case, however, although the four
images produced (one approximation and three detail images) are at half the resolution of the
original, they are the same size as the original image. The approximation images from the
undecimated algorithm are therefore represented as levels in a parallelepiped, with the
spatial resolution becoming coarser at each higher level and the size remaining the same. The
undecimated algorithm is redundant, meaning some detail information may be retained in
adjacent levels of transformation. It also requires more space to store the results of each level
of transformation and, although it is shift-invariant, it does not resolve the problem of feature
orientation.
A previous level of approximation, resolution J-1, can be
reconstructed exactly by applying the inverse transform to all four images at resolution J and
combining the resulting images. Essentially, the inverse transform involves the same steps as
the forward transform, but they are applied in the reverse order. In the decimated case, this
means up-sampling the approximation and detail images and applying reconstruction filters,
which are inverses of the decomposition scaling and wavelet filters, first by columns and then
by rows. For example, first the columns of the VD image would be up-sampled and the
inverse scaling filter would be applied, then the rows would be up- sampled and the inverse
wavelet filter would be applied. The original image is reconstructed by applying the inverse
transform to each deconstructed level in turn, starting from the level at the coarsest
resolution, until the original resolution is reached. Reconstruction in the undecimated case is
similar, except that instead of up- sampling the images, the filters are down-sampled before
each application of the inverse filters.
Shift-invariance is necessary in order to compare and combine wavelet coefficient images.
Without shift- invariance, slight shifts in the input signal will produce variations in the
wavelet coefficients that might intro- duce artifacts in the reconstructed image. Shift-variance
is caused by the decimation process, and can be resolved by using the undecimated
algorithm. However, the other problem with standard discrete wavelet transforms is the poor
directional selectivity, meaning poor representation of features with orientations that are not
horizontal or vertical, which is a result of separate filtering in these directions (Kingsbury,
1999).
2.2.2.3. Non-separated.
One approach for dealing with shift variance is to use a non-separated, two-dimensional
wavelet filter derived from the scaling function. This produces only two images, one
approximation image, also called the scale frame, and one detail image, called the wavelet
plane. The wavelet plane is computed as the difference between the original and the
approximation images and contains all the detail lost as a result of the wavelet
decomposition. As with the undecimated DWT, a coarser approximation is achieved by
upsampling the filter at each level of decomposition; correspondingly, the filter is down-
sampled at each level of reconstruction. Some redundancy between adjacent levels of
decomposition is possible in this approach, but since it is not decimated, it is shift-invariant,
and since it does not involve separate filtering in the horizontal and vertical directions, it
better preserves feature orientation.
2.3 Image fusion
The objective of image fusion is to produce a single image containing the best aspects of the
fused images. Some desirable aspects include high spatial resolution and high spectral
11	
  |	
  P a g e 	
  
	
  
resolution (multispectral and panchromatic satellite images), areas in focus (microscopy
images), functional and anatomic information (medical images), different spectral
information (optical and infrared images), or colour information and texture information
(multispectral and synthetic aperture radar images).
2.3.1 Image Fusion Scheme
The wavelet transform contains the low-high bands, the high-low bands and the high-high
bands of the image at different scales, plus the low-low band of the image at coarsest level.
Except for the low-low band which has all positive transform values, all other bands contain
transform values in these bands correspond to sharper brightness values and thus to the
salient features in the image such as edges, lines, and region boundaries. There a good
integration rule is to select the larger( absolute value) of the two wavelet coefficients at each
point. Subsequently, a composite image constructed by performing an inverse wavelet
transform based on the combined transform coefficients.
Since the wavelet transform provides both spatial and frequency domain localization, the
effect of maximum fusion rule can be illustrated in the following two aspects. If the same
object appears more distinctly( in other words with better contrast) in image A than in image
B, after fusion the object in image A will be preserved. In a different scenario, suppose the
outer boundary of the object appears more distinctly in image A while the inner boundary of
the object appears more distinctly in image B. As a result, the object in image A looks visually
larger than the corresponding object in image B. In this case the wavelet transform
coefficients of the object in images A and B will be dominant at the different resolution levels.
Based on maximum selection rule, both the outer structure from image A and the inner
structure from image B will be preserved in the fused image.
3. Methodolgy
The steps involved in fusion of images through wavelet transform are given below.
1) Get the images to be fused
2) Apply the wavelet transform on both the images through chosen wavelet at the
desired level
3) Get the approximation and detail coefficients for both the images
4) Merge the coefficients by desired fusion rule
5) Apply Inverse discrete wavelet transform on the merged coefficients and get the
fused image
Fig.3.1 Flowchart of the Methodology
Image Source :- H. Li, B.S. Manjunath, S.K. Mitra, Multisensor Image Fusion Using the
Wavelet Transform, Graphical Models and Image Processing, Vol. 57, No. 3, 1995, 235-245.
12	
  |	
  P a g e 	
  
	
  
4. Results
Based on the above stated methodology a MATLAB code has been written with a graphical
user interface to fuse the images.
4.1 User Interface
4.1.1 Wavelets
A choice of wavelets has been given to the user
1) Debauchies Wavelet - db1
2) Coiflets - coif1
3) Symlets – sym2
4) Discrete meyer - dmey
5) Orthogonal – bior 1.1
6) Reverse Biorthogonal - rbio1.1
User can select up to 10 decomposition level
4.2.2 Fusion Method
User has 6 options for fusion method both for approximation coefficients and detail
coefficients i.e. maximum, minimum, mean, Image 1, Image 2, and random which merges the
two approximations or detail structures obtained from Image 1 and 2 element wise by taking
the maximum, the minimum, the mean, the first element, the second element, the second
element or a randomly chosen element.
Fig. 4.1 The design of user interface
Shown below are the input images and the output fused image. As can be seen in the image 1
the left portion of the image is blurred and in the image 2 the middle portion is blurred. The
output image is much better than both the input images.
13	
  |	
  P a g e 	
  
	
  
Fig.5.1. a and b Input Images c) Fused Image using biorthogonal wavelet transform, four
levels of decomposition, minimum approximation coefficients and maximum detailed
coefficients.
5. Conclusions
Wavelet transforms isolate frequencies in both time and space, allowing detail information to
be easily extracted from satellite imagery. A number of different schemes have been proposed
to inject this detail information into multispectral imagery, ranging from simple substitution
to complex formulas based on the statistical properties of the imagery. While even the
simplest wavelet-based fusion scheme tends to produce better results than stan-dard fusion
schemes such as IHS and PCA, further improvement is evident with more sophisticated
wavelet- based fusion schemes. The drawback is that there is greater computational
complexity and often parameters must be set up before the fusion scheme can be applied.
Another strategy for improving the quality of results is to combine a standard fusion scheme
with a wavelet-based fusion scheme, however this also has limitations; IHS, for instance, can
only be applied to three bands at a time. The type of algorithm that is used to apply the
wavelet transform can also affect the quality of the result: decimation disturbs the linear
a)	
  	
  Image	
  1	
   b)	
  Image	
  2	
  
c)	
  Fused	
  Image	
  
14	
  |	
  P a g e 	
  
	
  
continuity of spatial features and introduces artifacts in the fused image, whereas non-
decimated algorithms require more memory space during processing but do not introduce as
many artifacts. Each wavelet-based fusion scheme has its own set of advantages and
limitations. More comprehensive testing is required in order to assess fully under what
conditions each one is most appropriate.
6. References
G. Pajares, J. Manuel , A wavelet-based image fusion tutorial, Pattern Recognition 37(2004)
1855-1872.
V.P.S Naidu, J.R. Raol, Pixel-level Image fusion using wavelets and Principal component
analysis, Defence Science Journal, Vol. 58No. 3, May 2008, 338-352.
E.J. Stollnitz, T.D. Derose, D. Salesin, Wavelets in Computer Graphics, Theory and
Applications, Morgan Kauffman Publishers Inc.
H. Li, B.S. Manjunath, S.K. Mitra, Multisensor Image Fusion Using the Wavelet Transform,
Graphical Models and Image Processing, Vol. 57, No. 3, 1995, 235-245.
15	
  |	
  P a g e 	
  
	
  
7. MATLAB Code
function varargout = wavelet(varargin)
% WAVELET MATLAB code for wavelet.fig
% WAVELET, by itself, creates a new WAVELET or raises the
existing
% singleton*.
%
% H = WAVELET returns the handle to a new WAVELET or the
handle to
% the existing singleton*.
%
% WAVELET('CALLBACK',hObject,eventData,handles,...) calls
the local
% function named CALLBACK in WAVELET.M with the given input
arguments.
%
% WAVELET('Property','Value',...) creates a new WAVELET or
raises the
% existing singleton*. Starting from the left, property
value pairs are
% applied to the GUI before wavelet_OpeningFcn gets called.
An
% unrecognized property name or invalid value makes property
application
% stop. All inputs are passed to wavelet_OpeningFcn via
varargin.
%
% *See GUI Options on GUIDE's Tools menu. Choose "GUI
allows only one
% instance to run (singleton)".
%
% See also: GUIDE, GUIDATA, GUIHANDLES
% Edit the above text to modify the response to help wavelet
% Last Modified by GUIDE v2.5 29-Mar-2014 21:32:17
% Begin initialization code - DO NOT EDIT
gui_Singleton = 1;
gui_State = struct('gui_Name', mfilename, ...
'gui_Singleton', gui_Singleton, ...
'gui_OpeningFcn', @wavelet_OpeningFcn, ...
'gui_OutputFcn', @wavelet_OutputFcn, ...
'gui_LayoutFcn', [] , ...
'gui_Callback', []);
if nargin && ischar(varargin{1})
gui_State.gui_Callback = str2func(varargin{1});
end
if nargout
[varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT
% --- Executes just before wavelet is made visible.
16	
  |	
  P a g e 	
  
	
  
function wavelet_OpeningFcn(hObject, eventdata, handles,
varargin)
% This function has no output args, see OutputFcn.
% hObject handle to figure
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
% varargin command line arguments to wavelet (see VARARGIN)
% Choose default command line output for wavelet
handles.output = hObject;
% Update handles structure
guidata(hObject, handles);
% UIWAIT makes wavelet wait for user response (see UIRESUME)
% uiwait(handles.figure1);
% --- Outputs from this function are returned to the command
line.
function varargout = wavelet_OutputFcn(hObject, eventdata,
handles)
% varargout cell array for returning output args (see
VARARGOUT);
% hObject handle to figure
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
% Get default command line output from handles structure
varargout{1} = handles.output;
% --- Executes on button press in pushbutton1.
function pushbutton1_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton1 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
axes(handles.axes1)
imshow('image1.jpg');
% --- Executes on button press in pushbutton2.
function pushbutton2_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton2 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
axes(handles.axes2)
imshow('image2.jpg');
% --- Executes on selection change in popupmenu1.
function popupmenu1_Callback(hObject, eventdata, handles)
% hObject handle to popupmenu1 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
17	
  |	
  P a g e 	
  
	
  
% handles structure with handles and user data (see GUIDATA)
% Hints: contents = cellstr(get(hObject,'String')) returns
popupmenu1 contents as cell array
% contents{get(hObject,'Value')} returns selected item
from popupmenu1
a = get(handles.popupmenu1, 'value');
switch a
case 1
handles.x1 = 'db1';
case 2
handles.x1 = 'coif1';
case 3
handles.x1 = 'sym2';
case 4
handles.x1 = 'dmey';
case 5
handles.x1 = 'bior1.1';
case 6
handles.x1 = 'rbio1.1' ;
end
guidata(hObject,handles)
% --- Executes during object creation, after setting all
properties.
function popupmenu1_CreateFcn(hObject, eventdata, handles)
% hObject handle to popupmenu1 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles empty - handles not created until after all
CreateFcns called
% Hint: popupmenu controls usually have a white background on
Windows.
% See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end
% --- Executes on selection change in popupmenu2.
function popupmenu2_Callback(hObject, eventdata, handles)
% hObject handle to popupmenu2 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
% Hints: contents = cellstr(get(hObject,'String')) returns
popupmenu2 contents as cell array
% contents{get(hObject,'Value')} returns selected item
from popupmenu2
b = get(handles.popupmenu2, 'value');
handles.x2 = b;
% switch b
% case 1
% handles.x2 = 1;
% case 2
% handles.x2 = 2;
18	
  |	
  P a g e 	
  
	
  
% case 3
% handles.x2 = 3;
% case 4
% handles.x2 = 4;
% case 5
% handles.x2 = 5;
%
% end
guidata(hObject,handles)
% --- Executes during object creation, after setting all
properties.
function popupmenu2_CreateFcn(hObject, eventdata, handles)
% hObject handle to popupmenu2 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles empty - handles not created until after all
CreateFcns called
% Hint: popupmenu controls usually have a white background on
Windows.
% See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end
% --- Executes on button press in pushbutton3.
function pushbutton3_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton3 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
% --- Executes on selection change in popupmenu3.
function popupmenu3_Callback(hObject, eventdata, handles)
% hObject handle to popupmenu3 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
% Hints: contents = cellstr(get(hObject,'String')) returns
popupmenu3 contents as cell array
% contents{get(hObject,'Value')} returns selected item
from popupmenu3
c = get(handles.popupmenu3, 'value');
switch c
case 1
handles.x3 = 'max';
case 2
handles.x3 = 'min';
case 3
handles.x3 = 'mean';
case 4
handles.x3 = 'img1';
case 5
handles.x3 = 'img2';
19	
  |	
  P a g e 	
  
	
  
case 6
handles.x3 ='rand';
end
guidata(hObject,handles)
% --- Executes during object creation, after setting all
properties.
function popupmenu3_CreateFcn(hObject, eventdata, handles)
% hObject handle to popupmenu3 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles empty - handles not created until after all
CreateFcns called
% Hint: popupmenu controls usually have a white background on
Windows.
% See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end
% --- Executes on selection change in popupmenu4.
function popupmenu4_Callback(hObject, eventdata, handles)
% hObject handle to popupmenu4 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
% Hints: contents = cellstr(get(hObject,'String')) returns
popupmenu4 contents as cell array
% contents{get(hObject,'Value')} returns selected item
from popupmenu4
d = get(handles.popupmenu4, 'value');
switch d
case 1
handles.x4 = 'max';
case 2
handles.x4 = 'min';
case 3
handles.x4 = 'mean';
case 4
handles.x3 = 'img1';
case 5
handles.x3 = 'img2';
case 6
handles.x3 ='rand';
end
guidata(hObject,handles)
% --- Executes during object creation, after setting all
properties.
function popupmenu4_CreateFcn(hObject, eventdata, handles)
% hObject handle to popupmenu4 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
20	
  |	
  P a g e 	
  
	
  
% handles empty - handles not created until after all
CreateFcns called
% Hint: popupmenu controls usually have a white background on
Windows.
% See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end
% --- Executes on button press in pushbutton4.
function pushbutton4_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton4 (see GCBO)
% eventdata reserved - to be defined in a future version of
MATLAB
% handles structure with handles and user data (see GUIDATA)
p = handles.x1;
q = handles.x2;
r = handles.x3;
s = handles.x4;
X1 = imread('image1.jpg');
X2 = imread('image2.jpg');
XFUS = wfusimg(X1,X2,p,q,r,s)/255;
axes(handles.axes5)
imshow(XFUS);

More Related Content

PPTX
Wavelet based image fusion
PDF
Image Fusion
PPTX
Underwater imaging
PPTX
IMAGE FUSION IN IMAGE PROCESSING
PPTX
Region based segmentation
PPTX
Hit and-miss transform
PPTX
Histogram Processing
PDF
Optical fiber communication Part 1 Optical Fiber Fundamentals
Wavelet based image fusion
Image Fusion
Underwater imaging
IMAGE FUSION IN IMAGE PROCESSING
Region based segmentation
Hit and-miss transform
Histogram Processing
Optical fiber communication Part 1 Optical Fiber Fundamentals

What's hot (20)

PPTX
Intensity Transformation
PPTX
Image Representation & Descriptors
PDF
8. introduction to small scale fading
PPTX
Watershed Segmentation Image Processing
PPTX
Characterization of Photonic Crystal Fiber
PPTX
Image enhancement
PDF
CV_Chap 6 Motion Representation
PPTX
Image compression standards
PDF
Digital Image Processing: Image Segmentation
PPT
PPT
Digitized images and
PPTX
Point processing
PPT
PPT
image enhancement
PPTX
Optical Fiber connectors
PPSX
Image Enhancement in Spatial Domain
PPTX
Modulation techniques
PDF
CV_1 Introduction of Computer Vision and its Application
PPT
Sharpening using frequency Domain Filter
Intensity Transformation
Image Representation & Descriptors
8. introduction to small scale fading
Watershed Segmentation Image Processing
Characterization of Photonic Crystal Fiber
Image enhancement
CV_Chap 6 Motion Representation
Image compression standards
Digital Image Processing: Image Segmentation
Digitized images and
Point processing
image enhancement
Optical Fiber connectors
Image Enhancement in Spatial Domain
Modulation techniques
CV_1 Introduction of Computer Vision and its Application
Sharpening using frequency Domain Filter
Ad

Viewers also liked (20)

PPTX
Comparison of image fusion methods
PPT
Fusion Imaging Overview
PDF
A New Approach of Medical Image Fusion using Discrete Wavelet Transform
PDF
Image Fusion Based On Wavelet And Curvelet Transform
PPTX
FUSION IMAGING
PPTX
Wavelet
PPTX
discrete wavelet transform
PDF
iaetsd Image fusion of brain images using discrete wavelet transform
DOCX
Multifocus image fusion based on nsct
PDF
Novel image fusion techniques using global and local kekre wavelet transforms
PDF
Ijmet 07 06_005
PDF
Iaetsd a modified image fusion approach using guided filter
PDF
A comparison between scilab inbuilt module and novel method for image fusion
PDF
Multimodality medical image fusion using improved contourlet transformation
PDF
Comparative study on image fusion methods in spatial domain
PPTX
Skin tone based steganography
PPTX
Image compression Algorithms
PPT
Wavelet Based Feature Extraction Scheme Of Eeg Waveform
PPTX
Introduction to wavelet transform
PDF
Multimodal Medical Image Fusion Based On SVD
Comparison of image fusion methods
Fusion Imaging Overview
A New Approach of Medical Image Fusion using Discrete Wavelet Transform
Image Fusion Based On Wavelet And Curvelet Transform
FUSION IMAGING
Wavelet
discrete wavelet transform
iaetsd Image fusion of brain images using discrete wavelet transform
Multifocus image fusion based on nsct
Novel image fusion techniques using global and local kekre wavelet transforms
Ijmet 07 06_005
Iaetsd a modified image fusion approach using guided filter
A comparison between scilab inbuilt module and novel method for image fusion
Multimodality medical image fusion using improved contourlet transformation
Comparative study on image fusion methods in spatial domain
Skin tone based steganography
Image compression Algorithms
Wavelet Based Feature Extraction Scheme Of Eeg Waveform
Introduction to wavelet transform
Multimodal Medical Image Fusion Based On SVD
Ad

Similar to Wavelet based image fusion (20)

PDF
International Journal of Engineering Research and Development (IJERD)
PPTX
PPT Image Analysis(IRDE, DRDO)
PPT
Lec11.ppt
PDF
PDF
Discrete wavelet transform using matlab
PDF
A Review on Image Denoising using Wavelet Transform
PPT
3rd sem ppt for wavelet
PPTX
Curved Wavelet Transform For Image Denoising using MATLAB.
PDF
Wavelet Signal Processing
PDF
Hq3114621465
PDF
R044120124
PDF
50120140504008
PDF
Ijri ece-01-02 image enhancement aided denoising using dual tree complex wave...
PPTX
Wavelet transform in two dimensions
PPTX
Wavelet Based Image Compression Using FPGA
PDF
Different Image Fusion Techniques –A Critical Review
PDF
Comparative Analysis of Dwt, Reduced Wavelet Transform, Complex Wavelet Trans...
PPT
Digital Image Processing_ ch3 enhancement freq-domain
PDF
Applications of Wavelet Transform
PDF
Dk33669673
International Journal of Engineering Research and Development (IJERD)
PPT Image Analysis(IRDE, DRDO)
Lec11.ppt
Discrete wavelet transform using matlab
A Review on Image Denoising using Wavelet Transform
3rd sem ppt for wavelet
Curved Wavelet Transform For Image Denoising using MATLAB.
Wavelet Signal Processing
Hq3114621465
R044120124
50120140504008
Ijri ece-01-02 image enhancement aided denoising using dual tree complex wave...
Wavelet transform in two dimensions
Wavelet Based Image Compression Using FPGA
Different Image Fusion Techniques –A Critical Review
Comparative Analysis of Dwt, Reduced Wavelet Transform, Complex Wavelet Trans...
Digital Image Processing_ ch3 enhancement freq-domain
Applications of Wavelet Transform
Dk33669673

Recently uploaded (20)

PDF
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
PPTX
additive manufacturing of ss316l using mig welding
PPT
Mechanical Engineering MATERIALS Selection
PPTX
FINAL REVIEW FOR COPD DIANOSIS FOR PULMONARY DISEASE.pptx
PPTX
UNIT 4 Total Quality Management .pptx
PPTX
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
PDF
Embodied AI: Ushering in the Next Era of Intelligent Systems
PPTX
Sustainable Sites - Green Building Construction
PDF
TFEC-4-2020-Design-Guide-for-Timber-Roof-Trusses.pdf
PPTX
Recipes for Real Time Voice AI WebRTC, SLMs and Open Source Software.pptx
PPTX
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
PPTX
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
PPTX
OOP with Java - Java Introduction (Basics)
PPTX
Lecture Notes Electrical Wiring System Components
PPTX
Geodesy 1.pptx...............................................
PPTX
IOT PPTs Week 10 Lecture Material.pptx of NPTEL Smart Cities contd
PDF
Operating System & Kernel Study Guide-1 - converted.pdf
PDF
Automation-in-Manufacturing-Chapter-Introduction.pdf
PPTX
CYBER-CRIMES AND SECURITY A guide to understanding
PPTX
UNIT-1 - COAL BASED THERMAL POWER PLANTS
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
additive manufacturing of ss316l using mig welding
Mechanical Engineering MATERIALS Selection
FINAL REVIEW FOR COPD DIANOSIS FOR PULMONARY DISEASE.pptx
UNIT 4 Total Quality Management .pptx
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
Embodied AI: Ushering in the Next Era of Intelligent Systems
Sustainable Sites - Green Building Construction
TFEC-4-2020-Design-Guide-for-Timber-Roof-Trusses.pdf
Recipes for Real Time Voice AI WebRTC, SLMs and Open Source Software.pptx
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
OOP with Java - Java Introduction (Basics)
Lecture Notes Electrical Wiring System Components
Geodesy 1.pptx...............................................
IOT PPTs Week 10 Lecture Material.pptx of NPTEL Smart Cities contd
Operating System & Kernel Study Guide-1 - converted.pdf
Automation-in-Manufacturing-Chapter-Introduction.pdf
CYBER-CRIMES AND SECURITY A guide to understanding
UNIT-1 - COAL BASED THERMAL POWER PLANTS

Wavelet based image fusion

  • 1. 1  |  P a g e     Wavelet based Image Fusion Term Paper Report CE 672 Machine Data Processing of Remotely Sensed Data Submitted by: - Umed Paliwal 10327774
  • 2. 2  |  P a g e     Abstract The objective of image fusion is to combine information from multiple images of the same scene. The result of image fusion is a new image which is more suitable for human and machine perception or further image-processing tasks such as segmentation, feature extraction and object recognition. Different fusion methods have been proposed in literature, including multiresolution analysis. This paper is on image fusion based on wavelet decomposition, i.e. a multiresolution image fusion approach. We can fuse images with the same or different resolution level, i.e. range sensing, visual CCD, infrared, thermal or medical. Over the past decade, a significant amount of research has been conducted concerning the application of wavelet transforms in image fusion. In this paper, an introduction to wavelet transform theory and an overview of image fusion technique are given. The results from wavelet-based methods can also be improved by applying more sophisticated models for injecting detail information; however, these schemes often have greater set-up requirements.
  • 3. 3  |  P a g e     Contents 1. Introduction ………………………………………………………………………………....5 2. Wavelet transform theory......................................................................................................6 2.1. Wavelet family.......................................................................................................6 2.1.1. Wavelet functions…………………………………………………6 2.1.2. Scaling functions..............................................................................7 2.2. Wavelet transforms................................................................................. ……….7 2.2.1. Continuous wavelet transform..........................................................7 2.2.2. Discrete wavelet transform.................................................................8 2.2.2.1. Decimated………………………………………………....8 2.2.2.2. Undecimated……………………………………………...9 2.2.2.3. Non-separated…………………………………………....10 2.3 Image Fusion……………………………………………………………………..10 2.3.1 Image Fusion Scheme………………………………………………..11 3. Methodology………………………………………………………………………………...11 4. Results………………………………………………………………………………………..11 5. Conclusion…………………………………………………………………………………...13 6. References……………………………………………………………………………………14 7. MATLAB Code……………………………………………………………………………...15
  • 4. 4  |  P a g e     Table of Figures Figure Title Page No. 2.1 Three-Level one-dimensional discrete wavelet transform 9 2.2 Image at Decomposition Level 1 and 2 9 3.1 Flowchart of the Methodology 11 4.1 The design of user interface 12 5.1 Fusion Example 13
  • 5. 5  |  P a g e     1 . Introduction It is often desirable to fuse images from different sources, acquired at different times, or otherwise having different characteristics. There are various methods that have been developed to perform image fusion. The standard image fusion techniques, such as those that use IHS, PCA, and Brovey transforms, however, can often produce poor results, at least in comparison with the ideal output of the fusion. New approaches, or improvements on existing approaches, are regularly being proposed that address particular problems with the standard techniques. Most recently, the potential benefits of wavelet-based image fusion methods have been explored in a variety of fields and for a variety of purposes. Wavelet theory has developed since the beginning of the last century. It was first applied to signal processing in the 1980's, and over the past decade it has been recognized as having great potential in image processing applications. Wavelet transforms are essentially extensions of the idea of high pass filtering. In visual terms, image detail is a result of high contrast between features, for example a light rooftop and dark ground, and high contrast in the spatial domain corresponds to high values in the frequency domain. Frequency information can be extracted by applying Fourier transforms, however it is then no longer associated with any spatial information. Wavelet transforms can therefore be more useful than Fourier transforms, since they are based on functions that are localized in both space and frequency. The detail information that is extracted from one image using wavelet transforms can be injected into another image using one of a number of methods, for example substitution, addition, or a selection method based on either frequency or spatial context. Further- more, the wavelet function used in the transform can be designed to have specific properties that are useful in the particular application of the transform . Experiments with wavelet-based fusion schemes have, for the most part, produced positive results, although there are some negative aspects, such as the introduction of artifacts in the fused image when decimated algorithms are used. In earlier studies , wavelet- based schemes were generally assessed in comparison to standard schemes; more recent studies propose hybrid schemes, which use wavelets to extract the detail information from one image and standard image transformations to inject it into another image, or propose improvements in the method of injecting information. These approaches seem to achieve better results than either the standard image fusion schemes (e.g. IHS, PCA) or standard wavelet-based image fusion schemes (e.g. substitution, addition); however, they involve greater computational complexity. Wavelet theory and wavelet analysis is a relatively recent branch of mathematics. The first, and simplest, wavelet was developed by Alfred Haar in 1909. The Haar wavelet belongs to the group of wavelets known as Daubechies wavelets, which are named after Ingrid Daubechies, who proved the existence of wavelet families whose scaling functions have certain useful properties, namely compact support over an interval, at least one non- vanishing moment, and orthogonal translates. Because of its simplicity, the Haar wavelet is useful for illustrating the basic concepts of wavelet theory but has limited utility in applications. !Haar(x) = 1,0!x<1 0,otherwise " # $ (1) !Haar(x) = 1,0 ! x <1/ 2 "1,1/ 2 ! x <1 0,otherwise # $ % & % (2)
  • 6. 6  |  P a g e     Over the past decade, there has been an increasing amount of research into the applications of wavelet transforms to remote sensing, particularly in image fusion. It has been found that wavelets can be used to extract detail information from one image and inject it into another, since this information is contained in high frequencies and wavelets can be used to select a set of frequencies in both time and space. The resulting merged image, which can in fact be a combination of any number of images, contains the best characteristics of all the original images. 2. Wavelet Transform Theory 2.1. Wavelet family Wavelets can be described in terms of two groups of functions: wavelet functions and scaling functions. It is also common to refer to them as families: the wavelet function is the “mother” wavelet, the scaling function is the “father” wavelet, and transformations of the parent wavelets are “daughter” and “son” wavelets. 2.1.1. Wavelet functions Generally, a wavelet family is described in terms of its mother wavelet, denoted as Ψ(x). The mother wavelet must satisfy certain conditions to ensure that its wavelet transform is stably invertible. These conditions are: !(x)! 2 dx =1 !(x) dx < "! !(x)dx = 0! 3&4&5 The conditions specify that the function must be an element of L2(R), and in fact must have normalized energy, that it must be an element of L1(R), and that it have zero mean. The third condition allows the addition of wavelet coefficients without changing the total flux of the signal. Other conditions might be specified according to the application. For example, the wavelet function might need to be continuous, or continuously differentiable, or it might need to have compact support over a specific interval, or a certain number of vanishing moments. Each of these conditions affects the results of the wavelet transform. To apply a wavelet function, it must be scaled and translated. Generally, a normalization factor is also applied so that the daughter wavelet inherits all of the properties of the mother wavelet. A daughter wavelet a,b(x) is defined by the equation !a, b(x) = a!1/2 !((x ! b) / a) (6) where a, b ∈ R and a ≠ 0; a is called the scaling or dilation factor and b is called the translation factor. In most practical applications it is necessary to place limits on the values of a and b. A common choice isa = 2—j and b = 2-k where j and k are integers. The resulting equation is !j,k (x) = 21/2 !(2j x ! k) (7) This choice for dilation and translation factors is called a dyadic sampling. Changing j by one corresponds to changing the dilation by a factor of two, and changing k by one corresponds to a shift of 2j.
  • 7. 7  |  P a g e     2.1.2. Scaling functions In discrete wavelet transforms, a scaling function, or father wavelet, is needed to cover the low frequencies. If the mother wavelet is regarded as a high pass filter then the father wavelet, denoted as (x), should be a low pass filter. To ensure that this is the case, it cannot have any vanishing moments. It is useful to specify that, in fact, the father wavelet have a zeroth moment, or mean, equal to one: !(x)dx =1! (8) Multiresolution analysis makes use of a closed and nested sequence of sub- spaces ,which is dense in L2(R) : each subsequent subspace is at a higher resolution and contains all the subspaces at lower resolutions. Since the father wavelet is in V0, it, as well as the mother wavelet, can be expressed as linear combinations of the basis functions for V1, k(x) : !(x) = hk"i,k (x) k ! (9) !(x) = lk!i,k (x) k ! (10) 2.2. Wavelet transforms Wavelet transforms provide a framework in which a signal is decomposed, with each level corresponding to a coarser resolution, or lower frequency band. There are two main groups of transforms, continuous and discrete. Discrete transforms are more commonly used and can be subdivided in various categories. Although a review of the literature produces a number of different names and approaches for wavelet transformations, most fall into one of the following three categories: decimated, undecimated, and non- separated. 2.2.1. Continuous wavelet transform A continuous wavelet transform is performed by applying an inner product to the signal and the wavelet functions. The dilation and translation factors are elements of the real line. For a particular dilation a and translation b, the wavelet coefficient Wf(a,b) for a signal f can be calculated as Wf (a,b) = f,!a,b = f (x)!! a,b (x)dx (11) Wavelet coefficients represent the information contained in a signal at the corresponding dilation and translation. The original signal can be reconstructed by applying the inverse transform: f (x) = 1 Cw Wf (a,b)!a,b (x)db da a2 !" " # (12) where Cw is the normalization factor of the mother wavelet. Although the continuous wavelet transform is simple to describe mathematically, both the signal and the wavelet function must have closed forms, making it difficult or impractical to apply. The discrete wavelet is used instead.
  • 8. 8  |  P a g e     2.2.2. Discrete wavelet transform The term discrete wavelet transform (DWT) is a general term, encompassing several different methods. It must be noted that the signal itself is continuous; discrete refers to discrete sets of dilation and translation factors and discrete sampling of the signal. For simplicity, it will be assumed that the dilation and translation factors are chosen so as to have dyadic sampling, but the concepts can be extended to other choices of factors. At a given scale J, a finite number of translations are used in applying multiresolution analysis to obtain a finite number of scaling and wavelet coefficients. The signal can be represented in terms of these coefficients as f (x) = cJk!Jk (x)+ djk"jk (x) k ! j=1 ! k ! (13) where cjk are the scaling coefficients and djk are the wavelet coefficients. The first term in Eq. (14) gives the low-resolution approximation of the signal while the second term gives the detailed information at resolutions from the original down to the current resolution J. The process of applying the DWT can be represented as a bank of filters. At each level of decomposition, the signal is split into high frequency and low frequency components; the low frequency components can be further decomposed until the desired resolution is reached. When multiple levels of decomposition are applied, the process is referred to as multiresolution decomposition. In practice when wavelet decomposition is used for image fusion, one level of decomposition can be sufficient, but this depends on the ratio of the spatial resolutions of the images being fused (for dyadic sampling, a 1:2 ratio is needed). 2.2.2.1. Decimated The conventional DWT can be applied using either a decimated or an undecimated algorithm. In the decimated algorithm, the signal is down- sampled after each level of transformation. In the case of a two-dimensional image, downsampling is performed by keeping one out of every two rows and columns, making the transformed image one quarter of the original size and half the original resolution. The decimated algorithm can therefore be represented visually as a pyramid, where the spatial resolution becomes coarser as the image becomes smaller Further discussion of the DWT will be primarily with respect to two- dimensional images, keeping in mind that the concepts can be simplified to the one- dimensional case. The wavelet and scaling filters are one-dimensional, necessitating a two-stage process for each level in the multiresolution analysis: the filtering and downsampling are first applied to the rows of the image and then to its columns. This produces four images at the lower resolution, one approximation image and three wavelet coefficient, or detail, images. A, HD, VD, and DD are the sub-images produced after one level of transformation. The A sub-image is the approximation image and results from applying the scaling or low-pass filter to both rows and columns. A subsequent level of transformation would be applied only to this sub- image. The HD sub- image contains the horizontal details (from low-pass on rows, high-pass on columns), the VD sub-image contains the vertical details (from high-pass on rows, lows- pass on columns) and the DD sub-image contains the diagonal details (from high-pass, or wavelet filter, on both rows and columns).
  • 9. 9  |  P a g e     Fig. 2.1. Three-Level one-dimensional discrete wavelet transform Fig.2.2. a) Image at first decomposition level b) Second Decomposition Level The decimated algorithm is not shift-invariant, which means that it is sensitive to shifts of the input image. The decimation process also has a negative impact on the linear continuity of spatial features that do not have a horizontal or vertical orientation. These two factors tend to introduce artifacts when the algorithm is used in applications such as image fusion. Image Source : - G. Pajares, J. Manuel , A wavelet-based image fusion tutorial, Pattern Recognition 37(2004) 1855-1872.
  • 10. 10  |  P a g e     2.2.2.2. Undecimated. The undecimated algorithm addresses the issue of shift-invariance. It does so by suppressing the down-sampling step of the decimated algorithm and instead up-sampling the filters by inserting zeros between the filter coefficients. Algorithms in which the filter is up-sampled are called “à trous”, meaning “with holes”. As with the decimated algorithm, the filters are applied first to the rows and then to the columns. In this case, however, although the four images produced (one approximation and three detail images) are at half the resolution of the original, they are the same size as the original image. The approximation images from the undecimated algorithm are therefore represented as levels in a parallelepiped, with the spatial resolution becoming coarser at each higher level and the size remaining the same. The undecimated algorithm is redundant, meaning some detail information may be retained in adjacent levels of transformation. It also requires more space to store the results of each level of transformation and, although it is shift-invariant, it does not resolve the problem of feature orientation. A previous level of approximation, resolution J-1, can be reconstructed exactly by applying the inverse transform to all four images at resolution J and combining the resulting images. Essentially, the inverse transform involves the same steps as the forward transform, but they are applied in the reverse order. In the decimated case, this means up-sampling the approximation and detail images and applying reconstruction filters, which are inverses of the decomposition scaling and wavelet filters, first by columns and then by rows. For example, first the columns of the VD image would be up-sampled and the inverse scaling filter would be applied, then the rows would be up- sampled and the inverse wavelet filter would be applied. The original image is reconstructed by applying the inverse transform to each deconstructed level in turn, starting from the level at the coarsest resolution, until the original resolution is reached. Reconstruction in the undecimated case is similar, except that instead of up- sampling the images, the filters are down-sampled before each application of the inverse filters. Shift-invariance is necessary in order to compare and combine wavelet coefficient images. Without shift- invariance, slight shifts in the input signal will produce variations in the wavelet coefficients that might intro- duce artifacts in the reconstructed image. Shift-variance is caused by the decimation process, and can be resolved by using the undecimated algorithm. However, the other problem with standard discrete wavelet transforms is the poor directional selectivity, meaning poor representation of features with orientations that are not horizontal or vertical, which is a result of separate filtering in these directions (Kingsbury, 1999). 2.2.2.3. Non-separated. One approach for dealing with shift variance is to use a non-separated, two-dimensional wavelet filter derived from the scaling function. This produces only two images, one approximation image, also called the scale frame, and one detail image, called the wavelet plane. The wavelet plane is computed as the difference between the original and the approximation images and contains all the detail lost as a result of the wavelet decomposition. As with the undecimated DWT, a coarser approximation is achieved by upsampling the filter at each level of decomposition; correspondingly, the filter is down- sampled at each level of reconstruction. Some redundancy between adjacent levels of decomposition is possible in this approach, but since it is not decimated, it is shift-invariant, and since it does not involve separate filtering in the horizontal and vertical directions, it better preserves feature orientation. 2.3 Image fusion The objective of image fusion is to produce a single image containing the best aspects of the fused images. Some desirable aspects include high spatial resolution and high spectral
  • 11. 11  |  P a g e     resolution (multispectral and panchromatic satellite images), areas in focus (microscopy images), functional and anatomic information (medical images), different spectral information (optical and infrared images), or colour information and texture information (multispectral and synthetic aperture radar images). 2.3.1 Image Fusion Scheme The wavelet transform contains the low-high bands, the high-low bands and the high-high bands of the image at different scales, plus the low-low band of the image at coarsest level. Except for the low-low band which has all positive transform values, all other bands contain transform values in these bands correspond to sharper brightness values and thus to the salient features in the image such as edges, lines, and region boundaries. There a good integration rule is to select the larger( absolute value) of the two wavelet coefficients at each point. Subsequently, a composite image constructed by performing an inverse wavelet transform based on the combined transform coefficients. Since the wavelet transform provides both spatial and frequency domain localization, the effect of maximum fusion rule can be illustrated in the following two aspects. If the same object appears more distinctly( in other words with better contrast) in image A than in image B, after fusion the object in image A will be preserved. In a different scenario, suppose the outer boundary of the object appears more distinctly in image A while the inner boundary of the object appears more distinctly in image B. As a result, the object in image A looks visually larger than the corresponding object in image B. In this case the wavelet transform coefficients of the object in images A and B will be dominant at the different resolution levels. Based on maximum selection rule, both the outer structure from image A and the inner structure from image B will be preserved in the fused image. 3. Methodolgy The steps involved in fusion of images through wavelet transform are given below. 1) Get the images to be fused 2) Apply the wavelet transform on both the images through chosen wavelet at the desired level 3) Get the approximation and detail coefficients for both the images 4) Merge the coefficients by desired fusion rule 5) Apply Inverse discrete wavelet transform on the merged coefficients and get the fused image Fig.3.1 Flowchart of the Methodology Image Source :- H. Li, B.S. Manjunath, S.K. Mitra, Multisensor Image Fusion Using the Wavelet Transform, Graphical Models and Image Processing, Vol. 57, No. 3, 1995, 235-245.
  • 12. 12  |  P a g e     4. Results Based on the above stated methodology a MATLAB code has been written with a graphical user interface to fuse the images. 4.1 User Interface 4.1.1 Wavelets A choice of wavelets has been given to the user 1) Debauchies Wavelet - db1 2) Coiflets - coif1 3) Symlets – sym2 4) Discrete meyer - dmey 5) Orthogonal – bior 1.1 6) Reverse Biorthogonal - rbio1.1 User can select up to 10 decomposition level 4.2.2 Fusion Method User has 6 options for fusion method both for approximation coefficients and detail coefficients i.e. maximum, minimum, mean, Image 1, Image 2, and random which merges the two approximations or detail structures obtained from Image 1 and 2 element wise by taking the maximum, the minimum, the mean, the first element, the second element, the second element or a randomly chosen element. Fig. 4.1 The design of user interface Shown below are the input images and the output fused image. As can be seen in the image 1 the left portion of the image is blurred and in the image 2 the middle portion is blurred. The output image is much better than both the input images.
  • 13. 13  |  P a g e     Fig.5.1. a and b Input Images c) Fused Image using biorthogonal wavelet transform, four levels of decomposition, minimum approximation coefficients and maximum detailed coefficients. 5. Conclusions Wavelet transforms isolate frequencies in both time and space, allowing detail information to be easily extracted from satellite imagery. A number of different schemes have been proposed to inject this detail information into multispectral imagery, ranging from simple substitution to complex formulas based on the statistical properties of the imagery. While even the simplest wavelet-based fusion scheme tends to produce better results than stan-dard fusion schemes such as IHS and PCA, further improvement is evident with more sophisticated wavelet- based fusion schemes. The drawback is that there is greater computational complexity and often parameters must be set up before the fusion scheme can be applied. Another strategy for improving the quality of results is to combine a standard fusion scheme with a wavelet-based fusion scheme, however this also has limitations; IHS, for instance, can only be applied to three bands at a time. The type of algorithm that is used to apply the wavelet transform can also affect the quality of the result: decimation disturbs the linear a)    Image  1   b)  Image  2   c)  Fused  Image  
  • 14. 14  |  P a g e     continuity of spatial features and introduces artifacts in the fused image, whereas non- decimated algorithms require more memory space during processing but do not introduce as many artifacts. Each wavelet-based fusion scheme has its own set of advantages and limitations. More comprehensive testing is required in order to assess fully under what conditions each one is most appropriate. 6. References G. Pajares, J. Manuel , A wavelet-based image fusion tutorial, Pattern Recognition 37(2004) 1855-1872. V.P.S Naidu, J.R. Raol, Pixel-level Image fusion using wavelets and Principal component analysis, Defence Science Journal, Vol. 58No. 3, May 2008, 338-352. E.J. Stollnitz, T.D. Derose, D. Salesin, Wavelets in Computer Graphics, Theory and Applications, Morgan Kauffman Publishers Inc. H. Li, B.S. Manjunath, S.K. Mitra, Multisensor Image Fusion Using the Wavelet Transform, Graphical Models and Image Processing, Vol. 57, No. 3, 1995, 235-245.
  • 15. 15  |  P a g e     7. MATLAB Code function varargout = wavelet(varargin) % WAVELET MATLAB code for wavelet.fig % WAVELET, by itself, creates a new WAVELET or raises the existing % singleton*. % % H = WAVELET returns the handle to a new WAVELET or the handle to % the existing singleton*. % % WAVELET('CALLBACK',hObject,eventData,handles,...) calls the local % function named CALLBACK in WAVELET.M with the given input arguments. % % WAVELET('Property','Value',...) creates a new WAVELET or raises the % existing singleton*. Starting from the left, property value pairs are % applied to the GUI before wavelet_OpeningFcn gets called. An % unrecognized property name or invalid value makes property application % stop. All inputs are passed to wavelet_OpeningFcn via varargin. % % *See GUI Options on GUIDE's Tools menu. Choose "GUI allows only one % instance to run (singleton)". % % See also: GUIDE, GUIDATA, GUIHANDLES % Edit the above text to modify the response to help wavelet % Last Modified by GUIDE v2.5 29-Mar-2014 21:32:17 % Begin initialization code - DO NOT EDIT gui_Singleton = 1; gui_State = struct('gui_Name', mfilename, ... 'gui_Singleton', gui_Singleton, ... 'gui_OpeningFcn', @wavelet_OpeningFcn, ... 'gui_OutputFcn', @wavelet_OutputFcn, ... 'gui_LayoutFcn', [] , ... 'gui_Callback', []); if nargin && ischar(varargin{1}) gui_State.gui_Callback = str2func(varargin{1}); end if nargout [varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:}); else gui_mainfcn(gui_State, varargin{:}); end % End initialization code - DO NOT EDIT % --- Executes just before wavelet is made visible.
  • 16. 16  |  P a g e     function wavelet_OpeningFcn(hObject, eventdata, handles, varargin) % This function has no output args, see OutputFcn. % hObject handle to figure % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) % varargin command line arguments to wavelet (see VARARGIN) % Choose default command line output for wavelet handles.output = hObject; % Update handles structure guidata(hObject, handles); % UIWAIT makes wavelet wait for user response (see UIRESUME) % uiwait(handles.figure1); % --- Outputs from this function are returned to the command line. function varargout = wavelet_OutputFcn(hObject, eventdata, handles) % varargout cell array for returning output args (see VARARGOUT); % hObject handle to figure % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) % Get default command line output from handles structure varargout{1} = handles.output; % --- Executes on button press in pushbutton1. function pushbutton1_Callback(hObject, eventdata, handles) % hObject handle to pushbutton1 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) axes(handles.axes1) imshow('image1.jpg'); % --- Executes on button press in pushbutton2. function pushbutton2_Callback(hObject, eventdata, handles) % hObject handle to pushbutton2 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) axes(handles.axes2) imshow('image2.jpg'); % --- Executes on selection change in popupmenu1. function popupmenu1_Callback(hObject, eventdata, handles) % hObject handle to popupmenu1 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB
  • 17. 17  |  P a g e     % handles structure with handles and user data (see GUIDATA) % Hints: contents = cellstr(get(hObject,'String')) returns popupmenu1 contents as cell array % contents{get(hObject,'Value')} returns selected item from popupmenu1 a = get(handles.popupmenu1, 'value'); switch a case 1 handles.x1 = 'db1'; case 2 handles.x1 = 'coif1'; case 3 handles.x1 = 'sym2'; case 4 handles.x1 = 'dmey'; case 5 handles.x1 = 'bior1.1'; case 6 handles.x1 = 'rbio1.1' ; end guidata(hObject,handles) % --- Executes during object creation, after setting all properties. function popupmenu1_CreateFcn(hObject, eventdata, handles) % hObject handle to popupmenu1 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles empty - handles not created until after all CreateFcns called % Hint: popupmenu controls usually have a white background on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on selection change in popupmenu2. function popupmenu2_Callback(hObject, eventdata, handles) % hObject handle to popupmenu2 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) % Hints: contents = cellstr(get(hObject,'String')) returns popupmenu2 contents as cell array % contents{get(hObject,'Value')} returns selected item from popupmenu2 b = get(handles.popupmenu2, 'value'); handles.x2 = b; % switch b % case 1 % handles.x2 = 1; % case 2 % handles.x2 = 2;
  • 18. 18  |  P a g e     % case 3 % handles.x2 = 3; % case 4 % handles.x2 = 4; % case 5 % handles.x2 = 5; % % end guidata(hObject,handles) % --- Executes during object creation, after setting all properties. function popupmenu2_CreateFcn(hObject, eventdata, handles) % hObject handle to popupmenu2 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles empty - handles not created until after all CreateFcns called % Hint: popupmenu controls usually have a white background on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on button press in pushbutton3. function pushbutton3_Callback(hObject, eventdata, handles) % hObject handle to pushbutton3 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) % --- Executes on selection change in popupmenu3. function popupmenu3_Callback(hObject, eventdata, handles) % hObject handle to popupmenu3 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) % Hints: contents = cellstr(get(hObject,'String')) returns popupmenu3 contents as cell array % contents{get(hObject,'Value')} returns selected item from popupmenu3 c = get(handles.popupmenu3, 'value'); switch c case 1 handles.x3 = 'max'; case 2 handles.x3 = 'min'; case 3 handles.x3 = 'mean'; case 4 handles.x3 = 'img1'; case 5 handles.x3 = 'img2';
  • 19. 19  |  P a g e     case 6 handles.x3 ='rand'; end guidata(hObject,handles) % --- Executes during object creation, after setting all properties. function popupmenu3_CreateFcn(hObject, eventdata, handles) % hObject handle to popupmenu3 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles empty - handles not created until after all CreateFcns called % Hint: popupmenu controls usually have a white background on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on selection change in popupmenu4. function popupmenu4_Callback(hObject, eventdata, handles) % hObject handle to popupmenu4 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) % Hints: contents = cellstr(get(hObject,'String')) returns popupmenu4 contents as cell array % contents{get(hObject,'Value')} returns selected item from popupmenu4 d = get(handles.popupmenu4, 'value'); switch d case 1 handles.x4 = 'max'; case 2 handles.x4 = 'min'; case 3 handles.x4 = 'mean'; case 4 handles.x3 = 'img1'; case 5 handles.x3 = 'img2'; case 6 handles.x3 ='rand'; end guidata(hObject,handles) % --- Executes during object creation, after setting all properties. function popupmenu4_CreateFcn(hObject, eventdata, handles) % hObject handle to popupmenu4 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB
  • 20. 20  |  P a g e     % handles empty - handles not created until after all CreateFcns called % Hint: popupmenu controls usually have a white background on Windows. % See ISPC and COMPUTER. if ispc && isequal(get(hObject,'BackgroundColor'), get(0,'defaultUicontrolBackgroundColor')) set(hObject,'BackgroundColor','white'); end % --- Executes on button press in pushbutton4. function pushbutton4_Callback(hObject, eventdata, handles) % hObject handle to pushbutton4 (see GCBO) % eventdata reserved - to be defined in a future version of MATLAB % handles structure with handles and user data (see GUIDATA) p = handles.x1; q = handles.x2; r = handles.x3; s = handles.x4; X1 = imread('image1.jpg'); X2 = imread('image2.jpg'); XFUS = wfusimg(X1,X2,p,q,r,s)/255; axes(handles.axes5) imshow(XFUS);