SlideShare a Scribd company logo
Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE
International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 446
Faiza DAKKA dakka_f@yahoo.fr
Faculty of Sciences, Department of Physics,
Mohamed V-Agdal University
Rabat B.P. 1014, Morocco
Ahmed HAMMOUCH
GIT-LGE Laboratory, ENSET,
Mohamed V-Souissi University
Rabat, B.P. 6207, Morocco
Driss ABOUTAJDINE
Faculty of Sciences, Department of Physics,
Mohamed V-Agdal University
Rabat B.P. 1014, Morocco
Abstract
This paper deals with unsupervised classification of multi-spectral images, we propose to use
a new vectorial fuzzy version of Hidden Markov Chains (HMC).
The main characteristic of the proposed model is to allow the coexistence of crisp pixels
(obtained with the uncertainty measure of the model) and fuzzy pixels (obtained with the
fuzzy measure of the model) in the same image. Crisp and fuzzy multi-dimensional densities
can then be estimated in the classification process, according to the assumption considered
to model the statistical links between the layers of the multi-band image. The efficiency of the
proposed method is illustrated with a Synthetic and real SPOTHRV images in the region of
Rabat.
The comparisons of two methods: fuzzy HMC and HMC are also provided. The classification
results show the interest of the fuzzy HMC method.
Keywords: Bayesian Iimage Classification, Markov Chains, Fuzzy Hidden Markov,
Unsupervised Classification.
1 INTRODUCTION
Hidden Markov models are used in many forms and with different spatial structures like
strings, fields or trees [3]. The popularity of these models is illustrated by the multitude and
diversity of applications that have been proposed, in signal processing in speech recognition
and also in image processing. In particular, the model of hidden Markov chains (CMC) has
been successfully used for image classification [3].
In this case, 2D images are first transformed into a 1D vector path through Hilbert-Peano [5]
to fit the one-dimensional structure of a chain Fig. 1 (a).The interest for this model comes
from the fact that when the hidden process X = (X, ..., XN) can be represented by a finite
Markov chain and when the structure of the noise is not too complex, then X can be
reconstructed from the only observed process Y = (Y1, ..., YN), Yn R,, using different criteria
for classification like MAP "Maximum A Posteriori" or MPM Maximal Posterior Mode. It is
sometimes interesting to take into account not only the uncertainty in the observations (often
due to noise), but also their vagueness (fuzzy). This imprecision may come from a motion
sensor at the time of the acquisition of a photo, or the effect of surface / volume part found in
Unsupervised Multispectral Image Classification
By Fuzzy Hidden Markov Chains Model For SPOTHRV
Images
Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE
International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 447
medical imaging and satellite. The latter effect reflects the fact that a pixel of the image is the
result of integration over a surface not necessarily homogeneous, where several components
are mixed in different proportions and unknown. In terms of classification, these pixels have
an ambiguous and sometimes it is better not to get a "harsh response" from classifier (eg,
Class 0 or Class 1) but a "vague response" measured by an interval value ]0,1[. This
response reflects the ambiguity of the pixels can be observed and interesting information.
Indeed, the level of degrees or can be interpreted as the mixing ratio of class 0 versus class 1
in the pixel, or as the degree of belonging to class 0.
This paper is organized as follows: The CMC model is briefly presented in Section 2. In
Section 3 the CMC model for fuzzy image classification is detailed, and also The CMC
Parameter estimation, carried out with an extension of the algorithm Iterative Conditional
Estimation (ICE) [1, 2]. Sections 4 illustrate the comparative results of Classification
Unsupervised. Finally, Section 5 reviews the theoretical and experimental work and offers
some perspectives.
2 UNSUPERVISED CLASSIFICATION USING HMC
This section is intended to give some recalls about the HMC model [6,7] and its use for
unsupervised image classification [8]. The HMC model can be adapted to a 2D analysis
through a Hilbert-Peano scan of the image, see Fig. 1. Hence all estimation and classification
processings are applied on the 1D sequence, and the segmented 2D image is reconstructed
by using a reverse Hilbert-Peano scan from the 1D classified sequences.
FIGURE 1: Hilbert-Peano scan construction for an 8*8 image. This scan is used to transform a
2D image into a 1D signal (y), and conversely.
FIGURE 2: Hilbert-Peano scan for an 8*8 multi-component image (M = number of layers in the
image, N = number of pixels in each layer).
The Hilbert-Peano scan presents the ability to take into account the neighborhood of the pixel
of interest [6]. Fig. 2 shows the scan for a 8 * 8 multi-component image with M layers.
A. The HMC model
An image y = {y1…., yN}, N being the total number of pixels, is considered as a realization of
the 1D observed process Y = {Y1,…,YN}, each Yn is a real-valued random variable. The
segmented image x = {x1,…,xN} is considered as a realization of a hidden process X =
{X1,…,XN}, each Xn = {1,…,K} is a discrete random variable. In classical HMC modeling,
X is assumed to be a Markov chain, i.e. p (xn+1 | xn,…,x1) = p (xn+1 | xn).
Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE
International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 448
The distribution of X is consequently determined by the distribution of X1, denoted by πk = p
(X1= k), and the set of transition matrices (An)1≤n≤N whose entries are an
ij = p (Xn+1 = j |Xn =
i).
We further assume that X is a stationary Markov chain, i.e. entry an
ij = aij does not depend on
n. With the following additional properties:
(i) Yn are independent conditionally to X, i.e. p (y | x) = π
N
n=1 p(yn | x), and
(ii) p (yn | x) = p (yn |xn), the distribution of the pairwise process (X,Y) can be written as
p (x, y) = πx1 fx1 (y1)
with fxn(yn) = p (yn | xn) the data-driven densities, which are assumed to be Gaussian in this
work [8].
B. Classification:
The estimation of X from Y can be done by applying the MPM Bayesian criterion:
)(maxarg=(y)N],…[1,n n kxMPM
n ξ∈∀
(1)
with ξn(k) = p (Xn = k | y) the marginal a posteriori probabilities.
The HMC model allows explicit computation of the MPM solution using the well-known
Baum’s “forward” αn(k) and “backward” βn(k) probabilities [10], modified by Devijver [9] for
computational reasons:
αn(k) ≈ P(Xn = k , Y1 = y1,…,Yn = yn) (2)
(3)
In the following, we use the numerically stable forward–backward recursions resulting from
these approximations:
Forward initialization:
for 1≤ i ≤ K.
∑≤≤ kj1
1jj
1ii
n
)(yƒ
)(yƒ
=(i)
π
π
α
(4)
Forward induction:
for n =2,…,N and 1 ≤ i ≤ K.
(5)
for 1 ≤ i ≤ K
βN(i)=1
Backward induction:
for n=N-1,…,1 et 1 ≤ i ≤ K
(6)
It can be shown that marginal a posteriori probabilities involved in MPM classification can be
written
ξn(i) = αn(i) βn(i),
(7)
and joint a posteriori probabilities Ψn(i,j) = P(Xn = i, Xn+1 = j| Y = y) as:
(8)
C. Estimation:
Before classification, all parameters involved in the CMC model have to be estimated.
θ = {πk, akl ,fk} k,l
(9)
Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE
International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 449
One well-known solution is to use the EM iterative procedure [11] aiming at optimizing the
log-likelihood of data, according to the steps described in Algorithm 1, with following update
equations
)(
N
1
=,k
1
][
k
N
n
q
n
q
k ∑=
ΠΩ∈∀ ξ
(10)
q
k
N
n
q
n
q
kl
lk
∧
−
=
Π
Ψ
Ω∈∀
∑
N
),(
=â,l,k
1
1][
(11)
q
k
n
N
n
q
nq
k
yk
∧
=
∧
Π
Ω∈∀
∑
N
)(
=,k 1
][ ξ
µ
(12)
q
k
q
kn
N
n
q
nq
k
yk
∧
∧
=
∧
Π
−
Ω∈∀
∑
N
))((
=,k
2
][
1
][
2
µξ
σ
(13)
Algorithm 1 : HMC parameters estimation using EM.
q ← 0: Initialize parameters θ[0]
repeat
q ←q + 1
- Compute “Forward”
][q
nα and “Backward”
][q
nβ probabilities using equations (5) and (6).
- Compute a posteriori probabilities )(][
kq
nξ using eq. (6) and ),(][
lkq
nψ using eq. (7) and (8).
- Estimate CMC parameters θ[q] using equations (10) and (11) for Markov parameters and
equations (12) and (13) for Gaussian data-driven parameters.
until |θ[q] - θ[q-1] | <Threshold.
Iterative estimation of parameters is stopped when parameters do not vary much.
3 UNSUPERVISED CLASSIFICATION USING FUSSY HMC
A. Fuzzy Markov chains model
We consider a multi-component image of M layers. According to the Hilbert-Peano scan, we
get N series of M data, denoted by y = { 1y ,… Ny }, where ny ={
]1[
ny ,…,
][M
ny }
t
, 1≤n ≤ N.
In classical HMC approach, the aim is to classify each yn RM into a set of K classes, the
state space ={w1,…,wK}, in order to obtain the segmented chain x ={x1,…, xN} ( Fig. 2). The
segmented image is then reconstructed from x using an inverse Hilbert-Peano scan.
For the sake of simplicity, we confine our study to the K = 2 case, i.e. = {0,1}.
In fuzzy HMC context, the range of xn is now the interval = [0, 1]. In the following, εn will
denote a realization of random variable Xn and we will adopt the notation:
εn = 0 if the pixel is from class 0,
εn ]0, 1[ if the pixel is a fuzzy one,
εn = 1 if the pixel is from class 1.
B. Probabilities in fuzzy Markov chains context
As stated previously, each xn takes its value in two types of sets: a hard one {0, 1}, and a
fuzzy one defined over the range ]0, 1[.
Let δ0 and δ1 be Dirac weights on 0 and 1, and 1 the Lebesgue measure on ]0, 1[.
By taking v = δ0 + δ1 + 1 as a measure on , the distribution of Xn can be defined by a
density h on with respect to v.
If we assume that X is homogeneous and the distribution of each Xn is uniform on the fuzzy
class, P (Xn = εn) =h(εn) = πεn can be written:
h(εn = 0) = π0;
Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE
International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 450
h(εn = 1) = π1;
h(εn) = π]0,1[; εn ]0, 1[;
with π0 + π1 + π]0,1[ = 1.
We can now detail the new expression for the transition probabilities of the Markov
chain nn
t εε ,1−
:
==== −−−
)/( 11,1 nnnn XXPt nn
εεεε
)()/0( 011 nnnn XXP εδε −− ==
] [ )()/( 1,011 nnnnn XXP ειεε −− ==+
)()/1( 111 nnnn XXP εδε −− ==+
Ω∈∀ − nn εε ,1 and { }Nn ,...,2∈∀
With 0,1
≥− nn
t εε and 111, =−−∫Ω
nnn
dt εεε
∏=
−
Π==
N
n
nn
txXP
2
,11
)( εεε
C. Multi-component fuzzy HMC implementation.
εn is a realization of a random variable Xn, and each yn is a realization of a random vector
ny ={
]1[
ny ,…,
][M
ny }
t
, Thus the problem is to estimate the unobserved realization x of a
random process X from the observed realization y of a random process y = { 1y ,… Ny };
Similarly to classical HMC, multi-component fuzzy HMC based image classification methods
consider the two following assumptions:
H1: the random variables Y1,…,YN are independent conditionally on X;
H2: the distribution of each Yn conditionally on X is equal to its distribution conditionally on Xn.
It is important to note that the random variables (
][m
ny )1≤m≤M are not assumed to be
mutually independent conditionally on Xn.
Assuming that distributions of (Xn,Yn,Xn+1,Yn+1) are independent of n, each state εn of the
state space (i.e. hard classes {0,1}, as well as the fuzzy class ]0,1[) is associated to a
distribution characterizing the M-dimensional observations yn:
fεn(yn) = P (Yn = yn | Xn = εn),
(14)
Given an observed sequence y = {y1,…, yN}, the joint state-observation probability is given by:
)()(),(
2
,1 111 n
N
n
yftyfyYxXP nnn εεεεε ∏=
−
Π===
(15)
In unsupervised classification, the distribution P (X = x,Y = y) is unknown and must first be
estimated in order to apply a Bayesian classification technique (MAP or MPM). Therefore the
following sets of parameters need to be estimated:
1) The set Γ characterizing the fuzzy Markov chain parameters, i.e. the initial probability
vector π = {πε} ε and the transition probabilities Ω∈∀− nnnn
t εεεε ,1_,1
2) The set ∆ regrouping the parameters of the M- dimensional distributions presented in (14),
i.e. the distributions associated with the hard classes and the fuzzy one.
D. Fuzzy Markov Chain parameters estimation
For the estimation of the parameters in ¡, we propose to use an adaptation of the general ICE
algorithm [12] which is an alternative to the well-known Estimation-Maximization (EM)
algorithm. In fact, ICE does not refer to the likelihood, but it is based on the conditional
expectation of some estimators from the complete data (x, y). It is an iterative method which
produces a sequence of estimations θq of parameter θ as follows:
1) initialization θ0, obtained with an initial classification algorithm (k-means algorithm).
Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE
International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 451
2) computation of ]/),([1
yYyxq
==
∧
+
θθ where ),( yx
∧
θ is an estimator of θ.
3) Stop the algorithm when θQ-1 θQ.
This section is not intended to give a complete description of the ICE algorithm in the HMC
context, interested readers may consult [5]. Similarly to the classical case, parameters in
Γcan be calculated analytically by using the Baum-Welch algorithm [13]:
for the hard classes, the classical normalized Baum-Welch probabilities [14] can be used
directly.
for the fuzzy class, the forward and backward probabilities are defined by:
ςςαξα ξςξ dtyf nnn ,
]1,0[
11 )()()( ∫++ ∝
(16)
ςςβξβ ξξς dyft nnn )()()( 1,
]1,0[
1 ++∫∝
(17)
The integrals above can not be calculated analytically and numerical integration must be
performed. Hence, the continuous interval ]0,1[ is partitioned into a given number F of
subintervals. We thus reduce the domain of fuzzy membership degree ]0,1[ to F attributed
values, corresponding to the medium value of the sub-interval of interest (see Fig. 3), thus
resulting in a quantization of the fuzzy measure ]0; 1[.
We obtained F fuzzy classes "discrete, whose value corresponds to the fuzzy median
subinterval considered.
For example, F = 2 implies ε {0.25, 0.75 }, F = 3 implies x {0. 165,0:5, 0.825}. More F is
large, over the parameter estimation is accurate.
FIGURE 3: Partition of the continuous interval ]0,1[ into F = 4 sub-intervals. The attributed
values ε correspond to the medium value of the sub-interval of interest.
E. Multi-dimensional density estimation
At each ICE iteration, we need to estimate the multidimensional densities fεn(yn). Several
strategies from multivariate data analysis are available, depending on the assumptions made
on the statistical links between the layers, and on the choice of the shape of the one-
dimensional densities. If independence between the layers is assumed, fεn(yn) is the product
of M densities
M
nn
gg εε ,...,1
defined on R:
)()(
1
m
n
M
m
m
n ygyf nn ∏=
= εε
For example, if we consider that fεn(yn) are multidimensional
Gaussian densities, parameters estimation can be easily achieved from the first and second
moments of a M-dimensional sample. Denoting by N(m, σ
2
) the normal distribution with mean
m and variance σ2
, the pdf of the hard classes are then expressed according to:
εn = 0 : N(m0; σ20),
εn = 1 : N(m1; σ21);
The parameters ∆={m0,m1,σ0, σ1} can be estimated by computing the empirical mean of
Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE
International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 452
several estimates according to ),(
1
1
1
yx
L
l
L
l
q
∧
=
+
∑= θθ , where xl is an a posteriori realization
of X conditionally on Y . It can be shown that X | Y is a non homogeneous Markov chain
whose parameters can be computed from the forward and backward probabilities: (16) and
(17).
The definition of the fuzzy measure Ā: “the pixel belongs to class 1” corresponding to the
fuzzy class ]0,1[, and its fuzzy membership function µA, allows us to estimate the fuzzy
parameters of the set ∆ in this new context. The proposed fuzzy membership function µA is
defined by:
[ ]




=
∈∀
−
−
− 10
01
11
,1
elsewhere0
)(
mmm
mm
mm
A mµ
Accordingly, the parameters of the Gaussian pdf for the fuzzy class can then be estimated
by:
mnn mm n ,0)1( εεε +−=
2
1
2
0
2
)1( σεσεσε nnn
+−=
4 MULTI-SPECTRALES IMAGES CLASSIFICATION
This section is intended to evaluate the two local classification approaches, and the benefit of
relaxing the HMCF model.
A. Validation
To validate the proposed approach we consider the synthetic image in Figure 4, we have
considered a hard class(the opaque background and the umbrella), and the
gradient between this class as the fuzzy zone (representing the border).Fig.4 present the
results of two methods of classification based on HMC model and fuzzy HMC model .
We can be seeing that the image of Figure 4 appears to be classified much more precise with
fuzzy CMC. The Taking into account a higher number of classes in the classical model of
CMC did not have much interest because the additional classes tend to specialize on
individual pixels.
FIGURE 4: The synthetic image and their results of classification with CMC model ( 4 class
and 5 class) and fussy CMC model (F=2,and F=3).
B. SPOTHRV Images Classification
1. The studied images
The image chosen for the study is that of the forest region of Rabat Morocco. This forest is an
ecological unit consisting of a typical natural vegetation and reforestation.
Figure 5 shows the satellite images studied, and its multi-spectral images (green, red and
near infrared) of the forest Rabat.
Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE
International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 453
FIGURE 5: The two images studied (a and b) and their multi-spectral images (c, d and e) for
the image (a) and (f, g and h) for the image (b).
HRV image - SPOT XS of this region, was acquired in December 1999 with a spatial
resolution of 20m. in this image we extracted two test areas of size 140x90 pixels occupied
mainly by a natural forest of orkoaks and eucalyptus planting, pine and acacia.
The reality of the ground was obtained from the official plans of the forest, followed by field
verification. These areas are characterized by heterogeneity in the distribution offorest
Stands. Discrimination between different textures is not obvious to human observer.
2. Experimental protocol
In all experiments, parameters initialization was done with a k-means classifier. The ICE
algorithm was stopped after 10 iterations, assuming it has converged.
FIGURE 6: (a) and (c) Maps of the reality of the ground. (b) and (d) Results of classification
with CMC (6 class) .
Figure 6 shows images of the realities ground (a and c) and the results of classification with
CMC on both multi-spectral images of the forest in Rabat.(b and d). Computation times are
339.246seconds.
In this application, the “Sea” or the “Acacia” were considered as crisp class 0 and
“Eucalyptus” or the “Oak-cork” crisp class 1. The fuzzy measure A thus corresponds to: “the
pixel belongs to the free sea class”. The classification results depend on the partition of ]0,1[,
the choice of the F sub-intervals implies different values of fuzzy measure ε, e.g. F = 2
implies ε {0.25, 0.75}, F = 3 implies ε {0.165, 0.5,0.825}, F = 4 implies ε {0.125, 0.375,
0.625, 0.875}
Figure 7 shows images of the realities ground (a and c) and a classification results obtained
with fussy vectorial CMC models for F=2 sub-intervals(b and e) and F=3 sub-intervals (c and
f). Computation times are 545.636 seconds.
Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE
International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 454
FIGURE 7: (a) and (d) Maps of the reality of the ground. Results of classification with fussy
CMC for 4 crisp class and 2 fuzzy class (K=4,F=2 ) (b) and (e) and for 3 crisp class and 3
fuzzy class (k=3,F=3 ) ) (c) and (f).
The computational complexity involved in the fuzzy vectorial HMC model is quite higher than
the one involved in the classical one. One can observe that a big number of fuzzy sub-
intervals provide a fine characterization of the observed scene. Furthermore, the
classification results produced by the fuzzy vectorial HMC model are more homogeneous.
The global shape and frontiers of the class seem to be better defined. Nevertheless, this
model clearly characterizes the frontiers between class and produces a reliable classification
of class, and then illustrates the interest of the vectorial fuzzy HMC model.
5 CONCLUSION
In this work, we propose to extend the fuzzy HMC model to unsupervised multi-spectral
image classification. In multi-component case, it is interesting not only to take into account
the uncertainty measure of the noisy observation (characteristic of probabilistic approach in
classical HMC), but also the imprecision measure of this observation (characteristic of fuzzy
approaches [6]) By adding a fuzzy measure in a statistical model, we obtain an original
model, different from classical and models. Indeed, it preserves the robustness of the
statistical classification (based on measures of uncertainty), and enriches it with the fuzzy
characteristic (measure of imprecision).The Experimental results confirm the validity of the
proposed approach. The vectorial fuzzy HMC model seems to be p promising in the field of
multi-component image classification, due to the imprecision measure ability to take into
account the multivariate information. We believe extending the fuzzy model to another model
strictly more general than the CMC, called Markov chain couple [15] could also benefit from
the fuzzy extension proposed in this paper.
REFERENCES
[1] W. Pieczynski. Statistical image segmentation. Mach. Graph. and Vis., 1 : 261-268, 1992.
[2] H. Caillol, W. Pieczynski, A. Hillion. Estimation of fuzzy Gaussian mixture and
unsupervised statistical image segmentation. IEEE Trans. on Im. Proc., 6(3) : 425-
440, 1997.
[3] W. Pieczynski. Modµeles de Markov en traitement d'images. Traitement du Signal, 20(3)
:255{278, 2003.
[4] B. Benmiloud and W. Pieczynski. Estimation des paramètres dans les CMC et
segmentation d'images. Traitement du Signal, 12(5) :433{454, 1995.
Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE
International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 455
[5] W. Skarbek. Generalized Hilbert scan in image printing. In R. Klette and W. G. Kropatsch,
editors, Theoretical Foundations of Computer Vision. Akademik Verlag, Berlin, 1992.
[6] S. Derrode, G. Mercier, and W. Pieczynski, “Unsupervised change detection in SAR
images using a multicomponent HMC model,” in MultiTemp’03, (Ispra, Italy) (2003).
[7] L. R. Rabiner, “A tutorial on HMMs and selected applications in speech recognition,”
Proc. IEEE 77(2), 257–286 (1989). [doi:10.1109/5.18626].
[8] N. Giordana and W. Pieczynski, “Estimation of generalized multisensor hidden Markov
chains and unsupervised image segmentation,” IEEE Trans. Pattern Anal. Mach.
Intell.19(5), 465–475 (1997). [doi:10.1109/34.589206].
[9] P.-A. Devijver, “Baum’s Forward-Backward algorithm revisited,” Pattern Recogn. Lett.
3,369–373 (1985).
[10]L. Baum, T. Petrie, G. Soules, and N. Weiss, “A maximization technique occurring in the
statistical analysis of probabilistic functions of Markov chains,” Ann. Math. Stat. 41(1),
164–171 (1970).
[11]A. P. Dempster, N. M. Laird, and D. B. Rubin, “Maximum likelihood from incomplete data
via the EM algorithm,” J. Roy. Stat. Soc. 39(1), 1–38 (1977).
[12]W. Pieczynski, “Statistical image segmentation,” Mach. Graph. and Vis.,vol. 1, pp. 261–
268, 1992.
[13]L. Rabiner, “A tutorial on hidden Markov models and selected applications in speech
recognition,” in Proc. IEEE, vol. 77, no. 2, Feb. 1989, pp. 257–286.
[14]P. A. Devijver, “Baum’s forward-backward algorithm revisited,” Pattern Recognition
Letters, vol. 39, no. 3, pp. 369–373, december 1985.
[15]S. Derrode, C. Carincotte, and J.M. Boucher. Unsupervised image segmentation based
on high-order hidden Markov chains. In IEEE ICASSP, Montreal (Ca), 17-21 mai
2004.

More Related Content

PDF
Performance Evaluation of Object Tracking Technique Based on Position Vectors
PDF
Semantic Video Segmentation with Using Ensemble of Particular Classifiers and...
PDF
Illustration Clamor Echelon Evaluation via Prime Piece Psychotherapy
PDF
Background Estimation Using Principal Component Analysis Based on Limited Mem...
PDF
Bistablecamnets
PDF
Object tracking by dtcwt feature vectors 2-3-4
PDF
Black-box modeling of nonlinear system using evolutionary neural NARX model
PDF
Image Processing
Performance Evaluation of Object Tracking Technique Based on Position Vectors
Semantic Video Segmentation with Using Ensemble of Particular Classifiers and...
Illustration Clamor Echelon Evaluation via Prime Piece Psychotherapy
Background Estimation Using Principal Component Analysis Based on Limited Mem...
Bistablecamnets
Object tracking by dtcwt feature vectors 2-3-4
Black-box modeling of nonlinear system using evolutionary neural NARX model
Image Processing

What's hot (20)

PDF
Cognition, Information and Subjective Computation
PDF
Neuro -fuzzy-networks-for-identification-of-mathematical-model-parameters-of-...
PDF
بررسی دو روش شناسایی سیستم های متغیر با زمان به همراه شبیه سازی و گزارش
PDF
Performance evaluation of ds cdma
PPTX
Neural Network Fundamentals
PDF
PERFORMANCE ASSESSMENT OF CHAOTIC SEQUENCE DERIVED FROM BIFURCATION DEPENDENT...
PPT
EC Presentation _ ID_ 201-15-14294
PDF
FITTED OPERATOR FINITE DIFFERENCE METHOD FOR SINGULARLY PERTURBED PARABOLIC C...
PDF
CHN and Swap Heuristic to Solve the Maximum Independent Set Problem
PDF
11.digital image processing for camera application in mobile devices using ar...
PDF
Adaptive Neuro-Fuzzy Inference System based Fractal Image Compression
PDF
Informative Priors for Segmentation of Medical Images
PDF
Paper id 21201483
DOC
EE8120_Projecte_15
PDF
Penalty Function Method For Solving Fuzzy Nonlinear Programming Problem
PDF
Classification of handwritten characters by their symmetry features
PPTX
Phoenix: A Weight-based Network Coordinate System Using Matrix Factorization
PDF
PSIVT2015_slide
PDF
The Perceptron (D1L1 Insight@DCU Machine Learning Workshop 2017)
PDF
Welcome to International Journal of Engineering Research and Development (IJERD)
Cognition, Information and Subjective Computation
Neuro -fuzzy-networks-for-identification-of-mathematical-model-parameters-of-...
بررسی دو روش شناسایی سیستم های متغیر با زمان به همراه شبیه سازی و گزارش
Performance evaluation of ds cdma
Neural Network Fundamentals
PERFORMANCE ASSESSMENT OF CHAOTIC SEQUENCE DERIVED FROM BIFURCATION DEPENDENT...
EC Presentation _ ID_ 201-15-14294
FITTED OPERATOR FINITE DIFFERENCE METHOD FOR SINGULARLY PERTURBED PARABOLIC C...
CHN and Swap Heuristic to Solve the Maximum Independent Set Problem
11.digital image processing for camera application in mobile devices using ar...
Adaptive Neuro-Fuzzy Inference System based Fractal Image Compression
Informative Priors for Segmentation of Medical Images
Paper id 21201483
EE8120_Projecte_15
Penalty Function Method For Solving Fuzzy Nonlinear Programming Problem
Classification of handwritten characters by their symmetry features
Phoenix: A Weight-based Network Coordinate System Using Matrix Factorization
PSIVT2015_slide
The Perceptron (D1L1 Insight@DCU Machine Learning Workshop 2017)
Welcome to International Journal of Engineering Research and Development (IJERD)
Ad

Similar to Unsupervised multispectral image Classification By fuzzy hidden Markov chains model For SPOTHRV Images (20)

PDF
PDF
3. 10079 20812-1-pb
PDF
Predicting electricity consumption using hidden parameters
PPTX
classification of soil different methods
PDF
A Survey on: Hyper Spectral Image Segmentation and Classification Using FODPSO
PDF
PDF
Cc24529533
PDF
Fuzzy Entropy Based Optimal Thresholding Technique for Image Enhancement
PPTX
PDF
IRJET - Clustering Algorithm for Brain Image Segmentation
PDF
DETECTION OF HUMAN BLADDER CANCER CELLS USING IMAGE PROCESSING
PDF
07 18sep 7983 10108-1-ed an edge edit ari
PDF
A F AULT D IAGNOSIS M ETHOD BASED ON S EMI - S UPERVISED F UZZY C-M EANS...
PDF
Segmentation by Fusion of Self-Adaptive SFCM Cluster in Multi-Color Space Com...
PDF
One-Sample Face Recognition Using HMM Model of Fiducial Areas
PDF
Lecture13 xing fei-fei
PPTX
Intelligent Fuzzy System Based Dermoscopic Segmentation for Melanoma Detection
PDF
MAGNETIC RESONANCE BRAIN IMAGE SEGMENTATION
PDF
PDF
Final Poster
3. 10079 20812-1-pb
Predicting electricity consumption using hidden parameters
classification of soil different methods
A Survey on: Hyper Spectral Image Segmentation and Classification Using FODPSO
Cc24529533
Fuzzy Entropy Based Optimal Thresholding Technique for Image Enhancement
IRJET - Clustering Algorithm for Brain Image Segmentation
DETECTION OF HUMAN BLADDER CANCER CELLS USING IMAGE PROCESSING
07 18sep 7983 10108-1-ed an edge edit ari
A F AULT D IAGNOSIS M ETHOD BASED ON S EMI - S UPERVISED F UZZY C-M EANS...
Segmentation by Fusion of Self-Adaptive SFCM Cluster in Multi-Color Space Com...
One-Sample Face Recognition Using HMM Model of Fiducial Areas
Lecture13 xing fei-fei
Intelligent Fuzzy System Based Dermoscopic Segmentation for Melanoma Detection
MAGNETIC RESONANCE BRAIN IMAGE SEGMENTATION
Final Poster
Ad

Recently uploaded (20)

PDF
FourierSeries-QuestionsWithAnswers(Part-A).pdf
PDF
Pre independence Education in Inndia.pdf
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PDF
VCE English Exam - Section C Student Revision Booklet
PDF
Computing-Curriculum for Schools in Ghana
PDF
2.FourierTransform-ShortQuestionswithAnswers.pdf
PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PDF
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
PDF
RMMM.pdf make it easy to upload and study
PPTX
Microbial diseases, their pathogenesis and prophylaxis
PDF
Microbial disease of the cardiovascular and lymphatic systems
PDF
TR - Agricultural Crops Production NC III.pdf
PDF
STATICS OF THE RIGID BODIES Hibbelers.pdf
PDF
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
PDF
01-Introduction-to-Information-Management.pdf
PDF
Complications of Minimal Access Surgery at WLH
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
PPTX
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
PDF
Anesthesia in Laparoscopic Surgery in India
PPTX
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
FourierSeries-QuestionsWithAnswers(Part-A).pdf
Pre independence Education in Inndia.pdf
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
VCE English Exam - Section C Student Revision Booklet
Computing-Curriculum for Schools in Ghana
2.FourierTransform-ShortQuestionswithAnswers.pdf
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
RMMM.pdf make it easy to upload and study
Microbial diseases, their pathogenesis and prophylaxis
Microbial disease of the cardiovascular and lymphatic systems
TR - Agricultural Crops Production NC III.pdf
STATICS OF THE RIGID BODIES Hibbelers.pdf
BÀI TẬP BỔ TRỢ 4 KỸ NĂNG TIẾNG ANH 9 GLOBAL SUCCESS - CẢ NĂM - BÁM SÁT FORM Đ...
01-Introduction-to-Information-Management.pdf
Complications of Minimal Access Surgery at WLH
Module 4: Burden of Disease Tutorial Slides S2 2025
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
Anesthesia in Laparoscopic Surgery in India
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...

Unsupervised multispectral image Classification By fuzzy hidden Markov chains model For SPOTHRV Images

  • 1. Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 446 Faiza DAKKA dakka_f@yahoo.fr Faculty of Sciences, Department of Physics, Mohamed V-Agdal University Rabat B.P. 1014, Morocco Ahmed HAMMOUCH GIT-LGE Laboratory, ENSET, Mohamed V-Souissi University Rabat, B.P. 6207, Morocco Driss ABOUTAJDINE Faculty of Sciences, Department of Physics, Mohamed V-Agdal University Rabat B.P. 1014, Morocco Abstract This paper deals with unsupervised classification of multi-spectral images, we propose to use a new vectorial fuzzy version of Hidden Markov Chains (HMC). The main characteristic of the proposed model is to allow the coexistence of crisp pixels (obtained with the uncertainty measure of the model) and fuzzy pixels (obtained with the fuzzy measure of the model) in the same image. Crisp and fuzzy multi-dimensional densities can then be estimated in the classification process, according to the assumption considered to model the statistical links between the layers of the multi-band image. The efficiency of the proposed method is illustrated with a Synthetic and real SPOTHRV images in the region of Rabat. The comparisons of two methods: fuzzy HMC and HMC are also provided. The classification results show the interest of the fuzzy HMC method. Keywords: Bayesian Iimage Classification, Markov Chains, Fuzzy Hidden Markov, Unsupervised Classification. 1 INTRODUCTION Hidden Markov models are used in many forms and with different spatial structures like strings, fields or trees [3]. The popularity of these models is illustrated by the multitude and diversity of applications that have been proposed, in signal processing in speech recognition and also in image processing. In particular, the model of hidden Markov chains (CMC) has been successfully used for image classification [3]. In this case, 2D images are first transformed into a 1D vector path through Hilbert-Peano [5] to fit the one-dimensional structure of a chain Fig. 1 (a).The interest for this model comes from the fact that when the hidden process X = (X, ..., XN) can be represented by a finite Markov chain and when the structure of the noise is not too complex, then X can be reconstructed from the only observed process Y = (Y1, ..., YN), Yn R,, using different criteria for classification like MAP "Maximum A Posteriori" or MPM Maximal Posterior Mode. It is sometimes interesting to take into account not only the uncertainty in the observations (often due to noise), but also their vagueness (fuzzy). This imprecision may come from a motion sensor at the time of the acquisition of a photo, or the effect of surface / volume part found in Unsupervised Multispectral Image Classification By Fuzzy Hidden Markov Chains Model For SPOTHRV Images
  • 2. Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 447 medical imaging and satellite. The latter effect reflects the fact that a pixel of the image is the result of integration over a surface not necessarily homogeneous, where several components are mixed in different proportions and unknown. In terms of classification, these pixels have an ambiguous and sometimes it is better not to get a "harsh response" from classifier (eg, Class 0 or Class 1) but a "vague response" measured by an interval value ]0,1[. This response reflects the ambiguity of the pixels can be observed and interesting information. Indeed, the level of degrees or can be interpreted as the mixing ratio of class 0 versus class 1 in the pixel, or as the degree of belonging to class 0. This paper is organized as follows: The CMC model is briefly presented in Section 2. In Section 3 the CMC model for fuzzy image classification is detailed, and also The CMC Parameter estimation, carried out with an extension of the algorithm Iterative Conditional Estimation (ICE) [1, 2]. Sections 4 illustrate the comparative results of Classification Unsupervised. Finally, Section 5 reviews the theoretical and experimental work and offers some perspectives. 2 UNSUPERVISED CLASSIFICATION USING HMC This section is intended to give some recalls about the HMC model [6,7] and its use for unsupervised image classification [8]. The HMC model can be adapted to a 2D analysis through a Hilbert-Peano scan of the image, see Fig. 1. Hence all estimation and classification processings are applied on the 1D sequence, and the segmented 2D image is reconstructed by using a reverse Hilbert-Peano scan from the 1D classified sequences. FIGURE 1: Hilbert-Peano scan construction for an 8*8 image. This scan is used to transform a 2D image into a 1D signal (y), and conversely. FIGURE 2: Hilbert-Peano scan for an 8*8 multi-component image (M = number of layers in the image, N = number of pixels in each layer). The Hilbert-Peano scan presents the ability to take into account the neighborhood of the pixel of interest [6]. Fig. 2 shows the scan for a 8 * 8 multi-component image with M layers. A. The HMC model An image y = {y1…., yN}, N being the total number of pixels, is considered as a realization of the 1D observed process Y = {Y1,…,YN}, each Yn is a real-valued random variable. The segmented image x = {x1,…,xN} is considered as a realization of a hidden process X = {X1,…,XN}, each Xn = {1,…,K} is a discrete random variable. In classical HMC modeling, X is assumed to be a Markov chain, i.e. p (xn+1 | xn,…,x1) = p (xn+1 | xn).
  • 3. Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 448 The distribution of X is consequently determined by the distribution of X1, denoted by πk = p (X1= k), and the set of transition matrices (An)1≤n≤N whose entries are an ij = p (Xn+1 = j |Xn = i). We further assume that X is a stationary Markov chain, i.e. entry an ij = aij does not depend on n. With the following additional properties: (i) Yn are independent conditionally to X, i.e. p (y | x) = π N n=1 p(yn | x), and (ii) p (yn | x) = p (yn |xn), the distribution of the pairwise process (X,Y) can be written as p (x, y) = πx1 fx1 (y1) with fxn(yn) = p (yn | xn) the data-driven densities, which are assumed to be Gaussian in this work [8]. B. Classification: The estimation of X from Y can be done by applying the MPM Bayesian criterion: )(maxarg=(y)N],…[1,n n kxMPM n ξ∈∀ (1) with ξn(k) = p (Xn = k | y) the marginal a posteriori probabilities. The HMC model allows explicit computation of the MPM solution using the well-known Baum’s “forward” αn(k) and “backward” βn(k) probabilities [10], modified by Devijver [9] for computational reasons: αn(k) ≈ P(Xn = k , Y1 = y1,…,Yn = yn) (2) (3) In the following, we use the numerically stable forward–backward recursions resulting from these approximations: Forward initialization: for 1≤ i ≤ K. ∑≤≤ kj1 1jj 1ii n )(yƒ )(yƒ =(i) π π α (4) Forward induction: for n =2,…,N and 1 ≤ i ≤ K. (5) for 1 ≤ i ≤ K βN(i)=1 Backward induction: for n=N-1,…,1 et 1 ≤ i ≤ K (6) It can be shown that marginal a posteriori probabilities involved in MPM classification can be written ξn(i) = αn(i) βn(i), (7) and joint a posteriori probabilities Ψn(i,j) = P(Xn = i, Xn+1 = j| Y = y) as: (8) C. Estimation: Before classification, all parameters involved in the CMC model have to be estimated. θ = {πk, akl ,fk} k,l (9)
  • 4. Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 449 One well-known solution is to use the EM iterative procedure [11] aiming at optimizing the log-likelihood of data, according to the steps described in Algorithm 1, with following update equations )( N 1 =,k 1 ][ k N n q n q k ∑= ΠΩ∈∀ ξ (10) q k N n q n q kl lk ∧ − = Π Ψ Ω∈∀ ∑ N ),( =â,l,k 1 1][ (11) q k n N n q nq k yk ∧ = ∧ Π Ω∈∀ ∑ N )( =,k 1 ][ ξ µ (12) q k q kn N n q nq k yk ∧ ∧ = ∧ Π − Ω∈∀ ∑ N ))(( =,k 2 ][ 1 ][ 2 µξ σ (13) Algorithm 1 : HMC parameters estimation using EM. q ← 0: Initialize parameters θ[0] repeat q ←q + 1 - Compute “Forward” ][q nα and “Backward” ][q nβ probabilities using equations (5) and (6). - Compute a posteriori probabilities )(][ kq nξ using eq. (6) and ),(][ lkq nψ using eq. (7) and (8). - Estimate CMC parameters θ[q] using equations (10) and (11) for Markov parameters and equations (12) and (13) for Gaussian data-driven parameters. until |θ[q] - θ[q-1] | <Threshold. Iterative estimation of parameters is stopped when parameters do not vary much. 3 UNSUPERVISED CLASSIFICATION USING FUSSY HMC A. Fuzzy Markov chains model We consider a multi-component image of M layers. According to the Hilbert-Peano scan, we get N series of M data, denoted by y = { 1y ,… Ny }, where ny ={ ]1[ ny ,…, ][M ny } t , 1≤n ≤ N. In classical HMC approach, the aim is to classify each yn RM into a set of K classes, the state space ={w1,…,wK}, in order to obtain the segmented chain x ={x1,…, xN} ( Fig. 2). The segmented image is then reconstructed from x using an inverse Hilbert-Peano scan. For the sake of simplicity, we confine our study to the K = 2 case, i.e. = {0,1}. In fuzzy HMC context, the range of xn is now the interval = [0, 1]. In the following, εn will denote a realization of random variable Xn and we will adopt the notation: εn = 0 if the pixel is from class 0, εn ]0, 1[ if the pixel is a fuzzy one, εn = 1 if the pixel is from class 1. B. Probabilities in fuzzy Markov chains context As stated previously, each xn takes its value in two types of sets: a hard one {0, 1}, and a fuzzy one defined over the range ]0, 1[. Let δ0 and δ1 be Dirac weights on 0 and 1, and 1 the Lebesgue measure on ]0, 1[. By taking v = δ0 + δ1 + 1 as a measure on , the distribution of Xn can be defined by a density h on with respect to v. If we assume that X is homogeneous and the distribution of each Xn is uniform on the fuzzy class, P (Xn = εn) =h(εn) = πεn can be written: h(εn = 0) = π0;
  • 5. Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 450 h(εn = 1) = π1; h(εn) = π]0,1[; εn ]0, 1[; with π0 + π1 + π]0,1[ = 1. We can now detail the new expression for the transition probabilities of the Markov chain nn t εε ,1− : ==== −−− )/( 11,1 nnnn XXPt nn εεεε )()/0( 011 nnnn XXP εδε −− == ] [ )()/( 1,011 nnnnn XXP ειεε −− ==+ )()/1( 111 nnnn XXP εδε −− ==+ Ω∈∀ − nn εε ,1 and { }Nn ,...,2∈∀ With 0,1 ≥− nn t εε and 111, =−−∫Ω nnn dt εεε ∏= − Π== N n nn txXP 2 ,11 )( εεε C. Multi-component fuzzy HMC implementation. εn is a realization of a random variable Xn, and each yn is a realization of a random vector ny ={ ]1[ ny ,…, ][M ny } t , Thus the problem is to estimate the unobserved realization x of a random process X from the observed realization y of a random process y = { 1y ,… Ny }; Similarly to classical HMC, multi-component fuzzy HMC based image classification methods consider the two following assumptions: H1: the random variables Y1,…,YN are independent conditionally on X; H2: the distribution of each Yn conditionally on X is equal to its distribution conditionally on Xn. It is important to note that the random variables ( ][m ny )1≤m≤M are not assumed to be mutually independent conditionally on Xn. Assuming that distributions of (Xn,Yn,Xn+1,Yn+1) are independent of n, each state εn of the state space (i.e. hard classes {0,1}, as well as the fuzzy class ]0,1[) is associated to a distribution characterizing the M-dimensional observations yn: fεn(yn) = P (Yn = yn | Xn = εn), (14) Given an observed sequence y = {y1,…, yN}, the joint state-observation probability is given by: )()(),( 2 ,1 111 n N n yftyfyYxXP nnn εεεεε ∏= − Π=== (15) In unsupervised classification, the distribution P (X = x,Y = y) is unknown and must first be estimated in order to apply a Bayesian classification technique (MAP or MPM). Therefore the following sets of parameters need to be estimated: 1) The set Γ characterizing the fuzzy Markov chain parameters, i.e. the initial probability vector π = {πε} ε and the transition probabilities Ω∈∀− nnnn t εεεε ,1_,1 2) The set ∆ regrouping the parameters of the M- dimensional distributions presented in (14), i.e. the distributions associated with the hard classes and the fuzzy one. D. Fuzzy Markov Chain parameters estimation For the estimation of the parameters in ¡, we propose to use an adaptation of the general ICE algorithm [12] which is an alternative to the well-known Estimation-Maximization (EM) algorithm. In fact, ICE does not refer to the likelihood, but it is based on the conditional expectation of some estimators from the complete data (x, y). It is an iterative method which produces a sequence of estimations θq of parameter θ as follows: 1) initialization θ0, obtained with an initial classification algorithm (k-means algorithm).
  • 6. Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 451 2) computation of ]/),([1 yYyxq == ∧ + θθ where ),( yx ∧ θ is an estimator of θ. 3) Stop the algorithm when θQ-1 θQ. This section is not intended to give a complete description of the ICE algorithm in the HMC context, interested readers may consult [5]. Similarly to the classical case, parameters in Γcan be calculated analytically by using the Baum-Welch algorithm [13]: for the hard classes, the classical normalized Baum-Welch probabilities [14] can be used directly. for the fuzzy class, the forward and backward probabilities are defined by: ςςαξα ξςξ dtyf nnn , ]1,0[ 11 )()()( ∫++ ∝ (16) ςςβξβ ξξς dyft nnn )()()( 1, ]1,0[ 1 ++∫∝ (17) The integrals above can not be calculated analytically and numerical integration must be performed. Hence, the continuous interval ]0,1[ is partitioned into a given number F of subintervals. We thus reduce the domain of fuzzy membership degree ]0,1[ to F attributed values, corresponding to the medium value of the sub-interval of interest (see Fig. 3), thus resulting in a quantization of the fuzzy measure ]0; 1[. We obtained F fuzzy classes "discrete, whose value corresponds to the fuzzy median subinterval considered. For example, F = 2 implies ε {0.25, 0.75 }, F = 3 implies x {0. 165,0:5, 0.825}. More F is large, over the parameter estimation is accurate. FIGURE 3: Partition of the continuous interval ]0,1[ into F = 4 sub-intervals. The attributed values ε correspond to the medium value of the sub-interval of interest. E. Multi-dimensional density estimation At each ICE iteration, we need to estimate the multidimensional densities fεn(yn). Several strategies from multivariate data analysis are available, depending on the assumptions made on the statistical links between the layers, and on the choice of the shape of the one- dimensional densities. If independence between the layers is assumed, fεn(yn) is the product of M densities M nn gg εε ,...,1 defined on R: )()( 1 m n M m m n ygyf nn ∏= = εε For example, if we consider that fεn(yn) are multidimensional Gaussian densities, parameters estimation can be easily achieved from the first and second moments of a M-dimensional sample. Denoting by N(m, σ 2 ) the normal distribution with mean m and variance σ2 , the pdf of the hard classes are then expressed according to: εn = 0 : N(m0; σ20), εn = 1 : N(m1; σ21); The parameters ∆={m0,m1,σ0, σ1} can be estimated by computing the empirical mean of
  • 7. Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 452 several estimates according to ),( 1 1 1 yx L l L l q ∧ = + ∑= θθ , where xl is an a posteriori realization of X conditionally on Y . It can be shown that X | Y is a non homogeneous Markov chain whose parameters can be computed from the forward and backward probabilities: (16) and (17). The definition of the fuzzy measure Ā: “the pixel belongs to class 1” corresponding to the fuzzy class ]0,1[, and its fuzzy membership function µA, allows us to estimate the fuzzy parameters of the set ∆ in this new context. The proposed fuzzy membership function µA is defined by: [ ]     = ∈∀ − − − 10 01 11 ,1 elsewhere0 )( mmm mm mm A mµ Accordingly, the parameters of the Gaussian pdf for the fuzzy class can then be estimated by: mnn mm n ,0)1( εεε +−= 2 1 2 0 2 )1( σεσεσε nnn +−= 4 MULTI-SPECTRALES IMAGES CLASSIFICATION This section is intended to evaluate the two local classification approaches, and the benefit of relaxing the HMCF model. A. Validation To validate the proposed approach we consider the synthetic image in Figure 4, we have considered a hard class(the opaque background and the umbrella), and the gradient between this class as the fuzzy zone (representing the border).Fig.4 present the results of two methods of classification based on HMC model and fuzzy HMC model . We can be seeing that the image of Figure 4 appears to be classified much more precise with fuzzy CMC. The Taking into account a higher number of classes in the classical model of CMC did not have much interest because the additional classes tend to specialize on individual pixels. FIGURE 4: The synthetic image and their results of classification with CMC model ( 4 class and 5 class) and fussy CMC model (F=2,and F=3). B. SPOTHRV Images Classification 1. The studied images The image chosen for the study is that of the forest region of Rabat Morocco. This forest is an ecological unit consisting of a typical natural vegetation and reforestation. Figure 5 shows the satellite images studied, and its multi-spectral images (green, red and near infrared) of the forest Rabat.
  • 8. Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 453 FIGURE 5: The two images studied (a and b) and their multi-spectral images (c, d and e) for the image (a) and (f, g and h) for the image (b). HRV image - SPOT XS of this region, was acquired in December 1999 with a spatial resolution of 20m. in this image we extracted two test areas of size 140x90 pixels occupied mainly by a natural forest of orkoaks and eucalyptus planting, pine and acacia. The reality of the ground was obtained from the official plans of the forest, followed by field verification. These areas are characterized by heterogeneity in the distribution offorest Stands. Discrimination between different textures is not obvious to human observer. 2. Experimental protocol In all experiments, parameters initialization was done with a k-means classifier. The ICE algorithm was stopped after 10 iterations, assuming it has converged. FIGURE 6: (a) and (c) Maps of the reality of the ground. (b) and (d) Results of classification with CMC (6 class) . Figure 6 shows images of the realities ground (a and c) and the results of classification with CMC on both multi-spectral images of the forest in Rabat.(b and d). Computation times are 339.246seconds. In this application, the “Sea” or the “Acacia” were considered as crisp class 0 and “Eucalyptus” or the “Oak-cork” crisp class 1. The fuzzy measure A thus corresponds to: “the pixel belongs to the free sea class”. The classification results depend on the partition of ]0,1[, the choice of the F sub-intervals implies different values of fuzzy measure ε, e.g. F = 2 implies ε {0.25, 0.75}, F = 3 implies ε {0.165, 0.5,0.825}, F = 4 implies ε {0.125, 0.375, 0.625, 0.875} Figure 7 shows images of the realities ground (a and c) and a classification results obtained with fussy vectorial CMC models for F=2 sub-intervals(b and e) and F=3 sub-intervals (c and f). Computation times are 545.636 seconds.
  • 9. Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 454 FIGURE 7: (a) and (d) Maps of the reality of the ground. Results of classification with fussy CMC for 4 crisp class and 2 fuzzy class (K=4,F=2 ) (b) and (e) and for 3 crisp class and 3 fuzzy class (k=3,F=3 ) ) (c) and (f). The computational complexity involved in the fuzzy vectorial HMC model is quite higher than the one involved in the classical one. One can observe that a big number of fuzzy sub- intervals provide a fine characterization of the observed scene. Furthermore, the classification results produced by the fuzzy vectorial HMC model are more homogeneous. The global shape and frontiers of the class seem to be better defined. Nevertheless, this model clearly characterizes the frontiers between class and produces a reliable classification of class, and then illustrates the interest of the vectorial fuzzy HMC model. 5 CONCLUSION In this work, we propose to extend the fuzzy HMC model to unsupervised multi-spectral image classification. In multi-component case, it is interesting not only to take into account the uncertainty measure of the noisy observation (characteristic of probabilistic approach in classical HMC), but also the imprecision measure of this observation (characteristic of fuzzy approaches [6]) By adding a fuzzy measure in a statistical model, we obtain an original model, different from classical and models. Indeed, it preserves the robustness of the statistical classification (based on measures of uncertainty), and enriches it with the fuzzy characteristic (measure of imprecision).The Experimental results confirm the validity of the proposed approach. The vectorial fuzzy HMC model seems to be p promising in the field of multi-component image classification, due to the imprecision measure ability to take into account the multivariate information. We believe extending the fuzzy model to another model strictly more general than the CMC, called Markov chain couple [15] could also benefit from the fuzzy extension proposed in this paper. REFERENCES [1] W. Pieczynski. Statistical image segmentation. Mach. Graph. and Vis., 1 : 261-268, 1992. [2] H. Caillol, W. Pieczynski, A. Hillion. Estimation of fuzzy Gaussian mixture and unsupervised statistical image segmentation. IEEE Trans. on Im. Proc., 6(3) : 425- 440, 1997. [3] W. Pieczynski. Modµeles de Markov en traitement d'images. Traitement du Signal, 20(3) :255{278, 2003. [4] B. Benmiloud and W. Pieczynski. Estimation des paramètres dans les CMC et segmentation d'images. Traitement du Signal, 12(5) :433{454, 1995.
  • 10. Faiza DAKKA, Ahmed HAMMOUCH & Driss ABOUTAJDINE International Journal Of Image Processing (IJIP), Volume (5) : Issue (4) : 2011 455 [5] W. Skarbek. Generalized Hilbert scan in image printing. In R. Klette and W. G. Kropatsch, editors, Theoretical Foundations of Computer Vision. Akademik Verlag, Berlin, 1992. [6] S. Derrode, G. Mercier, and W. Pieczynski, “Unsupervised change detection in SAR images using a multicomponent HMC model,” in MultiTemp’03, (Ispra, Italy) (2003). [7] L. R. Rabiner, “A tutorial on HMMs and selected applications in speech recognition,” Proc. IEEE 77(2), 257–286 (1989). [doi:10.1109/5.18626]. [8] N. Giordana and W. Pieczynski, “Estimation of generalized multisensor hidden Markov chains and unsupervised image segmentation,” IEEE Trans. Pattern Anal. Mach. Intell.19(5), 465–475 (1997). [doi:10.1109/34.589206]. [9] P.-A. Devijver, “Baum’s Forward-Backward algorithm revisited,” Pattern Recogn. Lett. 3,369–373 (1985). [10]L. Baum, T. Petrie, G. Soules, and N. Weiss, “A maximization technique occurring in the statistical analysis of probabilistic functions of Markov chains,” Ann. Math. Stat. 41(1), 164–171 (1970). [11]A. P. Dempster, N. M. Laird, and D. B. Rubin, “Maximum likelihood from incomplete data via the EM algorithm,” J. Roy. Stat. Soc. 39(1), 1–38 (1977). [12]W. Pieczynski, “Statistical image segmentation,” Mach. Graph. and Vis.,vol. 1, pp. 261– 268, 1992. [13]L. Rabiner, “A tutorial on hidden Markov models and selected applications in speech recognition,” in Proc. IEEE, vol. 77, no. 2, Feb. 1989, pp. 257–286. [14]P. A. Devijver, “Baum’s forward-backward algorithm revisited,” Pattern Recognition Letters, vol. 39, no. 3, pp. 369–373, december 1985. [15]S. Derrode, C. Carincotte, and J.M. Boucher. Unsupervised image segmentation based on high-order hidden Markov chains. In IEEE ICASSP, Montreal (Ca), 17-21 mai 2004.