Gridbased Nonlinear Estimation And Its Applications Jia Bin Xin
Gridbased Nonlinear Estimation And Its Applications Jia Bin Xin
Gridbased Nonlinear Estimation And Its Applications Jia Bin Xin
Gridbased Nonlinear Estimation And Its Applications Jia Bin Xin
1. Gridbased Nonlinear Estimation And Its
Applications Jia Bin Xin download
https://ebookbell.com/product/gridbased-nonlinear-estimation-and-
its-applications-jia-bin-xin-10503900
Explore and download more ebooks at ebookbell.com
2. Here are some recommended products that we believe you will be
interested in. You can click the link to download.
Gridbased Problem Solving Environments Ifip Tc2 Wg 25 Working
Conference On Gridbased Problem Solving Environments Implications For
Development And Deployment Of Numerical Software July 1721 2006
Prescott Arizona Usa 1st Edition Dennis Gannon
https://ebookbell.com/product/gridbased-problem-solving-environments-
ifip-tc2-wg-25-working-conference-on-gridbased-problem-solving-
environments-implications-for-development-and-deployment-of-numerical-
software-july-1721-2006-prescott-arizona-usa-1st-edition-dennis-
gannon-4239946
Advanced Smart Grid Functionalities Based On Powerfactory Green Energy
And Technology Softcover Reprint Of The Original 1st Ed 2018
https://ebookbell.com/product/advanced-smart-grid-functionalities-
based-on-powerfactory-green-energy-and-technology-softcover-reprint-
of-the-original-1st-ed-2018-6840806
Distributed Economic Operation In Smart Grid Modelbased And Modelfree
Perspectives Jiahu Qin Yanni Wan Fangyuan Li Yu Kang Weiming Fu
https://ebookbell.com/product/distributed-economic-operation-in-smart-
grid-modelbased-and-modelfree-perspectives-jiahu-qin-yanni-wan-
fangyuan-li-yu-kang-weiming-fu-47666390
Protection Principle And Technology Of The Vscbased Dc Grid 1st Ed Bin
Li
https://ebookbell.com/product/protection-principle-and-technology-of-
the-vscbased-dc-grid-1st-ed-bin-li-22476750
3. Analysis Optimization And Control Of Gridinterfaced Matrixbased
Isolated Acdc Converters Jaydeep Saha
https://ebookbell.com/product/analysis-optimization-and-control-of-
gridinterfaced-matrixbased-isolated-acdc-converters-jaydeep-
saha-47216726
Blockchainbased Systems For The Modern Energy Grid Sanjeevikumar
Padmanaban
https://ebookbell.com/product/blockchainbased-systems-for-the-modern-
energy-grid-sanjeevikumar-padmanaban-46882596
Cable Based And Wireless Charging Systems For Ev Technology And
Control Management And Grid Integration Transportation Rajiv Singh
https://ebookbell.com/product/cable-based-and-wireless-charging-
systems-for-ev-technology-and-control-management-and-grid-integration-
transportation-rajiv-singh-36456236
A Holistic Solution For Smart Grids Based On Linkparadigm Architecture
Energy Systems Integration Voltvar Chain Process Albana Ilo Danielleon
Schultis
https://ebookbell.com/product/a-holistic-solution-for-smart-grids-
based-on-linkparadigm-architecture-energy-systems-integration-voltvar-
chain-process-albana-ilo-danielleon-schultis-36453776
Blockchainbased Smart Grids 1st Edition Miadreza Shafiekhah Editor
https://ebookbell.com/product/blockchainbased-smart-grids-1st-edition-
miadreza-shafiekhah-editor-11132716
6. Grid-based Nonlinear
Estimation and Its
Applications
Bin Jia
Intelligent Fusion Technology, Inc., Germantown, Maryland, USA
Ming Xin
Department of Mechanical and Aerospace Engineering, University of Missouri,
Columbia, Missouri, USA
A SCIENCE PUBLISHERS BOOK
p,
A SCIENCE PUBLISHERS BOOK
p,
8. Preface
Estimation (prediction, filtering, and smoothing) is of great importance to
virtually all engineering and science disciplines that require inference, learning,
information fusion, identification, and retrieval of unknown dynamic system
states and parameters. The theory and practice of linear estimation of Gaussian
dynamic systems have been well established and successful.Although optimal
estimation of nonlinear dynamic systems has been studied for decades, it is
still a very challenging and unresolved problem. In general, approximations
are inevitable in order to design any applicable nonlinear estimation algorithm.
This monograph aims to provide a unified nonlinear estimation framework
from the Bayesian perspective and to develop systematic estimation algorithms
with grid-based numerical rules. It presents a common approach to the nonlinear
Gaussian estimation, by emphasizing that all Gaussian-assumed estimation
algorithms in the literature distinguish themselves merely by the way of
approximating Gaussian weighted integrals. A variety of numerical rules to
generate deterministic grid points and weights can be utilized to approximate
such Gaussian integrals and yield a family of nonlinear Gaussian estimation
algorithms, which can be unified in the same Bayesian framework. A unique
feature of this book is to reveal the close relationships among these nonlinear
estimators such as the unscented Kalman filter, cubature Kalman filter, Gauss-
Hermite quadrature filter, and sparse-grid quadrature filter. It is shown that
by certain grid selection rules, one filter can be constructed from the other, or
vice versa. The readers can learn the advantages and disadvantages of these
estimation techniques and have a guideline to select the suitable one to solve
their problems, because estimation accuracy and computation complexity can
be analytically given and easily controlled. Several important applications are
presented to demonstrate the capability of the grid-based nonlinear estimation
including multiple sensor estimation, uncertainty propagation, target tracking, and
navigation.Theintegrationofthegrid-basedestimationtechniquesandtheparallel
computing models, such as MapReduce and graphics processing units (GPU),
is briefly introduced. Pseudo-code is provided for the key algorithms, allowing
interested readers to develop their own programs for their estimation problems.
10. Contents
Preface iii
1. Introduction 1
1.1 Random Variables and Random Process 2
1.2 Gaussian Distribution 8
1.3 Bayesian Estimation 10
References 12
2. Linear Estimation of Dynamic Systems 13
2.1 Linear Discrete-Time Kalman Filter 13
2.2 Information Kalman Filter 16
2.3 The Relation Between the Bayesian Estimation and Kalman 21
Filter
2.4 Linear Continuous-Time Kalman Filter 33
References 34
3. Conventional Nonlinear Filters 35
3.1 Extended Kalman Filter 36
3.2 Iterated Extended Kalman Filter 37
3.3 Point-Mass Filter 38
3.4 Particle Filter 40
3.5 Combined Particle Filter 44
3.5.1 Marginalized Particle Filter 45
3.5.2 Gaussian Filter Aided Particle Filter 46
3.6 Ensemble Kalman Filter 47
3.7 Zakai Filter and Fokker Planck Equation 48
3.8 Summary 50
References 50
4. Grid-based Gaussian Nonlinear Estimation 52
4.1 General Gaussian Approximation Nonlinear Filter 53
4.2 General Gaussian Approximation Nonlinear Smoother 56
11. vi Grid-based Nonlinear Estimation and Its Applications
4.3 Unscented Transformation 57
4.4 Gauss-Hermite Quadrature 58
4.5 Sparse-Grid Quadrature 59
4.6 Anisotropic Sparse-Grid Quadrature and Accuracy Analysis 69
4.6.1 Anisotropic Sparse-Grid Quadrature 69
4.6.2 Analysis of Accuracy of the Anisotropic 72
Sparse-Grid Quadrature
4.7 Spherical-Radial Cubature 79
4.8 The Relations Among Unscented Transformation, 84
Sparse-Grid Quadrature, and Cubature Rule
4.8.1 From the Spherical-Radial Cubature Rule to the 84
Unscented Transformation
4.8.2 The Connection between the Quadrature Rule 86
and the Spherical Rule
4.8.3 The Relations Between the Sparse-Grid Quadrature 92
Rule and the Spherical-Radial Cubature Rule
4.9 Positive Weighted Quadrature 98
4.10 Adaptive Quadrature 102
4.10.1 Global Measure of Nonlinearity for Stochastic Systems 102
4.10.2 Local Measure of Nonlinearity for Stochastic Systems 103
4.11 Summary 104
References 105
5. Nonlinear Estimation: Extensions 107
5.1 Grid-based Continuous-Discrete Gaussian Approximation 108
Filter
5.2 Augmented Grid-based Gaussian Approximation Filter 110
5.3 Square-root Grid-based Gaussian Approximation Filter 112
5.4 Constrained Grid-based Gaussian Approximation Filter 114
5.4.1 Interval-constrained Unscented Transformation 114
5.4.2 Estimation Projection and Constrained Update 115
5.5 Robust Grid-based Gaussian Approximation Filter 116
5.5.1 Huber-based Filter 116
5.5.2 H∞
Filter 117
5.6 Gaussian Mixture Filter 122
5.7 Simplified Grid-based Gaussian Mixture Filter 125
5.8 Adaptive Gaussian Mixture Filter 126
5.9 Interacting Multiple Model Filter 128
5.10 Summary 130
References 130
14. Introduction
1
In a data inundated world, extracting useful information from the data becomes
indispensable in all science and engineering disciplines and even in people’s
daily life. Systems that extract signals from noisy measurements have to
face the challenges of sensor constraints, random disturbances, and lack of
precise knowledge of the underlying physical process. Estimation theories
and techniques are at the heart of such data processing systems. The data
processing methods can be traced back to Gauss (Gelb 1974), who originated
the deterministic least-squares method and employed it in a relatively simple
orbit determination problem. After more than 100 years, Fisher (Fisher 1912)
developed the maximum likelihood estimation method based on the probability
density functions.As a remarkable milestone, Wiener (Wiener 1949) proposed
statistically optimal filters in the frequency domain for the continuous-time
problem. Nevertheless, it provided optimal estimates only in the steady-state
constrained by statistically stationary processes. The discrete-time estimation
problem was investigated by Kolmogorov (Kolmogorov 1941) during the
same time period. The most significant breakthrough of the estimation theory
took place when Kalman (Kalman 1960) introduced the state-space model and
optimal recursive filtering techniques based on the time domain formulations.
It is this celebrated Kalman filter that makes possible digital computer
implementation of the estimation algorithm, and the tremendous success in a
wide range of applications.
An estimator is a procedure that processes measurement to deduce an
estimate of the state or parameters of a system from the prior (deterministic
or statistic) knowledge of the system, measurement, and initial conditions.
The estimation is usually obtained by minimizing the estimation error in a
well-defined statistical sense. Three types of estimation problems have been
15. 2 Grid-based Nonlinear Estimation and Its Applications
extensively investigated based on the times at which the measurement is
obtained and estimation is processed. When an estimate is obtained at the
same time as the last measurement point, it is a filtering problem; when the
estimation takes place within the time interval of available measurement data, it
is a smoothing problem; and when the estimation is processed at the time after
the last available measurement, it is a prediction problem. This book will cover
all these three estimation problems with relatively more attention to the first
one since a large spectrum of estimation applications is the filtering problem.
In most estimation applications, measurements and observations are
expressed as numerical quantities and they typically exhibit uncertain variability
every time they are repeated. These numerical-valued random quantities are
usually named as random variables. The concept of the random variable is
central to all the estimation concepts. For example, based on a probability
space on which the random variable is defined, probability distributions and
probability density functions can be defined.
1.1 Random Variables and Random Process
Definition 1.1. Random Vector [ ]
1
x , ,x
T
n
=
x
Given a probability space ( )
, ,
Ω A P where Ω is the sample space, A is
the σ-algebra of Ω, and P is a probability measure on A, a random vector
( ): n
Ω →
x is a real-valued function that maps a sample point, ω ∈Ω into a
point in ¡n
such that ( )
{ }
:
A= ω ω ≤ ∈
x x A
A for any [ ]
1, ,
T n
n
x x
= ∈
x .
Note that the lowercase and boldface letter denotes the random vector
while the lowercase, boldface, and italic letter denotes the realization of the
random vector.Arandom variable is a univariate random vector and is denoted
by a non-boldface letter.
Definition 1.2. Probability Distribution Function
A real scalar-valued function:
( )
{ }
( ) ( )
1 1 2 2
( : x ,x , ,xn n
F x x x
ω ω
) ≤ = < < <
x x
x x
P P
( )
{ }
( ) ( )
1 1 2 2
( : x ,x , ,xn n
F x x x
ω ω
) ≤ = < < <
x x
x x
P P
is called a probability distribution function where P denotes the probability.
It is also referred to as the joint probability distribution function of x1
,…, xn
.
Definition 1.3. Probability Density Function (PDF)
The derivative of the probability distribution function Fx
(x) is called the
probability density function p(x) of the random vector x, i.e.,
(
( )
dF
p
d
)
x
x
x
x
(1.1)
16. Introduction 3
or ( )
1 2
1 1
( , ,
n
x x x
n n
F p u u du du
−∞ −∞ −∞
) =∫ ∫ ∫
x
x (1.2)
In the subsequent text, we will use a simpler notation
( )
1 2
1 1
, , ( )
n
x x x
n n
p u u du du p d
−∞ −∞ −∞ −∞
=
∫ ∫ ∫ ∫ u u
x
(1.3)
to represent an integral with respect to a random vector. p(x) satisfies
( ) 1
p d
∞
−∞
∫ =
x x .
Definition 1.4. Expectation
The expectation or mean of a random vector [ ]
1
x , ,x
T
n
=
x is defined to be
the vector of expectations of its elements; i.e.,
[ ]
[ ]
[ ]
1 1
1
1
( )
x
x ( )
n
n
n n
x p dx dx
E
E
E x p dx dx
∞ ∞
−∞ −∞
∞ ∞
−∞ −∞
=
∫ ∫
∫ ∫
x
x
x
(1.4)
In a vector representation,
[ ] ( )
E p d
∞
−∞
∫ x
x
x x x m (1.5)
The expectation satisfies the linearity property:
[ ] [ ]
E c cE
=
x x (1.6)
[ ] [ ] [ ]
E E E
= +
x + y x y (1.7)
where c is a constant scalar.Amore general linearity can be derived from (1.6)
and (1.7)
E[AXB + C] = AE[X]B+C (1.8)
where A, B, and C are constant matrices and X is a random matrix. Assume
that these matrices have compatible dimensions.
The expectation of a function y(x) of the random vector x is
[ ] ( ) ( )
E p d
∞
−∞
∫ y
y y
x x x m (1.9)
E[.] can be used to derive other statistics of the random variable/vector.
For a random variable x,
2 2
x ( )
E x p x dx
∞
−∞
=
∫ (1.10)
17. 4 Grid-based Nonlinear Estimation and Its Applications
is called the mean square or second moment,
x ( )
n n
E x p x dx
∞
−∞
=
∫ (1.11)
is the nth order moment, and
( ) ( ) ( )
n
x x
E x m x m p x dx
∞
−∞
− = −
∫ (1.12)
is the nth order central moment.
Definition 1.5. Conditional Probability Density can be represented by
( , )
( )
( )
p
p
p
x y
x y
y
| (1.13)
Another expression of the conditional probability density is in the form
of Bayes’ rule:
( ) ( ) ( ) ( )
( , )
( )
( ) ( ) ( ) ( )
p p p p
p
p
p p p p d
∞
−∞
= = =
∫
y x x y x x
x y
x y
y y y x x x
| |
|
|
(1.14)
where the denominator can be regarded as a term to normalize the expression,
so that the total integral of the conditional probability density is unity. Note
that the integrand in the denominator is of the same form as the numerator.
Through the conditional probability density, inter-relationship among random
variables can be revealed.
Definition 1.6. Independence
Given a probability space ( )
, ,
Ω A P , two random vectors x and y are independent
if
( ) ( )
{ }
( ) ( )
{ }
( ) ( )
{ }
( )
: : :
A and B A B
ω ω ω ω ω ω ω
∈ ∈
= ∈ ∈
x y x y
P P P (1.15)
for all A,B∈A
A. In terms of probability distribution functions,
Fx,y
(x,y) = Fx
(x)Fy
(y) (1.16)
or the probability density function
p(x,y) = p(x)p(y) (1.17)
Definition 1.7. Correlation
The correlation matrix or the second moment of x is defined by
18. Introduction 5
C( ) T
E
x xx
C (1.18)
with the (i, j) entry x x
i j
E
representing the correlation between xi
and xj
.
Note that C is a symmetric matrix. x x [x ] [x ]
i j i j
E E E
=
if xi
and xj
are
uncorrelated.
If two random variables xi
and xj
are independent, they are uncorrelated.
The property of being uncorrelated is weaker than the property of independence.
For independent random variables, (x ) (x ) [ (x )] [ (x )]
i j i j
E f g E f E g
=
for
any functions f(xi
) and g(xj
), whereas for uncorrelated random variables, it is
only required that this holds for (x ) = x
i i
f and (x ) x
j j
g = .
Definition 1.8. Covariance
Define the covariance matrix or the second central moment of x by
( )( )
( )
T
E
− −
x x
P x x x
m m (1.19)
( )( )
x x
( ) x x
i j
ij i j
E m m
= − −
P x is called the cross-covariance of the ith and
jth components of x.
It can be shown that
P(x) = C(x) – mx
mx
T
(1.20)
We see that the covariance and correlation matrices are equal if and only if
the mean vector is zero. Note that the diagonal entries of P(x) are the variance
of each random variable xi
, i.e.,
[ ]
( )
2
(x ) x x ( )
i i i ii
Var E E
− =
P x
. For i j
≠ , ( ) 0
ij =
P x if and only if xi
and
xj
are uncorrelated. Thus, P(x) is a diagonal matrix if xi
and xj
are uncorrelated.
The square root of a variance Pii
(x) is termed the standard deviation of xi
.
If a random vector x has a covariance matrix P(x), y = Ax has a covariance
matrix AP(x)AT
.
Cross-covariance
If [ ]
1
x , ,x
T
n
=
x and 1
z , ,z
T
p
=
z are both random vectors with respective
means mx
and mz
, then their cross-covariance matrix is the n × p matrix
( )( )
( , )
T
E
− −
x z
P x z x z
m m (1.21)
The two random vectors [ ]
1
x , ,x
T
n
=
x and 1
z , ,z
T
p
=
z are
uncorrelated if (x ,z ) 0
i j =
P for all i = 1, …, n and all j = 1,…, p. This is
equivalent to the condition that ( , ) 0
=
P x z be a n × p zero matrix.
19. 6 Grid-based Nonlinear Estimation and Its Applications
Definition 1.9. Stochastic Process
Astochasticprocess isageneralizationofarandomvariableandcanbeconsidered
as an infinite collection of random variables. It is a set of random variables, x(ω, t),
indexed by the time t, and defined on a probability space ( )
, ,
Ω A P .
The random process can be thought of as a function of the sample point
ω and time t (Maybeck 1979). A random process is a set of time functions,
each of which is one possible outcome, ω, out of Ω. A random variable is a
real number, while a random process is a continuously indexed collection of
real numbers or a real-valued function. If t is fixed, x(ω, t) results in a random
variable. If ω is fixed, x(ω, t) becomes a realization of the random process. In
estimation or filtering, the statistical properties of interest for random processes
include how their statistics and joint statistics are related across time or between
components of vectors. To simplify the notation without causing confusion,
we will denote a random process by x(t).
The statistical properties of a random process can be similarly defined as
those for random variables/vectors by including the time t.
The expected value or mean can be given by
[ ]
( ) ( ) ( ( )) ( ) ( )
E t t p t d t t
∞
−∞
∫ x
x
x x x m (1.22)
A process x(t) is considered time-independent if for any distinct times
t1
, t2
, t3
,…, ti
,
1 2 1 2
( ( ), ( ), , ( ) ( ( ) ( ( ) ( ( )
i i
p t t t p t p t p t
)
= ) ) )
x x x x x x (1.23)
The autocorrelation of the random process x(t) between any two times
t1
and t2
can be defined as
C 1 2
( ) ( ) ( )
T
E t t
x x x
C (1.24)
The diagonal elements of C(x) when t1
= t2
= t, is the average power of
x(t), i.e., E[x2
(t)].
The autocovariance of x(t) can be defined by
( )( )
1 1 2 2
( ) ( ) ( ) ( ) ( )
T
P E t t t t
− −
x x
x x x
m m (1.25)
The cross-correlation between two random processes ( ) n
t ∈
x and
( ) p
t ∈
z is defined by an n × p matrix
C 1 2
( , ) ( ) ( )
T
E t t
=
x z x z
C (1.26)
20. Introduction 7
The cross-covariance between x(t) and z(t) can be similarly defined by an
n × p matrix
( )( )
1 1 2 2
( , ) ( ) ( ) ( ) ( )
T
P E t t t t
− −
x z
x z x z
m m (1.27)
Two random processes x(t) and z(t) are uncorrelated if their cross-
covariance matrix is identically zero for all t1
and t2
. They are called orthogonal
if their cross-correlation matrix is identically zero.Awhite noise is an example
of an uncorrelated random process.
Aparticularly important characteristic of a random process is whether or not
it is stationary. Stationarity is the property of a random process that guarantees
that its statistical properties, such as its mean, variance, and moments will not
change over time. The random process x(t) is strict-sense stationary if its PDF
is invariant with respect to the time shift (Papoulis 2002), i.e.,
1 2 1 2
( ( ), ( ), , ( ) ( ( + ), ( + ) , ( + )
i i
p t t t p t t t
τ τ τ
)
= , )
x x x x x x (1.28)
The random process x(t) is wide-sense stationary if
[ ]
( )
E t constant
=
x (1.29a)
E[x(t1
)xT
(t2
)] = C(t2
– t1
) = C(τ) (1.29b)
where C is a matrix with its value depending only on the time difference 2 1
t t τ
− =
.
A wide-sense stationary process implies that its first-order and second-
order statistics of x(t) are independent of the time origin, while a strict-sense
stationary process implies that all statistics do not depend on the time origin.
A rather strictly stationary process of particular interest is the ergodic
processes (Gelb 1974). A process is ergodic if any statistic calculated by
averaging over all members of the ensemble of samples at a fixed time can
be calculated equivalently by time-averaging over any single representative
member of the ensemble. In other words, a sampled function x(t) is ergodic
if its time-averaged statistics equal its ensemble averages. In practice, almost
all empirical results for stationary processes are derived from tests of a single
function with the ergodic assumption.
One of the most important random processes in estimation is the Markov
process. A random process x(t) is called a Markov process if its future state
distribution, conditioned on knowledge of its present state, does not rely on its
previous states. Specifically, for the times t1
< t2
< t3
< … <ti
,
( ) ( ) ( )
( ) ( ) ( )
( )
1 1 1
| , , |
i i i i i i
t t t t t
− −
≤ =
≤
x x x x x
x x
P P (1.30)
21. 8 Grid-based Nonlinear Estimation and Its Applications
Similarly, for discrete-time systems, a random process xk
is called a Markov
sequence if
( ) ( )
1
| ; 1 |
i i k i i i
k i −
≤ ≤ −= ≤
x x x x
x x
P P (1.31)
The solution to a general first-order differential equation
( ) ( , ) + ( )
d t
t t
dt
=
x
x w
g (1.32)
with an independent random process w(t) as a forcing function is a Markov
process. If we impose the restriction that the PDF of w(t) and consequently x(t)
are Gaussian, the process x(t) is called a Gauss-Markov process.
1.2 Gaussian Distribution
The most important probability density function of the random variable is the
Gaussian or normal distribution.As a consequence of the central limit theorem,
the Gaussian density is a good approximation of probability density function
involving a sum of many independent random variables, which is true whether
the random variables are continuous or discrete. The Gaussian random variable/
vector is very useful and of practical importance to the estimation problem
because (1) it provides an adequate model for many random behaviors exhibited
in nature, (2) it is a tractable mathematical model for estimation algorithms. A
normal distribution of a random variable can be represented by
2
1 1
( )
2
2
x
x m
p x exp
σ
πσ
−
= −
(1.33)
where mx
is the mean of the random variable x, σ is its standard deviation,
and σ2
is its variance. exp denotes the exponential function. We usually use
( )
2
,
x
N m σ to represent the normal distribution. If mx
= 0 and σ2
= 1, it is called
a standard normal distribution, x ~ N(0,1).As can be seen, the Gaussian density
function is completely defined by the two parameters mx
and σ2
. Unlike other
density functions, higher order moments are not required to generate a complete
description of the Gaussian density function.
The moments of a standard normal distribution can be easily calculated
1 3 ( 3)( 1),
0,
n n n n even
E x
n odd
⋅ − −
=
(1.34)
22. Introduction 9
Univariate Gaussian or normal random variables can be generalized to
random vectors.The probability density function of a Gaussian (normal) random
vector can be represented by
( )
[ ] [ ]
1
1/2
1 1
( )
2
2
T
n
p exp
π
−
= − − −
P
P
x x
x x m x m
(2π)n/2
(1.35)
where |.| denotes the determinant of a matrix. The covariance matrix P is
assumed to be positive definite.
Definition 1.10. Jointly Gaussian Distribution
A random vector [ ]
1
x , ,x
T
n
=
x is Gaussian or normal if every linear
combination of its components of x,
1
x
n
i i
i
c
=
∑ , is a scalar Gaussian random
variable. x1
,…, xn
is also called jointly Gaussian or jointly normal. Here, any
constant random variable is considered to be Gaussian in order for this definition
to make sense when all ci
= 0 or when x has a singular covariance matrix.
If [ ]
1
x , ,x
T
n
=
x is a Gaussian random vector with mean mx
and
covariance matrix Px
, we denote it as ( )
,
N x x
x P
m .
If xi
are independent, every linear combination of xi
is Gaussian. Then,
[ ]
1
x , ,x
T
n
=
x is a Gaussian vector.
Any subvector of a Gaussian vector is a Gaussian vector.
If [ ]
1
x , ,x
T
n
=
x is a Gaussian random vector with meanmx
and covariance
matrix Px
, Cx + b is a Gaussian random vector for any p × n matrix C and any
p-vector b.
Note that y = Cx being Gaussian does not necessarily imply that x is
Gaussian. If C is invertible, x = C–1
y is Gaussian.
If the components of a random vector are uncorrelated, then its covariance
matrix is diagonal. In general, this does not mean that the components of the
random vector are independent. However, if x is a Gaussian random vector,
then its components are independent. In other words, two jointly Gaussian
random vectors that are uncorrelated are also independent.
If x is a Gaussian random vector, ( )
,
N x x
x P
m , the linear transformation y
( )
1
−
= −
x x
y P x m results in a random vector y that follows a standard normal
distribution, i.e., y ~ N(0, I). This transformation is called the decorrelating
transformation (Gubner 2006, Kailath 2000), or stochastic decomposition
(Arasaratnam and Haykin 2007). y will be a Gaussian random vector with
uncorrelated and therefore independent components. x
P is the square root of
the covariance matrix Px
, which can be obtained by many matrix decomposition
techniques that factorizes a covariance matrix Px
in the form of Px
= SST
in which
= x
S P , e.g., the Cholesky decomposition, the singular value decomposition
(SVD) and the eigenvector decomposition.
23. 10 Grid-based Nonlinear Estimation and Its Applications
1.3 Bayesian Estimation
There are two commonly used models for the estimation procedure. The first
model is called non-Bayesian or Fisher estimation (Bar-Shalom et al. 2001)
assuming that the parameters to be estimated are nonrandom and constant in the
observation interval but the observations are noisy. The second model is called
Bayesian estimation assuming that the parameters to be estimated are random
variableswithapriorprobabilitydistribution,andtheobservationsarenoisyaswell.
Bayesian estimation is more powerful in state estimation for dynamic
systems. It is based on the Bayes’ rule
( ) ( )
( )
( )
p p
p
p
=
y x x
x y
y
|
| (1.36)
to estimate the random parameter x, from the noisy observation y = y. The
estimation procedure starts with a prior statistics or belief statement of the
random parameter x, i.e., p(x). As measurements are made, the prior PDF
p(x) is transformed to the posterior distribution function p(x|y) utilizing the
likelihood p(y|x). p(y) is called the evidence that scales the posterior to assure
its integral is unity. As more measurements are used, a posterior distribution
p(x|y) is improved such that it results in a sharper peak closer to the true
parameter/state. Once the posterior distribution is determined, all statistical
inferences or estimates can be made. This procedure can be illustrated in the
following discrete-time dynamic state estimation problem.
Consider a class of nonlinear discrete-time dynamical systems:
xk
= f(xk–1
) + νk–1
(1.37)
with an observation/measurement function that connects the state vector with
the observation:
( )
k k k
= +
y x
h n (1.38)
where ;
n m
k k
∈ ∈
x y
; νk–1
and nk
are independent white Gaussian process noise
and measurement noise with covariance Qk–1
and Rk
, respectively.
It is assumed that the discrete-time dynamic process (1.37) is a first-order
Markov process in which the current state is dependent only on the previous
state. The specific estimation problem is estimating the state vector xk
based
on the sequence of observation vectors { }
1: 1 1
, ,
k k
y y y y
.
The estimation can be performed based on the Bayesian estimation
framework by turning the problem into an estimation of the conditional posterior
density 1:
( )
k k
p x y
| . Recursive prediction and update procedures can be developed
for estimation of 1:
( )
k k
p x y
| .
Applying Bayes’ rule (1.36) to the posterior distribution 1:
( )
k k
p x y
| yields
24. Introduction 11
1:
1:
1:
( ) ( )
( )
( )
k k k
k k
k
p p
p
p
=
y x x
x y
y
|
| (1.39)
1:
( )
k k
p x y
| is the PDF of xk
conditioned on all observations up to and
including the current observation. Equation (1.39) can be rewritten as
1: 1 1: 1 1: 1
1:
1: 1 1: 1 1: 1
( , ) ( ) ( | , ) ( ) ( )
( )
( , ) ( | ) ( )
k k k k k k k k k k
k k
k k k k k
p p p p p
p
p p p
− − −
− − −
= =
y y x x y y x y x x
x y
y y y y y
| |
|
(1.40)
Applying Bayes’ rule to 1: 1
( )
k k
p −
y x
| yields
1: 1 1: 1 1: 1
1:
1: 1 1: 1
1: 1 1: 1
1: 1
1: 1
1: 1
( | , ) ( ) ( ) ( )
( )
( | ) ( ) ( )
( | , ) ( )
( | )
( | ) ( )
( | )
k k k k k k k
k k
k k k k
k k k k k
k k
k k k k
k k
p p p p
p
p p p
p p
p
p p
p
− − −
− −
− −
−
−
−
=
=
=
y y x x y y x
x y
y y y x
y y x x y
y y
y x x y
y y
|
|
|
|
(1.41)
Note that the last equality is obtained because, from the observation
Eq. (1.38), the observation at time k does not depend on the observations at
times 1: k – 1.
Since the denominator 1: 1
( | )
k k
p −
y y satisfies
( ) ( )
1: 1 1: 1
( | ) | |
k k k k k k k
p p p d
− −
= ∫
y y y x x y x (1.42)
Equation (1.41) becomes
( ) ( )
1: 1
1:
1: 1
( | ) ( )
( )
| |
k k k k
k k
k k k k k
p p
p
p p d
−
−
=
∫
y x x y
x y
y x x y x
|
| (1.43)
where ( )
|
k k
p y x is the likelihood function, ( )
1: 1
|
k k
p −
x y can be obtained from
the Chapman–Kolmogorov equation (Papoulis 2002)
( ) ( ) ( )
1: 1 1 1 1: 1 1
| | |
k k k k k k k
p p p d
− − − − −
= ∫
x y x x x y x (1.44)
where ( )
1
|
k k
p −
x x is the predictive PDF.
Equations (1.43) and (1.44) establish a recursive Bayesian filtering
procedure. To initialize the filter, a prior posterior density ( )
1 1: 1
|
k k
p − −
x y
can be assumed. The conditional PDF ( )
1: 1
|
k k
p −
x y satisfies the Chapman-
Kolmogorov Eq. (1.44). When the observation at time k is available, the
posterior conditional PDF ( )
1:
|
k k
p x y can be calculated from the Eq. (1.43).
25. 12 Grid-based Nonlinear Estimation and Its Applications
Equation (1.44) is called the prediction formula while Eq. (1.43) is called
the update formula. The moments, such as the mean and covariance, can
be calculated from the PDF ( )
1:
|
k k
p x y . The filtering algorithm described
previously can be presented in a block diagram, as shown in Fig. 1.1. Initially,
( )
1 1: 1
|
k k
p − −
x y is assumed to be known at the time k – 1, and then ( )
1: 1
|
k k
p −
x y
can be computed using Eq. (1.44). When the measurement at time k arrives,
the posterior density ( )
1:
|
k k
p x y at time k is calculated by the Bayesian update
formula in Eq. (1.43). Then, the mean and covariance of the state at time k can
be calculated from ( )
1:
|
k k
p x y . State estimation can be recursively carried out
by this procedure.
The recursive propagation of the posterior density given by Eqs. (1.43)
and (1.44) is only a conceptual solution, in the sense that (in general) it cannot
be determined analytically. Thus, one has to use approximations or suboptimal
Bayesian algorithms. In this book, one of the focuses is to use various grid
methods to approximate such a Bayesian estimation algorithm.
References
Arasaratnam, I. and S. Haykin. 2007. Discrete nonlinear filtering algorithms using Gauss-Hermite
quadrature. Proceedings of the IEEE 95: 953–977.
Bar-Shalom, Y., X. Li and T. Kirubarajan. 2001. Estimation with Application to Tracking and
Navigation: Theory, Algorithms and Software. John Wiley & Sons, Inc., New York.
Fisher, R.A. 1912. On an absolute criterion for fitting frequency curves. Messenger of Math 41: 155.
Gelb, A. 1974. Applied Optimal Estimation. MIT Press, Cambridge.
Gubner, J.A. 2006. Probability and Random Processes for Electrical and Computer Engineers,
Cambridge University Press, New York.
Kailath,T.,A.H. Sayed and B. Hassibi. 2000. Linear Estimation. Prentice Hall, Upper Saddle River.
Kalman, R.E. 1960. A new approach to linear filtering and prediction problem. Journal of Basic
Engineering, pp. 35–46.
Kolmogorov,A.N. 1941. Interpolation and Extrapolation von Stationaren Zufalligen Folgen. Bull.
Acad. Sci. USSR, Ser. Math. 5: 3–14.
Maybeck, P.S. 1979. Stochastic Models, Estimation, and Control, Vol. 1, Mathematics in Science
and Engineering Series. Academic Press, New York.
Papoulis,A. 2002. Probability, Random Variables, and Stochastic Processes. 4th Edition, McGraw-
Hill, New York.
Wiener, N. 1949. The Extrapolation, Interpolation and Smoothing of Stationary Time Series.
John Wiley & Sons, Inc., New York.
Fig. 1.1 Bayesian filtering framework.
Chapman-Kolmogorov
26. Linear Estimation
of Dynamic Systems 2
In the previous chapter, we described the estimation problem from the Bayesian
perspective. An estimation problem is usually hard to solve analytically for
general stochastic dynamic systems, so only conceptual algorithms are given
therein. The difficulties stem from the nonlinear dynamics and computation of
statistics that involves integrals of large dimensionality with respect to arbitrary
PDF. However, under the Gaussian and linearity assumptions, the estimation
problems become tractable.The celebrated Kalman filter gained its great success
because the recursive Kalman filtering algorithm formulated in the time-domain
state-space representation for linear dynamic systems with Gaussian statistics
can be readily implemented in digital computers. Numerous books on the
Kalman filter have been published. Hence, this chapter provides a concise
review of the Kalman filter without delving into its theoretical development and
properties. We first present the most widely used discrete-time linear Kalman
filter in Section 2.1, and then derive one of its variants, the information Kalman
filter in Section 2.2.Although the Kalman filter was not originally derived from
the Bayesian estimation, we will show in Section 2.3 that the Kalman filter
is equivalent to the Bayesian estimation for the Gaussian and linear system.
The continuous-time Kalman-Bucy filter is also summarized at the end of this
chapter for completeness despite that it is not much used in practice.
2.1 Linear Discrete-Time Kalman Filter
Consider a linear dynamic system described by the discrete-time state-space
representation
1 1 1
k k k k
− − −
= +
x F x v (2.1)
27. 14 Grid-based Nonlinear Estimation and Its Applications
k k k k
= +
y H x n (2.2)
where x
n
k ∈
x is the system state vector and p
n
k ∈
y is the measurement, nx
and
np
are the state and measurement dimensions, respectively. The additive process
noise 1
x
n
k− ∈
v and measurement noise p
n
k ∈
n are assumed to be zero-mean,
Gaussian, white noise.
The covariance matrices for vk
and nk
are given by
,
0,
k
T
i k
i k
E
i k
=
=
≠
Q
v v
,
0,
k
T
i k
i k
E
i k
=
=
≠
R
n n (2.3)
It is assumed that vk
, nk
, and x0
are uncorrelated with one another, i.e.,
0 0
0, for all and , 0, for all , 0, for all
T T T
i k k k
E k i E k E k
= = =
v n v x n x
(2.4)
It can be shown that the discrete-time state-space system (2.1), (2.2) is a
Gauss-Markov model. The filtering problem is to recover the system state xk
from the measurement yk
. There are several different ways to derive the linear
Kalman filter for this system according to different optimization criteria. We
give the derivation based on the minimum mean square error (MMSE). Under
the MMSE criterion, the conditional mean is the optimal estimate (Bar-Shalom
et al. 2001), which is true not only for linear systems. We will now examine
the conditional mean and covariance of the estimates.
We define two types of estimates based on the information upon which
they are conditioned. The first one is the conditional mean
[ ]
| 1:
ˆ |
k k k k
E
x x y
(2.5)
based on [ ]
1: 1, ,
k k
y y y
, the measurement history up to time k. The estimate
|
ˆk k
x is usually called the a posteriori estimate.
The second estimate is the propagated or predicted estimate through the
given dynamics
[ ]
| 1 1: 1
ˆ |
k k k k
E
− −
x x y
(2.6)
which describes how the state evolves between measurements. It is usually
called a priori estimate. The a priori covariance Pk|k–1
and a posteriori
covariance Pk|k
can be similarly defined.
The Kalman filtering includes the propagation/prediction step and the
measurement update step.Assume that the conditional mean estimate 1| 1
ˆk k
− −
x and
28. Linear Estimation of Dynamic Systems 15
the conditional covariance 1| 1
k k
− −
P at the time k – 1 are available. The predicted
estimate can be derived as follows:
[ ]
[ ]
[ ] [ ]
| 1 1: 1
1 1 1 1: 1
1 1 1: 1 1 1: 1
1 1| 1
ˆ |
|
| |
ˆ 0
k k k k
k k k k
k k k k k
k k k
E
E
E E
− −
− − − −
− − − − −
− − −
= +
= +
= +
x x y
F x v y
F x y v y
F x
(2.7)
Therefore, | 1 1 1| 1
ˆ ˆ
k k k k k
− − − −
=
x F x (2.8)
The predicted covariance becomes
( )( )
( )( )
( )
| 1 | 1 | 1
1 1 1 1 | 1 1 1 1 1 | 1
1 1| 1 1 1
ˆ ˆ
ˆ ˆ
T
k k k k k k k k
T
k k k k k k k k k k k k
T
k k k k k
E
E
− − −
− − − − − − − − − −
− − − − −
= − −
= + − + −
= +
P x x x x
F x v F x F x v F x
F P F Q
x
̂ k–1|k–1
x
̂ k–1|k–1
(2.9)
where ( )( )
1| 1 1 1| 1 1 1| 1
ˆ ˆ
T
k k k k k k k k
E
− − − − − − − −
= − −
P x x x x . Note that the last equality
is obtained because ( )
1 | 1
ˆ
k k k
− −
−
x x and vk–1
are uncorrelated, i.e.,
( )( ) ( ) ( )
1 1 | 1 1 1 1 | 1 1
ˆ ˆ 0
T
T T
k k k k k k k k k k
E E
− − − − − − − −
− = − =
F x x v v x x F
–x
̂ k–1|k–1
–x
̂ k–1|k–1
. (2.10)
The estimate and covariance have been propagated from stage k – 1 to
k. Next, at stage k, the conditional mean and covariance are to be updated by
incorporating the current measurement yk
. To this end, define the innovation
| 1
ˆ
k k k k k−
= −
y y H x
(2.11)
The linear update equation is given by
| | 1
ˆ ˆ
k k k k k k
= +
x x K y . (2.12)
Note that this linear relationship between the estimate and measurement is
a natural result of the Gaussian assumption and the choice of the conditional
mean as the estimate (Bar-Shalom 2001). Kk
is called the Kalman gain and can
be derived from the MMSE criterion as follows.
The estimation error is given by
29. 16 Grid-based Nonlinear Estimation and Its Applications
( )( )
|
1 1 1 1 1 1
1 1 1
ˆ
k k k k
k k k k k k k k k k k k
k k k k k k k
I
− − − − − −
− − −
−
= − + − −
= − + −
e x x
F e K H F e v K H v K n
K H F e v K n
I
. (2.13)
Next, the covariance matrix becomes
( ) ( )
( )
( ) ( )
( ) ( )
( ) ( )
( )
|
1 1 1 1
1 1
| 1
| 1 | 1 | 1 | 1
+
+
T
k k k k
T
k k k k k k k k
T T
k k k k k k k k k k
T T
k k k k k k k k k
T
T T T T
k k k k k k k k k k k k k k k k k k k
E
I I
E
I I
I I
− − − −
− −
−
− − − −
=
− −
=
− − +
=
− − +
=− − +
P e e
K H F e K H F e
K H v K H v K n K n
K H P K H K R K
P K H P P H K K H P H K K R K
I I
KT
k +
I I
I I
( ) ( )
( )
( ) ( )
( ) ( )
( ) ( )
( )
|
1 1 1 1
1 1
| 1
| 1 | 1 | 1 | 1
+
+
T
k k k k
T
k k k k k k k k
T T
k k k k k k k k k k
T T
k k k k k k k k k
T
T T T T
k k k k k k k k k k k k k k k k k k k
E
I I
E
I I
I I
− − − −
− −
−
− − − −
=
− −
=
− − +
=
− − +
=− − +
P e e
K H F e K H F e
K H v K H v K n K n
K H P K H K R K
P K H P P H K K H P H K K R K
(2.14)
The optimal Kalman gain can be found by letting
( )
|
Tr
0
k k
k
∂
=
∂
P
K
.
Note that
( )
( )
|
| 1 | 1 | 1
Tr
2 2
T
k k T T
k k k k k k k k k k k k k
k
− − −
∂
=
− − + +
∂
P
H P P H K H P H K R
K
, (2.15)
We have,
( )
1
| 1 | 1 | 1
1
| 1 | 1
/ 2
=
T
T T T
k k k k k k k k k k k k
T T
k k k k k k k k
−
− − −
−
− −
=
+ +
+
K P H P H H P H R
P H H P H R
( )
1
| 1 | 1 | 1
1
| 1 | 1
/ 2
=
T
T T T
k k k k k k k k k k k k
T T
k k k k k k k k
−
− − −
−
− −
=
+ +
+
K P H P H H P H R
P H H P H R
T
(2.16)
Using (2.16) in (2.14), the covariance matrix Pk|k
becomes
( )
| | 1 | 1 | 1 | 1
| 1 | 1 | 1 | 1
| 1 | 1
+
T T T T
k k k k k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k
k k k k k k
− − − −
− − − −
− −
=− − +
=− − +
= −
P P K H P P H K K H P H R K
P K H P P H K P H K
P K H P
(2.17)
The linear discrete-time Kalman filter can be summarized in Table 2.1
2.2 Information Kalman Filter
The Kalman filter equations can be presented in many different forms by
algebraic manipulations. One of the alternative forms is called the information
filter. It is computationally more efficient to handle the measurement update
30. Linear Estimation of Dynamic Systems 17
when the measurement sets are much larger than the state-space dimension
because it can reduce the matrix inversion complexity in the Kalman filter
algorithm.
Recall the Kalman gain
1
| 1 | 1
= T T
k k k k k k k k k
−
− −
+
K P H H P H R
Post-multiplying this equation by | 1
T
k k k k k
−
+
H P H R yields
| 1 | 1
=
T T
k k k k k k k k k k
− −
+
K H P H K R P H (2.18)
Recall the covariance update equation:
| | 1 | 1
k k k k k k k k
− −
= −
P P K H P
It can be rewritten as
| 1 | 1 |
k k k k k k k k
− −
= −
K H P P P (2.19)
Substituting this equation into (2.18) yields
( )
| 1 | | 1
=
T T
k k k k k k k k k k
− −
− +
P P H K R P H (2.20)
Table 2.1 Discrete-time Kalman filter.
Discrete-time linear dynamics and measurement model:
1 1 1
k k k k
− − −
= +
x F x v (2.1)
k k k k
= +
y H x n (2.2)
Assumptions: vk–1
and nk
are white Gaussian process and measurement noise, respectively.
1 1 1,
T T
k k k k k k
E E
− − −
= =
v v Q n n R . vk–1
and nk
are uncorrelated with x0
and with
each other.
Initialization: 0|0
x̂ and P0|0
Prediction:
| 1 1 1| 1
ˆ ˆ
k k k k k
− − − −
=
x F x (2.8)
| 1 1 1| 1 1 1
T
k k k k k k k
− − − − − −
= +
P F P F Q (2.9)
Update:
( )
| | 1 | 1
ˆ ˆ ˆ
k k k k k k k k k
− −
= + −
x x K y H x . (2.12)
| | 1 | 1
k k k k k k k k
− −
= −
P P K H P (2.17)
where
1
| 1 | 1
= T T
k k k k k k k k k
−
− −
+
K P H H P H R (2.16)
31. 18 Grid-based Nonlinear Estimation and Its Applications
It results in another Kalman gain expression
1
|
T
k k k k k
−
=
K P H R (2.21)
Substituting this equation into the covariance update Eq. (2.17) and post-
multiplying the result by 1
| 1
k k
−
−
P yields
1 1
| | | 1
T
k k k k k k k k k
− −
−
− =
I P H R H P P (2.22)
Rearranging this equation gives
1 1 1
| | 1 | | |
T
k k k k k k k k k k k k k
− − −
− + =
P P P H R H I = P P (2.23)
Factoring out the common factor |
k k
P leads to
1 1 1
| | 1
T
k k k k k k k
− − −
−
= +
P P H R H (2.24)
The inverse of the covariance matrix in (2.24) is called the information
matrix. Let’s define the information matrix as 1
−
=
Y P
. Then, the information
update Eq. (2.24) becomes
1
| | 1
T
k k k k k k k
−
−
= +
Y Y H R H
(2.25)
For the covariance prediction, recall
| 1 1 1| 1 1 1
T
k k k k k k k
− − − − − −
= +
P F P F Q
which can be rewritten as
( )
1
1
| 1 1 1| 1 1 1
T
k k k k k k k
−
−
− − − − − −
= +
P F P F Q (2.26)
Using the matrix inverse lemma
( ) ( )
1
1 1 1 1 1
−
− − − − −
+ = − +
M N M M N I M N M (2.27)
with 1 1| 1 1
T
k k k k
− − − −
=
M F P F and 1
k−
N = Q , and defining
Γk–1 ( )
1 1 1
1 1 1| 1 1 1 1| 1 1
T T
k k k k k k k k k
− − − −
− − − − − − − − −
= =
F P F F P F
G (2.28)
Equation (2.26) becomes
( )
1
1 1
| 1 | 1 1 1 1 1 1
k k k k k k k k k
−
− −
− − − − − − −
= =− +
Y P Q
G G G G
Γk–1
– Γk–1
(Γk–1
+ Q–1
k–1
)–1
Γk–1
(2.29)
The state update and prediction equations in the information form can be
also derived. Recall
( )
| | 1 | 1
ˆ ˆ ˆ
k k k k k k k k k
− −
= + −
x x K y H x
32. Linear Estimation of Dynamic Systems 19
Pre-multiplying this equation by 1
|
k k
−
P yields
( )
1 1 1
| | | | 1 | | 1
ˆ ˆ ˆ
k k k k k k k k k k k k k k k
− − −
− −
= + −
P x P x P K y H x (2.30)
Defining an information state vector y
͝ = 1
ˆ
−
=
z P x, and substituting Eq. (2.21)
into Eq. (2.30) lead to
( )
( )
1 1
| | | 1 | 1
1 1 1
| | 1
1 1
| 1 | 1
1
| 1
ˆ ˆ
ˆ
ˆ
T
k k k k k k k k k k k k
T T
k k k k k k k k k k
T
k k k k k k k
T
k k k k k
− −
− −
− − −
−
− −
− −
−
−
= + −
=
− +
= +
= +
y P x H R y H x
P H R H x H R y
P x H R y
y H R y
(2.31)
This is the prediction equation for the information state.
Pre-multiplying the state prediction equation | 1 1 1| 1
ˆ ˆ
k k k k k
− − − −
=
x F x by 1
| 1
k k
−
−
P
yields
1 1
| 1 | 1 | 1 1 1| 1
ˆ ˆ
k k k k k k k k k
− −
− − − − − −
=
P x P F x (2.32)
Using the definition of the information state and the information matrix
prediction Eq. (2.29) into Eq. (2.32) lead to
( )
1 1 1
| | | | 1 | | 1
ˆ ˆ ˆ
k k k k k k k k k k k k k k k
− − −
− −
= + −
P x P x P K y H x (2.30)
Defining an information state vector
1
ˆ
−
=
z P x , and substituting Eq. (2.21) into Eq. (2.30) lead to
( )
( )
1 1
| | | 1 | 1
1 1 1
| | 1
1 1
| 1 | 1
1
| 1
ˆ ˆ
ˆ
ˆ
T
k k k k k k k k k k k k
T T
k k k k k k k k k k
T
k k k k k k k
T
k k k k k
− −
− −
− − −
−
− −
− −
−
−
= + −
=
− +
= +
= +
y P x H R y H x
P H R H x H R y
P x H R y
y H R y
(2.31)
This is the prediction equation for the information state.
Pre-multiplying the state prediction equation | 1 1 1| 1
ˆ ˆ
k k k k k
− − − −
=
x F x by 1
| 1
k k
−
−
P yields
1 1
| 1 | 1 | 1 1 1| 1
ˆ ˆ
k k k k k k k k k
− −
− − − − − −
=
P x P F x (2.32)
Using the definition of the information state and the information matrix prediction equation (2.29) into Eq.
(2.32) lead to
( )
( )
( )
( )
1
1
| 1 1 1 1 1 1 1 1| 1
1
1
1 1 1 1 1 1| 1
1
1 1
1 1 1 1 1| 1 1 1 1| 1
1
1 1
1 1 1 1 1| 1
ˆ
ˆ
ˆ
ˆ
k k k k k k k k k k
k k k k k k k
T
k k k k k k k k k k
T
k k k k k k k
−
−
− − − − − − − − −
−
−
− − − − − − −
−
− − −
− − − − − − − − − −
−
− − −
− − − − − −
= − +
=
− +
=
− +
=
− +
y Q F x
I Q F x
I Q F P F F x
I Q F P x
G G G G
G G G
G G
G G
( )
1| 1
1
1
1 1 1 1 1| 1
k
T
k k k k k k
− −
−
− −
− − − − − −
=
− +
I Q F y
G G
(2.33)
This is the information state update equation. The information form of the Kalman filter can be summarized in
Table 2.2.
Table 2.2 Information Filter
Discrete-time linear dynamics and measurement model:
1 1 1
k k k k
− − −
= +
x F x v (2.1)
k k k k
= +
y H x n (2.2)
Assumptions: 1
k−
v and k
n are white Gaussian process and measurement noise, respectively.
1 1 1
T
k k k
E − − −
=
v v Q , T
k k k
E =
n n R . 1
k−
v and k
n are uncorrelated with 0
x and with each other.
Formatt
Formatt
Γ Γ Γ Γ
Γ
Γ
Γ
Γ
Γ
Γ
Γ
Γ
Γ
(2.33)
This is the information state prediction equation. The information form of
the Kalman filter can be summarized in Table 2.2.
Both the discrete-time Kalman filter and the information filter generate
identical results. When there are much more measurement sets than the states,
the information form can avoid the large matrix inverse operation as needed
in
1
| 1
T
k k k k k
−
−
+
H P H R for the Kalman gain calculation. Suppose there are m
measurement sets and the Rk
matrix is block diagonal. Equation (2.25) can be
rewritten as
33. 20 Grid-based Nonlinear Estimation and Its Applications
1
,1 ,1
1
,2 ,1
1
| | 1 | 1 ,1 ,2 ,
1
, ,
k k
k k
T T T T
k k k k k k k k k k k k m
k m k m
−
−
−
− −
−
= + = +
R 0 0 0 H
0 R 0 0 H
Y Y H R H H H H
0 0 0
0 0 0 R H
Ψ
Y
͝
1
,1 ,1
1
,2 ,1
1
| | 1 | 1 ,1 ,2 ,
1
, ,
k k
k k
T T T T
k k k k k k k k k k k k m
k m k m
−
−
−
− −
−
= + = +
R 0 0 0 H
0 R 0 0 H
Y Y H R H H H H
0 0 0
0 0 0 R H
Ψ
(2.34)
The block-diagonal Rk
means that each measurement set ( )
, 1, ,
k i i m
=
y at time
step k is uncorrelated with other measurement sets. Note that if the original Rk
is not
in this form, we can always perform the decorrelating transformation or stochastic
decompositionmentionedinChapter1.2tomakeRk
blockdiagonal(Kailathetal.2000).
Equation (2.34) can be rewritten as
1
| | 1 , , ,
1
m
T
k k k k k i k i k i
i
−
−
=
= + ∑
Y Y H R H
(2.35)
This form gives an efficient way to update the information matrix by
processing the measurement one at a time until all the measurement sets at time
k are assimilated. The results are the same as if we process all measurement sets
all at once as a large measurement vector. It can be applied to the information
Table 2.2 Information filter.
Discrete-time linear dynamics and measurement model:
1 1 1
k k k k
− − −
= +
x F x v (2.1)
k k k k
= +
y H x n (2.2)
Assumptions: vk–1
and nk
are white Gaussian process and measurement noise, respectively.
1 1 1
T
k k k
E − − −
=
v v Q , T
k k k
E =
n n R . 1
k−
v and k
n are uncorrelated with x0
and with
each other.
Initialization: 0|0
x̂ and 0|0
P , and the information covariance 1
0|0 0|0
−
Y = P
and the information state
vector 0|0 0|0 0|0
ˆ
=
y Y x
.
Prediction:
( )
1
1
| 1 1 1 1 1 1| 1
T
k k k k k k k k
−
− −
− − − − − − −
=
− +
y I Q F y
G G
Γk–1
Γk–1
(2.33)
( )
1
1
| 1 1 1 1 1 1
k k k k k k k
−
−
− − − − − −
=− +
Y Q
G G G G
Γk–1
Γk–1
Γk–1
Γk–1
(2.29)
where Γk–1
1 1
1 1 1| 1 1
T
k k k k k
− − −
− − − − −
= F P F
G
Update:
1
| | 1
T
k k k k k k k
−
−
= +
y y H R y . (2.31)
1
| | 1
T
k k k k k k k
−
−
= +
Y Y H R H (2.25)
34. Linear Estimation of Dynamic Systems 21
state update Eq. (2.31) in the same way. After this one-at-a-time measurement
update, the prediction step can proceed as usual.
There is another form of the information filter given by (Mutambara 1998)
for measurement update:
| | 1
k k k k k
−
= +
y y i
(2.36)
| | 1
k k k k k
−
= +
Y Y I
(2.37)
where 1
T
k k k k
−
=
i H R y , 1
T
k k k k
−
=
I H R H are called the information state contribution
and information matrix contribution, respectively.
The information filter is especially useful when the measurements come
from multiple sensors, which will be revisited in Chapter 6 for multiple sensor
estimation application.
2.3 The Relation Between the Bayesian Estimation and
Kalman Filter
In Section 2.1, the linear discrete Kalman filter is derived from minimizing
the MSE. In this section, we will show the relationship between this filter and
the Bayesian estimation in the prediction and update steps (Šimandl 2006).
Prediction
Based on the Chapman-Kolmogorov Eq. (1.44), we have
( ) ( ) ( )
( ) ( )
1 1 1 1 1
1 1 1 1 1| 1 1| 1 1
1
| | |
ˆ
; , ; ,
1
exp
2
k k k k k k k
k k k k k k k k k k
k
p p p d
N N d
a b d
− − − − −
− − − − − − − − −
−
=
=
= −
∫
∫
∫
x x x x x y x
x F x Q x x P x
x
(2.38)
where
( ) ( ) ( ) ( )
1 2
2 1 2 2
1 1| 1
1
2 det 2 det
x x
n n
k k k
a
π π
− − −
=
Q P
(2.39)
( ) ( ) ( ) ( )
1 1
1 1 1 1 1 1 1| 1 1| 1 1 1| 1
1 1 1 1
1 1 1 1 1| 1 1 1 1 1 1| 1 1| 1
1 1
1 1 1| 1 1| 1
ˆ ˆ
ˆ
ˆ
T
T
k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k
T T
k k k k k k k
b − −
− − − − − − − − − − − − −
− − − −
− − − − − − − − − − − − − −
− −
− − − − − −
= − − + − −
= + − +
− +
x F x Q x F x x x P x x
x F Q F P x x F Q x P x
x Q F x P
( )
1
1 1
1 1| 1 1| 1 1| 1
1 1 1 1
ˆ ˆ
k
T T
k k k k k k k k k
T
T T
k k k k c
−
− −
− − − − − − −
− − − −
+ +
= − − +
x
x Q x x P x
x Ax x b b x
(2.40)
35. 22 Grid-based Nonlinear Estimation and Its Applications
where 1 1 1 1
1 1 1 1| 1 1 1 1| 1 1| 1
ˆ
,
T T
k k k k k k k k k k k k
− − − −
− − − − − − − − − − −
= + = +
A F Q F P b F Q x P x , and
1 1
1 1| 1 1| 1 1| 1
ˆ ˆ .
T T
k k k k k k k k k
c − −
− − − − − − −
= +
x Q x x P x
Equation (2.40) can be rewritten as
( ) ( )
1 1 1
1 1
T
T
k k
b c
− − −
− −
= − − − +
x A b A x A b b A b
(2.41)
and Eq. (2.38) can then be rewritten as
( )
( ) ( )
( )
( ) ( ) ( )
( ) ( )
( )
1 1
1 1 1
1 1 1
1
1 1 2
2
1 1
1 1
1
|
2
1
2
1
2 det
2
1
.
2
x
k k k
T
T
k k k
n
T
T
k k
p a exp b d
a exp c d
a exp c
exp
π
− −
− − −
− − −
− −
− −
− −
= ⋅ −
= − − − − +
= ⋅ − − +
− − −
∫
∫
x x x
x A b A x A b b A b x
b A b A
x A b A x A b
( ) ( )
( ) ( ) ( )
1
1 2
2
1
1
1 1
2
2
2 det
1
2 det
2
x
x
n
k
n
T
d
a exp c
π
π
−
−
− −
= − − +
∫ A x
A b A b
( )
( ) ( )
( )
( ) ( ) ( )
( ) ( )
( )
1 1
1 1 1
1 1 1
1
1 1 2
2
1 1
1 1
1
|
2
1
2
1
2 det
2
1
.
2
x
k k k
T
T
k k k
n
T
T
k k
p a exp b d
a exp c d
a exp c
exp
π
− −
− − −
− − −
− −
− −
− −
= ⋅ −
= − − − − +
= ⋅ − − +
− − −
∫
∫
x x x
x A b A x A b b A b x
b A b A
x A b A x A b
( ) ( )
( ) ( ) ( )
1
1 2
2
1
1
1 1
2
2
2 det
1
2 det
2
x
x
n
k
n
T
d
a exp c
π
π
−
−
− −
= − − +
∫ A x
A b A b
(2.42)
Re-organizing 1
T
c
−
− +
b A b
yields
1 1 1 1 1 1
1 1 1| 1 1| 1 1 1 1| 1 1| 1
1 1
1 1| 1 1| 1 1| 1
1 1
1 1
ˆ ˆ
ˆ ˆ
=
T
T T T
k k k k k k k k k k k k k k
T T
k k k k k k k k k
T
k k k k
c
− − − − − −
− − − − − − − − − − − −
− −
− − − − − − −
− −
− − −
− + =
− + +
+ +
−
b A b F Q x P x A F Q x P x
x Q x x P x
x Q F A F
1 1
1 1 1
1 1 1
1 1 1| 1 1| 1
1 1 1
1| 1 1| 1 1 1
1 1 1
1| 1 1| 1 1| 1
ˆ
ˆ
ˆ
T
k k k
T
k k k k k k k
T T
k k k k k k k
T
k k k k k k
− −
− −
− − −
− − − − − −
− − −
− − − − − −
− − −
− − − − − −
−
−
−
−
Q Q x
x Q F A P x
x P A F Q x
x P A P
1
1| 1 1| 1 1| 1 1| 1
ˆ ˆ ˆ
T
k k k k k k k k
−
− − − − − − − −
+
x x P x
(2.43)
Because ( )
1
|
k k
p −
x x is also a Gaussian distribution, it can be defined as
( ) ( )
( ) ( )
( ) ( )
( )
1
1 | 1 | 1 | 1 | 1 | 1
1
2
2
| 1
1 1
ˆ ˆ ˆ
| ; , =
2
2 det
x
T
k k k k k k k k k k k k k k k
n
k k
p N exp
π
−
− − − − − −
−
− − −
x x x x P x x P x x
P
( ) ( )
( ) ( )
( ) ( )
( )
1
1 | 1 | 1 | 1 | 1 | 1
1
2
2
| 1
1 1
ˆ ˆ ˆ
| ; , =
2
2 det
x
T
k k k k k k k k k k k k k k k
n
k k
p N exp
π
−
− − − − − −
−
− − −
x x x x P x x P x x
P
. (2.44)
36. Linear Estimation of Dynamic Systems 23
Comparing the first quadratic term in Eq. (2.43) with the exponential term
in (2.44), we have,
1 1 1 1 1
1 1 1 1 1 | 1
T
k k k k k k k
− − − − −
− − − − − −
− + =
Q F A F Q Q P
(2.45)
Using the expression of A
into the left-hand side of Eq. (2.45) gives
1
1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1| 1 1 1 1
T T T
k k k k k k k k k k k k k k k
−
− − − − − − − − −
− − − − − − − − − − − − − − −
− + =
− + +
Q F A F Q Q Q F F Q F P F Q Q
(2.46)
Using the matrix inversion lemma (Kailath et al. 2000),
( ) ( )
1
1 1 1 1 1 1
−
− − − − − −
+ = − +
BCD A A A B DA B C DA (2.47)
Equation (2.46) becomes
( )
1
1 1 1 1
1 1 1 1 1 1 1 1| 1 1
T T
k k k k k k k k k k
−
− − − −
− − − − − − − − − −
− + = +
Q F A F Q Q Q F P F
. (2.48)
Hence, we have
| 1 1 1| 1 1 1
= T
k k k k k k k
− − − − − −
+
P F P F Q (2.49)
which is the Eq. (2.9), the covariance prediction of the discrete-time Kalman
filter.
Matching the second term in Eq. (2.43) with the corresponding term in
(2.44) yields
1 1 1 1
1 1 1| 1 1| 1 | 1 | 1
ˆ ˆ
k k k k k k k k k k
− − − −
− − − − − − − −
− =
−
Q F A P x P x
(2.50)
Substituting the expression of A
into the left-hand side of Eq. (2.50) leads to
( )
1
1 1 1 1
1 1 1 1 1 1| 1 1| 1 1| 1
1
1 1
1 1 1| 1 1| 1 1 1 1| 1 1 1 1 1| 1 1| 1 1| 1
1
1 1 1| 1 1 1 1| 1
ˆ
ˆ
T
k k k k k k k k k k k
T T
k k k k k k k k k k k k k k k k k k k
T
k k k k k k k k k
I
−
− − − −
− − − − − − − − − − −
−
− −
− − − − − − − − − − − − − − − − − − −
−
− − − − − − − −
− +
=
− − +
=
− −
Q F F Q F P P x
Q F P P F F P F Q F P P x
Q F P F F P F
( )
( )( )
( )
1
1 1 1 1| 1
1
1
1 1 1| 1 1 1 1 1| 1 1 1 1| 1 1 1 1 1| 1
1
1 1| 1 1 1 1 1| 1
1
| 1 1 1| 1
ˆ
ˆ
ˆ
ˆ
T
k k k k
T T T
k k k k k k k k k k k k k k k k k k
T
k k k k k k k k
k k k k k
−
− − − − −
−
−
− − − − − − − − − − − − − − − − − −
−
− − − − − − − −
−
− − − −
+
=
− + − +
=
− +
= −
Q F x
Q F P F Q F P F F P F Q F x
F P F Q F x
P F x
( )
1
1 1 1 1
1 1 1 1 1 1| 1 1| 1 1| 1
1
1 1
1 1 1| 1 1| 1 1 1 1| 1 1 1 1 1| 1 1| 1 1| 1
1
1 1 1| 1 1 1 1| 1
ˆ
ˆ
T
k k k k k k k k k k k
T T
k k k k k k k k k k k k k k k k k k k
T
k k k k k k k k k
I
−
− − − −
− − − − − − − − − − −
−
− −
− − − − − − − − − − − − − − − − − − −
−
− − − − − − − −
− +
=
− − +
=
− −
Q F F Q F P P x
Q F P P F F P F Q F P P x
Q F P F F P F
( )
( )( )
( )
1
1 1 1 1| 1
1
1
1 1 1| 1 1 1 1 1| 1 1 1 1| 1 1 1 1 1| 1
1
1 1| 1 1 1 1 1| 1
1
| 1 1 1| 1
ˆ
ˆ
ˆ
ˆ
T
k k k k
T T T
k k k k k k k k k k k k k k k k k k
T
k k k k k k k k
k k k k k
−
− − − − −
−
−
− − − − − − − − − − − − − − − − − −
−
− − − − − − − −
−
− − − −
+
=
− + − +
=
− +
= −
Q F x
Q F P F Q F P F F P F Q F x
F P F Q F x
P F x
( )
1
1 1 1 1
1 1 1 1 1 1| 1 1| 1 1| 1
1
1 1
1 1 1| 1 1| 1 1 1 1| 1 1 1 1 1| 1 1| 1 1| 1
1
1 1 1| 1 1 1 1| 1
ˆ
ˆ
T
k k k k k k k k k k k
T T
k k k k k k k k k k k k k k k k k k k
T
k k k k k k k k k
I
−
− − − −
− − − − − − − − − − −
−
− −
− − − − − − − − − − − − − − − − − − −
−
− − − − − − − −
− +
=
− − +
=
− −
Q F F Q F P P x
Q F P P F F P F Q F P P x
Q F P F F P F
( )
( )( )
( )
1
1 1 1 1| 1
1
1
1 1 1| 1 1 1 1 1| 1 1 1 1| 1 1 1 1 1| 1
1
1 1| 1 1 1 1 1| 1
1
| 1 1 1| 1
ˆ
ˆ
ˆ
ˆ
T
k k k k
T T T
k k k k k k k k k k k k k k k k k k
T
k k k k k k k k
k k k k k
−
− − − − −
−
−
− − − − − − − − − − − − − − − − − −
−
− − − − − − − −
−
− − − −
+
=
− + − +
=
− +
= −
Q F x
Q F P F Q F P F F P F Q F x
F P F Q F x
P F x
I
( )
1
1 1
1| 1 1| 1 1| 1
1 1
1 1 1| 1 1 1 1 1| 1 1| 1 1| 1
1 1| 1
ˆ
ˆ
k k k k k
T T
k k k k k k k k k k k k
k k k
− −
− − − − − −
− −
− − − − − − − − − − − − −
− − −
+
P x
F P F Q F P P x
P F )
)( )
1
1 1 1 1| 1
1
1 1 1| 1 1 1 1| 1 1 1 1 1| 1
1 1| 1
ˆ
ˆ
ˆ
T
k k k k
T T
k k k k k k k k k k k k
k k
−
− − − − −
−
− − − − − − − − − − − −
− − −
+
− +
Q F x
F P F F P F Q F x
x
)
1 1
1| 1 1| 1
1 1
1 1| 1 1 1 1 1| 1 1| 1 1| 1
1
ˆ
ˆ
k k k k
T
k k k k k k k k k k k
k
−
− − − −
− −
− − − − − − − − − − −
−
+
P x
P F Q F P P x
F )
)( )
1
1 1 1 1| 1
1
1 1| 1 1 1 1| 1 1 1 1 1| 1
1
ˆ
ˆ
T
k k k k
T T
k k k k k k k k k k k
k
−
− − − − −
−
− − − − − − − − − − − −
−
+
+
Q F x
P F F P F Q F x
(2.51)
37. 24 Grid-based Nonlinear Estimation and Its Applications
By comparing Eq. (2.50) with Eq. (2.51), we have
| 1 1 1| 1
ˆ ˆ
k k k k k
− − − −
=
x F x (2.52)
which is the Eq. (2.8), the state prediction of the discrete-time Kalman filter.
The summation of the last two terms in Eq. (2.43) is given by
1 1 1 1
1| 1 1| 1 1| 1 1| 1 1| 1 1| 1 1| 1
1
1 1 1 1
1| 1 1| 1 1 1 1 1| 1 1| 1 1| 1
1
1| 1 1| 1 1| 1
1| 1 1|
ˆ ˆ ˆ ˆ
ˆ ˆ
ˆ ˆ
ˆ
T T
k k k k k k k k k k k k k k
T T
k k k k k k k k k k k k k
T
k k k k k k
T
k k k k
− − − −
− − − − − − − − − − − − − −
−
− − − −
− − − − − − − − − − − − −
−
− − − − − −
− − −
− +
=
− +
+
= −
x P A P x x P x
x P F Q F P P x
x P x
x P
( )
( ) ( )
1
1 1
1 1| 1 1| 1 1 1 1| 1 1 1 1 1| 1 1| 1 1| 1
1
1| 1 1| 1 1| 1
1
1
1| 1 1| 1 1| 1 1| 1 1 1 1| 1 1 1 1 1
ˆ
ˆ ˆ
ˆ ˆ ˆ ˆ
T T
k k k k k k k k k k k k k k k k k
T
k k k k k k
T
T T T
k k k k k k k k k k k k k k k k
−
− −
− − − − − − − − − − − − − − − − − −
−
− − − − − −
−
−
− − − − − − − − − − − − − − − −
− +
+
=
− + +
P P F F P F Q F P P x
x P x
x P x x F F P F Q F x
( )
| 1
1
1| 1 1| 1 1| 1
1
1| 1 1 1 1| 1 1 1 1 1| 1
1
| 1 | 1 | 1
ˆ ˆ
ˆ ˆ
ˆ ˆ
k
T
k k k k k k
T T T
k k k k k k k k k k k
T
k k k k k k
−
−
− − − − − −
−
− − − − − − − − − − −
−
− − −
+
= +
=
x P x
x F F P F Q F x
x P x
1
1| 1 1| 1 1| 1
1 1
| 1 1| 1 1| 1
ˆ ˆ
ˆ
T
k k k k k k
k k k k k
−
− − − − − −
− −
− − − − −
x P x
P x
( )
( )
1 1
1 1 1| 1 1 1 1 1| 1 1| 1 1| 1
1
1 1 1| 1 1 1 1 1
ˆ
ˆ
T
k k k k k k k k k k k k
T T
k k k k k k k k
− −
− − − − − − − − − − − −
−
− − − − − − − −
+
+
F P F Q F P P x
F F P F Q F x | 1
1
1 1| 1
ˆ
k
k k k
−
−
− − −
F x
1 1 1 1
1| 1 1| 1 1| 1 1| 1 1| 1 1| 1 1| 1
1
1 1 1 1
1| 1 1| 1 1 1 1 1| 1 1| 1 1| 1
1
1| 1 1| 1 1| 1
1| 1 1|
ˆ ˆ ˆ ˆ
ˆ ˆ
ˆ ˆ
ˆ
T T
k k k k k k k k k k k k k k
T T
k k k k k k k k k k k k k
T
k k k k k k
T
k k k k
− − − −
− − − − − − − − − − − − − −
−
− − − −
− − − − − − − − − − − − −
−
− − − − − −
− − −
− +
=
− +
+
= −
x P A P x x P x
x P F Q F P P x
x P x
x P
( )
( ) ( )
1
1 1
1 1| 1 1| 1 1 1 1| 1 1 1 1 1| 1 1| 1 1| 1
1
1| 1 1| 1 1| 1
1
1
1| 1 1| 1 1| 1 1| 1 1 1 1| 1 1 1 1 1
ˆ
ˆ ˆ
ˆ ˆ ˆ ˆ
T T
k k k k k k k k k k k k k k k k k
T
k k k k k k
T
T T T
k k k k k k k k k k k k k k k k
−
− −
− − − − − − − − − − − − − − − − − −
−
− − − − − −
−
−
− − − − − − − − − − − − − − − −
− +
+
=
− + +
P P F F P F Q F P P x
x P x
x P x x F F P F Q F x
( )
| 1
1
1| 1 1| 1 1| 1
1
1| 1 1 1 1| 1 1 1 1 1| 1
1
| 1 | 1 | 1
ˆ ˆ
ˆ ˆ
ˆ ˆ
k
T
k k k k k k
T T T
k k k k k k k k k k k
T
k k k k k k
−
−
− − − − − −
−
− − − − − − − − − − −
−
− − −
+
= +
=
x P x
x F F P F Q F x
x P x
1 1 1 1
1| 1 1| 1 1| 1 1| 1 1| 1 1| 1 1| 1
1
1 1 1 1
1| 1 1| 1 1 1 1 1| 1 1| 1 1| 1
1
1| 1 1| 1 1| 1
1| 1 1|
ˆ ˆ ˆ ˆ
ˆ ˆ
ˆ ˆ
ˆ
T T
k k k k k k k k k k k k k k
T T
k k k k k k k k k k k k k
T
k k k k k k
T
k k k k
− − − −
− − − − − − − − − − − − − −
−
− − − −
− − − − − − − − − − − − −
−
− − − − − −
− − −
− +
=
− +
+
= −
x P A P x x P x
x P F Q F P P x
x P x
x P
( )
( ) ( )
1
1 1
1 1| 1 1| 1 1 1 1| 1 1 1 1 1| 1 1| 1 1| 1
1
1| 1 1| 1 1| 1
1
1
1| 1 1| 1 1| 1 1| 1 1 1 1| 1 1 1 1 1
ˆ
ˆ ˆ
ˆ ˆ ˆ ˆ
T T
k k k k k k k k k k k k k k k k k
T
k k k k k k
T
T T T
k k k k k k k k k k k k k k k k
−
− −
− − − − − − − − − − − − − − − − − −
−
− − − − − −
−
−
− − − − − − − − − − − − − − − −
− +
+
=
− + +
P P F F P F Q F P P x
x P x
x P x x F F P F Q F x
( )
| 1
1
1| 1 1| 1 1| 1
1
1| 1 1 1 1| 1 1 1 1 1| 1
1
| 1 | 1 | 1
ˆ ˆ
ˆ ˆ
ˆ ˆ
k
T
k k k k k k
T T T
k k k k k k k k k k k
T
k k k k k k
−
−
− − − − − −
−
− − − − − − − − − − −
−
− − −
+
= +
=
x P x
x F F P F Q F x
x P x
1 1 1 1
1| 1 1| 1 1| 1 1| 1 1| 1 1| 1 1| 1
1
1 1 1 1
1| 1 1| 1 1 1 1 1| 1 1| 1 1| 1
1
1| 1 1| 1 1| 1
1| 1 1|
ˆ ˆ ˆ ˆ
ˆ ˆ
ˆ ˆ
ˆ
T T
k k k k k k k k k k k k k k
T T
k k k k k k k k k k k k k
T
k k k k k k
T
k k k k
− − − −
− − − − − − − − − − − − − −
−
− − − −
− − − − − − − − − − − − −
−
− − − − − −
− − −
− +
=
− +
+
= −
x P A P x x P x
x P F Q F P P x
x P x
x P
( )
( ) ( )
1
1 1
1 1| 1 1| 1 1 1 1| 1 1 1 1 1| 1 1| 1 1| 1
1
1| 1 1| 1 1| 1
1
1
1| 1 1| 1 1| 1 1| 1 1 1 1| 1 1 1 1 1
ˆ
ˆ ˆ
ˆ ˆ ˆ ˆ
T T
k k k k k k k k k k k k k k k k k
T
k k k k k k
T
T T T
k k k k k k k k k k k k k k k k
−
− −
− − − − − − − − − − − − − − − − − −
−
− − − − − −
−
−
− − − − − − − − − − − − − − − −
− +
+
=
− + +
P P F F P F Q F P P x
x P x
x P x x F F P F Q F x
( )
| 1
1
1| 1 1| 1 1| 1
1
1| 1 1 1 1| 1 1 1 1 1| 1
1
| 1 | 1 | 1
ˆ ˆ
ˆ ˆ
ˆ ˆ
k
T
k k k k k k
T T T
k k k k k k k k k k k
T
k k k k k k
−
−
− − − − − −
−
− − − − − − − − − − −
−
− − −
+
= +
=
x P x
x F F P F Q F x
x P x
T
1 1 1 1
1| 1 1| 1 1| 1 1| 1 1| 1 1| 1 1| 1
1
1 1 1 1
1| 1 1| 1 1 1 1 1| 1 1| 1 1| 1
1
1| 1 1| 1 1| 1
1| 1 1|
ˆ ˆ ˆ ˆ
ˆ ˆ
ˆ ˆ
ˆ
T T
k k k k k k k k k k k k k k
T T
k k k k k k k k k k k k k
T
k k k k k k
T
k k k k
− − − −
− − − − − − − − − − − − − −
−
− − − −
− − − − − − − − − − − − −
−
− − − − − −
− − −
− +
=
− +
+
= −
x P A P x x P x
x P F Q F P P x
x P x
x P
( )
( ) ( )
1
1 1
1 1| 1 1| 1 1 1 1| 1 1 1 1 1| 1 1| 1 1| 1
1
1| 1 1| 1 1| 1
1
1
1| 1 1| 1 1| 1 1| 1 1 1 1| 1 1 1 1 1
ˆ
ˆ ˆ
ˆ ˆ ˆ ˆ
T T
k k k k k k k k k k k k k k k k k
T
k k k k k k
T
T T T
k k k k k k k k k k k k k k k k
−
− −
− − − − − − − − − − − − − − − − − −
−
− − − − − −
−
−
− − − − − − − − − − − − − − − −
− +
+
=
− + +
P P F F P F Q F P P x
x P x
x P x x F F P F Q F x
( )
| 1
1
1| 1 1| 1 1| 1
1
1| 1 1 1 1| 1 1 1 1 1| 1
1
| 1 | 1 | 1
ˆ ˆ
ˆ ˆ
ˆ ˆ
k
T
k k k k k k
T T T
k k k k k k k k k k k
T
k k k k k k
−
−
− − − − − −
−
− − − − − − − − − − −
−
− − −
+
= +
=
x P x
x F F P F Q F x
x P x
(2.53)
Considering Eqs. (2.49)–(2.53), the exponential terms in Eq. (2.42) and
Eq. (2.44) are the same. The remaining thing is to prove that the constants in
Eqs. (2.42) and (2.44) are the same.
( ) ( )
( ) ( )
( ) ( ) ( ) ( )
( ) ( ) ( ) ( )
( ) ( )
1
1 2
2
1 2
1
2 1 1
1 1 1 1| 1
1 2
2 1 2 2
1 1| 1
1 2
1 2
2 1 2 1 1
1 1| 1 1 1 1 1| 1
2 1 2 1
1 1| 1 1 1 1
2 det
2 det
2 det 2 det
1
2 det det det
1
2 det det
x
x
x x
x
x
n
n T
k k k k k
n n
k k k
n T
k k k k k k k k
n T
k k k k k k
a π
π
π π
π
π
−
−
− −
− − − − −
− − −
− −
− − − − − − − −
−
− − − − − −
+
=
=
+
=
+
A
F Q F P
Q P
Q P F Q F P
Q P F Q F
( )
1 2
I
(2.54)
By using the identity,
( ) ( )
det det
+ = +
I AB I BA (2.55)
Equation (2.54) becomes
38. Linear Estimation of Dynamic Systems 25
( ) ( )
( ) ( ) ( )
( ) ( )
( ) ( )
1
1 2
2
1 2
2 1 2
1
1 1| 1 1 1 1
1 2
2
1 1| 1 1 1
1 2
2
| 1
2 det
1
2 det det
1
2 det
1
2 det
x
x
x
x
n
n T
k k k k k k
n T
k k k k k
n
k k
a
I
π
π
π
π
−
−
− − − − − −
− − − − −
−
=
+
=
+
=
A
F P F Q Q
F P F Q
P
I
(2.56)
Hence, the constant parts in Eqs. (2.42) and (2.44) are the same. Since the
exponential term and the constants in Eqs. (2.42) and (2.44) are both the same,
Eqs. (2.42) and (2.44) are equivalent.
Update
Using the Bayesian equation, we have
( )
( ) ( )
( )
( ) ( )
( )
( ) ( )
( ) ( ) ( ) ( )
( )
1
| 1 | 1
| 1 | 1
1 2
2
| 1
1 2
2 1 2 2
| 1
| |
|
ˆ
; , ; ,
=
ˆ
; ,
2 det
=
2 det 2 det
1
2
z
z x
k k k k
k k
k
k k k k k k k k k
T
k k k k k k k k k
n T
k k k k k
n n
k k k
T
k k k k
p p
p
p
N N
N
exp
π
π π
−
− −
− −
−
−
=
+
+
× − −
y x x x
x y
y
y H x R x x P
y H x H P H R
H P H R
R P
y H x R ( ) ( ) ( )
( ) ( ) ( )
1 1
| 1 | 1 | 1
1
| 1 | 1 | 1
1
ˆ ˆ
2
1
ˆ ˆ
2
T
k k k k k k k k k k k
T T
k k k k k k k k k k k k k
exp
exp
− −
− − −
−
− − −
− × − − −
× − + −
y H x x x P x x
y H x H P H R y H x
)
( ) ( )
( )
( ) ( )
( )
( ) ( )
( ) ( ) ( ) ( )
( )
1
| 1 | 1
| 1 | 1
1 2
2
| 1
1 2
2 1 2 2
| 1
| |
ˆ
; , ; ,
=
ˆ
; ,
2 det
=
2 det 2 det
1
2
z
z x
k k k k
k
k
k k k k k k k k k
T
k k k k k k k k k
n T
k k k k k
n n
k k k
T
k k k k
p p
p
N N
N
exp
π
π π
−
− −
− −
−
−
=
+
+
× − −
y x x x
y
y H x R x x P
y H x H P H R
H P H R
R P
y H x R ( ) ( ) ( )
( ) ( ) ( )
1 1
| 1 | 1 | 1
1
| 1 | 1 | 1
1
ˆ ˆ
2
1
ˆ ˆ
2
T
k k k k k k k k k k k
T T
k k k k k k k k k k k k k
exp
exp
− −
− − −
−
− − −
− × − − −
× − + −
y H x x x P x x
y H x H P H R y H x
( )
( ) ( )
( )
( ) ( )
( )
( ) ( )
( ) ( ) ( ) ( )
( )
1
| 1 | 1
| 1 | 1
1 2
2
| 1
1 2
2 1 2 2
| 1
| |
|
ˆ
; , ; ,
=
ˆ
; ,
2 det
=
2 det 2 det
1
2
z
z x
k k k k
k k
k
k k k k k k k k k
T
k k k k k k k k k
n T
k k k k k
n n
k k k
T
k k k k
p p
p
p
N N
N
exp
π
π π
−
− −
− −
−
−
=
+
+
× − −
y x x x
x y
y
y H x R x x P
y H x H P H R
H P H R
R P
y H x R ( ) ( ) ( )
( ) ( ) ( )
1 1
| 1 | 1 | 1
1
| 1 | 1 | 1
1
ˆ ˆ
2
1
ˆ ˆ
2
T
k k k k k k k k k k k
T T
k k k k k k k k k k k k k
exp
exp
− −
− − −
−
− − −
− × − − −
× − + −
y H x x x P x x
y H x H P H R y H x
(2.57)
It can be seen that the ( )
|
k k
p x y also satisfies the normal distribution.
Hence, we assume,
( ) ( )
( ) ( )
( ) ( )
1
|
| | 1 2
2
|
1
ˆ ˆ
2
ˆ
| ; ,
2 det
x
T
k k k k k k
k k k k k k k n
k k
exp
p N
π
−
− − −
=
x x P x x
x y x x P
P
(2.58)
39. 26 Grid-based Nonlinear Estimation and Its Applications
Equation (2.57) can be rewritten as
( )
1
2
|
D
k k
p C e
−
= ⋅
x y (2.59)
where
( ) ( )
( ) ( ) ( ) ( )
1 2
2
| 1
1 2
2 1 2 2
| 1
2 det
2 det 2 det
z
z x
n T
k k k k k
n n
k k k
C
π
π π
−
−
+
=
H P H R
R P
(2.60)
( ) ( ) ( ) ( )
( ) ( ) ( )
1 1
| 1 | 1 | 1
1
| 1 | 1 | 1
1 1 1 1
| 1 | 1 | 1
1 1
| 1 | 1
ˆ ˆ
ˆ ˆ
ˆ
ˆ
T
T
k k k k k k k k k k k k k k k
T T
k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k
T T
k k k k k k k
D − −
− − −
−
− − −
− − − −
− − −
− −
− −
= − − + − −
− − + −
= + − +
− +
y H x R y H x x x P x x
y H x H P H R y H x
x H R H P x x H R y P x
y R H x P
( ) ( )
( )
( )
1 1
1
| 1 | 1 | 1
1
| 1 | 1
1 1
| 1 | 1 | 1 | 1
ˆ
+
ˆ
ˆ ˆ
k
T T T T
k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k
T T T
k k k k k k k k k k k k k
− −
−
− − −
−
− −
− −
− − − −
− + + +
+ +
− + −
x
y R H P H R y y H P H R H x
x H H P H R y
x H H P H R H P x
(2.61)
Similarly, Eq. (2.58) can be rewritten as,
( )
1
2
|
D
k k
p C e
−
= ⋅
x y
(2.62)
where
( ) ( )
1 2
2
|
1
2 det
x
n
k k
C
π
=
P
(2.63)
( ) ( )
1
|
1 1 1 1
| | | |
ˆ ˆ
ˆ ˆ ˆ ˆ
T
k k k k k k
T T T T
k k k k k k k k k k k k k k k k
D −
− − − −
=
− −
= − − +
x x P x x
x P x x P x x P x x P x
(2.64)
To show the equivalence of Eq. (2.57) and Eq. (2.58), we first show that
D D
= . By comparing the first term of Eq. (2.61) and Eq. (2.64), we have,
1 1 1
| | 1
T
k k k k k k k
− − −
−
= +
P H R H P (2.65)
which is true because it is the information update Eq. (2.24).
By comparing the second term of Eq. (2.61) with the third term of
Eq. (2.64), we have,
1 1 1
| | 1 | 1
ˆ ˆ
T
k k k k k k k k k k
− − −
− −
= +
P x H R y P x (2.66)
40. Linear Estimation of Dynamic Systems 27
which is true as well because it is the information state update Eq. (2.31).
Note that in both Eq. (2.61) and Eq. (2.64), the third term and the second
term are the same. Thus, the next thing is to prove that the summation of the
last four terms in Eq. (2.61) is equivalent to the last term of Eq. (2.64), i.e.,
( ) ( )
( )
( )
1 1
1
| 1 | 1 | 1
1
| 1 | 1
1 1 1
| 1 | 1 | 1 | 1 |
ˆ
ˆ
ˆ ˆ ˆ ˆ
T T T T
k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k
− −
−
− − −
−
− −
− − −
− − − −
− + + +
+ +
− + − =
y R H P H R y y H P H R H x
x H H P H R y
x H H P H R H P x x P x
(2.67)
By using Eq. (2.66), we have
( )( )
( ) ( )
( )
( )
( )
1 1
| | 1 | 1
1 1 1
| | | 1
1 1
| | 1 | | 1
1
| 1 | | 1
ˆ ˆ
ˆ
ˆ ˆ
ˆ ˆ
T
k k k k k k k k k k
T T
k k k k k k k k k k k k
T T
k k k k k k k k k k k k k k
T
k k k k k k k k k k
− −
− −
− − −
−
− −
− −
−
− −
= +
= + −
= + −
=
+ −
x P H R y P x
P H R y P H R H x
P H R y x P H R H x
x P H R y H x
(2.68)
Using Eq. (2.65) and the matrix inversion lemma, |
k k
P can be rewritten as
( )
( )
1
1 1
| | 1
1
| 1 | 1 | 1 | 1
T
k k k k k k k
T T
k k k k k k k k k k k k k
−
− −
−
−
− − − −
= +
=
− +
P H R H P
P P H H P H R H P
(2.69)
Using Eq. (2.69), Eq. (2.68) can be rewritten as
( )
( )
( ) ( )
( )
( )( )
1
| 1 | | 1
1 1
| 1 | 1 | 1 | 1 | 1 | 1
1
1 1
| 1 | 1 | 1 | 1 | 1 | 1
| 1 |
ˆ ˆ ˆ
ˆ ˆ
ˆ ˆ
ˆ
T
k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k k k k k k k
k k k k
−
− −
− −
− − − − − −
−
− −
− − − − − −
−
=
+ −
= + − + −
=
+ − + −
= +
x x P H R y H x
x P P H H P H R H P H R y H x
x P H R P H H P H R H P H R y H x
x P ( )
( )
( )( )
( ) ( )
( )
( )( )
( )
( )
( )( )
1 1
1 | 1 | 1 | 1
1 1
| 1 | 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1 | 1 | 1
| 1 |
ˆ
ˆ ˆ
ˆ ˆ
ˆ
T T T
k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k k k k k k k
k k k k
− −
− − − −
− −
− − − − − −
− −
− − − −
−
− + −
= + + + − −
=
+ + −
= +
H I H P H R H P H R y H x
x P H H P H R H P H R H P H R y H x
x P H H P H R R R y H x
x P ( ) ( )
1
1 | 1 | 1
ˆ
T T
k k k k k k k k k k
−
− − −
+ −
H H P H R y H x
)
) ) ( )
( ) )( )
| 1
1 1
| 1 | 1 | 1
1 1
1 | 1 | 1 | 1
ˆ
ˆ
ˆ
k k k
T T
k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k
−
− −
− − −
− −
− − − −
+ −
+ −
x
H P H R H P H R y H x
H H P H R H P H R y H x
) ) )( )
) ( )) )( )
) ) )( )
1 1
| 1 | 1 | 1
1 1
1 | 1 | 1 | 1
1 1
1 | 1
ˆ
ˆ
ˆ
T T
k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k k
T
k k k k k k k k
− −
− − −
− −
− − −
− −
−
+ −
+ + − −
+ −
P H R H P H R y H x
H R H P H R H P H R y H x
H R R R y H x
) ( )
1
| 1
ˆ
T
k k k k k k
−
−
+ −
H R y H x
( )
( )
( ) ( )
( )
( )( )
1
| 1 | | 1
1 1
| 1 | 1 | 1 | 1 | 1 | 1
1
1 1
| 1 | 1 | 1 | 1 | 1 | 1
| 1 |
ˆ ˆ ˆ
ˆ ˆ
ˆ ˆ
ˆ
T
k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k k k k k k k
k k k k
−
− −
− −
− − − − − −
−
− −
− − − − − −
−
=
+ −
= + − + −
=
+ − + −
= +
x x P H R y H x
x P P H H P H R H P H R y H x
x P H R P H H P H R H P H R y H x
x P ( )
( )
( )( )
( ) ( )
( )
( )( )
( )
( )
( )( )
1 1
1 | 1 | 1 | 1
1 1
| 1 | 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1 | 1 | 1
| 1 |
ˆ
ˆ ˆ
ˆ ˆ
ˆ
T T T
k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k k k k k k k
k k k k
− −
− − − −
− −
− − − − − −
− −
− − − −
−
− + −
= + + + − −
=
+ + −
= +
H I H P H R H P H R y H x
x P H H P H R H P H R H P H R y H x
x P H H P H R R R y H x
x P ( ) ( )
1
1 | 1 | 1
ˆ
T T
k k k k k k k k k k
−
− − −
+ −
H H P H R y H x
( )
( )
( ) ( )
( )
( )( )
1
| 1 | | 1
1 1
| 1 | 1 | 1 | 1 | 1 | 1
1
1 1
| 1 | 1 | 1 | 1 | 1 | 1
| 1 |
ˆ ˆ ˆ
ˆ ˆ
ˆ ˆ
ˆ
T
k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k k k k k k k
k k k k
−
− −
− −
− − − − − −
−
− −
− − − − − −
−
=
+ −
= + − + −
=
+ − + −
= +
x x P H R y H x
x P P H H P H R H P H R y H x
x P H R P H H P H R H P H R y H x
x P ( )
( )
( )( )
( ) ( )
( )
( )( )
( )
( )
( )( )
1 1
1 | 1 | 1 | 1
1 1
| 1 | 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1 | 1 | 1
| 1 |
ˆ
ˆ ˆ
ˆ ˆ
ˆ
T T T
k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k k k k k k k
k k k k
− −
− − − −
− −
− − − − − −
− −
− − − −
−
− + −
= + + + − −
=
+ + −
= +
H I H P H R H P H R y H x
x P H H P H R H P H R H P H R y H x
x P H H P H R R R y H x
x P ( ) ( )
1
1 | 1 | 1
ˆ
T T
k k k k k k k k k k
−
− − −
+ −
H H P H R y H x
) ) ( )
) )( )
1 1
| 1 | 1
1 1
| 1 | 1 | 1
ˆ
ˆ
T
k k k k k k k k k k
T T
k k k k k k k k k k k k k
− −
− −
− −
− − −
+ −
+ −
R H P H R y H x
P H R H P H R y H x
) ) )( )
) ( )) )( )
) ) )( )
1 1
| 1 | 1
1 1
| 1 | 1 | 1
1 1
| 1
ˆ
ˆ
ˆ
T
k k k k k k k k k k
T T
k k k k k k k k k k k k k k
k k k k k k
− −
− −
− −
− − −
− −
−
−
+ − −
−
R H P H R y H x
H P H R H P H R y H x
R R y H x
( )
1
| 1
ˆ
k k k k−
−
y H x
( )
( )
( ) ( )
( )
( )( )
1
| 1 | | 1
1 1
| 1 | 1 | 1 | 1 | 1 | 1
1
1 1
| 1 | 1 | 1 | 1 | 1 | 1
| 1 |
ˆ ˆ ˆ
ˆ ˆ
ˆ ˆ
ˆ
T
k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k k k k k k k
k k k k
−
− −
− −
− − − − − −
−
− −
− − − − − −
−
=
+ −
= + − + −
=
+ − + −
= +
x x P H R y H x
x P P H H P H R H P H R y H x
x P H R P H H P H R H P H R y H x
x P ( )
( )
( )( )
( ) ( )
( )
( )( )
( )
( )
( )( )
1 1
1 | 1 | 1 | 1
1 1
| 1 | 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1 | 1 | 1
| 1 |
ˆ
ˆ ˆ
ˆ ˆ
ˆ
T T T
k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k k k k k k k
k k k k
− −
− − − −
− −
− − − − − −
− −
− − − −
−
− + −
= + + + − −
=
+ + −
= +
H I H P H R H P H R y H x
x P H H P H R H P H R H P H R y H x
x P H H P H R R R y H x
x P ( ) ( )
1
1 | 1 | 1
ˆ
T T
k k k k k k k k k k
−
− − −
+ −
H H P H R y H x
(2.70)
41. 28 Grid-based Nonlinear Estimation and Its Applications
Let’s define ( )
1
| 1 | 1
T T
k k k k k k k k k
−
− −
= +
K P H H P H R and 1 1
| 1
T
k k k k k k
− −
−
= +
Y H R H P
,
then the last term of Eq. (2.64) becomes
( )
( )
( ) ( )
( )
( ) ( )
( ) ( )
1
|
| 1 | 1 | 1 | 1
| 1 | 1
| 1 | 1
ˆ ˆ
ˆ ˆ ˆ ˆ
ˆ ˆ
ˆ ˆ
T
k k k k
T
k k k k k k k k k k k k k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k
T
T
k k k k k k k k k
−
− − − −
− −
− −
= + − + −
= + − + −
+ − −
x P x
x K y H x Y x K y H x
y K Y K y y K Y I K H x x I K H Y K y
x I K H Y I K H x
T
( )
( )
( ) ( )
( )
( ) ( )
( ) ( )
1
|
| 1 | 1 | 1 | 1
| 1 | 1
| 1 | 1
ˆ ˆ
ˆ ˆ ˆ ˆ
ˆ ˆ
ˆ ˆ
T
k k k k
T
k k k k k k k k k k k k k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k
T
T
k k k k k k k k k
−
− − − −
− −
− −
= + − + −
= + − + −
+ − −
x P x
x K y H x Y x K y H x
y K Y K y y K Y I K H x x I K H Y K y
x I K H Y I K H x
( )
( )
( ) ( )
( )
( ) ( )
( ) ( )
1
|
| 1 | 1 | 1 | 1
| 1 | 1
| 1 | 1
ˆ ˆ
ˆ ˆ ˆ ˆ
ˆ ˆ
ˆ ˆ
T
k k k k
T
k k k k k k k k k k k k k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k
T
T
k k k k k k k k k
−
− − − −
− −
− −
= + − + −
= + − + −
+ − −
x P x
x K y H x Y x K y H x
y K Y K y y K Y I K H x x I K H Y K y
x I K H Y I K H x
(2.71)
Substitute Eq. (2.71) into Eq. (2.67) and the resultant equation is satisfied
if the following three equations are fulfilled,
( )
1
1
| 1
T T
k k k k k k k k k
−
−
−
− + =
R H P H R K Y K
(2.72a)
( ) ( )
1
| 1
T T
k k k k k k k k k k
−
− + = −
H P H R H K Y I K H
(2.72b)
( ) ( ) ( )
1 1
| 1 | 1
T
T T
k k k k k k k k k k k k k k
I
− −
− −
− + + =− −
H H P H R H P I K H Y K H
I (2.72c)
To prove (2.72a), we have,
( )
( ) ( )
( ) ( ) ( )
( )
( )
( )
1
1
| 1
1 1 1
1 1 1
| 1 | 1 | 1 | 1 | 1 | 1
1
| 1 | 1
1
| 1
1 1
| 1 | 1
T T
k k k k k k k k k
T
T T T T T T
k k k k k k k k k k k k k k k k k k k k k k k k k k k
T
k k k k k k k k k
T
k k k k k
T
k k k k k k k k
−
−
−
− − −
− − −
− − − − − −
−
− −
−
−
− −
− −
− + −
= − + − + + +
+ − −
= +
× +
R H P H R K Y K
R H P H R P H H P H R H R H P P H H P H R
H P H R R I H P
H P H R
H R H P P H
( )
( )
( ) ( )
( ) ( )
( )
1
| 1
1
1 1
| 1 | 1
| 1 | 1
1 1
| 1 | 1 | 1 | 1
T T
k k k k k
T T
k k k k k k k k k k k
T T
k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k k
−
−
−
− −
− −
− −
− −
− − − −
+
+ +
=
+ +
− + − +
H P H R
H P H R R H P H R
H P H R H P H R
H P H R H P H R H P P H
) ( )
( ) ( ) ( )
)
( )
1
1 1 1
1 1
| 1 | 1 | 1 | 1 | 1
1
| 1 | 1
1 1
| 1 | 1
T
k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k k k
T
k k k k k k k k k
T
k k k k k k k k
−
− − −
− −
− − − − −
−
− −
− −
− −
−
− + + +
+ − −
× +
K Y K
P H H P H R H R H P P H H P H R
H P H R R I H P
H R H P P H
( )
) ( )
( ) ( )
( )
1
| 1
1
1
| 1 | 1
| 1
1 1
| 1 | 1 | 1 | 1
T T
k k k k k
T T
k k k k k k k k k k k T
k k k k k
T T T
k k k k k k k k k k k k k k k k
−
−
−
−
− −
−
− −
− − − −
+
+ +
= +
+ − +
H P H R
H P H R R H P H R
H P H R
H P H R H P H R H P P H
( )
( ) ( )
( ) ( ) ( )
( )
( )
( )
1
1
| 1
1 1 1
1 1 1
| 1 | 1 | 1 | 1 | 1 | 1
1
| 1 | 1
1
| 1
1 1
| 1 | 1
T T
k k k k k k k k k
T
T T T T T T
k k k k k k k k k k k k k k k k k k k k k k k k k k k
T
k k k k k k k k k
T
k k k k k
T
k k k k k k k k
−
−
−
− − −
− − −
− − − − − −
−
− −
−
−
− −
− −
− + −
= − + − + + +
+ − −
= +
× +
R H P H R K Y K
R H P H R P H H P H R H R H P P H H P H R
H P H R R I H P
H P H R
H R H P P H
( )
( )
( ) ( )
( ) ( )
( )
1
| 1
1
1 1
| 1 | 1
| 1 | 1
1 1
| 1 | 1 | 1 | 1
T T
k k k k k
T T
k k k k k k k k k k k
T T
k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k k
−
−
−
− −
− −
− −
− −
− − − −
+
+ +
=
+ +
− + − +
H P H R
H P H R R H P H R
H P H R H P H R
H P H R H P H R H P P H
) ) ( ) ( )
)
1 1
1 1
1 | 1 | 1 | 1
| 1
| 1
T
T T T T
k k k k k k k k k k k k k k k k
k k k
k k k
− −
− −
− − − −
−
−
+ + +
− −
H R H R H P P H H P H R
I H P
P H ( )
( )
( )
( )
1
| 1
1
| 1
| 1
1 1
| 1 | 1 | 1
T T
k k k k k
T
k k k k k T
k k k k k
T T
k k k k k k k k k k k
−
−
−
−
−
− −
− − −
+
+
= +
+
H P H R
H P H R
H P H R
H P H R H P P H
( )
( ) ( )
( ) ( ) ( )
( )
( )
( )
1
1
| 1
1 1 1
1 1 1
| 1 | 1 | 1 | 1 | 1 | 1
1
| 1 | 1
1
| 1
1 1
| 1 | 1
T T
k k k k k k k k k
T
T T T T T T
k k k k k k k k k k k k k k k k k k k k k k k k k k k
T
k k k k k k k k k
T
k k k k k
T
k k k k k k k k
−
−
−
− − −
− − −
− − − − − −
−
− −
−
−
− −
− −
− + −
= − + − + + +
+ − −
= +
× +
R H P H R K Y K
R H P H R P H H P H R H R H P P H H P H R
H P H R R I H P
H P H R
H R H P P H
( )
( )
( ) ( )
( ) ( )
( )
1
| 1
1
1 1
| 1 | 1
| 1 | 1
1 1
| 1 | 1 | 1 | 1
T T
k k k k k
T T
k k k k k k k k k k k
T T
k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k k
−
−
−
− −
− −
− −
− −
− − − −
+
+ +
=
+ +
− + − +
H P H R
H P H R R H P H R
H P H R H P H R
H P H R H P H R H P P H
( )
( ) ( )
( ) ( ) ( )
( )
( )
( )
1
1
| 1
1 1 1
1 1 1
| 1 | 1 | 1 | 1 | 1 | 1
1
| 1 | 1
1
| 1
1 1
| 1 | 1
T T
k k k k k k k k k
T
T T T T T T
k k k k k k k k k k k k k k k k k k k k k k k k k k k
T
k k k k k k k k k
T
k k k k k
T
k k k k k k k k
−
−
−
− − −
− − −
− − − − − −
−
− −
−
−
− −
− −
− + −
= − + − + + +
+ − −
= +
× +
R H P H R K Y K
R H P H R P H H P H R H R H P P H H P H R
H P H R R I H P
H P H R
H R H P P H
( )
( )
( ) ( )
( ) ( )
( )
1
| 1
1
1 1
| 1 | 1
| 1 | 1
1 1
| 1 | 1 | 1 | 1
T T
k k k k k
T T
k k k k k k k k k k k
T T
k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k k
−
−
−
− −
− −
− −
− −
− − − −
+
+ +
=
+ +
− + − +
H P H R
H P H R R H P H R
H P H R H P H R
H P H R H P H R H P P H
(2.73)
In Eq. (2.73), the term
42. Linear Estimation of Dynamic Systems 29
( ) ( ) ( )
( )
( ) ( ) ( ) ( )
1
| 1 | 1 | 1
1 1
| 1 | 1 | 1
1
| 1 | 1 | 1 | 1
1 1
| 1 | 1 | 1 | 1 | 1
T T T
k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k k k
−
− − −
− −
− − −
−
− − − −
− −
− − − − −
+ + − +
− +
= + + + − +
− −
H P H R R H P H R H P H R
H P H R H P P H
H P H R R H P H H P H R H P H R
H P H R H P H H P P P H
( ) ( )
1
| 1 | 1 | 1 | 1 | 1
1
| 1 | 1 | 1
T T T T T
k k k k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k
−
− − − − −
−
− − −
= + + + − +
− −
=
H P H R H P H H P H H P H R H P H R
H P H R H P H H P H
0
( ) ( )
)
( ) ( ) ( )
1
| 1 | 1
1
| 1 | 1
1
| 1 | 1 | 1
1
| 1 | 1 | 1 | 1
T T
k k k k k k k k k k
T
k k k k k
T T T
k k k k k k k k k k k k k k k
T T
k k k k k k k k k k k
− −
−
− −
−
− − −
−
− − − −
+ − +
+
+ + − +
−
H P H R H P H R
P P H
H P H H P H R H P H R
P H H P P P H
( ) ( )
1 | 1 | 1 | 1
| 1 | 1
T T T T
k k k k k k k k k k k k k k k
T T
k k k k k k k
− − − −
− −
+ + + − +
−
H H P H H P H R H P H R
P H H P H
( ) ( ) ( )
( )
( ) ( ) ( ) ( )
1
| 1 | 1 | 1
1 1
| 1 | 1 | 1
1
| 1 | 1 | 1 | 1
1 1
| 1 | 1 | 1 | 1 | 1
T T T
k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k k k
−
− − −
− −
− − −
−
− − − −
− −
− − − − −
+ + − +
− +
= + + + − +
− −
H P H R R H P H R H P H R
H P H R H P P H
H P H R R H P H H P H R H P H R
H P H R H P H H P P P H
( ) ( )
1
| 1 | 1 | 1 | 1 | 1
1
| 1 | 1 | 1
T T T T T
k k k k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k
−
− − − − −
−
− − −
= + + + − +
− −
=
H P H R H P H H P H H P H R H P H R
H P H R H P H H P H
0
) ( )
)
) ( ) ( )
| 1 | 1
1
| 1 | 1
| 1 | 1 | 1
1
1 | 1 | 1 | 1
T T
k k k k k k k k k k
T
k k k k k
T T T
k k k k k k k k k k k k k k
T T
k k k k k k k k k
− −
−
− −
− − −
−
− − −
+ − +
+ + − +
−
H P H R H P H R
P P H
H P H H P H R H P H R
H H P P P H
( ) ( )
| 1 | 1 | 1
1 | 1
T T T T
k k k k k k k k k k k k k k k
T T
k k k k k
− − −
−
+ + + − +
−
H P H H P H R H P H R
H H P H
( ) ( ) ( )
( )
( ) ( ) ( ) ( )
1
| 1 | 1 | 1
1 1
| 1 | 1 | 1
1
| 1 | 1 | 1 | 1
1 1
| 1 | 1 | 1 | 1 | 1
T T T
k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k k k
−
− − −
− −
− − −
−
− − − −
− −
− − − − −
+ + − +
− +
= + + + − +
− −
H P H R R H P H R H P H R
H P H R H P P H
H P H R R H P H H P H R H P H R
H P H R H P H H P P P H
( ) ( )
1
| 1 | 1 | 1 | 1 | 1
1
| 1 | 1 | 1
T T T T T
k k k k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k
−
− − − − −
−
− − −
= + + + − +
− −
=
H P H R H P H H P H H P H R H P H R
H P H R H P H H P H
0
( ) ( ) ( )
( )
( ) ( ) ( ) ( )
1
| 1 | 1 | 1
1 1
| 1 | 1 | 1
1
| 1 | 1 | 1 | 1
1 1
| 1 | 1 | 1 | 1 | 1
T T T
k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k k k
−
− − −
− −
− − −
−
− − − −
− −
− − − − −
+ + − +
− +
= + + + − +
− −
H P H R R H P H R H P H R
H P H R H P P H
H P H R R H P H H P H R H P H R
H P H R H P H H P P P H
( ) ( )
1
| 1 | 1 | 1 | 1 | 1
1
| 1 | 1 | 1
T T T T T
k k k k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k
−
− − − − −
−
− − −
+ + + − +
− −
=
H P H R H P H H P H H P H R H P H R
H P H R H P H H P H
0
(2.74)
Hence, Eq. (2.72a) is verified. Next, we will verify (2.72b)
( )
( )
( ) ( )
( ) ( )
( )
( ) ( )
( ) ( )
( )
( ) ( )
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1
T
k k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k
I
− −
− −
− − − − −
− −
− −
− − − − −
− −
− −
−
= + + − +
= + + − +
= + +
K Y I K H
P H H P H R H R H P P H H P H R H
H P H R H P H R H P I P H H P H R H
H P H R H P H R H H
( )
( )
( ) ( )
( )
1
| 1 | 1
1
| 1
1 1
1
| 1 | 1 | 1 | 1
1
| 1 | 1
T T
k k k k k k k k k
T T
k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k
−
− −
−
−
− −
−
− − − −
−
− −
− +
+
= + − +
− +
I P H H P H R H
H P H R H H
H P H R H P H R H P H H P H R H
H P H H P H R H
) ) ( )
( ) ( )
( )
) ( )
( ) ( )
( )
) ( )
1 1
1 1
| 1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1
1 1
| 1
T
T T T T
k k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k
T T
k k k k k k k
I
− −
− −
− − −
− −
− −
− − − −
− −
−
+ + − +
+ − +
+
H R H R H P P H H P H R H
H P H R H P I P H H P H R H
H P H R H H ( )
( )
) ( )
( )
1
| 1 | 1
1
| 1
1 1
1
| 1 | 1 | 1
1
| 1 | 1
T T
k k k k k k k k k
T T
k k k k k k k
T T T T
k k k k k k k k k k k k k k k
T T
k k k k k k k k k k
−
− −
−
−
− −
−
− − −
−
− −
− +
+
− +
− +
I P H H P H R H
H P H R H H
H P H R H P H H P H R H
H P H H P H R H
I
( )
( )
( ) ( )
( ) ( )
( )
( ) ( )
( ) ( )
( )
( ) ( )
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1
T
k k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k
I
− −
− −
− − − − −
− −
− −
− − − − −
− −
− −
−
= + + − +
= + + − +
= + +
K Y I K H
P H H P H R H R H P P H H P H R H
H P H R H P H R H P I P H H P H R H
H P H R H P H R H H
( )
( )
( ) ( )
( )
1
| 1 | 1
1
| 1
1 1
1
| 1 | 1 | 1 | 1
1
| 1 | 1
T T
k k k k k k k k k
T T
k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k
−
− −
−
−
− −
−
− − − −
−
− −
− +
+
= + − +
− +
I P H H P H R H
H P H R H H
H P H R H P H R H P H H P H R H
H P H H P H R H
)
) ) ( )
( ) ( )
( )
) ( )
( ) ( )
( )
) ( )
1 1
1 1
| 1 | 1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1
1 1
| 1
k
T
T T T T
k k k k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k
I
− −
− −
− − − −
− −
− −
− − − −
− −
−
+ + − +
+ − +
+
H
H R H R H P P H H P H R H
R H P H R H P I P H H P H R H
R H P H R H H ( )
( )
) ( )
( )
1
| 1 | 1
1
| 1
1 1
1
| 1 | 1 | 1
1
| 1 | 1
T T
k k k k k k k k k
T T
k k k k k k k
T T T T
k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k
−
− −
−
−
− −
−
− − −
−
− −
− +
+
− +
− +
I P H H P H R H
H P H R H H
R H P H R H P H H P H R H
H P H H P H R H
( )
( )
( ) ( )
( ) ( )
( )
( ) ( )
( ) ( )
( )
( ) ( )
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1
T
k k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k
I
− −
− −
− − − − −
− −
− −
− − − − −
− −
− −
−
= + + − +
= + + − +
= + +
K Y I K H
P H H P H R H R H P P H H P H R H
H P H R H P H R H P I P H H P H R H
H P H R H P H R H H
( )
( )
( ) ( )
( )
1
| 1 | 1
1
| 1
1 1
1
| 1 | 1 | 1 | 1
1
| 1 | 1
T T
k k k k k k k k k
T T
k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k
−
− −
−
−
− −
−
− − − −
−
− −
− +
+
= + − +
− +
I P H H P H R H
H P H R H H
H P H R H P H R H P H H P H R H
H P H H P H R H
)
( ) ) ( )
( ) ( )
( )
) ( )
( ) ( )
( )
) ( )
1 1
1 1
| 1 | 1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1
1 1
| 1
k k
T
T T T T
k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k
I
− −
− −
− − − −
− −
− −
− − − −
− −
−
−
+ + − +
+ + − +
= + +
K H
H P H R H R H P P H H P H R H
H R H P H R H P I P H H P H R H
H R H P H R H H ( )
( )
) ( )
( )
1
| 1 | 1
1
| 1
1 1
1
| 1 | 1 | 1
1
| 1 | 1
T T
k k k k k k k k k
T T
k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k
−
− −
−
−
− −
−
− − −
−
− −
− +
+
+ − +
− +
I P H H P H R H
H P H R H H
H R H P H R H P H H P H R H
H P H H P H R H
( )
( )
( ) ( )
( ) ( )
( )
( ) ( )
( ) ( )
( )
( ) ( )
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1
T
k k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k
I
− −
− −
− − − − −
− −
− −
− − − − −
− −
− −
−
= + + − +
= + + − +
= + +
K Y I K H
P H H P H R H R H P P H H P H R H
H P H R H P H R H P I P H H P H R H
H P H R H P H R H H
( )
( )
( ) ( )
( )
1
| 1 | 1
1
| 1
1 1
1
| 1 | 1 | 1 | 1
1
| 1 | 1
T T
k k k k k k k k k
T T
k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k
−
− −
−
−
− −
−
− − − −
−
− −
− +
+
= + − +
− +
I P H H P H R H
H P H R H H
H P H R H P H R H P H H P H R H
H P H H P H R H
Rk
–1
( )
( )
( ) ( )
( ) ( )
( )
( ) ( )
( ) ( )
( )
( ) ( )
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1
T
k k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k
I
− −
− −
− − − − −
− −
− −
− − − − −
− −
− −
−
= + + − +
= + + − +
= + +
K Y I K H
P H H P H R H R H P P H H P H R H
H P H R H P H R H P I P H H P H R H
H P H R H P H R H H
( )
( )
( ) ( )
( )
1
| 1 | 1
1
| 1
1 1
1
| 1 | 1 | 1 | 1
1
| 1 | 1
T T
k k k k k k k k k
T T
k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k
−
− −
−
−
− −
−
− − − −
−
− −
− +
+
= + − +
− +
I P H H P H R H
H P H R H H
H P H R H P H R H P H H P H R H
H P H H P H R H
Rk
–1
( )
( )
( ) ( )
( ) ( )
( )
( ) ( )
( ) ( )
( )
( ) ( )
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1
T
k k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k
I
− −
− −
− − − − −
− −
− −
− − − − −
− −
− −
−
= + + − +
= + + − +
= + +
K Y I K H
P H H P H R H R H P P H H P H R H
H P H R H P H R H P I P H H P H R H
H P H R H P H R H H
( )
( )
( ) ( )
( )
1
| 1 | 1
1
| 1
1 1
1
| 1 | 1 | 1 | 1
1
| 1 | 1
T T
k k k k k k k k k
T T
k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k
−
− −
−
−
− −
−
− − − −
−
− −
− +
+
= + − +
− +
I P H H P H R H
H P H R H H
H P H R H P H R H P H H P H R H
H P H H P H R H
(2.75)
To prove Eq. (2.72b), we have,
43. 30 Grid-based Nonlinear Estimation and Its Applications
( ) ( )
( )
( ) ( )
( )
( )
1
| 1
1
| 1
1
| 1
1 1
1
| 1 | 1 | 1 | 1
1
| 1 | 1
1
| 1
T T
k k k k k k k k k k
T
k k k k k k
T T
k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k
T
k k k k k
−
−
−
−
−
−
− −
−
− − − −
−
− −
−
−
+ − −
= +
+
− + − +
− +
−
= +
H P H R H K Y I K H
H P H R H
H P H R H H
H P H R H P H R H P H H P H R H
H P H H P H R H
H
H P H R
( )
( )
1
| 1
1
1
| 1 | 1 | 1
1
| 1 | 1
T T
k k k k k
T T T T
k k k k k k k k k k k k k k k
T T
k k k k k k k k k
−
−
−
−
− − −
−
− −
+ +
+ +
P H R
H P H R H P H H P H R H
H P H H P H R
(2.76)
The term
( ) ( )
( ) ( )
( )
( )
1 1
1 1
| 1 | 1 | 1 | 1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1
1 1
| 1 | 1
T T T T T T T T
k k k k k k k k k k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k
− −
− −
− − − − − −
− −
− −
− − − −
− −
− −
− + + + +
= − + + + +
= − + + −
H P H R H P H R H P H H P H R H P H H P H R
H P H R R H P H H P H R H P H R
H P H R R H P H R R H P
( ) ( )
( )
( ) ( )
( )
1 1
| 1 | 1
1 1
1 1
| 1 | 1 | 1
T T
k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k
− −
− −
− −
− −
− − −
+ + +
= − + − + + +
=
H R H P H R
H P H R R H P H R H P H R
0
( ) ( )
( ) ( ) )
)
1 1
1
| 1 | 1 | 1 | 1
1 1
1 | 1 | 1
| 1
T T T T
k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k
T
k k k k k k k
− −
−
− − − −
− −
− − −
−
+ + +
+ + +
+ −
H P H H P H R H P H H P H R
H H P H R H P H R
H R R H P
( ) ( ) )
) ( ) )
1 1
| 1 | 1
1 1
| 1 | 1
T T
k k k k k k k k
T T
k k k k k k k k k
− −
− −
− −
− −
+ + +
+ + +
H R H P H R
P H R H P H R
( ) ( )
( ) ( )
( )
( )
1 1
1 1
| 1 | 1 | 1 | 1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1
1 1
| 1 | 1
T T T T T T T T
k k k k k k k k k k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k
− −
− −
− − − − − −
− −
− −
− − − −
− −
− −
− + + + +
= − + + + +
= − + + −
H P H R H P H R H P H H P H R H P H H P H R
H P H R R H P H H P H R H P H R
H P H R R H P H R R H P
( ) ( )
( )
( ) ( )
( )
1 1
| 1 | 1
1 1
1 1
| 1 | 1 | 1
T T
k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k
− −
− −
− −
− −
− − −
+ + +
= − + − + + +
=
H R H P H R
H P H R R H P H R H P H R
0
( ) ( )
( ) ( ) )
( )
1 1
1
1 | 1 | 1 | 1 | 1
1 1
1
| 1 | 1 | 1
1
| 1
T T T T T
k k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k
T
k k k k k k k k k
− −
−
− − − − −
− −
−
− − −
−
−
+ + +
+ + +
+ −
H R H P H H P H R H P H H P H R
H P H H P H R H P H R
H P H R R H P
( ) ( ) )
( ) ( ) )
1 1
| 1 | 1
1 1
1
| 1 | 1
T T
k k k k k k k k
T T
k k k k k k k k k k k
− −
− −
− −
−
− −
+ + +
− + + +
H R H P H R
H P H R H P H R
( ) ( )
( ) ( )
( )
( )
1 1
1 1
| 1 | 1 | 1 | 1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1
1 1
| 1 | 1
T T T T T T T T
k k k k k k k k k k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k
− −
− −
− − − − − −
− −
− −
− − − −
− −
− −
− + + + +
= − + + + +
= − + + −
H P H R H P H R H P H H P H R H P H H P H R
H P H R R H P H H P H R H P H R
H P H R R H P H R R H P
( ) ( )
( )
( ) ( )
( )
1 1
| 1 | 1
1 1
1 1
| 1 | 1 | 1
T T
k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k
− −
− −
− −
− −
− − −
+ + +
= − + − + + +
=
H R H P H R
H P H R R H P H R H P H R
0
( ) ( )
( ) ( ) )
)
1 1
| 1 | 1 | 1 | 1
1 1
| 1 | 1
T T T T
k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k k
T
k k k k
− −
− − − −
− −
− −
+ + +
+ + +
+ −
P H H P H R H P H H P H R
H P H R H P H R
R R H P
( ) ( ) )
) ( ) )
1 1
| 1 | 1
1 1
| 1
T T
k k k k k k k k
T T
k k k k k k k
− −
− −
− −
−
+ + +
+ + +
H R H P H R
H R H P H R
( ) ( )
( ) ( )
( )
( )
1 1
1 1
| 1 | 1 | 1 | 1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1
1 1
| 1 | 1
T T T T T T T T
k k k k k k k k k k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k
− −
− −
− − − − − −
− −
− −
− − − −
− −
− −
− + + + +
= − + + + +
= − + + −
H P H R H P H R H P H H P H R H P H H P H R
H P H R R H P H H P H R H P H R
H P H R R H P H R R H P
( ) ( )
( )
( ) ( )
( )
1 1
| 1 | 1
1 1
1 1
| 1 | 1 | 1
T T
k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k
− −
− −
− −
− −
− − −
+ + +
= − + − + + +
=
H R H P H R
H P H R R H P H R H P H R
0
(2.77)
Hence, Eq. (2.76) is equal to 0 and (2.72b) is verified.
By Eq. (2.72c), we have
44. Linear Estimation of Dynamic Systems 31
( ) ( ) ( )
( )
( )
( ) ( )
1 1
| 1 | 1
1 1
| 1 | 1
1
1 1 1 1
| 1 | 1 | 1 | 1
1
|
T T T
k k k k k k k k k k k k k k
T T T T T T
k k k k k k k k k k k k k k k k k k k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k
T
k k k k k
− −
− −
− −
− −
−
− − − −
− − − −
−
− − + + −
=
− − + + + −
= + − + +
− +
I K H Y I K H H H P H R H P
Y H K Y Y K H H K Y K H H H P H R H P
H R H P H P H H P H R H R H P
H R H P
( ) ( )
( )
( ) ( ) ( )
( )
( )
1
1
1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1
1
1 1 1
| 1 | 1 | 1
T T
k k k k k k k k k
T
T T T T T T
k k k k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k
−
−
− − −
− −
− −
− − − − −
− −
− −
−
− − −
− − −
+
+ + + +
+ + −
= + − +
P H H P H R H
H P H H P H R H R H P P H H P H R H
H H P H R H P
H R H P H H P H R H P H R H ( )
( ) ( )
( ) ( )
( ) ( )
1
| 1
1 1
1
| 1 | 1 | 1
1 1
1
| 1 | 1 | 1 | 1
1
| 1 | 1 | 1
T T
k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k
T T T T T T
k k k k k k k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k
−
−
− −
−
− − −
− −
−
− − − −
− −
− − −
− +
− + − +
+ + +
+ + +
H H P H R H
H R H P H H P H R H H H P H R H
H H P H R H P H R H P H H P H R H
H H P H R H P H H P H R
( )
( )
( ) ( )
( )
1
1 1
| 1 | 1
1
1 1
| 1 | 1
1 1
1
| 1 | 1 | 1
1 1
| 1 | 1 | 1
k
T T
k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k
− −
− −
−
− −
− −
− −
−
− − −
− −
− − −
+ + −
= − +
− + − +
+ +
H
H H P H R H P
H R H H H P H R H P H R H
H R H P H H P H R H H H P H R H
H H P H R H P H R H P H H P
( )
( ) ( )
1
| 1
1 1
| 1 | 1 | 1
T
k k k k
T T T T
k k k k k k k k k k k k k k k k
−
−
− −
− − −
+
+ + +
H R H
H H P H R H P H H P H R H
( ) ( ) ( )
( )
( )
( ) ( )
1 1
| 1 | 1
1 1
| 1 | 1
1
1 1 1 1
| 1 | 1 | 1 | 1
1
|
T T T
k k k k k k k k k k k k k k
T T T T T T
k k k k k k k k k k k k k k k k k k k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k
T
k k k k k
− −
− −
− −
− −
−
− − − −
− − − −
−
− − + + −
=
− − + + + −
= + − + +
− +
I K H Y I K H H H P H R H P
Y H K Y Y K H H K Y K H H H P H R H P
H R H P H P H H P H R H R H P
H R H P
( ) ( )
( )
( ) ( ) ( )
( )
( )
1
1
1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1
1
1 1 1
| 1 | 1 | 1
T T
k k k k k k k k k
T
T T T T T T
k k k k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k
−
−
− − −
− −
− −
− − − − −
− −
− −
−
− − −
− − −
+
+ + + +
+ + −
= + − +
P H H P H R H
H P H H P H R H R H P P H H P H R H
H H P H R H P
H R H P H H P H R H P H R H ( )
( ) ( )
( ) ( )
( ) ( )
1
| 1
1 1
1
| 1 | 1 | 1
1 1
1
| 1 | 1 | 1 | 1
1
| 1 | 1 | 1
T T
k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k
T T T T T T
k k k k k k k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k
−
−
− −
−
− − −
− −
−
− − − −
− −
− − −
− +
− + − +
+ + +
+ + +
H H P H R H
H R H P H H P H R H H H P H R H
H H P H R H P H R H P H H P H R H
H H P H R H P H H P H R
( )
( )
( ) ( )
( )
1
1 1
| 1 | 1
1
1 1
| 1 | 1
1 1
1
| 1 | 1 | 1
1 1
| 1 | 1 | 1
k
T T
k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k
− −
− −
−
− −
− −
− −
−
− − −
− −
− − −
+ + −
= − +
− + − +
+ +
H
H H P H R H P
H R H H H P H R H P H R H
H R H P H H P H R H H H P H R H
H H P H R H P H R H P H H P
( )
( ) ( )
1
| 1
1 1
| 1 | 1 | 1
T
k k k k
T T T T
k k k k k k k k k k k k k k k k
−
−
− −
− − −
+
+ + +
H R H
H H P H R H P H H P H R H
( ) ( ) ( )
( )
( )
( ) ( )
1 1
| 1 | 1
1 1
| 1 | 1
1
1 1 1 1
| 1 | 1 | 1 | 1
1
|
T T T
k k k k k k k k k k k k k k
T T T T T T
k k k k k k k k k k k k k k k k k k k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k
T
k k k k k
− −
− −
− −
− −
−
− − − −
− − − −
−
− − + + −
=
− − + + + −
= + − + +
− +
I K H Y I K H H H P H R H P
Y H K Y Y K H H K Y K H H H P H R H P
H R H P H P H H P H R H R H P
H R H P
( ) ( )
( )
( ) ( ) ( )
( )
( )
1
1
1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1
1
1 1 1
| 1 | 1 | 1
T T
k k k k k k k k k
T
T T T T T T
k k k k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k
−
−
− − −
− −
− −
− − − − −
− −
− −
−
− − −
− − −
+
+ + + +
+ + −
= + − +
P H H P H R H
H P H H P H R H R H P P H H P H R H
H H P H R H P
H R H P H H P H R H P H R H ( )
( ) ( )
( ) ( )
( ) ( )
1
| 1
1 1
1
| 1 | 1 | 1
1 1
1
| 1 | 1 | 1 | 1
1
| 1 | 1 | 1
T T
k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k
T T T T T T
k k k k k k k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k
−
−
− −
−
− − −
− −
−
− − − −
− −
− − −
− +
− + − +
+ + +
+ + +
H H P H R H
H R H P H H P H R H H H P H R H
H H P H R H P H R H P H H P H R H
H H P H R H P H H P H R
( )
( )
( ) ( )
( )
1
1 1
| 1 | 1
1
1 1
| 1 | 1
1 1
1
| 1 | 1 | 1
1 1
| 1 | 1 | 1
k
T T
k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k
− −
− −
−
− −
− −
− −
−
− − −
− −
− − −
+ + −
= − +
− + − +
+ +
H
H H P H R H P
H R H H H P H R H P H R H
H R H P H H P H R H H H P H R H
H H P H R H P H R H P H H P
( )
( ) ( )
1
| 1
1 1
| 1 | 1 | 1
T
k k k k
T T T T
k k k k k k k k k k k k k k k k
−
−
− −
− − −
+
+ + +
H R H
H H P H R H P H H P H R H
) ( ) ( )
( )
( )
( ) ( )
| 1 | 1
| 1 | 1
1 1 1 1
| 1 | 1 | 1 | 1
T T
k k k k k k k k k k k k k k
T T T T T T
k k k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k
k k k k k
− −
− −
− − − −
− − − −
− − + + −
=
− − + + + −
+ − + +
− +
I K H Y I K H H H P H R H P
Y H K Y Y K H H K Y K H H H P H R H P
H R H P H P H H P H R H R H P
H R H P
( ) ( )
( )
( ) ( ) ( )
( )
( )
1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1 | 1
| 1 | 1
1 1 1
| 1 | 1 | 1
T T
k k k k k k k k k
T T T T T T
k k k k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k
− − −
− −
− −
− − − − −
− −
− − −
− − −
+ + + +
+ + −
+ − +
P H H P H R H
H P H H P H R H R H P P H H P H R H
H H P H R H P
H R H P H H P H R H P H R H ( )
( ) ( )
( ) ( )
( ) ( )
| 1
1 1
| 1 | 1 | 1
1 1
| 1 | 1 | 1 | 1
| 1 | 1 | 1
T T
k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k
T T T T T T
k k k k k k k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k
− −
− − −
− −
− − − −
− −
− − −
− +
− + − +
+ + +
+ + +
H H P H R H
H R H P H H P H R H H H P H R H
H H P H R H P H R H P H H P H R H
H H P H R H P H H P H R
( )
( )
( ) ( )
( )
| 1 | 1
1 1
| 1 | 1
1 1
| 1 | 1 | 1
| 1 | 1 | 1
T T
k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k
− −
− −
− −
− −
− − −
− − −
+ + −
= − +
− + − +
+ +
H H P H R H P
H R H H H P H R H P H R H
H R H P H H P H R H H H P H R H
H H P H R H P H R H P H H P
( )
( ) ( )
| 1
1 1
| 1 | 1 | 1
k k k k
T T T T
k k k k k k k k k k k k k k k k
− −
− − −
+ + +
H R H
H H P H R H P H H P H R H
( ) ( ) ( )
( )
( )
( ) ( )
1 1
| 1 | 1
1 1
| 1 | 1
1
1 1 1 1
| 1 | 1 | 1 | 1
1
|
T T T
k k k k k k k k k k k k k k
T T T T T T
k k k k k k k k k k k k k k k k k k k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k
T
k k k k k
− −
− −
− −
− −
−
− − − −
− − − −
−
− − + + −
=
− − + + + −
= + − + +
− +
I K H Y I K H H H P H R H P
Y H K Y Y K H H K Y K H H H P H R H P
H R H P H P H H P H R H R H P
H R H P
( ) ( )
( )
( ) ( ) ( )
( )
( )
1
1
1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1
1
1 1 1
| 1 | 1 | 1
T T
k k k k k k k k k
T
T T T T T T
k k k k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k
−
−
− − −
− −
− −
− − − − −
− −
− −
−
− − −
− − −
+
+ + + +
+ + −
= + − +
P H H P H R H
H P H H P H R H R H P P H H P H R H
H H P H R H P
H R H P H H P H R H P H R H ( )
( ) ( )
( ) ( )
( ) ( )
1
| 1
1 1
1
| 1 | 1 | 1
1 1
1
| 1 | 1 | 1 | 1
1
| 1 | 1 | 1
T T
k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k
T T T T T T
k k k k k k k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k
−
−
− −
−
− − −
− −
−
− − − −
− −
− − −
− +
− + − +
+ + +
+ + +
H H P H R H
H R H P H H P H R H H H P H R H
H H P H R H P H R H P H H P H R H
H H P H R H P H H P H R
( )
( )
( ) ( )
( )
1
1 1
| 1 | 1
1
1 1
| 1 | 1
1 1
1
| 1 | 1 | 1
1 1
| 1 | 1 | 1
k
T T
k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k
− −
− −
−
− −
− −
− −
−
− − −
− −
− − −
+ + −
= − +
− + − +
+ +
H
H H P H R H P
H R H H H P H R H P H R H
H R H P H H P H R H H H P H R H
H H P H R H P H R H P H H P
( )
( ) ( )
1
| 1
1 1
| 1 | 1 | 1
T
k k k k
T T T T
k k k k k k k k k k k k k k k k
−
−
− −
− − −
+
+ + +
H R H
H H P H R H P H H P H R H
) ( ) ( )
( )
( )
( ) ( )
1 1
| 1 | 1
1 1
| 1 | 1
1
1 1 1 1
| 1 | 1 | 1 | 1
1
|
T T T
k k k k k k k k k k k k k k
T T T T T T
k k k k k k k k k k k k k k k k k k k k
T
T T T T
k k k k k k k k k k k k k k k k k
k k k k
− −
− −
− −
− −
−
− − −
− − − −
−
− + + −
− + + + −
+ − + +
+
H Y I K H H H P H R H P
H K Y Y K H H K Y K H H H P H R H P
H P H P H H P H R H R H P
R H P
) ( )
( )
( ) ( ) ( )
( )
( )
1
1
1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1
1
1 1 1
| 1 | 1 | 1
T T
k k k k k k k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k k k k
T
k k k k k k k k
T T T T
k k k k k k k k k k k k k k k
−
−
− − −
− −
− −
− − − − −
− −
− −
−
− −
− − −
+
+ + +
+ −
+ − +
P H H P H R H
P H H P H R H R H P P H H P H R H
H P H R H P
H P H H P H R H P H R H ( )
( ) ( )
( ) ( )
( ) ( )
1
| 1
1 1
1
| 1 | 1 | 1
1 1
1
| 1 | 1 | 1 | 1
1
| 1 | 1 | 1
T T
k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k
−
−
− −
−
− − −
− −
−
− − − −
− −
− − −
− +
+ − +
+ +
+ +
H H P H R H
R H P H H P H R H H H P H R H
H P H R H P H R H P H H P H R H
H P H R H P H H P H R
( )
( )
( ) ( )
( )
1
1 1
| 1 | 1
1
1 1
| 1 | 1
1 1
1
| 1 | 1 | 1
1 1
| 1 | 1 | 1
k
T
k k k k k k k k
T T T T
k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k k
− −
− −
− −
− −
− −
−
− − −
− −
− − −
+ −
− +
+ − +
+
H
H P H R H P
H H H P H R H P H R H
R H P H H P H R H H H P H R H
H P H R H P H R H P H H P
( )
( ) ( )
1
| 1
1 1
| 1 | 1 | 1
T
k k k k
T T T
k k k k k k k k k k k k k k k
−
−
− −
− − −
+
+ +
H R H
H P H R H P H H P H R H
( ) ( ) ( )
( )
( )
( ) ( )
1 1
| 1 | 1
1 1
| 1 | 1
1
1 1 1 1
| 1 | 1 | 1 | 1
1
|
T T T
k k k k k k k k k k k k k k
T T T T T T
k k k k k k k k k k k k k k k k k k k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k
T
k k k k k
− −
− −
− −
− −
−
− − − −
− − − −
−
− − + + −
=
− − + + + −
= + − + +
− +
I K H Y I K H H H P H R H P
Y H K Y Y K H H K Y K H H H P H R H P
H R H P H P H H P H R H R H P
H R H P
( ) ( )
( )
( ) ( ) ( )
( )
( )
1
1
1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1
1
1 1 1
| 1 | 1 | 1
T T
k k k k k k k k k
T
T T T T T T
k k k k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k
−
−
− − −
− −
− −
− − − − −
− −
− −
−
− − −
− − −
+
+ + + +
+ + −
= + − +
P H H P H R H
H P H H P H R H R H P P H H P H R H
H H P H R H P
H R H P H H P H R H P H R H ( )
( ) ( )
( ) ( )
( ) ( )
1
| 1
1 1
1
| 1 | 1 | 1
1 1
1
| 1 | 1 | 1 | 1
1
| 1 | 1 | 1
T T
k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k
T T T T T T
k k k k k k k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k
−
−
− −
−
− − −
− −
−
− − − −
− −
− − −
− +
− + − +
+ + +
+ + +
H H P H R H
H R H P H H P H R H H H P H R H
H H P H R H P H R H P H H P H R H
H H P H R H P H H P H R
( )
( )
( ) ( )
( )
1
1 1
| 1 | 1
1
1 1
| 1 | 1
1 1
1
| 1 | 1 | 1
1 1
| 1 | 1 | 1
k
T T
k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k
− −
− −
−
− −
− −
− −
−
− − −
− −
− − −
+ + −
= − +
− + − +
+ +
H
H H P H R H P
H R H H H P H R H P H R H
H R H P H H P H R H H H P H R H
H H P H R H P H R H P H H P
( )
( ) ( )
1
| 1
1 1
| 1 | 1 | 1
T
k k k k
T T T T
k k k k k k k k k k k k k k k k
−
−
− −
− − −
+
+ + +
H R H
H H P H R H P H H P H R H
( ) ( ) ( )
( )
( )
( ) ( )
1 1
| 1 | 1
1 1
| 1 | 1
1
1 1 1 1
| 1 | 1 | 1 | 1
1
|
T T T
k k k k k k k k k k k k k k
T T T T T T
k k k k k k k k k k k k k k k k k k k k k
T
T T T T T
k k k k k k k k k k k k k k k k k k k
T
k k k k k
− −
− −
− −
− −
−
− − − −
− − − −
−
− − + + −
=
− − + + + −
= + − + +
− +
I K H Y I K H H H P H R H P
Y H K Y Y K H H K Y K H H H P H R H P
H R H P H P H H P H R H R H P
H R H P
( ) ( )
( )
( ) ( ) ( )
( )
( )
1
1
1 | 1 | 1
1 1
1 1
| 1 | 1 | 1 | 1 | 1
1 1
| 1 | 1
1
1 1 1
| 1 | 1 | 1
T T
k k k k k k k k k
T
T T T T T T
k k k k k k k k k k k k k k k k k k k k k k k
T T
k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k
−
−
− − −
− −
− −
− − − − −
− −
− −
−
− − −
− − −
+
+ + + +
+ + −
= + − +
P H H P H R H
H P H H P H R H R H P P H H P H R H
H H P H R H P
H R H P H H P H R H P H R H ( )
( ) ( )
( ) ( )
( ) ( )
1
| 1
1 1
1
| 1 | 1 | 1
1 1
1
| 1 | 1 | 1 | 1
1
| 1 | 1 | 1
T T
k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k
T T T T T T
k k k k k k k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k k k
−
−
− −
−
− − −
− −
−
− − − −
− −
− − −
− +
− + − +
+ + +
+ + +
H H P H R H
H R H P H H P H R H H H P H R H
H H P H R H P H R H P H H P H R H
H H P H R H P H H P H R
( )
( )
( ) ( )
( )
1
1 1
| 1 | 1
1
1 1
| 1 | 1
1 1
1
| 1 | 1 | 1
1 1
| 1 | 1 | 1
k
T T
k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k k k
T T T T T
k k k k k k k k k k k k k k k k k
− −
− −
−
− −
− −
− −
−
− − −
− −
− − −
+ + −
= − +
− + − +
+ +
H
H H P H R H P
H R H H H P H R H P H R H
H R H P H H P H R H H H P H R H
H H P H R H P H R H P H H P
( )
( ) ( )
1
| 1
1 1
| 1 | 1 | 1
T
k k k k
T T T T
k k k k k k k k k k k k k k k k
−
−
− −
− − −
+
+ + +
H R H
H H P H R H P H H P H R H
(2.78)
( )
( ) ( )
( )
( )
( )
1
| 1 | 1
1
| 1 | 1
1
1 | 1 | 1
| 1 |
| 1
1
| 1 | 1
| 1
T T
k k k k k k k k k k k
T T T
k k k k k k k k k k
T T
k k k k k k k k k k
T T
k k k k k k k k k
T
k k k k k
T T T
k k k k k k k k k
T
k k k k
−
− −
−
− −
−
− − −
−
−
−
− −
−
+ +
− +
− +
= +
− +
+
+
H P H R R H P H R
H P H R H P H R
H P H R R H P H
H H P H R H P
H P H R
H P H R H P H
H P H
( )
1
1
T
k k k
−
− +
H R H
) ( )
( )
)
)
1
1 | 1
1
1 | 1
1
1 | 1
|
1
1
1 | 1
1
T T
k k k k k k k k
T T
k k k k k k k
T T
k k k k k k k k
k k k
T
k k k
T T
k k k k k k
T
k
−
−
−
−
−
− −
−
−
−
+ +
+
+
+
H R R H P H R
H R H P H R
H R R H P H
H P
H R
H R H P H
H
( )
1
1
T
k k k
−
− +
H R H
45. 32 Grid-based Nonlinear Estimation and Its Applications
The term in the bracket is given by
( ) ( ) ( )
( ) ( )
( ) ( )
1 1
| 1 | 1 | 1 | 1
1
| 1 | 1 | 1
1
| 1 | 1 | 1
1
| 1 | 1 | 1
|
T T T T T
k k k k k k k k k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k
T T T
k k k k k k k k k k k k k k k
k k k
− −
− − − −
−
− − −
−
− − −
−
− − −
+ + − +
− + − +
+ +
= + + +
−
H P H R R H P H R H P H R H P H R
H P H R R H P H H P H R
H P H R H P H H P H
H P H R H P H R H P H R
H P ( )
( )
1
1 | 1
1
| 1 | 1 | 1 | 1
1
| 1 | 1 | 1
T T T
k k k k k k k
T T T T
k k k k k k k k k k k k k k k k k k
T T T T
k k k k k k k k k k k k k
−
− −
−
− − − −
−
− − −
+
− − − +
+ +
=
H R H P H R
H P H R H P H H P H H P H R
H P H R H P H H P H
0
(2.79)
Hence, Eq. (2.78) is equal to 0, and (2.72c) is verified.
We have proved that D = D
~
. Next, we will show C = C
~
.
( ) ( )
( ) ( ) ( ) ( ) ( ) ( )
1 2
2
| 1
1 2 1 2
2 1 2 2 2
| 1 |
2 det 1
2 det 2 det 2 det
z
z x x
n T
k k k k k
n n n
k k k k k
π
π π π
−
−
+
=
H P H R
R P P
(2.80)
It is equivalent to proving
( )
( ) ( )
( )
| 1
|
| 1
det det
det
det
k k k
k k T
k k k k k
−
−
=
+
R P
P
H P H R
. (2.81)
Using Eq. (2.69), we have,
( ) ( ) ( )
( )
( ) ( )
( )
( )
( )( )
( )
1
| | 1 | 1 | 1
1
| 1 | 1 | 1
1
| 1 | 1
| 1 1
| 1 | 1
det det det
det det
det det
T T
k k k k k k k k k k k k k
T T
k k k k k k k k k k k
T T
k k k k k k k k k k
k k
T T
k k k k k k k k k
−
− − −
−
− − −
−
− −
− −
− −
= − +
= − +
+ +
=
− +
P P I H H P H R H P
P I H P H H P H R
H P H R H P H R
P
H P H H P H R
( ) ( )
( )
( ) ( )
( )
1
| 1 | 1
| 1
| 1
det det
det det
det
T
k k k k k k k k
k k k
T
k k k k k
−
− −
−
−
= +
=
+
P R H P H R
P R
H P H R
(2.82)
Note that in Eq. (2.82), the Eq. (2.55) is used.
46. Linear Estimation of Dynamic Systems 33
In summary, Eq. (2.57) and Eq. (2.58) are equivalent. The update equation
of the discrete-time Kalman filter is the same as the Bayesian estimation.
2.4 Linear Continuous-Time Kalman Filter
The continuous-time Kalman filter was developed by Kalman and Bucy after the
discrete-time Kalman filter. It is not as widely used in practice as the discrete-
time filter because of the prevalent implementation of the Kalman filter in real-
time systems with digital computers. However, it provides valuable conceptual
and theoretical perspectives on the filtering technique. The continuous-time
Kalman filter can be readily derived as a limiting case of its discrete-time
counterpart as the sampling time becomes very small. The detailed derivation
is omitted in this book and it can be seen in other references, e.g. (Crassidis and
Junkins 2012, Brown and Hwang 2012). The summary of the linear continuous-
time Kalman filter is provided in Table 2.3.
The covariance Eq. (2.85) is called the continuous Riccati equation. To
implement the continuous-time Kalman filter, first, given the initial conditions
for the state estimate ˆ and error covariance P0
, the Kalman gain is calculated
from Eq. (2.87). Next, the covariance Eq. (2.85) and the state estimate Eq. (2.86)
are numerically integrated forward in time using the continuous measurement
y(t) while in the meantime, updating the Kalman gain with the current
Table 2.3 Continuous-time Kalman filter.
Continuous-time linear dynamics and measurement model:
( ) ( ) ( ) ( )
t t t t
= +
x A x v (2.83)
( ) ( ) ( ) ( )
t t t t
= +
y H x n (2.84)
where x(t) is the continuous random process and y(t) is the continuous measurement.
Assumptions: v(t) and n(t) are white Gaussian process and measurement noise, respectively.
2.4 Linear Continuous-Time Kalman Filter
The continuous-time Kalman filter was developed by Kalman and Bucy after the discrete-time Kalman filter. It
is not as widely used in practice as the discrete-time filter because of the prevalent implementation of the
Kalman filter in real-time systems with digital computers. However, it provides valuable conceptual and
theoretical perspectives on the filtering technique. The continuous-time Kalman filter can be readily derived as
a limiting case of its discrete-time counterpart as the sampling time becomes very small. The detailed derivation
is omitted in this book and it can be seen in other references, e.g. (Crassidis and Junkins 2012,; Brown and
Hwang 2012). The summary of the linear continuous-time Kalman filter is provided in Table 2.3
Table 2.3 Continuous-Time Kalman Filter
Continuous-time linear dynamics and measurement model:
( ) ( ) ( ) ( )
t t t t
= +
x A x v
(2.83)
( ) ( ) ( ) ( )
t t t t
= +
y H x n (2.84)
where ( )
t
x is the continuous random process and ( )
t
y is the continuous measurement.
Assumptions: ( )
t
v and ( )
t
n are white Gaussian process and measurement noise, respectively.
( ) ( ) ( ) ( )
T
E t t t t τ
δ
= −
v v Q , ( ) ( ) ( ) ( )
T
E t t t t τ
δ
= −
n n R . ( )
t τ
δ − is the Dirac delta function.
( )
t
v and ( )
t
n are uncorrelated with 0
x and with each other.
Initialization: 0 0
ˆ ˆ
( )
t =
x x and 0 0
( )
t =
P P
Covariance Update:
1
( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( )
T T
t t t t t t t t t t t
−
= + − +
P A P P A P H R H P Q
(2.85)
State Estimate:
[ ]
ˆ ˆ ˆ
( ) ( ) ( ) ( ) ( ) ( )
t t t t t t
=
+ −
x x K y H x
. (2.86)
where
1
( )= ( ) ( ) ( )
T
t t t t
−
K P H R (2.87)
The covariance equation (2.85) is called the continuous Riccati equation. To implement the continuous-time
Kalman filter, first, given the initial conditions for the state estimate 0
x̂ and error covariance 0
P , the Kalman
Comment [M3]: AQ: Dat
Publishers website.
Formatted: Highlight
Formatted: Highlight
is the Dirac
delta function.
v(t) and n(t) are uncorrelated with x0
and with each other.
Initialization: 0 0
ˆ ˆ
( )
t =
x x and 0 0
( )
t =
P P
Covariance Update:
1
( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( )
T T
t t t t t t t t t t t
−
= + − +
P A P P A P H R H P Q (2.85)
State Estimate:
[ ]
ˆ ˆ ˆ
( ) ( ) ( ) ( ) ( ) ( ) .
t t t t t t
=
+ −
x x K y H x (2.86)
where 1
( )= ( ) ( ) ( )
T
t t t t
−
K P H R (2.87)
47. 34 Grid-based Nonlinear Estimation and Its Applications
covariance P(t). The integration continues until the final measurement time
is reached. It can be proved that if R(t) is positive definite and Q(t) is at least
positive semi-definite, the continuous-time Kalman filter is stable (Crassidis
and Junkins 2012).
It is worth noting the difference between the discrete-time covariance
matrices Qk
and Rk
, and continuous-time covariance matrices Q(t) and R(t).
They play the same role but have different numerical values. The relationship
among them is related to the sampling interval Δt:
( )
k t t
= ∆
Q Q (2.88)
( )
k
t
t
=
∆
R
R (2.89)
It may seem counterintuitive to have the discrete measurement covariance
approach ∞ as 0
t
∆ → . However, this is offset by the sampling rate approaching
infinity at the same time (Brown and Hwang 2012).
It is also noted that as 0
t
∆ → , | 1 1| 1
k k k k
− − −
→
P P from the covariance
prediction equation | 1 1 1| 1 1 1.
T
k k k k k k k
− − − − − −
= +
P F P F Q . Therefore, there is no need
to make a distinction between a priori and a posterior covariance matrices in
the continuous filter.
References
Bar-Shalom, Y., X. Li and T. Kirubarajan. 2001. Estimation with Application to Tracking and
Navigation: Theory, Algorithms and Software. John Wiley & Sons, Inc., New York.
Brown, R.G. and P.Y.C. Hwang. 2012. Introduction to Random Signals and Applied Kalman
Filtering. 4th Edition. John Wiley & Sons, New York.
Crassidis, J. and J. Junkins. 2012. Optimal Estimation of Dynamic Systems. 2nd Edition, CRC
Press, Boca Raton.
Kailath,T.,A.H. Sayed and B. Hassibi. 2000. Linear Estimation. Prentice Hall, Upper Saddle River.
Mutambara, G.O. 1998. Decentralized Estimation and Control for Multisensor Systems. 1st
Edition, CRC Press, Boca Raton.
Šimandl, M. 2006. Lecture Notes on State Estimation of Nonlinear Non-Gaussian Stochastic
Systems. Retrieved from semanticscholar.org.
48. Conventional
Nonlinear Filters 3
Most of the estimation problems in practice involve nonlinear models including
nonlinear dynamics and nonlinear measurement functions of the states. The
state estimation of such nonlinear systems becomes very difficult because
many useful statistic properties as a result of linearity and Gaussianity will
not be applicable in the filtering algorithm after nonlinear transformation. As
illustrated in the Bayesian approach, Chapter 1.3, the estimation requires the
construction of the conditional PDF because the posterior PDF embodies all
available statistical information. This makes it a formidable task to find an
optimal estimate analytically and in the meantime develop a filtering algorithm
implementable in real-time applications. Therefore, approximate filters are
necessary. The research in this respect has been conducted for decades.Avariety
of nonlinear filtering algorithms were derived and applied based on different
approximation techniques.
In this chapter, some conventional nonlinear filters that have been well
recognized and extensively applied are presented. The first nonlinear filter
is a natural extension of the Kalman filter derived from the linearization of
nonlinear dynamic and measurement equations, which is called the extended
Kalman filter (EKF) and its variants. The EKF, in spite of not being optimum,
has been successfully applied to many nonlinear systems over the past decades.
It is based on the notion that the true state is sufficiently close to the estimated
state. Thus, the error dynamics can be represented fairly accurately by the first-
order Taylor series expansion. The second class of nonlinear filters is the PDF
approximation based method including the point-mass filter and the particle
filter, which use an analytically designed deterministic grid and a large number
of random Monte Carlo samples to approximate PDF, respectively. Another
sampling point based nonlinear filter, the ensemble Kalman filter, is reviewed as
49. 36 Grid-based Nonlinear Estimation and Its Applications
a simple extension of the classical Kalman filter to solve large-scale nonlinear
estimation problems. Two continuous-time nonlinear filtering methods based
on the Zakai equation and Fokker Planck equation are presented at the end.
3.1 Extended Kalman Filter
The EKF adapts the linear Kalman filter so that it can be applied to nonlinear
problems. It is an analytical approximation based on the linearization of the
nonlinear dynamics and the nonlinear measurement model. For common
applications, the dynamic system can be presented by either the continuous-time
system or discrete-time system. The measurement equation is usually described
by the discrete-time equation.Although we mainly use the discrete-time system
description to introduce the estimation methods, the continuous-discrete time
system can also use the result via discretization.
We consider a class of nonlinear discrete-time dynamical systems described
by:
( )
1 1 1
k k k k
− − −
= +
x x v
f (3.1)
( )
k k k k
= +
y x n
h (3.2)
where ;
x
n m
k k
∈ ∈
x y
; vk–1
and nk
are independent white Gaussian process
noise and measurement noise with covariance Qk–1
and Rk
, respectively.
The EKF is based on the assumption that local linearization is a sufficient
description of nonlinearity. The filtering algorithm follows the same prediction
and update steps.
The predicted state and covariance can be approximated by (Gelb 1974)
( )
| 1 1 1| 1
ˆ ˆ
k k k k k
− − − −
=
x x
f (3.3)
| 1 1 1 1 1
T
k k k k k k
− − − − −
= +
P F P F Q (3.4)
where 1| 1
ˆk k
− −
x is the a posteriori estimate at the time k – 1, Fk–1
is the Jacobian
matrix of the nonlinear function f evaluated at 1| 1
ˆk k
− −
x .
1| 1
1
1
ˆk k
k
k
− −
−
−
∂
∂ x
F
x
f
(3.5)
The updated state and covariance are given by (Gelb 1974)
( )
| | 1
ˆ ˆ ˆ
k k k k k k k
−
= + −
x x K y y (3.6)
| | 1 | 1
k k k k k k k k
− −
= −
P P K H P (3.7)
50. Conventional Nonlinear Filters 37
where the predicted observation ˆk
y is given by
( )
| 1
ˆ ˆ
k k k k−
=
y x
h (3.8)
Hk
is the Jacobian matrix of the nonlinear function h evaluated at | 1
ˆk k−
x .
| 1
ˆk k
k
k
−
∂
∂ x
H
x
h
(3.9)
The Kalman gain Kk
is given by
( )
1
| 1 | 1
T T
k k k k k k k k k
−
− −
= +
K P H H P H R (3.10)
There are some observations about the EKF:
(1) The Kalman gain Kk
and the resultant error covariance matrix |
k k
P are
random variables because they are dependent on the estimate | 1
ˆk k−
x .
Consequently, Kk
and |
k k
P have to be calculated in real time while for
the linear Kalman filter, they can be precomputed and stored before the
measurements are obtained.
(2) The EKF may degrade or diverge if the initial estimate 0|0
x̂ is poor because
it is the reference about which the linearization takes place. If the error in
0|0
x̂ is large, the first-order approximation based on linearization will be
poor. This error can be propagated through the prediction and update steps
and cause further performance degradation, especially in the presence of
large nonlinearity.
(3) Since the covariance matrix |
k k
P is only an approximation to the true
covariance matrix, there is no guarantee that the actual estimate will be
close to the truly optimal estimate. However, the EKF has been used in
a large number of practical applications with great success. It is usually
the first try for nonlinear estimation problems.
Several EKF variants have been proposed to improve the EKF performance.
One of the methods is to apply local iterations to repeatedly calculate |
k k
P , Kk
,
and ˆk k, each time linearizing about the most recent estimate. This method is
called the iterative EKF.
3.2 Iterated Extended Kalman Filter
Note that the update equations of the EKF are functions of the estimate | 1
ˆk k−
x , i.e.,
( ) ( ) ( ) ( )
1
| | 1 | 1 | 1 | 1 | 1 | 1 | 1
ˆ ˆ ˆ ˆ ˆ ˆ
T T
k k k k k k k k k k k k k k k k k k k k k
−
− − − − − − −
= + + −
x x P H x H x P H x R y x
h
(3.11)
51. 38 Grid-based Nonlinear Estimation and Its Applications
( ) ( ) ( ) ( )
1
| | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1
ˆ ˆ ˆ ˆ
T T
k k k k k k k k k k k k k k k k k k k k k k k
−
− − − − − − − −
=
− +
P P P H x H x P H x R H x P
(3.12)
The reference about which the linearization is performed is | 1
ˆk k−
x , which
may not be accurate. Hence, the estimation can be improved by updating
the linearization reference point iteratively with |
ˆk k
x . Specifically, the update
equations are rewritten as
( ) ( ) ( ) ( )
1
1
| | 1 | 1 | | | 1 | |
ˆ ˆ ˆ ˆ ˆ ˆ
i T i i T i i
k k k k k k k k k k k k k k k k k k k k k
−
+
− − −
= + + −
x x P H x H x P H x R y x
h
(3.13)
( ) ( ) ( ) ( )
1
1
| | 1 | 1 | | | 1 | | | 1
ˆ ˆ ˆ ˆ
i T i i T i i
k k k k k k k k k k k k k k k k k k k k k k k
−
+
− − − −
=
− +
P P P H x H x P H x R H x P
(3.14)
where |
ˆi
k k
x and |
i
k k
P denote the mean and covariance update at the ith iteration,
respectively. The iteration is stopped when it converges, i.e., 1
| |
ˆ ˆ
i i
k k k k ε
+
− <
x x
with ε being a given threshold.
Remark 3.1: The EKF is derived from retaining the first-order terms in the
Taylor series expansion of the nonlinear dynamics and measurement function.
Thus, it is possible to improve the accuracy of estimation by including higher-
order terms such as the second-order EKF. However, it involves solutions
with extra complexity. One has to consider the extra computation cost to use
such filters.
Another class of nonlinear filtering methods involves approximation of the
a posteriori PDF directly. It does not rely on the Gaussian assumption and can
be applied to more general nonlinear estimation problems. In the following, two
types of such filters are introduced. The first one is the point-mass filter based
on deterministic grid points, and the second one is the particle filter based on
the sequential Monte Carlo method.
3.3 Point-Mass Filter
The key idea of the point-mass filter (PMF) is to represent the PDF by a grid
of points G0
, a set of volume masses for each point’s neighborhood, M0
, and a
set of probability density values at each point, V0|0
. Using such representations,
the Bayesian estimation equations can be rewritten. The PMF can be briefly
described as follows.
52. Conventional Nonlinear Filters 39
Initialization
The initial PDF ( )
0|0
p x is described by a grid of points ( ) ( )
{ }
0 0 0 0
; , 1, ,
x
i i n
G i N
= ∈ =
p p
( ) ( )
{ }
0 0 0 0
; , 1, ,
x
i i n
G i N
= ∈ =
p p , a set of volume masses for each point’s neighborhood is
given by ( )
{ }
0 0 0
, 1, ,
i
M i N
=
∆ =
p , and a set of probability density values
at each point is given by ( ) ( ) ( )
( ) ( )
{ }
0|0 0|0 0|0 0|0 0 0 0 0
; , , 1, ,
i i i i
V v v p G i N
= = ∈ =
p p .
N0
is the initial number of grid points.
Prediction of grid
Each point of the grid is propagated by
( ) ( )
( )
| 1 1
i i
k k k
− −
=
p p
f (3.15)
Due to the nonlinear transformation, the grid structure of
( ) ( )
{ }
| 1 | 1 1
; , 1, ,
x
i i n
k k k k k
i N
− − −
∈ =
p p is not the same as Gk–1
.
Grid Redefinition
The grid ( ) ( )
{ }
| 1 | 1 1
; , 1, ,
x
i i n
k k k k k
i N
− − −
∈ =
p p is refined to the grid
( ) ( )
{ }
; , 1, ,
x
j j n
k k k k
G j N
= ∈ =
p p to maintain the structure of the grid Gk–1
and
consider the effect of process noise on the spread of the PDF.
Prediction of the density value
Given 1| 1
k k
V − − , the predicted density value for the grid point ( )
j
k
p is then
rewritten as
( ) ( ) ( ) ( ) ( )
( )
1
| 1 1 1| 1 | 1
1
, 1, ,
k
N
j i i j i
k k k k k v k k k k
i
v v p j N
−
− − − − −
=
= ∆ ⋅ ⋅ − =
∑ p p p (3.16)
where Nk
is the number of grid points at the time k. pv
(.) denotes the PDF of
the process noise. Note that Mk–1
can be determined when the grid of points
Gk–1
is known.
Update of the density value
The posterior PDF at the ith grid point ( )
j
k
p are given by
( ) ( ) ( )
( )
( )
1
| | 1 , 1, ,
j j j
k k k k k n k k k
v c v p j N
−
−
= ⋅ ⋅ − =
y p
h (3.17)
where ( ) ( ) ( )
( )
( )
| 1
1
k
N
j j j
k k k k n k k
j
c v p
−
=
= ∆ ⋅ ⋅ −
∑ p y p
h . pn
(.) denotes the PDF of the
measurement noise.
53. 40 Grid-based Nonlinear Estimation and Its Applications
From the above equations, one can see that design of the grid
( ) ( )
{ }
; , 1, ,
x
j j n
k k k k
G j N
= ∈ =
p p is critical for the PMF. The floating grid
technique was proposed in (Šimandl et al. 2002, 2006), where a rectangular
equally-spaced grid is used. In addition, the grid is shifted and rotated based
on the predicted expected value and the predicted covariance matrix.
Remark 3.2: In order to achieve good accuracy and efficiency, the PMF needs
the sophisticated design of the grid (Šimandl et al. 2006). The computational
load is high compared to other methods, such as the particle filter.
3.4 Particle Filter
In the Bayesian estimation, the prediction and update of the estimate involve
integrals of PDFs in both the Chapman-Kolmogorov equation and the Bayesian
update rule. These integrals can be calculated by the Monte Carlo method.
Assume that we are able to generate Ns
independent and identically distributed
random particles {xk
(i)
; i = 1,…, Ns
} according to the posterior PDF p(xk
|y1:k
).
An approximation of this distribution is given by
pNs
(xk
| y1:k
) =
( ) ( )
( )
1: 1: 1: 1:
1
1
|
s
s
N
i
N k k k k
i
s
p
N
δ
=
= −
∑
x y x x
(xk
| xk
(i)
) (3.18)
Then, the integral ( ) ( ) ( )
1: 1: 1: 1:
|
k k k k
I p d
= ∫ x x y x
f f xk
| y1:k
xk k
can be approximated by
INs
( f ) =
( ) ( ) ( ) ( )
( )
1: 1: 1: 1: 1:
1
1
|
s
s s
N
i
N k N k k k k
i
s
I p d
N =
= = ∑
∫ x x y x x
f f f
f (xk
) pNs
(xk
| y1:k
)dxk
( ) ( ) ( ) ( )
( )
1: 1: 1: 1: 1:
1
1
|
s
s s
N
i
N k N k k k k
i
s
I p d
N =
= = ∑
∫ x x y x x
f f f (xk
(i)
) (3.19)
using the set of random particles {xk
(i)
; i = 1,…, Ns
}. When the number of
particles Ns
goes to infinity, INs
(f ) will approach the true I(f ).
The merit of the particle filter is that it has a convergence rate independent
of the dimension of the integrand by the central limit theorem. In contrast,
any deterministic numerical integration method has a rate of convergence that
decreases with the increase of dimension (Arulampalam et al. 2002).
Nevertheless, for a general non-Gaussian, multivariate or multi-modal
PDF ( )
1:
|
k k
p x y , it is difficult to generate samples directly from the posterior
( )
1:
|
k k
p x y . Alternatively, we can utilize the concept of importance sampling,
which is a method to compute expectations with respect to one distribution
using random samples drawn from another. Specifically, we choose a proposal
PDF ( )
1:
|
k k
q x y that approximates the target posterior distribution ( )
1:
|
k k
p x y
as closely as possible. This proposal PDF ( )
1:
|
k k
q x y is referred to as the
importance sampling distribution, since it samples the target distribution
( )
1:
|
k k
p x y non-uniformly to give “more importance” to some values of
( )
1:
|
k k
p x y than others. ( )
1:
|
k k
q x y should be easy to sample and its support
54. Conventional Nonlinear Filters 41
covers that of ( )
1:
|
k k
p x y , or the samples drawn from ( )
1:
|
k k
q x y overlap the
same region (or more) corresponding to the samples of ( )
1:
|
k k
p x y . ( )
1:
|
k k
p x y
and ( )
1:
|
k k
q x y have the same support if the following condition is satisfied.
( ) ( )
1: 1:
| 0 | 0 x
n
k k k k k
p q
> ⇒ > ∀ ∈
x y x y x (3.20)
This is a necessary condition for the importance sampling theory to hold
(Ristic et al. 2004).
We can write ( ) ( )
1: 1:
| |
k k k k
p q
∝
x y x y , which means ( )
1:
|
k k
p x y is
proportional to ( )
1:
|
k k
q x y at every xk
. A scaling factor can be defined as
w
~
( )
( )
( )
1:
1:
|
|
k k
k
k k
p
q
ω
x y
x
x y
(3.21)
Hence,
( ) ( )
( ) ( ) ( )
( ) ( )
1:
1:
|
1:
|
|
k k
k k k k k
k p
k k k k
q d
E
q d
ω
ω
=
∫
∫
x y
x x x y x
x
x x y x
f
f
w
~
w
~
(3.22)
where [ ]p
E ⋅ denotes the expected value with respect to the PDF ( )
1:
|
k k
p x y .
If Ns
particles ( )
{ }
, 1, ,
i
k s
i N
=
x are generated from ( )
1:
|
k k
q x y , Eq. (3.22)
can be rewritten as
( ) ( )
( )
( ) ( )
( )
( )
( )
( )
( ) ( )
( )
1:
1
|
1
1
1
1
s
s
s
k k
N
i i
k k N
i i
i
s
k k k
N
p
i i
k
i
s
N
E
N
ω
ω
ω
=
=
=
≈ =
∑
∑
∑
x y
x x
x x x
x
f
f f
w
~
w
~
w (3.23)
where
( )
( )
( )
( )
( )
( )
1
s
i
k
i
k N
i
k
i
ω
ω
ω
=
=
∑
x
x
x
w
w
~
w
~
(3.24)
Hence, a posterior PDF can be approximated by a set of particles and
weights for times up to k – 1.
( ) ( ) ( )
( )
1 1: 1 1 1 1| 1
1
|
s
N
i i
k k k k k k
i
p ω δ
− − − − − −
=
≈ −
∑
x y x x
w (3.25)
with
w(i)
k–1
∝
( )
( )
( )
( )
( )
1| 1
1
1| 1
i
k k
i
k i
k k
p
q
ω
− −
−
− −
=
x
x
(3.26)
The importance sampling method can be modified to calculate an
approximation of the current PDF without modifying past simulated trajectories.
56. owed to her own dignity; and was best calculated, if he retained one
spark of sensibility or discernment, to convince him that her
sentiments had undergone an irrevocable change. This method,
therefore, she determined to pursue; making, with a sigh, this grand
proviso, that she should find it practicable.
Mrs De Courcy, who guessed the current of her thoughts, suffered it
to proceed without interruption; and it was not till Laura relaxed her
brow, and raised her head, like one who has taken his resolution,
that her companion, stopping, complained of fatigue; proposing, as
her own carriage was not in waiting, to borrow Lady Pelham's, and
return home, leaving the other ladies to be conveyed in Mrs
Penelope's sociable to Norwood, where the party was to dine. Not
willing to direct the proposal to Laura, upon whose account chiefly it
was made, she then turned to Mrs Penelope, and inquired whether
she did not feel tired with her walk; but that lady, who piqued
herself upon being a hale active woman of her age, declared herself
able for much greater exertion, and would walk, she said, till she
had secured an appetite for dinner. Laura, who had modestly held
back till Mrs Penelope's decision was announced, now eagerly
offered her attendance, which Mrs De Courcy, with a little
dissembled hesitation, accepted, smiling to perceive how well she
had divined her young favourite's inclinations.
The whole party attended them to the spot where the carriages
were waiting. On reaching them, Mr Bolingbroke, handing in Mrs De
Courcy, left Laura's side for the first time free to Hargrave, who
instantly occupied it; while Montague, the drops standing on his
forehead, found himself shackled between Mrs Penelope and Miss
Bolingbroke. 'Ever dear, ever revered Miss Montreville'—Hargrave
began in an insinuating whisper. 'Sir!' cried Laura, starting with
indignant surprise. 'Nay, start not,' continued he in an under voice; 'I
have much, much to say. Lady Pelham allows me to visit Walbourne;
will you permit me to'—Laura had not yet studied her lesson of easy
civility, and therefore the courtesy of a slight inclination of the head
57. was contradicted by the tone in which she interrupted him, saying, 'I
never presume, Sir, to select Lady Pelham's visitors.'
She had reached the door of the carriage, and Hargrave took her
hand to assist her in entering. Had Laura been prepared, she would
have suffered him, though reluctantly, to do her this little service;
but he took her unawares, and snatching back her hand as from the
touch of a loathsome reptile, she sprang, unassisted, into her seat.
As the carriage drove off, Mrs De Courcy again apologized for
separating Laura from her companions; 'though I know not,' added
she, 'whether I should not rather take credit for withdrawing you
from such dangerous society. All ladies who have stray hearts must
guard them either in person or by proxy, since this formidable
Colonel Hargrave has come among us.' 'He has fortunately placed
the more respectable part of us in perfect security,' returned Laura,
with a smile and voice of such unembarrassed simplicity as fully
satisfied her examiner.
Had Laura spent a lifetime in studying to give pain, which, indeed,
was not in all her thoughts, she could not have inflicted a sharper
sting on the proud heart of Hargrave, than by the involuntary look
and gesture with which she quitted him. The idea of inspiring with
disgust, unmixed irresistible disgust, the woman upon whose
affections, or rather upon whose passions, he had laboured so
zealously and so long, had ever been more than he could bear, even
when the expression of her dislike had no witness; but now she had
published it to chattering misses and prying old maids, and more
favoured rivals. Hargrave bit his lip till the blood came; and, if the
lightning of the eye could scathe, his wrath had been far more
deadly to others.
After walking for some minutes surly and apart, he began to comfort
himself with the hopes of future revenge. 'She had loved him,
passionately loved him, and he was certain she could not be so
utterly changed. Her behaviour was either all affectation, or a
conceit of the strength of her own mind, which all these clever
58. women were so vain of. But the spark still lurked somewhere,
whatever she might imagine, and if he could turn her own weapons
against herself.'—Then, recollecting that he had resolved to cultivate
Lady Pelham, he resumed his station by her side, and was again the
courtly, insinuating Colonel Hargrave.
Hargrave had lately acquired a friend, or rather an adviser (the
dissolute have no friends), who was admirably calculated to supply
the deficiencies of his character as a man of pleasure. Indeed,
except in so far as pleasure was his constant aim, no term could,
with less justice, have been applied to Hargrave; for his life was
chiefly divided between the goadings of temptations to which he
himself lent arms, and the pangs of self-reproach which he could not
exclude, and would not render useful. The strait and narrow way he
never had a thought of treading, but his wanderings were more
frequent than he intended, his returns more lingering. The very
strength of his passions made him incapable of deep or persevering
deceit; he was humane to the suffering that pressed itself on his
notice, if it came at a convenient season; and he was disinterested,
if neglect of gold deserve the name. Lambert, his new adviser, had
no passions, no humanity, no neglect of gold. He was a gamester.
The practice of this profession, for, though a man of family and
fortune he made it a profession, had rendered him skilful to discern,
and remorseless to use the weaknesses of his fellow creatures. His
estate lay contiguous to —, the little town where Hargrave had been
quartered when he visited at Norwood; but the year which Hargrave
passed at — was spent by Lambert almost entirely alone in London.
He had returned however to the country, had been introduced to
Hargrave, and had just fixed upon him as an easy prey, when the
soldier was saved for a time, by receiving intimation of his
promotion, and orders to join his regiment in a distant county.
They met again in an evil hour, just as Hargrave had half-determined
to abandon as fruitless his search after Laura. The necessity of a
stimulant was as strong as ever. Another necessity too was strong,
for £10,000 of damages had been awarded to Lord Bellamer;
59. Hargrave could not easily raise the money, and Lord Lincourt refused
to advance a shilling. 'A pretty expensive pleasure has this Lady
Bellamer been to me,' said Hargrave, bestowing on her Ladyship a
coarse enough epithet; for even fine gentlemen will sometimes call
women what they have found them to be. He was prevailed on to try
the gaming-table for the supply of both his wants, and found that
pleasure fully twice as expensive. His friend introduced him to some
of those accommodating gentlemen who lend money at illegal
interest, and was generous enough to supply him when they would
venture no more upon an estate in reversion. Lambert had
accidentally heard of the phœnix which had appeared at Walbourne;
and, on comparing the description he received of her with that to
which with politic patience he had often listened, he had no doubt of
having found the object of Hargrave's search. But, as it did not suit
his present views that the lover should renew the pursuit, he dropt
not a hint of his discovery, listening, with a gamester's insensibility,
to the regrets which burst forth amidst the struggles of expiring
virtue, for her whose soft influence would have led to peace and
honour.
At last a dispute arising between the worthy Mr Lambert and his
respectable coadjutors, as to the partition of the spoil, it occurred to
him that he could more effectually monopolize his prey in the
country; and thither accordingly he was called by pressing business.
There he was presently so fortunate as to discover a Miss
Montreville, on whose charms he descanted in a letter to Hargrave in
such terms, that, though he averred she could not be Hargrave's
Miss Montreville, Hargrave was sure she could be no other; and, as
his informer expected, arrived in ——shire as soon as a chaise and
four could convey him thither.
Lambert had now a difficult game to play, for he had roused the
leading passion, and the collateral one could act but feebly; but they
who often tread the crooked path, find pleasure in its intricacy,
vainly conceiting that it gives proof of their sagacity, and Lambert
looked with pleasure on the obstacles in his way. He trusted, that
60. while the master-spirit detained Hargrave within the circle of
Walbourne, he might dexterously practise with the lesser imp of evil.
Had his letter afforded a clue to Laura's residence, Hargrave would
have flown directly to Walbourne, but he was first obliged to stop at
—; and Lambert, with some difficulty, persuaded him, that, as he
was but slightly known to Lady Pelham, and probably in disgrace
with her protegée, it would be more politic to delay his visit, and first
meet them at Lord —'s, where he had information that they were to
go on the following day. 'You will take your girl at unawares,' said
he, 'if she be your girl; and that is no bad way of feeling your
ground.' The vanity of extorting from Laura's surprise some
unequivocal token of his power prevailed on the lover to delay the
interview till the morning; and, after spending half the evening in
dwelling upon the circumstances of his last unexpected meeting with
her, which distance softened in his imagination to more than its
actual tenderness, he, early in the morning set out with Lambert for
—, where he took post in the hermitage, as a place which no
stranger omitted to visit.
Growing weary of waiting, he dispatched Lambert as a scout; and,
lest he should miss Laura, remained himself in the hermitage, till his
emissary brought him information that the party were in the picture
gallery. Thither he hastened; but the party had already left the
house, and thus had Laura accidental warning of his approach. No
reception could have been so mortifying to him, who was prepared
to support her sinking under the struggle of love and duty, of
jealousy and pride. No struggle was visible; or, if there was, it was
but a faint strife between native courtesy and strong dislike. He had
boasted to Lambert of her tenderness; the specimen certainly was
not flattering. Most of her companions were little more gracious. De
Courcy paid him no more attention than bare civility required.—With
the Bolingbrokes he was unacquainted, but the character of his
companion was sufficient reason for their reserve. Lady Pelham was
the only person present who soothed his wounded vanity. Pleased
with the prospect of unravelling the mystery into which she had
61. pried so long in vain, charmed with the easy gallantry and adroit
flattery of which Hargrave, in his cooler moments, was consummate
master, she accepted his attentions with great cordiality; while he
had the address tacitly to persuade her that they were a tribute to
her powers of entertaining.
Before they parted, she had converted her permission to visit
Walbourne into a pressing invitation, nay, had even hinted to De
Courcy the propriety of asking the Colonel to join the dinner party
that day at Norwood. The hint, however, was not taken; and
therefore, in her way home, Lady Pelham indulged her fellow-
travellers with sundry moral and ingenious reflections concerning the
folly of being 'righteous over much;' and on the alluring accessible
form of the true virtue, contrasted with the repulsive, bristly,
hedgehog-like make of the false. Indeed, it must be owned, that for
the rest of the evening her Ladyship's conversation was rather
sententious than agreeable; but the rest of the party, in high good
humour, overlooked her attacks, or parried them in play.
Montague had watched the cold composure of Laura on Hargrave's
first accosting her, and seen the gesture which repulsed him at
parting; and though in the accompanying look he lost volumes, his
conclusions, on the whole, were favourable. Still a doubt arose,
whether her manner sprung not from the fleeting resentment of
affection; and he was standing mournfully calculating the effects of
Hargrave's perseverance, when his mother, in passing him as she
followed her guests to the eating-room, said, in an emphatical
whisper, 'I am satisfied. There is no worm in the bud.'
Mrs De Courcy's encouraging assertion was confirmed by the
behaviour of Laura herself; for she maintained her usual serene
cheerfulness; nor could even the eye of love detect more than one
short fit of distraction; and then the subject of thought seemed any
thing rather than pleasing retrospect, or glad anticipation. The
company of his friends, Harriet's pointedly favourable reception of Mr
Bolingbroke's assiduities, and the rise of his own hopes, all enlivened
Montague to unusual vivacity, and led him to a deed of daring which
62. he had often projected, without finding courage to perform it. He
thought, if he could speak of Hargrave to Laura, and watch her
voice, her eye, her complexion, all his doubts would be solved. With
this view, contriving to draw her a little apart, he ventured, for the
first time, to name his rival; mentioned Lady Pelham's hint; and,
faltering, asked Laura whether he had not done wrong in resisting it.
'Really,' answered Laura with a very naïve smile, and a very faint
blush, 'I don't wonder you hesitate in offering me such a piece of
flattery as to ask my opinion.'
'Do not tax me with flattering you,' said De Courcy earnestly; 'I
would as soon flatter an apostle; but tell me candidly what you
think.'
'Then, candidly,' said Laura, raising her mild unembarrassed eye to
his, 'I think you did right, perfectly right, in refusing your
countenance to a person of Colonel Hargrave's character. While vice
is making her encroachments on every hand, it is not for the friends
of virtue to remove the ancient landmarks.'
Though this was one of the stalest pieces of morality that ever
Montague had heard Laura utter, he could scarcely refrain from
repaying it by clasping her to his heart. Convinced that her affections
were free, he could not contain his rapture, but exclaimed, 'Laura,
you are an angel! and, if I did not already love beyond all power of
expression, I should be'—He raised his eyes to seek those of Laura,
and met his mother's, fixed on him with an expression that
compelled him to silence.—'You should be in love with me;' said
Laura, laughing, and filling up the sentence as she imagined it was
meant to conclude. 'Well, I shall be content with the second place.'
Mrs De Courcy, who had approached them, now spoke on some
indifferent subject, and saved her son from a very awkward attempt
at explanation. She drew her chair close to Laura, and soon engaged
her in a conversation so animated, that Montague forgot his
embarrassment, and joined them with all his natural ease and
63. cheerfulness. The infection of his ease and cheerfulness Laura had
ever found irresistible. Flashes of wit and genius followed the
collision of their minds; and the unstudied eloquence, the poetic
imagery of her style, sprung forth at his touch, like blossoms in the
steps of the fabled Flora.
Happy with her friends, Laura almost forgot the disagreeable
adventure of the morning; and, every look and word mutually
bestowing pleasure, the little party were as happy as affection and
esteem could make them, when Lady Pelham, with an aspect like a
sea fog, and a voice suitably forbidding, inquired whether her niece
would be pleased to go home, or whether she preferred sitting
chattering there all night. Laura, without any sign of noticing the
rudeness of this address, rose, and said she was quite ready to
attend her Ladyship. In vain did the De Courcys entreat her to
prolong her visit till the morning. To dare to be happy without her
concurrence, was treason against Lady Pelham's dignity; and
unfortunately she was not in a humour to concur in the joy of any
living thing. De Courcy's reserve towards her new favourite she
considered as a tacit reproof of her own cordiality; and she had just
such a conviction that the reproof was deserved, as to make her
thoroughly out of humour with the reprover, with herself, and
consequently with everybody else. Determined to interrupt pleasure
which she would not share, the more her hosts pressed her stay, the
more she hastened her departure; and she mingled her indifferent
good nights to them with more energetic reprimands to the
tardiness of her coachman.
'Thank heaven,' said she, thrusting herself into the corner of her
carriage with that jerk in her motion which indicates a certain degree
of irritation, 'to-morrow we shall probably see a civilized being.' A
short pause followed. Laura's plain integrity and prudence had
gained such ascendancy over Lady Pelham, that her niece's opinion
was to her Ladyship a kind of second conscience, having, indeed,
much the same powers as the first. Its sanction was necessary to
her quiet, though it had not force to controul her actions. On the
64. present occasion she wished, above all things, to know Laura's
sentiments; but she would not condescend to ask them directly.
'Colonel Hargrave's manners are quite those of a gentleman,' she
resumed. The remark was entirely ineffectual; for Laura coolly
assented, without inquiring whether he were the civilized being
whom Lady Pelham expected to see. Another pause. 'Colonel
Hargrave will be at Walbourne to-morrow,' said Lady Pelham, the
tone of her voice sharpening with impatience. 'Will he, Ma'am?'
returned Laura, without moving a muscle. 'If Miss Montreville has no
objections,' said Lady Pelham, converting by a toss of her head and
a twist of her upper lip, the words of compliment into an insult.
'Probably,' said Laura, with a smile, 'my objections would make no
great difference.'—'Oh, to be sure!' returned Lady Pelham, 'it would
be lost labour to state them to such an obstinate, unreasonable
person as I am! Well, I believe you are the first who ever accused
me of obstinacy.' If Lady Pelham expected a compliment to her
pliability, she was disappointed; for Laura only answered, 'I shall
never presume to interfere in the choice of your Ladyship's visitors.'
That she should be thus compelled to be explicit was more than
Lady Pelham's temper could endure. Her eyes flashing with rage,
'Superlative humility indeed!' she exclaimed with a sneer; but, awed,
in spite of herself, from the free expression of her fury, she muttered
it within her shut teeth in a sentence of which the words 'close' and
'jesuitical' alone reached Laura's ear. A long and surly silence
followed; Lady Pelham's pride and anger struggling with her desire
to learn the foundation and extent of the disapprobation which she
suspected that her conduct excited. The latter, at last, partly
prevailed; though Lady Pelham still disclaimed condescending to
direct consultation.
'Pray, Miss Montreville,' said she, 'if Colonel Hargrave's visits were to
you, what mighty objections might your sanctity find to them?'—
Laura had long ago observed that a slight exertion of her spirit was
the best quietus to her aunt's ill humour; and, therefore, addressing
her with calm austerity, she said, 'Any young woman, Madam, who
65. values her reputation, might object to Colonel Hargrave's visits,
merely on the score of prudence. But even my "superlative humility"
does not reconcile me to company which I despise; and my
"sanctity," as your Ladyship is pleased to call it, rather shrinks from
the violator of laws divine and human.'
Lady Pelham withdrew her eyes to escape a glance which they never
could stand; but, bridling, she said, 'Well, Miss Montreville, I am
neither young nor sanctimonious, therefore your objections cannot
apply to Colonel Hargrave's visits to me; and I am determined,'
continued she, speaking as if strength of voice denoted strength of
resolution, 'I am determined, that I will not throw away the society
of an agreeable man, to gratify the whims of a parcel of narrow-
minded bigots.'
To this attack Laura answered only with a smile. She smiled to see
herself classed with the De Courcys; for she had no doubt that they
were the 'bigots' to whom Lady Pelham referred. She smiled, too, to
observe that the boasted freedom of meaner minds is but a poor
attempt to hide from themselves the restraint imposed by the
opinions of the wise and good.
The carriage stopped, and Laura took sanctuary in her own
apartment; but at supper she met her aunt with smiles of unaffected
complacency, and, according to the plan which she invariably
pursued, appeared to have forgotten Lady Pelham's fit of spleen; by
that means enabling her aunt to recover from it with as little
expence to her pride as possible.
66. CHAPTER XXV
Lady Pelham was not disappointed in her expectation of seeing
Colonel Hargrave on the following day. He called at Walbourne while
her Ladyship was still at her toilette; and was shown into the
drawing-room, where Laura had already taken her station. She rose
to receive him, with an air which shewed that his visit gave her
neither surprise nor pleasure; and, motioning him to a distant seat,
quietly resumed her occupation. Hargrave was a little disconcerted.
He expected that Laura would shun him, with marks of strong
resentment, or perhaps with the agitation of offended love; and he
was prepared for nothing but to entreat the audience which she now
seemed inclined to offer him.
Lovers are so accustomed to accuse ladies of cruelty, and to find
ladies take pleasure in being so accused, that unlooked-for kindness
discomposes them; and a favour unhoped is generally a favour
undesired. The consciousness of ill desert, the frozen serenity of
Laura's manner, deprived Hargrave of courage to use the opportunity
which she seemed voluntarily to throw in his way. He hesitated, he
faltered; while, all unlike her former self, Laura appeared determined
that he should make love, for she would not aid his dilemma even by
a comment on the weather. All the timidity which formerly marked
her demeanour, was now transferred to his; and, arranging her work
67. with stoical composure, she raised her head to listen, as Hargrave
approaching her stammered out an incoherent sentence expressive
of his unalterable love, and his fears that he had offended almost
beyond forgiveness.
Laura suffered him to conclude without interruption; then answered,
in a voice mild but determined, 'I had some hopes, Sir, from your
knowledge of my character and sentiments, that, after what has
passed, you could have entertained no doubts on this subject.—Yet,
lest even a shadow of suspense should rest on your mind, I have
remained here this morning on purpose to end it. I sincerely grieve
to hear that you still retain the partiality you have been pleased to
express, since it is now beyond my power to make even the least
return.'
The utmost bitterness of reproach would not have struck so chilly on
the heart of Hargrave as these words, and the manner in which they
were uttered. From the principles of Laura he had indeed dreaded
much; but he had feared nothing from her indifference. He had
feared that duty might obtain a partial victory; but he had never
doubted that inclination would survive the struggle. With a mixture
of doubt, surprise, and anguish, he continued to gaze upon her after
she was silent; then starting, he exclaimed—'I will not believe it; it is
impossible. Oh, Laura, choose some other way to stab, for I cannot
bear this!'—'It pains me,' said Laura, in a voice of undissembled
concern, 'to add disappointment to the pangs which you cannot but
feel; yet it were most blameable now to cherish in you the faintest
expectation.'—'Stop,' cried Hargrave, vehemently, 'if you would not
have me utterly undone. I have never known peace or innocence but
in the hope of your love; leave me a dawning of that hope, however
distant. Nay, do not look as if it were impossible. When you thought
me a libertine, a seducer—all that you can now think me, you
suffered me to hope. Let me but begin my trial now, and all woman-
kind shall not lure me from you.'
'Ah,' said Laura, 'when I dreamt of the success of that trial, a strange
infatuation hung over me. Now it has passed away for ever. Sincerely
68. do I wish and pray for your repentance, but I can no longer offer to
reward it. My desire for your reformation will henceforth be as
disinterested as sincere.'
Half distracted with the cutting calmness of her manner, so changed
since the time when every feature spoke the struggles of the heart,
when the mind's whole strength seemed collected to resist its
tenderness, Hargrave again vehemently refused to believe in her
indifference. ''Tis but a few short months,' he cried, grasping her
hand with a violence that made her turn pale; ''tis but a few short
months since you loved me with your whole soul, since you said that
your peace depended upon my return to virtue. And dare you
answer it to yourself to cast away the influence, the only influence
that can secure me?'
'If I have any influence with you,' returned Laura, with a look and an
attitude of earnest entreaty, 'let it but this once prevail, and then be
laid aside for ever. Let me persuade you to the review of your
conduct; to the consideration of your prospects as an accountable
being, of the vengeance that awaits the impenitent, of the escape
offered in the gospel. As you value your happiness, let me thus far
prevail. Or if it will move you more,' continued she, the tears
gushing from her eyes, 'I will beseech you to grant this, my only
request; in memory of a love that mourned your unworthiness
almost unto death.'
The sight of her emotions revived Hargrave's hopes; and, casting
himself at her feet, he passionately declared, while she shuddered at
the impious sentiment, that he asked no heaven but her love, and
cared not what were his fate if she were lost. 'Ah, Sir,' said she, with
pious solemnity, 'believe me, the time is not distant when the
disappointment of this passion will seem to you a sorrow light as the
baffled sports of childhood. Believe the testimony of one who but
lately drew near to the gates of the grave. On a death-bed, guilt
appears the only real misery; and lesser evils are lost amidst its
horror like shadows in a midnight-gloom.'
69. The ideas which Laura was labouring to introduce into the mind of
Hargrave were such as he had of late too successfully endeavoured
to exclude. They had intruded like importunate creditors; till, oft
refused admittance, they had ceased to return. The same arts which
he had used to disguise from himself the extent of his criminality, he
now naturally employed to extenuate it in the sight of Laura. He
assured her that he was less guilty than she supposed; that she
could form no idea of the force of the temptation which had
overcome him; that Lady Bellamer was less the victim of his
passions than of her own; he vehemently protested that he despised
and abhorred the wanton who had undone him; and that, even in
the midst of a folly for which he now execrated himself, his
affections had never wandered from their first object. While he
spoke, Laura in confusion cast down her eyes, and offended
modesty suffused her face and neck with crimson. She could indeed
form no idea of a heart which, attached to one woman, could find
any temptation in the allurements of another. But when he ended,
virtuous indignation flashing in her countenance, 'For shame, Sir!'
said she. 'If any thing could degrade you in my eyes it were this
mean attempt to screen yourself behind the partner of your
wickedness. Does it lessen your guilt that it had not even the poor
excuse of passion; or think you that, even in the hours of a
weakness for which you have given me such just reason to despise
myself, I could have prized the affections of a heart so depraved?
You say you detest your crime; I fear you only detest its
punishment; for, were you really repentant, my opinion, the opinion
of the whole world, would seem to you a trifle unworthy of regard,
and the utmost bitterness of censure be but an echo to your own
self-upbraidings.'
Hargrave had no inclination to discuss the nature of repentance. His
sole desire was to wrest from Laura some token, however slight, of
returning tenderness. For this purpose he employed all the
eloquence which he had often found successful in similar attempts.
But no two things can be more different in their effects, than the
language of passion poured into the sympathizing bosom of mutual
70. love, or addressed to the dull ear of indifference. The expressions
which Laura once thought capable of warming the coldest heart
seemed now the mere ravings of insanity; the lamentations which
she once thought might have softened rocks, now appeared the
weak complainings of a child for his lost toy. With a mixture of pity
and disgust she listened and replied; till the entrance of Lady Pelham
put a period to the dialogue, and Laura immediately quitted the
room.
Lady Pelham easily perceived that the conversation had been
particular; and Hargrave did not long leave her in doubt as to the
subject. He acquainted her with his pretensions to Laura, and
begged her sanction to his addresses; assuring her that his
intercourse with Lady Bellamer was entirely broken off, and that his
marriage would secure his permanent reformation. He complimented
Lady Pelham upon her liberality of sentiment and knowledge of the
world; from both of which he had hopes, he said, that she would not
consider one error as sufficient to blast his character. Lady Pelham
made a little decent hesitation on the score of Lady Bellamer's prior
claims; but was assured that no engagement had ever subsisted
there. 'She hoped Lord Lincourt would not be averse.' She was told
that Lord Lincourt anxiously desired to see his nephew settled. 'She
hoped Colonel Hargrave was resolved that his married life should be
irreproachable. Laura had a great deal of sensibility, it would break
her heart to be neglected; and Lady Pelham was sure, that in that
case the thought of having consented to the dear child's misery
would be more than she could support!' Her Ladyship was
vanquished by an assurance, that for Laura to be neglected by her
happy husband was utterly impossible.
'Laura's inclinations then must be consulted; every thing depended
upon her concurrence, for the sweet girl had really so wound herself
round Lady Pelham's heart, that positively her Ladyship could not
bear to give her a moment's uneasiness, or to press her upon a
subject to which she was at all averse.' And, strange as it may seem,
Lady Pelham at that moment believed herself incapable of
71. distressing the person whom, in fact, she tormented with ceaseless
ingenuity! Hargrave answered by confessing his fears that he was
for the present less in favour than he had once been; but he
disclosed Laura's former confessions of partiality, and insinuated his
conviction that it was smothered rather than extinguished.
Lady Pelham could now account for Laura's long illness and low
spirits; and she listened with eager curiosity to the solution of the
enigma, which had so long perplexed her. She considered whether
she should relate to the lover the sorrows he had caused. She
judged (for Lady Pelham often judged properly) that it would be
indelicate thus to proclaim to him the extent of his power; but, with
the usual inconsistency between her judgment and her practice, in
half an hour she had informed him of all that she had observed, and
hinted all that she suspected. Hargrave listened, was convinced, and
avowed his conviction that Lady Pelham's influence was alone
necessary to secure his success. Her Ladyship said, 'that she should
feel some delicacy in using any strong influence with her niece, as
the amiable orphan had no friend but herself, had owed somewhat
to her kindness, and might be biassed by gratitude against her own
inclination. The fortune which she meant to bequeath to Laura might
by some be thought to confer a right to advise; but, for her part, she
thought her little all was no more than due to the person whose
tender assiduities filled the blank which had been left in her
Ladyship's maternal heart by the ingratitude and disobedience of her
child.' This sentiment was pronounced in a tone so pathetic, and in
language so harmonious, that, though it did not for a moment
impose upon her hearer, it deceived Lady Pelham herself; and she
shed tears, which she actually imagined to be forced from her by the
mingled emotions of gratitude and of disappointed tenderness.
Lady Pelham had now entered on a subject inexhaustible; her own
feelings, her own misfortunes, her own dear self. Hargrave, who in
his hours of tolerable composure was the most polite of men,
listened, or appeared to listen, with unconquerable patience, till he
fortunately recollected an appointment which his interest in her
72. Ladyship's conversation had before banished from his mind; when
he took his leave, bearing with him a very gracious invitation to
repeat his visit.
With him departed Lady Pelham's fit of sentimentality; and, in five
minutes, she had dried her eyes, composed the paragraph which
was to announce the marriage of Lord Lincourt (for she killed off the
old peer without ceremony) to the lovely heiress of the amiable Lady
Pelham; taken possession of her niece's barouche and four, and
heard herself announced as the benefactress of this new wonder of
the world of fashion. She would cut off her rebellious daughter with
a shilling; give her up to the beggary and obscurity which she had
chosen, and leave her whole fortune to Lady Lincourt; for so, in the
fulness of her content, she called Laura. After some time enjoying
her niece's prospects, or to speak more justly her own, she began to
think of discovering how near they might be to their
accomplishment; and, for this purpose, she summoned Laura to a
conference.
Lady Pelham loved nothing on earth but herself; yet vanity, gratified
curiosity, and, above all, the detection of a mere human weakness
reducing Laura somewhat more to her own level awakened in her
breast an emotion resembling affection; as, throwing her arms round
her niece, she, in language half sportive, half tender, declared her
knowledge of Laura's secret, and reproached her with having
concealed it so well. Insulted, wronged, and forsaken by Hargrave,
Laura had kept his secret inviolable, for she had no right to disclose
it; but she scorned, by any evasion, to preserve her own. Glowing
with shame and mortification, she stood silently shrinking from Lady
Pelham's looks; till, a little recovering herself, she said, 'I deserve to
be thus humbled for my folly in founding my regards, not on the
worth of their object, but on my own imagination; and more, if it be
possible, do I deserve, for exposing my weakness to one who has
been so ungenerous as to boast of it. But it is some compensation to
my pride,' continued she, raising her eyes, 'that my disorder is cured
beyond the possibility of relapse.' Lady Pelham smiled at Laura's
73. security, which she did not consider as an infallible sign of safety. It
was in vain that Laura proceeded solemnly to protest her
indifference. Lady Pelham could allow for self-deceit in another's
case, though she never suspected it in her own. Vain were Laura's
comments upon Hargrave's character; they were but the fond
revilings of offended love. Laura did not deny her former preference;
she even owed that it was the sudden intelligence of Hargrave's
crimes which had reduced her to the brink of the grave; therefore
Lady Pelham was convinced that a little perseverance would fan the
smothered flame; and perseverance, she hoped, would not be
wanting. Nevertheless, as her Ladyship balanced her fondness for
contradicting by her aversion to being contradicted, and as Laura
was too much in earnest to study the qualifying tone, the conference
concluded rather less amicably than it began; though it ended by
Lady Pelham's saying, not very consistently with her sentiments an
hour before, that she would never cease to urge so advantageous a
match, conceiving that she had a right to influence the choice of one
whom she would make the heiress of forty thousand pounds. Laura
was going to insist that all influence would be ineffectual, but her
aunt quitted her without suffering her to reply. She would have
followed to represent the injustice of depriving Mrs Herbert of her
natural rights; but she desisted on recollecting that Lady Pelham's
purposes were like wedges, never fixed but by resistance.
The time had been when Lady Pelham's fortune would have seemed
to Hargrave as dust in the balance, joined with the possession of
Laura. He had gamed, had felt the want of money; and money was
no longer indifferent to him. But Laura's dower was still light in his
estimation, compared with its weight in that of Lambert, to whom he
accidentally mentioned Lady Pelham's intention. That prudent person
calculated that £40,000 would form a very handsome addition to a
fund upon which he intended to draw pretty freely. He had little
doubt of Hargrave's success; he had never known any woman with
whom such a lover could fail. He thought he could lead his friend to
bargain for immediate possession of part of his bride's portion, and,
for certainty of the rest in reversion, before parting with his liberty.
74. He allowed two, or perhaps even three months for the duration of
Laura's influence; during which time he feared he should have little
of her husband's company at the gaming-table; but from
thenceforth, he judged that the day would be his own, and that he
should soon possess himself of Hargrave's property, so far as it was
alienable. He considered that, in the meantime, Laura would furnish
attraction sufficient to secure Hargrave's stay at —, and he trusted
to his own dexterity for improving that circumstance to the best
advantage. He failed not, therefore, to encourage the lover's hopes,
and bestowed no small ridicule on the idea that a girl of nineteen
should desert a favourite on account of his gallantry.
Cool cunning would engage with fearful odds against imprudence, if
it could set bounds to the passions, as well as direct their course.
But it is often deceived in estimating the force of feelings which it
knows only by their effects. Lambert soon found that he had opened
the passage to a torrent which bore all before it. The favourite
stimulus found, its temporary substitute was almost disregarded;
and Hargrave, intoxicated with his passion, tasted sparingly of the
poisoned cup which his friend designed for him. His time and
thoughts were again devoted to Laura, and gaming was only sought
as a relief from the disappointment and vexation which generally
attended his pursuit. The irritation of his mind, however, made
amends for the lessened number of opportunities for plundering
him, by rendering it easier to take advantage of those which
remained.
The insinuating manners and elegant person of Hargrave gained
daily on the favour of Lady Pelham; for the great as well as the little
vulgar are the slaves of mere externals. She permitted his visits at
home and his attendance abroad, expatiating frequently on the
liberality of sentiment which she thus displayed. At first these
encomiums on her own conduct were used only to disguise from
herself and others her consciousness of its impropriety; but she
repeated them till she actually believed them just, and considered
75. herself as extending a charitable hand to rescue an erring brother
from the implacable malignity of the world.
She was indefatigable in her attempts to promote his success with
Laura. She lost no opportunity of pressing the subject. She
obstinately refused to be convinced of the possibility of overcoming
a strong prepossession. Laura, in an evil hour for herself,
thoughtlessly replied, that affection was founded on the belief of
excellence, and must of course give way when the foundation was
removed. This observation had just fallacy sufficient for Lady
Pelham's purpose. She took it for her text, and harangued upon it
with all the zeal and perseverance of disputation. She called it
Laura's theory; and insisted that, like other theorists, she would shut
her eyes against the plainest facts, nay, stifle the feelings of her own
mind, rather than admit what might controvert her opinion. She
cited all the instances which her memory could furnish of
agricultural, and chemical, and metaphysical theorism; and, with
astonishing ingenuity, contrived to draw a parallel between each of
them and Laura's case. It was in vain that Laura qualified, almost
retracted her unlucky observation. Her adversary would not suffer
her to desert the untenable ground. Delighted with her victory, she
returned again and again to the attack, after the vanquished had
appealed to her mercy; and much more than 'thrice she slew the
slain.'
Sick of arguing about the possibility of her indifference, Laura at
length confined herself to simple assertions of the fact. Lady Pelham
at first merely refused her belief; and, with provoking pity, rallied her
niece upon her self-deceit; but, finding that she corroborated her
words by a corresponding behaviour to Hargrave, her Ladyship's
temper betrayed its accustomed infirmity. She peevishly reproached
Laura with taking a coquettish delight in giving pain; insisted that
her conduct was a tissue of cruelty and affectation; and upbraided
her with disingenuousness in pretending an indifference which she
could not feel. 'And does your Ladyship communicate this opinion to
Colonel Hargrave?' said Laura, one day, fretted almost beyond her
76. patience by a remonstrance of two hours continuance. 'To be sure I
do,' returned Lady Pelham. 'In common humanity I will not allow him
to suffer more from your perverseness than I can avoid.' 'Well,
Madam,' said Laura, with a sigh and a shrug of impatient
resignation, 'nothing remains but that I shew a consistency, which,
at least is not common to affectation.'
Lady Pelham's representations had their effect upon Hargrave. They
brought balm to his wounded pride, and he easily suffered them to
counteract the effect of Laura's calm and uniform assurances of her
indifference. While he listened to these, her apparent candour and
simplicity, the regret she expressed at the necessity of giving pain,
brought temporary conviction to his mind; and, with transports of
alternate rage and grief, he now execrated her inconstancy, then his
own unworthiness; now abjured her, then the vices which had
deprived him of her affection. But the joint efforts of Lady Pelham
and Lambert always revived hopes sufficient to make him continue a
pursuit which he had not indeed the fortitude to relinquish.
His love (if we must give that name to a selfish desire, mingled at
times with every ungentle feeling), had never been so ardent. The
well-known principle of our nature which adds charms to what is
unattainable, lent new attractions to Laura's really improved
loveliness. The smile which was reserved for others seemed but the
more enchanting; the hand which he was forbidden to touch seemed
but the more soft and snowy; the form which was kept sacred from
his approach, bewitched him with more resistless graces. Hargrave
had been little accustomed to suppress any of his feelings, and he
gave vent to this with an entire neglect of the visible uneasiness
which it occasioned to its subject. He employed the private
interviews, which Lady Pelham contrived to extort for him, in the
utmost vehemence of complaint, protestation, and entreaty. He
laboured to awaken the pity of Laura; he even condescended to
appeal to her ambition; and persevered, in spite of unequivocal
denials, till Laura, disgusted, positively refused ever again to admit
him without witnesses.
77. His public attentions were, if possible, still more distressing to her.
Encouraged by Lady Pelham, he, notwithstanding the almost
repulsive coldness of Laura's manner, became her constant
attendant. He pursued her wherever she went; placed himself, in
defiance of propriety, so as to monopolize her conversation; and
seemed to have laid aside all his distinguishing politeness, while he
neglected every other woman to devote his assiduities to her alone.
He claimed the station by her side till Laura had the mortification to
observe that others resigned it at his approach; he snatched every
opportunity of whispering his adulations in her ear; and, far from
affecting any concealment in his preference, seemed to claim the
character of her acknowledged adorer. It is impossible to express the
vexation with which Laura endured this indelicate pre-eminence. Had
Hargrave been the most irreproachable of mankind, she would have
shrunk from such obtrusive marks of his partiality; but her sense of
propriety was no less wounded by the attendance of such a
companion, than her modesty was shocked by her being thus
dragged into the notice, and committed to the mercy of the public.
The exclusive attentions of the handsome Colonel Hargrave, the
mirror of gallantry, the future Lord Lincourt, were not, however
undesired, to be possessed unenvied. Those who unsuccessfully
angled for his notice, avenged themselves on her to whom they
imputed their failure, by looks of scorn, and by sarcastic remarks,
which they sometimes contrived should reach the ear of the
innocent object of their malice. Laura, unspeakably averse to being
the subject of even laudatory observation, could sometimes scarcely
restrain the tears of shame and mortification that were wrung from
her by attacks which she could neither resent nor escape. In spite of
the natural sweetness of her temper, she was sometimes tempted to
retort upon Colonel Hargrave the vexation which he caused to her:
and his officiousness almost compelled her to forsake the civility
within the bounds of which she had determined to confine her
coldness.
He complained bitterly of this treatment, and reproached her with
taking ungenerous advantage of his passion. 'Why then,' said she,
78. Welcome to our website – the perfect destination for book lovers and
knowledge seekers. We believe that every book holds a new world,
offering opportunities for learning, discovery, and personal growth.
That’s why we are dedicated to bringing you a diverse collection of
books, ranging from classic literature and specialized publications to
self-development guides and children's books.
More than just a book-buying platform, we strive to be a bridge
connecting you with timeless cultural and intellectual values. With an
elegant, user-friendly interface and a smart search system, you can
quickly find the books that best suit your interests. Additionally,
our special promotions and home delivery services help you save time
and fully enjoy the joy of reading.
Join us on a journey of knowledge exploration, passion nurturing, and
personal growth every day!
ebookbell.com