SlideShare a Scribd company logo
Stochastic Modeling And Control Edited By Ivan
Ganchev Ivanov download
https://ebookbell.com/product/stochastic-modeling-and-control-
edited-by-ivan-ganchev-ivanov-4072372
Explore and download more ebooks at ebookbell.com
Here are some recommended products that we believe you will be
interested in. You can click the link to download.
Stochastic Modeling Of Manufacturing Systems Advances In Design
Performance Evaluation And Control Issues 1st Edition J Macgregor
Smith Auth
https://ebookbell.com/product/stochastic-modeling-of-manufacturing-
systems-advances-in-design-performance-evaluation-and-control-
issues-1st-edition-j-macgregor-smith-auth-4191222
Stochastic Control And Mathematical Modeling Applications In Economics
1st Edition Hiroaki Morimoto
https://ebookbell.com/product/stochastic-control-and-mathematical-
modeling-applications-in-economics-1st-edition-hiroaki-
morimoto-4669890
Applied Stochastic Processes And Control For Jumpdiffusions Modeling
Analysis And Computation Siam Floyd B Hanson
https://ebookbell.com/product/applied-stochastic-processes-and-
control-for-jumpdiffusions-modeling-analysis-and-computation-siam-
floyd-b-hanson-1011344
Hybrid L1 Adaptive Control Applications Of Fuzzy Modeling Stochastic
Optimization And Metaheuristics Roshni Maiti
https://ebookbell.com/product/hybrid-l1-adaptive-control-applications-
of-fuzzy-modeling-stochastic-optimization-and-metaheuristics-roshni-
maiti-46753102
Modeling Stochastic Control Optimization And Applications 1st Ed
George Yin
https://ebookbell.com/product/modeling-stochastic-control-
optimization-and-applications-1st-ed-george-yin-10489200
Stochastic Modelling And Control Softcover Reprint Of The Original 1st
Ed 1985 Davis
https://ebookbell.com/product/stochastic-modelling-and-control-
softcover-reprint-of-the-original-1st-ed-1985-davis-53790684
Controlled Markov Processes And Viscosity Solutions Stochastic
Modelling And Applied Probability 2nd Wendell H Fleming
https://ebookbell.com/product/controlled-markov-processes-and-
viscosity-solutions-stochastic-modelling-and-applied-probability-2nd-
wendell-h-fleming-2528200
Stochastic Modeling And Optimization With Applications In Queues
Finance And Supply Chains Springer Series In Operations Research 1st
Edition David D Yao
https://ebookbell.com/product/stochastic-modeling-and-optimization-
with-applications-in-queues-finance-and-supply-chains-springer-series-
in-operations-research-1st-edition-david-d-yao-2105346
Stochastic Modeling And Analysis Of Telecoms Networks 1st Edition
Laurent Decreusefond
https://ebookbell.com/product/stochastic-modeling-and-analysis-of-
telecoms-networks-1st-edition-laurent-decreusefond-4062894
Stochastic Modeling And Control Edited By Ivan Ganchev Ivanov
AND CONTROL
Edited by
Ivan Ganchev Ivanov
STOCHASTIC
MODELING
STOCHASTIC MODELING
AND CONTROL
Edited by Ivan Ganchev Ivanov
Stochastic Modeling and Control
http://dx.doi.org/10.5772/2567
Edited by Ivan Ganchev Ivanov
Contributors
J. Linares-Pérez, R. Caballero-Águila, I. García-Garrido, Uwe Küchler, Vyacheslav A. Vasiliev,
Serena Doria, Ricardo López-Ruiz, Jaime Sañudo, David Opeyemi, Tze Leung Lai, Tiong Wee
Lim, Jingtao Shi, Ivan Ivanov, Sergey V. Sokolov, Nicholas A. Nechval, Maris Purgailis, Ming Xu,
Jiamin Zhu, Tian Tan, Shijie Xu, Vladimir Šimović, Siniša Fajt, Miljenko Krhen, Raúl Fierro, Ying
Shen, Hui Zhang
Published by InTech
Janeza Trdine 9, 51000 Rijeka, Croatia
Copyright © 2012 InTech
All chapters are Open Access distributed under the Creative Commons Attribution 3.0 license,
which allows users to download, copy and build upon published articles even for commercial
purposes, as long as the author and publisher are properly credited, which ensures maximum
dissemination and a wider impact of our publications. After this work has been published by
InTech, authors have the right to republish it, in whole or part, in any publication of which they
are the author, and to make other personal use of the work. Any republication, referencing or
personal use of the work must explicitly identify the original source.
Notice
Statements and opinions expressed in the chapters are these of the individual contributors and
not necessarily those of the editors or publisher. No responsibility is accepted for the accuracy
of information contained in the published chapters. The publisher assumes no responsibility for
any damage or injury to persons or property arising out of the use of any materials,
instructions, methods or ideas contained in the book.
Publishing Process Manager Dimitri Jelovcan
Typesetting InTech Prepress, Novi Sad
Cover InTech Design Team
First published November, 2012
Printed in Croatia
A free online edition of this book is available at www.intechopen.com
Additional hard copies can be obtained from orders@intechopen.com
Stochastic Modeling and Control, Edited by Ivan Ganchev Ivanov
p. cm.
ISBN 978-953-51-0830-6
Stochastic Modeling And Control Edited By Ivan Ganchev Ivanov
Contents
Preface IX
Chapter 1 Design of Estimation Algorithms from an Innovation
Approach in Linear Discrete-Time Stochastic Systems
with Uncertain Observations 1
J. Linares-Pérez, R. Caballero-Águila and I. García-Garrido
Chapter 2 On Guaranteed Parameter Estimation of Stochastic
Delay Differential Equations by Noisy Observations 23
Uwe Küchler and Vyacheslav A. Vasiliev
Chapter 3 Coherent Upper Conditional Previsions Defined
by Hausdorff Outer Measures to Forecast
in Chaotic Dynamical Systems 51
Serena Doria
Chapter 4 Geometrical Derivation of Equilibrium Distributions
in Some Stochastic Systems 63
Ricardo López-Ruiz and Jaime Sañudo
Chapter 5 Stochastic Modelling of Structural Elements 81
David Opeyemi
Chapter 6 Singular Stochastic Control in Option Hedging
with Transaction Costs 103
Tze Leung Lai and Tiong Wee Lim
Chapter 7 Stochastic Control for Jump Diffusions 119
Jingtao Shi
Chapter 8 Iterations for a General Class of Discrete-Time
Riccati-Type Equations: A Survey and Comparison 147
Ivan Ivanov
Chapter 9 Stochastic Observation Optimization on the Basis
of the Generalized Probabilistic Criteria 171
Sergey V. Sokolov
VI Contents
Chapter 10 Stochastic Control and Improvement of Statistical Decisions
in Revenue Optimization Systems 185
Nicholas A. Nechval and Maris Purgailis
Chapter 11 Application of Stochastic Control into Optimal Correction
Maneuvers for Transfer Trajectories 211
Ming Xu, Jiamin Zhu, Tian Tan and Shijie Xu
Chapter 12 Stochastic Based Simulations and Measurements of Some
Objective Parameters of Acoustic Quality: Subjective
Evaluation of Room Acoustic Quality with Acoustics
Optimization in Multimedia Classroom
(Analysis with Application) 233
Vladimir Šimović, Siniša Fajt and Miljenko Krhen
Chapter 13 Discrete-Time Stochastic Epidemic Models
and Their Statistical Inference 253
Raúl Fierro
Chapter 14 Identifiability of Quantized Linear Systems 279
Ying Shen and Hui Zhang
Stochastic Modeling And Control Edited By Ivan Ganchev Ivanov
Preface
Stochastic control plays an important role in many scientific and applied disciplines.
The goal of this book is to collect a group of outstanding investigations in various
aspects of the stochastic systems and their behavior.
Linear discrete-time stochastic systems with uncertain observations (Chapter 1), and
stochastic delay differential equations by noisy observations (Chapter 2) are
considered at the beginning of the book. The model of coherent upper prevision,
analyzed in Chapter 3, can be proposed to forecast in a chaotic system. A stochastic
control problem where the system is governed by a nonlinear stochastic differential
equation with jumps is explored in Chapter 4. The control is allowed to enter into both
diffusion and jump terms.
A geometrical interpretation of different multi-agent systems evolving in phase space
under the hypothesis of equiprobability is introduced, and some new results for
statistical systems are obtained in Chapter 5. The process and procedures for stochastic
modelling of structural elements are analyzed in Chapter 6.
The next three chapters are concerned with financial applications: a singular stochastic
control model in European option hedging with transaction costs is studied in Chapter
7, and a stochastic optimal control problem for jump diffusions, where the controlled
stochastic system is driven by both Brownian motion and Poisson random measure is
investigated in Chapter 8. More precisely, the relationship between the maximum
stochastic principle and the dynamic programming principle for the stochastic optimal
control problem of jump diffusions is derived. Iterations for computing the maximal
and stabilizing solution to the discrete time generalized algebraic Riccati equations are
considered and their numerical properties are considered in Chapter 9.
The remaining chapters can be considered as a collection of several applications of the
optimal control problems: a synthesis problem of the optimal control of the
observation process based on a generalized probabilistic criteria (Chapter 10), and a
problem for suggesting improvement of statistical decisions in revenue management
systems under parametric uncertainty (Chapter 11). In Chapter 12, an optimal
correction maneuver strategy is proposed from the view of stochastic control to track
two typical and interesting transfer trajectories for orbital rendezvous and Halo orbit,
X Preface
respectively. Next application is a problem for an acoustics optimization in the
multimedia classroom. In Chapter 13, the stochastic based simulations and
measurements of some objective parameters of acoustic quality and subjective
evaluation of room acoustic quality are analyzed. A discrete-time stochastic epidemic
model is introduced and examined from a statistical point of view in Chapter 14. The
last studies the parameter identifiability of quantized linear systems with Gauss-
Markov parameters from informational theoretic point of view.
Prof. Ivan Ganchev Ivanov
Head of the Department of Statistics and Econometrics,
Faculty of Economics and Business Administration,
Sofia University "St. Kl. Ohridski", Sofia,
Bulgaria
Stochastic Modeling And Control Edited By Ivan Ganchev Ivanov
Chapter 0
Design of Estimation Algorithms from an
Innovation Approach in Linear Discrete-Time
Stochastic Systems with Uncertain Observations
J. Linares-Pérez, R. Caballero-Águila and I. García-Garrido
Additional information is available at the end of the chapter
http://dx.doi.org/10.5772/45777
1. Introduction
The least-squares estimation problem in linear discrete-time stochastic systems in which the
signal to be estimated is always present in the observations has been widely treated; as is well
known, the Kalman filter [12] provides the least-squares estimator when the additive noises
and the initial state are Gaussian and mutually independent.
Nevertheless, in many real situations, usually the measurement device or the transmission
mechanism can be subject to random failures, generating observations in which the state
appears randomly or which may consist of noise only due, for example, to component
or interconnection failures, intermittent failures in the observation mechanism, fading
phenomena in propagation channels, accidental loss of some measurements or data
inaccessibility at certain times. In these situations where it is possible that information
concerning the system state vector may or may not be contained in the observations, at each
sampling time, there is a positive probability (called false alarm probability) that only noise is
observed and, hence, that the observation does not contain the transmitted signal, but it is
not generally known whether the observation used for estimation contains the signal or it
is only noise. To describe this interrupted observation mechanism (uncertain observations),
the observation equation, with the usual additive measurement noise, is formulated by
multiplying the signal function at each sampling time by a binary random variable taking
the values one and zero (Bernoulli random variable); the value one indicates that the
measurement at that time contains the signal, whereas the value zero reflects the fact that
the signal is missing and, hence, the corresponding observation is only noise. So, the
observation equation involves both an additive and a multiplicative noise, the latter modeling
the uncertainty about the signal being present or missing at each observation.
Linear discrete-time systems with uncertain observations have been widely used in estimation
problems related to the above practical situations (which commonly appear, for example, in
Chapter 1
2 Will-be-set-by-IN-TECH
Communication Theory). Due to the multiplicative noise component, even if the additive
noises are Gaussian, systems with uncertain observations are always non-Gaussian and hence,
as occurs in other kinds of non-Gaussian linear systems, the least-squares estimator is not a
linear function of the observations and, generally, it is not easily obtainable by a recursive
algorithm; for this reason, research in this kind of systems has focused special attention on the
search of suboptimal estimators for the signal (mainly linear ones).
In some cases, the variables modeling the uncertainty in the observations can be assumed
to be independent and, then, the distribution of the multiplicative noise is fully determined
by the probability that each particular observation contains the signal. As it was shown by
Nahi [17] (who was the first who analyzed the least-squares linear filtering problem in this
kind of systems assuming that the state and observation additive noises are uncorrelated)
the knowledge of the aforementioned probabilities allows to derive estimation algorithms
with a recursive structure similar to the Kalman filter. Later on, Monzingo [16] completed
these results by analyzing the least-squares smoothing problem and, subsequently, [3] and [4]
generalized the least-squares linear filtering and smoothing algorithms considering that the
additive noises of the state and the observation are correlated.
However, there exist many real situations where this independence assumption of the
Bernoulli variables modeling the uncertainty is not satisfied; for example, in signal
transmission models with stand-by sensors in which any failure in the transmission is detected
immediately and the old sensor is then replaced, thus avoiding the possibility of the signal
being missing in two successive observations. This different situation was considered by
[9] by assuming that the variables modeling the uncertainty are correlated at consecutive
time instants, and the proposed least-squares linear filtering algorithm provides the signal
estimator at any time from those in the two previous instants. Later on, the state estimation
problem in discrete-time systems with uncertain observations, has been widely studied under
different hypotheses on the additive noises involved in the state and observation equations
and, also, under several hypotheses on the multiplicative noise modeling the uncertainty in
the observations (see e.g. [22] - [13], among others).
On the other hand, there are many engineering application fields (for example, in
communication systems) where sensor networks are used to obtain all the available
information on the system state and its estimation must be carried out from the observations
provided by all the sensors (see [6] and references therein). Most papers concerning systems
with uncertain observations transmitted by multiple sensors assume that all the sensors have
the same uncertainty characteristics. In the last years, this situation has been generalized by
several authors considering uncertain observations whose statistical properties are assumed
not to be the same for all the sensors. This is a realistic assumption in several application fields,
for instance, in networked communication systems involving heterogeneous measurement
devices (see e.g. [14] and [8], among others). In [7] it is assumed that the uncertainty in
each sensor is modeled by a sequence of independent Bernoulli random variables, whose
statistical properties are not necessarily the same for all the sensors. Later on, in [10] and
[1] the independence restriction is weakened; specifically, different sequences of Bernoulli
random variables correlated at consecutive sampling times are considered to model the
uncertainty at each sensor. This form of correlation covers practical situations where the
signal cannot be missing in two successive observations. In [2] the least-squares linear and
quadratic problems are addressed when the Bernoulli variables describing the uncertainty in
2 Stochastic Modeling and Control
Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 3
the observations are correlated at instants that differ two units of time. This study covers
more general practical situations, for example, in sensor networks where sensor failures may
happen and a failed sensor is replaced not immediately, but two sampling times after having
failed. However, even if it is assumed that any failure in the transmission results from sensor
failures, usually the failed sensor may not be replaced immediately but after m instants of
time; in such situations, correlation among the random variables modeling the uncertainty
in the observations at times k and k + m must be considered and new algorithms must be
deduced.
The current chapter is concerned with the state estimation problem for linear discrete-time
systems with uncertain observations when the uncertainty at any sampling time k depends
only on the uncertainty at the previous time k − m; this form of correlation allows us
to consider certain models in which the signal cannot be missing in m + 1 consecutive
observations.
The random interruptions in the observation process are modeled by a sequence of Bernoulli
variables (at each time, the value one of the variable indicates that the measurement is the
current system output, whereas the value zero reflects that only noise is available), which
are correlated only at the sampling times k − m and k. Recursive algorithms for the filtering
and fixed-point smoothing problems are proposed by using an innovation approach; this
approach, based on the fact that the innovation process can be obtained by a causal and
invertible operation on the observation process, consists of obtaining the estimators as a linear
combination of the innovations and simplifies considerably the derivation of the estimators
due to the fact that the innovations constitute a white process.
The chapter is organized as follows: in Section 2 the system model is described; more
specifically, we introduce the linear state transition model perturbed by a white noise,
and the observation model affected by an additive white noise and a multiplicative noise
describing the uncertainty. Also, the pertinent hypotheses to address the least-squares linear
estimation problem are established. In Section 3 this estimation problem is formulated
using an innovation approach. Next, in Section 4, recursive algorithms for the filter and
fixed-point smoother are derived, including recursive formulas for the estimation error
covariance matrices. Finally, the performance of the proposed estimators is illustrated in
Section 5 by a numerical simulation example, where a two-dimensional signal is estimated
and the estimation accuracy is analyzed for different values of the uncertainty probability and
several values of the time period m.
2. Model description
Consider linear discrete-time stochastic systems with uncertain observations coming from
multiple sensors, whose mathematical modeling is accomplished by the following equations.
The state equation is given by
xk = Fk−1xk−1 + wk−1, k ≥ 1, (1)
where {xk; k ≥ 0} is an n-dimensional stochastic process representing the system state,
{wk; k ≥ 0} is a white noise process and Fk, for k ≥ 0, are known deterministic matrices.
3
Design of Estimation Algorithms from an Innovation Approach
in Linear Discrete-Time Stochastic Systems with Uncertain Observations
4 Will-be-set-by-IN-TECH
We consider scalar uncertain observations {yi
k; k ≥ 1}, i = 1, . . . , r, coming from r sensors
and perturbed by noises whose statistical properties are not necessarily the same for all the
sensors. Specifically, we assume that, in each sensor and at any time k, the observation yi
k,
perturbed by an additive noise, can have no information about the state (thus being only
noise) with a known probability. That is,
yi
k =
⎧
⎨
⎩
Hi
kxk + vi
k, with probability θ
i
k
vi
k, with probability 1 − θ
i
k
where, for i = 1, . . . , r, {vi
k; k ≥ 1} is the observation additive noise process of the i-th sensor
and Hi
k, for k ≥ 1, are known deterministic matrices of compatible dimensions. If we introduce
{θi
k; k ≥ 1}, i = 1, . . . , r, sequences of Bernoulli random variables with P[θi
k = 1] = θ
i
k, the
observations of the state can be rewritten as
yi
k = θi
kHi
kxk + vi
k, k ≥ 1, i = 1, . . . , r. (2)
Remark 1. If θi
k = 1, which occurs with known probability θ
i
k, the state xk is present in the
observation yi
k coming from the i-th sensor at time k, whereas if θi
k = 0 such observation
only contains additive noise, vi
k, with probability 1 − θ
i
k. This probability is called false alarm
probability and it represents the probability that only noise is observed or, equivalently, that yi
k
does not contain the state.
The aim is to address the state estimation problem considering all the available observations
coming from the r sensors. For convenience, denoting yk = (y1
k, . . . , yr
k)T, vk = (v1
k, . . . , vr
k)T,
Hk = (H1T
k , . . . , HrT
k )T and Θk = Diag(θ1
k, . . . , θr
k), Equation (2) is equivalent to the following
stacked observation equation
yk = ΘkHkxk + vk, k ≥ 1. (3)
2.1. Model hypotheses
In order to analyze the least-squares linear estimation problem of the state xk from the
observations y1, . . . , yL, with L ≥ k, some considerations must be taken into account. On
the one hand, it is known that the linear estimator of xk, is the orthogonal projection of xk
onto the space of n-dimensional random variables obtained as linear transformations of the
observations y1, . . . , yL, which requires the existence of the second-order moments of such
observations. On the other hand, we consider that the variables describing the uncertainty in
the observations are correlated in instants that differ m units of time to cover many practical
situations where the independence assumption on such variables is not realistic. Specifically,
the following hypotheses are assumed:
Hypothesis 1. The initial state x0 is a random vector with E[x0] = x0 and Cov[x0] = P0.
Hypothesis 2. The state noise {wk; k ≥ 0} is a zero-mean white sequence with Cov[wk] =
Qk, ∀k ≥ 0.
4 Stochastic Modeling and Control
Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 5
Hypothesis 3. The observation additive noise {vk; k ≥ 1} is a zero-mean white process with
Cov[vk] = Rk, ∀k ≥ 1.
Hypothesis 4. For i = 1, . . . , r, {θi
k; k ≥ 1} is a sequence of Bernoulli random variables with
P[θi
k = 1] = θ
i
k. For i, j = 1, . . . , r, the variables θi
k and θ
j
s are independent for |k − s| = 0, m and
Cov[θi
k, θ
j
s] are known for |k − s| = 0, m. Defining θk = (θ1
k, . . . , θr
k)T
, the covariance matrices of
θk and θs will be denoted by Kθ
k,s.
Finally, we assume the following hypothesis on the independence of the initial state and
noises:
Hypothesis 5. The initial state x0 and the noise processes {wk; k ≥ 0}, {vk; k ≥ 1} and
{θk; k ≥ 1} are mutually independent.
Remark 2. For the derivation of the estimation algorithms a matrix product, called Hadamard
product, which is simpler than the conventional product, will be considered. Let A, B ∈ Mmn,
the Hadamard product (denoted by ◦) of A and B is defined as [A ◦ B]ij = AijBij. From this
definition it is easily deduced (see [7]) the next property that will be needed later.
For any random matrix Gm×m independent of {Θk; k ≥ 1}, the following equality is satisfied
E[ΘkGm×mΘs] = E[θkθT
s ] ◦ E[Gm×m].
Particularly, denoting Θk = E[Θk], it is immediately clear that
E[(Θk − Θk)Gm×m(Θs − Θs)] = Kθ
k,s ◦ E[Gm×m]. (4)
Remark 3. Several authors assume that the observations available for the estimation come
either from multiple sensors with identical uncertainty characteristics or from a single sensor
(see [20] for the case when the uncertainty is modeled by independent variables, and [19] for
the case when such variables are correlated at consecutive sampling times). Nevertheless,
in the last years, this situation has been generalized by some authors considering multiple
sensors featuring different uncertainty characteristics (see e.g. [7] for the case of independent
uncertainty, and [1] for situations where the uncertainty in each sensor is modeled by variables
correlated at consecutive sampling times). We analyze the state estimation problem for the
class of linear discrete-time systems with uncertain observation (3), which, as established
in Hypothesis 4, are characterized by the fact that the uncertainty at any sampling time k
depends only on the uncertainty at the previous time k − m; this form of correlation allows
us to consider certain models in which the signal cannot be absent in m + 1 consecutive
observations.
3. Least-squares linear estimation problem
As mentioned above, our aim in this chapter is to obtain the least-squares linear estimator,

xk/L, of the signal xk based on the observations {y1, . . . , yL}, with L ≥ k, by recursive
formulas. Specifically, the problem is to derive recursive algorithms for the least-squares
linear filter (L = k) and fixed-point smoother (fixed k and L  k) of the state using uncertain
observations (3). For this purpose, we use an innovation approach as described in [11].
5
Design of Estimation Algorithms from an Innovation Approach
in Linear Discrete-Time Stochastic Systems with Uncertain Observations
6 Will-be-set-by-IN-TECH
Since the observations are generally nonorthogonal vectors, we use the Gram-Schmidt
orthogonalization procedure to transform the set of observations {y1, . . . , yL} into an equivalent
set of orthogonal vectors {ν1, . . . , νL}; equivalent in the sense that they both generate the same
linear subspace; that is,
L(y1, . . . , yL) = L(ν1, . . . , νL) = LL.
Let {ν1, . . . , νk−1} be the set of orthogonal vectors satisfying L(ν1, . . . , νk−1) = L(y1, . . . , yk−1),
the next orthogonal vector, νk, corresponding to the new observation yk, is obtained by
projecting yk onto Lk−1; specifically
νk = yk − Proj{yk onto Lk−1},
and, because of the orthogonality of {ν1, . . . , νk−1} the above projection can be found by
projecting yk along each of the previously found orthogonal vectors νi, for i ≤ k − 1,
Proj{yk onto Lk−1} =
k−1
∑
i=1
Proj{yk along νi} =
k−1
∑
i=1
E[ykνT
i ]

E[νiνT
i ]
−1
νi.
Since the projection of yk onto Lk−1 is 
yk/k−1, the one-stage least-squares linear predictor of
yk, we have that

yk/k−1 =
k−1
∑
i=1
Tk,iΠ−1
i νi, k ≥ 2 (5)
where Tk,i = E[ykνT
i ] and Πi = E[νiνT
i ] is the covariance of νi.
Consequently, by starting with ν1 = y1 − E[y1], the orthogonal vectors νk are determined by
νk = yk − 
yk/k−1, for k ≥ 2. Hence, νk can be considered as the new information or the
innovation in yk given {y1, . . . , yk−1}.
In summary, the observation process {yk; k ≥ 1} has been transformed into an equivalent
white noise {νk; k ≥ 1} known as innovation process. Taking into account that both processes
satisfy that
νi ∈ L(y1, . . . , yi) and yi ∈ L(ν1, . . . , νi), ∀i ≥ 1,
we conclude that such processes are related to each other by a causal and causally invertible
linear transformation, thus making the innovation process be uniquely determined by the
observations.
This consideration allows us to state that the least-squares linear estimator of the state based
on the observations, 
xk/L, is equal to the least-squares linear estimator of the state based on
the innovations {ν1, . . . , νL}. Thus, projecting xk separately onto each νi, i ≤ L, the following
general expression for the estimator 
xk/L is obtained

xk/L =
L
∑
i=1
Sk,iΠ−1
i νi, k ≥ 1, (6)
where Sk,i = E[xkνT
i ]. This expression is the starting point to derive the recursive filtering and
fixed-point smoothing algorithms in the next section.
6 Stochastic Modeling and Control
Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 7
4. Least-squares linear estimation recursive algorithms
In this section, using an innovation approach, recursive algorithms are proposed for the filter,

xk/k, and the fixed-point smoother, 
xk/L, for fixed k and L  k.
4.1. Linear filtering algorithm
In view of the general expression (6) for L = k, it is clear that the state filter, 
xk/k, is obtained
from the one-stage state predictor, 
xk/k−1, by

xk/k = 
xk/k−1 + Sk,kΠ−1
k νk, k ≥ 1; 
x0/0 = x0. (7)
Hence, an equation for the predictor 
xk/k−1 in terms of the filter 
xk−1/k−1 and expressions for
the innovation νk, its covariance matrix Πk and the matrix Sk,k are required.
State predictor 
xk/k−1. From hypotheses 2 and 5, it is immediately clear that the filter of the
noise wk−1 is 
wk−1/k−1 = E[wk−1] = 0 and hence, taking into account Equation (1), we have

xk/k−1 = Fk−1
xk−1/k−1, k ≥ 1. (8)
Innovation νk. We will now get an explicit formula for the innovation, νk = yk − 
yk/k−1, or
equivalently for the one-stage predictor of the observation, 
yk/k−1. For this purpose, taking
into account (5), we start by calculating Tk,i = E[ykνT
i ], for i ≤ k − 1.
From the observation equation (3) and hypotheses 3 and 5, it is clear that
Tk,i = E

ΘkHkxkνT
i , i ≤ k − 1. (9)
Now, for k ≤ m or k  m and i  k − m, hypotheses 4 and 5 guarantee that Θk is independent
of the innovations νi, and then we have that Tk,i = Θk HkE[xkνT
i ] = ΘkHkSk,i. So, after some
manipulations, we obtain
I. For k ≤ m, it is satisfied that 
yk/k−1 = Θk Hk
k−1
∑
i=1
Sk,iΠ−1
i νi, and using (6) for L = k − 1 it is
obvious that

yk/k−1 = Θk Hk 
xk/k−1, k ≤ m. (10)
II. For k  m, we have that 
yk/k−1 = Θk Hk
k−(m+1)
∑
i=1
Sk,iΠ−1
i νi +
m
∑
i=1
Tk,k−iΠ−1
k−iνk−i and adding
and subtracting
m
∑
i=1
Θk HkSk,k−iΠ−1
k−iνk−i, the following equality holds

yk/k−1 = ΘkHk
k−1
∑
i=1
Sk,iΠ−1
i νi +
m
∑
i=1
(Tk,k−i − Θk HkSk,k−i)Π−1
k−iνk−i, k  m. (11)
Next, we determine an expression for Tk,k−i − ΘkHkSk,k−i, for 1 ≤ i ≤ m.
7
Design of Estimation Algorithms from an Innovation Approach
in Linear Discrete-Time Stochastic Systems with Uncertain Observations
8 Will-be-set-by-IN-TECH
Taking into account (9), it follows that
Tk,k−i − ΘkHkSk,k−i = E

(Θk − Θk)HkxkνT
k−i , 1 ≤ i ≤ m, (12)
or equivalently,
Tk,k−i − Θk HkSk,k−i = E

(Θk − Θk)HkxkyT
k−i − E

(Θk − Θk)Hkxk 
yT
k−i/k−(i+1) .
To calculate the first expectation, we use again (3) for yk−i and from hypotheses 3 and 5,
we have that
E

(Θk − Θk)HkxkyT
k−i = E

Θk − Θk HkxkxT
k−iHT
k−iΘk−i
which, using Property (4), yields
E

(Θk − Θk)HkxkyT
k−i = Kθ
k,k−i ◦

HkE[xkxT
k−i]HT
k−i

.
Now, denoting Dk = E[xkxT
k ] and Fk,i = Fk−1 · · · Fi, from Equation (1) it is clear that
E[xkxT
k−i] = Fk,k−iDk−i, and hence
E

(Θk − Θk)HkxkyT
k−i = Kθ
k,k−i ◦

HkFk,k−iDk−iHT
k−i

where Dk can be recursively obtained by
Dk = Fk−1Dk−1FT
k−1 + Qk−1, k ≥ 1;
D0 = P0 + x0xT
0 .
(13)
Summarizing, we have that
Tk,k−i − ΘkHkSk,k−i = Kθ
k,k−i ◦

HkFk,k−iDk−iHT
k−i

− E

(Θk − Θk)Hkxk 
yT
k−i/k−(i+1) , 1 ≤ i ≤ m.
(14)
Taking into account the correlation hypothesis of the variables describing the uncertainty,
the right-hand side of this equation is calculated differently for i = m or i  m, as shown
below.
(a) For i = m, since Θk is independent of the innovations νi, for i  k − m, we have that
E

(Θk − Θk)Hkxk 
yT
k−m/k−(m+1)
= 0, and from (14)
Tk,k−m − Θk HkSk,k−m = Kθ
k,k−m ◦

HkFk,k−mDk−mHT
k−m

. (15)
(b) For i  m, from Hypothesis 4, Kθ
k,k−i = 0 and, hence, from (14)
Tk,k−i − Θk HkSk,k−i = −E

(Θk − Θk)Hkxk 
yT
k−i/k−(i+1) .
8 Stochastic Modeling and Control
Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 9
Now, from expression (5),

yk−i/k−(i+1) =
k−(i+1)
∑
j=1
Tk−i,jΠ−1
j νj,
and using again that Θk is independent of νi, for i = k − m, it is deduced that
Tk,k−i − Θk HkSk,k−i = −E[ Θk − Θk HkxkνT
k−m]Π−1
k−mTT
k−i,k−m
or, equivalently, from (12) for i = m, (15) and noting
Ψk,k−m = Kθ
k,k−m ◦

HkFk,k−mDk−mHT
k−m

Π−1
k−m,
we have that
Tk,k−i − ΘkHkSk,k−i = −Ψk,k−mTT
k−i,k−m, i  m. (16)
Next, substituting (15) and (16) into (11) and using (6) for 
xk/k−1, it is concluded that

yk/k−1 = Θk Hk 
xk/k−1 + Ψk,k−m νk−m −
m−1
∑
i=1
TT
k−i,k−mΠ−1
k−iνk−i , k  m. (17)
Finally, using (3) and (16) and taking into account that, from (1), Sk,k−i = Fk,k−iSk−i,k−i, the
matrices Tk,k−i in (17) are obtained by
Tk,k−i = ΘkHkFk,k−iSk−i,k−i, 2 ≤ k ≤ m, 1 ≤ i ≤ k − 1,
Tk,k−i = ΘkHkFk,k−iSk−i,k−i − Ψk,k−mTT
k−i,k−m, k  m, 1 ≤ i ≤ m − 1.
Matrix Sk,k. Since νk = yk − 
yk/k−1, we have that Sk,k = E[xkνT
k ] = E[xkyT
k ] − E[xk 
yT
k/k−1].
Next, we calculate these expectations.
I. From Equation (3) and the independence hypothesis, it is clear that E[xkyT
k ] = Dk HT
k Θk,
∀k ≥ 1, where Dk = E[xkxT
k ] is given by (13).
II. To calculate E[xk 
yT
k/k−1], the correlation hypothesis of the random variables θk must be
taken into account and two cases must be considered:
(a) For k ≤ m, from (10) we obtain
E[xk 
yT
k/k−1] = E[xk 
xT
k/k−1]HT
k Θk.
By using the orthogonal projection lemma, which assures that E[xk 
xT
k/k−1] = Dk −
Pk/k−1, where Pk/k−1 = E[(xk − 
xk/k−1)(xk − 
xk/k−1)T] is the prediction error
covariance matrix, we get
E[xk 
yT
k/k−1] = (Dk − Pk/k−1) HT
k Θk, k ≤ m.
9
Design of Estimation Algorithms from an Innovation Approach
in Linear Discrete-Time Stochastic Systems with Uncertain Observations
10 Will-be-set-by-IN-TECH
(b) For k  m, from (17) it follows that
E[xk 
yT
k/k−1] = E[xk 
xT
k/k−1]HT
k Θk + E[xkνT
k−m]ΨT
k,k−m
− E
⎡
⎣xk

m−1
∑
i=1
TT
k−i,k−mΠ−1
k−iνk−i
T
⎤
⎦ ΨT
k,k−m,
hence, using again the orthogonal projection lemma and taking into account that
Sk,k−i = E[xkνT
k−i], for 1 ≤ i ≤ m, it follows that
E[xk 
yT
k/k−1] = (Dk − Pk/k−1) HT
k Θk + Sk,k−mΨT
k,k−m
−
m−1
∑
i=1
Sk,k−iΠ−1
k−iTk−i,k−mΨT
k,k−m, k  m.
Then, substituting these expectations in the expression of Sk,k and simplifying, it is clear that
Sk,k = Pk/k−1HT
k Θk, 1 ≤ k ≤ m,
Sk,k = Pk/k−1HT
k Θk −

Sk,k−m −
m−1
∑
i=1
Sk,k−iΠ−1
k−iTk−i,k−m

ΨT
k,k−m, k  m.
(18)
Now, an expression for the prediction error covariance matrix, Pk/k−1, is necessary. From
Equation (1), it is immediately clear that
Pk/k−1 = Fk−1Pk−1/k−1FT
k−1 + Qk−1, k ≥ 1,
where Pk/k = E[(xk − 
xk/k)(xk − 
xk/k)T] is the filtering error covariance matrix. From
Equation (7), it is concluded that
Pk/k = Pk/k−1 − Sk,kΠ−1
k ST
k,k, k ≥ 1; P0/0 = P0.
Covariance matrix of the innovation Πk = E[νkνT
k ]. From the orthogonal projection lemma,
the covariance matrix of the innovation is obtained as Πk = E[ykyT
k ] − E[
yk/k−1
yT
k/k−1].
From (3) and using Property (4), we have that
E[ykyT
k ] = E[θkθT
k ] ◦

HkDk HT
k

+ Rk, k ≥ 1.
To obtain E[
yk/k−1
yT
k/k−1] two cases must be distinguished again, due to the correlation
hypothesis of the Bernoulli variables θk:
I. For k ≤ m, Equation (10) and Property (4) yield
E[
yk/k−1
yT
k/k−1] =

θkθ
T
k

◦

HkE[
xk/k−1
xT
k/k−1]HT
k

and in view of the orthogonal projection lemma,
E[
yk/k−1
yT
k/k−1] =

θkθ
T
k

◦

Hk(Dk − Pk/k−1)HT
k

, k ≤ m.
10 Stochastic Modeling and Control
Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 11
II. For k  m, an analogous reasoning, but using now Equation (17), yields
E[
yk/k−1
yT
k/k−1] =

θkθ
T
k

◦

Hk(Dk − Pk/k−1)HT
k

+ Ψk,k−mΠk−mΨT
k,k−m
+ Ψk,k−m
m−1
∑
i=1
TT
k−i,k−mΠ−1
k−iΠk−iΠ−1
k−iTk−i,k−mΨT
k,k−m
+ ΘkHkE[
xk/k−1νT
k−m]ΨT
k,k−m + Ψk,k−mE[νk−m 
xT
k/k−1]HT
k Θk
− ΘkHkE[
xk/k−1
m−1
∑
i=1
νT
k−iΠ−1
k−iTk−i,k−m]ΨT
k,k−m
− Ψk,k−m
m−1
∑
i=1
TT
k−i,k−mΠ−1
k−iE[νk−i 
xT
k/k−1]HT
k Θk.
Next, again from the orthogonal projection lemma, E[
xk/k−1νT
k−i] = E[xkνT
k−i] = Sk,k−i, for
1 ≤ i ≤ m, and therefore
E[
yk/k−1
yT
k/k−1] =

θkθ
T
k

◦

Hk(Dk − Pk/k−1)HT
k

+ Ψk,k−mΠk−mΨT
k,k−m
+ Ψk,k−m
m−1
∑
i=1
TT
k−i,k−mΠ−1
k−iTk−i,k−mΨT
k,k−m
+ Θk Hk

Sk,k−mΨT
k,k−m −
m−1
∑
i=1
Sk,k−iΠ−1
k−iTk−i,k−mΨT
k,k−m

+

Ψk,k−mST
k,k−m − Ψk,k−m
m−1
∑
i=1
TT
k−i,k−mΠ−1
k−iST
k,k−i

HT
k Θk.
Finally, from Equation (18), we have
Sk,k−mΨT
k,k−m −
m−1
∑
i=1
Sk,k−iΠ−1
k−iTk−i,k−mΨT
k,k−m = −(Sk,k − Pk/k−1HT
k Θk),
and hence,
E[
yk/k−1
yT
k/k−1] =

θkθ
T
k

◦

Hk(Dk − Pk/k−1)HT
k

+ Ψk,k−mΠk−mΨT
k,k−m
+ Ψk,k−m
m−1
∑
i=1
TT
k−i,k−mΠ−1
k−iTk−i,k−mΨT
k,k−m
− ΘkHk

Sk,k − Pk/k−1HT
k Θk

−

ST
k,k − Θk HkPk/k−1

HT
k Θk, k  m.
Finally, since Kθ
k,k = E[θkθT
k ] − θkθ
T
k , the above expectations lead to the following expression
for the innovation covariance matrices
Πk = Kθ
k,k ◦

HkDk HT
k

+ Rk + Θk HkSk,k, k ≤ m,
11
Design of Estimation Algorithms from an Innovation Approach
in Linear Discrete-Time Stochastic Systems with Uncertain Observations
12 Will-be-set-by-IN-TECH
Πk = Kθ
k,k ◦

HkDk HT
k

+ Rk − Ψk,k−m

Πk−m +
m−1
∑
i=1
TT
k−i,k−mΠ−1
k−iTk−i,k−m

ΨT
k,k−m
+ Θk HkSk,k + ST
k,kHT
k Θk − Θk HkPk/k−1HT
k Θk, k  m.
All these results are summarized in the following theorem.
Theorem 1. The linear filter, 
xk/k, of the state xk is obtained as

xk/k = 
xk/k−1 + Sk,kΠ−1
k νk, k ≥ 1; 
x0/0 = x0,
where the state predictor, 
xk/k−1, is given by

xk/k−1 = Fk−1
xk−1/k−1, k ≥ 1.
The innovation process satisfies
νk = yk − ΘkHk 
xk/k−1, k ≤ m,
νk = yk − ΘkHk 
xk/k−1 + Ψk,k−m νk−m −
m−1
∑
i=1
TT
k−i,k−mΠ−1
k−iνk−i , k  m,
where Θk = E[Θk] and Ψk,k−m = Kθ
k,k−m ◦

HkFk,k−mDk−mHT
k−mΠ−1
k−m

, with ◦ the
Hadamard product, Fk,i = Fk−1 · · · Fi and Dk = E[xkxT
k ] recursively obtained by
Dk = Fk−1Dk−1FT
k−1 + Qk−1, k ≥ 1; D0 = P0 + x0xT
0 .
The matrices Tk,k−i are given by
Tk,k−i = ΘkHkFk,k−iSk−i,k−i, 2 ≤ k ≤ m, 1 ≤ i ≤ k − 1,
Tk,k−i = ΘkHkFk,k−iSk−i,k−i − Ψk,k−mTT
k−i,k−m, k  m, 1 ≤ i ≤ m − 1.
The covariance matrix of the innovation, Πk = E[νkνT
k ], satisfies
Πk = Kθ
k,k ◦

HkDk HT
k

+ Rk + ΘkHkSk,k, k ≤ m,
Πk = Kθ
k,k ◦

HkDk HT
k

+ Rk − Ψk,k−m

Πk−m +
m−1
∑
i=1
TT
k−i,k−mΠ−1
k−iTk−i,k−m

ΨT
k,k−m
+ Θk HkSk,k + ST
k,kHT
k Θk − Θk HkPk/k−1HT
k Θk, k  m.
The matrix Sk,k is determined by the following expression
Sk,k = Pk/k−1HT
k Θk, k ≤ m,
Sk,k = Pk/k−1HT
k Θk −

Sk,k−m −
m−1
∑
i=1
Sk,k−iΠ−1
k−iTk−i,k−m

ΨT
k,k−m, k  m,
12 Stochastic Modeling and Control
Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 13
where Pk/k−1, the prediction error covariance matrix, is obtained by
Pk/k−1 = Fk−1Pk−1/k−1FT
k−1 + Qk−1, k ≥ 1,
with Pk/k, the filtering error covariance matrix, satisfying
Pk/k = Pk/k−1 − Sk,kΠ−1
k ST
k,k, k ≥ 1; P0/0 = P0. (19)
4.2. Linear fixed-point smoothing algorithm
The following theorem provides a recursive fixed-point smoothing algorithm to obtain the
least-squares linear estimator, 
xk/k+N, of the state xk based on the observations {y1, . . . , yk+N},
for k ≥ 1 fixed and N ≥ 1. Moreover, to measure of the estimation accuracy, a recursive
formula for the error covariance matrices, Pk/k+N = E

(xk − 
xk/k+N)(xk − 
xk/k+N)T

, is
derived.
Theorem 2. For each fixed k ≥ 1, the fixed-point smoothers, 
xk/k+N, N ≥ 1 are calculated by

xk/k+N = 
xk/k+N−1 + Sk,k+NΠ−1
k+Nνk+N, N ≥ 1, (20)
whose initial condition is the filter, 
xk/k, given in (7).
The matrices Sk,k+N are calculated from
Sk,k+N =

DkFT
k+N,k − Mk,k+N−1FT
k+N−1

HT
k+NΘk+N, k ≤ m − N, N ≥ 1,
Sk,k+N =

DkFT
k+N,k − Mk,k+N−1FT
k+N−1

HT
k+NΘk+N
−

Sk,k+N−m −
m−1
∑
i=1
Sk,k+N−iΠ−1
k+N−iTk+N−i,k+N−m

× ΨT
k+N,k+N−m, k  m − N, N ≥ 1.
(21)
where the matrices Mk,k+N satisfy the following recursive formula:
Mk,k+N = Mk,k+N−1FT
k+N−1 + Sk,k+NΠ−1
k+NST
k+N,k+N, N ≥ 1,
Mk,k = Dk − Pk/k.
(22)
The innovations νk+N, their covariance matrices Πk+N, the matrices Tk+N,k+N−i,
Ψk+N,k+N−m, Dk and Pk/k are given in Theorem 1.
Finally, the fixed-point smoothing error covariance matrix, Pk/k+N, verifies
Pk/k+N = Pk/k+N−1 − Sk,k+NΠ−1
k+NST
k,k+N, N ≥ 1, (23)
with initial condition the filtering error covariance matrix, Pk/k, given by (19).
Proof. From the general expression (6), for each fixed k ≥ 1, the recursive relation (20) is
immediately clear.
13
Design of Estimation Algorithms from an Innovation Approach
in Linear Discrete-Time Stochastic Systems with Uncertain Observations
14 Will-be-set-by-IN-TECH
Now, we need to prove (21) for Sk,k+N = E[xkνT
k+N] = E[xkyT
k+N] − E[xk 
yT
k+N/k+N−1], thus
being necessary to calculate both expectations.
I. From Equation (3), taking into account that E[xkxT
k+N] = DkFT
k+N,k and using that Θk+N
and vk+N are independent of xk, we obtain
E[xkyT
k+N] = DkFT
k+N,kHT
k+NΘk+N, N ≥ 1.
II. Based on expressions (10) and (17) for 
yk+N/k+N−1, which are different depending on
k + N ≤ m or k + N  m, two options must be considered:
(a) For k ≤ m − N, using (10) for 
yk+N/k+N−1 with (8) for 
xk+N/k+N−1, we have that
E

xk 
yT
k+N/k+N−1 = Mk,k+N−1FT
k+N−1HT
k+NΘk+N,
where Mk,k+N−1 = E

xk 
xT
k+N−1/k+N−1 .
(b) For k  m − N, by following a similar reasoning to the previous one but starting from
(17), we get
E[xk 
yT
k+N/k+N−1] = Mk,k+N−1FT
k+N−1HT
k+NΘk+N + Sk,k+N−m
−
m−1
∑
i=1
Sk,k+N−iΠ−1
k+N−iTk+N−i,k+N−m

ΨT
k+N,k+N−m.
Then, the replacement of the above expectations in Sk,k+N leads to expression (21).
The recursive relation (22) for Mk,k+N = E

xk 
xT
k+N/k+N is immediately clear from (7) for

xk+N/k+N and its initial condition Mk,k = E[xk 
xk/k] is calculated taking into account that,
from the orthogonality, E[xk 
xT
k/k] = E[
xk/k 
xT
k/k] = Dk − Pk/k.
Finally, since Pk/k+N = E

xkxT
k

− E


xk/k+N 
xT
k/k+N , using (20) and taking into account that

xk/k+N−1 is uncorrelated with νk+N, we have
Pk/k+N = E

xkxT
k − E


xk/k+N−1
xT
k/k+N−1 − Sk,k+NΠ−1
k+NST
k,k+N, N ≥ 1
and, consequently, expression (23) holds.

5. Numerical simulation example
In this section, we present a numerical example to show the performance of the recursive
algorithms proposed in this chapter. To illustrate the effectiveness of the proposed estimators,
we ran a program in MATLAB which, at each iteration, simulates the state and the observed
values and provides the filtering and fixed-point smoothing estimates, as well as the
corresponding error covariance matrices, which provide a measure of the estimators accuracy.
14 Stochastic Modeling and Control
Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 15
Consider a two-dimensional state process, {xk; k ≥ 0}, generated by the following first-order
autoregressive model
xk =

1 + 0.2 sin

(k − 1)π
50
 
0.8 0
0.9 0.2

xk−1 + wk−1, k ≥ 1
with the following hypotheses:
• The initial state, x0, is a zero-mean Gaussian vector with covariance matrix given by P0 =

0.1 0
0 0.1

.
• The process {wk; k ≥ 0} is a zero-mean white Gaussian noise with covariance matrices
Qk =

0.36 0.3
0.3 0.25

, ∀k ≥ 0.
Suppose that the scalar observations come from two sensors according to the following
observation equations:
yi
k = θi
kxk + vi
k, k ≥ 1, i = 1, 2.
where {vi
k; k ≥ 1}, i = 1, 2, are zero-mean independent white Gaussian processes with
variances R1
k = 0.5 and R2
k = 0.9, ∀k ≥ 1, respectively.
According to our theoretical model, it is assumed that, for each sensor, the uncertainty at time
k depends only on the uncertainty at the previous time k − m. The variables θi
k, i = 1, 2,
modeling this type of uncertainty correlation in the observation process are modeled by two
independent sequences of independent Bernoulli random variables, {γi
k; k ≥ 1}, i = 1, 2, with
constant probabilities P[γi
k = 1] = γi. Specifically, the variables θi
k are defined as follows
θi
k = 1 − γi
k+m(1 − γi
k), i = 1, 2.
So, if θi
k = 0, then γi
k+m = 1 and γi
k = 0, and hence, θi
k+m = 1; this fact guarantees that, if
the state is absent at time k, after k + m instants of time the observation necessarily contains
the state. Therefore, there cannot be more than m consecutive observations consisting of noise
only.
Moreover, since the variables γi
k and γi
s are independent, θi
k and θi
s also are independent for
|k − s| = 0, m. The common mean of these variables is θ
i
= 1 − γi(1 − γi) and its covariance
function is given by
Kθ
k,s = E[(θi
k − θ
i
)(θi
s − θ
i
)] =
⎧
⎪
⎪
⎪
⎪
⎪
⎨
⎪
⎪
⎪
⎪
⎪
⎩
0 if |k − s| = 0, m,
−(1 − θ
i
)2 if |k − s| = m,
θ
i
(1 − θ
i
) if |k − s| = 0.
To illustrate the effectiveness of the respective estimators, two hundred iterations of the
proposed algorithms have been performed and the results obtained for different values of
the uncertainty probability and several values of m have been analyzed.
15
Design of Estimation Algorithms from an Innovation Approach
in Linear Discrete-Time Stochastic Systems with Uncertain Observations
16 Will-be-set-by-IN-TECH
Let us observe that the mean function of the variables θi
k, for i = 1, 2 are the same if 1 − γi is
used instead of γi; for this reason, only the case γi ≤ 0.5 will be considered here. Note that, in
such case, the false alarme probability at the i-th sensor, 1 − θ
i
, is an increasing function of γi.
Firstly, the values of the first component of a simulated state together with the filtering and the
fixed-point smoothing estimates for N = 2, obtained from simulated observations of the state
for m = 3 and γ1 = γ2 = 0.5 are displayed in Fig. 1. This graph shows that the fixed-point
smoothing estimates follow the state evolution better than the filtering ones.
0 20 40 60 80 100 120 140 160 180 200
−3
−2
−1
0
1
2
3
4
5
Time k
Simulate state
Filtering estimates
Smoothing estimates
Figure 1. First component of the simulate state, filtering and fixed-point smoothing estimates for N = 2,
when m = 3 and γ1 = γ2 = 0.5.
Next, assuming again that the Bernoulli variables of the observations are correlated at
sampling times that differ three units of time (m = 3), we compare the effectiveness of the
proposed filtering and fixed-point smoothing estimators considering different values of the
probabilities γ1 and γ2, which provides different values of the false alarm probabilities 1 − θ
i
,
i = 1, 2; specifically, γ1 = 0.2, γ2 = 0.4 and γ1 = 0.1, γ2 = 0.3. For these values, Fig. 2 shows
the filtering and fixed-point smoothing error variances, when N = 2 and N = 5, for the first
state component. From this figure it is observed that:
i) As both γ1 and γ2 decrease (which means that the false alarm probability decreases), the
error variances are smaller and, consequently, better estimations are obtained.
ii) The error variances corresponding to the fixed-point smoothers are less than those of
the filters and, consequently, agreeing with the comments on the previous figure, the
fixed-point smoothing estimates are more accurate.
iii) The accuracy of the smoothers at each fixed-point k is better as the number of available
observations increases.
16 Stochastic Modeling and Control
Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 17
0 20 40 60 80 100 120 140 160 180 200
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
Iteration k
Filtering error variances
Smoothing error variances, N=2
Smoothing error variances, N=5
γ
1
=0.2
γ2
=0.4
γ
1
=0.1
γ2
=0.3
Figure 2. Filtering and smoothing error variances for the first state component for γ1 = 0.2, γ2 = 0.4 and
γ1 = 0.1, γ2 = 0.3, when m = 3.
On the other hand, in order to show more precisely the dependence of the error variance on
the values γ1 and γ2, Fig. 3 displays the filtering and fixed-point smoothing error variances
of the first state component, at a fixed iteration (namely, k = 200) for m = 3, when both γ1
and γ2 are varied from 0.1 to 0.5, which provide different values of the probabilities θ
1
and θ
2
.
More specifically, we have considered the values γi = 0.1, 0.2, 0.3, 0.4, 0.5, which lead to the
false alarm probabilities 1 − θ
i
= 0.09, 0.16, 0.22, 0.24, 0.25, respectively.
In this figure, both graphs (corresponding to the filtering and fixed-point smoothing error
variances) corroborate the previous results, showing again that, as the false alarm probability
increases, the filtering and fixed-point smoothing error variances (N = 2) become greater and
consequently, worse estimations are obtained. Also, it is concluded that the smoothing error
variances are better than the filtering ones.
Analogous results to those of Fig. 1-3 are obtained for the second component of the state. As
example, Fig. 4 shows the filtering and fixed-point smoothing error variances of the second
state component, at k = 200, versus γ1 for constant values of γ2, when m = 3 and similar
comments to those made from Fig. 3 are deduced.
Finally, for γ1 = 0.2, γ2 = 0.4 the performance of the estimators is compared for different
values of m; specifically, for m = 1, 3, 6, the filtering error variances of the first state component
are displayed in Fig. 5. From this figure it is gathered that the estimators are more accurate as
the values of m are lower. In other words, a greater distance between the instants at which the
variables are correlated (which means that more consecutive observations may not contain
state information) yields worse estimations.
17
Design of Estimation Algorithms from an Innovation Approach
in Linear Discrete-Time Stochastic Systems with Uncertain Observations
18 Will-be-set-by-IN-TECH
0.1 0.2 0.3 0.4 0.5
0.12
0.14
0.16
0.18
0.2
0.22
Probability γ
1
γ
2
=0.1, γ
2
=0.2, γ
2
=0.3, γ
2
=0.4, γ
2
=0.5
0.1 0.2 0.3 0.4 0.5
0.09
0.1
0.11
0.12
0.13
0.14
0.15
Probability γ1
γ
2
=0.1, γ
2
=0.2, γ
2
=0.3, γ
2
=0.4, γ
2
=0.5
Filtering error variances
Smoothing error variances
Figure 3. Filtering error variances and smoothing error variances for N = 2 of the first state component
at k = 200 versus γ1 with γ2 varying from 0.1 to 0.5 when m = 3.
0.1 0.2 0.3 0.4 0.5
0.1
0.12
0.14
0.16
0.18
0.2
0.22
Probability γ
1
γ
2
=0.1, γ
2
=0.2, γ
2
=0.3, γ
2
=0.4, γ
2
=0.5
0.1 0.2 0.3 0.4 0.5
0.08
0.1
0.12
0.14
0.16
Probability γ
1
γ
2
=0.1, γ
2
=0.2, γ
2
=0.3, γ
2
=0.4, γ
2
=0.5
Filtering error variances
Smoothing error variances
Figure 4. Filtering error variances and smoothing error variances for N = 2 of the second state
component at k = 200 versus γ1 with γ2 varying from 0.1 to 0.5 when m = 3.
18 Stochastic Modeling and Control
Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 19
0 20 40 60 80 100 120 140 160 180 200
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0.45
Iteration k
Filtering error variances, m=1, γ1
=0.2, γ2
=0.4
Filtering error variances, m=3, γ
1
=0.2, γ
2
=0.4
Filtering error variances, m=6, γ
1
=0.2, γ
2
=0.4
Figure 5. Filtering error variances for γ1 = 0.2, γ2 = 0.4 and m = 1, 3, 6.
6. Conclusions and future research
In this chapter, the least-squares linear filtering and fixed-point smoothing problems have
been addressed for linear discrete-time stochastic systems with uncertain observations coming
from multiple sensors. The uncertainty in the observations is modeled by a binary variable
taking the values one or zero (Bernoulli variable), depending on whether the signal is present
or absent in the corresponding observation, and it has been supposed that the uncertainty
at any sampling time k depends only on the uncertainty at the previous time k − m. This
situation covers, in particular, those signal transmission models in which any failure in the
transmission is detected and the old sensor is replaced after m instants of time, thus avoiding
the possibility of missing signal in m + 1 consecutive observations.
By applying an innovation technique, recursive algorithms for the linear filtering and
fixed-point smoothing estimators have been obtained. This technique consists of obtaining
the estimators as a linear combination of the innovations, simplifying the derivation of these
estimators, due to the fact that the innovations constitute a white process.
Finally, the feasibility of the theoretical results has been illustrated by the estimation of a
two-dimensional signal from uncertain observations coming from two sensors, for different
uncertainty probabilities and different values of m. The results obtained confirm the greater
effectiveness of the fixed-point smoothing estimators in contrast to the filtering ones and
conclude that more accurate estimations are obtained as the values of m are lower.
In recent years, several problems of signal processing, such as signal prediction, detection and
control, as well as image restoration problems, have been treated using quadratic estimators
and, generally, polynomial estimators of arbitrary degree. Hence, it must be noticed that
the current chapter can be extended by considering the least-squares polynomial estimation
19
Design of Estimation Algorithms from an Innovation Approach
in Linear Discrete-Time Stochastic Systems with Uncertain Observations
20 Will-be-set-by-IN-TECH
problems of arbitrary degree for such linear systems with uncertain observations correlated
in instants that differ m units of time. On the other hand, in practical engineering, some recent
progress on the filtering and control problems for nonlinear stochastic systems with uncertain
observations is being achieved. Nonlinearity and stochasticity are two important sources that
are receiving special attention in research and, therefore, filtering and smoothing problems
for nonlinear systems with uncertain observations would be relevant topics on which further
investigation would be interesting.
This research is supported by Ministerio de Educación y Ciencia (Programa FPU and grant
No. MTM2011-24718) and Junta de Andalucía (grant No. P07-FQM-02701).
Author details
J. Linares-Pérez
Dpto. de Estadística e I.O. Universidad de Granada. Avda. Fuentenueva. 18071. Granada, Spain
R. Caballero-Águila
Dpto. de Estadística e I.O. Universidad de Jaén. Paraje Las Lagunillas. 23071. Jaén, Spain
I. García-Garrido
Dpto. de Estadística e I.O. Universidad de Granada. Avda. Fuentenueva. 18071. Granada, Spain
7. References
[1] Caballero-Águila, R., Hermoso-Carazo, A.  Linares-Pérez, J. [2011]. Linear and
quadratic estimation using uncertain observations from multiple sensors with correlated
uncertainty, Signal Processing 91(No. 2): 330–337.
[2] García-Garrido, I., Linares-Pérez, J., Caballero-Águila, R.  Hermoso-Carazo, A. [2012].
A solution to the filtering problem for stochastic systems with multi-sensor uncertain
observations, International Mathematical Forum 7(No. 18): 887–903.
[3] Hermoso-Carazo, A.  Linares-Pérez, J. [1994]. Linear estimation for discrete-time
systems in the presence of time-correlated disturbances and uncertain observations, IEEE
Transactions on Automatic Control 39(No. 8): 1636 –1638.
[4] Hermoso-Carazo, A.  Linares-Pérez, J. [1995]. Linear smoothing for discrete-time
systems in the presence of correlated disturbances and uncertain observations, IEEE
Transactions on Automatic Control 40(No. 8): 243–251.
[5] Hermoso-Carazo, A., Linares-Pérez, J., Jiménez-López, J., Caballero-Águila, R. 
Nakamori, S. [2008]. Recursive fixed-point smoothing algorithm from covariances based
on uncertain observations with correlation in the uncertainty, Applied Mathematics and
Computation 203(No. 1): 243–251.
[6] Hespanha, J., Naghshtabrizi, P.  Xu, Y. [2007]. A survey of recent results in networked
control systems, Proceedings of the IEEE 95(No. 1): 138 –172.
[7] Hounkpevi, F.  Yaz, E. [2007]. Robust minimum variance linear state estimators for
multiple sensors with different failure rates, Automatica 43(No. 7): 1274–1280.
Acknowledgements
20 Stochastic Modeling and Control
Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 21
[8] Huang, R.  Záruba, G. [2007]. Incorporating data from multiple sensors for
localizing nodes in mobile ad hoc networks, IEEE Transactions on Mobile Computing 6(No.
9): 1090–1104.
[9] Jackson, R.  Murthy, D. [1976]. Optimal linear estimation with uncertain observations,
IEEE Transactions on Information Theory 22(No. 3): 376–378.
[10] Jiménez-López, J., Linares-Pérez, J., Nakamori, S., Caballero-Águila, R. 
Hermoso-Carazo, A. [2008]. Signal estimation based on covariance information
from observations featuring correlated uncertainty and coming from multiple sensors,
Signal Processing 88(No. 12): 2998–3006.
[11] Kailath, T., Sayed, A.  Hassibi, B. [2000]. Linear Estimation, Prentice Hall, New Jersey.
[12] Kalman, R. [1960]. A new approach to linear filtering and prediction problems,
Transactions of the ASME. Journal of Basic Engineering (No. 82 (Series D)): 35–45.
[13] Ma, J.  Sun, S. [2011]. Optimal linear estimators for systems with random sensor
delays, multiple packet dropouts and uncertain observations, IEEE Transactions on Signal
Processing 59(No. 11): 5181–5192.
[14] Malyavej, V., Manchester, I.  Savkin, A. [2006]. Precision missile guidance
using radar/multiple-video sensor fusion via communication channels with bit-rate
constraints, Automatica 42(No. 5): 763–769.
[15] Moayedi, M., Foo, Y.  Soh, Y. [2010]. Adaptive kalman filtering in networked systems
with random sensor delays, multiple packet dropouts and missing measurements, IEEE
Transactions on Signal Processing 58(No. 3): 1577–1578.
[16] Monzingo, R. [1981]. Discrete linear recursive smoothing for systems with uncertain
observations, IEEE Transactions on Automatic Control AC-26(No. 3): 754–757.
[17] Nahi, N. [1969]. Optimal recursive estimation with uncertain observation, IEEE
Transactions on Information Theory 15(No. 4): 457–462.
[18] Nakamori, S., Caballero-Águila, R., Hermoso-Carazo, A., Jiménez-López, J. 
Linares-Pérez, J. [2006]. Least-squares vth-order polynomial estimation of signals
from observations affected by non-independent uncertainty, Applied Mathematics and
Computation Vol. 176(No. 2): 642–653.
[19] Nakamori, S., Caballero-Águila, R., Hermoso-Carazo, A., Jiménez-López, J. 
Linares-Pérez, J. [2007]. Signal polynomial smoothing from correlated interrupted
observations based on covariances, Mathematical Methods in the Applied Sciences Vol.
30(No. 14): 1645–1665.
[20] NaNacara, W.  Yaz, E. [1997]. Recursive estimator for linear and nonlinear systems
with uncertain observations, Signal Processing Vol. 62(No. 2): 215–228.
[21] Sahebsara, M., Chen, T.  Shah, S. [2007]. Optimal H2 filtering with random sensor
delay, multiple packet dropout and uncertain observations, International Journal of Control
80(No. 2): 292–301.
[22] Sinopoli, B., Schenato, L., Franceschetti, M., Poolla, K., Jordan, M.  Sastry, S. [2004].
Kalman filtering with intermittent observations, IEEE Transactions on Automatic Control
49(No. 9): 1453–1464.
[23] Wang, Z., Yang, F., Ho, D.  Liu, X. [2005]. Robust finite-horizon filtering for stochastic
systems with missing measurements, IEEE Signal Processing Letters 12(No. 6): 437–440.
21
Design of Estimation Algorithms from an Innovation Approach
in Linear Discrete-Time Stochastic Systems with Uncertain Observations
22 Will-be-set-by-IN-TECH
[24] Wang, Z., Yang, F., Ho, D.  Liu, X. [2006]. Robust H∞ filtering for stochastic
time-delay systems with missing measurements, IEEE Transactions on Signal Processing
54(No. 7): 2579–2587.
22 Stochastic Modeling and Control
Chapter 0
On Guaranteed Parameter Estimation of
Stochastic Delay Differential Equations
by Noisy Observations
Uwe Küchler and Vyacheslav A. Vasiliev
Additional information is available at the end of the chapter
http://dx.doi.org/10.5772/45952
1. Introduction
Assume (Ω, F, (F(t), t ≥ 0), P) is a given filtered probability space and W = (W(t), t ≥ 0),
V = (V(t), t ≥ 0) are real-valued standard Wiener processes on (Ω, F, (F(t), t ≥ 0), P),
adapted to (F(t)) and mutually independent. Further assume that X0 = (X0(t), t ∈
[−1, 0]) and Y0 are a real-valued cadlag process and a real-valued random variable on
(Ω, F, (F(t), t ≥ 0), P) respectively with
E
 0
−1
X2
0(s)ds  ∞ and EY2
0  ∞.
Assume Y0 and X0(s) are F0−measurable, s ∈ [−1, 0] and the quantities W, V, X0 and Y0 are
mutually independent.
Consider a two–dimensional random process (X, Y) = (X(t), Y(t), t ≥ 0) described by the
system of stochastic differential equations
dX(t) = aX(t)dt + bX(t − 1)dt + dW(t), (1)
dY(t) = X(t)dt + dV(t), t ≥ 0 (2)
with the initial conditions X(t) = X0(t), t ∈ [−1, 0], and Y(0) = Y0. The process X is supposed
to be hidden, i.e., unobservable, and the process Y is observed. Such models are used in
applied problems connected with control, filtering and prediction of stochastic processes (see,
for example, [1, 4, 17–20] among others).
The parameter ϑ = (a, b) ∈ Θ is assumed to be unknown and shall be estimated based
on continuous observation of Y, Θ is a subset of R2 ((a, b) denotes the transposed (a, b)).
Equations (1) and (2) together with the initial values X0(·) and Y0 respectively have uniquely
solutions X(·) and Y(·), for details see [19].
Chapter 2
2 Will-be-set-by-IN-TECH
Equation (1) is a very special case of stochastic differential equations with time delay, see [5, 6]
and [20] for example.
To estimate the true parameter ϑ with a prescribed least square accuracy ε we shall construct a
sequential plan (T∗(ε), ϑ∗(ε)) working for all ϑ ∈ Θ. Here T∗(ε) is the duration of observations
which is a special chosen stopping time and ϑ∗(ε) is an estimator of ϑ. The set Θ is defined
to be the intersection of the set Θ with an arbitrary but fixed ball B0,R ⊂ R2. Sequential
estimation problem has been solved for sets Θ of a different structure in [7]-[9], [11, 13, 14, 16]
by observations of the process (1) and in [10, 12, 15] – by noisy observations (2).
In this chapter the set Θ of parameters consists of all (a, b) from R2 which do not belong to
lines L1 or L2 defined in Section 2 below and having Lebesgue measure zero.
This sequential plan is a composition of several different plans which follow the regions to
which the unknown true parameter ϑ = (a, b) may belong to. Each individual plan is based
on a weighted correlation estimator, where the weight matrices are chosen in such a way that
this estimator has an appropriate asymptotic behaviour being typical for the corresponding
region to which ϑ belongs to. Due to the fact that this behaviour is very connected with
the asymptotic properties of the so-called fundamental solution x0(·) of the deterministic
delay differential equation corresponding to (1) (see Section 2 for details), we have to treat
different regions of Θ = R2  L, L = L1 ∪ L2, separately. If the true parameter ϑ belongs
to L, the weighted correlation estimator under consideration converges weakly only, and
thus the assertions of Theorem 3.1 below cannot be derived by means of such estimators. In
general, the exception of the set L does not disturb applications of the results below in adaptive
filtration, control theory and other applications because of its Lebesgue zero measure.
In the papers [10, 12] the problem described above was solved for the two special sets of
parameters ΘI (a straight line) and ΘII (where X(·) satisfies (1) is stable or periodic (unstable))
respectively. The general sequential estimation problem for all ϑ = (a, b) from R2 except of
two lines was solved in [13, 14, 16] for the equation (1) based on the observations of X(·).
In this chapter the sequential estimation method developed in [10, 12] for the system (1), (2) is
extended to the case, considered by [13, 14, 16] for the equation (1) (as already mentioned, for
all ϑ from R2 except of two lines for the observations without noises).
A related result in such problem statement was published first for estimators of an another
structure and without proofs in [15].
A similar problem for partially observed stochastic dynamic systems without time-delay was
solved in [22, 23].
The organization of this chapter is as follows. Section 2 presents some preliminary facts
needed for the further studies about we have spoken. In Section 3 we shall present the main
result, mentioned above. In Section 4 all proofs are given. Section 5 includes conclusions.
2. Preliminaries
To construct sequential plans for estimation of the parameter ϑ we need some preparation. At
first we shall summarize some known facts about the equation (1). For details the reader is
referred to [3]. Together with the mentioned initial condition the equation (1) has a uniquely
determined solution X which can be represented for t ≥ 0 as follows:
24 Stochastic Modeling and Control
On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 3
X(t) = x0(t)X0(t) + b
 0
−1
x0(t − s − 1)X0(s)ds +
 t
0
x0(t − s)dW(s). (3)
Here x0 = (x0(t), t ≥ −1) denotes the fundamental solution of the deterministic equation
x0(t) = 1 +
t

0
(ax0(s) + bx0(s − 1))ds, t ≥ 0, (4)
corresponding to (1) with x0(t) = 0, t ∈ [−1, 0), x0(0) = 1.
The solution X has the property E
 T
0 X2(s)ds  ∞ for every T  0.
From (3) it is clear, that the limit behaviour for t → ∞ of X very depends on the limit behaviour
of x0(·). The asymptotic properties of x0(·) can be studied by the Laplace-transform of x0,
which equals (λ − a − be−λ)−1, λ any complex number.
Let s = u(r) (r  1) and s = w(r) (r ∈ R1) be the functions given by the following parametric
representation (r(ξ), s(ξ)) in R2 :
r(ξ) = ξ cot ξ, s(ξ) = −ξ/ sin ξ
with ξ ∈ (0, π) and ξ ∈ (π, 2π) respectively.
Now we define the parameter set Θ to be the plane R2 without the lines L1 = (a, u(a))a≤1
and L2 = (a, w(a))a∈R1 such that R2 = Θ ∪ L1 ∪ L2.
It seems not to be possible to construct a general simple sequential procedure which has the
desired properties under Pϑ for all ϑ ∈ Θ. Therefore we are going to divide the set Θ into some
appropriate smaller regions where it is possible to do. This decomposition is very connected
with the structure of the set Λ of all (real or complex) roots of the so-called characteristic
equation of (4):
λ − a − be−λ
= 0.
Put v0 = v0(ϑ) = max{Reλ|λ ∈ Λ}, v1 = v1(ϑ) = max{Reλ|λ ∈ Λ, Reλ  v0}. Beside of
the case b = 0 it holds −∞  v1  v0  ∞. By m(λ) we denote the multiplicity of the solution
λ ∈ Λ. Note that m(λ) = 1 for all λ ∈ Λ beside of (a, b) ∈ R2 with b = −ea. In this cases we
have λ = a − 1 ∈ Λ and m(a − 1) = 2. The values v0(ϑ) and v1(ϑ) determine the asymptotic
behaviour of x0(t) as t → ∞ (see [3] for details).
Now we are able to divide Θ into some appropriate for our purposes regions. Note, that
this decomposition is very related to the classification used in [3]. There the plane R2 was
decomposed into eleven subsets. Here we use another notation.
Definition (Θ). The set Θ of parameters is decomposed as
Θ = Θ1 ∪ Θ2 ∪ Θ3 ∪ Θ4,
where Θ1 = Θ11 ∪ Θ12 ∪ Θ13, Θ2 = Θ21 ∪ Θ22, Θ3 = Θ31, Θ4 = Θ41 ∪ Θ42 with
Θ11 = {ϑ ∈ R2
| v0(ϑ)  0},
Θ12 = {ϑ ∈ R2
| v0(ϑ)  0 and v0(ϑ) ∈ Λ},
25
On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations
4 Will-be-set-by-IN-TECH
Θ13 = {ϑ ∈ R2
| v0(ϑ)  0; v0(ϑ) ∈ Λ, m(v0) = 2},
Θ21 = {ϑ ∈ R2
| v0(ϑ)  0, v0(ϑ) ∈ Λ, m(v0) = 1, v1(ϑ)  0 and v1(ϑ) ∈ Λ},
Θ22 = {ϑ ∈ R2
| v0(ϑ)  0, v0(ϑ) ∈ Λ, m(v0) = 1, v1(ϑ)  0 and v1(ϑ) ∈ Λ},
Θ31 = {ϑ ∈ R2
| v0(ϑ)  0, v0(ϑ) ∈ Λ, m(v0) = 1 and v1(ϑ)  0},
Θ41 = {ϑ ∈ R2
| v0(ϑ) = 0, v0(ϑ) ∈ Λ, m(v0) = 1},
Θ42 = {ϑ ∈ R2
| v0(ϑ)  0, v0(ϑ) ∈ Λ, m(v0) = 1, v1(ϑ) = 0 and v1(ϑ) ∈ Λ}.
It should be noted, that the cases (Q2 ∪ Q3) and (Q5) considered in [3] correspond to our
exceptional lines L1 and L2 respectively.
Here are some comments concerning the Θ subsets.
The unions Θ1, . . . , Θ4 are marked out, because the Fisher information matrix and related
design matrices which will be considered below, have similar asymptotic properties for all ϑ
throughout every Θi (i = 1, . . . , 4).
Obviously, all sets Θ11, . . . , Θ42 are pairwise disjoint, the closure of Θ equals to R2 and the
exceptional set L1 ∪ L2 has Lebesgue measure zero.
The set Θ11 is the set of parameters ϑ for which there exists a stationary solution of (1).
Note that the one-parametric set Θ4 is a part of the boundaries of the following regions: Θ11,
Θ12, Θ21, Θ3. In this case b = −a holds and (1) can be written as a differential equation with
only one parameter and being linear in the parameter.
We shall use a truncation of all the introduced sets. First chose an arbitrary but fixed positive
R. Define the set Θ = {ϑ ∈ Θ| ||ϑ|| ≤ R} and in a similar way the subsets Θ11, . . . , Θ42.
Sequential estimators of ϑ with a prescribed least square accuracy we have already
constructed in [10, 12]. But in these articles the set of possible parameters ϑ were restricted to
Θ11 ∪ Θ12 ∪ {Θ41  {(0, 0)}} ∪ Θ42.
To construct a sequential plan for estimating ϑ based on the observation of Y(·) we follow the
line of [10, 12]. We shall use a single equation for Y of the form:
dY(t) = ϑ
A(t)dt + ξ(t)dt + dV(t), (5)
where A(t) = (Y(t), Y(t − 1)),
ξ(t) = X(0) − aY(0) − bY(0) + b
 0
−1
X0(s)ds − aV(t) − bV(t − 1) + W(t).
The random variables A(t) and ξ(t) are F(t)-measurable for every fixed t ≥ 1 and a short
calculation shows that all conditions of type (7) in [12], consisting of
E
 T
1
(|Y(t)| + |ξ(t)|)dt  ∞ for all T  1,
E[Δ̃ξ(t)|F(t − 2)] = 0, E[(Δ̃ξ(t))2
|F(t − 2)] ≤ 1 + R2
hold in our case. Here Δ̃ denotes the difference operator defined by Δ̃ f (t) = f (t) − f (t − 1).
26 Stochastic Modeling and Control
On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 5
Using this operator and (5) we obtain the following equation:
dΔ̃Y(t) = aΔ̃Y(t)dt + bΔ̃Y(t − 1)dt + Δ̃ξ(t)dt + dΔ̃V(t) (6)
with the initial condition Δ̃Y(1) = Y(1) − Y0.
Thus we have reduced the system (1), (2) to a single differential equation for the observed
process (Δ̃Y(t), t ≥ 2) depending on the unknown parameters a and b.
3. Construction of sequential estimation plans
In this section we shall construct the sequential estimation procedure for each of the cases
Θ1 . . . , Θ4 separately. Then we shall define, similar to [11, 13, 14, 16], the final sequential
estimation plan, which works in Θ as a sequential plan with the smallest duration of
observations.
We shall construct the sequential estimation procedure of the parameter ϑ on the basis of the
correlation method in the cases Θ1, Θ4 (similar to [12, 14, 15]) and on the basis of correlation
estimators with weights in the cases Θ2 ∪ Θ3. The last cases and Θ13 are new. It should be
noted, that the sequential plan, constructed e.g. in [2] does not work for Θ3 here, even in the
case if we observe (X(·)) instead of (Y(·)).
3.1. Sequential estimation procedure for ϑ ∈ Θ1
Consider the problem of estimating ϑ ∈ Θ1. We will use some modification of the estimation
procedure from [12], constructed for the Case II thereon. It can be easily shown, that
Proposition 3.1 below can be proved for the cases Θ11 ∪ Θ12 similarly to [12]. Presented below
modified procedure is oriented, similar to [16] on all parameter sets Θ11, Θ12, Θ13. Thus we
will prove Proposition 3.1 in detail for the case Θ13 only. The proofs for cases Θ11 ∪ Θ12 are
very similar.
For the construction of the estimation procedure we assume h10 is a real number in (0, 1/5)
and h1 is a random variable with values in [h10, 1/5] only, F(0)-measurable and having a
known continuous distribution function.
Assume (cn)n≥1 is a given unboundedly increasing sequence of positive numbers satisfying
the following condition:
∑
n≥1
1
cn
 ∞. (7)
This construction follows principally the line of [14, 16] (see [12] as well), for which the reader
is referred for details.
We introduce for every ε  0 and every s ≥ 0 several quantities:
– the functions
Ψs(t) =

(Δ̃Y(t), Δ̃Y(t − s)) for t ≥ 1 + s,
(0, 0) for t  1 + s;
– the sequence of stopping times
τ1(n, ε) = h1 inf{k ≥ 1 :
 kh1
0
||Ψh1
(t − 2 − 5h1)||2
dt ≥ ε−1
cn} for n ≥ 1;
27
On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations
6 Will-be-set-by-IN-TECH
– the matrices
G1(T, s) =
 T
0
Ψs(t − 2 − 5s)Ψ
1(t)dt, Φ1(T, s) =
 T
0
Ψs(t − 2 − 5s)dΔ̃Y(t),
G1(n, k, ε) = G1(τ1(n, ε) − kh1, h1), Φ1(n, k, ε) = Φ1(τ1(n, ε) − kh1, h1);
– the times
k1(n) = arg min
k=1,5
||G−1
1 (n, k, ε)||, n ≥ 1;
– the estimators
ϑ1(n, ε) = G−1
1 (n, ε)Φ1(n, ε), n ≥ 1, G1(n, ε) = G1(n, k1(n), ε), Φ1(n, ε) = Φ1(n, k1(n), ε);
– the stopping time
σ1(ε) = inf{N ≥ 1 : S1(N)  (ρ1δ−1
1 )1/2
}, (8)
where S1(N) =
N
∑
n=1
β2
1(n, ε),
β1(n, ε) = ||G̃−1
1 (n, ε)||, G̃1(n, ε) = (ε−1
cn)−1
G1(n, k1(n), ε)
and δ1 ∈ (0, 1) is some fixed chosen number,
ρ1 = 15(3 + R2
) ∑
n≥1
1
cn
.
The deviation of the ’first-step estimators’ ϑ1(n, ε) has the form:
ϑ1(n, ε) − ϑ = (ε−1
cn)−1/2
G̃−1
1 (n, ε)ζ̃1(n, ε), n ≥ 1, (9)
ζ̃1(n, ε) = (ε−1
cn)−1/2
τ1(n,ε)−k1(n)h1

0
Ψh1
(t − 2 − 5h1)(Δ̃ξ(t)dt + dV(t) − dV(t − 1)).
By the definition of stopping times τ1(n, ε) − k1(n)h1 we can control the noise ζ̃1(n, ε) :
Eϑ||ζ̃1(n, ε)||2
≤ 15(3 + R2
), n ≥ 1, ε  0
and by the definition of the stopping time σ1(ε) - the first factor G̃−1
1 (n, ε) in the representation
of the deviation (9).
Define the sequential estimation plan of ϑ by
T1(ε) = τ1(σ1(ε), ε), ϑ1(ε) =
1
S(σ1(ε))
σ1(ε)
∑
n=1
β2
1(n, ε)ϑ1(n, ε). (10)
We can see that the construction of the sequential estimator ϑ1(ε) is based on the family of
estimators ϑ(T, s) = G−1
1 (T, s)Φ(T, s), s ≥ 0. We have taken the discretization step h1 as
above, because for ϑ ∈ Θ12 the functions
f (T, s) = e2v0T
G−1
1 (T, s)
28 Stochastic Modeling and Control
On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 7
for every s ≥ 0 have some periodic matrix functions as a limit on T almost surely. These
limit matrix functions are finite and may be infinite on the norm only for four values of their
argument T on every interval of periodicity of the length Δ  1 (see the proof of Theorem 3.2
in [10, 12]).
In the sequel limits of the type lim
n→∞
a(n, ε) or lim
ε→0
a(n, ε) will be used. To avoid repetitions
of similar expressions we shall use, similar to [12, 14, 16], the unifying notation lim
n∨ε
a(n, ε) for
both of those limits if their meaning is obvious.
We state the results concerning the estimation of the parameter ϑ ∈ Θ1 in the following
proposition.
Proposition 3.1. Assume that the condition (7) on the sequence (cn) holds and let the parameter
ϑ = (a, b) in (1) be such that ϑ ∈ Θ1.
Then:
I. For any ε  0 and every ϑ ∈ Θ1 the sequential plan (T1(ε), ϑ1(ε)) defined by (10) is closed
(T1(ε)  ∞ Pϑ − a.s.) and possesses the following properties:
1◦
. sup
ϑ∈Θ1
Eϑ||ϑ1(ε) − ϑ||2
≤ δ1ε;
2◦. the inequalities below are valid:
– for ϑ ∈ Θ11
0  lim
ε→0
ε · T1(ε) ≤ lim
ε→0
ε · T1(ε)  ∞ Pϑ − a.s.,
– for ϑ ∈ Θ12
0  lim
ε→0
[T1(ε) −
1
2v0
ln ε−1
] ≤ lim
ε→0
[T1(ε) −
1
2v0
ln ε−1
]  ∞ Pϑ − a.s.,
– for ϑ ∈ Θ13
0  lim
ε→0
[T1(ε) +
1
v0
ln T1(ε) − Ψ
13(ε)], lim
ε→0
[T1(ε) +
1
v0
ln T1(ε) − Ψ
13(ε)]  ∞ Pϑ − a.s.,
the functions Ψ
13(ε) and Ψ
13(ε) are defined in (30).
II. For every ϑ ∈ Θ1 the estimator ϑ1(n, ε) is strongly consistent:
lim
n∨ε
ϑ1(n, ε) = ϑ Pϑ − a.s.
3.2. Sequential estimation procedure for ϑ ∈ Θ2
Assume (cn)n≥1 is an unboundedly increasing sequence of positive numbers satisfying the
condition (7).
We introduce for every ε  0 several quantities:
– the parameter λ = ev0 and its estimator
29
On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations
8 Will-be-set-by-IN-TECH
λt =
t

2
Δ̃Y(s)Δ̃Y(s − 1)ds
t

2
(Δ̃Y(s − 1))2ds
, t  2, λt = 0 otherwise; (11)
– the functions
Z(t) =

Δ̃Y(t) − λΔ̃Y(t − 1) for t ≥ 2,
0 for t  2;
Z̃(t) =

Δ̃Y(t) − λtΔ̃Y(t − 1) for t ≥ 2,
0 for t  2,
Ψ(t) =

(Δ̃Y(t), Δ̃Y(t − 1)) for t ≥ 2,
(0, 0) for t  2,
Ψ̃(t) =

(Z̃(t), Δ̃Y(t)) for t ≥ 2,
(0, 0) for t  2;
– the parameter α = v0/v1 and its estimator
α2(n, ε) =
ln
ν2(n,ε)

4
(Δ̃Y(t − 3))2dt
δ ln ε−1cn
, (12)
where
ν2(n, ε) = inf{T  4 :
 T
4
Z̃2
(t − 3)dt = (ε−1
cn)δ
}, (13)
δ ∈ (0, 1) is a given number;
– the sequence of stopping times
τ2(n, ε) = h2 inf{k  h−1
2 ν2(n, ε) :
kh2

ν2(n,ε)
||Ψ−1/2
2 (n, ε)Ψ̃(t − 3)||2
dt ≥ 1},
where suppose h2 = 1/5 and
Ψ2(n, ε) = diag{ε−1
cn, (ε−1
cn)α2(n,ε)
};
– the matrices
G2(S, T) =
 T
S
Ψ̃(t − 3)Ψ
(t)dt, Φ2(S, T) =
 T
S
Ψ̃(t − 3)dΔ̃Y(t),
G2(n, k, ε) = G2(ν2(n, ε), τ2(n, ε) − kh2), Φ2(n, k, ε) = Φ2(ν2(n, ε), τ2(n, ε) − kh2);
– the times
k2(n) = arg min
k=1,5
||G−1
2 (n, k, ε)||, n ≥ 1;
– the estimators
ϑ2(n, ε) = G−1
2 (n, ε)Φ2(n, ε), n ≥ 1,
30 Stochastic Modeling and Control
On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 9
where
G2(n, ε) = G2(n, k2(n), ε), Φ2(n, ε) = Φ2(n, k2(n), ε);
– the stopping time
σ2(ε) = inf{n ≥ 1 : S2(N)  (ρ2δ−1
2 )1/2
}, (14)
where S2(N) =
N
∑
n=1
β2
2(n, ε), ρ2 = ρ1, δ2 ∈ (0, 1) is some fixed chosen number,
β2(n, ε) = ||G̃−1
2 (n, ε)||, G̃2(n, ε) = (ε−1
cn)−1/2
Ψ−1/2
2 (n, ε)G2(n, ε).
In this case we write the deviation of ϑ2(n, ε) in the form
ϑ2(n, ε) − ϑ = (ε−1
cn)−1/2
G̃−1
2 (n, ε)ζ̃2(n, ε), n ≥ 1,
where
ζ̃2(n, ε) = Ψ−1/2
2 (n, ε)
τ2(n,ε)−k2(n)h2

ν2(n,ε)
Ψ̃(t − 3)(Δ̃ξ(t)dt + dV(t) − dV(t − 1))
and we have
Eϑ||ζ̃2(n, ε)||2
≤ 15(3 + R2
), n ≥ 1, ε  0.
Define the sequential estimation plan of ϑ by
T2(ε) = τ2(σ2(ε), ε), ϑ2(ε) = ϑ2(σ2(ε), ε). (15)
The construction of the sequential estimator ϑ2(ε) is based on the family of estimators
ϑ2(S, T) = G−1
2 (S, T)Φ2(S, T) = e−v1TG̃2(S, T)Φ̃2(S, T), T  S ≥ 0, where
G̃2(S, T) = e−v1T
Ψ−1/2
2 (T)G2(S, T), Φ̃2(S, T) = Ψ−1/2
2 (T)Φ2(S, T)
and Ψ2(T) = diag{ev1T, ev0T}. We have taken the discretization step h as above, because for
ϑ ∈ Θ22, similar to the case ϑ ∈ Θ12, the function
f2(S, T) = G̃−1
2 (S, T)
has some periodic (with the period Δ  1) matrix function as a limit almost surely (see (35)).
This limit matrix function may have an infinite norm only for four values of their argument T
on every interval of periodicity of the length Δ.
We state the results concerning the estimation of the parameter ϑ ∈ Θ2 in the following
proposition.
Proposition 3.2. Assume that the condition (7) on the sequence (cn) holds as well as the parameter
ϑ = (a, b) in (1) be such that ϑ ∈ Θ2. Then:
I. For any ε  0 and every ϑ ∈ Θ2 the sequential plan (T2(ε), ϑ2(ε)) defined by (15) is closed and
possesses the following properties:
1◦
. sup
ϑ∈Θ2
Eϑ||ϑ2(ε) − ϑ||2
≤ δ2ε;
31
On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations
10 Will-be-set-by-IN-TECH
2◦. the inequalities below are valid:
0  lim
ε→0
[T2(ε) −
1
2v1
ln ε−1
] ≤ lim
ε→0
[T2(ε) −
1
2v1
ln ε−1
]  ∞ Pϑ − a.s.;
II. For every ϑ ∈ Θ2 the estimator ϑ2(n, ε) is strongly consistent:
lim
n∨ε
ϑ2(n, ε) = ϑ Pϑ − a.s.
3.3. Sequential estimation procedure for ϑ ∈ Θ3
We shall use the notation, introduced in the previous paragraph for the parameter λ = ev0
and its estimator λt as well as for the functions Z(t), Z̃(t), Ψ(t) and Ψ̃(t).
Chose the non-random functions ν3(n, ε), n ≥ 1, ε  0, satisfying the following conditions as
ε → 0 or n → ∞ :
ν3(n, ε) = o(ε−1
cn),
log1/2
ν3(n, ε)
ev0ν3(n,ε)
ε−1
cn = o(1). (16)
Example: ν3(n, ε) = log2
ε−1cn.
We introduce several quantities:
– the parameter α3 = v0 and its estimator
α3(n, ε) = ln |λν3(n,ε)|,
where λt is defined in (11);
– the sequences of stopping times
τ31(n, ε) = inf{T  0 :
T

ν3(n,ε)
Z̃2
(t − 3)dt = ε−1
cn}, (17)
τ32(n, ε) = inf{T  0 :
T

ν3(n,ε)
(Δ̃Y(t − 3))2
dt = e2α3(n,ε)ε−1
cn
}, (18)
τmin(n, ε) = min{τ31(n, ε), τ32(n, ε)}, τmax(n, ε) = max{τ31(n, ε), τ32(n, ε)},
– the matrices
G3(S, T) =
 T
S
Ψ̃(t)Ψ(t)dt,
Φ3(S, T) =
 T
S
Ψ̃(t)dΔ̃Y(t),
G3(n, ε) = G3(ν3(n, ε), τmin(n, ε)),
Φ3(n, ε) = Φ3(ν3(n, ε), τmin(n, ε));
– the estimators
ϑ3(n, ε) = G−1
3 (n, ε)Φ3(n, ε), n ≥ 1, ε  0;
32 Stochastic Modeling and Control
On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 11
– the stopping time
σ3(ε) = inf{n ≥ 1 : S3(N)  (ρ3δ−1
3 )1/2
}, (19)
where S3(N) =
N
∑
n=1
β2
3(n, ε), δ3 ∈ (0, 1) is some fixed chosen number,
β3(n, ε) = ||G̃−1
3 (n, ε)||, ρ3 = 6(3 + R2
) ∑
n≥1
1
cn
,
G̃3(n, ε) = (ε−1
cn)−1/2
Ψ−1/2
3 (n, ε)G3(n, ε), Ψ3(n, ε) = diag{ε−1
cn, e2α3(n,ε)ε−1
cn
}.
In this case we write the deviation of ϑ3(n, ε) in the form
ϑ3(n, ε) − ϑ = (ε−1
cn)−1/2
G̃−1
3 (n, ε)ζ̃3(n, ε), n ≥ 1,
where
ζ̃3(n, ε) = Ψ−1/2
3 (n, ε)
τmin(n,ε)

ν3(n,ε)
Ψ̃(t − 3)(Δ̃ξ(t)dt + dV(t) − dV(t − 1))
and we have
Eϑ||ζ̃3(n, ε)||2
≤ 6(3 + R2
), n ≥ 1, ε  0.
Define the sequential estimation plan of ϑ by
T3(ε) = τmax(σ3(ε), ε), ϑ3(ε) = ϑ3(σ3(ε), ε). (20)
Proposition 3.3. Assume that the condition (7) on the sequence (cn) holds and let the parameter
ϑ = (a, b) in (1) be such that ϑ ∈ Θ3. Then:
I. For every ϑ ∈ Θ3 the sequential plan (T3(ε), ϑ3(ε)) defined in (20) is closed and possesses the
following properties:
1◦. for any ε  0
sup
ϑ∈Θ3
Eϑ||ϑ3(ε) − ϑ||2
≤ δ3ε;
2◦. the following inequalities are valid:
0  lim
ε→0
εT3(ε) ≤ lim
ε→0
εT3(ε)  ∞ Pϑ − a.s.;
II. For every ϑ ∈ Θ3 the estimator ϑ3(n, ε) is strongly consistent:
lim
n∨ε
ϑ3(n, ε) = ϑ Pϑ − a.s.
3.4. Sequential estimation procedure for ϑ ∈ Θ4
In this case b = −a and (6) is the differential equation of the first order:
dΔ̃Y(t) = aZ∗
(t)dt + Δ̃ξ(t)dt + dV(t) − dV(t − 1), t ≥ 2,
where
Z∗
(t) =

Δ̃Y(t) − Δ̃Y(t − 1) for t ≥ 2,
0 for t  2.
33
On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations
12 Will-be-set-by-IN-TECH
We shall construct sequential plan (T4(ε), ϑ4(ε)) for estimation of the vector parameter ϑ =
a(1, −1) with the (δ4ε)-accuracy in the sense of the L2-norm for every ε  0 and fixed chosen
δ4 ∈ (0, 1).
First define the sequential estimation plans for the scalar parameter a on the bases of
correlation estimators which are generalized least squares estimators:
a4(T) = G−1
4 (T)Φ4(T),
G4(T) =
 T
0
Z∗
(t − 2)Z∗
(t)dt,
Φ4(T) =
 T
0
Z∗
(t − 2)dΔ̃Y(t), T  0.
Let (cn, n ≥ 1) be an unboundedly increasing sequence of positive numbers, satisfying the
condition (7).
We shall define
– the sequence of stopping times (τ4(n, ε), n ≥ 1) as
τ4(n, ε) = inf{T  2 :
 T
0
(Z∗
(t − 2))2
dt = ε−1
cn}, n ≥ 1;
– the sequence of estimators
a4(n, ε) = a4(τ4(n, ε)) = G−1
4 (τ4(n, ε))Φ4(τ4(n, ε));
– the stopping time
σ4(ε) = inf{n ≥ 1 : S4(N)  (ρ4δ−1
4 )1/2
}, (21)
where S4(N) =
N
∑
n=1
G̃−2
4 (n, ε), ρ4 = ρ3, G̃4(n, ε) = (ε−1cn)−1G4(τ4(n, ε)). The deviation of
a4(n, ε) has the form
a4(n, ε) − a = (ε−1
cn)−1/2
G̃−1
4 (n, ε)ζ̃4(n, ε), n ≥ 1,
where
ζ̃4(n, ε) = (ε−1
cn)−1/2
τ4(n,ε)

0
Z∗
(t − 2)(Δ̃ξ(t)dt + dV(t) − dV(t − 1))
and we have
Eϑ||ζ̃4(n, ε)||2
≤ 3(3 + R2
), n ≥ 1, ε  0.
We define the sequential plan (T4(ε), ϑ4(ε)) for the estimation of ϑ as
T4(ε) = τ4(σ4(ε), ε), ϑ4(ε) = a4(σ4(ε), ε)(1, −1)
. (22)
The following proposition presents the conditions under which T4(ε) and ϑ4(ε) are
well-defined and have the desired property of preassigned mean square accuracy.
34 Stochastic Modeling and Control
On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 13
Proposition 3.4. Assume that the sequence (cn) defined above satisfy the condition (7). Then we
obtain the following result:
I. For any ε  0 and every ϑ ∈ Θ4 the sequential plan (T4(ε), ϑ4(ε)) defined by (22) is closed and has
the following properties:
1◦
. sup
ϑ∈Θ4
Eϑ||ϑ4(ε) − ϑ||2
≤ δ4ε;
2◦. the following relations hold:
– if ϑ ∈ Θ41 then
0  lim
ε→0
ε · T4(ε) ≤ lim
ε→0
ε · T4(ε)  ∞ Pϑ − a.s.,
– if ϑ ∈ Θ42 then
0  lim
ε→0
[T4(ε) −
1
2v0
ln ε−1
] ≤ lim
ε→0
[T4(ε) −
1
2v0
ln ε−1
]  ∞ Pϑ − a.s.;
II. For every ϑ ∈ Θ4 the estimator ϑ4(n, ε) is strongly consistent:
lim
n∨ε
ϑ4(n, ε) = ϑ Pϑ − a.s.
3.5. General sequential estimation procedure of the time-delayed process
In this paragraph we construct the sequential estimation procedure for the parameters a and
b of the process (1) on the bases of the estimators, presented in subsections 3.1-3.4.
Denote j = arg min
j=1,4
Tj(ε). We define the sequential plan (T(ε), ϑ(ε)) of estimation ϑ ∈ Θ
on the bases of all constructed above estimators by the formulae
SEP
(ε) = (T
(ε), ϑ∗
(ε)), T
(ε) = Tj (ε), ϑ∗
(ε) = ϑj (ε).
The following theorem is valid.
Theorem 3.1. Assume that the underlying processes (X(t)) and (Y(t)) satisfy the equations (1), (2),
the parameter ϑ to be estimated belongs to the region Θ and for the numbers δ1, . . . , δ4 in the definitions
(10), (15), (20) and (22) of sequential plans the condition
4
∑
j=1
δj = 1 is fulfilled.
Then the sequential estimation plan (T(ε), ϑ(ε)) possess the following properties:
1◦. for any ε  0 and for every ϑ ∈ Θ
T
(ε)  ∞ Pϑ − a.s.;
2◦. for any ε  0
sup
ϑ∈Θ
Eϑ ϑ
(ε) − ϑ 2
≤ ε;
3◦. the following relations hold with Pϑ – probability one:
– for ϑ ∈ Θ11 ∪ Θ3 ∪ Θ41
lim
ε→0
ε · T∗
(ε)  ∞;
35
On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations
14 Will-be-set-by-IN-TECH
– for ϑ ∈ Θ12 ∪ Θ42
lim
ε→0
[T∗
(ε) −
1
2v0
ln ε−1
]  ∞;
– for ϑ ∈ Θ13
lim
ε→0
[T∗
(ε) +
1
v0
ln T1(ε) − Ψ
13(ε)]  ∞,
the function Ψ
13(ε) is defined in (30);
– for ϑ ∈ Θ2
lim
ε→0
[T∗
(ε) −
1
2v1
ln ε−1
]  ∞.
4. Proofs
Proof of Proposition 3.1. The closeness of the sequential estimation plan, as well as assertions
I.2 and II of Proposition 3.1 for the cases Θ11 ∪ Θ12 can be easily verified similar to [10, 12, 14,
16]. Now we verify the finiteness of the stopping time T1(ε) in the new case Θ13.
By the definition of Δ̃Y(t) we have:
Δ̃Y(t) = X̃(t) + Δ̃V(t), t ≥ 1,
where
X̃(t) =
t

t−1
X(t)dt.
It is easy to show that the process (X̃(·)) has the following representation:
X̃(t) = x̃0(t)X0(0) + b
0

−1
x̃0(t − s − 1)X0(s)ds +
 t
0
x̃0(t − s)dW(s)
for t ≥ 1, X̃(t) =
 0
t−1 X0(s)ds +
 t
0 X(s)ds for t ∈ [0, 1) and X̃(t) = 0 for t ∈ [−1, 0). Based on
the representation above for the function x0(·), the subsequent properties of x0(t) the function
x̃0(t) =
 t
t−1 x0(s)ds can be easily shown to fulfill x̃0(t) = 0, t ∈ [−1, 0] and as t → ∞
x̃0(t) =
⎧
⎪
⎪
⎪
⎪
⎪
⎪
⎪
⎪
⎪
⎪
⎪
⎪
⎪
⎨
⎪
⎪
⎪
⎪
⎪
⎪
⎪
⎪
⎪
⎪
⎪
⎪
⎪
⎩
o(eγt), γ  0, ϑ ∈ Θ11,
φ̃0(t)ev0t + o(eγ0t), γ0  v0, ϑ ∈ Θ12,
2
v0
[(1 − e−v0 )t + e−v0 − 1−e−v0
v0
]ev0t + o(eγ0t), γ0  v0, ϑ ∈ Θ13,
1−e−v0
v0(v0−a+1)
ev0t + 1−e−v1
v1(a−v1−1)
ev1t + o(eγ1t), γ1  v1, ϑ ∈ Θ21,
1−e−v0
v0(v0−a+1)
ev0t + φ̃1(t)ev1t + o(eγ1t), γ1  v1, ϑ ∈ Θ22,
1−e−v0
v0(v0−a+1)
ev0t + o(eγt), γ  0, ϑ ∈ Θ3,
1
1−a + o(eγt), γ  0, ϑ ∈ Θ41,
1−e−v0
v0(v0−a+1)
ev0t − 1
a−1 + o(eγt), γ  0, ϑ ∈ Θ42,
36 Stochastic Modeling and Control
On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 15
where
φ̃i(t) = Ãi cos ξit + B̃i sin ξit
and Ãi, B̃i, ξi are some constants (see [10, 12]).
The processes X̃(t) and Δ̃V(t) are mutually independent and the process X̃(t) has the
representation similar to (3). Then, after some algebra similar to those in [10, 12] we get for
the processes X̃(t), Ỹ(t) = X̃(t) − λX̃(t − 1), λ = ev0 , Δ̃Y(t) and
Z(t) =

Δ̃Y(t) − λΔ̃Y(t − 1) for t ≥ 2,
0 for t  2
in the case Θ13 the following limits:
lim
t→∞
t−1
e−v0t
Δ̃Y(t) = lim
t→∞
t−1
e−v0t
X̃(t) = C̃X Pϑ − a.s., (23)
lim
t→∞
e−v0t
Ỹ(t) = CY, lim
t→∞
e−v0t
Z(t) = C̃Z Pϑ − a.s.,
and, as follows, for u ≥ 0
lim
T→∞
| T−2
e−2v0T
T

1
Δ̃Y(t − u)Δ̃Y(t)dt−
C̃2
X
2v0
1 −
u
T
e−uv0
| = 0 Pϑ − a.s., (24)
lim
T→∞
| T−1
e−2v0T
T

1
Δ̃Y(t − u)Z(t)dt−
C̃XC̃Z
2v0
1 −
u
T
e−uv0
| = 0 Pϑ − a.s.,
where C̃x, CY and C̃Z are some nonzero constants, which can be found from [10, 12]. From
(24) we obtain the limits:
lim
T→∞
1
T2e2v0T
G1(T, s) = G13(s), lim
T→∞
T−1
e−4v0T
|G1(T, s)| = G13e−(3+11s)v0
Pϑ − a.s.,
G13(s) =
C̃2
X
2v0
e−(2+5s)v0 e−(1+5s)v0
e−2(1+3s)v0 e−(1+6s)v0
, G13 =
sC̃3
XC̃Z
4v2
0
and, as follows, we can find
lim
T→∞
T−1
e2v0T
G−1
1 (T, s) = G̃13(s) Pϑ − a.s.,
G̃13(s) =
2v0e(3+11s)v0
sC̃XC̃Z
e−(1+6s)v0 −e−(1+5s)v0
−e−2(1+3s)v0 e−(2+5s)v0
is a non-random matrix function.
From (23) and by the definition of the stopping times τ1(n, ε) we have
lim
n∨ε
τ2
1 (n, ε)e2τ1(n,ε)v0
ε−1cn
= g∗
13 Pϑ − a.s., (25)
37
On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations
16 Will-be-set-by-IN-TECH
where g∗
13 = 2v0C̃−2
X e−2v0(2+5h1) + e−4v0(1+3h1)
−1
and, as follows,
lim
n∨ε
[τ1(n, ε) +
1
v0
ln τ1(n, ε) −
1
2v0
ln ε−1
cn]=
1
2v0
ln g∗
13 Pϑ − a.s., (26)
lim
n∨ε
τ1(n, ε)
ln ε−1cn
=
1
2v0
Pϑ − a.s., (27)
lim
n∨ε
[
1
ln3
ε−1cn
G̃−1
1 (n, ε) − [(2v0)3
g∗
13]−1
e−2v0k1(n)h1
G̃13(h1)] = 0 Pϑ − a.s. (28)
From (8) and (28) it follows the Pϑ − a.s. finiteness of the stopping time σ1(ε) for every ε  0.
The proof of the assertion I.1 of Proposition 3.1 for the case Θ13 is similar e.g. to the proof of
corresponding assertion in [14, 16]:
Eϑ||ϑ1(ε) − ϑ||2
= Eϑ
1
S2(σ1(ε))
||
σ1(ε)
∑
n=1
β2
1(n, ε)(ϑ1(n, ε) − ϑ)||2
≤
≤ ε
δ1
ρ1
Eϑ
σ1(ε)
∑
n=1
1
cn
· β2
1(n, ε) · ||G̃−1
1 (n, ε)||2
· ||ζ̃1(n, ε)||2
≤
≤
εδ1
ρ1
∑
n≥1
1
cn
Eϑ||ζ̃1(n, ε)||2
≤ εδ1
15(3 + R2)
ρ1
∑
n≥1
1
cn
= εδ1.
Now we prove the assertion I.2 for ϑ ∈ Θ13. Denote the number
g̃13 = [(2v0)3
g∗
13]2
ρ−1
1 δ1||G̃13(h1)||−2
and the times
σ

13(ε) = inf{n ≥ 1 :
N
∑
n=1
ln6
ε−1
cn  g̃13e4v0h1
},
σ

13(ε) = inf{n ≥ 1 :
N
∑
n=1
ln6
ε−1
cn  g̃13e20v0h1
}.
From (8) and (28) it follows, that for ε small enough
σ
13(ε) ≤ σ1(ε) ≤ σ
13(ε) Pϑ − a.s. (29)
Denote
Ψ
13(ε) =
1
2v0
ln(ε−1
cσ
13(ε)), Ψ
13(ε) =
1
2v0
ln(ε−1
cσ
13(ε)). (30)
Then, from (8), (26) and (29) we obtain finally the assertion I.2 of Proposition 3.1:
lim
ε→0
[T1(ε) +
1
v0
ln T1(ε) − Ψ
13(ε)] ≥
1
2v0
ln g∗
13 Pϑ − a.s.,
38 Stochastic Modeling and Control
On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 17
lim
ε→0
[T1(ε) +
1
v0
ln T1(ε) − Ψ
13(ε)] ≤
1
2v0
ln g∗
13 Pϑ − a.s.
For the proof of the assertion II of Proposition 3.1 we will use the representation (9) for the
deviation
ϑ1(n, ε) − ϑ =
1
ln3
ε−1cn
G̃−1
1 (n, ε)·
τ2
1 (n, ε)e2τ1(n,ε)v0
ε−1cn
·
ln ε−1cn
τ1(n, ε)
3
·
1
τ−1
1 (n, ε)e2τ1(n,ε)v0
ζ1(n, ε),
where
ζ1(n, ε) = ζ1(τ1(n, ε) − k1(n)h1, h1),
ζ1(T, s) =
T

0
Ψs(t − 2 − 5s)(Δ̃ξ(t)dt + dV(t) − dV(t − 1)).
According to (25), (27) and (28) first three factors in the right-hand side of this equality have
Pϑ − a.s. positive finite limits. The last factor vanishes in Pϑ − a.s. sense by the properties of
the square integrable martingales ζ1(T, s) :
lim
n∨ε
ζ1(n, ε)
τ−1
1 (n, ε)e2τ1(n,ε)v0
= lim
T→∞
ζ1(T, h1)
T−1e2v0T
= 0 Pϑ − a.s.
Then the estimators ϑ1(n, ε) are strongly consistent as ε → 0 or n → ∞ and we obtain the
assertion II of Proposition 3.1.
Hence Proposition 3.1 is valid.
Proof of Proposition 3.2.
Similar to the proof of Proposition 3.1 and [7]–[16] we can get the following asymptotic as
t → ∞ relations for the processes Δ̃Y(t), Z(t) and Z̃(t) :
– for ϑ ∈ Θ21
Δ̃Y(t) = CYev0t
+ CY1ev1t
+ o(eγt
) Pϑ − a.s.,
Z(t) = CZev1t
+ o(eγt
) Pϑ − a.s.,
λt − λ =
2v0ev0
v0 + v1
CZC−1
Y e−(v0−v1)t
+ o(e−(v0−v1+γ)t
) Pϑ − a.s.,
Z̃(t) = C̃Zev1t
+ o(eγt
) Pϑ − a.s.;
– for ϑ ∈ Θ22
|Δ̃Y(t) − CYev0t
− CY1(t)ev1t
| = o(eγt
) Pϑ − a.s.
|Z(t) − CZ(t)ev1t
| = o(eγt
) Pϑ − a.s.,
λt − λ = 2v0ev0
C−1
Y UZ(t)e−(v0−v1)t
+ o(e−(v0−v1+γ)t
) Pϑ − a.s.,
|Z̃(t) − C̃Z(t)ev1t
| = o(eγt
) Pϑ − a.s.,
where CY and CY1 are some non-zero constants, 0  γ  v1, CZ = CY1(1 − ev0−v1
), C̃Z =
CZ
v1 − v0
v1 + v0
; CZ(t), UZ(t) =
∞

0
CZ(t − u)e−(v0+v1)u
du and C̃Z(t) = CZ(t) − 2v0UZ(t) are the
periodic (with the period Δ  1) functions.
39
On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations
18 Will-be-set-by-IN-TECH
Denote
UZ̃(T) =
∞

0
C̃Z(T − u)e−(v0+v1)u
du,
UZ̃Z(S, T) =
∞

0
C̃Z(T − u)CZ(S − u)e−2v1u
du, ŨZ(T) = UZ̃Z̃(T, T).
It should be noted that the functions CZ(t), UZ(t), C̃Z(t) and UZ̃(T) have at most two roots
on each interval from [0, ∞) of the length Δ. At the same time the function UZ̃Z(S, T) - at most
four roots.
With Pϑ-probability one we have:
– for ϑ ∈ Θ2
lim
T−S→∞
e−2v0T
T

S
(Δ̃Y(t − 3))2
dt =
C2
Y
2v0
e−6v0
, (31)
– for ϑ ∈ Θ21
lim
T−S→∞
e−2v1T
T

S
Z̃2
(t − 3)dt =
C̃2
Z
2v1
e−6v1
, (32)
lim
T−S→∞
G̃−1
2 (S, T) = G̃21, (33)
where
G̃21 =
⎛
⎝
2v1(v1+v0)2
CZC̃Z(v1−v0)2 e3v1 − 4v0v1(v1+v0)
CZCY(v1−v0)2 e3v0
− 2v1(v1+v0)2
CZC̃Z(v1−v0)2 ev0+3v1 4v0v1(v1+v0)
CZCY(v1−v0)2 e4v0
⎞
⎠ ,
– for ϑ ∈ Θ22
lim
T−S→∞






e−2v1T
T

S
Z̃2
(t − 3)dt − e−6v1
ŨZ (T − 3)






= 0, (34)
lim
T−S→∞


G̃−1
2 (S, T) − G̃22(T)


 = 0, (35)
where
G̃22(T) =

1
2v0
UZ̃Z(T, T − 3) − UZ(T − 3)UZ̃(T)
−1
·
⎛
⎝
e3v1
2v0
−e3v0
CY
UZ̃(T)
−ev0+3v1
2v0
e4v0
CY
UZ̃(T)
⎞
⎠ .
The matrix G̃21 is constant and non-zero and G̃22(T) is the periodic matrix function with the
period Δ  1 (see [3], [10, 12, 14]) and may have infinite norm for four points on each interval
of periodicity only.
The next step of the proof is the investigation of the asymptotic behaviour of the stopping
times ν2(n, ε), τ2(n, ε) and the estimators α2(n, ε).
40 Stochastic Modeling and Control
Another Random Scribd Document
with Unrelated Content
temps, pendant que lui se tranquillise au cabaret. Le maître
fioûle sa bouteille, la jument lit la gazette.
GAZOUILLON, s. m. Terme des campagnards. Margouillis. Se dit
surtout du margouillis qui provient d'un mélange de neige
fraîchement tombée et de pluie. Gazouillon et Margouillis sont
des onomatopées.
GÉANE, s. f. Géante. La merveilleuse géane étonna toute
l'assemblée. Français populaire.
GEL, s. m. Gelée. Le mot gel manque dans plusieurs dictionnaires
et en particulier dans celui de l'Académie française. Le
Complément de ce même dictionnaire, et le Dictionnaire
national de Bescherelle [1846], disent que gel, dans le sens de
«Gelée,» a vieilli. Nous pouvons affirmer que le mot gel,
signifiant: «Gelée,» est d'un emploi habituel chez nous et chez
nos proches voisins.
GELÉE AUX GROSEILLES, s. f. Dites: «Gelée DE groseilles.» Dites
aussi: Gelée DE pomme, gelée DE framboise, etc.
GELER DE FROID. Geler. Faites-moi vite un grand feu, je gèle de
froid. Français populaire.
GELER (SE), v. pron. Geler. Je me gèle ici à vous attendre. Faute
très-répandue. «Se geler» n'est français qu'en parlant des
choses. «Le mercure peut se geler. Le nez de Mme
Z*** se gela
au passage du grand Saint-Bernard.»
GEMOTTER, v. n. Signifie: 1o
S'impatienter, pester; 2o
Languir, être
languissant. La pauvre drôlesse, abandonnée de tout le monde,
était là à gemotter dans son lit. Ranimez donc ce feu qui ne fait
que gemotter. Dans le patois vaudois, gemotta veut dire: Gémir,
et dans le patois neuchâtelois, gemiller, s'impatienter. R. Gemo.
GENDRE, s. m. Se faire gendre, signifie, dans son sens le plus
large: Se procurer, par un riche mariage, une position douce,
confortable, oisive, à laquelle on ne serait jamais arrivé d'une
autre manière. Dans un sens plus restreint, se faire gendre se
dit facétieusement et dérisoirement d'un jeune homme du haut,
qui, ayant une fortune exiguë, des habitudes un peu
dispendieuses et un extérieur agréable, choisit pour femme une
riche héritière dans la classe bourgeoise. Cette expression
originale, se faire gendre, a été créée ou mise en circulation par
un charmant article du journal de Mr
Petit-Senn. [Voyez le
Fantasque de 1835, no
81, p. 322, et la Revue suisse de 1850,
livraison du mois de mai, p. 328.]
GENÈVRE, s. m. Des grains de genèvre. Ce terme nous vient du
vieux français. Au commencement du dix-huitième siècle, on
disait encore indifféremment genèvre et genièvre. «Genièvre» a
prévalu.
GENILLÉ, s. m. Nous appelons goût de genillé, un mauvais goût
que contractent les volailles qui ont été nourries dans un
poulailler petit et malpropre. Geniller veut dire «Poulailler» dans
le dialecte du Berry. Djeneuille, dans le patois vaudois, signifie:
Poule. Par métathèse, c'est-à-dire par transposition de lettres,
ces mots viennent du mot latin gallina, poule.
GENOU, s. m. Nous disons d'un couteau qui coupe mal: Il coupe
comme les genoux d'une vieille femme, comme les genoux de
ma grand'mère. Expression triviale, consignée dans le
Dictionnaire du Bas langage, t. II, p. 10.
GERLE, s. f. Corbeille ronde et peu profonde, destinée à recevoir le
légume qu'on porte au marché. En Dauphiné, gerle signifie:
Jarre, grand vase de terre. En languedocien, une gerle est un
baquet, un grand seau. Voyez JARLOT.
GÉROFLÉE, s. f. Géroflée blanche. Bouquet de géroflées. Terme
français populaire. On doit dire: «Giroflée.»
GÉROLE, s. f. Chervis, racine potagère. Dans quelques provinces
de France, on dit: Gyrole.
GESSION, s. f. On vient d'ôter à ce jeune dissipateur la gession de
sa fortune. Terme parisien populaire, etc. On doit écrire
«Gestion» et prononcer gess-tion.
GICLÉE ou JICLÉE, s. f. Signifie: 1o
Jaillissement, liquide qui jaillit;
2o
Éclaboussure, flaquée. En deux ou trois giclées, on se rendit
maître du feu. Une giclée de mortier suffira contre ce mur. Dans
le Jura, gicle, s. f., se dit d'une petite seringue de sureau, avec
laquelle les polissons s'évertuent à arroser les passants. [Voyez
Monnier, Vocabulaire du Jura.]
GICLER ou JICLER, v. n. et a. Signifie: 1o
Jaillir, saillir, sortir
impétueusement; 2o
Faire jaillir, jeter de l'eau. Faire gicler de
l'eau; faire gicler de la boue. Finis-donc, André, tu me gicles.
Terme suisse-roman, savoisien, franc-comtois et lyonnais. En
provençal et en languedocien: Jhiscla. Onomatopée
remarquable. Dans le patois bourguignon, chicclai signifie:
«Faire jaillir,» et chiccle se dit d'une «Canonnière» ou seringue
de bois dont s'amusent les enfants pour jeter de l'eau. [Voyez
les Noëls bourguignons de La Monnoye.]
GIFFLARD, DE, s. Joufflu, mouflard, qui a le visage bouffi et
rebondi. Un gros gifflard. On disait en vieux français: Giffard,
giffarde, terme formé de giffe ou giffle, joue.
GIFLÉE, s. f. Giffle, mornifle, taloche.
GIGASSE, s. f. Se dit d'une personne démesurément grande et un
peu dégingandée.
GIGIER, s. m. Gésier, second ventricule de certains oiseaux. Ne
jetez pas ces gigiers, ils serviront pour le bouillon. Terme
généralement usité en Suisse et en France, mais que les
dictionnaires n'ont pas recueilli. Nous disons aussi: Gisier.
GIGNER, v. a. Guigner, regarder du coin de l'œil.
GIGOT DE MOUTON, s. m. Dites simplement: «Gigot,» puisque
«gigot» signifie: Cuisse de mouton séparée du corps de l'animal
pour être mangée. [Acad.]
GIGUE, s. f. Se dit d'une personne dont la taille est grande et
toute d'une venue. Vois-tu là-bas cette grande gigue, cette
perche? En Normandie, une gigue est une jeune fille qui a de
grandes jambes. En français, «Gigue» veut dire: Jambe; et
«Giguer,» aller vite, courir, sauter, danser.
GILLOTIN, s. m. (ll mouillés.) Pantin, jeune garçon qui est toujours
en mouvement, et qui cherche à divertir par ses perpétuelles
pasquinades. Faire le gillotin.
GILLOTINER, v. n. Faire le gillotin.
GINGEOLET, ETTE, adj. Ginguet, court, étriqué. Habit gingeolet.
GINGUER ou JINGUER, v. n. Jouer, rire, sauter, folâtrer. Elle est
toujours à ginguer. Terme limousin, normand et vieux français.
En Picardie on dit: Jingler.
GIRADE, s. f. Girarde ou julienne, fleur.
GIRANIUM, s. m. Écrivez «Géranium» et prononcez géraniome.
Prononcez aussi albome, peinsome et laudanome les mots
Album, Pensum et Laudanum.
GIRAUD, nom propre. Nous disons proverbialement et
facétieusement à une personne qui nous fait une demande
inadmissible, à une personne qui porte très-haut ses prétentions
et dont l'attente sera trompée: As-tu connu Giraud?... Eh bien,
torche Miraud; ou plus laconiquement: As-tu connu Giraud?
c'est-à-dire: Bernicle; à d'autres; adresse ta demande à un
autre. Tu voudrais que je te prêtasse encore cinquante francs?
As-tu connu Giraud? Quoi! ton vilain cousin se flatte d'épouser
cette jeune et jolie Anna!... As-tu connu Giraud?
GISIER, s. m. Voyez GIGIER.
GISPINER, v. a. Expression adoucissante, pour signifier: Filouter,
attraper, enlever habilement et sans scrupule, comme le font
quelquefois des amis entre eux. Ce joli volume était à sa potte:
il me l'a tout bonnement gispiné. En Lorraine on dit: Gaspiner
ou gabsiner, et à Valenciennes, gobsiner.
GIVRÉ, ÉE, part. et adj. Couvert de givre. C'est givré; c'est tout
givré. Il a beaucoup givré cette nuit. Terme des campagnards.
GLACE, s. f. Ne dites pas: «Manger une glace.» Dites: «Prendre
une glace, prendre des glaces.»
GLACE, s. f. Être froid comme la glace; être uni comme la glace.
Retranchez l'article et dites: Être froid comme glace; être uni
comme glace.
GLACER UN PLAFOND. Terme de plâtrier. L'expression française
est: Enduire un plafond.
GLAFFER ou GLLAFFER, v. a. (ll mouillés.) Terme des
campagnards. Manger gloutonnement quelque chose qui croque
sous la dent. On le dit des pourceaux et de ceux qui, de près ou
de loin, leur ressemblent. Ce mot gllafer, quand on le prononce
comme il faut, imite parfaitement la chose qu'il doit peindre.
GLAÎNE ou GLÈNE, s. f. Faire glaîne, terme d'écolier, signifie: Faire
rafle, prendre à l'improviste les jouets, et surtout les mâpis des
joueurs. Ce polisson, ce voleur s'approcha doucement du carré
et nous fit glaîne. Voyez GLENNE, no
1.
GLAPPE, s. f. Signifie: 1o
Terre glaise; 2o
Pisé. [P. G.]
GLAIRE, s. m. Le glaire d'un œuf. «Glaire» est féminin.
GLÉNER ou GLAÎNER, v. a. et n. Glaner, ramasser les épis après la
moisson. Terme français populaire et vieux français.
GLÉNEUR, GLÉNEUSE, s. Glaneur, glaneuse.
GLENNE, s. f. Glane, produit du glanage, glanure. Un bandit lui
enleva toutes ses glennes. Terme français populaire et vieux
français.
GLENNE, s. f. Sorte de renoncule des champs.
GLIN-GLIN, s. m. Terme enfantin. Le petit doigt. Il a bobo à son
glin-glin. Cette expression, usitée aussi dans les cantons voisins,
vient probablement des mots allemands klein, klein, qui
signifient: Petit, petit.
GLISSE, s. f. Terme de pâtissier. Cressin, sorte de petit pain long,
qui est fort léger à l'estomac.
GLISSE, s. f. Glissoire, chemin frayé sur la glace pour y glisser par
divertissement. Faire une bonne glisse, faire une longue glisse.
Gare, gare, sur la glisse! Terme suisse-roman et savoisien. On
dit à Lyon: Une glissière; en Lorraine, un glissant; à Paris, une
glissade.
GLISSER, v. neutre. La rue du Perron glisse souvent en hiver.
Dites: La rue du Perron est souvent glissante en hiver; ou dites:
On glisse souvent en hiver dans la rue du Perron.
GLISSER (SE), v. pron. Glisser, s'amuser à glisser. Les fossés sont
gelés: allons nous y glisser tous ensemble. Il faut dire: «Allons y
glisser tous ensemble.»
GLOPET, s. m. Sieste, méridienne. Voyez CLOPET.
GLU, s. masc. Du bon glu. Solécisme répandu aussi dans le reste
de la Suisse romane, en Savoie, en Dauphiné, en Franche-
Comté, en Lorraine et ailleurs.
GNIABLE, s. m. Sobriquet qu'on donne aux cordonniers.
GNIANIOU, s. m. Voyez NIANIOU.
GNIFFE-GNIAFFE, s. m. Ce terme fort expressif signifie: 1o
Nigaud,
niais, benêt; 2o
Flasque, lâche, mou et sans ressort. En Picardie
on dit: Gniouffe.
GOBE-LA LUNE, s. m. Gobe-mouche, niais, grand niais qui marche
la tête levée comme s'il regardait la lune. Dans le patois du bas
Limousin, gobo-luno se dit de celui qui s'occupe niaisement de
bagatelles. [Voyez Béronie, Dictionnaire du patois du bas
Limousin.]
GOBERGER (SE), v. pron. Faire grande chère, bâfrer, faire
bombance, se régaler. Nos quatre amis allèrent à une auberge
de Coppet, où ils demandèrent des feras et des volailles, dont ils
se gobergèrent. Voyez donc comme ces enfants se gobergent et
s'empiffrent de raisins et de noix! En français, «Se goberger»
signifie: Prendre ses aises, se dorloter, se divertir.
GODAILLE, s. f. Débauche de bouche, bâfre, grande ribote. Faire
une godaille. Ce fut une godaille complète, une godaille de
mâlevie. Le dictionnaire de l'Académie ne fait pas mention de ce
terme; et, selon les dictionnaires de Boiste, de Landais et de
Bescherelle, godaille signifie: 1o
Ivrognerie; 2o
Mauvais vin. Ce
n'est point là le sens que nous lui donnons à Genève; ce n'est
pas non plus le sens qu'il a dans le langage français populaire.
[Voyez le Dictionnaire du Bas langage, t. II, p. 17, et le
Dictionnaire rouchi-français, aux mots godaïer et godalier.]
GODAILLER, v. n. Faire une grande ribote, une bâfre, une godaille.
Dans les dictionnaires ce verbe a un autre sens.
GODAILLEUR, s. m. Riboteur, bambocheur, bâfreur. Un tas de
godailleurs. Ce mot et les deux précédents sont probablement
originaires du nord de la France, où le mot godale signifie:
«Bière, petite bière.»
GODICHE, s. et adj. Plaisant, risible. Être godiche, être
plaisamment bête. Tu es godiche, toi! Voilà qui est vraiment
godiche. Terme parisien populaire recueilli par MM. Noël et
Chapsal. Les autres dictionnaires donnent à ce mot le sens de:
«Gauche, emprunté, maladroit.»
GODICHON, s. m. Diminutif de godiche.
GODRON, s. m. Goudron. GODRONNER, v. a. Goudronner. Les
mots godron et godronner appartenaient encore à la langue des
dictionnaires, il y a un siècle.
GOFFETTE, adj. fém. Nous appelons mains goffettes, des mains
grassettes, des mains potelées.
GOGNE, s. f. Courage, cœur, hardiesse, capacité. Avoir la gogne,
oser. Aurais-tu la gogne de sauter ce ruisseau? Non, tu n'en as
pas la gogne; tu n'as point de gogne.
GOGNE, s. f. Rebut, lie, crasse, crapule. Se dit des personnes et
des choses. Quelle gogne de bâton tu as là! Dis donc, Jacques,
et ce bal d'hier! Quel bal! Quelle gogne! Qu'as tu donc appris
sur le compte de Robillard?—J'ai appris que c'est une gogne.—Et
sa famille?—Sa famille? C'est tout de la gogne. Tomber dans la
gogne, veut dire: Tomber dans la crapule. Terme vaudois. Chez
nos voisins du Jura, gone se dit d'une femme méprisable.
[Voyez C. Monnier, Vocabulaire du Jura.]
GÔGNES, s. f. pl. Compliments, cérémonies. Faire des gôgnes.
GOGNEUX, EUSE, adj. et s. Crasseux, dégoûtant, repoussant,
crapuleux. Se dit des personnes et des choses. Un chapeau
gogneux. Une tournure gogneuse; un air gogneux. Tu te
promenais hier avec deux individus bien gogneux. Dans le bas
limousin, gognou, et en vieux français, gognon, signifient:
Pourceau, cochon, et se disent de toute personne sale et
malpropre. Gognounà, faire grossièrement et salement un
ouvrage.
GOGUINETTE, s. f. Propos gaillard, parole un peu libre. Dire la
goguinette. Dire une goguinette; dire des goguinettes. En
Lorraine, goguenettes signifie: Propos joyeux. En vieux français,
goguer, v. n., veut dire: «Plaisanter.»
GOISE ou GOËZE, s. f. Serpe, grosse serpe. En Franche-Comté on
dit: Goisse et gouisse.
GOISET, GOAZET, ou GOINZET, s. m. Serpette. Se dit aussi d'un
couteau et principalement d'un mauvais couteau.
GOLÉE, s. f. Gorgée. Avales-en une seule golée. J'ai bu deux
petites golées de ton sirop, et j'en ai eu assez. En Picardie on
dit: Goulée. «Goulée» est un mot français; mais il signifie:
«Grande bouchée.»
GOLÉRON ou GOLAIRON, s. m. Ouverture, trou. Le goléron d'une
nasse. Dans l'ancienne langue provençale, golairos signifiait:
«Gosier.»
GOLET, s. m., et GOLETTE, s. f. Goulot, trou, orifice. Le golet d'une
bouteille. Terme jurassien et savoisien. Dans notre patois ces
mots ont une signification plus étendue.
GONFLE, s. f. Signifie: 1o
Vessie des quadrupèdes; 2o
Petite
ampoule sur la peau, cloche, élevure; 3o
Bulle de savon. Sa
brûlure lui a fait lever des gonfles. Percer une gonfle. Se
soutenir sur l'eau avec des gonfles. Terme suisse-roman et
savoisien.
GONFLE, adj. Gonflé. Il a tant marché aujourd'hui, qu'il en a les
pieds gonfles. Terme français populaire. A Lyon on écrit et on
prononce confle.
GONGON, s. des 2 genres. Grognon, celui ou celle qui bougonne,
qui grogne. Cette gongon finira-t-elle une fois de nous ennuyer?
Le mari et la femme sont aussi gongons l'un que l'autre.
GONGONNER, v. a. Bougonner, marmonner, se fâcher, gronder.
Notre vieux raufin ne s'arrête pas de gongonner. Il gongonne
ses enfants, il gongonne sa servante, il gongonne tout le
monde. Terme suisse-roman, savoisien et lyonnais.
GONVÉ, s. m. Une odeur de gonvé, est une odeur de renfermé,
une odeur de linge sale et gras. Votre Baby Chailloux sentait
terriblement le gonvé.
GONVER, v. a. et n. Couver. L'incendie éclata le matin; mais le feu
avait gonvé toute la nuit. Ne crois-tu pas, femme, que notre
Françoise gonve une maladie?—Je crois qu'elle gonve la
rougeole. Ta seille, Madelon, est égrillée: il faut la faire gonver
(c'est-à-dire: Gonfler dans l'eau). Terme connu dans le canton
de Vaud. En Franche-Comté on dit: Gouver.
GONVIÈRE, s. f. Signifie: 1o
Fondrière, creux plein de boue; 2o
Tas
de neige amoncelé par le vent.
GOTRET, s. m. Terme de boucherie. Ris de veau.
GOTTE, s. f. Mauvais ouvrage, mauvaise marchandise, chose de
nulle valeur, et dont on ne fait aucun cas.
GOUAILLER, v. n. Crier. Voyez COUAILLER.
GOUGNAUD, AUDE, s. et adj. Se dit d'une personne ou d'une
chose de rebut. Quel gougnaud de chapeau tu as là! Notre
nouvelle voisine N** est une gougnaude; elle s'habille comme
une gougnaude.
GOUGNAUDER, v. a. Manier maladroitement, gâter en maniant,
déformer, froisser, chiffonner.
GOUGNAUDS ou GOUGNEAUX, s. m. pl. Vieux chiffons, mauvais
linge, vieilles nippes, et, en général, objets vieux et sans valeur.
GOUILLARD, ARDE, s. et adj. Voyez GOULIARD.
GOUILLE, s. f. Petite mare, endroit où la boue séjourne, flaque.
Marcher dans la gouille; tomber dans la gouille. Terme suisse-
roman, savoisien, dauphinois et franc-comtois. Dans le bas
Limousin on dit: Ga-oullio, et dans le Berry, gouillat.
GOUJATER, v. a. et n. Travailler comme un goujat. Prendre des
manières de goujat. Un ouvrage goujaté est un ouvrage
bousillé, un ouvrage fait vite et sans soin.
GOULIAFE, s. m. Glouton malpropre. A Paris on dit: Gouliafre;
dans le vieux français et en Picardie, goulafre.
GOULIARD, ARDE, s. et adj. Gourmet, friand. Oh! la gouliarde, qui
trempe son doigt dans le sirop! Ces petits gouliards eurent fripé
en un clin d'œil tous les bonbons. Terme vaudois, savoisien et
vieux français. Dans le Limousin, en Normandie et sans doute
ailleurs, goulard signifie: Goulu, gourmand.
GOULIARDISE, s. f. Friandise. Comment, Élisa! du beurre et de la
confiture sur ton pain? quelle gouliardise! Tu n'aimes que les
gouliardises, Georgette, et tu vivrais de gouliardises. En vieux
français on disait: Goulardise et gouillardise. R. gula.
GOURLLE, s. f. (ll mouillés.) Cep de vigne arraché. Dans le canton
de Vaud on dit: Gourgne.
GOURMANDISE (UNE). Un plat de gourmandises. Si vous êtes
sages, vous aurez chacun pour votre goûter une petite
gourmandise. Cette expression, fort usitée en Suisse et en
Savoie, n'est pas inconnue en France, quoique les dictionnaires
ne l'aient pas relevée. «Je t'avais préparé les gourmandises que
tu aimes,» dit feu Mr
De Balzac, dans un de ses romans.
L'expression française consacrée est: «Friandise.» Un plat de
friandises.
GOURMANDS (POIS). Pois goulus, pois dont la cosse est tendre et
se mange.
GOURME, s. m. Jeter son gourme. Ce mot est féminin.
GOÛTER SOUPATOIRE, s. m. Goûter qui tient lieu de souper.
GOUTTE AU NEZ, s. f. Expression méridionale, etc. Les
dictionnaires disent: «Roupie.»
GOUTTIÈRE, s. f. Voie d'eau, fente, trou, ouverture à un toit par
où l'eau de la pluie pénètre et coule en dedans. L'orage souleva
les tuiles et occasionna une gouttière. Le plafond, qui était tout
neuf, fut entièrement taché par les gouttières. Terme suisse-
roman, méridional, etc. On appelle en français Gouttière: 1o
Le
chéneau qui reçoit et recueille les eaux de la pluie; 2o
Le tuyau
de descente.
GOYARDE, s. f. Serpe. Dans le Berry on dit: Goyard.
GRABEAU, s. m. Mercuriale, censure. Bon grabeau, mauvais
grabeau. Faire le grabeau des étudiants. Être soumis au
grabeau; recevoir son grabeau. On lit dans notre Constitution de
1814: «Les membres du Conseil d'État qui ne sont point sujets
au grabeau, n'y assisteront pas.» Terme vaudois et
neuchâtelois.
GRABELER, v. a. Faire le grabeau. «La Compagnie des Pasteurs
élira chacun de ses membres; elle se grabellera elle-même.»
[Constitution de 1814.] «Tous les Conseillers d'État qui ne sont
ni Syndics, ni Lieutenant, ni Syndics sortant de charge, ni
Trésorier, ni membres du Tribunal civil et de la Cour suprême,
seront grabelés un à un à la balotte.» [Ibid.] Le mot grabeler,
en vieux français, signifiait: Examiner, éplucher, débattre, choisir.
GRABOT, s. m. Voyez GRABEAU.
GRABOTER, v. a. Se dit quelquefois pour grabeler.
GRADUATION, s. f. Dans le langage académique on appelle
Examen de graduation, un examen à la suite duquel l'étudiant
reçoit le grade de bachelier, ou celui de licencié, ou celui de
docteur.
GRAIFION, s. m. Voyez GREFFION.
GRAILET, s. m. Plat d'étain donné pour prix dans les tirs.
GRAILETTE ou GREULETTE, s. f. Sorte de terrine, sorte de
casserole à trois pieds, laquelle sert à réchauffer les ragoûts.
GRAILLON, s. m. Ce mot est français; mais à Genève il se dit,
entre autres: 1o
D'un mets quelconque (viande, poisson,
légume, lait, etc.) qui, réchauffé, a contracté une mauvaise
odeur, un mauvais goût. Il se dit: 2o
Des tabliers, torchons,
mauvais linges, etc., dont la cuisinière s'est servie. En français:
Goût de graillon, odeur de graillon, signifient: «Goût, odeur de
viande ou de graisse brûlée.» [Acad.]
GRAIN DE SEL, s. m. Quand les jeunes enfants voient voltiger près
d'eux un oiseau, et qu'ils demandent comment il faut s'y
prendre pour l'attraper, on leur répond que l'infaillible moyen est
de leur mettre un grain de sel sur la queue. De là a pris
naissance notre expression figurée: Mettre un grain de sel sur la
queue de quelqu'un; c'est-à-dire: «Faire d'inutiles efforts pour le
capter et pour l'attirer dans le filet.»
GRAIN DE SUCRE, s. m. Morceau de sucre. Fais attention,
Caroline, tu coupes les grains de sucre trop gros.
GRAINGE, adj. Voyez GRINGE.
GRAISSE, s. f. Réprimande, semonce sévère. Donner une graisse;
recevoir une graisse. Tu as eu ta graisse. Terme français
populaire.
GRAISSE DE CHAR, s. f. Vieux oing, cambouis.
GRAISSE-MOLLE, s. f. Saindoux, graisse de porc. En Dauphiné, en
Provence et en Languedoc, on dit: Graisse blanche; à Bordeaux,
graisse douce.
GRAMON, s. m. Gramen, chien-dent, plante dont les racines sont
d'un grand usage pour les tisanes apéritives. Boire sur le
gramon. En Dauphiné, on dit: Grame.
GRAND, adj. Les expressions suivantes: Ce n'est pas grand chose;
j'ai eu grand peine; voici la grand route, etc., sont des
expressions correctes, mais étranges, et qui nous viennent du
vieux français. Au treizième siècle, grand ou grant était un
adjectif des deux genres.
GRAND, s. f. Terme des campagnards. Grand'mère. Dis-moi,
Colette, comment se porte ta grand? Pauvre Monsieur, cette
bonne grand, nous l'avons perdue il y a huit jours.
GRANDE-MAISON (LA). Terme adoucissant, euphémisme pour
dire: L'hôpital, la maison de charité. Jamais, non jamais,
Monsieur le Directeur, je ne consentirai à entrer dans la Grande-
maison.
GRANDET, ETTE, adj. Grandelet. Notre Stéphanie est déjà
grandette. Terme excellent, employé dans tout le Midi et sans
doute ailleurs.
GRAND-LOUIS ou GRAND-SIFFLET, s. m. Courlis ou courlieu
cendré, oiseau aquatique.
GRANGER, s. m. Métayer, fermier partiaire, fermier qui partage le
produit des champs avec le propriétaire. Ce terme, si connu
dans la Suisse romane, en Savoie et en Franche-Comté, n'a été
recueilli ni par le dictionnaire de l'Académie, ni par M. Poitevin,
le plus récent des lexicographes, ni par Gattel, ni par M.
Bescherelle; mais Boiste et N. Landais l'ont mentionné.
GRANGERIE, s. f. Grangeage. Mettre un domaine en grangerie, ou
à grangerie, c'est: En confier l'exploitation à un granger. Voyez
ce mot. Le mot grangerie, très-usité chez nous, n'a été
enregistré que par un seul dictionnaire moderne, le Complément
de l'Académie.
GRATON, s. m. Aspérité sur le papier, sur le terrain, etc. Sa boule
rencontra un graton.
GRATTE-À-CUL, s. m. Gratte-cul, fruit de l'églantier.
GRATTE-BOISSEUSE ou GRATTE-BOESSEUSE, s. f. Polisseuse de
boîtes de montres. Boesse ou gratte-boesse se disent d'une
sorte d'outil de ciseleur.
GRATTE-LOTON, s. m. Sobriquet qu'on donne aux ouvriers
horlogers. Voyez LOTON.
GRATTER, v. a. Gratter la rogne à quelqu'un, signifie: Le flatter
pour en obtenir une faveur, le cajoler, le flagorner dans des vues
intéressées. Il s'aperçut enfin que sa nièce lui grattait la rogne,
et qu'elle en voulait, par-dessus tout, à l'héritage. Expression
triviale. Dans le français populaire, on dit en ce même sens:
«Gratter l'oreille,» ou «gratter l'épaule à quelqu'un.» [Voyez le
Dictionnaire du Bas langage, t. II.]
GRATUISE, s. f. Râpe de fer-blanc, ustensile de cuisine. En
Dauphiné et en Languedoc, on dit: Gratuse; dans le patois
provençal, gratuè. En vieux français, gratuser signifie: Râper.
GRAVANCHE, s. f. Sorte de férâ. Voyez ce mot.
† GRAVATE, s. f. Cravate. Dis voir, femme, fadrait-il pas mettre une
gravate à notre petit, qui a un commencement de rouche?
Terme suisse-roman, savoisien, franc-comtois et méridional.
GRAVE, s. f. Grève, endroit au bord d'une rivière couvert de
gravier. Terme dauphinois et vieux français.
GRAVELAGE, s. m. Action de graveler.
GRAVELER, v. a. Couvrir de gravier. Graveler les allées d'un jardin;
graveler une promenade. Terme indispensable, et qu'on cherche
vainement dans les dictionnaires. En Languedoc on dit: Agraver.
GRAVELLE, s. f. Maladie des moutons, clavelée.
GREBATTER, v. a. Rouler. Se grebatter, se rouler. Expressions très-
familières aux campagnards.
È
GRÈBE (UNE). Sorte d'oiseau plongeur. Dites au masculin: Un
grèbe. Grèbe cornu, grèbe huppé.
GRÈBION, s. m. Grèbe esclavon, grèbe oreillard.
GREBOLER, v. n. Grelotter, trembler de froid. Je le trouvai tout
greulant, tout grebolant. En Savoie on dit: Grevoler; dans le
patois dauphinois, gromolà.
GREDON ou GREUDON, s. m. Guenilles, vieilleries, objets de
rebut.
GREGNOLU, UE, adj. Qui a beaucoup de nœuds. Bois gregnolu.
Terme des campagnards.
GREIFION, s. m. Gros bigarreau. Une livre de greifions. Terme
suisse-roman, savoisien et jurassien. En provençal, en
piémontais et en vieux français, on dit: Graffion; dans le
Languedoc, agrefion.
GREINGE, adj. Voyez GRINGE.
GRELON, s. m. Écrivez et prononcez «Grêlon.»
GREMILLETTE, s. f. (ll mouillés.) Lézard gris, lézard de murailles.
[P. G.] Dans le patois de Rolle (canton de Vaud) on dit:
Gremeillette.
GREMOLLION ou GREMAILLON, s. m. Grumeau, portion durcie
d'un liquide. La soupe s'était mise en gremollions. Notre pauvre
Estelle vomissait des gremollions de sang. Terme connu aussi
chez nos voisins du canton de Vaud. Dans le Berry et en
Lorraine on dit: Gremillion.
GRENÉ, ÉE, adj. Épi grené. Terme méridional et vieux français.
Dites: «Épi grenu.» Le verbe «grener» est français.
GRENETTE, s. f. Ce mot signifiait jadis: Marché aux grains; et c'est
le nom que porte encore aujourd'hui notre halle au blé. Terme
vaudois, savoisien, etc.
GRENETTE, s. f. Semen contra, poudre contre les vers, barbotine.
GRENIER À LESSIVE, s. m. Séchoir, sécherie, étendage.
GRENOUILLE, s. f. (fig.) Sorte de petit instrument formé d'une tête
de bouteille recouverte d'un morceau de parchemin traversé par
du crin. En le faisant tourner comme une crécelle, il imite assez
bien le cri des grenouilles, quand elles commencent à crier au
printemps. [P. G.]
GRÈSE, adj. fém. Voyez GRÈZE.
GRÉSILLER, v. neutre. Croquer sous la dent, comme le pain
lorsqu'il s'y est mêlé du sable ou du menu gravier. En
Languedoc on dit: Gréziner.
GREUBE, s. f. Tuf, terre sèche et dure qui sert à écurer, à nettoyer
les ustensiles de cuisine, les tablettes de sapin, etc. Patte à
greube. Terme suisse-roman et savoisien. Le vendeur de greube
s'appelle, dans notre patois: Le greubi.
GREUBIÈRE, s. f. Carrière d'où l'on tire la greube.
GREUBONS, s. m. pl. Peau croustillante qui reste quand on vient
de fondre du lard. Un plat de greubons. A Neuchâtel et dans
quelques parties du canton de Vaud, on dit: Grabon; dans
l'allemand-suisse, Grieben.
GREUGER, v. a. Gruger, friper, dissiper en folles dépenses. Il avait
hérité trois mille francs: c'est tout greugé. En vieux français,
gréuge signifie: Perte, dommage.
GREULER, v. actif. Secouer un arbre pour en faire tomber les
fruits. Greuler un cerisier, greuler un pommier. En Savoie on dit:
Creuler; en Franche-Comté, crôler; en vieux français, crosler et
crouller. Figurément et familièrement, creuler s'emploie dans le
sens de: Questionner quelqu'un, lui arracher des nouvelles, le
forcer, de façon ou d'autre, à dire ce qu'il sait et qu'il se soucie
peu ou point de raconter. Nous l'avons tant pressé, nous l'avons
tant greulé, qu'il a fini par nous débiter tout le journal. Voyez le
mot suivant.
GREULER, v. neutre. Grelotter, trembler de froid ou de peur. Ce
pauvre diable, blotti dans un fossé, greulait comme la feuille du
tremble. Terme suisse-roman, qu'on retrouve tel quel dans le
patois lorrain. Dans le Jura on dit: Grouller; en Bourgogne et en
vieux français, gruler. Nous disons à l'actif: Greuler la fièvre,
pour: Trembler la fièvre, avoir le tremblement qui résulte de la
fièvre. Nos campagnards disent en ce même sens ou sens
analogue: Greuler le marmot.
GREULETTE ou GREULAISON, s. f. Frisson, tremblement que
donne la fièvre ou la peur. Avoir la greulette; avoir la greulaison.
Cette dernière expression est surtout familière aux
campagnards.
GREULETTE, s. f. Sorte de terrine appelée aussi: Grailette.
GRÉVÉ, VÉE, adj. et part. Un fonds grévé; un domaine grévé
d'hypothèques. On doit écrire et prononcer «Grever,» sans
accent sur l'e.
GREVURE, s. f. Blessure. Ce terme vieillit.
GRÈZE ou GRÈSE, adj. f. Soie grèze. Soie qui est tirée de dessus le
coton. Terme lyonnais. Dites: «Soie grége.»
GRIBICHE, s. f. Signifie: 1o
Femme ou fille maligne, méchante,
pie-grièche; 2o
Et plus souvent, Fille ou femme de mœurs
dissolues. En Normandie, gribiche se dit d'une vieille femme
méchante dont on fait peur aux enfants.
GRIE, s. f. Plâtre gris, gypse. Terme de nos campagnards et de
ceux du canton de Vaud. Il existe à Bernex une ancienne
carrière de grie, qui a fait donner le nom de grisse aux terrains
environnants.
GRIFFÉE, s. f. Griffade, coup de griffe.
GRILLE, s. f. Cheville du pied. S'écorcher la grille. Terme suisse-
roman, savoisien et franc-comtois.
GRILLER, v. a. Rôtir. Griller du café; griller des châtaignes; griller
des glands. Terme savoisien. En français «Griller» signifie: Rôtir
sur le gril. «J'avais couché mes pincettes sur la braise pour faire
griller mon pain.» [Xav. De Maistre, Voyage autour de ma
chambre, ch. VIII.]
GRILLET, s. m. (ll mouillés.) Sorte d'insecte. Le cri des grillets. Un
trou de grillet. Terme suisse-roman, savoisien, lyonnais, franc-
comtois et méridional. En Poitou et dans le Berry on dit: Grelet;
en limousin et en vieux français, gril. Les dictionnaires et le bon
usage veulent qu'on dise: «Grillon.» R. lat. gryllus.
GRILLOIRE, s. f. Sorte de petite casserole à manche, surmontée
d'un couvercle, et qui sert à rôtir le café. Dans le canton de
Vaud on dit: Un grilloir. Le grilloir à café.
GRILLOIRE, s. f. (fig.) Endroit où la chaleur est insupportable;
endroit où l'on grille. Ce cabinet au midi est une grilloire
pendant l'été.
GRILLOTTER, v. actif. Griller, frire.
GRIMPER, v. n. Dans notre langage énergique, faire grimper les
murs à quelqu'un, signifie: L'impatienter outre mesure, le vexer,
le dépiter à l'excès. Les dictionnaires disent en ce même sens:
«Faire sauter quelqu'un au plancher, le faire sauter aux nues.»
On dit en Languedoc: Faire monter quelqu'un au ciel sans
échelle.
GRIMPION, s. m. Grimpereau, oiseau bien connu. Au sens figuré,
nous appelons grimpion, grimpionne, celui ou celle qui cherche
par des politesses, par des avances répétées, par des flatteries,
à s'introduire, à se glisser dans une société plus élevée, plus
haut placée que la sienne. De là ont pris naissance les phrases
suivantes familières: C'est un grimpion; il fait le grimpion; elle
fait la grimpionne. Ces jeunes époux veulent grimper. Les
grimpions doivent éprouver quelquefois de fameux déboires.
GRIMPIONNER, v. n. Faire le grimpion, faire la grimpionne. Tu ne
t'aperçois pas que cette jeune femme veut absolument
grimpionner.
GRINGALET, ETTE, adj. Faible, chétif. Cheval gringalet; veau
gringalet. Ton beau-frère est bien gringalet, etc. Le Complément
du dictionnaire de l'Académie, et le dictionnaire de M.
Bescherelle ne présentent ce mot que comme substantif, et ne
l'emploient qu'en parlant de l'homme. Nous l'employons très-
souvent comme adjectif, et nous lui donnons des sens fort
étendus.
GRINGE, adj. Triste, ennuyé, chagrin, de mauvaise humeur,
maussade, malingre. Rosalie est toute gringe aujourd'hui, et je
crains qu'elle ne soit malade. Qu'avez-vous donc, Monsieur le
notaire? Vous paraissez sombre et préoccupé?—En effet, je suis
gringe. J'attendais mes enfants par le bateau à vapeur, et voilà
le bateau qui arrive sans eux. Terme suisse-roman. Dans le
patois de l'évêché de Bâle, on dit: Graigne; en Franche-Comté,
grigne, et en Bourgogne, greigne; dans le Berry, grignaut; dans
le patois rouchi, engraigné. Tous ces mots, qui sont fort usités,
n'ont point de correspondants exacts en français. Dans le
dialecte normand, grigner signifie: «Être maussade.» En
Picardie, grigneux et grignard veulent dire: Pleurnicheur.
GRINGERIE, s. f. Mauvaise humeur, malingrerie. Après une pareille
mésaventure, un peu de gringerie est bien permis.
GRIOTTE, s. f. En français, ce mot désigne une espèce de cerise
grosse et noirâtre, plus douce que les autres. En Suisse, au
contraire, nous appelons griotte une cerise acide.
GRIPPÉ, ÉE, adj. Atteint de la grippe. Toute la famille est grippée.
Ce mot, si connu en Suisse et en France, n'est dans aucun
dictionnaire usuel.
GRISAILLE, s. f. Ribotte, excès de table, excès de boisson. [P. G.]
GRISE, s. fém. Tour malin, malice, espièglerie. En faire des grises,
en faire voir de grises, signifie: Jouer des tours, faire des
malices, attraper, tourmenter. Voilà un bambin qui en fera voir
de grises à son père et à sa mère. Vous m'en faites des grises,
malins enfants que vous êtes. Locution dauphinoise, limousine,
etc.
GRISPER et GRISPOUILLER, v. a. Crisper, agacer, impatienter. Cela
me grispouille, c'est-à-dire: Cela me tarabuste.
GRISPILLE, s. f. Sorte de jeu ou d'amusement, appelé aussi tire-
poils, et en français: La gribouillette. [P. G.] À la grispille,
locution adverbiale, signifie: Au pillage. Tout était à la grispille
dans cette maison.
GRISPILLER, v. a. Voler, filouter, friponner.
GRISSE ou GRITZE, s. m. Gruau d'avoine ou d'orge. Ce terme,
usité dans toute la Suisse romane, est formé du mot allemand
Grütze, qu'on prononce gritze, et qui a le même sens.
GROGNASSER, v. n. Grogner, se plaindre en grognant. Terme
parisien populaire.
GROGNE, s. f. Mauvaise humeur, disposition à se plaindre. Avoir la
grogne.
GROGNER QUELQU'UN. Le gronder, le réprimander avec humeur.
Il grogne tout son monde; il ne cesse de nous grogner.
«Grogner,» verbe neutre, est français. «Cette femme ne fait que
grogner.»
GROGNONNE, adj. et s. féminin. Sa maladie l'a rendue un peu
grognonne. Dites: «Grogneuse.»
GROLLE, s. f. Vieux soulier fort usé, savate. Mettre des grolles.
Porter des grolles. Comment donc, Madame Bonnard? vous
nous donnez là du pain qui est sec comme de la grolle. Terme
vieux français et français populaire.
GRONDÉE, s. f. Gronderie, réprimande. Faire une grondée.
Recevoir une grondée.
GROS, s. m. Le gros de l'hiver; le gros de l'été. Dites: «Le fort de
l'hiver; le fort de l'été.»
GROS (LES). Les notables, les riches, les principaux de l'endroit.
Nos gros se montrèrent, en toute occasion, humains et
charitables.
GROS, s. m. Terme de calligraphie. Écrire en gros, c'est Écrire en
gros caractères. Il faut dire: «Écrire la grosse.»
GROS, adj. De gros en gros, locution adverbiale. Il consentit à
nous raconter de gros en gros cette singulière aventure. Il faut
dire: «En gros.» Raconter en gros.
GROS-BLÉ, s. m. Nonnette, variété de froment. Le gros-blé
s'appelle aussi en français: «Blé barbu» et «Blé poulard.»
GROS-FORT, s. m. Grande absinthe, plante.
† GROS MAL, s. m. Haut mal, épilepsie, mal caduc. Tomber du
gros mal. Terme vaudois et savoisien. Dans le Limousin on dit:
Le grand mal.
GROS NEIRET ou GROS NOIRET, s. m. Canard garrot.
GROSSET, ETTE, adj. Un peu gros. Un poulet grosset; une perdrix
grossette.
GROUP, s. m. Angine du larynx. Écrivez et prononcez «Croup.»
GRUER, v. a. Faire gruer de l'avoine. Dites: «Monder.» Monder de
l'avoine.
GRUGEUR, s. m. Celui qui gruge.
GRUMEAU, s. m. Terme de boucherie. La pièce du devant de la
poitrine de l'animal entre les deux jambes. Grumeau de bœuf;
grumeau de mouton. Terme méridional.
GRUMEAU, s. m. Cerneau, noix cassée. Terme de la Suisse
romane.
GRUS, s. m. pl. Gruau, orge mondé, avoine mondée. De la soupe
aux grus. Terme suisse-roman, savoisien et franc-comtois. En
Champagne, gru signifie: Son de farine; et en vieux français,
greu, farine d'avoine et de froment.
GRUS (DES). Terme de fromagerie. Du caillé, du séret mêlé de
crême. «La Fanchon nous servit des grus et de la céracée.» [J.-J.
Rousseau, Nouvelle Héloïse.]
GUENAPIN, s. m. Polisson, bandit, chenapan.
GUENICHE, s. f. Femme débraillée, sale et d'un aspect repoussant.
Terme lorrain. En vieux français, «guenuche» ou guenoche
veulent dire: Sorcière, enchanteresse. Dans l'évêché de Bâle,
genache a le même sens.
GUENILLERIE, s. f. Guenille, rebut, objet de rebut. Se dit des
personnes et des choses.
GUERRER, v. n. Terme enfantin, en usage surtout chez les
campagnards. Cette petite folle d'Ernestine veut toujours
guerrer avec nous, c'est-à-dire: Veut toujours être en guerre
avec nous, guerroyer, batailler.
† GUETTE, s. f. Guêtre. De vieilles guettes. Français populaire.
GUETTON, s. m. Petite guêtre, guêtron. Une paire de guettons.
Terme savoisien, rouchi, etc.
GUEULÉE, s. f. Cri éclatant, clameur perçante. Pousser des
gueulées. Faire des gueulées. Ce n'étaient pas des chants,
c'étaient des gueulées d'enfer. Ce mot est français, mais dans
une acception différente.
GUEULER, v. a. Gueuler quelqu'un, l'appeler à voix forte. Tu ne
m'entends donc pas, Colombier: il y a une demi-heure que je te
gueule. «Gueuler,» v. n., est français.
GUEUSER, v. n. Faire une action de gueux, se conduire mal, faire
une gueuserie. Priver cette petite fille de son bal, c'est gueuser,
c'est coquiner, c'est être par trop sévère et méchant. En
français, «Gueuser» signifie: Mendier.
GUEUSERIE, s. f. Tour malin, méchanceté, action coupable. Sevrer
un enfant de quatre mois, c'est une gueuserie.
GUICHE, s. f. Jambe. Tirer la guiche, traîner la guiche. Après
douze heures de marche, le sac au dos, on commence joliment
à tirer la guiche.
GUIDE, s. f. Terme des campagnards. Digue. Élever des guides
contre le torrent. Guide vient-il de «digue» par une transposition
de lettres? Guide est-il au contraire le terme primitif et
véritable? Une digue n'est autre chose, en effet, qu'une barrière
établie pour guider les eaux. Voyez dans Gattel l'étymologie
banale.
GUIGNACHE, s. f. Guignon, guignon achevé.
GUIGNAUCHE, s. f. Guenuche, femme de mauvaise façon, femme
mal mise, fagotée, vêtue salement. Dans le canton de Vaud,
guignauche ou guegnauche signifie: Sorcière.
GUIGNE-EN-L'AIR. Badaud, imbécile.
GUILLAME, s. m. Grand guillame, grand flandrin.
† GUILLE, s. f. Quille. Jouer aux guilles. Terme suisse-roman,
franc-comtois et lorrain.
GUILLE, s. f. Terme des campagnards. Fine pointe, sommet,
sommité. La guille d'un clocher, la guille d'un arbre, la guille
d'une tour. Guillon, dans le canton de Vaud, a le même sens. A
notre fête du Tir fédéral [1851], un Vaudois disait: J'ai vu
planter le drapeau de la Confédération sur le fin guillon de la
Tour de l'Isle. En Franche-Comté, la pointe du jour s'appelle:
L'aube guillerole. [Vocabulaire jurassien de M. Monnier.] De cette
racine guille, viennent indubitablement les mots genevois
déguiller, aguiller, guille (à jouer), etc.
GUILLE, adj. À moitié ivre, gris. R. Guille, pointe. On dit en
français: Avoir une pointe de vin.
GUILLEMETTE (EN), loc. adv. Être en guillemette, signifie: Être en
pile, être l'un sur l'autre. Ces livres sont trop en guillemette, ils
vont tomber.
GUILLERETTE, s. f. Être à la guillerette ou être en guillerette, se
disent d'un objet mis dans une position d'où il risque de tomber.
Guillet, dans notre patois, et guilleret, dans le patois vaudois,
signifient: Sommet d'un arbre, d'un rocher, d'un bâtiment. Voyez
GUILLE, no
2.
GUILLERI, s. m. Courir le guilleri. Terme dauphinois, etc. Les
dictionnaires disent: «Courir le guilledou.»
GUILLETTE, s. f. (Prononcez ghillette.) Signifie: 1o
Boulette de
pâte dont on engraisse les dindes; 2o
Fusée de poudre. Voyez
GUILLE, no
2.
GUILLON, s. m. (Prononcez ghillon.) Fausset de tonneau, petite
broche de bois servant à boucher le trou qu'on fait à un tonneau
pour donner de l'air ou pour goûter le vin. Mettre un guillon.
Ôter le guillon. Terme vaudois, savoisien et jurassien. A Lyon, on
dit: Une guille; dans les environs de Dôle, une guillotte. Voyez
GUILLE, no
2.
GUILLONNER, v. a. Mettre le guillon, mettre le fausset.
GUINCHE, adj. Louche, qui a la vue de travers. En provençal on
dit: Guèchou. Dans le Berry, faire la guinche, signifie: Baisser la
tête après une mauvaise action.
GUINCHER, v. a. et n. Signifie: 1o
Lorgner du coin de l'œil,
guigner; 2o
Loucher, regarder de travers. Terme provençal.
GUINGOINE (DE), adv. De guingois, de travers, de biais, en
biaisant. Il marche tout de guingoine. Son habit allait tout de
guingoine. Nous disons aussi: De guingouarne et de
guingouaine. En Picardie, on dit: De guingoin.
GUIZE, s. f. (Prononcez ghize.) Terme de forge. Gueuse, fonte de
fer, fer coulé. «Un tuyau de gueuse.»
GY ou GI, s. m. Un tonneau de gy. Terme suisse, savoisien, franc-
comtois, méridional et vieux français. On doit dire: «Gypse, ou
plâtre.»
GYPER, v. a. Plâtrer, enduire de plâtre.
GYPERIE, s. f. Plâtrage, ouvrages en plâtre. La gyperie de cette
seule chambre avait coûté six cents francs.
GYPIER, s. m. Plâtrier.
GYSSAGE, s. m. Plâtrage.
GYSSER, v. a. Appliquer du plâtre, enduire de plâtre, plâtrer.
Gysser un plafond, gysser une paroi.
GYSSEUR, s. m. Ouvrier qui emploie le gypse, plafonneur. Dans le
Valais on dit: Gypseur.
H
HABILLÉ, ÉE. Participe. Nous disons d'une personne stupide, d'une
personne dépourvue de tout bon sens: C'est une bête habillée.
HABILLÉ EN. Habillé en noir, habillé en blanc. Dites: Habillé DE
noir, habillé DE blanc.
HABITUÉ, ÉE, adj. Place habituée; jeu habitué; lecture habituée;
promenade habituée. Dites: Place habituelle, jeu habituel,
promenade habituelle, lecture habituelle. [Boiste.]
HABITUER, v. a. J'ai habitué cet appartement, et j'y reste. Les
bonnes d'enfants ont habitué la promenade de la Treille. J'aime
mon cercle, je n'y rencontre que des personnes que j'ai
habituées. Toutes ces phrases sont autant de barbarismes.
† HABRE-SAC, s. m. Havre-sac. R. all. Haber, avoine.
† HACHIS, s. m. L'h de ce mot doit s'aspirer; mais dans le langage
populaire on prononce l'hâchis. On prononce aussi l'hareng, les-
z-haricots, les-z-harnais, les-z-hasards, l'hai-ye (la haie), l'hibou,
l'hangar, j'haïs (je hais), c'est-t-hideux, c'est-t-honteux, etc., etc.
HACHON, s. m. Hache, petite hache. L'hachon lui échappa des
mains. Hachon appartient au vieux français, et au patois du
canton de Vaud. On dit à Bordeaux: Hachot.
HAMEÇON, s. m. L'h de ce mot n'est point aspiré. On dit: Prendre
l'hameçon, mordre à l'hameçon. C'est donc par inadvertance,
sans doute, que MM. Ch. Nodier et Ackermann, dans leur
Vocabulaire français [1836], disent qu'il faut prononcer le
hameçon, en aspirant l'h.
HANCHOIS, s. m. (h aspiré.) Une salade de hanchois. Écrivez sans
h, «anchois,» et n'aspirez pas l'a.
HARENG, s. m. (fig.) Banc de sable, banc de gravier, îlot. Les
harengs de l'Arve. Tirer du sable de l'hareng. (sic.) L'Arve a
tellement grossi pendant ces trois jours, qu'elle a emporté
l'hareng. «Nous voyons souvent dans le lit d'une rivière, une
grande pierre retarder la vitesse des eaux, et occasionner un
amas de sable et de gravier: de là naissent des HARENGS qui,
etc.» [De Saussure, Voyage dans les Alpes, t. Ier
, p. 245.]
† HASARD, s. m. Terme d'encan. Miser un n-hasard.
HASARD DU POT (LE). Viens manger ma soupe quand tu voudras;
c'est au hasard du pot. On dit en France: La fortune du pot.
HAUT (LE). Les gens du haut, les dames du haut, les bals du haut,
etc. Se frotter contre les gens du haut; imiter, singer les gens du
haut. Ces expressions, d'un usage universel à Genève, ont
besoin d'être expliquées aux étrangers. Notre ville, étant bâtie
sur un coteau, se trouve naturellement divisée en haute et
basse ville. Or, comme les familles aisées demeurent, pour la
plupart, dans les quartiers du haut, on appelle gens du haut, les
riches de ces quartiers, en tant du moins que leurs familles sont
anciennes. Avec cette courte explication on comprendra sans
peine ce passage des Confessions de J.-J. Rousseau [liv. Ier
]: «Il
était, lui (Bernard, le cousin de Jean-Jacques), il était, lui, un
garçon du haut; moi, chétif apprenti, je n'étais plus qu'un enfant
de Saint-Gervais.»
HAUT, HAUTE, adj. Nous disons proverbialement d'un homme
orgueilleux et fier: Il est haut comme le temps, c'est-à-dire: Il
est excessivement fier et hautain. Expression languedocienne,
etc.
HAUT-BANC, s. m. Sorte d'échoppe.
HAUT-DE-CORPS (UN). Son cheval ne cessait de faire des hauts-
de-corps. Dites: Des hauts-le-corps.
HAUT GOÛT, s. m. Nous disons d'une sauce salée, poivrée, épicée:
Cette sauce a un haut goût. L'Académie dit: «Cette sauce EST DE
haut goût.»
Welcome to our website – the perfect destination for book lovers and
knowledge seekers. We believe that every book holds a new world,
offering opportunities for learning, discovery, and personal growth.
That’s why we are dedicated to bringing you a diverse collection of
books, ranging from classic literature and specialized publications to
self-development guides and children's books.
More than just a book-buying platform, we strive to be a bridge
connecting you with timeless cultural and intellectual values. With an
elegant, user-friendly interface and a smart search system, you can
quickly find the books that best suit your interests. Additionally,
our special promotions and home delivery services help you save time
and fully enjoy the joy of reading.
Join us on a journey of knowledge exploration, passion nurturing, and
personal growth every day!
ebookbell.com

More Related Content

PDF
Stochastic Modelling And Control Softcover Reprint Of The Original 1st Ed 198...
PDF
Unsolved Problems In Mathematical Systems And Control Theory Course Book Vinc...
PDF
Unsolved Problems In Mathematical Systems And Control Theory Course Book Vinc...
PDF
Mathematical Control Theory for Stochastic Partial Differential Equations 1st...
PDF
Discretetime Markov Jump Linear Systems Costa Olv Fragoso Md
PDF
Robust Control Of Linear Descriptor Systems 1st Ed Yu Feng Mohamed Yagoubi
PPTX
Impulsive and Stochastic Hybrid Systems in the Financial Sector with Bubbles
PDF
Stochastic Systems Theory And Applications V S Pugachev Igor Sinitsyn
Stochastic Modelling And Control Softcover Reprint Of The Original 1st Ed 198...
Unsolved Problems In Mathematical Systems And Control Theory Course Book Vinc...
Unsolved Problems In Mathematical Systems And Control Theory Course Book Vinc...
Mathematical Control Theory for Stochastic Partial Differential Equations 1st...
Discretetime Markov Jump Linear Systems Costa Olv Fragoso Md
Robust Control Of Linear Descriptor Systems 1st Ed Yu Feng Mohamed Yagoubi
Impulsive and Stochastic Hybrid Systems in the Financial Sector with Bubbles
Stochastic Systems Theory And Applications V S Pugachev Igor Sinitsyn

Similar to Stochastic Modeling And Control Edited By Ivan Ganchev Ivanov (20)

PDF
Dissipative Systems Analysis And Control Theory And Applications 3rd Ed Berna...
PDF
Advances In Discrete Time Systems Edited By Magdi S Mahmoud
PDF
Optimal control problem for processes
PDF
Optimal Control Problems Arising In Mathematical Economics Alexander J Zaslavski
PDF
kalman_maybeck_ch1.pdf
PDF
Varianceconstrained Multiobjective Stochastic Control And Filtering 1st Editi...
PDF
Advances In Control Systems 7 C T Leondes
PDF
Optimal control systems
PDF
Stochastic Models In Reliability 2nd Edition Terje Aven Uwe Jensen Auth
PDF
Stochastic Flows And Jumpdiffusions 1st Edition Hiroshi Kunita
PDF
Stochastic Flows And Jumpdiffusions 1st Edition Hiroshi Kunita
PDF
Linear Parametervarying And Timedelay Systems Analysis Observation Filtering ...
PDF
Symmetric Discontinuous Galerkin Methods For 1d Waves Fourier Analysis Propag...
PDF
Stochastic optimal control & rl
PDF
Systems_and_Control.pdf
PDF
Linear Statespace Control Systems 1st Edition Robert L Williams Ii
PDF
Value Based Decision Control: Preferences Portfolio Allocation, Winer and Col...
PDF
Financial Derivatives In Theory And Practice Rev Ed P J Hunt J E Kennedy
PDF
Stochastic Resonance A Mathematical Approach In The Small Noise Limit Samuel ...
PDF
IFAC2008art
Dissipative Systems Analysis And Control Theory And Applications 3rd Ed Berna...
Advances In Discrete Time Systems Edited By Magdi S Mahmoud
Optimal control problem for processes
Optimal Control Problems Arising In Mathematical Economics Alexander J Zaslavski
kalman_maybeck_ch1.pdf
Varianceconstrained Multiobjective Stochastic Control And Filtering 1st Editi...
Advances In Control Systems 7 C T Leondes
Optimal control systems
Stochastic Models In Reliability 2nd Edition Terje Aven Uwe Jensen Auth
Stochastic Flows And Jumpdiffusions 1st Edition Hiroshi Kunita
Stochastic Flows And Jumpdiffusions 1st Edition Hiroshi Kunita
Linear Parametervarying And Timedelay Systems Analysis Observation Filtering ...
Symmetric Discontinuous Galerkin Methods For 1d Waves Fourier Analysis Propag...
Stochastic optimal control & rl
Systems_and_Control.pdf
Linear Statespace Control Systems 1st Edition Robert L Williams Ii
Value Based Decision Control: Preferences Portfolio Allocation, Winer and Col...
Financial Derivatives In Theory And Practice Rev Ed P J Hunt J E Kennedy
Stochastic Resonance A Mathematical Approach In The Small Noise Limit Samuel ...
IFAC2008art
Ad

Recently uploaded (20)

PDF
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
PPTX
History, Philosophy and sociology of education (1).pptx
PDF
Microbial disease of the cardiovascular and lymphatic systems
PDF
What if we spent less time fighting change, and more time building what’s rig...
PDF
Chinmaya Tiranga quiz Grand Finale.pdf
PDF
2.FourierTransform-ShortQuestionswithAnswers.pdf
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PDF
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
PDF
LDMMIA Reiki Yoga Finals Review Spring Summer
PDF
Supply Chain Operations Speaking Notes -ICLT Program
PPTX
UNIT III MENTAL HEALTH NURSING ASSESSMENT
PDF
RMMM.pdf make it easy to upload and study
PPTX
Radiologic_Anatomy_of_the_Brachial_plexus [final].pptx
PPTX
master seminar digital applications in india
PPTX
UV-Visible spectroscopy..pptx UV-Visible Spectroscopy – Electronic Transition...
PDF
GENETICS IN BIOLOGY IN SECONDARY LEVEL FORM 3
PPTX
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
PPTX
Cell Types and Its function , kingdom of life
PDF
Weekly quiz Compilation Jan -July 25.pdf
PDF
Classroom Observation Tools for Teachers
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
History, Philosophy and sociology of education (1).pptx
Microbial disease of the cardiovascular and lymphatic systems
What if we spent less time fighting change, and more time building what’s rig...
Chinmaya Tiranga quiz Grand Finale.pdf
2.FourierTransform-ShortQuestionswithAnswers.pdf
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
LDMMIA Reiki Yoga Finals Review Spring Summer
Supply Chain Operations Speaking Notes -ICLT Program
UNIT III MENTAL HEALTH NURSING ASSESSMENT
RMMM.pdf make it easy to upload and study
Radiologic_Anatomy_of_the_Brachial_plexus [final].pptx
master seminar digital applications in india
UV-Visible spectroscopy..pptx UV-Visible Spectroscopy – Electronic Transition...
GENETICS IN BIOLOGY IN SECONDARY LEVEL FORM 3
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
Cell Types and Its function , kingdom of life
Weekly quiz Compilation Jan -July 25.pdf
Classroom Observation Tools for Teachers
Ad

Stochastic Modeling And Control Edited By Ivan Ganchev Ivanov

  • 1. Stochastic Modeling And Control Edited By Ivan Ganchev Ivanov download https://ebookbell.com/product/stochastic-modeling-and-control- edited-by-ivan-ganchev-ivanov-4072372 Explore and download more ebooks at ebookbell.com
  • 2. Here are some recommended products that we believe you will be interested in. You can click the link to download. Stochastic Modeling Of Manufacturing Systems Advances In Design Performance Evaluation And Control Issues 1st Edition J Macgregor Smith Auth https://ebookbell.com/product/stochastic-modeling-of-manufacturing- systems-advances-in-design-performance-evaluation-and-control- issues-1st-edition-j-macgregor-smith-auth-4191222 Stochastic Control And Mathematical Modeling Applications In Economics 1st Edition Hiroaki Morimoto https://ebookbell.com/product/stochastic-control-and-mathematical- modeling-applications-in-economics-1st-edition-hiroaki- morimoto-4669890 Applied Stochastic Processes And Control For Jumpdiffusions Modeling Analysis And Computation Siam Floyd B Hanson https://ebookbell.com/product/applied-stochastic-processes-and- control-for-jumpdiffusions-modeling-analysis-and-computation-siam- floyd-b-hanson-1011344 Hybrid L1 Adaptive Control Applications Of Fuzzy Modeling Stochastic Optimization And Metaheuristics Roshni Maiti https://ebookbell.com/product/hybrid-l1-adaptive-control-applications- of-fuzzy-modeling-stochastic-optimization-and-metaheuristics-roshni- maiti-46753102
  • 3. Modeling Stochastic Control Optimization And Applications 1st Ed George Yin https://ebookbell.com/product/modeling-stochastic-control- optimization-and-applications-1st-ed-george-yin-10489200 Stochastic Modelling And Control Softcover Reprint Of The Original 1st Ed 1985 Davis https://ebookbell.com/product/stochastic-modelling-and-control- softcover-reprint-of-the-original-1st-ed-1985-davis-53790684 Controlled Markov Processes And Viscosity Solutions Stochastic Modelling And Applied Probability 2nd Wendell H Fleming https://ebookbell.com/product/controlled-markov-processes-and- viscosity-solutions-stochastic-modelling-and-applied-probability-2nd- wendell-h-fleming-2528200 Stochastic Modeling And Optimization With Applications In Queues Finance And Supply Chains Springer Series In Operations Research 1st Edition David D Yao https://ebookbell.com/product/stochastic-modeling-and-optimization- with-applications-in-queues-finance-and-supply-chains-springer-series- in-operations-research-1st-edition-david-d-yao-2105346 Stochastic Modeling And Analysis Of Telecoms Networks 1st Edition Laurent Decreusefond https://ebookbell.com/product/stochastic-modeling-and-analysis-of- telecoms-networks-1st-edition-laurent-decreusefond-4062894
  • 5. AND CONTROL Edited by Ivan Ganchev Ivanov STOCHASTIC MODELING
  • 6. STOCHASTIC MODELING AND CONTROL Edited by Ivan Ganchev Ivanov
  • 7. Stochastic Modeling and Control http://dx.doi.org/10.5772/2567 Edited by Ivan Ganchev Ivanov Contributors J. Linares-Pérez, R. Caballero-Águila, I. García-Garrido, Uwe Küchler, Vyacheslav A. Vasiliev, Serena Doria, Ricardo López-Ruiz, Jaime Sañudo, David Opeyemi, Tze Leung Lai, Tiong Wee Lim, Jingtao Shi, Ivan Ivanov, Sergey V. Sokolov, Nicholas A. Nechval, Maris Purgailis, Ming Xu, Jiamin Zhu, Tian Tan, Shijie Xu, Vladimir Šimović, Siniša Fajt, Miljenko Krhen, Raúl Fierro, Ying Shen, Hui Zhang Published by InTech Janeza Trdine 9, 51000 Rijeka, Croatia Copyright © 2012 InTech All chapters are Open Access distributed under the Creative Commons Attribution 3.0 license, which allows users to download, copy and build upon published articles even for commercial purposes, as long as the author and publisher are properly credited, which ensures maximum dissemination and a wider impact of our publications. After this work has been published by InTech, authors have the right to republish it, in whole or part, in any publication of which they are the author, and to make other personal use of the work. Any republication, referencing or personal use of the work must explicitly identify the original source. Notice Statements and opinions expressed in the chapters are these of the individual contributors and not necessarily those of the editors or publisher. No responsibility is accepted for the accuracy of information contained in the published chapters. The publisher assumes no responsibility for any damage or injury to persons or property arising out of the use of any materials, instructions, methods or ideas contained in the book. Publishing Process Manager Dimitri Jelovcan Typesetting InTech Prepress, Novi Sad Cover InTech Design Team First published November, 2012 Printed in Croatia A free online edition of this book is available at www.intechopen.com Additional hard copies can be obtained from orders@intechopen.com Stochastic Modeling and Control, Edited by Ivan Ganchev Ivanov p. cm. ISBN 978-953-51-0830-6
  • 9. Contents Preface IX Chapter 1 Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 1 J. Linares-Pérez, R. Caballero-Águila and I. García-Garrido Chapter 2 On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 23 Uwe Küchler and Vyacheslav A. Vasiliev Chapter 3 Coherent Upper Conditional Previsions Defined by Hausdorff Outer Measures to Forecast in Chaotic Dynamical Systems 51 Serena Doria Chapter 4 Geometrical Derivation of Equilibrium Distributions in Some Stochastic Systems 63 Ricardo López-Ruiz and Jaime Sañudo Chapter 5 Stochastic Modelling of Structural Elements 81 David Opeyemi Chapter 6 Singular Stochastic Control in Option Hedging with Transaction Costs 103 Tze Leung Lai and Tiong Wee Lim Chapter 7 Stochastic Control for Jump Diffusions 119 Jingtao Shi Chapter 8 Iterations for a General Class of Discrete-Time Riccati-Type Equations: A Survey and Comparison 147 Ivan Ivanov Chapter 9 Stochastic Observation Optimization on the Basis of the Generalized Probabilistic Criteria 171 Sergey V. Sokolov
  • 10. VI Contents Chapter 10 Stochastic Control and Improvement of Statistical Decisions in Revenue Optimization Systems 185 Nicholas A. Nechval and Maris Purgailis Chapter 11 Application of Stochastic Control into Optimal Correction Maneuvers for Transfer Trajectories 211 Ming Xu, Jiamin Zhu, Tian Tan and Shijie Xu Chapter 12 Stochastic Based Simulations and Measurements of Some Objective Parameters of Acoustic Quality: Subjective Evaluation of Room Acoustic Quality with Acoustics Optimization in Multimedia Classroom (Analysis with Application) 233 Vladimir Šimović, Siniša Fajt and Miljenko Krhen Chapter 13 Discrete-Time Stochastic Epidemic Models and Their Statistical Inference 253 Raúl Fierro Chapter 14 Identifiability of Quantized Linear Systems 279 Ying Shen and Hui Zhang
  • 12. Preface Stochastic control plays an important role in many scientific and applied disciplines. The goal of this book is to collect a group of outstanding investigations in various aspects of the stochastic systems and their behavior. Linear discrete-time stochastic systems with uncertain observations (Chapter 1), and stochastic delay differential equations by noisy observations (Chapter 2) are considered at the beginning of the book. The model of coherent upper prevision, analyzed in Chapter 3, can be proposed to forecast in a chaotic system. A stochastic control problem where the system is governed by a nonlinear stochastic differential equation with jumps is explored in Chapter 4. The control is allowed to enter into both diffusion and jump terms. A geometrical interpretation of different multi-agent systems evolving in phase space under the hypothesis of equiprobability is introduced, and some new results for statistical systems are obtained in Chapter 5. The process and procedures for stochastic modelling of structural elements are analyzed in Chapter 6. The next three chapters are concerned with financial applications: a singular stochastic control model in European option hedging with transaction costs is studied in Chapter 7, and a stochastic optimal control problem for jump diffusions, where the controlled stochastic system is driven by both Brownian motion and Poisson random measure is investigated in Chapter 8. More precisely, the relationship between the maximum stochastic principle and the dynamic programming principle for the stochastic optimal control problem of jump diffusions is derived. Iterations for computing the maximal and stabilizing solution to the discrete time generalized algebraic Riccati equations are considered and their numerical properties are considered in Chapter 9. The remaining chapters can be considered as a collection of several applications of the optimal control problems: a synthesis problem of the optimal control of the observation process based on a generalized probabilistic criteria (Chapter 10), and a problem for suggesting improvement of statistical decisions in revenue management systems under parametric uncertainty (Chapter 11). In Chapter 12, an optimal correction maneuver strategy is proposed from the view of stochastic control to track two typical and interesting transfer trajectories for orbital rendezvous and Halo orbit,
  • 13. X Preface respectively. Next application is a problem for an acoustics optimization in the multimedia classroom. In Chapter 13, the stochastic based simulations and measurements of some objective parameters of acoustic quality and subjective evaluation of room acoustic quality are analyzed. A discrete-time stochastic epidemic model is introduced and examined from a statistical point of view in Chapter 14. The last studies the parameter identifiability of quantized linear systems with Gauss- Markov parameters from informational theoretic point of view. Prof. Ivan Ganchev Ivanov Head of the Department of Statistics and Econometrics, Faculty of Economics and Business Administration, Sofia University "St. Kl. Ohridski", Sofia, Bulgaria
  • 15. Chapter 0 Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations J. Linares-Pérez, R. Caballero-Águila and I. García-Garrido Additional information is available at the end of the chapter http://dx.doi.org/10.5772/45777 1. Introduction The least-squares estimation problem in linear discrete-time stochastic systems in which the signal to be estimated is always present in the observations has been widely treated; as is well known, the Kalman filter [12] provides the least-squares estimator when the additive noises and the initial state are Gaussian and mutually independent. Nevertheless, in many real situations, usually the measurement device or the transmission mechanism can be subject to random failures, generating observations in which the state appears randomly or which may consist of noise only due, for example, to component or interconnection failures, intermittent failures in the observation mechanism, fading phenomena in propagation channels, accidental loss of some measurements or data inaccessibility at certain times. In these situations where it is possible that information concerning the system state vector may or may not be contained in the observations, at each sampling time, there is a positive probability (called false alarm probability) that only noise is observed and, hence, that the observation does not contain the transmitted signal, but it is not generally known whether the observation used for estimation contains the signal or it is only noise. To describe this interrupted observation mechanism (uncertain observations), the observation equation, with the usual additive measurement noise, is formulated by multiplying the signal function at each sampling time by a binary random variable taking the values one and zero (Bernoulli random variable); the value one indicates that the measurement at that time contains the signal, whereas the value zero reflects the fact that the signal is missing and, hence, the corresponding observation is only noise. So, the observation equation involves both an additive and a multiplicative noise, the latter modeling the uncertainty about the signal being present or missing at each observation. Linear discrete-time systems with uncertain observations have been widely used in estimation problems related to the above practical situations (which commonly appear, for example, in Chapter 1
  • 16. 2 Will-be-set-by-IN-TECH Communication Theory). Due to the multiplicative noise component, even if the additive noises are Gaussian, systems with uncertain observations are always non-Gaussian and hence, as occurs in other kinds of non-Gaussian linear systems, the least-squares estimator is not a linear function of the observations and, generally, it is not easily obtainable by a recursive algorithm; for this reason, research in this kind of systems has focused special attention on the search of suboptimal estimators for the signal (mainly linear ones). In some cases, the variables modeling the uncertainty in the observations can be assumed to be independent and, then, the distribution of the multiplicative noise is fully determined by the probability that each particular observation contains the signal. As it was shown by Nahi [17] (who was the first who analyzed the least-squares linear filtering problem in this kind of systems assuming that the state and observation additive noises are uncorrelated) the knowledge of the aforementioned probabilities allows to derive estimation algorithms with a recursive structure similar to the Kalman filter. Later on, Monzingo [16] completed these results by analyzing the least-squares smoothing problem and, subsequently, [3] and [4] generalized the least-squares linear filtering and smoothing algorithms considering that the additive noises of the state and the observation are correlated. However, there exist many real situations where this independence assumption of the Bernoulli variables modeling the uncertainty is not satisfied; for example, in signal transmission models with stand-by sensors in which any failure in the transmission is detected immediately and the old sensor is then replaced, thus avoiding the possibility of the signal being missing in two successive observations. This different situation was considered by [9] by assuming that the variables modeling the uncertainty are correlated at consecutive time instants, and the proposed least-squares linear filtering algorithm provides the signal estimator at any time from those in the two previous instants. Later on, the state estimation problem in discrete-time systems with uncertain observations, has been widely studied under different hypotheses on the additive noises involved in the state and observation equations and, also, under several hypotheses on the multiplicative noise modeling the uncertainty in the observations (see e.g. [22] - [13], among others). On the other hand, there are many engineering application fields (for example, in communication systems) where sensor networks are used to obtain all the available information on the system state and its estimation must be carried out from the observations provided by all the sensors (see [6] and references therein). Most papers concerning systems with uncertain observations transmitted by multiple sensors assume that all the sensors have the same uncertainty characteristics. In the last years, this situation has been generalized by several authors considering uncertain observations whose statistical properties are assumed not to be the same for all the sensors. This is a realistic assumption in several application fields, for instance, in networked communication systems involving heterogeneous measurement devices (see e.g. [14] and [8], among others). In [7] it is assumed that the uncertainty in each sensor is modeled by a sequence of independent Bernoulli random variables, whose statistical properties are not necessarily the same for all the sensors. Later on, in [10] and [1] the independence restriction is weakened; specifically, different sequences of Bernoulli random variables correlated at consecutive sampling times are considered to model the uncertainty at each sensor. This form of correlation covers practical situations where the signal cannot be missing in two successive observations. In [2] the least-squares linear and quadratic problems are addressed when the Bernoulli variables describing the uncertainty in 2 Stochastic Modeling and Control
  • 17. Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 3 the observations are correlated at instants that differ two units of time. This study covers more general practical situations, for example, in sensor networks where sensor failures may happen and a failed sensor is replaced not immediately, but two sampling times after having failed. However, even if it is assumed that any failure in the transmission results from sensor failures, usually the failed sensor may not be replaced immediately but after m instants of time; in such situations, correlation among the random variables modeling the uncertainty in the observations at times k and k + m must be considered and new algorithms must be deduced. The current chapter is concerned with the state estimation problem for linear discrete-time systems with uncertain observations when the uncertainty at any sampling time k depends only on the uncertainty at the previous time k − m; this form of correlation allows us to consider certain models in which the signal cannot be missing in m + 1 consecutive observations. The random interruptions in the observation process are modeled by a sequence of Bernoulli variables (at each time, the value one of the variable indicates that the measurement is the current system output, whereas the value zero reflects that only noise is available), which are correlated only at the sampling times k − m and k. Recursive algorithms for the filtering and fixed-point smoothing problems are proposed by using an innovation approach; this approach, based on the fact that the innovation process can be obtained by a causal and invertible operation on the observation process, consists of obtaining the estimators as a linear combination of the innovations and simplifies considerably the derivation of the estimators due to the fact that the innovations constitute a white process. The chapter is organized as follows: in Section 2 the system model is described; more specifically, we introduce the linear state transition model perturbed by a white noise, and the observation model affected by an additive white noise and a multiplicative noise describing the uncertainty. Also, the pertinent hypotheses to address the least-squares linear estimation problem are established. In Section 3 this estimation problem is formulated using an innovation approach. Next, in Section 4, recursive algorithms for the filter and fixed-point smoother are derived, including recursive formulas for the estimation error covariance matrices. Finally, the performance of the proposed estimators is illustrated in Section 5 by a numerical simulation example, where a two-dimensional signal is estimated and the estimation accuracy is analyzed for different values of the uncertainty probability and several values of the time period m. 2. Model description Consider linear discrete-time stochastic systems with uncertain observations coming from multiple sensors, whose mathematical modeling is accomplished by the following equations. The state equation is given by xk = Fk−1xk−1 + wk−1, k ≥ 1, (1) where {xk; k ≥ 0} is an n-dimensional stochastic process representing the system state, {wk; k ≥ 0} is a white noise process and Fk, for k ≥ 0, are known deterministic matrices. 3 Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations
  • 18. 4 Will-be-set-by-IN-TECH We consider scalar uncertain observations {yi k; k ≥ 1}, i = 1, . . . , r, coming from r sensors and perturbed by noises whose statistical properties are not necessarily the same for all the sensors. Specifically, we assume that, in each sensor and at any time k, the observation yi k, perturbed by an additive noise, can have no information about the state (thus being only noise) with a known probability. That is, yi k = ⎧ ⎨ ⎩ Hi kxk + vi k, with probability θ i k vi k, with probability 1 − θ i k where, for i = 1, . . . , r, {vi k; k ≥ 1} is the observation additive noise process of the i-th sensor and Hi k, for k ≥ 1, are known deterministic matrices of compatible dimensions. If we introduce {θi k; k ≥ 1}, i = 1, . . . , r, sequences of Bernoulli random variables with P[θi k = 1] = θ i k, the observations of the state can be rewritten as yi k = θi kHi kxk + vi k, k ≥ 1, i = 1, . . . , r. (2) Remark 1. If θi k = 1, which occurs with known probability θ i k, the state xk is present in the observation yi k coming from the i-th sensor at time k, whereas if θi k = 0 such observation only contains additive noise, vi k, with probability 1 − θ i k. This probability is called false alarm probability and it represents the probability that only noise is observed or, equivalently, that yi k does not contain the state. The aim is to address the state estimation problem considering all the available observations coming from the r sensors. For convenience, denoting yk = (y1 k, . . . , yr k)T, vk = (v1 k, . . . , vr k)T, Hk = (H1T k , . . . , HrT k )T and Θk = Diag(θ1 k, . . . , θr k), Equation (2) is equivalent to the following stacked observation equation yk = ΘkHkxk + vk, k ≥ 1. (3) 2.1. Model hypotheses In order to analyze the least-squares linear estimation problem of the state xk from the observations y1, . . . , yL, with L ≥ k, some considerations must be taken into account. On the one hand, it is known that the linear estimator of xk, is the orthogonal projection of xk onto the space of n-dimensional random variables obtained as linear transformations of the observations y1, . . . , yL, which requires the existence of the second-order moments of such observations. On the other hand, we consider that the variables describing the uncertainty in the observations are correlated in instants that differ m units of time to cover many practical situations where the independence assumption on such variables is not realistic. Specifically, the following hypotheses are assumed: Hypothesis 1. The initial state x0 is a random vector with E[x0] = x0 and Cov[x0] = P0. Hypothesis 2. The state noise {wk; k ≥ 0} is a zero-mean white sequence with Cov[wk] = Qk, ∀k ≥ 0. 4 Stochastic Modeling and Control
  • 19. Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 5 Hypothesis 3. The observation additive noise {vk; k ≥ 1} is a zero-mean white process with Cov[vk] = Rk, ∀k ≥ 1. Hypothesis 4. For i = 1, . . . , r, {θi k; k ≥ 1} is a sequence of Bernoulli random variables with P[θi k = 1] = θ i k. For i, j = 1, . . . , r, the variables θi k and θ j s are independent for |k − s| = 0, m and Cov[θi k, θ j s] are known for |k − s| = 0, m. Defining θk = (θ1 k, . . . , θr k)T , the covariance matrices of θk and θs will be denoted by Kθ k,s. Finally, we assume the following hypothesis on the independence of the initial state and noises: Hypothesis 5. The initial state x0 and the noise processes {wk; k ≥ 0}, {vk; k ≥ 1} and {θk; k ≥ 1} are mutually independent. Remark 2. For the derivation of the estimation algorithms a matrix product, called Hadamard product, which is simpler than the conventional product, will be considered. Let A, B ∈ Mmn, the Hadamard product (denoted by ◦) of A and B is defined as [A ◦ B]ij = AijBij. From this definition it is easily deduced (see [7]) the next property that will be needed later. For any random matrix Gm×m independent of {Θk; k ≥ 1}, the following equality is satisfied E[ΘkGm×mΘs] = E[θkθT s ] ◦ E[Gm×m]. Particularly, denoting Θk = E[Θk], it is immediately clear that E[(Θk − Θk)Gm×m(Θs − Θs)] = Kθ k,s ◦ E[Gm×m]. (4) Remark 3. Several authors assume that the observations available for the estimation come either from multiple sensors with identical uncertainty characteristics or from a single sensor (see [20] for the case when the uncertainty is modeled by independent variables, and [19] for the case when such variables are correlated at consecutive sampling times). Nevertheless, in the last years, this situation has been generalized by some authors considering multiple sensors featuring different uncertainty characteristics (see e.g. [7] for the case of independent uncertainty, and [1] for situations where the uncertainty in each sensor is modeled by variables correlated at consecutive sampling times). We analyze the state estimation problem for the class of linear discrete-time systems with uncertain observation (3), which, as established in Hypothesis 4, are characterized by the fact that the uncertainty at any sampling time k depends only on the uncertainty at the previous time k − m; this form of correlation allows us to consider certain models in which the signal cannot be absent in m + 1 consecutive observations. 3. Least-squares linear estimation problem As mentioned above, our aim in this chapter is to obtain the least-squares linear estimator, xk/L, of the signal xk based on the observations {y1, . . . , yL}, with L ≥ k, by recursive formulas. Specifically, the problem is to derive recursive algorithms for the least-squares linear filter (L = k) and fixed-point smoother (fixed k and L k) of the state using uncertain observations (3). For this purpose, we use an innovation approach as described in [11]. 5 Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations
  • 20. 6 Will-be-set-by-IN-TECH Since the observations are generally nonorthogonal vectors, we use the Gram-Schmidt orthogonalization procedure to transform the set of observations {y1, . . . , yL} into an equivalent set of orthogonal vectors {ν1, . . . , νL}; equivalent in the sense that they both generate the same linear subspace; that is, L(y1, . . . , yL) = L(ν1, . . . , νL) = LL. Let {ν1, . . . , νk−1} be the set of orthogonal vectors satisfying L(ν1, . . . , νk−1) = L(y1, . . . , yk−1), the next orthogonal vector, νk, corresponding to the new observation yk, is obtained by projecting yk onto Lk−1; specifically νk = yk − Proj{yk onto Lk−1}, and, because of the orthogonality of {ν1, . . . , νk−1} the above projection can be found by projecting yk along each of the previously found orthogonal vectors νi, for i ≤ k − 1, Proj{yk onto Lk−1} = k−1 ∑ i=1 Proj{yk along νi} = k−1 ∑ i=1 E[ykνT i ] E[νiνT i ] −1 νi. Since the projection of yk onto Lk−1 is yk/k−1, the one-stage least-squares linear predictor of yk, we have that yk/k−1 = k−1 ∑ i=1 Tk,iΠ−1 i νi, k ≥ 2 (5) where Tk,i = E[ykνT i ] and Πi = E[νiνT i ] is the covariance of νi. Consequently, by starting with ν1 = y1 − E[y1], the orthogonal vectors νk are determined by νk = yk − yk/k−1, for k ≥ 2. Hence, νk can be considered as the new information or the innovation in yk given {y1, . . . , yk−1}. In summary, the observation process {yk; k ≥ 1} has been transformed into an equivalent white noise {νk; k ≥ 1} known as innovation process. Taking into account that both processes satisfy that νi ∈ L(y1, . . . , yi) and yi ∈ L(ν1, . . . , νi), ∀i ≥ 1, we conclude that such processes are related to each other by a causal and causally invertible linear transformation, thus making the innovation process be uniquely determined by the observations. This consideration allows us to state that the least-squares linear estimator of the state based on the observations, xk/L, is equal to the least-squares linear estimator of the state based on the innovations {ν1, . . . , νL}. Thus, projecting xk separately onto each νi, i ≤ L, the following general expression for the estimator xk/L is obtained xk/L = L ∑ i=1 Sk,iΠ−1 i νi, k ≥ 1, (6) where Sk,i = E[xkνT i ]. This expression is the starting point to derive the recursive filtering and fixed-point smoothing algorithms in the next section. 6 Stochastic Modeling and Control
  • 21. Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 7 4. Least-squares linear estimation recursive algorithms In this section, using an innovation approach, recursive algorithms are proposed for the filter, xk/k, and the fixed-point smoother, xk/L, for fixed k and L k. 4.1. Linear filtering algorithm In view of the general expression (6) for L = k, it is clear that the state filter, xk/k, is obtained from the one-stage state predictor, xk/k−1, by xk/k = xk/k−1 + Sk,kΠ−1 k νk, k ≥ 1; x0/0 = x0. (7) Hence, an equation for the predictor xk/k−1 in terms of the filter xk−1/k−1 and expressions for the innovation νk, its covariance matrix Πk and the matrix Sk,k are required. State predictor xk/k−1. From hypotheses 2 and 5, it is immediately clear that the filter of the noise wk−1 is wk−1/k−1 = E[wk−1] = 0 and hence, taking into account Equation (1), we have xk/k−1 = Fk−1 xk−1/k−1, k ≥ 1. (8) Innovation νk. We will now get an explicit formula for the innovation, νk = yk − yk/k−1, or equivalently for the one-stage predictor of the observation, yk/k−1. For this purpose, taking into account (5), we start by calculating Tk,i = E[ykνT i ], for i ≤ k − 1. From the observation equation (3) and hypotheses 3 and 5, it is clear that Tk,i = E ΘkHkxkνT i , i ≤ k − 1. (9) Now, for k ≤ m or k m and i k − m, hypotheses 4 and 5 guarantee that Θk is independent of the innovations νi, and then we have that Tk,i = Θk HkE[xkνT i ] = ΘkHkSk,i. So, after some manipulations, we obtain I. For k ≤ m, it is satisfied that yk/k−1 = Θk Hk k−1 ∑ i=1 Sk,iΠ−1 i νi, and using (6) for L = k − 1 it is obvious that yk/k−1 = Θk Hk xk/k−1, k ≤ m. (10) II. For k m, we have that yk/k−1 = Θk Hk k−(m+1) ∑ i=1 Sk,iΠ−1 i νi + m ∑ i=1 Tk,k−iΠ−1 k−iνk−i and adding and subtracting m ∑ i=1 Θk HkSk,k−iΠ−1 k−iνk−i, the following equality holds yk/k−1 = ΘkHk k−1 ∑ i=1 Sk,iΠ−1 i νi + m ∑ i=1 (Tk,k−i − Θk HkSk,k−i)Π−1 k−iνk−i, k m. (11) Next, we determine an expression for Tk,k−i − ΘkHkSk,k−i, for 1 ≤ i ≤ m. 7 Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations
  • 22. 8 Will-be-set-by-IN-TECH Taking into account (9), it follows that Tk,k−i − ΘkHkSk,k−i = E (Θk − Θk)HkxkνT k−i , 1 ≤ i ≤ m, (12) or equivalently, Tk,k−i − Θk HkSk,k−i = E (Θk − Θk)HkxkyT k−i − E (Θk − Θk)Hkxk yT k−i/k−(i+1) . To calculate the first expectation, we use again (3) for yk−i and from hypotheses 3 and 5, we have that E (Θk − Θk)HkxkyT k−i = E Θk − Θk HkxkxT k−iHT k−iΘk−i which, using Property (4), yields E (Θk − Θk)HkxkyT k−i = Kθ k,k−i ◦ HkE[xkxT k−i]HT k−i . Now, denoting Dk = E[xkxT k ] and Fk,i = Fk−1 · · · Fi, from Equation (1) it is clear that E[xkxT k−i] = Fk,k−iDk−i, and hence E (Θk − Θk)HkxkyT k−i = Kθ k,k−i ◦ HkFk,k−iDk−iHT k−i where Dk can be recursively obtained by Dk = Fk−1Dk−1FT k−1 + Qk−1, k ≥ 1; D0 = P0 + x0xT 0 . (13) Summarizing, we have that Tk,k−i − ΘkHkSk,k−i = Kθ k,k−i ◦ HkFk,k−iDk−iHT k−i − E (Θk − Θk)Hkxk yT k−i/k−(i+1) , 1 ≤ i ≤ m. (14) Taking into account the correlation hypothesis of the variables describing the uncertainty, the right-hand side of this equation is calculated differently for i = m or i m, as shown below. (a) For i = m, since Θk is independent of the innovations νi, for i k − m, we have that E (Θk − Θk)Hkxk yT k−m/k−(m+1) = 0, and from (14) Tk,k−m − Θk HkSk,k−m = Kθ k,k−m ◦ HkFk,k−mDk−mHT k−m . (15) (b) For i m, from Hypothesis 4, Kθ k,k−i = 0 and, hence, from (14) Tk,k−i − Θk HkSk,k−i = −E (Θk − Θk)Hkxk yT k−i/k−(i+1) . 8 Stochastic Modeling and Control
  • 23. Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 9 Now, from expression (5), yk−i/k−(i+1) = k−(i+1) ∑ j=1 Tk−i,jΠ−1 j νj, and using again that Θk is independent of νi, for i = k − m, it is deduced that Tk,k−i − Θk HkSk,k−i = −E[ Θk − Θk HkxkνT k−m]Π−1 k−mTT k−i,k−m or, equivalently, from (12) for i = m, (15) and noting Ψk,k−m = Kθ k,k−m ◦ HkFk,k−mDk−mHT k−m Π−1 k−m, we have that Tk,k−i − ΘkHkSk,k−i = −Ψk,k−mTT k−i,k−m, i m. (16) Next, substituting (15) and (16) into (11) and using (6) for xk/k−1, it is concluded that yk/k−1 = Θk Hk xk/k−1 + Ψk,k−m νk−m − m−1 ∑ i=1 TT k−i,k−mΠ−1 k−iνk−i , k m. (17) Finally, using (3) and (16) and taking into account that, from (1), Sk,k−i = Fk,k−iSk−i,k−i, the matrices Tk,k−i in (17) are obtained by Tk,k−i = ΘkHkFk,k−iSk−i,k−i, 2 ≤ k ≤ m, 1 ≤ i ≤ k − 1, Tk,k−i = ΘkHkFk,k−iSk−i,k−i − Ψk,k−mTT k−i,k−m, k m, 1 ≤ i ≤ m − 1. Matrix Sk,k. Since νk = yk − yk/k−1, we have that Sk,k = E[xkνT k ] = E[xkyT k ] − E[xk yT k/k−1]. Next, we calculate these expectations. I. From Equation (3) and the independence hypothesis, it is clear that E[xkyT k ] = Dk HT k Θk, ∀k ≥ 1, where Dk = E[xkxT k ] is given by (13). II. To calculate E[xk yT k/k−1], the correlation hypothesis of the random variables θk must be taken into account and two cases must be considered: (a) For k ≤ m, from (10) we obtain E[xk yT k/k−1] = E[xk xT k/k−1]HT k Θk. By using the orthogonal projection lemma, which assures that E[xk xT k/k−1] = Dk − Pk/k−1, where Pk/k−1 = E[(xk − xk/k−1)(xk − xk/k−1)T] is the prediction error covariance matrix, we get E[xk yT k/k−1] = (Dk − Pk/k−1) HT k Θk, k ≤ m. 9 Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations
  • 24. 10 Will-be-set-by-IN-TECH (b) For k m, from (17) it follows that E[xk yT k/k−1] = E[xk xT k/k−1]HT k Θk + E[xkνT k−m]ΨT k,k−m − E ⎡ ⎣xk m−1 ∑ i=1 TT k−i,k−mΠ−1 k−iνk−i T ⎤ ⎦ ΨT k,k−m, hence, using again the orthogonal projection lemma and taking into account that Sk,k−i = E[xkνT k−i], for 1 ≤ i ≤ m, it follows that E[xk yT k/k−1] = (Dk − Pk/k−1) HT k Θk + Sk,k−mΨT k,k−m − m−1 ∑ i=1 Sk,k−iΠ−1 k−iTk−i,k−mΨT k,k−m, k m. Then, substituting these expectations in the expression of Sk,k and simplifying, it is clear that Sk,k = Pk/k−1HT k Θk, 1 ≤ k ≤ m, Sk,k = Pk/k−1HT k Θk − Sk,k−m − m−1 ∑ i=1 Sk,k−iΠ−1 k−iTk−i,k−m ΨT k,k−m, k m. (18) Now, an expression for the prediction error covariance matrix, Pk/k−1, is necessary. From Equation (1), it is immediately clear that Pk/k−1 = Fk−1Pk−1/k−1FT k−1 + Qk−1, k ≥ 1, where Pk/k = E[(xk − xk/k)(xk − xk/k)T] is the filtering error covariance matrix. From Equation (7), it is concluded that Pk/k = Pk/k−1 − Sk,kΠ−1 k ST k,k, k ≥ 1; P0/0 = P0. Covariance matrix of the innovation Πk = E[νkνT k ]. From the orthogonal projection lemma, the covariance matrix of the innovation is obtained as Πk = E[ykyT k ] − E[ yk/k−1 yT k/k−1]. From (3) and using Property (4), we have that E[ykyT k ] = E[θkθT k ] ◦ HkDk HT k + Rk, k ≥ 1. To obtain E[ yk/k−1 yT k/k−1] two cases must be distinguished again, due to the correlation hypothesis of the Bernoulli variables θk: I. For k ≤ m, Equation (10) and Property (4) yield E[ yk/k−1 yT k/k−1] = θkθ T k ◦ HkE[ xk/k−1 xT k/k−1]HT k and in view of the orthogonal projection lemma, E[ yk/k−1 yT k/k−1] = θkθ T k ◦ Hk(Dk − Pk/k−1)HT k , k ≤ m. 10 Stochastic Modeling and Control
  • 25. Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 11 II. For k m, an analogous reasoning, but using now Equation (17), yields E[ yk/k−1 yT k/k−1] = θkθ T k ◦ Hk(Dk − Pk/k−1)HT k + Ψk,k−mΠk−mΨT k,k−m + Ψk,k−m m−1 ∑ i=1 TT k−i,k−mΠ−1 k−iΠk−iΠ−1 k−iTk−i,k−mΨT k,k−m + ΘkHkE[ xk/k−1νT k−m]ΨT k,k−m + Ψk,k−mE[νk−m xT k/k−1]HT k Θk − ΘkHkE[ xk/k−1 m−1 ∑ i=1 νT k−iΠ−1 k−iTk−i,k−m]ΨT k,k−m − Ψk,k−m m−1 ∑ i=1 TT k−i,k−mΠ−1 k−iE[νk−i xT k/k−1]HT k Θk. Next, again from the orthogonal projection lemma, E[ xk/k−1νT k−i] = E[xkνT k−i] = Sk,k−i, for 1 ≤ i ≤ m, and therefore E[ yk/k−1 yT k/k−1] = θkθ T k ◦ Hk(Dk − Pk/k−1)HT k + Ψk,k−mΠk−mΨT k,k−m + Ψk,k−m m−1 ∑ i=1 TT k−i,k−mΠ−1 k−iTk−i,k−mΨT k,k−m + Θk Hk Sk,k−mΨT k,k−m − m−1 ∑ i=1 Sk,k−iΠ−1 k−iTk−i,k−mΨT k,k−m + Ψk,k−mST k,k−m − Ψk,k−m m−1 ∑ i=1 TT k−i,k−mΠ−1 k−iST k,k−i HT k Θk. Finally, from Equation (18), we have Sk,k−mΨT k,k−m − m−1 ∑ i=1 Sk,k−iΠ−1 k−iTk−i,k−mΨT k,k−m = −(Sk,k − Pk/k−1HT k Θk), and hence, E[ yk/k−1 yT k/k−1] = θkθ T k ◦ Hk(Dk − Pk/k−1)HT k + Ψk,k−mΠk−mΨT k,k−m + Ψk,k−m m−1 ∑ i=1 TT k−i,k−mΠ−1 k−iTk−i,k−mΨT k,k−m − ΘkHk Sk,k − Pk/k−1HT k Θk − ST k,k − Θk HkPk/k−1 HT k Θk, k m. Finally, since Kθ k,k = E[θkθT k ] − θkθ T k , the above expectations lead to the following expression for the innovation covariance matrices Πk = Kθ k,k ◦ HkDk HT k + Rk + Θk HkSk,k, k ≤ m, 11 Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations
  • 26. 12 Will-be-set-by-IN-TECH Πk = Kθ k,k ◦ HkDk HT k + Rk − Ψk,k−m Πk−m + m−1 ∑ i=1 TT k−i,k−mΠ−1 k−iTk−i,k−m ΨT k,k−m + Θk HkSk,k + ST k,kHT k Θk − Θk HkPk/k−1HT k Θk, k m. All these results are summarized in the following theorem. Theorem 1. The linear filter, xk/k, of the state xk is obtained as xk/k = xk/k−1 + Sk,kΠ−1 k νk, k ≥ 1; x0/0 = x0, where the state predictor, xk/k−1, is given by xk/k−1 = Fk−1 xk−1/k−1, k ≥ 1. The innovation process satisfies νk = yk − ΘkHk xk/k−1, k ≤ m, νk = yk − ΘkHk xk/k−1 + Ψk,k−m νk−m − m−1 ∑ i=1 TT k−i,k−mΠ−1 k−iνk−i , k m, where Θk = E[Θk] and Ψk,k−m = Kθ k,k−m ◦ HkFk,k−mDk−mHT k−mΠ−1 k−m , with ◦ the Hadamard product, Fk,i = Fk−1 · · · Fi and Dk = E[xkxT k ] recursively obtained by Dk = Fk−1Dk−1FT k−1 + Qk−1, k ≥ 1; D0 = P0 + x0xT 0 . The matrices Tk,k−i are given by Tk,k−i = ΘkHkFk,k−iSk−i,k−i, 2 ≤ k ≤ m, 1 ≤ i ≤ k − 1, Tk,k−i = ΘkHkFk,k−iSk−i,k−i − Ψk,k−mTT k−i,k−m, k m, 1 ≤ i ≤ m − 1. The covariance matrix of the innovation, Πk = E[νkνT k ], satisfies Πk = Kθ k,k ◦ HkDk HT k + Rk + ΘkHkSk,k, k ≤ m, Πk = Kθ k,k ◦ HkDk HT k + Rk − Ψk,k−m Πk−m + m−1 ∑ i=1 TT k−i,k−mΠ−1 k−iTk−i,k−m ΨT k,k−m + Θk HkSk,k + ST k,kHT k Θk − Θk HkPk/k−1HT k Θk, k m. The matrix Sk,k is determined by the following expression Sk,k = Pk/k−1HT k Θk, k ≤ m, Sk,k = Pk/k−1HT k Θk − Sk,k−m − m−1 ∑ i=1 Sk,k−iΠ−1 k−iTk−i,k−m ΨT k,k−m, k m, 12 Stochastic Modeling and Control
  • 27. Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 13 where Pk/k−1, the prediction error covariance matrix, is obtained by Pk/k−1 = Fk−1Pk−1/k−1FT k−1 + Qk−1, k ≥ 1, with Pk/k, the filtering error covariance matrix, satisfying Pk/k = Pk/k−1 − Sk,kΠ−1 k ST k,k, k ≥ 1; P0/0 = P0. (19) 4.2. Linear fixed-point smoothing algorithm The following theorem provides a recursive fixed-point smoothing algorithm to obtain the least-squares linear estimator, xk/k+N, of the state xk based on the observations {y1, . . . , yk+N}, for k ≥ 1 fixed and N ≥ 1. Moreover, to measure of the estimation accuracy, a recursive formula for the error covariance matrices, Pk/k+N = E (xk − xk/k+N)(xk − xk/k+N)T , is derived. Theorem 2. For each fixed k ≥ 1, the fixed-point smoothers, xk/k+N, N ≥ 1 are calculated by xk/k+N = xk/k+N−1 + Sk,k+NΠ−1 k+Nνk+N, N ≥ 1, (20) whose initial condition is the filter, xk/k, given in (7). The matrices Sk,k+N are calculated from Sk,k+N = DkFT k+N,k − Mk,k+N−1FT k+N−1 HT k+NΘk+N, k ≤ m − N, N ≥ 1, Sk,k+N = DkFT k+N,k − Mk,k+N−1FT k+N−1 HT k+NΘk+N − Sk,k+N−m − m−1 ∑ i=1 Sk,k+N−iΠ−1 k+N−iTk+N−i,k+N−m × ΨT k+N,k+N−m, k m − N, N ≥ 1. (21) where the matrices Mk,k+N satisfy the following recursive formula: Mk,k+N = Mk,k+N−1FT k+N−1 + Sk,k+NΠ−1 k+NST k+N,k+N, N ≥ 1, Mk,k = Dk − Pk/k. (22) The innovations νk+N, their covariance matrices Πk+N, the matrices Tk+N,k+N−i, Ψk+N,k+N−m, Dk and Pk/k are given in Theorem 1. Finally, the fixed-point smoothing error covariance matrix, Pk/k+N, verifies Pk/k+N = Pk/k+N−1 − Sk,k+NΠ−1 k+NST k,k+N, N ≥ 1, (23) with initial condition the filtering error covariance matrix, Pk/k, given by (19). Proof. From the general expression (6), for each fixed k ≥ 1, the recursive relation (20) is immediately clear. 13 Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations
  • 28. 14 Will-be-set-by-IN-TECH Now, we need to prove (21) for Sk,k+N = E[xkνT k+N] = E[xkyT k+N] − E[xk yT k+N/k+N−1], thus being necessary to calculate both expectations. I. From Equation (3), taking into account that E[xkxT k+N] = DkFT k+N,k and using that Θk+N and vk+N are independent of xk, we obtain E[xkyT k+N] = DkFT k+N,kHT k+NΘk+N, N ≥ 1. II. Based on expressions (10) and (17) for yk+N/k+N−1, which are different depending on k + N ≤ m or k + N m, two options must be considered: (a) For k ≤ m − N, using (10) for yk+N/k+N−1 with (8) for xk+N/k+N−1, we have that E xk yT k+N/k+N−1 = Mk,k+N−1FT k+N−1HT k+NΘk+N, where Mk,k+N−1 = E xk xT k+N−1/k+N−1 . (b) For k m − N, by following a similar reasoning to the previous one but starting from (17), we get E[xk yT k+N/k+N−1] = Mk,k+N−1FT k+N−1HT k+NΘk+N + Sk,k+N−m − m−1 ∑ i=1 Sk,k+N−iΠ−1 k+N−iTk+N−i,k+N−m ΨT k+N,k+N−m. Then, the replacement of the above expectations in Sk,k+N leads to expression (21). The recursive relation (22) for Mk,k+N = E xk xT k+N/k+N is immediately clear from (7) for xk+N/k+N and its initial condition Mk,k = E[xk xk/k] is calculated taking into account that, from the orthogonality, E[xk xT k/k] = E[ xk/k xT k/k] = Dk − Pk/k. Finally, since Pk/k+N = E xkxT k − E xk/k+N xT k/k+N , using (20) and taking into account that xk/k+N−1 is uncorrelated with νk+N, we have Pk/k+N = E xkxT k − E xk/k+N−1 xT k/k+N−1 − Sk,k+NΠ−1 k+NST k,k+N, N ≥ 1 and, consequently, expression (23) holds. 5. Numerical simulation example In this section, we present a numerical example to show the performance of the recursive algorithms proposed in this chapter. To illustrate the effectiveness of the proposed estimators, we ran a program in MATLAB which, at each iteration, simulates the state and the observed values and provides the filtering and fixed-point smoothing estimates, as well as the corresponding error covariance matrices, which provide a measure of the estimators accuracy. 14 Stochastic Modeling and Control
  • 29. Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 15 Consider a two-dimensional state process, {xk; k ≥ 0}, generated by the following first-order autoregressive model xk = 1 + 0.2 sin (k − 1)π 50 0.8 0 0.9 0.2 xk−1 + wk−1, k ≥ 1 with the following hypotheses: • The initial state, x0, is a zero-mean Gaussian vector with covariance matrix given by P0 = 0.1 0 0 0.1 . • The process {wk; k ≥ 0} is a zero-mean white Gaussian noise with covariance matrices Qk = 0.36 0.3 0.3 0.25 , ∀k ≥ 0. Suppose that the scalar observations come from two sensors according to the following observation equations: yi k = θi kxk + vi k, k ≥ 1, i = 1, 2. where {vi k; k ≥ 1}, i = 1, 2, are zero-mean independent white Gaussian processes with variances R1 k = 0.5 and R2 k = 0.9, ∀k ≥ 1, respectively. According to our theoretical model, it is assumed that, for each sensor, the uncertainty at time k depends only on the uncertainty at the previous time k − m. The variables θi k, i = 1, 2, modeling this type of uncertainty correlation in the observation process are modeled by two independent sequences of independent Bernoulli random variables, {γi k; k ≥ 1}, i = 1, 2, with constant probabilities P[γi k = 1] = γi. Specifically, the variables θi k are defined as follows θi k = 1 − γi k+m(1 − γi k), i = 1, 2. So, if θi k = 0, then γi k+m = 1 and γi k = 0, and hence, θi k+m = 1; this fact guarantees that, if the state is absent at time k, after k + m instants of time the observation necessarily contains the state. Therefore, there cannot be more than m consecutive observations consisting of noise only. Moreover, since the variables γi k and γi s are independent, θi k and θi s also are independent for |k − s| = 0, m. The common mean of these variables is θ i = 1 − γi(1 − γi) and its covariance function is given by Kθ k,s = E[(θi k − θ i )(θi s − θ i )] = ⎧ ⎪ ⎪ ⎪ ⎪ ⎪ ⎨ ⎪ ⎪ ⎪ ⎪ ⎪ ⎩ 0 if |k − s| = 0, m, −(1 − θ i )2 if |k − s| = m, θ i (1 − θ i ) if |k − s| = 0. To illustrate the effectiveness of the respective estimators, two hundred iterations of the proposed algorithms have been performed and the results obtained for different values of the uncertainty probability and several values of m have been analyzed. 15 Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations
  • 30. 16 Will-be-set-by-IN-TECH Let us observe that the mean function of the variables θi k, for i = 1, 2 are the same if 1 − γi is used instead of γi; for this reason, only the case γi ≤ 0.5 will be considered here. Note that, in such case, the false alarme probability at the i-th sensor, 1 − θ i , is an increasing function of γi. Firstly, the values of the first component of a simulated state together with the filtering and the fixed-point smoothing estimates for N = 2, obtained from simulated observations of the state for m = 3 and γ1 = γ2 = 0.5 are displayed in Fig. 1. This graph shows that the fixed-point smoothing estimates follow the state evolution better than the filtering ones. 0 20 40 60 80 100 120 140 160 180 200 −3 −2 −1 0 1 2 3 4 5 Time k Simulate state Filtering estimates Smoothing estimates Figure 1. First component of the simulate state, filtering and fixed-point smoothing estimates for N = 2, when m = 3 and γ1 = γ2 = 0.5. Next, assuming again that the Bernoulli variables of the observations are correlated at sampling times that differ three units of time (m = 3), we compare the effectiveness of the proposed filtering and fixed-point smoothing estimators considering different values of the probabilities γ1 and γ2, which provides different values of the false alarm probabilities 1 − θ i , i = 1, 2; specifically, γ1 = 0.2, γ2 = 0.4 and γ1 = 0.1, γ2 = 0.3. For these values, Fig. 2 shows the filtering and fixed-point smoothing error variances, when N = 2 and N = 5, for the first state component. From this figure it is observed that: i) As both γ1 and γ2 decrease (which means that the false alarm probability decreases), the error variances are smaller and, consequently, better estimations are obtained. ii) The error variances corresponding to the fixed-point smoothers are less than those of the filters and, consequently, agreeing with the comments on the previous figure, the fixed-point smoothing estimates are more accurate. iii) The accuracy of the smoothers at each fixed-point k is better as the number of available observations increases. 16 Stochastic Modeling and Control
  • 31. Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 17 0 20 40 60 80 100 120 140 160 180 200 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 Iteration k Filtering error variances Smoothing error variances, N=2 Smoothing error variances, N=5 γ 1 =0.2 γ2 =0.4 γ 1 =0.1 γ2 =0.3 Figure 2. Filtering and smoothing error variances for the first state component for γ1 = 0.2, γ2 = 0.4 and γ1 = 0.1, γ2 = 0.3, when m = 3. On the other hand, in order to show more precisely the dependence of the error variance on the values γ1 and γ2, Fig. 3 displays the filtering and fixed-point smoothing error variances of the first state component, at a fixed iteration (namely, k = 200) for m = 3, when both γ1 and γ2 are varied from 0.1 to 0.5, which provide different values of the probabilities θ 1 and θ 2 . More specifically, we have considered the values γi = 0.1, 0.2, 0.3, 0.4, 0.5, which lead to the false alarm probabilities 1 − θ i = 0.09, 0.16, 0.22, 0.24, 0.25, respectively. In this figure, both graphs (corresponding to the filtering and fixed-point smoothing error variances) corroborate the previous results, showing again that, as the false alarm probability increases, the filtering and fixed-point smoothing error variances (N = 2) become greater and consequently, worse estimations are obtained. Also, it is concluded that the smoothing error variances are better than the filtering ones. Analogous results to those of Fig. 1-3 are obtained for the second component of the state. As example, Fig. 4 shows the filtering and fixed-point smoothing error variances of the second state component, at k = 200, versus γ1 for constant values of γ2, when m = 3 and similar comments to those made from Fig. 3 are deduced. Finally, for γ1 = 0.2, γ2 = 0.4 the performance of the estimators is compared for different values of m; specifically, for m = 1, 3, 6, the filtering error variances of the first state component are displayed in Fig. 5. From this figure it is gathered that the estimators are more accurate as the values of m are lower. In other words, a greater distance between the instants at which the variables are correlated (which means that more consecutive observations may not contain state information) yields worse estimations. 17 Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations
  • 32. 18 Will-be-set-by-IN-TECH 0.1 0.2 0.3 0.4 0.5 0.12 0.14 0.16 0.18 0.2 0.22 Probability γ 1 γ 2 =0.1, γ 2 =0.2, γ 2 =0.3, γ 2 =0.4, γ 2 =0.5 0.1 0.2 0.3 0.4 0.5 0.09 0.1 0.11 0.12 0.13 0.14 0.15 Probability γ1 γ 2 =0.1, γ 2 =0.2, γ 2 =0.3, γ 2 =0.4, γ 2 =0.5 Filtering error variances Smoothing error variances Figure 3. Filtering error variances and smoothing error variances for N = 2 of the first state component at k = 200 versus γ1 with γ2 varying from 0.1 to 0.5 when m = 3. 0.1 0.2 0.3 0.4 0.5 0.1 0.12 0.14 0.16 0.18 0.2 0.22 Probability γ 1 γ 2 =0.1, γ 2 =0.2, γ 2 =0.3, γ 2 =0.4, γ 2 =0.5 0.1 0.2 0.3 0.4 0.5 0.08 0.1 0.12 0.14 0.16 Probability γ 1 γ 2 =0.1, γ 2 =0.2, γ 2 =0.3, γ 2 =0.4, γ 2 =0.5 Filtering error variances Smoothing error variances Figure 4. Filtering error variances and smoothing error variances for N = 2 of the second state component at k = 200 versus γ1 with γ2 varying from 0.1 to 0.5 when m = 3. 18 Stochastic Modeling and Control
  • 33. Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 19 0 20 40 60 80 100 120 140 160 180 200 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 Iteration k Filtering error variances, m=1, γ1 =0.2, γ2 =0.4 Filtering error variances, m=3, γ 1 =0.2, γ 2 =0.4 Filtering error variances, m=6, γ 1 =0.2, γ 2 =0.4 Figure 5. Filtering error variances for γ1 = 0.2, γ2 = 0.4 and m = 1, 3, 6. 6. Conclusions and future research In this chapter, the least-squares linear filtering and fixed-point smoothing problems have been addressed for linear discrete-time stochastic systems with uncertain observations coming from multiple sensors. The uncertainty in the observations is modeled by a binary variable taking the values one or zero (Bernoulli variable), depending on whether the signal is present or absent in the corresponding observation, and it has been supposed that the uncertainty at any sampling time k depends only on the uncertainty at the previous time k − m. This situation covers, in particular, those signal transmission models in which any failure in the transmission is detected and the old sensor is replaced after m instants of time, thus avoiding the possibility of missing signal in m + 1 consecutive observations. By applying an innovation technique, recursive algorithms for the linear filtering and fixed-point smoothing estimators have been obtained. This technique consists of obtaining the estimators as a linear combination of the innovations, simplifying the derivation of these estimators, due to the fact that the innovations constitute a white process. Finally, the feasibility of the theoretical results has been illustrated by the estimation of a two-dimensional signal from uncertain observations coming from two sensors, for different uncertainty probabilities and different values of m. The results obtained confirm the greater effectiveness of the fixed-point smoothing estimators in contrast to the filtering ones and conclude that more accurate estimations are obtained as the values of m are lower. In recent years, several problems of signal processing, such as signal prediction, detection and control, as well as image restoration problems, have been treated using quadratic estimators and, generally, polynomial estimators of arbitrary degree. Hence, it must be noticed that the current chapter can be extended by considering the least-squares polynomial estimation 19 Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations
  • 34. 20 Will-be-set-by-IN-TECH problems of arbitrary degree for such linear systems with uncertain observations correlated in instants that differ m units of time. On the other hand, in practical engineering, some recent progress on the filtering and control problems for nonlinear stochastic systems with uncertain observations is being achieved. Nonlinearity and stochasticity are two important sources that are receiving special attention in research and, therefore, filtering and smoothing problems for nonlinear systems with uncertain observations would be relevant topics on which further investigation would be interesting. This research is supported by Ministerio de Educación y Ciencia (Programa FPU and grant No. MTM2011-24718) and Junta de Andalucía (grant No. P07-FQM-02701). Author details J. Linares-Pérez Dpto. de Estadística e I.O. Universidad de Granada. Avda. Fuentenueva. 18071. Granada, Spain R. Caballero-Águila Dpto. de Estadística e I.O. Universidad de Jaén. Paraje Las Lagunillas. 23071. Jaén, Spain I. García-Garrido Dpto. de Estadística e I.O. Universidad de Granada. Avda. Fuentenueva. 18071. Granada, Spain 7. References [1] Caballero-Águila, R., Hermoso-Carazo, A. Linares-Pérez, J. [2011]. Linear and quadratic estimation using uncertain observations from multiple sensors with correlated uncertainty, Signal Processing 91(No. 2): 330–337. [2] García-Garrido, I., Linares-Pérez, J., Caballero-Águila, R. Hermoso-Carazo, A. [2012]. A solution to the filtering problem for stochastic systems with multi-sensor uncertain observations, International Mathematical Forum 7(No. 18): 887–903. [3] Hermoso-Carazo, A. Linares-Pérez, J. [1994]. Linear estimation for discrete-time systems in the presence of time-correlated disturbances and uncertain observations, IEEE Transactions on Automatic Control 39(No. 8): 1636 –1638. [4] Hermoso-Carazo, A. Linares-Pérez, J. [1995]. Linear smoothing for discrete-time systems in the presence of correlated disturbances and uncertain observations, IEEE Transactions on Automatic Control 40(No. 8): 243–251. [5] Hermoso-Carazo, A., Linares-Pérez, J., Jiménez-López, J., Caballero-Águila, R. Nakamori, S. [2008]. Recursive fixed-point smoothing algorithm from covariances based on uncertain observations with correlation in the uncertainty, Applied Mathematics and Computation 203(No. 1): 243–251. [6] Hespanha, J., Naghshtabrizi, P. Xu, Y. [2007]. A survey of recent results in networked control systems, Proceedings of the IEEE 95(No. 1): 138 –172. [7] Hounkpevi, F. Yaz, E. [2007]. Robust minimum variance linear state estimators for multiple sensors with different failure rates, Automatica 43(No. 7): 1274–1280. Acknowledgements 20 Stochastic Modeling and Control
  • 35. Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations 21 [8] Huang, R. Záruba, G. [2007]. Incorporating data from multiple sensors for localizing nodes in mobile ad hoc networks, IEEE Transactions on Mobile Computing 6(No. 9): 1090–1104. [9] Jackson, R. Murthy, D. [1976]. Optimal linear estimation with uncertain observations, IEEE Transactions on Information Theory 22(No. 3): 376–378. [10] Jiménez-López, J., Linares-Pérez, J., Nakamori, S., Caballero-Águila, R. Hermoso-Carazo, A. [2008]. Signal estimation based on covariance information from observations featuring correlated uncertainty and coming from multiple sensors, Signal Processing 88(No. 12): 2998–3006. [11] Kailath, T., Sayed, A. Hassibi, B. [2000]. Linear Estimation, Prentice Hall, New Jersey. [12] Kalman, R. [1960]. A new approach to linear filtering and prediction problems, Transactions of the ASME. Journal of Basic Engineering (No. 82 (Series D)): 35–45. [13] Ma, J. Sun, S. [2011]. Optimal linear estimators for systems with random sensor delays, multiple packet dropouts and uncertain observations, IEEE Transactions on Signal Processing 59(No. 11): 5181–5192. [14] Malyavej, V., Manchester, I. Savkin, A. [2006]. Precision missile guidance using radar/multiple-video sensor fusion via communication channels with bit-rate constraints, Automatica 42(No. 5): 763–769. [15] Moayedi, M., Foo, Y. Soh, Y. [2010]. Adaptive kalman filtering in networked systems with random sensor delays, multiple packet dropouts and missing measurements, IEEE Transactions on Signal Processing 58(No. 3): 1577–1578. [16] Monzingo, R. [1981]. Discrete linear recursive smoothing for systems with uncertain observations, IEEE Transactions on Automatic Control AC-26(No. 3): 754–757. [17] Nahi, N. [1969]. Optimal recursive estimation with uncertain observation, IEEE Transactions on Information Theory 15(No. 4): 457–462. [18] Nakamori, S., Caballero-Águila, R., Hermoso-Carazo, A., Jiménez-López, J. Linares-Pérez, J. [2006]. Least-squares vth-order polynomial estimation of signals from observations affected by non-independent uncertainty, Applied Mathematics and Computation Vol. 176(No. 2): 642–653. [19] Nakamori, S., Caballero-Águila, R., Hermoso-Carazo, A., Jiménez-López, J. Linares-Pérez, J. [2007]. Signal polynomial smoothing from correlated interrupted observations based on covariances, Mathematical Methods in the Applied Sciences Vol. 30(No. 14): 1645–1665. [20] NaNacara, W. Yaz, E. [1997]. Recursive estimator for linear and nonlinear systems with uncertain observations, Signal Processing Vol. 62(No. 2): 215–228. [21] Sahebsara, M., Chen, T. Shah, S. [2007]. Optimal H2 filtering with random sensor delay, multiple packet dropout and uncertain observations, International Journal of Control 80(No. 2): 292–301. [22] Sinopoli, B., Schenato, L., Franceschetti, M., Poolla, K., Jordan, M. Sastry, S. [2004]. Kalman filtering with intermittent observations, IEEE Transactions on Automatic Control 49(No. 9): 1453–1464. [23] Wang, Z., Yang, F., Ho, D. Liu, X. [2005]. Robust finite-horizon filtering for stochastic systems with missing measurements, IEEE Signal Processing Letters 12(No. 6): 437–440. 21 Design of Estimation Algorithms from an Innovation Approach in Linear Discrete-Time Stochastic Systems with Uncertain Observations
  • 36. 22 Will-be-set-by-IN-TECH [24] Wang, Z., Yang, F., Ho, D. Liu, X. [2006]. Robust H∞ filtering for stochastic time-delay systems with missing measurements, IEEE Transactions on Signal Processing 54(No. 7): 2579–2587. 22 Stochastic Modeling and Control
  • 37. Chapter 0 On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations Uwe Küchler and Vyacheslav A. Vasiliev Additional information is available at the end of the chapter http://dx.doi.org/10.5772/45952 1. Introduction Assume (Ω, F, (F(t), t ≥ 0), P) is a given filtered probability space and W = (W(t), t ≥ 0), V = (V(t), t ≥ 0) are real-valued standard Wiener processes on (Ω, F, (F(t), t ≥ 0), P), adapted to (F(t)) and mutually independent. Further assume that X0 = (X0(t), t ∈ [−1, 0]) and Y0 are a real-valued cadlag process and a real-valued random variable on (Ω, F, (F(t), t ≥ 0), P) respectively with E 0 −1 X2 0(s)ds ∞ and EY2 0 ∞. Assume Y0 and X0(s) are F0−measurable, s ∈ [−1, 0] and the quantities W, V, X0 and Y0 are mutually independent. Consider a two–dimensional random process (X, Y) = (X(t), Y(t), t ≥ 0) described by the system of stochastic differential equations dX(t) = aX(t)dt + bX(t − 1)dt + dW(t), (1) dY(t) = X(t)dt + dV(t), t ≥ 0 (2) with the initial conditions X(t) = X0(t), t ∈ [−1, 0], and Y(0) = Y0. The process X is supposed to be hidden, i.e., unobservable, and the process Y is observed. Such models are used in applied problems connected with control, filtering and prediction of stochastic processes (see, for example, [1, 4, 17–20] among others). The parameter ϑ = (a, b) ∈ Θ is assumed to be unknown and shall be estimated based on continuous observation of Y, Θ is a subset of R2 ((a, b) denotes the transposed (a, b)). Equations (1) and (2) together with the initial values X0(·) and Y0 respectively have uniquely solutions X(·) and Y(·), for details see [19]. Chapter 2
  • 38. 2 Will-be-set-by-IN-TECH Equation (1) is a very special case of stochastic differential equations with time delay, see [5, 6] and [20] for example. To estimate the true parameter ϑ with a prescribed least square accuracy ε we shall construct a sequential plan (T∗(ε), ϑ∗(ε)) working for all ϑ ∈ Θ. Here T∗(ε) is the duration of observations which is a special chosen stopping time and ϑ∗(ε) is an estimator of ϑ. The set Θ is defined to be the intersection of the set Θ with an arbitrary but fixed ball B0,R ⊂ R2. Sequential estimation problem has been solved for sets Θ of a different structure in [7]-[9], [11, 13, 14, 16] by observations of the process (1) and in [10, 12, 15] – by noisy observations (2). In this chapter the set Θ of parameters consists of all (a, b) from R2 which do not belong to lines L1 or L2 defined in Section 2 below and having Lebesgue measure zero. This sequential plan is a composition of several different plans which follow the regions to which the unknown true parameter ϑ = (a, b) may belong to. Each individual plan is based on a weighted correlation estimator, where the weight matrices are chosen in such a way that this estimator has an appropriate asymptotic behaviour being typical for the corresponding region to which ϑ belongs to. Due to the fact that this behaviour is very connected with the asymptotic properties of the so-called fundamental solution x0(·) of the deterministic delay differential equation corresponding to (1) (see Section 2 for details), we have to treat different regions of Θ = R2 L, L = L1 ∪ L2, separately. If the true parameter ϑ belongs to L, the weighted correlation estimator under consideration converges weakly only, and thus the assertions of Theorem 3.1 below cannot be derived by means of such estimators. In general, the exception of the set L does not disturb applications of the results below in adaptive filtration, control theory and other applications because of its Lebesgue zero measure. In the papers [10, 12] the problem described above was solved for the two special sets of parameters ΘI (a straight line) and ΘII (where X(·) satisfies (1) is stable or periodic (unstable)) respectively. The general sequential estimation problem for all ϑ = (a, b) from R2 except of two lines was solved in [13, 14, 16] for the equation (1) based on the observations of X(·). In this chapter the sequential estimation method developed in [10, 12] for the system (1), (2) is extended to the case, considered by [13, 14, 16] for the equation (1) (as already mentioned, for all ϑ from R2 except of two lines for the observations without noises). A related result in such problem statement was published first for estimators of an another structure and without proofs in [15]. A similar problem for partially observed stochastic dynamic systems without time-delay was solved in [22, 23]. The organization of this chapter is as follows. Section 2 presents some preliminary facts needed for the further studies about we have spoken. In Section 3 we shall present the main result, mentioned above. In Section 4 all proofs are given. Section 5 includes conclusions. 2. Preliminaries To construct sequential plans for estimation of the parameter ϑ we need some preparation. At first we shall summarize some known facts about the equation (1). For details the reader is referred to [3]. Together with the mentioned initial condition the equation (1) has a uniquely determined solution X which can be represented for t ≥ 0 as follows: 24 Stochastic Modeling and Control
  • 39. On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 3 X(t) = x0(t)X0(t) + b 0 −1 x0(t − s − 1)X0(s)ds + t 0 x0(t − s)dW(s). (3) Here x0 = (x0(t), t ≥ −1) denotes the fundamental solution of the deterministic equation x0(t) = 1 + t 0 (ax0(s) + bx0(s − 1))ds, t ≥ 0, (4) corresponding to (1) with x0(t) = 0, t ∈ [−1, 0), x0(0) = 1. The solution X has the property E T 0 X2(s)ds ∞ for every T 0. From (3) it is clear, that the limit behaviour for t → ∞ of X very depends on the limit behaviour of x0(·). The asymptotic properties of x0(·) can be studied by the Laplace-transform of x0, which equals (λ − a − be−λ)−1, λ any complex number. Let s = u(r) (r 1) and s = w(r) (r ∈ R1) be the functions given by the following parametric representation (r(ξ), s(ξ)) in R2 : r(ξ) = ξ cot ξ, s(ξ) = −ξ/ sin ξ with ξ ∈ (0, π) and ξ ∈ (π, 2π) respectively. Now we define the parameter set Θ to be the plane R2 without the lines L1 = (a, u(a))a≤1 and L2 = (a, w(a))a∈R1 such that R2 = Θ ∪ L1 ∪ L2. It seems not to be possible to construct a general simple sequential procedure which has the desired properties under Pϑ for all ϑ ∈ Θ. Therefore we are going to divide the set Θ into some appropriate smaller regions where it is possible to do. This decomposition is very connected with the structure of the set Λ of all (real or complex) roots of the so-called characteristic equation of (4): λ − a − be−λ = 0. Put v0 = v0(ϑ) = max{Reλ|λ ∈ Λ}, v1 = v1(ϑ) = max{Reλ|λ ∈ Λ, Reλ v0}. Beside of the case b = 0 it holds −∞ v1 v0 ∞. By m(λ) we denote the multiplicity of the solution λ ∈ Λ. Note that m(λ) = 1 for all λ ∈ Λ beside of (a, b) ∈ R2 with b = −ea. In this cases we have λ = a − 1 ∈ Λ and m(a − 1) = 2. The values v0(ϑ) and v1(ϑ) determine the asymptotic behaviour of x0(t) as t → ∞ (see [3] for details). Now we are able to divide Θ into some appropriate for our purposes regions. Note, that this decomposition is very related to the classification used in [3]. There the plane R2 was decomposed into eleven subsets. Here we use another notation. Definition (Θ). The set Θ of parameters is decomposed as Θ = Θ1 ∪ Θ2 ∪ Θ3 ∪ Θ4, where Θ1 = Θ11 ∪ Θ12 ∪ Θ13, Θ2 = Θ21 ∪ Θ22, Θ3 = Θ31, Θ4 = Θ41 ∪ Θ42 with Θ11 = {ϑ ∈ R2 | v0(ϑ) 0}, Θ12 = {ϑ ∈ R2 | v0(ϑ) 0 and v0(ϑ) ∈ Λ}, 25 On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations
  • 40. 4 Will-be-set-by-IN-TECH Θ13 = {ϑ ∈ R2 | v0(ϑ) 0; v0(ϑ) ∈ Λ, m(v0) = 2}, Θ21 = {ϑ ∈ R2 | v0(ϑ) 0, v0(ϑ) ∈ Λ, m(v0) = 1, v1(ϑ) 0 and v1(ϑ) ∈ Λ}, Θ22 = {ϑ ∈ R2 | v0(ϑ) 0, v0(ϑ) ∈ Λ, m(v0) = 1, v1(ϑ) 0 and v1(ϑ) ∈ Λ}, Θ31 = {ϑ ∈ R2 | v0(ϑ) 0, v0(ϑ) ∈ Λ, m(v0) = 1 and v1(ϑ) 0}, Θ41 = {ϑ ∈ R2 | v0(ϑ) = 0, v0(ϑ) ∈ Λ, m(v0) = 1}, Θ42 = {ϑ ∈ R2 | v0(ϑ) 0, v0(ϑ) ∈ Λ, m(v0) = 1, v1(ϑ) = 0 and v1(ϑ) ∈ Λ}. It should be noted, that the cases (Q2 ∪ Q3) and (Q5) considered in [3] correspond to our exceptional lines L1 and L2 respectively. Here are some comments concerning the Θ subsets. The unions Θ1, . . . , Θ4 are marked out, because the Fisher information matrix and related design matrices which will be considered below, have similar asymptotic properties for all ϑ throughout every Θi (i = 1, . . . , 4). Obviously, all sets Θ11, . . . , Θ42 are pairwise disjoint, the closure of Θ equals to R2 and the exceptional set L1 ∪ L2 has Lebesgue measure zero. The set Θ11 is the set of parameters ϑ for which there exists a stationary solution of (1). Note that the one-parametric set Θ4 is a part of the boundaries of the following regions: Θ11, Θ12, Θ21, Θ3. In this case b = −a holds and (1) can be written as a differential equation with only one parameter and being linear in the parameter. We shall use a truncation of all the introduced sets. First chose an arbitrary but fixed positive R. Define the set Θ = {ϑ ∈ Θ| ||ϑ|| ≤ R} and in a similar way the subsets Θ11, . . . , Θ42. Sequential estimators of ϑ with a prescribed least square accuracy we have already constructed in [10, 12]. But in these articles the set of possible parameters ϑ were restricted to Θ11 ∪ Θ12 ∪ {Θ41 {(0, 0)}} ∪ Θ42. To construct a sequential plan for estimating ϑ based on the observation of Y(·) we follow the line of [10, 12]. We shall use a single equation for Y of the form: dY(t) = ϑ A(t)dt + ξ(t)dt + dV(t), (5) where A(t) = (Y(t), Y(t − 1)), ξ(t) = X(0) − aY(0) − bY(0) + b 0 −1 X0(s)ds − aV(t) − bV(t − 1) + W(t). The random variables A(t) and ξ(t) are F(t)-measurable for every fixed t ≥ 1 and a short calculation shows that all conditions of type (7) in [12], consisting of E T 1 (|Y(t)| + |ξ(t)|)dt ∞ for all T 1, E[Δ̃ξ(t)|F(t − 2)] = 0, E[(Δ̃ξ(t))2 |F(t − 2)] ≤ 1 + R2 hold in our case. Here Δ̃ denotes the difference operator defined by Δ̃ f (t) = f (t) − f (t − 1). 26 Stochastic Modeling and Control
  • 41. On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 5 Using this operator and (5) we obtain the following equation: dΔ̃Y(t) = aΔ̃Y(t)dt + bΔ̃Y(t − 1)dt + Δ̃ξ(t)dt + dΔ̃V(t) (6) with the initial condition Δ̃Y(1) = Y(1) − Y0. Thus we have reduced the system (1), (2) to a single differential equation for the observed process (Δ̃Y(t), t ≥ 2) depending on the unknown parameters a and b. 3. Construction of sequential estimation plans In this section we shall construct the sequential estimation procedure for each of the cases Θ1 . . . , Θ4 separately. Then we shall define, similar to [11, 13, 14, 16], the final sequential estimation plan, which works in Θ as a sequential plan with the smallest duration of observations. We shall construct the sequential estimation procedure of the parameter ϑ on the basis of the correlation method in the cases Θ1, Θ4 (similar to [12, 14, 15]) and on the basis of correlation estimators with weights in the cases Θ2 ∪ Θ3. The last cases and Θ13 are new. It should be noted, that the sequential plan, constructed e.g. in [2] does not work for Θ3 here, even in the case if we observe (X(·)) instead of (Y(·)). 3.1. Sequential estimation procedure for ϑ ∈ Θ1 Consider the problem of estimating ϑ ∈ Θ1. We will use some modification of the estimation procedure from [12], constructed for the Case II thereon. It can be easily shown, that Proposition 3.1 below can be proved for the cases Θ11 ∪ Θ12 similarly to [12]. Presented below modified procedure is oriented, similar to [16] on all parameter sets Θ11, Θ12, Θ13. Thus we will prove Proposition 3.1 in detail for the case Θ13 only. The proofs for cases Θ11 ∪ Θ12 are very similar. For the construction of the estimation procedure we assume h10 is a real number in (0, 1/5) and h1 is a random variable with values in [h10, 1/5] only, F(0)-measurable and having a known continuous distribution function. Assume (cn)n≥1 is a given unboundedly increasing sequence of positive numbers satisfying the following condition: ∑ n≥1 1 cn ∞. (7) This construction follows principally the line of [14, 16] (see [12] as well), for which the reader is referred for details. We introduce for every ε 0 and every s ≥ 0 several quantities: – the functions Ψs(t) = (Δ̃Y(t), Δ̃Y(t − s)) for t ≥ 1 + s, (0, 0) for t 1 + s; – the sequence of stopping times τ1(n, ε) = h1 inf{k ≥ 1 : kh1 0 ||Ψh1 (t − 2 − 5h1)||2 dt ≥ ε−1 cn} for n ≥ 1; 27 On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations
  • 42. 6 Will-be-set-by-IN-TECH – the matrices G1(T, s) = T 0 Ψs(t − 2 − 5s)Ψ 1(t)dt, Φ1(T, s) = T 0 Ψs(t − 2 − 5s)dΔ̃Y(t), G1(n, k, ε) = G1(τ1(n, ε) − kh1, h1), Φ1(n, k, ε) = Φ1(τ1(n, ε) − kh1, h1); – the times k1(n) = arg min k=1,5 ||G−1 1 (n, k, ε)||, n ≥ 1; – the estimators ϑ1(n, ε) = G−1 1 (n, ε)Φ1(n, ε), n ≥ 1, G1(n, ε) = G1(n, k1(n), ε), Φ1(n, ε) = Φ1(n, k1(n), ε); – the stopping time σ1(ε) = inf{N ≥ 1 : S1(N) (ρ1δ−1 1 )1/2 }, (8) where S1(N) = N ∑ n=1 β2 1(n, ε), β1(n, ε) = ||G̃−1 1 (n, ε)||, G̃1(n, ε) = (ε−1 cn)−1 G1(n, k1(n), ε) and δ1 ∈ (0, 1) is some fixed chosen number, ρ1 = 15(3 + R2 ) ∑ n≥1 1 cn . The deviation of the ’first-step estimators’ ϑ1(n, ε) has the form: ϑ1(n, ε) − ϑ = (ε−1 cn)−1/2 G̃−1 1 (n, ε)ζ̃1(n, ε), n ≥ 1, (9) ζ̃1(n, ε) = (ε−1 cn)−1/2 τ1(n,ε)−k1(n)h1 0 Ψh1 (t − 2 − 5h1)(Δ̃ξ(t)dt + dV(t) − dV(t − 1)). By the definition of stopping times τ1(n, ε) − k1(n)h1 we can control the noise ζ̃1(n, ε) : Eϑ||ζ̃1(n, ε)||2 ≤ 15(3 + R2 ), n ≥ 1, ε 0 and by the definition of the stopping time σ1(ε) - the first factor G̃−1 1 (n, ε) in the representation of the deviation (9). Define the sequential estimation plan of ϑ by T1(ε) = τ1(σ1(ε), ε), ϑ1(ε) = 1 S(σ1(ε)) σ1(ε) ∑ n=1 β2 1(n, ε)ϑ1(n, ε). (10) We can see that the construction of the sequential estimator ϑ1(ε) is based on the family of estimators ϑ(T, s) = G−1 1 (T, s)Φ(T, s), s ≥ 0. We have taken the discretization step h1 as above, because for ϑ ∈ Θ12 the functions f (T, s) = e2v0T G−1 1 (T, s) 28 Stochastic Modeling and Control
  • 43. On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 7 for every s ≥ 0 have some periodic matrix functions as a limit on T almost surely. These limit matrix functions are finite and may be infinite on the norm only for four values of their argument T on every interval of periodicity of the length Δ 1 (see the proof of Theorem 3.2 in [10, 12]). In the sequel limits of the type lim n→∞ a(n, ε) or lim ε→0 a(n, ε) will be used. To avoid repetitions of similar expressions we shall use, similar to [12, 14, 16], the unifying notation lim n∨ε a(n, ε) for both of those limits if their meaning is obvious. We state the results concerning the estimation of the parameter ϑ ∈ Θ1 in the following proposition. Proposition 3.1. Assume that the condition (7) on the sequence (cn) holds and let the parameter ϑ = (a, b) in (1) be such that ϑ ∈ Θ1. Then: I. For any ε 0 and every ϑ ∈ Θ1 the sequential plan (T1(ε), ϑ1(ε)) defined by (10) is closed (T1(ε) ∞ Pϑ − a.s.) and possesses the following properties: 1◦ . sup ϑ∈Θ1 Eϑ||ϑ1(ε) − ϑ||2 ≤ δ1ε; 2◦. the inequalities below are valid: – for ϑ ∈ Θ11 0 lim ε→0 ε · T1(ε) ≤ lim ε→0 ε · T1(ε) ∞ Pϑ − a.s., – for ϑ ∈ Θ12 0 lim ε→0 [T1(ε) − 1 2v0 ln ε−1 ] ≤ lim ε→0 [T1(ε) − 1 2v0 ln ε−1 ] ∞ Pϑ − a.s., – for ϑ ∈ Θ13 0 lim ε→0 [T1(ε) + 1 v0 ln T1(ε) − Ψ 13(ε)], lim ε→0 [T1(ε) + 1 v0 ln T1(ε) − Ψ 13(ε)] ∞ Pϑ − a.s., the functions Ψ 13(ε) and Ψ 13(ε) are defined in (30). II. For every ϑ ∈ Θ1 the estimator ϑ1(n, ε) is strongly consistent: lim n∨ε ϑ1(n, ε) = ϑ Pϑ − a.s. 3.2. Sequential estimation procedure for ϑ ∈ Θ2 Assume (cn)n≥1 is an unboundedly increasing sequence of positive numbers satisfying the condition (7). We introduce for every ε 0 several quantities: – the parameter λ = ev0 and its estimator 29 On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations
  • 44. 8 Will-be-set-by-IN-TECH λt = t 2 Δ̃Y(s)Δ̃Y(s − 1)ds t 2 (Δ̃Y(s − 1))2ds , t 2, λt = 0 otherwise; (11) – the functions Z(t) = Δ̃Y(t) − λΔ̃Y(t − 1) for t ≥ 2, 0 for t 2; Z̃(t) = Δ̃Y(t) − λtΔ̃Y(t − 1) for t ≥ 2, 0 for t 2, Ψ(t) = (Δ̃Y(t), Δ̃Y(t − 1)) for t ≥ 2, (0, 0) for t 2, Ψ̃(t) = (Z̃(t), Δ̃Y(t)) for t ≥ 2, (0, 0) for t 2; – the parameter α = v0/v1 and its estimator α2(n, ε) = ln ν2(n,ε) 4 (Δ̃Y(t − 3))2dt δ ln ε−1cn , (12) where ν2(n, ε) = inf{T 4 : T 4 Z̃2 (t − 3)dt = (ε−1 cn)δ }, (13) δ ∈ (0, 1) is a given number; – the sequence of stopping times τ2(n, ε) = h2 inf{k h−1 2 ν2(n, ε) : kh2 ν2(n,ε) ||Ψ−1/2 2 (n, ε)Ψ̃(t − 3)||2 dt ≥ 1}, where suppose h2 = 1/5 and Ψ2(n, ε) = diag{ε−1 cn, (ε−1 cn)α2(n,ε) }; – the matrices G2(S, T) = T S Ψ̃(t − 3)Ψ (t)dt, Φ2(S, T) = T S Ψ̃(t − 3)dΔ̃Y(t), G2(n, k, ε) = G2(ν2(n, ε), τ2(n, ε) − kh2), Φ2(n, k, ε) = Φ2(ν2(n, ε), τ2(n, ε) − kh2); – the times k2(n) = arg min k=1,5 ||G−1 2 (n, k, ε)||, n ≥ 1; – the estimators ϑ2(n, ε) = G−1 2 (n, ε)Φ2(n, ε), n ≥ 1, 30 Stochastic Modeling and Control
  • 45. On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 9 where G2(n, ε) = G2(n, k2(n), ε), Φ2(n, ε) = Φ2(n, k2(n), ε); – the stopping time σ2(ε) = inf{n ≥ 1 : S2(N) (ρ2δ−1 2 )1/2 }, (14) where S2(N) = N ∑ n=1 β2 2(n, ε), ρ2 = ρ1, δ2 ∈ (0, 1) is some fixed chosen number, β2(n, ε) = ||G̃−1 2 (n, ε)||, G̃2(n, ε) = (ε−1 cn)−1/2 Ψ−1/2 2 (n, ε)G2(n, ε). In this case we write the deviation of ϑ2(n, ε) in the form ϑ2(n, ε) − ϑ = (ε−1 cn)−1/2 G̃−1 2 (n, ε)ζ̃2(n, ε), n ≥ 1, where ζ̃2(n, ε) = Ψ−1/2 2 (n, ε) τ2(n,ε)−k2(n)h2 ν2(n,ε) Ψ̃(t − 3)(Δ̃ξ(t)dt + dV(t) − dV(t − 1)) and we have Eϑ||ζ̃2(n, ε)||2 ≤ 15(3 + R2 ), n ≥ 1, ε 0. Define the sequential estimation plan of ϑ by T2(ε) = τ2(σ2(ε), ε), ϑ2(ε) = ϑ2(σ2(ε), ε). (15) The construction of the sequential estimator ϑ2(ε) is based on the family of estimators ϑ2(S, T) = G−1 2 (S, T)Φ2(S, T) = e−v1TG̃2(S, T)Φ̃2(S, T), T S ≥ 0, where G̃2(S, T) = e−v1T Ψ−1/2 2 (T)G2(S, T), Φ̃2(S, T) = Ψ−1/2 2 (T)Φ2(S, T) and Ψ2(T) = diag{ev1T, ev0T}. We have taken the discretization step h as above, because for ϑ ∈ Θ22, similar to the case ϑ ∈ Θ12, the function f2(S, T) = G̃−1 2 (S, T) has some periodic (with the period Δ 1) matrix function as a limit almost surely (see (35)). This limit matrix function may have an infinite norm only for four values of their argument T on every interval of periodicity of the length Δ. We state the results concerning the estimation of the parameter ϑ ∈ Θ2 in the following proposition. Proposition 3.2. Assume that the condition (7) on the sequence (cn) holds as well as the parameter ϑ = (a, b) in (1) be such that ϑ ∈ Θ2. Then: I. For any ε 0 and every ϑ ∈ Θ2 the sequential plan (T2(ε), ϑ2(ε)) defined by (15) is closed and possesses the following properties: 1◦ . sup ϑ∈Θ2 Eϑ||ϑ2(ε) − ϑ||2 ≤ δ2ε; 31 On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations
  • 46. 10 Will-be-set-by-IN-TECH 2◦. the inequalities below are valid: 0 lim ε→0 [T2(ε) − 1 2v1 ln ε−1 ] ≤ lim ε→0 [T2(ε) − 1 2v1 ln ε−1 ] ∞ Pϑ − a.s.; II. For every ϑ ∈ Θ2 the estimator ϑ2(n, ε) is strongly consistent: lim n∨ε ϑ2(n, ε) = ϑ Pϑ − a.s. 3.3. Sequential estimation procedure for ϑ ∈ Θ3 We shall use the notation, introduced in the previous paragraph for the parameter λ = ev0 and its estimator λt as well as for the functions Z(t), Z̃(t), Ψ(t) and Ψ̃(t). Chose the non-random functions ν3(n, ε), n ≥ 1, ε 0, satisfying the following conditions as ε → 0 or n → ∞ : ν3(n, ε) = o(ε−1 cn), log1/2 ν3(n, ε) ev0ν3(n,ε) ε−1 cn = o(1). (16) Example: ν3(n, ε) = log2 ε−1cn. We introduce several quantities: – the parameter α3 = v0 and its estimator α3(n, ε) = ln |λν3(n,ε)|, where λt is defined in (11); – the sequences of stopping times τ31(n, ε) = inf{T 0 : T ν3(n,ε) Z̃2 (t − 3)dt = ε−1 cn}, (17) τ32(n, ε) = inf{T 0 : T ν3(n,ε) (Δ̃Y(t − 3))2 dt = e2α3(n,ε)ε−1 cn }, (18) τmin(n, ε) = min{τ31(n, ε), τ32(n, ε)}, τmax(n, ε) = max{τ31(n, ε), τ32(n, ε)}, – the matrices G3(S, T) = T S Ψ̃(t)Ψ(t)dt, Φ3(S, T) = T S Ψ̃(t)dΔ̃Y(t), G3(n, ε) = G3(ν3(n, ε), τmin(n, ε)), Φ3(n, ε) = Φ3(ν3(n, ε), τmin(n, ε)); – the estimators ϑ3(n, ε) = G−1 3 (n, ε)Φ3(n, ε), n ≥ 1, ε 0; 32 Stochastic Modeling and Control
  • 47. On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 11 – the stopping time σ3(ε) = inf{n ≥ 1 : S3(N) (ρ3δ−1 3 )1/2 }, (19) where S3(N) = N ∑ n=1 β2 3(n, ε), δ3 ∈ (0, 1) is some fixed chosen number, β3(n, ε) = ||G̃−1 3 (n, ε)||, ρ3 = 6(3 + R2 ) ∑ n≥1 1 cn , G̃3(n, ε) = (ε−1 cn)−1/2 Ψ−1/2 3 (n, ε)G3(n, ε), Ψ3(n, ε) = diag{ε−1 cn, e2α3(n,ε)ε−1 cn }. In this case we write the deviation of ϑ3(n, ε) in the form ϑ3(n, ε) − ϑ = (ε−1 cn)−1/2 G̃−1 3 (n, ε)ζ̃3(n, ε), n ≥ 1, where ζ̃3(n, ε) = Ψ−1/2 3 (n, ε) τmin(n,ε) ν3(n,ε) Ψ̃(t − 3)(Δ̃ξ(t)dt + dV(t) − dV(t − 1)) and we have Eϑ||ζ̃3(n, ε)||2 ≤ 6(3 + R2 ), n ≥ 1, ε 0. Define the sequential estimation plan of ϑ by T3(ε) = τmax(σ3(ε), ε), ϑ3(ε) = ϑ3(σ3(ε), ε). (20) Proposition 3.3. Assume that the condition (7) on the sequence (cn) holds and let the parameter ϑ = (a, b) in (1) be such that ϑ ∈ Θ3. Then: I. For every ϑ ∈ Θ3 the sequential plan (T3(ε), ϑ3(ε)) defined in (20) is closed and possesses the following properties: 1◦. for any ε 0 sup ϑ∈Θ3 Eϑ||ϑ3(ε) − ϑ||2 ≤ δ3ε; 2◦. the following inequalities are valid: 0 lim ε→0 εT3(ε) ≤ lim ε→0 εT3(ε) ∞ Pϑ − a.s.; II. For every ϑ ∈ Θ3 the estimator ϑ3(n, ε) is strongly consistent: lim n∨ε ϑ3(n, ε) = ϑ Pϑ − a.s. 3.4. Sequential estimation procedure for ϑ ∈ Θ4 In this case b = −a and (6) is the differential equation of the first order: dΔ̃Y(t) = aZ∗ (t)dt + Δ̃ξ(t)dt + dV(t) − dV(t − 1), t ≥ 2, where Z∗ (t) = Δ̃Y(t) − Δ̃Y(t − 1) for t ≥ 2, 0 for t 2. 33 On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations
  • 48. 12 Will-be-set-by-IN-TECH We shall construct sequential plan (T4(ε), ϑ4(ε)) for estimation of the vector parameter ϑ = a(1, −1) with the (δ4ε)-accuracy in the sense of the L2-norm for every ε 0 and fixed chosen δ4 ∈ (0, 1). First define the sequential estimation plans for the scalar parameter a on the bases of correlation estimators which are generalized least squares estimators: a4(T) = G−1 4 (T)Φ4(T), G4(T) = T 0 Z∗ (t − 2)Z∗ (t)dt, Φ4(T) = T 0 Z∗ (t − 2)dΔ̃Y(t), T 0. Let (cn, n ≥ 1) be an unboundedly increasing sequence of positive numbers, satisfying the condition (7). We shall define – the sequence of stopping times (τ4(n, ε), n ≥ 1) as τ4(n, ε) = inf{T 2 : T 0 (Z∗ (t − 2))2 dt = ε−1 cn}, n ≥ 1; – the sequence of estimators a4(n, ε) = a4(τ4(n, ε)) = G−1 4 (τ4(n, ε))Φ4(τ4(n, ε)); – the stopping time σ4(ε) = inf{n ≥ 1 : S4(N) (ρ4δ−1 4 )1/2 }, (21) where S4(N) = N ∑ n=1 G̃−2 4 (n, ε), ρ4 = ρ3, G̃4(n, ε) = (ε−1cn)−1G4(τ4(n, ε)). The deviation of a4(n, ε) has the form a4(n, ε) − a = (ε−1 cn)−1/2 G̃−1 4 (n, ε)ζ̃4(n, ε), n ≥ 1, where ζ̃4(n, ε) = (ε−1 cn)−1/2 τ4(n,ε) 0 Z∗ (t − 2)(Δ̃ξ(t)dt + dV(t) − dV(t − 1)) and we have Eϑ||ζ̃4(n, ε)||2 ≤ 3(3 + R2 ), n ≥ 1, ε 0. We define the sequential plan (T4(ε), ϑ4(ε)) for the estimation of ϑ as T4(ε) = τ4(σ4(ε), ε), ϑ4(ε) = a4(σ4(ε), ε)(1, −1) . (22) The following proposition presents the conditions under which T4(ε) and ϑ4(ε) are well-defined and have the desired property of preassigned mean square accuracy. 34 Stochastic Modeling and Control
  • 49. On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 13 Proposition 3.4. Assume that the sequence (cn) defined above satisfy the condition (7). Then we obtain the following result: I. For any ε 0 and every ϑ ∈ Θ4 the sequential plan (T4(ε), ϑ4(ε)) defined by (22) is closed and has the following properties: 1◦ . sup ϑ∈Θ4 Eϑ||ϑ4(ε) − ϑ||2 ≤ δ4ε; 2◦. the following relations hold: – if ϑ ∈ Θ41 then 0 lim ε→0 ε · T4(ε) ≤ lim ε→0 ε · T4(ε) ∞ Pϑ − a.s., – if ϑ ∈ Θ42 then 0 lim ε→0 [T4(ε) − 1 2v0 ln ε−1 ] ≤ lim ε→0 [T4(ε) − 1 2v0 ln ε−1 ] ∞ Pϑ − a.s.; II. For every ϑ ∈ Θ4 the estimator ϑ4(n, ε) is strongly consistent: lim n∨ε ϑ4(n, ε) = ϑ Pϑ − a.s. 3.5. General sequential estimation procedure of the time-delayed process In this paragraph we construct the sequential estimation procedure for the parameters a and b of the process (1) on the bases of the estimators, presented in subsections 3.1-3.4. Denote j = arg min j=1,4 Tj(ε). We define the sequential plan (T(ε), ϑ(ε)) of estimation ϑ ∈ Θ on the bases of all constructed above estimators by the formulae SEP (ε) = (T (ε), ϑ∗ (ε)), T (ε) = Tj (ε), ϑ∗ (ε) = ϑj (ε). The following theorem is valid. Theorem 3.1. Assume that the underlying processes (X(t)) and (Y(t)) satisfy the equations (1), (2), the parameter ϑ to be estimated belongs to the region Θ and for the numbers δ1, . . . , δ4 in the definitions (10), (15), (20) and (22) of sequential plans the condition 4 ∑ j=1 δj = 1 is fulfilled. Then the sequential estimation plan (T(ε), ϑ(ε)) possess the following properties: 1◦. for any ε 0 and for every ϑ ∈ Θ T (ε) ∞ Pϑ − a.s.; 2◦. for any ε 0 sup ϑ∈Θ Eϑ ϑ (ε) − ϑ 2 ≤ ε; 3◦. the following relations hold with Pϑ – probability one: – for ϑ ∈ Θ11 ∪ Θ3 ∪ Θ41 lim ε→0 ε · T∗ (ε) ∞; 35 On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations
  • 50. 14 Will-be-set-by-IN-TECH – for ϑ ∈ Θ12 ∪ Θ42 lim ε→0 [T∗ (ε) − 1 2v0 ln ε−1 ] ∞; – for ϑ ∈ Θ13 lim ε→0 [T∗ (ε) + 1 v0 ln T1(ε) − Ψ 13(ε)] ∞, the function Ψ 13(ε) is defined in (30); – for ϑ ∈ Θ2 lim ε→0 [T∗ (ε) − 1 2v1 ln ε−1 ] ∞. 4. Proofs Proof of Proposition 3.1. The closeness of the sequential estimation plan, as well as assertions I.2 and II of Proposition 3.1 for the cases Θ11 ∪ Θ12 can be easily verified similar to [10, 12, 14, 16]. Now we verify the finiteness of the stopping time T1(ε) in the new case Θ13. By the definition of Δ̃Y(t) we have: Δ̃Y(t) = X̃(t) + Δ̃V(t), t ≥ 1, where X̃(t) = t t−1 X(t)dt. It is easy to show that the process (X̃(·)) has the following representation: X̃(t) = x̃0(t)X0(0) + b 0 −1 x̃0(t − s − 1)X0(s)ds + t 0 x̃0(t − s)dW(s) for t ≥ 1, X̃(t) = 0 t−1 X0(s)ds + t 0 X(s)ds for t ∈ [0, 1) and X̃(t) = 0 for t ∈ [−1, 0). Based on the representation above for the function x0(·), the subsequent properties of x0(t) the function x̃0(t) = t t−1 x0(s)ds can be easily shown to fulfill x̃0(t) = 0, t ∈ [−1, 0] and as t → ∞ x̃0(t) = ⎧ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎨ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎩ o(eγt), γ 0, ϑ ∈ Θ11, φ̃0(t)ev0t + o(eγ0t), γ0 v0, ϑ ∈ Θ12, 2 v0 [(1 − e−v0 )t + e−v0 − 1−e−v0 v0 ]ev0t + o(eγ0t), γ0 v0, ϑ ∈ Θ13, 1−e−v0 v0(v0−a+1) ev0t + 1−e−v1 v1(a−v1−1) ev1t + o(eγ1t), γ1 v1, ϑ ∈ Θ21, 1−e−v0 v0(v0−a+1) ev0t + φ̃1(t)ev1t + o(eγ1t), γ1 v1, ϑ ∈ Θ22, 1−e−v0 v0(v0−a+1) ev0t + o(eγt), γ 0, ϑ ∈ Θ3, 1 1−a + o(eγt), γ 0, ϑ ∈ Θ41, 1−e−v0 v0(v0−a+1) ev0t − 1 a−1 + o(eγt), γ 0, ϑ ∈ Θ42, 36 Stochastic Modeling and Control
  • 51. On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 15 where φ̃i(t) = Ãi cos ξit + B̃i sin ξit and Ãi, B̃i, ξi are some constants (see [10, 12]). The processes X̃(t) and Δ̃V(t) are mutually independent and the process X̃(t) has the representation similar to (3). Then, after some algebra similar to those in [10, 12] we get for the processes X̃(t), Ỹ(t) = X̃(t) − λX̃(t − 1), λ = ev0 , Δ̃Y(t) and Z(t) = Δ̃Y(t) − λΔ̃Y(t − 1) for t ≥ 2, 0 for t 2 in the case Θ13 the following limits: lim t→∞ t−1 e−v0t Δ̃Y(t) = lim t→∞ t−1 e−v0t X̃(t) = C̃X Pϑ − a.s., (23) lim t→∞ e−v0t Ỹ(t) = CY, lim t→∞ e−v0t Z(t) = C̃Z Pϑ − a.s., and, as follows, for u ≥ 0 lim T→∞ | T−2 e−2v0T T 1 Δ̃Y(t − u)Δ̃Y(t)dt− C̃2 X 2v0 1 − u T e−uv0 | = 0 Pϑ − a.s., (24) lim T→∞ | T−1 e−2v0T T 1 Δ̃Y(t − u)Z(t)dt− C̃XC̃Z 2v0 1 − u T e−uv0 | = 0 Pϑ − a.s., where C̃x, CY and C̃Z are some nonzero constants, which can be found from [10, 12]. From (24) we obtain the limits: lim T→∞ 1 T2e2v0T G1(T, s) = G13(s), lim T→∞ T−1 e−4v0T |G1(T, s)| = G13e−(3+11s)v0 Pϑ − a.s., G13(s) = C̃2 X 2v0 e−(2+5s)v0 e−(1+5s)v0 e−2(1+3s)v0 e−(1+6s)v0 , G13 = sC̃3 XC̃Z 4v2 0 and, as follows, we can find lim T→∞ T−1 e2v0T G−1 1 (T, s) = G̃13(s) Pϑ − a.s., G̃13(s) = 2v0e(3+11s)v0 sC̃XC̃Z e−(1+6s)v0 −e−(1+5s)v0 −e−2(1+3s)v0 e−(2+5s)v0 is a non-random matrix function. From (23) and by the definition of the stopping times τ1(n, ε) we have lim n∨ε τ2 1 (n, ε)e2τ1(n,ε)v0 ε−1cn = g∗ 13 Pϑ − a.s., (25) 37 On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations
  • 52. 16 Will-be-set-by-IN-TECH where g∗ 13 = 2v0C̃−2 X e−2v0(2+5h1) + e−4v0(1+3h1) −1 and, as follows, lim n∨ε [τ1(n, ε) + 1 v0 ln τ1(n, ε) − 1 2v0 ln ε−1 cn]= 1 2v0 ln g∗ 13 Pϑ − a.s., (26) lim n∨ε τ1(n, ε) ln ε−1cn = 1 2v0 Pϑ − a.s., (27) lim n∨ε [ 1 ln3 ε−1cn G̃−1 1 (n, ε) − [(2v0)3 g∗ 13]−1 e−2v0k1(n)h1 G̃13(h1)] = 0 Pϑ − a.s. (28) From (8) and (28) it follows the Pϑ − a.s. finiteness of the stopping time σ1(ε) for every ε 0. The proof of the assertion I.1 of Proposition 3.1 for the case Θ13 is similar e.g. to the proof of corresponding assertion in [14, 16]: Eϑ||ϑ1(ε) − ϑ||2 = Eϑ 1 S2(σ1(ε)) || σ1(ε) ∑ n=1 β2 1(n, ε)(ϑ1(n, ε) − ϑ)||2 ≤ ≤ ε δ1 ρ1 Eϑ σ1(ε) ∑ n=1 1 cn · β2 1(n, ε) · ||G̃−1 1 (n, ε)||2 · ||ζ̃1(n, ε)||2 ≤ ≤ εδ1 ρ1 ∑ n≥1 1 cn Eϑ||ζ̃1(n, ε)||2 ≤ εδ1 15(3 + R2) ρ1 ∑ n≥1 1 cn = εδ1. Now we prove the assertion I.2 for ϑ ∈ Θ13. Denote the number g̃13 = [(2v0)3 g∗ 13]2 ρ−1 1 δ1||G̃13(h1)||−2 and the times σ 13(ε) = inf{n ≥ 1 : N ∑ n=1 ln6 ε−1 cn g̃13e4v0h1 }, σ 13(ε) = inf{n ≥ 1 : N ∑ n=1 ln6 ε−1 cn g̃13e20v0h1 }. From (8) and (28) it follows, that for ε small enough σ 13(ε) ≤ σ1(ε) ≤ σ 13(ε) Pϑ − a.s. (29) Denote Ψ 13(ε) = 1 2v0 ln(ε−1 cσ 13(ε)), Ψ 13(ε) = 1 2v0 ln(ε−1 cσ 13(ε)). (30) Then, from (8), (26) and (29) we obtain finally the assertion I.2 of Proposition 3.1: lim ε→0 [T1(ε) + 1 v0 ln T1(ε) − Ψ 13(ε)] ≥ 1 2v0 ln g∗ 13 Pϑ − a.s., 38 Stochastic Modeling and Control
  • 53. On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations 17 lim ε→0 [T1(ε) + 1 v0 ln T1(ε) − Ψ 13(ε)] ≤ 1 2v0 ln g∗ 13 Pϑ − a.s. For the proof of the assertion II of Proposition 3.1 we will use the representation (9) for the deviation ϑ1(n, ε) − ϑ = 1 ln3 ε−1cn G̃−1 1 (n, ε)· τ2 1 (n, ε)e2τ1(n,ε)v0 ε−1cn · ln ε−1cn τ1(n, ε) 3 · 1 τ−1 1 (n, ε)e2τ1(n,ε)v0 ζ1(n, ε), where ζ1(n, ε) = ζ1(τ1(n, ε) − k1(n)h1, h1), ζ1(T, s) = T 0 Ψs(t − 2 − 5s)(Δ̃ξ(t)dt + dV(t) − dV(t − 1)). According to (25), (27) and (28) first three factors in the right-hand side of this equality have Pϑ − a.s. positive finite limits. The last factor vanishes in Pϑ − a.s. sense by the properties of the square integrable martingales ζ1(T, s) : lim n∨ε ζ1(n, ε) τ−1 1 (n, ε)e2τ1(n,ε)v0 = lim T→∞ ζ1(T, h1) T−1e2v0T = 0 Pϑ − a.s. Then the estimators ϑ1(n, ε) are strongly consistent as ε → 0 or n → ∞ and we obtain the assertion II of Proposition 3.1. Hence Proposition 3.1 is valid. Proof of Proposition 3.2. Similar to the proof of Proposition 3.1 and [7]–[16] we can get the following asymptotic as t → ∞ relations for the processes Δ̃Y(t), Z(t) and Z̃(t) : – for ϑ ∈ Θ21 Δ̃Y(t) = CYev0t + CY1ev1t + o(eγt ) Pϑ − a.s., Z(t) = CZev1t + o(eγt ) Pϑ − a.s., λt − λ = 2v0ev0 v0 + v1 CZC−1 Y e−(v0−v1)t + o(e−(v0−v1+γ)t ) Pϑ − a.s., Z̃(t) = C̃Zev1t + o(eγt ) Pϑ − a.s.; – for ϑ ∈ Θ22 |Δ̃Y(t) − CYev0t − CY1(t)ev1t | = o(eγt ) Pϑ − a.s. |Z(t) − CZ(t)ev1t | = o(eγt ) Pϑ − a.s., λt − λ = 2v0ev0 C−1 Y UZ(t)e−(v0−v1)t + o(e−(v0−v1+γ)t ) Pϑ − a.s., |Z̃(t) − C̃Z(t)ev1t | = o(eγt ) Pϑ − a.s., where CY and CY1 are some non-zero constants, 0 γ v1, CZ = CY1(1 − ev0−v1 ), C̃Z = CZ v1 − v0 v1 + v0 ; CZ(t), UZ(t) = ∞ 0 CZ(t − u)e−(v0+v1)u du and C̃Z(t) = CZ(t) − 2v0UZ(t) are the periodic (with the period Δ 1) functions. 39 On Guaranteed Parameter Estimation of Stochastic Delay Differential Equations by Noisy Observations
  • 54. 18 Will-be-set-by-IN-TECH Denote UZ̃(T) = ∞ 0 C̃Z(T − u)e−(v0+v1)u du, UZ̃Z(S, T) = ∞ 0 C̃Z(T − u)CZ(S − u)e−2v1u du, ŨZ(T) = UZ̃Z̃(T, T). It should be noted that the functions CZ(t), UZ(t), C̃Z(t) and UZ̃(T) have at most two roots on each interval from [0, ∞) of the length Δ. At the same time the function UZ̃Z(S, T) - at most four roots. With Pϑ-probability one we have: – for ϑ ∈ Θ2 lim T−S→∞ e−2v0T T S (Δ̃Y(t − 3))2 dt = C2 Y 2v0 e−6v0 , (31) – for ϑ ∈ Θ21 lim T−S→∞ e−2v1T T S Z̃2 (t − 3)dt = C̃2 Z 2v1 e−6v1 , (32) lim T−S→∞ G̃−1 2 (S, T) = G̃21, (33) where G̃21 = ⎛ ⎝ 2v1(v1+v0)2 CZC̃Z(v1−v0)2 e3v1 − 4v0v1(v1+v0) CZCY(v1−v0)2 e3v0 − 2v1(v1+v0)2 CZC̃Z(v1−v0)2 ev0+3v1 4v0v1(v1+v0) CZCY(v1−v0)2 e4v0 ⎞ ⎠ , – for ϑ ∈ Θ22 lim T−S→∞ e−2v1T T S Z̃2 (t − 3)dt − e−6v1 ŨZ (T − 3) = 0, (34) lim T−S→∞ G̃−1 2 (S, T) − G̃22(T) = 0, (35) where G̃22(T) = 1 2v0 UZ̃Z(T, T − 3) − UZ(T − 3)UZ̃(T) −1 · ⎛ ⎝ e3v1 2v0 −e3v0 CY UZ̃(T) −ev0+3v1 2v0 e4v0 CY UZ̃(T) ⎞ ⎠ . The matrix G̃21 is constant and non-zero and G̃22(T) is the periodic matrix function with the period Δ 1 (see [3], [10, 12, 14]) and may have infinite norm for four points on each interval of periodicity only. The next step of the proof is the investigation of the asymptotic behaviour of the stopping times ν2(n, ε), τ2(n, ε) and the estimators α2(n, ε). 40 Stochastic Modeling and Control
  • 55. Another Random Scribd Document with Unrelated Content
  • 56. temps, pendant que lui se tranquillise au cabaret. Le maître fioûle sa bouteille, la jument lit la gazette. GAZOUILLON, s. m. Terme des campagnards. Margouillis. Se dit surtout du margouillis qui provient d'un mélange de neige fraîchement tombée et de pluie. Gazouillon et Margouillis sont des onomatopées. GÉANE, s. f. Géante. La merveilleuse géane étonna toute l'assemblée. Français populaire. GEL, s. m. Gelée. Le mot gel manque dans plusieurs dictionnaires et en particulier dans celui de l'Académie française. Le Complément de ce même dictionnaire, et le Dictionnaire national de Bescherelle [1846], disent que gel, dans le sens de «Gelée,» a vieilli. Nous pouvons affirmer que le mot gel, signifiant: «Gelée,» est d'un emploi habituel chez nous et chez nos proches voisins. GELÉE AUX GROSEILLES, s. f. Dites: «Gelée DE groseilles.» Dites aussi: Gelée DE pomme, gelée DE framboise, etc. GELER DE FROID. Geler. Faites-moi vite un grand feu, je gèle de froid. Français populaire. GELER (SE), v. pron. Geler. Je me gèle ici à vous attendre. Faute très-répandue. «Se geler» n'est français qu'en parlant des choses. «Le mercure peut se geler. Le nez de Mme Z*** se gela au passage du grand Saint-Bernard.» GEMOTTER, v. n. Signifie: 1o S'impatienter, pester; 2o Languir, être languissant. La pauvre drôlesse, abandonnée de tout le monde, était là à gemotter dans son lit. Ranimez donc ce feu qui ne fait que gemotter. Dans le patois vaudois, gemotta veut dire: Gémir, et dans le patois neuchâtelois, gemiller, s'impatienter. R. Gemo. GENDRE, s. m. Se faire gendre, signifie, dans son sens le plus large: Se procurer, par un riche mariage, une position douce,
  • 57. confortable, oisive, à laquelle on ne serait jamais arrivé d'une autre manière. Dans un sens plus restreint, se faire gendre se dit facétieusement et dérisoirement d'un jeune homme du haut, qui, ayant une fortune exiguë, des habitudes un peu dispendieuses et un extérieur agréable, choisit pour femme une riche héritière dans la classe bourgeoise. Cette expression originale, se faire gendre, a été créée ou mise en circulation par un charmant article du journal de Mr Petit-Senn. [Voyez le Fantasque de 1835, no 81, p. 322, et la Revue suisse de 1850, livraison du mois de mai, p. 328.] GENÈVRE, s. m. Des grains de genèvre. Ce terme nous vient du vieux français. Au commencement du dix-huitième siècle, on disait encore indifféremment genèvre et genièvre. «Genièvre» a prévalu. GENILLÉ, s. m. Nous appelons goût de genillé, un mauvais goût que contractent les volailles qui ont été nourries dans un poulailler petit et malpropre. Geniller veut dire «Poulailler» dans le dialecte du Berry. Djeneuille, dans le patois vaudois, signifie: Poule. Par métathèse, c'est-à-dire par transposition de lettres, ces mots viennent du mot latin gallina, poule. GENOU, s. m. Nous disons d'un couteau qui coupe mal: Il coupe comme les genoux d'une vieille femme, comme les genoux de ma grand'mère. Expression triviale, consignée dans le Dictionnaire du Bas langage, t. II, p. 10. GERLE, s. f. Corbeille ronde et peu profonde, destinée à recevoir le légume qu'on porte au marché. En Dauphiné, gerle signifie: Jarre, grand vase de terre. En languedocien, une gerle est un baquet, un grand seau. Voyez JARLOT. GÉROFLÉE, s. f. Géroflée blanche. Bouquet de géroflées. Terme français populaire. On doit dire: «Giroflée.» GÉROLE, s. f. Chervis, racine potagère. Dans quelques provinces de France, on dit: Gyrole.
  • 58. GESSION, s. f. On vient d'ôter à ce jeune dissipateur la gession de sa fortune. Terme parisien populaire, etc. On doit écrire «Gestion» et prononcer gess-tion. GICLÉE ou JICLÉE, s. f. Signifie: 1o Jaillissement, liquide qui jaillit; 2o Éclaboussure, flaquée. En deux ou trois giclées, on se rendit maître du feu. Une giclée de mortier suffira contre ce mur. Dans le Jura, gicle, s. f., se dit d'une petite seringue de sureau, avec laquelle les polissons s'évertuent à arroser les passants. [Voyez Monnier, Vocabulaire du Jura.] GICLER ou JICLER, v. n. et a. Signifie: 1o Jaillir, saillir, sortir impétueusement; 2o Faire jaillir, jeter de l'eau. Faire gicler de l'eau; faire gicler de la boue. Finis-donc, André, tu me gicles. Terme suisse-roman, savoisien, franc-comtois et lyonnais. En provençal et en languedocien: Jhiscla. Onomatopée remarquable. Dans le patois bourguignon, chicclai signifie: «Faire jaillir,» et chiccle se dit d'une «Canonnière» ou seringue de bois dont s'amusent les enfants pour jeter de l'eau. [Voyez les Noëls bourguignons de La Monnoye.] GIFFLARD, DE, s. Joufflu, mouflard, qui a le visage bouffi et rebondi. Un gros gifflard. On disait en vieux français: Giffard, giffarde, terme formé de giffe ou giffle, joue. GIFLÉE, s. f. Giffle, mornifle, taloche. GIGASSE, s. f. Se dit d'une personne démesurément grande et un peu dégingandée. GIGIER, s. m. Gésier, second ventricule de certains oiseaux. Ne jetez pas ces gigiers, ils serviront pour le bouillon. Terme généralement usité en Suisse et en France, mais que les dictionnaires n'ont pas recueilli. Nous disons aussi: Gisier. GIGNER, v. a. Guigner, regarder du coin de l'œil.
  • 59. GIGOT DE MOUTON, s. m. Dites simplement: «Gigot,» puisque «gigot» signifie: Cuisse de mouton séparée du corps de l'animal pour être mangée. [Acad.] GIGUE, s. f. Se dit d'une personne dont la taille est grande et toute d'une venue. Vois-tu là-bas cette grande gigue, cette perche? En Normandie, une gigue est une jeune fille qui a de grandes jambes. En français, «Gigue» veut dire: Jambe; et «Giguer,» aller vite, courir, sauter, danser. GILLOTIN, s. m. (ll mouillés.) Pantin, jeune garçon qui est toujours en mouvement, et qui cherche à divertir par ses perpétuelles pasquinades. Faire le gillotin. GILLOTINER, v. n. Faire le gillotin. GINGEOLET, ETTE, adj. Ginguet, court, étriqué. Habit gingeolet. GINGUER ou JINGUER, v. n. Jouer, rire, sauter, folâtrer. Elle est toujours à ginguer. Terme limousin, normand et vieux français. En Picardie on dit: Jingler. GIRADE, s. f. Girarde ou julienne, fleur. GIRANIUM, s. m. Écrivez «Géranium» et prononcez géraniome. Prononcez aussi albome, peinsome et laudanome les mots Album, Pensum et Laudanum. GIRAUD, nom propre. Nous disons proverbialement et facétieusement à une personne qui nous fait une demande inadmissible, à une personne qui porte très-haut ses prétentions et dont l'attente sera trompée: As-tu connu Giraud?... Eh bien, torche Miraud; ou plus laconiquement: As-tu connu Giraud? c'est-à-dire: Bernicle; à d'autres; adresse ta demande à un autre. Tu voudrais que je te prêtasse encore cinquante francs? As-tu connu Giraud? Quoi! ton vilain cousin se flatte d'épouser cette jeune et jolie Anna!... As-tu connu Giraud? GISIER, s. m. Voyez GIGIER.
  • 60. GISPINER, v. a. Expression adoucissante, pour signifier: Filouter, attraper, enlever habilement et sans scrupule, comme le font quelquefois des amis entre eux. Ce joli volume était à sa potte: il me l'a tout bonnement gispiné. En Lorraine on dit: Gaspiner ou gabsiner, et à Valenciennes, gobsiner. GIVRÉ, ÉE, part. et adj. Couvert de givre. C'est givré; c'est tout givré. Il a beaucoup givré cette nuit. Terme des campagnards. GLACE, s. f. Ne dites pas: «Manger une glace.» Dites: «Prendre une glace, prendre des glaces.» GLACE, s. f. Être froid comme la glace; être uni comme la glace. Retranchez l'article et dites: Être froid comme glace; être uni comme glace. GLACER UN PLAFOND. Terme de plâtrier. L'expression française est: Enduire un plafond. GLAFFER ou GLLAFFER, v. a. (ll mouillés.) Terme des campagnards. Manger gloutonnement quelque chose qui croque sous la dent. On le dit des pourceaux et de ceux qui, de près ou de loin, leur ressemblent. Ce mot gllafer, quand on le prononce comme il faut, imite parfaitement la chose qu'il doit peindre. GLAÎNE ou GLÈNE, s. f. Faire glaîne, terme d'écolier, signifie: Faire rafle, prendre à l'improviste les jouets, et surtout les mâpis des joueurs. Ce polisson, ce voleur s'approcha doucement du carré et nous fit glaîne. Voyez GLENNE, no 1. GLAPPE, s. f. Signifie: 1o Terre glaise; 2o Pisé. [P. G.] GLAIRE, s. m. Le glaire d'un œuf. «Glaire» est féminin. GLÉNER ou GLAÎNER, v. a. et n. Glaner, ramasser les épis après la moisson. Terme français populaire et vieux français. GLÉNEUR, GLÉNEUSE, s. Glaneur, glaneuse.
  • 61. GLENNE, s. f. Glane, produit du glanage, glanure. Un bandit lui enleva toutes ses glennes. Terme français populaire et vieux français. GLENNE, s. f. Sorte de renoncule des champs. GLIN-GLIN, s. m. Terme enfantin. Le petit doigt. Il a bobo à son glin-glin. Cette expression, usitée aussi dans les cantons voisins, vient probablement des mots allemands klein, klein, qui signifient: Petit, petit. GLISSE, s. f. Terme de pâtissier. Cressin, sorte de petit pain long, qui est fort léger à l'estomac. GLISSE, s. f. Glissoire, chemin frayé sur la glace pour y glisser par divertissement. Faire une bonne glisse, faire une longue glisse. Gare, gare, sur la glisse! Terme suisse-roman et savoisien. On dit à Lyon: Une glissière; en Lorraine, un glissant; à Paris, une glissade. GLISSER, v. neutre. La rue du Perron glisse souvent en hiver. Dites: La rue du Perron est souvent glissante en hiver; ou dites: On glisse souvent en hiver dans la rue du Perron. GLISSER (SE), v. pron. Glisser, s'amuser à glisser. Les fossés sont gelés: allons nous y glisser tous ensemble. Il faut dire: «Allons y glisser tous ensemble.» GLOPET, s. m. Sieste, méridienne. Voyez CLOPET. GLU, s. masc. Du bon glu. Solécisme répandu aussi dans le reste de la Suisse romane, en Savoie, en Dauphiné, en Franche- Comté, en Lorraine et ailleurs. GNIABLE, s. m. Sobriquet qu'on donne aux cordonniers. GNIANIOU, s. m. Voyez NIANIOU.
  • 62. GNIFFE-GNIAFFE, s. m. Ce terme fort expressif signifie: 1o Nigaud, niais, benêt; 2o Flasque, lâche, mou et sans ressort. En Picardie on dit: Gniouffe. GOBE-LA LUNE, s. m. Gobe-mouche, niais, grand niais qui marche la tête levée comme s'il regardait la lune. Dans le patois du bas Limousin, gobo-luno se dit de celui qui s'occupe niaisement de bagatelles. [Voyez Béronie, Dictionnaire du patois du bas Limousin.] GOBERGER (SE), v. pron. Faire grande chère, bâfrer, faire bombance, se régaler. Nos quatre amis allèrent à une auberge de Coppet, où ils demandèrent des feras et des volailles, dont ils se gobergèrent. Voyez donc comme ces enfants se gobergent et s'empiffrent de raisins et de noix! En français, «Se goberger» signifie: Prendre ses aises, se dorloter, se divertir. GODAILLE, s. f. Débauche de bouche, bâfre, grande ribote. Faire une godaille. Ce fut une godaille complète, une godaille de mâlevie. Le dictionnaire de l'Académie ne fait pas mention de ce terme; et, selon les dictionnaires de Boiste, de Landais et de Bescherelle, godaille signifie: 1o Ivrognerie; 2o Mauvais vin. Ce n'est point là le sens que nous lui donnons à Genève; ce n'est pas non plus le sens qu'il a dans le langage français populaire. [Voyez le Dictionnaire du Bas langage, t. II, p. 17, et le Dictionnaire rouchi-français, aux mots godaïer et godalier.] GODAILLER, v. n. Faire une grande ribote, une bâfre, une godaille. Dans les dictionnaires ce verbe a un autre sens. GODAILLEUR, s. m. Riboteur, bambocheur, bâfreur. Un tas de godailleurs. Ce mot et les deux précédents sont probablement originaires du nord de la France, où le mot godale signifie: «Bière, petite bière.» GODICHE, s. et adj. Plaisant, risible. Être godiche, être plaisamment bête. Tu es godiche, toi! Voilà qui est vraiment
  • 63. godiche. Terme parisien populaire recueilli par MM. Noël et Chapsal. Les autres dictionnaires donnent à ce mot le sens de: «Gauche, emprunté, maladroit.» GODICHON, s. m. Diminutif de godiche. GODRON, s. m. Goudron. GODRONNER, v. a. Goudronner. Les mots godron et godronner appartenaient encore à la langue des dictionnaires, il y a un siècle. GOFFETTE, adj. fém. Nous appelons mains goffettes, des mains grassettes, des mains potelées. GOGNE, s. f. Courage, cœur, hardiesse, capacité. Avoir la gogne, oser. Aurais-tu la gogne de sauter ce ruisseau? Non, tu n'en as pas la gogne; tu n'as point de gogne. GOGNE, s. f. Rebut, lie, crasse, crapule. Se dit des personnes et des choses. Quelle gogne de bâton tu as là! Dis donc, Jacques, et ce bal d'hier! Quel bal! Quelle gogne! Qu'as tu donc appris sur le compte de Robillard?—J'ai appris que c'est une gogne.—Et sa famille?—Sa famille? C'est tout de la gogne. Tomber dans la gogne, veut dire: Tomber dans la crapule. Terme vaudois. Chez nos voisins du Jura, gone se dit d'une femme méprisable. [Voyez C. Monnier, Vocabulaire du Jura.] GÔGNES, s. f. pl. Compliments, cérémonies. Faire des gôgnes. GOGNEUX, EUSE, adj. et s. Crasseux, dégoûtant, repoussant, crapuleux. Se dit des personnes et des choses. Un chapeau gogneux. Une tournure gogneuse; un air gogneux. Tu te promenais hier avec deux individus bien gogneux. Dans le bas limousin, gognou, et en vieux français, gognon, signifient: Pourceau, cochon, et se disent de toute personne sale et malpropre. Gognounà, faire grossièrement et salement un ouvrage. GOGUINETTE, s. f. Propos gaillard, parole un peu libre. Dire la goguinette. Dire une goguinette; dire des goguinettes. En
  • 64. Lorraine, goguenettes signifie: Propos joyeux. En vieux français, goguer, v. n., veut dire: «Plaisanter.» GOISE ou GOËZE, s. f. Serpe, grosse serpe. En Franche-Comté on dit: Goisse et gouisse. GOISET, GOAZET, ou GOINZET, s. m. Serpette. Se dit aussi d'un couteau et principalement d'un mauvais couteau. GOLÉE, s. f. Gorgée. Avales-en une seule golée. J'ai bu deux petites golées de ton sirop, et j'en ai eu assez. En Picardie on dit: Goulée. «Goulée» est un mot français; mais il signifie: «Grande bouchée.» GOLÉRON ou GOLAIRON, s. m. Ouverture, trou. Le goléron d'une nasse. Dans l'ancienne langue provençale, golairos signifiait: «Gosier.» GOLET, s. m., et GOLETTE, s. f. Goulot, trou, orifice. Le golet d'une bouteille. Terme jurassien et savoisien. Dans notre patois ces mots ont une signification plus étendue. GONFLE, s. f. Signifie: 1o Vessie des quadrupèdes; 2o Petite ampoule sur la peau, cloche, élevure; 3o Bulle de savon. Sa brûlure lui a fait lever des gonfles. Percer une gonfle. Se soutenir sur l'eau avec des gonfles. Terme suisse-roman et savoisien. GONFLE, adj. Gonflé. Il a tant marché aujourd'hui, qu'il en a les pieds gonfles. Terme français populaire. A Lyon on écrit et on prononce confle. GONGON, s. des 2 genres. Grognon, celui ou celle qui bougonne, qui grogne. Cette gongon finira-t-elle une fois de nous ennuyer? Le mari et la femme sont aussi gongons l'un que l'autre. GONGONNER, v. a. Bougonner, marmonner, se fâcher, gronder. Notre vieux raufin ne s'arrête pas de gongonner. Il gongonne
  • 65. ses enfants, il gongonne sa servante, il gongonne tout le monde. Terme suisse-roman, savoisien et lyonnais. GONVÉ, s. m. Une odeur de gonvé, est une odeur de renfermé, une odeur de linge sale et gras. Votre Baby Chailloux sentait terriblement le gonvé. GONVER, v. a. et n. Couver. L'incendie éclata le matin; mais le feu avait gonvé toute la nuit. Ne crois-tu pas, femme, que notre Françoise gonve une maladie?—Je crois qu'elle gonve la rougeole. Ta seille, Madelon, est égrillée: il faut la faire gonver (c'est-à-dire: Gonfler dans l'eau). Terme connu dans le canton de Vaud. En Franche-Comté on dit: Gouver. GONVIÈRE, s. f. Signifie: 1o Fondrière, creux plein de boue; 2o Tas de neige amoncelé par le vent. GOTRET, s. m. Terme de boucherie. Ris de veau. GOTTE, s. f. Mauvais ouvrage, mauvaise marchandise, chose de nulle valeur, et dont on ne fait aucun cas. GOUAILLER, v. n. Crier. Voyez COUAILLER. GOUGNAUD, AUDE, s. et adj. Se dit d'une personne ou d'une chose de rebut. Quel gougnaud de chapeau tu as là! Notre nouvelle voisine N** est une gougnaude; elle s'habille comme une gougnaude. GOUGNAUDER, v. a. Manier maladroitement, gâter en maniant, déformer, froisser, chiffonner. GOUGNAUDS ou GOUGNEAUX, s. m. pl. Vieux chiffons, mauvais linge, vieilles nippes, et, en général, objets vieux et sans valeur. GOUILLARD, ARDE, s. et adj. Voyez GOULIARD. GOUILLE, s. f. Petite mare, endroit où la boue séjourne, flaque. Marcher dans la gouille; tomber dans la gouille. Terme suisse-
  • 66. roman, savoisien, dauphinois et franc-comtois. Dans le bas Limousin on dit: Ga-oullio, et dans le Berry, gouillat. GOUJATER, v. a. et n. Travailler comme un goujat. Prendre des manières de goujat. Un ouvrage goujaté est un ouvrage bousillé, un ouvrage fait vite et sans soin. GOULIAFE, s. m. Glouton malpropre. A Paris on dit: Gouliafre; dans le vieux français et en Picardie, goulafre. GOULIARD, ARDE, s. et adj. Gourmet, friand. Oh! la gouliarde, qui trempe son doigt dans le sirop! Ces petits gouliards eurent fripé en un clin d'œil tous les bonbons. Terme vaudois, savoisien et vieux français. Dans le Limousin, en Normandie et sans doute ailleurs, goulard signifie: Goulu, gourmand. GOULIARDISE, s. f. Friandise. Comment, Élisa! du beurre et de la confiture sur ton pain? quelle gouliardise! Tu n'aimes que les gouliardises, Georgette, et tu vivrais de gouliardises. En vieux français on disait: Goulardise et gouillardise. R. gula. GOURLLE, s. f. (ll mouillés.) Cep de vigne arraché. Dans le canton de Vaud on dit: Gourgne. GOURMANDISE (UNE). Un plat de gourmandises. Si vous êtes sages, vous aurez chacun pour votre goûter une petite gourmandise. Cette expression, fort usitée en Suisse et en Savoie, n'est pas inconnue en France, quoique les dictionnaires ne l'aient pas relevée. «Je t'avais préparé les gourmandises que tu aimes,» dit feu Mr De Balzac, dans un de ses romans. L'expression française consacrée est: «Friandise.» Un plat de friandises. GOURMANDS (POIS). Pois goulus, pois dont la cosse est tendre et se mange. GOURME, s. m. Jeter son gourme. Ce mot est féminin. GOÛTER SOUPATOIRE, s. m. Goûter qui tient lieu de souper.
  • 67. GOUTTE AU NEZ, s. f. Expression méridionale, etc. Les dictionnaires disent: «Roupie.» GOUTTIÈRE, s. f. Voie d'eau, fente, trou, ouverture à un toit par où l'eau de la pluie pénètre et coule en dedans. L'orage souleva les tuiles et occasionna une gouttière. Le plafond, qui était tout neuf, fut entièrement taché par les gouttières. Terme suisse- roman, méridional, etc. On appelle en français Gouttière: 1o Le chéneau qui reçoit et recueille les eaux de la pluie; 2o Le tuyau de descente. GOYARDE, s. f. Serpe. Dans le Berry on dit: Goyard. GRABEAU, s. m. Mercuriale, censure. Bon grabeau, mauvais grabeau. Faire le grabeau des étudiants. Être soumis au grabeau; recevoir son grabeau. On lit dans notre Constitution de 1814: «Les membres du Conseil d'État qui ne sont point sujets au grabeau, n'y assisteront pas.» Terme vaudois et neuchâtelois. GRABELER, v. a. Faire le grabeau. «La Compagnie des Pasteurs élira chacun de ses membres; elle se grabellera elle-même.» [Constitution de 1814.] «Tous les Conseillers d'État qui ne sont ni Syndics, ni Lieutenant, ni Syndics sortant de charge, ni Trésorier, ni membres du Tribunal civil et de la Cour suprême, seront grabelés un à un à la balotte.» [Ibid.] Le mot grabeler, en vieux français, signifiait: Examiner, éplucher, débattre, choisir. GRABOT, s. m. Voyez GRABEAU. GRABOTER, v. a. Se dit quelquefois pour grabeler. GRADUATION, s. f. Dans le langage académique on appelle Examen de graduation, un examen à la suite duquel l'étudiant reçoit le grade de bachelier, ou celui de licencié, ou celui de docteur. GRAIFION, s. m. Voyez GREFFION.
  • 68. GRAILET, s. m. Plat d'étain donné pour prix dans les tirs. GRAILETTE ou GREULETTE, s. f. Sorte de terrine, sorte de casserole à trois pieds, laquelle sert à réchauffer les ragoûts. GRAILLON, s. m. Ce mot est français; mais à Genève il se dit, entre autres: 1o D'un mets quelconque (viande, poisson, légume, lait, etc.) qui, réchauffé, a contracté une mauvaise odeur, un mauvais goût. Il se dit: 2o Des tabliers, torchons, mauvais linges, etc., dont la cuisinière s'est servie. En français: Goût de graillon, odeur de graillon, signifient: «Goût, odeur de viande ou de graisse brûlée.» [Acad.] GRAIN DE SEL, s. m. Quand les jeunes enfants voient voltiger près d'eux un oiseau, et qu'ils demandent comment il faut s'y prendre pour l'attraper, on leur répond que l'infaillible moyen est de leur mettre un grain de sel sur la queue. De là a pris naissance notre expression figurée: Mettre un grain de sel sur la queue de quelqu'un; c'est-à-dire: «Faire d'inutiles efforts pour le capter et pour l'attirer dans le filet.» GRAIN DE SUCRE, s. m. Morceau de sucre. Fais attention, Caroline, tu coupes les grains de sucre trop gros. GRAINGE, adj. Voyez GRINGE. GRAISSE, s. f. Réprimande, semonce sévère. Donner une graisse; recevoir une graisse. Tu as eu ta graisse. Terme français populaire. GRAISSE DE CHAR, s. f. Vieux oing, cambouis. GRAISSE-MOLLE, s. f. Saindoux, graisse de porc. En Dauphiné, en Provence et en Languedoc, on dit: Graisse blanche; à Bordeaux, graisse douce. GRAMON, s. m. Gramen, chien-dent, plante dont les racines sont d'un grand usage pour les tisanes apéritives. Boire sur le gramon. En Dauphiné, on dit: Grame.
  • 69. GRAND, adj. Les expressions suivantes: Ce n'est pas grand chose; j'ai eu grand peine; voici la grand route, etc., sont des expressions correctes, mais étranges, et qui nous viennent du vieux français. Au treizième siècle, grand ou grant était un adjectif des deux genres. GRAND, s. f. Terme des campagnards. Grand'mère. Dis-moi, Colette, comment se porte ta grand? Pauvre Monsieur, cette bonne grand, nous l'avons perdue il y a huit jours. GRANDE-MAISON (LA). Terme adoucissant, euphémisme pour dire: L'hôpital, la maison de charité. Jamais, non jamais, Monsieur le Directeur, je ne consentirai à entrer dans la Grande- maison. GRANDET, ETTE, adj. Grandelet. Notre Stéphanie est déjà grandette. Terme excellent, employé dans tout le Midi et sans doute ailleurs. GRAND-LOUIS ou GRAND-SIFFLET, s. m. Courlis ou courlieu cendré, oiseau aquatique. GRANGER, s. m. Métayer, fermier partiaire, fermier qui partage le produit des champs avec le propriétaire. Ce terme, si connu dans la Suisse romane, en Savoie et en Franche-Comté, n'a été recueilli ni par le dictionnaire de l'Académie, ni par M. Poitevin, le plus récent des lexicographes, ni par Gattel, ni par M. Bescherelle; mais Boiste et N. Landais l'ont mentionné. GRANGERIE, s. f. Grangeage. Mettre un domaine en grangerie, ou à grangerie, c'est: En confier l'exploitation à un granger. Voyez ce mot. Le mot grangerie, très-usité chez nous, n'a été enregistré que par un seul dictionnaire moderne, le Complément de l'Académie. GRATON, s. m. Aspérité sur le papier, sur le terrain, etc. Sa boule rencontra un graton. GRATTE-À-CUL, s. m. Gratte-cul, fruit de l'églantier.
  • 70. GRATTE-BOISSEUSE ou GRATTE-BOESSEUSE, s. f. Polisseuse de boîtes de montres. Boesse ou gratte-boesse se disent d'une sorte d'outil de ciseleur. GRATTE-LOTON, s. m. Sobriquet qu'on donne aux ouvriers horlogers. Voyez LOTON. GRATTER, v. a. Gratter la rogne à quelqu'un, signifie: Le flatter pour en obtenir une faveur, le cajoler, le flagorner dans des vues intéressées. Il s'aperçut enfin que sa nièce lui grattait la rogne, et qu'elle en voulait, par-dessus tout, à l'héritage. Expression triviale. Dans le français populaire, on dit en ce même sens: «Gratter l'oreille,» ou «gratter l'épaule à quelqu'un.» [Voyez le Dictionnaire du Bas langage, t. II.] GRATUISE, s. f. Râpe de fer-blanc, ustensile de cuisine. En Dauphiné et en Languedoc, on dit: Gratuse; dans le patois provençal, gratuè. En vieux français, gratuser signifie: Râper. GRAVANCHE, s. f. Sorte de férâ. Voyez ce mot. † GRAVATE, s. f. Cravate. Dis voir, femme, fadrait-il pas mettre une gravate à notre petit, qui a un commencement de rouche? Terme suisse-roman, savoisien, franc-comtois et méridional. GRAVE, s. f. Grève, endroit au bord d'une rivière couvert de gravier. Terme dauphinois et vieux français. GRAVELAGE, s. m. Action de graveler. GRAVELER, v. a. Couvrir de gravier. Graveler les allées d'un jardin; graveler une promenade. Terme indispensable, et qu'on cherche vainement dans les dictionnaires. En Languedoc on dit: Agraver. GRAVELLE, s. f. Maladie des moutons, clavelée. GREBATTER, v. a. Rouler. Se grebatter, se rouler. Expressions très- familières aux campagnards. È
  • 71. GRÈBE (UNE). Sorte d'oiseau plongeur. Dites au masculin: Un grèbe. Grèbe cornu, grèbe huppé. GRÈBION, s. m. Grèbe esclavon, grèbe oreillard. GREBOLER, v. n. Grelotter, trembler de froid. Je le trouvai tout greulant, tout grebolant. En Savoie on dit: Grevoler; dans le patois dauphinois, gromolà. GREDON ou GREUDON, s. m. Guenilles, vieilleries, objets de rebut. GREGNOLU, UE, adj. Qui a beaucoup de nœuds. Bois gregnolu. Terme des campagnards. GREIFION, s. m. Gros bigarreau. Une livre de greifions. Terme suisse-roman, savoisien et jurassien. En provençal, en piémontais et en vieux français, on dit: Graffion; dans le Languedoc, agrefion. GREINGE, adj. Voyez GRINGE. GRELON, s. m. Écrivez et prononcez «Grêlon.» GREMILLETTE, s. f. (ll mouillés.) Lézard gris, lézard de murailles. [P. G.] Dans le patois de Rolle (canton de Vaud) on dit: Gremeillette. GREMOLLION ou GREMAILLON, s. m. Grumeau, portion durcie d'un liquide. La soupe s'était mise en gremollions. Notre pauvre Estelle vomissait des gremollions de sang. Terme connu aussi chez nos voisins du canton de Vaud. Dans le Berry et en Lorraine on dit: Gremillion. GRENÉ, ÉE, adj. Épi grené. Terme méridional et vieux français. Dites: «Épi grenu.» Le verbe «grener» est français. GRENETTE, s. f. Ce mot signifiait jadis: Marché aux grains; et c'est le nom que porte encore aujourd'hui notre halle au blé. Terme vaudois, savoisien, etc.
  • 72. GRENETTE, s. f. Semen contra, poudre contre les vers, barbotine. GRENIER À LESSIVE, s. m. Séchoir, sécherie, étendage. GRENOUILLE, s. f. (fig.) Sorte de petit instrument formé d'une tête de bouteille recouverte d'un morceau de parchemin traversé par du crin. En le faisant tourner comme une crécelle, il imite assez bien le cri des grenouilles, quand elles commencent à crier au printemps. [P. G.] GRÈSE, adj. fém. Voyez GRÈZE. GRÉSILLER, v. neutre. Croquer sous la dent, comme le pain lorsqu'il s'y est mêlé du sable ou du menu gravier. En Languedoc on dit: Gréziner. GREUBE, s. f. Tuf, terre sèche et dure qui sert à écurer, à nettoyer les ustensiles de cuisine, les tablettes de sapin, etc. Patte à greube. Terme suisse-roman et savoisien. Le vendeur de greube s'appelle, dans notre patois: Le greubi. GREUBIÈRE, s. f. Carrière d'où l'on tire la greube. GREUBONS, s. m. pl. Peau croustillante qui reste quand on vient de fondre du lard. Un plat de greubons. A Neuchâtel et dans quelques parties du canton de Vaud, on dit: Grabon; dans l'allemand-suisse, Grieben. GREUGER, v. a. Gruger, friper, dissiper en folles dépenses. Il avait hérité trois mille francs: c'est tout greugé. En vieux français, gréuge signifie: Perte, dommage. GREULER, v. actif. Secouer un arbre pour en faire tomber les fruits. Greuler un cerisier, greuler un pommier. En Savoie on dit: Creuler; en Franche-Comté, crôler; en vieux français, crosler et crouller. Figurément et familièrement, creuler s'emploie dans le sens de: Questionner quelqu'un, lui arracher des nouvelles, le forcer, de façon ou d'autre, à dire ce qu'il sait et qu'il se soucie peu ou point de raconter. Nous l'avons tant pressé, nous l'avons
  • 73. tant greulé, qu'il a fini par nous débiter tout le journal. Voyez le mot suivant. GREULER, v. neutre. Grelotter, trembler de froid ou de peur. Ce pauvre diable, blotti dans un fossé, greulait comme la feuille du tremble. Terme suisse-roman, qu'on retrouve tel quel dans le patois lorrain. Dans le Jura on dit: Grouller; en Bourgogne et en vieux français, gruler. Nous disons à l'actif: Greuler la fièvre, pour: Trembler la fièvre, avoir le tremblement qui résulte de la fièvre. Nos campagnards disent en ce même sens ou sens analogue: Greuler le marmot. GREULETTE ou GREULAISON, s. f. Frisson, tremblement que donne la fièvre ou la peur. Avoir la greulette; avoir la greulaison. Cette dernière expression est surtout familière aux campagnards. GREULETTE, s. f. Sorte de terrine appelée aussi: Grailette. GRÉVÉ, VÉE, adj. et part. Un fonds grévé; un domaine grévé d'hypothèques. On doit écrire et prononcer «Grever,» sans accent sur l'e. GREVURE, s. f. Blessure. Ce terme vieillit. GRÈZE ou GRÈSE, adj. f. Soie grèze. Soie qui est tirée de dessus le coton. Terme lyonnais. Dites: «Soie grége.» GRIBICHE, s. f. Signifie: 1o Femme ou fille maligne, méchante, pie-grièche; 2o Et plus souvent, Fille ou femme de mœurs dissolues. En Normandie, gribiche se dit d'une vieille femme méchante dont on fait peur aux enfants. GRIE, s. f. Plâtre gris, gypse. Terme de nos campagnards et de ceux du canton de Vaud. Il existe à Bernex une ancienne carrière de grie, qui a fait donner le nom de grisse aux terrains environnants. GRIFFÉE, s. f. Griffade, coup de griffe.
  • 74. GRILLE, s. f. Cheville du pied. S'écorcher la grille. Terme suisse- roman, savoisien et franc-comtois. GRILLER, v. a. Rôtir. Griller du café; griller des châtaignes; griller des glands. Terme savoisien. En français «Griller» signifie: Rôtir sur le gril. «J'avais couché mes pincettes sur la braise pour faire griller mon pain.» [Xav. De Maistre, Voyage autour de ma chambre, ch. VIII.] GRILLET, s. m. (ll mouillés.) Sorte d'insecte. Le cri des grillets. Un trou de grillet. Terme suisse-roman, savoisien, lyonnais, franc- comtois et méridional. En Poitou et dans le Berry on dit: Grelet; en limousin et en vieux français, gril. Les dictionnaires et le bon usage veulent qu'on dise: «Grillon.» R. lat. gryllus. GRILLOIRE, s. f. Sorte de petite casserole à manche, surmontée d'un couvercle, et qui sert à rôtir le café. Dans le canton de Vaud on dit: Un grilloir. Le grilloir à café. GRILLOIRE, s. f. (fig.) Endroit où la chaleur est insupportable; endroit où l'on grille. Ce cabinet au midi est une grilloire pendant l'été. GRILLOTTER, v. actif. Griller, frire. GRIMPER, v. n. Dans notre langage énergique, faire grimper les murs à quelqu'un, signifie: L'impatienter outre mesure, le vexer, le dépiter à l'excès. Les dictionnaires disent en ce même sens: «Faire sauter quelqu'un au plancher, le faire sauter aux nues.» On dit en Languedoc: Faire monter quelqu'un au ciel sans échelle. GRIMPION, s. m. Grimpereau, oiseau bien connu. Au sens figuré, nous appelons grimpion, grimpionne, celui ou celle qui cherche par des politesses, par des avances répétées, par des flatteries, à s'introduire, à se glisser dans une société plus élevée, plus haut placée que la sienne. De là ont pris naissance les phrases suivantes familières: C'est un grimpion; il fait le grimpion; elle
  • 75. fait la grimpionne. Ces jeunes époux veulent grimper. Les grimpions doivent éprouver quelquefois de fameux déboires. GRIMPIONNER, v. n. Faire le grimpion, faire la grimpionne. Tu ne t'aperçois pas que cette jeune femme veut absolument grimpionner. GRINGALET, ETTE, adj. Faible, chétif. Cheval gringalet; veau gringalet. Ton beau-frère est bien gringalet, etc. Le Complément du dictionnaire de l'Académie, et le dictionnaire de M. Bescherelle ne présentent ce mot que comme substantif, et ne l'emploient qu'en parlant de l'homme. Nous l'employons très- souvent comme adjectif, et nous lui donnons des sens fort étendus. GRINGE, adj. Triste, ennuyé, chagrin, de mauvaise humeur, maussade, malingre. Rosalie est toute gringe aujourd'hui, et je crains qu'elle ne soit malade. Qu'avez-vous donc, Monsieur le notaire? Vous paraissez sombre et préoccupé?—En effet, je suis gringe. J'attendais mes enfants par le bateau à vapeur, et voilà le bateau qui arrive sans eux. Terme suisse-roman. Dans le patois de l'évêché de Bâle, on dit: Graigne; en Franche-Comté, grigne, et en Bourgogne, greigne; dans le Berry, grignaut; dans le patois rouchi, engraigné. Tous ces mots, qui sont fort usités, n'ont point de correspondants exacts en français. Dans le dialecte normand, grigner signifie: «Être maussade.» En Picardie, grigneux et grignard veulent dire: Pleurnicheur. GRINGERIE, s. f. Mauvaise humeur, malingrerie. Après une pareille mésaventure, un peu de gringerie est bien permis. GRIOTTE, s. f. En français, ce mot désigne une espèce de cerise grosse et noirâtre, plus douce que les autres. En Suisse, au contraire, nous appelons griotte une cerise acide. GRIPPÉ, ÉE, adj. Atteint de la grippe. Toute la famille est grippée. Ce mot, si connu en Suisse et en France, n'est dans aucun dictionnaire usuel.
  • 76. GRISAILLE, s. f. Ribotte, excès de table, excès de boisson. [P. G.] GRISE, s. fém. Tour malin, malice, espièglerie. En faire des grises, en faire voir de grises, signifie: Jouer des tours, faire des malices, attraper, tourmenter. Voilà un bambin qui en fera voir de grises à son père et à sa mère. Vous m'en faites des grises, malins enfants que vous êtes. Locution dauphinoise, limousine, etc. GRISPER et GRISPOUILLER, v. a. Crisper, agacer, impatienter. Cela me grispouille, c'est-à-dire: Cela me tarabuste. GRISPILLE, s. f. Sorte de jeu ou d'amusement, appelé aussi tire- poils, et en français: La gribouillette. [P. G.] À la grispille, locution adverbiale, signifie: Au pillage. Tout était à la grispille dans cette maison. GRISPILLER, v. a. Voler, filouter, friponner. GRISSE ou GRITZE, s. m. Gruau d'avoine ou d'orge. Ce terme, usité dans toute la Suisse romane, est formé du mot allemand Grütze, qu'on prononce gritze, et qui a le même sens. GROGNASSER, v. n. Grogner, se plaindre en grognant. Terme parisien populaire. GROGNE, s. f. Mauvaise humeur, disposition à se plaindre. Avoir la grogne. GROGNER QUELQU'UN. Le gronder, le réprimander avec humeur. Il grogne tout son monde; il ne cesse de nous grogner. «Grogner,» verbe neutre, est français. «Cette femme ne fait que grogner.» GROGNONNE, adj. et s. féminin. Sa maladie l'a rendue un peu grognonne. Dites: «Grogneuse.» GROLLE, s. f. Vieux soulier fort usé, savate. Mettre des grolles. Porter des grolles. Comment donc, Madame Bonnard? vous
  • 77. nous donnez là du pain qui est sec comme de la grolle. Terme vieux français et français populaire. GRONDÉE, s. f. Gronderie, réprimande. Faire une grondée. Recevoir une grondée. GROS, s. m. Le gros de l'hiver; le gros de l'été. Dites: «Le fort de l'hiver; le fort de l'été.» GROS (LES). Les notables, les riches, les principaux de l'endroit. Nos gros se montrèrent, en toute occasion, humains et charitables. GROS, s. m. Terme de calligraphie. Écrire en gros, c'est Écrire en gros caractères. Il faut dire: «Écrire la grosse.» GROS, adj. De gros en gros, locution adverbiale. Il consentit à nous raconter de gros en gros cette singulière aventure. Il faut dire: «En gros.» Raconter en gros. GROS-BLÉ, s. m. Nonnette, variété de froment. Le gros-blé s'appelle aussi en français: «Blé barbu» et «Blé poulard.» GROS-FORT, s. m. Grande absinthe, plante. † GROS MAL, s. m. Haut mal, épilepsie, mal caduc. Tomber du gros mal. Terme vaudois et savoisien. Dans le Limousin on dit: Le grand mal. GROS NEIRET ou GROS NOIRET, s. m. Canard garrot. GROSSET, ETTE, adj. Un peu gros. Un poulet grosset; une perdrix grossette. GROUP, s. m. Angine du larynx. Écrivez et prononcez «Croup.» GRUER, v. a. Faire gruer de l'avoine. Dites: «Monder.» Monder de l'avoine. GRUGEUR, s. m. Celui qui gruge.
  • 78. GRUMEAU, s. m. Terme de boucherie. La pièce du devant de la poitrine de l'animal entre les deux jambes. Grumeau de bœuf; grumeau de mouton. Terme méridional. GRUMEAU, s. m. Cerneau, noix cassée. Terme de la Suisse romane. GRUS, s. m. pl. Gruau, orge mondé, avoine mondée. De la soupe aux grus. Terme suisse-roman, savoisien et franc-comtois. En Champagne, gru signifie: Son de farine; et en vieux français, greu, farine d'avoine et de froment. GRUS (DES). Terme de fromagerie. Du caillé, du séret mêlé de crême. «La Fanchon nous servit des grus et de la céracée.» [J.-J. Rousseau, Nouvelle Héloïse.] GUENAPIN, s. m. Polisson, bandit, chenapan. GUENICHE, s. f. Femme débraillée, sale et d'un aspect repoussant. Terme lorrain. En vieux français, «guenuche» ou guenoche veulent dire: Sorcière, enchanteresse. Dans l'évêché de Bâle, genache a le même sens. GUENILLERIE, s. f. Guenille, rebut, objet de rebut. Se dit des personnes et des choses. GUERRER, v. n. Terme enfantin, en usage surtout chez les campagnards. Cette petite folle d'Ernestine veut toujours guerrer avec nous, c'est-à-dire: Veut toujours être en guerre avec nous, guerroyer, batailler. † GUETTE, s. f. Guêtre. De vieilles guettes. Français populaire. GUETTON, s. m. Petite guêtre, guêtron. Une paire de guettons. Terme savoisien, rouchi, etc. GUEULÉE, s. f. Cri éclatant, clameur perçante. Pousser des gueulées. Faire des gueulées. Ce n'étaient pas des chants,
  • 79. c'étaient des gueulées d'enfer. Ce mot est français, mais dans une acception différente. GUEULER, v. a. Gueuler quelqu'un, l'appeler à voix forte. Tu ne m'entends donc pas, Colombier: il y a une demi-heure que je te gueule. «Gueuler,» v. n., est français. GUEUSER, v. n. Faire une action de gueux, se conduire mal, faire une gueuserie. Priver cette petite fille de son bal, c'est gueuser, c'est coquiner, c'est être par trop sévère et méchant. En français, «Gueuser» signifie: Mendier. GUEUSERIE, s. f. Tour malin, méchanceté, action coupable. Sevrer un enfant de quatre mois, c'est une gueuserie. GUICHE, s. f. Jambe. Tirer la guiche, traîner la guiche. Après douze heures de marche, le sac au dos, on commence joliment à tirer la guiche. GUIDE, s. f. Terme des campagnards. Digue. Élever des guides contre le torrent. Guide vient-il de «digue» par une transposition de lettres? Guide est-il au contraire le terme primitif et véritable? Une digue n'est autre chose, en effet, qu'une barrière établie pour guider les eaux. Voyez dans Gattel l'étymologie banale. GUIGNACHE, s. f. Guignon, guignon achevé. GUIGNAUCHE, s. f. Guenuche, femme de mauvaise façon, femme mal mise, fagotée, vêtue salement. Dans le canton de Vaud, guignauche ou guegnauche signifie: Sorcière. GUIGNE-EN-L'AIR. Badaud, imbécile. GUILLAME, s. m. Grand guillame, grand flandrin. † GUILLE, s. f. Quille. Jouer aux guilles. Terme suisse-roman, franc-comtois et lorrain.
  • 80. GUILLE, s. f. Terme des campagnards. Fine pointe, sommet, sommité. La guille d'un clocher, la guille d'un arbre, la guille d'une tour. Guillon, dans le canton de Vaud, a le même sens. A notre fête du Tir fédéral [1851], un Vaudois disait: J'ai vu planter le drapeau de la Confédération sur le fin guillon de la Tour de l'Isle. En Franche-Comté, la pointe du jour s'appelle: L'aube guillerole. [Vocabulaire jurassien de M. Monnier.] De cette racine guille, viennent indubitablement les mots genevois déguiller, aguiller, guille (à jouer), etc. GUILLE, adj. À moitié ivre, gris. R. Guille, pointe. On dit en français: Avoir une pointe de vin. GUILLEMETTE (EN), loc. adv. Être en guillemette, signifie: Être en pile, être l'un sur l'autre. Ces livres sont trop en guillemette, ils vont tomber. GUILLERETTE, s. f. Être à la guillerette ou être en guillerette, se disent d'un objet mis dans une position d'où il risque de tomber. Guillet, dans notre patois, et guilleret, dans le patois vaudois, signifient: Sommet d'un arbre, d'un rocher, d'un bâtiment. Voyez GUILLE, no 2. GUILLERI, s. m. Courir le guilleri. Terme dauphinois, etc. Les dictionnaires disent: «Courir le guilledou.» GUILLETTE, s. f. (Prononcez ghillette.) Signifie: 1o Boulette de pâte dont on engraisse les dindes; 2o Fusée de poudre. Voyez GUILLE, no 2. GUILLON, s. m. (Prononcez ghillon.) Fausset de tonneau, petite broche de bois servant à boucher le trou qu'on fait à un tonneau pour donner de l'air ou pour goûter le vin. Mettre un guillon. Ôter le guillon. Terme vaudois, savoisien et jurassien. A Lyon, on dit: Une guille; dans les environs de Dôle, une guillotte. Voyez GUILLE, no 2. GUILLONNER, v. a. Mettre le guillon, mettre le fausset.
  • 81. GUINCHE, adj. Louche, qui a la vue de travers. En provençal on dit: Guèchou. Dans le Berry, faire la guinche, signifie: Baisser la tête après une mauvaise action. GUINCHER, v. a. et n. Signifie: 1o Lorgner du coin de l'œil, guigner; 2o Loucher, regarder de travers. Terme provençal. GUINGOINE (DE), adv. De guingois, de travers, de biais, en biaisant. Il marche tout de guingoine. Son habit allait tout de guingoine. Nous disons aussi: De guingouarne et de guingouaine. En Picardie, on dit: De guingoin. GUIZE, s. f. (Prononcez ghize.) Terme de forge. Gueuse, fonte de fer, fer coulé. «Un tuyau de gueuse.» GY ou GI, s. m. Un tonneau de gy. Terme suisse, savoisien, franc- comtois, méridional et vieux français. On doit dire: «Gypse, ou plâtre.» GYPER, v. a. Plâtrer, enduire de plâtre. GYPERIE, s. f. Plâtrage, ouvrages en plâtre. La gyperie de cette seule chambre avait coûté six cents francs. GYPIER, s. m. Plâtrier. GYSSAGE, s. m. Plâtrage. GYSSER, v. a. Appliquer du plâtre, enduire de plâtre, plâtrer. Gysser un plafond, gysser une paroi. GYSSEUR, s. m. Ouvrier qui emploie le gypse, plafonneur. Dans le Valais on dit: Gypseur.
  • 82. H
  • 83. HABILLÉ, ÉE. Participe. Nous disons d'une personne stupide, d'une personne dépourvue de tout bon sens: C'est une bête habillée. HABILLÉ EN. Habillé en noir, habillé en blanc. Dites: Habillé DE noir, habillé DE blanc. HABITUÉ, ÉE, adj. Place habituée; jeu habitué; lecture habituée; promenade habituée. Dites: Place habituelle, jeu habituel, promenade habituelle, lecture habituelle. [Boiste.] HABITUER, v. a. J'ai habitué cet appartement, et j'y reste. Les bonnes d'enfants ont habitué la promenade de la Treille. J'aime mon cercle, je n'y rencontre que des personnes que j'ai habituées. Toutes ces phrases sont autant de barbarismes. † HABRE-SAC, s. m. Havre-sac. R. all. Haber, avoine. † HACHIS, s. m. L'h de ce mot doit s'aspirer; mais dans le langage populaire on prononce l'hâchis. On prononce aussi l'hareng, les- z-haricots, les-z-harnais, les-z-hasards, l'hai-ye (la haie), l'hibou, l'hangar, j'haïs (je hais), c'est-t-hideux, c'est-t-honteux, etc., etc. HACHON, s. m. Hache, petite hache. L'hachon lui échappa des mains. Hachon appartient au vieux français, et au patois du canton de Vaud. On dit à Bordeaux: Hachot. HAMEÇON, s. m. L'h de ce mot n'est point aspiré. On dit: Prendre l'hameçon, mordre à l'hameçon. C'est donc par inadvertance, sans doute, que MM. Ch. Nodier et Ackermann, dans leur Vocabulaire français [1836], disent qu'il faut prononcer le hameçon, en aspirant l'h. HANCHOIS, s. m. (h aspiré.) Une salade de hanchois. Écrivez sans h, «anchois,» et n'aspirez pas l'a. HARENG, s. m. (fig.) Banc de sable, banc de gravier, îlot. Les harengs de l'Arve. Tirer du sable de l'hareng. (sic.) L'Arve a tellement grossi pendant ces trois jours, qu'elle a emporté
  • 84. l'hareng. «Nous voyons souvent dans le lit d'une rivière, une grande pierre retarder la vitesse des eaux, et occasionner un amas de sable et de gravier: de là naissent des HARENGS qui, etc.» [De Saussure, Voyage dans les Alpes, t. Ier , p. 245.] † HASARD, s. m. Terme d'encan. Miser un n-hasard. HASARD DU POT (LE). Viens manger ma soupe quand tu voudras; c'est au hasard du pot. On dit en France: La fortune du pot. HAUT (LE). Les gens du haut, les dames du haut, les bals du haut, etc. Se frotter contre les gens du haut; imiter, singer les gens du haut. Ces expressions, d'un usage universel à Genève, ont besoin d'être expliquées aux étrangers. Notre ville, étant bâtie sur un coteau, se trouve naturellement divisée en haute et basse ville. Or, comme les familles aisées demeurent, pour la plupart, dans les quartiers du haut, on appelle gens du haut, les riches de ces quartiers, en tant du moins que leurs familles sont anciennes. Avec cette courte explication on comprendra sans peine ce passage des Confessions de J.-J. Rousseau [liv. Ier ]: «Il était, lui (Bernard, le cousin de Jean-Jacques), il était, lui, un garçon du haut; moi, chétif apprenti, je n'étais plus qu'un enfant de Saint-Gervais.» HAUT, HAUTE, adj. Nous disons proverbialement d'un homme orgueilleux et fier: Il est haut comme le temps, c'est-à-dire: Il est excessivement fier et hautain. Expression languedocienne, etc. HAUT-BANC, s. m. Sorte d'échoppe. HAUT-DE-CORPS (UN). Son cheval ne cessait de faire des hauts- de-corps. Dites: Des hauts-le-corps. HAUT GOÛT, s. m. Nous disons d'une sauce salée, poivrée, épicée: Cette sauce a un haut goût. L'Académie dit: «Cette sauce EST DE haut goût.»
  • 85. Welcome to our website – the perfect destination for book lovers and knowledge seekers. We believe that every book holds a new world, offering opportunities for learning, discovery, and personal growth. That’s why we are dedicated to bringing you a diverse collection of books, ranging from classic literature and specialized publications to self-development guides and children's books. More than just a book-buying platform, we strive to be a bridge connecting you with timeless cultural and intellectual values. With an elegant, user-friendly interface and a smart search system, you can quickly find the books that best suit your interests. Additionally, our special promotions and home delivery services help you save time and fully enjoy the joy of reading. Join us on a journey of knowledge exploration, passion nurturing, and personal growth every day! ebookbell.com