SlideShare a Scribd company logo
3.2 Communication Theory
CHAPTERIII
RANDOM PROCESS
3.1 INTRODUCTION
Randomsignalsareencountered ineverypracticalcommunicationsystem. Randomsignals,
unlike deterministic signals are unpredictable. For example, the received signalin the receiver
consists ofmessagesignalfromthe source, noise and interference introduced inthe channel.All
these signalsare randominnature, i.e. these signals areuncertainor unpredictable. Thesekind of
randomsignals cannot be analyzed using Fourier transform, as theycandealonlydeterministic
signals. The branch ofmathematics whichdeals withthe statisticalcharacterization ofrandom
signals isprobabilitytheory.
3.1.1 Basic Terms in Probability Theory
 Experiment:Anyactivitywith an observable result is called as experiment. Examples are
rolling a die, tossing a coin or choosing a card.
 Outcome: The result ofan experiment iscalled as outcome.
 Sample space: The set ofall possible outcomes ofan experiment is knownas the sample
space ofthe experiment and is denoted byS. For example, in tossing a coin experiment, the
sample space S  {Head, Tail}
 Event:Anysubset ofthe sample space is knownas anevent. For example, intossing a coin
experiment, E  {Head} is the event that a head appears (or) E  {Tail} is the event that a
tailappears.
 Independent event: Two events are said to be independent, when the occurrence ofone
event is not affected bythe occurrence ofother event. IfAand B are independent events,
then
P(A B)  P(A) * P(B)
 Mutually exclusive events:Two events aresaid to bemutuallyexclusive ordisjoint events,
whentheycannot occursimultaneously(Noneofthe outcomes arecommon). IfAandB are
mutuallyexclusiveevents, then
P(A B)=0

EC8491: Communication Theory Department of ECE
Random Process 3.3
 Probability: It is the measure of possibility or chance that an event will occur. It is
defined as:
Numberof desirableoutcome E
Probability= =
Totalnumberof possibleoutcomes S
For example, in cointossing experiment the probabilityofgetting “Head” is 1/2.
 Axioms ofProbability:
(a) Probability ofsample space is 1, i.e. P(S)=1
(b) Probabilityofanevent P(E) is nonnegative realnumber ranges from0 to 1.
0 P(E) 1
 
(c) Probabilityofthemutuallyexclusiveeventsisequalto sumoftheir individualprobability.
P(A B)=P(A)+P(B)

IfAand B are non-mutuallyexclusive events, then

P(A B)= P(A)+ P(B) P(A B)
 
 Conditional Probability: Theconditionalprobabilityofanevent B is the probabilitythat
the event willoccur given that an eventAhasalreadyoccurred. It isdefined as,
P(A B)
P(B/ A)=
P(A)

Similarly, the conditionalprobabilityofeventAgiventhat anevent Bhas alreadyoccurred
is given by,
P(A B)
P(A / B)=
P(B)

If the event A and B are independent, then P(A B)=P(A)*P(B)
 , then the
Conditionalprobability of B givenA is simplyP(B) as,
P(B/A)  [ P(A) * P(B)] / P(A)  P(B)
Similarly, conditionalprobabilityofAgiven B issimplyP(A) as,
P(A/B)  [ P(A) * P(B)] / P(B)  P(A)
 Law of Total Probability: If the events A1
,A2
,….Ak
be mutually exclusive, then the
probabilityofother event B is givenby,

=1
P(B)= P(B/A )P(A )
k
i i
i
EC8491: Communication Theory Department of ECE
3.4 Communication Theory
 Baye’s Theorem: Baye’s theorem or Baye’s rule describes the probability of an event
based onthe prior knowledge ofthe related event. It is defined as:

P(B/A )P(A ) P(B/A )P(A )
P(A / B)= =
P(B) P(B/A )P(A )
i i i i
i k
i i
i =1
where,
P(Ai
)  Prior probabilityor MarginalprobabilityofA, as it does not takes into
account anyinformationabout B.
P(B)  Prior probabilityor marginalprobabilityofB,as it does not takes into
account anyinformation aboutA.
P(Ai
/B)  ConditionalprobabilityofAgiven B. It is also called as posterior
probability, as it depends on the value ofB
P(B/Ai
)  ConditionalprobabilityofB givenA
3.2 RANDOM VARIABLES
Theoutcomesofarandomexperimentarenottheconvenientrepresentationformathematical
analysis. Forexample, ‘Head’and ‘Tail’intossing the coinexperiment.It willbeconvenient, ifwe
assign a number or a range ofvalues to the outcomes ofa randomexperiment. For example, a
‘Head’corresponds to 1 and a ‘Tail’corresponds to 0.
A random variable is defined as a process of mapping the sample space Q to the set
of real numbers. In other words, a random variable is an assignment of real numbers to
the outcomes of a random experiment as shown in Figure 3.1.Randomvariables are denoted
bycapitalletters, i.e., X, Y, andso on, and individualvaluesofthe randomvariableX are X(w).
Figure 3.1 Random variable  Mapping from Q to R
Random
Experiment
Sample Space Q
2
1 3 4
R
X( )
2 X( )
1 X( )
3 X( )
4
Real Numbers
EC8491: Communication Theory Department of ECE
Random Process 3.5
For example, consideran experiment oftossingthree fair coins. Let, Xdenote the number
of “Heads” that appear, then X is a random variable taking on one ofthe values 0,1,2,3 with
respective probabilities:
P{X  0}  P{(T, T, T)}  1 / 8
P{X  1}  P{(T, T, H), (T, H, T), (H, T, T)}  3/8
P{X  2}  P{(T, H, H), (H, T, H), (H, H, T)}  3/8
P{X  3}  P{(H, H, H)}  1/8
3.2.1 Classification of Random Variables
Randomvariables are classified into continuous anddiscrete randomvariables.
 The values ofcontinuous random variable are continuous in a givencontinuous sample
space. A continuous sample space has infinite range of values. The discrete value of a
continuous randomvariableis a value at one instant oftime. For example theTemperature,
T at some area is a continuous randomvariable that always exists in the rangesay, fromT1
and T2
.
Examples of continuous random variable  Uniform, Exponential, Gamma and
Normal.
 The values of a discrete random variable are only the discrete values in a given sample
space. The samplespace for a discreterandomvariable canbe continuous, discrete or even
both continuous and discrete points. They maybe also finite or infinite. For example the
“Wheelofchance”has the continuous samplespace. Ifwedefine adiscrete randomvariable
n asinteger numbers from0to 12, thenthediscrete randomvariableisX = {0,1,3,4…..12}.
Examples of discrete random variable – Bernoulli, Binomial, Geometric and
Poisson.
3.2.2 Distribution and Density Function
The Cumulative DistributionFunction (CDF) ofa randomvariable X is defined as,
F ( )=P{ω Ω:X( ) }
x x w x
 
which can be simplywritten as, F P X
x (x)= ( x)

EC8491: Communication Theory Department of ECE
3.6 Communication Theory
Properties of CDF:
 X
F ( )
x is a non-decreasing function of x.
 Since CDFis a probability, it ranges from0 to 1,i.e., X
0 F ( ) 1
x
 
 X
lim F ( ) 0
x x
  and X
lim F ( ) 1
x x
 
 X
F ( )
x is continuousfromthe right
 X X
P( X ) F ( ) F ( )
a b b a
   
 X X
P(X ) F ( ) F ( )
a a a
  
The Probability DensityFunction (PDF), ofa continuous randomvariableX is defined
as the derivative ofits CDF. It is denoted by:
i.e., X X
( )= F ( ).
d
f x x
dx
Properties of PDF:
 X( ) 0
f x 

X
( ) 1
f x dx
 




X
( ) P( X )
b
a
f x dx a b
  

 X
P(X A) ( )
A
f x dx
  
 X X
F ( ) ( ) .
x
x f u du

 
The Probability Mass Function (PMF), ofa discrete randomvariable X is defined as
px
(x), where
( ) P(X )
x
p x = = x
Properties of PMF:
 0 ( ) 1, 1,2,.....
x i
p x i
  
 ( ) 0, if ( 1,2,.....)
x i
p x x x i
  
 ( ) 1.
i x i
p x
 
EC8491: Communication Theory Department of ECE
Random Process 3.7
3.2.3 Statistical Averages of Random Variables
Eventhoughthedistributionfunctionprovidesacompletedescriptionoftherandomvariable,
some other statisticalaverages suchasmean and variance are also used to describe the random
variable indetail.
Mean or Expectation of the random variable:The mean ofthe continuous randomvariable
X with a densityfunction offX
(x) is defined as:
  
 X
E X = ( )
x f x dx


The meanofthe discrete randomvariable X is definedas the weighted sumofthe possible
outcomes givenas:
   
X
μ =E X = P X=
x x x

For example, ifX is considered as a randomvariable representing the observations ofthe
voltage of a randomsignal, then the mean value represents the average voltage or dc offset
of the signal.
Variance of the RandomVariable: It provides an estimate of the spread ofthe distribution
about the mean.The variance ofthe continuous randomvariablesis defined as:



2 2
X
σ ( )
x x
= x μ f (x)dx


The variance ofthe discrete randomvariables is given by the expectationof the squared
distance ofeachoutcome fromthe meanvalue. It isdefined as:
2 2
Var(X) E[(X ) ]
x x
   
We know that, expectation of a variable is
  X
E X P[X ]
x x
 
Therefore, the varianceis givenby,

2 2
X
σ = ( μ ) P[X= ]
x x
x x

For example, if X is considered as a random variable representing observations of the
voltage of a random signal, then the variance represents the AC power of the signal. The
second moment of X, E[X2
] is also called the mean-square value of the random signal
and it represents the total power of the signal.
EC8491: Communication Theory Department of ECE
3.8 Communication Theory
Covarianceofthe RandomVariable: Incommunicationsystem, itisveryimportantto compare
two signals in order to extract the information. For example, RADAR system compares the
transmitted signalto the target withthe received(reflected) signalfromthe target to measure the
parameters like range, angle and speed of the object. Correlation and covariance are mainly
used for this comparison purpose.
The covariance of two random variables X and Y is defined as the expectation of
the product of the two randomvariables given by:
 
X Y
Cov(X,Y)=E[(X )(Y )]
m m
Expanding and simplifyingabove equationwe get:
 X Y
Cov(X,Y)=E[XY] μ μ
If Cov(X,Y) = 0, then X and Yare uncorrelated, i.e.,   X Y
E XY =μ μ =E[X]E[Y]
If X and Y are independent, then Cov(X, Y) = 0, i.e., X and Y are uncorrelated but
the converse is not true.
The correlationcoefficient oftwo randomvariable is defined as,
XY
XY
X Y
σ
ρ =
σ σ
where, the correlationcoefficient ranges from[1, 1].
Some ofthe important randomvariables used incommunication systems are,
(a) Binomial RandomVariable:
This is a discrete randomvariable. Suppose there are ‘n’independent trials, eachofwhich
results in a success and failure with probability p and 1  p respectively. If X represents the
number ofsuccesses that occur in thentrials, thenX issaid to be binomialrandomvariable with
parameters (n, p).
The probability mass function of a binomialrandomvariable with parameters n and p is
givenby,

 

 
 
P(X= )= (1 ) , 0
i n i
n
i p p i n
i


The meanand variance ofbinomialrandomvariable is givenby,
E(X)
μ= =np

2
(1 )
σ = np p
EC8491: Communication Theory Department of ECE
Random Process 3.9
Application:
BinomialRandomVariable canbe used to modelthe totalnumber ofbits received in error
when sequence ofn bits is transmitted over a channelwitha bit-error probabilityofpas shown
in Figure 3.2.
Figure 3.2 PMF for Binomial Random Variable
(b) UniformRandomVariable:
This is a continuous random variable that takes values between a and b with equal
probabilities for intervalsofequallength. Theprobabilitydensityfunctionis showninFigure 3.3
defined as:






X
1
0 Otherwise
, a< x<b
b a
f (x)=
,
The cumulative distributionfunctionis showninFigure3.4 defined as:









X
0
F
, x<a
x a
(x)= , a x b
b a
1, x>b
 
0.3
0.25
0.2
0.15
0.1
0.05
0
0 2 4 6 8 10
x
EC8491: Communication Theory Department of ECE
3.10 Communication Theory
Figure 3.3 PDF for uniform random variable
Figure 3.4 CDF for uniform random variable
Application:
Uniformrandomvariable is used to modelcontinuous random variables, whose range is
known, but other informationlike likelihoodofthe values that the randomvariable canassume is
unknown. For example, the phase of a received sinusoid carrier is usually modelled as a
uniformrandomvariablebetween0 and 2 .Quantization Errorsare also modelled as uniform
randomvariable.
(c) Gaussian orNormal Random Variable:
The Gaussianor NormalRandomVariable is a continuous randomvariable described by
the probabilitydensityfunctionas:


X
1
2
2
2
(x m)
2σ
f (x)= e
π σ
f (x)
X
0 a b x
FX(x)
1.0
0 a b x
1
b a

EC8491: Communication Theory Department of ECE
Random Process 3.11
The PDFofGaussianrandomvariableis a bellshaped that is symmetric about the meanm
and attains the maximum value of 1/ 2π σ at x = m, as shownin Figure 3.5.
Figure 3.5 PDF for Gaussian Random Variable
Theparameter m is called the Mean and can assume anyfinite value. The parameter is
calledthe Standard Deviation and can assume anyfinite and positive value. The square ofthe
standard deviation, i.e., 2
 is the Variance. A Gaussian random variable with mean m and
variance 2
is denoted by 2
N( , )
m  . The random variable N(0, 1) is usuallycalled standard
normal.
Properties of Gaussian Random Variable:
 It iscompletelycharacterized byits meanand variance.
 The sumof two independent Gaussian random variables is also a Gaussian random
variable.
 TheweightedsumofNindependentGaussianrandomvariables isaGaussianrandom
variable.
 Iftwo Gaussian randomvariables havezero covariance (uncorrelated), theyare also
independent.
 AGaussianrandomvariable plus aconstant isanotherGaussianrandomvariablewith
the mean adjusted bythe constant.
 A Gaussian random variable multiplied bya constant is another Gaussian random
variable where both the mean and variance are affected bythe constant.
0 m x
fX(x)
2
1
2
EC8491: Communication Theory Department of ECE
3.12 Communication Theory
Applications:
 TheGaussianrandomvariableisthemostimportantandfrequentlyencounteredrandom
variable in communication systems. The reasonis that thermal noise, which is the
major source of noise in communication, has a Gaussian distribution.
 InRobotics, GaussianPDFis used to statisticallycharacterize sensor measurements,
robot locations and map representations.
3.3 CENTRAL LIMIT THEOREM
Animportant resultinprobabilitytheorythatiscloselyrelatedto theGaussiandistribution
is the Central Limit Theorem (or) Law of large numbers. Let X1
,X2
,X3
,…..,Xn
be a set of
randomvariableswiththefollowingproperties:
 The Xk
withk = 1,2,…nare statisticallyindependent.
 The Xk
allhave the same probabilitydensityfunction.
 Both the meanand the variance exist for each Xk
.
We do not assume that the densityfunctionofthe Xk
is Gaussian. Let Ybe a new random
variable defined as:

=1
Y= X
n
k
k
Then, according to the centrallimit theorem, thenormalized randomvariable,
Y
Y E[Y]
Z=
σ

approaches a Gaussian randomvariable withzero mean and unit variance as the number ofthe
random variables X1
, X2
, X3
,….., Xn
increases without limit. That is, as n becomes large,the
distributionofZapproaches that ofazero-meanGaussianrandomvariable withunit variance, as
shownby:
2
Z
1
F ( ) exp
2
2
z s
z ds

 
 
 
 
  
 

This is a mathematicalstatement ofthe centrallimit theorem. In words, the normalized
distribution of the sum of independent, identically distributed random variables
EC8491: Communication Theory Department of ECE
Random Process 3.13
approaches a Gaussian distribution as the number of random variables increases,
regardless of the individual distributions. Thus, Gaussian random variables are common
because theycharacterize the asymptotic propertiesofmanyother types ofrandomvariables.
When n is finite, the Gaussianapproximation ismost accurate in the centralportion ofthe
densityfunction(hencethenamecentrallimit)andlessaccurateinthe“tails”ofthedensityfunction.
Applications of Central Limit Theorem:
 ChannelModelling
 Finance
 PopulationStatistics
 HypothesisTesting
 EngineeringResearch
3.4 Random Process
A random process or stochastic process is the natural extension of random variables
when dealing with signals. Inanalyzing communication systems, we basicallydealwithtime-
varying signals.So far, theassumptionis that, allthesignals are deterministic. Inmanysituations,
it is more appropriate to modelsignals as random rather than deterministic functions.
One such example is the case ofthermalnoise inelectronic circuits. This type ofnoise is
due to the randommovement ofelectrons as a result ofthermalagitation, therefore, theresulting
current and voltage can onlybe described statistically. Another situation where modeling by
randomprocessesprovesusefulisinthecharacterizationofinformationsources.Aninformation
source, such as a speechsource, generatestime-varying signals whose contents are not known
in advance. Otherwise there would be no need to transmit them. Therefore, randomprocesses
also providea naturalwayto modelinformationsources.
A random process is a collection (or ensemble) of random variables  
X( ,ω)
t that
are functions of a real variable, namely time t where ω S
 (Sample space) and T
t
(Parameter set or Index set).The set ofpossible values ofanyindividualmember ofthe random
processiscalledstatespace.Anyindividualmemberitselfiscalledasamplefunctionorarealization
of the process. Arandom process can be viewed as a mapping ofsample space S to the set of
signalwaveforms asshown inFigure 3.6.
EC8491: Communication Theory Department of ECE
3.14 Communication Theory
Figure 3.6 Random Process – Mapping of Sample Space to Signal Waveform
Therealizationofone fromtheset ofpossible signalsis governed bysomeprobabilisticlaw.
This is similar to the definition ofrandomvariables where one from a set ofpossible values is
realized according to some probabilistic law. The difference is that in random processes, we
have signals (function of time) instead of values (numbers).
3.4.1 Classification of Random Process
Based on the continuous or discrete nature of the state space S and parameter set T, a
randomprocess can be classified into four types:
 If bothT and S arediscrete, the random process is called a Discrete random sequence.
For example, if Xn
represents the outcome ofthe nth
toss ofa fair dice, then {Xn
, 1
n } is
a discrete random sequence, since T{1, 2, 3, · · ·} and S  {1, 2, 3, 4, 5, 6}.
 If T is discrete and S is continuous, the random process is called a continuous
random sequence.
For example, if Xn
represents the temperature at the end of the nth
hourof a day, then
 
X , 1 24
n n
  isa continuous randomsequence, since temperature cantake anyvalue in
anintervaland hence continuous.
EC8491: Communication Theory Department of ECE
Random Process 3.15
 If T is continuous and S is discrete, the random process is called a Discrete random
process.
For example, ifX(t) represents the number oftelephone calls received in the interval(0, t)
then {X(t)} is a discrete random process, since S = {0, 1, 2, 3, · · ·}.
 If both T and S are continuous, the random process is called a continuous random
process.
For example, ifX(t) represents the maximumtemperature at a place in the interval(0, t),
{X(t)} is a continuous randomprocess.
Basedonthestationarity, arandomprocesscanbeclassifiedintostationaryandnon-stationary
randomprocess as shown in Figure 3.7.
Figure 3.7 Hierarchical Classification of Random Process
 A random process whose statistical characteristics do not change with time is called
as Stationary Random Process or Stationary Process.Example  Noise process as its
statistics do not change withtime.
 A random process whose statistical characteristics changes with time is called as
non-stationary process. Example  Temperature ofacityas temperature statisticsdepend
on the time ofthe day.
3.5 STATIONARY PROCESS
The random process X(t) is said to be Stationary in the Strict Sense (SSS) or strictly
stationary ifthefollowing conditionholds:
X x X
F F
1 k 1 k
(t +τ) ,... ...., (t +τ) 1 k (t ),... ...,x(t ) 1 k
(x ,... ...,x )= (x ,... ...,x )
Random Process
WSS Process
SSS Process
Ergodic
Process
EC8491: Communication Theory Department of ECE
3.16 Communication Theory
For alltime shifts , allk and allpossible choices ofobservationtimes t1
,……..tk
. In other
words, ifthe joint distribution ofanyset ofrandomvariables obtained byobserving the random
process X(t) is invariant with respect to the location at the time origin t = 0.
3.5.1 Mean of the Random Process
The meanofprocess X(t) isdefined as the expectationofthe randomvariable obtained by
observing the process at some time t givenby:

X X
E[X ] (t)
m (t)= (t) = x f (x)dx


where, X( )( )
t
f x is the first order probabilitydensityfunctionofthe process.
The mean of the strict stationary process is always constant given by:
x x
m (t)= m for allvalues oft
3.5.2 Correlation of the Random Process
Autocorrelation of the process X(t) is given by the expectation ofthe product oftwo
random variables X(t1
) and X(t2
) obtained by observing the process X(t) at times t1
and t2
respectively. It is defined as the measure of similarity of random processses.
XX 1 2 1 2
R ( , ) E[X( )X( )]
t t t t

1 2
1 2 X( ),X( ) 1 2 1 2
X X ( , )
t t
f x x dx dx
 
 
  
For a stationary random process 1 2
X( ),X( ) 1 2
( , )
t t
f x x depends only on the difference
between the observation time t1
and t2
. This implies the autocorrelation function of a strictly
stationary process depends only on the time difference t2
 t1
given by:

XX 1 2 XX 2 1
R ( , )=R ( )
t t t t for allvalues of t1
and t2
3.5.2.1 Importance of Autocorrelation
Autocorrelation function provides the spectral information ofthe random process. The
frequencycontent ofa process depends on the rapidityofthe amplitude change withtime. This
canbe measured bycorrelating amplitudes at t1
and t1
 .Autocorrelationfunctioncanbe used
to provide information about the rapidity ofamplitude variation with time whichin turn gives
informationabout their spectralcontent.
EC8491: Communication Theory Department of ECE
Random Process 3.17
For example, consider two randomprocess x(t) and y(t), whose autocorrelation function
RX
( ) and RY
()as shownin Figure 3.8. We can observe fromtheir autocorrelation function
that the random process x(t) is a slowly varying process compared to the process y(t). In fact
the power spectraldensityofrandom process is obtained from the Fourier Transformoftheir
autocorrelationfunction.
Figure 3.8 Autocorrelation Function of Two Random Process x(t) and y(t)
3.5.2.2 Properties of Autocorrelation Function
 The meansquarevalue ofa randomprocessis equalto the value ofautocorrelationat  =0.
2
XX
R (0)=E[X (T)]
 The autocorrelationfunctionis anevenfunctionof .

XX XX
R (τ)=R ( τ)
 The autocorrelationfunctionis maximumat  =0.
XX XX
R (τ) R (0)

 IfE[X(t)] 0 and X(t) is ergodic with no periodic components, then

2
XX
τ
lim R (τ)= X

 If X(t) has a periodic component, then willhave a periodic component with the
same period.
 IfX(t) is ergodic, zero mean and has no periodic components, then
 XX
τ
lim R (τ)=0

RX( )

R 
Y( )
0

EC8491: Communication Theory Department of ECE
3.18 Communication Theory
3.5.3 Covariance of the Random Process
The autocovariancefunctionofX(t) isdefined as:
XX 1 2 1 X 1 2 X 2
C ( , ) E{[X( ) ( )][X( ) ( )]}
t t t m t t m t
  
XX 1 2 X 1 X 2
R ( , ) ( ) ( )
t t m t m t
 
  2
XX 1 2 XX 2 1 X
C ( , )=R ( )
t t t t m
Like autocorrelation function, the autocovariance function of a strictly stationary
process depends on the time difference t2
 t1
.
3.6 WHITE NOISE PROCESS
One of the very important random process is the white noise process. Noises in many
practicalsituations are approximated bythe white noise process. Most importantly, the white
noise plays an important role inmodeling WSS signals.
A random process is said to be white noise process X(t), if it is zero mean and power
spectraldensityis defined as:
X 0
S (ω)=N / 2, forallfrequencies
where, N0
is a real constant.
The corresponding autocorrelationfunctionis givenby:
X 0
R (τ)=(N / 2)δ(τ)
where, ( )
  is the Diracdelta function.The PSD andautocorrelationfunctionofawhite noise is
shown inFigure 3.9 (a) and (b) respectively.
(a) (b)
Figure 3.9 (a) PSD of White Noise (b) Autocorrelation Function of White Noise
0


RX( )

N0/2
N0/2
SX 0
( ) N /2
 
EC8491: Communication Theory Department of ECE
Random Process 3.19
3.6.1 Properties of White Noise Process
 The termwhitenoise is analogous to white light whichcontains allvisible light frequencies.
 A white noise is a mathematical abstraction; it cannot be physically realized since it has
infinite averagepower.
 A white noise process canhave anyprobabilitydensityfunction.
 The randomprocess X(t) is called as white Gaussian noise process, ifX(t) is a stationary
Gaussianrandomprocess with zero meanand flat power spectraldensity.
 Ifthe systembandwidth(BW) is sufficientlynarrower than the noise BW and noise PSD is
flat, we canmodelit as awhite noise process. Thermalnoise, whichis thenoise generated in
resistors due to randommotion electrons, is generally modelled as white Gaussian noise,
since theyhave veryflat PSD over verywide band offrequency.
 A white noise process is called strict-sense white noise process, ifthe noise samples at
distinct instants oftime are independent.
3.7 WIDE-SENSE STATIONARY PROCESS (WSS)
Aprocess maynot be stationaryinthe strict sense, stillit mayhavemeanvalue and
anautocorrelationfunctionwhichare independent ofthe shift oftime origin.
Ø ( )=constant
x
m t
Ø 
XX 1 2 XX 2 1
R ( , )=R ( )
t t t t
Such a process is known as Wide-Sense Stationary or Weakly Stationary Process
(WSS) or Co-Variance Stationary.
3.8 POWER SPECTRAL DENSITY (PSD)
A randomprocessis a collection ofsignals and the spectralcharacteristicsofthese signals
determine the spectralcharacteristics ofthe randomprocess.
Ifthe signals ofthe randomprocess are:
Slowly Varying: Randomprocess will mainlycontain low frequencies and its power will be
mostlyconcentrated at low frequencies.
Fast Varying: Most of the power in the random process will be at the high-frequency
components.
EC8491: Communication Theory Department of ECE
3.20 Communication Theory
A usefulfunction that determines the distribution ofthe power of the randomprocess at
different frequencies is the Power Spectral Density or Power Spectrum of the random
process, the power spectraldensityofa randomprocess X (t) is denoted bySX
(f), and denotes
the strength ofthe power inthe randomprocess as a function offrequency. Theunit for power
spectraldensityis watts perHertz (W/Hz).
3.8.1 Expression for power spectral density
Theimpulseresponse ofa lineartimeinvariant filterisequalto the inverseFourierTransform
ofthefrequencyresponse ofthesystem. Let H(f) denotesthe frequencyresponse ofthe system,
thus
1 1
( ) H( )exp( 2 )
h f j f df


   

Substituting 1
( )
h  in 2
E[Y ( )],
t we get
2
1 2 XX 2 1 1 2
E[Y ( )] H( )exp( 2 ) ( )R ( )
t f j f df h d d
  
  
 
       
 
 
  
2 2 XX 2 1 1 1
H( ) ( ) R ( )exp( 2 )
df f d h j f d
  
  
       
  
In the last integralon the right hand side ofabove equation a new variable is defined :
2 1
  
Then theabove equation canbe rewrittenas:
2
2 2 2 XX 1 1
E[Y ( )] H( ) ( )exp( 2 ) R ( )exp( 2 )
t df f d h j f j f d
  
  
         
  
Themiddleintegralcorrespondsto H*( )
f ,thecomplexconjugateofthefrequencyresponse
ofthe filter and so we maysimplifythe equationas:
2
2
XX 1 1
E[Y ( )] H( ) R ( )exp( 2 )
t df f j f d
 
 
     
 
where H( )
f isthe magnitude response ofthe filter. Intheabove equationthe last integralterm
is the Fourier transformofthe autocorrelationfunction ofthe input randomprocess X(t).This
provides the definitionofa new parameter,



XX XX 1 1
S ( )= R (τ)exp( 2π τ ) τ
f j f d


EC8491: Communication Theory Department of ECE
Random Process 3.21
This function is called XX
S ( )
f iscalled the power spectraldensityor powerspectrumof
the stationaryprocess X(t).
Finally, 
2
2
XX
-
E[Y ( )]= H( ) S ( )
t f f df


The mean square of the output of a stable linear time-invariant filter in response
to a stationary process is equal to the integral over all frequencies of the power spectral
density of the input process multiplied by the squared magnitude response of the filter.
3.8.2 Relationship between power spectral density and autocorrelation
The power spectraldensityand the autocorrelationfunctionofa stationaryprocessforma
Fourier-transformpair with  and f as the variables ofinterest,

XX XX
S ( )= R (τ)exp(- j2π τ) τ
f f d

 


XX XX
R (τ)= S ( )exp( 2π τ)
f j f df


The above relations provide insight of spectral analysis of random processes and they
together called as Einstein-Wiener-Khintchine Relation.
3.8.3 Properties of Power Spectral Density
1. The zero-frequencyvalue ofthe power spectraldensityofa stationaryprocess equals the
totalarea under the graphofthe autocorrelationfunction(substituting f = 0 in Sxx
(f)).

XX XX
S (0)= R (τ) τ
d

 
2. The meansquare value ofa stationaryprocess equals the totalarea under the graph ofthe
power spectraldensity(substituting = 0 in Rxx
())

2
XX
E[X ( )]= S ( )
t f df


3. The power spectraldensityofa stationaryprocessis always nonnegative,
XX
S ( ) 0 for all
f f

4. Thepowerspectraldensityofarealvaluedrandomprocessis anevenfunctionoffrequency,
that is,
XX XX
S ( )=S ( )
f f

5. The power spectraldensityappropriatelynormalized has theproperties usuallyassociated
withaprobabilitydensityfunction:
EC8491: Communication Theory Department of ECE
3.22 Communication Theory

XX
X
XX
S ( )
P ( )=
S ( )
f
f
f df


3.9 ERGODIC PROCESS
The expectations or ensemble averages of a random process are averages across the
process that describes allpossible values of the sample functions of the process observed at
time tk
. Time averages are defined as long term sample averages that are averages along the
process.
Timeaveragesarethepracticalmeansforthe estimationofensembleaveragesoftherandom
process. The dc value ofx(t) is defined by the time average
T
T
1
(T) ( )
2T
x
m x t dt

 
The timeaverage is a randomvariable as its valuedepends on the observationintervaland
sample function. The meanofthe time average is givenby,
T
T
1
E[ (T)] E[ ( )]
2T
x x
m x t dt m

 

The process x(t) is ergodic in the mean if,
1. Thetimeaverage (T)
x
m approachestheensembleaverage x
m inthelimit astheobservation
intervalT approachesinfinity, that is

T
lim (T)=
x x
m m

2. The varianceof (T)
x
m approacheszero inthelimit asthe observationintervalT approaches
infinity.

T
lim var[ (T)]=0
x
m

The time averaged autocorrelationfunction ofasample functionx(t)


T
XX T
1
R (τ,T)= ( + τ) ( )
2T
x t x t dt
The process x(t) is ergodic in the autocorrelation function if,

XX XX
T
lim R (τ,T)]=R (τ)


XX
T
lim Var[R (τ,T)]=0

For a randomprocess to be ergodic, it has to be stationary; however the converse is not true.
EC8491: Communication Theory Department of ECE
Random Process 3.23
3.10 GAUSSIAN PROCESS
Gaussian processes playan important role in communication systems. The fundamental
reason for their importance is that thermal noise in electronic devices, which is produced
by the random movement of electrons due to thermal agitation, can be closely modeled
by a Gaussian process.
In a resistor, free electrons move as a result ofthermalagitation. The movement of these
electrons is random, but their velocityis a functionofthe ambient temperature. The higher the
temperature, the higherthe velocityoftheelectrons. The movement ofthese electronsgenerates
a current witha randomvalue. Wecan consider eachelectroninmotionas atinycurrent source,
whose current is a randomvariable that can be positive or negative, depending onthe direction
ofthemovement oftheelectron.Thetotalcurrent generatedbyallelectrons, whichisthegenerated
thermalnoise, isthe sumofthecurrents ofallthese current sources. We can assume that at least
a majorityofthese sources behaveindependentlyand, therefore, the totalcurrent is thesumofa
large numberofindependent and identicallydistributed randomvariables. Now, byapplying the
central limit theorem, we conclude that this total current has a Gaussian distribution. For this
reason, thermalnoise can be verywellmodeled byGaussian randomprocess.
Gaussian processes provide rather good models for some information sources as well.
Some properties ofthe Gaussian processes, make these process mathematicallytractable and
easyto use.
Let Ybe a randomvariable obtained byintegrating the product ofa randomprocess X(t)
for a time period of t = 0 to t = T and some function g(t) given by:
T
0
Y X( ) ( )
t g t dt
 
The weighting functioninabove equationissuchthat the mean-square valueoftherandom
variable Yis finite and ifthe random variableYis a Gaussian distributed random variable for
everyg(t), then the process X(t) is said to be Gaussian process.
The randomvariableYhas a Gaussiandistributionifits probabilitydensityfunctionhas the
form,
2
Y
Y 2
Y Y
( )
1
( ) exp
2 2
y m
f y
 

 
 
  
 
 
where Y
m is the mean and 2
Y
 is the variance ofthe randomvariableY.
Y.
EC8491: Communication Theory Department of ECE
3.24 Communication Theory
Aplot ofthis probabilitydensityfunctionis showninFigure 3.10, for thespecialcase when
the GaussianrandomvariableYis normalized to have mean ofzero and a variance 2
Y
 of1.
2
Y
1
( ) exp
2
2
y
f y
 
 
 
 
  
Figure 3.10 Normalized Gaussian Distribution
3.10.1 Advantages of Gaussian Process
1. Gaussian process has manyproperties that make analytic results possible.
2. Random processes produced by physical phenomena are often such that a Gaussian
model is appropriate. Further the use ofGaussian modelis confirmed by experiments.
3.10.2 Properties of Gaussian Process
1. If the set of random variables 1 2 3
X( ),X( ),X( ),... ...,X( )
n
t t t t obtained by observing a
randomprocess X(t) at times t1
,t2
,….tn
. and the process X(t) is Gaussian, then this set of
randomvariables is jointlyGaussian for anyn, with their PDF completelyspecified bythe
set ofmeans:
i
X( ) =E[X( )],
t i
m t where i  1, 2, 3…. n
and the set ofcovariance functions,

X X( ) X( )
C ( , )=E[(X( ) )(X( ) )]
k i
k i k t i t
t t t m t m k, i  1,2,3….n
2. Ifthe set ofrandomvariables X( )
i
t is uncorrelated, that is,
C 0
ij = i j

then X( )
i
t are independent.
fY(y)
0.6
0.4
0.2
y
  
3 2 1 0 1 2 3
EC8491: Communication Theory Department of ECE
Random Process 3.25
3. Ifa Gaussian process is stationary, then the process is also strictlystationary.
4. Ifa Gaussianprocess X(t) is passedthroughLTI filter, then therandomprocessY(t)at the
output ofthe filter is also Gaussianprocess.
3.11 TRANSMISSION OF A RANDOM PROCESS THROUGH A LTI FILTER
When a randomprocess X(t) is applied as input to a linear time-invariant filter ofimpulse
response h(t), producing anew randomprocessY(t) at the filter output as showninFigure 3.11.
It isdifficulttodescribetheprobabilitydistributionoftheY(t), evenwhentheprobabilitydistribution
ofthe X(t) is completelyspecified.
Figure 3.11 Transmission of a random process through a LTI filter
The time domain form of input-output relations of the filter for defining the mean and
autocorrelationfunctionsoftheoutput randomprocessY(t)intermsofthe input X(t),assuming
X(t) is astationaryprocess. The transmissionofa process througha LTI filter isgoverned bythe
convolution integral, where the output random process Y(t) is expressed in terms of input
randomprocess X(t) as:
 1 1 1
Y( )= (τ )X( τ ) τ
t h t d



where, is the integration variable.
Mean ofY(t):
Y ( ) E[Y( )]
m t t

1 1 1
E ( )X( )
h t d


 
 
   
 
 

Provided E[X(t)] is finite for allt and the systemis stable.
Impulse
Response
h(t)
X(t) Y(t)
EC8491: Communication Theory Department of ECE
3.26 Communication Theory
Interchanging theorder ofexpectation andintegration, we get
Y 1 1 1
( ) ( )E[X( )]
m t h t d


   

1 X 1 1
( ) ( )
h m t d


    

When the input randomprocess X(t) is stationary, the mean X( )
m t is a constant X
m , so
Y X 1 1
( )
m m h d


  

Y X
= H(0)
m m
where, H(0) is the zero-frequency(DC) response ofthe system.
The mean of the output random process Y(t) produced at the output of a LTI system
in response to input process X(t) is equal to the mean of X(t) multiplied by the DC
response of the system.
Autocorrelation ofY(t):
YY 1 2 1 2
R ( , ) E[Y( )Y( )]
t t t t

Usingconvolutionintegral, we get
YY 1 2 1 1 1 1 1 2 2 2
R ( , ) E ( )X( ) ( )X( )
t t h t d h t d
 
 
 
 
      
 
 
 
Provided E[X2
(t)] is finite for allt and the systemis stable,
YY 1 2 1 1 2 2 1 1 2 2
R ( , ) ( ) ( ) E[X( )X( )]
t t h d h d t t
 
 
       
 
1 1 2 2 XX 1 1 2 2
( ) ( ) R ( , )
h d h d t
 
 
       
 
When the input X(t) is a stationaryprocess, the autocorrelation functionofX(t) is onlya
function of the difference between the observation times 1 1
t  and 2 2
t  .Thus putting
1 2
   in above equation, we get
EC8491: Communication Theory Department of ECE
Random Process 3.27
YY 1 2 XX 1 2 1 2
R ( ) ( ) ( )R ( )
h h d d
 
 
        
 
On combining this result with the mean Y
m ,we see that if the input to a stable linear
time-invariant filter is a stationary process, then the output of the filter is also a
stationary process.
When 0
 , 2
YY
R (0) E[Y ( )] so
t


 
2
1 2 XX 2 1 1 2
E[Y ( )]= (τ ) (τ )R (τ τ ) τ τ
t h h d d
 


which isa constant.
FORMULAE TO REMEMBER
 IfAand B are independent events, then
P(A B)  P(A) * P(B)
 IfAand B are mutuallyexclusive events, then
P(A B) = 0
 ConditionalprobabilityofeventAgiven B,
P(A B)
P(A / B)
P(B)


 Law oftotalprobability
1
P(B) P(B / A )P(A )
k
i i
i 
 
 Baye’s theorem
P(B/ A )P(A )
P(A / B)
P(B)
i i
i 
 Cumulative Distribution Functionor CDF ofa randomvariable X
X
F ( ) P(X )
x x
 
 ProbabilityDensityFunctionor PDF ofa continuous randomvariable X
X X
( ) F ( )
d
f x x
dx

EC8491: Communication Theory Department of ECE
3.28 Communication Theory
 ProbabilityMass Functionor PMF of a discrete randomvariable X
pX
(x)  P(X  x)
 Mean ofthe randomvariable
E[X] ( ) (Continuous)
x
x f x dx


 
E[X]  X P[X ]
x x
  (Discrete)
 Variance ofthe randomvariable
2 2
X ( ) ( ) (Continuous)
x x
x f x dx


  

2 2 2
X E[(X ) ] ( ) P[X ] (Discrete)
x x x
x x
      
 Covariance oftwo randomvariable
Cov(X,Y) E[XY]  x y
 
 IfCov(X,Y) = 0, then X andY are uncorrelated.
 IfX andYare independent, then Cov(X,Y) = 0.
 PMF of BinomialRandomVariable
P(X ) (1 ) , 0
i n i
n
i p p i n
i

 
    
 
 
 PDFof UniformRandomVariable
1
,
( )
0, Otherwise
x
a x b
b a
f x

 





 PDF ofGaussianor NormalRandomVariable
2
2
( )
2
X
1
( )
2
x m
f x e




 
 The random process X(t) is said to be Stationary in the Strict Sense (SSS) or Strictly
Stationaryifthefollowingconditionholds,
1 1
X( ),...,X( ) 1 X( ),...,X( ) 1
F ( ,... ..., ) F ( ,... ..., )
k k
t t k t t k
x x x x
  
EC8491: Communication Theory Department of ECE
Random Process 3.29
 Mean ofthe randomprocess X(t) is defined as the:
( )
( ) E[X( )] ( )
x x t
m t t x f x dx


  
 Autocorrelation ofthe randomprocess X(t)is:
XX 1 2 1 2
R ( , ) E[X( )X( )]
t t t t

1 2
1 2 ( ), ( ) 1 2 1 2
( , )
x t x t
x x f x x dx dx
 
 
  
 Autocovariance functionofrandomvariable X(t)is:
2
X X 1 2 XX 2 1
C ( , ) R ( ) x
t t t t m
  
 Wide-Sense Stationaryor Weakly StationaryProcess (WSS) or Co-variance Stationary
X XX 1 2 XX 2 1
( ) constant and R ( , ) R ( )
m t t t t t
  
 Power spectraldensityor power spectrumofthe stationaryprocess X(t)
XX XX 1 1
S ( ) R ( )exp( 2 )
f j f d


     

 Einstein-Wiener-Khintchine relations
XX XX
S ( ) R ( )exp( 2 )
f j f d


     

XX XX
R ( ) S ( )exp( 2 )
f j f df


   

 x(t) is ergodic in the mean if
T
lim (T)
x x
m m
  ,
T
lim var[ (T)] 0
x
m


 x(t) isergodic inthe autocorrelationfunctionif:
T XX XX
lim R ( ,T) R ( ),
   
XX
T
lim Var[R ( ,T)] 0

 
 The mean of the output random process Y(t) produced at the output of a LTI system in
response to input process X(t) is equalto the meanofX(t) multiplied bytheDC response of
the system.
Y X H(0)
m m

EC8491: Communication Theory Department of ECE
3.30 Communication Theory
SOLVED EXAMPLES
1. A telegraph source generates two symbols: Dot and Dash. The dots were twice as
likely to occur as dashes. What is the probability of occurrence of dot and dash?
Solution: Given that,
P(Dot) = 2 P(Dash)
We know that the probabilityof sample space is 1, so
P(Dot) + P(Dash) = 1
Substituting, P(Dot) = 2P(Dash) in above equation, we get
3P(Dash) 1

Thus, P(Dash) = 1/3 and P(Dot) = 2/3
2. Binary data are transmitted over a noisy communication channel in a block of 16
binary digits. The probability that areceived digit is in errordue to channelnoise is
0.01.Assume that the errors occur in various digit positions within a block are
independent. Find a) Mean errors per block, b) variance of the number of errors
per blockand c) Probability that the number of errors per block is greater than or
equal to 4.
Solution: Let X denote the random variable of number of errors per block. Then, X has a
binomialdistribution.
(a) Mean  np
Given, n  16 and p  0.01
Mean error per block  16 * 0.01  0.16
(b) Variance  np(1  p)
Variance of the number of errors per block  (16)(0.01)(0.99)  0.158.
(c) Probabilitythat the number oferrors per block is greater than or equalto 4 is,
P(X 4)  1  P(X 3)
Using binomialdistribution, we get,
3 16
0
16
P(X 3) (0.01) (0.99) 0.986
i i
i i


 
  
 
 

Hence, P(X 4)  1  0.986  0.014
EC8491: Communication Theory Department of ECE
Random Process 3.31
3. The PDF of a random variable X is given by fX
(x) = k, for a x b
  and zero
elsewhere, where k is a constant. Find the:
(i) Value of k
(ii) If a  1 and b  2. Calculate P(|X|  c) for c  1/2.
Solution:
(i) Given that PDF fX
(x) corresponds to uniformrandomvariable, so the value ofk is,
k = 1 / (b-a)
(ii) Using the value ofk = 1/(2+1) = 1/3, PDF is given byfX
(x) =1/3, for 1 2
x
   and zero
elsewhere,
1/ 2
1/ 2
1
P( X 1/ 2) P( 1/ 2 X 1/ 2)
3
dx

 
       
 

We get, P( X 1/ 2) 1/3.
 
4. Find the mean and variance of random variable X that takes the values 0 and 1
with probabilities 0.4 and 0.6 respectively.
Solution:
Mean  E[X]  X P[X ]
x x
   0(0.4)  1(0.6)  0.6
Variance   2
X X
( ) P[X ]
x x
    (0  0.6)2
(0.4)  (1  0.6)2
(0.6)  0.24
5. Find the covariance of X and Yif (a) X andY are independent (b)Y= aX  b.
Solution:
(a) Cov(X,Y)  E[XY]  E[X]E[Y]
Since, X andYare independent, E[XY]  E[X] E[Y]. Cov(X, Y)  0
(b) Y  aX  b
E[XY]  E[X(aX  b)]  a E[X2
]  b E[X]  a E[X2
]  b E[X]
E[Y]  E[aX  b]  aE[X]  E[b]  a E[X]  b
Cov(X,Y) = a E[X2
]  b E[X] – E[X] (aE[X]  b)
= a E[X2
]  b E[X] – aE2
[X] – bE[X]
Cov(X,Y) = a (E[X2
] – E2
[X])
EC8491: Communication Theory Department of ECE
3.32 Communication Theory
6. Show that the random process X( )= Acos(ω +θ)
c
t t is a wide-sense stationary
process where, is a random variable uniformly distributed in the range (0,2)
and A and are constant.
Solution: For a random process to be WSS, it is necessary to show that,
 Meanis constant
 Autocorrelationfunctiondependsonlyontime difference.
The PDFofthe uniformdistributionis givenby
1
( ) , 0 2
2
f    

(a) Mean
2
0
1
E[ ( )] X( )
2
x t t d

 


2
0
1
Acos( )
2
ct d

   
 
2
0
A
[sin( )] 0
2
ct 
    

Therefore, meanis a constant.
(b) Autocorrelation
XX
R ( , E[Acos( )Acos( )]
c c c
t t t t
      
2
2
A
cos( ) A 4 (0)
2
c
    
2
A
cos( ).
2
c
  
So, autocorrelationfunctiondepends on the time difference
As, mean is constant and autocorrelation function depends only on , X(t) is a
wide-sense stationary process.
EC8491: Communication Theory Department of ECE
Random Process 3.33
7. Show that the random process X(t) = A cos is a wide-sense stationary
process where, and are constant andAis a random variable.
Solution: For a random process to be WSS, it is necessaryto show that,
 Meanis constant
 Autocorrelationfunctiondependsonlyontime difference.
(a) Mean
E[X(t)]  E[Acos( )]
ct
 
cos( )
ct
   E[A]
E[X(t)] 0

Therefore, mean is not a constant.
(b) Autocorrelation
XX
R ( , ) E[ cos( )Acos( )]
c c c
t t t t
        
2
1
[cos( ) cos(2 2 )]E[A ]
2
c c c
t
        
Thus, the autocorrelation of X(t) is not a function oftime difference only.So the given
random process X(t) is not WSS.
8. Let X(t) = A cos t
  B sin t
 and Y(
Y(t) = B cos t - A sin t, where A and B are
independent random variables both having zero mean and variance , and  is
constant. Find the cross-correlation of X(t) and Y(t).
Solution: The cross-correlation ofX(t) andY(t) is :
XY 1 2 1 2
R ( , ) E[X( )Y( )]
t t t t

1 1 2 2
E[(Acos Bsin )(Bcos Asin )]
t t t t
      
1 2 1 2
E[A B](cos cos sin sin )
t t t t
     
– E[A2
] cos ωt1
sin ωt2
+ E[B2
] sin ωt1
cos ωt2
Since, E[AB] = E[A]E[B] = 0 and E[A2
] = E[B2
] =
2
XY 1 2 1 2 1 2
R ( , ) (sin cos cos sin )
t t t t t t
     
2
1 2
sin ( )
t t
  
2
XY
R ( ) sin
  
where, τ  t2
 t1
.
EC8491: Communication Theory Department of ECE
3.34 Communication Theory
9. The input X(t) to a diode with a transfer characteristicY= X2
is a zero mean
stationary Gaussian random process with an autocorrelation function

XX
R (τ)=exp( τ ) . Find the mean µY
(t) and RYY
(t1
, t2
).
Solution:
µY
 E[Y(t)]  E[X2
(t)]  RXX
(0)  1
RYY
(t1
, t2
)  E[Y(t1
)Y(t2
)]  E[X2
(t1
) X2
(t2
)]
For zero meanGaussian randomvariable,
E[X2
(t1
) X2
(t2
)]  E[X2
(t1
)] E[X2
(t2
)]  2E[X(t1
)X(t2
)]2
where, E[X2
(t1
)]  E[X2
(t2
)]  RXX
(0)
E[X(t1
)X(t2
)]  RXX
(|t1
 t2
|)
Since X(t) is stationary,
RYY
(t1
, t2
)  [RXX
(0)]2
 2 [RXX
(|t1
 t2
|)]2
or RYY
( )  [RXX
(0)]2
 2 [RXX
( )]2
 1  2 exp( 2| |)
10. A wide sense stationary random process X(t) is applied to the input of an LTI
system with impulse response h(t)  3e2t
u(t). Calculate the mean of the output
Y(t) of the system if E[X(t)]  2.
Solution: The frequencyresponse ofthe systemcanbe obtainedbytaking Fourier transformof
the impulse response as:
1
H( ) F[ ( )] 3
2
h t
j
  

The mean value ofthe output Y(t) canbe obtained as:
Y X H(0)
m m

H(0)  3*(1/2)  3/2
Therefore, mean of the output Y(t)  2 * (3/2)  3
11. LetX andYbe realrandomvariables with finitesecond moments. ProvetheCauchy-
Schwarzinequality (E[XY])2
 E[X2
]E[Y2
]. (April/May 2015)
Solution: The meansquare value ofa random variable cannever be negative value, so
2
E[(X Y) ] 0
a
  , for any value of a
EC8491: Communication Theory Department of ECE
Random Process 3.35
Expanding above equationwe get,
2 2 2
E[X ] 2 E[XY] E[Y ] 0
a a
  
Substituting 2
E[X Y]/ E[Y ]
a toget left handsideofthisinequalityasminimum, we get
2 2 2
E[X ] (E[EY] / E[Y ] 0
 
(or) 2 2 2
(E[X Y]) E[X ]E[Y ]

12. Let X(t) and Y(t) be both zero-mean and WSS random processes. Consider the
random process Z(t)  X(t)  Y(t). Determine the autocorrelation and power
spectrum of Z(t) if X(t) and Y(t) are jointly WSS. (Apr/May 2015)
Solution: The autocorrelationofZ(t) is givenby:
RZZ
(t1
, t2
)  E[Z(t1
)Z(t2
)]
 E[[X(t1
)  Y(t1
)][X(t2
)  Y(t2
)]]
 E[X(t1
)X(t2
)]  E[X(t1
)Y(t2
)]  E[Y(t1
)X(t2
)]  E[Y(t1
)Y(t2
)]
 RXX
(t1
, t2
)  RXY
(t1
, t2
)  RYX
(t1
, t2
)  RYY
(t1
, t2
)
As X(t) andY(t) are jointlyWSS,
ZZ XX XY YX YY 2 1
R ( ) R ( ) R ( ) R ( ) R ( ), where t t
          
We knowthat the Fourier transformofthe autocorrelationfunctiongives power spectrum,
taking Fouriertransformonboth sides, we get
ZZ XX XY YX YY
S ( ) S ( ) S ( ) S ( ) S ( )
        
13. Let X(t)= A cos(ωt + )
 and Y(t)= Asin(ωt + )
 where A and  are constants
and  is a uniform random variable [0,2].Find the cross correlation of X(t) and
Y(t). (Apr/May 2015)(May/June 2016)
Solution: The cross-correlation ofX(t) andY(t) is
XY
R ( , ) E[X( )Y( )]
t t t t
    
2
E[A cos( )sin( )]
t t
      
2
A
E[sin(2 2 ) sin( )]
2
t
     
EC8491: Communication Theory Department of ECE
3.36 Communication Theory
2
A
{E[sin(2 2 )] E[sin( )]}
2
t
     
2
A
{0 E[sin( )]}
2
  
2
XY XY
A
R ( , ) R ( ) sin ( )
2
t t      
14. In a binary communication system, let the probability of sending a 0 and 1 be 0.3
and 0.7 respectively. Let us assume that a 0 being transmitted, the probability of it
being received as 1 is 0.01 and the probability of error for a transmission of 1 is
0.1.
(i) What is the probability that the output of this channel is 1?
(ii) If a 1is received then whatis the probability thatthe input to thechannelwas 1?
(Nov/Dec 2015)
Solution: Let X andYdenote the input and output ofthe channel. Giventhat
P(X  0)  0.3, P(X  1)  0.7
P(Y  0 | X  0)  0.99, P(Y  1 | X  0)  0.01
P(Y  0 | X  1)  0.1, P(Y  1 | X  1)  0.9
(a) We knowthat, fromthe totalprobabilitytheorem.
P(Y  1)  P(Y  1 | X  0) * P(X  0)  P(Y  1 | X  1) * P(X  1)
 0.01 * 0.3  0.9 * 0.7
P(Y  1)  0.633
The probability that the output of this channel is 1 is 0.633.
(b) Using Baye’s rule
P(X 1)P(Y 1| X 1)
P(X 1|Y 1)
P(X 0)P(Y 1| X 0) P(X 1)P(Y 1| X 1)
  
  
      
 (0.7 * 0.9) / (0.3 * 0.01 + 0.7 * 0.9)
P(X  1 | Y  1)  0.9953
If a 1 is received then the probability that the input to the channel was 1 is 0.9953.
EC8491: Communication Theory Department of ECE
Random Process 3.37
15. Given a random process, X(t)= Acos(ωt +μ) whereAand  are constants and 
is a uniformly distributed random variable. Show that X(t) is ergodic in both mean
and autocorrelation. (May/June 2016)
Solution: For X(t) to be ergodic in meanand autocorrelation
X X( ) E[X( )] x
t t
  
XX XX
R ( ) X( )X( ) E[X( )X( )] R ( )
t t t t
      
Ensemble Average: E[X( )] Acos( ) ( )
t t f d



    

Assume  is uniformly distributed over to
 
A
E[X( )] cos( ) 0
2
t t d

 
    
 
XX
R ( ) E[X( )X( )]
t t
   
2
A
cos( )cos( ( ) )
2
t t d


       
 
2
XX
A
R ( ) cos
2
  
Time Average:
T / 2
T T/ 2
1
X X( ) lim Acos( )
T
t t dt
 
   

0
0
T / 2
T / 2
0
A
Acos( )
T
t dt

  

0
x
XX
R ( ) X( )X( )
t t
   
T / 2 2
T T / 2
1
lim A cos( )cos( ( ) )
T
t t dt
 
      

0
0
2 T / 2
T / 2
0
A 1
[cos cos(2 2 )
T 2
t dt

    

2
XX
A
R ( ) cos .
2
  
EC8491: Communication Theory Department of ECE
3.38 Communication Theory
Thus time averaged mean and autocorrelation is equal to ensemble averaged mean
and autocorrelation. So the given process X(t) is ergodic in both mean and
autocorrelation.
16. Consider two linear filters connected in cascade as shown in Fig.1. Let X(t ) be a
stationary process with auto correlation function Rx
(  ), the random process
appearing at the first input filteris V(t ) and the second filteroutput isY(t).
(a) Find the autocorrelation function ofY(t ).
(b) Find the cross correlation function Rvy
() of V(t ) and Y(t ). (Apr/May 2017)
Fig. 1
Solution:
(a) The cascade connection of two filters is equivalent to a filter with the impulse
response
1 2
( ) ( ) ( )
h t h h t d


   

The autocorrelationfunction ofY(t) isgiven by:
Y 1 2 X 1 2 1 2
R ( ) ( ) ( )R ( )
h h d d
 
 
        
 
(b) The cross correlation ofV(t) andY(t) is given by:
VY
R ( ) E[V( )Y( )]
t t
   
The output Y(t) is given as Y(t) 2
V( ) ( )
h t d


   

So, VY 2
R ( ) E[V( ) V( ) ( ) ]
t h t d


      

VY 2 V
R ( ) ( )R ( )
h t t d


    

h1(t) h2(t)
V(t)
X(t) Y(t)
EC8491: Communication Theory Department of ECE
Random Process 3.39
17. The amplitude modulated signal is defined as AM c
X (t)= A (t)cos(ω t+θ)
m where
m(t) is the baseband signal and Acos(ω +θ)
ct is the carrier. The baseband signal
m(t) is modeled as a zero mean stationary random process with the autocorrelation
function R (τ)
xx and the PSD Gx
(f). The carrier amplitude A and frequency c

are assumed to be constant and the initialcarrier phase  is assumed to be random
uniformly distributed in the interval 
( π,π). Furthermore, m(t) and  are assumed
to be independent.
(i) Show that XAM
(t) is Wide Sense Stationary
(ii) Find PSD of XAM
(t). (Apr/May 2017)
Solution:
(i) For XAM
(t) to be WSS, its :
 Mean E[XAM
(t)] = Constant
 Autocorrelation E[XAM
(t) XAM
(t )] depends on 
AM
E[X ( )] E[A ( )cos( )]
c
t m t t
  
A E[ ( )]E[cos( )]
c
m t t
  
Given that, m(t) is zero mean stationaryrandomprocess:
E[XAM
(t)]  0 (Constant)
RXAMXAM
()  E[XAM
(t) XAM
(t  )]
E[A ( )cos( )A ( )cos( ( ) )]
c c
m t t m t t
       
2
A E[ ( ) ( )]E[cos( ( )cos( )]
c c c
m t m t t t
        
2
A
R ( )E[cos cos(2 2 )]
2
xx c c c
t
       
2
XAMXAM
A
R ( ) R ( )cos
2
xx c
    
Since mean ofXAM
(t) is constant and autocorrelationofXAM
(t) depends on ,
XAM
(t) is Wide Sense Stationary
EC8491: Communication Theory Department of ECE
3.40 Communication Theory
(ii) We know that the Fourier transform of the autocorrelation function gives power
spectrum.
2
XAMXAM
A
F[R ( )] F[ R ( )cos ]
2
xx c
    
GXAMXAM
(ω) 
2
A
2
F[Rxx
(τ)] F[cos ωc
τ]
Given that PSD ofm(t) is Gx
(f).
We know that,
F[cos ] ( ) ( )
c c c
f f f f
     
Using frequencyconvolutiontheorem, we get
2
XAMXAM
A
G ( ) G ( )*[ ( ) ( )]
2
x c c
f f f f f f
    
GXAMXAM
(f) 
2
A
2

[Gx
(f  fc
)  Gx
(f  fc
)].
EC8491: Communication Theory Department of ECE
Random Process 3.41
REVIEW QUESTIONS AND ANSWERS
PART-A
1. Define RandomVariable. (Nov/Dec 2015)
A randomvariable is a functionthat assigns a realnumber X(S) to everyelement S
s  ,
where S is the sample space corresponding to a randomexperiment E.
Example: Tossing anunbiased coin twice. The outcomes ofthe experiment are HH, HT,
TH,TT. Let X denote the numberofheads turning up. ThenX has the values 2,1,1,0.Here,
Xisarandomvariablewhichassignsarealnumberto everyoutcomeofarandomexperiment.
2. State Baye’s rule. (Nov/Dec 2015)
Baye’s rule or Baye’s theorem relates the conditional and marginal probabilities of
stochasticeventsAand B:
P(B | A)P(A)
P(A | B)
P(B)

Where, P(A|B)istheconditionalprobabilityofAgivenB, P(B|A)istheconditionalprobability
ofB givenA, P(A) is the marginalprobabilityofAandP(B) is the marginalprobabilityofB.
3. Define discrete random variable.
IfX is a random variable which can take a finite number or countablyinfinite number of
p values, X is called a discrete RV.Eg. Let X represent the sum ofthe numbers on the 2 dice,
when two dice are thrown.
4. Define continuous random variable.
If X is a random variable which can take allvalues (i.e., infinite number ofvalues) in an
interval, then X is called a continuous RV. Eg. The time takenby a person who speaks over a
telephone.
5. Define cumulative distribution function of a random variable.
The CumulativeDistributionFunction(CDF) ordistributionfunctionofarandomvariable
X is defined as, X
F (X) { :X( ) },
P x
    which canbe simplywritten as
F ( ) (X )
x x P x
 
EC8491: Communication Theory Department of ECE
3.42 Communication Theory
6. List the properties of CDF.
1. 0 F ( ) 1
x x
 
2. F ( )
x x is non-decreasing
3. lim F ( ) 0
x x x
  and lim F ( ) 1
x x x
 
4. F ( )
x x is continuousfromthe right
5. P( X ) F ( ) F ( )
x x
a b b a
   
6. P(X ) F ( ) F ( ).
x x
a a a
  
7. Define probability density function of a random variable.
The ProbabilityDensityFunction or PDF ofa continuous randomvariable Xis defined as
the derivative ofits CDF. It is denoted by:
( ) F ( ).
x x
d
f x x
dx

8. List the properties of PDF.
1. ( ) 0
x
f x 
2. ( ) 1
x
f x dx




3. ( ) P( X )
a
x
b
f x dx a b
  

4. In general,
A
P(X A) ( )
x
f x dx
  
5. F ( ) ( ) .
x
x x
x f u du

 
9. Define mean of a random variable.
The mean ofthe randomvariable X is defined as:
E[X] ( )
x
x f x dx


  (Continuous)
E[X] P[X ]
x x x x
   
 (Discrete)
10. Define variance ofa random variable.
The variance ofthe randomvariable X is defined as:
EC8491: Communication Theory Department of ECE
Random Process 3.43
2 2
X X
( ) ( )
x
x f x dx


  
 (Continuous)
2 2
X X
( ) P[X ]
x x x
   
 (Discrete)
11. Define covariance of a random variable.
The covariance of two random variables X and Y is defined as the expectation of the
product ofthe two randomvariables givenby:
Cov(X,Y)  E[XY]  X Y
 
12. Define correlation coefficient.
The correlationcoefficient oftwo randomvariableis defined as:
XY
XY
X Y

 
 
where, the value ofcorrelation coefficient ranges from1to 1.
13. When the two random variables are said to be uncorrelated?
The two randomvariables X andYare said to be uncorrelated, iftheir covariance value is
zero.
Cov(X,Y)  0
14. Give the PMF of binomial random variable.
P(X ) (1 ) , 0
i n i
n
i p p i n
i

 
    
 
 
15. Give the mean and variance ofbinomial random variable.
Mean ofbinomialrandomvariable is given by,   E(X)  np
Variance of binomial random variable is given by,  np(1  p).
16. What is the importance ofbinomialrandom variable in the communication?
Binomialrandomvariable can be used to modelthe totalnumber ofbits received in error,
when sequence ofn bits is transmitted over a channelwitha bit-error probabilityofp.
17. Give the PDF for uniform random variable.
This is a continuous randomvariable taking values betweenaand b withequalprobabilities
for intervals ofequallength. The probabilitydensityfunctionis givenby:
EC8491: Communication Theory Department of ECE
3.44 Communication Theory
X
1
,
( )
0, Otherwise
a x b
b a
f x

 


 


18. What is the importance ofuniform randomvariable in the communication?
The phase ofa received sinusoid carrier and quantizationerrors are usuallymodelled as a
uniformrandomvariable.
19. Give the PDF for Gaussian random variable.
The Gaussianornormalrandomvariable isa continuous randomvariabledescribed bythe
densityfunctionas:
2
2
( )
2
1
( )
2
x m
x
f x e




 
20. What is the significance of Gaussian random variable?
 TheGaussianrandomvariableisthemostimportantandfrequentlyencounteredrandom
variable in communication systems. The reason is that thermal noise, which is the
major source ofnoise in communication, has a Gaussiandistribution.
 In robotics, GaussianPDF is used to statisticallycharacterize sensor measurements,
robot locations and map representations.
21. List the properties of Gaussian random variable.
 The sum of two independent Gaussian random variables is also a Gaussian
randomvariable.
 The weighted sum of N independent Gaussian random variables is a Gaussian
randomvariable.
 Iftwo Gaussian randomvariables havezero covariance (uncorrelated), theyare also
independent.
22. State Central Limit Theorem. (May/June 2016)(Nov/Dec 2016)
Central limit theoremstates that the normalized distribution of the sumof independent,
identically distributed random variables approaches a Gaussian distribution as the number of
randomvariablesincreases, regardless ofthe individualdistributions.
EC8491: Communication Theory Department of ECE
Random Process 3.45
23. List the applications of Central Limit Theorem.
 Signalprocessing
 Channelmodelling
 Finance
 Populationstatistics
 Hypothesis testing
 Engineeringresearch
24. Define Random Process.
Arandomprocess is defined asrule whichassigns afunctionoftime to eachoutcome‘s’of
a randomexperiment.
25. Differentiate random process from random variable.
Randomvariable isa mapping ofevent outcome to realnumbers, whereas randomprocess
is themapping ofevent outcometo signalwaveforms. Randomprocess is a functionoftime, but
randomvariable is not a functionoftime.
26. What are the types of random process?
Based on the continuous or discrete nature of the state space S and parameter set T, a
randomprocesscan be classified into discrete randomsequence, continuousrandomsequence,
discrete randomprocess and continuous randomprocess.
Based on the stationarity, a random process can be classified into stationary and non-
stationaryrandomprocess.
27. Define stationary random process.
Arandomprocesswhose statisticalcharacteristics do not change withtimeis classified as
stationaryrandomprocessor stationaryprocess.
28. What is strict-sense stationary?
The random process X(t) is said to be Stationary in the Strict Sense (SSS) or strictly
stationaryifitsstatistics are invariant to ashift oforigin:
1 1
X( ),.....,X( ) 1 X( ),.....X( ) 1
F ( ,... ... ., ) F ( ,... ... ., )
k k
t t k t t k
x x x x
   
29. Define mean of the random process.
The meanofprocess X(t) isdefined as the expectationofthe randomvariable obtained by
observing the process at some time t givenby:
EC8491: Communication Theory Department of ECE
3.46 Communication Theory
X ( )
( ) E[X( )] ( )
x t
m t t x f x dx


   .
30. Define autocorrelation of the random process. (May/June 2016)
Autocorrelation of the process X(t) is given by the expectation of the product of two
random variables X(t1
) and X(t2
) obtained by observing the process X(t) at times t1
and t2
respectively.
XX 1 2 1 2
R ( , ) E[X( )X( )]
t t t t

1 2
1 2 X( ) X( ) 1 2 1 2
, ( , )
t t
x x f x x dx dx
 
 
  
31. List the properties of autocorrelation function.
 The mean square value of a process is equal to the value ofautocorrelation at 0.
2
XX
R (0) E[X (T)]

 The autocorrelationfunctionis anevenfunctionof.
XX XX
R ( ) R ( )
   
 The autocorrelationfunctionis maximumat =0.
XX XX
R ( ) R (0)
 
32. Define autocovariance of the random process.
The autocovariancefunctionofX(t) isdefined as:
2
XX 1 2 XX 2 1 X
C ( , ) R ( )
t t t t m
  
33. Define white noise process.
A random process is said to be white noise process X(t), if it is zero mean and power
spectraldensityis definedas:
X 0
S ( ) N / 2,
  forallfrequencies
34. Draw the PSD and autocorrelation function of white noise.
EC8491: Communication Theory Department of ECE
Random Process 3.47
35. When a random process is said to be white Gaussian noise process?
The randomprocess X(t) is said to be white Gaussian noise process, ifX(t) is a stationary
Gaussianrandomprocess with zero meanand flat power spectraldensity.
36. What isWide-Sense Stationary? (Apr/May 2017)
A randomprocess is called wide-sense stationary(WSS) ifits
 Meanis constant.
 Autocorrelationdepends onlyonthe time difference.
37. Define Power Spectral Density of the random process.
The distribution ofthe power ofthe randomprocessat different frequencies is the Power
SpectralDensityor Power Spectrumofthe randomprocess.
38. Give the power spectral density equation of a random process X.
XX XX 1 1
S ( ) R ( )exp( 2 )
f j f d


     

39. List the properties of power spectral density.
 The zero-frequencyvalue ofthe powerspectraldensityofastationaryprocess equals
the totalareaunder the graphofthe autocorrelationfunction.
 The meansquare value ofastationaryprocess equals thetotalarea under thegraphof
the powerspectraldensity.
40. What is ergodicity?
Arandomprocessissaid to beergodic iftime averagesarethe same forallsamplefunctions
and equalto the corresponding ensemble averages.
EC8491: Communication Theory Department of ECE
3.48 Communication Theory
41. When a process is ergodic in mean?
A stationaryprocessis called ergodic inthe meanif:
T
lim (T)
x x
m m


T
lim var[ (T)] 0
x
m
 

42. When a process is ergodic in autocorrelation?
Astationaryprocessis calledergodic intheautocorrelationfunctionif,
43. Write Einstein-Wiener-Khintchine relations. (Nov/Dec 2016) (Apr/May 2017)
XX XX
S ( ) R ( )exp( 2 )
f j f d


     

X X XX
R ( ) S ( )exp( 2 )
f j f df


   

44. Give the importance ofWiener-Khintchine relations.
For a stationary process, the power spectral density can be obtained from the Fourier
transformofthe autocorrelationfunction.
45. What are the advantages of Gaussian process?
 Gaussian process has manyproperties that make analytic results possible.
 Randomprocesses produced byphysicalphenomena are oftensuch that a
Gaussian modelis appropriate. Further the use ofGaussian modelis confirmed
byexperiments.
46. List the properties of Gaussian process.
 Ifa Gaussian process is stationary, then the process is also strictlystationary.
 Ifa Gaussianprocess X(t) is passedthroughLTI filter, thentherandomprocessY(t)
at the output ofthe filter is also Gaussian process.
EC8491: Communication Theory Department of ECE
Random Process 3.49
PART – B
1. Let X andYbe realrandomvariables with finite second moments. Prove the Cauchy-
Schwartz inequality 2 2 2
(E[XY]) E[X ]E[Y ]
 . (Apr/May 2015)
2. Differentiate SSS withthat ofWSS process. (Apr/May 2015)
3. Let X(t) andY(t) be bothzero-mean and WSS random processes. Consider the random
process Z(t)  X(t) +Y(t). Determine the autocorrelationand power spectrumofZ(t) if
X(t) andY(t) are jointlyWSS. (Apr/May 2015)
4. Let X(t) Acos( )
t
   andY(t) Asin( )
t
   whereAand are constants and  is
a uniformrandomvariable [0,2]. Find the cross correlation ofX(t) andY(t).
(Apr/May 2015)(May/June 2016)
5. In a binarycommunication system, let the probabilityofsending a 0 and 1 be 0.3 and 0.7
respectively. Let us assumethat a 0 being transmitted, the probabilityofit beingreceivedas
1 is 0.01 and the probabilityoferror for a transmission of1 is 0.1.
(i) What is the probabilitythat the output ofthis channelis 1?
(ii) Ifa 1 is received thenwhat is the probabilitythat the input to the channelwas 1?
(Nov/Dec 2015)
6. What isCDFand PDF?Statetheir properties.Also discussthemindetailbygivingexamples
of CDF and PDF for different types of random variables. (Nov/Dec 2015)
7. Explainindetailabout the transmissionofa randomprocess throughalinear timeinvariant
filter. (May/June 2016)(Nov/Dec 2016)
8. When a randomprocess is said to be strict sense stationary(SSS), wide sense stationary
(WSS) and ergodic process? (May/June 2016)(Nov/Dec 2016)
9. Given a random process, X(t) =Acos(t + ) whereAand are constants and  is a
uniformly distributed random variable. Show that X(t) is ergodic in both mean and
autocorrelation. (May/June 2016)
10. Define the following: Mean, Correlation, Covariance and Ergodicity. (Nov/Dec 2016)
11. What is a Gaussian randomprocess andmention its properties. (Nov/Dec 2016)
12. Consider two linear filters connected incascade as shown infig.1. Let X(t) be a stationary
EC8491: Communication Theory Department of ECE
3.50 Communication Theory
processwithautocorrelationfunctionRx
(), therandomprocessappearing at thefirst input
filter isV(t) and the second filter output isY(t).
(a) Find theautocorrelationfunction ofY(t).
(b) Find the cross correlation function Rvy
( ) ofV(t) andY(t). (Apr/May 2017)
Fig. 1
13. The amplitude modulated signalis defined as XAM
(t) =Am(t) cos( )
ct
  where m(t) is
the baseband signalandAcos( )
ct
  is the carrier. Thebaseband signalm(t)is modeled
as a zero meanstationaryrandomprocess withthe autocorrelationfunctionRxx
()and the
PSD Gx
(f). The carrier amplitudeAand frequency c
 areassumed to be constant and the
initialcarrierphase  isassumed toberandomuniformlydistributedintheinterval ( , )
  .
Furthermore, m(t) and  are assumed to be independent.
(i) Show that XAM
(t) isWide Sense Stationary
(ii) Find PSD ofXAM
(t). (Apr/May 2017)

EC8491: Communication Theory Department of ECE

More Related Content

PPT
Multivibrators
PPTX
EC6651 COMMUNICATION ENGINEERING UNIT 1
PPTX
Pass Transistor Logic
PPT
OPERATIONS ON SIGNALS
PDF
PYTHON PROGRAMMING NOTES RKREDDY.pdf
PPTX
Multivibrator
PPTX
Diabetes Mellitus
PPTX
Hypertension
Multivibrators
EC6651 COMMUNICATION ENGINEERING UNIT 1
Pass Transistor Logic
OPERATIONS ON SIGNALS
PYTHON PROGRAMMING NOTES RKREDDY.pdf
Multivibrator
Diabetes Mellitus
Hypertension

What's hot (20)

PPT
Digital Communication: Information Theory
PDF
Frequency Modulation and Demodulation
PPT
NOISE IN Analog Communication Part-1.ppt
PDF
Antenna &amp; wave lab manual
PPTX
signal and channel bandwidth
PPTX
Isolator & Circulator -FT.pptx
PPT
Digital Communication Pulse code modulation.ppt
PPTX
Radio receiver characteristics
PPT
Matched filter
PPTX
Digital modulation technique
PDF
Nyquist criterion for zero ISI
PPTX
Frequency division multiplexing (FDM).pptx
PDF
1.Basics of Signals
PPTX
Information Theory and coding - Lecture 2
PPTX
M ary psk modulation
PDF
Capítulo II - Microondas - Análisis de redes de microondas
PDF
Digital communication unit 1
PPT
Fourier transform
PPT
Chapter 5
PDF
Cellular Network -Ground Reflectio (Two Ray) Model.pdf
Digital Communication: Information Theory
Frequency Modulation and Demodulation
NOISE IN Analog Communication Part-1.ppt
Antenna &amp; wave lab manual
signal and channel bandwidth
Isolator & Circulator -FT.pptx
Digital Communication Pulse code modulation.ppt
Radio receiver characteristics
Matched filter
Digital modulation technique
Nyquist criterion for zero ISI
Frequency division multiplexing (FDM).pptx
1.Basics of Signals
Information Theory and coding - Lecture 2
M ary psk modulation
Capítulo II - Microondas - Análisis de redes de microondas
Digital communication unit 1
Fourier transform
Chapter 5
Cellular Network -Ground Reflectio (Two Ray) Model.pdf
Ad

Similar to Communication Theory - Random Process.pdf (20)

PPTX
Econometrics 2.pptx
PPTX
this materials is useful for the students who studying masters level in elect...
PPTX
Chapter-4 combined.pptx
PPTX
Discussion about random variable ad its characterization
PPTX
Chapter_09_ParameterEstimation.pptx
PPTX
Random Variables and Probability Distributions
PPT
Probability and random process Chapter two-Random_Variables.ppt
PDF
U unit7 ssb
PDF
Random variables
PPTX
Lesson5 chapterfive Random Variable.pptx
PPT
Doe02 statistics
DOCX
stochastic notes
PDF
Lec9_Estimation for engineering student.pdf
PDF
Welcome to International Journal of Engineering Research and Development (IJERD)
PPTX
CHAPTER I- Part 1.pptx
PDF
Estimation theory 1
PPTX
ISM_Session_5 _ 23rd and 24th December.pptx
PPT
Marketing management planning on it is a
PDF
Probability Formula sheet
Econometrics 2.pptx
this materials is useful for the students who studying masters level in elect...
Chapter-4 combined.pptx
Discussion about random variable ad its characterization
Chapter_09_ParameterEstimation.pptx
Random Variables and Probability Distributions
Probability and random process Chapter two-Random_Variables.ppt
U unit7 ssb
Random variables
Lesson5 chapterfive Random Variable.pptx
Doe02 statistics
stochastic notes
Lec9_Estimation for engineering student.pdf
Welcome to International Journal of Engineering Research and Development (IJERD)
CHAPTER I- Part 1.pptx
Estimation theory 1
ISM_Session_5 _ 23rd and 24th December.pptx
Marketing management planning on it is a
Probability Formula sheet
Ad

More from RajaSekaran923497 (6)

PDF
Communication Theory - Noise Characterization.pdf
PDF
Communication Theory - Angle Modulation .pdf
PDF
Communication Theory - Amplitude Modulation.pdf
PDF
EC8351-Electronic Circuits-I study material.pdf
PDF
EC8252-Electronic Devices Question Bank.pdf
PDF
Mechatronics study material-Question Bank.pdf
Communication Theory - Noise Characterization.pdf
Communication Theory - Angle Modulation .pdf
Communication Theory - Amplitude Modulation.pdf
EC8351-Electronic Circuits-I study material.pdf
EC8252-Electronic Devices Question Bank.pdf
Mechatronics study material-Question Bank.pdf

Recently uploaded (20)

PDF
July 2025 - Top 10 Read Articles in International Journal of Software Enginee...
PDF
Embodied AI: Ushering in the Next Era of Intelligent Systems
PDF
Well-logging-methods_new................
PDF
Model Code of Practice - Construction Work - 21102022 .pdf
PPTX
UNIT-1 - COAL BASED THERMAL POWER PLANTS
DOCX
573137875-Attendance-Management-System-original
PPT
Introduction, IoT Design Methodology, Case Study on IoT System for Weather Mo...
PDF
R24 SURVEYING LAB MANUAL for civil enggi
PDF
SM_6th-Sem__Cse_Internet-of-Things.pdf IOT
PDF
Enhancing Cyber Defense Against Zero-Day Attacks using Ensemble Neural Networks
PPTX
web development for engineering and engineering
PPTX
CH1 Production IntroductoryConcepts.pptx
PPTX
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
PPTX
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
PPTX
Artificial Intelligence
PDF
Evaluating the Democratization of the Turkish Armed Forces from a Normative P...
PPTX
M Tech Sem 1 Civil Engineering Environmental Sciences.pptx
PPTX
Foundation to blockchain - A guide to Blockchain Tech
PDF
Digital Logic Computer Design lecture notes
PPTX
additive manufacturing of ss316l using mig welding
July 2025 - Top 10 Read Articles in International Journal of Software Enginee...
Embodied AI: Ushering in the Next Era of Intelligent Systems
Well-logging-methods_new................
Model Code of Practice - Construction Work - 21102022 .pdf
UNIT-1 - COAL BASED THERMAL POWER PLANTS
573137875-Attendance-Management-System-original
Introduction, IoT Design Methodology, Case Study on IoT System for Weather Mo...
R24 SURVEYING LAB MANUAL for civil enggi
SM_6th-Sem__Cse_Internet-of-Things.pdf IOT
Enhancing Cyber Defense Against Zero-Day Attacks using Ensemble Neural Networks
web development for engineering and engineering
CH1 Production IntroductoryConcepts.pptx
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
Artificial Intelligence
Evaluating the Democratization of the Turkish Armed Forces from a Normative P...
M Tech Sem 1 Civil Engineering Environmental Sciences.pptx
Foundation to blockchain - A guide to Blockchain Tech
Digital Logic Computer Design lecture notes
additive manufacturing of ss316l using mig welding

Communication Theory - Random Process.pdf

  • 1. 3.2 Communication Theory CHAPTERIII RANDOM PROCESS 3.1 INTRODUCTION Randomsignalsareencountered ineverypracticalcommunicationsystem. Randomsignals, unlike deterministic signals are unpredictable. For example, the received signalin the receiver consists ofmessagesignalfromthe source, noise and interference introduced inthe channel.All these signalsare randominnature, i.e. these signals areuncertainor unpredictable. Thesekind of randomsignals cannot be analyzed using Fourier transform, as theycandealonlydeterministic signals. The branch ofmathematics whichdeals withthe statisticalcharacterization ofrandom signals isprobabilitytheory. 3.1.1 Basic Terms in Probability Theory  Experiment:Anyactivitywith an observable result is called as experiment. Examples are rolling a die, tossing a coin or choosing a card.  Outcome: The result ofan experiment iscalled as outcome.  Sample space: The set ofall possible outcomes ofan experiment is knownas the sample space ofthe experiment and is denoted byS. For example, in tossing a coin experiment, the sample space S  {Head, Tail}  Event:Anysubset ofthe sample space is knownas anevent. For example, intossing a coin experiment, E  {Head} is the event that a head appears (or) E  {Tail} is the event that a tailappears.  Independent event: Two events are said to be independent, when the occurrence ofone event is not affected bythe occurrence ofother event. IfAand B are independent events, then P(A B)  P(A) * P(B)  Mutually exclusive events:Two events aresaid to bemutuallyexclusive ordisjoint events, whentheycannot occursimultaneously(Noneofthe outcomes arecommon). IfAandB are mutuallyexclusiveevents, then P(A B)=0  EC8491: Communication Theory Department of ECE
  • 2. Random Process 3.3  Probability: It is the measure of possibility or chance that an event will occur. It is defined as: Numberof desirableoutcome E Probability= = Totalnumberof possibleoutcomes S For example, in cointossing experiment the probabilityofgetting “Head” is 1/2.  Axioms ofProbability: (a) Probability ofsample space is 1, i.e. P(S)=1 (b) Probabilityofanevent P(E) is nonnegative realnumber ranges from0 to 1. 0 P(E) 1   (c) Probabilityofthemutuallyexclusiveeventsisequalto sumoftheir individualprobability. P(A B)=P(A)+P(B)  IfAand B are non-mutuallyexclusive events, then  P(A B)= P(A)+ P(B) P(A B)    Conditional Probability: Theconditionalprobabilityofanevent B is the probabilitythat the event willoccur given that an eventAhasalreadyoccurred. It isdefined as, P(A B) P(B/ A)= P(A)  Similarly, the conditionalprobabilityofeventAgiventhat anevent Bhas alreadyoccurred is given by, P(A B) P(A / B)= P(B)  If the event A and B are independent, then P(A B)=P(A)*P(B)  , then the Conditionalprobability of B givenA is simplyP(B) as, P(B/A)  [ P(A) * P(B)] / P(A)  P(B) Similarly, conditionalprobabilityofAgiven B issimplyP(A) as, P(A/B)  [ P(A) * P(B)] / P(B)  P(A)  Law of Total Probability: If the events A1 ,A2 ,….Ak be mutually exclusive, then the probabilityofother event B is givenby,  =1 P(B)= P(B/A )P(A ) k i i i EC8491: Communication Theory Department of ECE
  • 3. 3.4 Communication Theory  Baye’s Theorem: Baye’s theorem or Baye’s rule describes the probability of an event based onthe prior knowledge ofthe related event. It is defined as:  P(B/A )P(A ) P(B/A )P(A ) P(A / B)= = P(B) P(B/A )P(A ) i i i i i k i i i =1 where, P(Ai )  Prior probabilityor MarginalprobabilityofA, as it does not takes into account anyinformationabout B. P(B)  Prior probabilityor marginalprobabilityofB,as it does not takes into account anyinformation aboutA. P(Ai /B)  ConditionalprobabilityofAgiven B. It is also called as posterior probability, as it depends on the value ofB P(B/Ai )  ConditionalprobabilityofB givenA 3.2 RANDOM VARIABLES Theoutcomesofarandomexperimentarenottheconvenientrepresentationformathematical analysis. Forexample, ‘Head’and ‘Tail’intossing the coinexperiment.It willbeconvenient, ifwe assign a number or a range ofvalues to the outcomes ofa randomexperiment. For example, a ‘Head’corresponds to 1 and a ‘Tail’corresponds to 0. A random variable is defined as a process of mapping the sample space Q to the set of real numbers. In other words, a random variable is an assignment of real numbers to the outcomes of a random experiment as shown in Figure 3.1.Randomvariables are denoted bycapitalletters, i.e., X, Y, andso on, and individualvaluesofthe randomvariableX are X(w). Figure 3.1 Random variable  Mapping from Q to R Random Experiment Sample Space Q 2 1 3 4 R X( ) 2 X( ) 1 X( ) 3 X( ) 4 Real Numbers EC8491: Communication Theory Department of ECE
  • 4. Random Process 3.5 For example, consideran experiment oftossingthree fair coins. Let, Xdenote the number of “Heads” that appear, then X is a random variable taking on one ofthe values 0,1,2,3 with respective probabilities: P{X  0}  P{(T, T, T)}  1 / 8 P{X  1}  P{(T, T, H), (T, H, T), (H, T, T)}  3/8 P{X  2}  P{(T, H, H), (H, T, H), (H, H, T)}  3/8 P{X  3}  P{(H, H, H)}  1/8 3.2.1 Classification of Random Variables Randomvariables are classified into continuous anddiscrete randomvariables.  The values ofcontinuous random variable are continuous in a givencontinuous sample space. A continuous sample space has infinite range of values. The discrete value of a continuous randomvariableis a value at one instant oftime. For example theTemperature, T at some area is a continuous randomvariable that always exists in the rangesay, fromT1 and T2 . Examples of continuous random variable  Uniform, Exponential, Gamma and Normal.  The values of a discrete random variable are only the discrete values in a given sample space. The samplespace for a discreterandomvariable canbe continuous, discrete or even both continuous and discrete points. They maybe also finite or infinite. For example the “Wheelofchance”has the continuous samplespace. Ifwedefine adiscrete randomvariable n asinteger numbers from0to 12, thenthediscrete randomvariableisX = {0,1,3,4…..12}. Examples of discrete random variable – Bernoulli, Binomial, Geometric and Poisson. 3.2.2 Distribution and Density Function The Cumulative DistributionFunction (CDF) ofa randomvariable X is defined as, F ( )=P{ω Ω:X( ) } x x w x   which can be simplywritten as, F P X x (x)= ( x)  EC8491: Communication Theory Department of ECE
  • 5. 3.6 Communication Theory Properties of CDF:  X F ( ) x is a non-decreasing function of x.  Since CDFis a probability, it ranges from0 to 1,i.e., X 0 F ( ) 1 x    X lim F ( ) 0 x x   and X lim F ( ) 1 x x    X F ( ) x is continuousfromthe right  X X P( X ) F ( ) F ( ) a b b a      X X P(X ) F ( ) F ( ) a a a    The Probability DensityFunction (PDF), ofa continuous randomvariableX is defined as the derivative ofits CDF. It is denoted by: i.e., X X ( )= F ( ). d f x x dx Properties of PDF:  X( ) 0 f x   X ( ) 1 f x dx       X ( ) P( X ) b a f x dx a b      X P(X A) ( ) A f x dx     X X F ( ) ( ) . x x f u du    The Probability Mass Function (PMF), ofa discrete randomvariable X is defined as px (x), where ( ) P(X ) x p x = = x Properties of PMF:  0 ( ) 1, 1,2,..... x i p x i     ( ) 0, if ( 1,2,.....) x i p x x x i     ( ) 1. i x i p x   EC8491: Communication Theory Department of ECE
  • 6. Random Process 3.7 3.2.3 Statistical Averages of Random Variables Eventhoughthedistributionfunctionprovidesacompletedescriptionoftherandomvariable, some other statisticalaverages suchasmean and variance are also used to describe the random variable indetail. Mean or Expectation of the random variable:The mean ofthe continuous randomvariable X with a densityfunction offX (x) is defined as:     X E X = ( ) x f x dx   The meanofthe discrete randomvariable X is definedas the weighted sumofthe possible outcomes givenas:     X μ =E X = P X= x x x  For example, ifX is considered as a randomvariable representing the observations ofthe voltage of a randomsignal, then the mean value represents the average voltage or dc offset of the signal. Variance of the RandomVariable: It provides an estimate of the spread ofthe distribution about the mean.The variance ofthe continuous randomvariablesis defined as:    2 2 X σ ( ) x x = x μ f (x)dx   The variance ofthe discrete randomvariables is given by the expectationof the squared distance ofeachoutcome fromthe meanvalue. It isdefined as: 2 2 Var(X) E[(X ) ] x x     We know that, expectation of a variable is   X E X P[X ] x x   Therefore, the varianceis givenby,  2 2 X σ = ( μ ) P[X= ] x x x x  For example, if X is considered as a random variable representing observations of the voltage of a random signal, then the variance represents the AC power of the signal. The second moment of X, E[X2 ] is also called the mean-square value of the random signal and it represents the total power of the signal. EC8491: Communication Theory Department of ECE
  • 7. 3.8 Communication Theory Covarianceofthe RandomVariable: Incommunicationsystem, itisveryimportantto compare two signals in order to extract the information. For example, RADAR system compares the transmitted signalto the target withthe received(reflected) signalfromthe target to measure the parameters like range, angle and speed of the object. Correlation and covariance are mainly used for this comparison purpose. The covariance of two random variables X and Y is defined as the expectation of the product of the two randomvariables given by:   X Y Cov(X,Y)=E[(X )(Y )] m m Expanding and simplifyingabove equationwe get:  X Y Cov(X,Y)=E[XY] μ μ If Cov(X,Y) = 0, then X and Yare uncorrelated, i.e.,   X Y E XY =μ μ =E[X]E[Y] If X and Y are independent, then Cov(X, Y) = 0, i.e., X and Y are uncorrelated but the converse is not true. The correlationcoefficient oftwo randomvariable is defined as, XY XY X Y σ ρ = σ σ where, the correlationcoefficient ranges from[1, 1]. Some ofthe important randomvariables used incommunication systems are, (a) Binomial RandomVariable: This is a discrete randomvariable. Suppose there are ‘n’independent trials, eachofwhich results in a success and failure with probability p and 1  p respectively. If X represents the number ofsuccesses that occur in thentrials, thenX issaid to be binomialrandomvariable with parameters (n, p). The probability mass function of a binomialrandomvariable with parameters n and p is givenby,         P(X= )= (1 ) , 0 i n i n i p p i n i   The meanand variance ofbinomialrandomvariable is givenby, E(X) μ= =np  2 (1 ) σ = np p EC8491: Communication Theory Department of ECE
  • 8. Random Process 3.9 Application: BinomialRandomVariable canbe used to modelthe totalnumber ofbits received in error when sequence ofn bits is transmitted over a channelwitha bit-error probabilityofpas shown in Figure 3.2. Figure 3.2 PMF for Binomial Random Variable (b) UniformRandomVariable: This is a continuous random variable that takes values between a and b with equal probabilities for intervalsofequallength. Theprobabilitydensityfunctionis showninFigure 3.3 defined as:       X 1 0 Otherwise , a< x<b b a f (x)= , The cumulative distributionfunctionis showninFigure3.4 defined as:          X 0 F , x<a x a (x)= , a x b b a 1, x>b   0.3 0.25 0.2 0.15 0.1 0.05 0 0 2 4 6 8 10 x EC8491: Communication Theory Department of ECE
  • 9. 3.10 Communication Theory Figure 3.3 PDF for uniform random variable Figure 3.4 CDF for uniform random variable Application: Uniformrandomvariable is used to modelcontinuous random variables, whose range is known, but other informationlike likelihoodofthe values that the randomvariable canassume is unknown. For example, the phase of a received sinusoid carrier is usually modelled as a uniformrandomvariablebetween0 and 2 .Quantization Errorsare also modelled as uniform randomvariable. (c) Gaussian orNormal Random Variable: The Gaussianor NormalRandomVariable is a continuous randomvariable described by the probabilitydensityfunctionas:   X 1 2 2 2 (x m) 2σ f (x)= e π σ f (x) X 0 a b x FX(x) 1.0 0 a b x 1 b a  EC8491: Communication Theory Department of ECE
  • 10. Random Process 3.11 The PDFofGaussianrandomvariableis a bellshaped that is symmetric about the meanm and attains the maximum value of 1/ 2π σ at x = m, as shownin Figure 3.5. Figure 3.5 PDF for Gaussian Random Variable Theparameter m is called the Mean and can assume anyfinite value. The parameter is calledthe Standard Deviation and can assume anyfinite and positive value. The square ofthe standard deviation, i.e., 2  is the Variance. A Gaussian random variable with mean m and variance 2 is denoted by 2 N( , ) m  . The random variable N(0, 1) is usuallycalled standard normal. Properties of Gaussian Random Variable:  It iscompletelycharacterized byits meanand variance.  The sumof two independent Gaussian random variables is also a Gaussian random variable.  TheweightedsumofNindependentGaussianrandomvariables isaGaussianrandom variable.  Iftwo Gaussian randomvariables havezero covariance (uncorrelated), theyare also independent.  AGaussianrandomvariable plus aconstant isanotherGaussianrandomvariablewith the mean adjusted bythe constant.  A Gaussian random variable multiplied bya constant is another Gaussian random variable where both the mean and variance are affected bythe constant. 0 m x fX(x) 2 1 2 EC8491: Communication Theory Department of ECE
  • 11. 3.12 Communication Theory Applications:  TheGaussianrandomvariableisthemostimportantandfrequentlyencounteredrandom variable in communication systems. The reasonis that thermal noise, which is the major source of noise in communication, has a Gaussian distribution.  InRobotics, GaussianPDFis used to statisticallycharacterize sensor measurements, robot locations and map representations. 3.3 CENTRAL LIMIT THEOREM Animportant resultinprobabilitytheorythatiscloselyrelatedto theGaussiandistribution is the Central Limit Theorem (or) Law of large numbers. Let X1 ,X2 ,X3 ,…..,Xn be a set of randomvariableswiththefollowingproperties:  The Xk withk = 1,2,…nare statisticallyindependent.  The Xk allhave the same probabilitydensityfunction.  Both the meanand the variance exist for each Xk . We do not assume that the densityfunctionofthe Xk is Gaussian. Let Ybe a new random variable defined as:  =1 Y= X n k k Then, according to the centrallimit theorem, thenormalized randomvariable, Y Y E[Y] Z= σ  approaches a Gaussian randomvariable withzero mean and unit variance as the number ofthe random variables X1 , X2 , X3 ,….., Xn increases without limit. That is, as n becomes large,the distributionofZapproaches that ofazero-meanGaussianrandomvariable withunit variance, as shownby: 2 Z 1 F ( ) exp 2 2 z s z ds                This is a mathematicalstatement ofthe centrallimit theorem. In words, the normalized distribution of the sum of independent, identically distributed random variables EC8491: Communication Theory Department of ECE
  • 12. Random Process 3.13 approaches a Gaussian distribution as the number of random variables increases, regardless of the individual distributions. Thus, Gaussian random variables are common because theycharacterize the asymptotic propertiesofmanyother types ofrandomvariables. When n is finite, the Gaussianapproximation ismost accurate in the centralportion ofthe densityfunction(hencethenamecentrallimit)andlessaccurateinthe“tails”ofthedensityfunction. Applications of Central Limit Theorem:  ChannelModelling  Finance  PopulationStatistics  HypothesisTesting  EngineeringResearch 3.4 Random Process A random process or stochastic process is the natural extension of random variables when dealing with signals. Inanalyzing communication systems, we basicallydealwithtime- varying signals.So far, theassumptionis that, allthesignals are deterministic. Inmanysituations, it is more appropriate to modelsignals as random rather than deterministic functions. One such example is the case ofthermalnoise inelectronic circuits. This type ofnoise is due to the randommovement ofelectrons as a result ofthermalagitation, therefore, theresulting current and voltage can onlybe described statistically. Another situation where modeling by randomprocessesprovesusefulisinthecharacterizationofinformationsources.Aninformation source, such as a speechsource, generatestime-varying signals whose contents are not known in advance. Otherwise there would be no need to transmit them. Therefore, randomprocesses also providea naturalwayto modelinformationsources. A random process is a collection (or ensemble) of random variables   X( ,ω) t that are functions of a real variable, namely time t where ω S  (Sample space) and T t (Parameter set or Index set).The set ofpossible values ofanyindividualmember ofthe random processiscalledstatespace.Anyindividualmemberitselfiscalledasamplefunctionorarealization of the process. Arandom process can be viewed as a mapping ofsample space S to the set of signalwaveforms asshown inFigure 3.6. EC8491: Communication Theory Department of ECE
  • 13. 3.14 Communication Theory Figure 3.6 Random Process – Mapping of Sample Space to Signal Waveform Therealizationofone fromtheset ofpossible signalsis governed bysomeprobabilisticlaw. This is similar to the definition ofrandomvariables where one from a set ofpossible values is realized according to some probabilistic law. The difference is that in random processes, we have signals (function of time) instead of values (numbers). 3.4.1 Classification of Random Process Based on the continuous or discrete nature of the state space S and parameter set T, a randomprocess can be classified into four types:  If bothT and S arediscrete, the random process is called a Discrete random sequence. For example, if Xn represents the outcome ofthe nth toss ofa fair dice, then {Xn , 1 n } is a discrete random sequence, since T{1, 2, 3, · · ·} and S  {1, 2, 3, 4, 5, 6}.  If T is discrete and S is continuous, the random process is called a continuous random sequence. For example, if Xn represents the temperature at the end of the nth hourof a day, then   X , 1 24 n n   isa continuous randomsequence, since temperature cantake anyvalue in anintervaland hence continuous. EC8491: Communication Theory Department of ECE
  • 14. Random Process 3.15  If T is continuous and S is discrete, the random process is called a Discrete random process. For example, ifX(t) represents the number oftelephone calls received in the interval(0, t) then {X(t)} is a discrete random process, since S = {0, 1, 2, 3, · · ·}.  If both T and S are continuous, the random process is called a continuous random process. For example, ifX(t) represents the maximumtemperature at a place in the interval(0, t), {X(t)} is a continuous randomprocess. Basedonthestationarity, arandomprocesscanbeclassifiedintostationaryandnon-stationary randomprocess as shown in Figure 3.7. Figure 3.7 Hierarchical Classification of Random Process  A random process whose statistical characteristics do not change with time is called as Stationary Random Process or Stationary Process.Example  Noise process as its statistics do not change withtime.  A random process whose statistical characteristics changes with time is called as non-stationary process. Example  Temperature ofacityas temperature statisticsdepend on the time ofthe day. 3.5 STATIONARY PROCESS The random process X(t) is said to be Stationary in the Strict Sense (SSS) or strictly stationary ifthefollowing conditionholds: X x X F F 1 k 1 k (t +τ) ,... ...., (t +τ) 1 k (t ),... ...,x(t ) 1 k (x ,... ...,x )= (x ,... ...,x ) Random Process WSS Process SSS Process Ergodic Process EC8491: Communication Theory Department of ECE
  • 15. 3.16 Communication Theory For alltime shifts , allk and allpossible choices ofobservationtimes t1 ,……..tk . In other words, ifthe joint distribution ofanyset ofrandomvariables obtained byobserving the random process X(t) is invariant with respect to the location at the time origin t = 0. 3.5.1 Mean of the Random Process The meanofprocess X(t) isdefined as the expectationofthe randomvariable obtained by observing the process at some time t givenby:  X X E[X ] (t) m (t)= (t) = x f (x)dx   where, X( )( ) t f x is the first order probabilitydensityfunctionofthe process. The mean of the strict stationary process is always constant given by: x x m (t)= m for allvalues oft 3.5.2 Correlation of the Random Process Autocorrelation of the process X(t) is given by the expectation ofthe product oftwo random variables X(t1 ) and X(t2 ) obtained by observing the process X(t) at times t1 and t2 respectively. It is defined as the measure of similarity of random processses. XX 1 2 1 2 R ( , ) E[X( )X( )] t t t t  1 2 1 2 X( ),X( ) 1 2 1 2 X X ( , ) t t f x x dx dx        For a stationary random process 1 2 X( ),X( ) 1 2 ( , ) t t f x x depends only on the difference between the observation time t1 and t2 . This implies the autocorrelation function of a strictly stationary process depends only on the time difference t2  t1 given by:  XX 1 2 XX 2 1 R ( , )=R ( ) t t t t for allvalues of t1 and t2 3.5.2.1 Importance of Autocorrelation Autocorrelation function provides the spectral information ofthe random process. The frequencycontent ofa process depends on the rapidityofthe amplitude change withtime. This canbe measured bycorrelating amplitudes at t1 and t1  .Autocorrelationfunctioncanbe used to provide information about the rapidity ofamplitude variation with time whichin turn gives informationabout their spectralcontent. EC8491: Communication Theory Department of ECE
  • 16. Random Process 3.17 For example, consider two randomprocess x(t) and y(t), whose autocorrelation function RX ( ) and RY ()as shownin Figure 3.8. We can observe fromtheir autocorrelation function that the random process x(t) is a slowly varying process compared to the process y(t). In fact the power spectraldensityofrandom process is obtained from the Fourier Transformoftheir autocorrelationfunction. Figure 3.8 Autocorrelation Function of Two Random Process x(t) and y(t) 3.5.2.2 Properties of Autocorrelation Function  The meansquarevalue ofa randomprocessis equalto the value ofautocorrelationat  =0. 2 XX R (0)=E[X (T)]  The autocorrelationfunctionis anevenfunctionof .  XX XX R (τ)=R ( τ)  The autocorrelationfunctionis maximumat  =0. XX XX R (τ) R (0)   IfE[X(t)] 0 and X(t) is ergodic with no periodic components, then  2 XX τ lim R (τ)= X   If X(t) has a periodic component, then willhave a periodic component with the same period.  IfX(t) is ergodic, zero mean and has no periodic components, then  XX τ lim R (τ)=0  RX( )  R  Y( ) 0  EC8491: Communication Theory Department of ECE
  • 17. 3.18 Communication Theory 3.5.3 Covariance of the Random Process The autocovariancefunctionofX(t) isdefined as: XX 1 2 1 X 1 2 X 2 C ( , ) E{[X( ) ( )][X( ) ( )]} t t t m t t m t    XX 1 2 X 1 X 2 R ( , ) ( ) ( ) t t m t m t     2 XX 1 2 XX 2 1 X C ( , )=R ( ) t t t t m Like autocorrelation function, the autocovariance function of a strictly stationary process depends on the time difference t2  t1 . 3.6 WHITE NOISE PROCESS One of the very important random process is the white noise process. Noises in many practicalsituations are approximated bythe white noise process. Most importantly, the white noise plays an important role inmodeling WSS signals. A random process is said to be white noise process X(t), if it is zero mean and power spectraldensityis defined as: X 0 S (ω)=N / 2, forallfrequencies where, N0 is a real constant. The corresponding autocorrelationfunctionis givenby: X 0 R (τ)=(N / 2)δ(τ) where, ( )   is the Diracdelta function.The PSD andautocorrelationfunctionofawhite noise is shown inFigure 3.9 (a) and (b) respectively. (a) (b) Figure 3.9 (a) PSD of White Noise (b) Autocorrelation Function of White Noise 0   RX( )  N0/2 N0/2 SX 0 ( ) N /2   EC8491: Communication Theory Department of ECE
  • 18. Random Process 3.19 3.6.1 Properties of White Noise Process  The termwhitenoise is analogous to white light whichcontains allvisible light frequencies.  A white noise is a mathematical abstraction; it cannot be physically realized since it has infinite averagepower.  A white noise process canhave anyprobabilitydensityfunction.  The randomprocess X(t) is called as white Gaussian noise process, ifX(t) is a stationary Gaussianrandomprocess with zero meanand flat power spectraldensity.  Ifthe systembandwidth(BW) is sufficientlynarrower than the noise BW and noise PSD is flat, we canmodelit as awhite noise process. Thermalnoise, whichis thenoise generated in resistors due to randommotion electrons, is generally modelled as white Gaussian noise, since theyhave veryflat PSD over verywide band offrequency.  A white noise process is called strict-sense white noise process, ifthe noise samples at distinct instants oftime are independent. 3.7 WIDE-SENSE STATIONARY PROCESS (WSS) Aprocess maynot be stationaryinthe strict sense, stillit mayhavemeanvalue and anautocorrelationfunctionwhichare independent ofthe shift oftime origin. Ø ( )=constant x m t Ø  XX 1 2 XX 2 1 R ( , )=R ( ) t t t t Such a process is known as Wide-Sense Stationary or Weakly Stationary Process (WSS) or Co-Variance Stationary. 3.8 POWER SPECTRAL DENSITY (PSD) A randomprocessis a collection ofsignals and the spectralcharacteristicsofthese signals determine the spectralcharacteristics ofthe randomprocess. Ifthe signals ofthe randomprocess are: Slowly Varying: Randomprocess will mainlycontain low frequencies and its power will be mostlyconcentrated at low frequencies. Fast Varying: Most of the power in the random process will be at the high-frequency components. EC8491: Communication Theory Department of ECE
  • 19. 3.20 Communication Theory A usefulfunction that determines the distribution ofthe power of the randomprocess at different frequencies is the Power Spectral Density or Power Spectrum of the random process, the power spectraldensityofa randomprocess X (t) is denoted bySX (f), and denotes the strength ofthe power inthe randomprocess as a function offrequency. Theunit for power spectraldensityis watts perHertz (W/Hz). 3.8.1 Expression for power spectral density Theimpulseresponse ofa lineartimeinvariant filterisequalto the inverseFourierTransform ofthefrequencyresponse ofthesystem. Let H(f) denotesthe frequencyresponse ofthe system, thus 1 1 ( ) H( )exp( 2 ) h f j f df        Substituting 1 ( ) h  in 2 E[Y ( )], t we get 2 1 2 XX 2 1 1 2 E[Y ( )] H( )exp( 2 ) ( )R ( ) t f j f df h d d                        2 2 XX 2 1 1 1 H( ) ( ) R ( )exp( 2 ) df f d h j f d                  In the last integralon the right hand side ofabove equation a new variable is defined : 2 1    Then theabove equation canbe rewrittenas: 2 2 2 2 XX 1 1 E[Y ( )] H( ) ( )exp( 2 ) R ( )exp( 2 ) t df f d h j f j f d                    Themiddleintegralcorrespondsto H*( ) f ,thecomplexconjugateofthefrequencyresponse ofthe filter and so we maysimplifythe equationas: 2 2 XX 1 1 E[Y ( )] H( ) R ( )exp( 2 ) t df f j f d             where H( ) f isthe magnitude response ofthe filter. Intheabove equationthe last integralterm is the Fourier transformofthe autocorrelationfunction ofthe input randomprocess X(t).This provides the definitionofa new parameter,    XX XX 1 1 S ( )= R (τ)exp( 2π τ ) τ f j f d   EC8491: Communication Theory Department of ECE
  • 20. Random Process 3.21 This function is called XX S ( ) f iscalled the power spectraldensityor powerspectrumof the stationaryprocess X(t). Finally,  2 2 XX - E[Y ( )]= H( ) S ( ) t f f df   The mean square of the output of a stable linear time-invariant filter in response to a stationary process is equal to the integral over all frequencies of the power spectral density of the input process multiplied by the squared magnitude response of the filter. 3.8.2 Relationship between power spectral density and autocorrelation The power spectraldensityand the autocorrelationfunctionofa stationaryprocessforma Fourier-transformpair with  and f as the variables ofinterest,  XX XX S ( )= R (τ)exp(- j2π τ) τ f f d      XX XX R (τ)= S ( )exp( 2π τ) f j f df   The above relations provide insight of spectral analysis of random processes and they together called as Einstein-Wiener-Khintchine Relation. 3.8.3 Properties of Power Spectral Density 1. The zero-frequencyvalue ofthe power spectraldensityofa stationaryprocess equals the totalarea under the graphofthe autocorrelationfunction(substituting f = 0 in Sxx (f)).  XX XX S (0)= R (τ) τ d    2. The meansquare value ofa stationaryprocess equals the totalarea under the graph ofthe power spectraldensity(substituting = 0 in Rxx ())  2 XX E[X ( )]= S ( ) t f df   3. The power spectraldensityofa stationaryprocessis always nonnegative, XX S ( ) 0 for all f f  4. Thepowerspectraldensityofarealvaluedrandomprocessis anevenfunctionoffrequency, that is, XX XX S ( )=S ( ) f f  5. The power spectraldensityappropriatelynormalized has theproperties usuallyassociated withaprobabilitydensityfunction: EC8491: Communication Theory Department of ECE
  • 21. 3.22 Communication Theory  XX X XX S ( ) P ( )= S ( ) f f f df   3.9 ERGODIC PROCESS The expectations or ensemble averages of a random process are averages across the process that describes allpossible values of the sample functions of the process observed at time tk . Time averages are defined as long term sample averages that are averages along the process. Timeaveragesarethepracticalmeansforthe estimationofensembleaveragesoftherandom process. The dc value ofx(t) is defined by the time average T T 1 (T) ( ) 2T x m x t dt    The timeaverage is a randomvariable as its valuedepends on the observationintervaland sample function. The meanofthe time average is givenby, T T 1 E[ (T)] E[ ( )] 2T x x m x t dt m     The process x(t) is ergodic in the mean if, 1. Thetimeaverage (T) x m approachestheensembleaverage x m inthelimit astheobservation intervalT approachesinfinity, that is  T lim (T)= x x m m  2. The varianceof (T) x m approacheszero inthelimit asthe observationintervalT approaches infinity.  T lim var[ (T)]=0 x m  The time averaged autocorrelationfunction ofasample functionx(t)   T XX T 1 R (τ,T)= ( + τ) ( ) 2T x t x t dt The process x(t) is ergodic in the autocorrelation function if,  XX XX T lim R (τ,T)]=R (τ)   XX T lim Var[R (τ,T)]=0  For a randomprocess to be ergodic, it has to be stationary; however the converse is not true. EC8491: Communication Theory Department of ECE
  • 22. Random Process 3.23 3.10 GAUSSIAN PROCESS Gaussian processes playan important role in communication systems. The fundamental reason for their importance is that thermal noise in electronic devices, which is produced by the random movement of electrons due to thermal agitation, can be closely modeled by a Gaussian process. In a resistor, free electrons move as a result ofthermalagitation. The movement of these electrons is random, but their velocityis a functionofthe ambient temperature. The higher the temperature, the higherthe velocityoftheelectrons. The movement ofthese electronsgenerates a current witha randomvalue. Wecan consider eachelectroninmotionas atinycurrent source, whose current is a randomvariable that can be positive or negative, depending onthe direction ofthemovement oftheelectron.Thetotalcurrent generatedbyallelectrons, whichisthegenerated thermalnoise, isthe sumofthecurrents ofallthese current sources. We can assume that at least a majorityofthese sources behaveindependentlyand, therefore, the totalcurrent is thesumofa large numberofindependent and identicallydistributed randomvariables. Now, byapplying the central limit theorem, we conclude that this total current has a Gaussian distribution. For this reason, thermalnoise can be verywellmodeled byGaussian randomprocess. Gaussian processes provide rather good models for some information sources as well. Some properties ofthe Gaussian processes, make these process mathematicallytractable and easyto use. Let Ybe a randomvariable obtained byintegrating the product ofa randomprocess X(t) for a time period of t = 0 to t = T and some function g(t) given by: T 0 Y X( ) ( ) t g t dt   The weighting functioninabove equationissuchthat the mean-square valueoftherandom variable Yis finite and ifthe random variableYis a Gaussian distributed random variable for everyg(t), then the process X(t) is said to be Gaussian process. The randomvariableYhas a Gaussiandistributionifits probabilitydensityfunctionhas the form, 2 Y Y 2 Y Y ( ) 1 ( ) exp 2 2 y m f y               where Y m is the mean and 2 Y  is the variance ofthe randomvariableY. Y. EC8491: Communication Theory Department of ECE
  • 23. 3.24 Communication Theory Aplot ofthis probabilitydensityfunctionis showninFigure 3.10, for thespecialcase when the GaussianrandomvariableYis normalized to have mean ofzero and a variance 2 Y  of1. 2 Y 1 ( ) exp 2 2 y f y            Figure 3.10 Normalized Gaussian Distribution 3.10.1 Advantages of Gaussian Process 1. Gaussian process has manyproperties that make analytic results possible. 2. Random processes produced by physical phenomena are often such that a Gaussian model is appropriate. Further the use ofGaussian modelis confirmed by experiments. 3.10.2 Properties of Gaussian Process 1. If the set of random variables 1 2 3 X( ),X( ),X( ),... ...,X( ) n t t t t obtained by observing a randomprocess X(t) at times t1 ,t2 ,….tn . and the process X(t) is Gaussian, then this set of randomvariables is jointlyGaussian for anyn, with their PDF completelyspecified bythe set ofmeans: i X( ) =E[X( )], t i m t where i  1, 2, 3…. n and the set ofcovariance functions,  X X( ) X( ) C ( , )=E[(X( ) )(X( ) )] k i k i k t i t t t t m t m k, i  1,2,3….n 2. Ifthe set ofrandomvariables X( ) i t is uncorrelated, that is, C 0 ij = i j  then X( ) i t are independent. fY(y) 0.6 0.4 0.2 y    3 2 1 0 1 2 3 EC8491: Communication Theory Department of ECE
  • 24. Random Process 3.25 3. Ifa Gaussian process is stationary, then the process is also strictlystationary. 4. Ifa Gaussianprocess X(t) is passedthroughLTI filter, then therandomprocessY(t)at the output ofthe filter is also Gaussianprocess. 3.11 TRANSMISSION OF A RANDOM PROCESS THROUGH A LTI FILTER When a randomprocess X(t) is applied as input to a linear time-invariant filter ofimpulse response h(t), producing anew randomprocessY(t) at the filter output as showninFigure 3.11. It isdifficulttodescribetheprobabilitydistributionoftheY(t), evenwhentheprobabilitydistribution ofthe X(t) is completelyspecified. Figure 3.11 Transmission of a random process through a LTI filter The time domain form of input-output relations of the filter for defining the mean and autocorrelationfunctionsoftheoutput randomprocessY(t)intermsofthe input X(t),assuming X(t) is astationaryprocess. The transmissionofa process througha LTI filter isgoverned bythe convolution integral, where the output random process Y(t) is expressed in terms of input randomprocess X(t) as:  1 1 1 Y( )= (τ )X( τ ) τ t h t d    where, is the integration variable. Mean ofY(t): Y ( ) E[Y( )] m t t  1 1 1 E ( )X( ) h t d                Provided E[X(t)] is finite for allt and the systemis stable. Impulse Response h(t) X(t) Y(t) EC8491: Communication Theory Department of ECE
  • 25. 3.26 Communication Theory Interchanging theorder ofexpectation andintegration, we get Y 1 1 1 ( ) ( )E[X( )] m t h t d        1 X 1 1 ( ) ( ) h m t d         When the input randomprocess X(t) is stationary, the mean X( ) m t is a constant X m , so Y X 1 1 ( ) m m h d       Y X = H(0) m m where, H(0) is the zero-frequency(DC) response ofthe system. The mean of the output random process Y(t) produced at the output of a LTI system in response to input process X(t) is equal to the mean of X(t) multiplied by the DC response of the system. Autocorrelation ofY(t): YY 1 2 1 2 R ( , ) E[Y( )Y( )] t t t t  Usingconvolutionintegral, we get YY 1 2 1 1 1 1 1 2 2 2 R ( , ) E ( )X( ) ( )X( ) t t h t d h t d                      Provided E[X2 (t)] is finite for allt and the systemis stable, YY 1 2 1 1 2 2 1 1 2 2 R ( , ) ( ) ( ) E[X( )X( )] t t h d h d t t               1 1 2 2 XX 1 1 2 2 ( ) ( ) R ( , ) h d h d t               When the input X(t) is a stationaryprocess, the autocorrelation functionofX(t) is onlya function of the difference between the observation times 1 1 t  and 2 2 t  .Thus putting 1 2    in above equation, we get EC8491: Communication Theory Department of ECE
  • 26. Random Process 3.27 YY 1 2 XX 1 2 1 2 R ( ) ( ) ( )R ( ) h h d d                On combining this result with the mean Y m ,we see that if the input to a stable linear time-invariant filter is a stationary process, then the output of the filter is also a stationary process. When 0  , 2 YY R (0) E[Y ( )] so t     2 1 2 XX 2 1 1 2 E[Y ( )]= (τ ) (τ )R (τ τ ) τ τ t h h d d     which isa constant. FORMULAE TO REMEMBER  IfAand B are independent events, then P(A B)  P(A) * P(B)  IfAand B are mutuallyexclusive events, then P(A B) = 0  ConditionalprobabilityofeventAgiven B, P(A B) P(A / B) P(B)    Law oftotalprobability 1 P(B) P(B / A )P(A ) k i i i     Baye’s theorem P(B/ A )P(A ) P(A / B) P(B) i i i   Cumulative Distribution Functionor CDF ofa randomvariable X X F ( ) P(X ) x x    ProbabilityDensityFunctionor PDF ofa continuous randomvariable X X X ( ) F ( ) d f x x dx  EC8491: Communication Theory Department of ECE
  • 27. 3.28 Communication Theory  ProbabilityMass Functionor PMF of a discrete randomvariable X pX (x)  P(X  x)  Mean ofthe randomvariable E[X] ( ) (Continuous) x x f x dx     E[X]  X P[X ] x x   (Discrete)  Variance ofthe randomvariable 2 2 X ( ) ( ) (Continuous) x x x f x dx       2 2 2 X E[(X ) ] ( ) P[X ] (Discrete) x x x x x         Covariance oftwo randomvariable Cov(X,Y) E[XY]  x y    IfCov(X,Y) = 0, then X andY are uncorrelated.  IfX andYare independent, then Cov(X,Y) = 0.  PMF of BinomialRandomVariable P(X ) (1 ) , 0 i n i n i p p i n i              PDFof UniformRandomVariable 1 , ( ) 0, Otherwise x a x b b a f x          PDF ofGaussianor NormalRandomVariable 2 2 ( ) 2 X 1 ( ) 2 x m f x e        The random process X(t) is said to be Stationary in the Strict Sense (SSS) or Strictly Stationaryifthefollowingconditionholds, 1 1 X( ),...,X( ) 1 X( ),...,X( ) 1 F ( ,... ..., ) F ( ,... ..., ) k k t t k t t k x x x x    EC8491: Communication Theory Department of ECE
  • 28. Random Process 3.29  Mean ofthe randomprocess X(t) is defined as the: ( ) ( ) E[X( )] ( ) x x t m t t x f x dx       Autocorrelation ofthe randomprocess X(t)is: XX 1 2 1 2 R ( , ) E[X( )X( )] t t t t  1 2 1 2 ( ), ( ) 1 2 1 2 ( , ) x t x t x x f x x dx dx         Autocovariance functionofrandomvariable X(t)is: 2 X X 1 2 XX 2 1 C ( , ) R ( ) x t t t t m     Wide-Sense Stationaryor Weakly StationaryProcess (WSS) or Co-variance Stationary X XX 1 2 XX 2 1 ( ) constant and R ( , ) R ( ) m t t t t t     Power spectraldensityor power spectrumofthe stationaryprocess X(t) XX XX 1 1 S ( ) R ( )exp( 2 ) f j f d           Einstein-Wiener-Khintchine relations XX XX S ( ) R ( )exp( 2 ) f j f d          XX XX R ( ) S ( )exp( 2 ) f j f df         x(t) is ergodic in the mean if T lim (T) x x m m   , T lim var[ (T)] 0 x m    x(t) isergodic inthe autocorrelationfunctionif: T XX XX lim R ( ,T) R ( ),     XX T lim Var[R ( ,T)] 0     The mean of the output random process Y(t) produced at the output of a LTI system in response to input process X(t) is equalto the meanofX(t) multiplied bytheDC response of the system. Y X H(0) m m  EC8491: Communication Theory Department of ECE
  • 29. 3.30 Communication Theory SOLVED EXAMPLES 1. A telegraph source generates two symbols: Dot and Dash. The dots were twice as likely to occur as dashes. What is the probability of occurrence of dot and dash? Solution: Given that, P(Dot) = 2 P(Dash) We know that the probabilityof sample space is 1, so P(Dot) + P(Dash) = 1 Substituting, P(Dot) = 2P(Dash) in above equation, we get 3P(Dash) 1  Thus, P(Dash) = 1/3 and P(Dot) = 2/3 2. Binary data are transmitted over a noisy communication channel in a block of 16 binary digits. The probability that areceived digit is in errordue to channelnoise is 0.01.Assume that the errors occur in various digit positions within a block are independent. Find a) Mean errors per block, b) variance of the number of errors per blockand c) Probability that the number of errors per block is greater than or equal to 4. Solution: Let X denote the random variable of number of errors per block. Then, X has a binomialdistribution. (a) Mean  np Given, n  16 and p  0.01 Mean error per block  16 * 0.01  0.16 (b) Variance  np(1  p) Variance of the number of errors per block  (16)(0.01)(0.99)  0.158. (c) Probabilitythat the number oferrors per block is greater than or equalto 4 is, P(X 4)  1  P(X 3) Using binomialdistribution, we get, 3 16 0 16 P(X 3) (0.01) (0.99) 0.986 i i i i             Hence, P(X 4)  1  0.986  0.014 EC8491: Communication Theory Department of ECE
  • 30. Random Process 3.31 3. The PDF of a random variable X is given by fX (x) = k, for a x b   and zero elsewhere, where k is a constant. Find the: (i) Value of k (ii) If a  1 and b  2. Calculate P(|X|  c) for c  1/2. Solution: (i) Given that PDF fX (x) corresponds to uniformrandomvariable, so the value ofk is, k = 1 / (b-a) (ii) Using the value ofk = 1/(2+1) = 1/3, PDF is given byfX (x) =1/3, for 1 2 x    and zero elsewhere, 1/ 2 1/ 2 1 P( X 1/ 2) P( 1/ 2 X 1/ 2) 3 dx               We get, P( X 1/ 2) 1/3.   4. Find the mean and variance of random variable X that takes the values 0 and 1 with probabilities 0.4 and 0.6 respectively. Solution: Mean  E[X]  X P[X ] x x    0(0.4)  1(0.6)  0.6 Variance   2 X X ( ) P[X ] x x     (0  0.6)2 (0.4)  (1  0.6)2 (0.6)  0.24 5. Find the covariance of X and Yif (a) X andY are independent (b)Y= aX  b. Solution: (a) Cov(X,Y)  E[XY]  E[X]E[Y] Since, X andYare independent, E[XY]  E[X] E[Y]. Cov(X, Y)  0 (b) Y  aX  b E[XY]  E[X(aX  b)]  a E[X2 ]  b E[X]  a E[X2 ]  b E[X] E[Y]  E[aX  b]  aE[X]  E[b]  a E[X]  b Cov(X,Y) = a E[X2 ]  b E[X] – E[X] (aE[X]  b) = a E[X2 ]  b E[X] – aE2 [X] – bE[X] Cov(X,Y) = a (E[X2 ] – E2 [X]) EC8491: Communication Theory Department of ECE
  • 31. 3.32 Communication Theory 6. Show that the random process X( )= Acos(ω +θ) c t t is a wide-sense stationary process where, is a random variable uniformly distributed in the range (0,2) and A and are constant. Solution: For a random process to be WSS, it is necessary to show that,  Meanis constant  Autocorrelationfunctiondependsonlyontime difference. The PDFofthe uniformdistributionis givenby 1 ( ) , 0 2 2 f      (a) Mean 2 0 1 E[ ( )] X( ) 2 x t t d      2 0 1 Acos( ) 2 ct d        2 0 A [sin( )] 0 2 ct        Therefore, meanis a constant. (b) Autocorrelation XX R ( , E[Acos( )Acos( )] c c c t t t t        2 2 A cos( ) A 4 (0) 2 c      2 A cos( ). 2 c    So, autocorrelationfunctiondepends on the time difference As, mean is constant and autocorrelation function depends only on , X(t) is a wide-sense stationary process. EC8491: Communication Theory Department of ECE
  • 32. Random Process 3.33 7. Show that the random process X(t) = A cos is a wide-sense stationary process where, and are constant andAis a random variable. Solution: For a random process to be WSS, it is necessaryto show that,  Meanis constant  Autocorrelationfunctiondependsonlyontime difference. (a) Mean E[X(t)]  E[Acos( )] ct   cos( ) ct    E[A] E[X(t)] 0  Therefore, mean is not a constant. (b) Autocorrelation XX R ( , ) E[ cos( )Acos( )] c c c t t t t          2 1 [cos( ) cos(2 2 )]E[A ] 2 c c c t          Thus, the autocorrelation of X(t) is not a function oftime difference only.So the given random process X(t) is not WSS. 8. Let X(t) = A cos t   B sin t  and Y( Y(t) = B cos t - A sin t, where A and B are independent random variables both having zero mean and variance , and  is constant. Find the cross-correlation of X(t) and Y(t). Solution: The cross-correlation ofX(t) andY(t) is : XY 1 2 1 2 R ( , ) E[X( )Y( )] t t t t  1 1 2 2 E[(Acos Bsin )(Bcos Asin )] t t t t        1 2 1 2 E[A B](cos cos sin sin ) t t t t       – E[A2 ] cos ωt1 sin ωt2 + E[B2 ] sin ωt1 cos ωt2 Since, E[AB] = E[A]E[B] = 0 and E[A2 ] = E[B2 ] = 2 XY 1 2 1 2 1 2 R ( , ) (sin cos cos sin ) t t t t t t       2 1 2 sin ( ) t t    2 XY R ( ) sin    where, τ  t2  t1 . EC8491: Communication Theory Department of ECE
  • 33. 3.34 Communication Theory 9. The input X(t) to a diode with a transfer characteristicY= X2 is a zero mean stationary Gaussian random process with an autocorrelation function  XX R (τ)=exp( τ ) . Find the mean µY (t) and RYY (t1 , t2 ). Solution: µY  E[Y(t)]  E[X2 (t)]  RXX (0)  1 RYY (t1 , t2 )  E[Y(t1 )Y(t2 )]  E[X2 (t1 ) X2 (t2 )] For zero meanGaussian randomvariable, E[X2 (t1 ) X2 (t2 )]  E[X2 (t1 )] E[X2 (t2 )]  2E[X(t1 )X(t2 )]2 where, E[X2 (t1 )]  E[X2 (t2 )]  RXX (0) E[X(t1 )X(t2 )]  RXX (|t1  t2 |) Since X(t) is stationary, RYY (t1 , t2 )  [RXX (0)]2  2 [RXX (|t1  t2 |)]2 or RYY ( )  [RXX (0)]2  2 [RXX ( )]2  1  2 exp( 2| |) 10. A wide sense stationary random process X(t) is applied to the input of an LTI system with impulse response h(t)  3e2t u(t). Calculate the mean of the output Y(t) of the system if E[X(t)]  2. Solution: The frequencyresponse ofthe systemcanbe obtainedbytaking Fourier transformof the impulse response as: 1 H( ) F[ ( )] 3 2 h t j     The mean value ofthe output Y(t) canbe obtained as: Y X H(0) m m  H(0)  3*(1/2)  3/2 Therefore, mean of the output Y(t)  2 * (3/2)  3 11. LetX andYbe realrandomvariables with finitesecond moments. ProvetheCauchy- Schwarzinequality (E[XY])2  E[X2 ]E[Y2 ]. (April/May 2015) Solution: The meansquare value ofa random variable cannever be negative value, so 2 E[(X Y) ] 0 a   , for any value of a EC8491: Communication Theory Department of ECE
  • 34. Random Process 3.35 Expanding above equationwe get, 2 2 2 E[X ] 2 E[XY] E[Y ] 0 a a    Substituting 2 E[X Y]/ E[Y ] a toget left handsideofthisinequalityasminimum, we get 2 2 2 E[X ] (E[EY] / E[Y ] 0   (or) 2 2 2 (E[X Y]) E[X ]E[Y ]  12. Let X(t) and Y(t) be both zero-mean and WSS random processes. Consider the random process Z(t)  X(t)  Y(t). Determine the autocorrelation and power spectrum of Z(t) if X(t) and Y(t) are jointly WSS. (Apr/May 2015) Solution: The autocorrelationofZ(t) is givenby: RZZ (t1 , t2 )  E[Z(t1 )Z(t2 )]  E[[X(t1 )  Y(t1 )][X(t2 )  Y(t2 )]]  E[X(t1 )X(t2 )]  E[X(t1 )Y(t2 )]  E[Y(t1 )X(t2 )]  E[Y(t1 )Y(t2 )]  RXX (t1 , t2 )  RXY (t1 , t2 )  RYX (t1 , t2 )  RYY (t1 , t2 ) As X(t) andY(t) are jointlyWSS, ZZ XX XY YX YY 2 1 R ( ) R ( ) R ( ) R ( ) R ( ), where t t            We knowthat the Fourier transformofthe autocorrelationfunctiongives power spectrum, taking Fouriertransformonboth sides, we get ZZ XX XY YX YY S ( ) S ( ) S ( ) S ( ) S ( )          13. Let X(t)= A cos(ωt + )  and Y(t)= Asin(ωt + )  where A and  are constants and  is a uniform random variable [0,2].Find the cross correlation of X(t) and Y(t). (Apr/May 2015)(May/June 2016) Solution: The cross-correlation ofX(t) andY(t) is XY R ( , ) E[X( )Y( )] t t t t      2 E[A cos( )sin( )] t t        2 A E[sin(2 2 ) sin( )] 2 t       EC8491: Communication Theory Department of ECE
  • 35. 3.36 Communication Theory 2 A {E[sin(2 2 )] E[sin( )]} 2 t       2 A {0 E[sin( )]} 2    2 XY XY A R ( , ) R ( ) sin ( ) 2 t t       14. In a binary communication system, let the probability of sending a 0 and 1 be 0.3 and 0.7 respectively. Let us assume that a 0 being transmitted, the probability of it being received as 1 is 0.01 and the probability of error for a transmission of 1 is 0.1. (i) What is the probability that the output of this channel is 1? (ii) If a 1is received then whatis the probability thatthe input to thechannelwas 1? (Nov/Dec 2015) Solution: Let X andYdenote the input and output ofthe channel. Giventhat P(X  0)  0.3, P(X  1)  0.7 P(Y  0 | X  0)  0.99, P(Y  1 | X  0)  0.01 P(Y  0 | X  1)  0.1, P(Y  1 | X  1)  0.9 (a) We knowthat, fromthe totalprobabilitytheorem. P(Y  1)  P(Y  1 | X  0) * P(X  0)  P(Y  1 | X  1) * P(X  1)  0.01 * 0.3  0.9 * 0.7 P(Y  1)  0.633 The probability that the output of this channel is 1 is 0.633. (b) Using Baye’s rule P(X 1)P(Y 1| X 1) P(X 1|Y 1) P(X 0)P(Y 1| X 0) P(X 1)P(Y 1| X 1)               (0.7 * 0.9) / (0.3 * 0.01 + 0.7 * 0.9) P(X  1 | Y  1)  0.9953 If a 1 is received then the probability that the input to the channel was 1 is 0.9953. EC8491: Communication Theory Department of ECE
  • 36. Random Process 3.37 15. Given a random process, X(t)= Acos(ωt +μ) whereAand  are constants and  is a uniformly distributed random variable. Show that X(t) is ergodic in both mean and autocorrelation. (May/June 2016) Solution: For X(t) to be ergodic in meanand autocorrelation X X( ) E[X( )] x t t    XX XX R ( ) X( )X( ) E[X( )X( )] R ( ) t t t t        Ensemble Average: E[X( )] Acos( ) ( ) t t f d          Assume  is uniformly distributed over to   A E[X( )] cos( ) 0 2 t t d           XX R ( ) E[X( )X( )] t t     2 A cos( )cos( ( ) ) 2 t t d             2 XX A R ( ) cos 2    Time Average: T / 2 T T/ 2 1 X X( ) lim Acos( ) T t t dt        0 0 T / 2 T / 2 0 A Acos( ) T t dt      0 x XX R ( ) X( )X( ) t t     T / 2 2 T T / 2 1 lim A cos( )cos( ( ) ) T t t dt           0 0 2 T / 2 T / 2 0 A 1 [cos cos(2 2 ) T 2 t dt        2 XX A R ( ) cos . 2    EC8491: Communication Theory Department of ECE
  • 37. 3.38 Communication Theory Thus time averaged mean and autocorrelation is equal to ensemble averaged mean and autocorrelation. So the given process X(t) is ergodic in both mean and autocorrelation. 16. Consider two linear filters connected in cascade as shown in Fig.1. Let X(t ) be a stationary process with auto correlation function Rx (  ), the random process appearing at the first input filteris V(t ) and the second filteroutput isY(t). (a) Find the autocorrelation function ofY(t ). (b) Find the cross correlation function Rvy () of V(t ) and Y(t ). (Apr/May 2017) Fig. 1 Solution: (a) The cascade connection of two filters is equivalent to a filter with the impulse response 1 2 ( ) ( ) ( ) h t h h t d        The autocorrelationfunction ofY(t) isgiven by: Y 1 2 X 1 2 1 2 R ( ) ( ) ( )R ( ) h h d d                (b) The cross correlation ofV(t) andY(t) is given by: VY R ( ) E[V( )Y( )] t t     The output Y(t) is given as Y(t) 2 V( ) ( ) h t d        So, VY 2 R ( ) E[V( ) V( ) ( ) ] t h t d           VY 2 V R ( ) ( )R ( ) h t t d         h1(t) h2(t) V(t) X(t) Y(t) EC8491: Communication Theory Department of ECE
  • 38. Random Process 3.39 17. The amplitude modulated signal is defined as AM c X (t)= A (t)cos(ω t+θ) m where m(t) is the baseband signal and Acos(ω +θ) ct is the carrier. The baseband signal m(t) is modeled as a zero mean stationary random process with the autocorrelation function R (τ) xx and the PSD Gx (f). The carrier amplitude A and frequency c  are assumed to be constant and the initialcarrier phase  is assumed to be random uniformly distributed in the interval  ( π,π). Furthermore, m(t) and  are assumed to be independent. (i) Show that XAM (t) is Wide Sense Stationary (ii) Find PSD of XAM (t). (Apr/May 2017) Solution: (i) For XAM (t) to be WSS, its :  Mean E[XAM (t)] = Constant  Autocorrelation E[XAM (t) XAM (t )] depends on  AM E[X ( )] E[A ( )cos( )] c t m t t    A E[ ( )]E[cos( )] c m t t    Given that, m(t) is zero mean stationaryrandomprocess: E[XAM (t)]  0 (Constant) RXAMXAM ()  E[XAM (t) XAM (t  )] E[A ( )cos( )A ( )cos( ( ) )] c c m t t m t t         2 A E[ ( ) ( )]E[cos( ( )cos( )] c c c m t m t t t          2 A R ( )E[cos cos(2 2 )] 2 xx c c c t         2 XAMXAM A R ( ) R ( )cos 2 xx c      Since mean ofXAM (t) is constant and autocorrelationofXAM (t) depends on , XAM (t) is Wide Sense Stationary EC8491: Communication Theory Department of ECE
  • 39. 3.40 Communication Theory (ii) We know that the Fourier transform of the autocorrelation function gives power spectrum. 2 XAMXAM A F[R ( )] F[ R ( )cos ] 2 xx c      GXAMXAM (ω)  2 A 2 F[Rxx (τ)] F[cos ωc τ] Given that PSD ofm(t) is Gx (f). We know that, F[cos ] ( ) ( ) c c c f f f f       Using frequencyconvolutiontheorem, we get 2 XAMXAM A G ( ) G ( )*[ ( ) ( )] 2 x c c f f f f f f      GXAMXAM (f)  2 A 2  [Gx (f  fc )  Gx (f  fc )]. EC8491: Communication Theory Department of ECE
  • 40. Random Process 3.41 REVIEW QUESTIONS AND ANSWERS PART-A 1. Define RandomVariable. (Nov/Dec 2015) A randomvariable is a functionthat assigns a realnumber X(S) to everyelement S s  , where S is the sample space corresponding to a randomexperiment E. Example: Tossing anunbiased coin twice. The outcomes ofthe experiment are HH, HT, TH,TT. Let X denote the numberofheads turning up. ThenX has the values 2,1,1,0.Here, Xisarandomvariablewhichassignsarealnumberto everyoutcomeofarandomexperiment. 2. State Baye’s rule. (Nov/Dec 2015) Baye’s rule or Baye’s theorem relates the conditional and marginal probabilities of stochasticeventsAand B: P(B | A)P(A) P(A | B) P(B)  Where, P(A|B)istheconditionalprobabilityofAgivenB, P(B|A)istheconditionalprobability ofB givenA, P(A) is the marginalprobabilityofAandP(B) is the marginalprobabilityofB. 3. Define discrete random variable. IfX is a random variable which can take a finite number or countablyinfinite number of p values, X is called a discrete RV.Eg. Let X represent the sum ofthe numbers on the 2 dice, when two dice are thrown. 4. Define continuous random variable. If X is a random variable which can take allvalues (i.e., infinite number ofvalues) in an interval, then X is called a continuous RV. Eg. The time takenby a person who speaks over a telephone. 5. Define cumulative distribution function of a random variable. The CumulativeDistributionFunction(CDF) ordistributionfunctionofarandomvariable X is defined as, X F (X) { :X( ) }, P x     which canbe simplywritten as F ( ) (X ) x x P x   EC8491: Communication Theory Department of ECE
  • 41. 3.42 Communication Theory 6. List the properties of CDF. 1. 0 F ( ) 1 x x   2. F ( ) x x is non-decreasing 3. lim F ( ) 0 x x x   and lim F ( ) 1 x x x   4. F ( ) x x is continuousfromthe right 5. P( X ) F ( ) F ( ) x x a b b a     6. P(X ) F ( ) F ( ). x x a a a    7. Define probability density function of a random variable. The ProbabilityDensityFunction or PDF ofa continuous randomvariable Xis defined as the derivative ofits CDF. It is denoted by: ( ) F ( ). x x d f x x dx  8. List the properties of PDF. 1. ( ) 0 x f x  2. ( ) 1 x f x dx     3. ( ) P( X ) a x b f x dx a b     4. In general, A P(X A) ( ) x f x dx    5. F ( ) ( ) . x x x x f u du    9. Define mean of a random variable. The mean ofthe randomvariable X is defined as: E[X] ( ) x x f x dx     (Continuous) E[X] P[X ] x x x x      (Discrete) 10. Define variance ofa random variable. The variance ofthe randomvariable X is defined as: EC8491: Communication Theory Department of ECE
  • 42. Random Process 3.43 2 2 X X ( ) ( ) x x f x dx       (Continuous) 2 2 X X ( ) P[X ] x x x      (Discrete) 11. Define covariance of a random variable. The covariance of two random variables X and Y is defined as the expectation of the product ofthe two randomvariables givenby: Cov(X,Y)  E[XY]  X Y   12. Define correlation coefficient. The correlationcoefficient oftwo randomvariableis defined as: XY XY X Y      where, the value ofcorrelation coefficient ranges from1to 1. 13. When the two random variables are said to be uncorrelated? The two randomvariables X andYare said to be uncorrelated, iftheir covariance value is zero. Cov(X,Y)  0 14. Give the PMF of binomial random variable. P(X ) (1 ) , 0 i n i n i p p i n i             15. Give the mean and variance ofbinomial random variable. Mean ofbinomialrandomvariable is given by,   E(X)  np Variance of binomial random variable is given by,  np(1  p). 16. What is the importance ofbinomialrandom variable in the communication? Binomialrandomvariable can be used to modelthe totalnumber ofbits received in error, when sequence ofn bits is transmitted over a channelwitha bit-error probabilityofp. 17. Give the PDF for uniform random variable. This is a continuous randomvariable taking values betweenaand b withequalprobabilities for intervals ofequallength. The probabilitydensityfunctionis givenby: EC8491: Communication Theory Department of ECE
  • 43. 3.44 Communication Theory X 1 , ( ) 0, Otherwise a x b b a f x          18. What is the importance ofuniform randomvariable in the communication? The phase ofa received sinusoid carrier and quantizationerrors are usuallymodelled as a uniformrandomvariable. 19. Give the PDF for Gaussian random variable. The Gaussianornormalrandomvariable isa continuous randomvariabledescribed bythe densityfunctionas: 2 2 ( ) 2 1 ( ) 2 x m x f x e       20. What is the significance of Gaussian random variable?  TheGaussianrandomvariableisthemostimportantandfrequentlyencounteredrandom variable in communication systems. The reason is that thermal noise, which is the major source ofnoise in communication, has a Gaussiandistribution.  In robotics, GaussianPDF is used to statisticallycharacterize sensor measurements, robot locations and map representations. 21. List the properties of Gaussian random variable.  The sum of two independent Gaussian random variables is also a Gaussian randomvariable.  The weighted sum of N independent Gaussian random variables is a Gaussian randomvariable.  Iftwo Gaussian randomvariables havezero covariance (uncorrelated), theyare also independent. 22. State Central Limit Theorem. (May/June 2016)(Nov/Dec 2016) Central limit theoremstates that the normalized distribution of the sumof independent, identically distributed random variables approaches a Gaussian distribution as the number of randomvariablesincreases, regardless ofthe individualdistributions. EC8491: Communication Theory Department of ECE
  • 44. Random Process 3.45 23. List the applications of Central Limit Theorem.  Signalprocessing  Channelmodelling  Finance  Populationstatistics  Hypothesis testing  Engineeringresearch 24. Define Random Process. Arandomprocess is defined asrule whichassigns afunctionoftime to eachoutcome‘s’of a randomexperiment. 25. Differentiate random process from random variable. Randomvariable isa mapping ofevent outcome to realnumbers, whereas randomprocess is themapping ofevent outcometo signalwaveforms. Randomprocess is a functionoftime, but randomvariable is not a functionoftime. 26. What are the types of random process? Based on the continuous or discrete nature of the state space S and parameter set T, a randomprocesscan be classified into discrete randomsequence, continuousrandomsequence, discrete randomprocess and continuous randomprocess. Based on the stationarity, a random process can be classified into stationary and non- stationaryrandomprocess. 27. Define stationary random process. Arandomprocesswhose statisticalcharacteristics do not change withtimeis classified as stationaryrandomprocessor stationaryprocess. 28. What is strict-sense stationary? The random process X(t) is said to be Stationary in the Strict Sense (SSS) or strictly stationaryifitsstatistics are invariant to ashift oforigin: 1 1 X( ),.....,X( ) 1 X( ),.....X( ) 1 F ( ,... ... ., ) F ( ,... ... ., ) k k t t k t t k x x x x     29. Define mean of the random process. The meanofprocess X(t) isdefined as the expectationofthe randomvariable obtained by observing the process at some time t givenby: EC8491: Communication Theory Department of ECE
  • 45. 3.46 Communication Theory X ( ) ( ) E[X( )] ( ) x t m t t x f x dx      . 30. Define autocorrelation of the random process. (May/June 2016) Autocorrelation of the process X(t) is given by the expectation of the product of two random variables X(t1 ) and X(t2 ) obtained by observing the process X(t) at times t1 and t2 respectively. XX 1 2 1 2 R ( , ) E[X( )X( )] t t t t  1 2 1 2 X( ) X( ) 1 2 1 2 , ( , ) t t x x f x x dx dx        31. List the properties of autocorrelation function.  The mean square value of a process is equal to the value ofautocorrelation at 0. 2 XX R (0) E[X (T)]   The autocorrelationfunctionis anevenfunctionof. XX XX R ( ) R ( )      The autocorrelationfunctionis maximumat =0. XX XX R ( ) R (0)   32. Define autocovariance of the random process. The autocovariancefunctionofX(t) isdefined as: 2 XX 1 2 XX 2 1 X C ( , ) R ( ) t t t t m    33. Define white noise process. A random process is said to be white noise process X(t), if it is zero mean and power spectraldensityis definedas: X 0 S ( ) N / 2,   forallfrequencies 34. Draw the PSD and autocorrelation function of white noise. EC8491: Communication Theory Department of ECE
  • 46. Random Process 3.47 35. When a random process is said to be white Gaussian noise process? The randomprocess X(t) is said to be white Gaussian noise process, ifX(t) is a stationary Gaussianrandomprocess with zero meanand flat power spectraldensity. 36. What isWide-Sense Stationary? (Apr/May 2017) A randomprocess is called wide-sense stationary(WSS) ifits  Meanis constant.  Autocorrelationdepends onlyonthe time difference. 37. Define Power Spectral Density of the random process. The distribution ofthe power ofthe randomprocessat different frequencies is the Power SpectralDensityor Power Spectrumofthe randomprocess. 38. Give the power spectral density equation of a random process X. XX XX 1 1 S ( ) R ( )exp( 2 ) f j f d          39. List the properties of power spectral density.  The zero-frequencyvalue ofthe powerspectraldensityofastationaryprocess equals the totalareaunder the graphofthe autocorrelationfunction.  The meansquare value ofastationaryprocess equals thetotalarea under thegraphof the powerspectraldensity. 40. What is ergodicity? Arandomprocessissaid to beergodic iftime averagesarethe same forallsamplefunctions and equalto the corresponding ensemble averages. EC8491: Communication Theory Department of ECE
  • 47. 3.48 Communication Theory 41. When a process is ergodic in mean? A stationaryprocessis called ergodic inthe meanif: T lim (T) x x m m   T lim var[ (T)] 0 x m    42. When a process is ergodic in autocorrelation? Astationaryprocessis calledergodic intheautocorrelationfunctionif, 43. Write Einstein-Wiener-Khintchine relations. (Nov/Dec 2016) (Apr/May 2017) XX XX S ( ) R ( )exp( 2 ) f j f d          X X XX R ( ) S ( )exp( 2 ) f j f df        44. Give the importance ofWiener-Khintchine relations. For a stationary process, the power spectral density can be obtained from the Fourier transformofthe autocorrelationfunction. 45. What are the advantages of Gaussian process?  Gaussian process has manyproperties that make analytic results possible.  Randomprocesses produced byphysicalphenomena are oftensuch that a Gaussian modelis appropriate. Further the use ofGaussian modelis confirmed byexperiments. 46. List the properties of Gaussian process.  Ifa Gaussian process is stationary, then the process is also strictlystationary.  Ifa Gaussianprocess X(t) is passedthroughLTI filter, thentherandomprocessY(t) at the output ofthe filter is also Gaussian process. EC8491: Communication Theory Department of ECE
  • 48. Random Process 3.49 PART – B 1. Let X andYbe realrandomvariables with finite second moments. Prove the Cauchy- Schwartz inequality 2 2 2 (E[XY]) E[X ]E[Y ]  . (Apr/May 2015) 2. Differentiate SSS withthat ofWSS process. (Apr/May 2015) 3. Let X(t) andY(t) be bothzero-mean and WSS random processes. Consider the random process Z(t)  X(t) +Y(t). Determine the autocorrelationand power spectrumofZ(t) if X(t) andY(t) are jointlyWSS. (Apr/May 2015) 4. Let X(t) Acos( ) t    andY(t) Asin( ) t    whereAand are constants and  is a uniformrandomvariable [0,2]. Find the cross correlation ofX(t) andY(t). (Apr/May 2015)(May/June 2016) 5. In a binarycommunication system, let the probabilityofsending a 0 and 1 be 0.3 and 0.7 respectively. Let us assumethat a 0 being transmitted, the probabilityofit beingreceivedas 1 is 0.01 and the probabilityoferror for a transmission of1 is 0.1. (i) What is the probabilitythat the output ofthis channelis 1? (ii) Ifa 1 is received thenwhat is the probabilitythat the input to the channelwas 1? (Nov/Dec 2015) 6. What isCDFand PDF?Statetheir properties.Also discussthemindetailbygivingexamples of CDF and PDF for different types of random variables. (Nov/Dec 2015) 7. Explainindetailabout the transmissionofa randomprocess throughalinear timeinvariant filter. (May/June 2016)(Nov/Dec 2016) 8. When a randomprocess is said to be strict sense stationary(SSS), wide sense stationary (WSS) and ergodic process? (May/June 2016)(Nov/Dec 2016) 9. Given a random process, X(t) =Acos(t + ) whereAand are constants and  is a uniformly distributed random variable. Show that X(t) is ergodic in both mean and autocorrelation. (May/June 2016) 10. Define the following: Mean, Correlation, Covariance and Ergodicity. (Nov/Dec 2016) 11. What is a Gaussian randomprocess andmention its properties. (Nov/Dec 2016) 12. Consider two linear filters connected incascade as shown infig.1. Let X(t) be a stationary EC8491: Communication Theory Department of ECE
  • 49. 3.50 Communication Theory processwithautocorrelationfunctionRx (), therandomprocessappearing at thefirst input filter isV(t) and the second filter output isY(t). (a) Find theautocorrelationfunction ofY(t). (b) Find the cross correlation function Rvy ( ) ofV(t) andY(t). (Apr/May 2017) Fig. 1 13. The amplitude modulated signalis defined as XAM (t) =Am(t) cos( ) ct   where m(t) is the baseband signalandAcos( ) ct   is the carrier. Thebaseband signalm(t)is modeled as a zero meanstationaryrandomprocess withthe autocorrelationfunctionRxx ()and the PSD Gx (f). The carrier amplitudeAand frequency c  areassumed to be constant and the initialcarrierphase  isassumed toberandomuniformlydistributedintheinterval ( , )   . Furthermore, m(t) and  are assumed to be independent. (i) Show that XAM (t) isWide Sense Stationary (ii) Find PSD ofXAM (t). (Apr/May 2017)  EC8491: Communication Theory Department of ECE