Download - DC Digital Communication PART4
-
8/7/2019 DC Digital Communication PART4
1/150
Chapter 5 Random Processes
random process : a collection of time functions
and an associated probability description
marginal or joint pdf
ensemble
{x(t)}
X(t) : a sample function
X(t1) a random variable=
X1
-
8/7/2019 DC Digital Communication PART4
2/150
-
8/7/2019 DC Digital Communication PART4
3/150
5.2 Continuous & Discrete Random Processes
Dependent on the possible values of the random variables
continuous random process :
random variables X(t1), X(t2), can assume any value
within a specified range of possible values
PDF is continuous (pdf has NO function)
X(t)
X1
t1
t
sa le f ctio
f
f( )
-
8/7/2019 DC Digital Communication PART4
4/150
discrete random process :
random variables can assume only certain isolated values
pdf contains only functions
mixed process : have both continuous and discrete component
X(t)
t
sa le f ctio f
f( )
100
0 0 100
-
8/7/2019 DC Digital Communication PART4
5/150
5.3 Deterministic & Non deterministic Random Processes
nondeterministic random process :
future values of each sample ftn cannot be
exactly predicted from the observed past values
(almost all random processes are nondeterministic)
deterministic random process :
~ can be exactly predicted ~
random ftn of time
-
8/7/2019 DC Digital Communication PART4
6/150
(Example)
X(t) = A cos(t + )A, known constant
constant for all t
but different for each sample function
random variation over the ensemble, not wrt time
still possible to define r.v. X(t1), X(t2),
and to determine pdf for r.v.
Remark: convenient way to obtain a probability model
for signals that are known except for one or two
parameters
Tx Rx
-
8/7/2019 DC Digital Communication PART4
7/150
5.4 Stationary & Nonstationary Random Processes
dependency of pdf on the value of time
stationary random process:
If all marginal and joint density function of the process do not depend
on the choice of time origin, the process is said to be stationary
( mean moment )
ensemble sample
ftn time
t1
X(t1)
r.v.
pdf
-
8/7/2019 DC Digital Communication PART4
8/150
nonstationary random process:
If any of the pdf does change with the choice of time origin,
the process is nonstationary.
All maginal & joint density ftn should be independent ofthe time origin!
too stringent
relaxed condition
mean value of any random variable X(t1) is independent of t1
& the correlation of two r.v. depends only on the time difference 12 tt
-
8/7/2019 DC Digital Communication PART4
9/150
stationary in the wide sense
mean, mean-square, variance, correlation coefficient of
any pair of r.v. are constant
random
inputresponse
system analysis strictly stationary stationary
in the wide sense !
-
8/7/2019 DC Digital Communication PART4
10/150
5.5 Ergodic & Nonergodic Random Process
If Almost every member of the ensemble shows the
same statistical behavior as the whole ensemble,
then it is possible to determine the statisticalbehavior by examining only one typical sample
function.
Ergodic process
-
8/7/2019 DC Digital Communication PART4
11/150
For ergodic process, the mean values andmoments can be determined by time averages aswell as by ensemble averages, that is,
(Note) This condition cannot exist
unless the process is stationary.
ergodic means stationary (not vice verse)
g
g gp!!
T
T
n
T
n
n dttXT
dxxfxX )(2
1lim)(
-
8/7/2019 DC Digital Communication PART4
12/150
-
8/7/2019 DC Digital Communication PART4
13/150
How to estimate the process parameters from
the observations of a single sample function?
We cannot make an ensemble average for
obtaining the parameters!
if erogodic
but, we cannot have a sample function
over infinite time interval
make a time average
make a time average over a finite
time interval}approximation to the true value
-
8/7/2019 DC Digital Communication PART4
14/150
Will determine
How good this approximation is?
Upon what aspects of measurement the goodness of the
approximation depends?
Estimation of the mean value of an ergodic random process {X(t)}
random variable true mean value
?
!T
dttXT
X0
)(1
X
XX
X
mean variance!
-
8/7/2019 DC Digital Communication PART4
15/150
? AXdtX
T
dttXTEXE
T
T
!!
!
0
0
1
)(
1
TX 1var w (see Ch.6)
T good estimate !
(Remark) X(t)
discrete measurement
-
8/7/2019 DC Digital Communication PART4
16/150
If we measure X(t) at equally spaced time interval , that is,
then the estimate of can be expressed as
mean
mean-square
...),2(),( 21 tXXtXX (!(!
X
!N
iXN
X1
1
? A ? A !!!
! XXN
XEN
XN
EXE ii111
? A !
!
jiji XXEN
XXN
EXE22
2 11
-
8/7/2019 DC Digital Communication PART4
17/150
: statistically independent, that is,
mean of estimate = true mean
? A jiX
jiXXXE ji
{!
!!2
2
? A
22
22
222
2
2
1
11
1
1
XN
X
N
X
N
XNNXNN
XE
X !
!
!
W
? A_ a2
22
1
var
XN
XEXEX
W!
!
N
X1var w
-
8/7/2019 DC Digital Communication PART4
18/150
See the example in pp.201~202
zero-mean
Gaussian
Random process
2,0 YN W
2)(t )()( 2 ttX !
-
8/7/2019 DC Digital Communication PART4
19/150
5.7 Smoothing Data with a Moving Window Average
iX
iN
iY
R
L
n
nk
ki
RL
iY
nnX
1
1
A kind of LPF
-
8/7/2019 DC Digital Communication PART4
20/150
Sep 20, 2005 CS477:Analog and Digital Communications 20
Random variables, Random processes
Analog and Digital Communications
Autumn 2005-2006
-
8/7/2019 DC Digital Communication PART4
21/150
Sep 20, 2005 CS477:Analog and Digital Communications 21
Random Variables Outcomes and sample space
Random Variables Mapping outcomes to:
Discrete numbers Discrete RVs
Real line Continuous RVs
Cumulative distribution function One variable
Joint cdf
-
8/7/2019 DC Digital Communication PART4
22/150
Sep 20, 2005 CS477:Analog and Digital Communications 22
Random Variables Probability mass function (discrete RV)
P
robability density function (cont. RV) Joint pdf of independent RVs
Mean
Variance Characteristic function
(IFT of pdf)(X ) = E [ej 2f X ] =R
x ej 2f x dx
-
8/7/2019 DC Digital Communication PART4
23/150
Sep 20, 2005 CS477:Analog and Digital Communications 23
Random Processes Mapping of an outcome (of an
experiment) to a range setRwhere R is
a set of continuous functions Denoted as or simply
For a particular outcome is
a deterministic function For or simply is a
random variable
X (t ; s) X (t )
s s0; X (t ; s0)
t t 0; X (t 0; s) X (t 0)
-
8/7/2019 DC Digital Communication PART4
24/150
Sep 20, 2005 CS477:Analog and Digital Communications 24
Random Processes Mean
Autocorrelation
Autocovariance
mX (t ) = E[X (t ; s)] = E[X (t )]
RX (t 1; t 2) = E[X (t 1)X (t 2)]
X (t 1; t 2) = E[X (t 1)X (t 2)] mX (t 1)mX (t 2)
X (t 1; t 2) = RX (t 1; t 2) mX (t 1)mX (t 2)
-
8/7/2019 DC Digital Communication PART4
25/150
Sep 20, 2005 CS477:Analog and Digital Communications 25
Random Processes Cross-correlation
(Processes are orthogonal if )
Cross-covariance
R X ;Y(t ; t ) [X (t )Y(t )]
X ;Y(t ; t ) [X (t )Y(t )] X (t ) Y(t )
X ;Y(t ; t ) R X ;Y(t ; t ) X (t ) Y(t )
RX ;Y(t ; t )
-
8/7/2019 DC Digital Communication PART4
26/150
Sep 20, 2005 CS477:Analog and Digital Communications 26
ExampleX (t ) A cos2t
m X (t ) [A ]cos2t
X (t ; t 2) [A2]cos2t cos2t 2
CX(t ; t 2) Var(A) cos2t cos2t2
A is random variable
-
8/7/2019 DC Digital Communication PART4
27/150
Sep 20, 2005 CS477:Analog and Digital Communications 27
Example
mX (t ) = 0
RX (t 1; t 2) = 21cos(2 fc(t 1 t 2))
CX (t 1; t 2) = RX (t 1; t 2)
X (t ) = cos(2 fct + ); U(0; 2 )
Mean is constant and autocorrelation is dependent on t 1 t 2
-
8/7/2019 DC Digital Communication PART4
28/150
Sep 20, 2005 CS477:Analog and Digital Communications 28
Example
RX ;Y(t ; t 2) RX(t ; t 2) + mX (t ) mN
(t 2)
Y(t) X (t ) + N(t)
X (t ) and N(t ) independent
-
8/7/2019 DC Digital Communication PART4
29/150
Sep 20, 2005 CS477:Analog and Digital Communications 29
Stationary and WSS RP Stationary Random Process (RP)
Wide sense stationary (WSS) RP
Mean constant in time
Autocorrelation depends only on
Stationary WSS (Converse not true!)
pX ( t )(x) pX (t+ T)(x) 8T
X (t 1; ) X (t t 1) X ()
-
8/7/2019 DC Digital Communication PART4
30/150
Sep 20, 2005 CS477:Analog and Digital Communications 30
Power Spectral Density (PSD) Defined for WSS processes
Provides power distribution as afunction of frequency
Wiener-Khinchine theorem
PSD is Fourier transform of ACF
SX (f)R
1
1
R X()ej f
-
8/7/2019 DC Digital Communication PART4
31/150
Random processes - basic concepts
Wind loading and structural response
Lecture 5 Dr. J.D. Holmes
-
8/7/2019 DC Digital Communication PART4
32/150
Random processes - basic concepts
Topics :
Concepts of deterministic and random processes
stationarity, ergodicity
Basic properties of a single random process
mean, standard deviation, auto-correlation, spectral density
Joint properties of two or more random processes
correlation, covariance, cross spectral density, simple input-output relations
Refs. : J.S. Bendat and A.G. Piersol Random data: analysis and measurement procedures J. Wiley, 3rd ed, 2000.
D.E. ewland Introduction to Random ibrations, Spectral and Wavelet Analysis Addison-Wesley 3rd ed. 1996
-
8/7/2019 DC Digital Communication PART4
33/150
Random processes - basic concepts
Deterministic and random processes :
deterministic processes :
physical process is represented by explicit mathematical relation
Example:
response of a single mass-spring-damper in free vibration in laboratory
Random processes :
result of a large number of separate causes. Described in probabilisticterms and by properties which are averages
both continuous functions of time (usually), mathematical concepts
-
8/7/2019 DC Digital Communication PART4
34/150
Random processes - basic concepts
random processes :
Theprobability density function describes the general distribution of the magnitude of the random process, but it gives no informationon the time or frequency content of theprocess
fX(x)
time, t
x(t)
-
8/7/2019 DC Digital Communication PART4
35/150
Random processes - basic concepts
Averaging and stationarity :
Sample records which are individual representations of the
underlying process
Ensemble averaging :
properties of theprocess are obtained by averaging over a collection or ensemble of sample records using values at corresponding times
Time averaging :properties are obtained by averaging over a single record in time
Underlying process
-
8/7/2019 DC Digital Communication PART4
36/150
Random processes - basic concepts
Stationary random process :
Ergodic process :
stationary process in which averages from a single record are the same as those obtained from averaging over theensemble
Most stationary random processes can be treated as ergodic
Ensemble averages do not vary with time
Wind loading from extra - tropical synoptic gales can be treated as st
ationaryrandom processesWind loading from hurricanes - stationary over shorterperiods
-
8/7/2019 DC Digital Communication PART4
37/150
Random processes - basic concepts
Mean value :
The mean value,Dx , is theheight of the rectangular area having the same area as that under the function x(t)
time, t
x(t)
Dx
T
!T
0Tx(t)dt
T
1Limx
Can also be defined as the first moment of thep.d.f. (ref. Lecture 3
)
-
8/7/2019 DC Digital Communication PART4
38/150
Random processes - basic concepts
Mean square value, variance, standard deviation :
variance,
gp!T
0
2
T
2 (t)dtxT
1imx
standard deviation, Wx, is the square root of the variance
mean square value,
22x xx(t) !(average of the square of the deviation ofx(t) from the mean valu
e,Dx)
time, t
x(t)
Qx
T
Wx
? A!T
0
2
Tdtx-x(t)
T
1Lim
-
8/7/2019 DC Digital Communication PART4
39/150
Random processes - basic concepts
Autocorrelation :
The value ofVx(X) at X equal to 0 is the variance, Wx2? A? A ! gp
T
0Tx dtx-)x(t.x-x(t)
T1im)(X
The autocorrelation, or autocovariance, describes the general dependency ofx(t) with its value at a short time later, x(t X)
time, t
x(t)
X
T
Normalized auto-correlation : R(X)= Vx(X)/Wx2 R(0)= 1
-
8/7/2019 DC Digital Communication PART4
40/150
Random processes - basic concepts
Autocorrelation :
The autocorrelation for a random process eventually decays to zero at largeX
R(X)
Time lag, X
1
0
The autocorrelation for a sinusoidal process (deterministic) is a cosine function which does not decay to zero
-
8/7/2019 DC Digital Communication PART4
41/150
Random processes - basic concepts
Autocorrelation :
The area under the normalized autocorrelation function for the fluctuating wind velocity measured at a point is a measure of the avera
ge time scale of theeddies being carried passed the measurementpoint, say T1
R(X)
Time lag, X
1
0
If we assume that theeddies are being swept passed at the meanvelocity, DU.T1 is a measure of the average length scale of theeddies
This is known as the integral length scale, denoted by Pu
g
!0
1 )dR(T XX
-
8/7/2019 DC Digital Communication PART4
42/150
Random processes - basic concepts
Spectral density :
Basic relationship (1) : g
! 0 x2
x dn(n)S
The spectral density, (auto-spectral density, power spectral density, spectrum) describes the average frequency content of a random process, x(t)
frequency, n
Sx(n)
Thequantity Sx(n).Hn represents the contribution to Wx2 from the frequency increment Hn
Units of Sx(n) : [units ofx]2 . se
c
-
8/7/2019 DC Digital Communication PART4
43/150
Random processes - basic concepts
Spectral density :
Basic relationship (2) :
Where XT(n) is the FourierTransform of theprocess x(t) taken over the time interval -T/2
-
8/7/2019 DC Digital Communication PART4
44/150
Random processes - basic concepts
Spectral density :
Basic relationship (3) :
The spectral density is twice the FourierTransform of the autocorrelation function
Inverse relationship:
Thus the spectral density and auto-correlation are closely linked -
they basically provide the same information about theprocess x(t)
g
g
!-
n2
xx d)e(2(n)SXT
XVi
_ a gg !! 0 x0 n2xx )dnnos(2)(dn)e(alRe)( XX X cnn i
-
8/7/2019 DC Digital Communication PART4
45/150
Random processes - basic concepts
Cross-correlation :
? A? A ! gpT
0Txy dty-)y(t.x-x(t)
T
1im)(Xc
The cross-correlation function describes the general dependency ofx(t) with another random process y(t X), delayed by a time delay,X
time, t
x(t)
X
T
time, t
y(t)
T
Dx
Dy
-
8/7/2019 DC Digital Communication PART4
46/150
Random processes - basic concepts
Covariance :
? A? Agp!dd!T
0Txy dty-y(t).x-x(t)
T
1im(t)y(t).x(0)c
The covariance is the cross correlation function with the time delay,X, set to zero
(Section 3.3.5 in Wind loading of structures)
Note that herex'(t) and y'(t) are used to denote the fluctuating parts ofx(t) and y(t) (mean parts subtracted)
-
8/7/2019 DC Digital Communication PART4
47/150
Random processes - basic concepts
Correlation coefficient :
The correlation coefficient,V, is the covariance normalized by the standard deviations ofx and y
When x and y are identical to each other, the value ofV is1 (full correlation)
When y(t)=x(t), the value ofV is 1
In general, 1< V < 1
yx .
(t)(t).y'x' !
-
8/7/2019 DC Digital Communication PART4
48/150
Random processes - basic concepts
Correlation - application : The fluctuating wind loading of a tower depends on the correlation coe
fficient between wind velocities and hence wind loads, at various heights
Forheights, z1, and z
2
: )(z).(z
)(z).u'(zu')z,(z
2u1u
2121 !
z1
z2
-
8/7/2019 DC Digital Communication PART4
49/150
Random processes - basic concepts
Cross spectral density :
By analogy with the spectral density:
The cross spectral density is twice the FourierTransform of the cr
oss-correlation function for theprocesses x(t) and y(t)
The cross-spectral density (cross-spectrum) is a complex number:
Cxy(n) is the co(-incident) spectral density - (in phase)
Qxy(n) is thequad (-rature) spectral density - (out ofphase)
g
g
!-
n2
xy d)e(2(n)SXT
Xi
xyc
)()()(xy niQnCn xyxy !
-
8/7/2019 DC Digital Communication PART4
50/150
Random processes - basic concepts
Normalized co- spectral density :
It is effectively a correlation coefficient for fluctuations at frequency
, n
Application : Excitation of resonant vibration of structures by fluctuating wind forces
Ifx(t) and y(t) are local fluctuating forces acting at different parts of the structure, Vxy(n1) describes how well the forces are correlated (synchronized) at the structural natural frequency, n1
)().(
)((n)xy
nSnS
n
yx
xy!V
-
8/7/2019 DC Digital Communication PART4
51/150
Random processes - basic concepts
Input - output relationships :
There are many cases in which it is of interest to know how an input ra
ndom process x(t) is modified by a system to give a random output process y(t)Application : The input is wind force - the output is structural response (e.g. displacement acceleration, stress). The system is the dynamic characteristics of the structure.
Linear system : 1) output resulting from a sum of inputs, is equal to the sum of outputs produced by each input individually (additiveproperty)
Linear systemInput x(t) Output y(t)
Linear system : 2) output produced by a constant times the input, is equal to the constant times the output produced by the input alone (homogeneous property)
-
8/7/2019 DC Digital Communication PART4
52/150
Random processes - basic concepts
Input - output relationships :
Relation between spectral density of output and spectral density of input :
|H(n)|2 is a transfer function, frequency response function, or admittance
Proof:Bendat & Piersol, Newland
(n)S.H(n).A(n)S x2
y !
Sy(n)
frequency, n
Sx(n) A.|H(n)|2
-
8/7/2019 DC Digital Communication PART4
53/150
End of Lecture 5
John Holmes225-405-3789 [email protected]
54
-
8/7/2019 DC Digital Communication PART4
54/150
Wireless Communication Research Laboratory (WiCoRe)
54
Review of Probability and Random Processes
55
-
8/7/2019 DC Digital Communication PART4
55/150
Wireless Communication Research Laboratory (WiCoRe)
55
Importance of Random Processes
Random variables and processes talk about quantities and
signals which are unknown in advance
The data sent through a communication system is modeled
as random variable
The noise, interference, and fading introduced by the
channel can all be modeled as random processes
Even the measure of performance (Probability of Bit Error)
is expressed in terms of a probability
56
-
8/7/2019 DC Digital Communication PART4
56/150
Wireless Communication Research Laboratory (WiCoRe)
Random Events
When we conduct a random experiment, we can use setnotation to describe possible outcomes
Examples: Roll a six-sided die
Possible Outcomes:
An eventis any subset of possible outcomes:
{1,2,3,4,5,6}S !{1,2}A !
57
-
8/7/2019 DC Digital Communication PART4
57/150
Wireless Communication Research Laboratory (WiCoRe)
Random Events (continued)
The complementary event:
The set of all outcomes in the certain event: S
The null event:
Transmitting a data bit is also an experiment
J
{3,4,5,6}A S A! !
58
-
8/7/2019 DC Digital Communication PART4
58/150
Wireless Communication Research Laboratory (WiCoRe)
Probability
The probability P(A) is a number which measures thelikelihood of the event A
Axioms of Probability
No event has probability less than zero:
and
Let A and B be two events such that:
Then:
All other laws of probability follow from these axioms
( ) 0P u
( ) 1P A e ( ) 1P S! !A B J !
( ) ( ) ( )P A B P A P B !
59
-
8/7/2019 DC Digital Communication PART4
59/150
Wireless Communication Research Laboratory (WiCoRe)
Relationships Between Random Events
Joint Probability:
- Probability that both A and B occur
Conditional Probability:
- Probability that A will occur given that B has occurred
( ) ( )P A B P A B!
( )( | )
( )
P A BP A B
P B!
-
8/7/2019 DC Digital Communication PART4
60/150
61
-
8/7/2019 DC Digital Communication PART4
61/150
Wireless Communication Research Laboratory (WiCoRe)
Random Variables
A random variableX(S) is a real valued function of theunderlying even space:
A random variable may be:
-Discrete valued: range is finite (e.g.{0,1}) orcountableinfinite (e.g.{1,2,3..})
-Continuous valued: range is uncountable infinite (e.g. )
A random variable may be described by:
- A name: X
- Its range: X
- A description of its distribution
s S
62
-
8/7/2019 DC Digital Communication PART4
62/150
Wireless Communication Research Laboratory (WiCoRe)
CumulativeDistribution Function
Definition:
Properties:
is monotonically nondecreasing
While the CDF defines the distribution of a randomvariable, we will usually work with the pdf or pmf
In some texts, the CDF is called PDF (Probability
Distribution function)
( ) ( ) ( )XF x F x P X x! ! e
( )XF xp
( ) 0Fp g !( ) 1Fp g !
( ) ( ) ( )P a X b F b F ap e !
63
-
8/7/2019 DC Digital Communication PART4
63/150
Wireless Communication Research Laboratory (WiCoRe)
ProbabilityDensity Function
Definition: or
Interpretations: pdfmeasures how fast the CDF isincreasing or how likely a random variable is to lie around
a particular value
Properties:
( )( ) XX
dF xP x
dx!
( )( )
dF xP x
dx!
( ) 0P x u ( ) 1P x dx
g
g !( ) ( )
b
aP a X b P x dx e !
64
-
8/7/2019 DC Digital Communication PART4
64/150
Wireless Communication Research Laboratory (WiCoRe)
Expected Values
Expected values are a shorthand way of describing a
random variable
The most important examples are:
-Mean:
-Variance:
( ) ( )xE X m xp x dxg
g
! !
2 2([ ] ) ( ) ( )x xE X m x m p x dx
g
g
!
-
8/7/2019 DC Digital Communication PART4
65/150
66
-
8/7/2019 DC Digital Communication PART4
66/150
Wireless Communication Research Laboratory (WiCoRe)
Some Useful Probability Distributions
Binary Distribution
This is frequently used for binary data
Mean:
Variance:
1( )
pp x
p
!
0x !
1x !
( )E X p!
2
(1 )X p pW !
67
-
8/7/2019 DC Digital Communication PART4
67/150
Wireless Communication Research Laboratory (WiCoRe)
Some Useful Probability Distributions
(continued)
Let where are independent
binary random variables with
Then
Mean:
Variance:
1
n
i
i
Y X!
! _ a, 1,...,iX i n!
1( ) pp xp! 0x !
1x !
( )E X np!2
(1 )X np pW !
( ) (1 ) y n yYn
p y p py
!
0,1,...,y n!
68
-
8/7/2019 DC Digital Communication PART4
68/150
Wireless Communication Research Laboratory (WiCoRe)
Some Useful Probability Distributions
(continued)
Uniform pdf:
It is a continuous random variable
Mean:
Variance:
1
( )
0
p x b a
!
a x be e
otherwise
1( ) ( )
2E X a b!
2 21 ( )12
X a bW !
69
-
8/7/2019 DC Digital Communication PART4
69/150
Wireless Communication Research Laboratory (WiCoRe)
S
ome Useful ProbabilityD
istributions(continued)
Gaussian pdf:
A gaussian random variable is completely determined by
its mean and variance
2( ) 21( )
2
xx mp x eW
TW
!
70
-
8/7/2019 DC Digital Communication PART4
70/150
Wireless Communication Research Laboratory (WiCoRe)
The Q-function
The function that is frequently used for the area under the
tail of the gaussian pdf is the denoted by Q(x)
The Q-function is a standard form for expressing errorprobabilities without a closed form
2 2( ) ,t
x
Q x e dt g ! 0x u
71
-
8/7/2019 DC Digital Communication PART4
71/150
Wireless Communication Research Laboratory (WiCoRe)
A Communication System with Guassian noise
Transmitter Receiver
_ aS a s R S N!
2(0, )nN W:
0 ?R
"
The probability that the receiver will make an error is2
2
( )
2
0
1( 0 | )
2
n
x a
nn
aPR S a e dx
W
WTW
g " ! ! !
72
-
8/7/2019 DC Digital Communication PART4
72/150
Wireless Communication Research Laboratory (WiCoRe)
Random Processes
A random variable has a single value. However, actual
signals change with time
Random variables model unknown events
Randomprocesses model unknown signals
A random process is just a collection of random variables
If X(t) is a random process, then X(1), X(1.5) and X(37.5)
are all random variables for any specific time t
73
-
8/7/2019 DC Digital Communication PART4
73/150
Wireless Communication Research Laboratory (WiCoRe)
TerminologyDescribing Random Processes
A stationary random process has statistical properties
which do not change at all time
A wide sense stationary (WSS) process has a mean and
autocorrelation function which do not change with time
A random process is ergodic if the time average always
converges to the statistical average
Unless specified, we will assume that all random processes
are WSS and ergodic
74
-
8/7/2019 DC Digital Communication PART4
74/150
Wireless Communication Research Laboratory (WiCoRe)
Description of Random Processes
Knowing the pdf of individual samples of the randomprocess is not sufficient.
- We also need to know how individual samples arerelated to each other
Two tools are available to decribe this relationship
- Autocorrelation function
- Power spectral density function
75
-
8/7/2019 DC Digital Communication PART4
75/150
Wireless Communication Research Laboratory (WiCoRe)
Autocorrelation
Autocorrelation measures how a random process changes
with time
Intuitively, X(1) and X(1.1) will be strongly related than
X(1) and X(100000)
The autocorrelation function quantifies this
For a WSS random process,
Note that
X E X t X t J X X! (0)XPower J!
76
-
8/7/2019 DC Digital Communication PART4
76/150
Wireless Communication Research Laboratory (WiCoRe)
Power Spectral Density
tells us how much power is at each frequency
Wiener-Khinchine Theorem:
- Power spectral density and autocorrelation are a
Fourier Transform pair
Properties of Power Spectral Density
f*
( ) { ( )}f F J X* !
( ) 0
( ) ( )
( )
f
f f
Power f dfg
g
p* u
p* !*
p ! *
77
-
8/7/2019 DC Digital Communication PART4
77/150
Wireless Communication Research Laboratory (WiCoRe)
Gaussian Random Processes
Gaussian random processes have some special properties
- If a gaussian random process is wide-sense stationary,
then it is also stationary- If the input to a linear system is a Gaussian random
process, then the output is also a Gaussian random
process
78
-
8/7/2019 DC Digital Communication PART4
78/150
Wireless Communication Research Laboratory (WiCoRe)
Linear systems
Input:
Impulse Response:
Output:
( )x t
( )y t
( )h t
( )x t ( )h t ( )y t
-
8/7/2019 DC Digital Communication PART4
79/150
-
8/7/2019 DC Digital Communication PART4
80/150
Random Signals
-
8/7/2019 DC Digital Communication PART4
81/150
-
8/7/2019 DC Digital Communication PART4
82/150
Sinusoid ofRandom Phase
NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=cos(2*pi*4*t+random('unif',-pi,pi,1,1));
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of Sinusoid of
cont. random phase')
( ) cos(2 4 ) uniform [ , ] X t t p p p= + Q Q -
-
8/7/2019 DC Digital Communication PART4
83/150
Sinusoid ofRandom Frequency
NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=cos(2*pi*random('unid',4,1,1)*t);
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of Sinusoid of
discrete random frequency (FSK)')
( ) cos(2 ) uniform [1,4] X t ft f p=
-
8/7/2019 DC Digital Communication PART4
84/150
Sinusoid ofRandom Amp, Freq, Phase
NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=random('unid',4,1,1)*cos(2*pi*random('unid',4,1,
1)*t+random('unif',-pi,pi,1,1));subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of Sinusoid of
cont. random amp, freq, phase')
( ) cos(2 ) uniform [1, 4] uniform [1, 4] uniform [ , ]X t A ft A f p p p= + Q Q -
-
8/7/2019 DC Digital Communication PART4
85/150
White Gaussian Random Process
NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=randn(1,SIMULATION_LENGTH);
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of WGN process')
i d Si id
-
8/7/2019 DC Digital Communication PART4
86/150
Noisy Random Sinusoid
NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=random('unid',4
,1,1)*cos(2*pi*random('unid',4,1,
1)*t+random('unif',-
pi,pi,1,1))+0.1*randn(1,SIMULATIO
N_LENGTH);
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
title('Realizations of noisy randomsinusoid')
( ) cos(2 )X t A ft Np= + Q +
P i A i l P
-
8/7/2019 DC Digital Communication PART4
87/150
Poisson Arrival Process
NUM_REAL=4;
SIMULATION_LENGTH=1024;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATION_LENGTH);
lambda=0.01;
figure(1);
clf;
for n=1:NUM_REAL
arrivals=random('poiss',lambda,1,
SIMULATION_LENGTH);realizations(n,:)=cumsum(arrivals
);
subplot(NUM_REAL,1,n);
plot(t,realizations(n,:));
end
subplot(NUM_REAL,1,1);
2 1[ ( )]2 12 1 [ ( )][ ( ) ( ) ] 0,1,2,...
!
k
t tt tP Q t Q t k e kk
ll - --- = = =
Pi ki RV f R d P
-
8/7/2019 DC Digital Communication PART4
88/150
Picking a RV from a Random Process
NUM_REAL=10000;
SIMULATION_LENGTH=8;
t=0:(1/SIMULATION_LENGTH):(1-
1/SIMULATION_LENGTH);
realizations=zeros(NUM_REAL,SIMULATI
ON_LENGTH);
figure(1);
clf;
for n=1:NUM_REAL
realizations(n,:)=randn(1,SIMULAT
ION_LENGTH);
end
x=realizations(:,3);
hist(x,30);
A Gaussian R of mean 0 and std 1
A l i
-
8/7/2019 DC Digital Communication PART4
89/150
Autocorrelation
function [Rxall]=Rx_est(X,M)
N=length(X);Rx=zeros(1,M+1);
for m=1:M+1,
for n=1:N-m+1,
Rx(m)=Rx(m)+X(n)*X(n+m-1);
end;
Rx(m)=Rx(m)/(N-m+1);
end;
for i=1:M,
Rxall(i)=Rx(M+2-i);
end
Rxall(M+1:2*M+1)=Rx(1:M+1);
1
1( ) 0,1,...,
N m
x n nm
n
R m X X m MN m
-
+
=
= =-
Autocorrelation of Gaussian Random
-
8/7/2019 DC Digital Communication PART4
90/150
u oco e a o o Gauss a a do
Process
N=1000;
X=randn(1,N);
M=50;
Rx=Rx_est(X,M);
plot(X)
title('Gaussian Random Process')
pause
plot([-M:M],Rx)
title('Autocorrelation function')
Autocorrelation of Gauss-Markov
-
8/7/2019 DC Digital Communication PART4
91/150
Random Process
rho=0.95;
X0=0;
N=1000;
Ws=randn(1,N);
X(1)=rho*X0+Ws(1);
for i=2:N,
X(i)=rho*X(i-1)+Ws(i);
end;
M=50;Rx=Rx_est(X,M);
plot(X)
title('Gauss-Markov Random Process')
pause
plot([-M:M],Rx)
title('Autocorrelation function')
[ ] 0.95 [ 1] [ ], [ ] ~ (0,1)[0] 0
X n X n w n w n NX
= - +=
-
8/7/2019 DC Digital Communication PART4
92/150
Random ProcessesRandom ProcessesIntroductionIntroduction
Professor KeProfessor Ke--Sheng ChengSheng Cheng
Department of Bioenvironmental SystemsDepartment of Bioenvironmental Systems
EngineeringEngineering
E-mail: [email protected]
-
8/7/2019 DC Digital Communication PART4
93/150
Introduction
A random process is a process (i.e.,
variation in time or one dimensional
space) whose behavior is not completely
predictable and can be characterized bystatistical laws.
Examples of random processes
Daily stream flow
Hourly rainfall of storm events
Stock index
-
8/7/2019 DC Digital Communication PART4
94/150
Random Variable
A random variable is a mappingfunction which assigns outcomes of arandom experiment to real numbers.
Occurrence of the outcome followscertain probability distribution.Therefore, a random variable is
completely characterized by itsprobability density function (PDF).
-
8/7/2019 DC Digital Communication PART4
95/150
-
8/7/2019 DC Digital Communication PART4
96/150
-
8/7/2019 DC Digital Communication PART4
97/150
-
8/7/2019 DC Digital Communication PART4
98/150
-
8/7/2019 DC Digital Communication PART4
99/150
The term stochastic processes
appears mostly in statistical
textbooks; however, the termrandom processes are frequently
used in books of many engineering
applications.
-
8/7/2019 DC Digital Communication PART4
100/150
Characterizations of a
-
8/7/2019 DC Digital Communication PART4
101/150
Characterizations ofa
Stochastic Processes
First-order densities of a random process
A stochastic process is defined to be completely ortotally characterizedif the joint densities for the
random variables are known forall times and all n.
In general, a complete characterization ispractically impossible, except in rare cases. As aresult, it is desirable to define and work withvarious partial characterizations.Depending onthe objectives of applications, a partialcharacterization often suffices to ensure thedesired outputs.
)(),(),( 21 ntXtXtX.
nttt ,,, 21 .
-
8/7/2019 DC Digital Communication PART4
102/150
For a specific t,X(t) is a random variable
with distribution .
The function is defined as thefirst-order distribution of the random variable
X(t). Its derivative with respect tox
is the first-order density of X(t).
])([),( xtXptxF e!
),( txF
xtx
Ftxf
xx! ),(),(
-
8/7/2019 DC Digital Communication PART4
103/150
If the first-order densities defined for all timet, i.e.f(x,t), are all the same, thenf(x,t) doesnot depend on tand we call the resultingdensity the first-order density of the randomprocess ; otherwise, we have a family offirst-order densities.
The first-order densities (or distributions) areonly a partial characterization of the randomprocess as they do not contain information
that specifies the joint densities of the randomvariables defined at two or more differenttimes.
_ a)(tX
-
8/7/2019 DC Digital Communication PART4
104/150
Mean and variance of a random process
The first-order density of a random process,
f(x,t), gives the probability density of the
random variablesX(t) defined for all time t.
The mean of a random process, mX(t), is
thus a function of time specified by
g
g!!! ttttX dxtxfxXEtXEtm ),(][)]([)(
-
8/7/2019 DC Digital Communication PART4
105/150
For the case where the mean ofX(t) does
not depend on t, we have
The variance of a random process, also a
function of time, is defined by
constant).(a)]([)( XX mtXEtm !!
_ a2222 )]([][)]()([)( tmXEtmtXEt XtXX !!W
-
8/7/2019 DC Digital Communication PART4
106/150
Second-order densities of a random
process
For any pair of two random variablesX(t1)
andX(t2), we define the second-order
densities of a random process as
or .
Nth-order densities of a random process
The nth order density functions for
at times are given by
or .
),;,( 2121 ttxxf
),( 21 xxf
_ a)(tXnttt ,,, 21 .
),,,;,,,( 2121 nn tttxxxf .. ),,,( 21 nxxxf .
Autocorrelation and autocovariance
-
8/7/2019 DC Digital Communication PART4
107/150
Autocorrelation and autocovariance
functions of random processes
Given two random variablesX(t1) andX(t2),
a measure of linear relationship between
them is specified byE[X(t1)X(t2)]. For a
random process, t1 and t2 go through allpossible values, and therefore,E[X(t1)X(t2)]
can change and is a function oft1 and t2. The
autocorrelation function of a randomprocess is thus defined by
? A ),()()(),( 122121 ttRtXtXEttR !!
-
8/7/2019 DC Digital Communication PART4
108/150
S i i f d
-
8/7/2019 DC Digital Communication PART4
109/150
Stationarity of random processes
Strict-sense stationarity seldom holds for randomprocesses, except for some Gaussian processes.Therefore, weaker forms of stationarity areneeded.
(((! nnnn tttxxxftttxxxf ,,,;,,,,,,;,,, 21212121 ....
-
8/7/2019 DC Digital Communication PART4
110/150
? A .allorconstant)()( tmtXE ! .andallfor,),( 21121221 ttttRttRttR !!
Equality and continuity of
-
8/7/2019 DC Digital Communication PART4
111/150
q y y
random processes
Equality
Note that x(t, [i) =y(t, [i) for every [iis not the same as x(t, [i) =y(t, [i) with
probability 1.
-
8/7/2019 DC Digital Communication PART4
112/150
-
8/7/2019 DC Digital Communication PART4
113/150
Mean square equality
-
8/7/2019 DC Digital Communication PART4
114/150
Lecture on Communication Theory
C
-
8/7/2019 DC Digital Communication PART4
115/150
CNU De t. of Electro ics 115
Chapter 1. Random Processes - PreliminaryP.1 Introduction
1. Deterministic signals: the class of signals that may bemodeled as completely specified functions of time.
2. Random signals: it is not possible to predict its precisevalue in advance. ex) thermal noise3. Random variable: A function whose domain is a sample
space and whose range is some set of real numbers.
obtained by observing a random process at a fixedinstant of time.
4. Random process:ensemble (family) of samplefunctions, ensemble of random variables.
P.2 Probability Theory
1. Random experiment1) Repeatable under identical conditions
2) Outcome is unpredictable3) For a large number of trials of theexperiment, the outcomes exhibit statisticalregularity, i.e., a definite averagepattern of
outcomes is observed for a large number of trials.
Lecture on Communication Theory2. Relative-Frequency Approach
-
8/7/2019 DC Digital Communication PART4
116/150
CNU De t. of Electro! ics 116
1) Relative frequency
2) Statistical regularity p Probability ofevent A.
3. Axioms of Probability.1)
a) Samplepoints sk:kth outcome ofexperimentb) Sample spaceS: totality of samplepoints
c) Sureevent:entire sample spaceS
d) J: null or impossibleevent
e) Elementary event: a single samplepoint
2) Definition ofprobabilitya) A sample spaceS ofelementary events
b) A class I ofevents that are subsets ofS.
c) A probability measure P() assigned to eachevent A in the class I,whichhas the following properties:
1
n
(A)N0 n ee
!
gp n
(A)NP(A) n
nlim
P(B)P(A)B)P(A then,classtheineventsexeclusive
mutuallytwoofuniontheisBAIf(iii)1P(A)0i)(i
1P(S))i(
!
ee
!
I
Axiomsof
Probability
Lecture on Communication Theory
3) P t 1 P(A)1)AP(
-
8/7/2019 DC Digital Communication PART4
117/150
CNU De" t. of Electro# ics 117
3) Property 1.
4) Property 2. If M mutually theexclusiveevents
have theexclusiveproperty
then5) Property 3.
4. Conditional Probability1) Conditional Probability of given A
(given A means that event A has occurred)
2) Statistically independent
ex1) BSC (Binary Symmetric Channel)
Discrete memoryless channel
M21 A,,A,A
SAAAM21
!
P(A)1)AP( !
1)P(A)P(A)P(A M21 !
P(AB)-P(B)P(A)B)P(A !
r leBayes';P(A)
B)P(B)|P(AA)|P(B
B)P(B)|P(AA)P(A)|P(BP(AB)
B&Aofyprobabilitjoi tP(AB)here
P(A)
P(AB)A)|P(B
!@
!!
!
!
P(A)P(B)P(AB) !
1-pA0 B0[0] [0]
1-p
pp
A1 B1[1] [1]
Lecture on Communication Theory
.Priori prob.
-
8/7/2019 DC Digital Communication PART4
118/150
CNU Dept. of Electro$ ics 118
Conditional prob. or likelihood
[0]p[1][0]p[0]
Output prob.
Posteriori prob.
P.3 Random variables
1) Random variable: A function whose domain is a samplespace and whose range is some set of real numbers
2) Discrete r. v. : X(k), k sample ex) range {1,,6}Continuous r. v. : X ex) 8~ 8 10
3) Cumulative distribution function (cdf) or distribution fct.
FX(x) = P(X e x) a) 0 e FX(x) e1 b) ifx1 < x2, FX(x1) e FX(x2), monotone-nondecreasing fct.
1pp,p)(A,p)(AP 211100 !!!
p;)A|P(B)A|P(B 1001 !!
p;1)A|P(B)A|P(B 1100 !!
101
100
p)p(1pp)P(B
ppp)p(1)P(B
!
!
;ppp)p(1
p)p(1)P(B
)P(A)A|P(B)B|P(A10
0
0
00000
!!
;p)p(1pp
p)p(1
)P(B
))P(AA|P(B)B|P(A
10
1
1
11111
!!
Lecture on Communication Theory
4) pdf (probability density fct.)
-
8/7/2019 DC Digital Communication PART4
119/150
CNU Dept. of Electro% ics 119
4) pdf (probability density fct.)
pdf: nonnegative fct., total area = 1
ex2)
(x)dx
d(x)f
XX!
!e
!
!
gg
g
2
1
x
x&
21
&
x&&
(x)dx)x(x
1(x)dx
)d((x)
Lecture on Communication Theory
2 Several random variables (2 random variables)
-
8/7/2019 DC Digital Communication PART4
120/150
CNU Dept. of Electro' ics 120
2. Several random variables (2 random variables)1) Joint distribution fct.
2) Joint pdf
3) Total area
4) Conditional prob. density fct. (given that X = fixed x)
If X,Y are statistically independent
fY(y|x) = fY(y)
Statistically independent fX,Y(x,y) = fX(x)fY(y)
P.4Statistical Average
1. Mean orexpected value1) Continuous
ex)
y)Yx,P(Xy)(x,X,Y ee!
yx
y)(x,Fy)(x,f X,Y
2
X,Y nnn
!
;d)(x,f(x)f
dd),(f(x)F1dd),(f
- X,YX
-
x
- X,YX
- - X,Y
((((
g
g
g
g g
g
g
g
g
!
!!
L
densitymarginal
!
u!"
g
g1x)dy|(yf
0(x)f
y)(x,fx)|(yf0(x)fIf
Y
X
X,YYX
g
g!! (x)dxxE[ ]))
010
10
15x
20
1xdx
10
1E[X]
10
0
2 !!! g
g
Lecture on Communication Theory
2) Discrete
-
8/7/2019 DC Digital Communication PART4
121/150
CNU Dept. of Electro0 ics 121
)
2. Function of r. v.Y=g(X) X, Y : r. v.
3. Moments1) n-th moments
2) Central moments
where is standard deviation
3
11)432(1
1[X]ex)
p(k)xn
(k)Nx[X]
k kk
nk
!!
!! g
g!
g
g!
dx(x)(x)f[ (X)][Y]X
g
g!!
0sinx2
1dx
2
1cosxE[Y]
otherwise0
x-
2
1
(x)fwhere
cos(X)g(X)Yex)
X
!!!
!
!!
Xofvaluesquaremean]E[X2n
meanE[X]1n(x)dxfx]E[X
2
x
X
nn
p!
!p!
! g
g
])E[(var[ ]2,n
(x)dx)(x])E[(2
1
21
1
n1
n1
!!!
! g
g
X
Lecture on Communication Theory
WX2 meaning: randomness, effective width of fX(x)
-
8/7/2019 DC Digital Communication PART4
122/150
CNU Dept. of Electro2 ics 122
X g X( )
Chebyshev inequality .
4. Characteristic functionCharacteristic function JX(v) f X(x)
ex4) Gaussian Random ariable
2
2
XX )-XP(
eu
val esq areean:][Xvariance,:
][X,If][X(X)2][X])[(X
22
X
22
XX
2
X
22
XX
22
X
2
X
!!
!!!
vexp(- vx)d(v)32
1(x)
)dx(x)exp( vx]E[exp( vx)(v)
44
44
g
g
g
g
!@!!@
vvv
!
!
!!
!
gg
!
oddnfor0
evennfor1)(n531])E[(x
momentscentral
2
vexpx
2x-exp
21(x)f0,If
v2
1jvexp(v)
x2
)(xexp
21
(x)f
n
Xn
X
2
X
2
X
2
X
2
X
XX
2
X
2
XX
2
X
2
X
X
X
; Chebyshev inequality
Lecture on Communication Theory5. Joint moments
-
8/7/2019 DC Digital Communication PART4
123/150
CNU Dept. of Electro5 ics 123
E[X] = 0 or E[Y] = 0
X, Y are orthogonal
X, Y are statistically independent uncorrelated
g
g
g
g
g
g
g
g
!
!
y)dxdy(x,xyfE[XY]
nCorrelatio
y)dxdy(x,fyx]YE[X
momentsJoint
X,Y
X,Y
jiji
YX
YX
cov[XY]
tcoefficienonCorrelati
E[XY]
E[Y])]E[X])(YE[(Xcov[XY]
Covariance
!
!
!
!
!
0E[XY]orthogonalareYandX
0[XY]coveduncorrelatareYandX
OX
uncorrelated
Lecture on Communication Theory
P.5 Transformations of Random variables: Y=g(X)
-
8/7/2019 DC Digital Communication PART4
124/150
CNU Dept. of Electro6 ics 124
g( )
1. Monotone transformations: one-to-one
2. Many-to-one transformations
Y
y
x X
(y)1gxdg/dx
(x)f
dy/dx
(x)f(y)f XXY
!!!
(y)1gk
xdg/dx
(x)f(y)f
k k
YY
!!
Lecture on Communication Theory
Chapter 1 Random Processes
-
8/7/2019 DC Digital Communication PART4
125/150
CNU Dept. of Electro7 ics 125
Chapter 1.Random Processes
1.1 Introduction
1. Mathematical modeling of a physical phenomenon(1) Deterministic if there is no uncertainty about its time-dependent behavior
at any instant of time.ex)
(2) Random or stochastic if it is not possible to predict its precise value inadvance. ---> Averagepower power spectral density statisticalparameter
2. random
(1) Information-bearing signal : voice signal consists of randomly spacedbursts ofenergy of random duration.
(2) Digital communication
pseudo randomsequence.
(3) Interference component : spurious electromagnetic wave.
(4) Thermal noise: by the random motion of theelectrons in conductors anddevices at the front end of the receiver.
tftf cT2cos)( !
Lecture on Communication Theory
1 2 Mathematical Definition of a Random Process
-
8/7/2019 DC Digital Communication PART4
126/150
CNU Dept. of Electro8 ics 126
1.2 Mathematical Definition of a Random Process
1. Properties of random process
(1) Random process is a function of time.(2) They are random, i.e., it is not possible to exactly define the waveforms
in the future, before conducting an experiment.
2. r. v. {X}: outcomes of a random experiment is mapped into a number
3. r. p. {X(t)} or {X(t,s)}: outcomes of a random experiment is mapped
into a waveform that is fct. of time.X(t,s), t is time, s is sample; indexed ensemble(family) of r.v.
SampleSpace or EnsembleS, s1, s2, , sn are samplepoints
Sample function
xj(t) = X(t,sj) {x1(t),x2(t),,xn(t)} ; sample space
{x1(tk),x2(tk),xn(tk)} = {X(tk,s1),X(tk,s2)X(tk,sn)} constitutes a random variable
r. p. ) X(t) = A cos (2Tfct 5), Random Binary Wave,
Gaussian noise
Lecture on Communication Theory
-
8/7/2019 DC Digital Communication PART4
127/150
CNU Dept. of Electro9 ics 127
FIGURE 1.1 An ensemble of sample functions.
Lecture on Communication Theory
1.3 Stationary Processes
-
8/7/2019 DC Digital Communication PART4
128/150
CNU Dept. of Electro@ ics 128
y
1. Stationary : statistical characterization of a process is
independent of time
2. r. p. X(t) is stationary in the strict sense or strictly stationary
If
for all time shift X, all k and all possible t1,,tk.
< Observation >
1) k = 1, FX(t)(x) = FX(t X)(x) = FX(x) for all t & X.1st order distribution fct. of a stationary r. p. is independent
of time
2) k = 2 & X = -t, for all t1& t2
2nd order distribution fct. of a stationary r. p. depends only
on the differences between the observation time
2. Two r. p. X(t),Y(t) are jointly stationary if the joint distribution functions of r. v. X(t1),,X(tk) and Y(t1),
,Y(tk) are invariant with respect to the location of the origin t = 0 for all k and j, and all choices of observation
times t1,,tk and t1, ,tk.
),...,(),...,( 1)(),...,(1)(),...,( 11 ktXtXktXtX xxFxxF kk ! XX
)x,x(F)2x,1x(F 21)tt(X),0(X)t(X),t(X 1221 !
Lecture on Communication Theory
Ex 1.1)
-
8/7/2019 DC Digital Communication PART4
129/150
CNU Dept. of ElectroA ics 129
Ex 1.1)
FIGURE 1.2 Illustrating theprobability of a joint event.
a1
t1
t2
b2
a2
b3
a3
t3
A possiblesample function
b1
-
8/7/2019 DC Digital Communication PART4
130/150
Lecture on Communication Theory
3. Autocovariance fct. of stationary r. p. X(t)
-
8/7/2019 DC Digital Communication PART4
131/150
CNU Dept. of ElectroC ics 131
CX(t1,t2)=E[(X(t1) - QX)(X(t2) - QX)]
=RX(t2 - t1) - QX2
4. Wide-sense stationary, second-order stationary, weakly stationary
@ strict-sense stationary wide sense stationary
5. Properties of the Autocorrelation Function(1) Autocorrelation fct. of stationary process X(t)RX(X)=E[X(t X)X(t)] for all t
(2) Propertiesa) Mean-square value by setting X = 0 , RX(0) = E[X2(t)]
b) RX(X):even fct. , RX(X) = RX(-X)
c) RX(X) has its maximum at X = 0, RX(X) e RX(0)pf. of c)
!
!!
2112X21X
XX
tandtallfor)t(tR)t,(tR
tallfor,constant(t)
ox
(0)R)(R(0)R
0)(2R(0)2R
0(t)]E[X)X(t)]2E[X(t)](tE[X
0]X(t)))E[(X(t
XXX
XX
22
2
ee@
us
us
us
Lecture on Communication Theory
(3) Physical meaning of RX(X)
-
8/7/2019 DC Digital Communication PART4
132/150
CNU Dept. of ElectroD ics 132
Interdependence of X(t) and X(t X)
Decorrelation timeX0: forX > X0, RX(X) < 0.01RX(0)
Ex 1.2) Sinusoidal wave with Random phase
)fcos(22
A
)]tf)cos(2f2tfcos(2E[A
)X(t)]E[X(t)(R
otherwise02
1
)(fwhere
)tfAcos(2X(t)
c
2
ccc2
X
c
!
!
!
ee!
!
FIGURE 1.4 Illustrating the autocorrelation functions of slowly and rapidlyfluctuating random processes.
Lecture on Communication Theory
-
8/7/2019 DC Digital Communication PART4
133/150
CNU Dept. of ElectroE ics 133
FIGURE 1.5Autocorrelation function of a sine wave with random phase.
Lecture on Communication Theory
Ex 1.3) Random Binary Wave
Tt011
-
8/7/2019 DC Digital Communication PART4
134/150
CNU Dept. of ElectroF ics 134
RX(0) = E[X(t)X(t)] = A2RX(T) = E[X(t)X(t T)] = 0
ee!
otherwise0,
Tt0,T
1)(tf ddT d
? A 0X(t)E2
1P(-A)A)P(
!@
!!
FIGURE 1.6 Sample function of random binary wave.
FIGURE 1.7Autocorrelation function of random binary wave.
Lecture on Communication Theory
6. Cross-correlation Functions
-
8/7/2019 DC Digital Communication PART4
135/150
CNU Dept. of ElectroG ics 135
r. p. X(t) with autocorrelation RX(t,u)
r. p. Y(t) with autocorrelation RY(t,u)
Cross-correlation fct. of X(t) and Y(t) RXY(t,u) = E[X(t)Y(u)]
RYX(t,u) = E[Y(t)X(u)]
Correlation Matrix of r. p. X(t) and Y(t)
If X(t) and Y(t) areeach w. s. s. and jointly w. s. s.
whereX = t-u
RXY(X) { RXY(-X) i.e. not even fct.RXY(0) is not maximum
RXY(X) = RYX(-X)
!u)(t,Ru)(t,R
u)(t,Ru)(t,Ru)R(t,
YYX
XYX
!
)(R)(R
)(R)(R)R(
YYX
XYX
Lecture on Communication Theory
Ex 1.4) Quadrature - Modulated Processes
X (t) d X (t) f X(t)
-
8/7/2019 DC Digital Communication PART4
136/150
CNU Dept. of ElectroH ics 136
X1(t) and X2(t) from w. s. s. r. p. X(t)
X1(t)=X(t)cos(2Tfct 5)
X2(t)=X(t)sin(2Tfct 5) where5 is independent of X(t)
Cross-correlation fct.
R12(X) = E[X1(t)X2(t-X)]
= E[X1(t)X2(t-X)]E[cos(2Tfct 5)sin(2Tfct-2TfcX 5)]
=
R12(0)=E[X1(t)X2(t)]=0 orthogonal
1.5 Ergodic Processes
1. Ensemble average time average(1) Expectation orensemble average of r.p. X(t)
average across theprocess
(2) Time average or long-term sample average
average along theprocess
(3) For sample function x(t) of w. s. s. r. p. X(t) with -Te t e T
(a) Time average (dc value)
e!
0
202
1
) XX CX f)sin(2(R2
1
!T
TX x(t)dt2T
1(T)
) XXCX f)sin(2(R
2
1
Lecture on Communication Theory
(b) Mean of time averageQX(T)
-
8/7/2019 DC Digital Communication PART4
137/150
CNU Dept. of ElectroI ics 137
1. w. s. s. r. p. X(t) is ergodic in the mean
2. w. s. s. r. p. X(t) is ergodic in the autocorrelation fct.
where RX(X,T) =
= time averaged autocorrelation fct.
of sample fct. x(t) from w. s. s. r. p. X(t)
1.6 Transmission of a r. p. through a linear filter
!
!
gp
gp
0(T)]var[lim
(T)limIf
XT
XXT
!
!
gp
gp
0T)],(var[Rlim
)(RT),(RlimIf
XT
XXT
T
T)x(t)dtx(t
2T
1
w.s.s r.p w.s.s r.p
)y(yF)x(xF k1,)Y(t)Y(tk1)X(t)X(t k1k1 p
Thus
FIGURE 1.8 Transmission of a random process through a linear time-invariantfilter.
Lecture on Communication Theory
1. Mean of Y(t))dX(t)h(E[E[Y(t)](t) 111Y ] !!
g
-
8/7/2019 DC Digital Communication PART4
138/150
CNU Dept. of ElectroP ics 138
2. Autocorrelation fct.
Mean square value E[Y2(t)]=RY(0)
s.s.w.areY(t)X(t),H(0)
X(t)s.s.w.d)h(
)d(t)h(
)]dE[X(t)h(
)dX(t)h(E[E[Y(t)](t)
XY
11X
11X1
11
111Y
3
3
]
1
!@
!
!
!
g
g
g
g
g
g
g
s.s..alsoisY(t)
X(t)s.s..uthere
)()Rh(d)h(d
)u,(t)Rh(d)h(d
])d)X(uh()d)X(th(E[
]E[Y(t)Y(u)u)(t,R
21X2211
21X2211
222111
Y
@
!
!
!
!
!
g
g
g
g
g
g
g
g
g
g
g
g
3
constantd)d()R)h(h((t)]E[Y 2112X212 !!
g
g
g
g
Lecture on Communication Theory
1.7 PowerSpectral density
1 M l f Y(t) d
-
8/7/2019 DC Digital Communication PART4
139/150
CNU Dept. of ElectroQ ics 139
1. Mean square value of Y(t) p. s. d.
h1(X1) H(f)
Power spectral density orpower spectrum of w. s. s. r. p. X(t)
Mean square value of Y(t)
g
g! [watt/Hz])dfj2)exp((R(f)S XX
g
g
g
g
g
g
g
g
g
g
g
g
g
g
g
g
g
g
g
g
!
!
!!
!
- X
2
X-222-
121112X-
22-
2112X21
2
(f)dfSH(f)
)df)exp(-j2(Rf)exp(j2h(ddfH(f)
)-( etd)f)exp(j2-(R)h(ddfH(f)
d)d()R)df]h(f2H(f)exp(j[(t)]E[Y
)
)(ff)S(2(t)]E[Y CX2 $
FIGURE 1.9 Magnitude response of ideal narrowband filter.
Lecture on Communication Theory
2. Properties of the PowerSpectral Density1) Ei t i Wi Khi t hi l ti
-
8/7/2019 DC Digital Communication PART4
140/150
CNU Dept. of ElectroR ics 140
1) Einstein - Wiener- Khintchine relations
2) Property 1.
For w. s. s. r. p.,
3) Property 2.Mean square value of w. s. s. r. p.
4) Property 3.
For w. s. s. r. p., SX(f) u 0 for all f.
5) Property 4.
SX(-f) = SX(f):even fct.5 RX(-X) = RX(X)
6) Property 5.
Thep. s. d., appropriately normalized, has theproperties
usually associated with a probability density fct.
p.r.s.s.w.:X(t)here,w
)dff(f)exp(j2S)(R
)dfj2)exp((R(f)S
XX
XX
g
g
g
g
!
oq
!
)d(R(0)S XX g
g!
g
g!! (f)dfS(0)R(t)]E[X XX
2
g
g
!(f)dfS(f)S(f)P
X
XX
2
1
X2
rms )(f)dfpf(W g
g!
Lecture on Communication Theory
Ex 1.5) Sinusoidal wave with Random Phase
r p X(t) = A cos (2Tf (t) 5)
-
8/7/2019 DC Digital Communication PART4
141/150
CNU Dept. of ElectroS ics 141
r. p. X(t) = A cos (2TfC(t) 5)
where5 is uniform r. v. over [-T, T]
)]f(f)f(f[4
A(f)S
t)fcos(22
A)(R
CC
2
X
C
2
X
!@
!
FIGURE 1.10 Power spectral density of sine wave with random phase;denotes the delta function at f=0.
)f(H
Lecture on Communication Theory
Ex 1.6) Random Binary wave with A & -AT)(1A
(R2
)
-
8/7/2019 DC Digital Communication PART4
142/150
CNU Dept. of ElectroT ics 142
Energy spectral density of a rectangularpulse g(t)
(fT)TsincA
)df)exp(-j2T
(1A)f(S
T0
)T
((R
22
T
T
2
X
X
!
!
u
!
)
FIGURE 1.11 Power spectral e sity of ra o i ary wa e.
T
(f)E(f)S
(fT)sincTA(f)E
gX
222g
!@
!
Lecture on Communication Theory
Ex 1.7) Mixing of a r. p. with a sinusoidal process.)tfX(t)cos(2Y(t) C !
-
8/7/2019 DC Digital Communication PART4
143/150
CNU Dept. of ElectroU ics 143
3. Relation among the PowerSpectral Density of the Input
and Output Random Process
? A)f(fS)f(fS4
1(f)S
)f)cos(2(R21)(R
X(t)oftindependenandr.vis
r.pw.s.s.isx(t)erewh
CXCXY
CXY
!
!
(f)SH(f)(f)S
(f)(f)SH(f)H(f)S
)i.e.let(
dd)dfj2)exp(()R)h(h(
)dfj2)exp((R(f)S
X
2
Y
XY
210021
2121X21
YY
!@
!
!!
!
!
g
g
g
g
g
g
g
g
-
8/7/2019 DC Digital Communication PART4
144/150
Lecture on Communication Theory
4. Relation among the PowerSpectral Density and the
Amplitude Spectrum of a Sample Function
-
8/7/2019 DC Digital Communication PART4
145/150
CNU Dept. of Electro ics 145
AmplitudeSpectrum of a Sample Function
Sample fct. x(t) of w. s. s. & ergodic r. p. X(t) withSX
(f)
X(f,T): FT of truncated sample fct. x(t)
Conclusion) Sample function
5. Cross Spectral DensityA measure of the freq. interrelationship between 2 randomprocess
(f)Sf)(S(f)S)(R)(R)(S)(R
)(S)(R
YXYXXYYXXY
YXYX
XYXY
!!!
? A
ft)dtj2x(t)exp(E2T
1
T)X(f,E2T
1(f)S
2T
TT
2
TX
li
li
!
!@
gp
gp
gp !T
TT
X )x(t)dtx(t2T
1
)(R li
for ulaaverage-ti eusing)(Robtain X !
T
T-ft)dt2x(t)exp(-jt)X(f,
-
8/7/2019 DC Digital Communication PART4
146/150
Lecture on Communication Theory
Table 1.2 Graphical Summary of Autocorrelation Functions and PowerSpectral Densitie of Random Processes of Zero Mean and Unit ariance
-
8/7/2019 DC Digital Communication PART4
147/150
CNU Dept. of Electroc ics 147
Lecture on Communication Theory
1.8 Gaussian Process
1 Definition
-
8/7/2019 DC Digital Communication PART4
148/150
CNU Dept. of Electrod ics 148
1. DefinitionProcess X(t) is a Gaussian process ifevery linear functional
of X(t) is a Gaussian r. v.
If the r. v. Y is a Gaussian distributed r. v. forevery g(t), then
X(t) is a Gaussian process
v.r.:Yfct.,some:g(t)g(t)X(t)dtY
T
0!
!
!!
!
2
yexp
2
1(y)f
N(0,1):ondistributiaussian1)0,(nor alized
2
)(yexp
2
1(y)f
2
Y
2
YY
2
Y
2
Y
Y
Y
FIGURE 1.13 Normalized Gaussian distribution.
Lecture on Communication Theory
2. irtues of Gaussian process1) Gaussian process has many properties that make analytic
-
8/7/2019 DC Digital Communication PART4
149/150
CNU Dept. of Electroe ics 149
1) Gaussian process has many properties that make analyticresults possible
2) Random processes produced by physical phenomena areoften such that a Gaussian model is appropriate.
3. Central Limit Theorem1) Let Xi, I = 1, 2, , N be a set of r. v. that satisfies
a) The Xi are statistically independent
b) The Xi have the samep. d. f. with mean QX and varianceWX2@ Xi : set of independently and identically distributed (i. i. d.)
r. vs.
Now Normalized r. v.
< Central limit theorem >
Theprobability distribution of Napproaches a normalized Gaussian distributionN(0,1) in the limit as N approaches infinity. Normalized r. v. r.v. N(0,1) .
!
!
!
!@
!!
N
1i
iN
i
i
XiX
i
Y
N
1v.r.define
1]var[Y
0]E[Y
N.,1,2,i,)(X1
Y
Lecture on Communication Theory4. Properties of Gaussian Process1) Property 1.
X(t) h(t) Y(t)
-
8/7/2019 DC Digital Communication PART4
150/150
X(t) h(t) Y(t)
If a Gaussian process X(t) is applied to a stable linear filter, then the randomprocess Y(t) developed at the output of the filter is also Gaussian.
2) Property 2.
Consider the set of r. v. or samples X(t1), X(t2), , X(tn) obtained by observing a r.p. X(t) at times t1, t2, , tn.
If theprocess X(t) is Gaussian, then this set of r. vs. are jointly Gaussian for any n,
with their n-fold joint p. d. f. being completely determined by specifying the set ofmeans
and the set of auto covariance functions
3) Property 3.
If a Gaussian process is stationary, then theprocess is also
strictly stationary.
4) Property 4.If random variables X(t1), X(t2), , X(tn), obtained by sampling a Gaussian processX(t) at times t1,t2,,tn, are uncorrelated, i. e.
Gaussian P. Gaussian P.stable, linear
n,1,2,i,)]E[X(ti)X(t i
!!
)))(X(t)E[(X(t)t,(tC)X(ti)X(tkikX ik
!