![Page 1: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/1.jpg)
Probability Theory and Random Processes
Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006.
![Page 2: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/2.jpg)
Probability
• Probability theory is based on the phenomena that can be modeled by an experiment with an outcome that is subject to chance.
• Definition: A random experiment is repeated n time (n trials) and the event A is observed m times (m occurrences). The probability is the relative frequency of occurrence m/n.
![Page 3: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/3.jpg)
Probability Based on Set Theory
• Definition: An experiment has K possible outcomes where each outcome is represented as the kth sample sk. The set of all outcomes forms the sample space S. The probability measure P satisfies the
• Axioms:– 0 ≤ P[A] ≤ 1– P[S] = 1– If A and B are two mutually exclusive events (the two events cannot
occur in the same experiment), P[AUB]=P [A] + P[B], otherwise P[AUB] = P[A] + P[B] – P[A∩B]
– The complement is P[Ā] = 1 – P[A]– If A1, A2,…, Am are mutually exclusive events, then P[A1] + P[A2] + … +
P[Am] = 1
![Page 4: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/4.jpg)
Venn Diagrams
A B
Events A and B that are mutually exclusive events in the sample space S.
S
sk Sample can only come from A, B, or neither.
A B
S
sk
Sample can only come from both A and B.
Events A and B are not mutually exclusive events in the sample space S.
![Page 5: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/5.jpg)
Conditional Probability
• Definition: An experiment involves a pair of events A and B where the probability of one is conditioned on the occurrence of the other. Example: P[A|B] is the probability of event A given the occurrence of event B
• In terms of the sets and subsets– P[A|B] = P[A∩B] / P[A]– P[A∩B] = P[A|B]P[B] = P[B|A]P[A]
• Definition: If events A and B are independent, then the conditional probability is simply the elementary probability, e.g. P[A|B] = P[A], P[B|A] = P[B].
![Page 6: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/6.jpg)
Random Variables
• Definition: A random variable is the assignment of a variable to represent a random experiment. X(s) denotes a numerical value for the event s.
• When the sample space is a number line, x = s.• Definition: The cumulative distribution function (cdf)
assigns a probability value for the occurrence of x within a specified range such that FX(x) = P[X ≤ x].
• Properties:– 0 ≤ FX(x) ≤ 1
– FX(x1) ≤ FX(x2), if x1 ≤ x2
![Page 7: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/7.jpg)
Random Variables
• Definition: The probability density function (pdf) is an alternative description of the probability of the random variable X: fX(x) = d/dx FX(x)
• P[x1 ≤ X ≤ x2] = P[X ≤ x2] - P[X ≤ x1]
= FX(x2) - FX(x1)
= fX(x)dx over the interval [x1,x2]
![Page 8: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/8.jpg)
Example Distributions
• Uniform distribution
bx
bxaab
ax
xf X
,0
,1
,0
)(
bx
bxaab
axax
xFX
,1
,
,0
)(
![Page 9: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/9.jpg)
Several Random Variables
• CDF:
• Marginal cdf:
• PDF:
• Marginal pdf:
• Conditional pdf:
yYxXyxF YX ,),(, P
),(),( ,
2
, yxFyx
yxf YXYX
1 ),(,
dvduvuf YX
x
YXX dvduvufxF ),()( ,
duyufyf YXY ),()( ,
dvvxfxf YXX ),()( ,
y
YXY dvduvufyF ),()( ,
)(
),()|( ,
xf
yxfxyf
X
YXY
![Page 10: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/10.jpg)
Statistical Averages
• Expected value:
• Function of a random variable:
• Text Example 5.4
dxxxfX XX E
)(XgY
dxxfXgXgdyyyfY XY EE
![Page 11: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/11.jpg)
Statistical Averages
• nth moments:
• Central moments:
dxxfxX Xnn E
dxxfxX X 22E
dxxfxX Xn
Xn
X E
222 XXXX dxxfxX
E
Mean-square value of X
Variance of X
![Page 12: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/12.jpg)
Joint Moments
• Correlation:
• Covariance:
• Correlation coefficient:
dydxyxfyxYX YXkiki ,, ,E Expected value of the product
- Also seen as a weighted inner product
YXXY
YYXXXY
E
EEEcovCorrelation of the central moment
correlatedstrongly ,1
eduncorrelat,0cov
YX
XY
![Page 13: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/13.jpg)
Random Processes
• Definition: a random process is described as a time-varying random variable
• Mean of the random process:
• Definition: a random process is first-order stationary if its pdf is constant
• Definition: the autocorrelation is the expected value of the product of two random variables at different times
tX
dxxxftXt tXX E
xfxf tXtX 21 XX t Constant mean, variance 22
XX t
2121, tXtXttRX E 2121, ttRttR XX Stationary to second order
![Page 14: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/14.jpg)
Random Processes
• Definition: the autocorrelation is the expected value of the product of two random variables at different times
• Definition: the autocovariance of a stationary random process is
2121, tXtXttRX E
2121, ttRttR XX Stationary to second order
2
21
2121,
XX
XXX
ttR
tXtXttC
E
![Page 15: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/15.jpg)
Properties of Autocorrelation
• Definition: autocorrelation of a stationary process only depends on the time differences
• Mean-square value:
• Autocorrelation is an even function:
• Autocorrelation has maximum at zero:
tXtXRX E
tXRX20 E
XX RR
0XX RR
![Page 16: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/16.jpg)
Example
• Sinusoidal signal with random phase
• Autocorrelation
otherwise,0
-,2
1
,2cos
tf
tfAtX c
c
X
fA
XtXR
2cos2
2
E
As X(t) is compared to itself at another time, we see there is a periodic behavior it in correlation
![Page 17: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/17.jpg)
Cross-correlation
• Two random processes have the cross-correlation
• Wide-sense stationary cross-correlation
uYtXutR
tYtX
XY E,
,
XYXY
YYXX
RuR
uRuRtRtR
,
,,,
![Page 18: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/18.jpg)
Example
• Output of an LTI system when the input is a RP• Text 5.7
![Page 19: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/19.jpg)
Power Spectral Density
• Definition: Fourier transform of autocorrelation function is called power spectral density
• Consider the units of X(t) Volts or Amperes• Autocorrelation is the projection of X(t) onto itself• Resulting units of Watts (normalized to 1 Ohm)
dfefSτR
dτeRfS
fjXX
fjXX
2
2
![Page 20: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/20.jpg)
Properties of PSD
• Zero-frequency of PSD
• Mean-square value
• PSD is non-negative
• PSD of a real-valued RP
dτRS XX 0
dffStX X2E
Which theorem does this property resemble?
0fSX
fSfS XX
![Page 21: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/21.jpg)
Example
• Text Example 5.12– Mixing of a random process with a sinusoidal process
– Autocorrelation
– PSD
tftXtY c2cos
Wide-sense stationary RP (to make it easier)
Uniformly distributed, but not time-varying
cXY fRtYtYR 2cos2
1E
cYcYY ffSffSfS 4
1
![Page 22: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/22.jpg)
PSD of LTI System
• Start with what you know and work the math
dddetXtXhh
dedtXhdtXh
detXthtXth
detYtYfS
dτeRfS
tXthtY
fj
fj
fj
fjY
fjYY
**
*
212
2121
2222111
2
2
2
E
E
E
E
![Page 23: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/23.jpg)
PSD of LTI System
• The PSD reduces to
fSfHfS
dddeRhhfS
dddeRhhfS
XY
fjXY
fjXY
2
0212
01020
210021
212
2121
variablesof Change
210
System shapes power spectrum of input as expected from a filtering like operation
![Page 24: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/24.jpg)
Gaussian Process
• The Gaussian probability density function for a single variable is
• When the distribution has zero mean and unit variance
• The random variable Y is said to be normally distributed as N(0,1)
2
2
2exp
2
1
Y
Y
Y
Y
yyf
2exp
2
1 2yyfY
![Page 25: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/25.jpg)
Properties of a Gaussian Process
• The output of a LTI is Gaussian if the input is Gaussian
• The joint pdf is completely determined by the set of means and autocovariance functions of the samples of the Gaussian process
• If a Gaussian process is wide-sense stationary, then the output of the LTI system is strictly stationary
• A Gaussian process that has uncorrelated samples is statistically independent
![Page 26: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/26.jpg)
Noise
• Shot noise
• Thermal noise
• White noise
• Narrow
![Page 27: Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006](https://reader036.vdocuments.us/reader036/viewer/2022062304/56649e0f5503460f94afa4d2/html5/thumbnails/27.jpg)