rvsp short answers
DESCRIPTION
Random variables short answer QuestionsTRANSCRIPT
-
Random Variable 119
FAoo) = 1
FAoo) = ~(2 -e- b
a 12a =b -=, b 2
(iii) P[1 < X < 2] = Fx(2)- Fx(l) =~(2-e2b)-~(2 e-h) b b
_ 1 [2 -2b 2 -b]-- -e - +e2 . 1P[1 < X < 2] =_[e-b _e-2b ] 2
Example 2.19
If the probability density ofa random variable is given by
X for 0< x< 1 f(x) { (2 x) for 1< x< 2 find the probabilities that the random variable having this probability density will take on a value 0) between 0.2 and 0.8 (ll) between 0.6 and 1.2 .
(Aug I Sep 2006) Solution
Given fx(x) ={ x for O
-
120 Probability Theory and Stochastic Processes
J J1.2 = I xdx+ (2-x)dx0.6 I _ x211 [ X 2 JI1.2-- + 2x-
2 0.6 2 I
2 2 = 1-0.6 +2(1.2)_1.2 -2+!.. 222
= 0.32+2.4-0.72-1.5
P{0.6 b} = 0.05.
(Nov 2006) Solution
Given the probability density function fx (x) = 3x2 , 0 < x < 1. (i) P{X>a} =l-P{X::;a}=l-FxCa)
= 1- J:3x 2 dx
=1-x3 = 1-a3
P{X = a} = fx(a) = 3a2
Given thatP{X =a} =P{X > a}
a2 (a+3) =1
(ii) P{X>b} =1-P{X::;b}=0.05 1- Fx(b) =0.05
F}( (b) = r:\;r.2(J;r. = h' 1-b3 = 0.05
b3 = 0.05 b = ~0.05 = 0.3684
-
The Random Variable 121
Example 2.21
AliJfu1alog signal received at the detector (measured in microvolts) may be modelled at:! a Ga'ussian random variable N (200, 256) at a fixed point in time. What is the probability that the signal is larger than 240 !-lV, given that it is larger than 210 !-lV?
(May 2005) Solution
Given an analog signal is modelled as a Gaussian random variable N (200, 256). This indicates that the mean value mx = 200 !-lV and variance a/ 256 or ax 16!-l V The required probability is
P(X> 240) = 1-P(X::; 240) (240)
= 1-F( 2401~200) = 1-F(8.75) =Q(8.75)
1 =------------r=======~~ (0.66)(8.75)+ 0.34"J8.752 + 5.51
P(X > 240) = 1.066 x 10-18
and Fx (240) =P(X ~ 240) =1-P(X > 240) ~:1d
F (210) =F(21O-200) F(lOJx 10 16
l-QC~)=l-Q(0.625) =1
0.66 x 0.625 +
1
Fx (210) =1-0.1356=0.8643 The probability that the signal is larger than 240 !-lV, given that it is larger than 210 !-lV is
FxC2IO) = 0.8643 =0.8643 Fx (240) 1
Example 2.22
The lifetime of Ie chips manufactured by a semiconductor manufacturer is approximately normally distributed with a mean = 5x106 hours and standard deviation of 5x106 hours. A
-
122 Probability Theory and Stochastic Processes
mainframe manufacturer requires that at least 95% of a batch should have a lifetime greater than 4x106 hours. Will the deal be made? (May 2005) Solution
The lifetime of les are normally distributed or have a Gaussian distribution.
Mean value J.lx = 5x 1 06
Standard deviation (fx = 5x 106 hoUrs
Probability of lifetime greater than 4 xl06 hours is
P{X > 4 X 106} = 1-P{X :::;:4 x106}
=;1-FA4 x 106]
I-F[4~5J = I-F(-O.2) = 1-(1-F(0.2)) F(0.2) =;1-Q(0.2)
1 e-O2'/2 =1- --=1-0.419
(0.66)(0.2) +0.34..j0.22 +5.51 J2ii :. P{X > 4 x 106} =: 0.581
Example 2.23
For a Gaussian random variablc with a=O and (J' =1, what is P(IX I> 2) andp[X > 2]? (May 2011)
Solution
Given a Gaussian random variable X with mean value Ilx 0 and spread (J' = 1. The probability
(i) P(IXI>2) =P(X 2) = P(X < -2) + 1 P(X < 2)
Now P(X < -2) =F(-2) =1 F(2) P(X < 2) = F(2)
PCI XI> 2) =I-F(2)+ 1 F(2) P(I X I> 2) =2(1- F(2
-
The Random Variable 123
P(I X I> 2) =2Q(2)
also,
1 e-2'/2 Q(2) ~ x--=0.0228
0.66 x 2 +0.34,,122 4- 5.51 2n .. P(IX I> 2) 2 xO.0228=0.0456
(ii) P(X >2) 1-P(X 2) 0.0228
Example 2.24
A Rayleigh density function is given by X2 2 f(x) {xe- / x;;::: 0
o x
-
124 Probability Theory and Stochastic Processes
Hence the given function f(x) satisfies the properties of the pdf.
(b) The distribution function FAx) = [fx(x)dx
x=oo
(c) P(0.5 s X s 2) =P(X s 2) P(X s 0.5) =Fx(2) (0.5) =1 _1+e-Gs'12
-118 -2P(0.5s X s 2) e -e 0.7472 (d) P(O.5 < X < 2) =P(X < 2)-P(X < 0.5)
=P(X s 2)-P(X = 2)-P(X sO.5)+P(X =0.5) P(0.5 < x < 2) Fx (2) FAO.5) + P(X 0.5) - P(X =2)
1 Now P(X 0.5) fx(0.5) =0.4412
2
P(X = 2) = fx(2) 2e-2 0.2706 -118 -2 1 -1/8P(0.5
-
The Random Variable 125
Solution
Given that the number of automobile arriving at a station is Poisson distributed. Average rate A, 50/ hour Time duration 1 minute
A waiting time occurs iftwo or more cars arrive in anyone minute duration. The probability ofa waiting line is
.. P{k;:: 2} I-P{ksl}
I-FAl)
But Fx(l) [bO +~JO! l! FAl) [1+~]
.. P{k;:: 2} 1 0.2032
Waiting line occurs at about 20.32% of the time.
Example 2.26
If a mass function is given by
x 1,2 ...,50 P(x) x 51,52... ,100{A(1~-X)'
otherwise
(i) Find A that makes the function a probability mass function and sketch its graph. (ii) Find P(x> 50), P(x < 50), P(25 < x < 75), and P(x: odd numbers). (iii) If the events indicated in (ii) are A, B, C, D respectively, find P (A IB), P (A IC),
P (A ID), P (C ID). Are the pairs A, B; A, C; A, D; C, D; independent evellls? (Dee 2002)
Solution
Given the mass function of a random variable X is
x =1,2 ...,50 fAx) x 51, 52... , 100{A(l~-X)'
otherwise
-
126 Probability Theory and Stochastic Processes
(i) lfthe function is a probability mass function, we know that 2:fx(x) = 1 50 100
2:Ax+2:A (I00-x)=1 x.,1 .
-
The Random Variable 127
(b) 49 X 1 P(X < 50) = :2:- = -'-'-[1 + 2+ 3 + ... +49] x=1 2500 2500 '
_l_x 49x50 2500 2
~=0.49 100
(c) P(X = x = 50 = 1 = 0 02 50) 2500 2500 50 .
(d) P(25 < X < 75) ~ X ...,
-
128 Probability Theory and Stochastic Processes
. . (49+1j,2I.e. [1+3+ ... +49] -2- =25x25
2 1 :. P[oddnumberJ:::: 2500 x25x25:::: 2 0.5
(ill) Given the events peA) P[X > 50] = 0.49 PCB) = P(X < 50) :::: 0.49
1P(C) P(X =50)=-=0.0250
74P(D) P(25 < X < 75) =- 0.74 100
1peE) P[odd number] :::: - = 0.5
2
Now
P(AnB) P(X > 50n X < 50)(a) peA IB) :::: =0 PCB) PCB)
.. P(AnB) 0, peA) ~ 0, PCB) ~ 0
.. A and B are mutually exclusive and independent events .
P(AnC) P(X > 50n X = 50) (b) peA I C) 0 P(C) P(C)
.P(AnC) 0, peA) ~ O,P(C)~ 0 :. A and C are mutually exclusive and independent events
P(X > 50n25 < X < 75)(c) P(AID) P(D) P(25
-
-----------
The Random Variable 129
peA nD) = 36 peA) = ~ P(D) = 74 100 100' 100
Since P(AD) "* P(A)P(D) the events A and D are not dependent. P(C ID) = P(C nD) = P[X = 50n25 < x < 75](d)
P(D) P(D) 1
50 2
74 = 74
100
Since Cc(CnD), P(CnD) =P(C)
:. The events C and D are not independent.
Example 2.27
If the probability density of a random variable is given by
Ix(x) ={ceXP(-XI4), 0::;;x
-
130 Probability Theory and Stochastic Processes
O:::::x
-
The Random Variable 131
(c) Probability of P{6
-
132 Probability Theory and Stochastic Processes
lithe function is a valid probability density function, then
L:fx(x)dx =1
J IicX)
+
Now LIA(1-x2 )COSl2 dx =1
Using integration by parts,
sm
. icX 1
-r _2_(-2x)dx . 1 ic =1
2
1
'+1
(-cos icX]4A x(-coST] r \ 2 dx =111 icic ic
22 -1
[ +l8A icX 2. icX
- 2 -xcos-+-sm- =1 iC 2 ic 2_1J 8A[ ic 2. ic ( (-ic] 2. II -ic))]
- -cos-+-:-sm-- -(-I)cos - +-sm - =12iC 2 ic 2 \ 2 ic 2
8A[ 2 2 ]-~ 0+---(-1) =1 ic~ ic ic
A 32
-
The Random Variable 107
Example 2.6
Find a constant b > 0, so that the function,
I 3x fAx) ~oe is a valid pdf. {
otherwise
Solution
Given the function I 3x
fxCx) = { 10: otherwise
If it is a valid density function, then fAx);:>: 0, is true, and
b 1 3x -e d"C =1 orJ010 ~[e3X]b 10 3 0
-1] =1
3b In(31)
b = lln(31) = 1.1446 3
Example 2.7
A Gaussian random variable Xwith /lx = 4 and ax 3 is generated. Find the probability of X S; 7.75. Write down the function and draw the graph.
Solution
Given a Gaussian random variable with /lx 4 ax = 3. Given the event {X s; 7.75}, then P{X s; 7.75} =FxC7.75)
-
108 Probability Theory and Stochastic Processes
We know that, FAx) =F( x~~x ) or Fx(7.75) =Fe7~-4)=Fe;5)=F(1.25)
Using the Q-function approximation, F(1.25) = l-Q(1.25)
. 1 e-1.25'12 1 x---(0.66)(1.25)+0.34~(1.25i +5.51 5
F(1.25) =0.8944 P{X 57.75} =0.8944
The Gaussian density function is
fAx) = ~e ,,2na/
1 (x_4)2 f (x) = e IS x ,J2n(9)
_(>-42'. 0.133e 18
Figure 2.16 shows the Gaussian density function.
fJx) 0.133 ................_ ..........
o 1 4 7 x Fig.2.16 Gaussian density function for llx 4,ax 3
Example 2.8
Assume that the height of clouds ax above the ground ::tt some location is a Gaussian random variable X with mean value 2 km and ax = 0.25 km. Find the probability of clouds higher than 2.5 km.
Solution
Given a Gaussian random variable X
Mean valuellx =2 km
-
The Random Variable 109
Spread ax 0.25 Ian Favourable event is {X> 2.5 Ian}
.. P{X > 2.5 Ian} = I-P{X S 2.5 km} =1-FxC2.5)
I_F(2.5- 2)=I-F(2)0.25
1-(1-Q(2)) =Q(2)
Using Q-function approximation, e-2' 121
P{X > 2.5 km} = Q(2) = x0.66x 2+0.34.J22 +5.51 J2ii
P{X > 2.5 km} = 0.0228
The probability that clouds are higher than 2500 m is therefore about 2.28%.
Example 2.9
A production line manufactures 1ill resistors that must satisfy 10% tolerance. (a) If a resistor is described by the Gaussian random variable X, for which
f.lx = 10000, ax 400, what fraction of the resistors is expected to be rejected? (b) Ifa machine is not properly adjusted, the product resistances change to a new value
where f.lx 1050n. What fraction is now rejected? Solution Given that the resistor value is described by the Gaussian distribution. The accepted resistor value is 1 kO l0% tolerance, that is, {900 0 to 1100 n}. The resistor is rejected if its value X is {X HOO}. (a) The probability of rejection is a fraction of the resistors rejected.
P{resistor rejected} = P{X < 900} +P{X > llOO} Given mean value f.lx 10000, ax = 400.
Then P{X llOO} = I-Fx(1100) = I_F(1l00~1000)
-
110 Probability Theory and Stochastic Processes
= I-F(2.5) =Q(2.5)
P {resistors rejected} = Q(2.5) +Q(2.5) = 2Q(2.5) Using the Q-function approximation,
P{resistors rejected} = 2
0.0124
= 1.24 % resistors are rejected (b) If mean value is adjusted to Ji
x 1050Q, (J'x =40Q
then P{X1100} =1-FX o. The average power is 10 W. What is 10
the probability that the received power is greater than the average power?
Solution
Given that the power reflected is an exponential distribution. The pdf is given as
fx(x) = ~e-X/lO, x > 0 10
-
The Random Variable III
x e-x1102.[_10e-xlJo J =1 10
Fx{x) =1_e-xIIO x>O The required probability is
P{X > 1O} =I-P{X $; 1O} I-FAlO)
P{X > 1O} =e- l ~ 0.368 Example 2.11 The amplitude ofthe output signal ofa radar system, i.e., receiving only noise is a Rayleigh's random variable with a 0, b = 4V. The system shows a false target detection if the signal exceeds Vvolts. What is the value of V, if the probability of false detection is 0.001 ? Solution
Given that the noise can be described by Rayleigh's distribution. The distribution function is
x:C:a x
-
112 Probability Theory and Stochastic Processes
Example 2.12
The lifetime of a system expressed in weeks is a Rayleigh random variable X for which
1~eXp{-X2/400) x~o f)(x = 200 . o x52} =1-P{x:=;;52}
1-Fx (52)
But Fx(52) r 2~O e-x'i40dx _e-x' /400 I:2 = 1-e-'z' 1400
P{X>52} 1_(l_e-S2"4oo) e-52'/400 i::i 0.00116
Example 2.13
A certain large city experiences, on an average, three murders per week. Their occurrence follows a Poisson distribution.
-
The Random Variable 113
(a) What is the probability that there are 5 or more murders in a given week? (b) On an average, how many weeks in a year can this city expect to have no murders? ( c) How many weeks per year ( average) can the city expect the number of murders per
week to equal or exceed the average number per week?
Solution
Given in a city, the occurrences of murders per week is a Poisson's distribution. Average no. of murders b = AT = 3 (a) Probability that there will be five or more murders in a given week
P{k ~ 5} = 1-P{k ~ 4} 4 bk
=l-e-b Lk=O k!
33 3432 ]=1-e-3 1+3+-+-+[ 2! 3! 4!
-3 [ 9 2 27]= 1-e 1+3+-+-+
2 2 8
(b) Probability of zero (no) murders -bbk -3 bO
P[k=O} =_e_=_e_=e-3 =0.0498 k! O!
Average number of weeks a year, the city has no murders is
52 x 0.0498 = 2.5889 weeks
(c) The probability that the number of murders per week is equal to or greater than the average number of murders per week is
P{k ~ 3} =l-P{k ~ 2}
= l-[P(O) + P(1) + P(2)]
= 1-.!2e-3 = 0.5768 2
.. The average number of weeks in a ye;u when the number of murders exceeds the average value is 52 x 0.5768 = 30 weeks .
.._.._--_._-
-
180 Probability Theory and Stochastic Processes
Solution
We know that for a given random variable, the binomial density function is given by
fAx) = LN NCKpK(1_p)N-KO(x-K) K=O
For the random variable {X Ix = K }, the probability mass function is P(x) = NCxpxqN-x forx=0,1,2 ... N
where q =1-p
The mean value ofX is
N
E[X] = LXP(x) ;'(=0
N E[X] = LX NCxpXqN-X
x=o
We know that NC = N(N-l)(N-2)
x x(x -1)(x - 2) ...
NC = N (N-I)C x X (x-I)
Since x = is not valid, N
E[X] = N.p" N Ie pX IqN 1 (x-I)L- (x-I) x=l
N
Since I (~-I)C(X_l)pX-lqN-l-(X-l) = (p +q)N-l = 1
x=l
E[X] =Np(1)=Np
E[X]=Np
The mean square value of X is
N
1;,'[X2] = LX2p(X) x=o
N = Lx2 NC pXqN xx
x=Q
-
Operations on One Random Variable 189
Example 3.22 The pdf of a random variable X is given by
x fAx) 2~x~5 20 ' Find the pdfof Y ::: 3X 5.
Solution
H2~x~5 Given Ix (x) otherwise
and the transformation Y:=3X-5 Y+5x 3
y+5or X=-
3
and
We know that fy(y) := fx(x) 1:1 Now, 1 (Y +5)frey) ?/X -3
I' ( ) := 1 (y +5) / 3 = y +5 Jy Y 3 20 180
The limits are for x = 2 , y =1
and x=5, y=10
l~y ~10
otherwise
Example 3.23
A Gaussian random variable with variance 10 and mean 5 is transformed to Y eX. Find the , pdfof Y.
-
190 Probability Theory and Stochastic Processes
Solution
Given (J/ '" 10, mx = 5.
We know that the pdf of a Gaussian random variable is
1 ex (-CX-5)2JfAx) .J2nlO p 20
The transformation or X =InCY) , x:::: In(y)
y.
We know that
1 ~ fx(lny) y
1 . (-(In(Y)-5)2) frey) = y../20n exp 20 '
More Solved Examples
Example 3.24
Let X and Ybe random variables such that Y s X, then show that E(Y] s E[X], provided that the expectation exists.
Solution
Given two random variables X and Y, and Y s X, or Y - X s O. Take expectations on both sides
E(Y-X]sO From the addition theorem, E(Y] E(X] ~ 0
or E(Y] s E(X]
Example 3.25
What is the mathematical expectation of winning Rs. lOon a balanced coin coming up heads, or losing Rs 10 on itscoming up tails?
-
Operations on One Random Variable 191
Solution
Let winning of Rs.l 0, XI = +1
losing of Rs.I 0, X2 = -1 Since the coin is balanced, the probability ofheads is P(xl ) 1 and the probability oftails is 2
1 P(~)=2' We know that the expectation
E[X] = L>;P(x;) XIP(XI )+X2 P(X2 )
.!.(+1)+.!.(-1) 0 2 2
. . mathematical expectation is zero
Example 3.26
A fair coin is tossed until a tail appears. Find the mathematical expectation of the number of tosses.
Solution
Let X be a random variable of the event "number of tosses". Since the coin is fair, the
probability ofgetting a head is .!. and the probability ofgetting a tail is .!.. 2 2
The possible outcomes oftossing until a tail appears is T, HT, HHT, H, HHT, HHHHT, ... and so on.
The probabilities of the outcomes are P(T) =.!.
2
P(HT) =.!..!. = .!. 22 4
P{HHT) = 1 1 1 1 222 8
P(HHHT) = L!.1..!. = ~ and 2222 16
1['(IIlL..IIT) ==2"
The expectation ofX is
E[X] = L>nP(xn} 11
-
192 Probability Theory and Stochastic Processes
1 1 1 1 1lx-+2x-+3x-+4x-+ .....+nx-........ .
2n2 4 8 16
nE[X] = 2n
Let s
Sthen
2
Substracting
SS OO(n n)~ 2" - 2n+l
Expand the tenus,
S 1 2 2 3 3 4 4 5 5
2
-+---+---+- +- + ........
2 2 4 4 8 8 16 16 32 32 64
S 2 i+(~-~)+(~-~)+C: -1~)+(;2 - 3~)+' S 1 1 1 1 1
=-+-+ +-+-+.......
2 2 4 8 16 32
S =2_2_=2 2" 11-
2 :. The required expectation is 2.
Example 3.27
A box contains 4 red and 2 green balls. Two balls are drawn together. Find the expected value of the number of red balls drawn. Solution
Let X be the random variable for the event ofnumber ofred balls drawn. When two balls are drawn together, assume the events are
No red ball and two green balls = Xl
One red ball and one green ball = x2 Two red balls and no green ball =X3
-
Operations on One Random Variable 193
The probabilities of the events are
P(X1 ) = 4~O 2C2
C2 lxl x2 6x5
1 15
6 15
The expected value is
1 8 6E(X] ="xP(x.)=Ox-+lx-+2x
..,; I I 15 15 15
=~+g 20 =i=1.33 15 15 15 3
E(X] 1.33 Example 3.28
When two unbiased dice are thrown, find the expected value of the product of the numbers shown on the dice.
Solution
When two dice are thrown, the sample space is S {(1,1), (1,2), ... ,(1,6),(2,1),(2,2), ... ,(2,6),(3,1 ),(3,2), ... , (3,6), ... ,(1,6)..... ,(6,6)}
Let X be the random variable for the events of product of the numbers shown on dices The possible outcomes areX {I, 2, 3,4, 5, 6, 2,4,6,8,10,12, 3,6,9,12,15,18
4,8,12,16,20,24,5,1O,15,20,25,30,6,12,18,24,30} Here, products 1,9, 16,36 appear one time, products 2,3,5,8, 10, 15, 18,20,24,25,30
appear two times, products 4,12 appear three times and product 6 appears 4 times. :. The probabilities are shown in Table 3.5.
Table 3.5 Values of X and P(x) X=X 1 2 3 4 5 6 8 9 10 12 15 16 18 20 24 25 30 361
P(x) 1-36
2 -36
2 -
36 3-36
2 -36 ~r2 36 1 -36 2 -36 3 -36 2 -36 1-36 2-36 2 -36 2-36 2 -36 2 -36 1-36
Then
E(X] =LX,P(x,) i
-
194 Probability Theory and Stochastic Processes
1 2 2 3 2 1 = lx-+ 2x-+3x-+4x-+5x-+ ...... 36x
36 36 36 36 36 36 E[X] =12.61
Example 3.29
When two unbiased dice are thrown, fmd the expected value of the sum of the numbers shown on the dice.
Solution
When two dice are thrown, the sample space is S ={(I,1),(l,2), ... ,(l, 6),(2, 1),(2,2), ... ,(2,6),(3, 1),... ,(3,6),(6, 1 ), ... ,(6,6)}
Let X be the random variable for the event of the sum ofthe numbers shown on the dices. The possible outcomes are
X {2, 3,4, 5, 6, 7,3,4,5,6, 7,8,4,5, 6, 7,8,9,5,6, 7,8,9,10,6,7,8,9,10,11, 7,8,1O,1l,12} The probabilities are shown in Table 3.6.
Table 3.6 Values ofX andP(x)
X=x i 2 3 4 5 6 7 8 9 10
P(x) 1 2 3 4 5 6 5 4 3- - - - -36 36 36 36 36 36 36 36 36
1 2 3 4 5 6- 5 E[X] 2x--+3x--+4x--+5x--+6x--+7x--+8x36 36 36 36 36 36 36
432 1+9x +lOx--+llx--+12x
36 36 36 36 E[X] =7
Example 3.30
Let Xbe a random variable with probabilities as shown in Table 3.7.
Table 3.7 M:zlues ofX andP(x)
X(X j ) -1 1 2
p(x) 1 - 1-
1 -
6 3 2
Find (a) E[X], (b) E[X2], (c) E[(2X +1)2] and (d) (J/.
-
Operations on One Random Variable 195
Solution
Given a discrete random variable with probabilities shown in Table 3.7.
(a) E[X] L>;P(x;) ;
1 1 1(b) E[X2] = LX;2p(X;) (~1)2 x +(1)2 X_+22 x6 3 2
1 !..+ 2= 2+!.. 2.5 3 2
1 1 1(c) E[(2X +1)2] = L(2x; +1)2 P(x/) (2(-1)+1)2 x-+(2xl+ 1)2 x-+(2x2+ 1)2 xi 6 3 2
1 9 25 47 -+ +6 3 2 3
(or) E[(2X+l)2] E[4X2+4X+1J 4E[X2]+4E[XJ+ 1
=4x 25 +4x2+1=1l+ 14 = 47 10 6 3 3
Example 3.31
Consider a random variable X with E[X]=5 and (j/ =3. Another random variable is given as Y=-SX+10. Find (a) E[X2] (b) E[XY] (c) E[y2] and (d) (j/. Solution
Given E[X] =5 and (j/ 3 Y -SX +10
9+52=9+25 =34
(b) E[XY] = E[X(-SX +10)] E[-SX2 + lOX] -SE[X2] +10E[X]
---- -------- ............................ _--_ .._-
-
196 Probability Theory and Stochastic Processes
= -8(34) + 1O(S) = -272 + SO = -222
(c) E[y2] =[(-8X+1O)2]=E[64X2+100-16X] = 64E[X2]-16E[X]+ 100 = 64x 34-16xS+100 =2196
(d) 0/ =E[y2]-E[Yf Now E[Y] =E[-~X +10] =-8E[X]+1O
=-8xS+1O=30
and 0"/ = 2196-302 = 2196-900 = 1296 Example 3.32 Consider that a pdfof a random variable X is
I -2:5:x:5:3 fxCx) = KO{
otherwise
and another random variable Y = 2X. Then fmd (a) value K, (b) E[X], (c) E[l1 and
(d) E[XJ1. Solution
(a) Given Ix(x) =H otherwise
Since fx(x) is a valid density function, then S:fx (x)dx = 1as shown in Fig. 3.7.
fx(x) 11Kr----+-----,
-2 o 3 Fig. 3.7 Uniform density function
r~dx =1 2K 3
..::.1 = 1 K -2
-
Operations on One Random Variable 197
or _3-_(0---2....:....) =1 K
or K =5
(b) E[X] = [x/x(x)dx
= rx.!dx=! Xll3 =~(32 (-2l) 2 5 5 2 -2 lO
9-4 10
1(c) E[Y] =E[2X] 2- =1 2
(d) E[XY] =E[X2X]=2E[X2] =2 r 3 x 2 !dx.1-2 5
2 38E[XY] =-[27+8]= =2.53 15 15
Example 3.33
O
-
198 Probability Theory and Stochastic Processes
=rx(0.3507.[;)dx
E(X] =O.l4x35/2 =2.18674 (ii) Mean square value
5/2=0.3507 rX dx
37/2 =0.3507~ 7/2 10
(iii) Variance O'i =E(X2] X2 = 4.68589 - (2.18674)2 = 0
Example 3.34
A random variable X has a pdf
-1(; 1(;
-
Operations on a Single Random Variable 181
This can be written as N L)x(x-l)+x) NCxpxqN-X x=O
Since x2 = x(x-l)+x N N
So, E[X2] Lx(x-l) NCxpxqN-x +LX NCxpXqN-X n=O n=O
Now NC can be written as x
NC N (N-I)C x X (x-I)
NC N(N -1) (N-2)C or
x x(x-I) (x-2)
~ x(x-l) N(N 1) (N-2)C x N-x +N;... ( -1) (x-2)P q 'P x=O x x
x=O
Since x 0 and 1 are not valid, N
E[X2] = N(N -1)L (N-2)C(X_2)pX-2 p2qN-2-(x-2) +Np x=2
N N(N _I)p2 L (N-2)C(X_2)pX-2 qN-2-(X-2) +Np
x=2
N
S. '"' (N-2)C x-2 N-2-(x-2) (p+qiN- 2) =1mce ... (x-2)P q x-2
:. The variance of X
=E[X2]-E[X]2 (1
X 2 N(N-I)p2+Np (Npi
(1/ N'2p2 _Npl +Np_N'2p'l. (1/ = Np-Np2 = Np(l- p) (1
X 2 Npq
-
Operations on One Random Variable 199
and the function Mean value ofthe function g(X),
E[g(X)] = [g(X}fx(x)dx
[ 12 1
= (4x2)-cosx dx11:12 2
E[g(X)] [ 12
2 x 2 cosx dx11:12 To evaluate the integration,
Jx2 cosx =x 2sinx- J(2x)(sinX)dx
x 2 sinx- 2[x(-cosx) J(-cosx)dx]
or Jx2 cosx =x 2 sinx+ 2xcosx-2sinx
E[g(X)] =2[12 x2cosxdx=2[x2sinx+2xcosx11:12
2[[n2 . n 2n n 2' n)= -sm +-cos- sm. 4 2 2 2 2
2sinxJ11:12-11:12
E[g(X)] n 2 8 Example 3.35
Givcn thc Rayleigh random variable with density function 2 _(.< __ a)2
f(x) -(x-a)e b u(x a)6
show that the mean and variance are
{7ibE[X]=a+V4 (May 2011)
-
200 Probability Theory and Stochastic Processes
Solution
Given Rayleigh's density function 2 _(x_a)'
f(x) =-(x-a)e b u(x-a)b
The mean value is
E[X] = [:if(x)dx
-(x-a)~2X
= -(x-a)e b dx[ b . To evaluate the integration,
x(x a) =x2 xa =x2+a2-2ax-a2 +ax or x(x-a) =(x-a)2 +a(x-a)
2 -(x-a) 2 E[X] =- [(x-aie-b-dx+~ [(X a)e b dx
b " b -{x-a)'
We know that [f(x)dx =r2(x;a)e-b-dx=l
x-aNow let .Jb =1
dx =.Jb dt The limit at x=ais 1=0
(From Appendix A)
E[X] =a+ Jib 2
----.~---
-
Operations on One Random Variable 201
{1ibE[X] =a+V4
(ii) Variance of X is
2 _(x_a)'22 X Now E[X] = r-(x-a)e b dx a b
To evaluate the integration, the term x2(x- a) can be expressed in terms of (x-a) as x2(x- a) =(x-a+ a)2(x-a)
=[(x-a)2 +a2+2(x-a)a](x-a) x2(x- a) =(x - a)3 + 2a(x- a)2 + a2(x- a)
. 2 -(x-a) -(x-a) E[X2] = [-(x-a)3e-b-dx+~ [(x-a)2 e- b-dx
b b ' 2 -(x-a)2a [ +b (x-a)e b dx
The integrations are
for r2(x ~ a)3 e -(x~a)' dx, let (x-a)2 2(x-a) dx= dt---'-----------'--=t,
b b
2(x-a)3 _(x_a)' (x-a)2 _(x_a)' Then [ e b dx = [ e b 2(x-a)dx
a b a b
For
J1ib ~ =2a--=a,mb
2 _(x_a)'
2 r2(x-a) -b-dx 2and a e =a a b
.. -.------.~~--
-
202 Probability Theory and Stochastic Processes
The variance of X is 2-2
(jX 2 =E[X ]-X
u'x =a'+aM+b (a+{!)' (j2 =a2 +a.Jib +b-a2 _1rb -2a fib x 4 '/4
r::::;- 1rba-v1rb =b-
4
Proved.
Example 3.36 Find the characteristic function for a random variable with density function
Ix (x) =x for 0:5: x :5: 1 Solution
The characteristic function ofX is given by tPx(m) = E(e1"'x]
= [,e jOJx Ix (x)dx
jroxtPx (m) == ! xe dx
-
Operations on One Random Variable 203
Example 3.37
The density function of a random variable is given as fAx) ae-bx x;?: 0 .
Find the characteristic function and the first two moments.
Solution
Given fAx) =ae-h>: X;?:O. The characteristic function is jWxlfrx(m) =E[e ]
:::= [,e jWX fx(x)dx
:::= [, ae-h>: ~wx dx
rae-'(b-jwx) fx(x)dx :::= a(~J[e(b-jW)XJ'"
b-]m 0
alfrx(m) :::=(~)(0-1)b jm b-jm
a lfrx(m) =--. b ]m
The moments are
(b-~milm==o= a 2m = j :m[(b-~m)2 ] -]. 2ja I (b- jm) m 0
2a I 2a m2 = (b- jm)3im 0 == IT
-
204 Probability Theory and Stochastic Processes
Example 3.38
Find the characteristic function for fx(x) e-J.tl,
Solution
Given fAx)=e-l xt , The characteristic function is
j1 (l+ o)XIO -1 -(I-jO)x!'"
--e +--e l+jOJ '""'" I-jOJ 0
1 -1 =-,-[1-0]--,-[0-1]
1+ ]OJ 1-]OJ
1 1 2 lPx(OJ) =--+--=-1+ jOJ 1- jOJ 1+ OJ2
Example 3.39
Show that the characteristic function of a Gaussian random variable with zero mean and variance (J'2 is
Solution
We know that the probability density fimction of a Gaussian random variable is
fx(x) = 1. e-h2a2 ~21t: (J'2 .
The characteristic function is lPx(OJ) = E(~(OX]
=[fx(x)ejO)xdx
-
Operations on One Random Variable 205
Now take the function _X2 -2+ jOJx20'
To convert it into a square form
;: + jrox =-[(;2uJ -jrox) = -[U"J2 ( f2u )-'2~ jro)-=-[UJ -21" j;; +e;;)' -(j;;J; =-[(f2u -j;;J-(j;)']
Then the characteristic function is
( jwu)' (jwu)'X
CPxCOJ) = _1_ [ e- .J'ln -.fi +.fi dx 50' '"
-{Jia2 (x j(J)(J)'CPxCOJ) = _1_e- 2- [ e- .fi" -.fi dx
50' '"
x . Let -- JOJO' = Y 0'
dx =dy dx=dyO'0'
-
206 Probability Theory and Stochastic Processes
Substituting the values,
=e-cu'(f2 12 -,_1-.c e~ll2dy5'-> We know that the area under the Gaussian pdf is
.in [e-Y212 dy =1 Proved.
Example 3.40
Find the density function of the random variabl,? X if the characteristic function is
~~ { I-olco I 1co I::;; I otherwise ~b~~
Solution
. Given the characteristic function
tPx (co) f1-1 co I' l 0 1co I::;; I otherwise
The density function is the inverse Fourier transform of the characteristic function with (-m)
-
Operations on One Random Variable 207
I (1-OJ) -jOlXI 1 fl -jOlXdx
=--e + J/and jx 0 jx l
1 1 -i(J)xl=-+-e jx _x2 ()
1 [1 -iX] 1=- -e +
x2 jx
Substituting the values in /x(x), we get 1[1 . 1 .]ix(x) =21r ~(_eJx+l)+ x2(1-e-J",)
1 [_iX 1 (1 -ix' ]-22 -tf" + + -e J
1rX
X jXel
_l_[l ___ +_e_-_] 1rX2 2
Example 3.41
Show that the distribution function for which the characteristic nmction is e-1(0\ has the density function
-oo
-
208 Probability Theory and Stochastic Processes
= [eO-iXl"'J!:", + 1+1jX [e-"'(I+PlI]
-0) 1]]
1 2 =-x-
21r 1+X2
Proved.
Example 3.42 The characteristic function for a Gaussian random variable X; having a mean value of zero,
is
-
182 Probability Theory and Stochastic Processes
Example 3.15
Find the mean and variance of Poisson's distribution fimction. (May 2011) Solution
We know that for a given random variable, Poisson's distribution fimction is given by N AK fAx) L:>-4_0(x-K)
K=O Kl For the random variable {XI x = K}, the probability mass fimction is
e-4 Ax P(x) =--for x =O,I,2, ... ,N,...oo
xl
where A Np.
The mean value of X is 00
E[X] =LXP(x) x=o
Since x = 0 is not valid,
. _ 00 A Ax-I e-4 E[X] -~ (x-I)!
or E[X]
'" -4 Ax-leSince L--- 1 (sum ofall probabilities) x=1 (x-I)!
E[X] =Axl E[X] A=Np
The mean square value of X i8
E[X2] = fX2p(X) x=o
-
Operations on One Random Variable 209
(J)(j2=j_X_e 2 =0
2 co=o
_d2 =--2Px(CO)dco
ui[e
.~Ui[e
and
I -d( 2=-l--GXCO e 2CO=O dco
+(-~i Jm]
.. m3 0
Similarly, we can obtain all other moments.
Example 3.43 .
Show that the characteristic function of a random variable having a binomial density is Px(co) =[1 p+ p ej())t .
(Nov-20l0)
-
210 Probability Theory and Stochastic Processes
Solution
The binomial density function is
P(x) = NCxpXqN-X for x 0, 1, 2,....,N The characteristic function is
jroxtPx (m) =E[e ] N
=LejwXp(x) x=o
tPx(m) =LN NCxpXqN-Xejrox x=o
q =l-p
tPx(m) =LN NCx(pejcoY(1 p)N-x x=o
XSince (a+bt = LN NCxaxbNx=o
Example 3.44
Find the moment generating function and the characteristic function ofa Poisson distribution.
Solution
Given that the Poisson density function ofX is
e-,l A: P(x) =-- x~O
x! The moment generating function ofX is
Mx(v) = E[e.\V] 00
= Le.\V P(x) x=o
M (v) = e-,l ~ (,levy x L... I
x=o x.
-
Operations on One Random Variable 211
'" XII We know that eX = L
:;;=0 n!
:. The moment generating function is Mx (v) := eA(e" -1). Similarly, the characteristic function ofX is given by
.,
-
Operations on a Single Random Variable 183
Since x2 =x(x-l)+x,
co -A. Il.
-
184 Probability Theory and Stochastic Processes
Example 3.17
The mean and variance of a binomial distribution are given by 6 and 2.4 respectively. Find P(X> 2). Solution
Given that it is a binomial distribution with mean =Np =6 and variance = Npq = 2.4.
Then 6q = 2.4, q = 2.4 = 0.4
6
and P =1-q=1-0.4=0.6
p =0.6, q=O.4
6 6
N =-=-=10
0.6 0.6
The probability P(X>2) =1--:-P[X~2]
= 1-[~ IOCxpVO-x] = 1- [lOCO 0.60.410 + lOCI 0.61 0.49 + IOC20.62 0.48 ]
P[X> 2] = 1-[0.410 +1OxO.6x0.49 + 10;9 0.62 X 0.48 ]
= 1-[1.048x 10-4 +1.57x..10-3 + 0.0106]
P[X> 2] = 1-0.lH23 = 0.9878 Example 3.18
Probability mass function of a discrete random variable is given as
X -1 - 1 -2 2
rex) 1-4
3-
4
(a) Find the moment generating function, moments and (b) tht:: mil for t::quiprobable events.
Solution
(a) The moment generating function for a discrete random variable is MxCv) = E[eVx ] = ~>vx;P(xJ
i
-
Operations on One Random Variable 185
From the given table,Mx(v) =eV(-II2) (~J+eV(1/2)i
Mx(v) =~[e-VI2+3eVI2J
3 vI2 1 -v12MAv) = -e +-e4 4
Using the infinite series expansion,
M (v)=~rl+r+ (~)2 +(~)3 +(~J +....+(~J1+.!..rl_r+ (~)2 _(~)3 +(~J +.... +(-1)"(~J1x 42 2! 3! 4! n! 4 2 2! 3! 4! n!
M (v) ~I+!"-+ (~J +! (~J +(~)' +........+2 (~J + (-I)" (~J
x 22 2! 2 3! 4! 4 n! 4 n!
But we know that
Equating the nth of terms of M x (v)
m v n =~[~+ (-IY ]
2n n n! n! 4 4
or m =J...[~+(-IY [!]]2n n 4 4 1 3 1 1 11The moments are m1 =-x---x-=--=2 4 2 4 22 4 1 3 1 1 1 1=-x-+-x-=-xl=m2 4 4 4 4 4 4 1311111
m3 =-x---x-=-x-=8 4 8 4 8 2 16
-
186 Probability Theory and Stochastic Processes
1 3 1 1 1 1= - x - + - x - - x 1= - and so on m4 16 4 16 4 16 16
(b) From the nth moment, it is observed that mn =(x1tP(Xj ) + (X2tP(X2)
If P(xj ) =P(x2 ) = 1 (equiprobable), then 2
1 1 ~ =0, m2 =-, ~ =0, m4 16 , ..... 4
n=evenHence {i" n=odd Example 3.19
If a pdf of a random variable X is given by Ix(x) =be-laxl , where a and b are real constants, find the moment generating function, mean and variance.
Solution
Given lAx) =be-lax,. (i) The moment generating function
Mx(v) E[eVx ]
=b[ (evXeQXdx+ reVX e-axdx] =b[ (e(aw)xdx+ re-(a-v)xdx] =b[_l_eIP+v.lA!O +2e-(O-Vl,t!"']
a+v -00 a-V
1 . 1 J~b -(1 0)--(0-1)[ a+v a-V =b[_I_+_1-J=~
a+V a-V a2 _v 2
2ab
a2 _v 2
-
Operations on One Random Variable 187
(ii) The mean value is 1'nt = d M x (v) I
-1 I=2 ab 2 2 2 (-2v)dv v o (a -v ) V = 0
4aba 4 4b 0=-a8 -=
(ill) The variance is
Example 3.20
Find the mgffor a discrete random variable X [X j,X2 , ...,Xn ] with pmf P(xJ, i=I,2,...... ,n and hence show that the nth moment, mn xjnp(x ) +X np(X ) +X np(X )+ ....1 2 2 3 3 Solution
The moment generating function for a discrete random variable is
Mx(v) = E[eVx ] = 2:eVX;P(xJ Using the infinite series expansion,
Mx(v) eVX1 P(xj)+eVX2 P(xJ+ eVX3 P(X) + ........
2
+~! [x/ P(x1)+ X22 p(X2 )+ ......]+ .......
-
188 Probability Theory and Stochastic Processes
Since p(X1)+P(X2)+ .. 1, MAv) =1 +v[XIP(Xt ) +x2 P(x.) +........]
2 v 2 2 2
+2T[X1 P(XJ+X2 P(X2)+X3 P(X;)+ ........ ]
+............ +:~[Xlnp(Xl)+X2"-P(X2)+ .........J
We know that
2 v 3v vII. Mx(v) l+v~+-mz+-m3 + ...... +-m. 2! 3: n!
Comparing the above two.equations, the nth order moments are Mx(v).
mil. =Xt"P(Xl)+X2"P(X2)+X3n(X3)+ ...... +xnP(x.)+ ........ Proved
Example 3.21
1If ix(x) = , find the density function for Y = X2 . 9
Solution
X2 Given and the transformation Y =-.
9
Then X & 3JY :. The roots are Xl -3/y and X2 =3/Y
So, I~II = -3 , Id;21= 2~ We know that the density function ofa transformed random variable Y is
JX(Xl)I~II+ JAX2 )1:1 +3 1 -9y 3 1 -9y
--e 2 +----e 2f,,(y) .J2i 2JY .J2i
-
344 Probability Theory and Stochastic Processes
More Solved Examples
Example 5.24
Random variablesXand Yhavethejointdensity Ix y(x,y) ;::;,_1 , 0 < x< 6, 0< y < 4. What , 24
is the expected value of the function g(x,y) =(XYl? (Nov 2008, May 2010, May 2011)
Solution
Given the joint density function
Ix y(x,y) ,
= {_I24
0
-
Operations on Multiple Random Variables 345
Solution
Given characteristic function
(,lIxy(co,,~) exp(-2coI2 -8co;) The joint moments of order n + k are
mnk (_ y+k 8 n +kc{Jxr (~,C02)1
) 8~n 801; ~ =~ =0 (i) The fIrst order moments are
= _ j 8exp(-2co]2 - 8co;) 1 8ml m] =m2 =0
mlO =0 . . The mean value of X is zero
and E[Y] =(_ j) 8c{JXy(mp m1 )18m m1 =m2 =02
mOl =0 :. The mean value of Yis zero.
(ii) Correlation ofX and. Y is Rxy E[XYJ" 11J11CXY
m = (_ j)2 a~y((i)pco2)1ll 8m1 am2 ml =m2 =0
_82 exp(-2C012 - 8m;) 1 ' 8ml 8m2 m1 =m2 =0
-
346 Probability Theory and Stochastic Processes
=-(-4
-
Operations on Multiple Random Variables 347
The two random variables are statistically independent. For othogonality, the condition is
'" '"
Now E[XIX 2 ] J J xjx2fxlX2 (Xl' Xi)dxl dx2 -00 ->
=~[16 16][16-4] 0 64
E[Xj X 2 ] =0 :. Xl andX2 are orthogonal. Hence the random variables Xl andX2 are independent
and orthogonal. Example 5.27
If X and Y are two independent random variables such that E[X] =A" variance of X 2
;=: 0' 12 , E[Y];::; A2 and variance ofY= a; ,prove that variance [X,Y] --al2a; + Nai + ;";0'1 (May 2010)
Solution
Given EfXl = A" E[Y] = ~
Variance of X =0'12 , Variance nfY a;
Now variance of X; E[(xy)2] E[XYf
var[XY] = E[X2 y2] - E[XY]E[XY]
Since X and Yare independent,
E[XY] =E[X]E[Y]
-
--- --------
348 Probability Theory and Stochastic Processes
We know that Variance of X =a)2 =E[X2 ] - E[X]2
.. E[X2] =a12 +E[X] =a12 +~2 and Variance of Y =a; =E[y2] - E[y]2
E[y2] = a; +E[y]2 = a; +A; So, var[XY] =(a12 +~2)(a; +A;) - ~2A;
=a;a; + ~2a; +A;a12 + ~2A; - ~2A; var[XY] =a)2a; + ~2a; + A;a; Proved.
Example 5.28 Three statistically independent random variabks X; ,X2 and X3 have mean values
Xl =3, X2 = 6 and X3 =-2. Find the mean values of the following functions:
(a) g(XI ,X2,X3) = Xl +3X2+4X3 (b) g(X),X2,X3) = X lX 2X3 (c) g(X"X2,X3) =-2X,X2 -3X)X3 +4X2X3 (d) g(X"X2,X3)=X,+X2+X3 (Nov 2007, May 2009) Solution Given X" X 2 and X3, are statistically independent.
X, =3
X 2 =6
X3 =-2 (a) The mean value of g(X" X2 ,X2 ) = X, +3X2+4X3
E[g(X, ,X2, X 1 )] =E[X, + 3X2+4X1 ] =E[X,] + 3E[X2]+4E[X3] =3+3 x6+4(-2) =13
(b) The mean value of g(X"X2,X3)=X,X2X 3 E[g(XI ,X2,X3)] =E[XI X2X3]
-
Operations on Multiple Random Variables 349
Since XI'X1 and X3 are independent
=E[XdE [X2 JE[X3]
=3x 6x (-2) =-36
(c) The mean of the function g(Xp X 2, X 3) =-2XjX2 -3XjX3 + 4X2X 3
E[g(Xl'X1,X3)] =E[-2XJX 1 -3XJX 3+4X1X 3] =E[-2XJX1]-E[3X1X3]+E[4X2X3] =-2E[X1X 2 ] ..,..3E[X1X 3]+4E[X2X 3] = -2E[XJ]E[X1 ] -3E[X1]E[X3 ] +4E[X2]E[X3] =-2 x3x 6-3 x3 x (-2) + 4x 6x(-2) =-66
(d) The mean value of the function g(X1,X2 ,X3)
E[g(X], X 2 , X 3)] E[XI + X 2 + X 3] =E[Xj] + E[X2 ] + E[X3] =3+6 2 7
Example 5.29
A joint density is given as
X(YO+I.5) o< x
-
350 Probability Theory and Stochastic Processes
=fI IfX"l x(y+ 1.5)dxcry=!x"+ldx ! (yk+l +1.5yk)dy 00
=[~]l [yk+2 + 1.5yk+l ]1 n+20 k+2 k+l 0
1 1 3) = (n + 2) (k + 2) + 2(k + 1)
5k+8 m k = nand k 0,1,2,." 2(n + 2)(k + l)(k + 2)
The moments are
Example 5.30
Statistically independent random variables X and Y have moments mlO 2, m20 =14, and mll =-6 . Find the moment /122 ' (Feb 2007, Feb 2008, May 2011) Solution
Given the moments =2, =14m lo m20
12 and mIl =-6m02 The moment /122 is a central moment and is given by
/122 =E[(X XY(Y _ y)2] =[E[X2]_~[X]2][E[y2]_ E[Yf]
13 36
1 m20 =-..... 2
We also know that
-6 or mOl =mlJmlO = 2
/122 =(14-22)(12 (-3i) /122 =30
-
Operations on Multiple Random Variables 351
Example 1,31 For two random variables X and Y
/ xr(x,y) = 0.38(x + 1)8(y) + 0.18(x)O(y) + 0.18(x)8(y 2) +0.158(x 1)8(y+ 2) + 0.28(x-I)8(y-l) +0.158 (x-l)8(y-3)
Find (a) The correlation b) The covariance (c) The correlation coefficient of X and Y (d) Are X and Yeither uncorre1ated or orthogonal?
(Nov 2006, May 2009) Solution
Given X and Yare discrete random variables.
The density function / xr(x,y) is a probability mass function
Pxr(x,y) =/xr(x,y)
The values can be put in the Table 5A.
Table 5.4 Probability Pxr(x,y)
(X,Y) Pxr(x,y (-1,0) I (0,0) I (0,2) (1,2) J (1,1) I (1,3)
0.3 I 0.1 . I 0.1 0.15 I 0.2 I 0.15 (a) The correlation ofX and Y is
Rxr E[XY] = LLxyPxr(x,y) = (IX-2)(0.15.) + (1)(1)(0.2) + (1)(3)(0.15)
-0.3 + 0.2 + 0.45 0.35 :. Rxr =0.35
(b) The covariance ofX and Y is CXy=Rxr -E[X]E[Y]
Now E[X] =LXPxr(x,oo) wht!rt! PXy(x,oo) O.38(x+l)+O.18(x)+U.18(x)
+O.158(x-l)+0.28(x 1)+0.158(x-l) =0.38(x+ 1)+ 0.28(x) + 0.58(x 1)
E[X] (0.3)(-1) + (0.2)(0) + (0.5)(1) =-0.3+0+0.5 0.2
-_..._--
-
352 Probability Theory and Stochastic Processes
Also E[Y] LYPxy(oo,y)
where Pxy(oo,y) = 0.38 (y) + 0.18 (y) + 0.18 (y- 2) +O.158(y + 2) + 0.2b(y 1) +0.158(y - 3) =O.46(y) + O.18(y- 2)+O.158(y+ 2) +O.2(y-I) +0.158(y-3)
E[Y] = 0.4(0)+ 0.1(2) +0.15( -2) +0.2(1) +0.15(3) =0+0.2-0.3+0.2+0.45=0.55
=Rxy -E[X]E[Y]CXY = 0.35 - (0.2)(0.55) =0.24
Cxy 0.24 (c) The correlation coefficient ofX and Y is
Pxy
Now 0'; E[X2]-E[X]2
and 0'; =E[y2]-E[Yf
Then E[X2] =Lx2pxy (x,oo)
= (0.3)( -Ii + (0.2)(0) + (0.5)(1)2 =0.3 +0.5 =0.8
and E[y2] =~:>2pXY(oo,y) = (0.4)(0) + (0.1)(2)2 + (0.15)(-2)2 + (0.2)(1)2 + (0.15)(3)2 =0+0.4+0.6+0.2+1.35 =2.55
E[y2 ]=2.55 0'; 0.8 0.22 =0.76 0'; 2.35 - 0.552 = 2.0475
Pxy = 0.24 =0.1924 .J0.76.J2.0475
(d) Since CXY =0.24* 0 and Rxy =0.35*0 the given random variables are neither orthogonal nor
uncorrelated.
-
Operations on MUltiple Random Variables 353
Example 5.32
Two random variables X and Yhave means X = I and Y = 2, variances 0'; =4 and 0'; =1, and PXY =0.4. New random variables Wand V are defined by
V =-X +2Y and W =X+3Y
Find (a) the means, (b) the variances, (c) the correlations and (d) the correlation coefficient .
Pvw of Vand W. Solution
-Given X 1, Y=2
0'; 4, 0'; =1, PXY = 0.4 V -X +2Y
and W =X+3Y
(a) Mean values E[V] =E[-X +2Y]=-E[X]+2E[Y]
+2x2=3 and E[W] =E[X + 3Y] =E[X] + 3E[Y]
1+3x2=7 (b) Variances
O'~ =E[V2]-E[Vf =[E(-X +2Yi]-32
E[X2 +4y2 -4XY]-9
0'; E[X2]+4E[y2]-4E[XY]-9 Now we know that
ErX 21 =O'.~ + ErXr ="'4+ 1= 5 and E[y2] o:+E[Yf~=1+4=5
Also PXY
or CXy E[XY] E[ X]E[Y] =P xyO' xO'y E[XY] PxyO'xO'y + E[X]E[Y]
---------_.. _
-
354 Probability Theory and Stochastic Processes
=0.4 x 1 x 2 + 1 x 2 =2.8
Substituting the values,
a; =5+4x5-4x2.8 9=4.8
and a! =E[W2] E[Wf
E[(X + 3y)2] _72
=E[X2 +9y2 +6XY]-49 =E[X2]+9E[y2]+6E[XY] 49 =5+9 x 5 +6 x 2.8- 49 17.8
(c) Correlations E[WV] E[(-X +2Y)(X +3Y)]
= E[- X2 + 2XY - 3XY+ 6y2] _E[X2] - E[ XY] + 6E[y2]
=-5 -2.8+6 x5 22.2
(d) Correlation coefficient Pvw = Cvw avaw
E[VW] - E[V]E[W] Pvw =
22.2-7x3 =~=0.13 Pvw. -../17.8J4i 9.243
Example 5.33
Two random variables X and Yhave the density function
f2-(x+O.5y)2 O
-
Operations on Multiple Random Variables 355
Solution
2-(X+O.5y)2 0< x < 2 and 0 < y
-
356 Probability Theory and Stochastic Processes
2 [96 ]=--+6+18 =2.009 43 5
=~[~[!.)+~[35)+~[34)lm02 43 3 3 4 5 2 4 2 [ . 243 81]m02 =- 24+-+- .=4.1343 10 2
(ii) Covariance CXY =JlII =E[(X X)(Y Y)]
Now,
mll =~[18+ 81 +24]=2.42443 8 Cxr = 2.424 (1.326)(1.866)
=-0.05 (iv) Since covariance C xr* 0, X and Y are notuncorrelated. Example 5.34
If X, Y and Z are uncorrelated and independent variables with the same variance 0'2 and zero mean. fmd the correlation coefficient between (X + Y) and (Y +Z). Solution
Given E[X] E[Y] E[Z]=O 2 2 2 2O'x =O'y =O'z 0'
So, E[X2] :::: E[y2] =E[z2] =0'2
Since X, Y, Z are independent,
E[XY] =E[YZ] E[XZ]=O
Also E[X +Y] =E(Y +Z] 0
Now variance of
-
Operations on Multiple Random Variables 357
= E[X2] +E[y]2 = 20'2 Also O';+z =E[(Y+Z)2] - E[Y +Z]2
E[y2] +E[Z2] =20'2
and cov[(X + y), (Y +Z)] =E[(X +Y)(Y +Z)] - E[X +Y]E[Y +Z]
= E[XY] +E[y2] +E[XZ] +E[YZ]
:. The correlation coefficient between (X +Y) and (Y +Z) is
cov[(X +Y,Y +Z)]P(X+Y)(Y+Z) = 2 2
O'X+Y O'Y+Z
0'2 0'2 1 =../20'2 .,/20'2 =20'2 ="2
Example 5.35
Joint density functions of two random variables is given by 18y2fxy(x,y)
, =--3 for 2
-
358 Probability Theory and Stochastic Processes
2[ 2 J 18y2fy(Y) =18y -- =2x22 8
.. frey) 9 ' -y"4
O
-
250 Probability Theory and Stochastic Processes
co
Now fx(x) = fe- x e-Y~u(x):=; e-Xu(x)
o
Similarly, fy(y) =e-Yu(y)
fx(x,y) fx(x) fy(y)
:. X and Yare statistically independent.
Example 4.7
If the joint pdf ofX and Y is
fx,y(x,y)
Solution
1 2Given fx,y(x,y) =---;:'e and a circle region, x2 + y2 $; a
as shown in Fig 4.5. The mass in the circle (probability) is
Fig. 4.5 Circle region x 2 + y2 = a2
Presion f f fx,r(x,y)dxdy region
Converting x, y into polar coordinates r $; a and 0 $; 8 $; 2n:
x = rcosO Y rsinO
r2 x2+ l dx~ =rdrdO
-
Multiple Random Variables 251
=_1_ rf" re-r'/2o'drdO21(; (1'2 .b
21(; r -r'/20' -1-_. re ur 21(; (1'2
21(;
p
.. The mass in the given region is l_e-a'I2a'.
Additional Problems
Example 4.8
(a) Find a constant b (in terms of a) so that the function
{ be-(x+Y ) 0
-
250 Probability Theory and Stochastic Processes
co
Now fx(x) = fe- x e-Y~u(x):=; e-Xu(x)
o
Similarly, fy(y) =e-Yu(y)
fx(x,y) fx(x) fy(y)
:. X and Yare statistically independent.
Example 4.7
If the joint pdf ofX and Y is
fx,y(x,y)
Solution
1 2Given fx,y(x,y) =---;:'e and a circle region, x2 + y2 $; a
as shown in Fig 4.5. The mass in the circle (probability) is
Fig. 4.5 Circle region x 2 + y2 = a2
Presion f f fx,r(x,y)dxdy region
Converting x, y into polar coordinates r $; a and 0 $; 8 $; 2n:
x = rcosO Y rsinO
r2 x2+ l dx~ =rdrdO
-
Multiple Random Variables 251
=_1_ rf" re-r'/2o'drdO21(; (1'2 .b
21(; r -r'/20' -1-_. re ur 21(; (1'2
21(;
p
.. The mass in the given region is l_e-a'I2a'.
Additional Problems
Example 4.8
(a) Find a constant b (in terms of a) so that the function
{ be-(x+Y ) 0
-
220 Probability Theory and Stochastic Processes
The limits are: If x =1, Y 2 - 3 ;;;;:; -1
If x= 5, Y 10-3=7
_I(y+3) -I
-
Operations on One Random Variable 217
Now dy =sine de
or Idel 1 dy =sine
Example 3.50
A random variable X is unifonnly distributed in (0, 6). IfX is transformed to a new random variable Y = 2(X _3)2 - 4, fmd (i) the density of Y, (ii) Y and (iii) a;. (Nov 2010) Solution
Given ix(x) {0 -6 1
otherwise
The transformed variable is
Y =2(X -3l 4 =2(X2 +9 6X) 4 =2X2 +18 12X 4 =2X2 12X+14
or X 2 -6X+7- Y =0 2
The roots are
6 X
2
X = 6.J36-28+2Y 2
=--~---------
-
218 Probability Theory and Stochastic Processes
X =3.!."'8+2Y 2
or x =3 ~.j4+y XI =3+0.707.jy+4
x2 =3-0.707.jy+4
dxl 0.707 dx2 -0.707 dy 2.jy +4' di- 2.jy +4
(i) We know that the density function of Y is
= 0.707 (1.+1.)2~y +4 6 6
I' ( ) =.!. x 0.707 }y Y 6.jy +4
The limits for Yare for X =0, y =14 and for x = 6,y = 4
fy(y) = 0.1178
.jy +4
(ii) The mean of Y is Y = (y fy(y)dy
= f O.1178 ~dy =0.1178 fy(y +4t1/2dy y+4
=0.1178[2Y .jY +4 _ (y +4)3/2] 3/2
---"-"-"---------"-----"-"----- "--
-
Operations on One Random Variable 219
=0.1178[118.79 - 50.91]
Y 8
Example 3.51
Let X be a continuous random variable with pdf
X 1
-
252 Probability Theory andStochastic Processes
-a 1 e =-+1
b
or a=ln(_b)l+b
Example 4.9
Discrete random variables X and Yhave a joint distribution function Fx.y(x,y) O.lu(x +4)(y -1) +O.l5u(x +3)u(y -1)+ 0.22u(x)u(y -3)
+O.18u(x -2)u(y + 2) +0.23 u(x-4)u(y+ 2) +O.l2u(x-4)u(y+3) Find (a) the marginal distribution functions Fx (x) and Fy (y); (b) P{-l < X S 4,-3 < Y s 3}. Solution
Given the joint distribution function FX,y(x,y) O.lu(x +4)u(y -1)+O.l5u(x+3)u(y -I) +0.22u (x)u(y - 3)
+O.l8u (x- 2)u(y + 2)+ 0.23u(x-4)u(y + 2) + 0.12u(x -4)u(y+ 3) (a) The marginal distribution functions are
Fx(x) =Fx,Y(x,co) = O.lu(x+4)+ 0, 15u(x + 3) +O.22u(x) + O.18u(x - 2)+ 0.23u(x-4) + O.12u(x-4)
Fx(x) =0.lu(x+4)+O.l5u(x+3)+0.22u(x) +O.18u(x-2)+0.35u(x-4) and Fy(y) == FX,y(CX:>,y) O.lu(y -1)+0.15u(y -1)+0.22u(y-3)
+o.18u(y +2)+ 0.23u(y + 2) + 0.12u(y + 3) Fr(y) = 0.25u(y-l) +0,22u(y-3) +0.41u(y+ 2)+0.12u(y +3)
Figure 4.6 shows the marginal distribution functions.
FAx)i _____O~6r=-5---l~ 0.47 t I
25 0. 10.11 I ~~---_3~i-_2ri-_~1~~i~2r-~3-+~-+~ y
(a) Fig. 4.6 Marginal distribution functions (a) F x
-
Multiple Random Variables 253
b) P{-1 < X ~ 4, -3< Y ~3} =0.22+0.18+0.23 0.63
Example 4.10
Show that the function
X
-
254 Probabillty Theory and Stochastic Processes
(a) Find and sketch FX,y(x,y). (b) If a
-
Multiple Random Variables 255
Figure 4.7 shows the joint distribution function FX,Y(x,y).
Fx,y(x,y) 1
Fxr(x,y)
function
Fig. 4.7 Distribution function F x ,Y (x, y)
(b) If a < b and given X +Y :::;; 3a. 4
Now take the limit
3a 3a x+ y =-or x = y
4 4
:. The probability is
p{x+y~ 3a} = rt rTY_l duiy4 .Iv={).Ix={) ab
_t,3; -
1 (3a--y)dy 1 1~oI4(3a _y)dy
y=o ab 4 ab y=o 4
33aY_L2 Jal4_1 (ab 4 2 0
p{x+ y:::; 3a1= 9a .. 4 32bJ
Example 4.12
Determine a constant b such that the given function is a valid joint density function. 2 +4y2 )f ( ) {b(X O~lxl
-
256 Probability Theory and Stochastic Processes
Solution
Given o:::;1 x 1
-
Multiple Random Variables 257
Fig. 4.8 Area ofthe density function , r .Jb
o = tan -I ( ~), 0 ~ 0 ~ 2n and duly = rdrdO
(a) Since the function is a valid density function, 2 2f f x +Y dxdy = 1
circular plane 8n
2".b r2So r' -rdrdO 1 .b 8n
4 IJb =1=_r_x2n
32n 0
b2
b216 =1, =16, b 4
(b) The probability, P{O.2b < X 2 + y2 ~ O.6b} is, after converting into polar coordinates: P{O.2b < r2 ~ u.6b}
or P{.JO.2b < r~ .JO.6b}
!21f 1:'6b r3-drdO O.2b 8n 4r I~
--x2n 32n .J02I,
-
258 Probability Theory and Stochastic Processes
l.-[(O.6b)2 -(O.2b)2] 16
1~ [0.36- 0;04Jb2 (0~~2 )42 :. P{0.2b < x2+y2 ~ 0.6b} 0.32
Example 4.14
The joint density function ofX and Y is given by ax2y O
-
Multiple Random Variables 259
The marginal density functions are
fx(x) == L=xfx,y(x,y)dy 0
-
260 Probability Theory andStochastic Processes
The marginal distribution functions are
x
-
Multiple Random Variables 261
K - =1 8
K =8 (b) The marginal pdfs are
fAx) = Lxfx,y(x,y)~ O
-
262 Probability Theory and Stochastic Processes
Solution
Given the joint density function,
.!.(X+ y) O
-
400 Probability Theory and Stochastic Processes
Since E[Xj2] =E[X/] =1 and E[XjX2] =0 Now Ry,y, (r) = cos wtsin wet +r) +sin wtcos w(t+r)
= sin(wt+ wr+wt) Ry,y, (r) = sin(2wt + wr) = sin w(2t +r)
The cross correlation is a function of time f. Therefore, YI(f) and Y2(t) are riot jointly WSs. Example 6.6
For a given random process X(t), the mean value is X = 6 and autocorrelation is RxAr) = 36+ 25e-1tl
Find (a) the average power of the process X(t) and (b) variance of X(t). Solution
Given RxAr) = 36+ 25e-1tl (a) The average power is
E[X2(t)] = Rxx(O)
:. Rxx(O) = 36 + 25e-(O) =36+25=61
., The average power is 61 watts.
(b) Variance of X(t) is
ax 2
=Rxx(O)-X-2
a/ =61-62 =61-36=25 Example 6.7
The autocorrelation function of a stationary random process X(t) is given by 16RvAr) ~36+--~u, . , 1+8r2
Fincllhe mean, mean square ancl VariaIll,;e uf Ule prul,;ess.
Solution
Given the autocorrelation function
16Rxx(r) =36+-1+8r2
(a) Since the process is stationary, we know that
-
Random Processes 401
x' = limRxx(r) T--.
-
402 Probability Theory and Stochastic Processes
-8.71]"-6.83
7
"16-9 16e-i -9
[ex] = 16e-2 -9 16-9 t16e-4 -9 16e-2 -9
More Solved Examples
Example 6.9
A random process is described by X(t) =A , where A is a continuous random variable uniformly distributed on (0,1). Show that X (t) is stationary process. (Feb 2007) Solution
Given X(t) =A, whereA is a continuous random variable with uniform distribution on (0,1) O~A~l
:. fAA) ={~ otherwise
(i) Mean value of X(t) is
E[X(t)] = 1X(t) fA (A)dA
= .( A d A =~(constant) (ii) The autocorrelation of X(t) is
Rxx('r) =E[X(t)X(t+r)]
= .( A,AfA(A)dA = .( A2 dA = A311 1 3 0 3
Rxx(r) is independent of time.
:. The given fimction X(t) is a stationary process.
Example 6.10
Consider random processes, X(t)=Acos(cqt+O) and Y(t) =Bcos(co2t +l/J), where A,B, cq and co2 are constants, while 0 and l/J are statistically independent random variables uniformly distributed on (0, 2n).
-
Random Processes 403
(i) Show that X(t) and yet) are jointly WSs. (ii) If 0 , showthatX(t)andY(t)arenotjointlyWSSunless COl =co2' (Feb 2008) Solution
Given the random processes are X(t) Acos(co1t+0) yet) =Bcos(co2t+)
Also 0 and are uniformly distributed over (0,211:').
o~O ~211:'Then 10(0) {2;
otherwise
o~ ~ 211:' and J,(~) ~{~
otherwise
(i) For the processes to be jointly WSS, E[X(t)] =E[Acos(~t+O)]
A rcos(~t +0)/0 (O)dO ~~ r 2Jr cos(~t+O)dO
211:' .b
2~ (sin (COlt + O)I~lt) =~(Sin(COlt) sin(co/)] 0
211:'
.. E[X(t)] 0 and the autocorrelation ofX(t) is
E[X(t)X(t I'r)] A2 rcos(~t I f))cos(col(t H:)+O))/(J(O)dO A2 '2lt I
= r' --[cos~T+cos(2~t+~T+20)]211:'.b 2
-
404 Probability Theory and Stochastic Processes
A2 RxxCr) =-cos~r
2
Similarly,
E[Y(t)] 0 B2
and Ryy(r) =TCOSCOz r
Now the cross correlation function between X(t) and yet) is
Rxr(r) =E[X(t)Y(t +r)]
= [ [ X(t)Y(t + r)!e,1J(8,/fd8d/f> Since 8 and /f> are statistically independent,
fe,,,,(8,/f =!e (8).frp (/f
Rxr(r) = [X(t)fe(8)d8 [Y(t+r).frp(/fd/f>
AB2 r2,. cos(~t+8)d8 rZ" cos(co2 (t +r)+/fd/f>161& k k - AB (0) 0 - 161&2 ..
Rxr(r) 0 Therefore, the mean values of X(t) and yet) are constant. The autocorrelation of both
X(t) and yet) are independent of t. The cross correlation is also independent of t. Hence the random processes are jointly WSs.
(ii) If 8 = /f>, the expression for cross correlation between X(t) and yet) is RXy(r) E[X(t)Y(t +r)]
=E[Acos(wl+ 8)Bcos(w2 (t +r) +8)] = A2B R[cosm? - fI~)t + 0)?1:) + coscoz + col)t + wz"r + 20)1
A B 2" A R 2"= cos((COz coat + co2r )d8 + cos((coz + ~ )t + COzr + 28)d841& 41& The second term is equal to zero.
AB 2" cosCOz ~)t+cozr)d841&
-
Random Processes 405
AB . Rxr(1:) =-COSW2 -Wl)t +W21:)2
The cross correlation function is a function of both t and 1:. Therefore the givenX(t) and yet) are not jointly WSS.
But if = WI ' thenw2 . AB
R xr (1:) == - cos co2 1: 2 The cross correlation function is independent of time t. Therefore the given processes are
jointly WSS, at () = and 01 =Ol.!. Example 6.11
A random process yet) =X(t) X(t +1:) is defined in terms ofa process, X(t) that is at least WSS. (a) Show that the mean value of yet) is zero even if X(t) has a non-zero mean value. (b) Show that a; =2[Rxx(0) Rxx(1:)]. (c) If yet) X(t)+X(t+1:), fmd E[Y(t)] and cr;. (Nov 2007) Solution
Given the random process yet) X(t) X(t+1:)
where X(t) is a WSS random process. (a) The mean value of Yet) is
E[Y(t)] E[X(t) - X(t +1:)] E[Y(t)] E[X(t)] - E[X(t +1:)]
For a stationary process, we know that E[X(t)] =E[X(t +1:)].
Given E[X(t)] -::j:. 0
Then R[Y(t)] 0
(b) The variance of Y(t) is cr; =E[y2(t)] - E[Y(t)]2 . cr; =E[y2(t)] O=E[y2(t)] cr; E[[X(t)-X(t+1:W]
-
406 Probability Theory and Stochastic Processes
E[X2 (t)] + E[X\t + 'l')] - 2E[X(t)]E[X(t + 'l')]
a; =E[X2(t)] +E[X2(t)]-2E[X(t)]E[X(t +'l')] a; Rxx(O) +Rxx(0)-2R,rr('l') a; 2[Rxx (0) Rxy('l')]
(c) If the random process is given by Y(t) =X(t)+X(/+'l')
then the mean value is E[Y(tn E[X(t) +X(t + 'l')]
;::::E[X(t)]+E[X(t+'l')] 2E[X(/)] Y(t) =2X(/)
The variance of Y(t) is
a; ;:::: E[y2(t)] E[Y(t)]2
a; =E[(X(t) + X(t +'l')f] - (2X(t)i
2 _2 ay E[X2(t)] + E[X2 (t + 'l')] + 2E[X(t)]E[X(t +'l')]-4X(t)
2 ~ a y =Rxx(O) +RxAO) + 2Rxx('l') -4X(I)
2 _2 a y =2[Rxx(0) +Rxx('l')] -4X(t)
Example 6.12
Determine if the random process X(t) = A, where A is a random variable with mean A and variance a~ is mean ergodic. (Feb 2007) Solution
Given X(t);:::: A, and X(t) A. IfX(t) is mean ergodic, then
E[X(t)] =A[X(t)] The time average of X(t) is
lim-1 I'l'A[X(t)] X(t)dt T->oo 2T T
-
Random Processes 407
=lim-l fr Adt A r ....oo 2T r
X(I) "" A[X(t)] = A :. The random process is ergodic in mean:
Example 6.13
Statistically independent zero mean random processes X(I) and Yet) have aut~correlation functions
Rxx(-r) == e-jTl and Ryy(-r) =cos(2m) respectively (a) Find the autocorrelation function of the sum ';(t)=X(t)+Y(t). (b) Find the autocorrelation function of the difference ~(t) X(t) Yet). (c) Find the cross correlation function of W1(t) and W2(t). (Nov 2007) Solution
Given X(t) and yet) are two statistically independent zero mean random processes. The autocorrelation functions are
= e-1rlRxx(-r) . Ryy(-r) = cos(2m)
(a) Given ';(t) =X(t)+Y(t) The autocorrelation function is
Rw,w. (-r)= E[';(t)W; (I +-r)]
Rw.w. (-r) =E[(X(/) + Y(t(X(t +-r)+ yet +-r)]
= E[X(t)X(t +-r)+ X(t)Y(t+-r)+ X(t+-r)Y(-r)+ Y(t)Y(t+-r)] .. Rw.w. (-r) E[X(t)X(1 +-r)] +E[X(t)Y(t +(1] + E[Y(t)X(t +-r)] + E[Y(t)Y(t+-r)]
Rw,w, (-r) :;:: Rxx(r)+ R.rr(r) +Ryx(r) +Ryy(r)
Since X(t) and yet) are statistically independent,
RXY (r) == Rrx
-
408 Probability Theory and Stochastic Processes
= E[(X(t) - Y(t(X(t +-r) Y(t +-r] E[X(t)X(t +(IJ E[X(t)Y(t +-r)] --, E[Y(t)X(t +-r)J + E[Y(t)Y(t +-r)]
Rw,w, (-r) Rxx(-r)- R xy(-r)- Rrx (-r) + Ryy(-r) Rxx(-r) + Ryy(-r) =e-Itl + cos(2/rr)
(c)' The cross correlation of ~ (t) and W2 (t) is Rw,w, (-r) =E[~(t)W2(t+-r)]
:;;; E[(X(t) +Y(t(X(t +-r) yet + -r] Rxx(-r)- Rxy(-r)+ Rrx(-r)- Ryy(-r) RX)[> (-r) - Ryy(-r)
cos(2n-r) Example 6.14
A random process is described by X(t) =A2 cos2 (we + 0), where A and we are constants and o is a random variable uniformly distributed between in. Is X (I) wide sense stationary?
(May 2010) Solution
Given X(t) = A2 cos2(av +0), 0 is uniformly distributed between n. The density function is
1&(0) {)n o otherwise
The mean value of X (I) is
'"
E[X(I)J - Jx(t) .ftJ(O)dO
-
Random Processes 409
::::: ~ [2n + sin 2(av + 8)\11: 1 4n 2_11:
A2 A2 -[2n + 0] ==4n 2
E[X(t)] ::::: A2
The autocorrelation function is 2
(constant)
Rxx(r) E[X(t)X(t +r)]
E[ A2 cos2 (coc + 8)A2 cos2(COc(t +r) + 8)]
A4E[cos2(coct + 8) cos2(coct + coer + 8)]
A4[= - r dB + r cos(2coct + 2B)dB + r cos(2aV + 2cocT + 2B)d88n .l-1I:.l-n .l-n
-
410 Probability Theory and Stochastic Processes
A4 =-[2n +0+0+ 2ncos2ro-r +0]8n c
Example 6.15
Prove that the random process X(t) =Acos(ro/ +0) is wide sense stationary if it is assumed that roc is a constant and 0 is a unifOlmly distributed variable in the interval (0,2n). Solution
Given X(t) = Acos(roct + 0), where 0 is unifonnly distributed between (O,2n). The density function is
O
-
Random Processes 411
2n 1fACOS(CO 1+ 8)Acos(COc(1 +-r) +8) -d8 c 2n o
A2 [2n 2n ]- fcoscoc-rd8 + fcos(2coi + 28 + -r)d84n 0 0 A2 A2
'-(cosco -r)(2n) =-coscoc-r 4n c 2
Since the mean value is constant and autocorrelation is independent of time, X(I) is wise sense stationary. Example 6.16
Let X(t) be a stationary continuous random process that is differentiable. Denote its time
derivative by X(t).
(a) Show that E[X(t)]=O. (b) Find R . . (-r) in terms of Rxx(-r). (Feb 2007, Feb 2008) xx
Solution
(a) Given X(t) is a stationary random process.
The time derivative of X(/) is X(/). From derivative principles,
The mean values is
E[.i'(t)] "" lim [X(I + f1) - X(/)] " "" f1 . E[X(t + f1)] - E[X(t)]11m --=-----'--....::..::....-::..-.:..=
11.--+0:> A We know that ifX(/) is stationary, then
E[X(t + -r)] E[X(t)] =E[X(t + f1)]
:. E[X(t)] = 0
-
412 Probability Theory and Stochastic Processes
(b) We know that Rxx(T) E[X(t)X(t +T)] with the derivative of X(t) as
R . . (T) =E[X(t)X(t+T)]xx
Consider X (t) has N random variables with joint density function. Then
Rxx(T) [, [,... [,~(t)~(t +T)!Axl'x2 ,' ,XN;tl,t2 .. tN)dx1dx2 .. dxN ax(t) OX(t+T)[, [, '" [ --. f (x x .. x .[ [ .. t )dxdx .. dx ~ dt dt x I' 2' N' 12 N I 2 N
Interchanging integration and differentiation,
Rxx (T) !: [[, [,... [,x(t)X(t+T)!x(XpX2, .. ,XN;tlt2 .. tN)dxldx2 .. dxN] 02 R . . (T) =-2[R.rr(T)]
xx ot Example 6.17
Given X=6andR.rr(t,t+T) 36+25exp(-T) for a random process X(t). Indicate which of the following statements are true based on what is known, with certainty: X(t) ( a ) is first order stationary (b) has total average power of 61 W (c) is ergodic (d) is wide sense stationary (e) has a periodic component (f) has an AC power of 36 W (Feb 2007, May 2009) Solution
Given the random process X(t): mean value X 6 autocorrelation R;v. (t,t +f) =36 + 25exp(-T)
(a) First order stationary: Since the mean value X(t) 6, a constant, the first order density function does not change
with time t. :. X(t) is first order stationary.
(b) We know that the average power is Pavg = Rxx(O)
-
Random Processes 413
Pavg 36+ 25exp(-0) =36+25 =61 W :. The average power is 61 W.
(c) If the process is ergodic, the ensemble average is equal to time averages A[X(t)] =X=6
and A[Rxx(t,t + -r)] = Rxx(t,t +-r) = 36 + 25e-I~1 Therefore the process is ergodic.
(d) Wide sense stationary (WSS): The conditions for the process to be WSS is
E[X] = 6 :::: constant and Rxx(t,t+-r) Rxx(-r) =36+ 25e-1T1
is independent of the absolute time t. . . The process is WSS.
(e) The process has no periodic components: Since limRxx(t, tJ +-r) =E[X]2 T->'"'
lim36+25e-1T1 36
E[X]2 =36 (f) The AC power of the process is the variance ofX(t).
0-; E[X(/)2]-E[X(t)]2 E[X(ty] Rxx (t,t + -r) l-r =0= Rxx(O) =61
0-x 2
=61- 62 =25 AC power is 25 watts.
Example 6.18
A random process is defined by X (I) :::: X 0 + Vt ,where Xo and V, are statistically independent random variahles, unifonnly distributed 011 intervals [XOl ,XoJ and [V; ,f;] respectively. Find (a) mean, (b) autocorrelation and (c) autocovariancc fWlctiun. (d) Is X(t) stationary in any sense? If so, state the type. (Feb 2008) Solution
The given random process is
X(t) Xo + Vt
-
414 Probability Theory and Stochastic Processes
where Xo and V are statistically independent.
elsewhere
1
elsewhere
(a) The mean value of X(t) is E[X(t)] =E[Xo + Vt] =E[Xo] + E[Vt]
-
Random Processes 415
_1[V}_~3] 11;-~ 3
~2 +~2 + ~11; 3
Since Xo and V, are independent.
E[Xo V] =~[Xo] E[V]
. R (tt+~) X;2+ X ;r+ X oI X02 (2t+'r)(X02+XOl)(~+V;) t(t+'r)(V22+~2+V;V2). . xx' + ,+-O----'-'-""--'----'...L:...
3 4 3 (c) The autocovariance ftmction ofX(t) is
Cxx (t,t +'r) Rxx(t,t +'r) - E[X(t)]E[X(t +'r)] . 1
Now E[X(t+'r)] =-[(X02 +X01)+(t+r)(J!; +~)].2
l' 1 and E[X]E[X(t+:r)] =2[X02 +XOI +f(J!; + ~)]2[X02 +XOI +(t+r)(J!; +~)]
t(t +r) 2 2 1 2 + (11; + ~ + ~J!;)--(X02 +XoJ3 4 - (2t;r) (X02 +X01 )(1I; +V;) t(t;r) (11; +~)2
-
416 Probability Theory and Stochastic Processes
X~ +X;l +2X02 X 01 4
C {t,t+1:) = X;2 + X;! -2X02X 01 + t(t+1:) (v;,2 +V;2 -2V;V )xx 2
. 12 12 (d) (1) The mean of X(t) is not constant.
(2) The autocorrelation function depends on time t. . (3) The auto covariance function depends on time t.
Hence the given random process is not stationruy.
Example 6.19
Let X(t) be a WSS process with autocorrelation function Rxx (r) =e-atrl, where a > 0 is a constant. AssumeX(t) amplitude modulates a carrier cos(a>ot+8) as shown in Fig. 6.4, where Wo is a constant and 8 is a random variable uniformly distributed on (-1', 1'). IfX(t) and 8 are statically independent, determine the autocorrelation function of Yet).
x (t) 'lrQ3) 'lr T Y(t)=X(t)cos(o~l+6)
cos(att+8) Fig. 6.4 Modulation system
(Feb 2008) Solution
Given a WSS random process X (t) has autocorrelation Rxx (1:) =e -alrl a > 0, where 6 is a uniformly distributed random variable on (-11:,11:).
elsewhere
and X(t) and 6 are statistically independent. FrornFig. 6.4, yet) =X(t)cos(wot+fJ).Thc autocorrelation of yet) is
Ryy(t,t+1:) =E[Y(t)Y(t+1:)] E[X(t)cos(Ct>ot + 6)X(t +1:)cos(C%(t + 1:)+ 6)]
-
Random Processes 417
::= E[X(t)X(t +-r)]E[cos(av +8)cos(mo(t +-r) +8)] 1
::= R,rr (-r)-E[cos mo-r + cos(2mot+ 28 + mo-r)]2 ..
Ryy(-r)
Example 6.20
Consider a random process X(t)::= Acosmt, where m is a constant and A is a random variable uniformly distributed over (0,1). Find the autocorrelation and autocovariance of X(t)....
(May 2010) Solution
Given a random processX(t) Acosmt
O
-
418 Probability Theory and Stochastic Processes
r cos lOt =cos lOt ,LAdA =-2
E[X(t+r)] = cosm(t+r)and
. 2
cosmtcosm(t+r) cosmtcosm(t+f)Cxx (t,t+r) ::::---~-...:.. 3 4
Cxx(t,t+r) :::: cosmtcosm(t+r) 12
Example 6.21
A stationary ~dom process X(t) with mean 2 has the autocorrelation function Rxx(r) I
:::: 4+exp(-1 r 1110). Find the mean and variance of Y = fX(t)dt. o .
Solution
I
Given E[X(t)] =2, Rxx(r) =4+e-1fll1O and yet) = fX(t)dt. o
(a) Mean value of yet) is
E[Y(t)] ::::E[jX(t)dt]
1
fE[X(t)]dt o
I
E[Y(t)] = f2dt =2 o
(b) Variance of yet) is
T
Now for E[y2(t)] we know that, if Y(t)::: fX(t)dt, then o
T
f (1-1 r I)Rxx(r)dr -T
-
Random Processes 419
1
.. E[y2(t)] = J(1-ITI)(4+e-ITIIIO)d'r -I
o 1 E[y2(t)] f (l +T)(4 + e~110 )dT + f(l-T)(4 + e-rIlO)dT
-I 0
o 0 f4dT+ f4TdT+ fef/lodT+ fTe~!lOdT= -I -I -I -I
I 1 I I f4TdT+ fe-filO - fTe-fllodT+f4dT
o 0
0 4 2 =4T + T
1_I 2
=-4+ 2 + 10(l-e'!) -lOe-o.1 -lOO(l-e-c'!) +4-2 lO(e'! 1) + 1 Oe-o. I +lOO(e-cl-l)
==200e-ol 200+20+4
E[y2 (t)] ;;:: 200e-cI 176 == 4.967 (1: =E[y2 (t)] E[Y(t)]2
(1; =4.967
Variance of yet) is =0.967.
Example 6.22
A stationary process has an autocorrelation function given by
_ 25T 2 +36 R( )T - 6.25T2 +4
Find the mean value, mean square value and variance of the process
(May 2011)
-
420 Probability Theory and Stochastic Processes
Solution
2 R(1') = 251' 36Given
6.251'2 + 4 (i) The mean square value is
E[X2] =R(1')I1' = 0
E[X2] =36 =9 4
(ii) Mean value is
25+ 36 25
6.25 4
X =2 2 2""; 2(iii) Variance C1x E[X ]-X =9-4 5 ~)(ample 6.23 Assume that an Ergodic random process x(t) has an alitocorrelation function
2 .. 18+--2 (1+4cos(21'6+1'
(i) F:ind IX I. (ii) Does this process have a periodic component? (iii) What is the average power in x(t) (May 2011) Solution
R;IT ('r) "" 18 + '--62 2 (1 +4cus(2(+1'
(i) Mean value:
-
Random Processes 421
2 10IRxx(,r) I =18+--(1+4)=18+-6+'t" 2 6+'t" 2
IXI 2 =18 IXI =Ji8 =4.2426
(ii) For a periodic component
limRxx('t") = lim(18 +~)(1 +4cos(2't")) '1'->00 '1'->00 6 + 't"
=00
.'. The process has a periodic component. (iii) Average power of X(t) is
E[X2] =Rxx(O) 2
=18 + 6(1 + 4cos(0)) E[X2] = 18+ 10 =19.667
6
Example 6.24
A stationary ergodic random process has the autocorrelation function with the periodic com
ponents as Rxx ('t") 25 + 4 . Find the mean and variance of X(t).1+6't"2
(May 2011) Solution
Given
The mean value: -' X == limRxx ('t") 25
'1'->00
X =5
-
422 Probability Theory and Stochastic Processes
Mean square value is
:. Variance
Example 6.25
A Gaussian random process is known to be WSS with a mean ,of X 4' an:d autocorrelation R xx (r) =25 e-3trIl2. Find the covariance function necessary to ~pecify t1ae joint density of the random process and give the covariance matrix for Xi =xttj)' where i = 1,2, .... ,N. Solution
Given a WS~ Gaussian random process X(t), wit~ X(t) 4 and Rxx(r) =25 e-3trl12 N random variables are given by X(t), i =1, 2, .... N with tk - ti =(k i) 1:', i and k =1, 2, 3,.....,N. The autocorrelation function for N random variables is
RXX(ti,tk ) Rxx~tk-tJ';'Rxx(1:') Rxx (ti't
k ) =25 e-311, -Ii1/2 .
The covariance function is defined as .. '
Cxx (I;, tk ) Rxi(tp1k)":" E[X(ti)]E[X(tk)' But given E[X(t)] =E[X(tk)] E[x(t)] =4
Cxx (tp tk ) 25 e-311, -li V2 - 4 x 4 Cxx (Ii ,t
k ) 25e-31I,-lilh -16
.'. The covariance matrix is
25e-N1225-16 25e:3/2 -16 25e-3 16 16 25e-3/2 25e-3/2 25e-(N/l)/2 )616 25 -16 16
25e-3/2 -16 25~'-(NI2)/2 -16[Cx] = 25e-3 16 25 16
25e-N12 -16 25e-(Nll)h -16 25e-{NI2)h 16 ... 25 16
For example if N =2
9 -10.42] [-10.42 9
-
Random Processes 423
.Example 6.26
Telephone calls are initiated through an exchange mean average rate of 75 per minute and are described by a Poisson process. Find the probability that more than 3 calls are initiated in any 15 second period. (Feb 2007) Solution
Given A. = 75 per minute = 70 per second 60
time period t 5 second
70At =-x5=6.25 60
No of calls initiated N =3 Since the process is a Poisson's process, we know that the probability of the number of
calls initiated being ~ N is N (At)k e-.:u
P[X(t) ~ N] == 2: ' k == 0,1,2,3, ...... t=O k!
Thus, the probability that the number ofcalls initiated more than 3 is
P[X(t) > 3] == 1 P[X(t) ~ 3]
== 1 [(6.25)0 e-25 + 6.25 e-6.25 + (6.25)2 e-6.25 + (6.25)3 e-625 ] 2! 3!
1-0.1302 P[X(t) > 3] =0.869
Questions (1) Explain the concept ofa random process. (Nov 2010, May 2004) (2) Explain how random processes are classIfied willl neat sketches.
(Nov 2003, Nov 2006) (3) Briefly explain the distribution and density functions in the context of stationary and
independent random processes. (4) Distinguish between stationary and non-stationary random processes.
. (Nov 2010, May 2009, May 2004)
-
436 Probability Theory and Stochastic Processes
Rxx(O) Proved.Pxx
The average power Pxx is the autocorrelation of X(t) at the origin.
7.3 Properties of the Power Density Spectrum
The properties of the power density spectrum Sxx(m) for a wide sense stationary random process X{t) are given as (1) Sxx(m) 2: 0 (7.14) Proof
From the defmition, the expected value of a non-negative function E [I X T (m) 12] is always non-negative.
Hence S.yx (m) 2: 0 (2) The power spectral density at zero frequency is equal to the area under the curve of
the autocorrelation Rxx (7:) .
That is, Sxx(O) = [,Rxx(7:)d7: (7.15)
Proof
From the definition, we know that
At m=O,
Sxx(O) [,Rxx(7:)d7: (3) The power density spectrum of a real process X(t) is an even function
i.e., Sxx (-m) =Sxx (m) X(t) is real (7.16) Proof
CUllsiu~r a WSS r~al pruc~ss X(I). Then Sxx(m) = [R (7:)e- jCtnd7:xx
Substitute 7: -7:, then
-
Random Processes: Spectral Characteristics 437
jWfSxx(-ro) =I Rxx (-1')e- d-r
SinceX(t) is real, from the properties of autocorrelation of (6.47) we know that,
Rxx(--r) =Rxx(-r)
SxxC-ro) =IRxx (-1')e-JWfd-r Sxx(-())) =Sxx(ro)
. . Sxx (ro) is an even function. (7.17) (4) Sxx(ro) is always a real function. Proof
From (7.3) we know that
Since the function 1X T (ro) 12 is a real function, Sx (ro) is always real. (5) If Sxx (ro) is a psd of the WSS random process X(t), then
(7.18)
(or) the time average of the mean square value of a WSS random process equals the area
under the curve of the power spectral density. Proof
From (6.34) we know that Rxx (-r) A {E[X(t +-r)X(t)]}
= 1 r'Sxx(ro)eJWfdro2n: Lx,
Now at -r 0,
Rxx(O) =A{E[X2(t)]} = 2~ I Sxx(f)dro = Area under the curve of SxX
-
438 Probability Theory and Stochastic Processes
(7.19)
Proof
From (7.3), we know that
Sxy(oo) =lim E[lXT(oo)n T~ro 2T
and XT(oo) = [TX(t)e-jllJ/dt
XT(oo) =dXT(OO)'=~IT X(t)e-jllJldt doo doo T
=[TX(t)(-joo)e-jllJldt
=(-joo) [TX(t)e-jllJldt
XT (00) =(-jro)XT(oo)
S .. (00) xx
S .. (00) =oo2Sxy (OO) Proved.. xx
(7) The power density spectrum and the time average of the autocorrelation function form a FOllfier transform pair.
-
Random Processes: Spectral Characteristics 439
(7.20)
and (7.21)
Proof
Consider a WSS random process X (t). We know that the time average ofthe autocorrelation function is
A[Rxy(t,t+r)] = A {E[X(t) X(t+r)]} For a WSS random process,
A[Rxy(t,t +r)] = Rxx(r) Now the power spectral density is
S ( ) xy w -1" E[I XT(~)n- 1m--='------= T->oo 2T
But we know that
1XT(w) 12 X T(w)X; (w)
Let X T (w) =:: fT X (tl)e- jrol, dtl and X;(w) = fTX(t)ejrol dt
Sxy(w) = limE[~ r T X(t)droldt T->oo 2T iT r
T X(tl)eC-)roIldtl]iT
S (w) xy =lim_l r
T rr E[X(t )X(t)]e-jw(t,-I)dtdt T->oo 2T iT .LT I I
We know that, E[X(tl)X(t)] - Rxy(t,tt>
TSu.(co) = lim_l_ r rr Rxy(t,tj)e-jW(I,-Odtjdt (7.22)T--2T iT.LT
By taking inverse Fourier transform,
= lim_l rT rT R (t t )(_1) [e-jW(II-H~dw dt dt . T->oo2T iT iT XY , 1 2n 1
-
440 Probability Theory and Stochastic Processes
. 1 [T [T =hm- Rxx(t,t) 8(tt -t-r)dtdllT'-?a;J 2T T T .
(7.23)
Since fT 8(/1 - I -r)dt, =1, let t) t +r.
=A[Rxx(t,t +r)]
1 [ 2rc
Sxx (00)ej(Qrdr (7.24)
Taking Fourier transform,
Sxx (00) = [A[Xx(/,t +r)]ejlm dr Fora WSS,
. . Rxx (r) and S xx (00) are a Fourier transform pair. Note I For a WSS random process X(t), the Fourier transform pairs
Sxx(oo) = [Rxx(r)e-j(Q!dr
and Rxx(r) =L [Sxx(oo)eJlmdoo2rc
are called Wiener-Khintchine relations.
Note 2 The knowledge of the power spectral density of a process X(t) gives the complete form of autocorrelation function when X(t) is atleast wide sense stationary. For a non-stationary random proccss, we can get only the time average of the autocorrelation function.
Example 7.1
Determine which of the following functions are valid power density spectrums and why? 2 (a) eos8(oo) (c) 00
2 + 004 006 + 3002 + 3
-
Random Processes: Spectral Characteristics 441
Solution
G' S () cos8(co)(a) Iven xx CO = 4' 2+co
From the properties ofpsd, (i) The given function S xx (co) is real and positive,
(ii) S (-co) = cos8(-co)
xx' 2+ (-cot
The function is even. Hence the given function is a valid psd.
(b) GivenSxx(co) e-(W-I)2, From the properties ofpsd,
(_ ) _ _(W_I)2Sxx co-e
:. Sxx(-co):7f:Sxx (co)
:. The given function is not a valid psd. 2co(c) Given S xx (co) = --;---;:--
+ +3
From the properties of psd,
(i) The given function is real and positive.
Sxx (-co) =Sxx (co) is an even function.
:. The given function is a valid psd.
7.4 Bandwidth of the Power Density Spectrum Baseband process Assume that X(t) is a lowpass process, that is, its spectral components are clustered at the origin, co O. As frequency increases, the spectrum magnitude decreases as shown in Fig. 7.1. The nonnalisation of a power spectrum is a measure of its spread. It is called rms bandwidth.
The rms bandwidth is obtained from
(7.25)
-
442 Probability Theory and Stochastic Processes
Fig. 7.1 Baseband process
Bandpass process
In a bandpass process, the spectral components are clustered near some frequencies
roo and -roo, as shown in Fig. 7.2. The mean frequency roo can be expressed as
(7.26)
Fig. 7.2 Spectrum ofa bandpass process and the rms bandwidth is
2 4 [(m-roo)2Sxx (m)dm Wms=~~------------- (7.27) [Sxx(m)dm
Example 7.2 . The power density spectrum of a baseband random process X(t) is
Find the rms bandwidth. Solution
Given power density spectrum
-
Random Processes: Spectral Characteristics 443
Now the nTIS bandwidth is
=32[ -00 +_1tan-I (00)]002(4 + (02 ) 2x2 2-
-
444 Probability Theory and Stochastic Processes
w;.~s J4i == 2J1i == 3.55 radls
7.5 Cross Power Density Spectrum
Definition I Consider two real random processes X(t) and y(t). If X(t) and yet) are jointly wide sense stationary random processes, then the cross power density spectrum is defined as the Fourier transfonn of the cross correlation function of X(t) and Yet). It is expressed as
Sxy(m) = [Rxy(r)e-j{Qt" d1' (7.28)
and Syx(m) [Ryx(r)e-Jmd1' (7.29) By inverse Fourier transfonnation, we can obtain the cross correlation functions, i.e.,
(7.30)
(7.31)
Therefore, the cross psd and cross correlation functions are a Fourier transfonn pair.
Definition 2
If X T (m) and 1;. (m) are the Fourier transfonns ofX(t) and Y(t) respectively in the interval [-T,T] , then the cross power density spectrum is defined as
(7.32)
and (7.33)
7.5.1 Average Cross Power The average cross power, Pxy of the WSS random processes X(t) and yet) is defmed as the cross correlation function at l' = O.
That is,
-
Random Processes: Spectral Characteristics 451
The cross correlation function is Rxy('r) Inverse Fourier transform ofSXy(w)
=F-l[(a+~wi ] We know from the Fourier transform that
I -----::-~ e-al tu(t)(a+
Additional Problems
Example 7.4
Consider the random process X(t) Acos(wt+e)
where A and (f) are real constants and e is a random variable uniformly distributed over [0, 2n]. Find the average power Pxx' Solution
We know that
P A {E[X2(t)]} xx Now X(t) =Acos(f)t + e)
o
-
452 Probability Theory and Stochastic Processes
=:; [fl< dO + 12l< cos(2cut +20)dOJ =A2 [2n + (-)sin(2cut+ 20)12l
-
Random Processes: Spectral Characteristics 453
. II . r oleJorr dro =_._ 2Now take, I' eJorr ro _ r1I -. eJ())~ -(2ro)dro11 J-r -I 11 J-r
Also, j-r
J~ -J~ J~ -J~ 2 2 ]1 e - e e - e (J~ -J~) (J~ -J~)R () --_ + +- e +e -- e -e-r .IT 27r [ j-r j-r -r 2 j-r3
Example 7.6
The autocorrelation of a WSS random process X(t) is given by R.cr(-r) = Acos(roo-r)
where A and Wo are constants. Find psd.
Solution
We know that the power spectral density
S.IT (ro) = Fourier transform of R.IT (-r)
S.IT(ro) = [Rxx(-r)e-Jmd-r
-
454 Probability Theory and Stochastic Processes
A-[2no(m - mo) +2no( m +mo)]2
Sxx(m) o=An[o(m-mo)+o(m+mo)] The power density spectrum is shown in Fig. 7.3.
Fig. 7.3 Power density spectrum of A cos (001:'
Example 7.7
Find the psd ofa WSS random processX(t) whose autocorrelation function is Rxxf"C) == ae-bl~1 Solution
We know that the power spectral density, Sxx(m) c::: Fourier transform ofRxx(r) S,'i,I,(m) ;;;;; r: Ru (T)e- j OJ1:dT
-
Random Processes: Spectral Characteristics 455
e{b- jro)r: 1 e-{b+jro)r: I"" a +a---
b JOJ -w -(b + JOJ) 0
a a =--.[1-0]---.[0-1]b-jOJ b+jOJ
a a a(b+ jOJ+b- jro)--+-
b2b- JOJ b+ JOJ +OJ2
Example 7.8
A random process has autocorrelation function
Rxx(r) ={l-ol'l' I, I'l'l s 1 otherwise
Find the psd and draw plots. (Nov 2010) Solution
The autocorrelation function is given as
I'l'l s 1 otherwise
The power spectral density is
Sxx(OJ) = [R,u('l')e-jWTd'l'
jWTSxx(OJ) = 11(1-I'l'l)e- d'l'
-jWT -jm and J'l'e-jWTd'l' =_e_'l'_ J_e_-d'l'
- JOJ (-jill)
....-_ .._-_.. _ ...__...._---_..._
-
456 Probability Theory and Stochastic Processes
= jm --t:e-jW~
or jm
jwr jO_e- 1 _e- )! 11Now S (m) =-- +-- +x:r . . jmJm -1 Jm
-1 (1 -jill) 1 (jW 1)- -e -- e jm jm
jill jill jW jill-1 e- e 1 e 1 e e- jill e - jill 1
S.rr(m) =-.-+ . - +---+--+--+2 2 2 2Jm Jm jm jm jm m m jm m m
(e jWl2 J(12 )2 jilll2 - j(12)2e- (e -e /m2
sin2(m/2) Sin(m/2)2 ((m/2)2 (m/2)
S.rr(m) =sa 2 (m/2) Plots of autocorrelation function R)(X(-t:) and the power density spectrum are
shown in Fir,. 7.4.
Fig.7.4 (a) Autocorrelation/unction Fig.7.4 (b) Power spectral density
-
Random Processes: Spectral Characteristics 457
Example 7.9
The cross power spectrum of X(t) and yet) is defined as
K + J:'ro -w
-
458 Probability Theory and Stochastic Processes
~[2sinWr]+-1-[2cosWr]- 1 z[2sinWr]2m 2m 211:Wr
( I) .WK cos Wr Rxy ()r = ---- sm r+--
1I:r 1I:Wrz m
More Solved Examples
Example 7.10
Find the autocorrelation function and power spectral density of the random process X(t) =Acos(lIV +8), where 8 is a random variable over the ensemble and is unifonnly distributed over the range (0,211:). (June 2003) Solution
Given the random process X(t) Acos(lIV+8)
1and 1e(8) =- 0~8 ~211:
211: (i) The autocorrelation function is
R"u(r) =E[X(t)X(t+r)] E[Acos(% +8)Acos(mo(t+r)+8)]
=AZE[cos(mot +8)cos(mot + mor + 8)] AZ
= -E[cos(2mot + mor + 28) + cosmor]2
A2 1z" cos m r A2 r" I= 0 d8 + -[cos(2mot+ mor + 28)]d8
2 211: 2 211:
-
Random Processes: Spectral Characteristics 459
A2 .. Rxx(r) Tcosmo1'
(ii) The power spectral density is Sxx (m) = Fourier transform ofRxx (1')
A2 . A2 - (cosmo1' e-Jtm d1' == - (2 . 2
l"eJ())o7: +e-JlI>7:) . e-Jtm d1'
2
~ [(e-/())-lI7: d1' + (e-/())+lIT d1'] A2 - 2rc[o(m-mo) + o(m +mo)] 4 2
A rc[o(m mo)+o(m+mo)]2
Example 7.11
The autocorrelation function of a 'vVSS random process is Rxx(1') = aexp(-(1' I bi). Find the power spectral density and normalised average power of the signal.
(Nov 2002, May 2011) Solution
Given the autocorrelation
Then the power spectral density is
Sxx(m) = (Rxx(1')e-/())Td1'
-
460 Probability Theory and Stochastic Processes
a .r:exp( IF- j{fJ
-
Random Processes: Spectral Characteristics 461
(or) we can also fmd from S xx (co),
Average power
1 -lIh' P =-[ abJiie-4- dcoxx 21C '-l
Let = 4 2
cob z or T=J2
zJ2 co =-
b
J2dco =- dz b
D ab [ -z'/2 J2 d 1 [-z'/2dxx =-- e - z=a-- e z 2Jii b J2ii00 '-l
Pxx =a(1) =a watts Example 7.12
Two independent stationary random processes X(t) and Y(t) have power spectrum densities 16 co2
Sxx (co) = 2 6 and S xy (co) = 2 respectively with zero means. Let another random co +1 co +16
process U(t) = X(t) + y(t). Then find (i) psd of U(t), (ii) S Xy(co) and (iii)Sxu (co). (June 1999)
Solution
OiVt:l1
X(t) and y(t) are independent random processes.
X =y=o
-
462 Probability Theory and Stochastic Processes
(i) Power density of Vet) is Suu(OJ) :::::sxAOJ)+Syy(OJ)
(ii) Since X(t) and yet) independent and uncorrelated. Sxy (OJ) ::= 21(; X Y 8 ( OJ) '" 0
(iii) Sxu(OJ) =Fouriertransfonnof Rxu('r) Now Rxu(r) =E[X(t)V(t+r)]
E[X(t) (X(t+ r) + Y(t +r))] =E[X(t) X(t+r)+X(t)Y(t+r)] = E[X(t) X(t +r) + E[X(t)Y(t + r)] Rxx(r) + Rxy(r)
Sxu(OJ) =Fouriertransfonn of [RxxCr) +RXy(r)] =Sxx (OJ) + SXy(OJ)
.' \..
, . 16Sxu(OJ) =--:-2
OJ +16 Example 7.13 .
1 Find the cross correlation function for the psd S xy(OJ) = 25 + oi .
Solution
Givenpsdis
The cross correlation function is Rxy(r) =F-J[Sxy(OJ)]
F-I(25~OJ2 )
-
Random Processes: Spectral Characteristics 463
We know from the Fourier transform that
2a _II--::----::-~ eat, a> 0, a constant a2 +0/
1 1 .. ~
25 +m2 2x5
Example 7.14
Find the cross power spectral density. if (a) Rxy(r,) A2 TSin(mor) and (b) Rxy(r)
A2 =Tcos(mor).
Solution
Given
(a)
The power spectral density
Sxy(m) =F[ ~2 sinmo-z-]
A2 (b) RXA, (-z-) =-COS(Olo-Z-)2
Sxy(m) =F[ ~2 cos(mo-z-)]
-
______
-- ------
464 Probability Theory and Stochastic Processes
Both power density spectrums are shown in Fig. 7.5.
jfrc/2
-COo -.frc/2
(a)
A2 A2 Fig.7.5 (a) psdfor - sin (Oor Fig. 7.5 (b) psdfor - cos roo'r
2 2
Example 7.15
X(t) is a stationary random process with spectral density S xx (co). Y(t) is another independent random process, y(t) =Acos(coct +0), where 0 is a random variable uniformly distributed over (-n,n). Find the spectral density function of Z(t) =X(t)Y(t). Solution
Given two independent random processes X(t) and Y(t) =A cos(coJ +0) . The power spectral density of X (t) is S xx (co). 0 is uniformly distributed over ( -n - n).
otherwise
The autocorrelation of Y(t) is Ryy(r) =E[Y(t)Y(t+r)]
=
OJJy(t)y(t +r)h(O)dO
-
Random Processes: Spectral Characteristics 465
A2 TcosrocT
Now Z(t) ::: X(t)Y(t) Rzz (T) =E[X(t)Y(t)X(t +T)Y(t +T)]
Since X(t) and yet) are independent, Rzz (T) =E[ X(t)X(t +T)]E[Y(t)Y(t +T)]
The power density spectrum is
Szz(ro) =F[Rzz(T)]
A2 - x F[R,IT (T)cosrocT]2 A2 -fC [sxx (ro + roc) + Sxx(ro - roc)]2
Example 7.16
If the autocorrelation ofa WSS process is Rxx(T) =K e-K1T1 , show that