tele3113 wk1wed

17
p. 1 TELE3113 Analogue & Digital Communications Review of Probability Theory

Upload: vin-voro

Post on 01-Nov-2014

569 views

Category:

Education


0 download

DESCRIPTION

 

TRANSCRIPT

Page 1: Tele3113 wk1wed

p. 1

TELE3113 Analogue & Digital Communications

Review of Probability Theory

Page 2: Tele3113 wk1wed

p. 2

Probability and Random Variables

Concept of Probability:

When the outcome of an event is not always the same, probability is the measure of the chance of obtaining a particular possible outcome

outcomeslikely equally possible ofnumber totaloutcomesfavourable possibleofnumber

outcomes favourable =)(P

NNAP A

N ∞→= lim)( Where N is total number of event occurrence,

NA is the number of occurrence of outcome A

e.g. dice tossing: P2 = 1/6 ; P2 or 4 or 6 = 1/2

Page 3: Tele3113 wk1wed

p. 3

Common Properties of Probability

• 0≤ P(A) ≤ 1

• If there are N possible outcomes A1 , A2 , … , AN then

• Conditional Probability:

probability of the outcome of an event is conditional on the outcome of another event

∑=

=N

iiAP

11)(

P(A)B)andP(AA)|P(B =

P(B)B)andP(AB)|P(A =;

P(B)A)P(BAPB)P(A |)(| =Bayes’ theorem

Page 4: Tele3113 wk1wed

p. 4

Common Properties of Probability

• Mutually exclusiveness

P(A or B) = P(A) + P(B); P(A and B) = 0

• Statistically Independence

P(B)P(A) B)and P(A(B)| P(A (A)|P(B

⋅=⇒== );) APBP

P(B)B)andP(AB)| P(A;

P(A)B)andP(AA)|P(B ==Q

Thus, A and B are statistically independent.

Thus, A and B are mutually exclusive.

Page 5: Tele3113 wk1wed

p. 5

Communication Example

In a communication channel, signal may be corrupted by noises.

m0

m1

r0

r1

P(r0|m0)

P(r0|m1)P(r1|m0)

P(r1|m1)

If r0 is received, m0 should be chosen ifP(m0|r0)P(r0) > P(m1|r0)P(r0)

By Bayes’ theorem P(r0|m0)P(m0) > P(r0|m1)P(m1)

Similarly, if r1 is received, m1 should be chosen ifP(m1|r1)P(r1) > P(m0|r1)P(r1) ⇒ P(r1|m1)P(m1) > P(r1|m0)P(m0)

Probability of correct reception: P(c) = P(ro|mo)P(mo) + P(r1|m1)P(m1)

Probability of error: P(ε) = 1-P(c)

Page 6: Tele3113 wk1wed

p. 6

Random Variables

A random variable X(.) is the rule or functional relationship which assigns real numbers X(λi) to each possible outcome λi in an experiment.

For example, in coin tossing, we can assign X(head) = 1, X(tail) = -1

If X(λ) assumes a finite number of distinct values

discrete random variable

If X(λ) assumes any values within an interval

continuous random variable

Page 7: Tele3113 wk1wed

p. 7

Cumulative Distribution Function

The cumulative distribution function, FX(x) , associated with a random variable X is:

)( xXPxFX ≤=Properties:

1)(0 ≤≤ xFX1)(;0)( =∞=−∞ FF

2121 )()( xxxFxF ≤≤ if

)()( 1221 xFxFxXxP XX −=≤<

FX(x)

x0

1

FX(x)

x0

1

(Non-decreasing)

Page 8: Tele3113 wk1wed

p. 8

Probability Density Function

The probability density function, fX(x) , associated with a random variable X is:

dxxdFxf X

X)(

)( =

∫∞−

==≤x

XX dfxFxXP ββ )()(

∫∞

∞−

= 1)( dxxf X

∫=−=≤<2

1

)()()( 1221

x

xXXX dxxfxFxFxXxP

0)( ≥xf X

Properties:

FX(x)

x0

1

fX(x)

x0

for all x

Page 9: Tele3113 wk1wed

p. 9

Statistical Averages of Random Variables

The statistical average or expected value of a random variable X is defined as

Xi

ii mxPxXE == ∑ )(

EX is called the first moment of X and mX is the average or mean value of X.

Similarly, the second moment EX2 is

∑=i

ii xPxXE )( 22

Its square root is called the root-mean-square (rms) value of X.

XmdxxxfXE == ∫∞

∞−

)(

∫∞

∞−

= dxxfxXE )( 22

or

or

The variance of the random variable X is defined as

222222 )()()( XXXXX mXEdxxfmXmXE −=−=−= ∫∞

∞−

σσ or

The square root of the variance is called the standard deviation, σX, of the random variable X.

Page 10: Tele3113 wk1wed

p. 10

Statistical Averages of Random Variables

∑∑==

=

N

iii

N

iii XEaXaE

11

∑∑==

=

N

iii

N

iii XVaraXaVar

1

2

1

Expected value of linear combination of N random variables is equivalent to linear combination of expected values of individual random variables

For N statistically independent random variables: X1, X2, … , XN

Covariance of a pair of random variables: X, Y

YXYXXY mmXYEmYmXE −=−−= ))((µ

If X and Y are statistically independent, µXY=0

Page 11: Tele3113 wk1wed

p. 11

A random variable that is equally likely to take on any value within a given range is said to be uniformly distributed.

Uniform Distribution

Page 12: Tele3113 wk1wed

p. 12

Consider an experiment having only two possible outcomes, A and B, which are mutually exclusive.

Let the probabilities be P(A) = p and P(B) = 1 − p = q.

The experiment is repeated n times and the probability of A occurring itimes is ,)( ini qp

in

iAP −

== where (binomial coefficient).

)!(!!ini

nin

−=

Binomial Distribution

The mean value of the binomial distribution is np and the variance is (npq).

Page 13: Tele3113 wk1wed

p. 13

Gaussian Distribution

Central-Limit theorem: The sum of N independent, identically distributedrandom variables approaches a Gaussian distribution when N is very large.

22 2/)(

21)( σµ

σπ−−= xexf

The Gaussian pdf is continuous and is defined by

where µ is the mean , and σ2 is the variance .

cumulative distribution function:

∫∞−

−−=

≤=x

y

X

dye

xXPxF22 2/)(

21

)(

σµ

σπ

Page 14: Tele3113 wk1wed

p. 14

Gaussian Distribution

Zero-mean unit-variance Gaussian random variable: 2/2

21)( xexg −=π

⇒ Probability distribution function: ∫∫∞−

∞−

==Ωx

yx

dyedyygx 2/2

21)()(π

Define Q-function: ∫∞

−=Ω−=x

y dyexxQ 2/2

21)(1)(π

(monotonic decreasing)

In general, for a random variable X with pdf:22 2/)(

21)( σµ

σπ−−= xexf

=>

Ω=≤σµ

σµ xQxXPxxXP )()( ;

Define: error function (erf) and complementary error function (erfc) :

=

+=Ω

221

21)( xerfcxerfx

21Q(x) ;

∫∫∞

−− =−==x

yx

y dyexerfxerfcdyexerf22 2)(1)(2)(

0 ππ ;

Thus,

Page 15: Tele3113 wk1wed

p. 15

Q-function

∫∞

−=x

y dyexQ 2/2

21)(π

1for x >>≅−

π2)(

2/2

xexQ

x

Page 16: Tele3113 wk1wed

p. 16

Random Processes

A random process is a set of indexed random variables (sample functions) defined in the same probability space.

In communications, the index is usually in terms of time.

xi(t) is called a sample functionof the sample space.

The set of all possible sample functions xi(t) is called ensemble and defines the random process X(t).

For a specific i, xi(t) is a time function.For a specific ti, X(ti) denotes a random variable.

Page 17: Tele3113 wk1wed

p. 17

Random Processes: Properties

Consider a random process X(t) , let X(tk) denote the random variable obtained by observing the process X(t) at time tk .

Mean: mX(tk) =EX(tk)

Variance:σX2 (tk) =EX2(tk)-[mX (tk)] 2

Autocorrelation: RXtk ,tj=EX(tk)X(tj) for any tk and tj

Autocovariance: CXtk ,tj=E [X(tk)- mX (tk)][X(tj)- mX (tj)]