lecture 05: independence · announcements ps1 due today pain poll last day to hand in + solutions...

Post on 19-Jul-2020

1 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Lecture 05:IndependenceLisa YanJuly 6, 2018

Announcements

PS1 due today◦ Pain poll◦ Last day to hand in + solutions released next Friday 7/13

PS2 out today! Due next Friday 7/13

2

Goals for today

◦ Properties of probability◦ Independence◦Mutual exclusivity, independence, and DeMorgan’s◦Conditional Independence

3

Summary from last time

4xkcd by Randall Munroe

Summary (conditional probability)

5

P(EF) = P(E) P(F|E) = P(F) P(E|F)

P(E) = P(F)P(E|F) + P(Fc) P(E|Fc)

P(F|E)

P(E|F)P(F)P(E|F)P(F) + P(E|Fc)P(Fc)

Definition ofConditional Probability

Bayes’Theorem

Chain Rule

Law of Total Probability

P(E|F)P(F)P(E)

P(EF)P(E)

= P(E|F)P(F)!

iP Fi P E Fi

=

!i

P Fi P E Fi=

Bayesian Interpretations

6

Define:E: the effect (observation, symptom, test result)H: the hypothesis

Prior: Probability of H before you observe ELikelihood: Probability of E given that H holdsPosterior: Probability of H after you observe E

! " # = ! # " ! "! #

posteriorlikelihood prior

Normalizing constant

!

DNA Paternity testing

Child is born with (A, a) gene pair (event BA,a)

◦ Mother has (A, A) gene pair

◦ Two possible fathers: M1: (a, a) M2: (a, A)

◦ P(M1) = p P(M2) = 1 - p

What is P(M1 | BA,a)?

7

P(M1 | BA,a) = P(M1 BA,a) / P(BA,a)

M1 more likely to be father

than he was before, since

P(M1 | BA,a) > P(M1)

Solution:

P(BA,a | M1)P(M1)

P(BA,a | M1)P(M1) + P(BA,a | M2)P(M2)=

1 (p)

1 (p) + (1/2) (1 – p)= =

2p

1+p> p

!

Properties of probability

8

For an event A:P(A) = 1 – P(Ac) (Complement)

For any events A and B:

P( A B ) = P( B A ) (Commutativity)P( A Bc ) = P( A ) – P( AB ) (Intersection)P( A ) = P( A B ) + P( A Bc ) (Law of Total Probability)

= P(A | B) P(B) + P(A | Bc) P(Bc)

P(A B) = P(B) P(A|B) (Chain Rule)P( A B ) ≥ P( A ) + P( B ) – 1 (Bonferroni)

!

P(•|G) is also a probability space!Formally, P(•|G) satisfies 3 axioms of probability.For example:

Axiom 1 of probability 0 £ P(E|G) £ 1

Corollary 1 (complement) P(Ec|G) = 1 – P(E|G)

9

Chain Rule: P(EF|G) = P(F|G) P(E|FG)

conditioning on G means that G is always given!

P(E|FG)P(F|G)P(E|G)

Bayes Theorem: P(F|EG) =

P(E|G) = P(F|G) P(E|FG) + P(Fc|G) P(E|FcG)Law of Total Probability:

More generally:

And now…

10

!

Two DiceRoll two 6-sided dice, yielding values D1 and D2.

Let E be event: D1 = 1Let F be event: D2 = 1

1. What is P(E), P(F), and P(EF)?

P(E) = 1/6, P(F) = 1/6, P(EF) = 1/36

P(EF) = P(E) P(F) à E and F independent

2. Let G be event: D1 + D2 = 5.What is P(E), P(G) and P(EG)?

P(E) = 1/6, P(G) = 4/36 = 1/9, P(EG) = 1/36

P(EG) ¹ P(E) P(G) à E and G dependent

11

G = {(1,4), (2,3), (3,2), (4,1)}

!

IndependenceTwo events E and F are defined as independent if:

P(EF) = P(E) P(F)Or, equivalently: P(E|F) = P(E)(otherwise E and F are called dependent events).

Three events E, F, and G are independent if:

P(EFG) = P(E) P(F) P(G), andP(EF) = P(E) P(F), andP(EG) = P(E) P(G), andP(FG) = P(F) P(G)

12

Pairwise independence of3 variables is not enoughto prove independence!

Independence?

13

A

B

S

P(AB) = P(A) P(B)

A

B

AB

S

!

When E and F are independent…Given independent events E and F, prove:

P(E|F) = P(E | Fc)

14

Proof:P(E Fc) = P(E) – P(EF) Intersection

= P(E) – P(E) P(F) Independence= P(E) [1 – P(F)] Factoring= P(E) P(Fc) Complement

So, E and Fc independent, implying that:P(E | Fc) = P(E) = P(E | F)

Intuitively, if E and F are independent, knowing whether F holds gives us no information about E

!

Generalized Independence

Events E1, E2, ..., En are independent if for every subset E1’, E2’, ..., Er’ (where r £ n) it holds that:

P(E1E2...Er ) = P(E1 ) P(E2) P(E3)… P(Er)

Example:Outcomes of n separate flips of a coin are all independent of one another.Each flip in this case is a trial of the experiment.

15

!

Two!!!! Dice!!!!Roll two 6-sided dice, yielding values D1 and D2.

Let E be event: D1 = 1Let F be event: D2 = 6Let G be event: D1 + D2 = 7

16

1. Are E and F independent?

2. Are E and G independent?

3. Are F and G independent?

4. Are E, F, and G independent?

Yes!

Yes!

Yes!

No!

P(E) = 1/6, P(F) = 1/6, P(EF) = 1/36

P(E) = 1/6, P(G) = 1/6, P(EG) = 1/36 EG = { (1,6) }

P(F) = 1/6, P(G) = 1/6, P(FG) = 1/36 FG = { (1,6) }

P(EFG) = 1/36 ¹ 1/216 = (1/6)(1/6)(1/6) EFG = { (1,6) }

G = { (1,6), (2,5), (3,4),(4,3), (5,2), (6,1) }

Generating Random BitsA computer produces a series of random bits.• with probability p of producing a 1.• Each bit generated is an independent trial.• E = first n bits are 1’s, followed by a single 0What is P(E)?

17

P(first n 1’s) = P(1st bit=1) P(2nd bit=1) ... P(nth bit=1)

= pn

P(n+1 bit=0) = (1 – p)

P(E) = P(first n 1’s) P(n+1 bit=0) = pn (1– p)

Solution: Independent trials

!

(biased) Coin Flips

Suppose we flip a coin n times.

• A coin comes up heads with probability p.

• Each coin flip is an independent trial.

18

1. P(n heads on n coin flips)

2. P(n tails on n coin flips)

3. P(first k heads, then n – k tails)

4. P(exactly k heads on n coin flips)

!" 1 − ! %&"

1 − ! %!%

'( !" 1 − ! %&"

BreakAttendance: t inyurl.com/cs109summer2018

19

!

Probability in a nutshell

20

Just Add! Inclusion Exclusion

Chain Rule

Just Multiply!

DeMorgan’s

Mutually Exclusive?

ORP(E È F)

ANDP(E F)

Independent?

Level 6 Level 30

Our goal today:

Level 15

Hash Tablesm strings are hashed (equally randomly) into a hash table with n buckets.• Each string hashed is an independent trial.• Let event E = at least one string hashed into first bucket.What is P(E)?

21

Define: Fi = string i not hashed into first bucket (where 1 ≤ i ≤ m)F1F2…Fm = no strings hashed into first bucket

WTF: P(E) = 1 – P(F1F2…Fm)

P(Fi) = 1 – 1/n = (n – 1)/n (for all 1 ≤ i ≤ m)

P(F1F2…Fm) = P(F1) P(F2) … P(Fm) = ((n – 1)/n)m

P(E) = 1 – P(F1F2…Fm) = 1– ((n– 1)/n)mSimilar to ≥ 1 of m people having same birthday as you.

Solution:Find P(Ec)!!!!

!

More Hash Table Funm strings are hashed (unequally) into a hash table with n buckets• Each string hashed is an independent trial, with

probability pi of getting hashed to bucket i• E = At least 1 of buckets 1 to k has ≥ 1 string hashed to it

22

Define: Fi = at least one string hashed into i-th bucket (where 1 ≤ i ≤ n)

WTF: P(E) = P(F1ÈF2È…ÈFk) = 1 – P((F1ÈF2È…ÈFk)c)

= 1 – P(F1c F2

c …Fkc) (DeMorgan’s Law)

P(F1c F2

c …Fkc) = P(no strings hashed to buckets 1 to k)

= (1 – p1 – p2 – … – pk)m

P(E) = 1– (1– p1 – p2 – …– pk)m

Solution:

No seriously, it’s more Hash Table Funm strings are hashed (unequally) into a hash table with n buckets• Each string hashed is an independent trial, with

probability pi of getting hashed to bucket i• E = Each of buckets 1 to k has ≥ 1 string hashed to it

23

Define: Fi = at least one string hashed into i-th bucket

P(E) = P(F1 F2 … Fk) = 1 – P((F1 F2 … Fk)c)

= 1 – P(F1c È F2

c È … È Fkc) (DeMorgan’s Law)

= 1 − # ∪%&'( )&* = 1– ∑.'(% −1 ./( ∑&01⋯1&3 # )&0* )&4* …)&3*

Where # )&0* )&4* …)&3* = (1 − 8&0 − 8&0…– 8&0):

Solution:

General Inclusion-ExclusionPrinciple of Probability

!

Network reliability

Consider the following parallel network:• n independent routers, each with

probability pi of functioning (where 1 ≤ i ≤ n)

• E = functional path from A to B exists.

What is P(E)?

24

P(E) = 1 – P(all routers fail)

= 1 – (1 – p1)(1 – p2)…(1 – pn)

p1p2

pn…

A B

Solution:

= 1 −$%&'

(1 − )%

Find P(Ec)!!!!

Digging deeper on independence

Two events E and F are defined as independent if:P(EF) = P(E) P(F)

If E and F are independent, does that tell us whether the following is true?

P(EF | G) = P(E | G) P(F | G),where G is an arbitrary event?

In general, No!

25

!

Two events E and F are defined as conditionally independent given G if:

P(EF | G) = P(E | G) P(F | G)Or, equivalently: P(E|FG) = P(E | G)(otherwise E and F are called conditionally dependent given G).

Warning: After conditioning on additional information,

1. Independent events can become conditionally dependent.

2. Dependent events can become conditionally independent.

Conditional Independence

26

Two…Dice…Roll two 6-sided dice, yielding values D1 and D2.

Let E be event: D1 = 1Let F be event: D2 = 6Let G be event: D1 + D2 = 7

1. Are E and F independent?

2. Are E and F independent given G?

27

Yes!

Independent events can become conditionally dependent.

No!

P(E) = 1/6, P(F) = 1/6, P(EF) = 1/36

P(E|G) = 1/6, P(F|G) = 1/6, P(EF|G) = 1/6P(EF|G) ¹ P(E|G) P(F|G) à E|G and F|G dependent

Faculty NightIn a dorm with 100 students:

30 students have straight A’s P(A) = 0.3020 students are in CS P(CS) = 0.206 students are in CS with straight A’s P(A,CS) = 0.06

At faculty night F, only CS students and straight A students show up (44 total).

1. Are A and CS independent?

2. Are A and CS independent given F?

Which is correct?

• Being a CS major lowers your probability of getting A’s

• Knowing you went to faculty night because you are a CS majormakes it less likely that you went because you have straight A’s

28

P(A|CS) = 6/20 = 0.30 = P(A) Yes!

P(A|CS,F) = 0.30 ¹ 30/44 = P(A|F) No!

Summary

29

Two events E and F are defined as independent if:P(EF) = P(E) P(F)

Or, equivalently: P(E|F) = P(E)If E and F are independent, then E and Fc are also independent, and

P(E|F) = P(E) = P(E|Fc)Two events E and F are defined as conditionally independent given G if:

P(EF | G) = P(E | G) P(F | G)Or, equivalently: P(E|FG) = P(E | G)Important notes:1. Independence does not imply causality; it’s just math.2. Conditioning on an event can break independence, or

it can make dependent variables conditionally independent.

Summary

30

Just Add! Inclusion Exclusion

Chain Rule

Just Multiply!

DeMorgan’s

Mutually Exclusive?

ORP(E È F)

ANDP(E F)

Independent?

top related