combining probabilities and conditional probability

56
Combining Probabilities and Conditional Probability Stat 700 Lecture 4 9/11/2001-9/13/2001

Upload: yoshi

Post on 25-Feb-2016

79 views

Category:

Documents


1 download

DESCRIPTION

Combining Probabilities and Conditional Probability. Stat 700 Lecture 4 9/11/2001-9/13/2001. Overview of Lecture. Algebra of Events and Probabilities Conditional Probability and its Importance Probability Updating: Bayes Rule Independent Events Discrete Random Variables - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Combining Probabilities and Conditional Probability

Combining Probabilities and Conditional Probability

Stat 700 Lecture 49/11/2001-9/13/2001

Page 2: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

2

Overview of Lecture

Algebra of Events and Probabilities Conditional Probability and its Importance Probability Updating: Bayes Rule Independent Events Discrete Random Variables Introduction to Probability Distributions Parameters of Distributions

Page 3: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

3

Algebra of Events and Probabilities

Given an event A, its complement, denoted by Ac, is the event whose elements are those that are not in A. Thus, the event Ac is the opposite of event A.

Note that (Ac)c = A. (Rule of double complementation)

Rule 1 (Complementation Rule): For any event A, P(A) = 1 - P(Ac).

From this rule it follows immediately that P() = 0, since P(S) = 1.

Page 4: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

4

A Simple Application Example: Consider the experiment of rolling two fair dice

simultaneously, and let A be the event that the sum of the outcomes is at most 11. We seek P(A).

Solution: Since there are (6)(6) = 36 simple events, and we can assume that these events are equally likely, then each has probability of 1/36. Out of these 36 events, only one of them, the simple event {(6,6)}, has a sum that exceeds 11, so Ac = {(6,6)}. Therefore, by the complementation rule,

.3635

3611

)(1)(

cAPAP

Page 5: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

5

Algebra … continued

Given two events, A and B, the union of A and B, denoted by (A B), is the event whose elements are those that belong to either A or B or both. It represents the occurrence of at least one of the events A or B.

We also write (A or B) for (A B). We could generalize this to the union of several

events, e.g., (A B C D) which would then represent the occurrence of at least one of these 4 events.

Page 6: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

6

Algebra … continued Given two events, A and B, the intersection of A and B,

denoted by (A B), is the event whose elements are those that belong to both A and B. It represents the simultaneous occurrence of A and B.

We also write (A and B) and (AB) for (A B). We could generalize this to the intersection of several

events, e.g., (A B C D) which would then represent the simultaneous occurrence of all 4 events.

If A B = = {empty event}, we say that A and B are disjoint or mutually exclusive.

Generalization: notion of pairwise disjoint events.

Page 7: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

7

Finite Additivity Properties Rule 2: Given disjoint events A and B, then

P(A B) = P(A) + P(B). Extended Rule 2: Given pairwise disjoint

events A1, A2, …, Ak, then

).(

)(...)()()...(

1

2121

k

ii

kk

AP

APAPAPAAAP

Page 8: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

8

Addition and Generalized Addition Rules

Rule 3 (Addition Rule): Given events A and B (which are not necessarily disjoint), then

P(A B) = P(A) + P(B) - P(A B). Extended Rule 3 (Inclusion-Exclusion Principle):

Given three events A, B, and C (which are not necessarily pairwise disjoint), then

).()}()()({

)}()()({)(

ABCPBCPACPABP

CPBPAPCBAP

Page 9: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

9

Some Concrete Applications of the Probability Rules Example 1: A study is to be performed to examine the association

between the occurrence of lung cancer and smoking. Suppose that one person is to be randomly chosen and classified into either a smoker or a nonsmoker, and whether he/she has lung cancer or not. For this experiment, the sample space is:

S ={(Nonsmoker, No lung cancer), (Nonsmoker, With lung cancer), (Smoker, No lung cancer), (Smoker, With lung cancer)}.

Assume that the proportion in the population who are smokers is 0.15, while the proportion in the population who have lung cancer is 0.05. Furthermore, assume that the proportion in the population who are smokers and with lung cancer is 0.009.

We seek the probability that the person chosen will either be a smoker or has lung cancer.

Page 10: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

10

Information for Example

The table below shows the information provided for in the description of the problem.

Such a table could also have been used to compute the desired probabilities … illustrated in class.

Smokers Nonsmokers Marginal Proportions

With lung cancer 0.009 0.05

Without lung cancer

Marginal Proportions 0.15

Page 11: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

11

Applications … continued

Solution of Example 1: Let A be the event that the person is a smoker, and B be the event that the person has lung cancer. From the given information we have that P(A) = 0.15, P(B) = 0.05, and P(AB) = 0.009. Therefore, by the addition rule, the desired probability is

.191.0009.005.015.0

)()()()or (

ABPBPAPBAP

Page 12: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

12

DeMorgan’s Rules DeMorgan’s rules state that:

– the complement of the union of events equals the intersection of their complements;

– the complement of the intersections of events equals the union of their complements.

Formally, for two events A and B, we have:

).()(

);(ccc

ccc

BABA

BABA

Page 13: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

13

Example … continued Suppose that in the preceding example we were instead

interested in the probability that the person is neither a smoker nor has lung cancer. Then, in formal notation, we want:

P(Ac Bc) = P{(not a smoker) and (is free of lung cancer)}. By virtue of deMorgan’s rule, we have that Ac Bc = (A B)c. Applying the complementation rule, we therefore obtain

.809.0191.01)( since )(1

))(()(

BABABAP

BAPBAPcc

ccc

Page 14: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

14

Another Approach: By Completing the Table of Probabilities

The answers obtained in the preceding slides could also be derived by simply completing the table of probabilities:

Smokers Nonsmokers Marginal Proportions

With lung cancer 0.009 0.041 0.05

Without lung cancer 0.141 000...888000999 0.95

Marginal Proportions 0.15 0.850 1.00

Page 15: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

15

A Matching Problem Example 2: Four people, Peter, Paul, Mary, and

Magdalene, write their names on identically-sized chips of papers. These chips are then placed on a box and thoroughly shuffled. Each of them (in the order given above) then randomly draws a chip from the box, with the drawing being without replacement. We seek the probabilities of the following events:– a) probability that Paul draws his name;– b) probability that either Paul or Magdalene (or both) draw

their respective names; and– c) probability that at least one of them draws his/her name.

Page 16: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

16

Matching … continued Solution: We shall let A = {Peter draws his name}; B

= {Paul draws his name}; C = {Mary draws her name}; and D = {Magdalene draws her name}. Observe that the number of outcomes {e.g., (Paul, Mary, Magdalene, Peter)} in the sample space S is (4)(3)(2)(1) = 4! = 24, and these outcomes are equally likely since the draws are done at random.

Now, for problem (a), we want: P(B) = N(B)/N(S). But N(B) = (3)(1)(2)(1) = 6 since we need “Paul” to be the second chip drawn. Therefore, P(B) = 6/24 = 1/4 = 0.25.

Page 17: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

17

Continued ...• For problem (b) we want P(B or D). By the addition rule, we have P(B or D) = P(B) + P(D) - P(B and D). Analogously to problem (a), we have P(D) = (3)(2)(1)(1)/24 = 1/4.• On the other hand, P(B and D) = (2)(1)(1)(1)/24 = 1/12 since both Paul and Magdalene must get their names, respectively. •Therefore, P(B or D) = 1/4 + 1/4 - 1/12 = 5/12 = 0.4167.

Page 18: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

18

Solution … continued For problem (c) where we are seeking the probability

of having at least one of them draw their own name, we want:

P{A B C D} = P{at least one of A, B, C, or D occurs}.

But by the generalized addition rule (inclusion-exclusion principle) for 4 events, this is:

P{A B C D} = {P(A)+P(B)+P(C)+P(D)} - {P(AB) + P(AC) + P(AD) + P(BC) + P(BD) + P(CD)} + {P(ABC) + P(ABD) + P(ACD) + P(BCD)} - P(ABCD}.

Page 19: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

19

Continued ...

Similar calculations as in the previous two problems now yield that (coloreds means the number of summands):

P{A B C D} = (4)(1/4) - (6)(1/12) + (4)(1/24) - (1)(1/24) = 1 - 1/2 + 1/6 - 1/24 = 1 - 1/2! + 1/3! - 1/4! = 0.625.

Food for Thought!! What do you think will happen to this probability if there were 1000 people, instead of 4 people??

Page 20: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

20

Another Solution: The “Brute Force” Approach

Sample Space: Possible Outcomes of the Draws(Legend: 1=Peter, 2=Paul, 3=Mary, 4=Magdalene)

(1,2,3,4) (2,1,3,4) (3,1,2,4) (4,1,2,3)(1,2,4,3) (2,1,4,3) (3,1,4,2) (4,1,3,2)(1,3,2,4) (2,3,1,4) (3,2,1,4) (4,2,1,3)(1,3,4,2) (2,3,4,1) (3,2,4,1) (4,2,3,1)(1,4,2,3) (2,4,1,3) (3,4,1,2) (4,3,1,2)(1,4,3,2) (2,4,3,1) (3,4,2,1) (4,3,2,1)

Page 21: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

21

Event B: “Paul” is Matched

Sample Space: Possible Outcomes of the Draws(Legend: 1=Peter, 2=Paul, 3=Mary, 4=Magdalene)

((11,,22,,33,,44)) (2,1,3,4) (3,1,2,4) (4,1,2,3)((11,,22,,44,,33)) (2,1,4,3) (3,1,4,2) (4,1,3,2)(1,3,2,4) (2,3,1,4) (((333,,,222,,,111,,,444))) (((444,,,222,,,111,,,333)))(1,3,4,2) (2,3,4,1) (((333,,,222,,,444,,,111))) (((444,,,222,,,333,,,111)))(1,4,2,3) (2,4,1,3) (3,4,1,2) (4,3,1,2)(1,4,3,2) (2,4,3,1) (3,4,2,1) (4,3,2,1)

Therefore, P(B) = 6/24 = 1/4 = 0.25.

Page 22: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

22

Event (B or D): Either “Paul” or “Magdalene” or Both Are Matched

Sample Space: Possible Outcomes of the Draws(Legend: 1=Peter, 2=Paul, 3=Mary, 4=Magdalene)

(((111,,,222,,,333,,,444))) (((222,,,111,,,333,,,444))) (((333,,,111,,,222,,,444))) (4,1,2,3)(((111,,,222,,,444,,,333))) (2,1,4,3) (3,1,4,2) (4,1,3,2)(((111,,,333,,,222,,,444))) (((222,,,333,,,111,,,444))) (((333,,,222,,,111,,,444))) (((444,,,222,,,111,,,333)))(1,3,4,2) (2,3,4,1) (((333,,,222,,,444,,,111))) (((444,,,222,,,333,,,111)))(1,4,2,3) (2,4,1,3) (3,4,1,2) (4,3,1,2)(1,4,3,2) (2,4,3,1) (3,4,2,1) (4,3,2,1)

Therefore, P(B or D) = 10/24 = 5/12 = 0.4167.

Page 23: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

23

Event (A or B or C or D): At Least One of Them Gets a Match

Sample Space: Possible Outcomes of the Draws(Legend: 1=Peter, 2=Paul, 3=Mary, 4=Magdalene)

(((111,,,222,,,333,,,444))) (((222,,,111,,,333,,,444))) (((333,,,111,,,222,,,444))) (4,1,2,3)(((111,,,222,,,444,,,333))) (2,1,4,3) (3,1,4,2) (((444,,,111,,,333,,,222)))(((111,,,333,,,222,,,444))) (((222,,,333,,,111,,,444))) (((333,,,222,,,111,,,444))) (((444,,,222,,,111,,,333)))(((111,,,333,,,444,,,222))) (2,3,4,1) (((333,,,222,,,444,,,111))) (((444,,,222,,,333,,,111)))(((111,,,444,,,222,,,333))) (2,4,1,3) (3,4,1,2) (4,3,1,2)(((111,,,444,,,333,,,222))) (((222,,,444,,,333,,,111))) (3,4,2,1) (4,3,2,1)

Thus, P(A or B or C or D) = 15/24 = 5/8 = 0.625.

Page 24: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

24

Conditional Probability: Motivation In many situations occurring in the sciences, both

natural and social, one is interested in the probability of an event given other information.

For example, one may not be interested in the probability of a person getting lung cancer, but rather might be interested in the probability that the person will get lung cancer given the information that this person is a cigarette smoker.

Or, one maybe interested in knowing the probability that someone is HIV-infected given a positive result from an ELISA test (a test for HIV-infection).

Page 25: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

25

Definition of Conditional Probability

In such situations, we are interested in the conditional probability of an event.

Given events A and B (for some experiment), the conditional probability of B given A is:

. 0)( whenever definednot

0)( whenever)(

)()|(

AP

APAPBAP

ABP

Page 26: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

26

Justification of the Definition If we are given the information that event A has

occurred, then we know that the outcome of the experiment is in event A.

Therefore, given this information, event B could have occurred only if the outcome is in the intersection of A and B. Dividing P(A and B) by P(A) serves to standardize P(A and B) since, given that A has occurred, the effective sample space becomes A.

If P(A) = 0, then conditioning on an event that never occurs is clearly not of any interest!

Page 27: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

27

A Simple Example

Consider the experiment of tossing three fair coins simultaneously.

Thus, S = {HHH, HHT, HTH, THH, HTT, THT, TTH, TTT}, and each outcome has probability of 1/8.

Define A = identical outcomes = {HHH, TTT}, and B = at least two heads = {HHT, HTH, THH, HHH}. Thus, P(A) = 2/8 and P(B) = 4/8. Note that (A B) = {HHH} so P(A B) = 1/8.

Page 28: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

28

Continued ...

P(B|A) = P(A B)/P(A) = (1/8)/(2/8) = 1/2. This conditional probability is clearly intuitive

since if we know that A has occurred [so the outcome was either (HHH) or (TTT)], then the only way that B could have occurred is if the outcome was (HHH)

Page 29: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

29

Example … continued On the other hand, if we are given the information

that B has occurred, then P(A|B) = P(A B)/P(B) = (1/8)/(4/8) = 1/4. Again, this is intuitive because the information that B

occurred tells us that the outcome is either (HHT), (HTH), (THH), or (HHH). In order for A to have occurred, then (HHH) must be the outcome, hence the conditional probability of A given B is 1/4.

From these two examples, note that: P(B|A) and P(A|B) need not be identical.

Page 30: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

30

Another Example Consider the hypothetical population proportions

regarding smoking and the presence of lung cancer which were used in previous examples. The information is reproduced in the table below. Recall that the experiment is to randomly choose one person from this population.

Smokers Nonsmokers Marginal Proportions

With lung cancer 0.009 0.041 0.05

Without lung cancer 0.141 0.809 0.95

Marginal Proportions 0.15 0.850 1.00

Page 31: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

31

Example … continued Let us denote by A the event that the person chosen

is a smoker, and by B the event that the person chosen has lung cancer.

Suppose that we are given the information that the person chosen is a smoker. Then

P(has lung cancer | smoker) = P(B|A) = P(AB)/P(A) = 0.009/0.15 = 0.06.

On the other hand, if we know that the person is a nonsmoker, then

P(has lung cancer | nonsmoker) = P(B|Ac) = P(AcB)/P(Ac) = 0.041/0.85 = 0.048.

Page 32: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

32

Example … continued By comparing the conditional probabilities P(B|A) =

0.060 and P(B|Ac) = 0.048 [and we could do such a comparison since the process of conditioning has standardized the probabilities for the two groups], one can conclude (in this hypothetical situation) that the prevalence of lung cancer among smokers is slightly higher than the prevalence of lung cancer among nonsmokers.

One may also look at the “inverse” probabilities: P(A|B) = P(A and B)/P(B) = 0.009/0.05 = 0.18; and P(A|Bc) = P(A and Bc)/P(Bc) = 0.141/0.95 = 0.148.

Page 33: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

33

Multiplication Rule

Given events A and B, if we are given P(A) > 0 and the conditional probability P(B|A), then the probability P(A B) could be obtained by inverting the conditional probability formula to get the multiplication rule:

).|()()( ABPAPBAP

Page 34: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

34

Utility of Multiplication Rule

The multiplication rule is what enables us to multiply the (conditional) probabilities in the branches of a tree diagram to get the (joint) probabilities of the outcomes of the experiment.

On the other hand, the multiplication rule is only usable if we are able to determine the conditional probability of B given A without recourse to the conditional probability rule [since this latter rule requires P(AB)].

Obtaining P(B|A) this way is usually done by examining the situation at hand.

Page 35: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

35

An Illustration Example: Suppose you have two boxes with

Box I containing 4 red and 7 blue balls, and Box II containing 6 red and 2 blue balls.

The two-step experiment is to pick a ball at random from Box I which is then transferred to Box II. A ball is then drawn from Box II.

Let A = event that a red ball is transferred from I to II; and

let B = event that a red ball is drawn from II. We seek: P(A and B) = P({both balls are red})

Page 36: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

36

Example … continued Clearly, P(A) = 4/11, while without using the

conditional probability formula, we obtain: P(B|A) = 7/9 since, after the transfer of a red

ball, there will be 7 reds and 2 blues in Box II. Consequently, by the multiplication rule: P(A and B) = P(A)P(B|A) = (4/11)(7/9) =

28/99 = 0.2828. In class we illustrate this computation via a

tree diagram.

Page 37: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

37

Example … continued Similarly, we obtain the joint probability of

getting a blue ball from Box I and a red ball from Box II, which is symbolically represented by (Ac B) as follows:

P(Ac) = P(blue from Box I) = 7/11; P(B|Ac) = P(red from Box II | blue ball

transferred) = 6/9, so P(Ac B) = P(Ac)P(B|Ac) = (7/11)(6/9) =

42/99 = 0.4242.

Page 38: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

38

Example … continued Let us now consider the problem of determining the

(marginal) probability of getting a red ball from Box II, that is, we want P(B).

Notice that we are not anymore directly interested in what happens in step 1 (getting a ball from Box I), but at the same time we see that the occurrence of event B depends on what will happen in step 1.

We are therefore faced with the problem of combining the probabilities arising from whether we transfer a red ball to Box II or we transfer a blue ball to Box II.

The question is: how do we combine?

Page 39: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

39

Analyzing the Situation First we note that the event B could arise in

two ways: – (draw a red from I, then draw a red from II); or – (draw a blue from I, then draw a red from II).

Symbolically, this is could be represented via: B = (A B) (Ac B) Furthermore, note that since A and Ac are

disjoint (could not occur simultaneously), then (A B) and (Ac B) are also disjoint events.

We could therefore apply the addition rule.

Page 40: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

40

The Combined Probability By the addition rule (or the finite additivity property), we

obtain: P(B) = P{(A B) (Ac B)} = P(A B) +

P(Ac B) = P(A)P(B|A) + P(Ac)P(B|Ac). From our earlier calculations where we used the

multiplication rule, we obtained: P(A B) = 28/99 and P(Ac B) = 42/99. Therefore, P(B) = 28/99 + 42/99 = 70/99 = .7070.

This is the probability of getting a red ball from Box II (taking into proper account what could happen with the draw from Box I).

Page 41: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

41

Theorem of Total Probabilities The calculation of P(B), where B is an event pertaining

to the result of the second step of the experiment, is a special case of what is referred to as the “Theorem of Total Probabilities,” a method for combining probabilities from the different possibilities arising from the first step of the experiment.

In its simplest form, when there are only two possibilities, A and Ac, from the first step of the experiment, the theorem states that:

).|()()|()()( cc ABPAPABPAPBP

Page 42: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

42

Updating of Probabilities Let us go back to the example we were considering.

Evidently, it is immediate that P(A) = P(red from Box I) = 4/11 = .3636; and P(Ac) = P(blue from Box I) = 7/11 = .6363. These are our prior probabilities for A and Ac. Suppose that when we performed the experiment, we

did not look at the color of the ball that we transferred from Box I to Box II. Furthermore, suppose that when we looked at the ball drawn from Box II it is red.

Given this information, how do we update our knowledge of what we transferred from Box I to II?

Page 43: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

43

Updating … continued Since we are interested in determining the probability

of event A (and also of Ac), given event B, then the desired probabilities are just the conditional probabilities. Therefore, applying the conditional probability rule, we have:

)()|()(

)()()|(

and ;)(

)|()()(

)()|(

BPABPAP

BPBAPBAP

BPABPAP

BPBAPBAP

cccc

Page 44: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

44

The Updating … continued From our earlier calculations, we have obtained: P(A B) = 28/99 and P(Ac B) = 42/99 P(B) = 70/99 Substituting these values in the formulas of the

preceding slide, we obtain our updated probabilities, also called the posterior probabilities, to be:

.60.07042

)99/70()99/42()|(

;40.07028

)99/70()99/28()|(

BAP

BAP

c

Page 45: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

45

Comparison: Priors and Posteriors We may compare the posterior probabilities of P(A|B)

= 0.40 and P(Ac|B) = 0.60 with the prior probabilities of P(A) = 0.3636 and P(Ac) = 0.6363.

These values indicate that the information that we got a red ball from Box II had increased the probability that we transferred a red ball from Box I to Box II from 0.3636 to 0.40, and decreased the probability that we transferred a blue ball from 0.6363 to 0.60.

The directions of change in the values are clearly intuitive, but the exact magnitudes of the changes cannot be obtained without recourse to the formulas and reasoning we had employed.

Page 46: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

46

(Reverend) Bayes Theorem The procedure we have just employed to update our

prior probabilities of A and Ac, given the occurrence of event B, is a special case of Bayes Theorem.

For the situation where in the first step of the experiment we only have two possibilities: A and Ac, and B is an event pertaining to the outcome of the second step of the experiment, Bayes Theorem states that:

.)|()()|()(

)|()()(

)|()()|(

cc ABPAPABPAPABPAP

BPABPAPBAP

Page 47: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

47

Another Example Situation: A medical test for HIV-infection has the

following characteristics:– if the person is HIV-infected, the (conditional)

probability that the test will be positive is 0.98, so the (conditional) probability that it will turn up negative is 0.02; while

– if the person is not HIV-infected, the (conditional) probability that the test will be negative is 0.99, so the conditional probability that it will turn up positive is 0.01.

Assume that the prevalence of HIV-infection in the population is 0.005.

Page 48: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

48

Characteristics of the Medical Test for Infection Below we present in tabular form the

characteristics of the test for HIV-infection. Note that the probabilities are conditional probabilities.

IF the person being tested isWITH conditional

probability ofHIV-infected Not HIV-

infectedPositive (+) 0.98 0.01THEN the

test forHIV-

infectionwill be

Negative (-) 0.02 0.99

Page 49: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

49

Example … continued The Experiment: Suppose that a person is randomly

chosen from the population, and this person is subjected to the test for HIV-infection. We are interested in:– a) The probability that the person will be HIV-

infected (prior to the test).– b) The probability that the test for HIV-infection

will show a positive result.– C) Given that the test showed a positive result,

the (updated) probability that the person is HIV-infected. [QUESTION: Without computing the probability, what is your best estimate??]

Page 50: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

50

Example … continued Solution: We let A be the event that the person chosen

is HIV-infected, and by B the event that the test for HIV-infection will show a positive result. From the given information, we have:

P(A) = 0.005 so P(Ac) = 1 - P(A) = 1-0.005 = 0.995. P(B|A) = 0.98 and P(B|Ac) = 0.01. By Theorem of Total Probabilities: P(B) = P(A)P(B|A) + P(Ac) P(B|Ac) = (.005)(.98) +

(.995)(.01) = .0049 + .00995 = .01485 By Bayes Rule: P(A|B) = P(A)P(B|A)/P(B) = (.005)(.98)/.01485 = 0.33.

Page 51: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

51

Generalizations of the Combining and Updating Rules

For completeness of our discussion, we present generalizations of the following: – Multiplication Rule– Theorem of Total Probabilities– Bayes Rule

The application of these rules are analogous to the preceding examples except for the possibility that there might be more than 2 outcomes in the “first step” of the experiment.

Page 52: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

54

Independent Events Let A and B be events of an experiment. We say that

A and B are independent if and only if: P(B|A) = P(B) or P(A|B) = P(A)

That is, knowledge of the occurrence of one event does not provide information about whether the other event has also occurred.

By the multiplication rule and the above definition, if A and B are independent, then

P(A B) = P(A)P(B) That is, joint probability = product of marginal

probabilities for independent events.

Page 53: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

55

Examples Suppose that two fair coins are tossed so S = {HH, HT,

TH, TT} and each sample outcome has probability of 1/4.

Let A = {outcome of first toss is H} = {HH, HT} so P(A) = 2/4 = 1/2.

Let B = {outcome of second toss is T} = {HT, TT} so P(B) = 2/4 = 1/2.

Note that since A B = {HT} then P(A B) = 1/4. Therefore, P(B|A) = (1/4)/(1/2) = 1/2 = P(B), so A and B

are independent events. Note also that P(A B) = (1/4) = (1/2)(1/2) = P(A)P(B),

another way to check independence.

Page 54: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

56

Examples … continued

This preceding example is the reason why we say that the outcomes of the tosses of a coin (or dice) are independent. In a practical sense, the outcome of the first toss does not have any bearing on the outcome of the second toss.

Page 55: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

57

Other Comments About Independence The main utility of the property of independence among

events is it enables us to calculate joint probabilities by simply multiplying the marginal probabilities.

Thus, if A, B, C, D are independent, then P(ABCD) = P(A)P(B)P(C)P(D).

Many events however are not independent! When independence is assumed, you should have a good reason (by examining the situation) why the events are independent.

Saying: The weather in Kenya tomorrow will be affected by the flutter of the wings of a monarch butterfly in Mexico today!

Page 56: Combining Probabilities and Conditional Probability

04/22/23 Lecture 4: More Probabilities and Conditioning

58

A Final Example Consider the experiment of observing whether two

people, Paul and Paula, who do not know each other and are in different places, get the flu in the next 6 months.

Let A be the event that Paul gets the flu, and B be the event that Paula gets the flu. Assume that A and B are independent, and furthermore, P(A) = 0.20 and P(B) = 0.05.

Want: P(at least one of them gets the flu) = P(A or B) By the addition rule and independence: P(A or B) = P(A) + P(B) - P(AB) = P(A) + P(B) -

P(A)P(B) = 0.20 + 0.05 - (0.20)(0.05) = 0.24.