sbe10_04 [read-only] [compatibility mode]

27
1 1 Slide Slide © 2005 Thomson/South © 2005 Thomson/South-Western Western Chapter 4 Chapter 4 Introduction to Probability Introduction to Probability Experiments, Counting Rules, Experiments, Counting Rules, and Assigning Probabilities and Assigning Probabilities Events and Their Probability Events and Their Probability Some Basic Relationships Some Basic Relationships of Probability of Probability Conditional Probability Conditional Probability Bayes’ Theorem Bayes’ Theorem 2 Slide Slide © 2005 Thomson/South © 2005 Thomson/South-Western Western Probability as a Numerical Measure Probability as a Numerical Measure of the Likelihood of Occurrence of the Likelihood of Occurrence 0 1 .5 .5 Increasing Likelihood of Occurrence Increasing Likelihood of Occurrence Probability: Probability: The event The event is very is very unlikely unlikely to occur. to occur. The occurrence The occurrence of the event is of the event is just as likely as just as likely as it is unlikely. it is unlikely. The event The event is almost is almost certain certain to occur. to occur.

Upload: vu-duc-hoang-vo

Post on 16-Apr-2017

226 views

Category:

Documents


5 download

TRANSCRIPT

Page 1: SBE10_04 [Read-Only] [Compatibility Mode]

1

11SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Chapter 4Chapter 4Introduction to ProbabilityIntroduction to Probability

■■ Experiments, Counting Rules, Experiments, Counting Rules,

and Assigning Probabilitiesand Assigning Probabilities

■■ Events and Their ProbabilityEvents and Their Probability

■■ Some Basic RelationshipsSome Basic Relationships

of Probabilityof Probability

■■ Conditional ProbabilityConditional Probability

■■ Bayes’ TheoremBayes’ Theorem

22SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Probability as a Numerical MeasureProbability as a Numerical Measureof the Likelihood of Occurrenceof the Likelihood of Occurrence

00 11.5.5

Increasing Likelihood of OccurrenceIncreasing Likelihood of Occurrence

Probability:Probability:

The eventThe event

is veryis very

unlikelyunlikely

to occur.to occur.

The occurrenceThe occurrence

of the event isof the event is

just as likely asjust as likely as

it is unlikely.it is unlikely.

The eventThe event

is almostis almost

certaincertain

to occur.to occur.

Page 2: SBE10_04 [Read-Only] [Compatibility Mode]

2

33SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

An Experiment and Its Sample SpaceAn Experiment and Its Sample Space

An An experimentexperiment is any process that generatesis any process that generateswellwell--defined outcomes.defined outcomes.

The The sample spacesample space for an experiment is the set offor an experiment is the set ofall experimental outcomes.all experimental outcomes.

An experimental outcome is also called a An experimental outcome is also called a samplesamplepointpoint..

44SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Example: Bradley InvestmentsExample: Bradley Investments

Bradley has invested in two stocks, Markley Oil and Bradley has invested in two stocks, Markley Oil and

Collins Mining. Bradley has determined that theCollins Mining. Bradley has determined that the

possible outcomes of these investments three monthspossible outcomes of these investments three months

from now are as follows.from now are as follows.

Investment Gain or LossInvestment Gain or Loss

in 3 Months (in $000)in 3 Months (in $000)

Markley OilMarkley Oil Collins MiningCollins Mining

1010

55

00

−−2020

88

−−22

Page 3: SBE10_04 [Read-Only] [Compatibility Mode]

3

55SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

A Counting Rule for A Counting Rule for MultipleMultiple--Step ExperimentsStep Experiments

�� If an experiment consists of a sequence of If an experiment consists of a sequence of kk stepssteps

in which there are in which there are nn11 possible results for the first step,possible results for the first step,

nn22 possible results for the second step, and so on, possible results for the second step, and so on,

then the total number of experimental outcomes isthen the total number of experimental outcomes is

given by (given by (nn11)()(nn22) . . . () . . . (nnkk).).

�� A helpful graphical representation of a multipleA helpful graphical representation of a multiple--stepstep

experiment is a experiment is a tree diagramtree diagram..

66SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Bradley Investments can be viewed as aBradley Investments can be viewed as a

twotwo--step experiment. It involves two stocks, eachstep experiment. It involves two stocks, each

with a set of experimental outcomes.with a set of experimental outcomes.

Markley Oil:Markley Oil: nn11 = 4= 4

Collins Mining:Collins Mining: nn22 = 2= 2

Total Number of Total Number of

Experimental Outcomes:Experimental Outcomes: nn11nn22 = (4)(2) = 8= (4)(2) = 8

A Counting Rule for A Counting Rule for MultipleMultiple--Step ExperimentsStep Experiments

Page 4: SBE10_04 [Read-Only] [Compatibility Mode]

4

77SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Tree DiagramTree Diagram

Gain 5Gain 5

Gain 8Gain 8

Gain 8Gain 8

Gain 10Gain 10

Gain 8Gain 8

Gain 8Gain 8

Lose 20Lose 20

Lose 2Lose 2

Lose 2Lose 2

Lose 2Lose 2

Lose 2Lose 2

EvenEven

Markley OilMarkley Oil(Stage 1)(Stage 1)

Collins MiningCollins Mining(Stage 2)(Stage 2)

ExperimentalExperimentalOutcomesOutcomes

(10, 8) (10, 8) Gain $18,000Gain $18,000

(10, (10, --2) 2) Gain $8,000Gain $8,000

(5, 8) (5, 8) Gain $13,000Gain $13,000

(5, (5, --2) 2) Gain $3,000Gain $3,000

(0, 8) (0, 8) Gain $8,000Gain $8,000

(0, (0, --2) 2) Lose Lose $2,000$2,000

((--20, 8) 20, 8) Lose Lose $12,000$12,000

((--20, 20, --2)2) Lose Lose $22,000$22,000

88SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

A second useful counting rule enables us to count theA second useful counting rule enables us to count the

number of experimental outcomes when number of experimental outcomes when nn objects are toobjects are to

be selected from a set of be selected from a set of NN objects.objects.

Counting Rule for CombinationsCounting Rule for Combinations

CN

n

N

n N nnN =

=

−!

!( )!

Number of Number of CombinationsCombinations of of NN Objects Taken Objects Taken nn at a Timeat a Time

where: where: NN! = ! = NN((NN −− 1)(1)(NN −− 2) . . . (2)(1)2) . . . (2)(1)

nn! = ! = nn((nn −− 1)(1)(nn −− 2) . . . (2)(1)2) . . . (2)(1)

0! = 10! = 1

Page 5: SBE10_04 [Read-Only] [Compatibility Mode]

5

99SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Number of Number of PermutationsPermutations of of NN Objects Taken Objects Taken nn at a Timeat a Time

where: where: NN! = ! = NN((NN −− 1)(1)(NN −− 2) . . . (2)(1)2) . . . (2)(1)

nn! = ! = nn((nn −− 1)(1)(nn −− 2) . . . (2)(1)2) . . . (2)(1)

0! = 10! = 1

P nN

n

N

N nnN =

=

−!

!

( )!

Counting Rule for PermutationsCounting Rule for Permutations

A third useful counting rule enables us to count theA third useful counting rule enables us to count the

number of experimental outcomes when number of experimental outcomes when nn objects are toobjects are to

be selected from a set of be selected from a set of NN objects, where the order ofobjects, where the order of

selection is important.selection is important.

1010SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Assigning ProbabilitiesAssigning Probabilities

Classical MethodClassical Method

Relative Frequency MethodRelative Frequency Method

Subjective MethodSubjective Method

Assigning probabilities based on the assumptionAssigning probabilities based on the assumptionof of equally likely outcomesequally likely outcomes

Assigning probabilities based on Assigning probabilities based on experimentationexperimentationor historical dataor historical data

Assigning probabilities based on Assigning probabilities based on judgmentjudgment

Page 6: SBE10_04 [Read-Only] [Compatibility Mode]

6

1111SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Classical MethodClassical Method

If an experiment has If an experiment has nn possible outcomes, this method possible outcomes, this method

would assign a probability of 1/would assign a probability of 1/nn to each outcome.to each outcome.

Experiment: Rolling a dieExperiment: Rolling a die

Sample Space: Sample Space: SS = {1, 2, 3, 4, 5, 6}= {1, 2, 3, 4, 5, 6}

Probabilities: Each sample point has aProbabilities: Each sample point has a

1/6 chance of occurring1/6 chance of occurring

ExampleExample

1212SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Relative Frequency MethodRelative Frequency Method

Number ofNumber ofPolishers RentedPolishers Rented

NumberNumberof Daysof Days

0011223344

4466

1818101022

Lucas Tool Rental would like to assignLucas Tool Rental would like to assign

probabilities to the number of car polishersprobabilities to the number of car polishers

it rents each day. Office records show the followingit rents each day. Office records show the following

frequencies of daily rentals for the last 40 days.frequencies of daily rentals for the last 40 days.

�� Example: Lucas Tool RentalExample: Lucas Tool Rental

Page 7: SBE10_04 [Read-Only] [Compatibility Mode]

7

1313SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Each probability assignment is given byEach probability assignment is given by

dividing the frequency (number of days) bydividing the frequency (number of days) by

the total frequency (total number of days).the total frequency (total number of days).

Relative Frequency MethodRelative Frequency Method

4/404/40

ProbabilityProbabilityNumber ofNumber of

Polishers RentedPolishers RentedNumberNumberof Daysof Days

0011223344

4466

1818101022

4040

.10.10

.15.15

.45.45

.25.25

.05.051.001.00

1414SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Subjective MethodSubjective Method

�� When economic conditions and a company’sWhen economic conditions and a company’scircumstances change rapidly it might becircumstances change rapidly it might beinappropriate to assign probabilities based solely oninappropriate to assign probabilities based solely onhistorical data.historical data.

�� We can use any data available as well as ourWe can use any data available as well as ourexperience and intuition, but ultimately a probabilityexperience and intuition, but ultimately a probabilityvalue should express our value should express our degree of beliefdegree of belief that thethat theexperimental outcome will occur.experimental outcome will occur.

�� The best probability estimates often are obtained byThe best probability estimates often are obtained bycombining the estimates from the classical or relativecombining the estimates from the classical or relativefrequency approach with the subjective estimate.frequency approach with the subjective estimate.

Page 8: SBE10_04 [Read-Only] [Compatibility Mode]

8

1515SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Subjective MethodSubjective Method

Applying the subjective method, an analyst Applying the subjective method, an analyst

made the following probability assignments.made the following probability assignments.

Exper. OutcomeExper. Outcome Net Gain Net Gain oror LossLoss ProbabilityProbability

(10, 8)(10, 8)

(10, (10, −−2)2)

(5, 8)(5, 8)

(5, (5, −−2)2)

(0, 8)(0, 8)

(0, (0, −−2)2)

((−−20, 8)20, 8)

((−−20, 20, −−2)2)

$18,000 Gain$18,000 Gain

$8,000 Gain$8,000 Gain

$13,000 Gain$13,000 Gain

$3,000 Gain$3,000 Gain

$8,000 Gain$8,000 Gain

$2,000 Loss$2,000 Loss

$12,000 Loss$12,000 Loss

$22,000 Loss$22,000 Loss

.20.20

.08.08

.16.16

.26.26

.10.10

.12.12

.02.02

.06.06

1616SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

An An eventevent is a collection of sample points.is a collection of sample points.

The The probability of any eventprobability of any event is equal to the sum ofis equal to the sum ofthe probabilities of the sample points in the event.the probabilities of the sample points in the event.

If we can identify all the sample points of anIf we can identify all the sample points of anexperiment and assign a probability to each, weexperiment and assign a probability to each, wecan compute the probability of an event.can compute the probability of an event.

Events and Their ProbabilitiesEvents and Their Probabilities

Page 9: SBE10_04 [Read-Only] [Compatibility Mode]

9

1717SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Events and Their ProbabilitiesEvents and Their Probabilities

Event Event MM = Markley Oil Profitable= Markley Oil Profitable

MM = {(10, 8), (10, = {(10, 8), (10, −−2), (5, 8), (5, 2), (5, 8), (5, −−2)}2)}

PP((MM) = ) = PP(10, 8) + (10, 8) + PP(10, (10, −−2) + 2) + PP(5, 8) + (5, 8) + PP(5, (5, −−2)2)

= .20 + .08 + .16 + .26= .20 + .08 + .16 + .26

= .70= .70

1818SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Events and Their ProbabilitiesEvents and Their Probabilities

Event Event CC = Collins Mining Profitable= Collins Mining Profitable

CC = {(10, 8), (5, 8), (0, 8), (= {(10, 8), (5, 8), (0, 8), (−−20, 8)}20, 8)}

PP((CC) = ) = PP(10, 8) + (10, 8) + PP(5, 8) + (5, 8) + PP(0, 8) + (0, 8) + PP((−−20, 8)20, 8)

= .20 + .16 + .10 + .02= .20 + .16 + .10 + .02

= .48= .48

Page 10: SBE10_04 [Read-Only] [Compatibility Mode]

10

1919SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Some Basic Relationships of ProbabilitySome Basic Relationships of Probability

There are some There are some basic probability relationshipsbasic probability relationships thatthat

can be used to compute the probability of an eventcan be used to compute the probability of an event

without knowledge of all the sample point probabilities.without knowledge of all the sample point probabilities.

Complement of an EventComplement of an Event

Intersection of Two EventsIntersection of Two Events

Mutually Exclusive EventsMutually Exclusive Events

Union of Two EventsUnion of Two Events

2020SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

The complement of The complement of AA is denoted by is denoted by AAcc..

The The complementcomplement of event of event A A is defined to be the eventis defined to be the eventconsisting of all sample points that are not in consisting of all sample points that are not in A.A.

Complement of an EventComplement of an Event

Event Event AA AAccSampleSpace SSampleSpace S

VennVennDiagramDiagram

Page 11: SBE10_04 [Read-Only] [Compatibility Mode]

11

2121SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

The union of events The union of events AA and and BB is denoted by is denoted by AA ∪ ∪ BB..

The The unionunion of events of events AA and and BB is the event containingis the event containingall sample points that are in all sample points that are in A A oror B B or both.or both.

Union of Two EventsUnion of Two Events

SampleSpace SSampleSpace SEvent Event AA Event Event BB

2222SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Union of Two EventsUnion of Two Events

Event Event MM = Markley Oil Profitable= Markley Oil Profitable

Event Event CC = Collins Mining Profitable= Collins Mining Profitable

MM ∪ ∪ CC = Markley Oil Profitable = Markley Oil Profitable

oror Collins Mining ProfitableCollins Mining Profitable

MM ∪ ∪ CC = {(10, 8), (10, = {(10, 8), (10, −−2), (5, 8), (5, 2), (5, 8), (5, −−2), (0, 8), (2), (0, 8), (−−20, 8)}20, 8)}

PP((MM ∪ ∪ C)C) == PP(10, 8) + (10, 8) + PP(10, (10, −−2) + 2) + PP(5, 8) + (5, 8) + PP(5, (5, −−2)2)

+ + PP(0, 8) + (0, 8) + PP((−−20, 8)20, 8)

= .20 + .08 + .16 + .26 + .10 + .02= .20 + .08 + .16 + .26 + .10 + .02

= .82= .82

Page 12: SBE10_04 [Read-Only] [Compatibility Mode]

12

2323SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

The intersection of events The intersection of events AA and and BB is denoted by is denoted by AA ∩ ∩ ΒΒ..

The The intersectionintersection of events of events AA and and BB is the set of allis the set of allsample points that are in bothsample points that are in both A A and and BB..

SampleSpace SSampleSpace SEvent Event AA Event Event BB

Intersection of Two EventsIntersection of Two Events

Intersection of A and BIntersection of A and B

2424SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Intersection of Two EventsIntersection of Two Events

Event Event MM = Markley Oil Profitable= Markley Oil Profitable

Event Event CC = Collins Mining Profitable= Collins Mining Profitable

MM ∩ ∩ CC = Markley Oil Profitable= Markley Oil Profitable

andand Collins Mining ProfitableCollins Mining Profitable

MM ∩ ∩ CC = {(10, 8), (5, 8)}= {(10, 8), (5, 8)}

PP((MM ∩ ∩ C)C) == PP(10, 8) + (10, 8) + PP(5, 8)(5, 8)

= .20 + .16= .20 + .16

= .36= .36

Page 13: SBE10_04 [Read-Only] [Compatibility Mode]

13

2525SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

The The addition lawaddition law provides a way to compute theprovides a way to compute theprobability of event probability of event A,A, or or B,B, or both or both AA and and B B occurring.occurring.

Addition LawAddition Law

The law is written as:The law is written as:

PP((AA ∪ ∪ BB) = ) = PP((AA) + ) + PP((BB) ) −− PP((AA ∩∩ BB))

2626SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Event Event MM = Markley Oil Profitable= Markley Oil Profitable

Event Event CC = Collins Mining Profitable= Collins Mining Profitable

MM ∪ ∪ CC = Markley Oil Profitable = Markley Oil Profitable

oror Collins Mining ProfitableCollins Mining Profitable

We know: We know: PP((MM) = .70, ) = .70, PP((CC) = .48, ) = .48, PP((MM ∩ ∩ CC) = .36) = .36

Thus: Thus: PP((MM ∪∪ C) C) = = PP((MM) + P() + P(CC) ) −− PP((MM ∩∩ CC))

= .70 + .48 = .70 + .48 −− .36.36

= .82= .82

Addition LawAddition Law

(This result is the same as that obtained earlier(This result is the same as that obtained earlier

using the definition of the probability of an event.)using the definition of the probability of an event.)

Page 14: SBE10_04 [Read-Only] [Compatibility Mode]

14

2727SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Mutually Exclusive EventsMutually Exclusive Events

Two events are said to be Two events are said to be mutually exclusivemutually exclusive if theif theevents have no sample points in common.events have no sample points in common.

Two events are mutually exclusive if, when one eventTwo events are mutually exclusive if, when one eventoccurs, the other cannot occur.occurs, the other cannot occur.

SampleSpace SSampleSpace SEvent Event AA Event Event BB

2828SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Mutually Exclusive EventsMutually Exclusive Events

If events If events AA and and BB are mutually exclusive, are mutually exclusive, PP((AA ∩∩ BB)) = 0.= 0.

The addition law for mutually exclusive events is:The addition law for mutually exclusive events is:

PP((AA ∪ ∪ BB) = ) = PP((AA) + ) + PP((BB))

there’s no need tothere’s no need toinclude “include “−− PP((AA ∩∩ BB))””

Page 15: SBE10_04 [Read-Only] [Compatibility Mode]

15

2929SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

The probability of an event given that another eventThe probability of an event given that another eventhas occurred is called a has occurred is called a conditional probabilityconditional probability..

A conditional probability is computed as follows :A conditional probability is computed as follows :

The conditional probability of The conditional probability of AA given given BB is denotedis denotedby by PP((AA||BB).).

Conditional ProbabilityConditional Probability

( )( | )

( )

P A BP A B

P B

∩=

3030SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Event Event MM = Markley Oil Profitable= Markley Oil Profitable

Event Event CC = Collins Mining Profitable= Collins Mining Profitable

We know:We know: PP((MM ∩ ∩ CC) = .36, ) = .36, PP((MM) = .70 ) = .70

Thus: Thus:

Conditional ProbabilityConditional Probability

( ) .36( | ) .5143

( ) .70

P C MP C M

P M

∩= = =

= Collins Mining Profitable= Collins Mining Profitable

givengiven Markley Oil ProfitableMarkley Oil Profitable

( | )P C M

Page 16: SBE10_04 [Read-Only] [Compatibility Mode]

16

3131SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Multiplication LawMultiplication Law

The The multiplication lawmultiplication law provides a way to compute theprovides a way to compute theprobability of the intersection of two events.probability of the intersection of two events.

The law is written as:The law is written as:

PP((AA ∩ ∩ BB) = ) = PP((BB))PP((AA||BB))

3232SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Event Event MM = Markley Oil Profitable= Markley Oil Profitable

Event Event CC = Collins Mining Profitable= Collins Mining Profitable

We know:We know: PP((MM) = .70, ) = .70, PP((CC||MM) = .5143) = .5143

Multiplication LawMultiplication Law

MM ∩ ∩ CC = Markley Oil Profitable= Markley Oil Profitable

andand Collins Mining ProfitableCollins Mining Profitable

Thus: Thus: PP((MM ∩∩ C) C) = = PP((MM))PP((M|CM|C))

= (.70)(.5143)= (.70)(.5143)

= .36= .36

(This result is the same as that obtained earlier(This result is the same as that obtained earlier

using the definition of the probability of an event.)using the definition of the probability of an event.)

Page 17: SBE10_04 [Read-Only] [Compatibility Mode]

17

3333SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Independent EventsIndependent Events

If the probability of event If the probability of event AA is not changed by theis not changed by theexistence of event existence of event BB, we would say that events , we would say that events AAand and BB are are independentindependent..

Two events Two events AA and and BB are independent if:are independent if:

PP((AA||BB) = ) = PP((AA)) PP((BB||AA) = ) = PP((BB))oror

3434SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

The multiplication law also can be used as a test to seeThe multiplication law also can be used as a test to seeif two events are independent.if two events are independent.

The law is written as:The law is written as:

PP((AA ∩ ∩ BB) = ) = PP((AA))PP((BB))

Multiplication LawMultiplication Lawfor Independent Eventsfor Independent Events

Page 18: SBE10_04 [Read-Only] [Compatibility Mode]

18

3535SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Multiplication LawMultiplication Lawfor Independent Eventsfor Independent Events

Event Event MM = Markley Oil Profitable= Markley Oil Profitable

Event Event CC = Collins Mining Profitable= Collins Mining Profitable

We know:We know: PP((MM ∩∩ CC) = .36, ) = .36, PP((MM) = .70, ) = .70, PP((CC) = .48) = .48

But: But: PP((M)P(C) M)P(C) = (.70)(.48) = .34, not .36= (.70)(.48) = .34, not .36

Are events Are events MM and and CC independent?independent?

DoesDoes PP((MM ∩∩ CC) = ) = PP((M)P(C) M)P(C) ??

Hence:Hence: MM and and CC are are notnot independent.independent.

3636SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Bayes’ TheoremBayes’ Theorem

NewNewInformationInformation

ApplicationApplicationof Bayes’of Bayes’TheoremTheorem

PosteriorPosteriorProbabilitiesProbabilities

PriorPriorProbabilitiesProbabilities

�� Often we begin probability analysis with initial orOften we begin probability analysis with initial orprior probabilitiesprior probabilities..

�� Then, from a sample, special report, or a productThen, from a sample, special report, or a producttest we obtain some additional information.test we obtain some additional information.

�� Given this information, we calculate revised orGiven this information, we calculate revised orposterior probabilitiesposterior probabilities..

�� Bayes’ theoremBayes’ theorem provides the means for revising theprovides the means for revising theprior probabilities.prior probabilities.

Page 19: SBE10_04 [Read-Only] [Compatibility Mode]

19

3737SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

A proposed shopping center A proposed shopping center

will provide strong competitionwill provide strong competition

for downtown businesses likefor downtown businesses like

L. S. Clothiers. If the shoppingL. S. Clothiers. If the shopping

center is built, the owner of center is built, the owner of

L. S. Clothiers feels it would be best toL. S. Clothiers feels it would be best to

relocate to the center. relocate to the center.

Bayes’ TheoremBayes’ Theorem

�� Example: L. S. ClothiersExample: L. S. Clothiers

The shopping center cannot be built unless aThe shopping center cannot be built unless a

zoning change is approved by the town council. Thezoning change is approved by the town council. The

planning board must first make a recommendation, forplanning board must first make a recommendation, for

or against the zoning change, to the council.or against the zoning change, to the council.

3838SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

■■ Prior ProbabilitiesPrior Probabilities

Let:Let:

Bayes’ TheoremBayes’ Theorem

AA11 = town council approves the zoning change= town council approves the zoning change

AA22 = town council disapproves the change= town council disapproves the change

P(P(AA11) = .7, P() = .7, P(AA22) = .3) = .3

Using subjective judgment:Using subjective judgment:

Page 20: SBE10_04 [Read-Only] [Compatibility Mode]

20

3939SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

■■ New InformationNew Information

The planning board has recommended The planning board has recommended against against the the zoning change. Let zoning change. Let BB denote the event of a negative denote the event of a negative recommendation by the planning board.recommendation by the planning board.

Given that Given that BB has occurred, should L. S. Clothiers has occurred, should L. S. Clothiers revise the probabilities that the town council will revise the probabilities that the town council will approve or disapprove the zoning change?approve or disapprove the zoning change?

Bayes’ TheoremBayes’ Theorem

4040SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

■■ Conditional ProbabilitiesConditional Probabilities

Past history with the planning board and the Past history with the planning board and the town council indicates the following:town council indicates the following:

Bayes’ TheoremBayes’ Theorem

PP((BB||AA11) = .2) = .2 PP((BB||AA22) = .9) = .9

PP((BBCC||AA11) = .8) = .8 PP((BBCC||AA22) = .1) = .1Hence:Hence:

Page 21: SBE10_04 [Read-Only] [Compatibility Mode]

21

4141SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

P(Bc|A1) = .8P(Bc|A1) = .8

P(A1) = .7P(A1) = .7

P(A2) = .3P(A2) = .3

P(B|A2) = .9P(B|A2) = .9

P(Bc|A2) = .1P(Bc|A2) = .1

P(B|A1) = .2P(B|A1) = .2 P(A1 ∩∩ B) = .14P(A1 ∩∩ B) = .14

P(A2 ∩∩ B) = .27P(A2 ∩∩ B) = .27

P(A2 ∩∩ Bc) = .03P(A2 ∩∩ Bc) = .03

P(A1 ∩∩ Bc) = .56P(A1 ∩∩ Bc) = .56

Bayes’ TheoremBayes’ Theorem

Tree DiagramTree Diagram

Town CouncilTown Council Planning BoardPlanning Board ExperimentalExperimentalOutcomesOutcomes

4242SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Bayes’ TheoremBayes’ Theorem

1 1 2 2

( ) ( | )( | )

( ) ( | ) ( ) ( | ) ... ( ) ( | )i i

i

n n

P A P B AP A B

P A P B A P A P B A P A P B A=

+ + +

�� To find the posterior probability that event To find the posterior probability that event AAii willwilloccur given that eventoccur given that event B B has occurred, we applyhas occurred, we applyBayes’ theoremBayes’ theorem..

�� Bayes’ theorem is applicable when the events forBayes’ theorem is applicable when the events forwhich we want to compute posterior probabilitieswhich we want to compute posterior probabilitiesare mutually exclusive and their union is the entireare mutually exclusive and their union is the entiresample space.sample space.

Page 22: SBE10_04 [Read-Only] [Compatibility Mode]

22

4343SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

■■ Posterior ProbabilitiesPosterior Probabilities

Given the planning board’s recommendation not Given the planning board’s recommendation not to approve the zoning change, we revise the prior to approve the zoning change, we revise the prior probabilities as follows:probabilities as follows:

1 11

1 1 2 2

( ) ( | )( | )

( ) ( | ) ( ) ( | )

P A P B AP A B

P A P B A P A P B A=

+

=+

(. )(. )(. )(. ) (. )(. )

7 27 2 3 9

Bayes’ TheoremBayes’ Theorem

= .34= .34

4444SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

■■ ConclusionConclusion

The planning board’s recommendation is good The planning board’s recommendation is good news for L. S. Clothiers. The posterior probability of news for L. S. Clothiers. The posterior probability of the town council approving the zoning change is .34 the town council approving the zoning change is .34 compared to a prior probability of .70.compared to a prior probability of .70.

Bayes’ TheoremBayes’ Theorem

Page 23: SBE10_04 [Read-Only] [Compatibility Mode]

23

4545SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Tabular ApproachTabular Approach

■■ Step 1Step 1

Prepare the following three columns:Prepare the following three columns:

Column 1Column 1 −− The mutually exclusive events for whichThe mutually exclusive events for whichposterior probabilities are desired.posterior probabilities are desired.

Column 2Column 2 −− The prior probabilities for the events.The prior probabilities for the events.

Column 3Column 3 −− The conditional probabilities of the newThe conditional probabilities of the newinformation information givengiven each event.each event.

4646SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Tabular ApproachTabular Approach

(1)(1) (2)(2) (3)(3) (4)(4) (5)(5)

EventsEvents

AAii

PriorPrior

ProbabilitiesProbabilities

PP((AAii))

ConditionalConditional

ProbabilitiesProbabilities

PP((BB||AAii))

AA11

AA22

.7.7

.3.3

1.01.0

.2.2

.9.9

Page 24: SBE10_04 [Read-Only] [Compatibility Mode]

24

4747SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Tabular ApproachTabular Approach

■■ Step 2Step 2

Column 4Column 4

Compute the joint probabilities for each event and Compute the joint probabilities for each event and the new information the new information BB by using the multiplication by using the multiplication law.law.

Multiply the prior probabilities in column 2 by Multiply the prior probabilities in column 2 by the corresponding conditional probabilities in the corresponding conditional probabilities in column 3. That is, column 3. That is, PP((AAi i IIBB) = ) = PP((AAii) ) PP((BB||AAii). ).

4848SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Tabular ApproachTabular Approach

(1)(1) (2)(2) (3)(3) (4)(4) (5)(5)

EventsEvents

AAii

PriorPrior

ProbabilitiesProbabilities

PP((AAii))

ConditionalConditional

ProbabilitiesProbabilities

PP((BB||AAii))

AA11

AA22

.7.7

.3.3

1.01.0

.2.2

.9.9

.14.14

.27.27

JointJoint

ProbabilitiesProbabilities

PP((AAi i IIBB))

.7 x .2.7 x .2

Page 25: SBE10_04 [Read-Only] [Compatibility Mode]

25

4949SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Tabular ApproachTabular Approach

■■ Step 2 (continued)Step 2 (continued)

We see that there is a .14 probability of the townWe see that there is a .14 probability of the towncouncil approving the zoning change and a negativecouncil approving the zoning change and a negativerecommendation by the planning board. recommendation by the planning board.

There is a .27 probability of the town councilThere is a .27 probability of the town councildisapproving the zoning change and a negativedisapproving the zoning change and a negativerecommendation by the planning board.recommendation by the planning board.

5050SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Tabular ApproachTabular Approach

■■ Step 3Step 3

Column 4Column 4

Sum the joint probabilities. The sum is theSum the joint probabilities. The sum is the

probability of the new information, probability of the new information, PP((BB). The sum). The sum

.14 + .27 shows an overall probability of .41 of a.14 + .27 shows an overall probability of .41 of a

negative recommendation by the planning board.negative recommendation by the planning board.

Page 26: SBE10_04 [Read-Only] [Compatibility Mode]

26

5151SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

Tabular ApproachTabular Approach

(1)(1) (2)(2) (3)(3) (4)(4) (5)(5)

EventsEvents

AAii

PriorPrior

ProbabilitiesProbabilities

PP((AAii))

ConditionalConditional

ProbabilitiesProbabilities

PP((BB||AAii))

AA11

AA22

.7.7

.3.3

1.01.0

.2.2

.9.9

.14.14

.27.27

JointJoint

ProbabilitiesProbabilities

PP((AAi i IIBB))

PP((BB) = .41) = .41

5252SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

■■ Step 4Step 4

Column 5Column 5

Compute the posterior probabilities using the basic Compute the posterior probabilities using the basic relationship of conditional probability.relationship of conditional probability.

The joint probabilities The joint probabilities PP((AAi i IIBB) are in column 4 and ) are in column 4 and the probability the probability PP((BB) is the sum of column 4.) is the sum of column 4.

Tabular ApproachTabular Approach

)(

)()|(

BP

BAPBAP i

i

∩=

Page 27: SBE10_04 [Read-Only] [Compatibility Mode]

27

5353SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

(1)(1) (2)(2) (3)(3) (4)(4) (5)(5)

EventsEvents

AAii

PriorPrior

ProbabilitiesProbabilities

PP((AAii))

ConditionalConditional

ProbabilitiesProbabilities

PP((BB||AAii))

AA11

AA22

.7.7

.3.3

1.01.0

.2.2

.9.9

.14.14

.27.27

JointJoint

ProbabilitiesProbabilities

PP((AAi i IIBB))

PP((BB) = .41) = .41

Tabular ApproachTabular Approach

.14/.41.14/.41

PosteriorPosterior

ProbabilitiesProbabilities

PP((AAii ||BB))

..34153415

.6585.6585

1.00001.0000

5454SlideSlide© 2005 Thomson/South© 2005 Thomson/South--WesternWestern

End of Chapter 4End of Chapter 4