random variables and probability distributionsmimoza.marmara.edu.tr/~cem/se/st3.pdf · • the...

Post on 13-Oct-2019

2 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Random Variables and

Probability Distributions

• capital letter X, to denote a random variable

• corresponding small letter, x for one of its values.

• Statistics is concerned with making inferences about

populations and population characteristics.

• the assignment of 1 or 0 is arbitrary though quite convenient. • The random variable for which 0 and 1 are chosen• to describe the two possible values is called a Bernoulli random variable.

• a statistical experiment, any process by which several chance observations are generated

• S = {NNN, NND, NDN, DNN, NDD, DND, DDN, DDD},where N nondefective and D defective.

• concerned with the number of defectives• A random variable is a function that associates a real

number with each element in the sample space. • the random variable X assumes the value 2 • for all elements in the subset E = {DDN, DND, NDD} of

the sample space S.

• If a sample space contains a finite number of possibilities

• Or an unending sequence with as many elements as there are

whole numbers,

• it is called a discrete sample space.

• If a sample space contains an infinite number of possibilities

• equal to the number of points on a line segment,

• it is called a continuous sample space.

• A random variable is called a discrete random variable

• if its set of possible outcomes is countable.

• But a random variable whose set of possible values is an

entire interval of numbers is not discrete.

• When a random variable can take on values on a continuous

scale,

• it is called a continuous random variable.

Let W be a random variable

in three tosses of a coin.

number of heads minus the number of tails

List the elements of the sample space S

to each sample point assign a value w of W.

What is a cumulative probability distribution (CD)?

• A table of the probabilities

cumulated over the events.

• The CD is a monotonically

increasing set of numbers

• The CD always ends with at

the highest value of 1.

Examples of e probability distribution (PD): Bernoulli Random Variables (RV)

•Bernoulli distribution: The outcome is either a

• “failure” (0) or a (1).

•X is a bernoulli RV when

•Pr(X = 0) = p “failure”

•Pr(X = 1) = (1 − p) “success”

• The bernoulli distribution has one parameter, p

The cumulative probability distribution (CD)of a Bernoulli Random Variables (RV)

• We need to know what p of

the RV is. Say p = 0.35.

• The CD of this Bernoulli RV

is:

• X is a random variable whose values x are the possible numbers of defective computers purchased by the school.

• Then x can only take the numbers 0, 1, 2

It is often helpful to look at a probability distribution in graphic form.

Continuous Probability Distributions

• A probability density function is constructed

• the area under its curveis 1

The CDF, F(x), is area function of the PDF, obtained by integrating the PDFfrom negative infinity to an arbitrary value x.

Joint Probability Distributions

• X and Y are two discrete random variables, • the probability distribution for their simultaneous

occurrence by a function f(x, y)• pair of values (x, y) within the range of the random

variables X and Y .• refer to this function as the joint probability distribution of

X and Y .• in the discrete case, f(x, y) = P(X = x, Y = y);• f(x, y) give the probability that outcomes x and y occur at

the same time.

Joint Probability Distributions

• Sometimes we’re simultaneously interested in two or more variables in a random experiment.

• We’re looking for a relationship between the two variables.

• Examples for discrete r.v.’s • • Year in college vs. Number of credits taken

• Number of cigarettes smoked per day vs. Day of the week

• Examples for continuous r.v.’s• Time when bus driver picks you up vs. Quantity of caffeine in bus driver’s

system

• Dosage of a drug (ml) vs. Blood compound measure (percentage)

• probability distribution g(x) of X alone is obtained by summing f(x, y) over the values of Y .

• probability distribution h(y) of Y alone is obtained by summing f(x, y) over the values of X.

• g(x) and h(y) to be the marginal distributions of X and Y

we rolled two dice and let X be the value on the first die and T be the total on both dice. Compute the marginal pmf of X and of T.

top related