functions of random variables

22
Functions of Random Variables Chapter 3 DeGroot & Schervish

Upload: hayes-mccormick

Post on 01-Jan-2016

34 views

Category:

Documents


3 download

DESCRIPTION

Functions of Random Variables. Chapter 3 DeGroot & Schervish. Functions of a Random Variable. the distribution of some function of X suppose X is the rate at which customers are served in a queue then 1/X is the average waiting time - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Functions  of  Random Variables

Functions of Random Variables

Chapter 3DeGroot & Schervish

Page 2: Functions  of  Random Variables

Functions of a Random Variablethe distribution of some function of Xsuppose X is the rate at which customers are

served in a queuethen 1/X is the average waiting timeIf we have the distribution of X, we should be

able to: determine the distribution of 1/Xor of any other function of X

Page 3: Functions  of  Random Variables

Random Variable with a Discrete DistributionDistance from the Middle exampleLet X have the uniform distribution on the

integers 1, 2, . . . , 9. Suppose that we are interested in how far X

is from the middle of the distribution, namely, 5.

We could define Y = |X − 5| and compute probabilities such as Pr(Y = 1) = Pr(X ∈ {4, 6}) = 2/9.

Page 4: Functions  of  Random Variables

Function of a Discrete Random VariableLet X have a discrete distribution with p.f. f ,let Y = r(X) for some function of r defined on

the set of possible values of XFor each possible value y of Y , the p.f. g of Y

is

Page 5: Functions  of  Random Variables

Distance from the MiddleThe possible values of Y in the previous

example are 0, 1, 2, 3, and 4. We see that Y = 0 if and only if X = 5

g(0) = f (5) = 1/9. For all other values of Y , there are two

values of X that give that value of Y . For example,{Y = 4} = {X = 1} ∪ {X = 9}. So, g(y) = 2/9 for y = 1, 2, 3, 4.

Page 6: Functions  of  Random Variables

Random Variable with a Continuous DistributionIf a random variable X has a continuous

distribution, then the procedure for deriving the probability distribution of a function of X differs from that given for a discrete distribution.

One way to proceed is by direct calculation

Page 7: Functions  of  Random Variables

Average Waiting TimeLet Z be the rate at which customers are

served in a queue,suppose that Z has a continuous c.d.f. F. The average waiting time is Y = 1/Z.If we want to find the c.d.f. G of Y , we can

write

Page 8: Functions  of  Random Variables

Random Variable with a Continuous DistributionIn general, suppose that the p.d.f. of X is f

and that another random variable is defined as Y = r(X).

For each real number y, the c.d.f. G(y) of Y can be derived as follows:

If the random variable Y also has a continuous distribution, its p.d.f. g can be obtained from the relation

Page 9: Functions  of  Random Variables

Direct Derivation of the p.d.f.Let r be a differentiable one-to-one function on the open

interval (a, b).Then r is either strictly increasing or strictly

decreasing. Because r is also continuous, it will map the interval (a,

b) to another open interval (α, β), called the image of (a, b) under r.

That is, for each x ∈ (a, b), r(x) ∈ (α, β), and for each y ∈ (α, β) there is x ∈ (a, b) such that y = r(x) and this y is unique because r is one-to-one.

So the inverse s of r will exist on the interval (α, β), meaning that for x ∈ (a, b) and y ∈ (α, β) we have r(x) = y if and only if s(y) = x.

Page 10: Functions  of  Random Variables

Theorem Let X be a random variable for which the p.d.f. is f and for

which Pr(a <X<b) = 1.Here, a and/or b can be either finite or infinite.

Let Y = r(X), and suppose that r(x) is differentiable and one-to-one for a <x <b.

Let (α, β) be the image of the interval (a, b) under the function r.

Let s(y) be the inverse function of r(x) for α <y <β.Then the p.d.f. g of Y is

Page 11: Functions  of  Random Variables

Proof If r is increasing, then s is increasing, and for each y ∈ (α, β)

Because s is increasing, ds(y)/dy is positive; hence, it equals |ds(y)/dy| and this equation implies the theorem.

Similarly, if r is decreasing, then s is decreasing, and for each y ∈ (α, β),

Since s is strictly decreasing, ds(y)/dy is negative so that −ds(y)/dy equals |ds(y)/dy|. It follows that the equation implies the theorem.

Page 12: Functions  of  Random Variables

The Probability Integral TransformationLet X be a continuous random variable The p.d.f. f (x) = exp(−x) for x >0 and 0 otherwise. The c.d.f. of X is F(x) = 1− exp(−x) for x >0 and 0

otherwise. If we let F be the function r, we can find the distribution of

Y = F(X). The c.d.f. or Y is, for 0 < y <1,

which is the c.d.f. of the uniform distribution on the interval [0, 1]. It follows that Y has the uniform distribution on the interval [0, 1].

Page 13: Functions  of  Random Variables

Theorem Let X have a continuous c.d.f. F, let Y = F(X).This transformation from X to Y is called the

probability integral transformation.The distribution of Y is the uniform

distribution on the interval [0, 1].

Page 14: Functions  of  Random Variables

Proof First, because F is the c.d.f. of a random variable, then 0 ≤ F(x) ≤ 1

for −∞ < x <∞. Therefore, Pr(Y < 0) = Pr(Y > 1) = 0. Since F is continuous, the set of x such that F(x) = y is a nonempty

closed and bounded interval [x0, x1] for each y in the interval (0, 1). Let F−1(y) denote the lower endpoint x0 of this interval, which was

called the y quantile of F. In this way, Y ≤ y if and only if X ≤ x1. Let G denote the c.d.f. of Y . Then

Hence, G(y) = y for 0 < y <1. Because this function is the c.d.f. of the uniform distribution on the interval [0, 1], this uniform distribution is the distribution of Y .

Page 15: Functions  of  Random Variables

Functions of Two or More Random VariablesWhen we observe data consisting of the

values of several random variables, we need to summarize the observed values in order to be able to focus on the information in the data.

Summarizing consists of constructing one or a few functions of the random variables.

We now describe the techniques needed to determine the distribution of a function of two or more random variables.

Page 16: Functions  of  Random Variables

Random Variables with a Discrete Joint DistributionSuppose that n random variables X1, . . . , Xn

have a discrete joint distribution for which the joint p.f. is f, and that m functions Y1, . . . , Ym of these n random variables are defined as follows:

Y1 = r1(X1, . . . , Xn),Y2 = r2(X1, . . . , Xn),...Ym = rm(X1, . . . , Xn).

Page 17: Functions  of  Random Variables

Random Variables with a Discrete Joint DistributionFor given values y1, . . . , ym of the m random variables

Y1, . . . , Ym, let A denote the set of all points (x1, . . . , xn) such that

r1(x1, . . . , xn) = y1,r2(x1, . . . , xn) = y2,...rm(x1, . . . , xn) = ym.

Then the value of the joint p.f. g of Y1, . . . , Ym is specified at the point (y1, . . . , ym) by the relation

Page 18: Functions  of  Random Variables

Random Variables with a Continuous Joint DistributionSuppose that the joint p.d.f. of X = (X1, . . . ,

Xn) is f (x) and that Y = r(X).Y = r(X1, . . . , Xn),

The d.f. of Y can be calculated as follows:

If Y has a continuous distribution, then the

derivation of G(y) gives the pd.f. of Y.

Page 19: Functions  of  Random Variables

Direct Transformation of a Multivariate p.d.f.Let X1, . . . , Xn have a continuous joint distribution for

which the joint p.d.f. is f . Assume that there is a subset S of Rn such that

Pr[(X1, . . . , Xn) ∈ S]= 1. Define n new random variables Y1, . . . , Yn as follows:

Y1 = r1(X1, . . . , Xn),Y2 = r2(X1, . . . , Xn),...Yn= rn(X1, . . . , Xn),

where we assume that the n functions r1, . . . , rn define a one-to-one differentiable transformation of S onto a subset T of Rn.

Page 20: Functions  of  Random Variables

Direct Transformation of a Multivariate p.d.f.Let the inverse of this transformation be

given as follows:x1 = s1(y1, . . . , yn),x2 = s2(y1, . . . , yn),...xn = sn(y1, . . . , yn).

Page 21: Functions  of  Random Variables

Direct Transformation of a Multivariate p.d.f.Then the joint p.d.f. g of Y1, . . . , Yn is

where J is the determinant and |J | denotes the absolute value of the determinant J .

This determinant J is called the Jacobian of the transformation specified by the equations.

Page 22: Functions  of  Random Variables

Linear TransformationsLet X = (X1, . . . , Xn) have a continuous joint

distribution for which the joint p.d.f. is f . Define Y = (Y1, . . . , Yn) by

Y = AX,where A is a nonsingular n × n matrix. Then

Y has a continuous joint distribution with p.d.f.

where A−1 is the inverse of A.