sums of independent random variables

Upload: jose-maria-medina-villaverde

Post on 26-Feb-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/25/2019 Sums of Independent Random Variables

    1/7

    ms of independent random variables

    ://www.statlect.com/sumdst1.htm[07/11/2015 15:45:33]

    Statlect The Digital Textbook

    Index>Additional topics in probability theory

    Sums of independent random variables

    This lecture discusses how to derive the distribution of the sum of two independent random variables.We explain first

    how to derive the distribution functionof the sum and then how to derive its probability mass function(if the summands

    are discrete) or its probability density function(if the summands are continuous).

    Distribution function of a sum

    The following proposition characterizes the distribution function of the sum in terms of the distribution functions of the

    two summands:

    Proposition Let and be two independent random variables and denote by and their

    distribution functions. Let:

    and denote the distribution function of by . The following holds:

    or:

    Proof

    Example Let be a uniform random variablewith support and probability density function:

    and another uniform random variable, independent of , with support and probability density

    function:

    The distribution function of is:

    Go

    http://www.statlect.com/http://www.statlect.com/prbtop.htmhttp://www.statlect.com/inddst1.htmhttp://www.statlect.com/inddst1.htmhttp://www.statlect.com/glossary/distribution_function.htmhttp://www.statlect.com/glossary/probability_mass_function.htmhttp://www.statlect.com/glossary/probability_density_function.htmhttp://www.statlect.com/ucduni1.htmhttp://www.statlect.com/glossary/support_of_a_random_variable.htmhttp://www.statlect.com/glossary/support_of_a_random_variable.htmhttp://www.statlect.com/ucduni1.htmhttp://www.statlect.com/glossary/probability_density_function.htmhttp://www.statlect.com/glossary/probability_mass_function.htmhttp://www.statlect.com/glossary/distribution_function.htmhttp://www.statlect.com/inddst1.htmhttp://www.statlect.com/prbtop.htmhttp://www.statlect.com/
  • 7/25/2019 Sums of Independent Random Variables

    2/7

    ms of independent random variables

    ://www.statlect.com/sumdst1.htm[07/11/2015 15:45:33]

    The distribution function of is:

    There are four cases to consider:

    1. If , then:

    2. If , then:

    3. If , then:

    4. If , then:

  • 7/25/2019 Sums of Independent Random Variables

    3/7

    ms of independent random variables

    ://www.statlect.com/sumdst1.htm[07/11/2015 15:45:33]

    Combining these four possible cases, we obtain:

    Probability mass function of a sum

    When the two summands are discrete random variables, the probability mass function of their sum can be derived as

    follows:

    Proposition Let and be two independent discrete random variables and denote by and their

    respective probability mass functions and by and their supports. Let:

    and denote the probability mass function of by . The following holds:

    or:

    Proof

    The two summations above are called convolutions (of two probability mass functions).

    Example Let be a discrete random variable with support and probability mass function:

    and another discrete random variable, independent of , with support and probability mass

    function:

  • 7/25/2019 Sums of Independent Random Variables

    4/7

    ms of independent random variables

    ://www.statlect.com/sumdst1.htm[07/11/2015 15:45:33]

    Define

    Its support is:

    The probability mass function of , evaluated at is:

    Evaluated at , it is:

    Evaluated at , it is:

    Therefore, the probability mass function of is:

    Probability density function of a sum

    When the two summands are absolutely continuous random variables, the probability density function of their sum can

    be derived as follows:

    Proposition Let and be two independent absolutely continuous random variables and denote by and

    their respective probability density functions. Let:

    and denote the probability density function of by . The following holds:

  • 7/25/2019 Sums of Independent Random Variables

    5/7

    ms of independent random variables

    ://www.statlect.com/sumdst1.htm[07/11/2015 15:45:33]

    or:

    Proof

    The two integrals above are called convolutions (of two probability density functions).

    Example Let be an exponential random variablewith support and probability density function:

    and another exponential random variable, independent of , with support and probability density

    function:

    Define:

    The support of is:

    When , the probability density function of is:

    Therefore, the probability density function of is:

    http://www.statlect.com/ucdexp1.htmhttp://www.statlect.com/ucdexp1.htm
  • 7/25/2019 Sums of Independent Random Variables

    6/7

    ms of independent random variables

    ://www.statlect.com/sumdst1.htm[07/11/2015 15:45:33]

    Featured pagesStudent t distribution

    Gamma function

    Normal distribution

    Main sectionsMathematical tools

    Fundamentals of probability

    Additional topics in probability

    Glossary entriesAlternative hypothesis

    Convolutions

    Precision matrix

    More details

    Sum of n independent random variables

    We have discussed above how to derive the distribution of the sum of two independent random variables. How do we

    derive the distribution of the sum of more than two mutually independent random variables? Suppose , , ...,

    are mutually independent random variables and let be their sum:

    The distribution of can be derived recursively, using the results for sums of two random variables given above:

    1. first, define:

    and compute the distribution of ;

    2. then, define:

    and compute the distribution of ;

    3. and so on, until the distribution of can be computed from:

    Solved exercises

    Below you can find some exercises with explained solutions:

    1. Exercise set 1

    The bookMost learning materials found on this website are now available in a traditional textbook format.

    Learn more

    http://www.statlect.com/ucdstu1.htmhttp://www.statlect.com/subon2/gamfun1.htmhttp://www.statlect.com/ucdnrm1.htmhttp://www.statlect.com/mathtl.htmhttp://www.statlect.com/fndprb.htmhttp://www.statlect.com/prbtop.htmhttp://www.statlect.com/glossary/alternative_hypothesis.htmhttp://www.statlect.com/glossary/convolutions.htmhttp://www.statlect.com/glossary/precision_matrix.htmhttp://www.statlect.com/inddst1.htm#mutualhttp://www.statlect.com/sum_of_independent_random_variables_exercise_set_1.htmhttp://www.statlect.com/book.htmhttp://www.statlect.com/book.htmhttp://www.statlect.com/sum_of_independent_random_variables_exercise_set_1.htmhttp://www.statlect.com/inddst1.htm#mutualhttp://www.statlect.com/glossary/precision_matrix.htmhttp://www.statlect.com/glossary/convolutions.htmhttp://www.statlect.com/glossary/alternative_hypothesis.htmhttp://www.statlect.com/prbtop.htmhttp://www.statlect.com/fndprb.htmhttp://www.statlect.com/mathtl.htmhttp://www.statlect.com/ucdnrm1.htmhttp://www.statlect.com/subon2/gamfun1.htmhttp://www.statlect.com/ucdstu1.htm
  • 7/25/2019 Sums of Independent Random Variables

    7/7

    ms of independent random variables

    Point estimation

    Delta method

    Maximum likelihood

    Explore

    Law of Large Numbers

    Wald test

    Gamma distribution

    Convergence in distribution

    Probability distributions

    Asymptotic theory

    Fundamentals of statistics

    About

    About Statlect

    Contacts

    Privacy policy and terms of use

    Website map

    Markov inequality

    Factorial

    Type I error

    Share

    http://www.statlect.com/point_estimation.htmhttp://www.statlect.com/delta_method.htmhttp://www.statlect.com/maximum_likelihood.htmhttp://www.statlect.com/asylln1.htmhttp://www.statlect.com/Wald_test.htmhttp://www.statlect.com/subon2/ucdgam1.htmhttp://www.statlect.com/subon2/dsconv1.htmhttp://www.statlect.com/distri.htmhttp://www.statlect.com/asythe.htmhttp://www.statlect.com/fundamentals_of_statistics.htmhttp://www.statlect.com/about.htmhttp://www.statlect.com/contacts.htmhttp://www.statlect.com/privacy_policy_and_terms_of_use.htmhttp://www.statlect.com/sitemap.htmhttp://www.statlect.com/glossary/Markov_inequality.htmhttp://www.statlect.com/glossary/factorial.htmhttp://www.statlect.com/glossary/Type_I_error.htmhttp://www.statlect.com/glossary/Type_I_error.htmhttp://www.statlect.com/glossary/factorial.htmhttp://www.statlect.com/glossary/Markov_inequality.htmhttp://www.statlect.com/sitemap.htmhttp://www.statlect.com/privacy_policy_and_terms_of_use.htmhttp://www.statlect.com/contacts.htmhttp://www.statlect.com/about.htmhttp://www.statlect.com/fundamentals_of_statistics.htmhttp://www.statlect.com/asythe.htmhttp://www.statlect.com/distri.htmhttp://www.statlect.com/subon2/dsconv1.htmhttp://www.statlect.com/subon2/ucdgam1.htmhttp://www.statlect.com/Wald_test.htmhttp://www.statlect.com/asylln1.htmhttp://www.statlect.com/maximum_likelihood.htmhttp://www.statlect.com/delta_method.htmhttp://www.statlect.com/point_estimation.htm