ece 6540, lecture 01
Post on 28-Apr-2022
5 Views
Preview:
TRANSCRIPT
Variations
ECE 6540, Lecture 01Introduction and Review of Probability
Estimation Theory: Some Definitions
Definitions Question: What is a statistic? How do we define it?
3
Definitions Question: What is a statistic? How do we define it?
Answer: A statistic is any function of sampled data
The function must be independent of the dataâs underlying probability distribution
4
Definitions Examples:
ð¥ð¥~ð©ð© 0,1 , y~ð©ð© 0,2
Are these statistics? ð¥ð¥
ð¥ð¥ + ðŠðŠ
ð¥ð¥2
ð¥ð¥ð¥ð¥ â ðŠðŠln ð¥ð¥ + 2 + 3ðŠðŠ
ðžðž ð¥ð¥
5
Definitions Examples:
ð¥ð¥~ð©ð© 0,1 , y~ð©ð© 0,2
Are these statistics? ð¥ð¥ Yes!
ð¥ð¥ + ðŠðŠ Yes!
ð¥ð¥2 Yes!
ð¥ð¥ð¥ð¥ â ðŠðŠln ð¥ð¥ + 2 + 3ðŠðŠ Yes!
ðžðž ð¥ð¥ No!
6
Definitions Question: What is an estimator? How do we define it?
7
Definitions Question: What is an estimator? How do we define it?
Answer: An estimator is a statistic that estimates a specific value
8
Definitions Examples:
ð¥ð¥~ð©ð© 0,1 , y~ð©ð© 0,1
A familiar statistic12ð¥ð¥ + ðŠðŠ is an estimator of what?
Is it a good estimator? Why or why not?
9
Definitions Examples:
ð¥ð¥~ð©ð© 0,1 , y~ð©ð© 0,1
A less familiar statistic23ð¥ð¥ + 1
3ðŠðŠ is an estimator of what?
Is it a good estimator? Why or why not?
10
Definitions Question: What is an estimation theory? How do we define it?
11
Definitions Question: What is an estimation theory? How do we define it?
Answer: Estimation theory is the study estimators and their properties.
12
Definitions Question: What is a [statistical] detector (not to be confused with communications
detector)?
13
Definitions Question: What is a [statistical] detector (not to be confused with communications
detector)?
Answer: (Warning: definition is a but fuzzy) A detector is a statistic or process
that determines the presence of a signal within noise
A [hypothesis] test is a method for determining what distribution a detector (statistic) belongs to
14
Definitions Example:
ðð~ð©ð© 0,1 , ð¥ð¥ is any signal
Hypothesis Test Null Hypothesis: y = n
Alternative Hypothesis: y = x + n
What is a good detector?
15
Definitions Example:
ðð~ð©ð© 0,1 , ð¥ð¥ is any signal
Hypothesis Test Null Hypothesis: y = n
Alternative Hypothesis: y = x + n
What is a good detector? Optimal detector: s = ðŠðŠ 2
Optimal test: s > ðð (threshold ðð is determined from a Chi-square distribution)
16
Definitions Question: What is an detection theory? How do we define it?
17
Definitions Question: What is an detection theory? How do we define it?
Answer: Detection theory is the study detectors and their properties.
18
Definitions Question: In engineering, how do we define the âbestâ or âoptimalâ of something
(e.g., the best estimator or the best detector)
19
Definitions Question: In engineering, how do we define the âbestâ or âoptimalâ of something
(e.g., the best estimator or the best detector)
Answer: Trick question, âbestâ and âoptimalâ is always based on some criteria
that WE define.
20
Applications
Definitions Question: What are some applications of estimation theory?
22
Applications RADAR (Radio Detection And Ranging) / Sonar
Detection: Is there a reflection from an aircraft?
Estimation: How far is the aircraft / what is its precise location?
Related (Waveform Design): Can I design waveforms to make the above easier/harder?
Detection Theory Estimation Theory
Credit: https://en.wikipedia.org/wiki/Radar23
Applications Communications Detection: Did I receive a message?
Estimation: What is the message?
Related (Coding): Can I design codes to make the above easier/harder?
Credit: http://www.ohlone.edu/instr/speech/longdesc-diagramcommunication.html
24
Applications It is pervasive in signal processing Estimation theory = applied statistics
Most modern signal processing tools involve statisticsâ Array processingâ Compressive sensing (most proofs are probability based)â Network science (probabilistic graphical models)â Optimal filter designâ De-noising â Tracking (e.g., Kalman filter)â Statistical Modelling / Analysis
25
Applications Two examples: Gaussian Random Variable
â What is an optimal estimate for the expected value?
Laplace Random Variableâ What is an optimal estimate for the expected value?
26
Schedule
Schedule Part 1: Classical Estimation Theory Minimum Mean Square Error Estimators
Minimum Variance Unbiased Estimators
Maximum Likelihood Estimators
Part 2: Bayesian Estimation Theory Maximum a Priori Estimators
Minimax Estimators
Part 3: Detection Theory Neyman-Pearson Tests
Generalized Maximum Likelihood Tests
28
Schedule Question: What is the difference between classical and Bayesian statistics? Why are these differences important?
29
Schedule Quick warning! In the first few classes, I am going to throw a lot of information at you
(some of which you may know, some of which you may not)
I do not expect you to retain everything 100%. My goal is to expose you to these concepts and make you more comfortable about the concepts.
30
Probability Review (with some things you may have not seen)
Probability Review Quick Note: Notation everywhere is different
We will try to stick with Kayâs notation
32
Probability Review Probability events: Probability that âeventâ A
â Pr ðŽðŽ
Probability of âeventâ A AND Bâ Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ,ðµðµ
Probability of âeventâ A OR Bâ Pr ðŽðŽ ⪠ðµðµ
Probability of an âeventâ A, given âeventâ Bâ ðð ðŽðŽ|ðµðµ
33
Probability Review Probability events: Probability that âeventâ A
â Pr ðŽðŽ
Probability of âeventâ A AND Bâ Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ,ðµðµ
Probability of âeventâ A OR Bâ Pr ðŽðŽ ⪠ðµðµ
Probability of an âeventâ A, given âeventâ Bâ ðð ðŽðŽ|ðµðµ
Ω
Event A
Event B
<- Universe
34
Probability Review Probability events: Probability that âeventâ A
â Pr ðŽðŽ
Probability of âeventâ A AND Bâ Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ,ðµðµ
Probability of âeventâ A OR Bâ Pr ðŽðŽ ⪠ðµðµ
Probability of an âeventâ A, given âeventâ Bâ ðð ðŽðŽ|ðµðµ
Ω
Event A
Event B
<- Universe
35
Probability Review Probability events: Probability that âeventâ A
â Pr ðŽðŽ
Probability of âeventâ A AND Bâ Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ,ðµðµ
Probability of âeventâ A OR Bâ Pr ðŽðŽ ⪠ðµðµ
Probability of an âeventâ A, given âeventâ Bâ ðð ðŽðŽ|ðµðµ
Ω
Event A
Event B
<- Universe
36
Event A
Event B
Probability Review Probability events: Probability that âeventâ A
â Pr ðŽðŽ
Probability of âeventâ A AND Bâ Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ,ðµðµ
Probability of âeventâ A OR Bâ Pr ðŽðŽ ⪠ðµðµ
Probability of an âeventâ A, given âeventâ Bâ ðð ðŽðŽ|ðµðµ
Ω <- Universe
37
Probability Review Probability events: Probability that âeventâ A
â Pr ðŽðŽ
Probability of âeventâ A AND Bâ Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ,ðµðµ
Probability of âeventâ A OR Bâ Pr ðŽðŽ ⪠ðµðµ
Probability of an âeventâ A, given âeventâ Bâ ðð ðŽðŽ|ðµðµ
New UniverseΩ = B
38
Probability Review EXAMPLE: Probability events (fair coin flips): Probability that âeventâ A
â Pr ðŽðŽ
Probability of âeventâ A AND Bâ Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ,ðµðµ
Probability of âeventâ A OR Bâ Pr ðŽðŽ ⪠ðµðµ
Probability of an âeventâ A, given âeventâ Bâ ðð ðŽðŽ|ðµðµ
Event ðŽðŽ â Coin 1 is headsEvent ðµðµ â Coin 2 is heads
39
Probability Review EXAMPLE: Probability events (fair coin flips): Probability that âeventâ A
â Pr ðŽðŽ = 1/2
Probability of âeventâ A AND Bâ Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ,ðµðµ = 1/4
Probability of âeventâ A OR Bâ Pr ðŽðŽ ⪠ðµðµ = 3/4
Probability of an âeventâ A, given âeventâ Bâ ðð ðŽðŽ|ðµðµ = 1/2
Event ðŽðŽ â Coin 1 is headsEvent ðµðµ â Coin 2 is heads
40
Probability Review Relationships between probability events: Chain Rule
â Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ Pr ðµðµ|ðŽðŽ = ðð ðµðµ ðð ðŽðŽ|ðµðµ
Bayes Theorem
â Pr ðŽðŽ|ðµðµ = Pr ðŽðŽ Pr ðµðµ|ðŽðŽPr ðµðµ
âORâ Ruleâ Pr ðŽðŽ ⪠ðµðµ = Pr ðŽðŽ + Pr ðµðµ â Pr ðŽðŽ â©ðµðµ
Independence Eventsâ Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ Pr ðµðµ
Disjoint Eventsâ Pr ðŽðŽ â© ðµðµ = 0
41
Probability Review Relationships between probability events: Chain Rule
â Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ Pr ðµðµ|ðŽðŽ = ðð ðµðµ ðð ðŽðŽ|ðµðµ
Bayes Theorem
â Pr ðŽðŽ|ðµðµ = Pr ðŽðŽ Pr ðµðµ|ðŽðŽPr ðµðµ
âORâ Ruleâ Pr ðŽðŽ ⪠ðµðµ = Pr ðŽðŽ + Pr ðµðµ â Pr ðŽðŽ â©ðµðµ
Independence Eventsâ Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ Pr ðµðµ
Disjoint Eventsâ Pr ðŽðŽ â© ðµðµ = 0
New UniverseΩ = B
Weight this by probability to be in B
42
Probability Review Relationships between probability events: Chain Rule
â Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ Pr ðµðµ|ðŽðŽ = ðð ðµðµ ðð ðŽðŽ|ðµðµ
Bayes Theorem
â Pr ðŽðŽ|ðµðµ = Pr ðŽðŽ Pr ðµðµ|ðŽðŽPr ðµðµ
âORâ Ruleâ Pr ðŽðŽ ⪠ðµðµ = Pr ðŽðŽ + Pr ðµðµ â Pr ðŽðŽ â©ðµðµ
Independence Eventsâ Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ Pr ðµðµ
Disjoint Eventsâ Pr ðŽðŽ â© ðµðµ = 0
New UniverseΩ = B
Derive from above
43
Probability Review Relationships between probability events: Chain Rule
â Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ Pr ðµðµ|ðŽðŽ = ðð ðµðµ ðð ðŽðŽ|ðµðµ
Bayes Theorem
â Pr ðŽðŽ|ðµðµ = Pr ðŽðŽ Pr ðµðµ|ðŽðŽPr ðµðµ
âORâ Ruleâ Pr ðŽðŽ ⪠ðµðµ = Pr ðŽðŽ + Pr ðµðµ â Pr ðŽðŽ â©ðµðµ
Independence Eventsâ Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ Pr ðµðµ
Disjoint Eventsâ Pr ðŽðŽ â© ðµðµ = 0
Event A
Event B
Remove overlap
44
Probability Review Relationships between probability events: Chain Rule
â Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ Pr ðµðµ|ðŽðŽ = Pr ðµðµ Pr ðŽðŽ|ðµðµ
Bayes Theorem
â Pr ðŽðŽ|ðµðµ = Pr ðŽðŽ Pr ðµðµ|ðŽðŽPr ðµðµ
âORâ Ruleâ Pr ðŽðŽ ⪠ðµðµ = Pr ðŽðŽ + Pr ðµðµ â Pr ðŽðŽ â©ðµðµ
Independence Eventsâ Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ Pr ðµðµ Pr ðµðµ = Pr ðµðµ|ðŽðŽ
Disjoint Eventsâ Pr ðŽðŽ â© ðµðµ = 0
45
Probability Review Relationships between probability events: Chain Rule
â Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ Pr ðµðµ|ðŽðŽ = Pr ðµðµ Pr ðŽðŽ|ðµðµ
Bayes Theorem
â Pr ðŽðŽ|ðµðµ = Pr ðŽðŽ Pr ðµðµ|ðŽðŽPr ðµðµ
âORâ Ruleâ Pr ðŽðŽ ⪠ðµðµ = Pr ðŽðŽ + Pr ðµðµ â Pr ðŽðŽ â©ðµðµ
Independence Eventsâ Pr ðŽðŽ â© ðµðµ = Pr ðŽðŽ Pr ðµðµ
Disjoint Eventsâ Pr ðŽðŽ â© ðµðµ = 0
Event A
Event B
46
Probability Review Random Variables: Continuous-values random variables ðð
Capital-case (ðð) means random (this is the notation we will use)
Lower-case (ð¥ð¥) means fixed value
Probability Density Functions (PDF) with parameter ðœðœ ðððð ð¥ð¥
ðððð,ðð ð¥ð¥
ðð ð¥ð¥;ðð
All three notations mean the same thing!
Kayâs notation
47
Probability Density Functions and Cumulative Density Functions
Probability Review Probability Density Functions (PDF) Definition A valid PDF is any function ðð ð¥ð¥ that is both
â Non-negative p ð¥ð¥ ⥠0â Unit area â«ââ
â ðð ð¥ð¥ ððð¥ð¥ = 1
Credit: https://commons.wikimedia.org/wiki/File:Normal_Distribution_PDF.svg49
Probability Review Cumulative Density Functions (CDF) Definition A valid CDF is any function ð¹ð¹ ð¥ð¥ that is both
â Monotonically increasing (non-deceasing )â Normalized: ð¹ð¹ ââ = 0, ð¹ð¹ â = 1
From a PDF as
â ðð ð¥ð¥;ðð = Pr ðð †ð¥ð¥ = â«ââð¥ð¥ ðð ðð;ðð ðððð
Figure
Credit: https://en.wikipedia.org/wiki/Normal_distribution#/media/File:Normal_Distribution_CDF.svg 50
Probability Review Gaussian (or Normal) Random Variable ðð:ðð ðð,ðð2
PDF is also known as the âbell curveâ
mean variance
ðð ð¥ð¥; ðð,ðð =1
2 ðððð2exp â
ð¥ð¥ â ðð 2
2ðð2
Credit:https://commons.wikimedia.org/wiki/File:Normal_Distrib
ution_PDF.svg
51
Probability Review Cumulative Distribution Function (CDF):
ðð ð¥ð¥;ðð = Pr ðð †ð¥ð¥ = ï¿œââ
ð¥ð¥
ðð ðð;ðð ðððð
FigureFigure Figure
Credit: https://en.wikipedia.org/wiki/Normal_distribution#/media/File:Normal_Distribution_CDF.svg
52
Probability Review Example: Uniform Random Variable ðð: uniform ðð,ðð
ðð ð¥ð¥;ðð,ðð = ï¿œ1
ððâððfor ðð †ð¥ð¥ †ðð
0 for otherwise Used for many applications
Credit: http://www.epixanalytics.com/modelassist/CrystalBall/Model_Assist.htm#Distributions/Continuous_distributions/Uniform.htm
53
Probability Review Example: Beta Random Variable ðð: beta(ðŒðŒ,ðœðœ)
ðð ð¥ð¥;ðŒðŒ,ðœðœ = 1ðµðµ ðŒðŒ,ðœðœ
ð¥ð¥ðŒðŒâ1 1â ð¥ð¥ ðœðœâ1
Used in control systems, population genetics, Bayesian inference
Credit: https://en.wikipedia.org/wiki/Beta_distribution#/media/File:Beta_distribution_pdf.svg
Beta function
54
Probability Review Example: Chi-squared random variable
ðð: ð³ð³N2
ðð ð¥ð¥;ðð =ð¥ð¥ðð2â1ððâ
ð¥ð¥2
2ðð2Î ðð
2
for ð¥ð¥ > 0
0 for otherwise
Used in detection theory
Credit: https://en.wikipedia.org/wiki/Chi-squared_distribution#/media/File:Chi-square_pdf.svg
55
Probability Review Transformation of Random Variables (Common Transformations)
Let ð¥ð¥ and ðŠðŠ be independent such that ð¥ð¥~ð©ð© 0,1 and ðŠðŠ~ð©ð© 0,1
ð¥ð¥ + ðŠðŠ~ ?
ð¥ð¥ 2 ~ ?
ðð ððâ1ðð
ð¥ð¥+ðŠðŠð¥ð¥2+ðŠðŠ2
~ ?
ð¥ð¥ 2
ðŠðŠ 2 ~?
ð¥ð¥ðŠðŠ
~ ?
56
Probability Review Transformation of Random Variables (Common Transformations)
Let ð¥ð¥ and ðŠðŠ be independent such that ð¥ð¥~ð©ð© 0,1 and ðŠðŠ~ð©ð© 0,1
ð¥ð¥ + ðŠðŠ~ðððððððððððð
ð¥ð¥ 2 ~ ððððð â ð ð ð ð ð ð ððððð ð ðð
ðð ððâ1ðð
ð¥ð¥+ðŠðŠð¥ð¥2+ðŠðŠ2
~ ð ð ð ð ð ð ððð ð ððð ð â²ð ð â ðð
ð¥ð¥ 2
ðŠðŠ 2 ~ð¹ð¹
ð¥ð¥ðŠðŠ
~ð¶ð¶ððð ð ððððŠðŠ
57
Expectations and Moments
Probability Review Random Variables ðð <- denoted by capital letter usually
Two types: discrete, continuous
Defined by a probability distribution function (PDF) and cumulative distribution function (CDF)
For discrete-valued random variables, the PDF is replaced by a probability mass function (PMF)
59
Probability Review Expectation
ðžðž ðð = ï¿œââ
â
ð¥ð¥ ðð ð¥ð¥ ððð¥ð¥
Expectation of a function
ðžðž ðð ðð = ï¿œââ
â
ðð ð¥ð¥ ðð ð¥ð¥ ððð¥ð¥
Expectation with an unknown parameter
ðžðž ðð;ðð = ï¿œââ
â
ðð ð¥ð¥ ðð ð¥ð¥;ðð ððð¥ð¥
Useful!! Can be easy to find the expectation of a function without finding find the PDF
60
Probability Review Moments
ðð: ðð ðð,ðð2ðð = ðð ðð = ðð âðð 2
â Mean: E ðð = â«âââ ð¥ð¥ ðð ð¥ð¥ ððð¥ð¥
â 2nd Moment: E ðð2 = â«âââ ð¥ð¥2 ðð ð¥ð¥ ððð¥ð¥
â Variance: E ðð âðð 2 = â«âââ (ð¥ð¥ âðð)2 ðð ð¥ð¥ ððð¥ð¥
Note: In general, the PDF of ðð2, ðð â ðžðž ðð 2, ⊠do not have the same PDF as ðð
61
Probability Review Expectation Examples Let
â ðð: ðð ðð,ðð2
â ðð = ðð ðð = ðð âðð 2
Compute
â E ðð;ðð,ðð2
â E ðð ðð ;ðð,ðð2
â E ðð + 10;ðð,ðð2
â E ðð 2ðð ;ðð,ðð2
62
Probability Review Expectation Examples Let
â ðð: ðð ðð,ðð2
â ðð = ðð ðð = ðð âðð 2
Compute
â E ðð;ðð,ðð2 = ðð
â E ðð ðð ;ðð,ðð2 = ðð2
â E ðð + 10;ðð,ðð2 = E ðð;ðð,ðð2 + 10 = ðð+ 10
â E ðð 2ðð ;ðð,ðð2 = 4ðð2
63
Probability Review Expectation Examples Let
â ðð: ð ð ð¥ð¥ððððððð ð ððð ð ðððððð ðð
â ðð ð¥ð¥;ðð = ï¿œððð ð âððð¥ð¥ for ð¥ð¥ ⥠00 for ð¥ð¥ < 0
Compute
â E ðð;ðð
64
Probability Review Expectation Examples Let
â ðð: ð ð ð¥ð¥ððððððð ð ððð ð ðððððð ðð
â ðð ð¥ð¥;ðð = ï¿œððð ð âððð¥ð¥ for ð¥ð¥ ⥠00 for ð¥ð¥ < 0
Compute
â E ðð;ðð = â«0âð¥ð¥ ðð ð¥ð¥ ððð¥ð¥ = â«0
âð¥ð¥ððð ð âððð¥ð¥ ððð¥ð¥
E ðð;ðð = ï¿œâð ð âððð¥ð¥ ððð¥ð¥ + 1
ðð0
â
=âð ð âððâ âððâ+ 1
ðð+ð ð â0ð¥ð¥ 1
ðð
E ðð;ðð =1ðð
LâHospitalâs Rule = 0
65
Probability Review Expectation Examples Let
â ðð: ð ð ð¥ð¥ððððððð ð ððð ð ðððððð ðð
â ðð ð¥ð¥;ðð = ï¿œððð ð âððð¥ð¥ for ð¥ð¥ ⥠00 for ð¥ð¥ < 0
Compute
â E ðð2;ðð
66
Probability Review Expectation Examples Let
â ðð: ð ð ð¥ð¥ððððððð ð ððð ð ðððððð ðð
â ðð ð¥ð¥;ðð = ï¿œððð ð âððð¥ð¥ for ð¥ð¥ ⥠00 for ð¥ð¥ < 0
Compute
â E ðð2;ðð = â«0âð¥ð¥2 ðð ð¥ð¥ ððð¥ð¥ = â«0
âð¥ð¥2ððð ð âððð¥ð¥ ððð¥ð¥
E ðð2;ðð = ï¿œâð ð âððð¥ð¥ ðð2ð¥ð¥2 + 2ððð¥ð¥ + 2
ðð20
â
=âð ð âððâ âðð2â2 + 2ððð¥ð¥ + 2
ðð2 +ð ð â0ð¥ð¥ 1ðð2
E ðð2;ðð =2ðð2
= 0
67
Two Random Variables (and their relationships)
Several Random Variables Joint PDFs of Random Variables ðð ð¥ð¥,ðŠðŠ Joint PDF of 2 random variables
Vector form
Define ðŸðŸ = ðððð , ðð =
ð¥ð¥ðŠðŠ
ðð ðð = ðð ð¥ð¥,ðŠðŠ Joint PDF of 2 random variables (short version)
Credit: https://en.wikipedia.org/wiki/Multivariate_normal_distribution#/media/File:MultivariateNormal.png
69
Several Random Variables Chain Rule for 2 Random Variables
ðð ð¥ð¥,ðŠðŠ = ðð ð¥ð¥ ðð ðŠðŠ|ð¥ð¥ = ðð ðŠðŠ ðð ð¥ð¥|ðŠðŠ
If Random Variables are independent
ðð ð¥ð¥,ðŠðŠ = ðð ð¥ð¥ ðð ðŠðŠ
70
Several Random Variables Moments of two random variables ðð,ðð
â Mean: E ðð , E ððâ 2nd Moment: E ðð2 ,ðžðž ðð2
â Variance: E ðð âððð¥ð¥2 , E ðð âðððŠðŠ
2
â Cross-Correlation: E ððððâ Cross-variance: E ðð âððð¥ð¥ ðð âðððŠðŠ = E ðððð â E ðð E ðð
If ðð,ðð are uncorrelated Cross-Correlation: ðžðž ðððð = ðžðž ðð ðžðž ðð
Co-Variance: ðžðž ðð â ðžðž ðð ðð âðžðž ðð = 0
Variance sum: var ðð + ðð = var ðð + var ðð
Important Note: â Independent implies uncorrelated, but Correlated does not imply independent
71
Several Random Variables Example: Two Normal Distributions Consider the two random variables
ðð~ð©ð© 0,1 , ðð~ð©ð© ð¥ð¥, 1
Compute the joint PDF ðð ð¥ð¥,ðŠðŠ
72
Several Random Variables Example: Two Normal Distributions Consider the two random variables
ðð~ð©ð© 0,1 , ðð~ð©ð© ð¥ð¥, 1
Compute the joint PDF ðð ð¥ð¥,ðŠðŠ
ðð ð¥ð¥,ðŠðŠ = ðð ð¥ð¥ ðð ðŠðŠ|ð¥ð¥ Chain Rule
ðð ð¥ð¥ = 12ðð(1)2
exp â ð¥ð¥ 2
2(1)2
ðð ðŠðŠ|ð¥ð¥ = 12ðð(1)2
exp â ðŠðŠâð¥ð¥ 2
2(1)2
ðð ð¥ð¥,ðŠðŠ = 12ðð(1)2
exp â ð¥ð¥2
2(1)21
2ðð(1)2exp â ðŠðŠâð¥ð¥ 2
2(1)2
ðð ð¥ð¥,ðŠðŠ = 12ðð
exp âð¥ð¥2+ ðŠðŠâð¥ð¥ 2
2= 1
2ððexp â2ð¥ð¥2+ðŠðŠ2â2ðŠðŠð¥ð¥
2
ðð ð¥ð¥,ðŠðŠ = 12ðð
exp â12
ð¥ð¥ðŠðŠ
ðð 2 â1â1 1
ð¥ð¥ðŠðŠ â ðð
ðð ~ð©ð© 00 , 2 â1
â1 1â1
= 1 11 2
73
Several Random Variables Example: Two Normal Distributions Consider the two random variables
ðð~ð©ð© 0,1 , ðð~ð©ð© ð¥ð¥, 1
Compute the joint PDF ðð ð¥ð¥,ðŠðŠ
ðððð ~ð©ð© 0
0 , 1 11 2
Question: How do we interpret this distribution?
74
top related