ece 6540, lecture 01
TRANSCRIPT
Variations
ECE 6540, Lecture 01Introduction and Review of Probability
Estimation Theory: Some Definitions
Definitions Question: What is a statistic? How do we define it?
3
Definitions Question: What is a statistic? How do we define it?
Answer: A statistic is any function of sampled data
The function must be independent of the dataβs underlying probability distribution
4
Definitions Examples:
π₯π₯~π©π© 0,1 , y~π©π© 0,2
Are these statistics? π₯π₯
π₯π₯ + π¦π¦
π₯π₯2
π₯π₯π₯π₯ β π¦π¦ln π₯π₯ + 2 + 3π¦π¦
πΈπΈ π₯π₯
5
Definitions Examples:
π₯π₯~π©π© 0,1 , y~π©π© 0,2
Are these statistics? π₯π₯ Yes!
π₯π₯ + π¦π¦ Yes!
π₯π₯2 Yes!
π₯π₯π₯π₯ β π¦π¦ln π₯π₯ + 2 + 3π¦π¦ Yes!
πΈπΈ π₯π₯ No!
6
Definitions Question: What is an estimator? How do we define it?
7
Definitions Question: What is an estimator? How do we define it?
Answer: An estimator is a statistic that estimates a specific value
8
Definitions Examples:
π₯π₯~π©π© 0,1 , y~π©π© 0,1
A familiar statistic12π₯π₯ + π¦π¦ is an estimator of what?
Is it a good estimator? Why or why not?
9
Definitions Examples:
π₯π₯~π©π© 0,1 , y~π©π© 0,1
A less familiar statistic23π₯π₯ + 1
3π¦π¦ is an estimator of what?
Is it a good estimator? Why or why not?
10
Definitions Question: What is an estimation theory? How do we define it?
11
Definitions Question: What is an estimation theory? How do we define it?
Answer: Estimation theory is the study estimators and their properties.
12
Definitions Question: What is a [statistical] detector (not to be confused with communications
detector)?
13
Definitions Question: What is a [statistical] detector (not to be confused with communications
detector)?
Answer: (Warning: definition is a but fuzzy) A detector is a statistic or process
that determines the presence of a signal within noise
A [hypothesis] test is a method for determining what distribution a detector (statistic) belongs to
14
Definitions Example:
ππ~π©π© 0,1 , π₯π₯ is any signal
Hypothesis Test Null Hypothesis: y = n
Alternative Hypothesis: y = x + n
What is a good detector?
15
Definitions Example:
ππ~π©π© 0,1 , π₯π₯ is any signal
Hypothesis Test Null Hypothesis: y = n
Alternative Hypothesis: y = x + n
What is a good detector? Optimal detector: s = π¦π¦ 2
Optimal test: s > ππ (threshold ππ is determined from a Chi-square distribution)
16
Definitions Question: What is an detection theory? How do we define it?
17
Definitions Question: What is an detection theory? How do we define it?
Answer: Detection theory is the study detectors and their properties.
18
Definitions Question: In engineering, how do we define the βbestβ or βoptimalβ of something
(e.g., the best estimator or the best detector)
19
Definitions Question: In engineering, how do we define the βbestβ or βoptimalβ of something
(e.g., the best estimator or the best detector)
Answer: Trick question, βbestβ and βoptimalβ is always based on some criteria
that WE define.
20
Applications
Definitions Question: What are some applications of estimation theory?
22
Applications RADAR (Radio Detection And Ranging) / Sonar
Detection: Is there a reflection from an aircraft?
Estimation: How far is the aircraft / what is its precise location?
Related (Waveform Design): Can I design waveforms to make the above easier/harder?
Detection Theory Estimation Theory
Credit: https://en.wikipedia.org/wiki/Radar23
Applications Communications Detection: Did I receive a message?
Estimation: What is the message?
Related (Coding): Can I design codes to make the above easier/harder?
Credit: http://www.ohlone.edu/instr/speech/longdesc-diagramcommunication.html
24
Applications It is pervasive in signal processing Estimation theory = applied statistics
Most modern signal processing tools involve statisticsβ Array processingβ Compressive sensing (most proofs are probability based)β Network science (probabilistic graphical models)β Optimal filter designβ De-noising β Tracking (e.g., Kalman filter)β Statistical Modelling / Analysis
25
Applications Two examples: Gaussian Random Variable
β What is an optimal estimate for the expected value?
Laplace Random Variableβ What is an optimal estimate for the expected value?
26
Schedule
Schedule Part 1: Classical Estimation Theory Minimum Mean Square Error Estimators
Minimum Variance Unbiased Estimators
Maximum Likelihood Estimators
Part 2: Bayesian Estimation Theory Maximum a Priori Estimators
Minimax Estimators
Part 3: Detection Theory Neyman-Pearson Tests
Generalized Maximum Likelihood Tests
28
Schedule Question: What is the difference between classical and Bayesian statistics? Why are these differences important?
29
Schedule Quick warning! In the first few classes, I am going to throw a lot of information at you
(some of which you may know, some of which you may not)
I do not expect you to retain everything 100%. My goal is to expose you to these concepts and make you more comfortable about the concepts.
30
Probability Review (with some things you may have not seen)
Probability Review Quick Note: Notation everywhere is different
We will try to stick with Kayβs notation
32
Probability Review Probability events: Probability that βeventβ A
β Pr π΄π΄
Probability of βeventβ A AND Bβ Pr π΄π΄ β© π΅π΅ = Pr π΄π΄,π΅π΅
Probability of βeventβ A OR Bβ Pr π΄π΄ βͺ π΅π΅
Probability of an βeventβ A, given βeventβ Bβ ππ π΄π΄|π΅π΅
33
Probability Review Probability events: Probability that βeventβ A
β Pr π΄π΄
Probability of βeventβ A AND Bβ Pr π΄π΄ β© π΅π΅ = Pr π΄π΄,π΅π΅
Probability of βeventβ A OR Bβ Pr π΄π΄ βͺ π΅π΅
Probability of an βeventβ A, given βeventβ Bβ ππ π΄π΄|π΅π΅
Ξ©
Event A
Event B
<- Universe
34
Probability Review Probability events: Probability that βeventβ A
β Pr π΄π΄
Probability of βeventβ A AND Bβ Pr π΄π΄ β© π΅π΅ = Pr π΄π΄,π΅π΅
Probability of βeventβ A OR Bβ Pr π΄π΄ βͺ π΅π΅
Probability of an βeventβ A, given βeventβ Bβ ππ π΄π΄|π΅π΅
Ξ©
Event A
Event B
<- Universe
35
Probability Review Probability events: Probability that βeventβ A
β Pr π΄π΄
Probability of βeventβ A AND Bβ Pr π΄π΄ β© π΅π΅ = Pr π΄π΄,π΅π΅
Probability of βeventβ A OR Bβ Pr π΄π΄ βͺ π΅π΅
Probability of an βeventβ A, given βeventβ Bβ ππ π΄π΄|π΅π΅
Ξ©
Event A
Event B
<- Universe
36
Event A
Event B
Probability Review Probability events: Probability that βeventβ A
β Pr π΄π΄
Probability of βeventβ A AND Bβ Pr π΄π΄ β© π΅π΅ = Pr π΄π΄,π΅π΅
Probability of βeventβ A OR Bβ Pr π΄π΄ βͺ π΅π΅
Probability of an βeventβ A, given βeventβ Bβ ππ π΄π΄|π΅π΅
Ξ© <- Universe
37
Probability Review Probability events: Probability that βeventβ A
β Pr π΄π΄
Probability of βeventβ A AND Bβ Pr π΄π΄ β© π΅π΅ = Pr π΄π΄,π΅π΅
Probability of βeventβ A OR Bβ Pr π΄π΄ βͺ π΅π΅
Probability of an βeventβ A, given βeventβ Bβ ππ π΄π΄|π΅π΅
New UniverseΞ© = B
38
Probability Review EXAMPLE: Probability events (fair coin flips): Probability that βeventβ A
β Pr π΄π΄
Probability of βeventβ A AND Bβ Pr π΄π΄ β© π΅π΅ = Pr π΄π΄,π΅π΅
Probability of βeventβ A OR Bβ Pr π΄π΄ βͺ π΅π΅
Probability of an βeventβ A, given βeventβ Bβ ππ π΄π΄|π΅π΅
Event π΄π΄ β Coin 1 is headsEvent π΅π΅ β Coin 2 is heads
39
Probability Review EXAMPLE: Probability events (fair coin flips): Probability that βeventβ A
β Pr π΄π΄ = 1/2
Probability of βeventβ A AND Bβ Pr π΄π΄ β© π΅π΅ = Pr π΄π΄,π΅π΅ = 1/4
Probability of βeventβ A OR Bβ Pr π΄π΄ βͺ π΅π΅ = 3/4
Probability of an βeventβ A, given βeventβ Bβ ππ π΄π΄|π΅π΅ = 1/2
Event π΄π΄ β Coin 1 is headsEvent π΅π΅ β Coin 2 is heads
40
Probability Review Relationships between probability events: Chain Rule
β Pr π΄π΄ β© π΅π΅ = Pr π΄π΄ Pr π΅π΅|π΄π΄ = ππ π΅π΅ ππ π΄π΄|π΅π΅
Bayes Theorem
β Pr π΄π΄|π΅π΅ = Pr π΄π΄ Pr π΅π΅|π΄π΄Pr π΅π΅
βORβ Ruleβ Pr π΄π΄ βͺ π΅π΅ = Pr π΄π΄ + Pr π΅π΅ β Pr π΄π΄ β©π΅π΅
Independence Eventsβ Pr π΄π΄ β© π΅π΅ = Pr π΄π΄ Pr π΅π΅
Disjoint Eventsβ Pr π΄π΄ β© π΅π΅ = 0
41
Probability Review Relationships between probability events: Chain Rule
β Pr π΄π΄ β© π΅π΅ = Pr π΄π΄ Pr π΅π΅|π΄π΄ = ππ π΅π΅ ππ π΄π΄|π΅π΅
Bayes Theorem
β Pr π΄π΄|π΅π΅ = Pr π΄π΄ Pr π΅π΅|π΄π΄Pr π΅π΅
βORβ Ruleβ Pr π΄π΄ βͺ π΅π΅ = Pr π΄π΄ + Pr π΅π΅ β Pr π΄π΄ β©π΅π΅
Independence Eventsβ Pr π΄π΄ β© π΅π΅ = Pr π΄π΄ Pr π΅π΅
Disjoint Eventsβ Pr π΄π΄ β© π΅π΅ = 0
New UniverseΞ© = B
Weight this by probability to be in B
42
Probability Review Relationships between probability events: Chain Rule
β Pr π΄π΄ β© π΅π΅ = Pr π΄π΄ Pr π΅π΅|π΄π΄ = ππ π΅π΅ ππ π΄π΄|π΅π΅
Bayes Theorem
β Pr π΄π΄|π΅π΅ = Pr π΄π΄ Pr π΅π΅|π΄π΄Pr π΅π΅
βORβ Ruleβ Pr π΄π΄ βͺ π΅π΅ = Pr π΄π΄ + Pr π΅π΅ β Pr π΄π΄ β©π΅π΅
Independence Eventsβ Pr π΄π΄ β© π΅π΅ = Pr π΄π΄ Pr π΅π΅
Disjoint Eventsβ Pr π΄π΄ β© π΅π΅ = 0
New UniverseΞ© = B
Derive from above
43
Probability Review Relationships between probability events: Chain Rule
β Pr π΄π΄ β© π΅π΅ = Pr π΄π΄ Pr π΅π΅|π΄π΄ = ππ π΅π΅ ππ π΄π΄|π΅π΅
Bayes Theorem
β Pr π΄π΄|π΅π΅ = Pr π΄π΄ Pr π΅π΅|π΄π΄Pr π΅π΅
βORβ Ruleβ Pr π΄π΄ βͺ π΅π΅ = Pr π΄π΄ + Pr π΅π΅ β Pr π΄π΄ β©π΅π΅
Independence Eventsβ Pr π΄π΄ β© π΅π΅ = Pr π΄π΄ Pr π΅π΅
Disjoint Eventsβ Pr π΄π΄ β© π΅π΅ = 0
Event A
Event B
Remove overlap
44
Probability Review Relationships between probability events: Chain Rule
β Pr π΄π΄ β© π΅π΅ = Pr π΄π΄ Pr π΅π΅|π΄π΄ = Pr π΅π΅ Pr π΄π΄|π΅π΅
Bayes Theorem
β Pr π΄π΄|π΅π΅ = Pr π΄π΄ Pr π΅π΅|π΄π΄Pr π΅π΅
βORβ Ruleβ Pr π΄π΄ βͺ π΅π΅ = Pr π΄π΄ + Pr π΅π΅ β Pr π΄π΄ β©π΅π΅
Independence Eventsβ Pr π΄π΄ β© π΅π΅ = Pr π΄π΄ Pr π΅π΅ Pr π΅π΅ = Pr π΅π΅|π΄π΄
Disjoint Eventsβ Pr π΄π΄ β© π΅π΅ = 0
45
Probability Review Relationships between probability events: Chain Rule
β Pr π΄π΄ β© π΅π΅ = Pr π΄π΄ Pr π΅π΅|π΄π΄ = Pr π΅π΅ Pr π΄π΄|π΅π΅
Bayes Theorem
β Pr π΄π΄|π΅π΅ = Pr π΄π΄ Pr π΅π΅|π΄π΄Pr π΅π΅
βORβ Ruleβ Pr π΄π΄ βͺ π΅π΅ = Pr π΄π΄ + Pr π΅π΅ β Pr π΄π΄ β©π΅π΅
Independence Eventsβ Pr π΄π΄ β© π΅π΅ = Pr π΄π΄ Pr π΅π΅
Disjoint Eventsβ Pr π΄π΄ β© π΅π΅ = 0
Event A
Event B
46
Probability Review Random Variables: Continuous-values random variables ππ
Capital-case (ππ) means random (this is the notation we will use)
Lower-case (π₯π₯) means fixed value
Probability Density Functions (PDF) with parameter π½π½ ππππ π₯π₯
ππππ,ππ π₯π₯
ππ π₯π₯;ππ
All three notations mean the same thing!
Kayβs notation
47
Probability Density Functions and Cumulative Density Functions
Probability Review Probability Density Functions (PDF) Definition A valid PDF is any function ππ π₯π₯ that is both
β Non-negative p π₯π₯ β₯ 0β Unit area β«ββ
β ππ π₯π₯ πππ₯π₯ = 1
Credit: https://commons.wikimedia.org/wiki/File:Normal_Distribution_PDF.svg49
Probability Review Cumulative Density Functions (CDF) Definition A valid CDF is any function πΉπΉ π₯π₯ that is both
β Monotonically increasing (non-deceasing )β Normalized: πΉπΉ ββ = 0, πΉπΉ β = 1
From a PDF as
β ππ π₯π₯;ππ = Pr ππ β€ π₯π₯ = β«ββπ₯π₯ ππ ππ;ππ ππππ
Figure
Credit: https://en.wikipedia.org/wiki/Normal_distribution#/media/File:Normal_Distribution_CDF.svg 50
Probability Review Gaussian (or Normal) Random Variable ππ:ππ ππ,ππ2
PDF is also known as the βbell curveβ
mean variance
ππ π₯π₯; ππ,ππ =1
2 ππππ2exp β
π₯π₯ β ππ 2
2ππ2
Credit:https://commons.wikimedia.org/wiki/File:Normal_Distrib
ution_PDF.svg
51
Probability Review Cumulative Distribution Function (CDF):
ππ π₯π₯;ππ = Pr ππ β€ π₯π₯ = οΏ½ββ
π₯π₯
ππ ππ;ππ ππππ
FigureFigure Figure
Credit: https://en.wikipedia.org/wiki/Normal_distribution#/media/File:Normal_Distribution_CDF.svg
52
Probability Review Example: Uniform Random Variable ππ: uniform ππ,ππ
ππ π₯π₯;ππ,ππ = οΏ½1
ππβππfor ππ β€ π₯π₯ β€ ππ
0 for otherwise Used for many applications
Credit: http://www.epixanalytics.com/modelassist/CrystalBall/Model_Assist.htm#Distributions/Continuous_distributions/Uniform.htm
53
Probability Review Example: Beta Random Variable ππ: beta(πΌπΌ,π½π½)
ππ π₯π₯;πΌπΌ,π½π½ = 1π΅π΅ πΌπΌ,π½π½
π₯π₯πΌπΌβ1 1β π₯π₯ π½π½β1
Used in control systems, population genetics, Bayesian inference
Credit: https://en.wikipedia.org/wiki/Beta_distribution#/media/File:Beta_distribution_pdf.svg
Beta function
54
Probability Review Example: Chi-squared random variable
ππ: π³π³N2
ππ π₯π₯;ππ =π₯π₯ππ2β1ππβ
π₯π₯2
2ππ2Ξ ππ
2
for π₯π₯ > 0
0 for otherwise
Used in detection theory
Credit: https://en.wikipedia.org/wiki/Chi-squared_distribution#/media/File:Chi-square_pdf.svg
55
Probability Review Transformation of Random Variables (Common Transformations)
Let π₯π₯ and π¦π¦ be independent such that π₯π₯~π©π© 0,1 and π¦π¦~π©π© 0,1
π₯π₯ + π¦π¦~ ?
π₯π₯ 2 ~ ?
ππ ππβ1ππ
π₯π₯+π¦π¦π₯π₯2+π¦π¦2
~ ?
π₯π₯ 2
π¦π¦ 2 ~?
π₯π₯π¦π¦
~ ?
56
Probability Review Transformation of Random Variables (Common Transformations)
Let π₯π₯ and π¦π¦ be independent such that π₯π₯~π©π© 0,1 and π¦π¦~π©π© 0,1
π₯π₯ + π¦π¦~ππππππππππππ
π₯π₯ 2 ~ πππππ β π π π π π π πππππ π ππ
ππ ππβ1ππ
π₯π₯+π¦π¦π₯π₯2+π¦π¦2
~ π π π π π π πππ π πππ π β²π π β ππ
π₯π₯ 2
π¦π¦ 2 ~πΉπΉ
π₯π₯π¦π¦
~πΆπΆπππ π ππππ¦π¦
57
Expectations and Moments
Probability Review Random Variables ππ <- denoted by capital letter usually
Two types: discrete, continuous
Defined by a probability distribution function (PDF) and cumulative distribution function (CDF)
For discrete-valued random variables, the PDF is replaced by a probability mass function (PMF)
59
Probability Review Expectation
πΈπΈ ππ = οΏ½ββ
β
π₯π₯ ππ π₯π₯ πππ₯π₯
Expectation of a function
πΈπΈ ππ ππ = οΏ½ββ
β
ππ π₯π₯ ππ π₯π₯ πππ₯π₯
Expectation with an unknown parameter
πΈπΈ ππ;ππ = οΏ½ββ
β
ππ π₯π₯ ππ π₯π₯;ππ πππ₯π₯
Useful!! Can be easy to find the expectation of a function without finding find the PDF
60
Probability Review Moments
ππ: ππ ππ,ππ2ππ = ππ ππ = ππ βππ 2
β Mean: E ππ = β«βββ π₯π₯ ππ π₯π₯ πππ₯π₯
β 2nd Moment: E ππ2 = β«βββ π₯π₯2 ππ π₯π₯ πππ₯π₯
β Variance: E ππ βππ 2 = β«βββ (π₯π₯ βππ)2 ππ π₯π₯ πππ₯π₯
Note: In general, the PDF of ππ2, ππ β πΈπΈ ππ 2, β¦ do not have the same PDF as ππ
61
Probability Review Expectation Examples Let
β ππ: ππ ππ,ππ2
β ππ = ππ ππ = ππ βππ 2
Compute
β E ππ;ππ,ππ2
β E ππ ππ ;ππ,ππ2
β E ππ + 10;ππ,ππ2
β E ππ 2ππ ;ππ,ππ2
62
Probability Review Expectation Examples Let
β ππ: ππ ππ,ππ2
β ππ = ππ ππ = ππ βππ 2
Compute
β E ππ;ππ,ππ2 = ππ
β E ππ ππ ;ππ,ππ2 = ππ2
β E ππ + 10;ππ,ππ2 = E ππ;ππ,ππ2 + 10 = ππ+ 10
β E ππ 2ππ ;ππ,ππ2 = 4ππ2
63
Probability Review Expectation Examples Let
β ππ: π π π₯π₯πππππππ π πππ π ππππππ ππ
β ππ π₯π₯;ππ = οΏ½πππ π βπππ₯π₯ for π₯π₯ β₯ 00 for π₯π₯ < 0
Compute
β E ππ;ππ
64
Probability Review Expectation Examples Let
β ππ: π π π₯π₯πππππππ π πππ π ππππππ ππ
β ππ π₯π₯;ππ = οΏ½πππ π βπππ₯π₯ for π₯π₯ β₯ 00 for π₯π₯ < 0
Compute
β E ππ;ππ = β«0βπ₯π₯ ππ π₯π₯ πππ₯π₯ = β«0
βπ₯π₯πππ π βπππ₯π₯ πππ₯π₯
E ππ;ππ = οΏ½βπ π βπππ₯π₯ πππ₯π₯ + 1
ππ0
β
=βπ π βππβ βππβ+ 1
ππ+π π β0π₯π₯ 1
ππ
E ππ;ππ =1ππ
LβHospitalβs Rule = 0
65
Probability Review Expectation Examples Let
β ππ: π π π₯π₯πππππππ π πππ π ππππππ ππ
β ππ π₯π₯;ππ = οΏ½πππ π βπππ₯π₯ for π₯π₯ β₯ 00 for π₯π₯ < 0
Compute
β E ππ2;ππ
66
Probability Review Expectation Examples Let
β ππ: π π π₯π₯πππππππ π πππ π ππππππ ππ
β ππ π₯π₯;ππ = οΏ½πππ π βπππ₯π₯ for π₯π₯ β₯ 00 for π₯π₯ < 0
Compute
β E ππ2;ππ = β«0βπ₯π₯2 ππ π₯π₯ πππ₯π₯ = β«0
βπ₯π₯2πππ π βπππ₯π₯ πππ₯π₯
E ππ2;ππ = οΏ½βπ π βπππ₯π₯ ππ2π₯π₯2 + 2πππ₯π₯ + 2
ππ20
β
=βπ π βππβ βππ2β2 + 2πππ₯π₯ + 2
ππ2 +π π β0π₯π₯ 1ππ2
E ππ2;ππ =2ππ2
= 0
67
Two Random Variables (and their relationships)
Several Random Variables Joint PDFs of Random Variables ππ π₯π₯,π¦π¦ Joint PDF of 2 random variables
Vector form
Define πΎπΎ = ππππ , ππ =
π₯π₯π¦π¦
ππ ππ = ππ π₯π₯,π¦π¦ Joint PDF of 2 random variables (short version)
Credit: https://en.wikipedia.org/wiki/Multivariate_normal_distribution#/media/File:MultivariateNormal.png
69
Several Random Variables Chain Rule for 2 Random Variables
ππ π₯π₯,π¦π¦ = ππ π₯π₯ ππ π¦π¦|π₯π₯ = ππ π¦π¦ ππ π₯π₯|π¦π¦
If Random Variables are independent
ππ π₯π₯,π¦π¦ = ππ π₯π₯ ππ π¦π¦
70
Several Random Variables Moments of two random variables ππ,ππ
β Mean: E ππ , E ππβ 2nd Moment: E ππ2 ,πΈπΈ ππ2
β Variance: E ππ βπππ₯π₯2 , E ππ βπππ¦π¦
2
β Cross-Correlation: E ππππβ Cross-variance: E ππ βπππ₯π₯ ππ βπππ¦π¦ = E ππππ β E ππ E ππ
If ππ,ππ are uncorrelated Cross-Correlation: πΈπΈ ππππ = πΈπΈ ππ πΈπΈ ππ
Co-Variance: πΈπΈ ππ β πΈπΈ ππ ππ βπΈπΈ ππ = 0
Variance sum: var ππ + ππ = var ππ + var ππ
Important Note: β Independent implies uncorrelated, but Correlated does not imply independent
71
Several Random Variables Example: Two Normal Distributions Consider the two random variables
ππ~π©π© 0,1 , ππ~π©π© π₯π₯, 1
Compute the joint PDF ππ π₯π₯,π¦π¦
72
Several Random Variables Example: Two Normal Distributions Consider the two random variables
ππ~π©π© 0,1 , ππ~π©π© π₯π₯, 1
Compute the joint PDF ππ π₯π₯,π¦π¦
ππ π₯π₯,π¦π¦ = ππ π₯π₯ ππ π¦π¦|π₯π₯ Chain Rule
ππ π₯π₯ = 12ππ(1)2
exp β π₯π₯ 2
2(1)2
ππ π¦π¦|π₯π₯ = 12ππ(1)2
exp β π¦π¦βπ₯π₯ 2
2(1)2
ππ π₯π₯,π¦π¦ = 12ππ(1)2
exp β π₯π₯2
2(1)21
2ππ(1)2exp β π¦π¦βπ₯π₯ 2
2(1)2
ππ π₯π₯,π¦π¦ = 12ππ
exp βπ₯π₯2+ π¦π¦βπ₯π₯ 2
2= 1
2ππexp β2π₯π₯2+π¦π¦2β2π¦π¦π₯π₯
2
ππ π₯π₯,π¦π¦ = 12ππ
exp β12
π₯π₯π¦π¦
ππ 2 β1β1 1
π₯π₯π¦π¦ β ππ
ππ ~π©π© 00 , 2 β1
β1 1β1
= 1 11 2
73
Several Random Variables Example: Two Normal Distributions Consider the two random variables
ππ~π©π© 0,1 , ππ~π©π© π₯π₯, 1
Compute the joint PDF ππ π₯π₯,π¦π¦
ππππ ~π©π© 0
0 , 1 11 2
Question: How do we interpret this distribution?
74