assignment 1 - answer
TRANSCRIPT
ASSIGNMENT QMT 732 (ECONOMETRICS) 2011
Question 1
Data on three-variable problem yield the following results:
a) What is the sample size?= 33
=
b) Compute the regression equation.
=
Therefore Y = 4 – 0.2X₁ + 1.6X₂
3x33 33x3
3x3 3x1
c) Estimate the standard error of and test the hypothesis that is zero
As we know, TSS = ESS + RSS
Given TSS =
ESS = (b’. X’Y)
ESS = . = 142.40
TSS = ESS + RSS
RSS = 150 – 142.40 = 7.6 aka e’e
S² = e’e/(n-k),
S.E = √ e’e/(n-k) = √7.6/30 = 0.503322
variance S.E
Test hypothesis, H0 : B₂ = 0
Since tcal > ttable, we reject the hypothesis, H0 : B₂ = 0
v(b1)==v(b2)=
v(b3)=
0.01
0.01
0.01
s(b1)==s(b2)=
s(b3)=
0.09
0.09
0.07
tcal =
Tα/2,n-k = T0.025,30
-2.294157
2.042
Reject H0
d) Test the same hypothesis by running the appropriate restricted regression and examining the difference in the residual sums of squares
As we know, Y = 4 – 0.2X₁ + 1.6X₂
TSS = ESS + RSS
ESS = 142.4
New TSS = 98.4
Therefore RSS = -44
e) Compute a conditional prediction for given and . Obtain also a 95 percent
interval for this prediction. If the actual value of turned out to be 12, would you think it came from the relationship underlying the sample data?
Y = 4 – 0.2X₁ + 1.6X₂
Ŷ = c’b
= 8
95% confidence interval value of Y
S.E = 0.503322
= 0.75
Therefore Sqrt = √0.75 = 0.8660
The confidence interval for the value Y is ŷf ± t0.025,30√ c’(X’X)-1 c
= 8 ± 2.042(0.8660)
= 8 ± 1.7684
= 9.7684 , 6.2316
Since 12 is fall outside the range of its upper and lower limit, therefore we reject the hypothesis,
H0 : Y = 12 if X2f = -4, X3f = 2
95% C.I Upper
9.7684
Lower
6.2316
Question 2
Consider the multiple regression model:
You would like to test the null hypothesis
a) Let denote the OLS estimators of . Find var( ) in terms of the
variances of , and the covariance between them.
b) Write the t-statistic for testing . What is the standard error of ?
c) Define and . Write a regression equation involving
Question 3
Explain the differences between the concepts of simple correlation, partial correlation and multiple correlation. Why is each useful?
Concept Differences AdvantagesSimple correlation Measure of the degree to which two
variables vary together, or a measure of the intensity of the association between two variables.
-Can show that an independentvariable causes a change in a dependent variable.
-For example, testing the hypothesis that an association between X and Y exists
-To determine if an association between two variables exists as determined using correlation
Partial correlation Measures the degree of association between two random variables, with the effect of a set of controlling random variables removed.
-Can examine the correlations among residuals (errors of prediction). If regress variable X on variable Z, then subtract X' from X, we have a residual e. This e will be uncorrelated with Z, so any correlation X shares with another variable Y cannot be due to Z.
-Hold some third variable constant while examining the relations between X and Y. It can be done this by design.
Multiple correlation Basically it is a linear relationship among more than two variables. It is measured by the coefficient of multiple determination, denoted as R2, which is a measure of the fit of a linear regression.
-The effects of all the independent variables simultaneously on a dependent variable.
-For example, the correlation co-efficient between the yield of paddy (X1) and the other variables, viz. type of seedlings (X2), manure (X3), rainfall (X4), humidity (X5) is the multiple correlation co-efficient R1.2345 . This co-efficient takes value between 0 and +1.
Question 4
The following variables are used to determine demand for roses. The quarterly data on these variables are given in Table 1. You are asked to consider the following demand functions:
Y x2 x3 x4 x5
11484 2.26 3.49 158.11 19348 2.54 2.85 173.36 28429 3.07 4.06 165.26 3
10079 2.91 3.64 172.92 49240 2.73 3.21 178.46 58862 2.77 3.66 198.62 66216 3.59 3.76 186.28 78253 3.23 3.49 188.98 88038 2.6 3.13 180.49 97476 2.89 3.2 183.33 105911 3.77 3.65 181.87 117950 3.64 3.6 185 126134 2.82 2.94 184 135868 2.96 3.12 188.2 143160 4.24 3.58 175.67 155872 3.69 3.53 188 16
a) Estimate the parameters of the linear model and interpret the results
b) Estimate the parameters of the log-linear model and interpret the results
c) give, respectively, the own-price, cross-price and income elasticities of demand. What are their a priori signs? Do the results concur with the a priori expectations?
d) How would you compute the own-price, cross-price and income elasticities for the linear model?
e) On the basis of your analysis which model if either would you choose and why?
Question 5
To answer Question 5, you have to refer to the the log-linear model (in Question 4),
a) What is the estimated own-price elasticity of demand (elasticity with respect to the price of roses)?
b) Is it statistically significant?
c) If so, is it significantly different from unity?
d) A priori, what are the expected signs of (price of carnations) and (income)? Are the empirical results in accord with these expectations?
e) If the coefficients of and are statistically insignificant, what may be the reasons?