statistics for business and economics 7 th edition chapter 12 multiple regression ch. 12-1 copyright...

58
Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Post on 20-Dec-2015

214 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Statistics for Business and Economics

7th Edition

Chapter 12

Multiple Regression

Ch. 12-1Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 2: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Chapter Goals

After completing this chapter, you should be able to: Apply multiple regression analysis to business decision-

making situations Analyze and interpret the computer output for a multiple

regression model Perform a hypothesis test for all regression coefficients

or for a subset of coefficients Fit and interpret nonlinear regression models Incorporate qualitative variables into the regression

model by using dummy variables Discuss model specification and analyze residuals

Ch. 12-2Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 3: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

The Multiple Regression Model

Idea: Examine the linear relationship between 1 dependent (Y) & 2 or more independent variables (Xi)

εXβXβXββY kk22110

Multiple Regression Model with k Independent Variables:

Y-intercept Population slopes Random Error

Ch. 12-3

12.1

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 4: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Multiple Regression Equation

The coefficients of the multiple regression model are estimated using sample data

kik2i21i10i xbxbxbby ˆ

Estimated (or predicted) value of y

Estimated slope coefficients

Multiple regression equation with k independent variables:

Estimatedintercept

In this chapter we will always use a computer to obtain the regression slope coefficients and other regression

summary measures.Ch. 12-4Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 5: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Multiple Regression Equation

Two variable model

y

x1

x2

22110 xbxbby ˆ

Slope for v

ariable x 1

Slope for variable x2

(continued)

Ch. 12-5Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 6: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Multiple Regression Model

Two variable model

y

x1

x2

yi

yi

<

ei = (yi – yi)

<

x2i

x1i The best fit equation, y , is found by minimizing the sum of squared errors, e2

<

Sample observation

22110 xbxbby ˆ

Ch. 12-6Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 7: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Standard Multiple Regression Assumptions

The values xi and the error terms εi are independent

The error terms are random variables with mean 0 and a constant variance, 2.

(The constant variance property is called homoscedasticity)

n), 1,(i for σ]E[εand0]E[ε 22ii

Ch. 12-7Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 8: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

The random error terms, εi , are not correlated with one another, so that

It is not possible to find a set of numbers, c0, c1, . . . , ck, such that

(This is the property of no linear relation for

the Xj’s)

(continued)

ji all for 0]εE[ε ji

0xcxcxcc KiK2i21i10

Ch. 12-8

Standard Multiple Regression Assumptions

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 9: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Example: 2 Independent Variables

A distributor of frozen desert pies wants to evaluate factors thought to influence demand

Dependent variable: Pie sales (units per week) Independent variables: Price (in $)

Advertising ($100’s)

Data are collected for 15 weeks

Ch. 12-9Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 10: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Pie Sales Example

Sales = b0 + b1 (Price)

+ b2 (Advertising)

WeekPie

SalesPrice

($)Advertising

($100s)

1 350 5.50 3.3

2 460 7.50 3.3

3 350 8.00 3.0

4 430 8.00 4.5

5 350 6.80 3.0

6 380 7.50 4.0

7 430 4.50 3.0

8 470 6.40 3.7

9 450 7.00 3.5

10 490 5.00 4.0

11 340 7.20 3.5

12 300 7.90 3.2

13 440 5.90 4.0

14 450 5.00 3.5

15 300 7.00 2.7

Multiple regression equation:

Ch. 12-10Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 11: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Estimating a Multiple Linear Regression Equation

Excel will be used to generate the coefficients and measures of goodness of fit for multiple regression

Data / Data Analysis / Regression

Ch. 12-11

12.2

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 12: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Multiple Regression Output

Regression Statistics

Multiple R 0.72213

R Square 0.52148

Adjusted R Square 0.44172

Standard Error 47.46341

Observations 15

ANOVA   df SS MS F Significance F

Regression 2 29460.027 14730.013 6.53861 0.01201

Residual 12 27033.306 2252.776

Total 14 56493.333      

  Coefficients Standard Error t Stat P-value Lower 95% Upper 95%

Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404

Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392

Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

ertising)74.131(Adv ce)24.975(Pri - 306.526 Sales

Ch. 12-12Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 13: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

The Multiple Regression Equation

ertising)74.131(Adv ce)24.975(Pri - 306.526 Sales

b1 = -24.975: sales will decrease, on average, by 24.975 pies per week for each $1 increase in selling price, net of the effects of changes due to advertising

b2 = 74.131: sales will increase, on average, by 74.131 pies per week for each $100 increase in advertising, net of the effects of changes due to price

where Sales is in number of pies per week Price is in $ Advertising is in $100’s.

Ch. 12-13Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 14: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Coefficient of Determination, R2

Reports the proportion of total variation in y explained by all x variables taken together

This is the ratio of the explained variability to total sample variability

squares of sum total

squares of sum regression

SST

SSRR2

Ch. 12-14

12.3

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 15: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Coefficient of Determination, R2

Regression Statistics

Multiple R 0.72213

R Square 0.52148

Adjusted R Square 0.44172

Standard Error 47.46341

Observations 15

ANOVA   df SS MS F Significance F

Regression 2 29460.027 14730.013 6.53861 0.01201

Residual 12 27033.306 2252.776

Total 14 56493.333      

  Coefficients Standard Error t Stat P-value Lower 95% Upper 95%

Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404

Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392

Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

.5214856493.3

29460.0

SST

SSRR2

52.1% of the variation in pie sales is explained by the variation in price and advertising

(continued)

Ch. 12-15Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 16: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Estimation of Error Variance

Consider the population regression model

The unbiased estimate of the variance of the errors is

where

The square root of the variance, se , is called the standard error of the estimate

1Kn

SSE

1Kn

es

n

1i

2i

2e

iKiK2i21i10i εxβxβxββY

iii yye ˆ

Ch. 12-16Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 17: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Standard Error, se

Regression Statistics

Multiple R 0.72213

R Square 0.52148

Adjusted R Square 0.44172

Standard Error 47.46341

Observations 15

ANOVA   df SS MS F Significance F

Regression 2 29460.027 14730.013 6.53861 0.01201

Residual 12 27033.306 2252.776

Total 14 56493.333      

  Coefficients Standard Error t Stat P-value Lower 95% Upper 95%

Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404

Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392

Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

47.463se

The magnitude of this value can be compared to the average y value

Ch. 12-17Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 18: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Adjusted Coefficient of Determination,

R2 never decreases when a new X variable is added to the model, even if the new variable is not an important predictor variable This can be a disadvantage when comparing

models What is the net effect of adding a new variable?

We lose a degree of freedom when a new X variable is added

Did the new X variable add enough explanatory power to offset the loss of one degree of freedom?

2R

Ch. 12-18Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 19: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Adjusted Coefficient of Determination,

Used to correct for the fact that adding non-relevant independent variables will still reduce the error sum of squares

(where n = sample size, K = number of independent variables)

Adjusted R2 provides a better comparison between multiple regression models with different numbers of independent variables

Penalize excessive use of unimportant independent variables

Smaller than R2

(continued)

2R

1)(n/SST

1)K(n/SSE1R2

Ch. 12-19Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 20: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Regression Statistics

Multiple R 0.72213

R Square 0.52148

Adjusted R Square 0.44172

Standard Error 47.46341

Observations 15

ANOVA   df SS MS F Significance F

Regression 2 29460.027 14730.013 6.53861 0.01201

Residual 12 27033.306 2252.776

Total 14 56493.333      

  Coefficients Standard Error t Stat P-value Lower 95% Upper 95%

Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404

Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392

Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

.44172R2 44.2% of the variation in pie sales is explained by the variation in price and advertising, taking into account the sample size and number of independent variables

2R

Ch. 12-20Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 21: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Coefficient of Multiple Correlation

The coefficient of multiple correlation is the correlation between the predicted value and the observed value of the dependent variable

Is the square root of the multiple coefficient of determination

Used as another measure of the strength of the linear relationship between the dependent variable and the independent variables

Comparable to the correlation between Y and X in simple regression

2Ry),yr(R ˆ

Ch. 12-21Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 22: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Evaluating Individual Regression Coefficients

Use t-tests for individual coefficients Shows if a specific independent variable is

conditionally important Hypotheses:

H0: βj = 0 (no linear relationship)

H1: βj ≠ 0 (linear relationship does exist between xj and y)

Ch. 12-22

12.4

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 23: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Evaluating Individual Regression Coefficients

H0: βj = 0 (no linear relationship)

H1: βj ≠ 0 (linear relationship does exist between xi and y)

Test Statistic:

(df = n – k – 1)

jb

j

S

0bt

(continued)

Ch. 12-23Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 24: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Evaluating Individual Regression Coefficients

Regression Statistics

Multiple R 0.72213

R Square 0.52148

Adjusted R Square 0.44172

Standard Error 47.46341

Observations 15

ANOVA   df SS MS F Significance F

Regression 2 29460.027 14730.013 6.53861 0.01201

Residual 12 27033.306 2252.776

Total 14 56493.333      

  Coefficients Standard Error t Stat P-value Lower 95% Upper 95%

Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404

Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392

Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

t-value for Price is t = -2.306, with p-value .0398

t-value for Advertising is t = 2.855, with p-value .0145

(continued)

Ch. 12-24Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 25: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

H0: βj = 0

H1: βj 0

d.f. = 15-2-1 = 12

a = .05

t12, .025 = 2.1788

The test statistic for each variable falls in the rejection region (p-values < .05)

There is evidence that both Price and Advertising affect pie sales at = .05

From Excel output:

Reject H0 for each variable

  Coefficients Standard Error t Stat P-value

Price -24.97509 10.83213 -2.30565 0.03979

Advertising 74.13096 25.96732 2.85478 0.01449

Decision:

Conclusion:

Reject H0Reject H0

a/2=.025

-tα/2

Do not reject H0

0 tα/2

a/2=.025

-2.1788 2.1788

Example: Evaluating Individual Regression Coefficients

Ch. 12-25Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 26: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Confidence Interval Estimate for the Slope

Confidence interval limits for the population slope βj

Example: Form a 95% confidence interval for the effect of changes in price (x1) on pie sales:

-24.975 ± (2.1788)(10.832)

So the interval is -48.576 < β1 < -1.374

jbα/21,Knj Stb   Coefficients Standard Error

Intercept 306.52619 114.25389

Price -24.97509 10.83213

Advertising 74.13096 25.96732

where t has (n – K – 1) d.f.

Here, t has

(15 – 2 – 1) = 12 d.f.

Ch. 12-26Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 27: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Confidence Interval Estimate for the Slope

Confidence interval for the population slope βi

Example: Excel output also reports these interval endpoints:

Weekly sales are estimated to be reduced by between 1.37 to 48.58 pies for each increase of $1 in the selling price

  Coefficients Standard Error … Lower 95% Upper 95%

Intercept 306.52619 114.25389 … 57.58835 555.46404

Price -24.97509 10.83213 … -48.57626 -1.37392

Advertising 74.13096 25.96732 … 17.55303 130.70888

(continued)

Ch. 12-27Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 28: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Test on All Coefficients

F-Test for Overall Significance of the Model Shows if there is a linear relationship between all

of the X variables considered together and Y Use F test statistic Hypotheses:

H0: β1 = β2 = … = βk = 0 (no linear relationship)

H1: at least one βi ≠ 0 (at least one independent variable affects Y)

Ch. 12-28

12.5

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 29: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

F-Test for Overall Significance

Test statistic:

where F has k (numerator) and

(n – K – 1) (denominator)

degrees of freedom The decision rule is

1)KSSE/(n

SSR/K

s

MSRF

2e

α1,Knk,0 FF if H Reject

Ch. 12-29Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 30: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

F-Test for Overall Significance

6.53862252.8

14730.0

MSE

MSRF

Regression Statistics

Multiple R 0.72213

R Square 0.52148

Adjusted R Square 0.44172

Standard Error 47.46341

Observations 15

ANOVA   df SS MS F Significance F

Regression 2 29460.027 14730.013 6.53861 0.01201

Residual 12 27033.306 2252.776

Total 14 56493.333      

  Coefficients Standard Error t Stat P-value Lower 95% Upper 95%

Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404

Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392

Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

(continued)

With 2 and 12 degrees of freedom

P-value for the F-Test

Ch. 12-30Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 31: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

F-Test for Overall Significance

H0: β1 = β2 = 0

H1: β1 and β2 not both zero

= .05

df1= 2 df2 = 12

Test Statistic:

Decision:

Conclusion:

Since F test statistic is in the rejection region (p-value < .05), reject H0

There is evidence that at least one independent variable affects Y

0

= .05

F.05 = 3.885Reject H0Do not

reject H0

6.5386MSE

MSRF

Critical Value:

F = 3.885

(continued)

F

Ch. 12-31Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 32: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Tests on a Subset of Regression Coefficients

Consider a multiple regression model involving variables xj and zj , and the null hypothesis that the z variable coefficients are all zero:

r)1,...,(j 0α of one least at :H

0ααα:H

j1

r210

irir1i1KiK1i10i εzαzαxβxββy

Ch. 12-32Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 33: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Tests on a Subset of Regression Coefficients

Goal: compare the error sum of squares for the complete model with the error sum of squares for the restricted model

First run a regression for the complete model and obtain SSE Next run a restricted regression that excludes the z variables

(the number of variables excluded is r) and obtain the restricted error sum of squares SSE(r)

Compute the F statistic and apply the decision rule for a significance level

(continued)

α1,rKnr,2e

0 Fs

r/)SSESSE(r)(F if H Reject

Ch. 12-33Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 34: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Prediction

Given a population regression model

then given a new observation of a data point

(x1,n+1, x 2,n+1, . . . , x K,n+1)

the best linear unbiased forecast of yn+1 is

It is risky to forecast for new X values outside the range of the data used to estimate the model coefficients, because we do not have data to support that the linear model extends beyond the observed range.

n),1,2,(iεxβxβxββy iKiK2i21i10i

1nK,K1n2,21n1,101n xbxbxbby ˆ

^

Ch. 12-34

12.6

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 35: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Using The Equation to Make Predictions

Predict sales for a week in which the selling price is $5.50 and advertising is $350:

Predicted sales is 428.62 pies

428.62

(3.5) 74.131 (5.50) 24.975 - 306.526

ertising)74.131(Adv ce)24.975(Pri - 306.526 Sales

Note that Advertising is in $100’s, so $350 means that X2 = 3.5

Ch. 12-35Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 36: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Nonlinear Regression Models

The relationship between the dependent variable and an independent variable may not be linear

Can review the scatter diagram to check for non-linear relationships

Example: Quadratic model

The second independent variable is the square of the first variable

εXβXββY 212110

Ch. 12-36

12.7

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 37: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Quadratic Regression Model

where:

β0 = Y intercept

β1 = regression coefficient for linear effect of X on Y

β2 = regression coefficient for quadratic effect on Y

εi = random error in Y for observation i

i21i21i10i εxβxββy

Model form:

Ch. 12-37Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 38: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Linear vs. Nonlinear Fit

Linear fit does not give random residuals

Nonlinear fit gives random residuals

X

resi

dua

ls

X

Y

X

resi

dua

ls

Y

X

Ch. 12-38Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 39: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Quadratic Regression Model

Quadratic models may be considered when the scatter diagram takes on one of the following shapes:

X1

Y

X1X1

YYY

β1 < 0 β1 > 0 β1 < 0 β1 > 0

β1 = the coefficient of the linear term

β2 = the coefficient of the squared term

X1

β2 > 0 β2 > 0 β2 < 0 β2 < 0

i21i21i10i εXβXββY

Ch. 12-39Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 40: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Testing for Significance: Quadratic Effect

Testing the Quadratic Effect Compare the linear regression estimate

with quadratic regression estimate

Hypotheses (The quadratic term does not improve the model)

(The quadratic term improves the model)

2 12110 xbxbby ˆ

110 xbby ˆ

H0: β2 = 0

H1: β2 0

Ch. 12-40Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 41: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Testing for Significance: Quadratic Effect

Testing the Quadratic Effect

Hypotheses (The quadratic term does not improve the model)

(The quadratic term improves the model)

The test statistic is

H0: β2 = 0

H1: β2 0

(continued)

2b

22

s

βbt

3nd.f.

where:

b2 = squared term slope coefficient

β2 = hypothesized slope (zero)

Sb = standard error of the slope2

Ch. 12-41Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 42: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Testing for Significance: Quadratic Effect

Testing the Quadratic Effect

Compare R2 from simple regression to

R2 from the quadratic model

If R2 from the quadratic model is larger than R2 from the simple model, then the quadratic model is a better model

(continued)

Ch. 12-42Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 43: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Example: Quadratic Model

Purity increases as filter time increases:

PurityFilterTime

3 1

7 2

8 3

15 5

22 7

33 8

40 10

54 12

67 13

70 14

78 15

85 15

87 16

99 17

Purity vs. Time

0

20

40

60

80

100

0 5 10 15 20

Time

Pu

rity

Ch. 12-43Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 44: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Example: Quadratic Model

Simple regression results:

y = -11.283 + 5.985 Time

(continued)

Regression Statistics

R Square 0.96888

Adjusted R Square 0.96628

Standard Error 6.15997

  CoefficientsStandard

Error t Stat P-value

Intercept -11.28267 3.46805 -3.25332 0.00691

Time 5.98520 0.30966 19.32819 2.078E-10

FSignificance

F

373.57904 2.0778E-10

^

Time Residual Plot

-10

-5

0

5

10

0 5 10 15 20

Time

Resid

uals

t statistic, F statistic, and R2 are all high, but the residuals are not random:

Ch. 12-44Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 45: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Example: Quadratic Model

  CoefficientsStandard

Error t Stat P-value

Intercept 1.53870 2.24465 0.68550 0.50722

Time 1.56496 0.60179 2.60052 0.02467

Time-squared 0.24516 0.03258 7.52406 1.165E-05

Regression Statistics

R Square 0.99494

Adjusted R Square 0.99402

Standard Error 2.59513

FSignificance

F

1080.7330 2.368E-13

Quadratic regression results:

y = 1.539 + 1.565 Time + 0.245 (Time)2^

(continued)

Time Residual Plot

-5

0

5

10

0 5 10 15 20

Time

Res

idua

ls

Time-squared Residual Plot

-5

0

5

10

0 100 200 300 400

Time-squared

Res

idua

lsThe quadratic term is significant and improves the model: R2 is higher and se is lower, residuals are now random

Ch. 12-45Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 46: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

The Log Transformation

Original multiplicative model

Transformed multiplicative model

The Multiplicative Model:

εXXβY 21 β2

β10

)log(ε)log(Xβ)log(Xβ)log(βlog(Y) 22110

Ch. 12-46Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 47: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Interpretation of coefficients

For the multiplicative model:

When both dependent and independent variables are logged: The coefficient of the independent variable Xk can

be interpreted as

a 1 percent change in Xk leads to an estimated bk percentage change in the average value of Y

bk is the elasticity of Y with respect to a change in Xk

i1i10i ε logX log ββ log Ylog

Ch. 12-47Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 48: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Dummy Variables

A dummy variable is a categorical independent variable with two levels: yes or no, on or off, male or female recorded as 0 or 1

Regression intercepts are different if the variable is significant

Assumes equal slopes for other variables If more than two levels, the number of dummy

variables needed is (number of levels - 1)

Ch. 12-48

12.8

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 49: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Dummy Variable Example

Let:

y = Pie Sales

x1 = Price

x2 = Holiday (X2 = 1 if a holiday occurred during the week)

(X2 = 0 if there was no holiday that week)

210 xbxbby21

ˆ

Ch. 12-49Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 50: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Dummy Variable Example

Same slope

(continued)

x1 (Price)

y (sales)

b0 + b2

b0

1010

12010

xb b (0)bxbby

xb)b(b(1)bxbby

121

121

ˆ

ˆHoliday

No Holiday

Different intercept

Holiday (x2 = 1)No Holiday (x

2 = 0)

If H0: β2 = 0 is rejected, then“Holiday” has a significant effect on pie sales

Ch. 12-50Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 51: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Interpreting the Dummy Variable Coefficient

Sales: number of pies sold per weekPrice: pie price in $

Holiday:

Example:

1 If a holiday occurred during the week0 If no holiday occurred

b2 = 15: on average, sales were 15 pies greater in weeks with a holiday than in weeks without a holiday, given the same price

)15(Holiday 30(Price) - 300 Sales

Ch. 12-51Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 52: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Interaction Between Explanatory Variables

Hypothesizes interaction between pairs of x variables Response to one x variable may vary at different

levels of another x variable

Contains two-way cross product terms

)x(xbxbxbb

xbxbxbby

21322110

3322110

ˆ

Ch. 12-52Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 53: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Effect of Interaction

Given:

Without interaction term, effect of X1 on Y is measured by β1

With interaction term, effect of X1 on Y is measured by β1 + β3 X2

Effect changes as X2 changes

21322110

1231220

XXβXβXββ

)XXβ(βXββY

Ch. 12-53Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 54: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Interaction Example

x2 = 1:y = 1 + 2x1 + 3(1) + 4x1(1) = 4 + 6x1

x2 = 0: y = 1 + 2x1 + 3(0) + 4x1(0) = 1 + 2x1

Slopes are different if the effect of x1 on y depends on x2 value

x1

4

8

12

0

0 10.5 1.5

y

Suppose x2 is a dummy variable and the estimated regression equation is 2121 x4x3x2x1y ˆ

^

^

Ch. 12-54Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 55: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Significance of Interaction Term

The coefficient b3 is an estimate of the difference in the coefficient of x1 when x2 = 1 compared to when x2 = 0

The t statistic for b3 can be used to test the hypothesis

If we reject the null hypothesis we conclude that there is a difference in the slope coefficient for the two subgroups

0β0,β|0β:H

0β0,β|0β:H

2131

2130

Ch. 12-55Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 56: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Multiple Regression Assumptions

Assumptions: The errors are normally distributed Errors have a constant variance The model errors are independent

ei = (yi – yi)

<

Errors (residuals) from the regression model:

Ch. 12-56

12.9

Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 57: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Analysis of Residualsin Multiple Regression

These residual plots are used in multiple regression:

Residuals vs. yi

Residuals vs. x1i

Residuals vs. x2i

Residuals vs. time (if time series data)

<Use the residual plots to check for violations of regression assumptions

Ch. 12-57Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall

Page 58: Statistics for Business and Economics 7 th Edition Chapter 12 Multiple Regression Ch. 12-1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice

Chapter Summary

Developed the multiple regression model Tested the significance of the multiple regression model Discussed adjusted R2 ( R2 ) Tested individual regression coefficients Tested portions of the regression model Used quadratic terms and log transformations in

regression models Explained dummy variables Evaluated interaction effects Discussed using residual plots to check model

assumptions

Ch. 12-58Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall