pertemuan v anareg

Upload: nandaeka

Post on 06-Jul-2018

222 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/17/2019 Pertemuan v ANAREG

    1/26

    Analisis RegresiMatrix Approach to Simple

    Linear Regression

    Pertemuan 5

    -Robert Kurniawan-

  • 8/17/2019 Pertemuan v ANAREG

    2/26

    Outline

    • Review Matrix

    • RLS dalam Bentuk Matriks

    • Estimasi Least Square Parameter Regresi

     • Inferensi Analisis Regresi

  • 8/17/2019 Pertemuan v ANAREG

    3/26

    Matrices• Definition: A matrix is a rectangular array of numbers

    or symbolic elements• In many applications, the rows of a matrix will

    represent individuals cases (people, items, plants,

    animals,...) and columns will represent attributes or

    characteristics

     • The dimension of a matrix is it number of rows and

    columns, often denoted as  r x  c (r  rows by  c columns)

    • Can be represented in full form or abbreviated form:

    11 12 1 1

    21 22 2 2

    1 2

    1 2

    1,..., ; 1,...,

     j c

     j c

    ij

    i i ij ic

    r r rj rc

    a a a a

    a a a a

    a i r j ca a a a

    a a a a

     

    A

  • 8/17/2019 Pertemuan v ANAREG

    4/26

    Special Types of Matrices

    11 12

    21 22

    1

    2

    3

    Square Matrix: Number of rows = # of Columns

    20 32 5012 28 42

    28 46 60

    Vector: Matrix with one column (column vector) or one row (row vector)

    57

    24

    r c

    b b

    b b

     

     

    A B

    C D   1 2 3' 17 31 '

     

     f f f 

     

      E F

     

    4

     

    2 3 3 2

    11 1

    1

    Transpose: Matrix formed by interchanging rows and columns of a matrix (use "prime" to denote transpose)

    6 86 15 22

    ' 15 138 13 25

    22 25

    c

    r c

    r rc

    h h

    h h

       

    G G

    H

    11 1

    1

    1,..., ; 1,..., ' 1,..., ; 1,...,

    Matrix Equality: Matrices of the same dimension, and corresponding elements in same cells are all equal:

    ij jic r 

    c rc

    h h

    h i r j c h j c i r  

    h h

    H

    A

    11 12

    11 12 21 22

    21 22

    4 6 = 4, 6, 12, 10

    12 10

    b bb b b b

    b b

      B

  • 8/17/2019 Pertemuan v ANAREG

    5/26

    Regression Examples - Toluca Data

    1

    2

    1R esponse V ector:

    '

     

    n

    n

    Y

    X Y

    1   80 399

    1   30 121

    1   50 221

    1   90 376

    1   70 361

    1   60 224

    1   120 546

    1   80 352

    1   100 353

    1   50 157

    1 21

    1

    2

    2

    21 2

     

    1

    1D esign M atrix:

    1

    1 1 1'

    nn

    n

    n

    nn

     X 

     X 

     X 

     X X X 

     

    X

    X

    1   40 160

    1   70 252

    1   90 389

    1   20 113

    1   110 435

    1   100 420

    1   30 212

    1   50 268

    1   90 377

    1   110 421

    1   30 273

    1   90 468

    1   40 244

    1   80 342

    1   70 323

  • 8/17/2019 Pertemuan v ANAREG

    6/26

    Matrix Addition and Subtraction

    2 22 2 2 2 2 2

    11 1

    1

    Addition and Subtraction of 2 Matrices of Common Dimension:

    4 7 2 0 4 2 7 0 6 7 4 2 7 0 2 7

    10 12 14 6 10 14 12 6 24 18 10 14 12 6 4 6

    c

    r c

    r rc

    a a

    a a

    C D C D C D

    A

    11 1

    1

    11 11 1 1

    1,..., ; 1,..., 1,..., ; 1,...,

    1,..., ; 1,...,

    c

    ij ijr c

    r rc

    c c

    i i

    b b

    a i r j c b i r j c

    b b

    a b a b

    a b i r j c

    B

    A B

     

    1 1

    11 11 1 1

    1 1

    r c

    r r rc rc

    c c

    r c

    r r 

    a b a b

    a b a b

    a b

    A B

    1 1 1 1 1

    2 2 2 2 2

    1 11 11 1

    1,..., ; 1,...,

    Regression Example:

      since

    ij ij

    rc rc

    n nn nn n

    n n n n

    a b i r j c

    a b

    Y E Y Y E Y  

    Y E Y Y E Y  

    Y E Y Y E Y  

    ε

    ε

    ε

    Y E Y   ε Y E Y ε

    1 1 1

    2 2 2

    n n n n

     E Y 

     E Y 

     E Y 

    ε ε

    ε ε

    ε ε

  • 8/17/2019 Pertemuan v ANAREG

    7/26

    Matrix MultiplicationMultiplication of a Matrix by a Scalar (single number):

    2 1 3(2) 3(1) 6 332 7 3( 2) 3(7) 6 21

    Multiplication of a Matrix by a Matrix (#cols( ) = #rows( )):

    If : A A B B A

     A Br c r c r  

    k k 

    c r 

    A A

    A B

    A B AB

    th th

    = 1,..., ; 1,...,

     sum of the roducts of the elements of i row of and column of :

     B

    ij A Bc

    ab i r j c

    ab c r  

     

      A B

     

    3 2 2 2

    3 2 2 2 3 2

     

    2 53 1

    3 12 4

    0 7

    2(3) 5(2) 2( 1) 5(4)

    3(3) ( 1

     

    A B

    A B AB

    1

    16 18

    )(2) 3( 1) ( 1)(4) 7 7

    0(3) 7(2) 0( 1) 7(4) 14 28

    If : = = 1,..., ; 1,..., A A B B A B

    c

     A B ij ik kj A Br c r c r c

    c r c ab a b i r j c

    A B AB

  • 8/17/2019 Pertemuan v ANAREG

    8/26

    Matrix Multiplication Examples - I

    11 1 12 2 1 21 1 22 2 2

    11 12 1 1

    1 2

    21 22 2 2

    11 1 12 2 1

    21 1 22 2 2

    Simultaneous Equations:

    (2 equations: , unknown):

     

    a x a x y a x a x y

    a a x y x x

    a a x y

    a x a x y

    a x a x y

    AX = Y

     

      22 2

     

    4

    Sum of Squares: 4 2 3 4 2 3 2

    3

     

    1 0 1 1

    2 0 1 20

    1

    0 1

    29

    1

    1Regression Equation (Expected Values):

    1 n n

     X X 

     X X 

     X X 

    β β

    β ββ

    β

    β β

  • 8/17/2019 Pertemuan v ANAREG

    9/26

    Matrix Multiplication Examples - II

    1

    2 2

    1 2

    1

    Matrices used in simple linear regression (that generalize to multiple regression):

    '

    1

    n

    n i

    i

    n

    n

    Y Y Y Y Y  

     X 

    Y Y  

    12

    1 2 2

    1 1

     

    1 1 1 1'

    1

    i

    i

    n nn

    i i

    i in

    n X  X 

     X X X  X X 

     X 

           

     

    X X

    1 1 0 1 1

    12 2 0 1 20

    1 2 1

    1 0 1

    1

    1 1 1 1'

    1

    n

    ii

    nn

    i i

    in n n

    Y X X 

    Y Y X X  X 

     X X X  X Y 

    Y X X 

    β β

    β βββ

    β

    β β

       

    X Y

  • 8/17/2019 Pertemuan v ANAREG

    10/26

    Special Matrix TypesSymmetric Matrix: Square matrix with a transpose equal to itself: ':

    6 19 8 6 19 8

    19 14 3 19 14 3

    8 3 1 8 3 1

    Diagonal Matrix: Square matrix with all off-diagonal elements equal to 0:

    A = A

    A A' A

    A

    1

    2

    3

    4 0 0 0 0

    0 1 0 0 0 Note:Diagonal matrices are symmetric (not vice versa)

    0 0 2 0 0

    Identity Matrix: Diagonal matrix with all diagonal elements equal to 1 (acts like multiplying a

    b

    b

    b

    B

    scalar by 1):

     

    11 12 13 11 12 13

    21 22 23 21 22 233 3 3 3

    31 32 33 31 32 33

     

    0 1 0

    0 0 1

    Scalar Matrix: Diagonal matrix with all diagonal elements equal to a single

    a a a a a a

    a a a a a a

    a a a a a a

    I A IA AI A

    4 4

    1 1 11

     number"

    0 0 0 1 0 0 0

    0 0 0 0 1 0 0

    0 0 0 0 0 1 0

    0 0 0 0 0 0 1

    1-Vector and matrix and zero-vector:

    1 01 1

    1 0  Note: ' 1 1 1

    1 11 0

    r r r r r r 

    k k k 

    I

    1 J 0 1 1

    11

    1 11 1

    1 1' 1 1 1

    1 11 1

    r r    r r r 

     

     

    1 1 J

  • 8/17/2019 Pertemuan v ANAREG

    11/26

    Linear Dependence and Rank of a Matrix

    • Linear Dependence: When a linear function of the

    columns (rows) of a matrix produces a zero vector

    (one or more columns (rows) can be written as linear

    function of the other columns (rows))

    • Rank of a matrix: Number of linearly independent

     co umns rows o e ma r x. an canno exceethe minimum of the number of rows or columns of 

    the matrix. rank(A) ≤ min(r A,ca)

    • A matrix if full rank if rank(A) = min(r A,ca)

    1 2 1 22 2 2 1 2 1

    1 2 1 22 2 2 1 2 1

    1 33 Columns of are linearly dependent rank( ) = 1

    4 12

    4 30 0 Columns of are linearly independent rank( ) = 2

    4 12

         

       

    A A A A A 0 A A

    B B B B B 0 B B

  • 8/17/2019 Pertemuan v ANAREG

    12/26

    Matrix Inverse

    • Note: For scalars (except 0), when we multiply a

    number, by its reciprocal, we get 1:2(1/2)=1   x (1/ x )= x ( x -1)=1

    • In matrix form if  A is a square matrix and full rank (all

    rows and columns are linearly independent), then A

      -1   -1 -1

     

    2 8 2 8 4 32 16 16

    2 8 2 8 1 036 36 36 36 36 36 36 36

    4 2 4 2 4 2 4 2 8 8 32 4 0 1

    36 36 36 36 36 36 36 36

    4 0 0 1/ 4 0 0 4 1

    0 2 0 0 1/ 2 00 0 6 0 0 1 / 6

    -1 -1

    -1 -1

    A A A A I

    B B BB

     / 4 0 0 0 0 0 0 0 0 1 0 0

    0 0 0 0 2 1/ 2 0 0 0 0 0 1 00 0 0 0 0 0 0 0 6 1/ 6 0 0 1

             

    I

  • 8/17/2019 Pertemuan v ANAREG

    13/26

    Computing an Inverse of 2x2 Matrix11 12

    2 221 22

    11 22 12 21

    11 12 21 22

    11 22 12 21 12 22 12

     full rank (columns/rows are linearly independent)

    Determinant of Note: If A is not full rank (for some value ):

    a a

    a a

    a a a ak a ka a ka

    a a a a ka a a k  

    A

    A A

    A22

    22 121

    2 221 11

    1

    0

    1  Thus does not exist if is not full rank 

    While there are rules for general matrices, we will use computers to solve them

    Regression Example:

    1

    a

    a a

    a a

    r r 

     X 

     

    -1A A A

    A

    2

    2

     

    n   n

    i   i

    n X    X     

     

    2

     

    1 n X 

    X

     

    21 12 2

    1 1 1 12

    1 1

    2

    1 1 1

    2

    1 1

    1  Note:

    n n n ni   i

    i i i in ni i i i

    i i

    i i

    n n

    i i

    i i

    in   ni

    i   ii   i

    n X X n X n X X  n

     X X 

     X X 

     X 

    n X X    X n

     

     

     

                 

     

     

    X'X X'X

    X'X  

    2 22 22 2

    1 1 1 1 1

    2

    2 2

    1 1 1

    2 2

    1 1

    1

    1

    n n n n n

    i i i i

    i i i i

    n n

    i i

    i i

    n n

    i i

    i i

    n X X X X n X X X X n X  

     X X 

    n X X X X 

     X 

     X X X X 

     

    X'X

  • 8/17/2019 Pertemuan v ANAREG

    14/26

    Use of Inverse Matrix – Solving Simultaneous Equations

    1 2 1 2

    1

     where and are matrices of of constants, is matrix of unknowns

      (assuming is square and full rank)

    Equation 1: 12 6 48 Equation 2: 10 2 12

    12 6

     y y y y

     y

    -1 -1 -1

    AY = C A C Y

    A AY A C Y = A C A

    A Y48

     

       

      -1C Y = A C

     

    2

    2 6 2 61 1

    10 12 10 1212( 2) 6(10) 84

    2 6 48 96 72 168 21 1 1

    10 12 12 480 144 336 484 84 84

    Note the wisdom of waiting to divide by a

     

    -1

    -1

    A

    Y = A C

    | A | t end of calculation!

  • 8/17/2019 Pertemuan v ANAREG

    15/26

    Useful Matrix ResultsAll rules assume that the matrices are conformable to operations:

    Addition Rules:

    ( ) ( )

    Multiplication Rules:

     

    A B B A A B C A B C

     

    sca ar

    Transpose Rules:

    ( ') ' ( ) ' '

    +

    A A A B A ' ( ) '

    Inverse Rules (Full Rank, Square Matrices):

    -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1

    B AB B'A' (ABC)' = C'B'A'

    (AB) = B A (ABC) = C B A (A ) = A (A') = (A )'

  • 8/17/2019 Pertemuan v ANAREG

    16/26

    Random Vectors and Matrices

     

    1

    1 2 3 2

    3

    1

    2

    Shown for case of =3, generalizes to any :

    Random variables: , ,

    Expectation: In general: 1,..., ; 1,...,

     

    ij

    n p

    n n

    Y Y Y Y  

     E Y 

     E Y E Y i n j p

     

    Y

    E Y E Y

     

    3

     

    Variance-Covariance Matri

           

           

         

    1 1

    2

    2 2 1 1 2 2 3 3

    3 3

    2

    1 1 1 1 2 2 1 1 3 3

    2

    2 2 1 1 2 2 2 2 3 3

    3 3 1 1 3 3 2 2 3

    x for a Random Vector:

    Y E Y 

     E Y E Y Y E Y Y E Y Y E Y 

    Y E Y 

    Y E Y Y E Y Y E Y Y E Y Y E Y  

    Y E Y Y E Y Y E Y Y E Y Y E Y  

    Y E Y Y E Y Y E Y Y E Y Y E Y  

    σ

    Y Y E Y Y E Y ' E

    E

    2

    1 12 13

    2

    21 2 23

    22

    31 32 33

    σ σ σ

    σ σ σ

    σ σ σ

     

     

  • 8/17/2019 Pertemuan v ANAREG

    17/26

    Linear Regression Example (n=3)

     

    2

    2 2

    2

    1

    2 2

    2

    2

    Error terms are assumed to be independent, with mean 0, constant variance :

    0 , 0

    0 0 0

    0 0 0

    0 0 0

    i i i j E i j

    σ

    ε σ ε σ σ ε ε

    ε σ

    ε σ σ

    ε σ

    2ε E ε σ ε I

     

    2 2

    Y = Xβ + ε E Y E Xβ + ε Xβ E ε Xβ

    σ Y σ Xβ + ε  

    2

    2 2

    2

    0 0

    0 0

    0 0

    σ

    σ σ

    σ

    2σ ε I

  • 8/17/2019 Pertemuan v ANAREG

    18/26

    Mean and Variance of Linear Functions of Y

    1

    11 1 1 1 11 1 1

    1

    1 1 1

    1 1

    matrix of fixed constants random vector

    ...

     random vector:

    ...

    k n n

    n n n

    k kn n k k kn n

    a a Y W a Y a Y  

    a a Y W a Y a Y  

     E W E a

    A Y

    W AY W

    1 1 1 11 1 1... ...n n n nY a Y a E Y a E Y  

     

    k  E W 

    E W  

       

     

    1 1 1 1

    11 1 1

    1

    ... ...k kn n k kn n

    n

    k kn n

     E a Y a Y a E Y a E Y 

    a a E Y  

    a a E Y  

     

    2

    AE Y

    σ W = E AY - AE Y AY - AE Y ’ = E A Y -E Y A Y - E Y ’ =

    E A Y - E Y Y - E Y 'A

            2

    ' = AE Y - E Y Y - E Y ' A' = Aσ Y A’

  • 8/17/2019 Pertemuan v ANAREG

    19/26

    Multivariate Normal Distribution

    21 1 1 12 1

    22 2 21 2 2

    2

    1 2

    Multivariate Normal Density function:

    n

    n

    n n   n n n

    µ   σ σ σ

    µ   σ σ σ

    µ   σ σ σ

     

    2Y   μ = E Y Σ = σ Y

     

     

    1/2 /2

    2

     

    12 exp ~

    2

    ~ , 1,...,

    n

    i i i

     f N 

    Y N i n Y  

    π

    µ σ σ

     

    -1Y   Σ Y - μ ’Σ Y - μ Y μ, Σ

    ,

    Note, if is a (full rank) matrix of fixed constants:

    ~

    i j ijY i j

     N 

    σ

    A

    W = AY Aμ, AΣA’

  • 8/17/2019 Pertemuan v ANAREG

    20/26

    Simple Linear Regression in Matrix Form0 1

    1 0 1 1 1

    2 0 1 2 2

    0 1

    1 1 1

    2 2 20

    Simple Linear Regression Model: 1,...,

    Defining:

    1

    1=

    i i i

    n n n

    Y X i n

    Y X 

    Y X 

    Y X 

    Y X 

    Y X 

    β β ε

    β β ε

    β β ε

    β β ε

    ε

    εβ

    Y X   β ε

    0 1 1

    0 1 2since:

     

     X 

     X 

    β β

    β β

    Y = Xβ + ε Xβ E Y

     

    1

     

    1n n nY X    ε

    0 1

    2

    2

    2

    2

    Assuming constant variance, and independence of error terms :

    0 0

    0 0

    0 0

    Further, assuming normal distributio

    n

    i

    n n

     X β β

    ε

    σ

    σσ

    σ

    2 2σ Y = σ ε I

    2n for error terms : ~ ,i   N ε σY Xβ I

  • 8/17/2019 Pertemuan v ANAREG

    21/26

    Estimating Parameters by Least Squares

    0 1

    0 1

    1 1

    2

    0 1

    1 1 1

    1 1

    2

    Normal equations obtained from: , and setting each equal to 0:

    Note: In matrix form: '

    n n

    i i

    i i

    n n n

    i i i i

    i i i

    n

    i i

    i i

    n n

    i i

    Q Q

    nb b X Y  

    b X b X X Y  

    n X Y 

     X X 

    β β

    X X X'Y

    0

    1

      Defining

     

    n

    n

    i i

    b

    b X Y 

         

    b

     

    1 1

     

    i i

    1

    2 2 2

    0 1 0 0 1 1

    1 1 1 1

    0

     

    Based on matrix form:

    ' 2 2

    ( )

    i

    n n n n

    i i i i i

    i i i i

    Q

    Y Y Y X Y n X X  

    Q

    QQ

    β β β β β β

    β

    -1X'Xb = X 'Y b = X'X X'Y

    Y - Xβ ’ Y - Xβ = Y’Y - Y’Xβ- β’X’Y + β’X’Xβ

    β

    0 1

    1 1

    2

    0 1

    1 1 11

    2 2 2

    2 ' 2 '

    2 2 2

    Setting this equal to zero, and replacing with b ' '

    n n

    i i

    i i

    n n n

    i i i i

    i i i

    Y n X 

     X Y X X 

     X Y X X 

    β β

    β

    β ββ

    β

       

       

       

    X Xb X Y

  • 8/17/2019 Pertemuan v ANAREG

    22/26

    Fitted Values and Residuals

    ^ ^

    0 1

    ^

    10 1 1

    ^^

    0 1 22

    ^0 1

    In Matrix form:

     is called the "hat" or "projection" matrix, note that is idempotent (

    i ii i i

    n

    n

    Y b b X e Y Y  

    Y  b b X 

    b b X Y 

    b b X Y 

    -1 -1Y Xb = X X'X X 'Y = HY H = X X'X X '

    H H HH

    ) and symmetric( ):

    -1 -1 -1 -1 -1 -1

    = H H = H'

    ' ' ' ' ' ' ' ' ' ' ' ' ' ' '

     

    ^ ^

    1 111

    ^ ^

    22 22

    ^ ^

     

    n

    n   nn

    Y Y    Y Y 

    Y Y Y    Y 

    Y Y Y    Y 

         

               

    ^

    = = = =

    e Y - Y = Y - Xb = Y - HY = (I - H)Y  

         

     

    2

    2

    Note:

    e

     MSE MSE 

    σ

    σ

    ^ ^-1 2 2

    2 2

    ^2 2

    E Y = E HY = HE Y = HXβ = X X’X X’Xβ = Xβ σ Y = Hσ IH’ H

    E = E I H Y = I H E Y = I H Xβ Xβ - Xβ = 0 σ e = I H σ I I H ’ I H

    s Y H s e = I H

  • 8/17/2019 Pertemuan v ANAREG

    23/26

    Analysis of Variance

    2

    2

    12

    1 1

    2

    12

    1

    Total (Corrected) Sum of Squares:

    1 11

    Note: '

    1 1

    1 1

    ' '

    n

    in n

    ii i

    i i

    n

    ini

    in n

    i

    SSTO Y Y Y  n

    Y n n

       

    Y Y Y 'JY J

    '

     

     

    n n

    SSE 

    e'e = Y - X b    

    2

    1

    since

    1 1'

    Note that , , are all QUADRATIC FOR

    n

    i

    i

    SSR SSTO SSE n n n

    SSTO SSR and SSE 

    -1

    ' Y -Xb = Y'Y - Y'Xb- b'X'Y +b'X'Xb = Y'Y -b'X'Y = Y' I - H Y

    b'X'Y = Y'X' X'X X'Y = Y'HY

    b'X'Y Y 'HY Y 'JY Y H J Y

    MS: for symmetric matricesY 'AY A

  • 8/17/2019 Pertemuan v ANAREG

    24/26

    Inferences in Linear Regression

    2 2

    2 2 2

    1 1 1

    2 2

    1

    Recall:1

    n n n

    i i i

    i i i

    n n

    i i

     MSE 

     X X MSE X MSE X 

    n n X X X X X X 

     X 

     X X X X 

    -1 -1 -1

    -1 -1 -1 -1 -1 -12 2 2 2 2

    -1 2

    b = X'X X'Y Þ E b = X'X X'E Y = X'X X'Xβ = β

    σ b = X’X X’σ Y X X’X = σ X’X X’IX X’X = σ X’X s b X’X

    X'X s b

    2

    1

    2 2

     

    n

    i

    i

    n n

    i i

     MSE 

     X X 

     X MSE MSE 

     X X X X 

    1 1i i

     

    1 1

    ^ ^2

    0 1

    ^2

    0 1

    Estimated Mean Response at :

    1

    Predicted New Response at :

    pred

    i i

    h

    h hh

    h

    h

    h h

     X X 

    Y b b X s Y MSE   X 

     X X 

    Y b b X s

    -12

    h h h h h h

    h

    X 'b X X 's b X X ' X'X X

    X 'b   1 MSE    -1h hX ' X'X X

  • 8/17/2019 Pertemuan v ANAREG

    25/26

    Soal Latihan

    Buku John Netter:

    1.Halaman 157 No. 4.24

    2.Halaman 56 No. 2.17

  • 8/17/2019 Pertemuan v ANAREG

    26/26