08 biometrics lecture 8 part3 2009-11-09

Upload: bsgindia82

Post on 05-Apr-2018

226 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    1/24

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    2/24

    2Face Recognition

    Face detection

    Face tracking

    Face recognition (appearance-based) Local features

    DCT-based methods

    Global features (holistic approach)

    Principal Component Analysis (PCA)

    Linear Discriminant Analysis (LDA)

    Performance evaluation Advantages and disadvantages

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    3/24

    3Principal Component Analysis (PCA)

    Principal component analysis (PCA), orKarhunen-Loevetransformation, is a data-representation method that findsan alternative set of parameters for a set of raw data (or

    features) such that most of the variability in the data iscompressed down to the first few parameters

    The transformed PCA parameters are orthogonal The PCA, diagonalizes the

    covariance matrix, and the

    resulting diagonalelements are the variances

    of the transformed PCA

    parameters

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    4/24

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    5/24

    5PCA

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    6/24

    6PCA

    The covariance (scatter) matrix of the data , which encodes thevariance and covariance of the data, is used in PCA to find the optimalrotation of the parameter space

    PCA finds the eigenvectors and eigenvalues of the covariance matrix.

    These have the property that

    where:

    - covariance (scatter) matrix

    - eigenvectors

    - transformed covariance matrix (diagonal scatter matrix of eigenvalues)

    - eigenvalues

    Example:

    T =W SW V

    S

    1 2[ , , , ]

    Dw w w=W r r rK

    1 2diag( ) [ , , , ]

    Dv v v= =V v K

    0.26 0.96 14492.28 20760.14 0.26 0.96 302.84 0

    0.96 0.26 20760.14 14492.28 0.96 0.26 0 94.40

    =

    ST

    W W V

    x

    r

    V

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    7/24

    7PCA

    Having found the eigenvectors and eigenvalues,the principal components are found by the followingtransformation:

    Example:

    T

    PCAx x= Wr r

    ,1 1 2 1

    ,2 1 2 2

    0.26 0.96 0.26 0.96

    0.96 0.26 0.96 0.26

    PCA

    PCA

    x x x x

    x x x x

    = = +

    The eigenvectors give an idea of the importance of each of

    the original parameters in accounting for the variance in the data

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    8/24

    8PCA

    A face image defines a pointin the high-dimensional imagespace

    Different face images share anumber of similarities witheach other

    They can be described by arelatively low-dimensional subspace

    They can be projected into anappropriately chosen subspace ofeigenfaces and classification can beperformed by similarity computation(distance)

    1x

    2x

    ,1PCAx

    ,2PCAx

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    9/24

    92D DCT and PCA

    Graphs from: C. Sanderson, On Local Features for Face Verification, IDIAPRR 04-36

    Feature VectorV= [C0

    , C1

    , C2

    , ... , Cuv

    ];

    Feature vector with first few

    local PCA basis functions

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    10/24

    10PCA

    Suppose data consists ofMfaces withD feature values

    1) Place data inD xMmatrix x

    2) Mean-center the data

    ComputeD-dimensional (mean). x0 = x -

    3) ComputeD xD covariance matrix ( c = x0 x0T)

    4) Compute eigenvectors and eigenvalues of covariancematrix

    5) ChooseKlargest eigenvalues (K

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    11/24

    11PCA

    PCA seeks directions that are efficient forrepresenting the data

    efficientnot efficient

    Class A

    Class B

    Class A

    Class B

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    12/24

    12Eigenfaces, the algorithm

    The database

    2

    1

    2

    N

    b

    b

    b

    =

    M

    2

    1

    2

    N

    c

    c

    c

    =

    M

    2

    1

    2

    N

    d

    d

    d

    =

    M

    2

    1

    2

    N

    e

    e

    e

    =

    M

    2

    1

    2

    N

    f

    f

    f

    =

    M

    2

    1

    2

    N

    g

    g

    g

    =

    M

    2

    1

    2

    N

    h

    h

    h

    =

    M

    2

    1

    2

    N

    a

    a

    a

    =

    M

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    13/24

    13Eigenfaces, the algorithm

    We compute the average face

    2 2 2 2

    1 1 11

    2 2 2 21, where 8

    N N N N

    a b hm

    m a b hm M

    Mm a b h

    + + + + + + = =

    + + +

    L

    Lr

    M M M ML

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    14/24

    14Eigenfaces, the algorithm

    Then subtract it from the training faces

    2 2 2 2 2 2 2 2

    2 2

    1 1 1 1 1 1 1 1

    2 2 2 2 2 2 2 2

    1 1 1 1

    2 2

    , , , ,

    ,

    m m m m

    N N N N N N N N

    m m

    N N

    a m b m c m d m

    a m b m c m d ma b c d

    a m b m c m d m

    e m f m

    e m fe f

    e m

    = = = =

    = =

    r rr r

    M M M M M M M M

    rr

    M M

    2 2 2 2 2 2

    1 1 1 1

    2 2 2 2 2 2, ,m m

    N N N N N N

    g m h m

    m g m h mg h

    f m g m h m

    = =

    rr

    M M M M M M

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    15/24

    15Eigenfaces, the algorithm

    Now we build the matrix which is N2by M

    The covariance matrix which is N2by N2

    Find eigenvalues of the covariance matrix

    The matrix is very large The computational effort is very big

    We are interested in at mostMeigenvalues We can reduce the dimension of the covariance (scatter) matrix

    Find theMeigenvalues and eigenvectors Eigenvectors ofC and S are equivalent

    Build transform matrix W from the eigenvectors ofS

    m m m m m m m ma b c d e f g h = xr r r rr r r r

    T= C x x

    T= S x x

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    16/24

    16Eigenfaces, the algorithm

    Compute for each face its projection onto the facespace

    Compute the threshold

    ( ) ( ) ( ) ( )( ) ( ) ( ) ( )

    ,1 ,2 ,3 ,4

    ,5 ,6 ,7 ,8

    , , , ,

    , , ,

    T T T T

    PCA m PCA m PCA m PCA m

    T T T T

    PCA m PCA m PCA m PCA m

    x a x b x c x d

    x e x f x g x h

    = = = =

    = = = =

    W W W W

    W W W W

    r rr r r r r r

    r rr r r r r r

    { }, ,1

    max , 1,2, ,2

    PCA i PCA jx x for i j M = =r r

    K

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    17/24

    17Eigenfaces, the algorithm

    To recognize a face

    Subtract the average face from it

    2

    1

    2

    N

    r

    r

    r

    =

    M

    2 2

    1 1

    2 2m

    N N

    r m

    r mr

    r m

    =

    rM M

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    18/24

    18Eigenfaces, the algorithm

    Compute its projection onto the face space

    Compute the distance in the face space between the faceand all known faces

    ( )PCA mx r= W

    r r

    22, 1,2, ,i PCA PCA ix x for i M = =r r K

  • 7/31/2019 08 Biometrics Lecture 8 Part3 2009-11-09

    19/24

    19Eigenfaces, the algorithm

    Reconstruct the face from eigenfaces

    Compute the distance between the face and itsreconstruction

    Distinguish between If then it is not a face

    If then it is a new face

    If then it is a known face

    PCA PCAr x= Wr r

    22

    m PCAr r = r r

    , ( 1,2, , )iand i M < = K{ }min iand