face recognition using ica presentation

Upload: sadia-khan

Post on 04-Jun-2018

216 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/13/2019 face recognition using ica presentation

    1/31

    PRESENTED BY:SADIA KHAN

    HINA SALEEM

    SIDRA KHANMADIHA BIBI

  • 8/13/2019 face recognition using ica presentation

    2/31

    FACES

    Faces are integral to human interaction

    Manual facial recognition is already usedin everyday authentication applications

    ID Card systems (passports, health card, anddrivers license)

    Booking stations

    Surveillance operations

  • 8/13/2019 face recognition using ica presentation

    3/31

    Facial Recognition technology automates the recognition

    of faces using one of two 2 modeling approaches: Face appearance 2D Eigen faces 3D Morphable Model

    Face geometry 3D Expression Invariant Recognition

    2D Eigenface

    Principle Component Analysis (PCA) 3D Face Recognition 3D Expression Invariant Recognition

    3D Morphable Model

    Facial Recognition

  • 8/13/2019 face recognition using ica presentation

    4/31

    ICA finds the directions of

    maximum independence

  • 8/13/2019 face recognition using ica presentation

    5/31

    Facial Recognition: Eigenface

    Decompose face

    images into a smallset of characteristicfeature images.

    A new face iscompared to thesestored images.

    A match is found if

    the new faces is closeto one of theseimages.

  • 8/13/2019 face recognition using ica presentation

    6/31

    Create training set of faces and calculate the

    eigenfaces

    Project the new image onto the eigenfaces.Check if image is close to face space.

    Check closeness to one of the known faces.

    Add unknown faces to the training set and re-calculate

    Facial Recognition: PCA - Overview

  • 8/13/2019 face recognition using ica presentation

    7/31

    Facial Recognition: PCA Training Set

  • 8/13/2019 face recognition using ica presentation

    8/31

    Facial Recognition: PCA Training

    Find average of

    training images.Subtract average face

    from each image.

    Create covariancematrix

    Generate eigenfaces

    Each original image

    can be expressed as alinear combination ofthe eigenfaces facespace

  • 8/13/2019 face recognition using ica presentation

    9/31

    A new image is project into the facespace.

    Create a vector of weights that describes this image.

    The distance from the original image to thiseigenface is compared.

    If within certain thresholds then it is a recognizedface.

    Facial Recognition: PCA Recognition

  • 8/13/2019 face recognition using ica presentation

    10/31

  • 8/13/2019 face recognition using ica presentation

    11/31

    Independent component analysis (ICA) is a method for

    finding underlying factors or components from multivariate

    (multi-dimensional) statistical data. What distinguishes ICA

    from other methods is that it looks for components that are

    bothstatistically independent, and nonGaussian.

    A.Hyvarinen, A.Karhunen, E.Oja

    Independent Component Analysis

    What is ICA?

  • 8/13/2019 face recognition using ica presentation

    12/31

    Blind Signal Separation (BSS) or Independent Component Analysis (ICA) is the

    identification & separation of mixtures of sources with little prior

    information.

    Applications include:

    Audio Processing Medical data

    Finance

    Array processing (beamforming)

    Coding

    and most applications where Factor Analysis and PCA is currently used. While PCA seeks directions that represents data best in a |x0- x|

    2 sense,ICA seeks such directions that are most independent from each other.

    Often used on Time Series separation of Multiple Targets

    ICA

  • 8/13/2019 face recognition using ica presentation

    13/31

    Principle 1: Nonlinear decorrelation. Find the

    matrix Wso that for any i j, the componentsyiand

    yjare uncorrelated, and the transformed componentsg(yi)and h(yj)are uncorrelated, whereg and haresome suitable nonlinear functions.

    Principle 2: Maximum nongaussianity. Find the

    local maxima of nongaussianity of a linearcombination y=Wxunder the constraint that thevariance of x is constant.

    Each local maximum gives one independent

    component.

    ICA estimation principles

  • 8/13/2019 face recognition using ica presentation

    14/31

    Given a set of observations of random variablesx1(t),

    x2(t)xn(t), where tis the time or sample index, assume

    that they are generated as a linear mixture of independent

    components: y=Wx, where Wis some unknown matrix.Independent component analysis now consists of

    estimating both the matrix Wand theyi(t), when we only

    observe thexi(t).

    ICA mathematical approach

  • 8/13/2019 face recognition using ica presentation

    15/31

    ICA Principal (Non-Gaussian is Independent)

    Key to estimating A is non-gaussianity

    The distribution of a sum of independent random variables tends toward a Gaussiandistribution. (By CLT)

    f(s1) f(s2) f(x1) = f(s1+s2)

    Where wis one of the rows of matrix W.

    y is a linear combination of si, with weights given by zi.

    Since sum of two indep r.v. is more gaussian than individual r.v., so zTs is more gaussianthan either of si. AND becomes least gaussian when its equal to one of si.

    So we could take was a vector which maximizes the non-gaussianity of wTx.

    Such a wwould correspond to a zwith only one non zero comp. So we get back the si.

    szAswxwy TTT

  • 8/13/2019 face recognition using ica presentation

    16/31

    We need to have a quantitative measure of non-gaussianity for ICA Estimation.

    Kurtotis : gauss=0 (sensitive to outliers)

    Entropy : gauss=largest

    Neg-entropy : gauss = 0 (difficult to estimate)

    Approximations

    where v is a standard gaussian random variable and :

    224 }){(3}{)( yEyEykurt

    dyyfyfyH )(log)()(

    )()()( yHyHyJ gauss

    222 )(48

    1

    12

    1)( ykurtyEyJ

    2)()()( vGEyGEyJ

    )2/.exp()(

    ).cosh(log1)(

    2uayG

    yaa

    yG

  • 8/13/2019 face recognition using ica presentation

    17/31

    Centering

    x= x E{x}

    But this doesnt mean that ICA cannt estimate the mean, but it just simplifiesthe Alg.

    ICs are also zero mean because of:

    E{s} = WE{x}

    After ICA, add W.E{x} to zero mean ICs

    Whitening We transform the xs linearly so that the x~are white. Its done by EVD.

    x~ = (ED-1/2ET)x = ED-1/2ETAx = A~s

    whereE{xx~} = EDET

    So we have to Estimate Orthonormal Matrix A~

    An orthonormal matrix has n(n-1)/2 degrees of freedom. So for large dim A we

    have to est only half as much parameters. This greatly simplifies ICA.

    Reducing dim of data (choosing dominant Eig) while doing whitening alsohelp.

    Data Centering & Whitening

  • 8/13/2019 face recognition using ica presentation

    18/31

    0) Centring = make the signals centred in zero

    xixi - E[xi] for each i

    1) Sphering = make the signals uncorrelated. I.e. apply a transform Vto xsuch that Cov(Vx)=I // where Cov(y)=E[yyT] denotes covariance matrix

    V=E[xxT]-1/2 // can be done using sqrtm function in MatLab

    xVx // for all t (indexes t dropped here)

    // bold lowercase refers to column vector; bold upper to matrix

    Scope: to make the remaining computations simpler. It is known thatindependent variables must be uncorrelatedso this can be fulfilled

    before proceeding to the full ICA

    Computing the pre-processing steps for ICA

  • 8/13/2019 face recognition using ica presentation

    19/31

    Computing the rotation step

    Fixed Point Algorithm

    Input: X

    Random init of W

    Iterate until convergence:

    Output: W, S

    1)(

    )(

    WWWW

    SXW

    XWS

    T

    T

    T

    g

    T

    t

    T

    t

    TGObj1

    )()()( IWWxWW

    0WXWXW

    TTgObj

    )(

    where g(.) is derivative of G(.),

    Wis the rotation transform soughtis Lagrange multiplier to enforce that

    W is an orthogonal transform i.e. a rotation

    Solve by fixed point iterations

    The effect ofis an orthogonal de-correlation

    This is based on an the maximisation of anobjective function G(.) which contains an

    approximate non-Gaussianity measure.

    The overall transform then

    to take Xback to Sis (WTV)

    There are several g(.)

    options, each will work best

    in special cases. See FastICA

    sw / tut for details.

  • 8/13/2019 face recognition using ica presentation

    20/31

  • 8/13/2019 face recognition using ica presentation

    21/31

  • 8/13/2019 face recognition using ica presentation

    22/31

  • 8/13/2019 face recognition using ica presentation

    23/31

  • 8/13/2019 face recognition using ica presentation

    24/31

    Two architectures for performing ICA on images. (a) Architecture I forfinding statistically independent basis images. Performing sourceseparation on the face images produced IC images in the rows of U. (b)The gray values at pixel location i are plotted for each face image. ICA inarchitecture I finds weight vectors in the directions of statisticaldependencies among the pixel locations. (c) Architecture II for finding afactorial code. Performing source separation on the pixels produced afactorial code in the columns of the output matrix, U. (d) Each faceimage is plotted according to the gray values taken on at each pixellocation. ICA in architecture II finds weight vectors in the directions ofstatistical dependencies among the face images

  • 8/13/2019 face recognition using ica presentation

    25/31

  • 8/13/2019 face recognition using ica presentation

    26/31

  • 8/13/2019 face recognition using ica presentation

    27/31

  • 8/13/2019 face recognition using ica presentation

    28/31

  • 8/13/2019 face recognition using ica presentation

    29/31

  • 8/13/2019 face recognition using ica presentation

    30/31

  • 8/13/2019 face recognition using ica presentation

    31/31