20111029 digital instrumentation 6 v1

Upload: pwbamu

Post on 06-Apr-2018

221 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    1/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    2/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Janaka Wijayakulasooriya

    PhD, MIEEE

    Senior Lecturer

    Department of Electrical and

    Electronic Engineering

    University of Peradeniya

    [email protected]

    Sensor Fusion

    EE561

    Are you ready for a 2nd Opinion ?

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    3/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Can you see anything ?

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    4/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    With Thermal Sensor Fusion

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    5/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    A new dimension to the vision

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    6/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    You are no longer safe behind Bushes

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    7/41 Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Sensor Fusion: Definition

    Sensor Fusion is the combining of sensory

    data or data derived from sensory data suchthat the resulting information is in some

    sense (eg: accuracy, robustness) better than

    would be possible when these sources were

    used individually

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    8/41 Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Sensor Fusion Models: Complementary Type

    sensors do not depend on each other

    directly Can be combined to establish a more

    complete picture of the phenomenon being

    observed and hence the sensor datasets

    would be complete

    Example:

    the use of multiple cameras each observing

    different parts of a room four radars around a geographical region would

    provide a complete picture of the area

    surrounding the region

    Fusion algorithm can be simply appending

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    9/41 Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Sensor Fusion Models: Competitive Type

    Each sensor delivers independent measurements of

    the same attribute or feature Fusion of the same type of data from different

    sensors or the fusion of measurements from a single

    sensor obtained at different instants is possible

    Provide robustness and fault-tolerance becausecomparison with another competitive sensor can be

    used to detect faults

    Can provide a degraded level of service in the

    presence of faults Competing sensors in this system do not necessarily

    have to be identical

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    10/41 Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Sensor Fusion Models: Cooperative Type

    Data provided by two independent sensors

    are used to derive information that wouldnot be available from a single sensor

    Eg: Stereoscopic vision system

    By combining the two-dimensional (2D) images

    from two cameras located at slightly different

    angles of incidence (viewed from two image

    planes), a three- dimensional (3D) image of the

    observed scene can be determined

    Cooperative sensor fusion is difficult to

    design, and the resulting data will be

    sensitive to the inaccuracies in all the

    individual sensors.

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    11/41 Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Stereoscopic Camera

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    12/41 Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Levels of Fusion

    Raw level Fusion

    Directly the sensor outputs are combined

    Eg: Averaging the temperature in a room

    Feature level Fusion

    Extract information from sensors before

    combining

    Eg: In fish classifier, features length and lightness

    were extracted from the vision inputs

    Decision/Action Level Fusion Fusion after taking the decisions based on

    individual sensors

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    13/41 Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Fusion Methods

    Probabilistic and statistical models:

    Bayesian reasoning Evidence theory

    Robust statistics

    Recursive operators

    Least-square (LS) and mean square methods:

    Kalman Filter

    Optimization

    Regularization

    Uncertainty ellipsoids

    Heuristic methods

    ANNs

    Fuzzy logic

    Approximate reasoning

    Computer vision techniques

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    14/41 Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Raw data level fusion: Case Study

    T ?T1 T2

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    15/41 Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Sensor Models

    P(x|z)

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    16/41 Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Consider another sensor

    P(x|z)

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    17/41 Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    How can we combine ?

    Let X = (1-W)z1+Wz2

    E(Xhat) = ?

    E() = ?

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    18/41 Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Combined Measurement

    P(x|z1,z2)

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    19/41 Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Example

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    20/41 Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Which Feature(s) ?

    Using length as a feature

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    21/41 Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Which feature(s) ?

    Using Lightness as feature

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    22/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    More features

    Linearly separable decision boundary

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    23/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Overly complex decision boundaries

    Over tuned decision boundary

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    24/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    More complex decision boundaries

    Optimized decision boundary

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    25/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Bayesian Decision Theory

    Lets revisit the example of Sea Bass and

    Salmon Let w represents the class:

    w = w1 Sea Bass

    w = w2 Salmon

    Priori Probabilities P(w)

    P(w1): The probability that next fish is Sea Bass

    P(w2): The probability that next fish is Salmon

    Reflect our prior knowledge of how likely weare to get a sea bass or salmon before the fish

    actually appears

    Can we make a decision only based on priori ?

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    26/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Improving the simple classifier

    Can we use P(w1) > P(w2) or P(w1) < P(w2)

    to decide next fish is Sea Bass or Salmon ? What if we have to predict many fish ?

    Suppose, in addition to the priori we have:

    Measurement vector (Feature vector)x={x1xN} on the subject

    (Example: x1 = lightness, x2 = length)

    Class-conditional probability density function

    P(x|w)

    Example: P(x2|w1) probability distribution of the

    length of the fish, given that it is a Sea Bass

    Bayes theorem can be used to calculate the

    posterior probabilities P(w|x)

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    27/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Bayes Formula

    How to find P(x) ?

    Informally this can be expressed in English as

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    28/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Bayes Classifier

    Bayes formula shows that by observing the

    value of x we can convert the priorprobability P(j) to the a posteriori

    probability (or posterior) probability P(j |x)

    For the purpose of classification, what is

    important is:

    Likelihood

    Priori

    Evidence p(x) can be considered as just ascaling factor which make sure the sum of

    each individual probabilities = 1

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    29/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Class-conditional PDF of the two classes

    Normalized PDF = area under curve is 1

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    30/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Posteriors

    Let priors P(w1)=1/3 and P(w2)=2/3

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    31/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Classification based on Posteriors

    Now, it is sensible to set the decision

    boundary as If P(w1|x) > P(w2|x) select class w1

    If P(w1|x) < P(w2|x) Select class w2

    In order to justify this decision, we have to

    calculate the probability of error, whenever

    we make a decision based on the

    measurement x and priories

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    32/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Minimizing probability of error

    The average probability of error is:

    So, if P(error|x) can be minimized for eachindividual decision, P(error) can be

    minimized

    So, if we apply the Bayes decision rule then

    the P(error|x) = min[ P(w1|x), P(w2|x) ]

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    33/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Bayes Decision Rule: elimination of p(x)

    As, p(x) [ evidence ] in the Bayes just ensures

    that P(w1|x) + P(w2|x) = 1, we can eliminateit from the decision rule

    Hence, the decision can be made only based

    on:

    p(x|w1).P(w1) > p(x|w2). P(w2) Decide w1

    p(x|w1).P(w1) < p(x|w2). P(w2) Decide w2

    Special cases:

    When p(x|w1) = p(x|w2) decision is onlybased on priories

    When P(w1)=P(w2) decision is only based on

    likelihood

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    34/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Generalization of Bayes Classifier

    Consider what happens if:

    More than one feature Feature vector

    Feature space

    More than two states of nature

    Many classes

    Allow other actions

    Eg: Rejection

    Introducing loss function more general than the

    probability of error

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    35/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Fusion at Decision Level: Bayes Reasoning

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    36/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Example

    Consider an air surveillance detector, which

    can have 3 states (x):

    Suppose a single sensor observes x and

    return 3 values

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    37/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Sensor Model: Likelihood Matrix

    Sensor can be modeled in the form of a

    likelihood matrix This gives P(z|x)

    Consider P(z|x) for two sensors

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    38/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    39/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Posterior Probability

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    40/41

    Janaka Wijayakulasooriya, Department of Electrical and Electronic Engineering, University of Peradeniya

    Combining Sensors

    Assuming that each sensor is mutually

    exclusive:P(x | z1,z2,zN) = P(x | z1).P(x|z2)P(x|ZN)

    Therefore

    P(x | z1,z2) = P(x|z1)P(x|z2)= C. P(z1|x).P(x).(z2|x).P(x)

  • 8/2/2019 20111029 Digital Instrumentation 6 v1

    41/41

    [email protected]