1 collaborative processing in sensor networks lecture 4 - distributed in-network processing hairong...

43
1 Collaborative Processing in Sensor Networks Lecture 4 - Distributed In-network Processing Hairong Qi, Associate Professor Electrical Engineering and Computer Science University of Tennessee, Knoxville http://www.eecs.utk.edu/faculty/qi Email: [email protected] Lecture Series at ZheJiang University, Summer 2008

Upload: dale-king

Post on 28-Dec-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

1

Collaborative Processing in Sensor Networks

Lecture 4 - Distributed In-network Processing

Hairong Qi, Associate ProfessorElectrical Engineering and Computer ScienceUniversity of Tennessee, Knoxvillehttp://www.eecs.utk.edu/faculty/qiEmail: [email protected]

Lecture Series at ZheJiang University, Summer 2008

2

Research Focus - Recap

• Develop energy-efficient collaborative processing algorithms with fault tolerance in sensor networks

– Where to perform collaboration?– Computing paradigms

– Who should participate in the collaboration?– Reactive clustering protocols– Sensor selection protocols

– How to conduct collaboration?– In-network processing– Self deployment

3

A Signal Processing and Fusion Hierarchy for Target Classification

modality 1)( 1

11 tx

modality 1)(1

1 rtxmodality M

)(1 kM tx

Temporalfusion

Multi-modalityfusion

Multi-sensor fusion

……

Temporalfusion

… …

……

… …

node 1 node i node N

Localprocessing

Localprocessing

Localprocessing

Localprocessing

Localprocessing

Localprocessing

Mobile-agent Middleware

4

Signal - Acoustic/Seismic/IR

5

Local Signal Processing

Am

plitude stat.

Time series signal

Power Spectral Density (PSD) Wavelet Analysis

Shape stat.

Peak selection

Coefficients

Feature vectors (26 elements)

Feature normalization, Principal Component Analysis (PCA)

Target Classification (kNN)

Feature extraction

Classifier design

6

A Signal Processing and Fusion Hierarchy for Target Classification

modality 1)( 1

11 tx

modality 1)(1

1 rtxmodality M

)(1 kM tx

Temporalfusion

Multi-modalityfusion

Multi-sensorfusion

……

Temporalfusion

… …

……

… …

node 1 node i node N

Localprocessing

Localprocessing

Localprocessing

7

Temporal Fusion

• Fuse all the 1-sec sub-interval local processing results corresponding to the same event (usually lasts about 10-sec)

• Majority voting

,maxarg cj

i ωϕ = ],1[ Cc∈

number of possible localprocessing results

number of localoutput c occurrence

8

A Signal Processing and Fusion Hierarchy for Target Classification

modality 1)( 1

11 tx

modality 1)(1

1 rtxmodality M

)(1 kM tx

Temporalfusion

Multi-modalityfusion

Multi-sensorfusion

……

Temporalfusion

… …

……

… …

node 1 node i node N

Localprocessing

Localprocessing

Localprocessing

9

Multi-modality Fusion

• Difference sensing modalities can compensate each other’s sensing capability and provide a comprehensive view of the event

• Treat results from different modalities as independent classifiers – classifier fusion

• Majority voting won’t work• Behavior-Knowledge Space

algorithm (Huang&Suen)

Assumption: - 2 modalities - 3 kinds of targets - 100 samples in the training SetThen: - 9 possible classification combinations

c1, c2 samples from each class fused result

1,1 10/3/3 11,2 3/0/6 31,3 5/4/5 1,3

…3,3 0/0/6 3

10

A Signal Processing and Fusion Hierarchy for Target Classification

modality 1)( 1

11 tx

modality 1)(1

1 rtxmodality M

)(1 kM tx

Temporalfusion

Multi-modalityfusion

Multi-sensorfusion

……

Temporalfusion

… …

……

… …

node 1 node i node N

Localprocessing

Localprocessing

Localprocessing

11

Value-based vs. Interval-based Fusion

• Interval-based fusion can provide fault tolerance• Interval integration – overlap function

– Assume each sensor in a cluster measures the same parameters, the integration algorithm is to construct a simple function (overlap function) from the outputs of the sensors in a cluster and can resolve it at different resolutions as required

w5s5w4s4

w3s3w2s2w1s1

w6s6 w7s7

3

2

1

0

O(x)

Crest: the highest, widest peak of the overlap function

12

Fault Tolerance Comparison of Different Overlap Functions

n: number of sensorsf: number of faulty sensorsM: to find the smallest interval that contains all the intersections of n-f intervalsS: return interval [a,b] where a is the (f+1)th left end point and b is the (n-f)th right end pointΩ: overlap functionN: interval with the overlap function ranges [n-f, n]

13

Multi-sensor Fusion for Target Classification

• Generation of local confidence ranges (For example, at each node i, use kNN for each k5,…,15)

• Apply the integration algorithm on the confidence ranges generated from each node to construct an overlapping function

confidencerange

confidence level

smallest largest in this column

Class 1 Class 2 … Class nk=5 3/5 2/5 … 0k=6 2/6 3/6 … 1/6 … … … … …k=15 10/15 4/15 … 1/15

2/6, 10/15 4/15, 3/6 … 0, 1/6

14

Distributed Integration Scheme

• Integration techniques must be robust and fault-tolerant to handle uncertainty and faulty sensor readouts

• Provide progressive accuracy• A 1-D array serves to represent the partially-

integrated overlap function• Mobile-agent-based framework is employed

15

Mobile-agent-based Multi-sensor Fusion for Target Classification

Node 1

Node 2

Node 3

[2/6, 10/15] Class 1

[3/15, 3/6] Class 2…

Confidence range CR1

[3/6, 4/5] Class 1

[0, 5/15] Class 2…

Confidence range CR2

[3/5, 10/12] Class 1

[1/15, 2/8] Class 2…

Confidence range CR3

At node 2, fusion code generates partially integrated confidence range CR12

[3/6, 10/15] Class 1

[3/15, 5/15] Class 2…

Fusion result at node 2: class 1 (58%)

Mobile agentcarries CR1

Mobile agentcarries CR12

Fusion result at node 3: class 1 (63%)

At node 3, fusion code generates partially integrated confidence range CR123

[3/5, 10/15] Class 1

[3/15, 2/8] Class 2…

16

An Example of Multi-sensor Fusion

[0.10 0.29][0.46 0.65][0.10 0.21]

[0.05 0.14][0.05 0.41][0.22 0.58]

[0.05 0.15][0.05 0.15][0.49 0.59]

[0.08 0.16][0.08 0.16][0.51 0.60]

node 1

node 2node 3

node 4

accwhc ××=h: height of the highest peakw: width of the peakacc: confidence at the center of the peak

17

Example of Multi-sensor Integration Result at Each Stop

stop 1 stop 2 stop 3 stop 4

c acc c acc c acc c acc

class 1 1 0.2 0.5 0.125 0.75 0.125 1 0.125

class 2 2.3 0.575 4.55 0.35 0.6 0.1 0.75 0.125

class 3 0.7 0.175 0.5 0.25 3.3 0.55 3.45 0.575

18

Performance Evaluation of Target Classification Hierarchy

North-South

East-West

Center

19

SITEX02 Scenario Setup

• Acoustic sampling rate: 1024HzSeismic sampling rate: 512 Hz

• Target types: AAV, DW, and HMMWV• Collaborated work with two other universities (Penn State, Wisconsin)• Data set is divided into 3 partitions: q1, q2, q3• Cross validation

– M1: Training (q2+q3); Testing (q1)– M2: Training (q1+q3); Testing (q2)– M3: Training (q1+q2); Testing (q3)

20

Confusion Matrices of Classification on SITEX02

Acoustic (75.47%, 81.78%)

Seismic (85.37%, 89.44%)

Multi-modalityfusion

(84.34%)

Multi-sensorfusion

(96.44%)

AAV DW HMV

AAV 29 2 1

DW 0 18 8

HMV 0 2 23

21

Result from BAE-Austin Demo

Compact Cluster Laydown

0.0

20.0

40.0

60.0

80.0

100.0

120.0

140.0

0.0 20.0 40.0 60.0 80.0 100.0 120.0 140.0

Ft

Ft Series1

Suzuki VitaraFord 350Ford 250 Harley Motocycle

22

Confusion Matrices

Acoustic Seismic

Multi-modal

23

24

Demo

• Mobile-agent-based multi-modal multi-sensor classification

– Mobile agent migrates with a fixed itinerary (Tennessee)

– Mobile agent migrates with a dynamic itinerary realized by distributed services (Auburn)

25

run 1:

car

walker

run 2:

car

26

Target Detection in Sensor Networks

• Single target detection– Energy threshold– Energy decay model:

• Multiple target detection– Why is multiple target detection necessary?– Requirements: energy-efficient & bandwidth efficient– Multiple target detection vs. source number estimation

αd

EE source

obs =

source

WXS =

:S

:X

:W

source matrix

observation matrixunmixing matrix

Assumption:

)()( XsizeSsize =

Target separation Speaker separation

27

Source Number Estimation (SNE)

• Task:• Algorithms for source number estimation

– Heuristic techniques– Principled approaches: Bayesian estimation, Markov chain Monte

Carlo method, etc.

• Limitations– Centralized structure

– Large amount of raw data transmission– Computation burden on the central processing unit

Environment

Sensor 1 Sensor i

central processing unit

data data data

…… ……

decision

Sensor n

)|(maxarg Xmm

HPm =∧

28

Distributed Source Number Estimation Scheme

SensorSensor

Sensor

… SensorSensor

SensorSensor

clusterhead

clusterhead

Fusion Center

decision decision

cluster 1 cluster L

System architecture

Algorithm structure

)(1 mL

m targets present in sensor filed

Local estimation at cluster 1

progressive

Local estimation at cluster L

progressive

Posterior probability fusion

)(mLL

…centralized centralizedcentralized centralized

29

Centralized Bayesian Estimation Approach

∑=

allH

mmm HPHp

HPHpHP

)()|(

)()|()|(

X

XX

∏=

=

aaaxx

xX

dRAtpRAtp

RAtpRAp

nn

tnn

)(),,,|)((),,|)((

),,|)((),,|(

πϕϕ

ϕϕ

: sensor observation matrixX

: hypothesis of the number of sourcesmH

: source matrixS

: mixing matrix,A SX A=: unmixing matrix,W and TT AAAW 1)( −=XS W=

: latent variable,a Xa W= and )(aS φ=: non-linear transformation functionφ: noise, with variancenR

: marginal distribution of)(⋅π aβ/1

)|(log)( )(m

t HpmL x=

2))()((2

log2

1)

2log()(

2

1))((log tAtAAmnt

T ∧∧∧

∧∧∧

−−−−+= axaβ

πβπ

∑=

∧∧

++−m

j

j mnanmn

1

2

]log)log(2

)2

log(2

[ γπ

β

Source: S. J. Roberts, “Independent component analysis: source assessment & separation, a Bayesian approach,” IEE Proc. on Vision, Image, and Signal Processing, 145(3), pp.149-154, 1998.

30

Progressive Bayesian Estimation Approach

• Motivation– Reduce data transmission– Conserve energy

)(1 mL

m targets present in sensor filed

Local estimation at cluster 1

progressive

Local estimation at cluster L

progressive

Posterior probability fusion

)(mLL

…centralized centralized

31

Progressive Bayesian Estimation Approach

1I

Environment

Sensor 11x

Sensor 22x

2I … 1−iI Sensor iix

iI … 1−nI Sensor nnx

)(mLn)(mLi)(2 mL

Estimation accuracy is progressively improved

includesiI

i][A

it ][ )(a

iTerm ]1[

iTerm ]2[

iTerm ]7[

Transmitted information

)|(log)( )(m

t HpmL x=

2))()((2

log2

1)

2log()(

2

1))((log tAtAAmnt

T ∧∧∧

∧∧∧

−−−−+= axaβ

πβπ

∑=

∧∧

++−m

j

j mnanmn

1

2

]log)log(2

)2

log(2

[ γπ

β

32

Progressive Update at Local Sensors

)()()( )(1

tiii xfmLmL += −

)(mL 2))()((2

log2

1)

2log()(

2

1))((log tAtAAmnt

T ∧∧∧

∧∧∧

−−−−+= axaβ

πβπ

∑=

∧∧

++−m

j

j mnanmn

1

2

]log)log(2

)2

log(2

[ γπ

β

)(,

)(,1

)(

1 ]1)2)[exp(][2exp(1

]1[]1[ tiik

tiiki

t

kii xwxwaTermTerm −−−−−= −

− ααα

1

1

2)(

,)(

1

)(

2)

1log(

2]2[

1]2[

=

∑−⋅

−+

−−−−

+−−

−=

i

m

k

t

kkit

i

ii

aAxmi

mimimi

Termmi

miTerm

ε

4

1

)(

,)(

1

)(

,)(

11 )()(2]4[1

]4[ ∑∑=

=

−− −+−+−−−

=m

k

t

kkit

i

m

k

t

kkit

iiii aAxaAxTermmimi

Term ε

only depends on the updating rule of mixing matrixiTerm ]3[ A

Progressive update of log-likelihood function:

33

Progressive Update at Local Sensors

)(mL 2))()((2

log2

1)

2log()(

2

1))((log tAtAAmnt

T ∧∧∧

∧∧∧

−−−−+= axaβ

πβπ

∑=

∧∧

++−m

j

j mnanmn

1

2

]log)log(2

)2

log(2

[ γπ

β

∑=

++

−=

m

ki

t

k

i

t

kt

iikt

iikii

a

axwxwiTerm

i

iTerm

1 21

)(1

)()(

,2)(

,1

)]([

][2)(

2]6[

1]6[

1

1

2)(

,)(

1

)(

2)

1log(

2]5[

1]5[

=

∑−⋅+

−−−

+−

=i

m

k

t

kkit

i

ii

aAxim

mimiim

Termii

Termε

only depends on the updating rule of mixing matrixiTerm ]7[ A

Progressive update of mixing matrix : BFGS method (Quasi-Newton method)A

2

1

)(

,)(

1 )( ∑=

− −+=m

k

t

kkit

iii aAxεεProgressive update of error:

34

35

An Example

36

Mobile-agent-based Implementation of Progressive Estimation

Sensor 1)(

1tx

Sensor 2)(

2tx

Sensor 3)(

3tx

Initialization

Update

Update terms in log-likelihood function

Compute estimated latent variable

Update estimation error

A

)(mL)(t∧

a

ε

Update

Update terms in log-likelihood function

Compute estimated latent variable

Update estimation error

A

)(mL)(t∧

a

ε

Output m

37

Distributed Fusion Scheme

• Bayes theorem

• Dempster’s rule of combination– Utilize probability intervals and

uncertainty intervals to determine the likelihood of hypotheses based on multiple evidence

)(1 mL

m targets present in sensor filed

Local estimation at cluster 1

progressive

Local estimation at cluster L

progressive

Posterior probability fusion

)(mLL

…centralized centralized

=

=== L

l

tl

m

L

lm

tl

tmm

tt

m

p

HPHp

p

HPHpHP

1

)(

1

)(

)(

)()(

)(

)()|(

)(

)()|()|(

x

x

x

xx

∑=

=L

lm

tll

tm HpcHP

1

)()( )|(log)|(log xx

where

∑∑==

==ll K

k kl

K

kk

ll dK

EK

c1

21

111

∑∑

≠=∩

≠=∩

∗−

∗=

jiHallH jgif

jiHHallH jgif

m

gf

mgf

HPHP

HPHPHP

,

,

)|()|(1

)|()|()|(

φXX

XXX

38

Performance Evaluation

Target types

Sensor nodesclustering

Signals: 1-second acoustic, 500 samples each

)(1 mL

m targets present in sensor filed

Local estimation

progressive

Local estimation

progressive

Posterior probability fusion

)(mLL

…centralized centralized

Bayesian Dempster’s rule

m targets present in sensor filed

centralized progressive

Source number estimation

centralized

Experiment 1

progressive

Experiment 2

Bayesian

Experiment 3

Dempster’s rule

Experiment 4

centralized centralizedprogressive progressive

Experiment 5Experiment 6

39

Average Log-likelihood Comparison

Experiment 1(Centralized)

Experiment 2(Centralized + Bayesian)

Experiment 3Centralized + Dempster’s rule

Experiment 4(Progressive)

Experiment 5 & 6(Progressive (cluster 1) vs. fusion)

Experiment 5 & 6(Progressive (cluster 2) vs. fusion)

40

Output Histogram ComparisonExperiment 1(Centralized)

Experiment 2(Centralized + Bayesian fusion)

Experiment 3(Centralized + Dempster’s rule)

Experiment 4(Progressive)

Experiment 5(Progressive + Bayesian fusion)

Experiment 6(Progressive + Dempster’s rule)

41

Performance Comparison

Kurtosis

Data transmission

Detection probability

Energy consumption

42

Discussion

• Develop a novel distributed multiple target detection scheme with– Progressive source number estimation approach (first in literature)– Mobile agent implementation– Cluster-based probability fusion

• The approach (progressive intra-cluster estimation + Bayesian inter-cluster fusion) has the best performance

Kurtosis Detection probability(%)

Data transmission(bits)

Energy comsumption

(uW*sec)

Centralized -1.2497 0.25 144,000 486,095

Centralized + Bayesian fusion

5.6905 0.65 128,160 433,424

Progressive + Bayesian fusion

4.8866 0.55 24,160 (16.78%) 89,242(18.36%)

43

Reference

• X. Wang, H. Qi, S. Beck, H. Du, “Progressive approach to distributed multiple-target detection in sensor networks,” pp. 486-503, Sensor Network Operations. Editor: Shashi Phoha, Thomas F. La Porta, Christopher Griffin. Wiley-IEEE Press, March 2006.

• H. Qi, Y. Xu, X. Wang, “Mobile-agent-based collaborative signal and information processing in sensor networks,” Proceedings of the IEEE, 91(8):1172-1183, August 2003.

• H. Qi, X. Wang, S. S. Iyengar, K. Chakrabarty, “High performance sensor integration in distributed sensor networks using mobile agents,” International Journal of High Performance Computing Applications, 16(3):325-335, Fall, 2002.