scalable information-driven sensor querying and routing for ad hoc heterogeneous sensor networks

Post on 23-Jan-2016

29 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

SCALABLE INFORMATION-DRIVEN SENSOR QUERYING AND ROUTING FOR AD HOC HETEROGENEOUS SENSOR NETWORKS. Paper By: Maurice Chu , Horst Haussecker , Feng Zhao Presented By: D.M. Rasanjalee Himali. INTRODUCTION. Problem addressed : - PowerPoint PPT Presentation

TRANSCRIPT

SCALABLE INFORMATION-DRIVENSENSOR QUERYING AND ROUTINGFOR AD HOC HETEROGENEOUSSENSOR NETWORKSPaper By: Maurice Chu , Horst Haussecker , Feng Zhao

Presented By: D.M. Rasanjalee Himali

INTRODUCTION

Problem addressed:How to dynamically query sensors and route data in a network so that information gain is maximized while latency and bandwidth consumption is minimized

Approach:information driven sensor querying (IDSQ)

optimize sensor selection and

constrained anisotropic diffusion routing (CADR) direct data routing and incrementally combine sensor measurements

so as to minimize an overall cost function.

INTRODUCTION

Use information utility measures to optimize sensor selection Use incremental belief update

Each node can :evaluate an information/cost objective, make a decision, update its belief state, and route data based on the local information/cost gradient and end-user requirement.

SENSING MODEL AND MEASURE OF UNCERTAINTYuses standard estimation theory.

zi(t): The time-dependent measurement of sensor i

λi(t):sensor i characteristics, x(t): unknown target positionh: possibly non-linear function depending on x(t) and parameterized by λi(t).

Characteristics of λi(t) about sensor i:Sensing modality (type of sensor i )Sensor position xi,Noise model of sensor i Node power reserve of sensor i

SENSING MODEL AND MEASURE OF UNCERTAINTY

BELIEF:Is a representation of the current a posteriori distribution of x given measurements z1, ..., zN

Typically, the expected value of this distribution is considered to be the estimate:

Residual uncertainty is approximated by the covariance:

SENSING MODEL AND MEASURE OF UNCERTAINTY

knowledge of the measurement value zi and sensor characteristics λi normally resides only in sensor i.

To compute the belief based on measurements from several sensors, we must pay a cost for communicating that information.

SENSING MODEL AND MEASURE OF UNCERTAINTY

Incorporating measurements into the belief are now assigned costs

Therefore, should intelligently choose a subset of sensor measurements which:

provide “good” information for constructing a belief state andminimize the communication cost of sensor measurements

Information Content of sensor i:a measure of the information a sensor measurement can provide to a belief state.

SENSOR SELECTIONGiven the current belief state, need to incrementally update the belief by incorporating measurements of previously not considered sensors.

However, among all available sensors in the network, not all provide useful information that improves the estimate.

Furthermore, some information might be useful, but redundant.

The task is to select an optimal subset and to decide on an optimal order of how to incorporate these measurements into our belief update. This provides a faster reduction in estimation uncertainty

SENSOR SELECTION

Assume there are N sensors labeled from 1 to N and the corresponding measured values of the sensors are {zi}. 1<=i<=N

Let U ⊂ {1, ...,N} be the set of sensors whose measurements have been incorporated into the belief.

The current belief is:

SENSOR SELECTION

Information utility functionThe sensor selection task is to choose a sensor which has not been incorporated into the belief yet which provides the most information

Def (Information Utility):

acts on the class P(Rd) of all probability distributions on Rd and returns a real number with d being the dimension of x.

SENSOR SELECTION

assign a value to each element p ∈ P(Rd) which indicates how spread out or uncertain the distribution p is.

Smaller values represent a more spread out distribution while larger values represent a tighter distribution.

SENSOR SELECTION

Incorporating a measurement zj, where j∉U, into the current belief state p (x|{zi} i∈U) is accomplished by further conditioning the belief with the new measurement.

Hence, the new belief state is

SENSOR SELECTION

Incorporating a measurement zj has the effect of mapping an element of P(Rd) to another element of P(Rd).

Since ψ gives a measure of how “tight” a distribution in P(Rd) is, it is clear that the best sensor j∈A={1, ...,N}−U to choose is

SENSOR SELECTION

However, in practice, we only have knowledge of h and λi to determine which sensor to choose.

We don't know the measurement value zj before it is being sent.

Nevertheless, we wish to select the “most likely” best sensor.

Hence, it is necessary to marginalize out the particular value of zj.

SENSOR SELECTIONFor any given value of zj for sensor j, we get a particular value for ψ acting on the new belief state p(x|{Zi} i∈U {Zj})

For each sensor j, consider the set of all values of ψ( ) for choices of zj:

Best average case

Maximizing worst case

Maximizing best case

INFORMATION UTILITY MEASURES

To quantify the information gain provided by a sensor measurement, it is necessary to define a measure of information utility.

The intuition:information content is inversely related to the “size” of the high probability uncertainty region of the estimate of x.

Ex:Covariance-BasedFisher Information MatrixEntropy of Estimation UncertaintyVolume of High Probability Region

INFORMATION UTILITY MEASURES

Covariance-BasedUsed in the simplest case of a uni-modal posterior distribution that can be approximated by a Gaussian

Derive utility measures based on the covariance Σ of the distribution px(X).

The determinant det(Σ) is proportional to the volume of the rectangular region enclosing the covariance ellipsoid.

Hence, the information utility function for this approximation can be chosen as:

INFORMATION UTILITY MEASURES

Entropy of Estimation Uncertainty

If the distribution of the estimate is highly non-Gaussian, then the covariance Σ is a poor statistic of the uncertainty.

One possible utility measure is the information-theoretic notion of information:

the entropy of a random variable.

the Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable

INFORMATION UTILITY MEASURES

For a discrete random variable X taking values in a finite set S, the Shannon entropy H(X) is defined to be:

Entropy is a measure of uncertainty which is inversely proportional to our notion of information utility.

Thus we can define the information utility as:

COMPOSITE OBJECTIVE FUNCTION

Up till now, we have ignored :the communication cost of transmitting information across the network, and which sensor actually holds the current belief.

Leader Node:sensor, l, which holds the current belief

COMPOSITE OBJECTIVE FUNCTION

Leader node might act as a relay station to the user, the belief resides at this node for an extended time intervalall information has to travel to this leader.

Another scenario:the belief itself travels through the network, and nodes are dynamically assigned as leaders.

Depending on the network architecture and the measurement task, both or a mixture of both cases can be implemented.

COMPOSITE OBJECTIVE FUNCTION

Assume that :

The leader node temporarily holds the belief state and

Information has to travel a certain distance through the network to be incorporated into the belief state.

COMPOSITE OBJECTIVE FUNCTION

The objective function for sensor querying and routing is a function of both:

information utility and

cost of bandwidth and latency.

COMPOSITE OBJECTIVE FUNCTIONThis can be expressed by a composite objective function, Mc, of the form:

where ψ = Mu(p(X|{zi},i∈U), λj)

Ma = the cost of the bandwidth, and latency of communicating information between sensor j and sensor l:The tradeoff parameter ∈[0,1] balances the contribution from the two terms

COMPOSITE OBJECTIVE FUNCTION

The objective is to maximize Mc by selecting a sensor j from the remaining sensors A={1, ...,N}−U by:

INFORMATION-DRIVEN SENSORQUERY

A sensor selection algorithm based on the cluster leader type of distributed processing protocol

Assume we have a cluster of N sensors each labelled by a unique integer in {1, ..., N}. A priori, each sensor i only has knowledge of its own position xi ∈R2

Select cluster leader

Activate if a target is present in sensor cluster

Test Results

Apply the sensor selection algorithm to the problem of spatial localization of a stationary target based on amplitude measurements from a network of sensors.

Sensor Selection Nearest Neighbor Data DiffusionMahalanobis distanceMaximum likelihoodBest Feasible Region

Test Results

top related