a journey through data rate theorems for communication and control

Download A journey through data rate theorems for communication and control

If you can't read please download the document

Upload: apu

Post on 10-Jan-2016

23 views

Category:

Documents


1 download

DESCRIPTION

A journey through data rate theorems for communication and control. Massimo Franceschetti. Motivation. January 2007. January 2012. Motivation. Motivation. Motivation. Abstraction. Channel quality. Time. Problem formulation. Linear dynamical system. A composed of unstable modes. - PowerPoint PPT Presentation

TRANSCRIPT

Towards a theory of large scale networks

A journey through data rate theorems for communication and controlMassimo Franceschetti

Motivation

January 2012January 2007

In the last decade, many authors have looked at the stabilization of dynamical systems over rate-limited feedback channels.

In general, all these works consider a scenario in which one or more dynamical systems need to be controlled or stabilized using multiple sensor and actuator that communicate over digital links with limited rate or on which packet may get lost.

The goal of this work is to characterize the trade-off between system instabilities and channel properties in the most general setup.2Page #Motivation

Motivation

Motivation

Abstraction

Channel qualityTimeProblem formulationLinear dynamical system

A composed of unstable modesSlotted time-varying channel evolving at the same time scale of the system Time-varying channelProblem formulation Objective: identify the trade-off between systems unstable modes and channels rate to guarantee stability:

In the literatureTwo main approaches to channel model

Information-theoretic approach (bit-rate)Network-theoretic approach (packets)

Information-theoretic approachA rate-based approach, transmit

Derive data-rate theorems quantifying how much rate is needed to construct a stabilizing quantizer/controller pair

RateTimeNetwork-theoretic approachA packet-based approach (a packet models a real number)

Determine the critical packet loss probability above which the system cannot be stabilized by any control scheme

TimeRateTatikonda-Mitter (IEEE-TAC 2002)

Information-theoretic approach TimeRateRate process: known at the transmitterDisturbances and initial state support: boundedData rate theorem:a.s. stabilityGeneralizes to vector case as:

Nair-Evans (SIAM-JCO 2004, best paper award)

Information-theoretic approach Rate process: known at the transmitterDisturbances and initial state support: unboundedBounded higher moment (e.g. Gaussian distribution)Data rate theorem:Second moment stabilityTimeRateGeneralizes to vector case as:

IntuitionWant to compensate for the expansion of the state during the communication process

State varianceinstabilityR-bit message

At each time step, the uncertainty volume of the state

Keep the product less than one for second moment stability

Martins-Dahleh-Elia (IEEE-TAC 2006)

Information-theoretic approach Scalar case only

RateTimeRate process: i.i.d process distributed as RDisturbances and initial state support: boundedCausal knowledge channel: coder and decoder have knowledge ofData rate theorem:Second moment stability

Intuition

Rate R1State variance

Rate R2Keep the average of the product less than one for second moment stability

At each time step, the uncertainty volume of the state

Minero-F-Dey-Nair (IEEE-TAC 2009)

Information-theoretic approach Vector case, necessary and sufficient conditions almost tight

RateTimeRate process: i.i.d process distributed as RDisturbances and initial state support: unboundedBounded higher moment (e.g. Gaussian distribution)Causal knowledge channel: coder and decoder have knowledge ofData rate theorem:Second moment stability

Proof sketchesNecessity: Using the entropy power inequality find a recursion

Thus, Sufficiency: Difficulty is in the unbounded support, uncertainty about the state cannot be confined in any bounded interval, design an adaptive quantizer, avoid saturation, achieve high resolution through successive refinements.

Divide time into cycles of fixed length (of our choice)

Observe the system at the beginning of each cycle and send an initial estimate of the state

During the remaining part of the cycle refine the initial estimateNumber of bits per cycle is a random variable dependent of the rate processUse refined state at the end of cycle for controlProof of sufficiency

Adaptive quantizerConstructed recursivelySuccessive refinements: exampleSuppose we need to quantize a positive real value At time supposeWith one bit of information the decoder knows that Successive refinements: exampleAt time suppose After receiving 01 the decoder knows that , thus the initial estimate has been refined Partition the real axis according to the adaptive 3-bit quantizer Label only the partitions on the positive real line (2 bits suffice) The scheme works as if we knew ahead of time that Proof of sufficiencyFind a recursion for

Thus, for large enough

Network-theoretic approachA packet-based approach (a packet models a real number)

TimeRateCritical dropout probabilitySinopoli-Schenato-F-Sastry-Poolla-Jordan (IEEE-TAC 2004) Gupta-Murray-Hassibi (System-Control-Letters 2007)

TimeRate

Generalizes to vector case as:

Critical dropout probabilityCan be viewed as a special case of the information-theoretic approach Gaussian disturbance requires unbounded support data rate theorem of Minero, F, Dey, Nair, (2009) to recover the result

Stabilization over channels with memoryGupta-Martins-Baras (IEEE-TAC 2009) Critical recovery probability

Network theoretic approachTwo-state Markov chain Stabilization over channels with memoryYou-Xie (IEEE-TAC 2010)

For recover the critical probability Data-rate theorem

Information-theoretic approachTwo-state Markov chain, fixed R or zero rate Disturbances and initial state support: unbounded Let T be the excursion time of state RIntuitionSend R bits after T time steps

In T time steps the uncertainty volume of the state

Keep the average of the product less than one for second moment stability

State varianceT time stepsR-bit message

Stabilization over channels with memoryCoviello-Minero-F (IEEE-TAC to appear)

Obtain a general data rate theorem that recovers all previous results using the theory of Jump Linear Systems

Information-theoretic approach Disturbances and initial state support: unbounded Time-varying rate Arbitrary positively recurrent time-invariant Markov chain of n statesMarkov Jump Linear System

Rate rkState varianceDefine an auxiliary dynamical system (MJLS)

Markov Jump Linear System

Let be the spectral radius of H

The MJLS is mean square stable iffRelate the stability of MJLS to the stabilizability of our system

Let H be the matrix defined by the transition probabilities and the rates

Data rate theoremStabilization in mean square sense over Markov time-varying channels is possible if and only if the corresponding MJLS is mean square stable, that is:

Proof sketchThe second moment of the system state is lower bounded and upper bounded by two MJLS with the same dynamicsis a necessary conditionLower bound: using the entropy power inequalityProof sketchUpper bound: using an adaptive quantizer at the beginning of each cycle the estimation error is upper bounded as This represents the evolution at times of a MJLSwhereA sufficient condition for stability of is Proof sketchAssumingCan choose large enough so that MJLS is stableSecond moment of estimation error at the beginning of each cycle is boundedThe state remains second moment bounded.Previous results as special cases

Previous results as special cases iid bit-rate

Data rate theorem reduces to

Previous results as special cases Two-state Markov channel

Data rate theorem reduces to

Previous results as special cases Two-state Markov channel

Data rate theorem further reduces to

From which it follows

What nextIs this the end of the journey?

No! journey is still wide open introducing noisy channels

Insufficiency of Shannon capacityExample: i.i.d. erasure channel

Data rate theorem:

Shannon capacity:

Capacity with stronger reliability constraintsShannon capacity soft reliability constraint Zero-error capacity hard reliability constraintAnytime capacity medium reliability constraint

Alternative formulationsUndisturbed systemsTatikonda-Mitter (IEEE-AC 2004)Matveev-Savkin (SIAM-JCO 2007)

a.s. stabilityAnytime reliable codes: Shulman (1996), Ostrovsky, Rabani, Schulman (2009), Como, Fagnani, Zampieri (2010)

Sukhavasi, Hassibi (2011), a.s. stabilityDisturbed systems (bounded)Matveev-Savkin (IJC 2007)

moment stabilitySahai-Mitter (IEEE-IT 2006)The Bode-Shannon connectionConnection with the capacity of channels with feedback

Elia (IEEE-TAC 2004)Ardestanizadeh-F (IEEE-TAC to appear)Ardestanizadeh-Minero-F (IEEE-IT to appear)

Control over a Gaussian channelInstabilityPower constraintComplementary sensitivity functionStationary (colored) Gaussian noiseThe largest instability U over all LTI systems that can be stabilized by unit feedback over the stationary Gaussian channel, with power constraint P corresponds to the Shannon capacity CF of the stationary Gaussian channel with feedback [Kim(2010)] with the same power constraint P.Power constraintFeedback capacityControl over a Gaussian channelCommunication using controlThis duality between control and feedback communication for Gaussian channels can be exploited to design communication schemes using control tools MAC, broadcast channels with feedback

Elia (IEEE-TAC 2004)Ardestanizadeh-Minero-F (IEEE-IT to appear)

ConclusionData-rate theorems for stabilization over time-varying rate channels, after a beautiful journey of about a decade, are by now fairly well understood The journey (quest) for noisy channels is still going onThe terrible thing about the quest for truth is that you may find itFor papers: www.circuit.ucsd.edu/~massimo/papers.html

G.B. Tiepolo: Time Unveiling the Truth

For Prof. Bullo

Sens

ors

Dynamical System

ControllerA

ctua

tors

dig

ital

chan

nel

A/D

D/A

dig

ital

chan

nel

disturbance

noisenoise

xk+1 = Axk +Buk + vk,yk = Cxk + wk

|1| 1, , |n| 1

supkE[||xk||2] u

mu log |u|

k Rk = R

R > Rc = log ||

R >u

mu log |u|

||2 22R

22R

R > log ||

{Ri}ki=0

{Rk}

||2E [22R] < 1

||2 22Ri

||2E [22R] < 1

{Ri}ki=0

{Rk}

||2E [22R] < 1

E[x2k] ||2 E[22R] E[x2k1]+ const

supk

[x2k] 0

x

0

R

0

R

x [1/2, 1)

x [0,)

1/2

k = 2

1

00

01

10

11

R1 +R2 = 3

R2 = 2

x

0

2 1

refinements

Quantize

+ 1

1

E[x2k ]

E[x2k ] const(E[ ||222R

])E[x2(k1) ] + const

E[ ||222R

]< 1 = const

(E[ ||222R

])< 1

2

1

Rk ={ w.p. 1 p

0 w.p. p

p < pc = 1||2

p < pc = maxi 1|i|2

= p < 1||2 , as r

E[ ||222R

]= p ||

2

20 + (1 p) ||2

22r < 1

q > qc = 1 1|2|

R > Rc = 12 logE[||2T ]

R

q > qc = 1 1|2|

R

R >12logE[||2T ]

22R

||2T 22R

Rk {r1, , rn}

pij = P{Rk+1 = rj |Rk = ri}

ri rj

p ij

z0 1, > 0

||2(H) < 1

2/ ||2(H) < 1

{zk}

Gupta Martins Baras (2009)

You Xie (2010)

Martins Dahleh Elia (2006)

Nair Evans (2004)

Coviello, Minero, F (2011)

Tatikonda, Mitter (2002)

Minero, F, Dey, Nair (2009)

Gupta Murray Hassibi (2007)

Recover Minero, F, Dey, Nair (2009)

= ||2E[22R] < 1

||2(H) = ||2(22r1 , , 22rn)(p1, , pn)T

||2(H) = ||2

2Tr (H)2 +

||22Tr (H)2 4 det(H) < 1

r1 = 0, r2 = r

||2 12logE[||2T ]

recovering You, Xie (2010)

as r p < 1||2

C = (1 p)r

Rk R ={

r w.p. 1 p0 w.p. p

=

||2E(22R) < 1

||2(22r(1 p) + p) < 1

C

C0

CA

C0 CA C

Perr 0

Perr = 0

P((M0|k, . . . , Md|k) != (M0, . . . ,Md)) = O(2d) for all d k

C > log ||

C0 > log ||

CA > log ||

T (z) =L(z)

1 + L(z)

Zi

Xi

Zi

Yi-L(z)

Ardestanizadeh, F (2011)

U =iU

log |i|

12pi

pipi

|T (ej)|2SZ()d P

Ardestanizadeh, F (2011)

M Xi Yi

Zi

MEncoder Decoder

supLU = CF

E(X2i ) P i

CF