distributed averaging via gossiping

47
Distributed Averaging via Gossiping A. S. Morse Yale University Gif – sur - Yvet May 22, 2012 Supelec EECI Graduate School in Control

Upload: maris

Post on 14-Feb-2016

46 views

Category:

Documents


0 download

DESCRIPTION

Supelec EECI Graduate School in Control. Distributed Averaging via Gossiping . A. S. Morse Yale University. Gif – sur - Yvette May 22, 2012. TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A A A A A A A A. ROADMAP. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Distributed Averaging  via Gossiping

Distributed Averaging via Gossiping

A. S. Morse

Yale University

Gif – sur - Yvette May 22, 2012

Supelec

EECI Graduate School in Control

Page 2: Distributed Averaging  via Gossiping

ROADMAP

Consensus and averaging

Linear iterations

Gossiping

Periodic gossiping

Multi-gossip sequences

Request-based gossiping

Double linear iterations

Page 3: Distributed Averaging  via Gossiping

Consider a group of n agents labeled 1 to n

Each agent i controls a real, scalar-valued, time-dependent, quantity xi called anagreement variable.

The neighbors of agent i, correspond to those vertices which are adjacent to vertex i

The groups’ neighbor graph N is an undirected, connected graph with verticeslabeled 1,2,...,n.

7

4

1

3

5

2

6

The goal of a consensus process is for all n agents to ultimately reach a consensus byadjusting their individual agreement variables to a common value.

This is to be accomplished over time by sharing information among neighbors ina distributed manner.

Consensus Process

Page 4: Distributed Averaging  via Gossiping

A consensus process is a recursive process which evolves with respect to a discretetime scale.

In a standard consensus process, agent i sets the value of its agreement variable at time t +1 equal to the average of the current value of its own agreement variable and the current values of its neighbors’ agreement variables.

Average at time t of values of agreement variables of agent i and the neighborsof agent i.

Ni = set of indices of agent i0s neighbors.

di = number of indices in Ni

Consensus Process

Page 5: Distributed Averaging  via Gossiping

An averaging process is a consensus process in which the common value to which each agreement variable is suppose to converge, is the average of the initialvalues of all agreement variables:

Averaging Process

xavg = 1n

nX

i=1xi(0)

Application: distributed temperature calculation

Generalizations: Time-varying case - N depends on timeInteger-valued case - xi(t) is to be integer-valueAsynchronous case - each agent has its own clock

Implementation Issues: How much network information does each agent need?To what extent is the protocol robust?

General Approach: ProbabilisticDeterministic

Standing Assumption: N is a connected graph

Performance metrics: Convergence rateNumber of transmissions needed

Page 6: Distributed Averaging  via Gossiping

ROADMAP

Consensus and averaging

Linear iterations

Gossiping

Periodic gossiping

Multi-gossip sequences

Request-based gossiping

Double linear iterations

Page 7: Distributed Averaging  via Gossiping

Ni = set of indices of agent i’s neighbors.

wij = suitably defined weights

Linear Iteration

x(t) =

266664

x1(t)x2(t)

...xn(t)

377775

x(t + 1) = Wx(t)

W = [wi j ]n£ n

Want x(t) ! xavg1 1 =

266664

11...1

377775

n£ 1

If A is a real n £ n matrix, then At converges as t ! 1, to a rank one matrix of the form qp0 if and only if A has exactly one eigenvalue at value 1 and all remaining n -1eigenvalues are strictly smaller than 1 in magnitude.

If A so converges, then Aq = q, A0p = p and p0q = 1.

x(t) ! 1n110x(0) = 1xavg

Thus if W is such a matrix and then W t ! 1n110q = 1; p = 1

n1

Page 8: Distributed Averaging  via Gossiping

Linear Iteration with Nonnegative Weights

x(t) ! xavg1

iff W1 = 1, W01 =1 and all n - 1 eigenvalues of W, except for W’s single eigenvalue at value 1, have magnitudes less than 1.

A square matrix S is doubly stochastic if it hasonly nonnegative entries and if its row and column sums all equal 1.

A square matrix S is stochastic if it hasonly nonnegative entries and if its row sums all equal 1.

S1 = 1 S1 = 1 and S01 = 1

For the nonnegative weight case, x(t) converges to xavg1 if and only if W is doubly stochastic and its single eigenvalue at 1 has multiplicity 1.

How does one choose the wij ¸ 0 so that W has these properties?

||S||1 = 1 Spectrum S contained in the closed unit circle All eignvalue of value 1 have multiplicity 1

Page 9: Distributed Averaging  via Gossiping

g > max {d1,d2 ,…dn}

x(t + 1) =Ã

I ¡ 1gL

!x(t) L = D - A

D =

266664

d1 0 ¢¢¢ 00 d2 ¢¢¢ 0... ... . . . ...0 0 ¢¢¢ dn

377775 Adjacency matrix of N: matrix of ones and zeros

with aij = 1 if N has an edge between vertices i and j.

L1 = 0

Each agent needs to know max {d1,d2 ,…dn} to implement this

doubly stochastic

The eigenvalue of L at 0 has multiplicity 1 because N is connected

single eigenvalue at 1 has multiplicity 1

xi(t + 1) =Ã

1 ¡ dig

!xi(t) + 1

gX

j 2N ixj (t)

Page 10: Distributed Averaging  via Gossiping

Each agent needs to know the number of neighbors of each of its neighbors.

Metropolis Algorithm

L = QQ’Q is a -1,1,0 matrix with rows indexed by vertex labels in N and columns indexed byedge labels such that qij = 1 if edge j is incident on vertex i, and -1 if edge i isincident on vertex j and 0 otherwise.

I – Q¤ Q0 ¤ = diagonalf ¸1; ¸2; : : :g¸ i = 1(1 + maxf di;dj g)

xi(t+1) =0@1 ¡

X

j 2N i

1(1 + maxf di;dj g)

1A xi(t)+

X

j 2N i

1(1 + maxf di;dj g)xj (t)

A Better Solution

Total number of transmissions/iteration: nX

i=1di = ndavg davg = 1

nnX

i=1di

{Boyd et al}

Why is I – Q¤ Q0

doubly stochastic?

Page 11: Distributed Averaging  via Gossiping

Agent i’s queue is a list qi(t) of agent i’s neighbor labels.

Agent i’s preferred neighbor at time t, is that agent whose label is in the front of qi(t).

Between times t and t+1 the following steps are carried out in order. Agent i transmits its label i and xi(t) to its current preferred neighbor.

At the same time agent i receives the labels and agreement variable values ofthose agents for whom agent i is their current preferred neighbor.

Agent i transmits mi(t) and its current agreement variable value to each neighbor with a label in Mi(t).

Mi(t) = is the set of the label of agent i’s preferred neighbor together with the labels of all neighbors who send agent i their agreement variables at time t. mi(t) = the number of labels in Mi(t)

Agent i then moves the labels in M i(t) to the end of its queue maintaining their relative order and updates as follows:

xi(t+1) =0B@1 ¡

X

j 2M i(t)

1(1 + maxf mi(t);mj (t)g)

1CA xi(t)+

X

j 2M i(t)

1(1 + maxf mi(t);mj (t)g)xj (t)

xi(t+1) =0@1 ¡

X

j 2N i

1(1 + maxf di;dj g)

1A xi(t)+

X

j 2N i

1(1 + maxf di;dj g)xj (t) ndavg

3n davg > 3

Modification

n transmissions

at most 2n transmissions

Page 12: Distributed Averaging  via Gossiping

Randomly chosen graphs with 200 vertices. 10 random graphs for each average degree.

Metropolis Algorithm vs Modified Metropolis Algorithm

Page 13: Distributed Averaging  via Gossiping

Acceleration

Consider A real symmetric with row and column sums = 1

x(t + 1) = Ax(t)Original system:

x(t + 1) = (g+ 1)Ax(t) ¡ gy(t)y(t + 1) = x(t)Augmented system:

x(t) converges to xavg1 iff

g = 1 ¡q

1 ¡ ¾221 +

q1 ¡ ¾22

Augment system is fastest if

; i.e. A1 = 1 and 10A = 1

y(0) = x(0)

¾2< 1 where ¾2 is the second largest singular value of A .Suppose

{Muthukrishnan, Ghosh, Schultz -1998}

-1 10 ¾2

convergesAugemented system

Augmented system slower

Augmented system faster

Page 14: Distributed Averaging  via Gossiping

ROADMAP

Consensus and averaging

Linear iterations

Gossiping

Periodic gossiping

Multi-gossip sequences

Request-based gossiping

Double linear iterations

Page 15: Distributed Averaging  via Gossiping

Gossip Process

A gossip process is a consensus process in which at each clock time, each agent is allowed to average its agreement variable with the agreement variable of at mostone of its neighbors.

The index of the neighbor of agent i which agent i gossips with at time t.

In the most commonly studied version of gossiping, the specific sequence of gossips which occurs during a gossiping process is determined probabilistically.

In a deterministic gossiping process, the sequence of gossips which occurs is determined by a pre-specified protocol.

This is called a gossip and is denoted by (i, j).

If agent i gossips with neighbor j at time t, then agent j must gossip with agenti at time t.

Page 16: Distributed Averaging  via Gossiping

Gossip Process

A gossip process is a consensus process in which at each clock time, each agent is allowed to average its agreement variable with the agreement variable of at mostone of its neighbors.

1. The sum total of all agreement variables remains constant at all clock steps.

2. Thus if a consensus is reached in that all agreement variables reach the same value, then this value must be the average of the initial values of all gossip variables.

3. This is not the case for a standard consensus process.

Page 17: Distributed Averaging  via Gossiping

A finite sequence of multi-gossips induces a spanning sub-graph M of N where (i, j)is an edge in M iff (i, j) is a gossip in the sequence.

A gossip (i, j) is allowable if (i, j) is the label of an edge on N

If more than one pair of agents gossip at a given time, the event is called a multi-gossip.

The “gossip with only one neighbor at a time rule” does not preclude more than one distinct pair of gossips occurring at the same time.

For purposes of analysis, a multi-gossip can be thought of as a sequence of single gossips occurring in zero time. The arrangement of gossips in the sequenceis of no importance because individual gossip pairs do not interact.

Page 18: Distributed Averaging  via Gossiping

State Space Model

x(t) =

266664

x1(t)x2(t)

...xn(t)

377775

x(t + 1) = M (t)x(t)

For n = 7, i = 2, j = 5

Doubly stochastic matrices with positivediagonals are closed under multiplication.

For each t, M(t) is a primitive gossip matrix .

Each primitive gossip matrix is a doublystochastic matrix with positive diagonals.

A gossip (i, j) primitive gossip matrix Pij

A doubly stochastic matrix

Page 19: Distributed Averaging  via Gossiping

State Space Model

x(t + 1) = M (t)x(t)

Doubly stochastic matrices with positivediagonals are closed under multiplication.

For each t, M(t) is a primitive gossip matrix .

Each primitive gossip matrix is a doublystochastic matrix with positive diagonals.

7

4

1

3

5

2

6

Neighbor graph Nx(2) = P57x(1)x(3) = P56P57x(1)x(4) = P32P56P57x(1)x(5) = P35P32P56P57x(1)

Each gossip matrix is a product ofprimitive gossip matrices

Each gossip matrix is a doubly stochastic matrix with positive diagonals.

(5,7) (5,6) (3,2) (3,5) (3,4) (1,2)

x(6) = P34P35P32P56P57x(1)x(7) = P12P34P35P32P56P57x(1)

gossip matrices

A {minimally} completegossip sequence

A {minimally} complete gossip matrix

Page 20: Distributed Averaging  via Gossiping

A i ! 1n 110

as fast as ¸i ! 0 where ¸ is the second largest eigenvalue {in magnitude} of A. .

Claim: If A is a complete gossip matrix then

Page 21: Distributed Averaging  via Gossiping

Facts about nonnegative matrices

A square nonnegative matrix A is irreducible if there does not exist a permutation matrix P such

where B and D are square matrices.

A square nonnegative matrix M is primitive if Mq is a positive matrix for q sufficientlylarge

Fact 1: A is irreducible iff its graph is strongly connected.

Fact 2: A is irreducible iff I + A is primitive

Consequences if A has all diagonal elements positive:

A is irreducible iff A is primitive.

A is primitive iff its graph is strongly connected.

Thanks to Luminita!

D

Page 22: Distributed Averaging  via Gossiping

Perron Frobenius Theorem

Theorem: Suppose A is primitive. Then A has a single eigenvalue ¸ of maximal magnitude and ¸ is real. There is a corresponding eigenvector which is positive.

Suppose A is a stochastic matrix with positive diagonals and a strongly connected graph. Then A has exactly one eigenvalue at 1 and allremaining n -1 eigenvalues have magnitudes strictly less than 1

Consequence:

A i ! 1n 110

as fast as ¸i ! 0 where ¸ is the second largest eigenvalue {in magnitude} of A, it is enough to show that the graph of A is strongly connected..

So to show that completeness of A implies that

Page 23: Distributed Averaging  via Gossiping

If A is a complete gossip matrix, then the graph of A is strongly connected.

A complete implies ° (A) is strongly connected.

Thus °(A0A) is strongly connected because ° (A0 A) = ° (A0 ±°(A) and arcs are conserved under composition since A and A0 have positive diagonals.

But A0A is stochastic because A is doubly stochastic.

Thus the claim is true because of the Perron Frobenius Theorem

If A is a complete gossip matrix, its second largest singular value is less than 1

If A is doubly stochastic, then |A|2 = second largest singular value of A

If A is a complete gossip matrix, then A is a semi-contraction w/r |¢ |2

Blackboard proof

Blackboard proof

Page 24: Distributed Averaging  via Gossiping

ROADMAP

Consensus and averaging

Linear iterations

Gossiping

Periodic gossiping

Multi-gossip sequences

Request-based gossiping

Double linear iterations

Page 25: Distributed Averaging  via Gossiping

7

4

1

3

5

2

6

A = P12P34P35P32P56P57

(5,6) (5,7) (5,6) (3,2) (3,5) (3,4) (1,2) (5,6) (3,2) (3,5) (3,4) (1,2)

(5,7)(5,7)

x(iT + 1) = Ax((i ¡ 1)T + 1); i ¸ 1

x(iT + 1) = A i x(1); i ¸ 1

T = 6T T

convergence rate = Tp

j¸j

Periodic Gossiping

A i ! 1n 110

as fast as ¸i ! 0 where ¸ is the second largest eigenvalue {in magnitude} of A.

We have seen that t because A is complete

Page 26: Distributed Averaging  via Gossiping

7

4

1

3

5

2

6

(5,7) (5,6) (3,2) (3,5) (3,4) (1,2)

A = P12P34P35P32P56P57

(5,6) (3,2) (3,5) (3,4) (1,2)

(5,6) (5,7)(5,7)

6 6

(5,7) (3,4) (1,2) (3,2) (5,6) (3,5) (3,4) (1,2) (3,2) (5,6) (3,5)

(5,7)(5,7) (3,4)

B = P35P56P35P12P34P57

How are the second largest eigenvalues {in magnitude} related?

If the neighbor graph N is a tree, then the spectrums of all possible minimally complete gossip matrices determined by N are the same!

Page 27: Distributed Averaging  via Gossiping

7

4

1

3

5

2

6

multi-gossips: (3,4) and (7,5) (3,5) and (1,2) (2,3) and (5,6)

Period T = number of multi-gossips T = 3

Minimal number of multi-gossips for a given neighbor graph N is the same as the minimal number of colors needed to color the edges of N so that no two edgesincident on any vertex have the same color.

The minimal number of multi-gossips {and thus the minimal value of T } is the chromatic index of N which, if N is a tree, is the degree of N.

degree = 3

Minimizing T

convergence rate = Tp

j¸j

Page 28: Distributed Averaging  via Gossiping

Modified Gossip Rule

Suppose agents i and j are to gossip at time t.

Standard update rule: xi(t + 1) = 1

2xi(t) + 12xj (t)

xj (t + 1) = 12xi(t) + 1

2xj (t)

xi(t + 1) = ®xi(t) + (1 ¡ ®)xj (t)

xj (t + 1) = (1 ¡ ®)xi(t) + ®xj (t)Modified gossip rule:

0 < ® < 1

Page 29: Distributed Averaging  via Gossiping

|¸2| vs ®

Page 30: Distributed Averaging  via Gossiping

ROADMAP

Consensus and averaging

Linear iterations

Gossiping

Periodic gossiping

Multi-gossip sequences

Request-based gossiping

Double linear iterations

Page 31: Distributed Averaging  via Gossiping

Suppose that the infinite sequence of multi-gossips under consideration isrepetitively complete with period T.

Then there exists a nonnegative constant ¸ <1 such that for each initial value ofx(0), all n gossip variables converge to the average value xavg as fast as ¸t converges to zero as t ! 1.

An infinite multi-gossip sequence is repetitively complete with period T if the finite sequence of gossips which occur in any given period of length T is complete.

Repetitively Complete Multi-Gossip Sequences

For example, a periodic gossiping sequence with complete gossiping matrices over each period is repetitively complete.

Page 32: Distributed Averaging  via Gossiping

7

4

1

3

5

2

6

A = P12P34P35P32P56P57

(5,6) (5,7) (5,6) (3,2) (3,5) (3,4) (1,2) (3,4) (1,2) (2,6) (5,7) (3,5)

(5,7)(5,6)

x(T + 1) = Ax((1)

x(2T + 1) = BAx(1)

T = 6 T = 6

X

X

B = P35P57P26P12P34P56

A and B are complete gossip matrices

Repetitively Complete Multi-Gossip Sequences

Page 33: Distributed Averaging  via Gossiping

kY

i=1Gi = GkGk¡ 1 ¢¢¢G2G1

Thus we are interested in studying the convergence of products of the form

as k ! 1 where each Gi is a complete gossip matrix.

Repetitively Complete Multi-Gossip Sequences

C = set of all complete multi-gossip sequences of length at most T

¸¾ = the second largest singular value of the complete gossip matrix determined by multi-gossip sequence ¾ 2 C

The convergence rate of a repetitively complete gossip sequence of period T is no slower than

Tp ¸

where: ¸ = max¾2C

¸¾

Page 34: Distributed Averaging  via Gossiping

ROADMAP

Consensus and averaging

Linear iterations

Gossiping

Periodic gossiping

Multi-gossip sequences

Request-based gossiping

Double linear iterations

Page 35: Distributed Averaging  via Gossiping

Request-Based Gossiping

A process in which a gossip takes place between two agents whenever one of the agents accepts a request placed by the other.

An event time of agent i is a time at which agent i places a request to gossip witha neighbor.

Assumption: The time between any two successive event times of agent i is bounded above by a positive number Ti

Deadlocks can arise if a requesting agent is asked to gossip by another agent at the same time.

Distinct Neighbor Event Time Assumption: All of the event times of each agentdiffer from all of the event times of each of its neighbors.

A challenging problem to deal with!

Page 36: Distributed Averaging  via Gossiping

7

4

1

3

5

2

67 sequences assigned

One way to satisfy the distinct neighbor event time assumption is to assign each agenta set of event times which is disjoint with the sets of event times of all other agents.

But it is possible to do a more efficient assignment

Distinct Neighbor Event Time Assumption: All of the event times of each agentdiffer from all of the event times of each of its neighbors.

Page 37: Distributed Averaging  via Gossiping

7

4

1

3

5

2

6

The minimal number of event time sequences needed to satisfy the distinct neighborevent time assumption is the same as the minimal number of colors needed to color thevertices of N so that no two adjacent vertices have the same color.

Minimum number of different event times sequences needed equals the chromatic number of N which does not exceed 1+ the maximum vertex degree of N

But it is possible to do a more efficient assignment

2 sequences assigned

One way to satisfy the distinct neighbor event time assumption is to assign each agenta set of event times which is disjoint with the sets of event times of all other agents.

Distinct Neighbor Event Time Assumption: All of the event times of each agentdiffer from all of the event times of each of its neighbors.

Page 38: Distributed Averaging  via Gossiping

1. If t is an event time of agent i, agent i places a request to gossip with that neighbor whose label is in front of the queue qi(t).

2. If t is not an event time of agent i, agent i does not place a request to gossip.

3. If agent i receives one or more requests to gossip, it gossips with that requesting neighbor whose label is closest to the front of the queue qi(t) and then moves the label of that neighbor to the end of qi(t)

4. If t is not an event time of agent i and agent i does not receive a request to gossip, then agent i does not gossip.

Request-Based Protocol

Agent i’s queue is a list qi(t) of agent i’s neighbor labels.

Protocol:

Page 39: Distributed Averaging  via Gossiping

Let E denote the set of all edges (i, j) in the neighbor graph N.

Suppose that the distinct neighbor event time assumption holds.

Then the infinite sequence of gossips generated by the request based protocol will be repetitively complete with period

where for all i,

Ti is an upper bound on the time between two successive event times of agent i,

dj is the number of neighbors of agent j and

Ni is the set of labels of agent i’s neighbors.

T = max(i;j )2E

min8><>:

TiX

j 2N idj ; Tk

X

j 2Nkdj

9>=>;

Page 40: Distributed Averaging  via Gossiping

ROADMAP

Consensus and averaging

Linear iterations

Gossiping

Periodic gossiping

Multi-gossip sequences

Request-based gossiping

Double linear iterations

Page 41: Distributed Averaging  via Gossiping

Double Linear Iteration

left stochastic S0(t) = stochastic

limt! 1 S(t)S(t ¡ 1) ¢¢¢S(1) = q10

y(t + 1) = S(t)y(t); y(0) = x(0)z(t + 1) = S(t)z(t); z(0) = 1

limt! 1 y(t) = q10x(0)limt! 1 z(t) = q101

= qnxavg= qn

Suppose q > 0

Suppose

z(t) > 0 8 t < 1Suppose each S(t) has positive diagonals

Benezit, Blondel, Thiran, Tsitsiklis, Vetterli -2010

z(t) > 0 8 t · 1

yi = unscaled agreement variablezi = scaling variable

xi(t) = yi(t)zi(t)

= qinxavgqin

= xavg; i 2 f 1;2; : : : ;ng= limt! 1yi(t)zi(t)limt! 1 xi(t)

y(t) =264

y1(t)...

yn(t)

375

z(t) =264

z1(t)...

zn(t)

375

Page 42: Distributed Averaging  via Gossiping

Broadcast-Based

Double Linear Iteration

Initialization: yi(0) = xi(0) zi(0) = 1

Transmission: Agent i broadcasts the pair {yi(t), zi(t)} to each of its neighbors.

Update: xi(t) = yi(t)zi(t)

Agent’s require same network information as Metropolisn transmissions/iteration Works if N depends on t

Why does it work?

y(0) = x(0)

z(0) = 1

S = (I+A) (I+D)-1A = adjacency matrix of N D = degree matrix of N S = left stochastic with positive diagonals

transpose of aflocking matrix

Page 43: Distributed Averaging  via Gossiping

S' is irreducible

Therefore S0 has a strongly connected graph

Therefore S0 has a rooted graph

Connectivity of N implies strong connectivity of the graph of (I+A)

because N is connected

Page 44: Distributed Averaging  via Gossiping

y(0) = x(0)

z(0) = 1

S = left stochastic with positive diagonals

because N is connected

z(t) > 0 , 8 t < 1 because S has positive diagonals

z(t) > 0 , 8 t · 1 because z(1) = nq and q > 0

xi(t) = yi(t)zi(t)So is well-defined

Sq =q q01 =1

S = (I+A) (I+D)-1

Page 45: Distributed Averaging  via Gossiping

Metropolis Iteration vs Double Linear Iteration

30 vertex random graphs

Metropolis

Double Linear

Page 46: Distributed Averaging  via Gossiping

Round Robin - Based Double Linear Iteration

At the same time agent i receives the values

from the agents j1, j2, … jk who have chosen agent i as their current preferred neighbor.

No required network informationn transmissions/iteration

Update: Agent i then moves the label of its current preferred neighbor to the end of its queue and sets

xi(t) = yi(t)zi(t)

Transmission: Agent i transmits the pair {yi(t), zi(t)} its preferred neighbor.

Initialization: yi(0) = xi(0) zi(0) = 1

Why does it work?

Page 47: Distributed Averaging  via Gossiping

S(0), S(1), ...is periodic with period T = lcm {d1, d2, ..., dn}.

P¿k > 0 for some k > 0

S(t) = left stochastic, positive diagonals

P¿ is primitive

z(t) > 0, 8 t < 1 because each S(t) has positive diagonals

q¿ > 0 Perron-Frobenius: P¿ has single eigenvalue at 1 and it has multiplicity 1.

limt! 1 xi(t) = limt! 1yi(t)zi(t) = xavg; i 2 f 1;2; : : : ;ng

z(t) > 0 , 8 t · 1 because z(t) ! {nq1, nq2, ... ,nqT} and q¿ > 0