1 tcom 501: networking theory & fundamentals lectures 9 & 10 m/g/1 queue prof. yannis a....
Post on 21-Dec-2015
221 views
TRANSCRIPT
1
TCOM 501: Networking Theory & Fundamentals
Lectures 9 & 10
M/G/1 Queue
Prof. Yannis A. Korilis
10-2 Topics
M/G/1 Queue Pollaczek-Khinchin (P-K) Formula Embedded Markov Chain Observed at Departure Epochs Pollaczek-Khinchin Transform Equation Queues with Vacations Priority Queueing
10-3 M/G/1 Queue
Arrival Process: Poisson with rate λ Single server, infinite waiting room Service times:
Independent identically distributed following a general distribution Independent of the arrival process
Main Results:
Determine the average time a customer spends in the queue waiting service (Pollaczek-Khinchin Formula)
Calculation of stationary distribution for special cases only
10-4 M/G/1 Queue – Notation
iW : waiting time of customer i
iX : service time of customer i
iQ : number of customers waiting in queue (excluding the one in service) upon arrival of
customer i iR : residual service time of customer i = time until the customer found in service by customer
i completes service iA : number of arrivals during the service time iX of customer i
Service Times
X1, X2, …, independent identically distributed RVs
Independent of the inter-arrival times
Follow a general distribution characterized by its pdf ( )Xf x , or cdf ( )XF x
Common mean [ ] 1/E X
Common second moment 2[ ]E X
10-5 M/G/1 Queue
State Representation:
{ ( ) : 0}N t t is not a Markov process – time spent at each state is not exponential
R(t) = the time until the customer that is in service at time t completes service
{( ( ), ( )) : 0}N t R t t is a continuous time Markov process, but the state space is not a countable set
Finding the stationary distribution can be a rather challenging task
Goals:
Calculate average number of customers and average time-delay without first calculating the stationary distribution
Pollaczek-Khinchin (P-K) Formula:
2[ ]
[ ]2(1 [ ])
E XE W
E X
To find the stationary distribution, we use the embedded Markov chain, defined by observing ( )N t at departure epochs only – transformation methods
10-6 A Result from Probability Theory
Proposition: Sum of a Random Number of Random Variables
N: random variable taking values 0,1,2,…, with mean [ ]E N
X1, X2, …, XN: iid random variables with common mean [ ]E X
Then: 1[ ] [ ] [ ]NE X X E X E N
Proof: Given that N=n the expected value of the sum is
1 1 1
| [ ] [ ]N n n
j j jj j jE X N n E X E X nE X
Then:
1 1
1 1
1
| { } [ ] { }
[ ] { } [ ] [ ]
N N
j jj jn n
n
E X E X N n P N n nE X P N n
E X nP N n E X E N
10-7 Pollaczek-Khinchin Formula Assume FCFS discipline
Waiting time for customer i is:
1
1 2 i i
i
i i i i i Q i jj i QW R X X X R X
Take the expectation on both sides and let i , assuming the limits exist:
1[ ] [ ] [ ] [ ] [ ]
[ ] [ ] [ ] [ ]i
i
i i j i ij i QE W E R E X E R E X E Q
E W E R E X E Q
Averages E[Q], E[R] in the above equation are those seen by an arriving customer.
Poisson arrivals and Lack of Anticipation: averages seen by an arriving customer are equal averages seen by an outside observer – PASTA property
Little’s theorem for the waiting area only:
[ ] [ ]E Q E W
[ ]
[ ] [ ] [ ] [ ] [ ] [ ]1
E RE W E R E X E W R E W E W
[ ] /E X : utilization factor = proportion of time server is busy
0[ ] 1E X p
Calculate the average residual time: [ ] lim [ ]ii
E R E R
10-8 Average Residual Time
2X1X
1X
( )D tXt
( )R t
Graphical calculation of the long-term average of the residual time
Time-average residual time over [0,t]: 1
0( )
tt R s ds
Consider time t, such that R(t)=0. Let D(t) be the number of departures in [0,t] and assume that R(0)=0. From the figure we see that:
( ) 22( )1
01
( ) 2
1
0
1 1 1 ( )( )
2 2 ( )
1 1 ( )lim ( ) lim lim
2 ( )
D tD tt ii i
i
D tt ii
t t t
XX D tR s ds
t t t D t
XD tR s ds
t t D t
Ergodicity: long-term time averages = steady-state averages (with probability 1)
0
1[ ] lim [ ] lim ( )
t
ii t
E R E R R s dst
10-9 Average Residual Time (cont.)
( ) 2
1
0
1 1 ( )lim ( ) lim lim
2 ( )
D tt ii
t t t
XD tR s ds
t t D t
lim ( ) /t
D t t
: long-term average departure rate. Should be equal to the long-term average
arrival rate. Long-term averages = steady-state averages (with probability 1):
( )
limt
D t
t
Law of large numbers:
( ) 2 2
21 1lim lim [ ]( )
D t n
i ii i
t n
X XE X
D t n
Average residual time:
21[ ] [ ]
2E R E X
P-K Formula:
2[ ] [ ]
[ ]1 2(1 )
E R E XE W
10-10 P-K Formula
P-K Formula:
2[ ] [ ]
[ ]1 2(1 )
E R E XE W
Average time a customer spends in the system
21 [ ]
[ ] [ ] [ ]2(1 )
E XE T E X E W
Average number of customers waiting for service:
2 2[ ]
[ ] [ ]2(1 )
E XE Q E W
Average number of customers in the system (waiting or in service):
2 2[ ]
[ ] [ ]2(1 )
E XE N E T
Averages E[W], E[T], E[Q], E[N] depend on the first two moments of the service time
10-11 P-K Formula: Examples
M/D/1 Queue: Deterministic service times all equal to 1/μ
2
2
1 1[ ] , [ ]E X E X
2 2 2 2[ ] [ ][ ] , [ ]
2(1 ) 2 (1 ) 2(1 ) 2(1 )
E X E XE W E Q
21 [ ] 1 2 (2 )[ ] , [ ] [ ]
2(1 ) 2 (1 ) 2 (1 ) 2(1 )
E XE T E N E T
M/M/1 Queue: Exponential service times with mean 1/μ
2
2
1 2[ ] , [ ]E X E X
2 2 2 2[ ] [ ][ ] , [ ]
2(1 ) (1 ) 2(1 ) (1 )
E X E XE W E Q
21 [ ] 1 1[ ] , [ ] [ ]
2(1 ) (1 )
E XE T E N E T
10-12 Distribution Upon Arrival or DepartureTheorem 1: For an M/G/1 queue at steady-state, the distribution of customers seen by an arriving customer is the same as that left behind by a departing customer.
Proof: Customers arrive one at a time and depart one at a time.
( ), ( ) :A t D t number of arrivals and departures (respectively) in (0,t)
( ) :nU t number of (n,n+1) transitions in (0,t) = number of arrivals that find system at state n
( )nV t : number of (n+1,n) transitions in (0,t) = number of departures that leave system at state n
( )nU t and ( )nV t differ by at most 1 [when a (n,n+1) transition occurs, another (n,n+1) transition
can occur only if the state has moved back to n, i.e., after a (n+1,n) transition has occurred]
Stationary probability that an arriving customer finds the system at state n:
lim { ( ) | arrival at }nt
P N t n t
n is the proportion of arrivals that find the system at state n:
( )
lim( )
nn
t
U t
A t
Similarly, stationary probability that a departing customer leaves system at state n:
( )
lim( )
nn
t
V t
D t
Noting that lim ( ) / lim ( ) /t t
A t t D t t
, we have:
( ) ( ) ( ) ( )( ) ( )
lim lim lim lim lim lim( ) ( )
n n n nn n
t t t t t t
U t V t U t V tA t D t
t t A t t D t t
10-13 Distribution Upon Arrival or Departure (cont.)
Theorem 2: For an M/G/1 queue at steady-state, the probability that an arriving customer finds n customers in the system is equal to the proportion of time that there are n customers in the system. Therefore, the distribution seen by an arriving customer is identical to the stationary distribution.
Proof: Identical to the PASTA theorem due to: Poisson arrivals Lack of anticipation: future arrivals independent of current state N(t) Theorem 3: For an M/G/1 queue at steady-state, the system appears statistically identical to an arriving and a departing customer. Both an arriving and a departing customer, at steady-state, see a system that is statistically identical to the one seen by an observer looking at the system at an arbitrary time. Analysis of the M/G/1 Queue:
Consider the embedded Markov chain resulting by observing the system at departure epochs
At steady-state, the embedded Markov chain and {N(t)} are statistically identical
Stationary distribution pn is equal to the stationary distribution of the embedded Markov chain
10-14 Embedded Markov Chain js : time of jth departure
( )j jL N s : number of customers left behind by the jth departing customer
Show that{ : 1}jL j is a Markov chain
If 1 1jL : customer j enters service immediately at time 1js . Then:
1 11 , if 1j j j jL L A L
If 1 0jL : customer j arrives after time 1js and departs at time js . Then:
1, if 0j j jL A L
Combining the above:
1 11{ 0}j j j jL L A L
jA : number of arrivals during service time jX :
0 0
{ } { | } ( ) (1/ !) ( ) ( )t kj j j X XP A k P A k X t f t dt k e t f t dt
A1, A2,…: independent – arrivals in disjoint intervals
jL depends on the past only through 1jL . Thus: { : 1}jL j is a Markov chain
10-15 Number of Arrivals During a Service Time A1, A2, …: iid. Drop the index j – equivalent to considering the system at steady state
0 0
1{ } { | } ( ) ( ) ( ) , 0,1,...
!t k
k X Xa P A k P A k X t f t dt e t f t dt kk
Find the first two moments of A Proposition: For the number of arrivals A during service time X, we have:
2 2 2
[ ] [ ]
[ ] [ ] [ ]
E A E X
E A E X E X
Proof: Given X=t, the number of arrivals A follows the Poisson distribution with parameter t .
0 0 0[ ] [ | ] ( ) ( ) ( ) ( ) [ ]X X XE A E A X t f t dt t f t dt tf t dt E X
2 2 2 2
0 0
2 2 2 2
0 0
[ ] [ | ] ( ) ( ) ( )
( ) ( ) [ ] [ ]
X X
X X
E A E A X t f t dt t t f t dt
t f t dt tf t dt E X E X
Lemma: Let Y be a RV following the Poisson distribution with parameter 0 . Then:
2 2 2
1 1! ![ ] , [ ]
k k
k kk kE Y k e E Y k e
10-16 Embedded Markov Chain
21i i 1i 2i
0
3
1
Transition Probabilities: 1{ | }ij n nP P L j L i
0
1
, 0,1,...
, 1, 1
0, 1
j j
j iij
P j
j iP i
j i
Stationary distribution: lim { }j nn
P L j
0 1 2
0 1 2
0 1
0 1 1 1 0
0 0 0 1 0
or ,0
, 1j j j j jP P
j
Unique solution: j is the fraction of departing customers that leave j customers behind
From Theorem 3: j is also the proportion of time that there are j customers in the system
10-17 Calculating the Stationary Distribution
Applying Little’s Theorem for the server, the proportion of time that the server is busy is:
0 01 [ ] 1E T
Stationary distribution can be calculated iteratively:
0 0 0 1 0
1 0 1 1 1 2 0
Iterative calculation might be prohibitively involved
Often, we want to find only the first few moments of the distribution, e.g., E[N] and E[N2]
We will present a general methodology based on z-transforms that can be used to
1. Find the moments of the stationary distribution without calculating the distribution itself
2. Find the stationary distribution, in special cases
3. Derive approximations of the stationary distribution
10-18 Moment Generating Functions
Definition: Moment generating function of random variable X; for any tR
( ) , continuous
( ) [ ]{ }, discretej
txXtX
X txjj
e f x dx XM t E e
e P X x X
Theorem 1: If the moment generating function ( )XM t exists and is finite in some neighborhood of t=0, it determines the distribution (pdf or pmf) of X uniquely.
Theorem 2: For any positive integer n:
1. ( ) [ ]n
n tXXn
dM t E X e
dt
2. (0) [ ]n
nXn
dM E X
dt
Theorem 3: If X and Y are independent random variables:
( ) ( ) ( )X Y X YM t M t M t
10-19 Z-Transforms of Discrete Random Variables
For a discrete random variable, the moment generating function is a polynomial of te .
It is more convenient to set tz e and define the z-transform (or characteristic function):
( ) [ ] { }jxXX j
j
G z E z z P X x
Let X be a discrete random variable taking values 0, 1, 2,…, and let { }np P X n . The z-transform is well-defined for | | 1z :
2 30 1 2 3
0
( ) nX n
n
G z p zp z p z p p z
Z-transform uniquely determines the distribution of X
If X and Y are independent random variables: ( ) ( ) ( )X Y X YG z G z G z
Calculating factorial moments:
1
1 11 1
2
1 12 2
lim ( ) lim [ ]
lim ( ) lim ( 1) ( 1) [ ( 1)]
nX n n
z zn n
nX n n
z zn n
G z np z np E X
G z n n p z n n p E X X
Higher factorial moments can be calculated similarly
10-20 Continuous Random Variables
Distribution Prob. Density Fun. Moment Gen. Fun. Mean Variance(parameters) fX (x) MX (t) E [X ] Var(X )
Uniformover 1b¡ a
etb¡ etat(b¡ a)
a+b2
(b¡ a)2
12
(a;b) a< x < b
Exponential ¸e¡ ¸x ¸¸¡ t
1¸
1¸
¸ x ¸ 0
Normal 1p2¼¾e
¡ (x¡ ¹ )2=2¾2 e¹ t+(¾t)2=2 ¹ ¾2
(¹ ;¾2) ¡ 1 < x <1
10-21 Discrete Random Variables
Distribution Prob. Mass Fun. Moment Gen. Fun. Mean Variance(parameters) P fX = kg MX (t) E [X ] Var(X )
Binomial¡ nk
¢pk(1¡ p)n¡ k (pet +1¡ p)n np np(1¡ p)
(n;p) k = 0;1;: ::;n
Geometric (1¡ p)k¡ 1p pet
1¡ (1¡ p)et1p
1¡ pp2
p k = 1;2;:::
NegativeBin.³k¡ 1r¡ 1
´pr(1¡ p)k¡ r
hpet
1¡ (1¡ p)et
i r rp
r(1¡ p)p2
(r;p) k = r;r +1;:::
Poisson e¡ ¸ ¸k
k! e (et ¡ 1) ¸ ¸¸ k = 0;1;:::
10-22 P-K Transform Equation
We have established:
1 1 11{ 0} ( 1)j j j j j jL L L A L A
Let lim { }n jj
P L n
be the stationary distribution and 0
( ) nL nn
G z z
its z-transform.
Noting that 1( 1)jL and jA are independent, we have:
1( 1)[ ] [ ] [ ]j j jL L AE z E z E z
At steady-state, jL and 1jL are statistically identical, with pmf n . Therefore:
1[ ] [ ] ( )j jL LLE z E z G z
Moreover: 0
[ ] ( )jA nA nn
E z G z z
Let X be a discrete random variable taking values 0, 1, 2,…, and let { }np P X n . Then:
( 1) 2 10 1 2 3 0 0[ ] ( [ ] )X XE z p p zp z p p z E z p
Therefore: 1 1( 1) 1 10 0 0 0[ ] ( [ ] ) ( ( ) )j jL L
LE z z E z z G z
Then:
1 00 0
( 1) ( )( ) [ ( ( ) )] ( ) ( )
( )A
L L A LA
z G zG z z G z G z G z
z G z
10-23 P-K Transform Equation
Probability 0 can be calculated by requiring 01
lim ( ) 1L nnzG z
Using L’Hospital’s rule:
0 00
1
[ ( ) ( 1) ( )]1 lim 1 [ ]
1 ( ) 1 [ ]A A
zA
G z z G zE A
G z E A
Recall that [ ] [ ]E A E X . For 0 0 , we must have: [ ] 1.E A Finally:
0 0 0
( 1)
00 0
( )( ) ( )
!
( )( ) ( )
!( ( 1))
kk k x
A k Xn n
kx z x
X Xn
X
xG z z z e f x dx
k
xze f x dx e f x dx
kM z
where 0
( ) [ ] ( )tX txX XM t E e e f x dx
is the moment generating function of the service time X
At steady-state the number of customers left behind by a departing customer and the number of customers in the system are statistically identical, i.e., { }jL and {N(t)} have the same pmf
Concluding: (1 )( 1) ( ) (1 )( 1) ( ( 1))
( ) ( )( ) ( ( 1))
A XN L
A X
z G z z M zG z G z
z G z z M z
10-24 P-K Transform Equation
Example 1: M/M/1 Queue. X is exponentially distributed with mean 1/μ. Then:
( ) [ ]tXXM t E e
t
Then, the z-transform of the number of arrivals during a service time is:
( ) ( ( 1))( 1)A XG z M zz z
The P-K Transform equation, then, gives:
(1 )( 1)(1 )( 1) ( ) 1
( )( ) 1
AN
A
z
z
zz G z
G zz G z zz
For | |z :
2 211
1z z
z
Then:
0
( ) (1 ) n nN n
G z z
Therefore: (1 ) , 0n
np n
10-25 Expansion in Partial Fractions
Assume that the z-transform is of the form:
( )
( ) ,( )
U zG z
V z
U(z) and V(z) polynomials without common roots. Let 1, , mz z be the roots of V(z). Then:
1 2( ) ( )( ) ( )mV z z z z z z z
Expansion of G(z) in partial fractions:
1 2
1 2
( )( ) ( ) ( )
m
m
G zz z z z z z
Given such an expansion, for | | | |kz z :
21
11 / k k k
z z
z z z z
Then:
1
1 1 0 0 1
1( )
1 /
nm m m
nk k kn
k k n n kk k k k k
zG z z
z z z z z z
Therefore:
1
1
mk
n nk k
pz
10-26 Expansion in Partial Fractions (cont.)
1 2
( )( ) , ( ) ( )( ) ( )
( ) m
U zG z V z z z z z z z
V z
Expansion of G(z) in partial fractions:
1 2
1 2
( )( ) ( ) ( )
m
m
G zz z z z z z
Determining the coefficients of the partial fractions:
1
1 1lim( ) ( )z z
z z G z
Note that:
1
1 11
1 2 1 1
( ) ( )lim( ) ( )
( ) ( ) ( )z zm
U z U zz z G z
z z z z V z
Therefore, the coefficients can be determined as:
( )
lim( ) ( )( )k
kk k
z zk
U zz z G z
V z
10-27 M/G/1 Queue with Priority Classes
M/G/1 system with arriving customers divided in c priority classes
Class 1 highest priority, class 2 second highest, up to class c which is the lowest priority class
Class k customers arrive according to a Poisson process with rate k
Service times of class k customers are iid, following a general distribution with mean 1/k kX and second moment 2
kX
Arrival processes are independent of each other and independent of the service times
/k k k k kX : utilization factor for class k
kW : average queueing time (at steady state) of class k customers
Preemptive or non-preemptive priority discipline
Develop a formula that gives the average queueing time for each priority class
10-28 Non-Preemptive Priority
Non-preemptive: service of a customer completes uninterrupted, even if customers of higher priority arrive in the meantime
A separate queue is maintained for each class; each time the server becomes free, the first customer in the highest priority queue (that is not empty) enters service
Non-preemptive policy: the mean residual service R time seen by an arriving customer is the same for all priority classes
Priority Class 1
Queueing time of class 1 customer = residual service time + time required to service all class 1 customers found in the queue upon arrival
Similarly to the derivation of P-K formula, this implies:
1 11
1W R Q
Little’s formula: 1 1 1Q W
Combining the two:
111
RW
10-29 Non-Preemptive Priority
Priority Class 2:
Queueing time for class 2 customer is the sum of the following:
1. Residual service time
2. Time to service all class 1 customers found in queue upon arrival
3. Time to service all class 2 customers found in queue upon arrival
4. Time to service all class 1 customers that arrive while the customer waits in the queue
Focusing on averages of these times:
2 1 2 1 21 2 1
1 1 2 2 1 21 2 1
1 1 1
1 1 1
W R Q Q W
R W W W
Solving for 2W and using the expression for 1W :
21 1 2(1 )(1 )
RW
10-30 Non-Preemptive Priority
Priority Class k: Using induction
1 1 1 1(1 )(1 )k
k k k
RW
Mean Residual Service Time: Using the graphical method developed in the proof of the P-K formula, one can show:
2
1
1
2
c
k kk
R X
Average time a class k customer spends in the system:
1
k kk
T W
Using Little’s formula the number of customers in the system for each class, and for all customers, the average time delay per customer:
1 1
1
c c
c
T TT
10-31 Preemptive Priority Non-preemptive policy: average queueing time of a customer depends on the arrival rate of
lower priority customers
Preemptive/resume policy: service of a customer is interrupted when a higher priority customer arrives; it resumes from the point of interruption when all higher priority customers have been served
Priority k customers are not affected by the presence of lower priority customers
Calculate kT = average time a class k customer spends in the system. This consists of:
1. Average service time of the customer 1/ k
2. Average time required to service customers of priority 1 through k found in the system upon arrival. This is equal to the average waiting time in an M/G/1 system where customers of priority lower than k are neglected, that is:
2
11
1,
1 2
kk
k i iik
RR X
3. Average time requited to service customers of priority higher than k that arrive while the customer is in the system:
1 1
1 1
1, 1
k k
i k i ki ii
T T k
10-32 Preemptive Priority
Combining terms:
1 11
1( )
1k
k k kk k
RT T
Final solution:
1
1 1 1
(1/ )(1 )
(1 )(1 )k k k
kk k
RT
where:
2
1
1
2
k
k i ii
R X