distributing layered encoded video through caches
DESCRIPTION
Distributing Layered Encoded Video through Caches. Authors: Jussi Kangasharju Felix HartantoMartin Reisslein Keith W. Ross Proceedings of IEEE Infocom 2001, April 22-26, 2001, Alaska, USA. Layout. Introduction Model of layered video streaming Optimal Caching - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/1.jpg)
Distributing Layered Encoded Video through Caches
Authors:
Jussi Kangasharju Felix Hartanto Martin Reisslein Keith W. Ross
Proceedings of IEEE Infocom 2001, April 22-26, 2001, Alaska, USA.
![Page 2: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/2.jpg)
Layout
Introduction Model of layered video streaming Optimal Caching Negotiation About Stream Quality Queuing of Requests Is Partial Caching Useful?
![Page 3: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/3.jpg)
Introduction Layered encoded video is appropriate for
heterogeneous environment like the Internet.
Using cache server between clients and servers is beneficial.
Questions: Which videos and which layers in the videos should be cached given a limited cache size and bandwidth?
Methodology: based on stochastic knapsack 2-resource problem.
![Page 4: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/4.jpg)
Model of layered video streaming with proxy Video streams are stored on origin
servers. Popular streams are cached in proxy. Clients direct their requests to
appropriate proxy. If requested stream is cached, it is
delivered from proxy to client over LAN. Otherwise, origin server delivers stream through WAN to proxy, which in turns, delivers to client.
![Page 5: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/5.jpg)
Layered Video Pre-encoded using layered encoding
techniques: J. Lee, T. Kim, and S. Ko, “Motion prediction based on temporal layering for layered video
coding,” in Proc. of ITC–CSCC, Vol. 1, July 1998. S. McCanne and M. Vetterli, “Joint source/channel coding for multicast packet video,” in
Proc. of IEEE International Conference on Image Pro-cessing, Oct. 1995. M. Vishwanath and P. Chou, “An efficient algorithm for hierarchical compression of video,” in
Proc. of IEEE International Conference on Image Processing, Nov. 1994.
A video consists of a base layer (basic quality information) and enhancement layers (quality enhancements).
Benefits: flexible streaming services; flexible pricing structures.
![Page 6: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/6.jpg)
Layered Video Model There are M video objects (CBR encoded). Each video has L layers. rl(m) : rate (bit/sec) of layer l, l = 1,..,L of video
object m, m = 1,…,M. j-quality stream: a stream consisting of layers
1,2,..,j. T(m), m = 1,…,M: length in sec. of video m. R(j,m): revenue accrued from providing a j-
quality stream of video m.
![Page 7: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/7.jpg)
Proxy server model Bandwidth for streaming media from origin
servers to the proxy is fixed at C (bit/sec). Proxy has a finite storage capacity of G
(bytes). Caching strategy:
cache contents are updated periodically based on the estimates of client’s request pattern.
cache complete layers of video objects to maximize the revenue accrued from the streaming service.
give layers of popular objects priority over less popular objects, based layer over enhancement layers.
![Page 8: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/8.jpg)
Proxy server model (2) Request arrival: Poisson process with rate
(req/sec). p(j,m): the popularity of the j-quality
stream of video m. p(j,m): arrival rate of requests for j-quality
stream of object m. c = (c1,c2,…,cM), with 0 cm L for m=1,…
M : cache indicator. cm = i if layer 1 through I of video m are cached.
Space occupied is (1)
![Page 9: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/9.jpg)
Stream delivery model Client sends a request for j-quality
stream of video m to proxy: If all requested layers are cached (cm j),
proxy delivers video. If some layers are missing (cm < j), server
tries to stream missing layers cm+1,…,j at rate to client.
If there is sufficient bandwidth the request is served and a bandwidth of is occupied for T(m) seconds.
Otherwise, request is considered BLOCKED.
![Page 10: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/10.jpg)
Stream delivery model (2) Bc(j,m): blocking probability of the
request for a j-quality stream of video m in cache configuration c.
Bc(j,m) = 0 for cm j. Bc(j,m) can be calculated using
Kaufman-Roberts algorithm in O(CML) time.
The expected blocking probability is:
![Page 11: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/11.jpg)
Blocking probability formula
Bc(j,m) = 1 -
where Sc(j,m) = { n Sc : bc . N C – bc(j,m)}
Reference for loss model used in calculating blocking probability:K. W. Ross, Multiservice Loss Models for Broadband TelecommunicationNetworks, Springer–Verlag, 1995.
![Page 12: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/12.jpg)
Stream delivery model (3) The throughput of requests for j-quality
streams of object m is p(j,m)(1-Bc(j,m)). The total revenue of streaming service is:
The goal is to cache object layers to maximize total revenue rate R(c).
![Page 13: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/13.jpg)
Optimal caching Maximizing revenue rate R(c) is
analytically intractable and exhaustive search over cache configuration are prohibitive for realistic problem.
Solution: using heuristics.
![Page 14: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/14.jpg)
Utility heuristics Assign each of the ML video layers a
cache utility ul,m, l = 1,…,L, m = 1,…,M. Movie layers are cached in decreasing
order of utility. If the movie layer with the next highest
utility doesn’t fit into the remaining cache space, skip this movie layer and try to cache the next highest utility movie layer.
When a layer of an movie is skipped, all other layers of this movie are skipped too.
![Page 15: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/15.jpg)
Utility definitions
![Page 16: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/16.jpg)
Evaluation of heuristics Test the performance of heuristics in small problems
to compare the heuristic against the exhaustive search.
Parameters: M = 10, L = 2. C is varied from 3-15 Mbits/s. Cache capacity G
varies from 3-7 Gbytes (could store from 23.1-41.7% of total movie data).
Movie has average length of 1 hour. Rate of each layer is chosen randomly from a uniform
distribution between 0.1 and 3 Mbps. Request rate is 142 requests/sec. Request type and movie requested drawn from a Zipf
distribution with parameter 1.0. Revenue of each movie layer is uniformly distributed between
1 to 10.
![Page 17: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/17.jpg)
Average error obtained with each heuristic compared to exhaustive search.Small link: 3 Mbit/s. Large link: 15 Mbit/s.Small cache: 3 Gbytes. Large cache: 7 Gbytes.
Conclusion: heuristics achieve performance very closed to the optimum in most cases.
![Page 18: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/18.jpg)
Evaluation of heuristics (2) Parameters:
M = 1000, L = 2. C is varied from 10-150 Mbits/s (between 1-15% of the total bandwidth required to stream all requested movie).
Cache capacity G varies from 12-560 Gbytes (could store from 0.9-41.7% of total movie data).
Movie has average length of 1 hour. Rate of each layer is chosen randomly from a uniform
distribution between 0.1 and 3 Mbps. Request rate is 142 requests/sec. Request type and movie requested drawn from a Zipf
distribution with parameter 1.0. Revenue of each movie layer is uniformly distributed
between 1 to 10.
![Page 19: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/19.jpg)
![Page 20: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/20.jpg)
![Page 21: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/21.jpg)
![Page 22: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/22.jpg)
![Page 23: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/23.jpg)
![Page 24: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/24.jpg)
![Page 25: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/25.jpg)
Some conclusions from evaluation of heuristics Revenue density heuristic has the best
performance of the three heuristics. Especially when we have shortage of one resource (link bandwidth or cache size).
If both resources are in short, try to increase cache size before increasing link bandwidth.
When requests are not very skewed significant increase in link capacity and cache size to keep the revenue at the same level. When requests are very skewed, we can have the same revenue with less resource.
Request rate has much less effect on the revenue than the Zipf-parameter.
![Page 26: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/26.jpg)
Stream quality negotiation If client’s request is blocked, the service
provider tries to offer a lower quality stream of requested object.
Question: How much additional revenue is incurred with this “negotiation”?
Answer: Not much. Study the case when L = 2. Revenue incurred from successful
negotiation is:
![Page 27: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/27.jpg)
![Page 28: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/28.jpg)
Queuing of Requests If a request is blocked, the server put that
request in its queue and serve it later when resource becomes available.
Question: How much additional revenue does it bring?
Answer: Not much. Simulation:
request time out 5 minutes; queue of finite size. Queue priority: arrival time, required resources,
and potential revenues.
![Page 29: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/29.jpg)
![Page 30: Distributing Layered Encoded Video through Caches](https://reader035.vdocuments.us/reader035/viewer/2022062500/56814ed6550346895dbc71e7/html5/thumbnails/30.jpg)
Is Partial caching useful? In system where clients are only
interested in complete streams (always request all layers) and no revenue is incurred for partial systems.
Question: Is caching partial streams beneficial?
Answer: No.