comp slides4 b
TRANSCRIPT
-
8/8/2019 Comp Slides4 B
1/25
GRAY CODE
( )1>>= BCBCGC
Logical operation XOR is Exclusive OR:
00=0 01=1
11=0 10=1
Examples
1) Value =127:Binary codeBC: 0111 1111
Gray code GC: 0111 1111
0011 1111
---------------
0100 0000
2) Value = 128Binary code BC: 1000 0000
Gray code GC: 1000 0000
0100 0000
---------------
1100 0000
-
8/8/2019 Comp Slides4 B
2/25
BINARY AND GRAY CODES (4 BITS):
0
1
2
3
4
5
6
7
8
9
1 0
1 1
1 2
1 3
1 4
1 5
0
1
2
3
4
5
6
7
8
9
1 0
1 1
1 2
1 3
1 4
1 5
B i n a r y c o d e G r a y c o d e
-
8/8/2019 Comp Slides4 B
3/25
Binary and Gray code bit planes:
q Encode each bit plane with JBIGq Typically, lossless JPEG is more efficient
than JBIG for images with more than 6 bits
-
8/8/2019 Comp Slides4 B
4/25
Binary code Gray code
7
6
5
4
-
8/8/2019 Comp Slides4 B
5/25
3
2
1
0
-
8/8/2019 Comp Slides4 B
6/25
JPEG: Joint Photographic Experts Group
JPEG Lossless mode
-
8/8/2019 Comp Slides4 B
7/25
Predictive coding in lossless JPEG
ALTERNATIVE PREDICTORS:
Mode: Predictor: Mode: Predictor:
0 Null 4 N+ W-NW
1 W 5 W+ (N-NW)/2
2 N 6 N+ (W-NW)/2
3 NW 7 (N+ W)/2
NW N
W x
-
8/8/2019 Comp Slides4 B
8/25
RESIDUAL CODING:
= x x
DECODER:
x x= +
ENTROPY CODING:
HUFFMAN CODING
ARITHMETIC CODING (QM-CODER)
-
8/8/2019 Comp Slides4 B
9/25
Huffman coding
Category Codeword Difference Codeword0 00 0 -
1 010 -1, 1 0, 1
2 011 -3, -2, 2, 3 00, 01, 10, 11
3 100 -7,..-4, 4..7 000,...011,
100...111
4 101 -15,..-8, 8..15 0000,...0111,
1000...11115 110 -31,..-16, 16..31 :
6 1110 -63,..-32, 32..63 :
7 11110 -127,..-64, 64..127 :
8 111110 -255,..-128,
128..255
:
-
8/8/2019 Comp Slides4 B
10/25
Binary arithmetic coding
0
1
2
3
7 8
::
{ 0 }
{ 1 }
{ 2 , 3 }
{ 4 , . . , 7 }
{ 6 3 , . . , 1 2 7 } { 1 2 8 , . . , 2 5 5 }
-
8/8/2019 Comp Slides4 B
11/25
Lossless JPEG: example
Pixel sequence: 10, 12, 10, 7, 8, 8, 12.Prediction mode: 1 (previous pixel value)
Static Huffman codingPixel: 10 12 10 7 8 8 12
+10 +2 -2 -3 +1 0 +4Category: 4 2 2 2 1 0 3
Bits: 101 1010 011 10 011 01 011 00 010 1 00 100 100
-
8/8/2019 Comp Slides4 B
12/25
FELICS
Fast and Efficient Lossless
Image Compression SystemP.Howard and J.Vitter
PREDICTION MODEL:
L=min{a,b}, H=max{a,b}.
L = The smaller of the two neighbor pixel value
H= The larger of the two neighbor pixel value
b
a x
-
8/8/2019 Comp Slides4 B
13/25
ASSUMPTION OF DISTRIBUTION:
L H
b e lo w
r a n g e
a b o v e
r a n g e
i n
r a n g e
p r o b a b i l i t y
i n t e n s i t y
CODING THE PIXEL VALUE:
IF P [L,H] THEN
Output(0);
Encode =P-L by adjusted binary code;
ELSE IF P
-
8/8/2019 Comp Slides4 B
14/25
ADJUSTED BINARY CODING:
Let =H-L. There are +1 possible values in the
range.
If +1 is a power of two, a binary code with
log2(+1) bits is used.
Otherwise ( ) 1log 2 + are used for middle values and( ) 1log 2 + bits for the outer values.
For example, if=4 there are five values (0, 1, 2, 3,
4) and their corresponding adjusted binary
codewords are (111, 10, 00, 01, 110).
RICE CODING:
For determining the Rice coding parameter k, the is used as a context.
For each context , a cumulative total bit rate is
maintained for each reasonable Rice parameter
value k, of the code length that would have
resulted if the parameter kwere used to encode all
values encountered so far in the context.
The parameter with the smallest cumulative code
length is used to encode the next value
encountered in the context.
The allowed parameter values are k=0, 1, 2, and
3.
-
8/8/2019 Comp Slides4 B
15/25
CALIC:
Context-based Adaptive LosslessImage Codec
CALIC [X.Wu,1995] is ranked top among
the schemes that were evaluated by the JPEG
committee prior to the development JPEG-Lossless standard.
but rather elaborate scheme.
-
8/8/2019 Comp Slides4 B
16/25
LOCO-I:
LOw-COmplexity
LOssless COmpression for Images
Features Context modeling (365 contexts)
Run-length coder: Run-lengths are coded withGolomb-Rice method.
Nonlinear predictor
Golomb-Rice entropy coder: the parameterkdepends on the context, and is adapted to theaverage absolute value of residuals.
-
8/8/2019 Comp Slides4 B
17/25
Model structure of JPEG-LS
G r a d i e n t s
F l a tr e g i o n ?
F i x e d
p r e d i c t o r
A d a p t i v ec o r r e c t i o n
C o n t e x t
m o d e l e r
R u n c o u n t e r
i m a g es a m p l e s
p r e d i c t i o n
e r r o r s
i m a g es a m p l e s
p r e d i c t e d v a l u e s
i m a g es a m p l e s
r u n l e n g s t a t i s t i c s
p r e d . e r r s t a t i s t i c s
+
-
c o n t e x t M o d e l e r
m o d er e g u l a r
r u n
-
8/8/2019 Comp Slides4 B
18/25
COMPONENTS OF THE MODEL:
1. Prediction
2. Determination of context.
3. Probability model for the prediction errors.
-
8/8/2019 Comp Slides4 B
19/25
PREDICTOR:
( )
( )
otherwise
),min(if
),max(if
,max
,min
bac
bac
cba
ba
ba
x
+
=
Examples:
20 12
10 10
1) c=20 > max(10,12) predictor =
min(10,12)=10
10 20
18 20
2) c=10 < max(18,20) predictor =
max(18,20)=20
12 20
10 18
2) 10
-
8/8/2019 Comp Slides4 B
20/25
RESIDUAL CODING:
= x x
DECODER:
x x= +
-
8/8/2019 Comp Slides4 B
21/25
JPEG-LS context model
The distributoin of prediction residuals in
continuous tone images can be approximatedby a Laplacian distribution, i.e. a two-sided
exponential decay centered at zero.
In a one-pass scheme, the encoder cannot
tune an optimal Huffman table to each
possible distribution of prediction residuals.
Adaptive construction of optimal tables is
ruled out by the complexity contsraints.
Solution: For each context the encoder
adaotively chooses the best among a limitedset of Huffman codes, matched to
exponentially decaying distributions, based
on the past performance.
-
8/8/2019 Comp Slides4 B
22/25
GRADIENT DETERMINATION:
g1 = d- b
g2 = b - c
g1 = c - a
CONTEXT QUANTIZATION:
gi qi
{0}
{1,2}
{3,4,5,6}
{7,8,...,20}
{e 20}
NUMBER OF DIFFERENT CONTEXTS: 365
( )( )C
T=
+ +2 1 1
2
3
T=4, |C|=365: balance storage requirements with
high-order conditioning
-
8/8/2019 Comp Slides4 B
23/25
JPEG-LS coding
VALUE MAPPING FROM [-127,127] TO [0, 255]:( )
( ) 0if,12
0if,2
-
8/8/2019 Comp Slides4 B
24/25
JPEG-LS: Part 2
Difference from Part-1
1. No run mode2. 4 5 reference points
3. Modified prediction for images with
sparse histograms
4. Binarization and its arithmetic coding
Part-1:
Context modeling Prediction or Run mode
Golomb Coding
Part-2:
Context modeling Prediction Binarization
Arithemtic Coding
-
8/8/2019 Comp Slides4 B
25/25
Comparison of lossless methods
Source: http://www.hpl.hp.com/research/itc/csl/vcd/infotheory/loco.html