basic message coding 《 digital watermarking: principles & practice 》 chapter 3
DESCRIPTION
Basic Message Coding 《 Digital Watermarking: Principles & Practice 》 Chapter 3. Multimedia Security. Multi-Message Watermarking. Mapping messages into watermarking vectors Analogous to mapping between messages and transmitted signals Source coding - PowerPoint PPT PresentationTRANSCRIPT
Basic Message Coding 《 Digital Watermarking: Principles & Practice 》 Chapter 3
Multimedia Security
2
Multi-Message Watermarking
• Mapping messages into watermarking vectors– Analogous to mapping between messages and
transmitted signals• Source coding
– Mapping messages into sequences of symbols
• Modulation– Mapping sequences of symbols into physical signals
3
Direct Message Coding
• Assign a unique, predefined message mark to represent each message
• Let the set of messages to be denoted M and the number of message in the set |M|. – We design a set of |M| message marks, W, each of
which is associated with a message.– Encoding
• To watermark a work with the message m, the embedder simply embeds message mark W[m]
– Decoding• Detection values are computed for each of the |M] message
marks. the most likely message is the one corresponding to the message mark with the highest detection value.
4
Code Separation
• How to design the message marks?– The message marks should be chosen to have good
behavior with respect to predictable false-alarm rate, fidelity, robustness…
– One message mark shall not be confused with another
• Marks should be far apart from one another in marking space
If the corruption is too serious, the watermark might be erroneously decoded as a different message
Message mark
Channel Corrupted message mark
Decoding ?
5
Message Coding as an Optimization Problem
• Designing an optimal set of |M| N-dimensional message marks is equivalent to the problem of placing |M| points on the surface of an N-dimensional sphere such that the distance between the two closet points is maximized
6
Randomly Generated Codes (1/2)
• If the number of messages is large compared to the dimensionality of marking space, randomly generated codes usually result in good code separation
Distribution of average angles between three message vectors in randomly generated, three-dimensional codes
7
Randomly Generated Codes (2/2)
• If the dimension of marking space is large, randomly generated message vectors are likely to be orthogonal to one another
Distribution of average angles between three message vectors in randomly generated, 256-dimensional codes
8
Problems of Direct Message Coding
• Direct message coding does not scale well– The detector must compute the detection value for
each of |M| reference marks– e.g.
• If 16 bits of information (65,536 possibilities) is embedded, the detector must compute detection value for each of 65,536 reference marks.
• The problem can be dramatically reduced by first by representing each message with a sequence of symbols, drawn from an alphabet
9
Multi-Symbol Message Coding
• If each message is represented with a sequence of L separate symbols, drawn from an alphabet of size |A|, we can represent up to |A|L different messages.– E.g. |A|=4, message length |L|=8
• Only 32 comparisons is required
10
Time and Space Division Multiplexing
• The Work is divided into disjoint regions, either in space or time, and embed a reference mark for one symbol into each part.– The message mark is constructed by concatenating
several reference marks– e.g.
• To embed 4 symbols into an image of dimensions w pixels x h pixels, we would use reference marks of dimensions w/2 x h/2
11
Frequency Division Multiplexing
• The Work is divided into disjoint bands in the frequency domain, and embed a reference mark for one symbol in each– The message mark is constructed by adding together
several reference marks of different frequencies
12
Code Division Multiplexing
• Several uncorrelated reference marks embedded into the same work have no effect on one another in a linear correlation system– Code division multiplexing in spread spectrum
communication
13
Code Division Multiplexing (cont.)
• If we represent a message with sequences of L symbols drawn from an alphabet of size |A|, we define a set of Lx|A| reference marks, WAL
• All reference marks added to Wm should be as orthogonal as possible
– WAL[i,a] ‧WAL[j,b] =0 ,i≠j
– WAL[i,a] and WAL[j,b] must be maximally distinguishable
– We may add WAL[1,3] and WAL[2,1] but we would never embed both WAL[1,3] and WAL[1,1]
• In a given location• Example:
– |A|=4, L=5– Represent the message sequence
3, 1, 4, 2, 3– Wm=WAL[1,3]+WAL[2,1]+
WAL[3,4]+WAL[4,2]+WAL[5,3]
WAL
1
2
3
4
5
1 2 3 4
index
Symbol number
Wm
14
Equivalence of Code Division Multiplexing to Other Approaches
• Viewing time division multiplexing as special cases of code division multiplexing
Time division multiplexing
Code Division
Multiplexing
15
Example: Simple Eight-bit Watermarks (1/2)
• Mapping a message into bit stream• A single reference pattern wri, i=1 to 8, for each
bit location i is used. • wri is generated pseudo-randomly according to a
given seed and normalized to have zero mean. • For embedded watermark pattern,
– wmi=wri if m[i]=1, wmi=-wri if m[i]=0– wtmp=∑wmi
– wm=wtmp/swtmp
16
Example: Simple Eight-bit Watermarks (2/2)
• Embedding– cw=c0+awm
• Detection– The detector correlates the received image c against
each of the eight reference pattern, and uses the sign of each correlation to determine the most likely value for the corresponding bit.
17
Problem with Simple Multi-Symbol Messages
• Example: (L=3, dimensionality of marking space=N)– W312=WAL[1,3]+WAL[2,1]+WAL[3,2]– W314=WAL[1,3]+WAL[2,1]+WAL[3,4]– W312. W314= …=N
• The smallest possible inner product between two message marks that differ in h symbols is N(L-2h)– L-h: inner product for # of same symbols– h: inner product for # of differences
• As L increases, the inner product above increases, the angle between two vectors decreases, the message marks of the closet pair become progressively similar