theory and applications of gf(2 p ) cellular automata p. pal chaudhuri department of cst bengal...
Post on 28-Dec-2015
216 Views
Preview:
TRANSCRIPT
Theory and Applications of GF(2p) Cellular Automata
P. Pal Chaudhuri
Department of CST
Bengal Engineering College (DU)
Shibpur, Howrah India
(LOGIC ON MEMORY)
An Application Of
LOGIC ON MEMORY
Logic on Memory
• Basic Concept
• Classical Example – Content Addressable
Memory
– Content Addressable Processor
Cell
Comp
Bit LineWord Line
=
Logic on Memory
• Sub Micron era
• Search
• Storage of (Large) size table and efficient search
• Memory + CA
• Efficient storage and search of data with CA based Classifier
Logic-on-memory
• Problem Definition
• CA Based Solution
Memory CA
Memory Element
XOR XNORLogic
Logic on Memory to Implement a specific function
GF(2p) CA as a Classifier
Input Output / AttributeI1 (C11) A1I2 (C21) A2I3 (C31) A1…… …..
Ik (C22) A2
( I1 , I3 ) ; (C11 , C31) A1 ( I2 , Ik ) ; (C21 , C22) A2
• Classification ---- a universal problem
• Given the Input, fast search for the attribute of an input element
• Uses a Special Class of CA
• Non Group Multiple Attractor CA (MACA)
Classifier
• Design of a CA Based Classifier
• Input is an element Cij
----- the classifier outputs Ai --- that is the
Cij belongs to class Ai
• Implicit Memory • Fast Search
Input Output / AttributeI1 (C11) A1I2 (C21) A2I3 (C31) A1…… …..
Ik (C22) A2
( I1 , I3 ) ; (C11 , C31) A1 ( I2 , Ik ) ; (C21 , C22) A2
LOGIC ON MEMORY Memory(Conventional & CA) + (CA)XOR Logic
Special Class Of CA Non Group Multiple Attractor CA (MACA)
9 711
5
8 410
6
D1 MACA
MACA
8 10
4
6
9 11
7
5 01100101
1110
12 14
2
0
13 15
1
30000 0011
00 01
1110
13 115
3
12 214
00100
Problem Definition
• Given Sets {P1} {P2} ………. {Pn} where each set {Pi} = {Xi1 , Xi2 , Xi3 …… Xim}
• Given a randomly selected value Xkj
• To Answer The Question
Which Class does Xkj belong To?
12 14
2
0
8 10
4
6
9 11
7
5
13 15
1
3
Classifier
• n bit CA with M Attractors is a natural Classifier
• {0,3,5,6} Are the attractors
• Inverted trees are the Attractor Basins
12 14
2
0
8 10
4
6
9 11
7
5
13 15
1
3
Classifier
• Suppose we want to identify which class X = 7 lies in
• The CA is loaded with X
• CA is run in autonomous mode for k (=2) cycles where k is the depth of CA
• The Pseudo Exhaustive bits (10 ) of the Attractor give the class of the pattern
12 14
2
0
8 10
4
6
9 11
7
5
13 15
1
301100101
1110
0000 0011
0001
Two Class D1 Classifier
• We use Depth 1 CA (D1 CA)
• We construct a CA satisfying the following 1. R1
x P1 and y P2
T (x y) 0
2. R2
T 2 =T
T (T I ) = 0
12 14
2
0
13 15
1
30000 0011
00 01
13 115
3
12 214
0
0100
Depth 1 CA (D1 MACA)
Algorithm
• Any CA Satisfying R1 & R2 is a classifier for P = { { P1} {P2} }
• P1 = { 0,2,12,14} and P2 = { 3,1,13,15}
• Each basin of CA will contain patterns from either P1 or P2
• 2 attractors
12 14
2
0
13 15
1
30000 0011
00 01
Algorithm
• In general, there will be 2 n-r attractors ( n=Size of CA , r=Rank of T matrix )
• 2 n-r PE positions at certain (n-r) positions
• The two Classes can be identified by a single bit memory stored in a 2 n-r x 1 bit memory or a simple logic circuit
12 14
2
0
13 15
1
30000 0011
00 01
Multiclass Classifier
• But what about multi class classifier ?
• A general CA based solution does not exist
• However we can use
hierarchical Two Classifier to build a solution
8 10
4
6
9 11
7
501100101
1110
12 14
2
0
13 15
1
30000 0011
0001
Multiclass Classifier
• Hierarchical Two Class classifier
• Built by partitioning the pattern set P
• P = {P1, P2, P3 ,…Pn}
as {{P1,P2,P3…Pk},{Pk+1,….Pn}}
and finding a two class classifier for this
• This is repeated for each subset
• Number of CAs required is log2n where n is the
number of classes
Multiclass Classifier
Classes are• P1 = {0,2,12,14}• P2 = {3,1,13,15}• P3 = {5,7, 9,11}• P4 = {6,4,8,10}
8 10
4
6
9 11
7
501100101
1110
12 14
2
0
13 15
1
30000 0011
0001
Multiclass Classifier
• Initially we built a Two Classifier to identify these two classes
• Temp0 = {P1,P2}
• Temp1 = {P3,P2}
• Then two more Classifiers to identify {P1 and P2} and {P3 and P4}
8 10
4
6
9 11
7
501100101
1110
12 14
2
0
13 15
1
30000 0011
0001
Temp 0Temp 1
General Multiclass Classifier
P2 PnPk
Temp1
Temp0
Temp 00Temp 11
Templm
log2 n CA s
Multiclass Classifier in GF (2p)
• Handles class elements of Symbol string rather than a bit string
• A T matrix satisfying R1 and R2 is efficiently obtained using BDD in GF(2)
• In GF (2p) we have introduced certain hueristics to get a solution T matrix in reasonably fast time
Application Areas
• Fast encoding in vector quantization of images
• Fault diagnosis
Image Compression
• Target Pictures Portraits and similar images
• Image size 352 x 240 ( CCIR size )
• Target compression ratio 97.5 % - 99 %
• Target PSNR value 25 - 30 dB
• Target application low bit rate coding for video telephony
Algorithm
• Used a training set of 12 pictures of a similar nature
• The images were partitioned in sizes of 8 x 8
• These 8 x 8 blocks are clustered around 8192 pivot points using standard LBG algorithm
B1
B2
Bi
Bm
Bn
Blocks
Training Images
Algorithm
• Elements are 64 length GF (2p) Symbol string --- 8 x 8 pixel block
• Therefore we have 8192 clusters
• And these can be addressed using 13 bits
• A multi class classifier is designed for these 8192 classes
• The depth of this classifier is 13
C1 C2…..
…. ….Cn
Clusters
Pivot Points
C1
C2
C8192
Codebook
Algorithm• The target image to be coded is
divided into 8 x 8 blocks• Each of these blocks is input to the
Multi Class Classifier• The Multi Class Classifier outputs
the class id of the block• This is done in effectively 13 clock
cycles plus some memory access times
• Encoding time is thus drastically reduced
Image
Block
Classifier
Class id
Training Images
B1
B2
Bi
BmBn
Blocks
C1 C2…..
…. ….Cn
Clusters
Pivot Points
C1
C2
C8192
Image
Block
Classifier
Codebook
Algorithm
Sample Results
Image PSNRJulie 33.53Girl1256 34.58Michelle 32.69Girl256 27.84Claire256 29.91Ash 27.84
Sample Images
• PSNR 27.8 db• Compression ratio 97.5 %
Sample Images
PSNR 25.1 dbCompression ratio 97.5 %
PSNR 28.5 dbCompression ratio 97.5 %
Schematic of a CA Based Vector Quantizer
CA Memory
CA Conf.
Controller
PE bits
Shift Register
Output
Hardware Design for
CA Based Vector Quantizer
Improvements Over the Basic scheme
• A hierarchical encoder has been implemented
• The image is first encoded using 16 x 16 blocks ….
• If a match cannot be obtained with any of the classes in the training set, then a match with 8 x 8 blocks is tried
• This pushes up the Compression ratio to 99 %
Dynamic Classification
• Static Database• The solution assumes the target pattern is present in
the cluster set• If a new pattern outside this range is input , the
classifier indicates No entry in The Database• So a linked queue of these new blocks is maintained• At periodic intervals, a new Multiclass Classifier is
obtained using these updated data members after incorporating them in the appropiate classes
Thank You
top related