displacement measurement for color images by a phase-encoded joint transform correlator
TRANSCRIPT
Optics & Laser Technology 44 (2012) 1129–1135
Contents lists available at SciVerse ScienceDirect
Optics & Laser Technology
0030-39
doi:10.1
n Corr
E-m
journal homepage: www.elsevier.com/locate/optlastec
Displacement measurement for color images by a phase-encodedjoint transform correlator
Peng Ge, Qi Li n, Huajun Feng, Zhihai Xu
State Key Lab of Modern Optical Instrumentation, Zhejiang University, Hangzhou 310027, China
a r t i c l e i n f o
Article history:
Received 2 May 2011
Received in revised form
10 October 2011
Accepted 10 October 2011Available online 17 November 2011
Keywords:
Displacement measurement
Phase-encoding
Color images
92/$ - see front matter & 2011 Elsevier Ltd. A
016/j.optlastec.2011.10.002
esponding author.
ail address: [email protected] (Q. Li).
a b s t r a c t
Displacement measurement for color images using phase-encoded joint transform correlator (PJTC) is
proposed. Color reference and target images are decomposed into red, green and blue components. We
form new monochromic target and reference images by spatially arranging the three channels. The
corresponding displacement of the color images is obtained using the PJTC with the new monochromic
target and reference. Our method generates only one sharp peak, and could detect out the displacement
of color images accurately. An appropriate hybrid-setup is proposed for pragmatic implementation.
& 2011 Elsevier Ltd. All rights reserved.
1. Introduction
Weaver and Goodman proposed joint transform correlator(JTC) in 1966 [1]. It is a 4f system, and the optical setup usestwo Fourier lenses to correlate the target and reference images,and has been widely used for pattern recognition and motiontracking [2–5]. Janschek et al. proposed using JTC to measure thespatial shift between imaging devices and principal axis in orderto recover the blurred images, navigate satellites, and alsomonitor spaces in real-time [6–10]. They made use of the JTC tofind out the correlation peaks related to the displacement vector.But the literature deals with only monochromatic images. Evenwhen the measured images are colored, they simply change thecolor images into gray images. In fact, most of visual signals arechromatic. Not only the shapes but also the colors of the imagesare indispensable characteristics for pattern recognition andimage registration. Similarities between the shape distributionsof each color should be recognized. This is color pattern recogni-tion. Positive recognition means full shape similarity between thetarget and the reference images in each color distributions [11].The introduction of color information in pattern recognition andimage registration is especially useful when the contours and theintensity distribution do not provide enough information topermit the recognition of images. Even when the shape and grayintensity of the target image is the same as those of the referenceimage, their RGB information are likely different. If we justtranslate a color image into a gray image, it may cause fault
ll rights reserved.
estimation. All the color information may be lost or cannot befully used. Thus extending JTC to measure displacement for colorimages is a very crucial task.
Classical JTC has been widely investigated in color patternrecognition by several research groups [11–21]. The multichannelsingle-output color JTC for color pattern recognition is particu-larly attractive [11,12]. The method achieved color pattern recog-nition by applying the classical JTC in multiple channels. Thepeaks of correlation, however, were not sharp enough. Thus, Alamintroduced a fringed-adjusted JTC [13] to improve the perfor-mance of color pattern recognition. A fringe-adjusted filter wasapplied to each channel to achieve excellent correlation discrimi-nation. And the multichannel single-output color JTC produced15 peaks in the correlation plane. It is not easy, however, to obtainthe location of cross-correlation peaks that represented thecoherence level between the reference and the target image forall color channels. A new system of multichannel single-outputjoint fractional Fourier transform correlator for color patternrecognition was proposed by Jin [14]. They obtained only threecorrelation peaks at the output plane. Yu et al. used a color LCTVas the input SLM and three lasers to illuminate the input SLM[15], and used color LCTVs to perform real-time multichannelJTC capable of discriminating between different colors and dif-ferent shapes [16]. Hsieh then presented a JTC for color imagerecognition using three liquid–crystal spatial light modulators,experimentally obtaining the correlation peaks of red, green andblue [17]. Some researchers adopted color encoding to implementcolor pattern recognition [18–20]. Nicolas further considered thecolor distribution as the third dimension and proposed anencoding of the 3D images into 2D images in order to performtheir 3D correlation in a convergent optical correlator [19].
P. Ge et al. / Optics & Laser Technology 44 (2012) 1129–11351130
Alam then proposed a new 3D color pattern recognition techni-que, exploiting the concept of a fringe-adjusted JTC and CIELabcolor space, yields better discrimination and sharper and strongercorrelation peak intensity, as compared with a classical JTC withconventional red–green–blue (RGB) components [20]. A wave-length-compensated three-channel (RGB) JTC for color patternrecognition using a ferroelectric liquid–crystal spatial light mod-ulator (SLM) operating in binary pure phase modulation isreported by Martınez [21]. But to our best knowledge no onehas attempted to apply PJTC to displacement detection for colorimages. The method used in color image registration are mostlybased on vector phase-correlation, or called Quaternion Fouriertransform [22–26]. It is a digital method, and the calculationspeed may be limited by the digital hardware. So we extend JTC tocolor image registration in order to improve the calculation speed.And the introduction of phase-encoding is to remove the inter-ference of other correlation peaks.
In this paper, we propose a displacement measurement forcolor images based on the phase-encoding JTC (PJTC). In thismethod, the color reference image is separated into the RGB colorchannels, displaying on a monochromatic plane and phase-encoded by a random phase-mask. The color target image isseparated into the RGB channels and overlaid with phase-encoded monochromatic reference image to form the inputimage, resulting in a single sharp correlation peak. The inputplane is gray scaled, thus real-time operation is possible with theuse of SLMs. The displacement between the color reference andthe color target image is obtained through measuring the positionof the cross-correlation peak. Section 2 contains a description ofthe principles of displacement measurement by PJTC, digitalresults is described in Section 3. Section 4 provide a possiblehybrid setup and conclusion is given in Section 5.
2. Color image displacement detection based on thephase-encoded JTC (PJTC)
First, we decompose color images into three channels whichconsist of red, green and blue part. For a color reference image, wedecompose it into rR, rG, rB which represent the red, the green andthe blue channel, respectively. The new monochromic reference isformed by specially arranging the three components, that is
rðx,yÞ ¼ rRðx,y�aÞþrGðx,yÞþrBðx,yþaÞ, ð1Þ
where a in Eq. (1) is a constant. The color target image has adisplacement (xi, yi) relative to the reference image, so the newmonochromic target is expressed as
tðx,yÞ ¼ tRðxþxi,y�aþyiÞþtGðxþxi,yþyiÞþtBðxþxi,yþaþyiÞ, ð2Þ
where tR is the red component of the color target image, tG is thegreen component of the color target image and tB is the bluecomponent of the color target image.
Refregier and Javidi proposed phase encoding and applied it inoptical image encryption and decryption [27]. Alam proposed areference phase-encoded joint transform correlation techniquefor multiple target detection [28]. We modify and apply it todisplacement measurement for gray image [29]. Consider arandom phase function Fðu,vÞ in Fourier domain, and we definea phase mask Yðu,vÞ
Yðu,vÞ ¼ exp½�jFðu,vÞ�, ð3Þ
and
fðx,yÞ ¼I�1½Yðu,vÞ�: ð4Þ
In Eq. (3), Fðu,vÞ is a random phase function uniformlydistributed between 0 and 2p in the Fourier domain, andj denotes an imaginary number. In Eq. (4) I�1 is 2D inverse
Fourier transform. We use f(x, y) to encode the monochromicreference image r(x, y) as follows:
r00ðx,yÞ ¼ rðx,yÞ � fðx,yÞ: ð5Þ
We overlap the phase-encoded monochromic reference withthe monochromic target image to form the monochromic inputimage, and the joint input image is given by
f ðx,yÞ ¼ r00ðx,yÞþftRðxþxi,y�aþyiÞþtGðxþxi,yþyiÞ
þtBðxþxi,yþaþyiÞg
¼ frRðx,y�aÞþrGðx,yÞþrBðx,yþaÞg �fðx,yÞ
þ½tRðxþxi,y�aþyiÞþtGðxþxi,yþyiÞþtBðxþxi,yþaþyiÞ�:
ð6Þ
Its Fourier spectrum is expressed as the following:
Fðu,vÞ ¼ f f t2ff ðx,yÞg ¼I½frRðx,y�aÞþrGðx,yÞþrBðx,yþaÞg �fðx,yÞ�
þI½tRðxþxi,y�aþyiÞþtGðxþxi,yþyiÞþtBðxþxi,yþaþyiÞ�
¼ RRðu,vÞYðu,vÞexpðivaÞþRGðu,vÞYðu,vÞ
þRBðu,vÞYðu,vÞexpð�ivaÞþTRðu,vÞexpð�iuxi�ivyiþ ivaÞ
þTGðu,vÞexpð�iuxi�ivyiÞþTBðu,vÞexpð�iuxi�ivyi�ivaÞ
ð7Þ
where I denotes Fourier transform. And the corresponding jointpower spectrum (JPS) is obtained by
JPSðu,vÞ ¼ 9Fðu,vÞ92¼ 9RRðu,vÞ92
þ9RGðu,vÞ92þ9RBðu,vÞ92
þ9TRðu,vÞ92
þ9TGðu,vÞ92þ9TBðu,vÞ92
þRRðu,vÞexpðivaÞRGðu,vÞn
þRGðu,vÞRRðu,vÞnexpð�ivaÞþRRðu,vÞexpði2vaÞRBðu,vÞn
þRBðu,vÞexpð�i2vaÞRRðu,vÞn
þRRðu,vÞYðu,vÞTRðu,vÞnexpðiuxiþ ivyiÞ
þTRðu,vÞexpð�iuxi�ivyiÞRRðu,vÞnYðu,vÞn
þRRðu,vÞYðu,vÞTGðu,vÞnexpðiuxiþ ivyiþ ivaÞ
þTGðu,vÞexpð�iuxi�ivyi�ivaÞRRðu,vÞnYðu,vÞn
þRRðu,vÞYðu,vÞTBðu,vÞnexpðiuxiþ ivyiþ i2vaÞ
þTBðu,vÞexpð�iuxi�ivyi�i2vaÞRRðu,vÞnYðu,vÞn
þRGðu,vÞRBðu,vÞnexpðivaÞþRBðu,vÞexpð�ivaÞRGðu,vÞn
þRGðu,vÞYðu,vÞTRðu,vÞnexpðiuxiþ ivyi�ivaÞ
þTRðu,vÞexpð�iuxi�ivyiþ ivaÞRGðu,vÞnYðu,vÞn
þRGðu,vÞYðu,vÞTGðu,vÞnexpðiuxiþ ivyiÞ
þTGðu,vÞexpð�iuxi�ivyiÞRGðu,vÞnYðu,vÞn
þRGðu,vÞYðu,vÞTBðu,vÞnexpðiuxiþ ivyiþ ivaÞ
þTBðu,vÞexpð�iuxi�ivyi�ivaÞRGðu,vÞnYðu,vÞn
þRBðu,vÞYðu,vÞTRðu,vÞnexpðiuxiþ ivyi�i2vaÞ
þTRðu,vÞexpð�iuxi�ivyiþ i2vaÞRBðu,vÞnYðu,vÞn
þRBðu,vÞYðu,vÞTGðu,vÞnexpðiuxiþ ivyi�ivaÞ
þTGðu,vÞexpð�iuxi�ivyiþ ivaÞRBðu,vÞnYðu,vÞn
þRBðu,vÞYðu,vÞTBðu,vÞnexpðiuxiþ ivyiÞ
þTBðu,vÞexpð�iuxi�ivyiÞRBðu,vÞnYðu,vÞn
þTRðu,vÞexpðivaÞTGðu,vÞnþTGðu,vÞTRðu,vÞnexpð�ivaÞ
þTRðu,vÞTBðu,vÞnexpði2vaÞþTBðu,vÞTRðu,vÞnexpð�i2vaÞ
þTGðu,vÞTBðu,vÞnexpðivaÞþTBðu,vÞexpð�ivaÞTGðu,vÞn,
ð8Þ
where * denotes conjugate. Then we multiply Eq. (8) with thephase mask Yðu,vÞ, given in Eq. (3), and we get the phase-encoded JPS (PJPS)
PJPS¼ JPSðu,vÞYðu,vÞ ¼ RRðu,vÞ2Yðu,vÞþRGðu,vÞ2Yðu,vÞ
þRBðu,vÞ2Yðu,vÞþTRðu,vÞ2Yðu,vÞþTGðu,vÞ2Yðu,vÞ
þTBðu,vÞ2Yðu,vÞþRRðu,vÞexpðivaÞRGðu,vÞnYðu,vÞ
þRGðu,vÞRRðu,vÞnexpð�ivaÞYðu,vÞ
þRRðu,vÞexpði2vaÞRBðu,vÞnYðu,vÞ
P. Ge et al. / Optics & Laser Technology 44 (2012) 1129–1135 1131
þRBðu,vÞexpð�i2vaÞRRðu,vÞnYðu,vÞ
þRRðu,vÞYðu,vÞTRðu,vÞnexpðiuxiþ ivyiÞYðu,vÞ
þTRðu,vÞexpð�iuxi�ivyiÞRRðu,vÞn
þRRðu,vÞYðu,vÞTGðu,vÞnexpðiuxiþ ivyiþ ivaÞYðu,vÞ
þTGðu,vÞexpð�iuxi�ivyi�ivaÞRRðu,vÞn
þRRðu,vÞYðu,vÞTBðu,vÞnexpðiuxiþ ivyiþ i2vaÞYðu,vÞ
þTBðu,vÞexpð�iuxi�ivyi�i2vaÞRRðu,vÞn
þRGðu,vÞRBðu,vÞnexpðivaÞYðu,vÞ
þRBðu,vÞexpð�ivaÞRGðu,vÞnYðu,vÞ
þRGðu,vÞYðu,vÞTRðu,vÞnexpðiuxiþ ivyi�ivaÞYðu,vÞ
þTRðu,vÞexpð�iuxi�ivyiþ ivaÞRGðu,vÞn
þRGðu,vÞYðu,vÞTGðu,vÞnexpðiuxiþ ivyiÞYðu,vÞ
þTGðu,vÞexpð�iuxi�ivyiÞRGðu,vÞn
þRGðu,vÞYðu,vÞTBðu,vÞnexpðiuxiþ ivyiþ ivaÞYðu,vÞ
þTBðu,vÞexpð�iuxi�ivyi�ivaÞRGðu,vÞn
þRBðu,vÞYðu,vÞTRðu,vÞnexpðiuxiþ ivyi�i2vaÞYðu,vÞ
þTRðu,vÞexpð�iuxi�ivyiþ i2vaÞRBðu,vÞn
þRBðu,vÞYðu,vÞTGðu,vÞnexpðiuxiþ ivyi�ivaÞYðu,vÞ
þTGðu,vÞexpð�iuxi�ivyiþ ivaÞRBðu,vÞn
þRBðu,vÞYðu,vÞTBðu,vÞnexpðiuxiþ ivyiÞYðu,vÞ
þTBðu,vÞexpð�iuxi�ivyiÞRBðu,vÞn
þTRðu,vÞexpðivaÞTGðu,vÞnYðu,vÞ
þTGðu,vÞTRðu,vÞnexpð�ivaÞYðu,vÞ
þTRðu,vÞTBðu,vÞnexpði2vaÞYðu,vÞ
þTBðu,vÞTRðu,vÞnexpð�i2vaÞYðu,vÞ
þTGðu,vÞTBðu,vÞnexpðivaÞYðu,vÞ
þTBðu,vÞexpð�ivaÞTGðu,vÞnYðu,vÞ ð9Þ
If we apply the inverse Fourier transform to Eq. (9), it willproduce a huge cross-correlation as well as many small noises.Terms including the phase mask Yðu,vÞ will become systemnoises randomly distributed in the output plane. The amplitudeof these system noises is much smaller than the amplitude of thecross-correlation between the reference and the target image. Inorder to illustrate the mathematical expression of the cross-correlation’s position, we only consider the terms which willgenerate the cross-correlation, that is given by
Sðu,vÞ ¼ TRðu,vÞexpð�iuxi�ivyiÞRRðu,vÞn
þTGðu,vÞexpð�iuxi�ivyiÞRGðu,vÞn
þTBðu,vÞexpð�iuxi�ivyiÞRBðu,vÞn
¼ ½TRðu,vÞRRðu,vÞnþTGðu,vÞRGðu,vÞnTBðu,vÞRBðu,vÞn�
�expð�iuxi�ivyiÞ ð10Þ
Fig. 1. The color (a) reference im
Correlation output will be obtained after the inverse Fouriertransform is applied to Eq. (10). Sometimes, the size of the phasemask is relatively not big enough as compared to the input imageor the content of the input image is too dense, so the noisesgenerated by the phase mask will influence the cross-correlation.Also, the side-lobe of the cross-correlation peak could be toowide. Alam and Karim proposed a fringe-adjusted filter (FAF) tosolve the problem in classical JTC [30–32]. Alam defined thegeneralized FAF, and called the modified FAF, the fractional powerFAF (FPFAF) [33]. The FPFAF is defined as
Hfpfaf ðu,vÞ ¼Bðu,vÞ
Aðu,vÞþ9RRðu,vÞþRGðu,vÞþRBðu,vÞ9m , ð11Þ
where Bðu,vÞ,Aðu,vÞ are either constants or functions of u and v.Because the target image is not exactly the same as the referenceimage due to the displacement between them, and for simpleimplementation, we let Bðu,vÞ ¼ 1,Aðu,vÞ ¼ 0,m¼ 1. The filter thenbecomes amplitude-modulated phase-only JTC, and the filter isexpressed as
Hðu,vÞ ¼1
9RRðu,vÞþRGðu,vÞþRBðu,vÞ9, ð12Þ
where RRðu,vÞ denotes Fourier transformation of rRðx,yÞ, RGðu,vÞdenotes Fourier transformation of rGðx,yÞ and RBðu,vÞ denotesFourier transformation of rBðx,yÞ. Eq. (10) is multiplied withEq. (12). The filtered phase-encoded spectrum (FS) is obtained as
FSðu,vÞ ¼ Sðu,vÞ � Hðu,vÞ
¼ ½TRðu,vÞRRðu,vÞnþTGðu,vÞRGðu,vÞnTBðu,vÞRBðu,vÞn�
�expð�iuxi�ivyiÞ9RRðu,vÞþRGðu,vÞþRBðu,vÞ9�1
� ½TRðu,vÞþTGðu,vÞþTBðu,vÞ�expð�iuxi�ivyiÞ ð13Þ
We apply the inverse Fourier transform to Eq. (13), and thecross-correlation which relates to the displacement between thecolor reference image and the color target image is obtained as
cðx,yÞ ¼ tRðxþxi,yþyiÞþtGðxþxi,yþyiÞþtBðxþxi,yþyiÞ
¼ tRþGþBðxþxi,yþyiÞ: ð14Þ
The displacement ðxi,yiÞ is determined by the brightest peak inthe entire output plane.
3. Digital results
The color reference and target image are shown in Fig. 1 andtheir resolution are both 256 pixels by 256 pixels. The realdisplacement between them is (8.13, 7.59) pixels. First, wedecompose the color reference and target image into three
age and (b) target image.
Fig. 3. The new monochromatic (a) reference image and (b) target image. (c)The amplitude of the phase-encoded reference image and (d) the total input image. (e) The
corresponding 3D plot of correlation output. (f) The corresponding 3D plot of correlation output when color images are translated into gray images.
Fig. 2. Using classical JTC, the corresponding (a) monochromic input images; (b) the 2D plot of the correlation output; and (c) the 3D plot of the color image displacement
measurement.
P. Ge et al. / Optics & Laser Technology 44 (2012) 1129–11351132
P. Ge et al. / Optics & Laser Technology 44 (2012) 1129–1135 1133
independent channels, and arrange them as a single scene for theinput images according to Refs. [11,12]. Then we use classicaljoint transform correlator to determine the displacement betweenthe reference and the target image. The joint input imageincluding three channels of the reference and the target imageis shown in Fig. 2(a). The 2D Plot of the correlation output isshown in Fig. 2(b). The 3D plot of the correlation output is shownin Fig. 2(c). Using the cross-correlation’s position, the displace-ment is determined as (7.92, 7.40) pixels. The errors are (0.21,0.19) pixels. But there are 15 peaks existing in the output plane,so distinguishing the correct cross-correlation peaks to determinethe displacement between the reference and the target image isrelatively difficult.
Next we use our method to measure the displacementbetween the color reference and the color target image. The colorreference images are separated into three channels, and are
Fig. 4. Zone of images overlapping for mutually shifted reference and target images.
Table 1Errors due to the displacement range.
Displacement range Mean on x RMSE on x Mean on y RMSE on y
[0,10] 0.1876 0.1029 0.1470 0.0726
[10,20] 0.1256 0.1026 0.0577 0.0478
[20,30] 0.1111 0.0977 0.0753 0.0678
[30,40] 0.1627 0.1403 0.0917 0.0994
[40,50] 0.1673 0.1450 0.1316 0.1048
[50,60] 1.8385 1.3074 2.0893 1.5317
Fig. 5. Errors of displacement (a) along h
placed in a black background to form a new monochromaticreference image shown in Fig. 3(a) with resolution 512 pixels by1024 pixels. The monochromatic target image is generated in thesame way as the monochromatic reference and shown in Fig. 3(b).We use matlab built in function rand to digitally generate a twodimensional random function Fðu,vÞ uniformly distributedbetween 0 and 2p. So the random phase mask Yðu,vÞ is generatedusing Eq. (3).The amplitude of phase-encoded reference image isshown in Fig. 3(c).The amplitude of the total input image includ-ing the phase-encoded monochromatic reference and the mono-chromatic target image is shown in Fig. 3(d).The 3D plot of thecorrelation output is shown in Fig. 3(e). It can be seen that onlyone sharp peak exits in the output plane that is very convenientfor the displacement detection. By finding the position of thesharp peak, we obtain the displacement as (8.06, 7.72) pixels. Theerrors are (0.07, �0.13) pixels. The accuracy of the detectionusing PJTC has been much improved in contrast to the classicalJTC, and obtaining the displacement by locating the cross-correla-tion peak is much simpler than classical JTC. And we translate thecolor target image and color references into gray images forcomparison. Correlation output is shown in Fig. 3(f).The intensityof the cross-correlation’s peak (shown in Fig. 3(e)) generated byour approach that adopts three channels of the color image isnearly four times as that of translating color images into grayimages (shown in Fig. 3(f)).Thus in a real optical experimentwhere light intensity is limited, it is more likely and easier todetect out the cross-correlation when we adopt all the threechannels of color image.
Then, we test different detection errors due to the displace-ment ranges shown in Fig. 4. The resolution of color reference is200 pixels by 200 pixels. Every time we move the reference imageby a random displacement between the ranges in the column 1 inthe Table 1 to form the color target image. We repeat 20 times toform 20 target images in each displacement range. Mean absoluteerrors are summarized in Table 1. When the displacement rangeexceeds 50 pixels, detection error will be larger than one pixel.This situation will not satisfy our need for high precision in image
orizontal axis and (b) vertical axis.
Table 2Performance of the detection errors over a video stream.
Method MSE Mean
Horizontal Vertical Horizontal Vertical
PJTC 0.002 0.0011 0.0623 0.0617
Classical JTC 0.0027 0.0082 0.2114 0.1699
Fig. 6. The proposed optical setup.
P. Ge et al. / Optics & Laser Technology 44 (2012) 1129–11351134
registration. When displacement range is 50 by 50 pixels, theoverlapping zone is ðð200�50Þ � ð200�50ÞÞ=ð200� 200Þ ¼ 56:25%of the reference image. If the overlapping zone is more than56.25%, mean detection errors could remain within 0.2 pixels, andRMSE could remain within 0.15 pixels.
Finally, we test the performance of PJTC and classical JTC tomeasure displacement over a color video. The video has 101 colorframes. The errors of displacement along horizontal and verticalaxes are shown, respectively, in Fig. 5(a) and (b). Performance ofthe detection errors over a video stream is shown in Table 2.Mean square errors (MSE) along horizontal and vertical axisis 0.002 and 0.0011 pixels of PJTC contrast to 0.0027 and0.0082 pixels of classical JTC. Mean errors along horizontaland vertical axis is 0.0623 and 0.0617 pixels of PJTC contrastto 0.2114 and 0.1699 pixels of classical JTC. This shows that thePJTC has better displacement measurement accuracy than theclassical JTC.
4. Hybrid setup
The proposed hybrid setup is shown in Fig. 6. The twodimensional random function Fðu,vÞ that is uniformly distributedbetween 0 and 2p is generated by Computer1 (COM1), so that therandom phase mask, Yðu,vÞ is generated using Eq. (4).The SLM1 isboth a phase and amplitude combination modulator. The SLM2 isan amplitude-only modulator, and the SLM3 is a phase-onlymodulator. Phase-only modulation can be realized through dis-playing a uint8 gray level image on SLM3.Both SLM1 and SLM2are at the front focal plane of Fourier transform lens1 (FTL1) andFTL2, respectively. Also, both CCD1 and CCD2 are at the back focalplane of Fourier transform lens 1 and lens 2, respectively. Thecolor reference image and the target image are separated intothree channels to form the new monochromatic reference andtarget image. The phase-encoded monochromatic reference imageis overlaid with the monochromatic target image that is displayedon SLM1 through COM1.The light generated by laser 1 (L1) arecollimated by collimator lens 1 (CL1) and reflected by reflective
mirrors 1 (RM1), perpendicularly passing through polarized beamsplitter 1 (PBS1) to SLM1. After modulation and reflection off theSLM1, the reflected light is Fourier transformed by the FTL1 andthe joint power spectrum (JPS) is captured by CCD1. The JPS issent to COM2 and to obtain the filtered phase-encoded jointpower spectrum (HJPS). The HJPS is displayed on SLM2 throughCOM2. The light generated by L2 is collimated by CL2 and phaseonly modulated by SLM3. The resulting light passes through PBS2to SLM2. The reflected light modulated by SLM2 at the front focalplane of FTL2 is inverse Fourier transform by FTL2, and correlationoutput at the back focal plane of FTL2 is captured by CCD2.Finally,the computation is performed to determine the number of pixelfor the displacement.
5. Conclusion
We proposed an image displacement measurement technol-ogy for color images based on phase-encoding JTC. The technol-ogy decomposes the color image into three channels and positionthe three independent components in such a way that form thenew monochromatic image. The PJTC could scatter other itemsinto system noises distributing in the whole output plane andleave only one cross-correlation thus improving the detectionability and precision of the cross-correlation, which accuratelycorrelate to the displacement between the color reference and thecolor target image .In Comparison to the displacement measure-ment using classical joint transform correlator, our methodshowed improved performance in detection capability and detec-tion precision.
Acknowledgment
This research was supported by The National Natural ScienceFoundation of China (Grant no. 61178064) and National BasicResearch Program (973) of China (Grant no. 2009CB724002).We are also thankful to the reviewers for their helpful suggestion.
P. Ge et al. / Optics & Laser Technology 44 (2012) 1129–1135 1135
References
[1] Weaver CS, Goodman JW. A technique for optically convolving two functions.Appl Opt 1966;5(7):1248–9.
[2] Mendlovic D, Marom E, Konforti N. Complex reference-invariant joint-trans-form correlator. Opt Lett 1990;15(21):1224–6.
[3] Alam MS, Karim MA. Multiple target detection using a modified fringe-adjusted joint transform correlator. Opt Eng 1994;33(5):1610–6.
[4] Mallik B, Datta AK. Defect detection in fabrics with a joint transformcorrelation technique: theoretical basis and simulation. Text Res 1999;69(11):829–35.
[5] Loo CH, Alam MS. Invariant object tracking using fringe-adjusted jointtransform correlator. Opt Eng 2004;43(9):2175–83.
[6] Janschek K, Dyblenko S, Boge T. Image based attitude determination using anoptical correlator. In: Schurmann B. editor. Spacecraft guidance, navigationand control systems, proceedings of the fourth ESA international conference.ESTEC, Noordwijk, the Netherlands; 1999. p.487–92.
[7] Janschek K, Tchernykh V, Dyblenko S, et al. Compensation of the attitudeinstability effect on the imaging payload performance with optical correla-tors. Acta Astronaut 2003;52(9–12):965–74.
[8] Janschek K, Tchernykh V, Dyblenko S, et al. Compensation of focal planeimage motion perturbations with optical correlator in feedback loop.In: Meynart R, Neeck SP, Shimoda H, editors. Proceedings of SPIE: Sensors,Systems, and Next-Generation Satellites VIII. Bellingham; 2004. p. 280–8.
[9] Janschek K, Tchernykh V, Dyblenko S. Performance analysis of opto-mecha-tronic image stabilization for a compact space camera. Control Eng Pract2007;15:333–47.
[10] Janschek K, Tchernykh V, Dyblenko S. Integrated camera motion compensa-tion by real-time image motion tracking and image deconvolution. In: LeeKM, chairman. Proceedings of the 2005 IEEE/ASME international conferenceon advanced intelligent mechatronics. Monterey; 2005. p.1437–44.
[11] Mendlovic D, Deutsch M, Ferreira C, Garcıa J. Single-channel polychromaticpattern recognition by the use of a joint-transform correlator. Appl Opt1996;35:6382–9.
[12] Deutsch M, Garcia J, Mendlovic D. Multichannel single-output color patternrecognition by use of a joint transform correlator. Appl Opt 1996;35:6976–82.
[13] Alam MS, Wai CN. Color pattern recognition using fringe-adjusted jointtransform correlation. Opt Eng 2001;40:2407–13.
[14] Jin WM, Zhang YP. Color pattern recognition based on the joint fractionalFourier transform correlator. Chin Opt Lett 2007;5(11):628–31.
[15] Yu FTS, Jutamulia S, Lin TW. Real-time polychromatic signal detection using acolor liquid crystal television. Opt Eng 1987;26:453–60.
[16] Yu FTS, Jutamulia S, Yelamarty RV, Gregory DA. Adaptive joint transformcorrelator for real-time color pattern-recognition. Opt Laser Technol 1989;21(3):189–92.
[17] Hsieh ML, Hsu KY, Zhai HC. Color image recognition by use of a jointtransform correlator of three liquid–crystal televisions. Appl opt 2002;41(8):1500–4.
[18] Corbalan M, Millan MS, Yzuel MJ. Color pattern recognition with CIELABcoordinates. Opt Eng 2002;41(130). 075006(1)–9.
[19] Nicolas J, Iemmi C, Campos J, Yzuel MJ. Optical encoding of color three-dimensional correlation. Opt Commun 2002;209(1–3):35–43.
[20] Alam MS, Goh SF, Dacharaju S. Three-dimensional color pattern recognitionusing fringe-adjusted joint transform correlation with CIELab coordinates.IEEE Trans Instrum Meas 2010;59(8):2176–84.
[21] Martınez PG, Martınez JL, Sanchez-Lopez MM, Moreno I. Wavelength-compensated time-sequential multiplexed color joint transform correlator.Appl Opt 2010;49:4866–73.
[22] Sangwine SJ, Eli TA, Moxey CE. Vector phase correlation. Electron Lett 2001;37(25):1513–5.
[23] Moxey CE, Sangwine SJ, Ell TA. Hypercomplex correlation techniques forvector images. IEEE Trans Signal Process 2003;51(7):1941–53.
[24] Feng W, Hu B, Yang C. A subpixel color image registration algorithm usingquaternion phase-only correlation. In: ICALIP—International conference onaudio, language and image processing. Shanghai; 2008. p. 1045–9.
[25] Alexiadis DS, Sergiadis GD. Estimation of motions in color image sequencesusing hypercomplex Fourier transforms. IEEE Trans Image Process 2009;18(1):168–87.
[26] Moxey CE, Sangwine SJ, Ell TA. Color-grayscale image registration usinghypercomplex phase correlation. In: International conference on imageprocessing, vol. 2; 2002. p.385–8.
[27] Refregier P, Javidi B. Optical image encryption based on input plane andFourier plane random encoding. Opt Lett 1995;20:767–9.
[28] Cherri AK, Alam MS. Reference phase-encoded fringe-adjusted joint trans-form correlator. Appl Opt 2001;40:1216–25.
[29] Ge P, Li Q, Feng HJ, Xu ZH. Displacement measurement using phase-encodedtarget joint transform correlator. Opt Eng 2011;50(3):038201–11.
[30] Alam MS, Karim MA. Multiple target detection using a modified fringe-adjusted joint transform correlator. Opt Eng 1994;33:1610–7.
[31] Alam MS, Khoury JS. Fringe-adjusted incoherent erasure joint transformcorrelator. Opt Eng 1998;37:75–82.
[32] Alam MS, Karim MA. Fringe-adjusted joint transform correlaltion. Appl Opt1993;32:4344–50.
[33] Alam MS. Fractional power fringe-adjusted joint transform correlation.Opt Eng 1995;34:3208–16.