performance of mutual information similarity measure for ...mutual information (mi) has recently...
TRANSCRIPT
Performance of Mutual Information Similarity Measure for Registration of Multi-Temporal Remote Sensing Images
Hua-mei Chen
Department of Computer Science and Engineering University of Texas at Arlington, Arlington, TX 76019, USA
Pramod K. Varshney1 and Manoj K. Arora
Department of Electrical Engineering and Computer Science Syracuse University, Syracuse, NY 13244
Abstract Accurate registration of multi-temporal remote sensing images is essential for various
change detection applications. Mutual information (MI) has recently been used as a similarity
measure for registration of medical images because of its generality and high accuracy. Its
application in remote sensing is relatively new. There are a number of algorithms for the
estimation of joint histogram to compute mutual information, but they may suffer from
interpolation-induced artifacts under certain conditions. In this paper, we investigate the use of a
new joint histogram estimation algorithm called generalized partial volume estimation (GPVE)
for computing mutual information to register multi-temporal remote sensing images. The
experimental results show that higher order GPVE algorithms have the ability to significantly
reduce interpolation-induced artifacts. In addition, mutual information based image registration
performed using the GPVE algorithm produces better registration consistency than the other two
popular similarity measures namely mean squared difference (MSD) and normalized cross
correlation (NCC) used for the registration of multi-temporal remote sensing images.
Index Terms- multi-temporal images, image registration, mutual information, joint histogram
estimation, registration consistency.
1 Corresponding author: [email protected]
I INTRODUCTION
Temporal change detection study is based on the processing of data collected at different
times (i.e., multi-temporal data) and is important in many applications that include land use land
cover change detection, deforestation and environmental monitoring etc. A variety of change
detection algorithms based on tools such as image differencing [1], principal component analysis
[2], change vector analysis [3], Markov Random Fields [4] and neural networks [5] may be used.
The basis of all these algorithms is, however, accurate registration of images taken at different
times. For example, registration accuracy of less than one-fifth of a pixel is required to achieve a
change detection error of less than 10% [6].
The necessity of accurate image registration and geometric rectification arises due to the
presence of a number of distortions (errors) in remote sensing images that occur as a result of
variations in platform positions, rotation of earth and relief displacements etc. The behavior of
most of these distortions is systematic and, thus, can be easily removed at data acquisition
centers. It is also generally expedient to procure systematic-corrected images as these are already
geometrically rectified to a map projection system such as universal transverse mercator (UTM).
However, systematic corrections are normally performed on the basis of the platform ephemeris
data obtained from the header information, which may be relatively inaccurate. Therefore, some
random distortions may still be present in the systematic-corrected data. This can be illustrated
with the help of agency supplied Landsat thematic mapper (TM) images taken at two different
times as shown in Fig. 1. The images were systematically corrected and geometrically rectified
to UTM at the data acquisition center. It can, however, be seen that the two points (marked by
"+") having the same UTM coordinates in both the images are located at different positions
indicating the presence of non-systematic registration errors. To perform change detection
studies, these images have to be registered with each other to correct for non-systematic errors.
This is typically done by selecting a few features, also known as ground control points (GCP), in
both images (floating image taken at time I and reference image taken at time II). The GCP are
matched by pair and used to compute transformation parameters to register the images. A
resampling procedure is then followed to estimate the intensity values at the new locations of the
floating image. The feature based registration technique via manual selection of GCP is,
however, laborious, time intensive and a complex task. Some automatic algorithms have been
developed to automate the selection of GCP to improve the efficiency [7, 8]. However, the
extraction of GCP may still suffer from the fact that sometimes too few a points will be selected,
and further the extracted points may be inaccurate and unevenly distributed over the image. This
may lead to large registration errors. Hence, automatic intensity based registration techniques
may be more appropriate than the feature based techniques.
In this paper, we adopt mutual information (MI) as the similarity measure for automatic
registration of multi-temporal remote sensing images. The major requirement to compute the MI
between two images is the accurate estimation of the joint histogram. A number of interpolation
algorithms such as the linear and partial volume interpolation (PVI) may be used to estimate the
joint histogram. However, we have found that the performance of the existing joint histogram
estimation algorithms may be limited due to a phenomenon called interpolation-induced artifacts,
which not only hampers the global optimization process but also limits the accuracy [9, 10].
Typical artifact patterns resulting from linear [11] and partial volume interpolation [12] are
shown in Fig. 2(a) and 2(b). The multiple peaks in these figures are the result of artifacts. To
overcome the problem of interpolation-induced artifacts, we have developed a new joint
histogram estimation algorithm called generalized partial volume estimation (GPVE) [10]. Here,
we investigate its application for multi-temporal remote sensing image registration. We also
compare the performance of MI based registration that employs GPVE, with other intensity-
based registration algorithms using mean square difference (MSD) and normalized cross-
correlation (NCC) as similarity measures. In Section II, we review the general intensity-based
image registration approach and point out one common problem encountered when MSD and
NCC are used as similarity measures. In Section III, we briefly review the MI based registration
approach and describe the artifacts phenomenon. The 2D implementation of the GPVE algorithm
is described in Section IV. In the absence of ground data (i.e., GCP), registration consistency
[11] has been used as a measure to evaluate the performances of different registration algorithms,
and is reviewed in Section V. Experimental results, their discussion and conclusions are provided
in Sections VI and VII respectively.
II REVIEW OF INTENSITY BASED IMAGE REGISTRATION APPROACH
The main principle behind any intensity based registration technique is to find a set of
transformation parameters that globally optimizes a similarity measure. This can be expressed
mathematically as,
)),((arg)),((arg*αα
α TT RFSoptRFSopt == (1)
where F is the floating image, R is the reference image, Tα is the transformation model, α is the
set of parameters involved in Tα, and S represents the chosen similarity measure. From (1), we
can observe that to register images F and R using a transformation model Tα, we need to find the
associated transformation parameters *α that optimize the selected similarity measure S.
Two commonly used similarity measures are MSD and NCC [13]. To use these as valid
similarity measures, images are assumed radiometrically corrected. However, in some cases,
even though accurate radiometric correction has been applied, the registration results using MSD
or NCC as similarity measures may still not be reliable. This occurs, for example, when the
scene undergoes a significant amount of change between two times. This can be illustrated by
registering small portions ([128×128] pixels and [228×228] pixels) of Landsat TM band 1
images taken in years 1995 and 1998 (see Fig. 3). The images are systematically corrected, and
thus contain no radiometric errors. From Fig. 3, a significant change between the intensity values
of the two images can be observed which is probably due to the change in crop types in the
region. The 2D registration functions obtained by using MSD and NCC as similarity measures,
to register images in Fig. 3, are shown in Fig. 4(a)-(b). If both MSD and NCC were suitable
similarity measures in this case, we would expect a global minimum in Fig. 4(a) and a global
maximum in Fig. 4(b) both at the position (50,50). Instead, it occurs at (30,20) for MSD and at
(70,30) for NCC. Thus, these two similarity measures fail to accurately register the two images.
In order to overcome this potential difficulty pertaining to MSD and NCC, a more robust
similarity measure is required.
Mutual information (MI) has been proposed as a suitable similarity measure for many
multi-modal image registration problems [12,14]. Analogous to NCC, the objective now is to
maximize the MI between the two images. Since its introduction, MI has been used widely in
many medical image registration applications because of its high accuracy and generality [15,
16]. Only recently, some work has been initiated on MI based registration of remote sensing
images [17-19]. Fig. 5 illustrates the 2D registration function using MI as the similarity measure
for the registration of images shown in Fig. 3. It is clear from Fig. 5 that the maximum occurs at
the position (50,50) thereby demonstrating that MI is able to successfully register the two
images, which was not the case with either NCC or MSD.
III COMPUTATION OF MUTUAL INFORMATION BETWEEN TWO IMAGES
MI of two random variables A and B, can be obtained as [20],
),()()(),( BAHBHAHBAI −+= (2)
where H(A) and H(B) are the entropies of A and B and H(A,B) is their joint entropy. Considering
A and B as two images, the MI based registration criterion states that the images shall be
registered when I(A,B) is maximal. The entropies and joint entropy can be computed from,
∑−=a
AA aPaPAH )(log)()( (3)
∑−=b
BB bPbPBH )(log)()( (4)
∑−=ba
BABA baPbaPBAH,
,, ),(log),(),( (5)
where )(aPA and )(bPB are the marginal probability mass functions, and ),(, baP BA is the joint
probability mass function. MI measures the degree of dependence of A and B by measuring the
distance between the joint distribution ),(, baP BA and the distribution associated with the case of
complete independence )()( bPaP BA ⋅ . The probability mass functions can be obtained from,
∑=
ba
BA bahbahbaP
,
, ),(),(),( (6)
∑=b
BAA baPaP ),()( , (7)
∑=a
BAB baPbP ),()( , (8)
where h is the joint histogram of the image pair. It is a 2D matrix of the following form:
−−−−
−−
=
)1,1(...)1,1()0,1(.........
)1,1(...)1,1()0,1()1,0(...)1,0()0,0(
NMhMhMh
NhhhNhhh
h
The value h(a,b), a∈[0, M-1], b∈[0,N-1], is the number of corresponding pairs having intensity
value a in the first image and intensity value b in the second image. It can thus be seen from (2)
to (8) that the joint histogram estimate is sufficient to determine the MI between two images.
Two commonly used algorithms to estimate the joint histogram of two images are linear
interpolation [11] and PVI [12]. However, when the two images have the same spatial resolution
along at least in one direction, both algorithms may suffer from interpolation-induced artifacts
[9]. The reason is that, under this condition, the number of grid-aligned pixels may change
abruptly when the displacement involved in the geometrical transformation along that direction
changes. However, if a rotational difference is involved in the two images to be registered,
artifacts will not appear even when displacements change. This is due to the fact that when a
rotational difference is present (except for multiples of 90 degrees), the number of grid-aligned
pixels is always small, no matter how the displacements change. An extensive discussion on the
mechanisms causing the artifacts and the underlying theory may be found in [9, 10]. The artifact
patterns may, however, be present while registering multi-temporal remote sensing images (see
Fig. 2). To combat this problem, we use a new joint histogram estimation algorithm called
GPVE, which we have successfully employed earlier for the registration of medical brain CT and
MR images [10]. Its 2D implementation is described in the next section.
IV GENERALIZED PARTIAL VOLUME ESTIMATION (GPVE) ALGORITHM
FOR JOINT HISTOGRAM ESTIMATION
Let Tα be the transformation characterized by the parameter set α that will be applied to
image A (see equation (1)). Assume that Tα maps a grid point with coordinates (x, y) of a pixel in
image A onto the corresponding point ),( ji ji ∆+∆+ in image B, where (i, j) are the coordinates
of a pixel in B, and ji ∆∆ and are small displacements with 0 ≤ ji ∆∆ , < 1. The unit here is inter-
pixel distance. See Fig. 6 for a geometrical illustration. For each grid point the joint histogram h
is updated as:
ZqpqfpfqjpiByxAh ji ∈∀∆−⋅∆−=+++ , )()()),(),,(( (9)
where Z is the set of all integers and f is a real valued kernel function that satisfies the following
two conditions,
i) 0)( ≥xf , where x is a real number. (10)
ii) 10 ,integeran is where,1)( <∆≤=∆+∑∞
−∞=
nnfn
(11)
The first condition ensures that the joint histogram values are non-negative while the
second condition makes the sum of the updated values equal to one for each corresponding pair
of points in image A and image B. In this paper, we employ B-spline functions as the kernel
functions. The details on B-spline functions can be found in [21, 22]. It may be noted that the
PVI algorithm is a special case when the first order B-spline function is used as the kernel
function. Fig. 7 shows the 2D MI registration function using the second order GPVE algorithm
(second order B-spline function is used) for the same data as was used to produce Fig. 2. Clearly
the artifacts have been reduced significantly.
V REGISTRATION CONSISTENCY
In the absence of ground data, registration consistency [11] may be used as a measure to
evaluate the performance of different intensity-based image registration algorithms. Defining,
BAT , as the transformation obtained using image A as the floating image and image B as the
reference image, the registration consistency (dp) of BAT , and ABT , can be formulated as,
(12b) ),(),()/1(
12a) ( ),(),()/1(
)(),(,,
)(),(,,
,
,
∑
∑
∩∈
∩∈
−⋅=
−⋅=
BAB
BAA
IIyxABBABBA
IIyxBAABAAB
yxTTyxNdp
yxTTyxNdp
o
o
where the composition ABBA TT ,, o represents the transformation that applies ABT , first and then
BAT , . The overlap region of images A and B is defined as IA,B. The discrete domains of images A
and B are IA and IB respectively, NA and NB are the number of pixels of image A and image B
within the overlap region. The registration consistency defined in (12a) specifies the mean
distance of the two mapped points )(, pT BA and )(,1 pT AB− , where p is a pixel in image A.
Similarly, (12b) represents the mean distance of the two mapped points )(, pT AB and )(,1 pT BA− ,
where p is a pixel in image B. In general, it is expected that the two values from (12a) and (12b)
will practically be the same. Similarly, a three-date registration consistency can be defined as,
),(),(),(),(2
1,,,
)(),(,,,3
,,
yxTTTyxyxTTTyxN
dp CABCABIIyx
BACBACA CBAA
oooo −+−⋅= ∑∩∈
(13)
where IA,B,C is the overlap region of images A, B and C.
In cases where the transformation required to bring two images in alignment is just a
displacement, the transformation BAT , can be represented as a 2-D vector ),(, yxv BA ∆∆= , where
x∆ is the vertical displacement and y∆ is the horizontal displacement. For this case, a simplified
two-date registration consistency can be defined as,
ABBA vvdp ,,2+= (14)
Similarly, the three-date registration consistency can be defined as,
2/)( ,,,,,,3 ABBCCAACCBBA vvvvvvdp +++++= (15)
VI. EXPERIMENTAL RESULTS AND DISCUSSION
Three multi-temporal image data sets are used in our experiments. These are Landsat TM
band 1 images of three different dates (1995, 1997 and 1998), IRS PAN images of two different
dates (1997 and 1998), and Radarsat SAR images of two different dates (1997 and 1998). As an
example, only the Landsat TM images are shown in Fig. 8. The area covered by all the images of
sizes [512×512] pixels belongs to a portion of the San Francisco Bay in California. The images
obtained from the source were already systematically corrected. The header files associated with
each image indicate that there is no rotational difference in Landsat TM images, a slight
rotational difference of 0.02 degree in IRS PAN images and a significant amount of rotational
difference (i.e., 18.5686 degree) in the SAR images. Therefore, it is anticipated that
interpolation-induced artifacts may be more pronounced while registering multi-temporal
Landsat TM and IRS PAN images than while registering multi-temporal Radarsat SAR images
(see Section III).
We evaluate the performance of four algorithms to estimate the joint histogram to
compute mutual information. These are linear interpolation, the first order GPVE algorithm,
which is actually the PVI algorithm, and the second and third order GPVE algorithms. Simplex
search procedure [23,24] is used to find the global optimum in all cases. Since the simplex search
procedure is just a local optimizer, there are instances when it may fail to find the position of the
global optimum. Under those circumstances, we change the initial search points until the correct
optimum is obtained. Tables 1-6 show all the experimental results of registering different pairs of
multi-temporal remote sensing images. The first date in each pair serves as the floating image
and the second date as the reference image. Since, Landsat TM images are obtained for three
dates, three sets of registration are performed. Tables 1 to 3 show these registration results in the
form of transformation parameters (i.e., displacements in x and y directions) along with the
corresponding two-date registration consistency. Table 4 shows the corresponding three-date
registration consistencies of each algorithm to register these images.
A glance at the first four rows of tables 1-4 shows no significant difference between the
performances of each algorithm as all of them perform fairly well with the linear interpolation
resulting in the worst registration consistency. Further, on close inspection of transformation
parameters for the four joint histogram estimation algorithms in Tables 1-3, we notice that the
PVI algorithm results in almost perfect registration consistency. However, if we plot the
registration function of each algorithm for a pair of images to be registered, we can observe the
differences in these algorithms because of interpolation-induced artifacts. One such illustration is
provided in Fig. 9 (a to c) for the registration of Landsat TM images of 1995 and 1997. The
artifact patterns can be clearly seen in these plots. In fact, the PVI algorithm, which has produced
perfect registration consistency as observed from the transformation parameters also shows the
presence of artifacts (Fig. 10(a) and (b)). In contrast, the second and higher order GPVE, do not
result in any artifact patterns (see Fig. 10(c) and (d), for second order as an example). Thus,
higher order GPVE implementation clearly has an advantage over linear and PVI algorithms
since the resultant registration function is very smooth.
Similar conclusions can be drawn from the registration result of IRS PAN images
reported in Table 5. Linear and PVI algorithms result in artifact patterns, which are significantly
reduced on implementing higher order GPVE algorithms, as can be seen from plots of 1D
registration function in y direction in Fig. 11 (a) to (c). In case of registration of Radarsat SAR
images (Table 6), we did not observe any artifact patterns when linear and PVI algorithms are
used. This is because of the rotational difference between the two images as described in Section
III. In this case, the linear interpolation has again resulted in relatively poor registration
consistency while PVI and higher order GPVEs have similar performance. Thus, joint histogram
estimation using higher order GPVE algorithms produces more reliable registration results than
using linear or PVI algorithms.
Further, in order to evaluate the performance of our MI based registration algorithm, the
registration results of 3rd order GPVE algorithm are compared with those obtained from image
registration using MSD and NCC as similarity measures. Linear image interpolation is used
when implementing the algorithms based upon these two measures. The registration results
obtained via MSD and NCC are shown in the last two rows of Tables 1-6. From our experiments,
although, it is hard to determine which similarity measure results in better registration accuracy
due to a lack of ground data. However, on the basis of registration consistency it can be clearly
seen that MI based registration implemented through higher order GPVE algorithms outperforms
the registration obtained with MSD and NCC as similarity measures.
VI. CONCLUSIONS
MI based image registration approach is investigated in this paper for three multi-
temporal remote sensing data sets. In cases when the images to be registered have the same
spatial resolution and orientation, the application of MI based registration through conventional
implementations like linear interpolation and partial volume interpolation may create problems
due to the presence of artifacts, as shown by the results in this paper. We have introduced a new
joint histogram estimation algorithm called GPVE to calculate the mutual information. By
choosing the 2nd order or the 3rd order B-splines as the kernel functions involved in GPVE, we
have shown that the artifacts can be reduced significantly thereby improving registration
accuracy. Although, a precise evaluation of registration accuracy is not possible without the
availability of accurate ground data, we have shown that MI based registration implemented
through the higher order GPVE algorithm results in better registration consistency than the
registration performed using MSD and NCC as the similarity measures.
ACKNOWLEDGEMENTS
This research was supported by National Aeronautics and Space Administration (NASA)
under grant number NAG5-11227. The remote sensing data was provided by the Eastman Kodak
Company under the multi modality image fusion study program.
REFERENCES
[1] V. Castelli, C. D. Elvidge, C. S. Li, and J. J. Turek, “Classification-based change
detection: Theory and applications to the NALC data set, in Remote Sensing Change Detection: Environmental Monitoring Methods and Applications, R. S. Lunetta and C. D. Elvidge, eds. Michigan: Ann Arbor Press, 1998, pp. 53-74.
[2] J. F. Mas, “Monitoring land-cover changes: a comparison of change detection
techniques”, Int. J. Remote Sensing, vol. 20, pp. 139-152, 1999. [3] E. F. Lambin and A. H. Strahler, “Indicators of land-cover change for change-vector
analysis in multitemporal space at coarse spatial scales, Int. J. Remote Sensing, vol. 15, pp. 2099-2119, 1994
[4] T. Kasetkasem and P. K Varshney, " An image change detection algorithm based on
Markov random field models", IEEE Trans. Geosci. Remote Sensing, vol. 40, pp. 1815-1823, Aug. 2002.
[5] X. Liu and R. G. Lathrop, “Urban change detection based on an artificial neural network”, Int. J. Remote Sensing, vol. 23, pp. 2513-2518, 2002.
[6] X. Dai and S. Khorram, "The effects of image misregistration on the accuracy of
remotely sensed change detection", IEEE Trans. Geosc.Remote Sensing, vol. 36, pp. 1566-77, Sept. 1998.
[7] X. Dai and S. Khorram, "A feature-based image registration algorithm using improved chain-code representation combined with invariant moments", IEEE Trans. Geosci. Remote Sensing, vol.37, pp. 2351-62, Sept. 1999.
[8] C. K. Toth and T. Schenk, "Feature-based matching for automatic image registration", ITC Journal, vol. 1, pp. 40-46, 1992.
[9] J. P. W. Pluim, J. B. A. Maintz, and M. A. Viergever, " Interpolation artifacts in mutual
information-based image registration", Computer Vision and Image Understanding, vol. 77, pp. 211-232, 2000.
[10] H. Chen and P. K. Varshney, "Mutual Information Based CT-MR Brain Image Registration Using Generalized Partial Volume Joint Histogram Estimation", to appear in IEEE Trans. Medical Imaging.
[11] M. Holden, D. L. G. Hill, E. R. E. Denton, J. M. Jarosz, T. C. S. Cox, T. Rohlfing, J. Goodey, and D. J. Hawkes, "Voxel similarity measures for 3-D serial MR brain image registration," IEEE Trans. Medical Imaging, vol. 19, pp. 94-102, Feb. 2000.
[12] F. Maes, A. Collignon, D. Vandermeulen, G. Marchal, and P. Suetens, “Multimodality image registration by maximization of mutual Information”, IEEE Trans. Medical Imaging, vol. 16, pp. 187-198, 1997.
[13] L. G. Brown, “A survey of image registration techniques”, ACM Computing Surveys, vol. 24, pp. 325-376, 1992.
[14] P. Viola and W. Wells, “Alignment by maximization of mutual information”, in Proc. 5th International Conference on Computer Vision, pp. 16-23, 1995.
[15] A. K. Erdi, Y. Hu, and C. Chui, "Using mutual information (MI) for automated 3D registration in the pelvis and thorax region for radiotherapy treatment planning", in Proc. Society of Photo-Optical Instrument Engineers (SPIE), vol. 3979, pp. 416-425, 2000.
[16] V. Zagrodsky, R. Shekhar, and J.F. Cornhill, "Mutual information-based registration of cardiac ultrasound volumes", in Proc. Society of Photo-Optical Instrument Engineers (SPIE), vol.3979, pp.1605-1614, 2000.
[17] H. Chen and P. K. Varshney, “A pyramid approach for multimodality image registration based on mutual information”, in Proc.Fusion' 2000, pp. MoD3-9 – MoD3-15, 2000.
[18] K. Johnson, A. C. Rhodes, I Zavorin, and J. LeMoigne, “Mutual information as a similarity measure for remote sensing image registration”, in Proc. SPIE, vol. 4383, pp. 51-61, 2001.
[19] J. Inglada, “Similarity measures for multisensor remote sensing images”, in Proc. IGARSS 02, CD ROM.
[20] T. M. Cover and J. A. Thomas, Elements of Information Theory, John Wiley and Sons, 1991.
[21] M. Unser, A. Aldroubi, and M. Eden, "B-spline signal processing: Part I-theory", IEEE Trans. Signal Processing, vol. 41, pp 821-833, 1993.
[22] M. Unser, A. Aldroubi, and M. Eden, "B-spline signal processing: Part II-efficient design", IEEE Trans. Signal Processing, vol. 41, pp 834-848, 1993.
[23] J. A. Nelder and R. Mead, “A simplex method for function minimization”, The Computer
Journal, vol. 7, pp. 308- 313, 1965. [24] H. Chen, “Mutual Information Based Image Registration with Applications”, Ph.D.
dissertation, Syracuse University, Syracuse, NY, 2002.
Table 1. Registration results for Landsat TM 1995 and 1997 images. T denotes the transformation parameters (vertical and horizontal displacements)
Algorithm 95->97 T95, 97
[pixel, pixel] 97->95 T97, 95 [pixel, pixel]
2-date consistency
Linear [2.9109,62.4256] [-3.0757,-62.4665] 0.1698 1st order [3.0000,62.0000] [-3.0000,-62.0000] 0.0000 2nd order [2.9812,62.3717] [-2.9819,-62.3739] 0.0024 3rd order [2.9837,62.3907] [-2.9838,-62.3930] 0.0023
MSD [3.2801,62.4177] [-3.0000,-62.3163] 0.2979 NCC [3.1412,62.3719] [-3.1039,-62.3604] 0.0390
Table 2. Registration results for Landsat TM 1995 and 1998 images. T denotes the transformation parameters (vertical and horizontal displacements)
Algorithm 95->98 T95, 98
[pixel, pixel] 98->95 T98, 95 [pixel, pixel]
2-date consistency
Linear [33.3183,-13.5464] [-33.3238,13.5576] 0.0125 1st order [33.0001,-13.9990] [-33.0000,14.0000] 0.0010 2nd order [33.2304,-13.6165] [-33.2307,13.6169] 0.0005 3rd order [33.2504,-13.5969] [-33.2522,13.5975] 0.0020
MSD [33.3777,-13.6033] [-33.3438,13.6517] 0.0592 NCC [33.3507,-13.6290] [-33.3489,13.6486] 0.0197
Table 3. Registration results for Landsat TM 1997 and 1998 images. T denotes the transformation parameters (vertical and horizontal displacements)
Algorithm 97->98 T97, 98 [pixel, pixel]
98->97 T98, 97 [pixel, pixel]
2-date consistency
Linear [30.3784,-76.0752] [-30.3745,76.0577] 0.0179 1st order [30.0000,-76.0000] [-30.0010,76.0003] 0.0011 2nd order [30.2734,-75.9989] [-30.2723,76.0115] 0.0127 3rd order [30.2953,-75.9996] [-30.2938,76.0000] 0.0015
MSD [30.3637,-75.9647] [-30.4245,75.7822] 0.1923 NCC [30.3818,-75.8891] [-30.3867,76.1127] 0.2237
Table 4. 3-date registration consistency for registration of Landsat TM 1995, 1997 and 1998 images.
Algorithm 3-date Registration ConsistencyLinear 0.1187
1st order 0.0008 2nd order 0.0289 3rd order 0.0291
MSD 0.2314 NCC 0.2023
Table 5. Registration results for IRS PAN 1997 and 1998 images. T denotes the transformation parameters (rotation angle, vertical and horizontal displacements) Algorithm T97, 98
[degree, pixel, pixel] T98, 97
[degree, pixel, pixel] 2-date consistency
Linear [-0.1158,27.9863,-8.8709] [0.1290,-27.9837,8.9155] 0.0470 1st order [-0.0005,28.0001,-8.9993] [0.0008,-28.0023,8.9990] 0.0024 2nd order [-0.1017,28.0313,-8.8341] [0.0997,-28.0241,8.8878] 0.0104 3rd order [-0.1025,28.0575,-8.8460] [0.1015,-28.0360,8.9002] 0.0071
MSD [-0.1206,28.0089,-8.8800] [0.1281,-27.9905,8.9639] 0.0309 NCC [-0.1222,27.9752,-8.9126] [0.1270,-27.9441,8.9919] 0.0241
Table 6. Registration results for Radarsat SAR 1997 and 1998 images. T denotes the transformation parameters (vertical and horizontal displacements) Algorithm T97, 98
[degree, pixel, pixel] T98, 97
[degree, pixel, pixel] 2-date consistency
Linear [-18.5656,76.4484,-56.0337] [18.5284,-54.6218,77.4446] 0.1150 1st order [-18.5345,76.3701,-56.1286] [18.5186,-54.5811,77.5116] 0.0533 2nd order [-18.5163,76.3780,-56.0975] [18.5172,-54.5888,77.4628] 0.0237 3rd order [-18.5208,76.3863,-56.0444] [18.5210,-54.5977,77.4538] 0.0565
MSD [-18.5228,76.2855,-55.6907] [18.5264,-54.6726,77.1890] 0.1504 NCC [-18.5257,76.2932,-55.7384] [18.5200,-54.6581,77.1451] 0.0641
Fig. 1 (a) Landsat TM band 1 image taken in 1997. (b) Landsat TM band 1 image taken in 1998. The two crosses mark the points of the same UTM coordinates in each frame. The noticeable shift demonstrates the presence of registration error in systematic-corrected data.
(a) (b)
Fig. 2. Artifact patterns in the MI registration function using (a) linear interpolation and (b) partial volume interpolation.
(a) (b)
(a) (b)
Fig. 4. Two dimensional registration functions while registering images shown in Fig. 3 using (a) MSD and (b) NCC as similarity measures.
Fig. 3. Two small portions of Landsat TM band 1 images taken in 1995 and 1998 respectively. The white box in image (b) indicates the corresponding sub-set of image (a).
(a) (b)
Fig. 5. 2D registration function while registering images shown in Fig. 3 using MI as the similarity measure.
(i, j) (x, y)
A
i∆j∆
Tα ),( ji ji ∆+∆+
BFig. 6. Geometrical illustration of the transformation Tα
Fig. 7. MI registration function using 2nd order GPVE. Compare this with Fig. 5 and notice the reduction in artifacts.
(a) (b)
(c) Fig. 8. Landsat TM images used in the experiments. Images were acquired in (a) 1995, (b) 1997 and (c) 1998.
5860
6264
66
02
46
0.5
1
1.5
x-displacementy-displacement
MI
59 60 61 62 63 64 650.6
0.7
0.8
0.9
1
1.1
1.2
1.3
1.4
x-displacement
MI
0 1 2 3 4 5 60.6
0.7
0.8
0.9
1
1.1
1.2
1.3
1.4
y-displacement
MI
(a)
(b) (c)
Fig. 9. Registration functions obtained through linear interpolation for registration of Landsat images of 1995 and 1997. (a) landscape of the 2D MI registration function. (b) 1D registration function in x direction. (c) 1D registration function in y direction.
59 60 61 62 63 64 650.6
0.7
0.8
0.9
1
1.1
1.2
1.3
1.4
x-displacement
MI
0 1 2 3 4 5 60.6
0.7
0.8
0.9
1
1.1
1.2
1.3
1.4
MI
y-displacement
59 60 61 62 63 64 650.6
0.7
0.8
0.9
1
1.1
x-displacement
MI
0 1 2 3 4 5 60.6
0.7
0.8
0.9
1
1.1
y-displacement
MI
Fig. 10. 1D registration functions resulting from PVI (a and b) and 2nd order GPVE (c and d) in registering Landsat TM images of 1995 and 1997.
(a) (b)
(c) (d)
25 26 27 28 29 30 310.9
1
1.1
1.2
1.3
1.4
1.5
1.6
1.7
1.8
y-displacement
MI
25 26 27 28 29 30 310.9
1
1.1
1.2
1.3
1.4
1.5
1.6
1.7
1.8
y-displacement
MI
25 26 27 28 29 30 310.9
1
1.1
1.2
1.3
1.4
1.5
1.6
1.7
1.8
y-displacement
MI
Fig. 11. 1D registration functions in y direction resulting from different algorithms in registering IRS PAN images of 1997 and 1998. (a) Linear interpolation (b) PVI (c) 2nd order GPVE
(a) (b)
(c)