issues in image registration and image similarity based on mutual information

Post on 07-Jul-2015

277 Views

Category:

Education

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

This is my 2nd Doctorate progresses committee presentation in image registration which is explained how do you find image similarity based on Entropy and mutual information

TRANSCRIPT

D A R S H A N A M I S T R Y

S U P E R V I S O R : - D R . A S I M B A N E R J E E - D A I I C T

C O - S U P T E R V I S O R : - D R . S H I S H I R S H A H ,U N I V E R S I T Y O F H O U S T O N

D P C : - D R A D I T Y A T A T U - D A I I C TD R T A N I S H Z A V E R I - N I T , N U

Issues in Image Registration –multimodal image acquisition &

image super‐resolution.

Outline

Introduction of Image Registration(IR)

Issues of Image Registration

Steps for Image Registration

Review Comments on DPC-2

Affine Transformation

Entropy

Mutual Information

Introduction

Image Registration:-Itis a process of overlyingof two or more imagesof the same scene takenat different times,different viewpointsand/or differentsensors.

Fig. 1. Image Registration

Issues of Image Registration

Multimodal Registration

Registration of images of the same scene acquired from different sensors.

Viewpoint Registration

Registration of images taken from different viewpoints.

Temporal Registration

Registration of images of same scene taken at different times or under different conditions

Steps of Image Registration

B. Zitova, J. Flusser[1] introduced basic 4 steps of image registration.

Fig. 2. Four steps of Image Registration (Top row: feature detection, middle row: feature matching, model estimation, bottom right: image resampling and transformation

Previous Review Comments on DPC-2 (21-Dec. 12)

Implement affine transformation and use mutual information and plots its variations.

Identify standard image databases that will be used for future work.

Identify recent registration approaches reported in literature for future exploration during the next review period.

Affine Transformation[1/4]

The affine transformation parameters can be calculated by coordinates of control points, and then geometrically transformation may be conducted for registered image.

Affine Transformation[2/4]

Fig. 3. (a) Transform of an image (b) Rotate of an image, (c) Scaling of an image

(a) (b)

(c)

Affine Transformation[3/4]

Fig. 4. (a) Transform of an image (b) Rotate of an image, (c) Scaling of an image

(c)

(b)(a)

Affine Transformation[4/4]

Fig. 5. Affine Transformation of an image

Entropy[1/7]

A key measure of information is known asentropy[2], which is usually expressed by averageamount of information received when the value ofrandom value X is observed (the average number ofbits needed to store or communicate one symbol in amessage).

Shannon introduced an adopted measure in 1948, which weights the information per outcome by the probability of that outcome occurring. Given events

occurring with probabilities

the Shannon entropy is defined as,

Entropy[2/7]

The units for entropy are “nats” when the natural logarithm is used and “bits" for base 2 logarithms.

When all messages are equally likely to occur, the entropy is maximal, because you are completely uncertain which message you will receive.

When one of the messages has a much higher chance of being sent than the other messages, the uncertainty decreases.

The amount of information for the individual messages that have a small chance of occurring is high, but an average, the information (entropy/ uncertainty ) is lower.

Entropy[3/7]

Example:-

1. Mummy-0.35, daddy-0.2, cat-0.2, cow-0.25

The entropy of child’s language-

-0.35 log(0.35)-0.2 log(0.2)-0.2 log(0.2)- 0.25 log(0.25)=1.96

2. Mummy- 0.05, daddy-0.05, cat- 0.02, train-0.02, car-0.02, cookie-0.02, telly-0.02, no-0.8

The entropy of child’s language- 1.25

14

Entropy[4/7]

Base on the distribution of the grey values of theimage[4], a probability distribution of grey values can beestimated by counting the number of times each greyvalue occurs in the image and dividing those numbers bythe total number of occurrences.

A single intensity of image will have a low entropy; itcontains very little information.

A high entropy value will be yielded by an image withmore or less equal quantities of many differentintensities, which is an image containing a lot ofinformation.

15

Entropy[5/7]

Entropy[6/7]

Entropy[7/7]

Entropy has three interpretations:

1. The amount of information an event(message, grey value of a point) gives when it takes place,

2. The uncertainty about the outcome of an event, and

3. The dispersion of the probabilities with which the events take place.

18

Mutual Information[1/3]

Mutual Information (transformation)[1]measures the amount of information that can beobtained about one random variable by observinganother.

It is important in communication where it can beused to maximize the amount of information sharedbetween sent and received signals.

A basic property of the mutual information is that I(X;Y) = H(X) - H(X|Y)

Mutual information is the amount by which the uncertaintyabout Y decrease when X is given: the amount of informationX contains about Y.

Mutual Information[2/3]

The second form of mutual information is most closely related to joint entropy.

I(X:Y) = H(X) + H(Y) – H(X,Y)

-H(X,Y) means that maximizing mutual information is related to minimizing joint entropy.

The advantage of mutual information over jointentropy per se, is that it includes the entropies of theseparate images.

20

Mutual Information[3/3]

The final form of mutual information is based on Kullback-Leibler distance[KLD][2]

Where SI (Specific mutual Information) is the point wisemutual information.

The interpretation of this form is that it measures the distancebetween the joint distribution of the images’ grey values p(x,y) and the joint distribution in case of independence of theimages, p(x)p(y).

It is measure of dependence between two images. If the testing images are well registered, then the value of KLD

becomes small or is equal to zero[17].

21

Tools:

OpenCV(open source computer vision)2.3/2.4.2

Microsoft Visual Studio 2008/2010 express

cMake(cross platform make) 2.8

Data Set

512 X 512 CT and MRI images

a b

c d

e f

Conclusion

Image similarity is find based on entropy and mutual information.

The entropy of an image is a measure of the amount of uncertainty with gray values.

Mutual information measures the amount of information, one image that can be obtained about one random variable by observing another.

Entropy value is high image similarity is less. Mutual information is maximum then image similarity is high.

References

1. B.Zitova,J.Flusser,“ Image Registration methods :A survey”, Image and vision computing, vol.21-no.-11,pp977-1000,2003.

2. J. P.W.Pluim, J.B. A. Maintz, and M. A. Viergever, “Mutual Information based registration of medical images: a survey”, IEEE Transactions on Medical Images, Vol.-XX, Issues-Y, 2003.

3. R. M. Gray, “Entropy and Information Theory”, Springr-Verlag, New York, 2009.

4. F. P. M. Oliveira, J. M. R. S. Tavares , “ Medical Image Registration: a Review ”, Computer Methods in Biomechanics and Biomedical Engineering, 2012.

5. A. Sotiras , C. Davatazikos, N. Paragios, “Deformable Medical Image Registration: A Survey”, Research Report n° 7919 September 2012.

6. M.V.Sruthi1, K.Soundararajan, V.Usha Sree , “ Accurate Multimodality Registration of medical images”, Accurate Multimodality Registration of medical images , Volume 1, Issue 3 , PP.33-36 , June 2012 .

7. Q. Xie, S. Kurtek, G. Christensen, Z. Ding, E. Klassen,A. Srivastava, “A Novel Framework for Metric-Based Image Registration”, WBIR2012

8. M.V.Wyavahare, P.M.Patil, H.K.Abhyankar,“Image Registration Techniques: An overview”, International Journal of Signal Processing, Image Processing, and PatternReorganization,Vol.2,No.3,2009.

9. J.B.A.Maintz, A.Viegever, “A Survey of Medical Image Registration”, Medical Image Analysis”,Vol.2,No.1,pp(1-37),1998.

10. G.Junbin,G.Xiaosong,W.Wei;P.Pengcheng, “Geometric Global Registration of Parallax Images Based on Wavelet Transform”, International Conference of Electronic Measurement and Instruments, pp 2862-2865,2007.

11. Gang Hong; Yun Zhang, “Combination of feature based and area based image registration technique for high resolution remote sensing image”, International Conference on Geo science and Remote Sensing Symposium, pp 377-380, 2007

12. Malviya A.; Bhirud,S.G., “Wavelet based image registration using mutual information”, International Conferenceon Emerging Trendsin Electronic and Photonic Devices and Systems (ELECTRO‟09),245-244,2009.

13. HongG.,ZhangY;“Wavelet based image registration technique for high resolution remote sensing image”, Journal of Compute Science and Geosciences 2006.

14. Hui Lin; Peijun Du; Weichang Zhao; Lianpeng Zhang; Huasheng Sun, “Image Registration based on corner detection and affine transformation”, International conference on Image and Signal Processing (CISP), pp2184-2188,2010.

15. Y. Yamamura,;H. Kim;J. Tan; S. Yamamura,; Yamamoto, “A methodfor reducing of computational time on image registration employingwavelet transformation”, International conference on ControlAutomation and Systems (ICCAS), pp 1286-1291,2007.

16. Y. Pei; H. Wu; J. Yu; G. Cai, “Effective Image Registration based onImproved Harris Corner Detection”, International Conference onInformation Networking an Automation (ICINA), V193-V196,2010.

17. R. Gan, J. Wu, A. C.S. Chung, S.C.H. Yu,W.M. Wells, MultiresolutionImage Registration Based on Kullback-Leibler Distance”, 7th International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI’04, 2004.

top related