by xin qi - university of toronto t-space...theory (for a more in-depth discussion concerning graph...

72
Vector Nonlocal Mean Filter by Xin Qi A thesis submitted in conformity with the requirements for the degree of Master of Science in Mathematics Graduate Department of Mathematics University of Toronto c Copyright 2015 by Xin Qi

Upload: others

Post on 26-Oct-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Vector Nonlocal Mean Filter

by

Xin Qi

A thesis submitted in conformity with the requirementsfor the degree of Master of Science in Mathematics

Graduate Department of MathematicsUniversity of Toronto

c© Copyright 2015 by Xin Qi

Page 2: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Vector Nonlocal Mean Filter

Xin Qi

Master of Science in Mathematics

Graduate Department of Mathematics

University of Toronto

2015

Abstract.

We give a new algorithm for image denoising combining ideas from vector diffusion map and

nonlocal mean filter. We will also demonstrate its performance in comparison to the standard

nonlocal mean and median filters.

ii

Page 3: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Acknowledgements

I would like to thank Prof. Hau-tieng Wu for introducing me to the topic of image denoising

and for patiently guiding me through this thesis. I am also grateful for the departmental staff

and fellow students, especially Jemima Merisca and Khoa Pham, who made my stay here at the

University of Toronto as welcoming and comfortable as it was.

iii

Page 4: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Contents

List of Tables vi

List of Figures vii

1 Introduction 1

1.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

1.2 Error/noise measurements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

1.3 Graph Laplacian and graph connection Laplacian . . . . . . . . . . . . . . . . . . 4

2 Nonlocal and vector nonlocal filters 9

2.1 Local mean and local median filters . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2.2 Nonlocal filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.2.1 Nonlocal mean filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.2.2 Nonlocal median filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

2.3 Vector Nonlocal Mean (VNLM) filter and algorithm . . . . . . . . . . . . . . . . 16

2.4 Variants of NLM and VNLM algorithms . . . . . . . . . . . . . . . . . . . . . . . 18

2.5 Computation difficulties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

3 Results 20

3.1 The setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

3.2 Performance test over 100 images . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

3.3 Detailed analysis of two examples . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

3.4 Brief explanation on the choice of parameters . . . . . . . . . . . . . . . . . . . . 32

3.5 Two more examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

iv

Page 5: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

4 Discussion and future directions 37

4.1 Future directions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

Appendices 39

A Algorithm variants 40

B Exact image names 42

C Tables of individual results 45

C.1 SNR of 100 images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

C.2 PSNR of 100 images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

C.3 Earth Mover’s Distance of 100 images . . . . . . . . . . . . . . . . . . . . . . . . 53

C.4 Comparison between norm search method and search window method . . . . . . 57

D Performance time over 100 images 59

Bibliography 62

v

Page 6: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

List of Tables

3.1 Summary of SNR performance of various algorithms over 100 images . . . . . . . 21

3.2 Summary of PSNR performance of various algorithms over 100 images . . . . . . 21

3.3 Paired t-test results on the SNR and PSNR samples . . . . . . . . . . . . . . . . 21

3.4 Summary of EMD performance of various algorithms over 100 images . . . . . . 22

3.5 Summary of performance time of various algorithms over 100 images . . . . . . . 22

3.6 Summary of SNR/PSNR comparison between norm search and search window

methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

3.7 Summary of EMD comparison between norm search and search window methods 23

3.8 Signal-to-Noise Ratio of two images . . . . . . . . . . . . . . . . . . . . . . . . . . 29

3.9 Peak Signal-to-Noise Ratio of two images . . . . . . . . . . . . . . . . . . . . . . 29

3.10 Earth Mover’s Distance between U − U and the noise . . . . . . . . . . . . . . . 29

3.11 Performance time of the various algorithms for two images . . . . . . . . . . . . . 32

B.1 File names of the 100 images used . . . . . . . . . . . . . . . . . . . . . . . . . . 44

C.1 SNR of various algorithms over 100 images . . . . . . . . . . . . . . . . . . . . . 49

C.2 PSNR of various algorithms over 100 images . . . . . . . . . . . . . . . . . . . . . 53

C.3 Earth Mover’s Distance of various algorithms over 100 images . . . . . . . . . . . 57

C.4 SNR comparison between norm search and search window methods . . . . . . . . 57

C.5 PSNR comparison between norm search and search window methods . . . . . . . 58

C.6 EMD comparison between norm search and search window methods . . . . . . . 58

D.1 Performance time of the various algorithms over 100 images . . . . . . . . . . . . 61

vi

Page 7: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

List of Figures

2.1 Comparison of local mean and median filters . . . . . . . . . . . . . . . . . . . . 10

2.2 Comparison of nonlocal mean and median filters . . . . . . . . . . . . . . . . . . 16

2.3 Motivation for rotational invariant distance . . . . . . . . . . . . . . . . . . . . . 16

3.1 Comparison of vector nonlocal mean and standard nonlocal mean and median

filters - Cameraman image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

3.2 Comparison of vector nonlocal mean and standard nonlocal mean and median

filters - Lena image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

3.3 Histograms of residue - Cameraman Image . . . . . . . . . . . . . . . . . . . . . . 30

3.4 Histograms of residue - Lena Image . . . . . . . . . . . . . . . . . . . . . . . . . . 30

3.5 Histograms of method noise - Cameraman Image . . . . . . . . . . . . . . . . . . 31

3.6 Histograms of method noise - Lena Image . . . . . . . . . . . . . . . . . . . . . . 31

3.7 A stronger performance of VNLM - Image 1 . . . . . . . . . . . . . . . . . . . . . 34

3.8 A stronger performance of NLM - Image 3 . . . . . . . . . . . . . . . . . . . . . . 36

4.1 Example of a dilation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

vii

Page 8: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 1

Introduction

Image denoising has always been an important part of signal processing, especially in the

digitized world of modern society. Local filters are one of the earliest methods of denoising which

used only information from neighbouring pixels with the idea that locality meant similarity.

However what local filters fails to take into account is the nonlinear structure inside the

image, where pixels spatially far apart in an image can share similarities. With this idea in mind,

Buades et al. proposed the algorithm of nonlocal mean filters [BC05]. The original nonlocal

mean filters focused on using single pixels to denoise, but in [SSN09] the algorithm has moved on

to cutting the image into overlapping patches of much smaller size and using those to build the

filter. If two patches are similar, then the center point of one patch is used to denoise the other.

Furthermore, the work done in [CS12] has shown that using the Euclidean median instead of

the mean results in a better filter which is more robust to noise, thus producing better results.

In general, for a large data set, there are nontrivial relationships between its data points.

For example, consider any three dimensional shape, by fixing a point of reference, we can have

two rotations of the shape that look drastically different. However, to the human eye, we know

that these shapes are in fact very similar. This means that the dimensionality of the features is

high. Therefore there is a need to introduce a new metric that somehow takes relationships like

these into account. We tackle this problem by using a distance that is rotationally invariant.

We will also need some tools from spectral graph theory, namely vector diffusion maps

(VDM) and graph connection Laplacian (GCL). VDM and GCL are especially useful for feature

extraction and dimension reduction, which is beneficial for us, for example, it allows extraction

1

Page 9: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 1. Introduction 2

of the true textures from images [SW12].

In this thesis, we attempt to push the power of nonlocal mean filter further by incorporating

the ideas of VDM/GCL and rotational invariant distance to better identify similar patches and

thus improve the denoised results.

1.1 Background

A grayscale image Im×n of size n-pixels wide by m-pixels long can be viewed as a m× n matrix

where an individual pixel value corresponds to the matrix entry in the respective spatial location.

We model a noisy image U as follows:

U(i, j) = Im×n(i, j) +N(i, j),(1.1)

where N is the matrix of noise satisfying E[N(i, j)] = 0 and Cov(N(i, j), N(h, l)) = δihδjlσ2, σ ≥

0, and Im×n is the underlying clean image.

Within this thesis, we will focus on the additive Gaussian white noise (AGWN) model which

models the effects of many random processes observed in nature. The characteristics of AGWN

is that the noise is a collection of independent and identically distributed (i.i.d) random variables

of normal distribution with zero mean, and we added this to the original signal.

1.2 Error/noise measurements

Let U be as described above, our goal is to find a denoising operator A so that when A acts on

U , AU is a good estimator of Im×n. The idea is that A should not change the original image

Im×n and thus the method noise∗ U −AU should be the noise that was applied to Im×n. Thus

if the denoising operator performs well, the difference should look like noise and have little to

no discernable structures.

To better see the results of our algorithm, we take a clean image, which we pre-process by

standardizing the pixel values such that the image has zero mean and unit variance, and add

AGWN with variance σ2. One important notion to quantify the relationship between the clean

∗A term used by Buades et al. in [BC05]

Page 10: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 1. Introduction 3

image and the noise is that of signal-to-noise ratio (SNR) given in decibels. SNR is computed

as follows

SNR = 20 log10(σsignalMSE

),

σsignal =

√√√√ 1

mn

mn∑i=1

(Im×n(i)− E(Im×n))2, MSE =1

mn

mn∑i=1

(U(i)− Im×n(i))2,

where U is the denoised (or noisy) image, σsignal is the standard deviation of the clean image,

and MSE is the mean squared error between the denoised and the original image.

We also have the notion of peak-signal-to-noise ratio (PSNR) also given in decibels. PSNR

is computed as follows

PSNR = 20 log10(MAX2

signal

MSE),

where MAXsignal := maxi,j{|Im×n(i)|} i.e. the absolute maximum signal value.

We use both SNR and PSNR because they describe two slightly different aspects of the

relationship between the result and the original image. The SNR gives us a sense of how

strong the signal and the noise are, but if the image is rather homogenous, the SNR is not

very informative. The PSNR is a lot more content dependent, it gives us a sense of how well

the high-intensity regions of the image is coming through the noise i.e. the contrast. Since

the denoising filter can adjust the contrast of the image, the PSNR can be rather helpful

in demonstrating the performance of the various denoising filters. In either case, the strict

numerical value of the SNR/PSNR does not necessarily tell us how well the denoising methods

performed, but they are nevertheless a helpful indicator towards performance.

We will also use Earth Mover’s Distance (EMD) to measure performance. EMD (also known

as Optimal Transport distance) computes the minimal amount of work required to transform one

distribution into another [RTG98a,RTG98b,Vil03,LB01]. We will apply the one dimensional

case of this metric to compare how close our denoised results are to the original image.

Aside from measuring the error of our results, SNR/PSNR are also very useful in helping us

measure the noise. Even though our experiments in later sections will assume prior knowledge of

the noise structure and statstics, in practice, we will not always have these information. Thus by

Page 11: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 1. Introduction 4

applying our filters and using the SNR/PSNR of our denoised result, we can obtain an estimate

of the statistics of noise (i.e. its mean and standard deviation etc.) and thus adjust our filter

parameters accordingly. This is a simple yet effective method of improving our filters.

1.3 Graph Laplacian and graph connection Laplacian

R.R.Coifman and S.Lafon introduced the idea of diffusion maps as a tool for dimension reduction

and feature extraction in 2006 [CL06]. As a general framework, several algorithms can be

understood in this framework. For example, if we allow the diffusion time to be zero, then the

resulting algorithm is a special case known as an eigenmap [BN01,BN03]. Since the definition

of diffusion maps relies on ideas in graph theory, we first give some relevant definitions in graph

theory (for a more in-depth discussion concerning graph theory see [BM08]).

Definition 1.3.1. An undirected, simple graph G, is a pair (V,E), where V is the set of vertices,

and E is the set of edges which connects the vertices. Specifically, E ⊆ {{u, v} : u, v ∈ V, u 6= v},

and two vertices u, v ∈ V , are connected if {u, v} ∈ E.

Definition 1.3.2. A undirected, simple graph G is complete, if E = {{u, v} : u, v ∈ V, u 6= v}.

Definition 1.3.3. A undirected, simple affinity graph Ga is a triple (V,E,w) where V , E are

as defined above and w is a function from E to R≥0, that is w assigns an affinity (or weight) to

each edge.

Definition 1.3.4. A undirected, simple connection graph Gc is a quadruple (V,E,w, r) where

(V,E,w) is the affinity graph and r is a function from E to G, where G is a matrix group.

The construction of w and r depend on the situation we are trying to model. For example,

we will use the Gaussian kernel function to define the affinity function. Specifically in our

example, the dataset is a finite sample from the patch space (a formal definition will be provided

when we provide the algorithm) and

wij = exp(−‖xi − xj‖2/ε2),

where ε is a parameter chosen by the user.

Page 12: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 1. Introduction 5

There are several matrices we can associate to a graph Ga or Gc.

Definition 1.3.5. • The affinity matrix of a graph G with |V | = n is a n × n matrix W

such that

Wij =

w({vi, vj}), if (vi, vj) ∈ E,

0, otherwise.

We will shorten w({vi, vj}) to w(i, j) for ease of notation.

• The degree function∗ deg : V → R≥0 of a graph G with |V | = n is defined as follows

deg(vi) =n∑j=1

Wij ,

for each vi ∈ V . The degree matrix D is a n× n diagonal matrix such that Dii = deg(vi).

• The transition matrix of a graph G is A = D−1W .

• Let Id be the identity matrix, the normalized graph Laplacian (GL) of a graph G is

L = A− Id, while the unnormalized GL is L = D −W .

• For a connection graph Gc = (V,E,w, r) with |V | = n and matrix group G of d × d

matrices, define n× n block matrix S with d× d blocks such that each block

S[i, j] =

wijrij , if (i, j) ∈ E,

0, otherwise,

and define a define n× n diagonal block matrix D with d× d blocks such that

D[i, i] = deg(vi)Idd,

where Idd is the d × d identity matrix. The unnormalized graph connection Laplacian

(GCL) is defined as

C ′ := D − S,

∗This is slightly different from the traditional sense of degree in graph theory.

Page 13: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 1. Introduction 6

and the normalized GCL is defined as

C := Id−D−1S.

Before we proceed further, we give some definitions concerning terms found in statistics so

that we can interpret transition matrix in a different light.

Definition 1.3.6. A finite, discrete-time Markov process of order m is a stochastic process

{Xt} which satisfies the Markov property of order m given below:

P(Xn+1 = xn+1|Xn = xn, . . . , X0 = x0) = P(Xn+1 = xn+1|Xn = xn, . . . , Xn−m+1 = xn−m+1).

That is, the probability of determining the (n+ 1)-th step only depends on the last m steps,

m < n+ 1.

When m = 1, that is if the stochastic process satisfies the following

P(Xn+1 = xn+1|Xn = xn, . . . , X0 = x0) = P(Xn+1 = xn+1|Xn = xn),

we simply call it a Markov process.

For more intuition about Markov processes, consider the following example of a Markov

process given in [LPW09, p. 3]:

A certain frog lives in a pond with two lily pads, east and west. A long time ago,

he found two coins at the bottom of the pond and brought one up to each lily pad.

Every morning, the frog decides whether to jump by tossing the current lily pad’s

coin. If the coin lands heads up, the frog jumps to the other lily pad. If the coin

lands tails up, he remains where he is.

We see that there are two states the frog can be in: the east lily pad or the west, and assuming

the coin is fair, the only information relevant in determining where the frog will go depends

only on the lily pad that he is currently on.

Thus a probabilistic interpretation to the transition matrix A is to imagine a random walk

on the vertices of G, the (i, j)-th entry of A can be viewed as the probability to go from vi to vj

Page 14: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 1. Introduction 7

in one step. Then A is the forward transition matrix of a finite, discrete-time Markov process

on G, i.e. (Ak)ij is the probability of going to state j from state i in k-steps. Notice that there

can be several possible trajectories to go from state i to state j. Specifically, given a general

trajectory of the form (i, i1, i2, . . . , ik−1, j),

(Ak)ij =∑

i1,i2,...,ik−1

Aii1Ai1i2 . . . Aik−1j .

Let us examine some properties of A = D−1W . We will assume our affinity graph to be

undirected, simple and complete (being complete means that the degree of any vertex is greater

than zero) with w(i, j) = w(j, i). A is not symmetric but it is similar to D−1/2WD−1/2 which is

symmetric which can be diagonalized by the spectral theorem. Thus there exists an orthogonal

matrix U and a diagonal matrix Λ such that D−1/2WD−1/2 = UΛUT . Denoting D−1/2U by Φ

and D1/2U by Ψ, we have that

A = ΦΛΨT .

Specifically Λ is the diagonal matrix of eigenvalues λ1, . . . , λn of D−1/2WD−1/2 ordered such

that |λ1| ≥ · · · ≥ |λn|.

Definition 1.3.7. Let A = ΦΛΨT be defined as above. Denote the i-th column of Φ by φi and

take the diffusion time t > 0. Then the diffusion map (DM) is the function Φt : V → Rn−1

defined by

Φt(i) = (λt2φ2(i), λt3φ3(i), . . . , λ

tnφn(i))T ,

where i is a vertex in V . We are often interested in the truncated diffusion map (tDM) with

threshold δ > 0, which is the map defined as the projection of the diffusion map Φt onto its first

m coordinates, where m is chosen to be the largest m such that

(λm+1

λ2)2t > δ and (

λm+2

λ2)2t ≤ δ.

Notice here that we start with φ2, this is because one property of A is that the top eigenvector

is always [ 1n , . . . ,1n ]T thus not providing any useful information. There is much more to diffusion

maps and diffusion times than what is described here, such as diffusion distance and diffusion

Page 15: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 1. Introduction 8

time, more information can be found in [CL06,CKL+08].

There is a manifold setup and interpretation to diffusion maps. For example, consider a set

X of n i.i.d data points in a d dimensional manifold Md which we smoothly embed into Rp

with p > d. We can view X as a complete affinity graph and define the affinity function by the

Gaussian kernel function. The diffusion map Φt embeds X into an Euclidean space where the

Euclidean distance between any two embedded points is equal to their diffusion distance Dt

[CL06]. More information can be found in [CL06,CLL+05,CKL+08,NLCK05,NLCK08].

Page 16: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 2

Nonlocal and vector nonlocal filters

In this chapter, we will start with a brief summary of existing methods before introducing our

proposed method of vector nonlocal mean filters.

2.1 Local mean and local median filters

Consider an image corrupted by AGWN modelled by (1.1). One of the earliest approaches

in image denoising is to recover the original image by taking advantage of the information of

neighboring pixels in determining the value of each individual pixel.

The local mean filter algorithm does exactly this. For the (i, j)-th pixel p, it sets up a

neighborhood window of size r × r centered at p and computes the weighted mean of all pixels

within this window. The mean is then assigned to be the denoised value p′ at the (i, j)-th

location of the image. Mathematically, given U(i, j) = Im×n(i, j) +N(i, j),

Uij =1

|N |∑

(l,k)∈M

Ul,kwl,k,(2.1)

where N is the neighbourhood window, Nij := {Ul,k|i − r ≤ l ≤ i + r, j − r ≤ k ≤ j +

r, for some odd r ∈ N>0}, |N | is the cardinality of N , wl,k is the chosen weight, and U the

denoised image [BT88, p. 32-34] [Dav04, Chapter 3] [Ver91, Chapter 4]. In practice, the

weight is determined by some chosen kernel such as the Gaussian. By the approximation of the

identity and the estimation of the variance, we can determine an optimal kernel based on the

9

Page 17: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 2. Nonlocal and vector nonlocal filters 10

circumstances. This idea is similar to the kernel regression model and more details can be found

in [SW13].

Similarly, the local median filter computes the median within this patch and assigns the

median as the denoised value. That is,

Uij = Median{Ul,k|Ul,k ∈ N},

where again U is the denoised image and N the neighbourhood window [BT88, p. 32-34] [Dav04,

Chapter 3] [Ver91, Chapter 4] [Mar91, p. 274].

The following figure shows the effectiveness of local mean and median filters on an image

corrupted with AGWN of mean 0 and variance 0.1. We see that the median filter was able

to give sharper results along the boundaries/edges as can be seen in the feather region of the

denoised images.

(a) Original image (b) Noisy image, SNR: 10.0816

(c) Local mean denoised, SNR: 13.3318 (d) Local median denoised, SNR: 13.5946

Figure 2.1: Comparison of local mean and median filters

Page 18: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 2. Nonlocal and vector nonlocal filters 11

2.2 Nonlocal filters

Before we begin talking about nonlocal filters, we will first define the ideas of patch and patch

space so that we have some framework to build on.

Definition 2.2.1 (Continuous patch space, patch). Let f ∈ L2(R2), define the continuous patch

space

PI := {Px = f |BI(x)} ⊂ L2(I2),

where x ∈ R2, and BI(x) is the box centered at x with radius I > 0, i.e.

BI(x) := {y ∈ R2||y1 − x1| ≤ I, |y2 − x2| ≤ I} ∈ L2(I2).

A patch p is an element of PI .

In practice, every image we deal with is discrete, so we have the following definition.

Definition 2.2.2 (Discrete patch space). For an image U ∈ Rn×n, the discrete patch space is

P(m−1)/2 := {Pij |i, j = 1, . . . , n} ⊂ Rm×m,

where

Pij := {U(l, h) | |l − i| ≤ (m− 1

2), |h− j| ≤ (

m− 1

2)},

where m is odd and (m− 1)/2 serves as the discrete counterpart to I in Def. 2.2.1.

2.2.1 Nonlocal mean filter

The nonlocal mean algorithm (NLM) proposed by Buades et al. is rather simple and works in

the following way. For a noisy image U = [ui]ni=1∗ to denoise pixel ui we use a weighted average

of all the pixels in the image. Denote NL(ui) to be the denoised value of ui by NLM,

NL(ui) =n∑k=1

w(ui, uk)uk,(2.2)

∗To ease notation, we will use a single index to represent the 2-dimensional image

Page 19: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 2. Nonlocal and vector nonlocal filters 12

where w is the Gaussian kernel applied to the Euclidean distance of points ui and uk. Notice

the difference between (2.2) and (2.1), in (2.1), we consider the geometric relationship of the

grid between two pixels, but in (2.2), we consider the relationship between the function defined

on the grid.

This relationship between diffusion maps and nonlocal mean algorithm is developed further

in [SSN09]. To link NLM to DM, here is a summary. For a noisy image U = [ui]ni=1, define

Kε(ui, uk) := K(d(Pi, Pk)√

ε),

where ε > 0 is the bandwidth, K is the chosen kernel, d(Pi, Pk) is the chosen distance in the

patch space, and D(ui) the normalization factor

D(ui) =

n∑k=1

Kε(ui, uk).

The denoised pixel is given by

NL(ui) =1

D(ui)

n∑k=1

Kε(ui, uk)uk,(2.3)

where Kε is the Gaussian kernel as mentioned earlier. Thus, in the case of (2.3), w(ui, uk) =

Kε(ui,uk)D(ui)

.

We see that if uj is similar to ui, then Kε(ui, uj) is closer to 1 thus the value of uj will

contribute more to the averaging. Notice also that in (2.3), we did not care about the spatial

proximity of uj to ui, hence the name of the algorithm. Furthermore, compared to (2.2), in (2.3)

uses patches as opposed to just pixels by how Kε is defined .

(2.3) can be rewritten in matrix form. Let W be a n× n matrix with Wij = Kε(ui, uj), and

D be a n×n diagonal matrix with Dii =∑n

k=1Wik. Then the denoising operator is A = D−1W ,

and the denoised U = AU (where we vectorize U in the case of 2D images, see).

A more detailed analysis in [Chu97] shows that A has a complete set of right eigenvectors

1 = λ0 ≥ λ1 ≥ · · · ≥ λN−1 > 0 where the eigenvector of λ0 is the vector of all ones. In fact, if a

graph is connected, then we have strict inequality in λ0 > λ1. A graph G is connected if for any

pair of vertices u, v there exists a sequence of edges {{u, u1}, {u1, u2}, . . . , {uk−1, uk}, {uk, v}}.

Page 20: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 2. Nonlocal and vector nonlocal filters 13

Consequently, G’s transition matrix A is irreducible. Thus if we were to apply an infinite

iteration of A to U , the limiting behaviour would be a constant signal of value b0 where b0 is

the coefficient of the first eigenvector of the expansion of U in the eigenbasis. In [SSN09], Singer

emphasizes the importance of iterating the denoising operator. The idea is that given sufficient

number of iterations we will reduce the noise but not so much as to blur the image. Precisely,

we could use the filter 2A−A2 as suggested by Ronald Coifman. In view of diffusion maps, this

filter shares the same eigenvalues as A but suppresses the larger eigenvalues much less while

suppressing the smaller eigenvalues much more.

Below, we give a detailed break down of the algorithm in pseudo-code for implementation

on computer programs such as MATLABr.

Algorithm 1 Standard NLM

Input: Noisy image U = (ui), parameters p, ε.Output: Denoised image U = (ui)

Pad the image array with a border of ceiling(p/2) pixelsCreate patch Pi of size (p× p) centered at each ui.for every ui do

For each pixel uk, set Wi,k = exp(−‖Pi − Pk‖2/ε2)Define the diagonal matrix D, such that Di,i is the sum of the i-th row of W .U = D−1WU

end for

Now, we provide a precise algorithm. let Um×n be the noisy image of size m× n, p be the

patch size, Pp be the patch space, and ε be the Gaussian kernel width. First, we vectorize the

2-dimension image Um×n as a vector in the following way

1. The original image:

Um×n =

p1,1 · · · p1,n

.... . .

...

pm,1 · · · pm,n

.

Page 21: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 2. Nonlocal and vector nonlocal filters 14

2. The image reshaped:

~Umn×1 =

p′1

p′2...

p′n

p′n+1

...

p′mn

,

where ~Umn×1((k − 1)n+ l) = Um×n(k, l). We will follow this indexing convention.

We pad the array U such that every pixel of U can be the center of a p× p patch. We then

build the affinity matrix W such that

Wij = exp(−‖Pi − Pj‖2

ε2),

where ε = 10σnoisy is chosen according to [BC05, p. 64] and i, j = 1, . . . ,mn where mn is the

number of patches i.e. mn = |Pp|.

We normalize the affinity matrix W using the diagonal matrix D where

Dij =

∑mn

k=1Wik, if i = j,

0, otherwise,

where mn is number of rows (or equivalently, the columns) of W . Thus D has the sums of the

rows of W down its diagonal.

We build the denoising operator A by A := D−1W . We apply A to the vectorized ~U to

receive the denoised image U , which we reshape back into a matrix of the original dimension.

In practice, it is actually very expensive to compare every patch, thus we implement a

threshold such that we only use the N patches that are the most similar to the patch we are

working with for the denoising (for more details see Alg. 3 in Appendix A).

Page 22: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 2. Nonlocal and vector nonlocal filters 15

2.2.2 Nonlocal median filter

Similar to the local filter case, we can also apply the idea of using median instead of mean

with the nonlocal filter to get nonlocal Euclidean median filter (NLEM). In [CS12], Chaudhury

and Singer demonstrated that by using the Euclidean median instead of the Euclidean mean,

the results are improved as the median is more robust to noise. Specifically, for a noisy image

U = [ui]ni=1, consider its associated patch space Pp and kernel width ε, we first find patches Pi

around each pixel ui and construct W as before for every patch. Then for each ui we find the

patch Pj which minimizesn∑j=1

Wij‖Pi − Pj‖

and assign the denoised value of ui by the center pixel of Pj . Here Wij = exp(−‖Pi − Pj‖2/h2)

with h being the kernel width and ‖Pi − Pj‖ being the `1 distance. Note that the `1 distance is

equivalent to the Euclidean median.

The subsequent figure illustrates the various denoising filters we have discussed so far. We

standardize the image before, apply to it an AGWN with mean 0 and variance 0.1.

(a) Original image (b) Noisy image, SNR: 10.0394

(c) Nonlocal mean denoised,SNR: 14.8145

(d) Method noise of nonlocalmean denoised

Page 23: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 2. Nonlocal and vector nonlocal filters 16

(e) Nonlocal mean denoisedwith 2A−A2, SNR: 14.7554

(f) Method noise of nonlocalmean denoised with 2A−A2

(g) Nonlocal median denoised,SNR: 15.9026

(h) Method noise of nonlocal me-dian denoised

Figure 2.2: Comparison of nonlocal mean and median filters

2.3 Vector Nonlocal Mean (VNLM) filter and algorithm

In this section, we introduce our novel algorithm in hopes of improving the results we get from

NLM and NLEM algorithms. The general idea is based on taking rotational invariant distance

(RID) into account. For example, consider the following two images:

(a) Original image (b) After 90 degree clockwise rotation

Figure 2.3: Motivation for rotational invariant distance

Despite the rotation, these two images identify the same object, and we would like to consider

patches that fit this criteria and collect them to improve our denoising filters.

Page 24: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 2. Nonlocal and vector nonlocal filters 17

Algorithm 2 VNLM

Input: Noisy image U = (ui), 1 ≤ i ≤ n, parameters p,N, ε, r.

Output: Denoised image (U) = ((u)i)Pad the image array with a border of ceiling(p/2) pixelsCreate patch Pi of size (p× p centered at each ui.for every ui do

for every patch uk, 1 ≤ k ≤ n dofor 1 ≤ j ≤ r do

Rotate Pk by 360j/r degrees into Pk,j and compute ‖Pi − Pk,j‖2.end forLet di,k = min{‖Pi − Pk,j‖2}Set Wi,k = exp−di,k/ε2

end forDefine the diagonal matrix D, such that Di,i is the sum of the i-th row of W .U = D−1WU

end for

The algorithm for vector NLM (VNLM) differs from NLM algorithm at exactly one step:

building W . To improve on the weight for Wij , we use RID instead of the standard `2 distance.

The RID of patches Pi, Pj ∈ Pp is defined as

dRID(Pi, Pj) = minO∈SO(2)

‖Pi −O ◦ Pj‖,(2.4)

where O is a rotation in the circle group SO(2) and O ◦ Pj denotes the action of O on Pj .

Numerically, we take 360/θ rotations at θ degrees uniform intervals and for each rotation we

compute the Euclidean difference. We then pick the smallest difference generated this way and

use it for our kernel i.e. dRID is as defined in (2.4). Notice that, at worst we produce the same

W as the standard NLM algorithm. This should be able to better identify similar patches by

picking out patches that when rotated are much more similar.

One important consideration is the number of rotations we take i.e. the method for choosing

θ. Our experiments has shown that there is no large differences whether we rotated every

degree or every nine degrees, although the computation cost is much greater if we rotate more.

Intuitively, we estimate that the rotation should depend on the the patch size. Since the images

we are dealing with are discrete, we can compute the degree required to rotate one pixel one

space over.

Page 25: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 2. Nonlocal and vector nonlocal filters 18

2.4 Variants of NLM and VNLM algorithms

There are various ways we can vary either algorithms. First, since we will not know the standard

deviation of the noise when we are working with actual data, thus the following kernel

Wij = exp(−‖Pi − Pj‖2

di,7dj,7),

where dk,7 is the Euclidean distance of the 7-th closest neighbour of patch k, performs much

better as it is adaptive to individual image [ZP04].

Another option would be to use the above kernel and perform NLM, then from the denoised

image, we can estimate the variance of the noise σ2noisy, and then apply NLM and VNLM with

kernel width ε = 10σnoisy.

Finally, Chaudhury and Singer’s work in NLEM has demonstrated that using Euclidean

median is often better than using Euclidean mean since the Euclidean median is more robust to

noise [CS12]. We can also adapt the NLEM with rotation in mind and we give the algorithms for

both NLEM and Vector NLEM at the end of this section. Again we will stress that the difference

between standard NLEM and Vector NLEM is that Vector NLEM uses rotational-invariant

distance instead of `2 distance in assessing the weights between patches.

2.5 Computation difficulties

As expected, rotation is expensive in terms of computation and speed and by accounting for

rotations, we significantly increase the number of computations we are performing. There has

been work done towards fast rotation such as the work of Zhao et al. regarding Fast Steerable

Principal Component Analysis (FsPCA) in [ZS13,ZSS14]. Zhao’s work is specifically tailored to

images in used in cryo-electron microscopy (cryo-EM) [ZS14]. Their algorithm takes the image

into the Fourier space and perform the rotation there (which is much simpler computationally)

and then transform it back into the real space. We cannot necessarily use this method as the

images that Zhao works with all have compact support, which we cannot assume for the patches

we are working with.

Even though we cannot use the methods proposed by Zhao, we do have a few modifications

Page 26: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 2. Nonlocal and vector nonlocal filters 19

to the algorithm that can cut down the computation times. What we are currently considering

in an effort to reduce computation time is to impose a search window around the pixel (i.e.

we will only consider patches within a specific distance away from the pixel we are denoising)

and/or to impose a threshold such that we only use patches that are sufficiently similar when

performing the averaging (for example, the nearest 150 patches). We will implement these

ideas when performing our proposed algorithm and compare their individual performances while

discussing its advantages and disadvantages.

The first method in reducing the amount of computation is, instead of computing the

Euclidean distance of a patch with all possible patches, we can limit the algorithm to compute the

distance of every patch within some search window i.e. given Pi centered at ui define the search

window centered at ui to be Si := {Pj |uj ∈ Br(ui), r ∈ N>0, with respect to `∞ norm} ⊂ Pp,

as given in [BC05,CS12]. However notice that by doing this, we are will no longer be able to

use patches outside the search window, and so we will inevitably lose certain patches that are

similar but spatially far from the patch we are currently computing. Thus we are no longer

considering the global picture and thus lose out on some of the advantages of using NLM. We

will call this the search window method.

To gain back the advantage of NLM that is lost using search windows, we suggest that the

following method which uses characteristics of norms. We evaluate the `2 norm of all patches,

and for each patch, identify potential similar patches based on their `2 norm. Since if two

images are rotationally similar, they will have the similar norms and rotation preserves norms

(of course we concede that due to numerical limitations, we will not have this when performing

the algorithm in the real world, this approach is still valid), by identifying potential patches

we will save computational time and still take advantage of the nonlocal nature of repeating

patterns in images. We will call this method the norm search method. We note that although

we would like W to be symmetric in this case, it might not be in this case as the number of

patches is (in general) much larger than the number of similar patches we are looking for.

Page 27: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3

Results

3.1 The setup

We setup our experiment with the following parameters: the patch size is 9× 9, the variance of

the noise is σ2noisy = 0.1, the Gaussian kernel parameter is ε = 10σnoisy, the number of neighbours

(if needed) is 150, and a patch size of 7× 7 (9× 9) with a search window radius of 10 (20) for

traditional NLEM (VNLEM) method.

We implemented all algorithms using MATLABr on two Linux machines, Sphere with

Intelr Xeonr CPU with twelve threads at 2.20GHz and 188 Gigabyte of RAM, and Ganita with

twelve threads at 2.40GHz and 62 Gigabyte of RAM. We also use the original MATLABr codes

for NLEM [Cha13] provided by Chaudhury based on his work in [CS12] keeping his original

parameters exactly as provided.

We import an image into MATLABr, turning it into gray-scale if necessary. We then

standardize the image so that it has zero mean and unit variance. We add AGWN with σ2 = 0.1

to the image and use ε = 10σ as the Gaussian kernel width. Note that here we are assuming the

knowledge of σ. However we would like to again point out that, in practice, we will not have

these information and a good method is to use SNR/PSNR to estimate the noise statistics as

detailed at the end of Sec. 1.2.

We perform the proposed algorithms in the previous chapter, paying attention to which

distance we use for neighbours (`2 distance vs. rotation-invariant distance), whether we use norm

search method or search windows method, and how regression is performed (kernel regression

20

Page 28: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3. Results 21

vs. quantile regression). We are currently implementing a rotation angle of 9 degrees. We use

the code provided by Hagi available online from [Hag07] to rotate our patches more efficiently.

3.2 Performance test over 100 images

We perform NLM, NLM with 2A−A2, VNLM, VNLM with 2A−A2, and NLEM algorithms on

100 images taken from [TGR+11] and present the summary results of the SNR and PSNR below.

We keep all parameters the same as mentioned in the setup. Additionally we used bilinear

interpolation to resize the image to 200 × 300 pixels before cropping the image to 200 × 200

by removing the the excess on the right side. For tables of the individual results as well as the

exact image names of the images experimented on, please see the appendix.

Summary results of SNR of proposed algorithms over 100 images

Noisy Image NLM VNLMNLM

with 2A−A2VNLM

with 2A−A2 NLEM

Mean 10.0719 9.8355 11.5159 11.3867 12.0549 3.6394Standard Deviation 0.0000 2.8432 2.5489 1.7800 2.4223 4.0454

Table 3.1: Summary of SNR performance of various algorithms over 100 images

Summary results of PSNR of proposed algorithms over 100 images

Noisy Image NLM VNLMNLM

with 2A−A2VNLM

with 2A−A2 NLEM

Mean 18.8093 18.5944 20.2681 20.1583 20.8165 12.4548Standard Deviation 2.3682 2.6531 2.6459 2.1649 2.5028 3.3512

Table 3.2: Summary of PSNR performance of various algorithms over 100 images

Pair t-test comparison of methods

NLM/VNLMNLM with 2A−A2

VNLM with 2A−A2 NLEM/VNLMNLEM

VNLM with 2A−A2

SNR 1.2242e-30 4.4524e-8 5.8670e-27 1.3791e-41PSNR 3.9001e-30 7.5460e-8 1.5873e-32 2.0904e-47

Table 3.3: Paired t-test results on the SNR and PSNR samples

We perform paired t-tests on NLM/VNLM, NLM with 2A − A2/VNLM with 2A − A2,

NLEM/VNLM, and NLEM/VNLM with 2A − A2 at 5% significance level for both the SNR

Page 29: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3. Results 22

and PSNR results. We see that in both cases, the t-test concludes that there is a significant

difference between the methods.

Summary results of Earth Mover’s Distance of proposed algorithms over 100 images

NLM VNLMNLM

with 2A−A2VNLM

with 2A−A2 NLEM

Mean 9.7126e-3 7.5127e-3 6.5182e-3 1.4732e-2 4.1521e-2Standard Deviation 4.8460e-3 6.4739e-3 4.6098e-3 9.3723e-3 2.7529e-2

Table 3.4: Summary of EMD performance of various algorithms over 100 images

We see that in general, VNLM and its variants tend to perform better then NLM and its

variants. However they take much longer to perform then traditional methods. Of course the

current implementation of VNLM/VNLEM can still be improved in terms of performance speed

but it will still take longer than traditional methods due to the extra number of computations

we must do.

Below we give the summary report of the performance time of the various algorithms

performed with nearest neighbour search for NLM/NLM with 2A−A2, norm search method

for VNLM/VNLM with 2A−A2, and search window method for NLEM on the two machines

Ganita and Sphere.

Performance time in minutesGanita Sphere

NLM VNLM NLEM NLM VNLM NLEMMean 2.7526 83.9904 23.0525 4.0540 101.3900 25.1720Standard Deviation 0.9710 0.3237 4.2646 1.3375 0.5391 1.5940

Table 3.5: Summary of performance time of various algorithms over 100 images

Finally we give a comparison of the first 10 images using the search window method (which

takes much longer than the norm search method). We apply the exact same noise to the images

and compare the SNR, PSNR, and EMD of the results obtained from both algorithms over the

10 images and present them below.

We see that the norm search method tend to perform better than the search window method.

Page 30: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3. Results 23

SNR/PSNR comparison of norm search (NS) and search window (SW) methods

VNLMVNLM

with 2A−A2

SNR Noisy NS SW NS SWMean 10.0719 11.417 9.5443 12.2136 10.9935Standard Deviation 0 1.7565 2.6049 1.8942 2.5539

PSNR Noisy NS SW NS SWMean 18.1942 19.6959 17.9743 20.5368 19.4054Standard Deviation 2.6419 2.1386 2.6901 2.3103 2.6889

Table 3.6: Summary of SNR/PSNR comparison between norm search and search windowmethods

EMD comparison of norm search (NS) and search window (SW) methods

VNLMVNLM

with 2A−A2

NS SW NS SWMean 5.6940e-3 1.0773e-2 1.1262e-2 5.7449e-3Standard Deviation 5.4517e-3 5.4436e-3 9.2185e-3 3.1594e-3

Table 3.7: Summary of EMD comparison between norm search and search window methods

3.3 Detailed analysis of two examples

We give a detailed analysis of two examples. We first consider an image of size 256× 256 pixels.

We use the norm search method for vector nonlocal mean algorithm to speed up the process.

(a) Original image (b) Noisy image

Page 31: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3. Results 24

(c) NLM denoised (d) Method noise of NLM denoised

(e) NLM denoised with 2A−A2 (f) Method noise of NLM denoised with2A−A2

(g) NLEM denoised (h) Method noise of NLEM denoised

Page 32: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3. Results 25

(i) VNLM denoised (j) Method noise of VNLM denoised

(k) VNLM denoised with 2A−A2 (l) Method noise of VNLM denoisedwith 2A−A2

(m) VNLEM denoised (n) Method noise of VNLEM denoised

Figure 3.1: Comparison of vector nonlocal mean and standard nonlocal mean and median filters- Cameraman image

Page 33: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3. Results 26

We next consider an image of size 200× 200 pixels. We still use the norm search method for

vector nonlocal mean algorithm to speed up the process. For the (vector) nonlocal Euclidean

algorithm we use a search window of radius (20) 10 pixels and a patch size of (9× 9) 7× 7.

(a) Original image

(b) Noisy image

Page 34: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3. Results 27

(c) NLM denoised (d) Method noise of NLM denoised

(e) NLM denoised with 2A−A2 (f) Method noise of NLMdenoised with 2A−A2

(g) NLEM denoised (h) Method noise of NLEM denoised

Page 35: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3. Results 28

(i) VNLM denoised (j) Method noise of VNLM denoised

(k) VNLM denoised with 2A−A2 (l) Method noise of VNLM denoisedwith 2A−A2

(m) VNLEM denoised (n) Method noise of VNLEM denoised

Figure 3.2: Comparison of vector nonlocal mean and standard nonlocal mean and median filters- Lena image

Page 36: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3. Results 29

To compare between the performance of the various algorithms we look at their SNR, PSNR,

the Earth Mover Distance (EMD) between U − U and N (recall that U = I + N , U = AU ;

that is U is the noisy image, I the original image, N the noise, and U the denoised image), the

histogram of the residue∗ I − U , and the histogram of the method noise U − U .

Signal-to-Noise Ratio (SNR)

Noisy NLM VNLMNLM

with 2A−A2VNLM

with 2A−A2 NLEM VNLEM

Cameraman 10.0394 14.8145 15.2672 14.7554 15.0827 15.9026 14.6582Lena 10.0719 13.3098 13.4933 14.3851 14.0455 6.7954 11.4702

Table 3.8: Signal-to-Noise Ratio of two images

Peak Signal-to-Noise Ratio (PSNR)

Noisy NLM VNLMNLM

with 2A−A2VNLM

with 2A−A2 NLEM VNLEM

Cameraman 16.7038 21.4788 21.9312 21.4186 21.7471 22.5666 21.3185Lena 19.0631 22.3009 22.3969 23.3763 23.0328 15.7866 20.4564

Table 3.9: Peak Signal-to-Noise Ratio of two images

Earth Mover’s Distance

NLM VNLMNLM

with 2A−A2VNLM

with 2A−A2 NLEM VNLEM

Cameraman 5.9645e-3 5.4634e-3 1.4860e-2 9.2896e-3 3.3063e-3 5.7510e-3Lena 1.8852e-3 5.0255e-3 9.0846e-2 1.0148e-2 2.1767e-2 6.2062e-3

Table 3.10: Earth Mover’s Distance between U − U and the noise

We see that in both examples, using the 2A−A2 filter, we have a greater SNR/PSNR in

the VNLM then the NLM. However the other algorithms seems more inconclusive in which

algorithms performs better. We see from the images themselves that VNLM and its variants

seem to perform better for background areas and boundaries, and at preserving repetitive

textures such as the feathers in the Lena image.

∗A term used in [SSN09]

Page 37: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3. Results 30

(a) µ =7.5536e-4;σ =0.1816

(b) µ =3.0966e-3;σ =0.1829

(c) µ =1.5518e-3;σ =0.1603

(d) µ =-1.6053e-3;σ =0.1724

(e) µ =-2.5781e-4;σ =0.1761

(f) µ =5.6549e-3;σ =0.1850

Figure 3.3: µI−U = 1.5678e− 3, σI−U = 0.3148. From left to right, top to bottom: histogram of

I− U of NLM, NLM with 2A−A2, NLEM, VNLM, VNLM with 2A−A2, VNLEM - Cameramanimage

(a) µ =-7.4224e-4;σ =0.2160

(b) µ =-4.7068e-4;σ =0.0.1909

(c) µ =-1.4804e-3;σ =0.4573

(d) µ =3.0196e-2;σ =0.2115

(e) µ =5.8939e-3;σ =0.1985

(f) µ =9.0328e-3;σ =0.2670

Figure 3.4: µI−U = −6.0754e− 4, σI−U = 0.3136. From left to right, top to bottom: histogram

of I − U of NLM, NLM with 2A−A2, NLEM, VNLM, VNLM with 2A−A2, VNLEM - Lenaimage

Page 38: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3. Results 31

(a) µ =-8.1243e-4;σ =0.2803

(b) µ =1.5288e-3;σ =0.2109

(c) µ =-1.6011e-5;σ =0.1603

(d) µ =-3.1731e-3;σ =0.2888

(e) µ =-1.8256e-3;σ =0.2661

(f) µ =4.0871e-3;σ =0.1850

Figure 3.5: µU−I = −1.5678e− 3, σU−I = 0.3148. From left to right, top to bottom: histogram

of N − U of NLM, NLM with 2A − A2, NLEM, VNLM, VNLM with 2A − A2, VNLEM -Cameraman image

(a) µ =-1.3470e-4;σ =0.3217

(b) µ =1.3686e-4;σ =0.2538

(c) µ =-8.7288e-4;σ =0.5527

(d) µ =3.0804e-2;σ =0.3030

(e) µ =6.5014e-3;σ =0.2568

(f) µ =9.6403e-3;σ =0.3483

Figure 3.6: µU−I = −6.0754e− 4, σU−I = 0.3136. From left to right, top to bottom: histogram

of N − U of NLM, NLM with 2A−A2, NLEM, VNLM, VNLM with 2A−A2, VNLEM - Lenaimage

Finally, we present the performance time of the various algorithms for the Cameraman and

the Lena images.

Notice that in both examples, even though some of the results seems inconclusive in terms

Page 39: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3. Results 32

Performance time in minutes in GanitaNLM & NLMwith 2A−A2

VNLM & VNLMwith 2A−A2 NLEM VNLEM

Camera man 4.5 82.8 9.9 1479.3Lena 2.2 145.3 21.4 1404.6

Table 3.11: Performance time of the various algorithms for two images

of SNR/PSNR (e.g. Lena image with 2A−A2 filters). However, over the 100 images, they have

been demonstrated to perform consistently over NLM/NLM with 2A−A2/NLEM.

3.4 Brief explanation on the choice of parameters

The various parameters for traditional NLM/NLEM that we’ve implemented are what was

suggested in [SSN09,CS12]. From our experiments, we found that if we used the smaller patch

size of 7× 7, rotation results in too much distortion and features are lost, thus bigger patch sizes

like 9× 9 and 11× 11 performs better. Since our patch sizes are larger, we needed to increase

our search window radius to compensate for the decrease in the number of comparisons made

for each patch.

3.5 Two more examples

We present our results for two of the 100 images, one performed better using vector nonlocal

mean filters, while the other performed better with traditional nonlocal mean filters.

Page 40: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3. Results 33

First we present the image where the rotational-invariant methods performed better. There

are two challenges in this image, one is to successfully eliminate the noise over the large sky

region, the other is to maintain some crisp definitions to the branches after the denoising process.

We see that the rotational invariant filters was very effective in eliminating the noise in the sky

portion of the image as well as not over-smooth the branches in the tree.

(a) Original image (b) Noisy image, SNR: 10.0719

(c) Nonlocal mean denoised, SNR:11.878

(d) Vector nonlocal mean denoised,SNR: 14.1463

Page 41: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3. Results 34

(e) Nonlocal mean denoised with 2A−A2, SNR: 12.8008

(f) Vector nonlocal mean denoised with2A−A2, SNR: 14.2973

(g) Nonlocal median denoised, SNR:2.989

(h) Vector nonlocal median denoised,SNR: 13.8186

Figure 3.7: A stronger performance of VNLM - Image 1

Page 42: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3. Results 35

Next we present the image where the traditional nonlocal mean methods performed better.

This is a particularly challenging image since a lot of the details in the clouds disappear with

the addition of noise. We see that both NLM and NLM with 2A − A2 algorithms were able

to preserve the details of the clouds much better. Although in terms of median filters, the

rotational-invariant one was able to preserve the edges of the clouds better, especially in the

lower right hand portion of the image.

(a) Original image (b) Noisy image, SNR: 10.0719

(c) Nonlocal mean denoised, SNR:15.8368

(d) Vector nonlocal mean denoised,SNR: 14.775

Page 43: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 3. Results 36

(e) Nonlocal mean denoised with 2A−A2, SNR: 15.4606

(f) Vector nonlocal mean denoised with2A−A2, SNR: 15.3541

(g) Nonlocal median denoised, SNR:12.4642

(h) Vector nonlocal median denoised,SNR: 14.3980

Figure 3.8: A stronger performance of NLM - Image 3

Page 44: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 4

Discussion and future directions

The nonlocal mean filter is quite good at image denoising since they take nonlocal information

into account. An image is divided into overlapping patches and the algorithm identifies patches

which are similar and use those in its attempt to denoise. The nonlocal mean filter can also be

viewed in terms of diffusion maps and we incorporated the idea of rotation-invariant distances

in nonlocal mean filter to better identify patches which are closer. We adapt rotation-invariant

distance not only to traditional nonlocal mean filter but also to the nonlocal median filter

which is more robust to noise, and we implement suggestions from current literature concerning

diffusion maps along with rotation-invariant distance to see if we can improve the idea of nonlocal

mean filter further.

We see that by using rotation-invariant distance, we get better results with denoising along

edges, clusterings of patterns in the image, thus giving us sharper features and cleaner back-

grounds. However, we have a major hurdle in the implementation of our algorithm which is the

processing time. Due to the usage of rotational-invariant distance, we are significantly increasing

the number of computations in finding the distance between two patches (approximately 40 folds

under the current setup) which leads to much longer processing time. In fact, our experiments

have shown that without using any computation reducing methods, a 200× 200 image takes over

4 days to process. This is why the work of Zhao et. al in [ZS13,ZSS14,ZS14] is quite interesting

to us as they are able to increase their computation speed in filtering images from cryo-electron

microscopy which also involves using large numbers of rotations. However we cannot simply

apply their ideas to our work as our images lack the one feature which their algorithm was

37

Page 45: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Chapter 4. Discussion and future directions 38

designed in mind with, that of images with compact support. Also, as we used MATLABr as

our programming language, one potential solution to speeding up the algorithm is to program it

in a programming language with better efficiency such as C++, but one thing we note is that in

implementing our algorithm in MATLABr, we are using a function programmed in C++ for

our rotations.

4.1 Future directions

Despite these challenges, the results of our algorithm are quite promising and there are several

future directions we would like to pursue which we detail below:

1. We would like to take other interesting features aside from rotation into consideration.

For example, we can consider the effects of dilation as shown below

(a) Original image (b) After vertical dilation

Figure 4.1: Example of a dilation

We see that, like rotation, we can consider dilation-invariant distances to improve the

NLM algorithm.

2. We are interested in applying VNLM and its variants to colour images, such as using

different combinations of the RGB channels to denoise.

3. It would be interesting to see if rotational-invariant distance can be incorporated in other

denoising methods or can be applied to other problems.

4. We would like to report on the theoretical framework and analysis of VNLM in the coming

publications.

Page 46: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendices

39

Page 47: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix A

Algorithm variants

Algorithm 3 Standard NLM with N nearest neighbours

Input: Noisy image U = (ui), parameters p,N, ε.Output: Denoised image U = (ui)

Pad the image array with a border of ceiling(p/2) pixelsCreate patch Pi of size (p× p) centered at each ui.for every ui do

Find the nearest N patches Pk to Pi based on `2 norm.For each neighbour Pk, set Wi,k = exp(−‖Pi − Pk‖2/ε2)Define the diagonal matrix D, such that Di,i is the sum of the i-th row of W .U = D−1WU

end for

Algorithm 4 Standard NLEM

Input: Noisy image U = (ui), parameters p,N, ε.

Output: Denoised image (U) = ((u)i)Pad the image array with a border of ceiling(p/2) pixelsCreate patch Pi of size (p× p centered at each ui.for every ui do

Find the nearest N neighbours of ui.For each neighbour uk, set wi,k = exp−‖Pi − Pk‖2/ε2Find patch P which minimize

∑1≤k≤N wi,k‖P − Pk‖

Assign ui the value of the center pixel of patch P .end for

40

Page 48: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix A. Algorithm variants 41

Algorithm 5 Vector NLEM

Input: Noisy image U = (ui), parameters p,N, ε.

Output: Denoised image (U) = ((u)i)Pad the image array with a border of ceiling(p/2) pixelsCreate patch Pi of size (p× p centered at each ui.for every ui do

Find the nearest N neighbours of ui.for every neighbour uk do

for 1 ≤ j ≤ r doRotate Pk by 360j/r degrees into Pk,j and compute ‖Pi − Pk,j‖2.

end forLet di,k = min{‖Pi − Pk,j‖2}Set wi,k = exp−dik/ε2

end forFind patch P which minimize

∑1≤k≤N wi,k‖P − Pk‖

Assign ui the value of the center pixel of patch P .end for

Algorithm 6 VNLM with norm search method

Input: Noisy image U = (ui), parameters p,N, ε, r.

Output: Denoised image (U) = ((u)i)Pad the image array with a border of ceiling(p/2) pixelsCreate patch Pi of size (p× p centered at each ui.for every ui do

Find the nearest N neighbours of ui by comparing its `2 norm with ‖Pi‖`2 .for every neighbour uk do

for 1 ≤ j ≤ r doRotate Pk by 360j/r degrees into Pk,j and compute ‖Pi − Pk,j‖2.

end forLet di,k = min{‖Pi − Pk,j‖2}Set Wi,k = exp−di,k/ε2

end forDefine the diagonal matrix D, such that Di,i is the sum of the i-th row of W .U = D−1WU

end for

Page 49: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix B

Exact image names

The following table lists the exact image name and folder of the gallery in [TGR+11] where we

obtained each of 100 images we used.

Folder/File name of the 100 images used

Name Folder File Name Name Folder File Name

Image 1 cd03A DSC 0019.JPG Image 15 cd07A DSC 0074.JPG

Image 2 cd03A DSC 0036.JPG Image 16 cd07A DSC 0077.JPG

Image 3 cd03A DSC 0063.JPG Image 17 cd07A DSC 0079.JPG

Image 4 cd03A DSC 0069.JPG Image 18cd10B closeup figs

fresh shade sunDSC 0072.JPG

Image 5 cd07A DSC 0002.JPG Image 19 cd13A DSC 0033.JPG

Image 6 cd07A DSC 0003.JPG Image 20 cd13A DSC 0034.JPG

Image 7 cd07A DSC 0004.JPG Image 21 cd13A DSC 0051.JPG

Image 8 cd07A DSC 0018.JPG Image 22 cd13A DSC 0078.JPG

Image 9 cd07A DSC 0044.JPG Image 23 cd16A DSC 0002.JPG

Image 10 cd07A DSC 0068.JPG Image 24 cd13A DSC 0004.JPG

Image 11 cd07A DSC 0069.JPG Image 25 cd13A DSC 0008.JPG

Image 12 cd07A DSC 0071.JPG Image 26 cd13A DSC 0012.JPG

Image 13 cd07A DSC 0072.JPG Image 27 cd13A DSC 0020.JPG

Image 14 cd07A DSC 0073.JPG Image 28 cd13A DSC 0023.JPG

42

Page 50: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix B. Exact image names 43

Name Folder File name Name Folder DSC 0005.JPG

Image 29 cd13A DSC 0025.JPG Image 57 cd58A DSC 0021.JPG

Image 30 cd13A DSC 0035.JPG Image 58 cd58A DSC 0022.JPG

Image 31 cd13A DSC 0052.JPG Image 59 cd58A DSC 0032.JPG

Image 32 cd13A DSC 0076.JPG Image 60 cd59A DSC 0004.JPG

Image 33 cd56A DSC 0001.JPG Image 61 cd60A DSC 0001.JPG

Image 34 cd56A DSC 0037.JPG Image 62 cd60A DSC 0002.JPG

Image 35 cd56A DSC 0041.JPG Image 63 cd60A DSC 0003.JPG

Image 36 cd56A DSC 0048.JPG Image 64 cd60A DSC 0004.JPG

Image 37 cd56A DSC 0051.JPG Image 65 cd60A DSC 0020.JPG

Image 38 cd57A DSC 0001.JPG Image 66 cd60A DSC 0025.JPG

Image 39 cd57A DSC 0011.JPG Image 67 cd60A DSC 0047.JPG

Image 40 cd57A DSC 0037.JPG Image 68 cd60A DSC 0071.JPG

Image 41 cd58A DSC 0001.JPG Image 69 cd01A DSC 0001.JPG

Image 42 cd58A DSC 0002.JPG Image 70 cd01A DSC 0002.JPG

Image 43 cd58A DSC 0003.JPG Image 71 cd01A DSC 0003.JPG

Image 44 cd58A DSC 0004.JPG Image 72 cd01A DSC 0004.JPG

Image 45 cd58A DSC 0005.JPG Image 73 cd01A DSC 0005.JPG

Image 46 cd58A DSC 0006.JPG Image 74 cd01A DSC 0006.JPG

Image 47 cd58A DSC 0007.JPG Image 75 cd01A DSC 0007.JPG

Image 48 cd58A DSC 0008.JPG Image 76 cd01A DSC 0008.JPG

Image 49 cd58A DSC 0011.JPG Image 77 cd01A DSC 0009.JPG

Image 50 cd58A DSC 0013.JPG Image 78 cd01A DSC 0010.JPG

Image 51 cd58A DSC 0014.JPG Image 79 cd01A DSC 0011.JPG

Image 52 cd58A DSC 0015.JPG Image 80 cd01A DSC 0012.JPG

Image 53 cd58A DSC 0016.JPG Image 81 cd01A DSC 0013.JPG

Image 54 cd58A DSC 0017.JPG Image 82 cd01A DSC 0014.JPG

Image 55 cd58A DSC 0018.JPG Image 83 cd01A DSC 0015.JPG

Image 56 cd58A DSC 0020.JPG Image 84 cd01A DSC 0016.JPG

Page 51: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix B. Exact image names 44

Name Folder File name Name Folder File name

Image 85 cd01A DSC 0017.JPG Image 93 cd01A DSC 0025.JPG

Image 86 cd01A DSC 0018.JPG Image 94 cd01A DSC 0026.JPG

Image 87 cd01A DSC 0019.JPG Image 95 cd01A DSC 0027.JPG

Image 88 cd01A DSC 0020.JPG Image 96 cd01A DSC 0028.JPG

Image 89 cd01A DSC 0021.JPG Image 97 cd01A DSC 0029.JPG

Image 90 cd01A DSC 0022.JPG Image 98 cd01A DSC 0030.JPG

Table B.1: File names of the 100 images used

Page 52: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix C

Tables of individual results

C.1 SNR of 100 images

SNR of proposed algorithms over 100 images

Noisy Image NLM VNLMNLM

with 2A−A2

VNLM

with 2A−A2NLEM

Image 1 10.0719 11.878 14.1463 12.8008 14.2973 2.989

Image 2 10.0719 12.2723 11.7146 13.114 12.3562 4.9209

Image 3 10.0719 15.8368 14.775 15.4606 15.3541 12.4642

Image 4 10.0719 9.4103 9.8523 11.0529 10.5272 -0.17

Image 5 10.0719 7.7167 10.3074 9.9852 10.7057 0.8112

Image 6 10.0719 7.7946 10.2138 10.1759 10.623 0.7041

Image 7 10.0719 8.0464 10.0698 10.0757 10.9524 2.7532

Image 8 10.0719 12.793 13.3226 13.4166 14.1413 8.302

Image 9 10.0719 13.7353 13.9479 14.2084 14.5888 9.342

Image 10 10.0719 8.3247 10.0399 10.356 10.6106 0.3049

Image 11 10.0719 8.5551 10.4354 10.5628 11.1134 2.9777

Image 12 10.0719 9.0047 10.9964 10.7214 11.546 2.6731

Image 13 10.0719 9.006 10.2786 10.9211 10.9876 3.1646

Image 14 10.0719 8.1414 11.0051 10.4368 10.9895 0.2675

45

Page 53: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix C. Tables of individual results 46

Noisy Image NLM VNLMNLM

with 2A−A2

VNLM

with 2A−A2NLEM

Image 15 10.0719 10.1679 10.9076 11.5676 11.5779 2.6611

Image 16 10.0719 7.8509 10.1304 10.1134 10.7921 1.6358

Image 17 10.0719 8.9344 10.367 10.6788 11.032 0.8724

Image 18 10.0719 10.923 11.8071 12.6828 12.3146 5.1875

Image 19 10.0719 8.1762 10.4229 10.5461 10.638 0.2627

Image 20 10.0719 12.5326 12.3382 13.8356 12.7261 4.6928

Image 21 10.0719 9.8052 11.1511 11.6188 11.9941 4.7577

Image 22 10.0719 13.3194 13.5194 13.8759 14.3076 9.8602

Image 23 10.0719 10.1081 10.702 11.8942 11.2227 3.3699

Image 24 10.0719 15.8818 15.5737 15.6159 15.4507 10.6619

Image 25 10.0719 12.4063 12.0499 13.613 13.0218 7.6685

Image 26 10.0719 13.5216 14.6425 13.7718 15.2854 8.8105

Image 27 10.0719 9.4362 11.0952 11.4924 11.6877 3.429

Image 28 10.0719 12.8871 13.2926 13.5966 13.8502 6.8086

Image 29 10.0719 12.6963 12.9098 13.795 13.4517 4.9192

Image 30 10.0719 9.2463 11.0017 11.1711 11.1935 1.5896

Image 31 10.0719 13.0316 13.217 14.0767 13.8464 6.2815

Image 32 10.0719 13.6977 13.0326 14.7133 13.7409 8.8093

Image 33 10.0719 8.9855 10.7582 10.8456 11.5385 3.9965

Image 34 10.0719 7.8483 9.8464 10.0618 10.69 2.0263

Image 35 10.0719 7.8975 9.9245 10.026 10.6681 1.8203

Image 36 10.0719 7.9933 9.4343 9.8516 10.522 3.3524

Image 37 10.0719 7.8562 10.2006 9.9373 10.7508 1.2492

Image 38 10.0719 8.3182 10.4507 10.3245 11.2172 3.3829

Image 39 10.0719 8.4173 10.3244 10.3209 11.1823 3.9249

Image 40 10.0719 7.0631 10.3187 9.5489 10.3021 -1.0283

Image 41 10.0719 7.6761 10.3924 10.0394 10.6606 0.0011

Page 54: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix C. Tables of individual results 47

Noisy Image NLM VNLMNLM

with 2A−A2

VNLM

with 2A−A2NLEM

Image 42 10.0719 7.6497 10.3992 9.9836 10.5587 -0.0502

Image 43 10.0719 8.1081 10.0245 10.2477 10.8668 2.1263

Image 44 10.0719 8.9225 11.0422 11.0535 11.3544 1.4415

Image 45 10.0719 7.5238 10.0277 9.8572 10.3183 -0.9589

Image 46 10.0719 7.6303 10.5297 9.9442 10.7793 0.0906

Image 47 10.0719 7.5162 10.2303 9.8684 10.4503 -0.2681

Image 48 10.0719 8.361 10.799 10.6065 11.2532 1.2851

Image 49 10.0719 10.4314 11.4312 11.717 12.3633 6.4082

Image 50 10.0719 9.7622 10.5911 11.2805 11.3537 2.4318

Image 51 10.0719 7.9388 9.7265 10.165 10.4405 0.54

Image 52 10.0719 9.0775 11.6039 10.978 11.977 2.5574

Image 53 10.0719 7.2196 10.0394 9.6595 10.3602 -0.4875

Image 54 10.0719 9.8327 10.5481 11.371 11.6311 6.0838

Image 55 10.0719 8.7393 10.291 10.5768 11.1361 2.5692

Image 56 10.0719 7.4604 9.8345 9.8902 10.3287 -0.6189

Image 57 10.0719 9.0013 10.8424 11.0828 11.1511 0.3789

Image 58 10.0719 7.9732 9.8056 10.1357 10.7292 2.7515

Image 59 10.0719 8.064 9.6843 10.2215 10.7098 2.9901

Image 60 10.0719 7.5219 10.0175 9.8045 10.541 0.8971

Image 61 10.0719 17.225 20.6253 14.8934 20.6893 11.872

Image 62 10.0719 17.114 19.4535 14.9472 19.6111 11.8716

Image 63 10.0719 16.4558 18.8328 15.0051 18.9749 13.295

Image 64 10.0719 16.5078 18.5284 14.9747 18.631 13.6338

Image 65 10.0719 16.4218 18.5042 15.0382 18.5432 12.4557

Image 66 10.0719 16.6129 18.6855 15.1808 18.7001 13.1022

Image 67 10.0719 16.6099 18.0805 15.2015 18.2538 14.2418

Image 68 10.0719 17.2881 19.4529 15.4 19.3492 15.7056

Page 55: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix C. Tables of individual results 48

Noisy Image NLM VNLMNLM

with 2A−A2

VNLM

with 2A−A2NLEM

Image 69 10.0719 10.78 11.9785 12.4405 12.5348 5.1903

Image 70 10.0719 7.7167 10.3074 9.9852 10.7057 0.8112

Image 71 10.0719 7.7946 10.2138 10.1759 10.623 0.7041

Image 72 10.0719 8.0464 10.0698 10.0757 10.9524 2.7532

Image 73 10.0719 7.3673 10.2971 9.753 10.3693 -0.7478

Image 74 10.0719 7.2227 10.5596 9.4306 10.8307 0.5756

Image 75 10.0719 8.6018 10.5745 10.6734 11.2651 2.3067

Image 76 10.0719 7.9603 10.0093 10.0688 10.6212 0.9393

Image 77 10.0719 8.0222 9.3905 10.1584 10.5129 2.707

Image 78 10.0719 8.7178 10.1504 10.7667 11.1638 3.232

Image 79 10.0719 9.5608 10.965 11.3653 11.7689 3.4716

Image 80 10.0719 8.1955 10.1838 10.3917 10.5398 -0.3482

Image 81 10.0719 7.6405 10.0307 9.968 10.5426 0.3462

Image 82 10.0719 7.9991 9.9274 10.002 10.5711 0.1324

Image 83 10.0719 7.9854 9.7395 10.1161 10.5081 1.1346

Image 84 10.0719 7.144 10.3046 9.555 10.5419 0.253

Image 85 10.0719 7.6063 10.2343 10.0472 10.4543 -0.0343

Image 86 10.0719 8.4546 10.2045 10.4388 11.2315 3.7516

Image 87 10.0719 10.4686 10.7617 11.857 11.6026 3.8037

Image 88 10.0719 8.0973 10.091 10.2693 10.7314 1.1141

Image 89 10.0719 7.465 10.5835 9.7828 10.5692 -0.4821

Image 90 10.0719 7.6344 10.1315 10.0004 10.3559 -0.5882

Image 91 10.0719 12.3601 11.7811 13.1385 12.5224 9.3746

Image 92 10.0719 8.7225 10.1226 10.785 11.0065 2.7515

Image 93 10.0719 8.9046 9.2868 10.8135 10.3579 3.5055

Image 94 10.0719 8.4552 10.3687 10.2716 11.132 3.3663

Image 95 10.0719 10.7983 10.2753 12.0077 11.2479 7.636

Page 56: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix C. Tables of individual results 49

Noisy Image NLM VNLMNLM

with 2A−A2

VNLM

with 2A−A2NLEM

Image 96 10.0719 7.5046 10.0609 9.8814 10.4134 -0.3951

Image 97 10.0719 10.1764 11.3546 11.7417 12.0318 5.1248

Image 98 10.0719 8.7628 10.53 10.842 11.0834 1.942

Image 99 10.0719 7.7132 10.3424 10.1699 10.5165 0.0361

Image 100 10.0719 7.5113 10.2078 9.7117 10.6643 0.7232

Table C.1: SNR of various algorithms over 100 images

C.2 PSNR of 100 images

PSNR of proposed algorithms over 100 images

Noisy Image NLM VNLMNLM

with 2A−A2

VNLM

with 2A−A2NLEM

Image 1 19.6969 21.5016 23.7638 22.4250 23.9201 12.6139

Image 2 19.6774 21.8778 21.3052 22.7191 21.9576 14.5265

Image 3 19.8180 25.5673 24.5126 25.2066 25.0995 22.1965

Image 4 20.2522 19.5410 20.0056 21.2330 20.6998 10.0083

Image 5 19.4825 17.1172 19.7175 19.3956 20.1163 10.2213

Image 6 18.5749 16.2878 18.7157 18.6777 19.1258 9.2054

Image 7 18.5393 16.4886 18.5106 18.5420 19.4157 11.2193

Image 8 13.7860 16.4891 17.0366 17.1304 17.8553 12.0159

Image 9 13.3049 16.9662 17.1805 17.4412 17.8218 12.5716

Image 10 20.7648 18.9876 20.7234 21.0488 21.3027 10.9971

Image 11 16.9964 15.4748 17.3434 17.4872 18.0359 9.9016

Image 12 17.0414 15.9717 17.9649 17.6904 18.5155 9.6419

Image 13 18.5460 17.4711 18.7357 19.3951 19.4585 11.6367

Image 14 18.5175 16.5820 19.4502 18.8817 19.4350 8.7131

Image 15 16.6516 16.7463 17.4859 18.1473 18.1575 9.2406

Page 57: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix C. Tables of individual results 50

Noisy Image NLM VNLMNLM

with 2A−A2

VNLM

with 2A−A2NLEM

Image 16 19.4524 17.2199 19.5027 19.4939 20.1719 11.0163

Image 17 18.2420 17.0993 18.5323 18.8487 19.2015 9.0425

Image 18 20.0525 20.9027 21.7857 22.6634 22.2951 15.1679

Image 19 21.3919 19.4892 21.7428 21.8651 21.9580 11.5827

Image 20 23.6626 26.1159 25.9054 27.4250 26.3151 18.2820

Image 21 21.8861 21.6183 22.9652 23.4328 23.8083 16.5717

Image 22 16.7511 19.9985 20.1954 20.5551 20.9865 16.5208

Image 23 20.1600 20.1898 20.7869 21.9823 21.3100 18.4696

Image 24 14.0990 19.9079 19.5966 19.6424 19.4772 14.6872

Image 25 16.9718 19.3051 18.9234 20.5129 19.9171 14.5676

Image 26 15.7780 19.2274 19.4775 19.4775 20.9914 14.5156

Image 27 17.7835 17.1472 18.8002 19.2030 19.3991 11.1405

Image 28 18.2822 21.0962 21.4962 21.8067 22.0583 15.0178

Image 29 17.4836 20.1073 20.3214 21.2066 20.8632 12.3308

Image 30 20.8549 20.0280 21.7841 21.9537 21.9765 12.3721

Image 31 15.6512 18.6045 18.7927 19.6560 19.4244 11.8608

Image 32 18.9808 22.6065 21.9340 23.6222 22.6489 17.7174

Image 33 18.5246 17.4289 19.1945 19.2982 19.9894 12.4483

Image 34 19.9282 17.7045 19.7004 19.9176 20.5462 11.8825

Image 35 19.6936 17.5189 19.5438 19.6471 20.2896 11.4419

Image 36 20.7148 18.6341 20.0705 20.4944 21.1639 13.9952

Image 37 21.6103 19.3943 21.7388 21.4756 22.2891 12.7873

Image 38 18.7325 16.9651 19.1096 18.9840 19.8777 12.0396

Image 39 18.4380 16.778 18.6896 18.6859 19.5484 12.2878

Image 40 19.5908 16.5820 19.8373 19.0677 19.8211 8.4905

Image 41 21.1118 18.7038 21.4274 21.0788 21.7003 11.0405

Image 42 20.7778 18.3526 21.1030 20.6891 21.2646 10.6553

Page 58: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix C. Tables of individual results 51

Noisy Image NLM VNLMNLM

with 2A−A2

VNLM

with 2A−A2NLEM

Image 43 20.5846 18.6011 20.5255 20.7603 21.3780 12.6338

Image 44 22.8959 21.7362 23.8508 23.8775 24.1768 14.2647

Image 45 20.3051 17.7470 20.2604 20.5514 20.0902 9.2742

Image 46 21.6270 19.1779 22.0808 21.4986 22.3342 11.6451

Image 47 21.6066 19.0358 21.7637 21.4021 21.9850 11.2663

Image 48 16.6023 14.8895 17.3275 17.1357 17.7835 7.8136

Image 49 17.9996 18.3585 19.3485 19.6444 20.2893 14.3331

Image 50 20.5961 20.2522 21.0970 21.8047 21.8746 12.9476

Image 51 22.3725 20.2068 22.0151 22.4654 22.7399 12.8373

Image 52 19.5470 18.5517 21.0786 20.4523 21.4520 12.0307

Image 53 20.2181 17.3626 20.1855 19.8045 20.5064 9.6587

Image 54 17.6274 17.3848 18.0999 18.9262 19.1861 13.6294

Image 55 19.0125 17.6768 19.2295 19.5172 20.0766 11.5083

Image 56 20.6752 18.0567 20.4372 20.4935 20.9319 9.9841

Image 57 25.0561 23.9845 25.8262 26.0668 26.1353 15.3628

Image 58 20.7844 18.6827 20.5181 20.8481 21.4416 13.4611

Image 59 20.5757 18.5589 20.1881 20.7252 21.2136 13.4896

Image 60 23.1740 20.6220 23.1185 22.9061 23.6431 13.9992

Image 61 18.3165 25.4696 28.8695 23.1380 28.9338 20.1161

Image 62 17.0367 24.0787 26.4177 21.9115 26.5753 18.8348

Image 63 21.1895 27.5653 29.9401 26.1226 30.0913 24.4126

Image 64 13.0409 19.4747 21.4901 17.9437 21.5983 16.6027

Image 65 19.0859 25.4290 27.5118 24.0520 27.5563 21.4696

Image 66 15.4415 21.9770 24.0499 20.5503 24.0688 18.4717

Image 67 13.0385 19.5733 21.0464 18.1680 21.2203 17.1996

Image 68 14.0676 21.2817 23.4485 19.3957 23.3448 19.6755

Image 69 18.9203 19.6268 20.8262 21.2889 21.3832 14.0386

Page 59: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix C. Tables of individual results 52

Noisy Image NLM VNLMNLM

with 2A−A2

VNLM

with 2A−A2NLEM

Image 70 19.4825 17.1172 19.7175 19.3956 20.1163 10.2213

Image 71 18.5749 16.2878 18.7157 18.6777 19.1258 9.2054

Image 72 15.5393 16.4886 18.5106 18.5420 19.4157 11.2193

Image 73 18.7577 16.0490 18.9829 18.4388 19.0551 7.9379

Image 74 18.8369 15.9846 19.3244 18.1955 19.5957 9.3404

Image 75 20.9331 19.4474 21.4215 21.5346 22.1245 13.1622

Image 76 19.7732 17.6410 19.6957 19.7695 20.3208 10.6401

Image 77 18.2350 16.1780 17.5447 18.3209 18.6745 10.8677

Image 78 16.6363 15.2738 16.7048 17.3296 17.7272 9.7868

Image 79 20.9548 20.4098 21.7930 22.2476 22.6379 14.3351

Image 80 19.2135 17.2797 19.3198 19.5302 19.6810 8.7931

Image 81 20.2338 17.7857 20.1912 20.1298 20.7045 10.5076

Image 82 21.0914 19.0068 20.9345 21.0215 21.5893 11.1497

Image 83 18.3997 16.2875 18.0495 18.4440 18.8336 9.4583

Image 84 18.1076 15.1785 18.3398 17.5906 18.5776 8.2880

Image 85 19.7288 17.2489 19.8903 19.7033 20.1111 9.6224

Image 86 16.0744 14.4469 16.2052 16.4412 17.2339 9.7456

Image 87 17.8866 18.2668 18.5348 19.6695 19.4082 11.6153

Image 88 19.4821 17.4995 19.5000 19.6796 20.1416 10.5242

Image 89 18.9496 16.3413 19.4609 18.6605 19.4468 8.3956

Image 90 20.3414 17.8934 20.4004 20.2692 20.6254 9.6813

Image 91 14.8859 17.1687 16.5831 17.9517 17.3316 14.1885

Image 92 17.2702 15.9123 17.3137 17.9832 18.2042 9.9464

Image 93 14.4840 13.3147 13.6896 15.2236 14.7663 7.9176

Image 94 19.9862 18.3672 20.2826 20.1856 21.0463 13.2805

Image 95 13.8238 14.5491 14.0172 15.7594 14.9947 11.3879

Image 96 19.3916 16.8004 19.3781 19.2005 19.7329 8.9242

Page 60: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix C. Tables of individual results 53

Noisy Image NLM VNLMNLM

with 2A−A2

VNLM

with 2A−A2NLEM

Image 97 17.1522 17.2530 18.4337 18.8220 19.1120 12.2039

Image 98 19.6356 18.3205 20.0937 20.4055 20.6470 11.5056

Image 99 19.6144 17.2449 19.8835 19.7120 20.0590 9.5773

Image 100 17.8012 15.2402 17.9367 17.4401 18.3936 8.4524

Table C.2: PSNR of various algorithms over 100 images

C.3 Earth Mover’s Distance of 100 images

Earth Mover’s Distance of proposed algorithms over 100 images

NLM VNLMNLM

with 2A−A2

VNLM

with 2A−A2NLEM

Image 1 7.5748e-3 9.0785e-3 1.6405e-2 1.2859e-2 3.3774e-2

Image 2 4.0217e-3 4.8706e-3 1.3000e-2 4.9887e-3 2.7276e-2

Image 3 5.3602e-3 1.4372e-3 1.2930e-2 4.3230e-3 8.8201e-3

Image 4 1.0577e-2 1.3774e-2 1.6058e-2 1.7132e-2 6.6924e-2

Image 5 1.6279e-2 1.4467e-2 3.7063e-3 2.6004e-2 7.0930e-2

Image 6 1.5064e-2 1.4397e-2 6.9673e-3 2.7242e-2 7.8063e-2

Image 7 1.5299e-2 7.6965e-3 3.3917e-3 1.4800e-2 5.1807e-2

Image 8 3.6369e-3 2.1886e-3 1.2623e-2 4.5470e-3 1.7243e-2

Image 9 3.3321e-3 1.4928e-3 1.1401e-2 6.0410e-3 1.4499e-2

Image 10 8.6144e-3 1.5759e-2 1.0936e-2 2.3995e-2 7.4440e-2

Image 11 1.3372e-2 7.1389e-3 3.7202e-3 1.5704e-2 5.0578e-2

Image 12 9.8976e-3 6.6847e-3 4.6535e-3 1.5268e-2 4.8273e-2

Image 13 1.0847e-2 6.8460e-3 4.6428e-3 1.2466e-2 4.6655e-2

Image 14 1.0720e-2 2.0871e-2 1.0474e-2 3.0770e-2 7.8605e-2

Image 15 4.4105e-3 6.7914e-3 8.9994e-3 1.2317e-2 4.4101e-2

Page 61: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix C. Tables of individual results 54

NLM VNLMNLM

with 2A−A2

VNLM

with 2A−A2NLEM

Image 16 1.6480e-2 1.0853e-2 3.6032e-3 2.1100e-2 6.4839e-2

Image 17 7.3608e-3 1.1411e-2 9.2795e-3 1.9187e-2 6.3187e-2

Image 18 5.8873e-3 5.2418e-3 5.0336e-3 1.5026e-2 3.2998e-2

Image 19 1.3788e-2 1.6912e-2 6.4251e-3 2.8652e-2 8.3143e-2

Image 20 2.1237e-3 6.9131e-3 1.0232e-2 1.3661e-2 3.2080e-2

Image 21 8.6569e-3 3.5421e-3 3.8245e-3 1.4368e-2 3.6310e-2

Image 22 2.8204e-3 1.6989e-3 1.0607e-2 5.2421e-3 1.4338e-2

Image 23 6.3166e-3 6.8621e-3 7.6484e-3 1.5459e-2 4.7102e-2

Image 24 5.6195e-3 3.8007e-3 1.3280e-2 8.4837e-3 1.0469e-2

Image 25 1.7601e-3 4.2153e-3 9.1738e-3 4.9789e-3 2.0002e-2

Image 26 5.4829e-3 2.4581e-3 1.3271e-2 7.0385e-3 1.4027e-2

Image 27 1.1234e-2 7.1358e-3 2.9804e-3 1.6445e-2 4.4538e-2

Image 28 3.5928e-3 3.2448e-3 1.2131e-2 7.4259e-3 2.0801e-2

Image 29 2.1611e-3 4.9180e-3 8.4556e-3 1.3067e-2 3.1285e-2

Image 30 1.2456e-2 1.2292e-2 1.7458e-3 2.4013e-2 6.5011e-2

Image 31 2.4508e-3 2.4719e-3 1.0605e-2 8.1654e-3 2.6290e-2

Image 32 1.5499e-3 2.0829e-3 9.4951e-3 5.9676e-3 1.6643e-2

Image 33 1.1848e-2 4.6847e-3 3.0190e-3 1.3297e-2 4.2934e-2

Image 34 1.7382e-2 6.3634e-3 2.4255e-3 1.9670e-2 6.2717e-2

Image 35 1.6510e-2 7.7822e-3 2.9896e-3 2.0557e-2 6.3401e-2

Image 36 1.8092e-2 5.3849e-3 4.0291e-3 1.3134e-2 5.0141e-2

Image 37 1.5563e-2 1.1319e-2 3.2998e-3 2.3167e-2 6.5601e-2

Image 38 1.5642e-2 4.3569e-3 1.7185e-3 1.6587e-2 4.9247e-2

Image 39 1.4969e-2 3.4482e-3 1.8107e-3 1.5404e-2 4.5106e-2

Image 40 1.7978e-2 2.7083e-2 7.8629e-3 3.6515e-2 1.0587e-1

Image 41 1.5044e-2 1.8844e-2 6.8475e-3 2.9581e-2 8.6162e-2

Image 42 1.5682e-2 2.0311e-2 6.2124e-3 3.0127e-2 8.5818e-2

Page 62: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix C. Tables of individual results 55

NLM VNLMNLM

with 2A−A2

VNLM

with 2A−A2NLEM

Image 43 1.6394e-2 7.7408e-3 1.5966e-3 1.8043e-2 6.0292e-2

Image 44 9.2147e-3 1.4170e-2 7.2582e-3 2.2249e-2 5.5695e-2

Image 45 1.2115e-2 2.2013e-2 1.1280e-2 3.2429e-2 1.0033e-1

Image 46 1.6163e-2 1.8522e-2 5.2544e-3 2.7533e-2 8.2285e-2

Image 47 1.5521e-2 2.0359e-2 7.5857e-3 3.1293e-2 9.1429e-2

Image 48 1.1510e-2 1.2473e-2 8.2890e-3 2.4512e-2 6.7851e-2

Image 49 4.7505e-3 3.3618e-3 7.0746e-3 6.7949e-3 2.6451e-2

Image 50 6.4539e-3 7.9803e-3 1.0424e-2 1.3577e-2 4.3104e-2

Image 51 1.2036e-2 1.2298e-2 8.0106e-3 2.3153e-2 7.4713e-2

Image 52 7.4407e-3 1.0817e-2 9.6169e-3 2.1433e-2 5.1857e-2

Image 53 1.7284e-2 1.9863e-2 6.5077e-3 3.0815e-2 9.4260e-2

Image 54 8.7388e-3 4.5336e-3 4.2138e-3 7.0342e-3 3.0808e-2

Image 55 1.1427e-2 5.7917e-3 4.5946e-3 1.6236e-2 5.2077e-2

Image 56 1.4098e-2 1.7508e-2 9.6905e-3 3.0895e-2 9.7413e-2

Image 57 9.5780e-3 1.5211e-2 7.6841e-3 2.5066e-2 6.7196e-2

Image 58 1.6180e-2 5.4762e-3 2.6495e-3 1.7720e-2 5.5943e-2

Image 59 1.5638e-2 5.3636e-3 2.8076e-3 1.6658e-2 5.4162e-2

Image 60 1.7905e-2 1.3133e-2 3.0036e-3 2.6762e-2 7.5501e-2

Image 61 1.2246e-2 2.4408e-3 2.0629e-2 4.2342e-3 5.5485e-3

Image 62 1.1533e-2 1.7888e-3 1.9904e-2 3.5124e-3 6.0823e-3

Image 63 9.7113e-3 2.6901e-3 1.7850e-2 4.9401e-3 4.8638e-3

Image 64 9.6172e-3 2.3215e-3 1.7811e-2 4.6068e-3 4.7277e-3

Image 65 9.5416e-3 2.7306e-3 1.7861e-2 4.8785e-3 5.4027e-3

Image 66 9.4067e-3 2.4100e-3 1.7541e-2 4.5170e-3 4.8420e-3

Image 67 8.8213e-3 2.1015e-3 1.6790e-2 4.4270e-3 4.7234e-3

Image 68 9.5479e-3 2.1250e-3 1.7577e-2 4.2343e-3 3.8275e-3

Image 69 5.7828e-3 4.3320e-3 5.8554e-3 1.2093e-2 3.1376e-2

Page 63: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix C. Tables of individual results 56

NLM VNLMNLM

with 2A−A2

VNLM

with 2A−A2NLEM

Image 70 1.6279e-2 1.4467e-2 3.7063e-3 2.6004e-2 7.0930e-2

Image 71 1.5064e-2 1.4397e-2 6.9673e-3 2.7242e-2 7.8063e-2

Image 72 1.5299e-2 7.6965e-3 3.3917e-3 1.4800e-2 5.1807e-2

Image 73 1.7076e-2 2.4702e-2 6.8286e-3 3.4540e-2 1.0050e-1

Image 74 1.9763e-2 1.6430e-2 3.3257e-3 2.7611e-2 7.7394e-2

Image 75 1.0841e-2 1.0010e-2 6.8136e-3 1.8010e-2 5.3544e-2

Image 76 1.3582e-2 1.2361e-2 5.3907e-3 2.1715e-2 6.9750e-2

Image 77 1.6397e-2 5.5405e-3 2.6349e-3 1.4940e-2 5.7182e-2

Image 78 1.3078e-2 4.4026e-3 3.0816e-3 1.4365e-2 4.9568e-2

Image 79 6.0280e-3 8.3106e-3 9.1216e-3 1.3732e-2 4.1105e-2

Image 80 9.3250e-3 1.9609e-2 1.2233e-2 2.9389e-2 8.9200e-2

Image 81 1.5742e-2 1.4202e-2 5.8072e-3 2.7812e-2 8.2201e-2

Image 82 1.1596e-2 1.3712e-2 8.0120e-3 2.2230e-2 7.0400e-2

Image 83 1.3188e-2 1.1590e-2 6.7171e-3 2.0618e-2 6.8480e-2

Image 84 2.0388e-2 1.7330e-2 3.0816e-3 3.0226e-2 8.4343e-2

Image 85 1.5112e-2 2.0239e-2 7.9370e-3 3.2573e-2 8.8542e-2

Image 86 1.4379e-2 3.2203e-3 2.1531e-3 1.4183e-2 4.6044e-2

Image 87 3.7451e-3 6.9756e-3 1.0634e-2 9.3145e-3 3.7940e-2

Image 88 1.2773e-2 1.2033e-2 6.1686e-3 2.4537e-2 6.9630e-2

Image 89 1.4571e-2 2.4429e-2 9.3510e-3 3.3763e-2 9.3431e-2

Image 90 1.3379e-2 2.1873e-2 1.1121e-2 3.3196e-2 9.7102e-2

Image 91 3.1524e-3 5.3749e-3 1.1256e-2 2.8321e-3 1.5337e-2

Image 92 1.3951e-2 5.3145e-3 2.8364e-3 1.6809e-2 5.5000e-2

Image 93 1.0364e-2 7.9355e-3 6.1156e-3 8.1957e-3 4.6445e-2

Image 94 1.4052e-2 5.5780e-3 2.7251e-3 1.5563e-2 4.6617e-2

Image 95 4.3537e-3 8.9880e-3 7.8035e-3 3.7238e-3 2.3570e-2

Image 96 1.5379e-2 1.8525e-2 8.2966e-3 3.1042e-2 9.3155e-2

Page 64: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix C. Tables of individual results 57

NLM VNLMNLM

with 2A−A2

VNLM

with 2A−A2NLEM

Image 97 6.2348e-3 3.8756e-3 6.4940e-3 8.9313e-3 3.1990e-2

Image 98 1.2386e-2 9.8288e-3 3.9507e-3 2.2141e-2 5.8753e-2

Image 99 1.4591e-2 1.8838e-2 8.6153e-3 3.2389e-2 8.6732e-2

Image 100 1.8003e-2 1.3904e-2 3.0086e-3 2.7081e-2 7.6399e-2

Table C.3: Earth Mover’s Distance of various algorithms over 100 images

C.4 Comparison between norm search method and search win-

dow method

SNR comparison of norm search (NS) and search window (SW) methods

Noisy VNLMVNLM with

2A−A2

NS SW NS SW

Image 1 10.0719 13.8717 12.3639 14.1723 13.6481Image 2 10.0719 11.2520 10.6439 12.2640 11.6268Image 3 10.0719 12.9385 13.1629 14.9378 14.8849Image 4 10.0719 9.8399 8.0700 10.5269 9.1012Image 5 10.0719 10.2863 7.5313 10.7045 9.2065Image 6 10.0719 10.1934 6.7380 10.6266 8.5814Image 7 10.0719 9.7985 7.5159 10.8848 8.9235Image 8 10.0719 13.2484 12.0841 14.1147 13.1987Image 9 10.0719 13.9218 12.8852 14.5792 14.3192Image 10 10.0719 9.9933 7.5556 10.6073 9.0024

Table C.4: SNR comparison between norm search and search window methods

Page 65: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix C. Tables of individual results 58

PSNR comparison of search window (SW) and norm search (NS) methods

Noisy VNLMVNLM with

2A−A2

NS SW NS SW

Image 1 19.6969 23.4707 21.9797 23.7891 23.2713Image 2 19.6774 20.7113 20.2439 21.8441 21.2316Image 3 19.8180 22.4682 22.7901 24.6503 24.6174Image 4 20.2522 19.9906 18.2186 20.6989 19.2732Image 5 19.4825 19.6945 16.9300 20.1149 18.6151Image 6 18.5749 18.6911 15.2266 19.1294 17.0826Image 7 18.5393 18.1880 15.9583 19.3398 17.3812Image 8 13.7860 16.9595 15.7980 17.8285 16.9127Image 9 13.3049 17.1462 16.1071 17.8119 17.5500Image 10 20.7648 20.6670 18.2006 21.2988 19.6896

Table C.5: PSNR comparison between norm search and search window methods

EMD comparison of search window (SW) and norm search (NS) methods

VNLMVNLM with

2A−A2

NS SW NS SW

Image 1 9.0785e-3 4.7088e-3 1.2859e-2 6.6813e-3Image 2 4.8706e-3 9.0443e-3 4.9887e-3 5.7581e-3Image 3 1.4372e-3 8.0199e-3 4.3230e-3 2.7537e-3Image 4 1.3774e-2 1.2399e-2 1.7132e-2 1.1296e-2Image 5 1.4467e-2 1.6087e-2 2.6004e-2 6.8429e-3Image 6 7.6965e-3 2.2659e-2 2.7242e-2 7.2674e-3Image 7 2.1886e-3 1.9991e-2 1.4800e-2 9.6728e-3Image 8 1.4928e-3 7.4693e-3 4.5470e-3 3.1260e-3Image 9 1.5759e-2 6.7266e-3 6.0410e-3 2.2116e-3Image 10 7.1389e-3 1.3576e-2 2.3995e-2 9.8403e-3

Table C.6: EMD comparison between norm search and search window methods

Page 66: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix D

Performance time over 100 images

Our experiments were performed on two machines (for details see Sec. 3.1). In the following

table, NLM is short for NLM/NLM with 2A−A2, similarly VNLM is short for VNLM/VNLM

with 2A−A2.

Performance time in minutes

Ganita Sphere

NLM VNLM NLEM NLM VNLM NLEM

Image 1 2.9451 84.1318 22.8645 Image 3 3.0864 101.7909 30.8906

Image 2 3.1903 83.5754 24.3083 Image 6 5.2027 101.1287 32.5851

Image 4 5.7863 84.4877 40.0348 Image 9 2.6064 101.4373 24.2356

Image 5 4.1955 84.3262 44.5964 Image 10 4.1558 100.5414 25.3718

Image 7 2.8474 83.9971 22.7255 Image 13 4.4131 101.4443 24.2727

Image 8 1.7015 83.9198 20.2592 Image 14 3.9866 101.2842 24.8865

Image 11 2.1415 83.8499 21.0095 Image 17 4.7464 100.6421 24.3178

Image 12 2.5120 83.2450 23.9381 Image 18 2.9386 101.2758 24.6799

Image 15 3.1721 83.4869 22.5657 Image 21 3.2847 101.4251 24.1822

Image 16 2.6826 83.6168 24.2402 Image 22 2.0307 101.0343 23.4641

Image 19 3.7842 84.0066 23.2428 Image 25 3.8552 100.9794 24.2610

Image 20 2.0662 83.7244 22.5022 Image 26 2.6193 101.1464 24.3322

Image 23 3.2274 84.2666 21.8077 Image 29 2.4834 101.1399 23.9919

59

Page 67: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix D. Performance time over 100 images 60

Ganita Sphere

NLM VNLM NLEM NLM VNLM NLEM

Image 24 1.5217 84.1574 19.1086 Image 30 3.8589 101.1127 25.5389

Image 27 2.8892 83.8876 20.5782 Image 33 3.8882 100.7611 24.6960

Image 28 2.0916 84.0104 19.7413 Image 34 3.8566 101.4122 25.3714

Image 31 2.9034 83.6185 21.7927 Image 39 3.5705 102.0796 24.3868

Image 32 1.4601 83.9792 19.8289 Image 40 6.9277 101.6352 24.8716

Image 35 3.9836 83.9715 24.0033 Image 41 6.1969 100.9013 24.8375

Image 36 2.9566 84.5838 23.2942 Image 42 6.4420 102.0776 24.4951

Image 37 2.8137 84.1190 20.9777 Image 47 6.6754 101.1190 25.3006

Image 38 2.3505 83.9217 22.8695 Image 48 4.8326 101.1576 25.6888

Image 43 3.1338 83.8465 23.9542 Image 49 3.5172 100.8999 24.0451

Image 44 2.3627 83.7188 23.9764 Image 50 5.2176 101.3840 24.4117

Image 45 3.2667 84.2969 23.8043 Image 53 6.7381 102.5106 26.0028

Image 46 2.6096 83.6205 23.8877 Image 54 2.4896 100.9546 25.0044

Image 51 3.6580 83.7970 24.6266 Image 55 3.1730 101.4463 25.6629

Image 52 2.5179 84.4180 22.6700 Image 56 5.0539 101.4570 26.8236

Image 57 3.8922 83.6928 20.4403 Image 61 7.4122 100.9993 23.6071

Image 58 2.3474 84.2593 24.3248 Image 62 3.9753 101.0171 23.6707

Image 59 2.3175 84.0885 20.9157 Image 63 2.6754 100.8882 23.8222

Image 60 2.7316 84.3972 23.4227 Image 64 2.5830 100.8805 23.7043

Image 65 3.1346 84.2290 23.7443 Image 69 2.9334 102.7041 24.2912

Image 66 1.8304 83.9594 20.1079 Image 70 4.0194 101.2526 25.3497

Image 67 1.6660 83.9359 20.0895 Image 71 4.3210 100.7674 26.0837

Image 68 1.7281 84.5875 20.0503 Image 72 3.3624 103.1106 24.9938

Image 73 6.1802 84.1052 22.3114 Image 77 3.9963 101.4254 25.2180

Image 74 2.5929 84.3310 24.5774 Image 78 3.9365 101.4389 25.7344

Image 75 2.2791 83.8219 22.1733 Image 79 4.8173 102.2699 25.1487

Image 76 2.5834 83.9890 22.9987 Image 80 6.6417 101.1399 25.5892

Page 68: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Appendix D. Performance time over 100 images 61

Ganita Sphere

NLM VNLM NLEM NLM VNLM NLEM

Image 81 3.9127 83.9055 24.1820 Image 85 5.9896 100.9766 25.3943

Image 82 3.1793 84.6445 22.6416 Image 86 4.5370 101.3249 24.9197

Image 83 2.7210 83.4608 23.8547 Image 87 4.0746 102.0992 23.8648

Image 84 2.7571 84.0984 24.8287 Image 88 5.7453 101.4550 25.6252

Image 89 4.9352 83.3791 23.3775 Image 95 3.9217 101.7256 24.6773

Image 90 3.1195 83.8198 24.6268 Image 96 5.2215 100.8067 26.7424

Image 91 1.9429 83.7071 19.0656 Image 97 2.7275 101.9020 24.9487

Image 92 2.2055 84.3117 24.1493 Image 98 3.7730 101.6615 25.8647

Image 93 2.3243 83.8928 22.4800 Image 99 4.3549 101.4237 26.8678

Image 94 2.0409 84.3516 22.9691 Image 100 3.7244 102.1214 26.0983

Table D.1: Performance time of the various algorithms over 100 images

Page 69: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Bibliography

[BC05] A. Buades and B. Coll. A non-local algorithm for image denoising. In In CVPR,

pages 60–65, 2005.

[BM08] J. A. Bondy and U. S. R. Murty. Graph theory, volume 244. Springer-Verlag London,

2008.

[BN01] M. Belkin and P. Niyogi. Laplacian eigenmaps and spectral techniques for embedding

and clustering. In Advances in Neural Information Processing Systems 14, pages

585–591. MIT Press, 2001.

[BN03] M. Belkin and P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data

representation. Neural Comput., 15(6):1373–1396, June 2003.

[BT88] R. Boyle and R.C. Thomas. Computer vision: a first course. Blackwell Scientific

Publications, 1988.

[Cha13] K. N. Chaudhury. Non-local euclidean medians. http://www.mathworks.com/

matlabcentral/fileexchange/40204-non-local-euclidean-medians, February

2013. Retrieved Jun. 26, 2015.

[Chu97] F. R. K. Chung. Spectral graph theory, volume 92. American Mathematical Soc.,

1997.

[CKL+08] R. R. Coifman, I. G. Kevrekidis, S. Lafon, M. Maggioni, and B. Nadler. Diffusion

maps, reduction coordinates, and low dimensional representation of stochastic systems.

Multiscale Modeling & Simulation, 7(2):842–864, 2008.

62

Page 70: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Bibliography 63

[CL06] R. R. Coifman and S. Lafon. Diffusion maps. Appl. Comput. Harmon. Anal.,

21(1):5–30, 2006.

[CLL+05] R. R. Coifman, S. Lafon, A. B. Lee, M. Maggioni, B. Nadler, F. Warner, and S. W.

Zucker. Geometric diffusions as a tool for harmonic analysis and structure definition

of data: Diffusion maps. Proceedings of the National Academy of Sciences of the

United States of America, 102(21):7426–7431, 2005.

[CS12] K. N. Chaudhury and A. Singer. Non-local Euclidean Median. IEEE SIGNAL

PROCESSING LETTERS, 19(11):745–748, 2012.

[Dav04] E. R. Davies. Machine Vision: Theory, Algorithms, Practicalities. Morgan Kaufmann

Publishers Inc., San Francisco, CA, USA, 2004.

[Hag07] I. B. Hagi. Fast imrotate. http://www.mathworks.com/matlabcentral/

fileexchange/17788-fast-imrotate, November 2007. Retrieved Mar. 27, 2015.

[LB01] E. Levina and P. Bickel. The Earth Mover’s distance is the Mallows distance:

some insights from statistics. Proceedings Eighth IEEE International Conference on

Computer Vision. ICCV 2001, 2, 2001.

[LPW09] D. A. Levin, Y. Peres, and E. L. Wilmer. Markov chains and mixing times. American

Mathematical Soc., 2009.

[Mar91] A. Marion. An Introduction to Image Processing. Chapman and Hall, 1991.

[NLCK05] B. Nadler, S. Lafon, R. R. Coifman, and I. G. Kevrekidis. Diffusion maps, spectral clus-

tering and eigenfunctions of Fokker-Planck operators. arXiv preprint math/0506090,

2005.

[NLCK08] B. Nadler, S. Lafon, R. R. Coifman, and I. G. Kevrekidis. Diffusion maps-a proba-

bilistic interpretation for spectral embedding and clustering algorithms. In Principal

manifolds for data visualization and dimension reduction, pages 238–260. Springer,

2008.

Page 71: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Bibliography 64

[RTG98a] Y. Rubner, C. Tomasi, and L. Guibas. The Earth Mover’s Distance as a metric for im-

age retrieval. Technical Report STAN-CS-TN-98-86, Computer Science Department,

Stanford University, September 1998.

[RTG98b] Y. Rubner, C. Tomasi, and L. Guibas. A metric for distributions with applications to

image databases. In Proceedings of the IEEE International Conference on Computer

Vision, pages 59–66, Bombay, India, January 1998.

[SSN09] A. Singer, Y. Shkolnisky, and B. Nadler. Diffusion interpretation of nonlocal

neighborhood filters for signal denoising. SIAM J. Imaging Sciences, 2(1):118–139,

2009.

[SW12] A. Singer and H-T. Wu. Vector diffusion maps and the connection Laplacian.

Communications on pure and applied mathematics, 65(8), 2012.

[SW13] A. Singer and H-T. Wu. Spectral convergence of the connection laplacian from

random samples. arXiv preprint arXiv:1306.1587, 2013.

[TGR+11] G. Tkacik, P. Garrigan, C. Ratliff, G. Milinski, J. Klein, L. Seyfarth, P. Sterling,

D. Brainard, and V. Balasubramanian. Natural images from the birthplace of the

human eye. PLoS ONE, 6(6):e20409, 06 2011.

[Ver91] D. Vernon. Machine Vision: Automated Visual Inspection and Robot Vision. Prentice

Hall, 1991.

[Vil03] C. Villanic. Topics in Optimal Transportation. Graduate Studies in Mathematics,

American Mathematical Society, 2003.

[ZP04] L. Zelnik-manor and P. Perona. Self-tuning spectral clustering. In Advances in

Neural Information Processing Systems 17, pages 1601–1608. MIT Press, 2004.

[ZS13] Z. Zhao and A. Singer. Fourier-bessel rotational invariant eigenimages. Journal of

the Optical Society of America A, Optics, image science, and vision., 30(5):871–877,

2013.

Page 72: by Xin Qi - University of Toronto T-Space...theory (for a more in-depth discussion concerning graph theory see [BM08]). De nition 1.3.1. An undirected, simple graph G, is a pair (V;E),

Bibliography 65

[ZS14] Z. Zhao and A. Singer. Rotationally invariant image representation for viewing

direction classification in cryo-EM. Journal of structural biology, 186(1):153–166,

2014.

[ZSS14] Z. Zhao, Y. Shkolnisky, and A. Singer. Fast steerable principal component analysis.

arXiv preprint arXiv:1412.0781, 2014.