wavelet-based blind blur reduction

13
SIViP DOI 10.1007/s11760-014-0613-z ORIGINAL PAPER Wavelet-based blind blur reduction Fatma Kerouh · Amina Serir Received: 20 December 2012 / Revised: 2 January 2014 / Accepted: 2 January 2014 © Springer-Verlag London 2014 Abstract This paper presents a blind iterative algorithm for blurred image restoration. Herein, an optical blur ker- nel is considered. The proposed deblurring approach per- forms in the wavelet domain. It is relying on the wavelet details analysis and correction through resolutions. A quality- metric-based stopping criterion is proposed to control the iterative correction process. The main advantage of the pro- posed technique lies in the fact that it is fully automatic and both the iterative correction process and the stopping crite- rion are based on the same transform. Tests using blurred images from the LIVE 2008 database show the effectiveness of the proposed adaptive algorithm. Keywords Blur kernel · The wavelet transform · Blur quality metric · Blind iterative deblurring algorithms · Stopping criterion 1 Introduction Blur is a common problem in many applications. There are two main kinds of blur: motion blur [1] and optical blur [2]. Although these two kinds of blurs often exist simulta- neously, they are operated separately. This work focuses on optical blur; it could be caused by lenses. In this kind of situ- ations, deblurring is fundamental in making pictures sharper and more useful for analysis and interpretation. Deblurring is considered as an ill-posed inverse problem [3], usually kept at bay by means of regularization following the classical F. Kerouh (B ) · A. Serir Laboratory of Image Processing and Radiation, Faculty of Electronics and Computing, U.S.T.H.B., 16111 Algiers, Algeria e-mail: [email protected] A. Serir e-mail: [email protected] works of Tikhonov [4]. Herein, an iterative approach maxi- mizing the image quality is proposed. In image deblurring, we seek to recover an original sharp image using a mathe- matical model of the blurring process. We assume that the image formation could be adequately described by a linear spatially invariant relation and that the noise is additive. A common model for the image formation corresponds to fil- tering the image f through a two-dimensional (2D) linear system whose impulse response h is a Gaussian function. At each pixel (k , l ), the blurred version f b is then defined as follows: f b (k , l ) = (h f )(k , l ) + n (k , l ) (1) where n is an additive random noise. Blurring affects especially edges [5], which represent dis- continuities and high-frequency components in images. It was shown in [6] that the wavelet transform is well adapted to the high-frequency image components analysis. The aim of this work is to propose a blind deblurring method for opti- cal blurred images. The proposed approach is iterative and essentially based on wavelet coefficients correction. The iter- ative deblurring process is controlled by a recently proposed stopping criterion using a blind objective blurred image qual- ity metric [7, 8]. The rest of the paper is organized as follows. Related works on deblurring techniques are presented in Sect. 2. Sec- tion 3 details the proposed algorithm for blur effect reduction. Experimental settings, methodology, results and discussions are presented in Sect. 4. We then conclude the paper with some perspectives for future work. 2 Related works There exist in the literature two main kinds of deblurring methods: Parametric and iterative. The main drawback of the 123

Upload: amina

Post on 25-Jan-2017

213 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Wavelet-based blind blur reduction

SIViPDOI 10.1007/s11760-014-0613-z

ORIGINAL PAPER

Wavelet-based blind blur reduction

Fatma Kerouh · Amina Serir

Received: 20 December 2012 / Revised: 2 January 2014 / Accepted: 2 January 2014© Springer-Verlag London 2014

Abstract This paper presents a blind iterative algorithmfor blurred image restoration. Herein, an optical blur ker-nel is considered. The proposed deblurring approach per-forms in the wavelet domain. It is relying on the waveletdetails analysis and correction through resolutions. A quality-metric-based stopping criterion is proposed to control theiterative correction process. The main advantage of the pro-posed technique lies in the fact that it is fully automatic andboth the iterative correction process and the stopping crite-rion are based on the same transform. Tests using blurredimages from the LIVE 2008 database show the effectivenessof the proposed adaptive algorithm.

Keywords Blur kernel · The wavelet transform · Blurquality metric · Blind iterative deblurring algorithms ·Stopping criterion

1 Introduction

Blur is a common problem in many applications. There aretwo main kinds of blur: motion blur [1] and optical blur[2]. Although these two kinds of blurs often exist simulta-neously, they are operated separately. This work focuses onoptical blur; it could be caused by lenses. In this kind of situ-ations, deblurring is fundamental in making pictures sharperand more useful for analysis and interpretation. Deblurringis considered as an ill-posed inverse problem [3], usuallykept at bay by means of regularization following the classical

F. Kerouh (B) · A. SerirLaboratory of Image Processing and Radiation, Faculty of Electronicsand Computing, U.S.T.H.B., 16111 Algiers, Algeriae-mail: [email protected]

A. Serire-mail: [email protected]

works of Tikhonov [4]. Herein, an iterative approach maxi-mizing the image quality is proposed. In image deblurring,we seek to recover an original sharp image using a mathe-matical model of the blurring process. We assume that theimage formation could be adequately described by a linearspatially invariant relation and that the noise is additive. Acommon model for the image formation corresponds to fil-tering the image f through a two-dimensional (2D) linearsystem whose impulse response h is a Gaussian function. Ateach pixel (k, l), the blurred version fb is then defined asfollows:

fb (k, l) = (h ∗ f ) (k, l) + n (k, l) (1)

where n is an additive random noise.Blurring affects especially edges [5], which represent dis-

continuities and high-frequency components in images. Itwas shown in [6] that the wavelet transform is well adaptedto the high-frequency image components analysis. The aimof this work is to propose a blind deblurring method for opti-cal blurred images. The proposed approach is iterative andessentially based on wavelet coefficients correction. The iter-ative deblurring process is controlled by a recently proposedstopping criterion using a blind objective blurred image qual-ity metric [7,8].

The rest of the paper is organized as follows. Relatedworks on deblurring techniques are presented in Sect. 2. Sec-tion 3 details the proposed algorithm for blur effect reduction.Experimental settings, methodology, results and discussionsare presented in Sect. 4. We then conclude the paper withsome perspectives for future work.

2 Related works

There exist in the literature two main kinds of deblurringmethods: Parametric and iterative. The main drawback of the

123

Page 2: Wavelet-based blind blur reduction

SIViP

parametric methods is that they need key parameters, (i.e.,threshold values), which are often tuned empirically. Para-metric methods developed in the wavelet domain have beenextensively studied [9–13]. On the other hand, iterative meth-ods do not require any key parameter, but they may consumemuch running time. In [14] for instance, Rudin et al. pro-pose an iterative deblurring method called shock filters (SF).It is based on creating discontinuities on edges using partialderivative equations (PDE). This iterative method providesenhanced images; however, it does not preserve textures andfine image details. Moreover, it is iterative with no explicitstopping criteria. These two kinds of methods could be eitherblind or non-blind. The non-blind ones, such as Wiener fil-tering [15], Lucy-Richardson [16,17], least-squares filtering[4] and recursive Kalman filtering [18], assume a knownpoint spread function (PSF), which is not always possiblein many real image processing situations. In the blind case[19], two approaches could be adopted. The first one con-sists of a new approach for blur kernel estimation followedby a non-blind deblurring algorithm [20–23]. The second oneturns on the association of a blur kernel estimation methodand a deconvolution process [21]. Authors in [20–23] pro-pose a blur kernel estimation method and then the non-blindLucy–Richardson (LR) deconvolution algorithm is appliedfor deblurring. Applying the LR algorithm, however, gener-ates unpleasant ringing artifact near strong edges. Moreover,the LR algorithm convergence is not guaranteed [21].

The main contribution of this work is to propose an iter-ative blind deblurring algorithm for optical blurred images.In the proposed approach, a blur kernel is estimated for eachblurred edge pixel and then a proposed iterative correctionprocess is adopted. Moreover, a stopping criterion is pro-posed to control the iterative deblurring process.

3 Proposed blur-reduction algorithm

This section details the proposed algorithm for blur reduc-tion. Adequate blur kernel estimation is proposed for eachblurred edge pixel. And then, an iterative correction processis applied in the wavelet domain. As illustrated in Fig. 1, thesuggested iterative process involves different steps. For goodunderstanding of the proposed approach, each step will bedetailed in the following.

3.1 The wavelet transform

The wavelet transform decomposes an image into a set of‘frequency bands’ by projecting the image onto an elementof a set of basic functions. It is equivalent to band-pass fil-ter with a bank of constant-Q filters. The basic functions arecalled wavelets. A multiresolution analysis decomposes animage into a smoothed version and a set of detail informa-

Deblurred Image

Inverse WT

Deblurring

IQA(t+1) -IQA(t)

Image Quality Assessment (IQA)

Wavelet transform (WT)

Image

Yes

No

Fig. 1 Flowchart of the proposed deblurring approach

tion at different scales and directions. Once we have decom-posed an image this way, we may analyze the behavior ofthe detail information across different scales. In additionto the wavelet transform ability to localize in time and itsability to specify a multiresolution analysis, many potentialapplication areas have been identified. These include edgecharacterization, noise reduction, data compression and sub-band coding [6]. As blurring affects essentially edges whichrepresent the high-frequency components of an image, thewavelet transform seems to be suitable for blur character-ization. Herein, the orthogonal Daubechies wavelet (Db2)is applied at three resolutions to achieve a trade-off betweenreducing the run-time and edges persistence analysis throughresolutions.

3.2 Image quality assessment

There exist in the literature three kinds of quality assessmentmetrics: full reference (FR), reduced reference (RR) and noreference (NR) [24–27]. FR methods as the mean squareerror (MSE) and the peak signal-to-noise ratio (PSNR) arenot always applicable in some practical situations when anoriginal image is not available. In the same way, RR metricsrequire some characteristics of the original image such as itsstatistics for instance. However, the NR case is the most dif-ficult since no information about the original image is avail-able. So to assess the quality, only the degraded image contentis used. In [7], an effective and fast no reference blurred imagequality assessment metric (IQA) is proposed. A comparativestudy, with some existing no reference metrics done in [7],shows the effectiveness of this method in terms of accuracy

123

Page 3: Wavelet-based blind blur reduction

SIViP

and running time. More details about the used quality metricwill be available in the following.

3.2.1 The used no-reference blurred image quality metric

The proposed no-reference blurred image quality metric pre-sented in [7] exploits the step edge width through a multires-olution analysis using the wavelet transform. It involves thefollowing steps:

1. Apply the wavelet transform at three resolution levels.2. Construct a contour map at each resolution level i , for

each pixel (k, l) as follows:

Conti (k, l) ={

Ei (k, l) if Ei (k, l) > T hi

0 otherwise(2)

With,

Ei (k, l) =√

D2hi (k, l) + D2

vi (k, l) (3)

where Dhi and Dvi represent, respectively, the horizontaland vertical wavelet details at the i th iteration.The set of thresholds T hi are fixed by taking some con-siderations: let us evaluate the blur level at different res-olutions using the wavelet transform. It could be noticedthat for a fixed threshold, the edge detection is less effi-cient while going down in resolutions (Fig. 2). This is dueto smoothing effect introduced by the wavelet transformfilters. Then, for a better edge detection and noise effectreduction, we found that it is useful to propose a thresh-old value depending on the contour map mean at eachresolution level i as follows:

T hi = 2i−1

N × M×

k=N∑k=1

l=M∑l=1

(Ei (k, l)) (4)

where (N × M) represents the test image size.Conti (k, l) represents the significant edge pixel magni-tude. If its value corresponds to zero, that means that thispixel is not considered as an edge point.

3. Let us consider the edge pixel (k, l) for a given resolu-tion level i . The considered pixel Conti (k, l) �= 0 couldbe labeled as blurred when the difference between theedge pixel and the average of its neighbors is less than athreshold ξi . In fact, according to Fig. 2, while increasingin resolution, the distance Di between the central pixeland the mean of its neighbors increases. Let us considerAhi (k, l) and Avi (k, l) the neighborhood average valuesestimated from two edge pixels neighbor in the horizontaland vertical directions, respectively:

Ahi (k, l) = 1

2(Conti (k, l + 1) + Conti (k, l − 1)) (5)

Avi (k, l) = 1

2(Conti (k + 1, l) + Conti (k − 1, l)) (6)

Let us denote Aci (k, l) with c = ‘hi ’ or ‘vi .’ Hence, onecould define the relative variation of the edge pixel com-pared to the average as:

B Rci (k, l) = |Conti (k, l) − Aci (k, l)|Aci (k, l)

(7)

The largest value between B Rhi (k, l) and B Rvi (k, l) isselected for the final decision.

Bi (k, l) ={

1 if max (B Rhi , B Rvi ) < ξi

0 otherwise(8)

As the edge pixel intensity depends on the resolution level i ,it is judicious to consider a set of thresholds depending on theresolution level. Experimentally, ξi is defined as 0.5 × 2i−1.Edge pixels are labeled ‘1’ in Bi (k, l) if they are consideredas blurred and ‘0’ otherwise.

For a good blur effect characterization, we exploit theedge pixel persistence through resolutions. Indeed, if oneschematizes the local maximum found in the image detailsprovided by the wavelet transform represented in Fig. 2, onenotice that the edge intensity spread reduces with resolutions.Hence, one could estimate the number of edge pixels N Ei

and blurred pixels N Bi at each resolution level i . We definethe number of blurred edge pixels to the number of edgepixels as the quality ratio Qi at the i th resolution as follows:

Fig. 2 Example of localmaximum provided by thewavelet coefficients at threeresolutions

D 1

D 2

D 3

1st Resolution2nd Resolution 3rd Resolution

123

Page 4: Wavelet-based blind blur reduction

SIViP

Qi = N Bi

N Ei, (i = 1, 2, 3) (9)

As mentioned above, the relative number of edge pixelsbecomes less important while increasing in resolutions. Thatis why one may consider the Qi ratio as more relevant athigher resolution levels. Weighting those ratios accordingly,the proposed blurred image quality metric (IQA) is definedas:

IQA = 1 −∑i=3

i=1 23−i × Qi∑i=3i=1 23−i

(10)

The proposed blurred image quality metric scores are com-prised between zero and one. As the image blurrinessincreases, the metric is expected to decrease from 1 to 0.

3.3 The proposed blur-reduction approach

Blind image deblurring algorithms (BIDA) constitute a sub-set of image restoration used to solve ill-posed inverse prob-lems, which could be challenging in low-SNR situations. Theaim is to restore blurred images in the respect of two mainpoints, fine details preservation and running time.

The proposed approach focuses on texture and fine detailpreservation for a good analysis and interpretation. To restoreimages, an iterative method, based on maximizing the qual-ity metric defined in Eq. (10), is proposed. As the proposedquality metric is based on a counting operation, it is discon-tinuous so that; it could not be defined as a distribution. That’swhy linear optimization approaches could not be used in thiscase. Different iterative algorithms have been already pro-posed [14,17,28], for most of them, however, very few arebeing said explicitly about the used stopping criterion. Ter-minating an iterative restoration scheme prior to convergenceis another way to explicitly incorporate knowledge about thepresence of noise in the degraded image (i.e., stopping theiterative deblurring process, when the residual error reachesthe noise variance level). In fact, finding a robust stoppingcriterion for iterative deblurring algorithms is still an openresearch topic [29–31]. Therefore, a stopping criterion, usedto control iterative deblurring algorithms, will be presentedin the following.

3.3.1 Proposed stopping criterion for iterative deblurringalgorithms

The rationale behind the proposed stopping criterion is tocontrol the deconvolution process by estimating the remain-ing blur quantity in the restored image. In fact, deblurringalgorithms aim to reduce the blur amount in the image,so assessing the blur amount in the image at each deblur-ring algorithm iteration may inform us about the remain-ing blur quantity in the image. In other words, the deblur-

ring process should improve the image quality (IQA), thusIQA(t−1) − IQA(t) < ξ . Otherwise, some artifacts couldbe introduced by the deblurring process and it should stopthe iterative process. Many existing criteria could be used[28,32]. The idea here is to use the blur quality metric definedin Eq. (10) to control the remaining blur amount, at each iter-ation, as follows:

1. Initialization: At iteration t = 0, estimate the originalblurred image quality, I Q A(0).

2. At the iteration t ∈ N∗,

• Apply the deblurring algorithm.• Estimate the restored image quality IQA(t).

3. Test on the restored image quality: If IQA(t)−IQA(t−1) >

ξ , then repeat the step 2 and set IQA(t−1) = IQA(t),

where ξ is defined experimentally to be 10−3.

The used deblurring algorithm will be detailed in the nextsection.

3.3.2 Proposed blur-reduction method

We recall that the proposed deblurring process aims to maxi-mize the quality metric defined in Eq. (10). The idea turns onenhancing the extracted blurred edge pixels. So that reducesthe total number of blurred pixels (N Bi ) and the blur factorQi which means improves the quality metric score. Figure 2illustrates that the local intensity variation around edgescould be modeled as a Gaussian function having differentwidth (standard deviation values) for different resolution lev-els. Hence, a blur edge pixel could be enhanced by makingnarrower the local variation of wavelet coefficients centeredon this blurred edge pixel.

The developed iterative algorithm is applied in the waveletdomain as follows:

Step 0. Apply the wavelet transform on the test image atthree resolution levels.Step 1. Estimate the input blurred image quality IQA(0) atthe first iteration, iter = 0, using the quality metric definedin Eq. (10).Step 2. Detect the blurred edges maxima and orientations.Step 3. Project the edge pixel profile centered on the previousdetected blurred pixel on the simulated Gaussian basis andidentify the standard deviation of the projected Gaussiankernel which suits better the considered wavelet coefficientsneighborhood.Step 4. Correct all the detected blurred edge pixels by replac-ing the considered neighborhood wavelet coefficients by anadequate projected Gaussian function with a reduced stan-dard deviation value. Label the corrected blurred edge pointsas non-blurred ones.

123

Page 5: Wavelet-based blind blur reduction

SIViP

Step 5. Estimate the restored image quality IQA(t):If IQA(t) − IQA(t−1) > ξ , where ξ = 10−3,iter = iter + 1 and repeat the steps from 2 to 4, else stop theiterative process.Step 6. Apply the inverse wavelet transform to recover thedeblurred image.

Steps 2, 3 and 4 require additional explanation. Indeed, toperform step 2 which consists of extract blurred edge pixelsand their associated orientation, the same approach detailedin Sect. 3.2.1 is used. Moreover for a better edge detection,we propose to replace the threshold expression defined inEq. (2) by an adaptive threshold depending on the qualityscore (IQA) as follows:

T hi = IQA × 2i−1

N × M×

k=N∑k=1

l=M∑l=1

(Ei (k, l)) (11)

Accordingly, if the image is heavily blurred, the thresholdvalue is reduced and all the wavelet coefficients above thethreshold value are kept to consider more edge pixels and viceversa. In addition, each blurred edge pixel is characterizedby its orientation, θi (k, l), defined as follows:

θi (k, l) = arctan

(Dhi (k, l)

Dvi (k, l)

)(12)

In the third step, we estimate the wavelet coefficients profileof a local neighborhood centered on a blurred pixel. The localneighborhood is assumed to be Gaussian. In other words,one considers that the wavelet coefficient magnitude varia-tion in a neighborhood centered on a blurred pixel (km, ln)

could be modeled by a Gaussian kernel gσn (x) centered on(km, ln), following the direction θi with σn as standard devi-ation. To start the iterative deblurring process, one shouldselect the most important blurred pixel in terms of waveletcoefficients. Hence, let us consider D′

ci (k, l), the thresholdedversion of the wavelet detail coefficients Dci (k, l) obtainedat each resolution level i in two directions, horizontal andvertical defined as:

D′ci (k, l) =

{Dci (k, l) if Dci (k, l) > T hi

0 otherwise(13)

One could then define a matrix Rci which contains only thedetected blurred pixels at each resolution level i as follows:

Rci (k, l) = D′ci (k, l) × Bi (k, l) (14)

Bi (k, l) is defined by Eq. (8) and the sub-script ‘c’ stands for‘h’ and ‘v,’ the horizontal and the vertical orientations.

Hence, let us consider the blurred edge pixel (km, ln) andMi (km, ln) which maximizes Rci (k, l).

Mi (km, ln) = max{c} (|Rci (km, ln)|) (15)

In selecting the one-dimensional blurred pixel neighborhoodτs (km, lm) defined by the coordinates of its center (km, ln)

and its spreading 2s+1, a problem occurs. In fact, some pixelsdisappear when trying to extract the edge profile becauseof the edge orientation which may take any value between[0, 2π ]. To avoid this problem, an interpolation solution isproposed at each direction θi (km, ln) defined by Eqs. (16)–(18).

∀ (k, l) ∈ τs (km, lm) ,

Prhi (k, l) = D′Hi (k, l) × sin(θi (km, ln)) (16)

Prvi (k, l) = D′vi (k, l) × cos(θi (km, ln)) (17)

Prdi (k, l) = D′di (k, l) × cos

(θi (km, ln) − π

4

)(18)

where Prhi , Prvi and Prdi are the obtained interpolationsover the wavelet detail sub-bands in horizontal, vertical anddiagonal orientations, respectively. To simplify writing, wedenote the interpolated details Prci , with the subscript ‘c’stands for, ‘h’ for horizontal, ’v’ for vertical and ‘d’ fordiagonal directions. We recall that the considered projectionsPrci (k, l) , could be assimilated to a Gaussian function sincethe two-dimensional Kernel is Gaussian. Any projections atan arbitrary angle θ will be Gaussian. Thus, a dictionaryof Gaussian functions has been constructed to best matchthe blurred transitions profile. Then, Prhi , Prvi and Prdi areconsidered independently. For each one, we aim to select aGaussian function optimizing their approximation. For thispurpose, let us consider �, the ordered set of standard devi-ation σn .

� = {σ1, σ2, σ3 . . . σT } ,∀σn ∈ �, σn+1 > σn and n =1 . . . T, σn ranges from 0.5 to 10 with a pitch of 0.5.

A dictionary D {gσn (x)} with T Gaussian function ker-nels is constructed, as follows:

gσn (x) = 1

σn√

2πexp

−x2

2σ 2n

,−s

2≤ x ≤ s

2(19)

0 2 4 6 8 10 12 14 16 18 200

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

Spatial points

Mag

nitu

de

Fig. 3 Samples of simulated Gaussian functions basis

123

Page 6: Wavelet-based blind blur reduction

SIViP

Some of these Gaussian functions are presented in Fig. 3. Asthe blur kernel size changes, the Gaussian standard deviation(i.e., blur) changes too.

At each blurred edge pixel (km, ln) characterized by itsorientation θi , one could consider the neighborhood centeredon (km, ln) and oriented in the θi direction. Then, each pointbelonging to this neighborhood could be represented by thepolar coordinate x = √

k2 + l2. Our goal is to select a dictio-nary element maximizing the projection of Prci (x) onto theconstruct dictionary D {gσn (x)}. Note that the inner projectwill have a maximum value if it matches faithfully the pro-jections Prci (k, l).

∀ (k, l) ∈ τs (km, lm) , x =√

k2 + l2,

γmc = maxσn∈� < Prci (x) , gσn (x) × Uc (θi ) > (20)

The Gaussian gσn (x) is always positive, whereas the waveletcoefficients may be negative when neighboring coefficientsign area. Hence, it is important to introduce the parameterUc (θi ) defined as:

Uc (θi ) =⎧⎨⎩

sign (sin (θi )) i f c = ′h′sign (cos (θi )) i f c = ′v′sign (tan (θi )) i f c = ′d ′

(21)

We denote nc the dictionary indices that maximize the pro-jections in horizontal, vertical and diagonal directions. Thus,we could approximate the profile P̂rci (x) of different detailsas follows:

∀x ∈ τs (km, lm)

P̂rci (x) = γmc × gnc (x) (22)

here P̂rci (x) is the one-dimensional projection approxima-tion in horizontal vertical and diagonal directions followinga specific orientation angle θi . Then, one could store the stan-dard deviation σnc.

In the fourth step, to reduce the blur effect, we replacethe Prci (x) details by a narrower Gaussian function withstandard deviation σnc−1, (σnc−1 < σnc) as follows:

∀ (k, l) ∈ τs (km, lm) , k = xcos (θi ) , l = xsin (θi ) ,

P ′rci (x) = γmc × gnc−1(x) × Uc(θi ) (23)

gnc−1 represents the Gaussian function with the dictionaryindice nc − 1. γmc and Uc(θi ) are defined in Eqs. (20), (21),respectively.

The corrected blurred edge point is labeled as a non-blurred one and the same process is applied on the otherblurred edge pixels. The image quality is then estimated inthe wavelet domain. While the quality score is improved,

Fig. 4 LIVE 2008 datasetoriginal images

123

Page 7: Wavelet-based blind blur reduction

SIViP

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

5

10

15

IQA

SD

Fig. 5 Proposed image quality scores versus the Gaussian kernel stan-dard deviation used in the blurring process

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 110

20

30

40

50

60

70

80

90

IQA

DM

OS

Fig. 6 Proposed image quality scores versus difference means opinionscores

steps 3 and 4 are iterated until no blurred edge pixel remains.Finally, the inverse wavelet transform is applied to recoverthe restored image.

4 Experiments, results and discussions

Experiments are conducted with the MATLAB platform,using images from the LIVE database [33,34]. This data-base comprises a set of twenty-nine original images, highresolution (24 pixel/bit) in RGB colors. That attempts tocover sufficient diversity in image content including pic-tures of faces, people, animals, close-up shots, wide-angleshots, natural scenes, man-made objects, images with distinctforeground/background configurations and images withoutany specific object of interest (see Fig. 4). Among possi-

Table 1 Proposed quality metric evaluation compared to some existingmethods

Freq. Th [27] IQM [28] Kurtosis [29] NIS [30] IQA

SROCC 0.7305 0.706 0.75 0.754 0.8822

Table 2 LR and SF evaluated using a fixed IN

Fixed iteration number (100)

Mean-IN Mean-PSNR Mean-IQA

LR 100 33.5557 0.9480

SF 100 26.6164 0.9403

Table 3 Evaluation of the LR and SF using a PSNR-based stoppingcriterion

PSNR as stopping criterion

Mean-IN Mean-PSNR Mean-IQA

LR 36.3448 34.8010 0.8181

SF 25.5172 28.8409 0.8090

ble distortions, we focus here on Gaussian blur. Gaussianblurred images are generated using a circular-symmetric two-dimensional Gaussian kernel of standard deviation rangingfrom 0.42 to 15 pixels. Note that the proposed quality met-ric has been already evaluated in [7]. Tests using the LIVEdatabase blurred images provided a high correlation valuebetween objective and subjective scores. In the present paper,the aim is to evaluate the performance of the quality metricused as a stopping criterion associated with the new proposediterative deblurring algorithm. This evaluation will be carriedout in three main steps. First, some scores of the proposedquality metric are recalled. Then, the used quality metric willbe tested as a stopping criterion associated with two existingiterative deblurring algorithms (LR and SF). Finally, the pro-posed adaptive iterative deblurring algorithm performance isevaluated against the two considered methods.

4.1 The IQA quality metric evaluation

We recall main results to illustrate the effectiveness of theused quality metric. The proposed IQA is evaluated againstthe standard deviation values (SD) of the Gaussian filtersintroducing blur in images and a subjective ones using thedifference mean opinion scores (DMOS) corresponding toeach blurred image. A logistic function is used to adjustthe objective and subjective image quality metric outputsand model it by a curve using a nonlinear regression model.This regression is done by minimizing the mean square errorbetween the proposed image quality metric values and thesubjective (objective) scores. Figures 5 and 6, illustrate these

123

Page 8: Wavelet-based blind blur reduction

SIViP

Table 4 Evaluation of the LR and SF using the proposed IQA as astopping criterion

The proposed stopping criterion

Mean-IN Mean-PSNR Mean-IQA

LR 49.184 32.6429 0.9678

SF 46.3379 27.9340 0.9397

Fig. 7 Proposed stopping criteria evaluation. a Original image. bBlurred image (IQA = 0.3867). c, d Deblurred using LR, SF with afix iteration number (100). e, f Deblurred using LR, SF associated withthe PSNR as stopping criterion g, h Deblurred using LR, SF associatedwith the proposed criterion (IQA)

results in terms of a scatter plot with respect to objective andsubjective data, respectively. It is clearly shown that the pro-posed image quality metric is consistent with both the sub-jective and the objective evaluation. While evaluating the fitgoodness, the Spearman correlation (SROOC) is used. Forvalidation, the proposed IQA has been compared to com-mon metrics. The obtained scores are presented in Table 1.Accordingly, it could be clearly noticed that the proposedmetric provides a high correlation value, and it outperformsthe considered metrics. We conclude that tests using the LIVEdatabase images show that the proposed image quality metriccorrelates well with the subjective human judgment and withthe source of distortion representing the standard deviationof the Gaussian kernels used in the blurring process.

4.2 Tests of the proposed IQA as a stopping criterion

To evaluate the proposed stopping criterion effectiveness, aset of experiments is conducted using two existing iterativedeblurring algorithms that are the LR non-blind deblurringmethod based on Bayesian reasoning and the SF blind deblur-ring method based on partial differential equations (PDE).

Tests are performed using three different stopping criteria:

• A fixed number of iterations set to 100.• A classical quality metric PSNR.• A no reference blurred image quality metric (IQA).

To evaluate the performance, the mean iteration num-ber (mean-IN), the mean Pick signal-to-noise ratio (mean-PSNR) and the mean image quality metric (mean-IQA) areconsidered over all the Gblur LIVE dataset.

4.2.1 Results and discussion

Tables 2, 3 and 4 summarize obtained scores while apply-ing the three stopping criteria to the two existing methodsLR and SF on all the Gblur LIVE database-blurred images,respectively. Tables 2 and 3 show that using the PSNR asa stopping criterion is more efficient than fixing arbitrarilyan iteration number. The knowledge of the reference imageallows reducing the number of iterations while maintainingroughly the same restored image quality in terms of mean-PSNR. However, the PSNR criterion is not always useful inreal applications as reference images are not available. We

Table 5 Evaluation of thedeblurred images shown inFig. 7

LR 100 iter SF 100 iter LR PSNR SF PSNR LR New SF New

PSNR 26.8714 24.3357 28.1874 24.282 27.60 24.352

IQA 0.9515 0.9267 0.8635 0.8473 0.9584 0.9249

IN 100 100 4 36 68 77

123

Page 9: Wavelet-based blind blur reduction

SIViP

(a)

0 10 20 30 40 50 60 70 80 90 100

0.4

0.5

0.6

0.7

0.8

0.9

1

Iterations

IQA

LR

(a)

(b)

0 10 20 30 40 50 60 70 80 90 10026.8

27

27.2

27.4

27.6

27.8

28

28.2

28.4

28.6

28.8

Iterations

PS

NR

LR

Fig. 8 Deblurred images using the LR algorithm. a IQA versus itera-tion number, b PSNR versus iteration number

could also observe that the obtained mean quality values aresomewhat better while using a fixed number of iterationsrather than using the PSNR measure as a stopping criterion.In fact it is considered by some in the literature that the PSNRis not a faithful tool for image quality evaluation [35]. In anycase, the IQA-based stopping criterion is more suitable asreference image is not available. According to Table 4, onecould notice that the IQA-based stopping criterion was ableto stop the algorithm providing a high-quality value with areasonable number of iterations clearly much less than 100iterations used to generate the results in Table 2. Figure 7illustrates one example of an original image from the LIVEdataset, its blurred version and restored images using the LRand the SF deblurring methods associated with the three typesof stopping criteria, respectively. The quantitative evaluationof the obtained images is summarized in Table 5. Interestingresults are obtained using the IQA-based stopping criterion,in particular in the no reference case. In fact, a good imagequality is obtained with a reasonable number of iterations.Therefore, deblurring methods associated with the IQA as astopping criterion converge with a reduced number of iter-

(a)

(b)

0 10 20 30 40 50 60 70 80 90 1000.45

0.5

0.55

0.6

0.65

0.7

0.75

0.8

0.85

0.9

0.95

Iterations

IQA

SF

0 10 20 30 40 50 60 70 80 90 10024

24.05

24.1

24.15

24.2

24.25

24.3

24.35

24.4

Iterations

PS

NR

SF

Fig. 9 Deblurred image using the SF method. a IQA versus iterationnumber, b PSNR versus iteration number

Fig. 10 A set of original images from the LIVE database

ations. A PSNR-based stopping criterion is faster, however,not applicable in real applications where no reference imageis available.

123

Page 10: Wavelet-based blind blur reduction

SIViP

Fig. 11 A set of blurred images from the LIVE database

Table 6 Evaluation of a set of blurred images

Blurred images Image (a) Image (b) Image (c) Image (d)

SD (σ) 7.66650 2.166638 4.916644 2.624973

RMSE 202.4887 233.3290 131.8404 291.5568

PSNR 25.1047 24.4535 26.9303 23.4875

IQA 0.2976 0.7051 0.3566 0.5497

Comparing the PSNR values shown in Tables 2 and 3, onecould notice that obtained values after 100 iterations using theboth methods (LR and SF) are less than obtained values withless number of iterations. This could be explained by plottingthe evolution of the quality scores IQA and the PSNR valuesversus the number of iterations (from 1 to 100 iterations)in two cases. Figures 8 and 9 show the evolution for boththe LR and SF algorithms when the image represented inFig. 7b is used. Accordingly, one could notice that the qualitymetric value (IQA) increases with iterations and converges toa fixed value close to 1. Figures 8a and 9a show that the IQAvalue increases until the fiftieth iteration, and then, it becomesvery weak. Herein, we suppose that the variation IQA(t+1) −IQA(t) less than ξ = 0.001 is negligible. However, whileanalyzing Figs. 8b and 9b one could notice that the PSNRvalue reaches a maximum after some number of iterationsand then decreases. It follows that LR converges after fouriterations and about thirty iterations for the SF method.

To conclude, the IQA-based stopping criterion allowsdetermining when to stop the iterative deblurring process.The IQA is quite general and does not require any knowl-edge about the original image or the PSF. This makes theIQA-based stopping criterion useful for both blind and non-blind iterative deconvolution methods.

Fig. 12 The obtained deblurred images using the proposed deblurringmethod

Table 7 Evaluation of the proposed deblurring algorithm

Deblurred images Image (a) Image (b) Image (c) Image (d)

RMSE 43.2242 34.0397 35.9743 30.8794

PSNR 31.8752 32.8109 32.5709 33.2341

IQA 0.7202 0.8798 0.8098 0.9064

Fig. 13 An example of a non-uniform blurred image. a Original andb Blurred (σ = 1.25, IQA = 0.7002)

4.3 Test of the proposed adaptive deblurring algorithm

We will now focus on the proposed deblurring algorithmevaluation. Let us consider at first only four different blurredimages to visually assess subjectively the proposed deblur-ring method performance. The full evaluation involves allLIVE dataset-blurred images will be considered further. Thefour original images are presented in Fig. 10. Their blurredversions are presented in Fig. 11. Blurred images are char-acterized by the standard deviation (SD), root mean squareerror (RMSE), PSNR and IQA (see Table 6).

123

Page 11: Wavelet-based blind blur reduction

SIViP

Fig. 14 The obtained deblurredimages using: a Proposedmethod, b LR method and c SFmethod

Table 8 Comparative study on the test image shown in Fig. 13b

LR SF Proposed

IQA 0.8998 0.9162 0.9549

IN 13 15 7

4.3.1 Results and discussion

Figure 12 depicts obtained deblurred images using the pro-posed method applied on blurred images illustrated in Fig. 11.Blur effect is considerably reduced compared to the blurredimages represented in Fig. 11. Indeed, edges and fine imagedetails are sharper and clearer. Table 7 summarized therestored images quantitative evaluation in terms of RMSE,PSNR and IQA. It could be noticed that the quality measureimproves when compared to those of the blurred versionssummarized in Table 6.

From the obtained measures, we can say that the pro-posed deblurring algorithm associated with the IQA-basedstopping criterion is promising and, in particular consideringthe fact that it is fully automatic and does not require anyreference image. Restored images without noticeable degra-dations in the visual quality are obtained. Images are clearerand sharper, so more suitable for analysis and interpretation.

Let us test the proposed algorithm on blurred image fromthe Gblur LIVE database with different blur for the fore-ground and the background (see Fig. 13b).

Obtained deblurred images using the two existing meth-ods and the proposed one are represented in Fig. 14 andthe quantitative evaluation represented in Table 8. From theobtained results, it could be noticed that the IQA-based pro-posed algorithm outperforms the others.

4.3.2 Comparative evaluation of the three studieddeblurring algorithms

Let us now compare the proposed method performanceshown in the previous section against the LR and the SF meth-ods. These two methods are applied firstly on blurred imagesshown in Fig. 11. Restored images are shown in Fig. 15.Quantitative evaluation in terms of PSNR and IQA are pre-sented in Table 9. From these experiments, one could con-

Fig. 15 Deblurred images. a, c, e, g The LR outputs. b, d, f, h The SFoutputs

clude visually, via the obtained deblurred images and accord-ing to the quantitative evaluation given in Tables 7, 8 and 9,that the IQA-based approach outperforms the existing onesincluding the blind SF method.

The mean-PSNR and mean-IQA when applied to all theLIVE dataset images are given in Table 10. Accordingly, onecould conclude that the proposed method outperforms theothers in terms of Mean-PSNR and Mean-IQA values.

123

Page 12: Wavelet-based blind blur reduction

SIViP

Table 9 Evaluation of the proposed stopping criterion using classicaldeblurring methods LR and SF

Images LR SF

PSNR IQA PSNR IQA

(a) 24.6881 0.5324 25.111 0.588

(b) 24.7260 0.4517 25.358 0.642

(c) 26.9917 0.5076 28.156 0.644

(d) 23.6052 0.6451 23.9656 0.7464

Table 10 Evaluation of the LR, SF and the proposed method on allGblur LIVE database blurred images

LR SF Proposed

Mean-PSNR 32.6429 27.934 36.3379

Mean-IQA 0.9478 0.9397 0.9603

From all experiments carried out, we conclude that theproposed adaptive deblurring algorithm restore images whilepreserving the image structure. Moreover, it is totally blindand fully automatic.

5 Conclusion

This paper addresses the blind iterative deconvolution prob-lem in image restoration. An approach relying on analyz-ing the wavelet coefficients through resolutions is proposed.To reduce the blur effect, an iterative restoration algorithmcontrolled by a proposed stopping criterion in the waveletdomain is designed. The resulting algorithm is compared totwo common algorithms that are LR and SF by using theLIVE data collection and three different performance mea-sures. Obtained results revealed that it outperforms these twoalgorithms according to the three criteria.

References

1. Walk, M., Raudaschl, P., Schwarzbauer, T., Erler, M., Läuter,M.: Fast and robust linear motion deblurring, Technical ReportarXiv:1212.2245 [cs.CV] (2012)

2. Coren, Stanley, Ward, Lawrence M., Porac, Clare, Ftraser, Bobert:The effect of optical blur on visual-geometric illusions. Bull. Psy-chon. Soc. 11(6), 390–392 (1978)

3. Per Christian Hanson, J.G.N.: Deblurring images, matrices, spectraand filtering, Society of industrial and applied mathematics. Ch. 1.1–7 (2006)

4. Tikhonov, A.N., Arsenin, V.Y.: Solutions of Ill-posed problems.Winston and Sons, Washington (1977)

5. Hang Hang H.Z., Li M., Zhang, C.: Blur detection for images usingwavelet transform. In: Proceedings of 4th International Conferenceon Multimedia and Expositions, pp. 17–20. Beijing, China, June27–30 (2004)

6. Mallat, S.: A Wavelet Tour of Signal Processing. Academic Press,London (1999)

7. Kerouh, F., Serir, A.: A no reference quality metric for measuringimage blur in wavelet. Int. J. Digital Inf. Process. Wirel. Commun.(IJDIPWC) 1(4), 767–776 (2012)

8. Kerouh, F., Serir, A.: A quality measure based stopping criterion foriterative deblurring algorithms. In: IEEE International Workshopon Systems, Signal Processing and their Applications, pp. 372–377(2013)

9. Charles, C., Leclerc, G., Louette, P., Rasson, J.-P., Pireaux, J.-J.:Noise filtering and deconvolution of XPS data by wavelets andFourier transform. Surf. Interface Anal. 36, 71–80 (2004)

10. Donoho, D.L.: Nonlinear solution of linear inverse problems bywavelet-vaguelette decomposition. Appl. Comput. Harmon. Anal.2, 101–126 (1995)

11. Fan, J., Koo, J.-Y.: Wavelet deconvolution. IEEE Trans. Inf. Theory48(3), 734–747 (2003)

12. Ping Yang, A.I., Xin Hou, Z., You Wang, C.: Image deblurringbased on wavelet transform and neural network. In: Proc. IEEE 2,pp. 647–651

13. Chai, Anwei, Shen, Zuowei: Deconvolution: A wavelet frameapproach. Numerische Mathematick 106(4), 529–587 (2007)

14. Osher, S.J., Rudin, L.I.: Feature-oriented Image enhancement usingShock Filters. SIAM J. Numer. Anal. 27, 919–940 (1990)

15. Ozkan, M.K., Erden, A.T., Sezan, M.I., Tekalp, A.M.: Effi-cient multiframe wiener restoration of blurred and noisy imagesequences. Trans. IEEE Image Process. 1, 453–476 (Oct. 1992)

16. Richardson, W.: Bayesian-based iterative method of image restora-tion. J. Opt. Soc. Am. 62(1), 55–59 (1972)

17. Lucy, L.B.: An iterative technique for the rectification of observeddistributions. J. Astron. 79, 745–754 (1974)

18. Biemond, J.: Stochastic linear image restoration. In: Huang, T.S.(ed.) In Advances in Computer Vision and Image Processing Ch.5, pp. 502–528. JAI Press Inc, London (1986)

19. Kunder, D., Hatzinakos, D.: Blind image deconvolution. Proc.IEEE 13(3), 43–64 (1996)

20. Fergus, R., Singh, B., Hertzmann, A., Roweis, S.T., Freeman, W.T.:Removing Camera Shake from a single Photograph. In: Proceed-ings of 6th International ACM SIGGRAPH, pp. 787–794. NewYork, USA (2006)

21. Yuan, L., Sun, J., Quan, L., Shum, H.Y.: Image deblurring withblurred/noisy image pairs. Proc. ACM Trans. Graph. 26(3) (2007)

22. Xu, S., Liang, H., and Li, G.: A deblurring technique for largescale motion images using a hybrid camera. In: Proceedings of 3rdInternational Congress on Image and Signal Processing, Vol. 2, pp.806–810. Yantai, China (2010)

23. Levin, A.: Blind Motion deblurring using image statistics. Adv.Neural Inf. Process. Syst. (NIPS) (2006)

24. Firestone, L., Cook, Talsamia, N., Preston, K.: Comparison of aut-ofocus methods of automated microscopy. Citometry 12(3), 195–206 (1991)

25. Nill, N.B., Bouzas, B.H.: Objective image quality measure derivedfrom digital image power spectra. Opt. Eng. 31(4), 813–825 (1992)

26. Caviedes, J., Gurbuz, S.: No reference sharpness metric based onlocal edges Kurtosis. Proc. IEEE 3, 53–56 (2002)

27. Cohen, E., Yitzhaky, Y.: No-reference assessment of blur and noiseimpacts on image quality. SIViP 4(3), 289–302 (2010)

28. Narvekar, N.D. and Karam, L.J.: An iterative deblurring algorithmbased in the concept of just noticeable blur. In: Presented at the2009 Scottsdale International Workshop on Video Processing andQuality Metrics for Consumer Electronics (VPQM), Scottsdale,Arizona, U.S.A., Jan. 13–15 (2009)

29. Oliveira, J.P., Bioucas-Dias, J.M., Figueiredo, M.A.T.: Adaptivetotal variation image deblurring: A majorization-minimizationapproach. IEEE Trans. Signal Process. 57(9), 1683–1693 (2009)

123

Page 13: Wavelet-based blind blur reduction

SIViP

30. Chantas, G.K., Galatsanos, N.P., Molina, R., Katsagge-los, A.K.:Variational Bayesian image restoration with a product of spatiallyweighted total variation image priors. IEEE Trans. Image Process.19(2), 351–362 (Feb. 2010)

31. Giryes, R., Elad, M., Eldar, Y.C.: The projected GSURE for auto-matic parameter tuning in iterative shrinkage methods. Appl. Com-put. Harmon. Anal. 30(3), 407–422 (2011)

32. Almeida, M.S.C., Figueiredo, M.A.T.: New stopping criteria foriterative blind image deblurring based on residual whiteness mea-sures. In: Proceedings of IEEE Statistical Signal Processing Work-shop, pp. 337–340, Nice (June 28–30, 2011)

33. Wang, Z., Bovik, A.C., Sheikh, H.R., Simoncelli, E.P.: Image qual-ity assessment from error visibility to structural similarity. Proc.IEEE Image Process. 13(4), 600–612 (2004)

34. Sheikh, H.R., Sabir, M.F., Bovik, A.C.: A statistical evaluation ofrecent full reference image quality assessment algorithms. Proc.IEEE Trans. Image Process. 15(11), 3440–3451 (2006)

35. Sheikh, H.R., Bovik, A.C., de Veciana, G.: An information fidelitycriterion for image quality assessment using natural scene statistics.IEEE Trans. Image Process. 14(12), 2117–2128 (Dec. 2005)

123