human iris biometry

45
UNIVERSITAT POLIT ` ECNICA DE CATALUNYA Escola T ` ecnica Superior d’Enginyeria de Telecomunicaci ´ o de Barcelona Grau en Enginyeria de Sistemes de Telecomunicaci ´ o HUMAN IRIS BIOMETRY Author: Largo Castell ` a, Juan Carlos Supervisor: Dr. Villar Santos, Jorge Luis August 2, 2016

Upload: juan-carlos-largo-castella

Post on 20-Feb-2017

30 views

Category:

Career


0 download

TRANSCRIPT

Page 1: Human Iris Biometry

UNIVERSITAT POLITECNICA DECATALUNYA

Escola Tecnica Superior d’Enginyeria deTelecomunicacio de Barcelona

Grau en Enginyeria de Sistemes de Telecomunicacio

HUMAN IRIS BIOMETRY

Author:Largo Castella,Juan Carlos

Supervisor:Dr. Villar Santos,

Jorge Luis

August 2, 2016

Page 2: Human Iris Biometry

Abstract

This project exposes the implementation of an iris based biometric system, from thetheoretical basis to the implementation of it by examining different types of methodsdescribed in other documents.

The human iris structure remains invariant over the time containing several easilyidentifiable structures, believed to be unique to each person [1]. This information isextracted by using mathematical pattern-recognition techniques to obtain a charac-teristic iris code which can be used in identification systems.

The recognition principle is the failure of a test of statistical independence on theiris codes since two different iris codes should not agree in more than a half of theirbits. The operating principle is as follows: first the system has to localize the innerand outer boundaries of the iris (pupil and limbus) in an image of an eye. Furthersubroutines detect and exclude eyelids, eyelashes, and specular reflections that oftenocclude parts of the iris. The set of pixels containing only the iris, normalizedby a rubber-sheet model to compensate for pupil dilation or constriction, is thenanalyzed to extract an iris code encoding the information needed to compare two irisimages. The code generated by imaging an iris is compared to stored template(s)in a database. If the Hamming distance is below the decision threshold, a positiveidentification outcomes due to the statistical improbability that two different personscould agree by chance in so many bits, given the high entropy of iris templates.

The iris segmentation and normalization process is challenging due to the presence ofeyelashes, eyelids, and reflections that may occlude regions of the iris. Furthermore,the dilation of pupils due to different light illuminations and the inconvenient thatthe iris and the pupil are not concentric cause this type of biometry to be quitecomplex.

I

Page 3: Human Iris Biometry

Resumen

Este proyecto expone la implementacion de un sistema biometrico basado en el iris,desde la base teorica hasta la implementacion del mismo por mediante el estudio dediferentes metodos descritos en otros documentos.

La estructura del iris humano permanece estable a lo largo del tiempo conteniendoesta varias estructuras faciles de identificar, las cuales son consideradas unicas encada persona [1]. Dicha informacion es extraida mediante tecnicas matematicas dereconocimiento de patrones para obtener ası un codigo asociado al iris que puedeser utilizado en sistemas de autenticacion biometrica.

El principio de funcionamiento es el siguiente: primero el sistema tiene que localizarlos lımites interior y exterior del iris (la pupila y el limbo) en una imagen de unojo. Posteriores subrutinas detectan y excluyen los parpados, las pestanas y lasreflexiones especulares que a menudo obstruyen partes del iris. El conjunto depıxeles que contienen unicamente el iris, normalizado por un modelo conocido como“rubber sheet model” que compensa la dilatacion o constriccion de la pupila, esanalizado posteriormente para extraer un codigo del iris que codifica la informacionnecesaria para comparar dos imagenes del iris. El codigo generado mediante lasimagenes de un iris se compara con una o varias plantillas almacenadas en la basede datos. Si la distancia de Hamming esta por debajo del umbral de decision, sevalida la identidad de cierto individuo debido a la improbabilidad estadıstica de quedos personas diferentes puedan coincidir por casualidad en tantos bits del codigo,dada la alta entropıa de las plantillas del iris.

El proceso de segmentacion y normalizacion del iris supone un reto debido a lapresencia de pestanas, parpados y reflexiones que pueden obstruir las regiones deliris. Ademas, la dilatacion de la pupila debido a los diferentes niveles de iluminaciony el inconveniente que el iris y la pupila no son concentricas conllevan que este tipode biometrıa sea bastante compleja.

II

Page 4: Human Iris Biometry

Resum

Aquest projecte exposa la implementacio d’un sistema biometric basat en l’iris, desde la base teorica fins a la implementacio del mateix per mitja de l’estudi de diferentsmetodes descrits en altres documents.

L’estructura de l’iris huma roman estable al llarg del temps contenint aquesta di-verses estructures facils d’identificar, les quals son considerades uniques en cadapersona [1]. Tal informacio es extreta mitjancant tecniques matematiques de re-coneixement de patrons per obtenir aixı un codi associat a l’iris que pot ser empraten sistemes d’autenticacio biometrica.

El principi de funcionament es el seguent: primer el sistema ha de localitzar els lımitsinterior i exterior de l’iris (la pupil·la i el limb) en una imatge d’un ull. Posteriorssubrutines detecten i exclouen les parpelles, les pestanyes i les reflexions especularsque sovint obstrueixen parts de l’iris. El conjunt de pıxels que contenen unicamentl’iris, normalitzat per un model conegut com ”rubber sheet model” que compensa ladilatacio o constriccio de la pupil·la, es analitzat posteriorment per extreure un codide l’iris que codifica la informacio necessaria per comparar dues imatges de l’iris.El codi generat mitjancant les imatges d’un iris es compara amb una o diversesplantilles emmagatzemades en la base de dades. Si la distancia de Hamming estroba per sota del llindar de decisio, es valida la identitat de cert individu a causade la improbabilitat estadıstica que dues persones diferents puguin coincidir percasualitat en tants bits del codi, donada l’alta entropia de les plantilles de l’iris.

El proces de segmentacio i normalitzacio de l’iris suposa un repte a causa de lapresencia de pestanyes, parpelles i reflexions que poden obstruir les regions del’iris. A mes, la dilatacio de la pupil·la a causa dels diferents nivells d’il·luminacioi l’inconvenient que l’iris i la pupil·la no son concentriques comporten que aquesttipus de biometria sigui bastant complexa.

III

Page 5: Human Iris Biometry

State of the art

John Daugman settled down the basis of iris recognition systems back in 1994, whenhe patented and published in a paper his algorithm for image processing, featureextraction, and matching.

The principle of iris recognition is based on a failure of a test of statistical inde-pendence on iris phase structure projected under 2D Gabor wavelet filters. Thecombinatorial complexity of this phase information across different persons presentshigh enough entropy among samples from different classes (different iris) to providereliable decisions about personal identity with extremely high confidence.

IV

Page 6: Human Iris Biometry

Statement of the purpose

The purpose of this project is to understand the fundamentals of iris recognitionsystems. How are these methods implemented, what does the recognition rely on,how are these methods used and what do these methods achieve along with theirlimitations.

An implementation of an iris recognition system is carried on along this project,trying to replicate in the best possible way the results achieved by John Daugman.The prototype code has been written under Matlab R2016a but for a real applicationscenario C++ or other more efficient platforms should be considered.

Page 7: Human Iris Biometry

Contents

1 Introduction to biometrics 11.1 Anatomy of the human eye . . . . . . . . . . . . . . . . . . . . . . . . 2

1.1.1 The iris . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.1.2 The pupil . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

1.2 Iris biometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2 Encoding an iris 62.1 Locating the pupil . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

2.1.1 Binarization . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72.1.2 The Canny edge detector . . . . . . . . . . . . . . . . . . . . . 82.1.3 Circular Hough transform . . . . . . . . . . . . . . . . . . . . 9

2.2 Locating the limbus . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102.3 Identify non iris artifacts . . . . . . . . . . . . . . . . . . . . . . . . . 122.4 Unwrapping the iris . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

2.4.1 Rubber sheet model . . . . . . . . . . . . . . . . . . . . . . . 142.4.2 Bio-mechanical model . . . . . . . . . . . . . . . . . . . . . . 152.4.3 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

2.5 Feature extraction and encoding . . . . . . . . . . . . . . . . . . . . . 19

3 Matching iris codes 223.1 Achieving orientation invariance . . . . . . . . . . . . . . . . . . . . . 243.2 Performance of the code . . . . . . . . . . . . . . . . . . . . . . . . . 25

4 Experimental results 264.1 Database characterization . . . . . . . . . . . . . . . . . . . . . . . . 274.2 Code sensitivity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294.3 Statistic results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

5 Conclusions and future work 34

Page 8: Human Iris Biometry

List of Figures

1.1 Steps in a biometric system (1) . . . . . . . . . . . . . . . . . . . . . 11.2 Steps in a biometric system (2) . . . . . . . . . . . . . . . . . . . . . 11.3 Eye anatomy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.4 Iris muscles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.5 Iris texture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2.1 ROI extraction process. . . . . . . . . . . . . . . . . . . . . . . . . . . 62.2 Encoding process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.3 Pupil location process . . . . . . . . . . . . . . . . . . . . . . . . . . 72.4 Original biometric sample . . . . . . . . . . . . . . . . . . . . . . . . 72.5 Binarized biometric sample . . . . . . . . . . . . . . . . . . . . . . . . 72.6 Canny input sample . . . . . . . . . . . . . . . . . . . . . . . . . . . 82.7 Canny edge response . . . . . . . . . . . . . . . . . . . . . . . . . . . 82.8 2D circular Hough transform . . . . . . . . . . . . . . . . . . . . . . . 102.9 3D circular Hough transform . . . . . . . . . . . . . . . . . . . . . . . 102.10 Biometric sample with soft iris transition towards sclera . . . . . . . . 112.11 Biometric sample with double limbic boundary . . . . . . . . . . . . . 112.12 Segmented iris from image 2.10 . . . . . . . . . . . . . . . . . . . . . 112.13 Segmented iris from image 2.11 . . . . . . . . . . . . . . . . . . . . . 112.14 Example of segmented iris images . . . . . . . . . . . . . . . . . . . . 122.15 Points to fit in parabola . . . . . . . . . . . . . . . . . . . . . . . . . 132.16 Eyelid segmentation . . . . . . . . . . . . . . . . . . . . . . . . . . . 132.17 Example of the normalization of an iris . . . . . . . . . . . . . . . . . 142.18 Rubber sheet model meshwork . . . . . . . . . . . . . . . . . . . . . . 152.19 Rubber sheet model crosslinks . . . . . . . . . . . . . . . . . . . . . . 152.20 Bio-mechanical model radial displacement prediction for ρ = 0.75 . . 162.21 Bio-mechanical model final position prediction for ρ = 0.75 . . . . . . 162.22 Bio-mechanical model meshwork for ρ = 0.75 . . . . . . . . . . . . . . 162.23 Rubber sheet model meshwork for ρ = 0.75 . . . . . . . . . . . . . . . 162.24 Set of correction curves . . . . . . . . . . . . . . . . . . . . . . . . . . 172.25 Biometric sample A . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182.26 Biometric sample B . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182.27 Comparison of the optimum sampling curves A . . . . . . . . . . . . 182.28 Biometric sample A2 . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.29 Biometric sample B2 . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.30 Comparison of the optimum sampling curves B . . . . . . . . . . . . 192.31 Gabor filter bank . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

Page 9: Human Iris Biometry

List of Figures List of Figures

3.1 HD distribution of uncorrelated iris . . . . . . . . . . . . . . . . . . . 233.2 Observed vs Binomial cumulatives . . . . . . . . . . . . . . . . . . . . 233.3 Original HD distribution . . . . . . . . . . . . . . . . . . . . . . . . . 253.4 Rotated HD distribution . . . . . . . . . . . . . . . . . . . . . . . . . 25

4.1 Distribution of shifts for x axis . . . . . . . . . . . . . . . . . . . . . 274.2 Distribution of shifts for y axis . . . . . . . . . . . . . . . . . . . . . . 274.3 Distance from limbic center relative to pupil radius . . . . . . . . . . 274.4 Distribution of pupil radius . . . . . . . . . . . . . . . . . . . . . . . 284.5 Distribution of limbus radius . . . . . . . . . . . . . . . . . . . . . . . 284.6 Distribution of dilation ratios (ρ) . . . . . . . . . . . . . . . . . . . . 284.7 Code shift distribution of uncorrelated iris . . . . . . . . . . . . . . . 294.8 Code shift distribution of same iris . . . . . . . . . . . . . . . . . . . 294.9 Decision environment at θ = 0 without angular correction . . . . . . . 294.10 Decision environment at θ = 0 after angular correction . . . . . . . . 294.11 Decision environment at θ = 90 without angular correction . . . . . . 304.12 Decision environment at θ = 90 after angular correction . . . . . . . . 304.13 Code sensitivity at θ = 0 . . . . . . . . . . . . . . . . . . . . . . . . . 304.14 Code sensitivity at θ = 90 . . . . . . . . . . . . . . . . . . . . . . . . 304.15 Decidability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 324.16 True positives (TP) . . . . . . . . . . . . . . . . . . . . . . . . . . . . 334.17 Decision environment at λ = 10 . . . . . . . . . . . . . . . . . . . . . 334.18 Decision environment at λ = 15 . . . . . . . . . . . . . . . . . . . . . 33

Page 10: Human Iris Biometry

Chapter 1

Introduction to biometrics

Biometrics refers to metrics related to human characteristics which can be used tomeasure and describe physiological and behavioural characteristics of an individual.Some examples of physiological characteristics include, but are not limited to finger-print, face, palm print, iris, retina, among others. As previously mentioned, it mayseem that most biometric systems are image-based, but not all, voice for examplerequires spectral analysis to perform biometric identification. These traits can beused to identify an individual and provide reliable automatic recognition of persons.

The principal steps in a biometric system can be described by the following chart:

Biometricsample

ROIsegmentation

Featureextraction

Characteristicvector

Figure 1.1: Steps in a biometric system (1)

Once the characteristic vector is extracted from the biometric sample, it is commonlyencrypted to prevent possible data filtration which can lead to impersonation ofidentity. Finally, the characteristic vector is then compared with a database ofcharacteristic vectors to validate the identity of an individual, such vector mayhave or not been encrypted previously. In case the system relies on encryption thematching procedure must act accordingly by decrypting at the time of matching orby working with homomorphic encryption.

Characteristicvector

Matching Decision

Database

Figure 1.2: Steps in a biometric system (2)

1

Page 11: Human Iris Biometry

Chapter 1. Introduction to biometrics 1.1. Anatomy of the human eye

The matching criteria relies on a characteristic and unique pattern associated tothe individual. The key issue associated to all pattern recognition problems is therelation between inter-class and intra-class variability: objects can be reliably clas-sified only if the variability among different instances of a given class is significantlyless than the variability between different classes. For example, in face recognition,difficulties arise from the fact that the face is a changeable social organ displayinga variety of expressions, as well as being an active three-dimensional object whoseimage varies with viewing angle, pose, age and other factors. Against this intra-class (same face) variability, inter-class variability is limited because different facespossess the same basic set of features, in the same canonical geometry such as thecase of monozygotic twins. Similar problems arise when dealing with fingerprintbiometrics, since the fingerprints are often subject to handwork and labour and canbecome harmed introducing marks or modifying the original fingerprint. The irisinstead, presents a much wider range of characteristics that present an enormousvariability (since some characteristics may or may not be present in an iris), is wellprotected from the environment since it is an internal (yet externally visible) organof the eye and is stable over time reaching its final shape on the eighth month ofgestation of a human being. For all of these reasons, iris patterns become interestingas an alternative approach to reliable recognition of persons.

1.1 Anatomy of the human eye

The external eye is composed by two main pieces. The smaller frontal part, trans-parent and more curved, called the cornea, is linked to the larger white unit calledthe sclera. The sclerotic chamber constitutes the remaining region of the eye. Ithas a white color and its radius is typically about 12 mm. The cornea and scleraare connected by a ring called the limbus. The iris is the colored circular structureconcentrically surrounding the center of the eye and finally, the pupil, which appearsto be black. The size of the pupil controls the amount of light entering the eye.

Figure 1.3: Eye anatomy

The eyeball grows rapidly, increasing from about 16–17 millimetres at birth to22.5–23 mm by three years of age. By age 13, the eye attains its full size of around24 mm, dimensions differ among adults by only one or two millimetres, stayingremarkably consistent across different ethnicities.

2

Page 12: Human Iris Biometry

Chapter 1. Introduction to biometrics 1.1. Anatomy of the human eye

1.1.1 The iris

There are some important aspects of the iris which are to be considered when build-ing a biometric system: which is its anatomic structure?, is this structure geneticallybound?, when does it attain its maturity?, does its texture remain invariant overtime?. Some of these aspects may help understand the limitations associated to irisbiometric systems.

The iris has a fixed diameter with an average of 12 mm among the population, itis a muscular tissue responsible for controlling the diameter and size of the pupiland thus the amount of light reaching the retina. There are two groups of smoothmuscles for this purpose: a circular group called the sphincter pupillae, and a radialgroup called the dilator pupillae. This can be seen in fig. 1.4.

Figure 1.4: Iris muscles

The iris begins to form in the third month of gestation and the structures creatingits pattern are largely complete by the eighth month, although pigment accretioncan continue into the first postnatal years. Its complex pattern can contain manydistinctive features such as arching ligaments, furrows, ridges, crypts, rings, corona,freckles, and a zigzag collarette, some of which can be seen in fig. 1.4. Statisticaltests of iris texture demonstrate that the patterns associated to each individual arenot genetically bound[1] whilst the color of the iris is strongly determined genetically,this means that even monozygotic twins which posses identical DNA will presentdifferent iris texture. Furthermore, the two eyes of an individual contain completelyindependent iris patterns.

3

Page 13: Human Iris Biometry

Chapter 1. Introduction to biometrics 1.1. Anatomy of the human eye

Figure 1.5: Iris texture

The surface of the multilayered iris that is visible includes two sectors that aredifferent in color, an outer ciliary part and an inner pupillary part. These two partsare divided by the collarette which appears as a zigzag pattern.

The striated tabecular meshwork of elastic pectinate ligament creates the predom-inant texture under visible light, whereas in the near-infrared (NIR) wavelengthsslowly modulated stromal features dominate the iris pattern. In NIR wavelengths,even darkly pigmented irises reveal rich and complex features.

1.1.2 The pupil

The pupil is the opening in the centre of the eye which allows light to strike theretina. Light enters through the pupil and goes through the lens, which focusesthe image on the retina. The reason why the pupil appears black is due to lightrays entering the pupil are either absorbed by the tissues inside the eye directly, orabsorbed after diffuse reflections within the eye that mostly miss exiting the narrowpupil. In optical terms, the anatomical pupil is the eye’s aperture and the iris isthe aperture stop adapting in diameter to allow more or less light to reach theretina. When more light is needed, the pupil gets dilated (process known as miosis)and when there is an excess of light reaching the retina the pupil gets constricted(process known as mydriasis). The normal pupil size in adults varies from 2 to 4 mmin diameter when constricted to 4 to 8 mm when dilated, taking in considerationthat the diameter of the iris is fixed around 12 mm, the expected dilation ratiodefined as:

ρ =pupil diameter

iris diameter(1.1)

Parameter ρ is thereby expected to be comprised between ρ ∈ [0.15, 0.7]. The pupiland the limbus are not concentric, in fact, the pupil center tends to be shiftedtowards the nasal region and it is not unusual to observe shifts of around 20% of the

4

Page 14: Human Iris Biometry

Chapter 1. Introduction to biometrics 1.2. Iris biometry

pupil radius. Research shows that the pupil center is related to its constriction[2],becoming the limbus and pupil more concentric the more the pupil gets dilated.

Besides from the changes in size experienced by the pupil determined by ambientillumination, focal length, drug action, emotional conditions, among others, thepupil is also subject to rhythmic, but regular variations in diameter, called hippus,occurring in a frequency range of 0.05 to 0.3Hz. This phenomena is independent ofeye movements or changes in illumination and is particularly noticeable when pupilfunction is tested with a light.

1.2 Iris biometry

First of all, the biometric system has to localize the inner and outer boundaries ofthe iris (pupil and limbus) on the image of an eye. Further subroutines detect andexclude eyelids, eyelashes, and specular reflections that often occlude parts of theiris. The set of pixels containing only the iris, normalized by a rubber-sheet modelto compensate for pupil dilation or constriction, is then analyzed to extract a bitpattern encoding the information needed to compare two iris images.

For identification (one-to-many template matching) or verification (one-to-one tem-plate matching), the resultant code obtained by imaging an iris is compared to storedtemplate(s) in a database. If the Hamming distance is below the decision threshold,a positive identification has effectively been made because of the statistical extremeimprobability that two different persons could agree by chance in so many bits, giventhe high entropy of iris templates.

A minimum of information is expected to capture the rich details of iris patterns,therfore an imaging system should resolve a minimum of 70 pixels in iris radius[1].An individual with darkly pigmented irises exhibits a low contrast between the pupiland the iris region if the image is acquired under natural light, making segmentationmore difficult, for this reason NIR imaging is desirable, furthermore the majority ofpersons worldwide have “dark brown eyes”, the dominant phenotype of the humanpopulation, revealing less visible texture in the visible wavelength (VW) band butappearing richly structured in the NIR band. Using the NIR spectrum also enablesthe blocking of corneal specular reflections from a bright ambient environment, byallowing only those NIR wavelengths from the narrow-band illuminator back intothe iris camera. An inconvenient when working with NIR imaging is that the limbicboundary usually has extremely soft contrast when long wavelength NIR illumina-tion is used, causing the segmentation of the iris to become more complicated.

5

Page 15: Human Iris Biometry

Chapter 2

Encoding an iris

An iris biometric system firstly has to resolve the location of the region of interest(ROI), by locating the inner and outer boundaries of the iris (pupil and limbus) inan image of an eye. The ROI extraction relies on a proper identification of the pupil.The reason for this is that it is the most distinctive and easy part to identify in theeye and also bounds the region in which the limbus should be present. A properbiometric sample has to provide enough information to identify the pupil correctly,otherwise the image sample is discarded. Further subroutines detect and excludeeyelids, eyelashes, and shadows that often occlude parts of the iris.

Locate thepupil

Locatethe iris

Identify noniris artifacts

Figure 2.1: ROI extraction process.

Once the ROI has been located, the set of pixels containing only the iris are thennormalized to compensate for pupil dilation or constriction and non concentricityof the iris and the pupil in a process known as iris unwrapping. The normalizediris is then projected under 2D Gabor wavelets to extract the texture information ofthe iris. The projected iris is finally binarized to extract an iris code encoding theinformation needed to compare two iris images.

ROIsegmentation

Unwrapthe iris

Featureextraction

Iriscode

Figure 2.2: Encoding process.

In this thesis two iris unwrapping models have been studied to attempt to compen-sate for pupil dilation and constriction. The first model called rubber sheet modelproposed by J. Daugman that compensates for pupil dilation without consideringthe dilation ratio, and a second model called bio-mechanical model that considersthe dilation and adjusts the interpolation grid accordingly to the dilation.

6

Page 16: Human Iris Biometry

Chapter 2. Encoding an iris 2.1. Locating the pupil

2.1 Locating the pupil

When looking for the pupil there are two main characteristics which can be exploitedto locate it: the pupil is one of the darkest regions in the eye and it is circularshaped. Therefore, the main steps to carry on are segmenting the sample to identifythe dark regions (binarization) and then look for circles. The search for circularshaped regions is efficiently achieved by the circular Hough transform (CHT) butrather than looking for circles in the binarized image, its input is an edge responseimage provided by the Canny edge detector to reduce the amount of information.The overall process can be described by the following chart:

Figure 2.3: Pupil location process

Biometricsample

BinarizationCanny edge

detector

CircularHough Tr.

2.1.1 Binarization

The binarization process takes an image, say I, and applies a certain threshold Ith.Such threshold is adjusted through experimental analysis to extract only the darkregions of the biometric sample. The binarization function is defined as:

IB(x, y) =

{1 if I(x, y) < Ith

0 otherwise(2.1)

In fig. 2.4 we can see the original iris sample and in fig. 2.5 the result of thebinarization with Ith.

Figure 2.4: Original biometric sample Figure 2.5: Binarized biometric sample

7

Page 17: Human Iris Biometry

Chapter 2. Encoding an iris 2.1. Locating the pupil

2.1.2 The Canny edge detector

The purpose of edge detection is to significantly reduce the amount of data in animage, while preserving the structural properties to be used for further image pro-cessing. Canny’s edge detector uses a multi-stage algorithm to detect a wide rangeof edges in images. The Process of Canny edge detection algorithm can be brokendown to 5 different steps:

1. Noise suppression: smooth the image using a Gaussian filter to reduce noise.

2. Finding gradients: apply Sobel operator to find image gradients.

3. Non-maximum suppression: preserve all local maximum edge values in thegradient image and suppress the rest.

4. Double threshold: edge pixels stronger than the high threshold are markedas strong; edge pixels weaker than the low threshold are suppressed and edgepixels between the two thresholds are marked as weak.

5. Hysteresis: strong edges are interpreted as “true edges” and weak edges areincluded if and only if they are connected to strong edges.

The resultant image after applying Canny edge detection to the image seen on 2.6is shown on fig. 2.7. A detailed description of the Canny edge detector steps can befound on references [3] and the original John F. Canny description from Canny at[4].

Figure 2.6: Canny input sample Figure 2.7: Canny edge response

8

Page 18: Human Iris Biometry

Chapter 2. Encoding an iris 2.1. Locating the pupil

2.1.3 Circular Hough transform

Hough transform (HT) algorithms can be used to determine the parameters of simpleparameterizable structures, such as lines, circles, ellipses and parabolas present inan image. The Circular Hough transform (CHT) is designed to find a circle bycharacterizing its center (x0, y0) and radius r0 in the parameter space. The equationthat defines a circle is given by:

(x− x0)2 + (y − y0)2 = r20 (2.2)

Where x and y are the points of the circle on the image. The parametric form ofthis circle equation is given by the following expressions:

x = x0 + r0 × cos(θ) (2.3)

y = y0 + r0 × sin(θ) (2.4)

In CHT method, for each edge point (xi, yi) a circle, say C, with radius rc is drawnconsidering (xi, yi) as the center. Consider an arbitrary point p = (xc, yc) on C, thenthe circle centered on (xc, yc) with radius rc must be passed through (xi, yi). To findthe desired circle, majority voting technique (i.e., Hough Transform) is applied. Inthis process, for each point on C the accumulator value is increased by one. Foreach edge points (xi, yi) : i = 1, 2, 3..., n, the HT is defined as:

H(xc, yc, rc) =∑

h(xi, yi, xc, yc, rc) (2.5)

Where:

h(xi, yi, xc, yc, rc) =

{1 if g(xi, yi, xc, yc, rc) = 0

0 otherwise(2.6)

With:

C: g(xi, yi, xc, yc, rc) = (xi − xc)2 + (yi − yc)2 − r2c (2.7)

9

Page 19: Human Iris Biometry

Chapter 2. Encoding an iris 2.2. Locating the limbus

The parameter triple (xc, yc, rc) that maximizes is common to the largest numberof edge points and is a reasonable choice to represent the circular contour. In figs.2.8, 2.9 the CHT of the image obtained in 2.7 can be seen.

Figure 2.8: 2D circular Hough transform Figure 2.9: 3D circular Hough transform

The peak value corresponds to the center candidate of the voting procedure for agiven radius.

2.2 Locating the limbus

The parameters of the pupil can now be used to estimate the iris parameters since thepupil and iris center present an offset which bounds the area in which a healthy irisshall be contained. The radius of the iris is also bounded by the extreme dilationratios (ρ) of the pupil. Therefore, the procedure for locating the iris starts fromthe parameters of the pupil. Then, it searches for the circular path where thereis maximum change in pixel values of the circular contour over the blurred partialderivative of the edge image obtained from the Canny edge detector by varyingthe radius between rpup × [1.2, 1.8] and shifting the center in the region containedin (xp, yp) ± 0.2 × rpup . The blurring over the edge response provides a highertolerance for deviations to take place over the contour image caused by digitizationof the pixels and to reduce the negative effect of the eccentricity of the iris. Theoperator can be described by the following equation:

max(r,x0,y0)

Gσ(r)∗ ∂

∂r

∫r,x0,y0

I(x, y)

2πrds (2.8)

Where ∗ denotes the convolution product and d is the distance with respect to thepupil center in pixels. This operator behaves as a circular edge detector to identifythe outer limit of the iris (limbus). Locating the iris can become very difficultsince some iris present a soft transition towards the sclera when imaging in the NIRspectrum, this can be seen in fig. 2.10 and other iris present an outer and innerlimbic boundary such as in fig. 2.11.

10

Page 20: Human Iris Biometry

Chapter 2. Encoding an iris 2.2. Locating the limbus

Figure 2.10: Biometric sample with softiris transition towards sclera

Figure 2.11: Biometric sample withdouble limbic boundary

This inconvenient negatively affects the precision of the location of the iris. Further-more, the eyelashes and eyelids are troublesome when looking for the limbus sincethey are treated as possible borders of the iris when the operator searches exhaus-tively over the region of the eye, to address that inconvenient the operator can bemodified to search just over the maximum variation of the contour along the definedarc of a given radius, suppressing thereby most of the eyelids and eyelashes. Theresult of applying the proper modifications to the operator resulted in the followingiris segmentations for the images in figs. 2.10, 2.11:

Figure 2.12: Segmented iris from image2.10

Figure 2.13: Segmented iris from image2.11

11

Page 21: Human Iris Biometry

Chapter 2. Encoding an iris 2.3. Identify non iris artifacts

The following images show a set of segmented iris images:

Figure 2.14: Example of segmented iris images

If the database of study does present stable imaging conditions across the samples(same distance to the sensor and same zoom factor) then, this operator can beoptimized by narrowing the range of radius to search for the limbus since the iris,unlike the pupil, possesses a fixed radius and has an average among the humanpopulation of around 12 mm. The same can be applied for the center of the irissince according to the distance to the sensor and the zoom, the region to search canbe narrowed, to achieve this an small subset of the database can be used as trainingto determine these parameters and increase the efficiency of the algorithm.

2.3 Identify non iris artifacts

A particularly important issue involved in iris segmentation is the localization ofeyelids, eyelashes and shadows (EES). EES localization is important because theiris is almost always partially occluded by these factors, which will increase falseacceptance and false rejection if not properly excluded. It is important to notethat there is another major factor which harms the performance of iris recognition:reflections; but this factor only appears when working on VW. This project aimsonly samples obtained on the NIR spectrum for practical reasons but under VWimaging systems, specular reflections should be addressed.

12

Page 22: Human Iris Biometry

Chapter 2. Encoding an iris 2.4. Unwrapping the iris

Efficient EES localization is difficult. First, accurate eyelid localization is challengingdue to eyelashes occlusion and second; the variation of the intensity, amount andshape irregularity of eyelashes. There are two ways to address EES localization:establishing an eyelid curvature model statistically and a common arc structure toidentify eyelashes or excluding a predefined region of the iris. Although one methodis more desirable than the other we have to consider that EES localization accuracyis not faultless and it has time consumption associated to image processing whilstexcluding a predefined region of the iris has no computational cost but the expenseof discarding relevant information.

In this project a first approach to eyelid localization was tackled by performing arectangular average filter followed by Sobel horizontal filter which is then binarizedby using a threshold determined by experimental analysis. The points within pupiland outside of the iris are suppressed, the resultant points are then used to fit aparabola.

Figure 2.15: Points to fit in parabola Figure 2.16: Eyelid segmentation

The implemented method was proposed by Basit et al. on [5]. The reason forimplementing this method was because of its simplicity but it did not provide thedesired results. For this reason other more accurate and complex methods should beexplored, some of which can be seen seen on [6, 7, 8, 9]. Instead of addressing EES, apredefined region of the iris was discarded from comparison but further developmentof this project should properly address EES.

2.4 Unwrapping the iris

Robust representations for pattern recognition must be invariant to changes in thesize, position, and orientation of the patterns. In the case of iris recognition, thismeans we must create a representation that is invariant to the optical size of theiris in the image (which depends upon the distance to the eye, and zoom), the sizeof the pupil within the iris (which introduces a non affine pattern deformation), thelocation of the iris within the image, and the iris orientation, which depends uponhead tilt, torsional eye rotation within its socket, and camera angles. Fortunately,invariance to all of these factors can readily be achieved.

13

Page 23: Human Iris Biometry

Chapter 2. Encoding an iris 2.4. Unwrapping the iris

For on-axis but possibly rotated iris images, it is natural to use a projected pseudo-polar coordinate system. The polar coordinate grid is not necessarily concentric,since in most eyes the pupil and the iris are not concentric. This coordinate systemcan be described as doubly dimensionless: the polar variable, angle (θ), is inherentlydimensionless, but in this case the radial variable is also dimensionless, because itranges from the pupillary boundary to the limbus which can be described by anormalized unit interval comprised between [0, 1]. Therefore, the normalized irisspace is defined along its radial r ∈ [0, 1] and its angular θ ∈ [0, 2π] components.The following image depicts the result of normalizing the surface of the iris with therubber sheet model:

Figure 2.17: Example of the normalization of an iris

In this project, two different models for constructing the elastic meshwork of the irishave been studied: a first approach known as rubber sheet model, and a model thatintends to compensate for pupil dilation, namely, bio-mechanical model.

2.4.1 Rubber sheet model

This model approaches the dilation and constriction of the pupil modeled by acoordinate system as the stretching of a homogeneous rubber sheet, having thetopology of an annulus anchored along its outer perimeter, with tension controlledby an (off-centered) interior ring of variable radius. The homogeneous rubber sheetmodel assigns to each point on the iris, regardless of its size and pupillary dilation,a pair of real coordinates (r, θ) where r is on the unit interval r ∈ [0, 1] and θ is anangle ranging from θ ∈ [0, 2π]. The remapping of the iris image from raw Cartesiancoordinates (x, y) to the dimensionless non concentric polar coordinate system (r, θ)can be represented as:

I(x(r, θ), y(r, θ))→ I(r, θ) (2.9)

Where the remapping equations are given by:

R(r) = (1− r)× rpupil + r × rlimbus (2.10)

x(r, θ) = (1− r)× xpupil + r × xlimbus +R(r)× cos(θ)

y(r, θ) = (1− r)× ypupil + r × ylimbus +R(r)× sin(θ)(2.11)

In which R(r) in eq. 2.10 represents the progression of radius and (x(r, θ), y(r, θ)) ineq. 2.11 provide the coordinates associated to each R(r). Since the radial coordinate

14

Page 24: Human Iris Biometry

Chapter 2. Encoding an iris 2.4. Unwrapping the iris

r ranges from the iris inner boundary rpup to its outer boundary rlimbus as a unitinterval, it inherently corrects for the elastic pattern deformation in the iris whenthe pupil changes in size. The resultant interpolation grid described by equations2.11 with N = 8 radial sections and M = 32 angular sections can be seen on figs.2.18, 2.19.

Figure 2.18: Rubber sheet modelmeshwork

Figure 2.19: Rubber sheet modelcrosslinks

In the representation of the iris described on figs. 2.18, 2.19 there is a magentaisolated point that corresponds to a theoretical circle of radius r = 0, this point canbe considered as the “geometric center” of the meshwork since its the central pointwhich all radial and angular points are referring to.

The coordinate system described above achieves invariance to the position and sizeof the iris within the image, and to the dilation of the pupil within the iris. However,it would not be invariant to the orientation of the iris within the image plane. Theexplanation of how to compensate this effect is detailed on section 3.1.

2.4.2 Bio-mechanical model

The effect of changes in pupil size on iris recognition has become an active researchtopic in recent years, and several factors have been demonstrated to induce vary-ing levels of pupil dilation that negatively affect the performance of iris recognitionsystems. These factors include changes in the ambient lighting conditions, alco-hol, drugs, and aging. Physiological studies indicate that the deformation of theiris tissue caused by pupil dilation is nonlinear. Therefore, the incorporation of anonlinear iris normalization scheme will likely address the problems associated withlarge changes in pupil size.

In [10], Tomeo-Reyes et al. proposed a nonlinear normalization scheme that ap-proaches the dilation and constriction of the pupil modeled by a coordinate systemthat considers the radial displacement of any point in the iris at a given dilationlevel.

Unlike the rubber sheet model, in which equally spaced radial samples are consid-ered at each angular position, the proposed method uses the radial displacement

15

Page 25: Human Iris Biometry

Chapter 2. Encoding an iris 2.4. Unwrapping the iris

estimated by the bio-mechanical model to perform the radial sampling. In fig.2.20 the graphic shows the displacement u(r) obtained in an extreme dilation caseρ = 0.75, and fig. 2.21 shows the final radial position r + u(r) associated to thegiven dilation ratio.

0 0.25 0.5 0.75 1

Normalized radius

0

0.25

0.5

0.75

1

u(r

)

Radial displacement u(r) for ρ=0.75

Bio-mechanical model

Rubber sheet model

Figure 2.20: Bio-mechanical model radialdisplacement prediction for ρ = 0.75

0 0.25 0.5 0.75 1

Normalized radius

0

0.25

0.5

0.75

1

r+u(r

)

Final positions r+u(r) for ρ=0.75

Bio-mechanical model

Rubber sheet model

Figure 2.21: Bio-mechanical model finalposition prediction for ρ = 0.75

Therefore, according to equations 2.11, the new function r′ that remaps the coordi-nate system to compensate for dilation of the pupil is given by:

r′ = r + u(r) (2.12)

When applying the correction to the hypothetical representation shown in fig. 2.18for a supposed dilation ratio of ρ = 0.75 and concentric pupil and limbus, theresultant meshworks are:

Figure 2.22: Bio-mechanical modelmeshwork for ρ = 0.75

Figure 2.23: Rubber sheet modelmeshwork for ρ = 0.75

One of the problems of the bio-mechanical model proposed by Tomeo-Reyes et al.in [10] is that it does not take into account relevant aspects of the iris physiologysuch as the non-concentricity of the iris and the pupil and the lack of a model tocompensate for pupil constriction since the model that they present is only valid for

16

Page 26: Human Iris Biometry

Chapter 2. Encoding an iris 2.4. Unwrapping the iris

dilation scenarios. More detailed and precise models should be elaborated takinginto account the dilation, constriction and the pupil shift along its dilation andcontraction[2].

2.4.3 Discussion

At this point we have two possible normalization meshworks. The rubber sheetmodel could be considered a minimalistic approach for not taking into considerationthe constriction/dilation ratio when normalizing the iris but still provides satisfy-ing results when not dealing with extreme variations in pupil size. On the otherhand we have the antagonistic models of the rubber sheet model aimed to addressconstriction/dilation by considering the anatomy of the iris. The complexity of anadequate model for such purpose raises the question: is it worth it?. In regards tothe proposed bio-mechanical model we can say that it is disesteemed because of thefundamental basis it relies on. It fails by not considering the non-concentricity of thelimbus and the pupil and even worse, takes the assumption that the structure of theiris is homogeneous when it is not. At contraction/dilation the iris folds over itselflike a curtain hiding texture which was previously visible harming thereby the recog-nition irreparably. Experimental analysis shows that oversampling at values closeto the pupil when the dilation ratios are close to each other tend to produce betterHD scores than the rubber sheet model. An small example tries to depict such astatement by taking a set of curves associated to the dilation ratio and compare theHD scores for two given iris by applying the set of curves to the the normalizationprocess:

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Normalized radius

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

r+u

(r)

Figure 2.24: Set of correction curves

17

Page 27: Human Iris Biometry

Chapter 2. Encoding an iris 2.4. Unwrapping the iris

We compare two samples of the same iris with different ρ: the sample A withρA = 0.34 and sample B with ρB = 0.45.

Figure 2.25: Biometric sample A Figure 2.26: Biometric sample B

After comparing both samples by applying the correction curves shown in fig. 2.24the optimum curves are very similar and close to the rubber sheet model as depictedin fig. 2.27. Although different dilation ratios are observed we cannot appreciate arelation between the dilation and the associated corrective curves. For this particularexample we can see that oversampling the region close to the pupil does not yield anyimprovement in the HD score. The obtained Hamming distances are HDrubber = 0.33and after applying the curve correction HDcorrection = 0.32.

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Normalized radius

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

r+u

(r)

Sample A

Sample B

Figure 2.27: Comparison of the optimum sampling curves A

18

Page 28: Human Iris Biometry

Chapter 2. Encoding an iris 2.5. Feature extraction and encoding

We now take a second pair of iris with similar dilation ratios: ρA = 0.42 andρB = 0.44.

Figure 2.28: Biometric sample A2 Figure 2.29: Biometric sample B2

The obtained HDs are HDrubber = 0.32 and HDcorrection = 0.29. In this scenario wecan see in fig. 2.30 that oversampling the region close to the pupil does improvealthough not excessively the HD score.

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Normalized radius

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

r+u

(r)

Sample A

Sample B

Figure 2.30: Comparison of the optimum sampling curves B

To address this topic in depth statistical tests should be carried out but the im-provement if any does not appear to be significant enough in terms of improving theHD score.

2.5 Feature extraction and encoding

From now on, we rename (r, θ) to (x, y). An effective strategy for extracting bothcoherent and incoherent textural information from images, such as the detailedtexture of the iris, is the computation of 2D Gabor coefficients. This family offilters are conjointly optimal in providing the maximum possible resolution bothfor information about the orientation and spatial frequency content of local imagestructure, simultaneously with information about position achieving the theoretical

19

Page 29: Human Iris Biometry

Chapter 2. Encoding an iris 2.5. Feature extraction and encoding

lower bound for conjoint uncertainty over these four variables as dictated by theuncertainty principle. The general expression for the Gabor filters over the imagedomain (x, y) have the functional form:

G(x, y|α, β, λ, θ, φ) = e −x′2/α2 − y′2/β2

ej(2πx′/λ+ φ) (2.13)

Where x′ and y′ can be decomposed according to the orientation θ of the filter:

x′ = x cos θ + y sin θ (2.14)

y′ = −x sin θ + y cos θ (2.15)

Where (α, β) specify effective width and length and 1/λ specifies spatial frequencyin pixels/cycle. A set of the Gabor filters centered at the origin (x0, y0) with aspectratio β/α = 1 and several wavelengths and orientations can be seen in the fig 2.31below.

Figure 2.31: Gabor filter bank

Each bit in an iris code is computed by evaluating the sign of the projected localregion of the iris image onto a given Gabor filter:

code(x0, y0|α, β, λ, θ, φ) =

{1 if φ {IG(x0, y0|α, β, λ, θ, φ)} ∈ [0, π]

0 if φ {IG(x0, y0|α, β, λ, θ, φ)} ∈ [−π, 0)(2.16)

Where φ denotes phase and IG the projection of the normalized iris I(r, θ) on thecomplex Gabor filters produced by the convolution product:

IG(x, y|α, β, λ, θ, φ) = I(x, y)∗G(x, y|α, β, λ, θ, φ) (2.17)

20

Page 30: Human Iris Biometry

Chapter 2. Encoding an iris 2.5. Feature extraction and encoding

By construction, 2D Gabor filters have no DC response in either their real or imag-inary parts, this eliminates possible dependency of the computed code bit on meanillumination of the iris and on its contrast gain. Only phase information is used forrecognizing irises because amplitude information depends upon extraneous factorssuch as imaging contrast, illumination, and camera gain.

The binarized code captures the information of wavelet zero-crossings, as is clearfrom the operator sign in eq. 2.16. The extraction of phase has the further advantagethat phase angles remain defined regardless of how poor the image contrast may be.For documentation about the Gabor filters some interesting documents can be foundin references [11, 12].

21

Page 31: Human Iris Biometry

Chapter 3

Matching iris codes

The iris codes are matched to obtain a Hamming Distance (HD) score as the measureof dissimilarity between any two irises. Each HD score is compared to a certainthreshold to ascertain the identity of an individual. The matching of the codes isimplemented by the exclusive-or operator (XOR) applied to the resultant phase bitvectors that encode any two iris patterns. For every iris there are two codes: acode encoding iris texture, and a masking code to prevent non iris artifacts frominfluencing iris comparisons. The XOR operator detects disagreement between anycorresponding pair of bits, while the AND operator ensures that the compared bitsare both deemed to have been uncorrupted by eyelashes, eyelids, specular reflections,or other noise. The norms of the resultant code bit and of the AND’ed masks are thenmeasured in order to compute a fractional Hamming Distance (HD) as the measureof the dissimilarity between any two irises, whose two phase code bit vectors aredenoted codeA, codeB and whose mask bit vectors are denoted maskA, maskB:

HD =‖(codeA⊕ codeB)�maskA�maskB‖

‖maskA�maskB‖(3.1)

Where ⊕ dentes the bitwise operator XOR, � is the bitwise operator AND and ‖·‖denotes the Hamming weight as the number of nonzero elements. The denominatortallies the total number of phase bits that mattered in iris comparisons after artifactswere discounted, so the resulting HD is a fractional measure of dissimilarity in which0 would represent a perfect match.

As seen in section 3.2, each bit of any iris code has equal a priori odds of being a 1or a 0, therefore the expected proportion of agreeing bits between the codes for twodifferent iris is HD=0.5 (each of the four states 00, 01, 10, 11 has probability 0.25,the bits agree in half of the cases and disagree on the other half).

22

Page 32: Human Iris Biometry

Chapter 3. Matching iris codes

If each one of code bits in a given iris code were fully independent of every otherbit, then the expected distribution of observed Hamming distances between twoindependent such iris codes would be a binomial distribution with p = 0.5 and thenumber of degrees of freedom N should be the number of bits of the code (equivalentto tossing a fair coin N times). The histogram in fig. 3.1 shows the distribution ofHDs obtained from 283122 comparisons between different pairings of iris images.

Figure 3.1: HD distribution ofuncorrelated iris

Figure 3.2: Observed vs Binomialcumulatives

The theoretical binomial distribution plotted in fig. 3.1 corresponds to a normalizedbinomial of the form:

p

(HD =

k

N

)=

(N

k

)pk(1− p)N−k (3.2)

After carrying out all the possible comparisons between different pairs of irises inthe database, the observed distribution when comparing different iris codes perfectlyfits a binomial distribution with an observed mean of HD= 0.5002 and an standarddeviation σ = 0.0255 having N = p(1 − p)/σ2 = 385 degrees of freedom. In thisexample the code length was composed of N = 32 × 360 = 11520. To validatesuch a statistical model we must also study the behavior of the tails, by examiningquantile–quantile plots of the observed cumulatives versus the theoretically predictedcumulatives. Such quantile–quantile plot is given in fig. 3.2. The straight linerelationship reveals very precise agreement between model and data.

The reason for the reduction in the number of freedom degrees from the expectedtotal number of bits in the code is that there are substantial radial correlations withinan iris. For example, a given furrow or cilary process tends to propagate across asignificant radial distance in the iris, exerting its influence on several remote partsof the code, thus reducing their independence. Similarly, a feature such as a furrowinfluence different parts of the code associated with several different scales of analysisin the Gabor filter.

23

Page 33: Human Iris Biometry

Chapter 3. Matching iris codes 3.1. Achieving orientation invariance

The encoding algorithm can detect and encode an iris regardless of its position,size and orientation in the image, but the resultant code is not translation-invariantalong its angular θ component, for this reason, further stages of the algorithm haveto correct the orientation of two iris codes when they are being matched.

3.1 Achieving orientation invariance

The most efficient way to achieve iris recognition with orientation invariance alongits angular component θ is not to rotate the image itself using the Euler matrix, butrather to compute the iris phase code in a single canonical orientation and then tocompare this representation at many discrete orientations by cyclic scrolling of itsangular variable.

The statistical consequences of seeking the best match after numerous relative rota-tions of two iris codes are as follows: let f0(x) be the raw probability density functionobtained for the Hamming Distances (HD) between uncorrelated iris comparisonsafter comparing them in a single relative orientation such as in eq. 3.2. Then F0(x),the cumulative of f0(x) from 0 to x, becomes the probability of getting a false matchin such a test when using HD acceptance criterion x:

F0(x) =

∫ x

0

f0(x′)dx′ (3.3)

Which can be also expressed as:

f0(x) =d

dxF0(x) (3.4)

Therefore, the probability of not making a false match when using criterion x is 1−F0(x) after a single test. Assuming that rotated codes behave as independence codesthen, the probability of not making a false match after n independent orientationtests is (1 − F0(x))n. It follows that the probability of a false match after a “bestof n” test of agreement, when using HD criterion x, regardless of the actual form ofthe raw unrotated distribution f0(x), is:

Fn(x) = 1− (1− F0(x))n (3.5)

Finally, the expected density fn(x) associated with this cumulative is:

fn(x) =∂

∂xFn(x) = nf0(x)(1− F0(x))n−1 (3.6)

24

Page 34: Human Iris Biometry

Chapter 3. Matching iris codes 3.2. Performance of the code

The implications of performing the orientation correction for uncorrelated iris pro-vides a new skewed distribution with a reduced mean as shown in fig. 3.4.

In practice, the resultant distribution after seeking the best match between [−5, 5]◦

orientations, accounting for a total of n = 11 different trials:

Figure 3.3: Original HD distribution Figure 3.4: Rotated HD distribution

Since only the smallest value in each group of n = 11 samples was retained, the newdistribution is skewed and biased to a lower mean value HD= 0.4702, as we wouldexpected from the theory of extreme value sampling.

3.2 Performance of the code

A primary question is whether there is independent variation in iris detail, bothwithin a given iris and across the human population. Any systematic correlationsin iris detail across the population or within itself would reduce its entropy, whichmeans that some bits in the code would become irrelevant. From the principle ofentropy we know that a code of any length has maximum information capacity ifall its possible states are equiprobable. However it doesn’t mean that all these bitsare of interest since there would be present information bits as well as noisy bits.Further development of this project should address this problem by discerning whichare the bits of information and how to retain them while suppressing the most partof noisy bits (compacting the code).

25

Page 35: Human Iris Biometry

Chapter 4

Experimental results

The Image Database of study in this project was CASIA Iris Version 1 collected bythe Chinese Academy of Sciences [13] which includes 756 iris images from 108 eyesbeing all left eyes. For each one 7 samples are captured with a resolution of 320 pixelswidth and 280 in height, by using eight 850 nm NIR illuminators circularly arrangedaround the sensor to make sure that iris is uniformly and adequately illuminated.

When comparing different iris codes, the total number of comparisons can be ex-pressed as N = 756 eyes arranged in groups of k = 2. Thereby, the the total numberof comparisons is:

Ncomparisons =

(N

k

)=

(756

2

)= 285390. (4.1)

The total number of inter-class (same eyes) comparisons is the number of eyes timesthe number of samples for each eye (Ns) arranged in groups of 2:

Ninter−class = Neyes ×(Ns

k

)= 108×

(7

2

)= 2268. (4.2)

The total number of intra-class comparisons can therefore be expressed as:

Nintra−class = Ncomparisons −Ninter−class = 283122. (4.3)

The reason for which this database was chosen is that it was comprised of a reason-able amount of samples with close eye views providing thereby high radial resolutionof the eyes. The resolution provided that the number of iterations for performing thesegmentation of the iris was reduced when comparing with higher resolution systemsfor whom a coarse-to-fine downscaling to upscaling process should be considered toimprove the speed of the segmentation.

26

Page 36: Human Iris Biometry

Chapter 4. Experimental results 4.1. Database characterization

4.1 Database characterization

Before addressing the performance of the system it is important to characterize theperformance of the iris segmentation to find out if the results obtained stay consistentwith the anatomic description of the human eye. The first topic to address is aboutthe non-concentricity of pupil and limbus center. After experimental analysis theIntegro-Differential operator defined in section 2.2 was optimized by setting thesearching region between [−8, 8] both for x and y axis. The following figures depictthe obtained distribution of shifts for the mentioned parameters:

Figure 4.1: Distribution of shifts for xaxis

Figure 4.2: Distribution of shifts for yaxis

From the distribution observed in x axis we can conclude that the samples of studywere composed of left eyes. The reason for this is that by anatomy the pupil centeris shifted towards the nasal region therefore we are seeing a shift towards the left.Examining the distribution of centers in the y axis we can say that there is nopredisposition of the limbus in being below or above the pupil center. The next stepis to analyze the statistical distance between the pupil and iris center, fig. 4.3 showsthe relative deviation from the limbus center relative to the pupil radius.

Figure 4.3: Distance from limbic center relative to pupil radius

We can see that in most cases the limbus and the pupil are not concentric and thatmost of the deviation is comprised within 20% pupil deviation staying therefore,

27

Page 37: Human Iris Biometry

Chapter 4. Experimental results 4.1. Database characterization

consistent with the anatomical definitions. After experimental analysis the Integro-Differential operator was set to search for a range of radius comprised in the intervalr− limbus ∈ [80, 120], and the circular Hough transform was set to search in a rangecomprised between rpupil ∈ [20, 70]. The observed distribution along the pupil radiuspresented a general concentration in the range [35, 65]. The observed distribution ofradius for the limbus is mainly concentrated in between [100, 115], since the limbushas a fixed size and its average is consistent among the population its variation canbe understood as the variance in the sampling conditions being this conditioned bythe zoom and distance to the sensor.

Figure 4.4: Distribution of pupil radius Figure 4.5: Distribution of limbus radius

In order to quantify pupil dilation, the ratio between the pupil and limbus radiiis used, such distribution is shown in fig. 4.6. Since large differences in the dila-tion ratio can harm considerably the performance of the matching procedure it isimportant to pay special attention to this parameter.

Figure 4.6: Distribution of dilation ratios (ρ)

Anatomically, a healthy pupil could in principle vary between 0.15 (highly con-stricted pupil) and 0.75 (highly dilated pupil), the range of values obtained for thedatabase used is mainly composed from about 0.3 to 0.55 which can be consideredquite stable. Based on the distribution of ρ, images can be divided into three cate-gories: constricted images (ρ < 0.35) depicted in red, images with a normal dilationratio (0.35 ≥ ρ ≤ 0.475) depicted in blue, and dilated images depicted in yellow

28

Page 38: Human Iris Biometry

Chapter 4. Experimental results 4.2. Code sensitivity

(ρ > 0.475). According to this categorization, the database is mainly composed ofpupils with a normal dilation ratio. This is due to the image acquisition process.

Finally, it is important to characterize the distribution of shifts to apply a proper andnot excessive correction. For this purpose a statistical test was carried to evaluatethe distribution of shifts along the database both for uncorrelated and for the sameiris, such distributions can be seen on fig. 4.7 and 4.8 respectively. This allowed tounderstand how where the shifts distributed and to narrow the shift correction toavoid excessive rotations that would harm the recognition. As the shift distributionseen on fig. 4.8 shows, the most part of the iris were corrected after applying a shiftcorrection along the [−5, 5] range accounting for a total of n = 11 corrections.

Figure 4.7: Code shift distribution ofuncorrelated iris

Figure 4.8: Code shift distribution ofsame iris

4.2 Code sensitivity

When the shift correction takes place it is interesting to understand how much doesit improve the performance of the recognition. For this reason we will take a lookat the decision environment before and after applying the shift correction:

Figure 4.9: Decision environment atθ = 0 without angular correction

Figure 4.10: Decision environment atθ = 0 after angular correction

29

Page 39: Human Iris Biometry

Chapter 4. Experimental results 4.2. Code sensitivity

This decision environment has been obtained by a Gabor filter with orientation θ = 0which corresponds to the angular direction. We can see that some HD scores forthe inter-class (same iris) comparisons are even worse than the intra-class (differentiris). This is unexpected and counter intuitive but before addressing this we willfirst look at the decision environment for a radial orientation of the filter at θ = 90:

Figure 4.11: Decision environment atθ = 90 without angular correction

Figure 4.12: Decision environment atθ = 90 after angular correction

In this case we can see that inter-class comparisons are always below the HD scoresfor intra-class comparisons which is to be expected but at the time of applying theshift correction do not present much improvement if any. The reason for this is dueto the correlations introduced by the Gabor filter and the correlation within thesame iris. At the time of matching Gabor filters produce the codes to add itself in-phase and counter-phase providing scores which can be worse than when comparingto a random iris. It is important to note that the effect mentioned does only appearwhen the filter is oriented at θ = 0 (along the angular component) since this is thedirection in which the shift correction takes place. Instead, when working at θ = 90we are capturing detail along the radial direction and applying the shift correctionalong the angular component does have any effect in terms of worsening the HDscore. The following figures help understand this effect:

-180 -135 -90 -45 0 45 90 135 180

Code shift (degrees)

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

HD

Code shift sensitivity at θ=0°

Figure 4.13: Code sensitivity at θ = 0

-180 -135 -90 -45 0 45 90 135 180

Code shift (degrees)

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

HD

Code shift sensitivity at θ=0°

Figure 4.14: Code sensitivity at θ = 90

The presented figures show the average HD score of all codes when comparing acode with itself rotated a certain number of samples. We can see in fig. 4.13 that

30

Page 40: Human Iris Biometry

Chapter 4. Experimental results 4.2. Code sensitivity

we can shift a code in in such a way that its HD score is worse than average whilstin fig. 4.14 there is no such effect and has a higher tolerance when comparing acode with a shifted version of itself. This demonstrates why when applying the shiftcorrection at θ = 90 we could not notice much difference.

31

Page 41: Human Iris Biometry

Chapter 4. Experimental results 4.3. Statistic results

4.3 Statistic results

To compute the iris code there are many possible Gabor filters which can be used, inthis project the orientation chosen was θ = 0 since it reveals all the radial texture onthe iris and thus provides more discriminant information; the phase offset was set toφ = 0; the aspect ratio was set to α/β = 1 and finally the wavelength, to optimizethis parameter the “decidability” criteria was chosen since it gives a measure of howwell separated the two distributions are. The “decidability” (d′) criteria is definedas:

d′ =|µ1 − µ2|√

σ21+σ

22

2

(4.4)

This measure of decidability is independent of how liberal or conservative is theacceptance threshold used. Rather, by measuring separation, it reflects the degreeto which any improvement in (say) the false match error rate must be paid for bya worsening of the failure-to-match error rate. The performance of any biometrictechnology can be calibrated by its score, among other metrics. In fig. 4.15 we cansee the decidability obtained for every wavelength when a shift correction between[−5, 5]◦ was applied.

2 4 6 8 10 12 14 16 18

Wavelength (λ)

0

0.5

1

1.5

2

2.5

3

3.5

4

4.5

Decid

abili

ty (

d′)

Figure 4.15: Decidability

As we can see in fig. 4.15 the best decidability is obtained at λ = 15. While it isimportant to characterize the decidability of the system it is not the only importantaspect to take into consideration, there may exist other wavelengths for which thedecidability criteria may not be optimum but provide better decision environmentsaccording to a certain criteria, for this reason we considered the study of the “truepositive ratio” (TPR) as a measure of how well the system can identify an individualwithout making any false decisions, such a plot can be seen in fig. 4.16; accordingto this graphic, the best decision environment when considering the highest possibleTPR is achieved at λ = 10 with TPR= 0.87 whilst at λ = 15 the TPR= 0.79.

32

Page 42: Human Iris Biometry

Chapter 4. Experimental results 4.3. Statistic results

2 4 6 8 10 12 14 16 18

Wavelength (λ)

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Tru

e p

ositiv

e r

ate

Figure 4.16: True positives (TP)

The reason for this is that the intra-class variability is narrower at λ = 10 whilestill providing a good class decidability resulting in a better TPR score whilst atλ = 15 the intra-class variability is higher, thereby harming the TPR. The decisionscenarios at λ = 10 and λ = 15 can be seen on figs. 4.17 and 4.18 respectively.

Figure 4.17: Decision environment atλ = 10

Figure 4.18: Decision environment atλ = 15

33

Page 43: Human Iris Biometry

Chapter 5

Conclusions and future work

After the realization of the project we have the theoretical basis and a first approachto an iris biometric system adapted to the NIR spectrum. The system extracts an iriscode generated from the segmentation of the iris which allows the system to matchwith other stored codes by means of the bitwise operator XOR which provides anamazing speed when matching iris codes since the XOR operator can be executed ina single instruction in the processor. The system relies on a proper segmentation ofthe iris which can in some cases become quite tricky. It is very important to captureas much details of the iris as possible and in this project a single optimized Gaborfilter was chosen but further development of the texture analysis should introducemodifications in the Gabor filter according to the region of study to address theheterogeneous structure of the iris. Further development of the system may alsoinclude segmentation of the eyelashes, eyelids and shadows (EES) minimizing asmuch as possible the suppression of texture information. Also a proper detailedstudy of the intrinsic correlations in the iris should be done compacting the generatedcode.

Future adaptations of the system may also be capable to operate on the visiblespectrum. Working on the VW spectrum has the additional difficulty that reflectionsshould be addressed when segmenting the iris. Furthermore, it would be interestingto combine facial recognition algorithms to extract iris codes.

34

Page 44: Human Iris Biometry

Bibliography

[1] J. Daugman, “How iris recognition works,” Circuits and Systems for VideoTechnology, IEEE Transactions on, vol. 14, no. 1, pp. 21–30, 2004.

[2] J. R. Charlier, M. Behague, and C. Buquet, “Shift of pupil center with pupilconstriction.”

[3] B. Green, “Canny edge detection algorithm,” pp. 1–7, 2002. [Online]. Available:http://dasl.mem.drexel.edu/alumni/bGreen/www.pages.drexel.edu/ weg22/can tut.html

[4] J. Canny, “A Computational Approach to Edge Detection,” IEEE Transactionson Pattern Analysis and Machine Intelligence, vol. PAMI-8, no. 6, pp. 679–698,1986.

[5] A. Basit, M. Y. Javed, and M. A. Anjum, “Eyelid detection in localized irisimages,” Proceedings - 2nd International Conference on Emerging Technologies2006, ICET 2006, no. November, pp. 157–159, 2006.

[6] T. H. Min and R. H. Park, “Comparison of eyelid and eyelash detection algo-rithms for performance improvement of iris recognition,” Proceedings - Inter-national Conference on Image Processing, ICIP, pp. 257–260, 2008.

[7] T. Wang, M. Han, and H. Wan, “Improved and robust eyelash and eyelidlocation method,” 2012 International Conference on Wireless Communicationsand Signal Processing, WCSP 2012, 2012.

[8] C. Academy and P. O. Box, “Enhanced Usability Of Iris Recognition Via Ef-ficient User Interface And Iris Image Restoration Zhaofeng He , Zhenan Sun, Tieniu Tan and Xianchao Qiu Center for Biometrics and Security ResearchNational Laboratory of Pattern Recognition , Institute of Aut,” Security, pp.261–264, 2008.

[9] L. Yang, Y. X. Dong, Z. T. Wu, and C. Engineering, “[ J [ J,” vol. 1, no.Iccda, pp. 533–536, 2010.

[10] I. Tomeo-Reyes, A. Ross, A. D. Clark, and V. Chandran, “A biomechanicalapproach to iris normalization,” Proceedings of 2015 International Conferenceon Biometrics, ICB 2015, pp. 9–16, 2015.

[11] Z. Lin and B. Lu, “Iris recognition method based on the optimized Gaborfilters,” Image and Signal Processing (CISP), 2010 3rd International Congresson, vol. 4, no. 1, pp. 1868–1872, 2010.

35

Page 45: Human Iris Biometry

Bibliography Bibliography

[12] J. G. Daugman, “Complete Discrete 2-D Gabor Transforms by Neural Networksfor Image Analysis and Compression,” IEEE Transactions on Acoustics, Speech,and Signal Processing, vol. 36, no. 7, pp. 1169–1179, 1988.

[13] Note on CASIA-Iris V1, “Chinese Academy of Sciences Institute of Automation(CASIA).” [Online]. Available: http://biometrics.idealtest.org/

[14] P. Podder, T. Z. Khan, M. H. Khan, and M. M. Rahman, “An Efficient Iris Seg-mentation Model Based on Eyelids and Eyelashes Detection in Iris RecognitionSystem,” 2015.

36