1.1. dr. dmitry o. gorodnichy video surveillance & biometrics section science and engineering...

64
1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation of Your Biometric System

Upload: emery-thompson

Post on 20-Jan-2016

226 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

1.

Dr. Dmitry O. GorodnichyVideo Surveillance & Biometrics SectionScience and Engineering Directorate

How To Conduct All-Inclusive Performance Evaluation of Your Biometric System

Page 2: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

2.

OutlineOutline1. What’s Biometrics? – to the user

1. What is it used for?

2. Biometric systems evolution and taxonomy

CBSA case studies: Iris, faces (ePassport, in video)

2. What’s Biometrics? – to the developer

1. Why Biometrics may fail? What are the factors?

2. What is biometrics Performance Evaluation?

3. Conducting comprehensive Performance Evaluation

1. General workflow

2. Basic metrics: counting False Matches vs. Non-Matches

3. Multi-order analysis: Taking into account all scores

4. Examples – what to look for

4. Lessons learnt

Page 3: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

3.

Intro to CBSAIntro to CBSA

Page 4: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

4.

Canada Border Services AgencyCanada Border Services Agency

CBSA was created in December 2003 as one of six agencies in the Public Safety (PS) portfolio RCMP (Police) is another PS agency

Integrates ‘border’ functions previously spread among 3 organizations: Customs branch from the former Canada Customs and

Revenue Agency (CCRA) Intelligence, Interdiction and Enforcement program of

Citizenship and Immigration Canada (CIC) Import inspection at Ports of Entry (POE) program from the

Canadian Food Inspection Agency (CFIA)

Mandate: to provide integrated border services that support national security and public safety priorities and facilitate the free flow of persons and goods

Page 5: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

5.

GoC (* interest in Biometrics)GoC (* interest in Biometrics) Transport Canada

Department of Foreign Affairs and International Trade

CRA (Revenue Agency) *

Office of the Privacy Commissioner of Canada*

CSIS

Citizenship and Immigration***

Passport Canada***

RCMP**

DND / Forces***

DRDC***

CBSA-SED***

National Research Council (NRC) Capacity for R&D

Page 6: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

6.

Science & Engineering DirectorateScience & Engineering DirectoratePart of Technology Branch

• We are 100 people strong and multi-disciplinary (chemistry, physics, mathematics, forensics, etc.)

• We work from one central facility (i.e. no regional offices)

• Main thrust areas: Advanced Analysis and Analytics

• Traditionally, chemistry-based, now in the forensic and physics (optical) space, moving towards the data/information space

Advanced Sensors (and Integrated Sensor Networks)• Building off RADNet concepts where “sensors” can be

for Chemical, Biological, Radiological, Nuclear or Explosive (CBRNE), electronics, and/or people, and remoting these sensors to create a “virtual presence”

Page 7: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

7.

5-year Strategy released [2008]5-year Strategy released [2008]

To become Agency’s Science and Engineering Authority Evolve a Chief Scientific Officer role

Continue operational excellence: regulatory and projects for solving today’s problems

Dedicate resources to proactively shape tomorrow and increase level of readiness

“Science & Engineering Directorate”

Laboratory & Scientific ServicesDirectorate make decisions or take actions based on sound evidence

Page 8: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

8.

… … as a result as a result Video Surveillance and BiometricsVideo Surveillance and Biometrics (VSB) Section (VSB) Section was created (Jan 2009) - was created (Jan 2009) -

to support agency’s Portfolios in to support agency’s Portfolios in Video Surveillance and Biometrics Video Surveillance and Biometrics

with expertise in with expertise in Image Analysis & Pattern RecognitionImage Analysis & Pattern Recognition(Overlap of Comp.Sci. AND Math) (Overlap of Comp.Sci. AND Math)

Page 9: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

9.

Video Analytics and BiometricsVideo Analytics and Biometrics Best synergy and impact is achieved by treating Video Surveillance

together with Biometrics. … because most Biometrics are image-based. Same unique set of skills:

Image Analysis (CompSci) AND Pattern Recognition (Math). Face Recognition – where Biometrics meets Surveillance.

5 20 40-120 400

Person detectionFace

detection/extraction Face recognitionIris recognitionVehicle

detection

1

MATCH

D. Gorodnichy © 2009

Page 10: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

10.

Three foci of our R&D work:Three foci of our R&D work:

Our objective: To find what is possible and the best:• in Video Analytics, Biometrics, Face Recognition

• for LAND and AIRPORT Points of Entry (POE)

to be in a position to recommend solutions to CBSA & OGD.

Focus 1: Evaluation of Market Solutions

Focus 2: In-house R&D

Focus 3: Live Tests/Pilots in the Field

Specifically: in Iris and Face (from Video) recognition

Page 11: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

11.

Cross often? Make it simple, use NEXUS.

NEXUS is designed to expedite the border clearance process for low-risk, pre-approved travellers into Canada and the United States.

NEXUS Iris RecognitionNEXUS Iris Recognition

Page 12: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

12.

NEXUS Iris Recognition (cntd)NEXUS Iris Recognition (cntd)

Fully operational at:

Vancouver International Airport

Halifax International Airport

Toronto- L.B. Pearson International Airport

Montréal – Trudeau International Airport

Calgary International Airport

Winnipeg International Airport

Edmonton International Airport

Ottawa-MacDonald Cartier International Airport

More about NEXUS: www.nexus.gc.ca,

Page 13: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

13.

1. Biometric needs 1. Biometric needs

Page 14: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

14.

What is Biometrics?What is Biometrics?Biometrics is an automated technique of measuring a

physical characteristic (biometric data) of a person for the purpose of recognizing* him/her.

NB: Most biometric modalities are image-based!

[CBSA NEXUS Iris Recognition system]

Page 15: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

15.

Biometric system taxonomyBiometric system taxonomy

By recognition task

By operational considerations

By environmental conditions

By modality

By modality characteristics

By recognition performance requirements

By decision making

Page 16: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

16.

Four “recognition” tasksFour “recognition” tasks

1. Verification / Authentification: 1 to 1 (eg. Access Card)

Instant decision making, facilitated by pre-stored biometric data

2. “White List” identification: 1 to N

a) N is small/limited (eg. laptop users, secret facility personnel)

b) N is large and growing (bank clients, trusted travelers)

Instant decision making, facilitated by Cooperation from users

3. “Black List” identification / Screening: 1 to M, M not large

Time-weighted decision made by an Trained Analysts

With intelligence coming from difference sources

4. Classification / Categorization: 1 to K, K – small

What is his type, eg. Gender, Age, race ? (soft biometrics)

Whom of K people he resembles most ? (eg. tracking in video)

Page 17: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

17.

Operational ConsiderationsOperational Considerations

Recognition task: verification vs. identification (harder)

Overt vs covert image capture

Cooperative vs non-cooperative participant

Structured (constrained) vs non-structured environment

Small database vs. large database

Lighting conditions at enrollment and passage

Relative impact (Cost) of False Match vs False Non-Match

Other:

Training of staff etc.

Page 18: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

18.

Decision makingDecision making

Final decisionvs. list of options/candidates for further investigation

Based only biometric data provided vs. with additional data

Instantly made – in live mode vs. time-weighted decision – by go

Fully automated (no human involved)vs. made by Forensic Analyst

Example: Face Recognition used for helping police/immigration to capture criminals vs. used for Access Control with ePassports and Smartgates

Page 19: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

19.

CBSA case study: Why iris? CBSA case study: Why iris?

Must be easily accepted by public: not intrusive

Must not have a bad association

To take a fingerprint image – be treated like a criminal

To take a face (eyes) photo – be treated like a traveler

… so that people would voluntarily participate (and cooperate)

Yet, be well-performing…

Page 20: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

20.

NEXUS as example of “White List” applicationNEXUS as example of “White List” application

Cost of False Match = security breach

False Match – not allowed, smallest possible FMR required

Result of spoofing attack (intentional defeat of a system)

Cost of False Non-Match = inconvenience / frustration

The smaller number of FM, the larger number of FNM…

But members cooperate well to help the system

Page 21: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

21.

Biometric modality characteristicsBiometric modality characteristics.

?2001 - 2009

Universality: each person should have the characteristic g

Uniqueness: how well separates individuals from one another. g

Permanence: how well a biometric resists aging/fatigue etc g

Circumvention: ease of use of a substitute. ? m

Performance: accuracy, robustness of technology used. ? m-g

Collectability: ease of acquisition for measurement. ? m-g

Acceptability: degree of approval of a technology ? m-g

Page 22: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

22.

2. What’s inside that box ?2. What’s inside that box ?

biometric system

“Hello John Smith !”

Page 23: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

23.

Why Biometrics may fail? Why Biometrics may fail?

1

2

2

3

3

4

5

PR

IP

Score(s)

1. Image(s) captured

2. Best image(s) selected and enhanced - preprocessing

3. Biometric region extracted - segmentation

4. Image features extracted (minutia, color, edges…)

5. Attributes computed (set of N numbers, 0<Xi<MAXi)

6. Best match(es) obtained: 0<Sj<1

7. Decision is made (based on Threshhold)

Page 24: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

24.

Scores for Genuine and Imposter users:Scores for Genuine and Imposter users:

NB: Traditionally, Match is when score is lower a threshold(ie. And it could be not the smallest score!)

Page 25: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

25.

5

Iris Recognition exampleIris Recognition example2

3 6

4

7

Is image quality good?

Is iris extraction good?

Is feature detail/numbers sufficient?

Are used features informative?

5 Best scores:0.510.38 *0.39 *0.41 *0.67

Where to put Threshold?

What about confidence level?

Page 26: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

26.

Science behind BiometricsScience behind Biometrics

Biometric recognition success is attributed to success in :

IP: Image Processing theory to deal with variability of the quality of the captured iris.

PR: Pattern Recognition (statistical machine learning algorithms) to determine the similarity between templates.

Academic Conferences: ~10 big ones, ~ 10000 papers / year

Canadian IAPR Conference on Robot and Computer Vision (CRV)

IEEE CVPR, ECCV, ICCV, ICIP, BCMV …

Page 27: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

27.

Evolution of BiometricsEvolution of Biometrics

Surveillance

Towards more collectable, unconstrained environments

Leve

l of a

utom

atio

n an

d in

telli

gent

pr

oces

sing

2020.

2010

2000

1990

1980

1970

Faces from videoSoft biometrics (audio/visual)

Multi-modal fusion

full

more

little

none

.

Faces from videoSoft biometrics (audio/visual)

Multi-modal fusion

Controlled environment Manual selection

iris fingerprint

Biometrics

Towards collecting intelligence / evidence

Automated identification of people from video

Automated detection of events

Remote monitoring only

D. Gorodnichy © 2009

Page 28: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

28.

New New biometric technologiesbiometric technologies

As a result of evolution, the arrival of such biometric technologies as

Biometric Surveillance,

Soft Biometrics,

Stand-off Biometrics, also identified as Biometrics at a Distance, Remote Biometrics, Biometrics on the Move or Biometrics on the Go

And increased demand for Face Recognition from Video, .

Page 29: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

29.

Stand-off vs. Stand-in biometricsStand-off vs. Stand-in biometrics

Stand-in biometrics: a person intentionally comes in contact with a biometric sensor

Stand-off biometrics: without person’s direct engagement with the sensor (In many cases, s/he may not even know where a capture device is located or whether his/her biometric trait is being captured.)

As a result, a single biometric measurement is normally much less identifying. This means two things:

1. there could be more than one match below the threshold, or two or more very close matching scores.

22 final recognition decision is not based on a single measurement, but on a number of biometric measurements taken from the same or different sensor, combined together using some data fusion technique

Page 30: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

30.

To conclude…To conclude…

Why to conduct evaluation ?

Because …

Biometric system is not a “magic box”, but a statistically-derived tool, and it is not error-free (and never will !)

And because you want …

To select the best system for your needs

Or, if you already got one, to make it perform better!

Page 31: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

31.

3. How to do it and what to watch for3. How to do it and what to watch for

Page 32: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

32.

What is performance evaluation ?What is performance evaluation ?

Definition: Performance evaluation is a procedure that involves the testing of a system on a database and/or in a specific setup for the purpose of obtaining measurable statistics (metrics) that can be used to compare systems or system setups to one another.

Basic Measurements:

Scores

Decisions (correct vs. incorrect) :

False Match (FM), True Match

False Non-match (FNM), True Non-Match

Page 33: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

33.

General evaluation processGeneral evaluation process

1. Determine suitability of modality (-ies)

2. Determine costs/impact of FM and FNM

3. Determine all factors affecting performance

4. Measure performance

1. wrt all factors

1. On large-scale database (>1000)

2. On Pilot project (in real environment)

5. Evaluate the capability to be integrated / customized

1. Wrt input parameters (pre-processing)

2. Wrt output parameters (post-processing)

Page 34: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

34.

Factors that affect the performanceFactors that affect the performance

THREE sources of problem:

1. Capture device

2. User

3. Light condition (for image-based biometrics)

Page 35: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

35.

Example: seven Face Recognition factorsExample: seven Face Recognition factors

1. face image resolution*

2. facial image quality* 1. blurred due to motion,

2. lack of focus,

3. low contrast due to insufficient camera exposure or aperture,

3. head orientation*

4. facial expression/variation

5. lighting conditions* 1. location of the source of light with

- respect to the camera, - users and - other objects

6. occlusion*

7. aging and facial surgery*

* - concerns ALL image-based biometrics

Page 36: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

36.

Example: Face Recognition databases*Example: Face Recognition databases*Database (year created) /Users

#individuals/ # images

IOD / image width

Orientation

Expression

Lighting / quality

Occlusion

situations

RepresentativeFacial image

AT &T Olivetti

(1992-1994)  

40 /400

~60 / 92

yes yes yes yes

FERET

(1993-1996) 

1999 /14,126

~ 80 / 256

9-20 2 2 2

PIE2000

68  / 41,368

~75/ 640

13 3 43

Cas-peal2003

1040 ~45 / 360

21 15 6 1-5

Korean 1000/52000

~80/640

7 5 16

Equinox 91 ~100 / 240

1 3 3 -IR images

Cmu- hyperspectral

54 80/640

1 4 -IR images

5

U of Texas2002

284 ~80/ 720

video video

FRVT * HCInt1999-2002

37,437 / 121,589

>100 1 3

FRVT * MCInt1999-2002

>10063

>80,<80

several

Still and video

several

yes

* From [Gor2008]

Page 37: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

37.

Page 38: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

38.

Basic performance metricsBasic performance metrics False Match Rate (False Accept, False

Positive, Type 1 Error)

False Non-Match Rate (False Reject, False Negative, Type 2 Error, Miss rate)

which are tradeoff off one another !

functions of many parameters (main - Threshold)

Consider Cost (Impact) of these errors!

In “White list”: FM cost > FNM cost

In “Black list” – the opposite

Focus on those error that costs the most!

0.15 0 0

0.16 0 3

0.17 0 3

0.18 0 19

0.19 0 36

0.2 0 69

0.21 0 93

0.22 0 155

0.23 0 211

0.24 0 230

0.25 0 269

0.26 0 295

0.27 0 263

0.28 0 292

0.29 0 253

0.3 0 236

0.31 1 226

0.32 1 191

0.33 6 146

0.34 7 134

0.35 19 130

0.36 47 108

0.37 85 102

0.38 224 98

0.39 559 64

0.4 1563 79

0.41 4464 64

0.42 14866 51

0.43 48037 46

0.44 138788 31

0.45 338895 39

0.46 658719 31

0.47 961218 35

0.48 1066150 41

0.49 669634 24

0.5 10449 1

0.51 22 0

0.52 0 0

Page 39: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

39.

Error trade-off curvesError trade-off curves

Detection Error Trade-off (DET) curve: graph of FMR vs FNMR, which is obtained by varying the system parameters such as match threshold.

Receiver Operator Characteristic (ROC) curve similar to DET curve, but plots TMR (True Match Rate) against FMR.

Page 40: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

40.

Rank-k identification rate (Rk) - the number of times the correct identity is in the top k most likely candidates.

Cumulative Match Characteristic (CMC) curve - plots the rank-k identification rate against k.

Identification survesIdentification surves

Page 41: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

41.

Other measuresOther measures

Equal Error Rate

FNMR at fixed FMR (eg. 0.001, if cost of FM is high)

Page 42: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

42.

Reported error ratesReported error ratesBiometrics    

EER    

FMR    

FNMR    

Subjects    

Comment Reference

Face n.a. 1% 10% 37437 Varied lighting, indoor/outdoorFRVT (2002)[24]

Fingerprint n.a. 1% 0.1% 25000US Government operational data

FpVTE (2003)[25]

Fingerprint 2% 2% 2% 100Rotation and exaggerated skin distortion

FVC (2004)[26]

Hand geometry

1% 2% 0.1% 129With rings and improper placement

(2005)[27]

Iris < 1% 0.94% 0.99% 1224 Indoor environmentITIRT (2005)[28]

Iris 0.01%0.0001%

0.2% 132 Best conditionsNIST (2005)[29]

Keystrokes 1.8% 7% 0.1% 15 During 6 months period (2005)[30]

Voice 6% 2% 10% 310 Text independent, multilingualNIST (2004)[31]

•Obtained for a specific data/setup/conditions/constrains•These rates can not be relied upon, but should be used as a guide only!

Page 43: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

43.

Limitations of basic metrics Limitations of basic metrics

What if:

1. there is more than one match below the threshold ?

2. there are two or more very close matching scores ?

A:

0.61

0.59

0.36 ***

0.49

0.57

B:0.510.38 *0.39 *0.41 *0.67

Page 44: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

44.

Request for new ISO Biometrics standardRequest for new ISO Biometrics standard

ISO/IEC SC 37 N 3060 (May 2009) – “Canadian contribution on Future Testing needs”:

“There is a need for a comprehensive biometrics performance evaluation standard that would take into account not only the best matching scores, but also the "runner-up"matching scores.

Such standard would be most applicable for evaluation of stand-off identification systems, such as Face or Iris Recognition from Video. The standard however would also be applicable for verification systems, in particular to those that make the decision based on examining the entire database of enrolled identities.”

ISO/IEC SC 37 N 3371 (WG 5 Roadmap, July 2009) -

7. Perceived Requirements for New Standards and Technical Reports: “A contribution from Canadian National Body (37N3145) might form the basis of future work”

Page 45: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

45.

Multi-order biometric performance analysisMulti-order biometric performance analysis

Order 0:

See ALLs scores distributions

Order 1 (Traditional):

See single-score statistics (FMR/FNMR) and trade-off curves

Order 2:

Examine all scores and see best (smallest) scores: “Do they coincide with genuine score?”

Order 3:

Examine relationship between the scores:- See difference between best and second best scores, - See ALL scores below threshold

Page 46: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

46.

General workflowGeneral workflow Step 0: Data preparation

Analyze and select Enrolled and Passage datasets:

• of several sizes (N): 100, 500, 1000, 5000

• corresponding to different factors/setups

Step 1: Encode ALL images (get binary templates)

Record Failure to Acquire (FTA)

Step 2: Get ALL Scores for ALL image PAIRS

A) For Enrolled – Imposters only

B) For Passage – Imposters and Genuine

Step 3: Analyze ALL obtained scores (many,many…)

Using multi-order analysis

Page 47: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

47.

Estimate time requiredEstimate time required

Step 2b (getting the scores) is most time consuming

Page 48: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

48.

Order-0 analysisOrder-0 analysis

Visualizing the score only:

Just by looking at the score distribution (Order-0 Analysis), one may spot a problem or a deficiency of the system

One system displays intolerance to one (or more) factors present in the enrolled images.

Modify your setup or buy another system!

Page 49: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

49.

Order-1 analysis: FMR / FMNR curves Order-1 analysis: FMR / FMNR curves

By plotting FMR/FNMR as function of threshold for different data-set sizes, one may see how to optimally adjust the threshold.

Page 50: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

50.

Order-1 analysis: FMR / FMNR curves Order-1 analysis: FMR / FMNR curves

By plotting FMR/FNMR as function of threshold for different data-set sizes, one may see how to optimally adjust the threshold.

Page 51: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

51.

Order-1 analysis: DET curves Order-1 analysis: DET curves

Measured points must be shown, not only extrapolated lines!

Especially in the area of prime interest

INCORECT

?

Page 52: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

52.

Order-1 analysis: DET curves Order-1 analysis: DET curves

Measured points must be shown, not only extrapolated lines!

Especially in the area of prime interest

CORRECT

Page 53: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

53.

Order 2. Do Genuine data have best scores ?Order 2. Do Genuine data have best scores ?

Page 54: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

54.

Order 3: Recognition confidence IOrder 3: Recognition confidence I

Number of scores below a threshold (for 3000 images).

Is this a good systems ?

Hits=2596, Misses=346

Page 55: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

55.

Order 3: Recognition confidence I (cnt)Order 3: Recognition confidence I (cnt)

Number of scores below a threshold

Is this one better ?

Hits=2745, Misses=147

Page 56: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

56.

Order 3: Recognition confidence I (cnt)Order 3: Recognition confidence I (cnt)

Many can improve the match/non-match tradeoff at the cost of allowing more than one scores below a threshold. (by raising the threshold) - Will you deploy it for Access Control ?!

Page 57: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

57.

Order 3: Recognition confidence IIOrder 3: Recognition confidence II

Distance from “runner-up” and “winning” scores –

Page 58: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

58.

Order 3: Recognition confidence IIOrder 3: Recognition confidence II

Distance from “runner-up” and “winning” scores – Which do you prefer?

Page 59: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

59.

Trade-off Curves with FCRTrade-off Curves with FCR

DEFINITION: Failure of Confidence Rate (FCR) – the rate of incidences in which there are more than one match below threshold

0.001 0.028

Page 60: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

60.

Performance Report CardPerformance Report Card

Page 61: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

61.

In addition to Match/Non-Match Errors …In addition to Match/Non-Match Errors …

Report:

FTA (Failure To Acquire): because some systems may produce better DET curves by rejecting (i.e. failing to acquire) the images that are more difficult to recognize, eg. iris images that are occluded.

FCR (Failure of Confidence Rate):because some systems may produce better DET curves by allowing more matches below/above the matching threshold, ie by producing less reliable recognition decisions.

Page 62: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

62.

4. Lessons learnt4. Lessons learnt

Main lesson and motivation for deploying biometrics:

“Even though no biometric modality is error-free, with proper system tuning and setup adjustment, critical errors of the biometric systems can be minimized to the level allowed for the operational use”.

And it is only through performance evaluation that

- biometric systems errors, and

- factors / parameters that affect the recognition performance

can be discovered and properly taken into account!

Page 63: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

63.

Especially because … Especially because …

There are many biometric error types (eg. FMR, FTA, FCR…)

There are many factors that affect the performance (lighting, location …)

Performance may deteriorate over time (as number of stored people increases and spoofing techniques become more sophisticated).

While …

There are also many ways to improve the performance (eg. more samples, modalities, constraints, …)

Page 64: 1.1. Dr. Dmitry O. Gorodnichy Video Surveillance & Biometrics Section Science and Engineering Directorate How To Conduct All-Inclusive Performance Evaluation

64.

ReferencesReferences

ISO/IEC 19795: Biometric performance testing and reporting.

D. Gorodnichy. “Face databases and evaluation”, in Encyclopedia of Biometrics (Editor: Stan Li), Elsevier Publisher, 2009.

D. Gorodnichy. “Evolution and Evaluation of Biometric Systems”, Proceedings of Second IEEE Symposium on Computational Intelligence for Security and Defense Applications. Ottawa, Canada, 9-10 July 2009

D. Gorodnichy. Multi-order analysis framework for comprehensive biometric performance evaluation, SPIE Conference on Defense, Security, and Sensing. Orlando, 5 - 9 April 2010,

C-BET (Comprehensive Biometrics Evaluation Toolkit):

developed by Canada Border Services Agency for GoC

for selecting new and tuning existing biometric systems.