nagamatsu gaze estimation method based on an aspherical model of the cornea surface of revolution...

4
Copyright © 2010 by the Association for Computing Machinery, Inc. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail [email protected] . ETRA 2010, Austin, TX, March 22 – 24, 2010. © 2010 ACM 978-1-60558-994-7/10/0003 $10.00 Gaze Estimation Method based on an Aspherical Model of the Cornea: Surface of Revolution about the Optical Axis of the Eye Takashi Nagamatsu Yukina Iwamoto Junzo Kamahara Naoki Tanaka § Kobe University Michiya Yamamoto Kwansei Gakuin University Abstract A novel gaze estimation method based on a novel aspherical model of the cornea is proposed in this paper. The model is a surface of revolution about the optical axis of the eye. The calculation method is explained on the basis of the model. A prototype system for estimating the point of gaze (POG) has been developed using this method. The proposed method has been found to be more accurate than the gaze estimation method based on a spherical model of the cornea. CR Categories: H.5.2 [Information Interfaces and Presentation]: User Interfaces—Ergonomics; I.4.9 [Image Processing and Com- puter Vision]: Applications Keywords: Gaze tracking, calibration-free, eye movement, eye model 1 Introduction The use of a physical model of the eye for remote gaze estima- tion has gained considerable importance in recent times because this technique does not require a large number of calibration points. Most of the studies use a spherical model of the cornea [Shih and Liu 2004; Guestrin and Eizenman 2007]. However, Shih et al. and Guestrin et al. pointed out that a spherical model may not be suit- able for modeling the boundary region of the cornea. In this paper, we propose a novel physical model of the cornea, which is a surface of revolution about the optical axis of the eye, and an estimation method based on the model. 2 Gaze estimation based on spherical model of cornea In our previous studies, we had proposed systems for the estima- tion of the point of gaze (POG) on the computer display, which were based on the spherical model of the cornea as shown in Fig- ure 1. [Nagamatsu et al. 2008a; Nagamatsu et al. 2008b]. Figure 2 shows the evaluation results of the system that used two cameras and two light sources; the evaluation involved three subjects (a, b, c) [Nagamatsu et al. 2008b]. The system had low accuracy of esti- mating the POG around the top left and right corners of the display. The results could be because the boundary region of the modeled e-mail:[email protected] e-mail:[email protected] e-mail:[email protected] § e-mail:[email protected] e-mail:[email protected] cornea was not spherical or light from an LED was reflected from the sclera that was not modeled. A Visual Axis Optical Axis E F Rotation Center Center of Corneal Curvature Center of the Pupil Pupil Cornea Fovea POG B K α,β R Figure 1: Eye model with spherical model of cornea. 0 102 205 307 410 512 614 717 819 922 1024 0 128 256 384 512 640 768 896 1024 1152 1280 a b c Figure 2: Evaluation results of our previously proposed system based on spherical model of cornea, in display coordinate sysytem. 3 Novel aspherical model of cornea: surface of revolution about optical axis of eye 3.1 Novel aspherical model We propose an aspherical model of the cornea for remote gaze es- timation. This model is a surface of revolution about the optical axis of the eye as shown in Figure 3. In our model, the boundary between the cornea and the sclera can be joined smoothly. 3.2 Determination of optical axis of eye on the basis of novel model We use a special arrangement of cameras and light sources to de- termine the optical axis of the eye on the basis of the novel model. We use two cameras with a light source attached to each camera. The position of each light source is supposed to be the same as the nodal point of the camera. Figure 4 shows a cross section of the eyeball. A is the center of corneal curvature near the optical axis. Lj and Cj denote the po- sition of the light source j and the nodal point of camera j , respec- 255

Upload: kalle

Post on 13-Jan-2015

1.408 views

Category:

Documents


5 download

DESCRIPTION

A novel gaze estimation method based on a novel aspherical model of the cornea is proposed in this paper. The model is a surface of revolution about the optical axis of the eye. The calculation method is explained on the basis of the model. A prototype system for estimating the point of gaze (POG) has been developed using this method. The proposed method has been found to be more accurate than the gaze estimation method based on a spherical model of thecornea.

TRANSCRIPT

Page 1: Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea Surface Of Revolution About The Optical Axis Of The Eye

Copyright © 2010 by the Association for Computing Machinery, Inc. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail [email protected]. ETRA 2010, Austin, TX, March 22 – 24, 2010. © 2010 ACM 978-1-60558-994-7/10/0003 $10.00

Gaze Estimation Method based on an Aspherical Model of the Cornea:Surface of Revolution about the Optical Axis of the Eye

Takashi Nagamatsu∗ Yukina Iwamoto† Junzo Kamahara‡ Naoki Tanaka§Kobe University

Michiya Yamamoto¶Kwansei Gakuin University

Abstract

A novel gaze estimation method based on a novel aspherical modelof the cornea is proposed in this paper. The model is a surface ofrevolution about the optical axis of the eye. The calculation methodis explained on the basis of the model. A prototype system forestimating the point of gaze (POG) has been developed using thismethod. The proposed method has been found to be more accuratethan the gaze estimation method based on a spherical model of thecornea.

CR Categories: H.5.2 [Information Interfaces and Presentation]:User Interfaces—Ergonomics; I.4.9 [Image Processing and Com-puter Vision]: Applications

Keywords: Gaze tracking, calibration-free, eye movement, eyemodel

1 Introduction

The use of a physical model of the eye for remote gaze estima-tion has gained considerable importance in recent times becausethis technique does not require a large number of calibration points.Most of the studies use a spherical model of the cornea [Shih andLiu 2004; Guestrin and Eizenman 2007]. However, Shih et al. andGuestrin et al. pointed out that a spherical model may not be suit-able for modeling the boundary region of the cornea.

In this paper, we propose a novel physical model of the cornea,which is a surface of revolution about the optical axis of the eye,and an estimation method based on the model.

2 Gaze estimation based on spherical modelof cornea

In our previous studies, we had proposed systems for the estima-tion of the point of gaze (POG) on the computer display, whichwere based on the spherical model of the cornea as shown in Fig-ure 1. [Nagamatsu et al. 2008a; Nagamatsu et al. 2008b]. Figure2 shows the evaluation results of the system that used two camerasand two light sources; the evaluation involved three subjects (a, b,c) [Nagamatsu et al. 2008b]. The system had low accuracy of esti-mating the POG around the top left and right corners of the display.The results could be because the boundary region of the modeled

∗e-mail:[email protected]†e-mail:[email protected]‡e-mail:[email protected]§e-mail:[email protected]¶e-mail:[email protected]

cornea was not spherical or light from an LED was reflected fromthe sclera that was not modeled.

A

Visual Axis

Optical Axis

E

F

Rotation Center Center of Corneal Curvature

Center of the Pupil

Pupil

Cornea

Fovea

POG

B

K

α,βR

Figure 1: Eye model with spherical model of cornea.

0

102

205

307

410

512

614

717

819

922

1024

0 128 256 384 512 640 768 896 1024 1152 1280

abc

Figure 2: Evaluation results of our previously proposed system

based on spherical model of cornea, in display coordinate sysytem.

3 Novel aspherical model of cornea: surfaceof revolution about optical axis of eye

3.1 Novel aspherical model

We propose an aspherical model of the cornea for remote gaze es-timation. This model is a surface of revolution about the opticalaxis of the eye as shown in Figure 3. In our model, the boundarybetween the cornea and the sclera can be joined smoothly.

3.2 Determination of optical axis of eye on the basis ofnovel model

We use a special arrangement of cameras and light sources to de-termine the optical axis of the eye on the basis of the novel model.We use two cameras with a light source attached to each camera.The position of each light source is supposed to be the same as thenodal point of the camera.

Figure 4 shows a cross section of the eyeball. A is the center ofcorneal curvature near the optical axis. Lj and Cj denote the po-sition of the light source j and the nodal point of camera j, respec-

255

Page 2: Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea Surface Of Revolution About The Optical Axis Of The Eye

Optical axis

Figure 3: Surface of revolution about optical axis of eye.

AB

Cj

Corneal Surface

Center of Corneal

Camera j

Optical AxisPupil

B'j

B''j

Image Plane

Nodal Point

of Camera jCurvature

Center of Pupil on Image Plane

Center of Pupil

d

Lj

P'j

Pj

Light Source j

of Camera j

Purkinje Imageon Image Plane

Scleral Surface

Scleral Surface

Figure 4: Cross section of eyeball showing center of corneal curva-

ture, along with center of pupil, position of light source, and nodal

point of camera.

tively; Cj is assumed to the same as Lj . The value of Cj (= Lj)is determined by calibrating the camera beforehand.

A ray originating from the center of the pupil B gets refracted atpoint B′′

j , passes through the nodal point of the camera j, Cj , andintersects the camera image plane at a point B′

j .

A ray from Lj is reflected at a point Pj on the corneal surfaceback along its incident path. It passes through Cj and intersectsthe camera image plane at a point P′

j . If the cornea is perfectlyspherical, the line connecting Lj and Pj would pass through A,and A can be determined by using the two cameras. However, theposition of A cannot be estimated accurately when light is reflectedfrom an aspherical surface of the eye.

Because we use the model of a surface of revolution about the op-tical axis of the eye, the ray from Lj is reflected from the surfaceof the eye back in the plane including the optical axis of the eye.Therefore, A, B, B′

j , B′′j , Cj , Lj , Pj , P′

j , and the optical axisof the eye are coplanar. The normal vector of a plane that includesthe optical axis is {(Cj − B′

j) × (P′j − Cj)}, and the plane is

expressed as

{(Cj −B′j)× (P′

j −Cj)} · (X−Cj) = 0, (1)

where X (= (x, y, z)T) is a point on the plane. We obtain two

planes when we use two cameras (j = 0, 1). The optical axis ofthe eye can be determined from the intersection of the two planes.The two planes must not be coplanar (i.e., the optical axis of the eyemust not be coplanar with the nodal points of both cameras).

Irrespective of the part on the surface of the eye (central region areaof the corneal surface, boundary region of the cornea, scleral region,etc.) from where light is reflected, the optical axis of the eye canbe determined mathematically. Actually, the scleral surface is notsmooth, making it difficult to estimate the glint positions.

By using the method, only the optical axis of the eye can be deter-mined. However, it is necessary to determine A to determine thevisual axis of the eye, because we have assumed that the visual axisand the optical axis of the eye intersect at A. The visual axis of theeye is described as X = A+tc in a parametric form, where c is theunit direction vector of the visual axis of the eye, which can be esti-mated by the method described in Section 5.2. The estimation errorof A leads to a parallel shift of the visual axis of the eye. Therefore,as the distance between a user and the gazed object increases, theestimation error of the POG on the object in terms of view angledecreases. The intersection of two lines (X = Cj + tj(Cj −B′

j),j = 0, 1 ) gives an approximation of A; the approximation error isless than approximately 7.8 mm (the average value of the radius ofthe cornea).

However, when the distance between the user and the object issmall, the estimation error of POG caused by estimation error ofA is relatively large in terms of view angle. In order to solve thisproblem, we propose a method for estimating A and determiningthe visual axis of the eye accurately.

4 Estimation of user dependent parameters(user calibration)

We have to estimate the following user dependent parameters: theradius of corneal curvature near the optical axis of the eye, R; thedistance between the centers of corneal curvature and the pupil, K;and the offset between optical and visual axes, α and β. In orderto estimate these user dependent parameters, the user is instructedto gaze at a single point (calibration point) the position of which isknown. The position of the calibration point is selected such thatthe light from the camera is reflected from the corneal surface thatis approximated as a sphere. It is assumed that the pupil can be ob-served through the corneal surface that is approximated as a sphere.Therefore, the refraction at the corneal surface can be determinedon the basis of the spherical model of the cornea wherein the pupilis observed by using the cameras.

4.1 Estimation of radius of corneal curvature on thebasis of spherical model of cornea

When a user gazes at an object near the camera in the user-calibration process, light is reflected from the spherical corneal sur-face. Hence, we can use the spherical model of the cornea in thiscase.

We estimate the position of the center of corneal curvature, A. Fig-ure 5 shows a cross section of the cornea including the center ofcorneal curvature A; the position of the light source i, Li; the posi-tion of the light source j, Lj ; the nodal point of camera i, Ci; andthe nodal point of the camera j, Cj . The positions of Ci (= Li) andCj (= Lj) are known. A ray from Li reflected from the cornealsurface returns to Ci and reaches P′

ii. The extension of the pathincludes A, because the corneal suface is supposed to be a sphere.Similarly, the line connectiong Cj and P′

jj includes A. Therefore,A can be estimated from the intersection of two lines as follows:

256

Page 3: Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea Surface Of Revolution About The Optical Axis Of The Eye

X = Ci + tii

(Ci −P′

ii

), (2)

X = Cj + tjj

(Cj −P′

jj

), (3)

where tii and tjj are parameters.

A ray from Li is reflected at a point Pji on the corneal surface suchthat the reflected ray passes through Cj and intersects the cameraimage plane at a point P′

ji. Similarly, a ray from Lj is reflected ata point Pij on the corneal surface such that the reflected ray passesthrough Ci and intersects the camera image plane at a point P′

ij . Inorder to estimate the radius of the cornea, we estimate the reflectionpoint Pji (= Pij), that is, the intersection of the lines as follows:

X = Ci + tij

(Ci −P′

ij

), (4)

X = Cj + tji

(Cj −P′

ji

), (5)

where tij and tji are parameters. Therefore, the radius of cornealcurvature, R, is determined as R = ||Pji −A||.

Pji = P

ij

A

Cj P'

ji

Li

Corneal Surface

Center of Corneal Curvature Image Plane

Lj

Ci

Image Plane

P'ij

P'jj

P'ii

Figure 5: Cross section of cornea showing containing center of

corneal curvature, position of light sources, and nodal points of

cameras.

4.2 Estimation of distance between centers of cornealcurvature and pupil

As shown in Figure 6, a ray originating from the center of the pupilB gets refracted at point B′′

j , passes through the nodal point of thecamera j, Cj , and intersects the camera image plane at a point B′

j .B′′

j can be determined by solving the equations given below:

X = Cj + tj

(Cj −B′

j

), (6)

R = ||X−A||. (7)

These equations may have two solutions; we select the one closerto Cj .

The equation of the vector tj (the refracted vector at B′′j shown in

Figure 6) can be obtained by using Snell’s law as follows:

tj =(−ρnj · vj −

√1− ρ2 (1− (nj · vj)2)

)nj + ρvj , (8)

where the incident vector vj = (Cj−B′j)/||Cj−B′

j ||, the normalvector at the point of refraction, nj = (B′′

j −A)/||B′′j −A||, and

ρ = n1/n2 (n1: refractive index of air≈ 1; n2: effective refractiveindex ≈ 1.3375).

The center of the pupil, B, can be determined from the intersectionof two rays from the two cameras, as follows:

X = B′′j + sjtj (j = 0, 1), (9)

where sj is a parameter. Therefore, the distance between the centersof corneal curvature and the pupil, K, is determined as K = ||B−A||.

A

B B''j

vj

nj

tj

Cj

Corneal Surface

Center of Corneal Curvature Image Plane

Optical AxisPupil

B'j

Figure 6: Refraction on corneal surface.

4.3 Estimation of offset between optical and visualaxes

The offset between optical and visual axes is expressed by two pa-rameters; e.g., horizontal and vertical angles. For the case of auser gazing at a known position, the offset between optical and vi-sual axes is calculated by the method described in Nagamatsu et al.[2008b].

5 Estimation of visual axis of eye after usercalibration

After the user calibration, the user moves his/her eyes freely. Theoptical axis of the eye can be calculated by the method described inSection 3. R, K, and the offset between optical and visual axes ofthe eye are known from the user calibration.

The position of the center of corneal curvature, A, and the unitdirection vector along the visual axis of the eye, c, are required forthe calculation of the visual axis of the eye.

5.1 Estimation of center of corneal curvature

We suppose that the corneal surface where the pupil is observed canbe approximated as a spherical surface. The algorithm for searchingthe position of A is as follows:

1) Set the position of A on the optical axis; select the position that isnearest to the intersection of the two lines, X = Cj + tj(Cj−B′

j)(j = 0, 1).

257

Page 4: Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea Surface Of Revolution About The Optical Axis Of The Eye

2) Calculate B′′j and tj by using Equations 6, 7, and 8, where R is

known from the user calibration.

3) Calculate B, the position of the center of the pupil, from theintersection of the two lines described by Equation 9.

4) Calculate the distance between B and A, and compare it to Kthat was estimated during the user calibration.

5) Shift the position of A toward the rotation center of the eye alongthe optical axis of the eye and repeat steps 1–4 to determine theaccurate position of A. It is sufficient to search the position of Afor a length of 10 mm, because the average radius of the cornea isapproximately 7.8 mm. The search is finished when ||B −A|| =K.

5.2 Estimation of visual axis of eye and POG

The unit direction vector of the visual axis of the eye, c, is deter-mined from the unit direction vector of the optical axis of the eye,d, and the offset between optical and visual axes of the eye by usingthe method described in Nagamatsu et al. [2008b].

The intersection point between the visual axis of the eye (X =A + tc) and the object gives the POG.

6 Implementation

A prototype system for the estimation of the POG on a display hasbeen implemented, as shown in Figure 7. This system consists oftwo synchronized monochrome IEEE-1394 digital cameras (FireflyMV, Point Grey Research Inc.), a 17′′ LCD, and a Windows-basedPC (Windows XP). The software was developed using OpenCV 1.0[Intel]. Each camera is equipped with a 1/3′′ CMOS image sensorwhose resolution is 752× 480 pixels. A 35-mm lens and an IR filterare attached to each camera. Two infrared LEDs are attached toeach camera such that the midpoint of the two LEDs coincides withthe nodal point of the camera. These cameras are positioned underthe display. The intrinsic parameters of the cameras are determinedbefore setting up the system.

Figure 7: Prototype system.

The evaluation of the prototype system in a laboratory involved anadult subject who does not wear glasses or contact lenses. Thesubject’s eye (right) was approximately 500 mm from the display.She was asked to stare at 25 points on the display. More than 10data points were recorded for each point.

In order to confirm the effectiveness of our method, we comparedour method to the method described in Sections 3.2 and 3.3 in Chenet al. [2008], in which the optical axis was determined as a line

connecting the corneal center and the virtual pupil on the basis ofthe spherical model of the cornea.

Figure 8 shows the evaluation results. The crosses and trianglesindicate the POG obtained by our method and Chen’s method, re-spectively. Our method appears to be more accurate than Chen’smethod in determining POG at the top left and right corners of thedisplay. The estimated R and K were 8.04 mm and 4.43 mm, re-spectively.

0

102

205

307

410

512

614

717

819

922

1024

0 128 256 384 512 640 768 896 1024 1152 1280

Our method Chen's method

Figure 8: Comparison of our method and Chen’s method in display

coordinate system.

7 Conclusion

We proposed a novel physical model of the eye for remote gazetracking. This model is a surface of revolution about the opticalaxis of the eye. We determined the mathematical expression forestimating the POG on the basis of the model. We evaluated theprototype system developed on the basis of our method and foundthat the system could be used to estimate the POG on the entirecomputer display.

References

CHEN, J., TONG, Y., GRAY, W., AND JI, Q. 2008. A robust 3Deye gaze tracking system using noise reduction. In Proceedings

of the 2008 Symposium on Eye Tracking Research & Applica-

tions, 189–196.

GUESTRIN, E. D., AND EIZENMAN, M. 2007. Remote point-of-gaze estimation with free head movements requiring a single-point calibration. In Proceedings of the 29th Annual Interna-

tional Conference of the IEEE EMBS, 4556–4560.

INTEL. Open source computer vision library.http://sourceforge.net/projects/opencvlibrary/.

NAGAMATSU, T., KAMAHARA, J., IKO, T., AND TANAKA, N.2008. One-point calibration gaze tracking based on eyeball kine-matics using stereo cameras. In Proceedings of the 2008 Sympo-

sium on Eye Tracking Research & Applications, 95–98.

NAGAMATSU, T., KAMAHARA, J., AND TANAKA, N. 2008.3D gaze tracking with easy calibration using stereo cameras forrobot and human communication. In Proceedings of IEEE RO-

MAN 2008, 59–64.

SHIH, S.-W., AND LIU, J. 2004. A novel approach to 3-D gazetracking using stereo cameras. IEEE Transactions on Systems,

Man, and Cybernetics, Part B 34, 1, 234–245.

258