binocular eye tracking in virtual reality for inspection...

8
Binocular Eye Tracking in Virtual Reality for Inspection Training Andrew T. Duchowski, Vinay Shivashankaraiah, Tim Rawls Department of Computer Science Clemson University Anand K. Gramopadhye , Brian J. Melloy Department of Industrial Engineering Clemson University Barbara Kanki Human Factors Research and Technology Division NASA Ames Research Center (a) User interface. (b) Eye image from tracking device. (c) 3D scanpaths. Figure 1: Recording binocular eye movements in aircraft cargo bay virtual inspection environment. Abstract This paper describes the development of a binocular eye tracking Virtual Reality system for aircraft inspection train- ing. The aesthetic appearance of the environment is driven by standard graphical techniques augmented by realistic tex- ture maps of the physical environment. A “virtual flashlight” is provided to simulate a tool used by inspectors. The user’s gaze direction, as well as head position and orientation, are tracked to allow recording of the user’s gaze locations within [email protected] [email protected] [email protected] the environment. These gaze locations, or scanpaths, are cal- culated as gaze/polygon intersections, enabling comparison of fixated points with stored locations of artificially gener- ated defects located in the environment interior. Recorded scanpaths provide a means of comparison of the performance of experts to novices, thereby gauging the effects of training. CR Categories: I.3.7 [Computer Graphics]: Three- Dimensional Graphics and Realism—Virtual Reality I.3.6 [Computer Graphics]: Methodology and Techniques— Interaction Techniques J.4 [Computer Applications]: Social and Behavioral Sciences—Psychology Keywords: Eye Tracking, Virtual Reality, Visual Inspec- tion 1 Introduction Aircraft inspection and maintenance are an essential part of a safe, reliable air transportation system. Training has been identified as the primary intervention strategy in improving inspection performance [4]. If training is to be successful,

Upload: vandat

Post on 28-Jun-2018

216 views

Category:

Documents


0 download

TRANSCRIPT

Binocular Eye Tracking in Virtual Reality for InspectionTraining

Andrew T. Duchowski�, Vinay Shivashankaraiah,Tim Rawls

Departmentof ComputerScienceClemsonUniversity

AnandK. Gramopadhye†, BrianJ.MelloyDepartmentof IndustrialEngineering

ClemsonUniversity

BarbaraKanki‡

HumanFactorsResearchandTechnologyDivisionNASA AmesResearchCenter

(a)Userinterface. (b) Eyeimagefrom

trackingdevice.

(c) 3D scanpaths.

Figure1: Recordingbinoculareyemovementsin aircraftcargobayvirtual inspectionenvironment.

Abstract

This paperdescribesthe developmentof a binocular eyetrackingVirtual Realitysystemfor aircraft inspectiontrain-ing. The aestheticappearanceof the environmentis drivenby standardgraphicaltechniquesaugmentedby realistictex-turemapsof thephysicalenvironment.A “virtual flashlight”is providedto simulatea tool usedby inspectors.Theuser’sgazedirection,aswell asheadpositionandorientation,aretrackedto allow recordingof theuser’sgazelocationswithin�

[email protected][email protected][email protected]

theenvironment.Thesegazelocations,or scanpaths, arecal-culatedasgaze/polygonintersections,enablingcomparisonof fixatedpointswith storedlocationsof artificially gener-ateddefectslocatedin the environmentinterior. Recordedscanpathsprovideameansof comparisonof theperformanceof expertstonovices,therebygaugingtheeffectsof training.

CR Categories: I.3.7 [Computer Graphics]: Three-DimensionalGraphicsand Realism—Virtual Reality I.3.6[Computer Graphics]: Methodology and Techniques—InteractionTechniquesJ.4[ComputerApplications]: SocialandBehavioral Sciences—Psychology

Keywords: Eye Tracking,Virtual Reality, Visual Inspec-tion

1 Introduction

Aircraft inspectionandmaintenanceareanessentialpartofa safe,reliableair transportationsystem.Traininghasbeenidentifiedasthe primary interventionstrategy in improvinginspectionperformance[4]. If training is to be successful,

it is clear that inspectorsneedto be provided with trainingtools� to help enhancetheir inspectionskills. In responsetothis need,this paperoutlinesa Virtual Reality (VR) inspec-tion simulatorunderdevelopmentat theVirtual RealityEyeTracking(VRET) Laboratoryat ClemsonUniversity.

TheVR inspectionsystemis a collaborative extensionofrecentefforts beingpursuedat theTrainingSystemLabora-tory (TSL) at Clemson.Previous work at the TSL focusedon thedevelopmentof a computerbasedinspectiontrainingprogram—AutomatedSystemof Instructionfor SpecializedTraining (ASSIST) [5]. The ASSIST program,developedusinga taskanalyticmethodology, featuresa PC-basedin-spectionsimulationof anaircraftcargobay, whereanimageof a portion of the airframeis presentedto the userfor in-spection(visualdetectionof defects).The imageis a photoof a sectionof an aircraft’s interior. The useris shown animageselectedfrom a two-dimensional4 � 8 grid of images,andis ableto “navigate” left, right, up,anddown to view theentiregrid, one imageat a time. Despiteits advantagesofbeinga computer-basedinspectiontraining/job-aidtool, thestatic,two-dimensionallayoutof theairframelacksrealism.To enhancethefidelity of the inspectionsystem,an immer-sive, three-dimensionalVR systemhasbeendeveloped.

Thepurposeof this paperis to describethe developmentof the Virtual Reality inspectionsystem. The VR inspec-tion simulatorfeaturesa binoculareye tracker, built into thesystem’s HeadMountedDisplay (HMD), which allows therecordingof the user’s dynamicpoint of regardwithin thevirtual environment. Usergazedirections,aswell asheadposition and orientation,are tracked to enablenavigationandpost-immersive examinationof the user’s overt spatio-temporalfocusof attentionwhile immersedin the environ-ment. The recordedpoint of regardaddressesimprecisionandambiguityof the user’s viewpoint in a virtual environ-ment by explicitly providing the 3D location of the user’sgaze.

In Virtual Reality, the three-dimensionallocationof gazeservesaseithera real-timeor a post-immersiondiagnosticindicator of the user’s overt focus of attention. The col-lection of gazepoints taken over the courseof immersion,theuser’s three-dimensionalscanpath,servesasadiagnostictool for post-immersivereasoningabouttheuser’sactionsintheenvironment.In our system,we conjecturerecordedeyemovements,or scanpaths,will enablecomparisonof theper-formanceof expertsto novices,therebygaugingthe effectsof training.

The main contribution of this paper is the presenta-tion of real-timesoftware techniquesfor the integrationofthe binoculareye tracker in a Virtual Reality application.Presently, it appearsthat the binoculareye tracker coupledwith an HMD capableof vergencemeasurementin VR isthefirst of its kind to beassembledin theUnitedStates.Al-thoughbinoculareye trackers integratedwith HMDs havepreviouslybeenproposed[8], no reportsof their actualcon-structionor operationhave beenfound. For reference,the

hardwareusedattheClemsonUniversityVirtual RealityEyeTracking lab is briefly describedin Section2. Section3presentsthe evolution of the geometricmodel of the vir-tualaircraftcargobayinspectionenvironment.In Section4,technicalissuesof eye tracker systemintegration and op-erationarediscussed,emphasizingcoordinatemappingbe-tweentheeye trackerandtheVR application,andthecalcu-lationof thethree-dimensionalgazepointfrombinoculareyemovementmeasurements.ConcludingremarksaregiveninSection5.

2 Hardware Platform

Our primaryrenderingengineis a dual-rack,dual-pipe,SGIOnyx2 R

�InfiniteRealityTM systemwith 8 rastermanagers

and 8 MIPS R�

R10000TM processors,eachwith 4Mb sec-ondarycache.1 It is equippedwith 3Gbof mainmemoryand0.5Gbof texturememory.

Multi-modal hardware componentsinclude a binocularISCAN eye tracker mountedwithin a Virtual ResearchV8(high resolution)HeadMountedDisplay (HMD). The V8HMD offers 640� 480 resolutionper eye with separateleftandright eye feeds.HMD positionandorientationtrackingis providedby anAscension6 Degree-Of-Freedom(6DOF)Flock Of Birds (FOB),a d.c.electromagneticsystemwith a10mslatency. A 6DOFtracked,hand-heldmouseprovidesameansto representa virtual tool for theuserin theenviron-ment.Thelab is shown in Figure2.

Figure2: Virtual RealityEyeTracking(VRET) laboratoryatClemsonUniversity.

3 Geometric Environment Model-ing

The goal of the constructionof the virtual environment isto matchtheappearanceof thephysicalinspectionenviron-ment,an aircraft cargo bay, shown in Figure3. The phys-

1Silicon Graphics,Onyx2, InfiniteReality, areregisteredtrademarksofSiliconGraphics,Inc.

Figure3: Aircraft cargobayphysicalenvironment.

ical environmentis a complex three-dimensionalcube-likevolume,with airframecomponents(e.g., fuselageribs) ex-posedfor inspection.A typical visual inspectiontaskof thecargo bayinvolvescarefullywalking over thestructuralele-mentswhile searchingfor surfacedefectssuchascorrosionandcracks(amongothers).

3.1 Model Geometry

The evolution of the virtual inspectionenvironmentbeganwith astraightforwardemulationof thetwo-dimensionalAS-SIST inspectiongrid. The rationalefor this designwastherelatively quick developmentof a simple texture-mappedgrid to provide the userwith a morenaturalrepresentationandnavigation of the entire inspectionregion. The imagegrid, displayedasa flat polygonalwall in VR, is shown inFigure 4. Although the VR environmentprovided advan-

Figure4: Initial planar2D virtual environment.

tagesover the PC-basedsystem,several problemswith itsdesignbecameevident.

Themainadvantageof theVR systemis its displayof theentiresideof theairframe’swall, whichprovidesbettercon-text for the user in termsof the location of the individualpanelsunderinspection.Theobviousdrawbacksof this im-plementation,however, arethattherearenoticeablediscrep-anciesin theappearanceof theimages(e.g.,lighting changesandpositionalmisalignment),andthe unsuitablesimplicityof thegeometryfor VR. Thesimplistic2D wall in effectde-featsthe immersive andnaturalnavigationaladvantagesof-feredby VR technology. Clearly, to provide an immersiveenvironment,a three-dimensionalstructurewasrequired.

Thenext evolutionof theinspectionenvironmentwaspat-ternedafter a simple three-dimensionalenclosure(e.g., acube),specifiedby actualdimensionsof the real inspectionenvironment,i.e., an aircraft’s cargo bay. An earlystageofthewireframerenditionof thecargobay“3D box” is shownin Figure5. The model is built entirely out of planarpoly-

Figure5: 3D box-like virtual environment(shown in wire-frame).

gons.Therearetwo pragmaticreasonsfor thisdesignchoice.First, sincethe representationof the true complexity of theairframestructureis avoided, fast display ratesare main-tained(on the orderof 10-30 fps) while tracking latenciesareminimized(on the orderof 10-30ms for headandeyetrackers).2 Second,planarpolygons(quadrilaterals)arepar-ticularly suitablefor texturemapping.To provide a realisticappearanceof the environment,in keepingwith the desireof preservinggeometricsimplicity, imagesof the physicalenvironmentareusedfor texturemaps.

3.2 Lighting and Flashlight Modeling

The SGI Onyx2 hostprovidesreal-timegraphicsrenderingperformance,while simultaneouslyprocessingtracking in-formationsentto the hostvia the rs-232serialconnection.

2Display updateandlatency measuresareonly roughestimatesat thispoint,weareworkingonobtainingmoreformal measurements.

To generatetheenvironment,no specializedrenderingalgo-rithms� areinvokedbeyondwhatis providedby theOpenGLgraphicslibrary ApplicationProgramInterface(API). Stan-dard (1st-order)direct illumination is usedto light the en-vironment. Additionally, an OpenGL spotlight is usedtoprovide the userwith a “virtual flashlight”. The flashlighteffect is shown over a 2D polygonin Figure6. The flash-

Figure6: Virtual flashlightshown overa2D polygon.

light’s positionandorientationareobtainedfrom the6DOFelectromagneticallytracked“flying mouse”from Ascension.

BecauseOpenGL relieson thePhongillumination modelcoupledwith Gouraudshadingto generatelighting effects,large polygonsproducea coarse(blocky) flashlight beam.To correct this problem, the polygonswere subdivided tosmooth out the spotlight, producinga more aestheticallypleasingcirculareffect. The level of polygonalsubdivisionis user-adjustable.

3.3 Realistic Texture Maps

To texturemapthesimple3D box-likeenvironment,imagesof thephysicalenvironmentwereobtained.With permissionfrom a commercialairline, imagesof anaircraft’s cargobayweretakenwhile theaircraftwaswithheldfromoperationforinspectionandservicing. Sincethe time to photographthecargobayinteriorwaslimited, noparticularmethodologyorspecialequipmentwasusedto obtaintheseimages.Carewastaken to attemptto photographthe environmentin sectionsby translatinga hand-heldcamera,however, inaccuraciesinimagealignmentwereinevitable.

In retrospect,a betterapproachwould have beento mea-suretheenvironmenta priori, thencalculateandrecordap-propriatepositionsof the camera,andensureproperplace-mentof the cameraon somestablerigging apparatus(e.g.,a levelled tripod which could be fastenedto the aircraft in-terior). This approachwould besimilar to a motioncaptureor animage-renderingsessionusedin specialeffectsproduc-tion work. Of course,this methodwould requiremoretimeandcareto carryout.

Although our approachwas somewhat haphazard,im-agealignmentwaspossiblethroughcarefullabelingof pho-tographsanda gooddealof digital post-processing.In ourcase,we usedThe Gimp (the freewareanalogueof Photo-Shop)to orientandresizetheimagesappropriately. There-

sultingenvironment,with theuser-held“virtual flashlight” isshown in Figure7.

Figure7: Virtual flashlightin virtual aircraftcargobayenvi-ronment.

4 Eye Tracking

Interestin gaze-contingentinterfacetechniqueshasenduredsince early implementationsof eye-slaved flight simula-tors and hassincepermeatedseveral disciplinesincludinghuman-computerinterfaces,teleoperatorenvironments,andvisual communicationmodalities[7, 9, 6]. Recentapplica-tionsof aneye tracker in VR haveshown promisein theuseof the device asa componentof multi-modal interfacesys-tems.For example,Jacobhasusedaneye trackerasaselec-tion device in VR [10], andDanforthetal. useaneyetrackerasanindicatorof gazein a gaze-contingentmulti-resolutionterrainnavigation environment[1]. Although theseareex-amplesof eye trackers usedas interactive devices, in thepresentapplication,the eye tracker servesa diagnostic pur-pose. That is, no changesin the display occur due to thelocationof theuser’sgaze,rather, theuser’seye movementsaresimply(andunobtrusively)recordedin real-timefor post-immersionanalysis.

In our system,a dedicatedPC calculatesthe point of re-gardin real-time(60Hz)from left andright videoimagesoftheuser’spupilsandinfra-redcornealreflections(first Purk-inje images).Figure8 showsauserwearingtheeyetrackingHMD. Eye imagescapturedby the camerascanbe seenintwo videomonitorsnearthelower right of thefigure. In thissection,the technicalissuesof systemintegration and op-erationarepresented(for a technicaldescriptionof systeminstallationandwiring, see[2]).

4.1 Eye Tracker Integration

In designingthe visual inspectionsimulator, a critical con-cernis the mappingof eye tracker coordinatesto the appli-

Figure8: EyetrackingHMD.

cation program’s referenceframe. The eye tracker calcu-latestheviewer’spointof regardrelative to theeye tracker’sscreenreferenceframe, e.g., a 512� 512 pixel plane,per-pendicularto the optical axis. That is, for eacheye, theeye tracker returnsa samplecoordinatepair of x- and y-coordinatesof the point of regard at eachsamplingcycle(e.g.,onceevery � 16msfor a 60Hz device). This coordi-natepair mustbe mappedto the extentsof the applicationprogram’s viewing window. TheVR applicationmustalso,in thesameupdatecycle, mapthecoordinatesof thehead’spositionandorientationtrackingdevice.

4.2 Eye Tracker Coordinate Mapping

The eye movementdata obtainedfrom the tracker mustbe mappedto a range appropriatefor the VR applica-tion. Specifically, the 2D eye tracker data, expressedineye tracker screencoordinates,mustbe mappedto the 2Ddimensionsof the nearviewing frustum. The 3D viewingfrustumemployedin theperspectiveviewing transformationis definedby the parametersleft, right, bottom,top, near, far.3 Figure9 showsthedimensionsof theeye tracker screen(left) andthe dimensionsof the viewingfrustum(right). To converttheeyetrackercoordinates� x � y ���

eye

eye

right,top

left,bottom

512,512

0,0

near

far

Figure9: Eyetracker to 3D viewing frustumscreencoordi-natemapping.

3For example,asusedin theOpenGL functioncall glFrustum().

to graphicscoordinates� x y � a linear interpolationmappingis used:

x left � x ��� right � left �512

(1)

y bottom � � 512 � y ����� top � bottom �512

(2)

Sincetheeyetrackerorigin is at thetop-leftof thescreenandtheviewing frustum’sorigin is at thebottom-left(acommondiscrepancy betweenimagingandgraphicsapplications),theterm � 512 � y ��� in Equation(2) handlesthe necessaryy-coordinatemirror transformation.

The above coordinatemapping assumesthat the eyetracker coordinatesare in the range � 0 511� . In reality, theusable,or effective,coordinateswill bedependenton: (a) thesizeof theapplicationwindow, and(b) thepositionof theap-plicationwindow. Propermappingbetweeneye tracker andapplicationcoordinatesis achievedthroughthemeasurementof theapplicationwindow’sextentsin theeyetracker’srefer-enceframe.This is accomplishedby usingtheeye tracker’sown fine cursormovementandcursorlocationreadout.

To obtaintheextentsof theapplicationwindow in theeyetracker’s referenceframe,the applicationwindow’s cornersaremeasuredwith the eye tracker’s cursor. Thesewindowextentsare thenusedin the linear mappingequation. Fig-ure10 illustratesanexampleof a 600� 450applicationwin-dow as it would appearon the eye tracker scenemonitor.Basedon the measurementsshown in Figure10, the linear

LEFTSCENE

MONITOR

Application

data display)(as shown in

51,446

51,53

482,53

482,446H: 267 V: 250 D: 0 T:00:00:00:00

267,250

Figure10: Mappingmeasurementexample.

coordinatemappingis:

x x ��� 51� 482 � 51 � 1� � 600� (3)

y 449 � y � � 53� 446 � 53 � 1� � 450� (4)

The centralpoint on the eye tracker display is � 267 250� .

Note that y is subtractedfrom 449 to take careof the im-age/graphics� verticalorigin flip.

4.3 Gaze Vector Calculation

Thecalculationof thepointof regardin three-spacedependson only the relative positionsof the two eyes in the hori-zontal axis. The parametersof interesthereare the three-dimensionalvirtual coordinates,� xg yg zg � , which can bedeterminedfrom traditional stereogeometrycalculations.Figure11 illustratesthe basicbinoculargeometry. Helmet

(x ,y ,z )h h h

b

f =

(x ,y ,z )g gg

(x ,y )(x ,y )

near

l r rl

Figure11: Basicbinoculargeometry.

tracking determinesboth helmetposition and the (orthog-onal) directionaland up vectors,which determineviewer-local coordinatesshown in the diagram. The helmetposi-tion is theorigin, thehelmetdirectionalvectoris theoptical(viewer-localz) axis,andthehelmetupvectoris theviewer-local y axis.

Given instantaneous,eye tracked, viewer-local coordi-nates � xl yl � and � xr yr � in the left andright imageplanes(mappedfrom eye tracker screencoordinatesto the nearview plane), and head-tracked head position coordinates� xh yh zh � , the viewer-local coordinatesof the gazepoint,� xg yg zg � , aredeterminedby therelations:

xg � 1 � s � xh � s � xl � xr ��� 2 (5)

yg � 1 � s � yh � s � yl � yr ��� 2 (6)

zg � 1 � s � zh � s f (7)

wheres b ��� xl � xr � b � , b is thedisparitydistancebetweentheleft andright eyecenters,and f is thedistanceto thenearviewing planealongtheviewer-local z axis.

Figure 12 shows a user’s 3D gaze points in the air-craft cargo bayenvironment. Gazepoint coordinatesbased

(a)Frontview. (b) Sideview.

Figure12: 3D gazepointsin cargobayvirtual environment.

on vergencecalculationsgiven by Equations(5)–(7) arepresentlycloselycorrelatedwith the user’s headlocation.4

Figure12(b)shows thecollectionof gazepointsfrom a sideviewpoint. With respectto depth,thegazepointsdonotpre-ciselyfall on thepolygonalsurfacesof theenvironment.Tocomparethevisualsearchperformanceof expertsto novicesin a task-specificenvironmentsuchastheaircraftcargobay,recordedscanpathinformationshouldconvey thelocationofgazepointsin thesamerelativereferenceframeasthevisualsearchtargets. Specifically, we wish to comparethesegazepointsto the setof storedlocationsof artificially generateddefectslocatedatknown coordinatesin thecargobaytexturemaps.An exampleof defectlocationin anenvironmenttex-turemapis shown in Figure13.5 Wearethereforeinterested

Figure13: Exampleof locationof artificially generatedde-fect in anenvironmenttexturemap.

in recordingthepointsof intersectionof theuser’sgazeandtheenvironmentwalls (polygons).

To calculatethe the gaze/polygonintersection,the gazepoint is expressedparametricallyas a point on a ray withorigin � xh yh zh � , thehelmetposition,with the ray emanat-ing alonga vectorscaledby parameters. That is, rewritingEquations(5), (6), and(7), we have:

xg xh � s

�xl � xr

2� xh �

4Weareworking towardsthespecificationof focalanddisparityparam-etersto give usclearerdepthinformationof thegazepoint,dissociatingtheheadpositionfrom thepointof gaze.

5Thesimpletargetsymbolin Figure13 is currentlyonly usedfor testingpurposes;eventuallyartificial defectswill bemademorerealistic.

yg yh � s

�yl � yr

2� yh �

zg zh � s � f � zh �or, in vectornotation,

g h � sv (8)

whereh is theheadposition,v is thecentralview vectorands is thescaleparameterasdefinedpreviously. Theview vec-tor v is obtainedby subtractingthehelmetpositionfrom themidpointof theeye trackedx-coordinateandfocal distanceto thenearview plane,i.e.,

v �� � xl � xr ��� 2� yl � yr ��� 2f

� �!�� xh

yh

zh

� (9) m � h

wherem denotestheleft andright eyecoordinatemidpoint.6

To align theview vectorto thecurrentheadorientation,thevectorm mustfirst be transformedto the proper(instanta-neous)headorientation.This is doneby first normalizingmandthenmultiplying it by theorientationmatrix returnedbytheheadtracker.7

Given the three-dimensionalgazevector, v, specifiedbyEquation(9), Equation(8) givesthecoordinatesof thegazepoint parametricallyalonga ray originatingat the headpo-sition � xh yh zh � . The depthof the three-dimensionalgazepoint in world coordinatesis valid only if s " 0.

4.4 Gaze Point Calculation

The formulation of the gazedirection given by Equation(8) canbeusedfor testingvirtual gaze/polygonintersectioncoordinatesvia traditionalray/polygonintersectioncalcula-tions commonlyusedin ray tracing[3]. The gaze/polygonintersectionpoint is found on the closestpolygon to theviewer intersectingthe gazeray, assumingall polygonsareopaque.This polygonis foundby testingall polygonsin thescenefor intersectionwith thegazeray. To find theintersec-tion point g of thegazeray with theclosestpolygon,a newinterpolantt is obtainedby calculatingthe gazeray inter-sectionswith all scenepolygons.All suchintersectionsareexaminedfor which t " 0.8 Theinterpolantt is obtainedbysubstitutingthe gazeray equationinto the polygon’s planeequation(in vectornotation):

t �#� N $ h � D �N $ v (10)

6Note that sincethe vertical eye tracked coordinatesyl andyr areex-pectedto beequal(sincegazecoordinatesareassumedto beepipolar),theverticalcoordinateof thecentralview vectordefinedby % yl & yr ')( 2 is some-what extraneous;either yl or yr would do for the calculationof the gazevector. However, sinceeye tracker datais alsoexpectedto be noisy, thisaveragingof theverticalcoordinatesenforcestheepipolarassumption.

7Equivalently, headorientationin the form of a quaternion,asreturnedby theheadtracker, maybeusedfor thesamepurpose.

8If t * 0, thepolygonmayintersectthegazeray, but behindtheviewer.

whereN is thenegatedpolygonnormalandD is theheightparameterof the polygon’s planeequation. The geometryof this calculationis depictedin Figure14. Thecalculation

N

-N

v

h

Figure14: Ray/planegeometry.

of theray/planeintersectionmaybespeededup by evaluat-ing thedenominatorof Equation(10) first. The intersectionalgorithmis given in Figure15. Note that the ray/polygon

vd N $ v; // denominator

if(vd + 0) ,vo -�#� N $ h � D � ; // numeratort vo � vd ;.

Figure15: Ray/polygonintersection.

intersectionalgorithmonly returnsthe intersectionpoint ofthe ray andthe infinite planedefinedby the polygon’s facenormal. Becausethe normaldefinesa planeof infinite ex-tent, the point g mustbe testedagainstall of the polygon’sedgesto establishwhetherthepoint lies insidethepolygon.This is aninstanceof asolutionto thewell-known “point-in-polygon”problem.Thegeometryof thisalgorithmis shownin Figure16. If thepoint g is boundedby theperpendicular

B

A

N

g

N’

Figure16: Point-in-polygonproblem.

planesdefinedby thepolygon’sedges,theng lieswithin thepolygon,otherwiseit lies on the planedefinedby the facenormalN, but outsidethe polygonalregion. The resulting

algorithmgeneratesascanpathconstrainedto lie on polygo-nal re/ gionswithin thevirtual environment.Sucha scanpathis shown in Figure17. Provided thenumberof polygonsis

Figure17: 3D scanpathin aircraftcargobayvirtual environ-ment.

sufficiently small,thealgorithmexecutesin real-time.

5 Conclusions & Future Work

We have describedan operationalplatform for real-timerecordingof eyemovementsin Virtual Reality. Theplatformis basedon high-endgraphicsenginesandan electromag-netically tracked, binocularhelmetequippedwith infra-redeye trackingcapability. Renderingtechniquesarerelativelysimple,relying only on standard(OpenGL) graphicslibrarycalls. Trackingroutinesdeliver helmetpositionandorienta-tion in real-time,which areuseddirectly to provide updatedimagesto theHMD.

Usergazedirectionis trackedin real-time,alongwith cal-culatedgaze/polygonintersections.We arein theprocessofanalyzingrecordedgazeintersectionpointsfor comparisonwith storedlocationsof artificially generateddefectsin theinspectionenvironment.

Controlledstudiesof humanperformancein the virtualcargobayinspectionenvironmentareforthcoming.

6 Acknowledgements

This work wassupportedin partby ClemsonUniversity In-novationaward(Project#1-20-1906-51-4087),NASA Amesgrant(Task#NCC2-1114),andNSFCAREERaward(GrantIIS-9984278).

References

[1] DANFORTH, R., DUCHOWSKI , A., GEIST, R., AND

MCALILEY, E. A Platformfor Gaze-ContingentVir-

tualEnvironments.In Smart Graphics (Papers from the2000 AAAI Spring Symposium, Technical Report SS-00-04) (MenloPark,CA, 2000),AAAI, pp.66–70.

[2] DUCHOWSKI , A. T., AND VERTEGAAL , R. Course05: Eye-Based Interaction in Graphical Systems: The-ory & Practice. ACM SIGGRAPH,New York, NY,July 2000.SIGGRAPH2000CourseNotes.

[3] GLASSNER, A. S., Ed.An Introduction to Ray Tracing.AcademicPress,SanDiego,CA, 1989.

[4] GRAMOPADHYE, A., BHAGWAT, S., K IMBLER, D.,AND GREENSTEIN, J. The Use of AdvancedTech-nology for Visual InspectionTraining. Applied Er-gonomics 29, 5 (1998),361–375.

[5] GRAMOPADHYE, A. K., MELLOY, B., CHEN, S.,AND BINGHAM , J. Use of ComputerBasedTrain-ing for Aircraft Inpsectors:FindingsandRecommen-dations.In Proceedings of the HFES/IEA Annual Meet-ing (SanDiego,CA, August2000).

[6] HELD, R., AND DURLACH, N. Telepresence,timede-lay andadaptation.In Pictorial Communication in Vir-tual and Real Environments, S.R. Ellis, M. Kaiser, andA. J. Grunwald, Eds.Taylor & Francis,Ltd., London,1993,pp.232–246.

[7] JACOB, R. J. WhatYouLook at is WhatYouGet:EyeMovement-BasedInteractionTechniques. In HumanFactors in Computing Systems: CHI ’90 ConferenceProceedings (1990),ACM Press,pp.11–18.

[8] OHSHIMA , T., YAMAMOTO, H., AND TAMURA , H.Gaze-DirectedAdaptiveRenderingfor InteractingwithVirtual Space.In Proceedings of VRAIS’96 (March30–April 3 1996),IEEE,pp.103–110.

[9] STARKER, I ., AND BOLT, R. A. A Gaze-ResponsiveSelf-DisclosingDisplay. In Human Factors in Comput-ing Systems: CHI ’90 Conference Proceedings (1990),ACM Press,pp.3–9.

[10] TANRIVERDI , V., AND JACOB, R. J. K. Interactingwith EyeMovementsin Virtual Environments.In Hu-man Factors in Computing Systems: CHI 2000 Confer-ence Proceedings (2000),ACM Press,pp.265–272.