discrete rotation during eye-blink · discrete rotation during eye-blink anh nguyen(b), marc...

7
Discrete Rotation During Eye-Blink Anh Nguyen (B ) , Marc Inhelder, and Andreas Kunz Innovation Center Virtual Reality, ETH Zurich, Z¨ urich, Switzerland [email protected] Abstract. Redirection techniques enable users to explore a virtual environment larger than the real physical space by manipulating the mapping between the virtual and real trajectories without breaking immersion. These techniques can be applied continuously over time (using translational, rotational and curvature gains) or discretely (utilizing change blindness, visual suppression etc.). While most atten- tion has been devoted to continuous techniques, not much has been done on discrete techniques, particularly those utilizing visual suppression. In this paper, we propose a study to investigate the effect of discrete rotation of the virtual environment during eye-blink. More specifically, we describe our methodology and experiment design for identifying rota- tion detection thresholds during blinking. We also discuss preliminary results from a pilot study. Keywords: Redirected walking · Eye-blink Rotation detection threshold · Visual suppression 1 Introduction Compared to other methods of navigating in a virtual environment (VE) such as using controllers or walking-in-place, real walking has been shown to have better integrity and provide better immersion [1]. However, the challenge arises when the VE is much larger than the physical space. One of the solutions to this problem is the use of redirection techniques (RDTs). Depending on how these techniques are applied, Suma et al. categorized them into continuous and discrete. These techniques can be further divided into overt and subtle depending on whether they are noticeable or not [2]. Overt continuous RDTs involve the use of metaphors such as seven league boots [3], flying [1] or virtual elevators and escalators. Subtle continuous RDTs involve continuously manipu- lating different aspects of the users’ trajectory such as translation - users walk faster/slower in the VE than in real life, rotation - users rotate faster/slower in the VE than in real life and curvature - users walk on a different curvature in the VE than in real life [4]. When applied within certain thresholds, these manipu- lations remain unnoticeable and immersion is maintained. Discrete RDTs refer to instantaneous relocation or reorientation of users in the VE. Some exam- ples of overt discrete RDTs are teleportation [5] and portals [6]. Subtle discrete c Springer International Publishing AG, part of Springer Nature 2018 L. T. De Paolis and P. Bourdot (Eds.): AVR 2018, LNCS 10850, pp. 183–189, 2018. https://doi.org/10.1007/978-3-319-95270-3_13

Upload: others

Post on 23-Sep-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Discrete Rotation During Eye-Blink · Discrete Rotation During Eye-Blink Anh Nguyen(B), Marc Inhelder, and Andreas Kunz Innovation Center Virtual Reality, ETH Zurich, Z¨urich, Switzerland

Discrete Rotation During Eye-Blink

Anh Nguyen(B), Marc Inhelder, and Andreas Kunz

Innovation Center Virtual Reality, ETH Zurich, Zurich, [email protected]

Abstract. Redirection techniques enable users to explore a virtualenvironment larger than the real physical space by manipulating themapping between the virtual and real trajectories without breakingimmersion. These techniques can be applied continuously over time(using translational, rotational and curvature gains) or discretely(utilizing change blindness, visual suppression etc.). While most atten-tion has been devoted to continuous techniques, not much has been doneon discrete techniques, particularly those utilizing visual suppression.

In this paper, we propose a study to investigate the effect of discreterotation of the virtual environment during eye-blink. More specifically,we describe our methodology and experiment design for identifying rota-tion detection thresholds during blinking. We also discuss preliminaryresults from a pilot study.

Keywords: Redirected walking · Eye-blinkRotation detection threshold · Visual suppression

1 Introduction

Compared to other methods of navigating in a virtual environment (VE) suchas using controllers or walking-in-place, real walking has been shown to havebetter integrity and provide better immersion [1]. However, the challenge ariseswhen the VE is much larger than the physical space. One of the solutions tothis problem is the use of redirection techniques (RDTs). Depending on howthese techniques are applied, Suma et al. categorized them into continuousand discrete. These techniques can be further divided into overt and subtledepending on whether they are noticeable or not [2]. Overt continuous RDTsinvolve the use of metaphors such as seven league boots [3], flying [1] or virtualelevators and escalators. Subtle continuous RDTs involve continuously manipu-lating different aspects of the users’ trajectory such as translation - users walkfaster/slower in the VE than in real life, rotation - users rotate faster/slower inthe VE than in real life and curvature - users walk on a different curvature in theVE than in real life [4]. When applied within certain thresholds, these manipu-lations remain unnoticeable and immersion is maintained. Discrete RDTs referto instantaneous relocation or reorientation of users in the VE. Some exam-ples of overt discrete RDTs are teleportation [5] and portals [6]. Subtle discretec© Springer International Publishing AG, part of Springer Nature 2018L. T. De Paolis and P. Bourdot (Eds.): AVR 2018, LNCS 10850, pp. 183–189, 2018.https://doi.org/10.1007/978-3-319-95270-3_13

Page 2: Discrete Rotation During Eye-Blink · Discrete Rotation During Eye-Blink Anh Nguyen(B), Marc Inhelder, and Andreas Kunz Innovation Center Virtual Reality, ETH Zurich, Z¨urich, Switzerland

184 A. Nguyen et al.

RDT can be performed when users fail to notice the reorientation and relocationdue to change blindness [7] or during visual suppression caused by saccadic eyemovement or blinking [8,9]. Although overt RDTs offer higher range of motionand enable users to travel in a much larger VE, it has been shown that subtleRDTs produce fewer breaks in presence [2] and therefore are generally preferedfor a more immersive VR experience. Among the subtle RDTs, most attentionhas been paid on continuous RDTs including research on detection thresholdsand factors that influence them [10,11], or research on the implementation ofthese techniques in real walking applications such as steer-to-center, steer-to-orbit, steer-to-predefined-target [4], model predictive control [12]. Up to now,current research on discrete RDTs, especially using eyetracker information (e.g.eye movements, blinks, gazes) is quite limited, probably due to the lack of headmounted displays (HMDs) with an integrated eyetracker.

With the development of new HMDs with affordable integrated eyetrackerssuch as HTC Vive or FOVE, it is promising that research on subtle discreteRDTs using eyetracker information could be widely applicable in the future. Inthis paper, we propose the application of subtle discrete RDTs, more specificallyrotation, in real walking during blinking. We first describe our methodologyfor blink detection and threshold identification. Furthermore, we discuss ourexperiment design and setup, and the results from a pilot study.

2 Related Work

We blink spontaneously 20–30 times per minute [13] to moisturize our eyes andeach blink lasts about 100–150 ms [14]. During blinking, the eyelids cover thepupils and prevent light and visual inputs from entering the eyes, resulting ina disruption of the image on the rectina. Nevertheless, we rarely notice thisdisruption due to the fact that our brain suppresses visual information duringblinking, so-called visual suppression. Interestingly, because of this suppression,people sometimes fail to notice changes happening to the scene during blinkingsuch as color change, target appearance/disappearance or target displacement[15]. While visual suppression during blinking is undesirable in tasks that requireconstant monitoring of visual input such as driving, it offers a new posibility fordiscrete subtle redirection in the context of redirected walking. There is, how-ever, a limit to how much redirection could be applied to the scene without theuser noticing it. The only study that addresses this question is by Ivleva where ablink sensor was created and used with the HTC Vive to identify the detectionthresholds for reorientation and repositioning during blinking [9]. While resultsfrom this study can not be used in a redirected walking application, they con-cluded that it could be a potential method. There are also a few limitations ofthis study such as users in the study were not performing locomotion, and thescene used may have contained reference points that give clues to the users wherethey have been redirected. In other contexts not related to redirected walking,many studies have been conducted to confirm the fact that people do not noticetarget displacement during blinking. However, to our knowledge, there exists noother study that quantifies this displacement.

Page 3: Discrete Rotation During Eye-Blink · Discrete Rotation During Eye-Blink Anh Nguyen(B), Marc Inhelder, and Andreas Kunz Innovation Center Virtual Reality, ETH Zurich, Z¨urich, Switzerland

Discrete Rotation During Eye-Blink 185

3 Methodology

3.1 Blink Detection

Figure 1 shows typical pupil diameter recordings of a participant walking in aVE. It can be seen that during blinking the eyetracker loses track of the eyesand the pupil sizes become zero. However, it is worth to notice that the leftand right eyes do not open or close at the same time and there is occasionallyspurious noise like in Fig. 1(b). Since redirection should only be applied duringblinking, it is important that blinks are detected reliably and there can not beany false positive. Therefore, in our blink detection algorithm, the following twoconditions need to be satisfied for an event to be considered a blink: (i) botheyes’ pupil diameters should change from nonzero to zero and remain zero for acertain amount of time; (ii) once the first condition is satisfied, the subsequentstep from nonzero to zero will only be considered after a predefined amount oftime to eliminate irregular blinks or noise like in Fig. 1(b).

Fig. 1. Diameter of left and right pupils of a participant during walking

Page 4: Discrete Rotation During Eye-Blink · Discrete Rotation During Eye-Blink Anh Nguyen(B), Marc Inhelder, and Andreas Kunz Innovation Center Virtual Reality, ETH Zurich, Z¨urich, Switzerland

186 A. Nguyen et al.

3.2 Threshold Identification

The detection of a stimulus could be modeled by a psychometric curve wherethe x-axis represents the stimulus value and the y-axis represents the percent ofcorrect response. Threshold identification refers to the process of identifying thispsychometric function. The classical method to identify the whole psychometricfunction is called the constant stimuli method (CSM), where the whole rangeof stimulus is presented in random order. However, this method requires a largenumber of repetitions and is not efficient since most of the time, only certainaspects of this psychometric function such as the 75% correct response point, orthe slope are of interest. In constrast to CSM, adaptive methods such as staircasemethod, bayesian adaptive methods, etc. select the next stimulus level basedon previous responses and do not present the whole range of stimulus. Thesemethods require fewer trials but only identify one point on the psychometriccurve and/or the slope.

While most existing studies on redirected walking adopt the CSM forthreshold identification [8,10], to reduce experiment time, we select the Bayesianadaptive method called QUEST, whose details are provided by Watson andPelli [16].

4 Experiment Design and Setup

The aim of this study is to identify the detection threshold for scene rotationduring blinking. While in other redirected walking thresholds studies the partic-ipants were informed about the purpose of the study and asked if they noticethe manipulation correctly, the same design can not be used in our experiment.If the participants are informed that during blinking the scene will be rotated,they will potentially try to fixate on a reference point and deliberately blink toidentify the rotation direction. As a result, the real aim of the study can not bedisclosed. Instead, a cover story is given to the participants that they are testinga new system which may contain some technical bugs and are encouraged toinform the experimenter whenever such bug occurs. When a subject reports abug, the experimenter first makes sure that a scene rotation has just been appliedand then verifies if the subject has really noticed the rotation rather than some-thing else. When it is confirmed that the subject has noticed the rotation, itwill be considered a correct detection response. Otherwise, when a stimulus hasbeen presented after a blink, without the user making any comment, it will beconsidered a no detection response. Depending on the type of responses, the nextstimulus level is selected accordingly. In addition, since there may be asymme-try in users’ ability to detect scene rotation of different directions, we identifythresholds for left and right rotations separately.

In this study, users are required to walk around a maze-like environment(Fig. 2(a)) to search for a target. The maze is much larger than the existingavailable tracking space (Fig. 2(b)) and therefore whenever users approach thephysical wall, a reset action will be performed which reorients the users towardsthe center of the physical space. Once the target has been found, a new scene

Page 5: Discrete Rotation During Eye-Blink · Discrete Rotation During Eye-Blink Anh Nguyen(B), Marc Inhelder, and Andreas Kunz Innovation Center Virtual Reality, ETH Zurich, Z¨urich, Switzerland

Discrete Rotation During Eye-Blink 187

will be randomly generated and loaded. The experiment is completed after theusers have been exposed to 40 stimulus values per rotation direction.

(a) User view of the VR scene (b) Top view with real phys-ical space overlay

Fig. 2. Scene used in the study

Our setup consists of an Oculus DK2 head mounted display (HMD) withan integrated SMI eyetracker providing eyetracking data such as gaze position,pupil diameters, etc. at 60 Hz. An Intersense IS-1200 optical tracking system isattached on top of the HMD and provides 6 DOF position tracking at a rateof 180 Hz. The system is powered by a backpack-mounted laptop and the gameplay was made with Unity. The environment was optimized to run constantlyat the HMD’s maximum frame rate of 75 Hz. The available tracking space is13 m× 6.6 m.

5 Pilot Study and Preliminary Results

A pilot study was performed to verify the applicability of the proposed experi-ment protocol and the cover story. Five naive subjects (3 males and 2 females,age range: 20–29) who were all students from the university volunteered to par-ticipate in the study. The subjects were not informed about the real purpose ofthe study but instead were told the cover story. The first pilot subject remem-bered to mention to the expetimenter everytime he noticed a technical “bug”such as: “the color is weird”, some things “seem a bit blur”, or “the scene justglitched”. However, the next two subjects were too immersed in the VE thatthey did not mention anything even though the scene rotation was increasedup to its predefined maximum of 15◦. When asked if they had noticed any-thing, they replied “I sometimes saw the scene jump” and “I have seen it for awhile now but forgot to mention it”. Since it is crucial that the user’s responsesare timely collected, we changed the experiment protocol for the last two pilotsubjects and added a training session. In this training session, the subjectswere exposed to the same environment but the scene rotation was always 15◦.

Page 6: Discrete Rotation During Eye-Blink · Discrete Rotation During Eye-Blink Anh Nguyen(B), Marc Inhelder, and Andreas Kunz Innovation Center Virtual Reality, ETH Zurich, Z¨urich, Switzerland

188 A. Nguyen et al.

This ensured that the subjects experienced the stimulus and understood whatthey should point out during the experiment. Moreover, keywords were assignedto each “bug” that the subjects discovered in the training session such as: “blur”,“jump”, “color”, etc. This way, during the final study, the subjects only needto use these keywords when they detect a “bug” and do not have to stop andexplain in full sentence what just happened. This adjusted protocol worked wellfor the last two pilot subjects and will be adopted for the final study. After theexperiment, a series of questions was used to debrief the subjects, to determinethe effectiveness of the cover story and whether the subjects had realized thatthe scene rotations were linked to blinking. When asked if they could guess whythe technical bugs occured, all the subjects recited the cover story and none ofthem identified that they were associated with their blinks.

An average detection threshold could not be obtained from this pilot studydue to the limited number of subjects and varied experiment protocol betweensubjects. However, it was observed that scene rotations below 5◦ were on averagenot detected by the subjects. This estimation is close to the detection thresholdduring saccadic eye movements found by Bolte and Lappe [8].

6 Conclusion

In this paper, we proposed an experiment design for identifying detection thresh-olds for scene rotation during blinking. Without being told the true purpose ofthe study, users were asked to walk around a VE looking for a target and encour-aged to report when they detect some technical bugs, i.e. scene manipulation.The performed pilot study enabled us to refine the experiment design, showedthat the cover story was effective and resulted in a rough estimation of thedetection threshold. Further studies with large enough sample size are requiredto identify the detection threshold of not only scene rotation but displacementduring blinking.

References

1. Usoh, M., Arthur, K., Whitton, M.C., Bastos, R., Steed, A., Slater, M., Brooks Jr.,F.P.: Walking > walking-in-place > flying, in virtual environments. In: Proceedingsof the 26th Annual Conference on Computer Graphics and Interactive Techniques,SIGGRAPH 1999, pp. 359–364. ACM Press/Addison-Wesley Publishing Co., NewYork (1999)

2. Suma, E.A., Bruder, G., Steinicke, F., Krum, D.M., Bolas, M.: A taxonomy fordeploying redirection techniques in immersive virtual environments. In: 2012 IEEEVirtual Reality Workshops (VRW), pp. 43–46, March 2012

3. Interrante, V., Ries, B., Anderson, L.: Seven league boots: a new metaphor foraugmented locomotion through moderately large scale immersive virtual environ-ments. In: 2007 IEEE Symposium on 3D User Interfaces, March 2007

4. Razzaque, S., Kohn, Z., Whitton, M.C.: Redirected walking. In: Eurographics2001 - Short Presentations, Geneva, Switzerland, pp. 1–6. Eurographics Association(2001)

Page 7: Discrete Rotation During Eye-Blink · Discrete Rotation During Eye-Blink Anh Nguyen(B), Marc Inhelder, and Andreas Kunz Innovation Center Virtual Reality, ETH Zurich, Z¨urich, Switzerland

Discrete Rotation During Eye-Blink 189

5. Bowman, D.A., Koller, D., Hodges, L.F.: Travel in immersive virtual environments:an evaluation of viewpoint motion control techniques. In: Proceedings of IEEE 1997Annual International Symposium on Virtual Reality, pp. 45–52, 215, March 1997

6. Freitag, S., Rausch, D., Kuhlen, T.: Reorientation in virtual environments usinginteractive portals. In: 2014 IEEE Symposium on 3D User Interfaces (3DUI), pp.119–122, March 2014

7. Suma, E.A., Clark, S., Krum, D., Finkelstein, S., Bolas, M., Warte, Z.: Leveragingchange blindness for redirection in virtual environments. In 2011 IEEE VirtualReality Conference, pp. 159–166, March 2011

8. Bolte, B., Lappe, M.: Subliminal reorientation and repositioning in immersive vir-tual environments using saccadic suppression. IEEE Trans. Vis. Comput. Graph.21, 545–552 (2015)

9. Ivleva, V.: Redirected Walking in Virtual Reality during eye blinking. Bachelor’sthesis, University of Bremen (2016)

10. Steinicke, F., Bruder, G., Jerald, J., Frenz, H., Lappe, M.: Estimation of detectionthresholds for redirected walking techniques. IEEE Trans. Vis. Comput. Graph.16, 17–27 (2010)

11. Neth, C.T., Souman, J.L., Engel, D., Kloos, U., Bulthoff, H.H., Mohler, B.J.:Velocity-dependent dynamic curvature gain for redirected walking. In: 2011 IEEEVirtual Reality Conference, pp. 151–158. IEEE, New York, March 2011

12. Nescher, T., Huang, Y.-Y., Kunz, A.: Planning redirection techniques for optimalfree walking experience using model predictive control. In: 2014 IEEE Symposiumon 3D User Interfaces (3DUI), pp. 111–118, March 2014

13. Sun, W.S., Baker, R.S., Chuke, J.C., Rouholiman, B.R., Hasan, S.A., Gaza, W.,Stava, M.W., Porter, J.D.: Age-related changes in human blinks. Passive and activechanges in eyelid kinematics. Invest. Ophthalmol. Vis. Sci. 38(1), 92–99 (1997)

14. VanderWerf, F., Brassinga, P., Reits, D., Aramideh, M., Ongerboer de Visser,B.: Eyelid movements: behavioral studies of blinking in humans under differentstimulus conditions. J. Neurophysiol. 89(5), 2784–2796 (2003)

15. Kevin O’Regan, J., Deubel, H., Clark, J.J., Rensink, R.A.: Picture changes duringblinks: looking without seeing and seeing without looking. Vis. Cogn. 7(1–3), 191–211 (2000)

16. Watson, A.B., Pelli, D.G.: Quest: a Bayesian adaptive psychometric method. Per-cept. Psychophys. 33, 113–120 (1983)